A CASE STUDY OF A TEACHER SHIFTING TOWARD RESPONSIVE PLANNING USING A LEARNING PROGRESSION By Julia Alexander Christensen A DISSERTATION Submitted to Michigan State University in partial fulfillment of the requirements for the degree of Curriculum, Instruction, and Teacher Education—Doctor of Philosophy 2023 ABSTRACT The Framework for K-12 Science Education and the Next Generation Science Standards have called on teachers to shift from having students learn about science topics to having students engage in a process of figuring out how and why natural phenomena occur. These reform documents push teachers away from a transmission-style (TS) approach to science teaching, in which students’ ideas and ways of knowing are often ignored or treated as misconceptions. Instead, these reform documents require that teachers use a responsive instructional (RI) approach, in which student ideas are attended to and used as resources for learning. Research on the use of RI provides evidence of greater student learning gains compared to those observed with use of TS. Though RI has been called for within reform efforts and its benefits have been demonstrated, this approach is not common in most science classrooms and it can be challenging for teachers to learn to plan for and enact this type of instruction. Most studies of RI focus on teachers’ enactment, often focusing on how a teacher elicits, interprets and/or responds to student ideas during instruction. However, the way a teacher plans for instruction strongly influences enactment, as teachers do not commonly stray from their lesson plan once instruction has begun. Therefore, if a teacher is to enact RI, they must first plan ways to attend to and use student ideas as resources for learning or engage in what I call responsive planning. One tool that may support a teacher’s responsive planning is a learning progression (LP). As a model of how student ideas may change over time, an LP used during planning may help (1) to increase teacher awareness of commonly held student ideas to plan for and (2) in making decisions about instructional next steps that build on student ideas. Therefore, this case study investigated how a high school physics teacher with a TS approach to science instruction began to intentionally engage, with PD support, in responsive planning; how his planning changed over time; and how his planning was supported with the use of an LP. Data was collected during PD-supported planning meetings held during a 6-week force and motion unit that was taught twice, once in the Fall, and again, in the Spring. At the start of the study, the participating teacher’s planning was informed by his TS instructional approach. However, over time, the teacher made changes across three dimensions of responsiveness: (1) the amount of attention he planned to give to student ideas during instruction, (2) the type of discourse structures he planned to use and 3) the types of roles he planned to have students take on during instruction. As compared to his original TS-informed lesson plans, the teacher's Fall eliciting lesson plans showed increased attention and role responsiveness. The attention responsiveness of his Fall responding lessons also increased from the beginning to end of the unit. As compared to the Fall responding lessons, the teacher's Spring responding lessons showed more discourse and role responsiveness. Over time, these gradual shifts in attention, discourse and role amounted to significant changes in the responsiveness of his lesson plans. Data also suggests that these shifts occurred when PD support focused on shifting one dimension at a time, rather than multiple at once. Additionally, the teacher used the LP both when his planning followed a TS approach and as it transitioned to being more responsive. Together, these findings suggest: (1) that PD providers could use a similar approach to supporting teachers as they shift toward responsive planning (starting with one dimension, rather than trying to shift multiple dimensions at once) and (2) an LP may be useful, when coupled with PD, to teachers as they shift toward more responsive planning. This dissertation is dedicated to my husband, Adam, for supporting my stupid, stupid decision to go to grad school (See Ruben, 2010). iv ACKNOWLEDGEMENTS I wish to acknowledge the follow people for their unending support during my graduate program and dissertation process: Alicia Alonzo, my doctoral advisor and dissertation chair; Amelia Gotwals, Christina Schwarz and David Stroupe, my dissertation committee members; Christie Morrison Thomas, my cohort member and friend; Kate Miller, my analysis partner and friend; the many members of the science ed writing group; Adam Christensen, my husband; Sandy Alexander, my mother; and, finally, AJ (pseudonym), my friend and former colleague who acted as my participating teacher during this study. Lastly, this work was supported by the National Science Foundation under Grant No. DRL-1253036. Any opinions, findings and conclusions or recommendations expressed in this dissertation are those of the author and do not necessarily reflect the views of the National Science Foundation. v TABLE OF CONTENTS CHAPTER ONE: INTRODUCTION ..................................................................................................... 1 Research Questions (1.A) ................................................................................................... 6 CHAPTER TWO: LITERATURE REVIEW ............................................................................................ 7 Responsive Instruction (2.A) .............................................................................................. 7 Dimensions of Responsiveness (2.B) ................................................................................ 16 NGSS Reform Context (2.C) .............................................................................................. 21 Planning (2.D) ................................................................................................................... 23 Learning Progressions (2.E) .............................................................................................. 26 Responsive Planning: A Conceptual Framework (2.F) ...................................................... 30 CHAPTER THREE: METHODS......................................................................................................... 38 Participant Teacher and Support for Responsive Planning (3.A) ..................................... 39 Researcher’s Roles and Positionality (3.B) ....................................................................... 46 Data Generation (3.C) ...................................................................................................... 49 Data Analysis (3.D) ........................................................................................................... 53 CHAPTER FOUR: FINDINGS ........................................................................................................... 64 Findings for Research Question One (4.A) ....................................................................... 64 Findings for Research Question Two (4.B) ....................................................................... 73 Findings for Research Question Three (4.C) ................................................................... 107 CHAPTER FIVE: DISCUSSION ....................................................................................................... 117 Notable Themes (5.A)..................................................................................................... 117 Implications and Recommendations (5.B) ..................................................................... 126 REFERENCES ............................................................................................................................... 132 APPENDIX A: FORCE & MOTION LEARNING PROGRESSION (FMLP) ........................................... 141 APPENDIX B: INTRODUCTION TO LEARNING PROFESSIONS PRESENTATION SLIDES ................. 143 APPENDIX C: POST-UNIT REFLECTIVE CONVERSATION PROTOCOL ........................................... 147 APPENDIX D: LESSON PLAN MEMO EXAMPLE ........................................................................... 151 vi CHAPTER ONE: INTRODUCTION Students come to school with ideas and ways of thinking about the world that play a critical role in their science learning (National Research Council, 2007, 2012). However, these ideas and ways of thinking are often ignored or treated as misconceptions, rather than as resources for learning (Campbell et al., 2016). Over the last several decades, research in science education has focused on ways to support teachers in responsive instruction (i.e., instruction that attends to and uses student ideas as resources for learning) and its benefits. Such research includes three forms of responsive instruction: formative assessment (Andersson & Palm, 2016; Black & Wiliam, 1998a; Ruiz-Primo & Furtak, 2007), responsive teaching (Fennema et al., 1996; Pierson, 2008; Richards & Robertson, 2016), and anticipating student ideas and responses (Cartier et al., 2013). Collectively, the research on these forms of responsive instruction provide evidence that when teachers engage in responsive instruction, students demonstrate higher learning gains than when teachers use a transmission model of instruction, such as lectures on content with confirmatory lab activities (Driver et al., 1994). Responsive instruction is not just called for, but is required for, the most recent reform efforts in science education articulated by A Framework for K-12 Science Education (Framework; National Research Council, 2012) and embodied in the Next Generation Science Standards (NGSS; NGSS Lead States, 2013). These reform documents promote a shift away from learning about science topics toward engaging students in a process of figuring out how natural phenomena occur. This figuring out process generally begins with students sharing their ideas about a phenomenon, followed by teachers using the shared ideas as resources for learning and to inform their instruction (i.e., responsive instruction). 1 However, responsive instruction is not common in most science classrooms (Gotwals et al., 2015; Smith, 2020). This type of instruction requires a substantial amount of time and support to develop (Bennett, 2011) and has been found to be quite challenging for teachers to enact (Gotwals & Birmingham, 2016; Heritage, 2007; Levin et al., 2009; Maskiewicz, 2016; Stahnke et al., 2016). Contributing to the rarity of responsive instruction may be the commonly taken evaluative stance toward students’ ideas that “focuse[s] on how much students have learned” with respect to canonical ideas (Minstrell et al., 2011, p. 2). When using such an evaluative stance, teachers are likely to categorize student ideas as either correct or incorrect (Otero & Nathan, 2008). Therefore, teachers using the evaluative stance are likely to miss the many different ideas students may have about a topic or to see those ideas as obstacles to learning rather than as resources for learning (Larkin, 2012). The use of the evaluative stance aligns with the transmission model of science instruction (Abd-El-Khalick et al., 2004). When using a transmission model, the goal for students is to learn about or acquire scientific content, and the goal for instruction is to deliver accurate science content. Instructional approaches taken to achieve this goal typically include didactic lectures and confirmatory demonstrations or lab experiences (Abd-El-Khalick et al., 2004). While students may contribute to the discourse of lectures, these most often occur in an initiation-response-evaluation (IRE) format directed by the teacher (Howe & Abedin, 2013; Mehan, 1979). Within the IRE format, the teacher evaluates the accuracy of a student’s response to a teacher-initiated question. Demonstrations carried out by the teacher and lab experiences completed by students supplement lectures and provide students with physical representations and confirmation of science concepts. With such a focus on ‘correct’ ideas, it is 2 understandable that teachers would rely so heavily on an evaluative stance toward the novice and alternative ideas students have about the world when using a transmission model of instruction. The kind of instruction teachers use, whether responsive or transmission-style, influences the entire ‘plan-enact-reflect’ teaching cycle, not just enactment. For example, for lectures, a common transmission-style enactment, a teacher’s planning approach may focus on how to present information to students. This may include deciding how to break down large concepts into smaller pieces, how to sequence content pieces, what stories to tell, what analogies to use, and when they should check for accuracy of student understanding. A teacher does not need to consider student ideas to plan for this kind of instruction. However, if student ideas are considered during planning for transmission-style enactment, a teacher may plan to correct ideas they view as misconceptions during the lecture and craft questions that specifically check to make sure students now hold the correct idea. In contrast, when planning to enact responsive instruction, a teacher must plan ways to use student ideas as resources for learning (as opposed to as misconceptions). For example, planning using a responsive approach could include selecting a phenomenon and crafting questions that will draw out ideas the teacher anticipates his students holding, considering which of the elicited student ideas to respond to first and selecting investigation activities that test students’ ideas. In all of these examples of a responsive approach to planning, student ideas drive the planning. Just as enactment influences planning, a teacher’s plans strongly influence his enactment; teachers rarely make decisions while enacting instruction that radically change the 3 direction of their plans (Clark & Peterson, 1986; Joyce, 1978), regardless of their instructional approach. Therefore, if a teacher plans a transmission-style lesson to present content or evaluate and/or correct student ideas, he is most likely to enact this type of instruction. Similarly, if a teacher plans a responsive lesson, he is most likely to enact a lesson that attends to and uses students' ideas. Because teachers only ‘fine-tune’ their plans during enactment (Joyce, 1978), if a teacher does not plan for responsive instruction, it is highly unlikely to occur. Therefore, it is important for teachers to engage in responsive planning practices: those that attend to and use student ideas to plan instruction. Finally, a teacher’s instructional approach also influences how teachers reflect. When reflecting, teachers consider what they noticed during instruction, which can come from a variety of different instructional elements (van Es & Sherin, 2002). During a transmission-style lesson, the teacher may notice how students are engaging the lesson (e.g., note-taking, talking to neighbor) and understanding the material (e.g., responding with correct or incorrect answers to questions). They may not notice the variety of ways students may be thinking about a topic. By definition, teachers using a responsive approach must notice student ideas in order to attend to these ideas and use them for instruction. Teachers using a responsive approach may notice student ideas during a discussion or as they review students’ written work. Regardless of instructional approach, what teachers notice is available for reflection and for planning upcoming instruction. While teacher noticing of student ideas during instruction is essential to responsive instruction, in this study, I focus specifically on how a teacher plans to both elicit student ideas based on anticipated student responses and use the student ideas he has already 4 noticed so that the decisions made during enactment work in the direction of responsive instruction. Enactment of new standards (i.e., NGSS) will require teachers using a transmission-style instructional approach to transition to a responsive instructional approach. Engaging in a responsive planning approach for the first time is likely to be challenging and require time, PD, and new tools to support teachers' planning processes. Much of the research related to responsive instruction focuses on teacher practices enacted during instruction in the moment- to-moment interactions with students (e.g., Lineback, 2016; Ruiz-Primo & Furtak, 2007). The current literature contains very few examples of research on in-service teacher responsive planning (e.g., Lampert, 2001; Mangiante, 2018). Even less attention has been paid to how teachers begin to intentionally engage in responsive planning and how their engagement may change over time with support. Since planning so strongly influences enactment, research on how to support shifts toward responsive planning may play an important role in improving teachers’ enactment of responsive instruction. One tool that has been proposed as a support for planning and enacting responsive instruction is the learning progression. Learning progressions (LPs) are “descriptions of the successively more sophisticated ways of thinking about a topic that can follow one another as children learn” (National Research Council, 2007, p. 214). LPs may organize commonly held student conceptions into a model of how students may progress in their understanding of scientific ideas. Some studies show that LPs, along with LP-related PD, may support responsive planning in a number of ways: through increased attention to and awareness of students’ commonly held ideas (Christensen & Alonzo, 2018; Furtak, 2012), shifts away from an 5 evaluative stance (Gunckel et al., 2018), and decisions about next steps or the ordering of instructional activities (Yin et al., 2014; Zhai et al., 2018). Generally, these studies focus on how an LP may support responsive instruction. This study aimed to understand how an LP may support responsive planning practices, particularly for a teacher transitioning to responsive instruction. Research Questions (1.A) For this study, I used an interpretivist case study (Dyson & Genishi, 2005) of a teacher beginning to intentionally engage in responsive planning. My case study teacher was a high school physics teacher whose teaching practice at the start of the study represented a transmission-style of science instruction still common in classrooms today (Smith, 2020). I have identified this teacher as a critical case (Flyvbjerg, 2011) because he shifted his practice under several ideal conditions: receiving on-going PD support for his planning process and a local teaching context that was supportive of his decision to shift toward responsive instructional practices. I ask the follow research questions: 1. How does a teacher with a transmission-style of science instruction begin to intentionally engage, with PD support, in responsive planning? 2. How does the teacher’s responsive planning change over time? 3. How does the teacher use the LP to support his responsive planning, both in the beginning and over time? 6 CHAPTER TWO: LITERATURE REVIEW This is a study of responsive planning: attending to and using student ideas as resources to plan instruction. In the first three sections, I provide a review of literature relevant to responsive planning. First, I define responsive instruction, the instructional aim of responsive planning, and draw on three forms of responsive instruction to help explicate my definition: formative assessment, responsive teaching and anticipating student ideas and responses. Second, I describe three dimensions of responsiveness and how they have been used to described responsiveness in the literature. Third, I describe the NGSS reform context and its potential influence on my study. In these first three sections, I aim to describe what responsive instruction looks like in the classroom, thereby providing a vision or instructional goal for a teacher’s responsive planning. In the next section, I draw on research related to teacher planning to motivate my study’s focus on the planning aspects of responsive instruction. In the following section, I describe LPs, research on their influence on teacher awareness and view of student ideas, and research on their use as supports for responsive instruction. In the final section, drawing on the literature from all five previous sections, I present my conceptual framework for responsive planning. Responsive Instruction (2.A) According to constructivist learning theories, new knowledge can only be built on prior knowledge derived from lived experiences (Dewey, 1916), including previous education. New information that does not relate to current ideas is “either quickly forgotten or is remembered only as rote-learned statements” (Wiser et al., 2012 p. 361). Thus, students’ previously held 7 ideas, which may or may not be consistent with scientific knowledge, play an important role as students learn new ideas. This understanding of the learning process suggests that teachers should engage in responsive instruction, which I define as attending to and using students’ ideas as resources for learning. I draw on literature describing three forms of responsive instruction: formative assessment, responsive teaching, and anticipating student ideas and responses. I have identified these three forms of instruction as ‘responsive’ because they are all based on a constructivist theory of learning and have a goal of drawing teachers’ attention to and building on students’ ideas (i.e., using ideas as resources for learning). In the sub-sections below, I present the three forms of responsive instruction (formative assessment, responsive teaching, and anticipating student ideas and responses), along with brief overviews of the reported benefits and challenges of each. Formative Assessment (2.A.1) Formative assessment is a process by which teachers “collect evidence about how student learning is progressing during instruction so that necessary instructional adjustments can be made to close the gap between students’ current understanding and the desired goals” (McManus, 2008 p. 3). For a teacher, formative assessment consists of three practices: eliciting and interpreting students’ ideas and ways of thinking about a topic or phenomenon in order to respond with appropriate instructional support (e.g., Gotwals et al., 2015; Ruiz-Primo & Furtak, 2007). Eliciting involves collecting information about student thinking. This can be done using tasks, such as a group modeling activity or writing prompt, intended to draw out students’ ideas about a topic or phenomenon. Decisions about what phenomenon and writing prompts will 8 best elicit student ideas are made during planning. Interpreting involves analyzing or making sense of the collected information about student thinking. For example, a teacher may review students’ responses to a writing prompt to see what kinds of ideas they are holding, which can inform the way they plan subsequent instruction. Eliciting and interpreting can also be enacted in less formalized ways (e.g., Bell & Cowie, 2001; Shavelson et al., 2008). For example, a teacher may ask ‘on the fly’ questions (i.e., elicit) as students discuss their ideas in small groups. The teacher could then interpret students’ responses in-the-moment. Regardless of when or how evidence of student thinking is gathered, the teacher’s understanding from eliciting and interpreting can then inform how the teacher might respond. Responding decisions can be made during instruction, such as asking students to clarify an idea (respond by further eliciting) or providing feedback to help students revise their thinking during an activity. However, teachers can also plan to respond on a longer time scale. For example, a teacher may write new or adjust future lessons to address student thinking (Andersson & Palm, 2016; Ruiz-Primo & Furtak, 2007). Research suggests that teachers’ use of formative assessment practices can improve student learning outcomes. In one of the earliest syntheses of research on formative assessment, Black and Wiliam (1998b) found that a large variety of empirical studies “all show that attention to formative assessment can lead to significant learning gains” (p. 17). As an example of one of the studies in this synthesis, Bergan, Sladeczek, Schwarz, and Smith (1991) trained an experimental group of kindergarten teachers on how to collect (i.e., elicit) and analyze (i.e., interpret) formative assessment data from the students in their classes. These teachers used the data to inform their instructional responses. After controlling for pre-test 9 scores, the learning gains of students in the experimental classrooms were significantly higher than those of students in the control classrooms, demonstrating a potential influence of the use of formative assessment on student learning gains. In a more recent study demonstrating similar outcomes, students whose teachers participated in PD on formative assessment practices were found to have higher learning outcomes than students whose teachers were in the control group (Andersson & Palm, 2016). In a study looking directly at teachers’ formative assessment practices, Ruiz-Primo and Furtak (2007) found that greater frequency in use of informal formative assessment practices was linked to higher student performance on assessment tasks. In all of these examples, teachers elicited and interpreted their students’ ideas and ways of thinking about a topic and responded with instruction that supported students as they progressed in their learning. Despite its promising effects on student learning, analyses of teachers’ practices show that formative assessment is not widely used (Gotwals et al., 2015). A number of studies have documented that even with support, teachers do not quickly or easily take up formative assessment practices (e.g., Gotwals & Birmingham, 2016; Heritage et al., 2009; Kang & Anderson, 2015; Stahnke et al., 2016)). Some of the areas teachers struggle when trying to enact formative assessment practices include a limited ability to frame questions that elicit more than declarative information from students (Gotwals & Birmingham, 2016), not noticing student ideas even when they are elicited (van Es & Sherin, 2002), and not knowing how to adapt or plan future instruction (i.e., respond) based on student thinking (Shepard, 2009). Responding is often seen as the most challenging practice, yet the most critical for student learning (Grant et al., 2009; Ruiz-Primo & Furtak, 2007). Contributing to all areas in which 10 teachers struggle and infrequent use of formative assessment may be teachers’ lack of awareness and view of student ideas. Commonly, teachers take an evaluative ‘gets it’ or ‘doesn’t get it’ (Minstrell et al., 2011) stance toward student understanding (Gotwals & Birmingham, 2016; Otero & Nathan, 2004). Both evaluations reflect a focus on canonically correct understanding, as something students understand (‘gets it’) or do not understand (‘doesn’t get it’). The ‘gets it’ category includes seeing students as ‘getting it’ if they can reproduce an idea using language from the textbook and/or can express the idea in a different way. The ‘doesn’t get it’ category includes both ‘missing’ and ‘wrong’ interpretations of student ideas and thinking. When interpreting student ideas as ‘missing,’ teachers are generally focused on the canonical knowledge they see as absent from student responses. One contributing factor to this interpretation may be a lack of awareness of student ideas, or not knowing what ideas (other than the correct ones) to pay attention to. In the extreme, teachers may interpret student thinking as ‘missing’ because they believe a student has no prior knowledge about a topic or view the student as a ‘blank slate’ (Kang & Anderson, 2015). If teachers focus on student ideas in terms of the completeness of alignment with canonical ideas as they elicit, the instructional implication is that teachers are more likely to plan responses that simply provide the ‘right’ answer (or content that students are missing), without regard for what ideas students already have (Otero & Nathan, 2004). This approach counters constructivist instructional approaches, which rely on students’ prior knowledge as the basis for further learning. When interpreting student ideas as ‘wrong,’ teachers may simply not see any correct ideas within a student response or may see the idea within a student response as different than 11 canonical knowledge. A teacher may interpret student ideas as ‘wrong’ when they don’t understand the substance of the student ideas or consider the ideas to be misconceptions (Campbell et al., 2016). Student ideas interpreted as ‘wrong’ are often thought of as obstacles for learning and needing to be eliminated (Larkin, 2012). The instructional implication of treating students’ ‘wrong’ ideas as obstacles is that teachers may plan responding activities that aim to replace (Christensen & Alonzo, 2018) or “squash” (Furtak, 2012, p. 1195) these ideas, instead of using them as building blocks for learning. Responsive Teaching (2.A.2) Robertson and colleagues (2016) define responsive teaching using three themes. First, responsive teaching foregrounds the substance of students’ ideas. In practice, this means teachers are seeking to understand what students are saying (or writing) from the perspective of the student, “rather than to evaluate or correct it” (p. 2). This theme helps to describe the purpose of a teacher engaging in the formative assessment practice of interpreting, mentioned in the previous sub-section. As teachers elicit student ideas through discussion or written responses, the purpose is to listen for the ideas students have, regardless of scientific accuracy, rather than to evaluate the ideas as simply ‘right’ or ‘wrong’ (Coffey et al., 2011). Teachers can plan to foreground the substance of student thinking by crafting questions that will help them uncover how students are thinking about a topic or phenomenon, rather than simply whether the students hold canonical ideas. Second, responsive teaching involves recognizing the disciplinary connections within students’ ideas. Disciplinary connections, also described as the seeds of science (Hammer & van Zee, 2006), may be the first flickers of a scientific conception, such as the notion of air as 12 material as a nascent version of kinetic molecular theory, or a desire for proof as a nascent version of the scientific practice of argumentation. In other words, responsive teaching means seeing, or interpreting, canonical ideas as ‘present’ or ‘partially present’ within student thinking (i.e., ‘gets it’ category). Similar to interpreting (from formative assessment), recognizing disciplinary connections may occur during or after enactment. When a teacher’s interpreting considers the ways in which students’ thinking connects with or is the ‘seed of’ disciplinary knowledge, the focus shifts toward the substance of student thinking (theme one) rather than whether students hold correct knowledge. The third theme of responsive teaching, taking up and pursuing the substance of student thinking, overlaps with the formative assessment practice of responding in that teachers use student thinking to inform or plan upcoming instruction. To do this, teachers may adapt instruction, or respond, in the moment, such as by asking students to assess one another’s ideas (Ball, 1993). Teachers may also plan to pursue the substance of student thinking by creating lessons that ask students to design and conduct experiments to test the varying ideas shared by students (Hammer, 1997) or by tailoring entire units to answer students’ questions (Richards et al., 2015). As for formative assessment, research demonstrates that responsive teaching can improve student learning outcomes (Carpenter et al., 1989; Empson & Jacobs, 2008; Fennema et al., 1993, 1996; Pierson, 2008; Richards & Robertson, 2016). Multiple studies (e.g., Empson & Jacobs, 2008; Pierson, 2008) have found greater student learning gains in classrooms with high degrees of responsive teaching with classrooms as compared to those with transmission-style teaching (i.e., low degrees of responsive teaching). In addition, Fennema and colleagues (1996) 13 found that as teacher practice shifted toward responsive teaching, students’ mathematical achievement gains improved. Also similar to the formative assessment literature, researchers note the challenges of enacting responsive teaching (Chazan & Schnepp, 2002; Levin et al., 2009; Maskiewicz, 2016; Robertson & Atkins Elliott, 2020; Rop, 2002; Tang et al., 2009). Maskiewicz (2016) notes challenges from the perspective of a teacher learning to enact responsive teaching, such as balancing facilitation and control and knowing what to do with ‘wrong’ ideas. Teachers’ perceptions of responsive teaching, such as the view that responsive teaching leaves students feeling confused and without the ‘right’ answer, have also been cited as challenges to enactment (Robertson & Atkins Elliott, 2020). Other challenges are linked to how responsive teaching may be at odds with teachers’ contexts, such as a local administrator who values classroom management over attending to student thinking (Chazan & Schnepp, 2002) or high- stakes assessments that direct teachers’ attention toward the correctness of student thinking (Levin et al., 2009). One factor underlying all of these challenges of shifting toward responsive teaching is likely that teachers are moving away from a transmission model of science education. Anticipating Student Ideas and Responses (2.A.3) Another way for teachers to be responsive to student ideas is through the practice of anticipating student ideas and responses (Cartier et al., 2013). Teachers may anticipate a range of things when envisioning plans for the flow of instruction, from how two students will work well (or poorly) together to whether an instructional activity is at the right level of difficulty for students. Anticipating student ideas and responses entails considering the ideas students may 14 already have about a topic (anticipating student ideas; ASI) and actively envisioning how these ideas might shape the way students interpret and respond to assessment and instructional tasks (anticipating student responses; ASR). For example, using anticipated student ideas about force and motion, a teacher can anticipate student responses, such as the ways students may represent forces when modeling a collision or the kinds of hypotheses students may suggest during an investigation of the relationship between force and acceleration. These anticipations may then shape the planning decisions the teacher will make about the kinds of instruction to provide students during the unit. Anticipating can inform the way teachers engage in formative assessment and responsive teaching. As part of formative assessment, anticipating the kinds of ideas student may hold can shape the way teachers elicit. As they plan, teachers can develop discussion questions and writing prompts in ways that will either elicit the specific ideas they anticipate and/or ensure that elicitation tasks can be answered from a broad range of anticipated student perspectives. Anticipating can also inform the development of lesson plans prior to instruction so that teachers are prepared to respond to or take up and pursue student thinking as part of formative assessment or responsive teaching, respectively. The research on anticipating student ideas and responses is less extensive than that on formative assessment or responsive teaching. However, the several studies that do exist demonstrate some benefits and challenges of the practice. For example, in a study on pre- service elementary teachers’ planning of investigation-based science discussions, teachers’ anticipation of alternative student ideas during planning seemed to support how they crafted questions that would surface (i.e., elicit) a variety of student ideas (Kademian & Davis, 2018). In 15 a study on supportive curriculum for pursuing student thinking (Grant et al., 2009), the curricular materials used included examples of potential student ideas (i.e., student ideas teachers can anticipate) about how to solve mathematical problems. As both observed during instruction and reported in teacher interviews, teachers used the potential student ideas to plan and enact instruction. In this study, the curricular materials supported teachers in anticipating student ideas and responses, addressing one of the biggest challenges teachers face when anticipating: a lack of awareness of the many ideas and ways of thinking that students may have about a topic. Without an awareness of student ideas, it is difficult, if not impossible, for teachers to use student ideas to inform their instructional plans until after they have elicited ideas from students. Therefore, Fennema and colleagues (1996) suggest that teachers rely on “researcher literature…of common student understandings” (p. 322) to support their anticipations, particularly during planning. Dimensions of Responsiveness (2.B) In the literature, responsive teaching is described in terms of 1) attention to students’ ideas during instruction, 2) classroom discourse structures and 3) roles students play in constructing knowledge. I refer to these as dimensions of responsiveness; together, they describe what responsive instruction might look like during enactment. I use these dimensions to evaluate my participating teacher’s lesson plans for responsiveness (see section 3.D.2). The first dimension, attention, captures how much instruction incorporates student ideas and thinking. The discourse dimension captures who is doing most of the instructional talking, the teacher or the students. The student role dimension captures who is constructing knowledge and deciding the learning pathway, again, the teacher or the students. When 16 instruction is high in responsiveness, teachers draw the attention of the class to the substance of student thinking, use discourse structures high in student-talk, and allow students to play a significant role in driving the construction of knowledge and the direction that instruction may take. Some studies of responsive instruction (e.g., Coffey et al., 2011; Pierson, 2008; Ruiz- Primo & Furtak, 2007) use one or two of these dimensions (not all three) and often integrate them into a single description of responsiveness. Generally, this use is because these studies focused on a specific instructional context (e.g., instruction during discussions; Pierson, 2008) that may not have warranted all three dimensions. However, because I anticipated a wide range of instructional contexts for this study, I treat these dimensions as separate ways of describing responsiveness. In the subsections that follow, I describe how scholars have described instruction as less or more responsive using each dimension. Attention to Students’ Ideas (2.B.1) Teacher attention to student ideas is at the heart of responsive instruction, with instruction considered high in responsiveness when teachers attend to the substance of students’ ideas and thinking and low in responsiveness when teachers’ attention simply acknowledges or evaluates the correctness of student thinking. I use three example studies (Gotwals et al., 2015; Levin et al., 2009; Pierson, 2008) to illustrate these descriptions of low and high attention responsiveness from the literature. When identifying what counted as responsive in their study of novice teacher attention to student thinking, Levin and colleagues (2009) stated they “consider[ed] it evidence of attention to student thinking when the intern notice[d] and respond[ed] to a student’s idea” 17 with a “focus on the sense of the idea from the student’s perspective” (p. 147). Therefore, interns needed to attend, not just to the student idea, but to the substance of the idea to be considered a highly responsive instructional move. Similarly, Pierson (2008) considered instruction to be high in responsiveness when teachers “explore student thinking and allow [student] reasoning to be the focal point” of classroom discussions (p. 79). In this example, “student reasoning” acted as the substance of students’ thinking to which teachers to attend. Finally, when analyzing video of classroom formative assessment, Gotwals and colleagues (2015) considered attention to student ideas for the purpose of seeing the understanding (i.e., substance) within their ideas. In contrast, when teachers’ attention is limited to evaluations of correct or simple acknowledgements, scholars considered this evidence of low responsiveness. Evaluation of student thinking was represented by classifying student ideas or thinking as either correct or incorrect. For example, Gotwals and colleagues (2015) considered instruction low in responsiveness when teachers attend to students’ ideas “to see if they are right/wrong” (p. 411). Levin et al. (2015) noted that if the novice teachers “notic[ed] or respond[ed] only to correctness,” then the instructional move would not be considered responsive. Finally, Pierson’s (2008) study also included the use of the IRE pattern (Lemke, 1990; Mehan, 1979) during discussions as an indication of low responsiveness. During class discussions, the IRE pattern begins when a teacher asks a question of the class (‘initiate’), students ‘respond’, and the teacher ‘evaluates’ the response. Therefore, the attention teachers give to student ideas when using the IRE format for eliciting is for the purpose of evaluating for correctness. 18 Pierson also includes simple acknowledges such as “oh” or “thank you” (p. 70) to represent low attention responsiveness. While attention was given to student ideas during instruction in all three studies, when the attention did not focus on the substance of students’ ideas or thinking the attention was considered low in responsiveness. Discourse (2.B.2) Another dimension of responsiveness focuses on who is doing the majority of the talking during instruction, the teacher or the students (Pierson, 2008; Richards et al., 2020). When classroom discourse is high in student talk, the instruction is often seen as highly responsive. In their framework for responsiveness, Richards and colleagues (2020) used student-to-student talk moves as a way to identify highly responsive classroom discourse. Since the teacher is removed from the classroom discourse when students talk to each other (e.g., during a classroom debate or when in small groups), student-to-student talk represents moments when the amount of student-talk is high and, therefore, classroom discourse is highly responsive. Thompson and colleagues (2016) counted the number of talk turns between students and teachers during classroom discussions. The more talk-turns that engaged students in classroom discourse represented more amount of student-talk. This contributed to their description of highly responsive classroom discourse. In contrast, instruction low in responsiveness has high amounts of teacher-talk. Therefore, Richards and colleagues (2020) considered lectures to be a discourse structure low in responsiveness, as a lecture is a classroom activity where teachers often present content with little to no student input. If students are allowed to contribute to the discourse during a lecture, such as when responding to a teacher’s question, student contributions are often short and 19 follow an IRE pattern. Thompson and colleagues (2016) considered episodes of classroom discourse to be not responsive “when the teacher was the only one talking” (p. 16). To summarize, as the amount of student-talk increased within classroom discourse, scholars considered the responsiveness of the instruction to increase as well. Student Role (2.B.3) The last dimensions of responsiveness commonly used in the literature to describe the responsiveness of instruction is the role students play in constructing knowledge. Generally, instruction is considered responsive if students are given an opportunity to construct knowledge during instruction. For example, Gotwals and colleagues (2015) considered classroom discussions that were “co-lead by students” and was “guided by student ideas” to be at the highest level of responsiveness (p. 411). Since students co-led the discussion, they would get to decide (play a role in) whether and how student ideas contributed to their understanding of the content (constructing knowledge). The teacher’s role in this process is, therefore, reduced. Elby and colleagues (2020) similarly include student role in their analytic tool for capturing the responsiveness of instruction. They describe students as “the knowledge- creators” (p. 2087) if, for example, students sustained their classroom debate with no nudges from the teacher. When students were positioned as knowledge-creators the classroom instruction was considered high in responsiveness. In contrast, instruction was considered less responsive when students were not given a role in constructing knowledge, or rather when the teacher was the sole constructor of knowledge. This aligns with how Gotwals and colleagues (2015) considered discussions that were solely guided by the teacher (no student role in constructing knowledge) as being low in 20 responsiveness. As an additional example, Elby and colleagues (2020) describe low instructional responsiveness in their analytic tool as positioning the teacher as the “source and/or arbitrator of knowledge claims” (p. 2087). Therefore, if a teacher explained content to students, Elby and colleagues considered this to be low responsiveness since students played no role in constructing knowledge. To summarize, the more students (are allowed to) contribute to knowledge construction, the more responsive scholars consider the instruction to be. NGSS Reform Context (2.C) The most recent reform in science education, articulated by the Framework for K-12 Science Education (Framework; National Research Council, 2012) and Next Generation Science Standards (NGSS; NGSS Lead States, 2013), provided a motivating context for my study’s focus on responsiveness. These guiding documents advocate a shift in the focus of science education, from students learning about science topics to students figuring out how or why something happens. In this framing of science education, students should “continually build on and revise their knowledge and abilities, starting from their curiosity about what they see around them and their initial conceptions about how the world works” (National Research Council, 2012, p. 11). In other words, student ideas are the necessary starting point of the figuring out process. By emphasizing attention to and use of student ideas advocated for by the Framework and NGSS, NGSS reform aligns with responsive instruction. The Framework calls for students to build on their ideas by engaging in science and engineering practices (SEPs), i.e., the figuring out process. In the years following the release of the Framework and NGSS, most states (44 at the time of the study) have either fully adopted the NGSS or used the Framework to guide the creation of their own state science standards 21 (NSTA, 2017). Within the Framework, knowledge and practices are integrated, rather than treated as separate learning targets. This parallels the way knowledge is constructed within the discipline of science, by using “a set of practices…to establish, extend, and refine” knowledge (National Research Council, 2012, p. 26). Novices tend to hold disconnected and even contradictory pieces of knowledge as isolated facts and struggle to find ways to integrate or make sense of them, whereas experts come to understand and organize the knowledge of their discipline through their experiences engaging in disciplinary practices (National Research Council, 1999). The assumption, then, is that by supporting students’ engagement in the SEPs as they build toward or are introduced to scientific concepts, students will gain a deeper and more integrated understanding of science and engineering knowledge. Therefore, the Framework advocates for students to engage in SEPs to support the construction of understanding from their initial ideas toward scientific understandings of the natural world. The NGSS identify eight core SEPs for this purpose: 1) asking questions and defining problems, 2) developing and using models, 3) planning and carrying out investigations, 4) analyzing and interpreting data, 5) using mathematics and computational thinking, 6) constructing explanations and designing solutions, 7) engaging in argument from evidence, and 8) obtaining, evaluating, and communicating information. The integration of knowledge and practice requires a meaningful context or focus for sense-making, such as a natural phenomenon. The use of phenomena to anchor instructional experiences allows students to learn through the application of science concepts and practices (Achieve & NextGenStorylines, 2016). This contrasts with a transmission model that teaches about concepts as discrete facts and may ask students to apply them in context later, if at all. 22 For example, using the NGSS, students might construct an evidence-based explanation (SEP #6) for how a tree grows (the natural phenomenon) that uses the scientific concept of mitosis, rather than simply learning about the topic of mitosis. Good anchoring phenomena are often observable events and build on everyday or family experiences that are relevant to students, allowing them to share their ideas about the way the world works (Penuel & Bell, 2016). By calling for the integration of science knowledge and practice through the exploration of natural phenomena, the Framework and NGSS provide a new vision for science classroom instruction that both aligns with responsive instruction (i.e., attending to and using student ideas as learning resources) and pushes against a transmission model of science instruction that focus on students’ acquisition of ‘correct ideas.’ For example, NGSS-aligned instruction might have students create models (SEP #2) of a phenomenon based on their initial ideas about how the phenomenon occurs. Later, the teacher may have students analyze data (i.e., SEP #4) to evaluate and revise their ideas about the phenomenon. In this scenario, student ideas are an essential element of both activities. The modeling activity provides an opportunity for the teacher to attend to students’ initial ideas and then use them to inform their plans for later instruction (the data analysis activity). In making changes to instructional activities that draw on a transmission model to those called for by the Framework and NGSS, as in the examples above, teachers will necessarily need to plan differently. Planning (2.D) Teacher planning is an important part of the plan-enact-reflect instructional cycle. Planning informs the way instruction is enacted. During enactment, teachers notice a variety of 23 instructional elements including student ideas (van Es & Sherin, 2021), which can be reflected on to inform how teachers plan for upcoming instruction. Kennedy (2006) conceptualizes teacher planning as a mental act of “envisioning.” Teachers “see” in their minds what will happen as they construct each lesson or episode, as if they are creating a play with different characters and props, important elements of timing, and a purpose for the episode. In fact, studies of teacher planning have found that many of the details within teachers’ plans are not actually written down but exist in the mental agendas teachers create (Borko & Livingston, 1989). Teachers continue to update their mental agenda for upcoming lessons based on their reflections from classroom experiences (Borko et al., 1990; Edmunds, 2011). However, the “pre-active decisions” (Joyce, 1978, p. 75), or those made before instruction begins, generally establish the boundaries within which in-the-moment decision making occurs. In-the-moment decisions have been characterized as “fine tuning” (Joyce, 1978, p. 75) the set of activities that have been previously prepared. Once instruction has begun, teachers rarely make decisions that radically change the direction of their instructional plans (Clark & Peterson, 1986; Joyce, 1978). In other words, if teachers’ instructional plans use a transmission-style of instruction, teachers are more likely to enact transmission-style instruction. Therefore, the planning process is particularly important to attend to and support when asking teachers to shift their instructional practice, such as the shift to responsive instruction called for by the NGSS reform. Planning Within The NGSS Reform Context (2.D.1) Due to the shifts called for by the Framework and NGSS, teachers will be faced with new kinds of planning decisions. Teachers will need to select phenomena with attention to the 24 science concepts needed to explain the event, ensure that the phenomena are both relevant to students and complex enough to engage students in the figuring out process, and decide how to present phenomena to students so that they elicit student ideas and thinking about the underlying science concepts (National Research Council, 2015; Penuel & Bell, 2016). To integrate science concepts and SEPs, teachers will need to plan SEP-based activities in ways that leave the intellectual work to students (National Research Council, 2015). This contrasts with the use of confirmatory lab experiences (Katchevich et al., 2013), which may appear to include SEPs. In confirmatory lab activities, students may carry out the investigation but are not involved in the planning of it. However, if students are to engage in the practice of planning and carrying out investigations (i.e., SEP #3), teachers will need to plan activities that support students in making some of their own decisions about what data to collect and how to collect it, rather than planning activities that ask students to follow procedural steps of a confirmatory lab in which these important decisions are already made. When planning within the NGSS context, teachers will also need to adapt their initial plans in order to respond or take up and pursue student thinking as students share and revise their ideas throughout the unit (McManus, 2008; Robertson et al., 2016). For example, a teacher may need to choose a different reading, one that more effectively speaks to the line of questioning students are taking or provides information necessary for students to refine their thinking. As illustrated by the development of resources and PD to support teachers in planning NGSS-aligned instruction (e.g., Colson & Colson, 2016; Krajcik et al., 2014), these new kinds of planning decisions, such as choosing a phenomenon, may be challenging for teachers. Some of these challenges of identified in the responsive instruction literature may contribute to the 25 challenges teachers face within the NGSS context. For example, teachers may struggle with anticipating the kinds of ideas students will have about a phenomenon and, therefore, face uncertainty in choosing the phenomenon. This may be related to a lack of awareness about the common ideas students hold about a topic because they have traditionally focused on listening for the ‘correct’ answer (Otero & Nathan, 2004) rather than listening to understand student thinking. Even if teachers are able to anticipate students’ ideas about a phenomenon, they may struggle to adapt the flow of instruction in ways that build on students’ ideas (i.e., respond; Ruiz-Primo & Furtak, 2007). In other words, they may struggle with organizing activities that support shifts in students’ thinking over the course of a unit, as they may be used to designing activities that aim to replace student ideas with ‘correct’ ones (i.e., treating student ideas as misconceptions; Christensen & Alonzo, 2018). One type of tool that may support teachers with these planning challenges is the learning progression. Learning Progressions (2.E) Learning progressions (LPs) are models of how student ideas or thinking develop over time (Alonzo, 2012; Lehrer & Schauble, 2015). Scholars have noted that some LPs highlight student ideas, while others highlight patterns of reasoning (Jin et al., 2019; Shavelson & Kurpius, 2012). I focus on the former, as this best describes the LP used for this study. These LPs contain common ideas about a topic, organized to represent a hypothesis about how student understanding may progress. The least sophisticated student ideas are found in the lower anchor (i.e., bottom level), and the most sophisticated or canonically correct ideas are found in the upper anchor (i.e., top level). The middle levels contain intermediary ideas, in increasing sophistication, that students may hold as they learn and are likely important to the construction 26 of ideas later in the progression (National Research Council, 2007). For example, realizing that objects have some properties because they are made of a particular material is likely “a critical first step toward understanding” atomic-molecular theory (p. 220). Additionally, students typically hold the set of ideas at each LP level at the same time (Duncan & Hmelo-Silver, 2009). For example, the Force and Motion LP used in this study indicates that students commonly characterize the effect of an external force acting on an object as causing the constant velocity of the object. These students also tend to hold the idea that multiple external forces acting on an object should be added to determine the amount of force the object is experiencing. Therefore, these two ideas are found in the same level of the LP. Typically developed using constructivist theories of learning, LPs provide models of how students might build understanding of a one level by leveraging the ideas found in the previous level. In the following sub-sections, I discuss empirical research that has explored a) the influence of LPs on teacher awareness and views of student thinking and b) use of LPs as a support for teachers’ responsive instruction. Influence on Awareness and Views of Student Ideas (2.E.1) The use of LPs has been hypothesized to support teachers’ shifts away from an evaluative stance (i.e., interpretations of ‘right’, ‘wrong’ or ‘missing’). Some empirical evidence supports this hypothesis. Gunckel and colleagues (2018) found that teachers working with an LP and associated curricular materials took a more nuanced perspective when considering student ideas. Teachers shifted away from treating student ideas as misconceptions (i.e., an evaluative stance) and toward one that considered students’ ideas as resources for learning and teaching (i.e., an interpretive stance). Other researchers hypothesize that LP-supported shifts toward a 27 more nuanced perspective may be due to an increase in teacher awareness of students’ ideas. With the support of an LP and associated PD, teachers in Furtak’s (2012) study showed an increase in their ability to identify student ideas during class discussions, demonstrating increased awareness of specific student ideas. In another study reporting increase awareness of student ideas, teachers with support from an LP and associated PD reported greater awareness of specific student ideas during instruction and when reflecting on student work (Christensen & Alonzo, 2018). Most of these teachers additionally reported a greater awareness that students hold ideas about physics topics in general (i.e., general awareness of student ideas). Increased awareness of student ideas (both specific and general) may be a first step toward disrupting a ‘gets it’/’doesn’t get it’ perspective and focusing teachers’ attention on the ideas that students have, regardless of their correctness. Support For Responsive Instruction (2.E.2) Much of the work on LPs as a support for teachers has been conducted within the framework of formative assessment (i.e., eliciting, interpreting, and responding). LPs have been hypothesized as a support for teachers’ formative assessment practices (Alonzo, 2012; Alonzo & Elby, 2019). Empirical studies, showing that LPs offer some support for teachers’ eliciting, interpreting, and responding to their students’ ideas, provide preliminary evidence of this hypothesis. In a four-year study of teachers’ formative assessment practices, teachers using LP- aligned elicitation tasks improved their eliciting over time while using these tasks (Furtak, Bakeman, et al., 2018) by increasing the number of questions asked that aimed at surfacing student thinking. These same teachers also improved their responding during class discussions by making fewer evaluative responses and by asking more questions that pushed on student 28 thinking. In another study, preservice teachers used an LP to analyze video-recordings of student interviews (von Aufschnaiter & Alonzo, 2018). As they interpreted the students’ thinking, the preservice teachers used the LP to focus on relevant aspects of student thinking. In a study by Furtak (2012), LPs supported in-service teachers in making inferences about (i.e., interpreting) student thinking as they listened to student ideas shared during class. These inferences often aligned with the commonly held student ideas found on the LP used within the study. Some of the research on LPs as a support for teachers’ responsive instruction suggests its potential as a useful tool when planning such instruction. In the same four-year study mentioned above, teachers developed elicitation tasks with the aid of an LP (Furtak, Circi, et al., 2018). In early versions of the tasks, only students with a correct (i.e., upper anchor) understanding could demonstrate their thinking; all other ideas could only be interpreted as ‘doesn’t get it.’ Over time, teachers developed tasks that were more aligned with the LP, allowing students with ideas at different levels of the LP to share their thinking. In this case, the use of the LP to inform the development of elicitation tasks (a common goal of planning) supported the teachers’ ability to engage in ‘in-the-moment’ responsive instruction (during enactment). In a study of teachers’ formative assessment specifically focused on responding through adaptations to future instruction during planning, teachers made instructional adjustments to their original unit plans in response to LP-aligned class assessment data (Zhai et al., 2018). These instructional adjustments included changes to the sequence of activities to better align with the LP. In this case, the LP supported teacher planning by helping them organize the flow of activities within the unit. 29 Beyond formative assessment, LPs may also support teachers in anticipating student ideas and responses, as “research on student learning of the scientific ideas” embedded in classroom tasks is recommended to support teachers’ anticipating (Cartier et al., 2013, p. 29). While research on LPs as tools for anticipating is limited, two studies (Furtak & Heredia, 2014; Wooten et al., 2019) on formative assessment offer some evidence of LPs as supports for teacher anticipating student ideas and responses during planning. In both studies, teachers planned instructional activities to address student ideas in the LP that they anticipated their students might hold prior to determining if their own students actually held these ideas. While formative assessment focuses on how teachers respond to the ideas elicited from students, these studies offer some evidence of teacher use of an LP for anticipating and that anticipating informed teachers’ planning of future instructional responses. Responsive Planning: A Conceptual Framework (2.F) Drawing on the literature discussed above, I define responsive planning as a set of practices that attends to and uses student ideas to create planned responsive instruction. Table 1 outlines the responsive planning practices used in this study and the form of responsive instruction (formative assessment, responsive teaching, and anticipating student ideas and responses) that each reflects. I separate these four responsive planning practices into two categories based on their relationship, either direct or indirect, to planned responsive instruction. Figure 1 presents a visual representation of the relationship between each of the four responsive planning practices and planned responsive instruction. The two direct responsive planning practices are planning to elicit and planning to respond. When teachers engage in these practices, they 30 Table 1 Responsive Planning Practices and their Alignment with Forms of Responsive Instruction Responsive Planning Practices FA RT ASI/R 1. Anticipating student ideas and/or responses X 2. Interpreting student responses (verbal or written) X X 3. Planning to elicit student ideas and thinking X X (X) 4. Planning to respond to elicited student ideas and X X thinking Note. FA – Formative assessment; RT – Responsive teaching; ASI/R – Anticipating student ideas & responses; X – Indicates that the practice reflects the form of responsive instruction; (X) – Indicates that the practice may reflect the form of responsive instruction. Figure 1 Responsive Planning Practices and their Relationship to Planned Responsive Instruction Indirectly Linked Practices Directly Linked Practices Anticipating Planning to Elicit Interpreting Planning to Respond Student ideas informing Planned responsive directly linked instruction practices Note. Squares contain responsive planning practices, while hexagons contact the output/input of teachers’ engagement in the practices. envision how ideas for classroom activities will play out with students, for the purpose of eliciting or responding to students’ ideas and thinking, respectively. The direct products of these two responsive planning practices are teachers’ planned responsive instruction, which includes both their vision of the lesson and any classroom materials they have created to help them enact their vision. Teachers’ planned responsive instruction contains evidence of how the teacher plans to structure the classroom discourse and attend to students ideas, along with the roles the teacher plans to provide students during instruction. 31 The two indirect responsive planning practices are anticipating and interpreting. When teachers engage in these practices, they consider what student responses (anticipated or elicited) reveal about the ideas students (may) hold. Interpretations may be that students are missing canonical knowledge, hold wrong ideas, or that their ideas have a disciplinary connection to canonical knowledge (i.e., canonical ideas are present). The interpretations of student ideas can shape the way teachers use the ideas to inform their preparation of classroom activities (i.e., plans to elicit or respond). Therefore, the student ideas teachers anticipate and how teachers interpret student ideas (either anticipated or elicited) indirectly shape their envisioned lesson plans and classroom materials. These responsive planning practices sit within the ‘planning’ portion of the plan-enact- reflect teaching cycle but are influenced by how a teacher enacts and reflects on their instruction. For example, the teacher’s ability to notice student thinking (van Es & Sherin, 2021) during enactment influences the kinds of student ideas they reflect on and that can be used to inform a teacher’s plans. In other words, a teacher’s noticing practice influences a teacher’s responsive planning practices, and ultimately, a teacher’s responsiveness. In this study, I focus on responsive planning only. In the following sub-sections, I describe each responsive planning practice identified in the Table 1 and how it may connect to the other responsive planning practices as a teacher works to plan responsive instruction. Next, I describe how an LP may support a teacher when engaging in each of the responsive planning practices. 32 Responsive Planning Practices (2.F.1) In this section, I discuss how each responsive planning practice (Table 1) draws from the form(s) of responsive instruction (formative assessment, responsive teaching, and anticipating student ideas and responses). For each practice, I provide NGSS-aligned examples (i.e., using phenomena and integrating practices and content). Anticipating Student Ideas and/or Responses (2.F.1.a). One way a teacher can attend to and use student ideas to create planned responsive instruction is by anticipating student ideas and/or responses (Cartier et al., 2013). A teacher can use the anticipated ideas/responses to inform the classroom activities he plans and/or envision how the anticipated ideas may be used in responsive ways during instruction. As an example of the former, anticipating student ideas may help teachers prepare investigations that will allow students to test those ideas, should students hold them. As example of the latter, a teacher may anticipate student responses to a small group argumentation activity by considering the ideas students shared during a class discussion. The teacher might then use the anticipated responses to prepare follow up questions aimed at uncovering or understanding the anticipated ideas, which he can use when during the activity. Interpreting Student Responses (2.F.1.b). The practice of interpreting is an explicit step in the process of formative assessment. The themes of foregrounding the substance of students’ thinking and seeing the disciplinary connections within students’ ideas from responsive teaching describes the way teachers should engage in the practice of interpreting if it is to be considered responsive. The goal of interpreting is to understand student thinking from the student’s perspective, not to simply evaluate for its accuracy, and consider ways these 33 ideas may be foundational to canonical ideas. Examples of student written responses that might be interpreted during planning include initial models of a phenomenon or students’ written explanations. While planning, teachers may also interpret the verbal responses students shared during a whole class discussion or as students worked in small groups. Similar to the practice of anticipating, the purpose of engaging in this practice is to develop understanding of student thinking to inform the planning of upcoming instruction. Planning to Elicit (2.F.1.c). Both formative assessment and responsive teaching ask teachers to draw out student ideas so they can be used to inform instruction. Formative assessment makes this explicit through the practice of eliciting. Responsive teaching implies the practice of eliciting within the theme of foregrounding the substance of student ideas, which calls on teachers to listen for understanding. A teacher may plan to support this aspect of responsive instruction by preparing classroom activities that elicit student ideas. Teachers can do this in two ways: generally or specifically. First, a teacher may plan activities that elicit student ideas in general. For example, a teacher may have students construct an initial model about a phenomenon to elicit their ideas about how the phenomenon occurs. In this case, the activity allows for a wide variety of student ideas to be elicited, not just the ones the teacher may have anticipated. Second, a teacher may plan to elicit specific student ideas, either ones previously elicited during instruction or those the teacher anticipates students holding. As an example of the former, a teacher may want to track how students’ ideas are evolving over time and therefore develop an elicitation task to see if students are still holding the initial ideas shared in a modeling activity. As an example of the latter, a teacher may select a phenomenon 34 and craft questions related to the phenomenon to draw out the specific ideas he anticipates students holding while also listening for unanticipated ideas. Planning to Respond (2.F.1.d). Planning classroom activities specifically designed to build on student ideas is another responsive planning practice. It aligns with the practice of responding from formative assessment and the theme of taking up and pursuing the substance of student thinking from responsive teaching. This type of responsive planning captures the ways that teachers use elicited or anticipated student ideas as resources for learning. A teacher may plan an activity that uses an elicited student idea, such as the idea of ‘air as material,’ as a preliminary theoretical model and ask students to consider a variety of structures for the ‘material’ (e.g., uniform and blob-like, made of tiny individual pieces). This planned activity builds on the elicited student idea and uses it as a resource for learning. A teacher may also use anticipated student ideas and responses to help plan instruction before students’ ideas are elicited. For example, if a teacher anticipates three student ideas that might surface during a discussion, he can plan several responding classroom activities that ensure the three anticipated ideas can be investigated. Anticipating student ideas can help teachers prepare activities well in advance so that their mid-unit planning process feels like choosing between a set of pre-planned options, rather than feeling like they must develop an activity from scratch. Following from the example above, if two of the three ideas surfaced during the discussion, the teacher could then pick the two investigation activities that align with the ideas his students hold. 35 LP Support For Responsive Planning Practices (2.F.2) LPs may support each of the four responsive planning practices (Table 1). First, an LP contains common student understandings about a topic and may increase teachers’ awareness about these ideas (Christensen & Alonzo, 2018; Furtak, 2012; Furtak & Heredia, 2014; Wooten et al., 2019). By increasing teachers’ awareness of student ideas, an LP may directly support teachers as they anticipate student ideas and responses (Cartier et al., 2013). For example, teachers may be able to better anticipate how students will respond to planned instruction once they are aware of the ideas that may be underlying students’ responses or be able to anticipate a broader range of student ideas to address with planned instruction. Second, an LP may support teachers as they interpret student responses (Furtak, 2012; Gunckel et al., 2018; von Aufschnaiter & Alonzo, 2018). Similar to how an LP supported teachers interpreting during classroom discussions (Furtak, 2012), an LP may support teachers in taking a more nuanced perspective when interpreting student responses during planning (Gunckel et al., 2018). This perspective counters the commonly used evaluative stance (i.e., right v. wrong/missing) toward student ideas (Otero & Nathan, 2008). Third, an LP may support teachers in planning classroom activities that elicit student ideas (Furtak, Bakeman, et al., 2018; Furtak, Circi, et al., 2018; Wooten et al., 2019). For example, an LP can be used to ensure that elicitation activities allow students at each level of the LP to share their ideas. Therefore, LPs may support teachers as they develop elicitation activities that allow a broader range of student ideas to be elicited (e.g., Furtak, Circi, et al., 2018). 36 Finally, the hypothesized learning pathway of an LP could support teachers in planning to respond by preparing classroom activities that builds on student ideas (Yin et al., 2014; Zhai et al., 2018). For example, teachers could compare the ideas between two adjacent levels of an LP to inform the kinds of classroom activities that could build on the ideas at the lower level toward ideas at the next level (Yin et al., 2014). Teachers could also use the LP to organize activities within a unit (e.g., Zhai et al., 2018). By doing so, the student ideas aligned with the lowest level would be built on or addressed first in ways that help bridge them to ideas in the next level of the LP (i.e., from level 1 to level 2), followed by instruction to create a bridge to the next level (i.e., from level 2 to 3), all the way to the top level of the LP. Summary (2.F.3) When teachers engage in responsive planning practices, they are planning in ways that attend to and use student ideas as resources for learning to create planned responsive instruction. This contrasts with a transmission model of science instruction that relies on lectures and confirmatory labs to deliver content to students. As a teacher begins to engage in responsive planning practices, he is likely to struggle due to the challenges of responsive instruction. Along with PD, LPs may be useful for teachers engaging in responsive planning practices by increasing teacher awareness of student ideas, supporting a more nuanced perspective toward student ideas, and providing a model of a learning pathway. 37 CHAPTER THREE: METHODS This is an interpretive case study (Dyson & Genishi, 2005) of a teacher beginning to intentionally engage in responsive planning, with support from an LP and on-going PD. At the start of the study, my case study teacher, AJ (pseudonym), taught using a transmission-style of instruction (heavy use of lectures and confirmatory lab experiences) and was interested in shifting toward more responsive teaching practices. Therefore, he made an ideal candidate for this study. I used a case study approach because AJ’s engagement in responsive planning informs and is informed by other aspects of his teaching practice and teaching context during the study, including his enactment of the planned responsive instruction and engagement with the supports offered (LP and on-going PD). The use of a case study approach allowed me to capture “the type of concrete, context-dependent knowledge that research on learning shows to be necessary” (Flyvbjerg, 2011, p. 302). From the rich descriptions of AJ’s responsive planning, in his local context, this study provides others with vicarious experiences through which they can make naturalistic generalizations to their contexts (Stake & Trumball, 1982). AJ represents a critical case (Flyvbjerg, 2011) because he is shifting his planning under idealized conditions. Critical cases allow us to consider what is possible under the best circumstances. If, under ideal conditions, a teacher struggles to shift toward responsive planning, we can assume that teachers under less idealized conditions might face similar or additional challenges in making such shifts. AJ represents a critical case in two ways. First, AJ’s state was recently working to align its science standards with the Framework and did not assess students in AJ’s subject, physics. This circumstance provided AJ with more freedom to try shifting his practice without the threat of evaluation from standardized test scores, contributed 38 to his local administration’s support of his participation in this study, and supported AJ’s interest in responsive instruction generally. Second, AJ and I spent nine years working together as science teachers at the same high school. During our time working as colleagues, we developed a level of understanding of each other and mutual trust that, in this study, facilitated rich discussions about his teaching practice, which might otherwise be a sensitive topic. A teacher that I did not have such an open and honest relationship with may not have been as willing to share his thoughts, opinions and feelings as freely as AJ did during this study. In the sections below, I describe the methods used for this case study. In the first sub- section, I further describe my participating teacher and the supports provided to him during the study. Next, I describe my roles and positionality during the study. I then described my data sources and how they were generated. Finally, I describe the data analysis methods I used to answer my research questions. Participant Teacher and Support for Responsive Planning (3.A) In this section, I begin by further describing my participating teaching (AJ) and his teaching context. I then describe the two forms of support provided to AJ during the study: the LP AJ used during planning and the PD I provided. Because of my close relationship with AJ and the fact that I provided the PD during the study, I conclude this section by describing my role in the study Participant Teacher and His Teaching Context (3.A.1) AJ is a White, middle-class man who, at the beginning of the study, was in his mid-30s and had 13 years of high school science teaching experience. During the first ten years of his career, AJ taught chemistry and physical science in an NGSS Lead State. He then moved back to 39 his home state, which was only in the initial phase of updating state science standards to align with the Framework. The (upcoming) NGSS reform context provided a new vision of science instruction for my case study teacher. With this new NGSS-inspired vision (i.e., students engaging in a process of figuring out), my case study teacher (re)considered the kinds of opportunities he would need to plan for and provide to students during instruction that were different from his current instruction. Since my case study teacher anticipated his state’s shift to NGSS in the near future, he wanted to try incorporating two of the prominent elements of the Framework and NGSS as he shifted his instruction to be more responsive: integrating science and engineering practices with science content and using an anchoring phenomenon as meaningful context for the figuring out process. With the move back to his home state, AJ take a position teaching physics. This new physics position was at a small, suburban high school in a rural part of the state; it was the only high school in the district, which neighbored the district AJ attended as a student. During the academic year of the study, there were just under 500 students enrolled at the school, with a student body that was 80% White and 31% low-income (qualified for free-reduced lunch). All students at AJ’s school took some level of physics as a district graduation requirement. AJ taught three different levels of physics courses each year: AP Calculus-based Physics, an honors physics course, and a general physics course. This study focuses on the general physics course, which was offered as a semester-long junior/senior level course for college-bound students not seeking a degree in a science-related field. AJ taught the course in both the Fall and Spring semesters. 40 AJ chose a unit on force and motion (F&M) as the focus of this study for two reasons. First, the F&M unit was the first major unit of instruction in the course. He hoped that starting the course using responsive instruction would make it easier to support students in making the transition to a new type of instruction as compared to trying to shift in the middle of the course. Second, of the three LPs (F&M, momentum and energy) that I had made available as supportive tools, he was most interested and drawn to the F&M LP. I collected data from the F&M unit during the Fall 2019 and Spring 2020 semesters. In the 2018-2019 academic year (pre-study), AJ’s F&M unit lasted approximately two weeks each semester. Versions of the F&M unit from this study (i.e., during 2019-2020 academic year) lasted approximately six weeks during each semester. Supports for Responsive Planning (3.A.2) During the study, AJ was supported by: 1) the use of the FMLP, 2) professional development I provided during the summer prior to the study, and 3) regular planning meetings I held with AJ during the implementation of his Fall and Spring F&M units. Below, I described each of these supports in more detail. Force & Motion LP (3.A.2.a). The LP used as a support during the study (FMLP; adapted from Alonzo & Steedle, 2009), focused on the relationship between force and motion. The FMLP was a major focus of the PD work I did with AJ during Summer PD (See 3.A.2.b next). While other LPs highlight patterns in reasoning that could apply to other topics (Jin et al., 2019), the FMLP describes commonly held student ideas about the relationship between force and motion in four levels. At each level, it describes how students with ideas at that level are likely 41 to make predictions about four motion/force situations: 1) not moving, 2) moving, 3) not experiencing force(s), and 4) experiencing force(s). (See Appendix A.) Summer PD (3.A.2.b). To launch AJ’s intentional engagement in responsive planning, I provided one-on-one, in-person professional development (PD) over four consecutive days during Summer 2019. Each day’s PD was between 6 and 8 hours long. The all four days of PD were video- and audio-recorded and used as a secondary data source for the study (See 3.C.2 for details). Each day's activities are detailed below. Day One (3.A.2.b.i). On the first day of PD (Day 1), I began by conducting a pre-PD interview with AJ (See 3.C.2). This was followed by an introduction to the general concept of LPs via a PowerPoint presentation (Appendix B). The presentation defined LPs and gave a brief overview of research on how LPs have been useful to teachers (e.g., increased awareness of student ideas). I planned to provide PD around LPs for three different physics concepts: force and motion (the FMLP), energy, and momentum. AJ choose the FMLP as the LP/topic to work with first. We used the FMLP to practice looking for student ideas in responses to LP-aligned ordered-multiple choice (OMC; Briggs et al., 2006) and open-ended questions and in transcripts of students being interviewed about their F&M ideas. The student responses in these materials were collected from related studies with the same grant funding as this one. Additionally, we used the FMLP to look for student ideas in responses that AJ collected during the Spring 2019 (prior to the study). When I recruited AJ for this project in the Spring of 2019, I provided him with the same LP-aligned OMC and open-ended questions to as those mentioned above, along with others that were aligned to the energy and momentum LPs. I 42 encouraged AJ to use them with his students before the end of the Spring 2019 semester, which he did. He brought these student responses to the summer PD, and we incorporated the F&M-related items into the FMLP work described above. We then focused on how an LP could be used as an instructional pathway. This began with a brief discussion on what it would look like to organize the flow of instruction along the increasing levels of the LP (i.e., LP as an instructional pathway) rather than following the instructional pathway laid out in a textbook or a logical progression of science content. After the presentation, we compared adjacent levels of the FMLP and identified the incremental shifts in thinking a student would make if his or her thinking were to shift along the FMLP. We called these ‘gains in understanding,’ and I suggested using these ‘gains’ to inform the kinds of activities that would be part of the Fall 2019 version of AJ’s F&M unit. At two points during the day, I gave AJ 15-20 minutes of independent writing time to reflect on his own understanding and ideas. I referred to this as ‘free writes’ and didn’t provide a specific prompt for his writing. The first point was just before breaking for lunch, which was in the middle of our use of the FMLP to look for student ideas in collected responses, and as the last activity of the day. We discussed his reflections together immediately following each independent writing time. Day Two (3.A.2.b.ii). On Day 2, we briefly reviewed LPs on momentum and energy and interpreted student responses to LP-aligned open-ended questions (collected from AJ’s students in Spring 2019) in terms of the commonly held ideas about momentum and energy highlighted in the LPs. Afterward, AJ noted that he wanted to focus on the FMLP and changing his F&M unit. 43 We then focused on AJ’s understanding of the NGSS. I presented on and we discussed four major elements of the Framework (National Research Council, 2012) and NGSS: disciplinary core ideas, cross-cutting concepts, science and engineering practices (SEPs), and phenomena- based instruction. At the close of this discussion, AJ indicated that he wanted to focus on integrating SEPs with content and using a phenomenon-based approach. The rest of the second day was devoted to responsive instruction, drawing on formative assessment, responsive teaching and anticipating student ideas and responses to frame the discussion. I did not distinguish between the three forms of responsive instruction but rather we discussed the various aspects using a set of readings on the topic. For example, I introduced the idea of eliciting (a formative assessment practice) with the framing of ‘foregrounding the substance of student thinking’ (a theme from responsive teaching). While I did have several sections of text from articles or books for AJ to read and planned to discuss these with him, AJ ended up asking a lot of questions during the presentation. Therefore, I decided to forego the reading activities and extend the presentation time, allowing AJ’s questions to steer a lot of the discussion. As in Day 1, just before lunch and as the last activity of the day, I gave AJ 15-20 minutes of free write time to reflect on his own understanding and ideas, and we discussed these ideas together immediately following each independent writing time. Days Three and Four (3.A.2.b.iii). During the last two days of PD (Days 3 & 4), AJ and I discussed preliminary ideas for the F&M unit of instruction. This included: • Discussing and searching for phenomena, • Considering which SEPs made sense to integrate into the unit and how to scaffold these for students, 44 • Considering the ‘gains in understanding’ (i.e., incremental shifts along the FMLP) identified in the previous PD days when considering what lesson plans to create, • Anticipating the ideas his students might hold about F&M and how to elicit them, and • Creating outlines of some of the lesson plans for the unit. AJ then used some additional time before the start of the Fall 2019 semester to continue his planning for the F&M unit. We discussed this post-PD planning during a conversation held before AJ’s teaching in Fall 2019. Planning Meetings (3.A.2.c). To provide on-going support as AJ began to engage in responsive planning, he and I had regular planning meetings as he planned and implemented his F&M unit. During both the Fall and Spring semesters, I met with AJ once or twice a week via an online videoconferencing platform (i.e., Zoom). During these meetings AJ regularly shared: • Student ideas AJ noticed, both during instruction and in student work, • How students’ ideas were utilized in his planning decisions, • SEPs and the phenomenon he was incorporating into the unit, • Accounts of what happened during the most recent days of instruction, • Plans for upcoming instructional days, • Perceived successes and challenges as he planned and enacted instruction, and • Questions related to the topics above (e.g., questions about the student ideas he noticed). To support AJ’s responsive planning during these meetings, I regularly: • Clarified concepts we had previously discussed during the summer PD, 45 • Offered suggestions for how to structure classroom activities to align with responsive instruction (especially when AJ asked for input), • Pressed on AJ to consider an alternative perspective (e.g., pressed AJ to consider partial understandings within student ideas as opposed to only seeing ideas as wrong), and • Shared my experiences learning to be responsive in my own teaching practice. Our planning meetings were audio- and video-recorded and used as a primary data source for the study (For details, see 3.C.1.a). Researcher’s Roles and Positionality (3.B) I held (at least) three roles in this study: 1) AJ’s good friend and former colleague, 2) professional development provider, and 3) researcher. Within each sub-section below, I describe one of these roles, how it (may have) influenced my study, and steps I took to mitigate potentially problematic influences. Friend and Former Colleague of Teacher Participant (3.B.1) At the start of the study, I had known AJ for roughly 12 years; we considered each other friends (and still do). When AJ was teaching chemistry and physical science, I taught biology and physics in the same high school. We regularly relied on each other to clarify our understandings of science content and to develop classroom routines and strategies. We both assisted with high school musicals and served at various levels in our local teachers’ union. Our science department also met socially out of school fairly regularly. After entering graduate school, I often shared my experiences and learning with my former colleagues, including AJ. Part of my sharing included my experiences observing and 46 working with preservice secondary science teachers during their science methods courses. These courses included the development and use of phenomenon-based instructional units, which allowed preservice teachers to practice responsive instruction. I believe sharing my experiences within the methods courses contributed to AJ’s interest in participating with me in this study. This sharing included some of the phenomena preservice teachers chose to use for their instructional units, some of the ideas that were elicited during the units and activities they used to address elicited ideas. Essentially, AJ got to hear about the phenomenon-based units that preservice teachers were trying out. I believe this became intriguing for AJ and, therefore, he was ready and willing to try out a phenomenon-based unit of instruction for my study. During the academic year of the study, I had the opportunity to co-teach two semesters of secondary science methods courses. As I developed and used my own phenomenon-based unit within the methods course, I relied on AJ’s expertise about the sport of curling, as the phenomenon I used was set in the context of the sport. In tandem with our work focused on AJ’s planning, I occasionally asked AJ questions about curling and shared my own experiences as an instructor of a phenomenon-based science unit, including successes and challenges. Therefore, I was positioned as a learner with support from AJ at the same time AJ was positioned as a learner with support from me. While AJ and I already had a great deal of trust and mutual respect as friends and colleagues, the fact that we were both positioned as learners and were helping each other supported the openness and honesty AJ brought to our conversations about his own teaching. While open and honest conversations allowed for a rich data set, my friendship with AJ also presented challenges to the research process. For example, because AJ and I knew each 47 other well, we often shared our thoughts using shared references or short-hand language, making it challenging for a third party (i.e., someone reading a transcript of the conversation) to fully understand everything being discussed. To help mitigate this challenge during data collection, I regularly asked AJ to expand on his thinking during discussions. I would often say something like, I know exactly what you’re saying, but can you expand on your thinking so I have it in your words. However, naturally, I missed opportunities to have AJ expand on his thinking in-the-moment. Therefore, I also incorporated member checking (Lincoln & Guba, 1985) with AJ to ensure that he shared my interpretations of the data. PD Provider (3.B.2) As described above, I provided PD during the summer to launch AJ’s intentional engagement in responsive planning and also during on-going support during planning meetings. In both forms of PD, I often gave AJ suggestions or recommendations about strategies he might try and regularly pushed him to consider his students’ ideas and perspectives as he planned. AJ always made the final planning decisions for his classroom and carried them out during instruction in whatever way he saw fit. However, I took a very active role in the planning process during both the PD and planning meetings. To help account for my PD provider role in the study, I considered all my contributions as ‘PD support' when coding the transcripts of my conversations with AJ. This allowed me to reflect on whether and how the PD support I offered was taken up and its role in shaping AJ’s engagement in responsive planning during my analysis (See 3.D.4). 48 Researcher (3.B.3) Prior to my graduate school experience, I taught high school science for nine years. During graduate school, I have learned more about the intent of the NGSS reform, the benefits of being responsive to student ideas, and the ways LPs may support teachers. This has caused me to reflect on my own science teaching and make sense of some of the challenges I faced when working with students and still hear in the conversations I have with former colleagues. Therefore, I approach this work with a strong commitment to supporting teachers as they work through challenges to enact responsive instruction. Consequently, my research involves understanding how teachers take up, and the tools that support, responsiveness. My interest in teacher responsiveness and LPs certainly influenced the way I provided PD and may have biased my interpretation of the data in this study. To help mitigate potential biases, I had a colleague assist in the coding process and shared my claims and evidence with a group of critical friends (Discussed throughout 3.D). Data Generation (3.C) In this section, I describe my primary and secondary data sources and how they were generated. In general, the goal of this data was to capture AJ’s engagement in responsive planning practices. Primary Sources (3.C.1) The two primary data sources for this study were: 1) video recordings of planning meetings during both the Fall and Spring F&M units and 2) two post-unit interviews (one after the Fall F&M unit and one after the Spring F&M unit). 49 Planning Meetings (3.C.1.a). As previously mentioned (3.A.2.a), AJ and I held regularly planning meetings during both the Fall and Spring F&M units. These planning meetings provided a place for AJ to share his thinking (e.g., plans, reflections, questions) and receive support as he intentionally engaged in responsive planning. These data, therefore, captured some of AJ’s engagement in responsive planning practices, the details of his lesson plans, and whether/how AJ utilized the supports I provided (FMLP, my suggestions during PD). For the Fall 2019 version of the F&M unit, AJ and I had six planning meetings. One planning meeting occurred before the start of the Fall 2019 version of the unit; the other five occurred throughout the 6 weeks of implementation. The Fall 2019 planning meetings lasted anywhere from 45 minutes to 2 hours; the average length was 81.5 minutes. For the Spring 2020 version of the F&M unit, AJ and I had seven planning meetings. We intended to meet before the start of the Spring 2020 version of the F&M unit; however, we had to cancel the pre- unit meeting due to scheduling conflicts. Therefore, all planning meetings associated with the Spring 2020 version of the F&M unit occurred after the unit began. Spring 2020 planning meetings lasted between 1 hour and 2 hours, 20 minutes with an average of 95 minutes. The planning meetings were held virtually using Zoom, audio- and video-recorded, and later transcribed. I refer to these meetings throughout the rest of the study using the following shorthand: PM-[semester abbreviation, number]. For example, PM-F1 refers to the first planning meeting of the Fall semester. Table 2 provides a list of planning meeting names, the dates these meetings took place and the length for each meeting. Post-Unit Reflective Conversations (3.C.1.b). I conducted two post-unit reflective conversations (RC), one after AJ’s Fall 2019 version of the F&M unit (7hrs, 28mins; completed in 50 Table 2 Planning Meeting Data Name Date Length PM-F1 2019-07-26 2 hours, 0 minutes PM-F2 2019-09-06 0 hours, 45 minutes PM-F3 2019-09-12 0 hours, 45 minutes PM-F4 2019-09-22 0 hours, 59 minutes PM-F5 2019-09-27 2 hours, 9 minutes PM-F6 2019-10-07 1 hour, 28 minutes PM-S1 2020-02-02 0 hours, 58 minuets PM-S2 2020-02-04 1 hour, 9 minutes PM-S3 2020-02-09 1 hour, 48 minutes PM-S4 2020-02-11 0 hours, 58 minutes PM-S5 2020-02-17 2 hours, 8 minutes PM-S6 2020-02-24 1 hour, 45 minutes PM-S7 2020-03-05 2 hours, 20 minutes three sessions) and one after AJ’s implementation of the Spring 2020 (4hrs; completed in two sessions). I refer to these as RC-1 and RC-2, respectively. These conversations were held virtually using Zoom, audio- and video-recorded, and later transcribed. The protocol for the RCs can be found in Appendix C. AJ and I spent a large portion of each RC walking through the F&M unit AJ had just finished teaching. Next, AJ also answered questions about what guided his instruction and his understandings of the NGSS reform, responsive instruction, and LPs, with additional questions asking him to compare his current understandings to his understandings at the start of the study. In the RCs, AJ was also asked to share successes and challenges as he planned and enacted his Fall and Spring F&M units. 51 Secondary Data Sources (3.C.2) There are four secondary data sources for this study: video recording of a pre-PD interview, video recording of the summer PD, artifacts from AJ’s instruction, and two member- check sessions (Lincoln & Guba, 1985). These secondary data sources were used to support my interpretations and verify information from the primary data set. Each secondary data source and its collection is described below. I conducted a one hour in-person, semi-structured interview with AJ before PD began. There were two goals of the pre-PD interview. The first was to get a sense for AJ’s current instruction and planning process. AJ talked through the instructional plans he had used for the unit on F&M that he would be changing during the study and answered questions about what and how he prioritizes things that guide his thinking as he plans instruction. The second goal was to get a sense for AJ’s understandings of the NGSS reform, responsive instruction, and learning progressions. Questions phrased as ‘what do you know about…” were used to prompt a discussion about each of these topics. A similar protocol to the RCs was used for the pre-PD interview. I provided four days of PD during the summer prior to the study (See 3.A.2.b). All four days were video recorded. Most of AJ’s instructional materials were collected, including both teacher materials (e.g., PowerPoint presentations) and student materials (e.g., activity handouts). With permission, some student work was also collected, such as student-generated models of phenomenon and student responses to writing prompts. 52 I held two member-check sessions with AJ, one during the summer of 2021 (one year after the study) and one during the summer of 2022 (two year after the study), each lasting approximately one hour. Member checks offer a way to explore the validity of a study’s findings from the perspective of the participant (Lincoln & Guba, 1985). Therefore, I used member- checks to consider AJ’s perspective on my interpretations of the data. I discuss this data as part of my analysis below. Data Analysis (3.D) I began my analysis by coding transcripts of my primary data sources using two sets of a priori codes based on my conceptual framework of responsive planning (See 2.F). These were codes for 1) the four responsive planning practices (anticipating, interpreting, planning to elicit and planning to respond) and 2) the level of responsiveness of planned instruction along the three dimensions of responsiveness (i.e., discourse, attention, and student role). First, I applied my responsive planning practice codes to ‘idea units’: sections of text, most commonly several talk-turns between speakers, during which AJ and I discussed or engaged in one of the responsive planning practices. This included sub-codes for the practice of interpreting. Next, I coded for responsiveness of planned instruction within each idea unit for all three dimensions of responsiveness using levels low, medium, and high. If there was no evidence for one of the dimensions (e.g., no discussion of planned discourse structures), then I coded this as uncertain. In a final round of coding, I coded each idea unit by the semester and lesson plan it referenced (e.g., first lesson in the Spring semester) since each planning meeting and post-unit interview included references to multiple lesson plans and I could not number the lesson plans until all had been identified. 53 A fellow doctoral student reviewed my application of the codes in two ways. First, my colleague used my code descriptions (See 3.D.1 & 3.D.2 below) to apply the codes to two (due to her limited time) planning meetings transcripts on her own. These two transcriptions were selected from the primary sources because they were the two transcripts with the most coding when I applied codes myself. We then discussed our coding together, resolving any differences. Since there were very few differences when coding separately, my colleague then reviewed my coding for four additional planning meeting transcripts. We then met and resolved any issues in coding she noted. Following the review of my coding, I sorted the data by lesson plan and used my codes to develop lesson plan memos: analytic memos to describe the responsive planning practices AJ engaged in, the responsiveness of each lesson plan by dimension, and whether/how supports were used as AJ planned each lesson. Finally, I looked for patterns across these lesson plan memos to answer my research questions. In the sub-sections below, I provide further details of my coding process for each of my three sets of codes: 1) responsive planning practices, 2) responsiveness of planned instruction, and 3) lesson plan reference. Next, further describe my lesson plan memos and the process I used to create them. Finally, I explain how I used looked for and used patterns in my lesson plan memos to answer each of my research questions. Coding for Responsive Planning Practices (3.D.1) As mentioned above, I began my analysis by coding transcripts of my primary data sources using a priori codes for my four responsive planning practices: anticipating, interpreting, planning to elicit, and planning to respond. See Table 3 for all codes used in this 54 round of coding. These codes were applied to idea units during which AJ and I discussed or engaged in responsive planning practices. I double-coded some idea units due to the integrated nature of the planning practices. For example, in one instance, AJ anticipated that students would hold an idea from the FMLP, but that his plans to elicit may not uncover the idea well. Therefore, he changed the language of his eliciting activity (planning to elicit) so that students holding the idea might provide responses indicating so. In this case, the whole discussion was captured as one ‘idea unit’ and double coded for the two types of responsive planning practices. Examples of all responsive planning practice codes can be found in Table 3. I also sub-coded the practice of interpreting whenever AJ compared a student idea (either anticipated or elicited) to a canonical science concept. These a priori sub-codes were: 1) present – applied when AJ considered a science concept to be present in the student idea or thinking, 2) missing – applied when AJ considered a science concept to be missing from the student idea or thinking, and 3) wrong – applied when AJ considered a student idea or thinking as incorrect or as a misconception. I used an additional sub-sub-code of partial when AJ interpreted a student idea or thinking as being partially present, partially missing, or partially wrong. A sub-code of ‘LP’ was be applied to an idea unit whenever AJ or I used or referenced LPs generally or the FMLP specifically, including ideas and levels within the FMLP. This included times when AJ or I referenced an idea within the FMLP, without explicitly mentioning the FMLP. Coding for Dimensions of Responsiveness (3.D.2) Once idea units were identified and coded with the responsive planning practices above, I coded the idea units using a priori codes for the dimensions of responsiveness identified in my 55 Table 3 A priori Codes and Subcodes for Responsive Planning Practices Responsive planning Descriptions of responsive planning practices, plus any practices codes & interpreting sub- Examples of ‘LP’ sub-code relevant sub-codes codes (1) Anticipating Predicting what students will say/write if Anticipating ideas from the asked a particular question, which may LP or using the ideas from be based on previous interactions with the LP to help anticipate students. student responses (2) Interpreting Analyzing either anticipated or elicited student responses to understand Four sub-codes, used students’ ideas. Examples of elicited when applicable: responses include students' written work and verbal contributions during prior class sessions. Present Evaluating a student idea or thinking as similar to or the same as a science idea Using the LP or ideas from or science way of thinking. the LP to guide interpretations Missing Evaluating a student idea or thinking as missing a science idea. Wrong Evaluating a student idea as wrong or a misconception. Partial (sub-sub-code) Evaluating a student idea or thinking as being partially present, partially missing or partially wrong. (3) Planning to elicit Developing an activity with the goal of Using the LP to ensure that eliciting student ideas about a particular a broad range of student topic. ideas are elicited in the planning activity or developing questions to elicit specific ideas from the LP. Developing an activity based on an idea Developing a lesson aimed or multiple ideas that the teacher either at building on student ideas (4) Planning to anticipated (not yet elicited) or elicited (either elicited or respond from students during a previous class anticipated) from one level session. of the LP to the next. 56 conceptual framework: 1) the discourse structures AJ planned to use during instruction, 2) the attention AJ planned to give to students’ ideas during instruction, and 3) the role AJ planned to give students in constructing knowledge during instruction. I coded idea units for applicable dimensions of responsiveness, using both the dimension and a level of responsiveness (low, medium, or high). Descriptions and examples of each level of responsiveness for each dimension of responsiveness can be found in Table 4. Coding for Lesson Plan Sequence and Content (3.D.3) Finally, I labeled each coded idea unit by the lesson plan(s) discussed or referenced. To do this, I began with the Fall semester post-unit interview data. From this data, I constructed a brief description of each lesson plan AJ implemented during the Fall semester, which included various science topics and/or activities, along with the dates of implementation. I then compared these Fall descriptions of the lesson plans to the descriptions of lesson plans provided in the Spring post-unit interview. Generally, AJ’s lesson plans maintained a similar overarching science topic or activity when comparing Fall and Spring. However, AJ implemented some of his lesson plans in a different order than in the Fall. Because the sequence of the lesson plans was important for seeing a timeline of changes in how AJ was planning, I coded each lesson plan by the timing of its implementation (i.e., first lesson plan implemented each semester was labeled as ‘LP1’) for each semester. I then used the above lesson plan descriptions and sequencing to apply lesson plan codes to all of the idea units within the primary data sources. Because AJ created and implemented different versions of his lesson plans, I additionally applied a ‘version’ sub-code to each lesson plan number code. I used three ‘version’ sub-codes. A sub-code of initial was 57 Table 4 Dimensions of Responsiveness Codes and Examples at Each Level of Responsiveness Dimension of Level of Responsiveness Responsiveness Low Medium High Discourse Planned instruction only Planned instruction Planned discourse is includes teacher provides some mostly student-to- dominated discourse. opportunities for student or student-led students to engage in talk. student-to-student talk or non-IRE student- teacher talk. Ex: Planned lecture for most of the instructional Ex: Planned instruction Ex: Students assigned to time, with IRE patterns includes some time work in small groups; of student-teacher talk. dedicated to having teacher plans to students sharing out facilitate a student-led their work to the whole whole group discussion. class, with remaining time being teacher-led. Attention Teacher plans to give no Teacher plans to give Teacher plans to center attention to students’ some attention to the instruction around ideas/thinking during student ideas/thinking student ideas/thinking. instruction. during instruction. Ex: Lab activity in which Ex: Teacher brings up Ex: Lab activity tailored students confirm student ideas/thinking to investigate previously scientific ideas. he’s seen/heard in a elicited student previous lesson ideas/thinking. throughout instruction Student Role Teacher plans to be the Teacher plans for Teacher plans for source of knowledge, students to have some students to be the main students’ role is to role in the construction source of knowledge receive knowledge of knowledge during the construction during the lesson. lesson. Ex: Teacher provides students with data and Ex: Students are allowed Ex: Students design an explains the conclusion to design a portion of an investigation; students they should draw from investigation. analyze data and it. determine conclusions. 58 applied to the lesson plans that AJ developed between the summer PD and the Fall semester, as these versions of the lesson plans represented AJ’s initial way of engaging in responsive planning. Not all lesson plans had an initial version, as AJ did not make plans for the entire unit over the summer. I applied the sub-codes Fall and Spring to lesson plan descriptions when implemented in the Fall and Spring, respectively. For example, in both semesters, AJ provided whole instruction on Newton’s Third Law (i.e., lesson plan description), however this lesson plan topic the sixth topic of the Fall semester and the seventh topic of the Spring semester. Therefore, any idea unit in which AJ and I discussed or referred to the Newton’s Third Law lesson plan would be coded as ‘L6-Fall’ when discussing the Fall version and ‘L7-Spring’ when discussing the Spring version. Idea units were double coded with a lesson plan number if multiple lesson plans were referenced in one idea unit. Table 5 provides a sequential list of lesson plan descriptions and any applicable versions (i.e., original, Fall, Spring). This process was repeated for any instructional artifacts I collected. For example, student handouts that AJ used to implement the Fall version of the Newton’s Third Law lesson plan mentioned above was coded as L6-Fall. This allowed me to review all relevant data sources for a given lesson plan, which informed my next analytic step of developing lesson plan memos. Developing Lesson Plan Memos (3.D.4) My next analytical step was to construct lesson plan memos: analytic memos for each of AJ’s lesson plans. Since I had multiple sources of data for each lesson plan, and some data sources contained information relevant to multiple lesson plans, constructing memos for each lesson was a useful way of organizing the data, synthesizing data across sources and looking for trends across lesson plans. The goal of the lesson plan memos was to capture AJ’s engagement 59 Table 5 Sequential List of Lesson Plan Descriptions and Applicable Versions Lesson Plan Topic/Activity Initial Sequencing Fall Sequencing Spring Sequencing Description Segmentation of the L1 L1 L1 phenomenon video Initial phenomenon modeling L2 L2 L2 activity ‘Slanted v. Flat Board’ lab activity N/A L3 Overview of forces, topics included Newton’s First Law L4 L4 L4 (N1L), force types and free-body diagrams (FBDs) Revising initial phenomenon L5 L5 L5 models Lab activity that was only considered as part of AJ’s initial Lab N/A N/A set of lesson plans; was to follow L5-initial Newton’s Third Law N/A L6 L7 Newton’s Second Law lab activity N/A L7 L8 Newton’s Second Law review N/A L8 L9 Friction lab activity N/A L9 L6 Gravity lab activity N/A L10 L10 Gravity review N/A L11 L11 in responsive planning, the responsiveness of his planned instruction, and whether/how he used support (FMLP or PD) during the planning process. To create my lesson plan memos, I pulled all relevant data for a given lesson plan and used evidence from this data to complete a memo template. The template included space for the following lesson plan details: 60 1) Plan Description – A brief description of the lesson plan, including (when possible) how the plan developed across planning meetings. 2) Summary of Codes – A list of codes (and any sub-codes) applied to all idea units (across data sources) for the lesson plan, along with any relevant evidence (i.e., pulled quotes from transcripts or links to instructional artifacts) 3) Support – Any evidence of support provided and/or used when planning the lesson. Examples include support I provided during planning meetings, the FMLP, and anything else AJ mentioned as supportive to his planning. 4) Label of Eliciting or Responding – A label of whether the plan was for a primarily eliciting or a primarily responding lesson. Eliciting lesson plans were those with a main focus on drawing out student thinking; often this corresponded to a more planning to elicit codes than planning to respond codes. Responding lessons were those planned with the intention of supporting changes in student thinking; often this corresponded to more planning to respond codes than planning to elicit codes. An example lesson plan memo is included in Appendix D. My lesson plan memos acted a summary of the data for any given lesson plan. Therefore, I used these lesson plan memos to look for potential patterns in the data and then followed up on these potential patterns by reviewing the associated coded data directly. I shared approximately half of my lesson plan memos and my preliminary claims about AJ’s shifts in practice with fellow doctoral students to evaluate the trustworthiness of my findings. I incorporated their feedback into my analysis process. Below, I discuss how this process was used to answer my research questions. 61 Answering My Research Questions (3.D.5) To answer my first research question, How does a teacher with a transmission-style of science instruction begin to intentionally engage, with PD support, in responsive planning?, I looked for patterns across lesson plan memos for all of the lesson plans AJ developed between the summer PD and the start of the Fall semester (coded as initial). These were L1-initial, L2- initial, L4-initial, L5-initial and Lab-initial. I began by looking for similarities and differences in the codes for dimensions of responsiveness across the lesson plan memos. I started with similar levels of responsiveness for a dimension (e.g., those with low attention), I then looked at how AJ was engaging in the responsive planning practices within the associated data. This allowed me to look for patterns in the way AJ engaged in the practices that resulted in the same levels of responsiveness and compare them to AJ’s engagement in the practices that resulted in a different level of responsiveness. Additionally, I looked for whether/how AJ took up any of the PD support I provided. By doing so, I was able to develop a description of how AJ initially began, with PD support, to engage in responsive planning and the way this informed the responsiveness of his planned instruction. To answer my second research question, How does the teacher’s responsive planning change over time?, I started with a similar process as the one for my first research question. For each set of AJ’s lesson plans with a similar level of responsiveness for a given dimension (e.g., those with low attention), I developed descriptions for how AJ was engaging in the responsive planning practices. I operationalize ‘change’ as a difference in level of a dimension of responsiveness when comparing similar lesson plans (e.g., comparing eliciting lesson within the same semester or 62 comparing the Fall and Spring versions of the same lesson plan). For example, I considered AJ’s planning to have ‘changed’ when his L1-initial lesson was coded as low in attention and his L1- Fall was coded as high in attention. Therefore, I then compared AJ’s engagement in responsive planning practices across lesson plans lesson plans to develop claims about how AJ’s engagement shifted over time. Additionally, I looked for whether/how AJ took up any of the PD support I provided and whether/how this support potentially contributed to the shifts in practice I observed. To answer my third research question, How does the teacher use the LP to support his responsive planning, both in the beginning and over time?, I began by focusing on lesson plan memos that noted AJ using the FMLP in any way during planning (i.e., those with the LP sub- code). By looking across the data from these lesson plans, I was able to develop claims for whether/how AJ used the FMLP when engaging in responsiveness planning practices described in findings from the first two research questions. 63 CHAPTER FOUR: FINDINGS My findings are organized by my three research questions, which are listed below as a helpful reference: 1. How does a teacher with a transmission-style of science instruction begin to intentionally engage, with PD support, in responsive planning? 2. How does the teacher’s responsive planning change over time? 3. How does the teacher use the LP to support his responsive planning, both in the beginning and over time? Findings for Research Question One (4.A) After the summer PD and prior to the start of Fall semester, AJ planned five lessons. We had briefly discussed the first four (the start of the unit) during the summer PD. After the summer PD, AJ created materials he planned to use when implementing those four lessons. These four initial lesson plans became L1-initial, L2-initial, L4-initial, and L5-initial. In addition, AJ created a preliminary plan for a lab-based lesson to be implemented at some point after the initial set of four. He had not developed any materials for that lesson yet. He shared all five lesson plans during our first planning meeting, which took place prior to the start of the Fall semester. I refer to the lessons with created materials as ‘initial’ (e.g., L1-initial). I refer to AJ’s preliminary lab-based lesson idea as Lab-initial, as this idea was never developed into a lesson used during the study. I analyzed these five lesson plans to answer my first research question: How does a teacher with a transmission-style of science instruction begin to intentionally engage, with PD support, in responsive planning? 64 Focus of Initial Way of Planning: Correct Knowledge (4.A.1) I found that AJ’s initial way of planning the F&M unit focused heavily on correct knowledge. When planning, he evaluated the correctness of the student ideas and responses he anticipated. These interpretations revealed AJ’s use of the ‘gets it/doesn’t get it’ evaluative stance and informed the way he planned to elicit and respond. AJ’s interpretations of anticipated student ideas informed his plans to elicit in two ways based on whether he expected students to ‘get it’ or ‘not get it.’ When he anticipated students would ‘get it’ (interpreted as present), AJ used the correct knowledge he anticipated students would hold to create questions that would elicit the correct responses. When he anticipated students would ‘not get it’ (interpreted as missing or wrong), AJ made plans to provide students with correct knowledge prior to asking them to complete an eliciting task. When planning to respond, AJ planned to provide a lecture on correct knowledge or have student confirm the knowledge with a lab activity. These plans seemed informed by his anticipation of missing or wrong ideas, though he did plan to review student work (i.e., interpret elicited ideas) prior to settling on the exact correct knowledge he would provide. All of the initial lesson plans were low in the attention he gave student ideas during instruction and the roles students were given in constructing knowledge. His planned discourse structures, however, generally alternated between high and low responsiveness depending on whether the lesson was planned as a small group activity (high discourse responsiveness) or whole group activity (low discourse responsiveness). See Table 6 for an overview of the responsiveness of these initial lessons. 65 In the sections below, I support these broad descriptions, illustrating how AJ’s planning practices (anticipating, interpreting, planning to elicit, and planning to respond) focused on correct knowledge and worked together to create instructional plans with low responsiveness in attention and student role. I begin by illustrating how this focus manifested in his plans to elicit when anticipating missing or wrong ideas (i.e., students will ‘not get it’). Next, I illustrate the correct knowledge focus of his initial plans to elicit when anticipating present ideas (i.e., students will ‘get it’). Lastly, I illustrate a focus on correct knowledge when planning to respond using an example from L4-initial, with additional support from Lab-initial. Table 6 Overview of Responsiveness of AJ’s Initial Lesson Plans Lesson Plans Representing Initial Planning Lab- Descriptors L1-initial L2-initial L4-initial L5-initial initial* Main lesson grouping format Small Small Whole Small Small group group group group group Responsiveness Discourse High High Low High Uncertain Attention Low Low Low Low Low Student Role Low Low Low Low Low Note. *Lab-initial did not align to any Fall lesson. Planning to Elicit When Anticipating Students Will ‘Not Get It’ (4.A.1.a). AJ planned for his first two lessons to elicit ideas from students about the anchoring phenomenon, a video of a Rube-Goldberg machine. AJ used a ‘won’t get it’ lens when anticipating and interpreting several student responses for these eliciting activities. An example of this kind of interpreting can be found in his L1-initial lesson plan, an eliciting lesson plan with the general goal of students segmenting the video of the Rube-Goldberg phenomenon based on the objects’ motion within 66 the phenomenon. AJ and I did a similar activity during the summer PD to help us decide how best to connect various science ideas to the phenomenon. AJ decided to have students do a similar segmenting activity during their first eliciting lesson about the phenomenon and planned the details of his L1-initial plan between the summer PD and our first planning meeting (PM-F1). During PM-F1, AJ described his elicitation plan for L1-initial, which was informed by his anticipation of student responses and his interpretation of these anticipated responses. AJ shared the following anticipation and interpretation: 'Cause I think when we did that, you and I looked at that [phenomenon video] with a pretty solid understanding of what motion is and what acceleration is and...that kind of thing, and…if they [students] didn't have that sort of focus a little bit, they might go in 9000 different directions on it… Without any sort of basis of ‘what is physics’, what are they’re looking for, will they get there, I guess, was the question. (PM-F1) AJ anticipated that students would not have “any sort of basis of ‘what is physics,’” a missing interpretation. This anticipation and interpretation informed his plans to provide students with correct knowledge prior to asking them to segment the phenomenon video (elicitation task). During our discussion about his plan, I asked why he felt providing this correct knowledge prior to the eliciting task was better than moving straight to the elicitation task and he replied: If we do the vocab instruction [on speed, velocity and acceleration] first, I think it can focus that video discussion a little bit and let them be more successful just talking to 67 each other, and let the students be more successful talking to each other and trying to categorize the video and pick out specific similarities and differences. (PM-F1) Based on this description, AJ planned to have students talk to each during the eliciting task (high-discourse), which adds a dimension of responsiveness to the plan. However, AJ seemed to want students to use the definitions he planned to provide as foundational knowledge to steer their conversations and thinking, limiting the attention on students’ own ideas about motion (low-attention) and ability to use their ideas to segment the phenomenon (low-student role). AJ was may have been concerned about students feeling frustrated or confused when completing the assignment and wanted to mitigate those negative feelings. His concern seemed born from his anticipation and interpretation that students were missing correct knowledge and the way to support students would be to provide this correct knowledge to them prior to the eliciting task. AJ’s planning of L2-initial offers another example of how he planned when anticipating that students would ‘not get it.’ AJ’s plan for L2-initial was informed by an interpretation that students would hold a wrong idea about the scientific practice of modeling. AJ shared: I have a feeling that if I say the word ‘model’ to my students, they're gonna go glassy- eyed on me. So I think I need to give them an idea of what I'm talking about. I'm not talking about, take... I think when they hear the word model, they think, I need to go get things that are in my kitchen or my junk drawer and create something out of it. So I came up with just a little, almost little PowerPoint intro that just simplifies what a model is, and I think that I might share that with them before doing the cog model activity. (PM-F1) 68 AJ anticipated that students would think of scientific modeling as a process of creating a representation (e.g., creating a model airplane) instead of explaining how and why a scientific phenomenon occurred. He also indicated an interpretation students’ thinking as wrong with the phrase “I’m not talking about…” when referring to this anticipated idea. This interpretation of the anticipated idea shaped AJ’s plans to elicit. As in L1-initial, AJ planned to provide correct knowledge about modeling during a “little PowerPoint intro” and have students use this knowledge, along with the provided correct knowledge on motion from L1-initial, to create an initial model about a segment of the phenomenon. As in L1-initial, the plan was for students to work in small groups (i.e., high-discourse); however, the students’ own ideas about motion and how to construct a model were eclipsed by the correct knowledge AJ was planning to provide prior to the eliciting activity. Thus, the planned instruction was low in the attention and student role dimensions of responsiveness. Planning to Eliciting When Anticipating Student ‘Will Get It’ (4.A.1.b). While AJ more commonly anticipated that students would be missing or hold wrong ideas, he also anticipated several correct (present) ideas and responses. When anticipating students would hold correct knowledge, AJ sometimes planned to elicit this correct knowledge. An example of this comes from AJ’s L4-initial plan. L4-initial was planned as a responding lesson that would follow L2- initial and cover a variety of force topics, including the definition of force, types of forces, and Newton’s first law. During the section on types of forces, AJ planned to roll a ball across the table and ask students why it eventually stopped. He anticipated that students would know friction was the reason it stopped. He shared this anticipation as we discussed this part of the plan: “I mean, I think it [friction] will probably come up 'cause it's another word that they're 69 gonna associate with the word force” (PM-F1). Here, AJ anticipated some correct (present) knowledge regarding the concept of friction. This appeared to inform his plans to embed an IRE-style question into L4-initial that would elicit a correct understanding of friction from students. AJ’s plans for L4-initial and L4-Fall were the same. Therefore, additional evidence that AJ’s anticipation of correct knowledge of friction informed his plans to elicit this correct knowledge, comes from AJ’s reflection on L4-Fall (same as L4-initial) during the planning meeting immediately following his implementation of the lesson: I literally took a golf ball and rolled it across the floor and asked them why it stopped. And I would say the vast majority of my answers that I got were, "Gravity stopped it." That's it, like, that's it. That was interesting to me that they... And it took until friction was on the PowerPoint as a type of force, until anyone mentioned the word, it never came up until that point, and I was shocked, I was shocked at that. (PM-F3) AJ referenced his anticipation that students would have correct knowledge of friction and would be able to respond correctly to his planned question (elicitation), ‘why did the golf ball stop?’ In this case, the students did not respond in the way he anticipated, leaving him “shocked.” The interpretation of correct anticipated knowledge was connected to his plan to elicit that correct knowledge during instruction. Details of the responsiveness of L4-initial are discussed in the next section. Planning to respond (4.A.1.c). AJ also focused on correct knowledge when planning to respond in his initial lesson plans. These plans appeared to be informed by the missing and wrong ideas AJ anticipated. L4-initial was planned as a responding lesson. Since AJ had not yet 70 elicited any student thinking at this point in the study, he planned a lesson on a variety of force topics because students’ ideas about forces “might need some streamlining” (PM-F1). This suggests that AJ generally anticipated that students might be holding wrong ideas about forces in addition to anticipating that students would hold correct knowledge about friction as described previously. AJ later referred to his method of responding as "lecturing" (PM-F3), which was supported by the slides he shared during our first planning meeting (PM-F1). Thus, the planned discourse during this whole group activity was low in responsiveness. He planned to have students take notes on the various force topics (low-student role) and did not plan to give any attention during instruction to the ideas he would have elicited from L1-initial and L2- initial (low-attention). While he did not share that he was anticipating any specific wrong or missing ideas, his general anticipation of ideas that “might need some streamlining” appeared to inform his preliminary plans to respond, which took the form of a lecture on correct knowledge of forces. As another example of AJ’s initial way of planning to respond, AJ also shared a preliminary idea he had for a lab activity (Lab-initial) that would follow his plans for the first four lessons (L1-initial, L2-initial, L4-initial & L5-initial). He shared his planned placement of the Lab-initial idea in the sequence of the lessons after we discussed the timing of the first four lesson plans: “And I think that's right, and I think that's [the first four lessons are] reasonable to do in that time frame, and something that I can roll into those sort of demonstration or lab experiences [Lab-initial]” (PM-F1). He then went on to provide a general description of what students might be doing during this preliminary lab activity: 71 I'm thinking like mini-labs almost like stations for some of these things, like "In this one you're gonna experience this” …and then 10 minutes later, you'll go and do this one, and 10 minutes later you'll go and do that one. (PM-F1) As we discussed his idea further, it was clear that “these things” were science ideas presented in the L4-initial lecture and that the “mini-labs” would help demonstrate or confirm these ideas. For example, we discussed a potential lab activity (part of Lab-initial) that could be tied to some of the friction-related content he planned for L4-initial: “I could conceive of a lab experience where they see that things that are on wheels [reference to rolling friction], versus things that are sliding across something [reference to sliding friction], react very differently” (PM-F1). In this case, the plan was for students to “see” the difference between rolling and sliding friction he planned to identify to students in the L4-initial lesson, not ‘figure out’ the differences between the friction types. The combination of AJ’s description of planning for students to “see” concepts, coupled with the placement of Lab-initial idea after a lecture on related-content, highlights the focus of Lab-initial on correct knowledge. Because the plan for Lab-initial aimed to confirm correct knowledge through a lab-based activity, it was low in responsiveness in terms of attention and student roles. The planned discourse for the Lab-initial lesson was unclear based on AJ’s descriptions and, therefore, was coded as uncertain. There is no direct evidence that AJ was planning a mini-lab based on any specific anticipated missing or wrong ideas. Rather, his Lab-initial plan seems like an extension of his L4-initial plan to provide correct knowledge about forces. 72 Summary (4.A.2) AJ’s planning for all of his initial lessons focused on correct knowledge. He anticipated and interpreted things students would ‘get’ and ‘not get.’ These anticipations and interpretations appeared to inform his plans to elicit and respond. When planning to elicit, he planned to provide correct knowledge when he anticipated missing or wrong ideas and planned to elicit specific correct knowledge when he anticipated present ideas. When planning to respond, he prepared a collection of correct knowledge that students might be missing or have wrong in their work, with the plan to lecture on, and then confirm with lab-based experiences, correct knowledge. Ultimately, this focus on correct knowledge resulted in low responsiveness, particularly in terms of attention and student roles, for his initial lesson plans. As AJ refined his plans during the study, he focused less on correct knowledge and shifted toward being more responsive. The details of these changes are discussed in the next section. Findings for Research Question Two (4.B) As a reminder, my second research question asked how AJ’s engagement in responsive planning change over time. I identified three ways that AJ shifted his responsive planning practices over the course of the study in ways that were more responsive, as compared to his initial way of planning. First, AJ stopped providing correct knowledge to students prior to eliciting their thinking about the phenomenon and began planning to elicit student ideas as if they were hypotheses about the phenomenon. I describe this shift as planning to elicit initial hypotheses. Second, AJ stopped planning to provide and confirm correct knowledge based on missing and/or wrong interpretations of student thinking. Instead, he began interpreting student ideas as partial understandings and connecting elicited student ideas to science ideas 73 within his plans to respond. I describe this as planning to respond with attention to disciplinary connections. Finally, AJ began planning to elicit all types of student thinking, as opposed to just correct knowledge, during whole group instruction, which increased his discourse responsiveness during whole group instruction. I describe this as a shift toward planning discussions. In the sub-sections below, I support my descriptions of each shift identified above using illustrative evidence from the data. For each shift, I provide evidence of the changes in responsiveness and the support that potentially influenced each shift. Planning to Elicit Initial Hypotheses (4.B.1) AJ’s first shift was in the way he planned to elicit students’ initial ideas about the phenomenon. Table 7 provides an overview of this shift. As described previously (See 4.A.1), AJ’s initial way of planning to elicit in a phenomenon-based unit focused on eliciting correct knowledge. In his initial plans to elicit ideas about the phenomenon (L1-initial and L2-initial), AJ planned to provide correct knowledge prior to the eliciting task. Although these plans were high in discourse responsiveness (as students were working in small groups), they were low in responsiveness in the attention and student role dimensions (because AJ planned to provide students with correct knowledge prior to eliciting). After several planning meetings providing support on eliciting, AJ no longer planned to provide students with correct knowledge to use during the eliciting activities. Instead, he started seeing student ideas as hypotheses and planned to allow students to use their own ideas and vocabulary (high-attention) and ways of representing their ideas (high-student role) to complete the eliciting tasks of segmenting (L1-Fall) and creating an initial model (L2-Fall) of 74 Table 7 Overview of Shifts in Unit’s Initial Eliciting Lesson Plans Description of Description Responsiveness Responsiveness Responsiveness shift from of shift from of initial version of Fall version of Spring version initial à Fall Fall à Spring Discourse – high Stopped Discourse – high Discourse – high Attention – low providing Attention – high Attention – high L1 Role – low correct Role – high Role – high knowledge prior to eliciting; Discourse – high Discourse – high No shift. Discourse – high Planned to elicit Attention – low Attention – high Attention – high student ideas as Role – low Role – high No support Role – high hypotheses provided. (increased L2 attention & role). Support provided. the phenomenon. A potential reason behind this shift was the support provided to AJ during the first planning meeting (PM-F1), specifically targeting his L1-initial and L2-initial plans. In the spring, AJ decided to begin his unit with a similar set of highly responsive lessons (L1-Spring and L2-Spring) without support. In the sub-sections below, I provide evidence from AJ’s L1-Fall and L2-Fall planning to support my claim that AJ stopped planning to provide correct knowledge prior to eliciting (i.e., initial way of planning to elicit) and began planning to elicit initial hypotheses. For each lesson, I provide evidence of the support provided and the responsiveness of AJ’s Fall versions of the lessons. I then describe AJ’s equivalent lesson plans for the Spring semester (L1-Spring & L2- Spring) to support my claim that AJ made a full shift in this aspect of his planning practice. 75 Evidence from L1-Fall (4.B.1.a). As described previously (See 4.A.1.a), AJ shared his anticipation that students would be missing correct knowledge about motion, which informed his decision to start with a brief lecture on this correct knowledge prior to eliciting student thinking (L1-initial plan). In our planning meeting, I reminded AJ that the purpose of eliciting is to uncover the ideas and definitions of motion that students already hold, regardless of the whether the ideas are right or wrong. Below is a portion of this exchange between me and AJ: JC: But I think that the idea here literally day one or day two of class is not for what they [students] do in this initial activity…for them to be correct. AJ: Right. Yeah, that's true. JC: And so it's more to see where they're at before you... This is just the elicitation of what they see and notice inherently. AJ: Yeah, that's true. JC: They [students] do have definitions in their brain of what motion is. AJ: That's true, yeah. JC: So I think if you just go with “motion” and let them describe the kinds of motion that they're talking about, let them use their own words, I think the idea would be to listen for those words…especially in the beginning, and then sort of... And then do the instruction around the vocab connected to the words that they use. AJ: That makes sense. Yeah, alright, I'll buy that. (PM-F1) My support re-purposed AJ’s plan from trying to elicit correct knowledge and toward eliciting the variety of ideas that students may hold and suggested introducing vocabulary to build on those ideas. 76 After some additional discussion, AJ summarized his L1-Fall plans, to be implemented on days two and three of the semester. He said: Days two and three, they [students] are categorizing the video. And basically, I'm thinking a couple iterations of that and having them talk with each other and work through that, as we talked about. I think we have a good solid plan for that. I don't think there's a lot of prep work that I'm gonna do for that. I think a lot of it is, "Hey, here's a video. Take some big pieces of paper and start to chunk it, and describe your chunks, be able to describe your chunks." (PM-F1) In L1-Fall, he no longer planned to provide students with correct knowledge to use in the eliciting task but instead planned to allow students to use their own ideas and terms, or “whatever they wanna put on it” (high-attention), to decide how to segment the phenomenon (high-student role) as they talked in their small groups (high-discourse). This resulted in a highly responsive L1-Fall plan with the goal of drawing out student ideas about motion, not only correct knowledge about motion as in the L1-initial plan. AJ made the decision to use, essentially, the same lesson again in the Spring semester on his own (without support) and thus did not make any significant changes to L1-Fall when planning for L1-Spring. Therefore, L1-Spring was also high in responsiveness across all three dimensions. Evidence from L2-Fall (4.B.1.b). During our first planning meeting (PM-F1), AJ shared his screen, revealing slides for a brief lecture on modeling conventions that he planned to provide to students prior to the eliciting task of creating an initial model about the phenomenon. At this time, I suggested that AJ use the students’ ideas about ‘how to model’ elicited from the 77 modeling activity as the starting point for a ‘modeling conventions’ conversation, implicitly pushing against the perceived need to provide modeling conventions (i.e., correct knowledge) prior to the elicitation. AJ took up this suggestion and made changes to his presentation during the planning meeting. Below is an excerpt of our dialogue: JC: Agreement about drawing conventions is important…But I think, like, what will arrows mean? What will molecules look like? And how do you show the passing of time? So, but I think, those conversations will come out of what's on the [students’] models. AJ: So looking at my screen for a second, should I take this [slide on modeling conventions] and move this after the first iteration of a model, so that it can then sort of lead into that, so they get their idea of what a model is out, and then we start talking about the conventions of what are dots, what are arrows, what are... What is all that? JC: Yeah, ‘cause I think they'll just naturally use these things. AJ: I think they will. Yeah. (PM-F1) In this exchange, AJ began to re-organize the flow of instruction so that eliciting, rather than providing correct knowledge, came first. Later in our discussion, I suggested directly addressing the wrong idea he anticipated from students (modeling = representation) and a way of introducing this to students that frames their ideas as hypotheses: JC: One thing that might help trigger that this is different than a representative model, which is sort of like, "let's create a cell out of Styrofoam balls," or whatever, would be calling it an explanation or an explanatory model. AJ: Gotcha… 78 JC: And you could even put pictures of, 'cause they've already taken biology, right? AJ: Yes. JC: Biology has just a ton of representative models, and so like, "This is a model of a chloroplast. It's just a picture diagram. But what we want is more of an explanatory model.” AJ: (Modifies language on the screen) JC: The other word that might trigger for them is like “we are going to treat these as hypotheses.” AJ: I change [the language on the slide] to hypothesize. (PM-F1) AJ’s final plans for the initial modeling lesson (L2-Fall) entailed introducing the eliciting task (creating an initial model) with the terms ‘explanatory model’ and ‘hypothesize,’ rather than providing correct knowledge on modeling prior to the modeling activity. Coupled with the changes that resulted in his L1-Fall plan, his L2-Fall plan would allow students to use their own ideas about the phenomenon (high-attention) and how to represent those ideas (high-student role) when constructing their models. AJ made the decision to use, essentially, the same lesson again in the Spring semester on his own (without support). Similar to L1-Spring, he did not make any significant changes to his L2-Fall plans when planning for L2-Spring. Therefore, L2-Spring was also high in responsiveness across all three dimensions. Planning to Respond with Attention to Student Ideas (4.B.2) AJ’s next shift was in the way he interpreted and planned to respond to the ideas he had elicited from students. AJ’s initial plans to respond, L4-initial & Lab-initial, aimed to provide and 79 confirm correct knowledge, respectively. However, with support, AJ began planning to respond with an increase in the attention he planned to give student ideas during instruction. This shift began during the Fall semester and continued into the Spring semester. AJ’s shift in planning to respond occurred in two different ways based on the type of lesson he planned: labs or lectures. For lab-based planned responses, AJ began planning to respond to student ideas as if they were hypotheses to be investigated, rather than using lab activities to confirm correct knowledge (initial way of planning to respond). For lecture-based planned responses, AJ began attending to the disciplinary connections between students’ ideas and canonical science ideas, rather than providing correct knowledge without any links to student thinking. In the sub-sections below, I further describe and support my claims that AJ shifted to planning to respond with attention to student ideas. I begin with AJ’s shift in lab- based responding lessons, followed by his shift in lecture-based responding lessons. Investigating Student Hypotheses: Planning Lab-Based Responding Lessons (4.B.2.a). AJ’s initial plan for a lab-based lesson (Lab-initial) would have confirmed correct knowledge provided in an earlier lecture-based lesson (L4-initial). However, with support, AJ did not end up using lab activities for this purpose. Instead, he began planning lab-based responding lessons to investigate student ideas, which impacted both the focus and placement of lab activities. First, AJ began treating the students’ ideas like hypotheses for the lab investigations, which increased attention on student ideas during instruction. Three of the four lab-based responding lessons in the Fall were at least medium in attention. In the Spring, all lab-based responding lessons were at least medium in attention, with two of the four lab-based responding lessons showing an increase in attention responsiveness as compared to their Fall 80 versions. See Table 8 for an overview of the shifts in responsiveness of AJ’s lab-based responding lessons. Table 8 Overview of Shift in Responsiveness of Lab-Based Responding Lesson Plans Description of Description of Responsiveness Responsiveness Responsiveness shift from shift from of Spring of initial version of Fall version initial à Fall Fall à Spring version Discourse – N/A N/A uncertain Lab Attention – low Labs no longer Role – low used to confirm N/A correct Discourse – high Discourse – high L3 knowledge; Labs Attention – high Continuation of Attention – high used to Role – high ‘initial to Fall’ Role – high investigate shift (increased N/A Discourse – high Discourse – high student attention). L7 Attention – med Attention – med hypotheses Role – low Role – med (increased No support N/A attention) Discourse – high provided. Discourse – high L9* Attention – low Attention – med Support Role – low Role – low N/A provided. Discourse – high Discourse – high L10 Attention – med Attention – high Role – low Role – med Note. *The Spring version of L9-Fall was L6-Spring. Second, AJ’s lab-based lessons were no longer placed after a lecture on the corresponding science topic (See Table 5, in Methods). Rather, AJ planned several lab experiences that explored elicited student ideas prior to providing a lecture-based lesson on the same topic. By placing lab-based lessons prior to lecture-based lessons on the same topics, the lab-based lessons were no longer used to confirm previously provided correct knowledge (as in AJ’s initial way of planning). 81 In the subsections below, I support my claim that AJ’s planning of lab-based responding lessons shifted from using confirmation labs to investigating student ideas. I begin with L3-Fall, as this was the first lab-based responding lesson plan that AJ implemented during the study. I describe his planning of this lesson, including support I provided. Then I provide additional examples of AJ’s new approach to planning lab-based responses. Evidence from L3-Fall (4.B.2.a.i). AJ’s planning for L3-Fall, a lab-based responding lesson, was likely shaped by the support initially offered to help AJ’s planning to elicit practice and suggested that AJ consider student ideas as hypotheses. This support originated during our first planning meeting (PM-F1) and was carried into the next planning meeting (PM-F2), which occurred after the first day of the L2-Fall multi-day eliciting lesson. During the PM-F2 planning meeting, AJ and I continued to use the term ‘hypothesis’ when referencing student ideas as we spent some time reviewing students’ work after their first day of creating initial models (L2- Fall). For example, toward the end of our planning meeting, AJ shared how he planned to talk with students about the ideas we noticed in their models thus far: “Yeah, so here are the predominant hypotheses that I saw” (PM-F2). A few days after the PM-F2 planning meeting, AJ emailed me with some follow up questions and ideas, which included his preliminary plans to respond to students’ hypotheses about whether the wooden board in the phenomenon video was slanted or not: “It could be an interesting discussion point (pose the Q [to students]: did the board have to be slanted? Could you design an experiment to prove a “yes/[slanted]” hypothesis incorrect?) Ask them [students] to design and perform it. (Email) Near the end of our email exchange, AJ shared: 82 Thinking of student ideas as hypotheses is also super helpful for me too. It helps me see where I can put more of the science as process stuff [SEPs] into class instead of treating everything as a didactic lecture. (Email) Ultimately, this led to AJ adding L3-Fall, a planned lab-based response informed by the student hypotheses elicited during L2-Fall. This plan would allow students to design an investigation (high-student role) to test their ideas about whether the wooden board in the phenomenon video was slanted or not (high-attention). Students would design and carry out these investigations in small groups and then share their results across groups through a jigsaw activity (high-discourse). AJ’s plans for L3-Spring, the corresponding lesson from the Spring semester, were similarly high in all three dimensions of responsiveness. Additional Evidence from Fall Semester (4.B.2.a.ii). During the Fall semester, AJ planned three additional lab-based lessons without support. Two of these lesson plans (L7-Fall and L10-Fall) were medium in attention to elicited student ideas and positioned prior to a lecture on a related topic. Similar to AJ’s planning of L3-Fall, these two lessons were informed by student thinking he noticed during a previous lesson. For example, L7-Fall was a lab on Newton’s Second Law. During a planning meeting (PM-F4), AJ shared his decision to do the Newton’s Second Law lab based on a student question posed during his lecture on Newton’s Third Law: For the most part, like [the students] said, “Okay, I’ll buy that there are these interactions and there’s a forward and a backwards in each case”, but there were still, obviously, the [question from students], “Does that really happen every time?” Like, when something really tiny hit something really big, or when something really big hit 83 something really tiny, and so that’s where I thought we needed to go towards Newton’s second law. Within the students’ question, is an implied hypothesis that the described interaction does not happen every time when something really tiny hits something really big (or vice versa). In other words, mass plays a role in these interactions. Rather than planning a lecture that would provide correct knowledge on Newton’s Second Law as his next lesson, AJ planned an investigation that would help students answer their question and test their implied hypothesis about the role of mass, along with other variables (net force and acceleration). AJ planned to draw attention to students’ thinking during the lesson, which is evident in the lab handout provided to students. For example, the handout began with: Many of you have expressed concern about Newton’s 3rd law, specifically questioning whether large objects and smaller objects REALLY DO exert the exact same equal and opposite force on each other. Perhaps we should take some time to see just how three important variables interact: the heft of an object (we’ll call this MASS), the unbalanced force on an object (we’ll call this NET FORCE), and the change in motion that the unbalanced force can cause (we’ll call this ACCELERATION). (L7-Fall Handout, emphasis in initial) This planned response was considered medium-attention because the lab activity explicitly used student thinking as motivation and regularly returned to the student question and hypothesis. It was not considered high-attention because AJ inserted additional content ideas (e.g., acceleration) that were not evident in the student thinking at this point. Additionally, the plan called for students to work in small groups (high-discourse) to carry out the lab activity that AJ 84 designed (low-student role). In the Spring, AJ maintained this plan’s medium-attention and high- discourse but increased the responsiveness of the student role to medium by allowing students to help design the investigation. Another lab-based responding lesson was L10-Fall, an investigation about acceleration due to gravity. As with the planning of L7-Fall, AJ’s decision to do this lab investigation was based on elicited student thinking from L5-Fall. In a planning meeting prior to the start of the lesson, AJ shared: They are all very well aware that gravity pulls on everything pulls downward with a force on everything, they…now if we get into the details of does the force differ between a cellphone and a book, or does the acceleration differ between a cell phone and a book? I mean, I’ve [got] answers all over the place on that. (PM-F5) Based on this student thinking, AJ planned an investigation, which he designed (low-student role), that allowed students to test some of their hypotheses (and some science ideas AJ introduced) about the acceleration due to gravity (medium-attention) while working in small groups (high-discourse). One elicited student hypothesis to be tested was that mass influenced the acceleration. Therefore, part of the plan included dropping objects with different masses. AJ also planned for students to test whether different ways of throwing an object (e.g., drop, toss straight up, toss at an angle) would affect its acceleration due to gravity, an anticipated idea. Since the hypotheses to be tested included anticipated ideas, the attention was considered medium, rather than high, for this responding lesson. In the Spring version of this lesson plan (L10-Spring), AJ increased the both the attention and student role. He planned to have the class compile a list of potential hypotheses to be 85 tested, such that all the hypotheses to be tested were to be elicited from students (high- attention). The L10-Spring plan included a combination of student and teacher designed elements, thus making the plans medium in student role responsiveness. For example, AJ told students to measure acceleration, not speed, using a video-capturing tool but allowed student to pick their own objects (including those from home) and decide how they would drop and/or throw the objects to test their hypotheses. Attending to Disciplinary Connections: Planning Lecture-Based Responding Lessons (4.B.2.b). During the Fall semester, AJ began attending to the disciplinary connections between students’ ideas and canonical science ideas when planning lecture-based responses. As a part of this shift, AJ’s interpreting became more nuanced. He relied less on a ‘right/wrong’ perspective to interpret student ideas and began interpreting student ideas as having pieces of correct knowledge (e.g., partial-present, partial-present-missing). AJ often used these pieces of correctness, or disciplinary connections within student ideas, to plan his lecture-based responses. AJ planned to draw students’ attention to the ideas previously elicited and to explain the disciplinary connections he saw between those ideas and the content of the lecture, which he referred to as his way of “formalizing” (PM-F6) students’ ideas. Table 9 provides an overview of the shift in responsiveness of AJ’s lecture-based responding lesson plans. AJ’s shift toward attending to disciplinary connections during lecture- based responding lessons was associated with two different attempts at support (referred to as Round 1 and Round 2) provided over several Fall planning meetings and can be first observed when comparing the attention of L4-Fall (low) to that of L6-Fall (medium). AJ continued to 86 Table 9 Overview of Shift in Responsiveness of Lecture-Based Responding Lesson Plans Description of Description of Responsiveness Responsiveness Responsiveness shift from shift from of Spring of initial version of Fall version initial à Fall Fall à Spring version Discourse – low No shift. Discourse – low Discourse – med Attention – low Attention – low Continuation of Attention – med L4 Role – low Support Role – low ‘initial to Fall’ Role – med provided. shift. (Round 1) N/A Lectures now Discourse – low Discourse – low L6* include Attention – med Attention – med disciplinary Role – low Role – low connections No shift in N/A Discourse – low Discourse – med within student attention. L8** Attention – med Attention – med ideas (increased Role – low Role – uncertain attention). No support N/A Discourse – low provided. Discourse – med Support Attention – med Attention – med L11 provided Role – low Role – med (Round 2) Note. *The Spring version of L6-Fall was L7-Spring; **The Spring version of L8-Fall was L9- Spring. attend to disciplinary connections within students’ ideas, thereby maintaining a medium level of attention for the remainder of the Fall semester, as well as the Spring semester. I provided two rounds of support over several planning meetings that likely influenced AJ’s lecture-based lesson plans. My first attempt at supporting AJ’s planned responding lessons (Round 1) included support for both interpreting and planning to respond. While my interpreting support appeared productive in shifting AJ’s practice toward using a more nuanced perspective when considering student ideas, AJ did not take up my suggestions for how to shift his L4-initial plan away from a lecture-based response. He didn’t make any significant changes to L4-initial, so L4-Fall was low in all dimensions of responsiveness. In the planning meeting 87 immediately following AJ’s implementation of L4-Fall, I provided a different suggestion (than those in Round 1) for his plans to respond (Round 2) based on his reflections on his implementation of the lesson. AJ appeared to take up my Round 2 suggestion, shifting the attention of the next lecture-based lesson (L6-Fall) to medium. I support my claim that AJ shifted his planning of lecture-based responding lessons in the sub-sections below. First, I provide evidence of AJ’s shift in interpreting practice as we reviewed student work together (Round 1 support). Next, I demonstrate how AJ’s planning of lecture-based responses remained focused on providing correct knowledge, despite Round 1 support. Next, I provide evidence of how AJ’s planning of his next lecture-based responding lesson (L6-Fall) demonstrated a shift towards attending to disciplinary connections within students’ ideas. I include the planning meeting discussions that illustrate AJ’s struggle to move away from traditional lecturing and the support (Rounds 1 and 2) offered for AJ’s planning to respond practice. Lastly, I use L8-Fall as an additional example of AJ’s continued use of attending to disciplinary connections when planning lecture-based responses. F4-Fall – Interpreting Shift (4.B.2.b.i). Initially, AJ relied on a ‘gets it/doesn’t get it’ perspective when interpreting the ideas he anticipated from students. During our first review of students’ work, AJ’s interpreting became more nuanced, considering the ways in which the ideas represented partial understanding of science concepts. This shift was observed during the PM-F2 planning meeting. This planning meeting took place after the first day of L2-Fall, a multi- day lesson in which AJ planned to elicit student hypotheses about the phenomenon. As we started reviewing students’ initial models from L2-Fall, AJ shared his interpretations of student thinking made during class, which used a ‘doesn’t get it’ perspective: 88 From what I noticed as I walked around and looked over shoulders and talked to here and there. They did not get much into why does...They didn't get to forces, so I need to prompt them to deal with interactions between them specifically. That's where I want them to go, and maybe that's the way to get them into thinking about forces. From this statement, AJ appears to be interpreting the responses his students provided as missing ideas about forces. At this point in the conversation, I asked if the students simply did not use the term ‘force’ in their models (a partial-present interpretation; Round 1 support). This prompted AJ to read from a student model during the planning meeting and consider whether the ‘idea’ of force was present without the academic vocabulary. We then had the following exchange: JC: Did they just not use the word ‘force’? AJ: Well, so this one says, "The first cog began rolling, moving because of gravity, on a tilted board. It's unknown how the first cog began motion, the first cog then hit the second cog, causing it to roll." So there's a, they got the idea there without the word. JC: ‘Hit’, right? That's like… AJ: Yes, the ‘hit’. Yes. JC: The ‘hit’, I think, is sort of the seed of force. AJ: Yep. In this exchange, AJ interpreted the student response as having “the idea without the word” (partial-present-missing), showing more nuance in his interpretation, similar to the support provided. 89 As we continued to interpret student ideas from their initial models, AJ continued to demonstrate a more nuanced perspective, or one that looks for the disciplinary connections within student ideas. Below is another example in which AJ read the student response and then interpreted it using a more nuanced perspective: And this one's actually interesting, it says "The first cog then hit the second cog, causing it to roll and the first to stop." So, they've got at least a semblance of an idea that that hit had two outcomes, like there's both sides of that. So that’s nifty. "The second cog began rolling until hitting the third cog and stopping. Because of the force from the second, the third cog began moving until falling off the board and finishing the cog cycle." So there's definitely pieces there. In this example, AJ described the students’ response as having “a semblance of an idea” and “pieces there,” reflecting a partial-present interpretation. In this case, AJ made this interpretation without my specific prompting. L4-Fall – Still Providing Correct Knowledge (4.B.2.b.ii). While the Round 1 interpreting support appeared to help AJ look for disciplinary connections within student thinking (use a nuanced interpreting perspective), the Round 1 suggestions for planning to respond were not taken up. As we interpreted student thinking within the students’ models, I also offered ways AJ could modify his L4-initial plan. This plan, his first attempt at planning a responding lesson, focused on providing correct knowledge using a lecture. He had created a presentation of various force concepts with the general anticipation that students would be missing correct knowledge about these concepts. 90 I encouraged AJ to build on the students’ thinking within the model in his responding lesson. For example, AJ and I noticed that some students drew simple circles to represent gears (the objects in the phenomenon video), and some students drew more accurate depictions of gears. I suggested that AJ use these different approaches to initiate a discussion about how students represented their ideas slightly differently (i.e., variations in modeling conventions) as a way to connect student thinking to free-body diagrams, a science topic he planned to address with his L4-initial plan: But the other thing is, you may then sort of elevate the fact that some of them [student groups] just used circles, and some of them used straight up pictures of cogs [gears] and that maybe just having a conversation around, did you understand what this group meant, even though they just used circles? And I think that's the bridge to then get you to free-body diagrams. (PM-F2) Like others offered, this suggestion implied a disciplinary connection within the students’ idea. In this case, the disciplinary connection is between the “circles” that students used to represent the cogs in the phenomenon and how scientists represent complex systems using simple images (e.g., car represented by a square). The suggestion also encouraged AJ to consider planning conversations or discussions (high-discourse) with students about elicited ideas as a way to connect them to science ideas. Though offered several times during the first two planning meetings (PM-F1 & PM-F2), AJ did not take up these suggestions on how to plan responses as he planned L4-Fall. Therefore, his L4-Fall plan was relatively unchanged from his L4-initial plan and maintained low 91 responsiveness across all dimensions. After implementing L4-Fall, he summarized what happened as, “A lot more sage on the stage-y more than anything else” (PM-F3). L6-Fall – Lecturing With Attention To Disciplinary Connections (4.B.2.b.iii). AJ’s plans for his next lecture-based lesson, L6-Fall, demonstrated a shift in how he planned to respond during whole group instruction. Though he maintained his lecture-based approach (low- discourse), he drew students’ attention to students’ previously elicited ideas by connecting the correct pieces of their ideas (partial-present) to science topics. This offered some (medium) attention to student ideas during instruction, even though discourse and student roles were still low in responsiveness. This plan to lecture with attention to disciplinary connections was suggested (Round 2) during the PM-F3 planning meeting, which took place after, and on the same day as, AJ’s implementation of L4-Fall, which was low in all dimensions of responsiveness. I made this suggestion after AJ shared a lack of ability to support student progress without using a lecture- based approach: [I’m] feeling in the weeds 'cause I don't know how to take this style of presentation or this style of learning and have [students make] progress without the stand-up and give them [students] the lecture about what the different forces are and what Newton's laws are. So, that's the challenging part for me, and that's where I feel like the last couple of days were very much more in the traditional style of things, it was me presenting stuff. (PM-F3) In this excerpt, AJ appears to know he should be doing something different, something more “organic,” but is struggling with how to change his practice. This quote also provides additional 92 evidence that previous suggestions on how to change his planned responses were not helpful. The previous suggestions on how to “have progress” during instruction (Round 1) included use of discussions (high-discourse), which may have been too far removed from AJ’s current lecture-bound practice to have meaningful impact. Once AJ shared his challenge, I made a new suggestion (Round 2). I encouraged AJ to lecture, but do so in ways that attended to elicited students’ thinking by talking about the connections he saw between their ideas and the science ideas of his lecture. For example, I suggested that AJ could connect students’ use of ‘speeding up’ and ‘slowing down’ in their models with the academic vocabulary of ‘acceleration’ in his potential plans for Newton’s Second Law: “In terms of the vocab, like when you introduce Newton's Second Law, I think that's the point when you say, ‘when we've been saying speed up and slow down, that's what acceleration is’” (PM-F3). I made another suggestion or how he might make connect another science topic (Newton’s Third Law) to a student idea that he had previously interpreted as partial-present: JC: And so, then the other thing is, I think you could... You know, like the idea of the organic. You may be able to do that with Newton's Third Law. You may be able... [JC references student idea on screen] AJ: Yeah, I think that's true, yeah…specifically in those two bumps, I think that's a natural one. (PM-F3) In this exchange, AJ seems to be taking up this suggestion, seeing the disciplinary connection between the student idea and Newton’s Third Law. This conversation prompted AJ to plan his next lecture-based responding lesson, L6-Fall, on Newton’s Third Law. Though he continued to 93 lecture (low-discourse) while students took notes (low-student role), AJ began planning his lecture-based responding lesson using the disciplinary connections to elicited student ideas, with plans to elevate these connections during instruction (medium-attention). During a planning meeting, AJ described how he elevated the Newton’s Third Law connection during L6- Fall: [I used] the [phenomenon] video to come up with some examples of Newton's third law, and specifically that interaction and really honing in on, saying, "A lot of you really picked up on the fact that when there's... When one cog hits the next, there's an acceleration for the second cog, and it speeds up. You will all tell me that it [points to the first cog] slows down, and there's this pairing that happens." And then we looked at the video in many different situations of it interacting, and I picked different spots to stop and focus on…so we used that to jump into Newton's third law. (PM-F4) AJ planned his response so that attention was drawn to the student thinking he noticed in their models and had interpreted as partial-present. The student idea became the starting point of the lesson, which was then used to “jump into,” or make a disciplinary connection to, the content of the lesson (Newton’s Third Law). Additional Evidence from L8-Fall (4.B.2.b.iv). Moving forward, AJ was able to connect his new lecture-based approach, attending to disciplinary connections, to his new lab-based approach, investigating student hypotheses. Since he was providing students with a lab experience prior to lecturing on the same topic, he was able to listen and watch for student ideas that he could then connect to science topics in the next lesson. For example, AJ anticipated his students would come up with the mathematical relationship, “As one goes up, 94 the other goes down” (PM-F4), when considering how forces and weight/size are related within the lab equipment set up. This is a novice version of the mathematical relationship found within the Newton’s Second Law equation, F=ma. In the equation, ‘m’ stands for ‘mass,’ a concept associated with, but different from, the students’ idea of weight and size. AJ planned to use this novice relationship as part of his L8-Fall lecture. He shared this during the planning meeting that took place after the first day of the L7-Fall lab and a few days before his implementation of L8-Fall: I think they'll [student will] come up with a pretty secure relationship, so then we can do F = ma [Newton’s Second Law content] and kinda move forward from there, and then I'll relate that back to really big things and really tiny things [student thinking that prompted the L7-Fall lab] and talk about that relationship. (PM-F4) AJ planned to connect the relationship he anticipated from students during the L7-Fall lab, along with the student thinking that prompted the L7-Fall lab (i.e., “really big and really tiny things”), to his lecture on Newton’s Second Law (L8-Fall). The use of the term “secure” implies that AJ’s interpretation of the idea is at least partial-present. When his students came up with the relationship that AJ anticipated during the L7-Fall lab, AJ included the related student data and responses as part of the L8-Fall lecture presentation (medium-attention). After implementation, he summarized this lesson set (L7-Fall lab & L8-Fall lecture) as: “They [students] came up with that relationship [from the L7-Fall lab] and then we formalized it [with L8-Fall lecture]” (2019-12-02). “Formalizing” became AJ’s way of referring to his lectures in which he attended to and built on students thinking. 95 Toward Planning Discussions – Planning to Elicit During Whole Group Instruction (4.B.3) AJ’s final shift was in the way he planned to elicit during whole group instruction. During the Fall semester, AJ’s planned discourse structures during whole group instruction were limited to lectures (low-discourse). In the Spring, AJ shifted toward planning discussions for whole group instruction and increased the discourse responsiveness of his these lesson plans. This shift occurred in two ways depending on the grouping format (whole or small group) of the associated Fall lesson plan. For Spring versions of AJ’s Fall whole group instruction, AJ began planning to elicit by having students ‘share out’ their ideas and thinking. For Spring versions of AJ’s Fall small group instruction, AJ added whole group segments during which he planned to elicit by having students ‘share out’ their student ideas and thinking. In the subsections below, I first discuss how AJ shifted toward planning discussions by making changes to his Fall plans for whole group instruction, followed by changes to his Fall plans for small group instruction. In the last sub-section, I discuss support associated with this shift. Changes for Spring Whole Group Instruction – Having Students ‘Share Out’ (4.B.3.a). When planning in the Spring, AJ generally used or modified his Fall lesson plans. When considering whether and how to modify Fall versions of lesson plans with whole group instruction, AJ began planning for students to ‘share out’ their own ideas and thinking rather than planning to bring up and talk about the student ideas himself (as he planned for his Fall semester whole group instruction). This shift can be seen when comparing Fall versions of lesson plans with whole group instruction to their Spring counterparts. AJ modified three of his 96 four Fall lesson plans by adding plans to elicit during whole group instruction, which increased the discourse of the Spring versions. See Table 10. Table 10 Overview of Shift in Responsiveness of Whole Group Lessons Plans from Fall to Spring Responsiveness of Fall Description of shift from Responsiveness of Lesson version Fall à Spring Spring version Discourse – low Discourse – med L4 Attention – med Attention – med Role – low Role – med Discourse – low Provided opportunity for Discourse – low L6* Attention – med students to ‘share out’ Attention – med Role – low [increase in discourse Role – low responsiveness]. Discourse – low Discourse – med L8** Attention – low Support: AJ attributed to Fall Attention – med Role – low experience. Role – uncertain Discourse – low Discourse – med L11 Attention – low Attention – med Role – low Role – med Note. *The Spring version of L6-Fall was L7-Spring; **The Spring version of L8-Fall was L9- Spring. Generally, AJ planned to have students ‘share out’ (planned elicitation with high student talk) at the beginning of the lesson. These ‘share-outs’ elicited students’ ideas or thinking from an earlier small group lesson (e.g., lab activity) that he was able to monitor and use to anticipate student responses. This made the Spring versions of these lessons medium in discourse since they included some segments heavy in student talk and some segments heavy in teacher talk. The ‘share out’ was then followed by a short lecture (high teacher talk) in which AJ drew students’ attention to disciplinary connections within the elicited ideas or thinking (planned response with attention to student ideas). Therefore, AJ shifted toward planning discussions (increased discourse) while maintaining his Fall increase in attention: planning to respond with attention to student ideas. 97 Evidence of this shift in planning, or addition of plans to elicit to whole group instruction, can be seen when comparing L8-Fall to its Spring counterpart, L9-Spring. The topic of these lesson plans was Newton’s Second Law. In the L8-Fall plan, AJ used a student idea he elicited during the previous day’s lab-based, small group lesson (“As one goes up, the other goes down,” PM-F4) as the starting point for his whole group lecture. During the lecture, he “formalized” their ideas or explained the disciplinary connections between the student idea and Newton’s Second Law. Because it consisted of a lecture, the discourse responsiveness of this lesson was low. In the Spring version of this lesson (L9-Spring), rather than talking about the students’ ideas himself, AJ planned for students to ‘share out’ their ideas about the data collected the previous day during a lab-based, small group lesson. During the PM-S6 planning meeting, AJ described his plan after implementing it: I took one of their [students] sets of data and started the PowerPoint with it today. And we [AJ and the students] looked first for trends, and everyone came up with the trends…We did it more as a group together. It wasn’t like in individual groups, [or] that they [students] were sort of on their own. They were engaged and they were together, they paid attention, they answered questions and asked questions and talked to each other. Though the description is given after implementation, it suggests AJ planned to ask students to ‘share out’ their ideas about trends in the data, which allowed the level of student talk AJ described. 98 During the planning meeting, AJ went on to describe the mathematical relationships students identified in their trends and his switch to the teacher-led segment of the lesson. He said: [Students found] some places where the force doubled, and the acceleration more or less doubled. And then another place where it doubled and it looks like it’s close to doubling. And so, that’s where we said, “Well, these look close, but is it good enough for a mathematical relationship? So, that’s where I did a…“here’s what a physicist would do, we would put it onto a graph, and do a linear regression…” (PM-S6). Here, AJ ‘formalized’ students’ ideas about trends in the data by showing them a canonical process for analyzing data and connecting the students mathematical trend ideas to the scientifically accepted trends. AJ said he told the students’ their trends “look[ed] close” and asked them to consider “is it [a trend] good enough.” While he never offered a direct interpretation of the students’ trend ideas, this quote does imply a partial-present-missing interpretation of the students’ ideas. To summarize, AJ planned to elicit student ideas with a ‘share out’ at the start of L9=Spring and then respond with a lecture in which he attended to disciplinary connections within students’ ideas. Compared to L8-Fall, his plans for L9-Spring reflect a shift toward planning discussions. In terms of responsiveness, the lesson plan was 1) medium for discourse, since a portion of the lesson plan was a ‘share out’ or planned elicitation in which student-talk would be high; 2) medium for attention, as student ideas were elevated and used during instruction, in conjunction with canonical knowledge; and 3) medium for student role, as the elicitation prompt to “look for trends” implies that students did knowledge construction during 99 the early part of the lesson, with the teacher taking a ‘knowledge-holder’ role for the remainder of the lesson. Changing Small Group Instruction – Adding Whole Group ‘Share Out’ Segment (4.B.3.b). When planning in the Spring, AJ also modified some of his plans for small group instruction that represented a shift toward planning discussions. When considering whether and how to modify Fall versions of lesson plans with small group instruction, AJ occasionally added a whole group elicitation segment. Therefore, what was once only a small group lesson that included plans to elicit (Fall versions), now included a whole group segment that also included plans to elicit (Spring versions). Similar to the shift for whole group instruction, the elicitation segment was planned as a student ‘share out.’ This represented a shift in the way AJ planned whole group instruction, since all of AJ’s Fall whole group instruction lessons were planned as lectures with low discourse responsiveness. While this was true for only two of AJ’s seven lesson plans with small group formats, they were the only whole group segments planned with high discourse responsiveness. See Table 11. These additional whole group lesson plan segments added an entire day of instruction and were planned as a student-led, teacher-facilitated (high-discourse) group discussions about the students’ thinking (high-attention). AJ did not plan to push students toward any particular conclusion or connect their ideas to any science topic during this segment of instruction (high- student role). Therefore, I consider the way AJ planned to elicit using a whole group instructional format to represent a shift toward planning discussions. Evidence of this shift can be found by comparing AJ’s L3-Fall and L3-Spring plans. L3-Fall was a planned response to ideas elicited from L2-Fall and was high in responsiveness across all 100 Table 11 Overview of Shift in Responsiveness of Small Group Lesson Plans from Fall to Spring Fall Version Description of shift Spring Version Grouping from Grouping Responsiveness Fall à Spring Responsiveness Format Format Disc – high Disc – high L1 Small Group Att – high Small Group Att – high Role – high Role – high Disc – high Disc – high L2 Small Group Att – high Small Group Att – high Role – high Role – high Disc – high Small Group Att – high Disc – high Role – high L3 Small Group Att – high Role – high Added whole group Disc – high ‘share out’ lesson Whole Group Att – high segment [increase Role – high in whole group Disc – high discourse Small Group Att – med Disc – high responsiveness]. Role – med L5 Small Group Att – low Role – low Support attributed Disc – high by AJ to Fall Whole Group Att – high experience. Role – high Disc – high Disc – high L7 Small Group Att – med Small Group Att – med Role – low Role - med Disc – high Disc – high L9* Small group Att – low Small Group Att – med Role – low Role – low Disc – high Disc – high L10 Small Group Att – med Small Group Att – high Role – low Role – med Note. Disc = discourse; Att = attention; *The Spring version of L9-Fall was L6-Spring. three dimensions. AJ planned for students to work in small groups together (high-discourse) to design (high-student role) and carry out an investigation to test a set of elicited student ideas 101 about the phenomenon (high-attention). AJ planned to have students ‘share out’ their findings across groups in a small group (jigsaw) format to do this. Since a similar set of student ideas were elicited with L2-Spring, AJ’s L3-Spring plan was identical to his L3-Fall plan except for the addition of a whole group ‘share out’ about students’ lab designs and findings. AJ described his plan for this whole group segment as a consensus discussion based on the two hypotheses students were testing with the L3-Spring investigation: that the wooden board in the phenomenon video was either slanted or horizontally flat. When discussing the plans for the whole group segment of the lesson, AJ shared: I wanna see kind of where it goes from there to see whether they're [students are] gonna be more interested in... Well, first to see how they... Whether they come to a consensus on, they believe the board to be flat or they believe the board to be slanted. 'Cause I think if they do come to a consensus on one of those, then I think there's clear direction on which way they go. I think if they say, "Well, it had to be slanted," then I think we go gravity and friction. If they think it's flat, then I think we go into…Newton's First Law and work beyond that. So, if there's consensus, I think that's the direction that it takes us. If there's not consensus, I wanna hear what their arguments are and sort of decide from there. (PM-S1) In the excerpt above, AJ planned to elicit student thinking with the purpose of building student “consensus” in a whole group format about which student hypothesis they believe to be correct (high-attention). This implies a high level of planned discourse responsiveness. He does not appear to care about the outcome of the consensus, and, therefore, is not planning to push the student consensus in one direction or another (high-student role). 102 Additionally, AJ showed evidence of planning to use the elicited student thinking to guide his next planned response with attention to disciplinary connections, the approach developed during the Fall semester (See 4.B.2.b). In the above quote, AJ made a preliminary plan to choose gravity and friction as his next science topics if students decided the board was slanted and Newton’s Second Law as the next science topic if the students believed the board was flat. In these preliminary plans, AJ demonstrated how he continued to look for disciplinary connections within students’ ideas to plan his responses during the Spring semester. Support (4.B.3.c). During the Spring semester, support appeared to come from two places. One likely source of support was AJ’s experience implementing lessons during the Fall. This experience appeared to support his increase in discourse responsiveness and shift toward planning discussions. The other potential source of support was suggestions offered during planning meetings. In the Spring, however, these appeared to help AJ maintain his new approaches in planning, rather than make shifts in planning as they had in the Fall. I provide evidence of the potential influence of these two supports on AJ’s Spring planning in the sections below. I begin with AJ’s identified support of his Fall experience implementing lesson plans and how that seemed to help AJ increase the discourse responsiveness of his Spring lesson plans Then I discuss the support that may have helped AJ maintain new approaches to planning he had developed during the study. Support from Fall Experience (4.B.3.c.i). AJ seemed to initiate the increase in discourse responsiveness (toward planning discussions) on his own. I did not offer any suggestions during planning meetings aimed at pushing AJ in this way. However, he did offer insight into the support he felt helped him make this shift during a member-check discussion (2022-07-25). 103 During the discussion, I shared my Lesson Plan Memos and described the shifts in planning I noticed through my analyses, including changes in the planned discourse structures between Fall and Spring versions of several lesson plans. AJ confirmed my general finding that he began planning to elicit student ideas during whole group instruction in the Spring. When I asked what he attributed this shift to, he responded by saying: And I think maybe part of that was because I was more comfortable doing it. I could see what the path was gonna be, I could trust that the path was gonna be there, I could trust that it could all come together, that kind of thing, I think there was for sure a big comfort level on my part. I then checked my understanding by asking, “So, it’s like… ‘I know what the path looks like for this conversation because I did it last time [Fall semester], now I'm gonna allow you [students] to participate a little bit in this thing.”’ AJ nodded in response. AJ attributed his shift toward inviting students into the discourse of his whole group instruction to a gain in comfort from having implemented his lesson plans during the Fall semester. From this experience, he could see and trust that a pathway through the discussions would “all come together.” Maintenance Support (4.B.3.c.ii). While I did not offer any specific support during planning meetings to facilitate AJ’s increase in discourse responsiveness, I did offer support that may have helped him maintain his new planning approaches. During the Spring semester, AJ faced a moment of uncertainty about student thinking when planning his L5-Spring lesson. Though he had students’ work to interpret, he felt unsure as to whether the student responses indicated that they held particular ideas and, therefore, struggled to plan his response. 104 For example, while reviewing students’ initial models (student work from L2-Spring) to look for disciplinary connections between student ideas to science ideas, AJ shared that he wanted to address the topic of ‘net force’ in his upcoming lesson (L5-Spring). However, he faced uncertainty about students’ thinking and whether ‘net force’ would be a good topic for his next lesson. During the PM-S2 planning meeting, AJ shared: Like I wanna go to net [force] first, I naturally wanna go there, but I still don't know that students have this idea that you know a table is exerting a force, and like if I push something, like I think they get that if I push something [an object], there's a force there…[but] I don't know that there's an idea that one object can exert a force on another object without a human pushing it, like I don't know that that's a thing that they believe yet. Though students had completed elicitation tasks (segmenting task of L1-Spring and initial modeling task of L2-Spring) at this point in the semester, the student responses left AJ with uncertainty about whether students thought forces could only be caused by an active agent, like a human. AJ then shifted from uncertainty about what students thought, to believing students held the wrong idea about when forces exist. He shared, “I think some of them [students] think that a force is still this active thing that someone has to do, a person or an entity has to make happen” (PM-S2). Though he was trying to plan a response to elicited student ideas, AJ appeared to switch to anticipating students held the non-canonical idea in the absence of clarity about student thinking. 105 I suggested he plan to elicit this specific thinking during his next lesson, rather than plan based on this anticipation. I suggested, “I'm wondering if maybe it's worth... Sort of eliciting that question, like eliciting where they're at with that” and went on to describe how he could incorporate this suggestion into other planning ideas we had already discussed during the planning meeting (PM-S2). AJ took up this suggestion and planned ways to elicit students’ understanding about whether forces could exist without an active agent in his L5-Spring lesson plan. In this case, the suggestion seemed to help AJ maintain his new planning sequence of planning to elicit student hypotheses followed by planning to respond with attention to students’ ideas. This maintenance support stands in contrast to the support offered during the Fall semester that appeared to help AJ makes shifts toward this sequence. Summary (4.B.4) Over the course of the study, AJ made small, but important changes in the way he engaged in the responsive planning practices (anticipating, interpreting, planning to elicit and planning to respond) and responsiveness of his planned instruction. First, AJ began planning to elicit students’ initial hypotheses about the unit’s anchoring phenomenon and increase the attention and student role responsiveness of his lesson plans. Next, AJ began planning to respond with attention to student ideas and increase the attention of his planned responses. This shift in planning to respond occurred in two ways: 1) for lab-based responding lessons, AJ began investigating student hypotheses and 2) for lecture-based responding lessons, AJ began attending to disciplinary connections within student ideas. Finally, AJ shifted toward planning discussions and increased the discourse of his whole group instruction. The shifts in AJ’s 106 responsive planning demonstrate a shift away from his initial focus on correct knowledge and toward being responsive to students’ ideas and thinking. Findings for Research Question Three (4.C) In this final section of my Findings chapter, I discuss AJ’s use of the FMLP over the course of the study. I observed only a few instances of use during each planning meeting, if at all (Table 12). Some instances of use were only in passing. For example, AJ may have referenced an idea from the FMLP. Other instances of use were more in-depth. For example, AJ and I had several discussions about how he might incorporate an FMLP-aligned resource into his lesson plan. In the descriptions of use below, I’m not suggesting that these uses are consistently a tool that he relied on. Table 12 Depth of LP Use During Planning Meetings Planning Meeting In Passing Use of FMLP In Depth Use of FMLP PM-F1 X X PM-F2 X X PM-F3 X PM-F4 No evidence of use PM-F5 X PM-F6 X PM-S1 X PM-S2 X PM-S3 X PM-S4 X PM-S5 X PM-S6 No evidence of use PM-S7 X 107 With variation in AJ’s depth of use, AJ used the FMLP as a list of two different things. First, AJ used the FMLP as a list of potential student ideas. He used the FMLP with both his initial (less responsive) and new (more responsive) planning practices. Second, AJ used the FMLP as a list of bridges between student ideas (or levels of the FMLP). While not directly present in my data, AJ reported this use of the FMLP with his new of planning to respond practice. LP as a List of Potential Student Ideas (4.C.1) AJ used the FMLP as a list of potential student ideas with both his initial (less responsive) and his new (more responsive) planning practices. In the sub-sections below, I provide evidence of AJ’s use of the FMLP as a list of potential student ideas with his initial way of engaging in responsive planning practices, new planning to elicit practice, new interpreting practice, and new planning to respond practice. With Initial Way Of Planning (4.C.1.a). During our first planning meeting (PM-F1), AJ referenced the LP only a few times when discussing his initial lessons plans. In his initial lesson plans, AJ interpreted the ideas he anticipated through a ‘gets it/doesn’t get it’ perspective and planned to elicit correct ideas or planned to respond to correct the things he anticipated students ‘wouldn’t get.’ When relying on this initial focus on correct knowledge, AJ appeared to use the FMLP as a list of potential student ideas that would need to be corrected. Specifically, AJ anticipated an idea from the FMLP (impetus) and interpreted the idea as wrong. Although AJ did not discuss impetus (or any other ideas from the FMLP) when sharing his initial plans to elicit, his initial plans to respond were intended to correct the impetus idea. 108 For example, AJ referenced impetus as we discussed his plans for L4-initial (planned lecture on force topics) and L5-initial (planned elicitation of correct ideas from L4-initial). AJ described this lesson sequence as, a point in the unit “where we can destroy the impetus idea” (PM-F1). Though made only in passing, this comment provides evidence of AJ’s use of the FMLP as a list of potential ideas in a way that aligns with his initial focus on correct knowledge. By describing it as something to be destroyed, AJ implied a wrong interpretation of the anticipated impetus idea. Additionally, he appears to have planned his response to this wrong idea, without indicating any plans to elicit the idea from students first. Another example came later in the same planning meeting. As I made suggestions of potential lesson plan ideas, AJ made another reference to the same idea from the FMLP in passing: “Especially, if the impetus thing rears its ugly head and needs to be addressed” (PM- F1). In this case, AJ described the impetus idea as “ugly,” implying an interpretation of the idea as wrong, as in the last example. AJ also described the FMLP idea as something that “needs to be addressed” or corrected, which indicates a focus on responding to the anticipated idea. Again, there is no mention of plans to elicit the impetus idea; at this point in the study, AJ’s planning focused only on eliciting correct ideas. In both examples, AJ referenced the impetus idea from the FMLP. The goal of instruction was to correct this wrong idea. This is consistent with AJ’s initial way of engaging in responsive planning, which focused on correct knowledge. With New Planning To Elicit Practice (4.C.1.b). Early in the Fall semester, AJ began planning to elicit student hypotheses. This meant he planned to elicit a range of student ideas. On occasion, AJ used the FMLP as a list of potential student ideas, or hypotheses, to be elicited. 109 This contrasts with his initial use of the FMLP as a list of ideas to be corrected. AJ used the FMLP of a list of potential ideas to elicit in two ways. First, AJ started considering whether his eliciting tasks would, in fact, elicit ideas from the FMLP. An example of this use of the FMLP occurred during the first planning meeting. After provided some support, reminding AJ that the purpose of eliciting is to uncover the ideas students hold, he shared a potential planning idea: Maybe a third iteration of the model, okay, "Create a new model that is just based on the forces that you see happening. So, create a model that only labels the forces that are occurring," or something like that and see what they do with that. I think that's where you'll see or at least be able to flesh out that impetus idea. (PM-F1) In this example, AJ’s focus was on planning an activity that would help him elicit whether students held the impetus idea, an idea highlighted on the list of potential student ideas from the FMLP. Previously, AJ had no plans to elicit the impetus idea at all, potentially because he initially planned to elicit correct ideas and interpreted impetus as a wrong idea. With a shift to planning to elicit a range of student ideas, AJ considered whether his eliciting task will elicit the impetus idea from the FMLP. AJ continued to consider whether, and ensure that, eliciting tasks would elicit FMLP ideas during the remainder of the study. For example, later in the Fall semester, AJ said the following about his plan to elicit by having students revise their initial models about the phenomenon (L5-Fall): We'll get some flesh out of why it [the object in the phenomenon] continues to move…And maybe that can... Maybe that'll help flush that out and sort of get us an idea 110 of where their thinking is, as far as the progression and where their thinkings are in general. (PM-F2) In this example, AJ hoped to elicit student thinking in general and with respect to the FMLP. As an additional example from the Spring semester, AJ shared his thinking about a planning idea we were discussing: “'Cause I think if I give them something like this…I don't really know whether impetus [FMLP idea] is gonna be there or not” (PM-S2). Here again, as AJ planned to elicit, he was concerned with whether his eliciting tasks allowed him to see whether his students held an idea from the list of potential ones provided by the FMLP. The second way AJ used the FMLP as a list of potential student ideas with his new planning to elicit practice was to incorporate FMLP-aligned resources into his lesson plans. In the Fall semester, with my supportive suggestion, AJ included modified OMC items in his plans for the second day of his L2-Fall = lesson. He planned to provide students with the modified OMC items, along with other prompts aligned to non-FMLP ideas, to discuss in their small groups. He hoped this plan would help students clarify ideas about the phenomenon as they worked in small groups to create their initial models. In the Spring semester, AJ used modified OMC items in his plans to elicit student thinking about why objects slow down. He then used the student responses to inform his plans to respond, which are discussed in a later (See 4.C.1.d below). With New Interpreting Practice (4.C.1.c). Starting with the shift described as planning to respond with attention to student ideas, AJ reduced his reliance on ‘gets it/doesn’t get it’ interpretations of student ideas. Instead, he offered more nuanced interpretations (e.g., partial- present) as he looked for disciplinary connections between student ideas and science ideas. 111 When doing so, AJ used the FMLP as list of potential student ideas to which he could compare elicited student responses. In other words, the FMLP acted as a reference list. As an example, AJ shared his interpretation of student thinking using the FMLP after eliciting student thinking during his L4-Spring lesson. AJ and I had the following exchange during the planning meeting following the lesson (PM-S2): AJ: When they [students] said “external or an outside influence,” I was like, "Oh, wow. Okay."…I think this moves away from, it moves away from the level one [of the FMLP], I think. It moves away from this natural state of everything has its own [blah] and then it just is. JC: The natural state is rest. AJ: Yep. I think it moves away from that at least a little bit. In this example, AJ was impressed with a piece of the student response (“external or an outside influence”), implying at least a partial-present interpretation. Additionally, AJ also described the elicited student response as being better than the idea in level one of the FMLP. In this instance, AJ is using the FMLP to justify how impressed he was with the student thinking as the student response represented better thinking than that found in the lowest level of the FMLP. Another example comes from AJ’s reflection on ideas elicited during the L3-Spring lesson, an investigation of the orientation of the wooden board in the anchoring phenomenon. AJ shared: The interesting part to me was this idea that, when they [students] made the connection with “big force meant big motion” or “just a bump meant big motion if it 112 [the wooden board] was slanted.” I think that is a way that we can sort of talk about and hopefully dispel a little bit of the impetus idea. (PM-S2) In this case, AJ hoped that the elicited student ideas (e.g., “just a bump meant big motion if it [the wooden board] was slanted”) might help “dispel” the FMLP idea impetus, suggesting that the elicited idea is more correct than the impetus idea. Therefore, similar to the last example, AJ implied at least a partial-present interpretation of the student thinking relative to an FMLP idea. With New Planning to Respond Practice (4.C.1.d). As part of AJ’s planning to respond with attention to student ideas practice (See 4.B.2), AJ began using elicited ideas to inform his responding lesson plans. For lab-based responding lessons, this meant AJ planned to investigate student hypotheses with lab activities. In one instance, AJ used the FMLP as a list of potential student ideas, or hypotheses, when planning a lab-based responding lesson. This was in the planning of L6-Spring, a friction investigation. As mentioned earlier (See 4.C.1.a), AJ modified and used FMLP-aligned OMC items to elicit students’ ideas about why objects slow down (L6-Spring). He planned to have students answer the items individually and to use their responses (ideas) to inform his plans for the upcoming friction investigation (L6-Spring). He planned to place two students with the same OMC response (e.g., option A) in a group with two other students with the same OMC response to each other (e.g., option B), but different from the other two students. Therefore, in a group of four, each student would work with someone holding the same idea (or who picked the same response) as they did and two other students holding a different idea. Within their 113 groups, students would discuss their ideas related to why things slow down based on their OMC item responses: I think the idea that I wanna roll with on Thursday [first day of L6-Spring] is,…if I get lucky, I can mix some groups together in terms of... I thought about the homogenous group or a heterogeneous group, and I like the heterogeneous group… I'm gonna have them self-identify in private. I don't think I want them to publicly identify, but I want them to self-identify in private and then I will create the groups and put them in, and I'll be straight up and tell them probably, "Hey, two of you are this idea [from the OMC item] and two of you are that idea [from the OMC item], and I want you to sort of flesh…play around with that a little bit. (PM-S4) AJ’s plan was to then have the group make predictions about the results of the lab activity based on their ideas, or hypotheses, from the OMC items: …and asking [students] to predict the data that they're gonna look for... And then share it with each other in the group and see, "Okay, what are we gonna look for or see for each one [of the student ideas]," and then talk with each other and see if there are other pieces of data that they can look for to validate each of those. (PM-S4) To summarize, AJ used the student responses from an FMLP-aligned eliciting resource to inform the way he planned his lab-based responding lesson. First, he used student responses to put student into groups. Second, he had students discuss their ideas and predict the investigation outcomes based on these ideas, or hypotheses. In this example, AJ planned to investigation student hypotheses, with the hypotheses to be investigated coming from the FMLP, or list of potential student ideas. 114 LP as a List of Bridges Between Ideas (4.C.2) In addition to his use of the FMLP as a list of potential student ideas, AJ used the FMLP as a list of bridges between ideas. This use demonstrates partial use of the hierarchical features of the FMLP. I did not directly observe this use as a list of bridges between ideas. Rather, AJ reported that the FMLP was useful when mapping out his next instructional steps, which suggests it informed his planning to respond with attention to student ideas. During the PM-F5 planning meeting, I asked AJ what was influencing his responding plans. AJ identified three factors: the students’ ideas, his own knowledge of the discipline, and the FMLP. When I asked if any one or two of these were higher in importance, he said “no” and then elaborated on his use of the FMLP: Because the progression and knowing the gaps specifically, the jumps from one level to the next [bridge], is as related...obviously, you need the ideas in the room [students’ ideas],…but then you also need that knowledge of the physics progression… at least the knowledge of the connections [bridges] to make that happen. So I think it's a triangle. I think there's three of those that are all sort of interacting and inter-playing with each other there. When I then tried to clarify how he was using the FMLP, we had the following exchange: JC: Okay. So…you're seeing the progression in terms of those gains [bridges]? AJ: Yeah, I'm seeing... Yes the gains [bridges], yup. JC: Okay, like I need to know what these things that are, like the bridges between the ideas? AJ: Yeah. 115 JC: Not necessarily that I'm gonna take bridge one to two, and then two to three and three to four. But that I want those bridges to exist and I'm gonna put them in whatever order I think I need them to be. AJ: And whatever order the kids seem to need them to be. (PM-F5) From the excerpt, it appears AJ considered the bridges students would need to take in order to get from one idea to another within the FMLP, but did rely on the FMLP’s sequencing of the bridges (i.e., bridge from level 1 to level 2, bridge from level 2 to level 3, and so on) when planning his responses. Summary (4.C.3) During the study, AJ used the FMLP as a list of two different items: 1) potential student ideas and 2) bridges between potential student ideas. AJ used the FMLP as a list of potential ideas, with his initial (less responsive) and new (more responsive) responsive planning practices. AJ reported using the FMLP as a list of bridges between potential student ideas with his new planning to respond practice. 116 CHAPTER FIVE: DISCUSSION The findings above present a case of how a teacher’s planning practices increased in responsiveness over time and how an LP was used during the change process. In the sections below, I first discuss notable themes within the findings and their connections to the literature. I then discuss implications and recommendations for those supporting teacher responsiveness. Notable Themes (5.A) In this subsection, I address the following notable themes from the findings: 1) how my codes captured the responsiveness of AJ’s planning, as a methodological contribution, and 2) the challenge of planning to respond and 3) the use of the FMLP during the study, as empirical contributions. I end with limitations on the study. Capturing Responsiveness of Planning (5.A.1) Studies of responsive instruction (both planning and enactment) have relied on three dimensions for describing responsiveness: class discourse structures, attention to student ideas during instruction, and the roles given to students during instructional activities. Many of these studies (e.g., Gotwals et al., 2015; Pierson, 2008) have integrated some or all of these dimensions into a single measurement or code for the responsiveness of a teacher’s practice. For example, Pierson (2008) included attention into her categorization of discourse. In other words, a teacher’s practice needed to meet both discourse and attention criteria to be considered low, medium or high in responsiveness. Additionally, some studies (e.g., Elby et al., 2020; Pierson, 2008) relied on a single dimension to measure or code for responsiveness. For example, Elby and colleagues used student role to describe the responsiveness of classroom instruction within their analytic tool. 117 In my study, I included all three of these dimensions of responsiveness and considered them separately. This allowed me to capture how AJ’s planning practices shifted in ways that might not have been captured if integrated into a single responsiveness measure or if I only used one of these dimensions. For example, during the latter half of the Fall unit, AJ began lecturing about the connections he saw between students’ ideas and science ideas (See 4.B.2.b). This shift in responsiveness was captured by an increase (low to medium) in attention to student ideas. However, the discourse and the student role remained low in responsiveness. If, for example, planned discourse and attention had been integrated into a single descriptor or code, a shift in both discourse and attention would have been required to qualify as an increase in responsiveness. Therefore, I may not have noticed this shift in the way AJ was planning to respond during the Fall semester since he only changed the attention, and not discourse, of his planned instruction. Additionally, I was able to capture a shift in responsive of a planned lecture, an instructional context not typically thought of as responsive (e.g., Richards et al., 2020). As described earlier in this section, prior studies have contributed to our understanding of classroom responsiveness using the three dimensions of responsiveness individually and when integrated into a single variable. However, had I defined responsiveness in these ways, I would have likely missed some of the shifts in responsiveness AJ made during the study. Therefore, this study offers a new way of analyzing instruction, particularly planning, for responsiveness using three separate dimensions of responsiveness. 118 The Challenge of Planning to Respond (5.A.2) Many scholars have acknowledged how challenging it can be for teachers to enact responsive instruction (e.g., Bennett, 2011), with some noting that the responding practice as the most challenging (Ruiz-Primo & Furtak, 2007). The results of my study suggest a similar challenge for the practice of planning to respond based on the differences in AJ’s planned eliciting and planned responding lessons. First, AJ’s responding lesson plans shifted more gradually than his eliciting lesson plans. Second, AJ took up my initial suggestions (support) during the first planning meeting for his planning to elicit practice. However, before AJ made changes in his planning to respond practice, I offered multiple and modified suggestions (support) over several planning meetings. In the sub-sections below, I further describe these two differences in how AJ’s planning to elicit and planning to respond practices shifted during the study. I then offer one insight into why shifts in planning to respond may have been more challenging and how this insight is connected to the literature. Gradual Shift in Planning to Respond (5.A.2.a). AJ’s planned responding lessons increased in responsiveness more gradually than his planned eliciting lessons. By gradually, I mean that changes in AJ’s planned responding lessons generally occurred in a single dimension at a time, shifted from low to medium, then medium to high, and took place over the entire study. To elaborate, AJ’s planning to respond practice shifted in responsiveness twice during the study. First, during the Fall semester, AJ began planning to respond with attention to student ideas, mostly through a shift in the attention of his planned responses from low to medium. During the Spring semester, AJ moved toward planning discussions for whole group 119 instruction, mostly observed through a shift in the discourse of his responding lessons from low to medium. In contrast, changes in AJ’s planned eliciting lessons generally occurred in two dimensions at a time, shifted directly from low to high, and occurred very early in the study. AJ’s planning to elicit practice shifted during our first planning meeting. AJ decided to stop planning to provide correct knowledge prior to eliciting (initial planning focus; low-attention, low-student role) and began planning to elicit initial hypotheses about the phenomenon (high- attention, high-student role). Less Uptake of Support for Planning to Respond (5.A.2.b). The second difference between AJ’s shifts in planned responding and eliciting lessons was in the uptake of support provided. Though I made multiple attempts to support a shift in the way AJ planned to respond during the first two planning meetings, the first uptake of support was in an email exchange following the second planning meeting. In the email exchange, AJ began planning to investigate student hypotheses within his lab-based responding lesson plans (part of the overall shift to planning to respond with attention to student ideas). Additionally, it wasn’t until the third planning meeting that AJ’s lecture-based responding lessons showed a shift in the way they attended to disciplinary connections within student ideas (part of the overall shift to planning to respond with attention to student ideas). Over the course of these planning meetings, I shifted my support from suggesting ways to build on students’ ideas within a discussion, which focused on changing multiple dimensions of responsiveness, to providing support that focused on increasing just one dimension of responsiveness. First, I started describing student ideas as potential hypotheses to be 120 investigated, which focused only on the attention AJ might give to student ideas during lab- based activities. Next, I started suggesting that AJ keep lecturing but add into his lectures the descriptions of the disciplinary connections he saw in students’ ideas, which focused on increasing the attention of his lecture-based responses. Given AJ’s uptake of the modified support offered, it seemed AJ needed a more scaffolded pathway for change in his planning to respond practice. In contrast, support for AJ’s planned eliciting lessons focused on changing multiple dimensions of responsiveness and was taken up when offered during our first planning meeting together (prior to the start of Fall semester). Insights Into the Challenges of Planning to Respond (5.A.2.c). Taken together, the more gradual shift in and less uptake of support for AJ’s planning to respond practice suggest that AJ found changing his planning to respond practice more challenging than changing his planning to elicit practice. I offer one insight into why planning to respond may have been more challenging for AJ to shift than planning to elicit. AJ’s first shift in planning to respond (toward planning to respond with attention to student ideas) paralleled a shift in interpreting from evaluating student ideas using a ‘gets it/doesn’t get it’ perspective (Minstrell & van Zee, 2000; Otero & Nathan, 2004)to seeking to find the disciplinary connections between students’ ideas and science ideas (Robertson et al., 2016). AJ then used these disciplinary connections as part of his shift in planning lecture-based responses (toward attending to disciplinary connections). Part of the challenge of shifting his planning to respond practice may have been that AJ was still new to interpreting in terms of considering connections between student ideas and science ideas to inform his planning to respond practice. AJ had to shift multiple practices at once to modify the responsiveness of his planned responding lessons. In contrast, AJ could 121 modify the responsiveness of his planned eliciting lessons while still relying on a ‘gets it/doesn’t get it’ perspective. For example, AJ could still interpret the student ideas he anticipated as wrong while still ensuring that his plans to elicit would, in fact, uncover whether students held the ideas or not. Therefore, part of AJ’s challenge in planning to respond may, in part, be that he was also working to shift his interpreting practice at the same time. This hypothesis about why planning to respond appeared more challenging for AJ aligns with Heritage and colleagues (2009) description of interpreting as “pivotal” (p. 47) to the success of responding when engaged in formative assessment. The hypothesis also aligns with findings from Furtak’s (2012) study of teachers’ formative assessment practices. While teachers in her study began eliciting a broader range of student ideas, they struggled to respond to these ideas in responsive ways during instruction. Additionally, they continued to demonstrate use of a ‘gets it/doesn’t get it’ interpreting lens throughout the study. In this case, teachers’ maintaining the same interpreting practice may have been part of the reason why their responding practice did not change. Use of the FMLP (5.A.3) During this study, AJ used the FMLP in two ways: as a list of potential student ideas and as a list of bridges between the student ideas. In the sub-sections below, I describe how these uses align with the literature on teachers’ use of LPs to inform instruction. LP Use as a List of Ideas (5.A.3.a). AJ’s primary use of the FMLP was as an unordered list of potential student ideas, which is consistent with the literature in three ways. First, use of the FMLP as a ‘list of ideas’ makes sense because FMLP specifically highlights common student ideas within each of its four levels. Other forms of LPs focus less on students’ ideas (Jin et al., 122 2019), such as those highlighting common patterns in student thinking that can apply to multiple topics (e.g., Jin & Anderson, 2012) and those including a logical organization of canonical concepts (e.g., Furtak, 2012). Therefore, had AJ been using a different type of LP, I may not have observed as much use of it as a ‘list of ideas.’ Second, AJ used the FMLP as an unordered ‘list of ideas.’ Using ideas from an LP without attention to their level has been reported in several studies of teachers’ use of the LP. For example, both Alonzo and colleagues (2022) and Furtak (2012) reported teacher use of LP- aligned assessment items to elicit student ideas. AJ also used LP-aligned assessment items for the same purpose. In these cases, teachers simply wanted to know if students held ideas from the LP without necessarily attending to the LP level of the ideas. Third, AJ used the FMLP as ‘list of ideas’ when engaging in responsive planning practices in less and more responsive ways. For example, when planning initially (less responsive), AJ aimed to “destroy the impetus idea” with his planned response. In this case, impetus was an idea highlighted in the FMLP at both level 2 and 3. Despite its position in middle levels of the FMLP, it was not treated as a valuable idea in any way. Others have observed similar treatment of the ideas highlighted by an LP. For example, teachers in Furtak (2012) described hoping to “squash” ideas highlighted in the study’s LP with their instruction (p. 1195). Furtak, therefore, worried that the LP may have reinforced the use of a ‘right/wrong’ perspective for interpreting. AJ continued to use the FMLP as a ‘list of ideas’ even when his planning practices shifted toward being more responsive. For example, when planning to elicit, AJ anticipated whether and worked to ensure that some of the ideas from the FMLP would be elicited if his student held the ideas. Furtak, Circa and Heredia (2018) similarly observed this more responsive use of 123 the FMLP as a ‘list of ideas’ to inform planning to elicit. In their longitudinal study of teachers’ use of an LP to inform instruction, Furtak and colleagues found that, over time, teachers came to ask more questions that elicited student thinking when aided by an LP and LP-related PD. AJ’s use of the FMLP as a ‘list of ideas’ for both more and less responsive ways of planning suggests that an LP may be helpful to teachers making this transition in practice, which Furtak and Heredia (2014) similarly hypothesized. LP Use as a List of Bridges (5.A.3.b). AJ also reported using the FMLP as a list of ‘bridges’ between the commonly held student ideas it contains. AJ reported this use when planning to respond with attention to student ideas. Using the ‘bridges between ideas’ within the FMLP aligns with a hypothesized use of an LP as an instructional pathway (Yin et al., 2014). Under this hypothesis, a teacher would plan instruction to address student ideas using an LP as a roadmap, following the sequence of ‘bridges’ between levels of ideas. Starting with ideas at the lowest level first, teachers would try to ‘bridge’ the gap between level 1 and level 2 with their planned instruction. This would be followed by planned instruction informed by the ‘bridge’ between levels 2 and 3, and so on. During my study, AJ reported using the ‘bridges,’ but not the specific sequencing of the ‘bridges,’ found within the FMLP when planning to respond. In other words, AJ used the FMLP to develop his own instructional pathway, rather than using the specific roadmap through potential student ideas laid out in the FMLP as suggested in the literature. Limitations (5.A.4) There are limitations to this study in terms of its generalizability. These limitations are related to the design of the study and my positionality, as AJ’s trusted friend and as a researcher invested in support for teacher responsiveness. This study involves only one teacher 124 (limit from design) who received PD support from a trusted friend (limit from my positionality) as he intentionally began to engage in responsive planning. The trust and respect in our relationship allowed me to push on his practice during planning meetings with less concern about preserving our relationship. However, there may have been moments when I did not push as hard for fear of being too critical on a friend. Therefore, this study offers one possible way that a teacher may shift his planning practices toward responsiveness under fairly ideal conditions. Since most teachers would not be receiving PD from a friend, the findings of this study are limited in their generalizability to studies of responsive planning under different PD conditions. Additionally, AJ may have been more willing to try out new ways of planning because he wanted to be supportive of my dissertation. He may not have made the same shifts in planning had his PD support come from someone else, especially someone he trusted less. Additionally, I cannot make claims about shifts in AJ’s practice in units beyond F&M and without support. His practice may look different for a different topic of instruction or without any support. Additionally, I cannot make claims about whether changes in AJ’s responsive planning practices during the F&M unit persisted beyond the year of the study. Therefore, my findings cannot be generalized to AJ’s planning practice broadly. Instead of generalization, this study offers a ‘what’s possible’ contribution to the literature. My positionality offered both affordances and constraints for the trustworthiness of my analyses. My relationship with AJ afforded me greater insight into his teaching practice, especially his initial engagement in responsive planning practices. However, I do have an interest in supporting teachers’ planning responsiveness, particularly for a teacher I know well, which may have biased my interpretation of the data. To account for this potential bias, fellow 125 doctoral students assisted me in the coding process and in reviewing my interpretations of the data. Implications and Recommendations (5.B) This study provides evidence of how a teacher increased the responsiveness of his planning practices and of how he used an LP while doing so. Based on my findings and my experience being the PD provider for AJ during the study, I discuss several implications and recommendations for teacher educators and PD providers. I end with my goals for future research. Dimensions of Responsiveness Tool Recommendation (5.B.1) Based on the utility of the three, separate dimensions of responsiveness (i.e., discourse, attention, and student role) for identifying subtle ways that AJ increased the responsiveness of his planning, it may be useful to incorporate these three dimensions of responsiveness into a tool for analyzing teacher lesson plans. Such a tool might look similar to the coding guide used in this study, but with examples of how other teachers (like AJ) made subtle adjustments (e.g., shift from low to medium) in responsiveness of their planned instruction. This tool might be used by teacher educators or PD providers to evaluate lesson plans and inform the feedback they provide (see 5.B.3). The tool may be useful for teachers as well. Teachers could use the tool to evaluate their own or sample lesson plans. Teachers could also use the tool as a guide during planning. For example, if teachers felt their lesson plans were low in a particular dimension, teachers could make use of the example shifts as potential ideas to use in their own planning. By focusing on one dimension at a time, teachers may feel the shifts are more manageable and, therefore, may be able to make small shifts toward responsiveness 126 in their planning. Over time, these small shifts can add up to significant changes in the responsiveness of their planning, and, thus, their instructional practice. Using a tool with the three separate dimensions of responsiveness directly with teachers may be a form of support that can help focus teachers’ planning decisions on ways that produce responsive plans. LP Tool Recommendation (5.B.2) An LP may be a useful tool for teachers to use as they as they begin to engage in responsive planning. However, PD support may be needed to help teachers use this tool as they transition from less to more responsive planning. Sometimes the introduction of a new teaching tool may demand a shift in practice before the tool proves useable, but AJ was able to use the FMLP without much PD at the start of the study. However, his use was consistent with his original less responsive planning practices. With PD on responsive instruction (anticipating, eliciting, interpreting and responding), AJ was able to use the FMLP in more responsive ways. Therefore, an LP may be useful to teachers making the transition from less to more responsive planning because the tool can transition with them. Lessons Learned and Recommendations from the PD Provider Perspective (5.B.3) A true analysis of the PD I provided to AJ was beyond the scope of this study. However, in this section I share two lessons learned and recommendations from my experience as AJ’s PD provider with the hope that these are useful to other PD providers and teacher educators. It likely comes as no surprise that the overarching theme of my lessons and recommendations is that it is important to be responsive to a teacher’s current and changing practice when providing support. 127 My first ‘lesson learned’ was that sometimes my suggestions for AJ were too far removed from his current practice. In other words, my suggestions were too big of a leap for AJ, given his current practice. For example, I was aiming for AJ to have discussions with his students when he was still providing lectures as his main responding strategy. Luckily, AJ was open about his struggles with shifting away from lecturing. Therefore, I could change my suggestions to something a little closer to his current practice, but still a shift toward responsiveness. Continuing the example above, I then suggested that AJ continue lecturing but include the disciplinary connections he saw in his student ideas within the lecture, therefore increasing the attention to student thinking during instruction. In this case, my suggestions to AJ shifted from a focus on the end goal (analogous to AJ’s initial focus on ‘correct knowledge’) to starting with AJ’s current practice as the foundation from which to shift his practice toward responsiveness. Therefore, my recommendation is two-fold. First, consider that a lack of shift or uptake of suggestions may simply be evidence that the suggestions are asking too much of the teacher and may need to be broken into smaller steps toward the goal (as opposed to seeing this as a deficit of the teacher). Second, a tool like the dimensions of responsiveness tool described above may be useful in helping PD providers and teacher educators in shaping suggestions to be closer to a teacher’s current practice but still push the teacher toward responsiveness. For example, if a teacher is currently low in all three dimensions of responsiveness, suggestions may focus on just one of the dimensions and aim to shift toward a medium level of responsiveness rather than offering suggestions that ask the teacher to shift two or three dimensions with a jump from low to high responsiveness. By doing so, the support offered attends to and builds on (i.e., is more responsive to) the teacher’s current practice. 128 My second ‘lesson learned’ comes from the success of reframing student ideas as hypotheses as a way to support a shift in AJ’s practice. AJ initially focused his plans to elicit on correct knowledge. It was almost as if AJ couldn’t imagine a reason for eliciting ‘incorrect’ knowledge from students. However, as soon as I reframed students' ideas as hypotheses, eliciting student ideas that might be wrong had value. To AJ, hypotheses are not necessarily meant to be correct. They are just predictions based on current understanding that are meant to be tested. By drawing on AJ’s understanding of ‘hypotheses,’ a construct that AJ was very familiar with as a science teacher, my support was able to help provide a purpose for eliciting that AJ couldn’t imagine without the reframing of student ideas. Therefore, my recommendation, which is not new or unique, is to draw on ideas or constructs that teachers are very familiar with (e.g., hypotheses) as a way to shift their thinking about a construct they are struggling to view in a new way (e.g., student ideas). In other words, analogous, similes and metaphors may be very useful in helping teachers reframe the way they think about teaching and learning. Again, by accessing and building on what teachers already know, the support provided is responsive to teachers’ current understandings. Implications and Recommendation from my Relationship with AJ (5.B.4) The PD I provided to AJ during the study played a critical role in how AJ made shifts in his planning practice. In some cases, the PD supported AJ in changing his practice (e.g., shift in AJ’s plans to elicit after I suggested that AJ think of student ideas as hypotheses). In other cases, the PD aimed to maintain AJ’s new practices in the face of challenges (e.g., suggesting that AJ plan to elicit again when facing uncertainty about student thinking). My suggestions, though, were very much responsive (or tailored) to AJ, both to where his planning practice was at the 129 time and to what I knew about AJ as a teacher and a person. Our shared history as colleagues and friends meant that we already had many common points of reference. Therefore, I could say things like ‘what if you tried it like that one activity you used to do, but just changed part X’ and AJ would be able to follow my train of thought. It meant I already had a sense for what was important to AJ as a science teacher and could use this knowledge to gauge what kind of suggestions AJ would likely find useful. My relationship with AJ, built up over years of working together and being friends, was critical to my ability to play the PD provider role during the study. Educational researchers often play the additional role of PD provider to participants, yet may not have the same relationship with practicing teachers as I did with AJ. Because educational researchers are often considered outsiders to schools, their ability to build up a sufficient level of trust so that teachers feel comfortable sharing their practice and seeking instructional guidance from researchers is limited. Additionally, teachers are often pulled in many different directions during the school year, leaving little time to build relationships with outside PD providers. Therefore, I believe it may be useful for educational researchers turn their attention to building relationships with and supporting instructional coaches, rather than teachers, as they work to shift classroom practice. An instructional coach is a relatively new type of position. People in these positions are hired to provide on-going PD to teachers in the district in a particular subject area (e.g., science, literacy). Therefore, it will be important for instructional coaches to build relationships with teachers in their district so they might provide the kind of responsive instructional support critical to shift teacher practice. 130 Educational researchers may offer a unique support for instructional coaches. Because the positions are relatively new, instructional coaches are less likely to be in community with each other (across districts or regions) to discuss and develop ways to support practicing teachers. Educational researchers, because of their experience as PD providers and teacher educators, may be well positioned to support a community of instructional coaches and, indirectly, influence change in teacher practice. Future Research (5.B.5) Absent from my findings in this study is an analysis of AJ’s responsiveness through an equity lens. The students in AJ’s classes were mostly middle-class and white. However, paying attention to the experiences of the few students of color and students of lower socio-economic status are likely to reveal a lack of responsiveness to the lived experiences of these students. In my future work, I hope to turn an eye to cultural responsiveness and how this might align with and/or offer challenges to being responsive to students emerging science ideas. 131 REFERENCES Abd-El-Khalick, F., Boujaoude, S., Duschl, R., Lederman, N. G., Mamlok-Naaman, R., Hofstein, A., Niaz, M., Treagust, D., & Tuan, H.-L. (2004). Inquiry in science education: International perspectives. Science Education, 88(3), 397–419. https://doi.org/10.1002/sce.10118 Achieve, & NextGenStorylines. (2016). Using phenomena in NGSS-designed lessons and units. In STEM Teaching Tools Initiative, Institute for Science + Math Education. University of Washington. http://creativecommons.org/licenses/by/3.0/ Alonzo, A. C. (2012). Learning progressions: Significant promise, significant challenge. Zeitschrift Fur Erziehungswissenschaft, 15(1), 95–109. https://doi.org/10.1007/s11618-012-0253-4 Alonzo, A. C., & Elby, A. (2019). Beyond empirical adequacy: Learning progressions as models and their value for teachers. Cognition and Instruction, 37(1), 1–37. https://doi.org/10.1080/07370008.2018.1539735 Alonzo, A. C., & Steedle, J. T. (2009). Developing and assessing a force and motion learning progression. Science Education, 93(3), 389–421. https://doi.org/10.1002/sce.20303 Alonzo, A. C., Wooten, M. M., & Christensen, J. (2022). Learning progressions as a simplified model: Examining teachers’ reported uses to inform classroom assessment practices. Science Education, 106, 852–889. https://doi.org/10.1002/sce.21713 Andersson, C., & Palm, T. (2016). The impact of formative assessment on student achievement: A study of the effects of changes to classroom practice after a comprehensive professional development programme. Learning and Instruction, 49, 92–102. https://doi.org/10.1016/j.learninstruc.2016.12.006 Ball, D. L. (1993). With an eye on the mathematical horizon: Dilemmas of teaching elementary school mathematics. In Source: The Elementary School Journal (Vol. 93, Issue 4). https://www.jstor.org/stable/1002018?seq=1&cid=pdf- Bell, B., & Cowie, B. (2001). The characteristics of formative assessment in science education. Science Education, 85(5), 536–553. Bennett, R. E. (2011). Formative assessment: A critical review. Assessment in Education: Principles, Policy and Practice, 18(1), 5–25. https://doi.org/10.1080/0969594X.2010.513678 Bergan, J. R., Sladeczek, I. E., Schwarz, R. D., & Smith, A. N. (1991). Effects of a measurement and planning system on kindergartners’ cognitive development and educational Programming. American Educational Research Journal, 28(3), 683–714. 132 Black, P., & Wiliam, D. (1998a). Assessment and classroom learning. Assessment in Education: Principles, Policy & Practice, 5(1), 7–74. https://doi.org/10.1080/0969595980050102 Black, P., & Wiliam, D. (1998b). Inside the Black Box: Raising Standards through Classroom Assessment. In Source: The Phi Delta Kappan (Vol. 80, Issue 2). Borko, H., & Livingston, C. (1989). Cognition and improvisation: Differences in mathematics instruction by expert and novice teachers. American Educational Research Journal, 26(4), 473–498. Borko, H., Livingston, C., & Shavelson, R. J. (1990). Teachers’ thinking about instruction. Remedial and Special Education, 11(6), 40–49. https://doi.org/10.1177/074193259001100609 Briggs, D. C., Alonzo, A. C., Schwab, C., & Wilson, M. (2006). Diagnostic Assessment With Ordered Multiple-Choice Items. Educational Assessment, 11(1), 33–63. https://doi.org/10.1207/s15326977ea1101_2 Campbell, T., Schwarz, C. v., & Windschitl, M. (2016). What we call misconceptions may be necessary stepping-stones toward making sense of the world. Science and Children, 053(07). https://doi.org/10.2505/4/sc16_053_07_28 Carpenter, T. P., Fennema, E., Peterson, P. L., Chiang, C.-P., & Loef, M. (1989). Using knowledge of children’s mathematics thinking in classroom teaching: An experimental study. American Educational Research Journal, 26(4), 499–531. Cartier, J. L., Smith, M. S., Stein, M. K., & Ross, D. K. (2013). 5 practices for orchestrating productive task-based discussions in science. NSTA Press. https://doi.org/10.1016/j.paid.2011.03.037 Chazan, D., & Schnepp, M. (2002). Methods, goals, beliefs, commitments, and manner in teaching: Dialogue against a calculus backdrop. Social Tonstructivist Teaching, 9, 171– 195. Christensen, J., & Alonzo, A. C. (2018). Teachers’ use of a learning progression to inform planned instruction. NARST Annual International Conference. Clark, C. M., & Peterson, P. L. (1986). Teachers’ thought processes. In M. C. Wittrock (Ed.), Handbook of research on teaching (3rd ed., Vol. 3, pp. 255–296). Macmillan. Coffey, J. E., Hammer, D., Levin, D. M., & Grant, T. (2011). The missing disciplinary substance of formative assessment. Journal of Research in Science Teaching, 48(10), 1109–1136. https://doi.org/10.1002/tea.20440 133 Colson, M., & Colson, R. (2016). Planning NGSS-based instruction: Where do you start? Science and Children, 53(6), 51–53. Dewey, J. (1916). Democracy and education: An introduction to the philosophy of education. Free Press. Driver, R., Asoko, H., Leach, J., Scott, P., & Mortimer, E. (1994). Constructing scientific knowledge in the classroom. Educational Researcher, 23(7), 5–12. https://doi.org/10.3102/0013189X023007005 Duncan, R. G., & Hmelo-Silver, C. E. (2009). Learning progressions: Aligning curriculum, instruction, and assessment. Journal of Research in Science Teaching, 46(6), 606–609. https://doi.org/10.1002/tea.20316 Dyson, A. H., & Genishi, C. (2005). On the Case: Approaches to Language and Literacy Research. Teachers College Press. Edmunds, L. (2011). The planning processes of teachers in high-achieving schools: Case studies of six tenth grade English teachers. (Publication No. 3475513) [Doctoral dissertation, Azusa Pacific University]. UMI Dissertation Publishing. Elby, A., Luna, M. J., Robertson, A. D., Levin, D. M., & Richards, J. (2020). Framing analysis lite: A tool for teacher educators. The Interdisciplinarity of the Learning Sciences, 14th International Conference of the Learning Sciences (ICLS), 4, 2085–2092. Empson, S. B., & Jacobs, V. R. (2008). Learning to listen to children. In D. Tirosh & T. Wood (Eds.), Tools and processes in mathematics teacher education (pp. 257–281). Sense Publishers. https://doi.org/10.4324/9780203392621-11 Fennema, E., Carpenter, T. P., Franke, M. L., Levi, L., Jacobs, V. R., & Empson, S. B. (1996). A longitudinal study of learning to use children’s thinking in mathematics instruction. Journal for Research in Mathematics Education, 27(4), 403–434. Fennema, E., Franke, M. L., Carpenter, T. P., & Carey, D. A. (1993). Using children’s knowledge in instruction. American Educational Research Journal, 30(3), 555–583. Flyvbjerg, B. (2011). Case study, case study, case study! In N. K. Denzin & Y. S. Lincoln (Eds.), The Sage Handbook of Qualitative Research (4th ed., pp. 301–316). Sage Publishing. Furtak, E. M. (2012). Linking a learning progression for natural selection to teachers’ enactment of formative assessment. Journal of Research in Science Teaching, 49(9), 1181–1210. https://doi.org/10.1002/tea.21054 134 Furtak, E. M., Bakeman, R., & Buell, J. Y. (2018). Developing knowledge-in-action with a learning progression: Sequential analysis of teachers’ questions and responses to student ideas. Teaching and Teacher Education, 76, 267–282. https://doi.org/10.1016/j.tate.2018.06.001 Furtak, E. M., Circi, R., & Heredia, S. C. (2018). Exploring alignment among learning progressions, teacher-designed formative assessment tasks, and student growth: Results of a four-year study. Applied Measurement in Education, 31(2), 143–156. https://doi.org/10.1080/08957347.2017.1408624 Furtak, E. M., & Heredia, S. C. (2014). Exploring the influence of learning progressions in two teacher communities. Journal of Research in Science Teaching, 51(8), 982–1020. https://doi.org/10.1002/tea.21156 Gotwals, A. W., & Birmingham, D. (2016). Eliciting, identifying, interpreting, and responding to students’ ideas: Teacher candidates’ growth in formative assessment practices. Research in Science Education, 46(3), 365–388. https://doi.org/10.1007/s11165-015- 9461-2 Gotwals, A. W., Philhower, J., Cisterna, D., & Bennett, S. (2015). Using video to examine formative assessment practices as measures of expertise for mathematics and science Teachers. International Journal of Science and Math Education, 13(2), 405–423. https://doi.org/10.1007/s10763-015-9623-8 Grant, T. J., Kline, K., Crumbaugh, C., Kim, O.-K., & Cengiz, N. (2009). How can curriculum materials support teachers in pursuing student thinking during whole-group discussions? In J. T. Remillard, B. A. Herbel-Eisenmann, & G. M. Lloyd (Eds.), Mathematics teachers at work: Connecting curriculum materials and classroom instruction (pp. 103–117). Routledge. Gunckel, K. L., Covitt, B. A., & Salinas, I. (2018). Learning progressions as tools for supporting teacher content knowledge and pedagogical content knowledge about water in environmental systems. Journal of Research in Science Teaching, 55(9), 1339–1362. https://doi.org/10.1002/tea.21454 Hammer, D. (1997). Discovery learning and discovery teaching. Cognition and Instruction, 15(4), 485–529. https://doi.org/10.1207/s1532690xci1504_2 Hammer, D., & van Zee, E. H. (2006). Seeing the science in children’s thinking: Case studies of student inquiry in physical science. Heinemann. Heritage, M. (2007). Formative assessment: What do teachers need to know and do? Phi Delta Kappan, 89(2), 140–145. 135 Heritage, M., Kim, J., Vendlinski, T., & Herman, J. (2009). From evidence to action: A seamless process in formative assessment? Educational Measurement: Issues and Practice, 28(3), 24–31. Howe, C., & Abedin, M. (2013). Classroom dialogue: a systematic review across four decades of research. Cambridge Journal of Education , 43(3), 325–356. https://doi.org/10.1080/0305764X.2013.786024 Jin, H., & Anderson, C. W. (2012). A learning progression for energy in socio-ecological systems. Journal of Research in Science Teaching, 49(9), 1149–1180. https://doi.org/10.1002/tea.21051 Jin, H., Mikeska, J. N., Hokayem, H., & Mavronikolas, E. (2019). Toward coherence in curriculum, instruction, and assessment: A review of learning progression literature. Science Education, 103(5), 1206–1234. https://doi.org/10.1002/sce.21525 Joyce, B. (1978). Toward a theory of information processing in teaching. Education Research Quarterly, 3(4), 66–77. Kademian, S. M., & Davis, E. A. (2018). Supporting beginning teacher planning of investigation- based science discussions. Journal of Science Teacher Education, 29(8), 712–740. https://doi.org/10.1080/1046560X.2018.1504266 Kang, H., & Anderson, C. W. (2015). Supporting preservice science teachers’ ability to attend and respond to student thinking by design. Science Education, 99(5), 863–895. https://doi.org/10.1002/sce.21182 Katchevich, D., Hofstein, A., & Mamlok-Naaman, R. (2013). Argumentation in the chemistry laboratory: Inquiry and confirmatory experiments. Research in Science Education, 43(1), 317–345. https://doi.org/10.1007/s11165-011-9267-9 Kennedy, M. M. (2006). Knowledge and vision in teaching. Journal of Teacher Education, 57(3), 205–211. https://doi.org/10.1177/0022487105285639 Krajcik, J., Codere, S., Dahsah, C., Bayer, R., & Mun, K. (2014). Planning instruction to meet the intent of the Next Generation Science Standards. Journal of Science Teacher Education, 25(2), 157–175. https://doi.org/10.1007/s10972-014-9383-2 Lampert, M. (2001). Teaching problems and the problems of teaching. Yale University Press. Larkin, D. (2012). Misconceptions about “misconceptions”: Preservice secondary science teachers’ views on the value and role of student ideas. Science Education, 96(5), 927– 959. https://doi.org/10.1002/sce.21022 136 Lehrer, R., & Schauble, L. (2015). Learning progressions: The whole world is NOT a stage. Science Education, 99(3), 432–437. https://doi.org/10.1002/sce.21168 Lemke, J. L. (1990). Talking science: Language, learning and values. Ablex Publishing. Levin, D. M., Hammer, D., & Coffey, J. E. (2009). Novice teachers’ attention to student thinking. Journal of Teacher Education, 60(2), 142–154. https://doi.org/10.1177/0022487108330245 Levrini, O., Fantini, P., Tasquier, G., Pecori, B., & Levin, M. (2015). Defining and Operationalizing Appropriation for Science Learning. Journal of the Learning Sciences, 24(1), 93–136. https://doi.org/10.1080/10508406.2014.928215 Lincoln, Y. S., & Guba, E. G. (1985). Naturalistic inquiry. SAGE Publications. Lineback, J. E. (2016). Methods to assess teacher responsiveness in situ. In A. D. Robertson, R. E. Scherr, & D. Hammer (Eds.), Responsive teaching in science and mathematics (pp. 203– 226). Routledge. Mangiante, E. S. (2018). Planning for reform-based science: Case studies of two urban elementary teachers. Research in Science Education, 48(1), 207–232. https://doi.org/10.1007/s11165-016-9566-2 Maskiewicz, A. C. (2016). Navigating the challenges of teaching responsibly: An insider’s perspective. In A. D. Robertson, R. E. Scherr, & D. Hammer (Eds.), Responsive teaching in science and mathematics (pp. 105–125). Routledge. McManus, S. (2008). Attributes of effective formative assessment. Council of Chief State School Officers, Washington, DC, 1–6. Mehan, H. (1979). Learning lessons: Social organization in the classroom. Harvard University Press. Minstrell, J., Anderson, R., & Li, M. (2011). Building on learning thinking: A framework for assessment in instruction. In Commissioned paper for the Committee on Highly Successful STEM Schools or Programs for K-12 STEM Education. Minstrell, J., & van Zee, E. H. (2000). Inquiring into Inquiry Learning and Teaching in Science (J. Minstrell & E. H. van Zee, Eds.). American Association for the Advancement of Science. National Research Council. (1999). How people learn: Brain, mind, experience and school (J. D. Bransford, A. L. Brown, & R. R. Cocking, Eds.). National Academy Press. https://doi.org/10.1201/9780203736548-8 137 National Research Council. (2007). Taking science to school: Learning and teaching science in grades K-8. In Taking Science to School. The National Academies Press. https://doi.org/10.17226/11625 National Research Council. (2012). A framework for K-12 science education: Practices, crosscutting concepts, and core ideas. National Research Council. (2015). Guide to implementing the next generation science standards. The National Academies Press. https://doi.org/10.17226/18802 NGSS Lead States. (2013). Next Generation Science Standards: For states, by states. NSTA. (2017). About the Next Generation Science Standards. NGSS@NSTA. https://ngss.nsta.org/about.aspx Otero, V. K., & Nathan, M. J. (2004). Elementary pre-service teachers’ conceptions of student prior knowledge. Physics Education Research Conference Proceedings, 141–144. http://proceedings.aip.org/proceedings/cpcr.jsp Otero, V. K., & Nathan, M. J. (2008). Preservice elementary teachers’ views of their students’ prior knowledge of science. Journal of Research in Science Teaching, 45(4), 497–523. https://doi.org/10.1002/tea.20229 Penuel, W. R., & Bell, P. (2016). Qualities of a good anchor phenomenon for a coherent sequence of science lessons. In STEM Teaching Tools Initiative, Institute for Science + Math Education. University of Washington. Pierson, J. L. (2008). The relationship between patterns of classroom discourse and mathematics learning committee (Doctoral dissertation). The University of Texas at Austin, Austin, TX. Richards, J., Elby, A., Luna, M. J., Robertson, A. D., Levin, D. M., & Nyeggen, C. G. (2020). Reframing the responsiveness challenge: A framing-anchored explanatory framework to account for irregularity in novice teachers’ attention and responsiveness to student thinking. Cognition and Instruction, 38(2), 116–152. https://doi.org/10.1080/07370008.2020.1729156 Richards, J., Johnson, A., & Gillespie Nyeggen, C. (2015). Inquiry-based science and the next generation science standards: A magnetic attraction. Science and Children, 52(6), 54–58. https://doi.org/10.2505/4/sc15_052_06_54 Richards, J., & Robertson, A. D. (2016). A review of the research on responsive teaching in science and mathematics. In A. D. Robertson, R. E. Scherr, & D. Hammer (Eds.), Responsive teaching in science and mathematics (pp. 36–55). Routledge. 138 Robertson, A. D., & Atkins Elliott, L. J. (2020). Truth, success, and faith: Novice teachers’ perceptions of what’s at risk in responsive teaching in science. Science Education, 104(4), 736–761. https://doi.org/10.1002/sce.21568 Robertson, A. D., Atkins, L. J., Levin, D. M., & Richards, J. (2016). What is responsive teaching? In A. D. Robertson, R. E. Scherr, & D. Hammer (Eds.), Responsive teaching in science and mathematics (pp. 1–35). Routledge. Rop, C. J. (2002). The meaning of student inquiry questions: A teacher’s beliefs and responses The meaning of student inquiry questions: a teacher’s beliefs and responses. International Journal of Science Education, 24(7), 717–737. https://doi.org/10.1080/09500690110095294 Ruiz-Primo, M. A., & Furtak, E. M. (2007). Exploring teachers’ informal formative assessment practices and students’ understanding in the context of scientific inquiry. Journal of Research in Science Teaching, 44(1), 57–84. https://doi.org/10.1002/tea.20163 Shavelson, R. J., & Kurpius, A. (2012). Reflections on learning progressions. In A. C. Alonzo & A. W. Gotwals (Eds.), Learning progressions in science: Current challenges and future directions. Sense Publishers. Shavelson, R. J., Young, D. B., Ayala, C. C., Brandon, P. R., Furtak, E. M., Ruiz-Primo, M. A., Tomita, M. K., & Yin, Y. (2008). On the impact of curriculum-embedded formative assessment on learning: A collaboration between curriculum and assessment developers. Applied Measurement in Education, 21(4), 295–314. https://doi.org/10.1080/08957340802347647 Shepard, L. A. (2009). Commentary: Evaluating the validity of formative and interim assessment. Educational Measurement: Issues and Practice, 28(3), 32–37. Smith, P. S. (2020). What Does a National Survey Tell Us about Progress toward the Vision of the NGSS? Journal of Science Teacher Education, 31(6), 601–609. https://doi.org/10.1080/1046560X.2020.1786261 Stahnke, R., Schueler, S., & Roesken-Winter, B. (2016). Teachers’ perception, interpretation, and decision-making: a systematic review of empirical mathematics education research. ZDM, 48(48), 1–27. https://doi.org/10.1007/s11858-016-0775-y Stake, R. E., & Trumball, D. J. (1982). Naturalistic generalizations. Review Journal of Philosophy & Social Science, 7(1/2), 1–12. Tang, X., Coffey, J. E., Elby, A., & Levin, D. M. (2009). The scientific method and scientific inquiry: Tensions in teaching and learning. Science Education, 94(1), 29–47. https://doi.org/10.1002/sce.20366 139 Thompson, J., Hagenah, S., Kang, H., Stroupe, D., Braaten, M., Colley, C., & Windschitl, M. (2016). Rigor and responsiveness in classroom activity. Teachers College Record, 118(5). van Es, E. A., & Sherin, M. G. (2002). Learning to notice: Scaffolding new teachers’ interpretations of classroom interactions. Journal of Technology and Teacher Education, 10(4), 517–596. van Es, E. A., & Sherin, M. G. (2021). Expanding on prior conceptualizations of teacher noticing. ZDM - Mathematics Education, 53(1), 17–27. https://doi.org/10.1007/s11858-020- 01211-4 von Aufschnaiter, C., & Alonzo, A. C. (2018). Foundations of formative assessment: Introducing a learning progression to guide preservice physics teachers’ video-based interpretation of student thinking. Applied Measurement in Education, 31(2), 113–127. https://doi.org/10.1080/08957347.2017.1408629 Wiser, M., Smith, C. L., & Doubler, S. (2012). Learning progressions as tools for curriculum development. In A. C. Alonzo & A. W. Gotwals (Eds.), Learning progressions in science: Current challenges and future directions (pp. 359–403). Sense Publishers. https://doi.org/10.1007/978-94-6091-824-7_16 Wooten, M., Alonzo, A. C., & Christensen, J. (2019). What teachers find useful about learning progressions for classroom formative assessment practices. AERA Annual Conference. Yin, Y., Tomita, M. K., & Shavelson, R. J. (2014). Using formal embedded formative assessments aligned with a short-term learning progression to promote conceptual change and achievement in science. International Journal of Science Education, 36(4), 531–552. https://doi.org/10.1080/09500693.2013.787556 Zhai, X., Li, M., & Guo, Y. (2018). Teachers’ use of learning progression-based formative assessment to inform teachers’ instructional adjustment: a case study of two physics teachers’ instruction. International Journal of Science Education, 40(15), 1832–1856. https://doi.org/10.1080/09500693.2018.1512772 140 APPENDIX A: FORCE & MOTION LEARNING PROGRESSION (FMLP) Table 13 Force and Motion Learning Progression (FMLP) Level Description Force No Force Motion No Motion 4 ● Net force applied to an object is If there is a non- If there is no net If an object is If an object is not proportional to its resulting acceleration zero net force force acting on an accelerating, a non- moving, the net force (change in speed or direction); the net force acting on an object, it will move zero net force is acting on the object may not be in the direction of motion. object, it will with constant acting on it. If an is zero – unless the accelerate. velocity (including object is moving object has a zero zero velocity, i.e., with constant instantaneous at rest). velocity, no net velocity (as its force is acting on it. velocity changes, in which case a net force is acting on the object (to change its velocity). 3 ● An object is stationary either because there If there is a non- If there is no net If an object is If an object is not are no forces acting on it or because there is zero net force force acting on an moving with a moving, the net force no net force acting on it. acting on an object, it is either constant velocity, a acting on the object ● An object’s speed (rather than its object, it will slowing down or non-zero net force is is zero. acceleration) is proportional to the net move with stopped (i.e., at acting on it. If an force in the direction of its motion. constant velocity. rest). object is slowing ● Objects may be moving even when no down, no net force forces are being applied; however, objects (3A): The force (3A): Zero net force is acting on it. cannot continue moving indefinitely without that put the could result from (3A): Zero net force an applied force. object into opposing forces (3A): The force that could result from ● There may be forces acting on an object motion initially coming into put the object into opposing forces that are not in the direction of its motion; contributes to the balance (e.g., motion initially coming into balance however, an object cannot be moving in a net force. through contributes to the (e.g., through direction different from that of the net dissipation of the net force. dissipation of the force. force that put the force that put the object into motion object into motion initially). initially). 141 Table 13 (cont’d) 2 ● Motion implies a force in the direction of If a force is acting If no force is acting If an object is If an object is not motion; non-motion implies no force. on an object, it is on an object, it is moving, a force is moving, no force is ● Force implies motion in the direction of the moving in the not moving. acting on it in the acting on it. force. direction of the direction of its force. motion. (2A): The force could be the force (2A): The force could that put the be the force that put object into motion the object into initially (which is motion initially carried with the (which is carried object and may with the object and dissipate over may dissipate over time). time). The object may come to rest once this force has been used up. 1 ● Force is a push or a pull that may or may If a force is acting not involve motion. on an object, it is ● Force is an internal property of objects moving unless the related to their weight. object is too ● Forces prevent the natural movement of heavy to be objects (e.g., gravity prevents objects from moved. flying off into space). Note. Adapted from (Alonzo & Steedle, 2009) 142 APPENDIX B: INTRODUCTION TO LEARNING PROFESSIONS PRESENTATION SLIDES 143 144 145 146 APPENDIX C: POST-UNIT REFLECTIVE CONVERSATION PROTOCOL Post-Unit Interview Protocol - Semi-structured: Ask teacher to plan on walking through unit together – so he can bring/make plan-book available during interview. New Instructional Unit – Walk through together: 1. Can you walk me through this unit? a. Be sure to have teacher discuss the following for each lesson/day/activity: i. What would students be doing? ii. What would you be doing, as the teacher? iii. Can you tell me how you decided to break up the content in these ways? Teacher may respond in terms of ‘time’ (i.e. It’s about what I can fit into my 90 minute block). If so, push teacher to think about how he knows what will ‘fit’ in the 90min block. b. Make a note to ask for copies of the curricular materials of interest c. General differences… 2. What kinds of things guided your thinking when planning this unit? a. Which of these is the most important or salient for you? i. Edited due to : 1. Friction a. much bigger focus because misconceptions seem tied to friction b. Putting friction into context, came up naturally 2. Kinematics – dropped for now, not intending to do it b. Is this carrying through to the units that follow this one? i. Projectile motion – looked at video, no initial models…re-prompted 1. Forces first…only gravity 2. Did some kinematics in the vertical 3. What does this mean for the horizontal direction? ii. Circular motion – no modeling, reflecting probably should have 1. Similar order iii. Energy – 1. Where does the energy come from for all that’s happening in this video? Transformations 2. Had an idea of energy conservation already 3. Resisted an ‘accounting’ iv. Momentum – 147 1. No modeling 2. More buy-in that ‘accounting’ is a helpful tool 3. How would you describe the flow of content from the start of the unit to the end of the unit? a. If stuck: i. Do you think of starting with ‘foundational’ knowledge to ‘harder’ or ‘more advanced’? ii. Maybe it’s better captured by as starting with a ‘review of ideas they’ve already learned’ and then adding content that branches from that? b. Again, is that similarly true for your units that followed this one? c. If not addressed, i. How did the LP ideas factor into the planning of this unit? 1. Awareness 2. Didn’t help with a progression through the unit 3. Kids all over the map of the LP, challenging in a whole group setting ii. How did the ideas of your specific students factor into the unit? 1. Guideposts/sign posts, content that has to eventually be covered (GPS), “recalculating” 2. “Jump in a lake” - more in the practices than in the content 3. Struggle with feeling like a failure 4. What kinds of goals did you have for your students in this unit? a. What did you want them to experience as part of the process? b. What kind of outcomes did you want for them? 5. And what kinds of things did you prioritize as you planned your instruction? a. Naturally knock down “bad” ideas b. Build up canonically correct experiences Thinking on how people learn: 6. Can you describe how you saw students overcoming impetus (or learning another specific idea discussed earlier). 7. If students struggled with this concept, what parts of the concept do you feel are difficult? a. …and why do you think that is? 148 b. What kinds of things do you do to help them with this? Thinking on reform efforts: 8. NGSS – phenomena a. Made the process possible, what does this tell us about...buy-in 9. NGSS – practices 10. Have your thoughts about the Next Generation Science Standards shifted since we started our work together? a. If not addressed in answer: i. Have there been shifts in the parts you like or don’t like? ii. Do you feel you face the same roadblocks? iii. To what extent do you see NGSS as aligned or not with your views on how people learn? 11. So what does it mean to you to be responsive to student ideas? a. In what ways do you feel your teaching reflects this? b. How have your students responded to this? What affect do you think it’s had on them? c. Are there things that you feel you’ve had to give up in order to teach in this way? Given the tradeoffs, why did you decide to shift in the way you did? d. Are there ways of being responsive you still hope to improve on? If so, can you describe these. e. Is there anything specific that you see as a roadblock or hinderance to this improvement? f. Add note from previous discussions… 12. What are your general impressions of the LPs? Are there things you like? Things you don’t like? a. Is there anything about the LPs that were helpful to you being responsive, specifically? b. Were there any other ways that you found them useful? (Aside from responsiveness supports) c. Previously referred to the “gains” as the part that he used the most. If not addressed: You had mentioned in a previous conversation that the “gains” of the LP were part of a trio of things you thought about when planning. d. Can you give some specific examples of how these gains were used to plan the unit? i. Spurred ideas for unit planning (versus lesson planning) 149 What’s it like to try this… 13. What were some of the challenges you faced during the unit? a. What do you think contributed to this? b. Are there ways to make this better during the next time you teach the unit? c. Is this something that you think will just always be a challenge? d. You mentioned a few different things during previous conversations during the unit, do you still see XX (see below) as a challenge? i. Understanding how ‘practice sheets’ or ‘the math’ fits into the units ii. Assessment…depending on how it went 14. What were some of the successes you saw during the unit? a. What do you think contributed this? b. Will you try to implement this in other units? c. You mentioned a few different things during previous conversations during the unit, do you still see XX (see below) as a success? i. Students being more engaged generally in the unit? (Were they engaged all the way through?) 1. Students engaging well with the science journals? (Is this holding true still?) ii. Assessment…depending on how it went 15. Are there any other things from this unit that you’ll carry forward into other units? These could be strategies, activities, etc. 150 APPENDIX D: LESSON PLAN MEMO EXAMPLE Fall 1 - Segmentation Activity Summary/Description of Lesson: - Lesson: Students will break the rube goldberg (RG) video (phenomenon) into segments based on the motion of the objects and collate segments that are similar. - Date: 9/5 & 9/6/2019 - Background: This is a task AJ & I did during our PD planning time to find portions of the RG’s motion that were similar. Within a group of similar segments, we picked one segment to be used as an anchoring phenomenon and the other segments within the group as a set of “transfer tasks.” AJ decided he wanted students to do this same activity. Preliminary Noticings - AJ’s anticipation that students lack of &/or his interpreting that students are missing a canonical understanding of motion terms (i.e., speed, velocity, acceleration) will cause their discussions during the small group work to be unfocused. AJ participated in a task similar to what he’s asking students to do and feels his own content knowledge supported him in the completion of the activity. Together, these are fueling his desire to provide term/vocabulary instruction (less responsive framing) prior to the elicitation task of segmenting the video so that students will have the necessary knowledge to complete the task successfully. - Additionally, while students will engage in small group discussions during the main portion/purpose of the lesson (i.e., discourse = more responsive), the two different launching options AJ is debating between shape the way the small group portion is framed in terms of attention & roles (one way = more responsive; one way = less responsive). Overview of Codes: - Planning to Elicit - Discourse - more responsive - Attention - more & less responsive (debate) - Student Role - more & less responsive (debate) - TASK LAUNCH: Guided by-anticipated - Discourse - less responsive - Attention - less responsive - Student role - less responsive 151 - Anticipation - lack of canonical knowledge - Interpreting - missing - Knowledge/Skill - Support codes - Clarifying - Sharing perspective Story from the Codes: - The main portion of the lesson, the segmenting activity, is to have students work in small groups to discuss how they would segment the Rube Goldberg video into categories of similar types of motion. Therefore, this section of the lesson was coded as: - Planning to Elicit: Plans to have students work in small groups to segment video into similar clips based on their motion (i.e., eliciting ideas about motion) - Discourse for this portion of the lesson is more responsive because students will talk to each other in small groups (quote 1) - The original plan (the one AJ & I created together during the summer PD) was to launch students into the task and have them use their own terms and understandings of motion to guide their conversation/decisions. Therefore: - Attention is more responsive - on student’s ideas about motion - Role is more responsive - students contribute their own understanding to the initial models, participate in constructing knowledge - HOWEVER, during one of our first planning meetings AJ debates between the original version that we planned together (described above) and first introducing students to motion vocabulary (i.e., speed, velocity, acceleration) prior to the small group work. The two different options for the launch actually frame the small group portion of the lesson in two different ways with regard to attention to student ideas & student roles, one way is more responsive & one way less responsive. This debate is being guided by what AJ anticipates about students' thinking. Therefore this portion of the lesson was coded as: - Guided by - in this case, the anticipation of student thinking; with the following framing attached - Less responsive option: Provide students with definitions of motion terms (speed, velocity & acceleration) prior to initial modeling - Attention - implies students don’t already have ideas about motion or that because they may have differing ideas about motion, this will get in the way/derail of their conversations (see quote 1) 152 - Role - students must receive information from teacher prior to an elicitation task; therefore, the role of the student is shifted to be less responsive as the intention is for students to use vocab correctly to express their initial understanding (see quote 1) - He anticipated that students’ lack of canonical understanding about the various ways motion can be described (i.e., speed, velocity, acceleration; quote 1 [implied]) and what physics is (quote 3), that they will need this understanding in order to be successful during the small group discussions. This was also coded as interpreting-missing. - Additionally, AJ seems to be relying on his own knowledge of physics content in his reasoning. He basically says that he and I used our physics knowledge to support our discussions when we were segmenting the video ourselves, therefore students will need this canonical knowledge if they are to have productive discussions. (Quote 3) - Eventually, JC is able to talk AJ out of vocab instruction (support codes): - JC reminds AJ that we didn’t really use the ideas of constant speed/velocity or acceleration when we were segmenting the video. (Mostly we thought about horizontal v. vertical v. circular motion. And our term debate was actually about “motion” & “movement” because it was clear we were thinking differently about these terms) (Quote 5) - That the goal of eliciting is to find out what students actually think, because they do have ideas about motion, not whether they know canonical information/terms (Quote 4, 5 & 6) - Start with words kids (would) use (Quote 7) - AJ ends up providing a kid-friendly word bank during the initial modeling activity, rather than the segmentation activity. He reflected on this during a spring semester planning meeting. (Q8) Quotes/Evidence: - Quote 1: “If we do the vocab instruction [on speed, velocity and acceleration] first, I think it can focus that video discussion a little bit and let them be more successful just talking to each other, and let the students be more successful talking to each other and trying to categorize the video and pick out specific similarities and differences.” (2019- 07-26) - Quote 2: taking a couple of days of instruction on that and just getting that out of the way in terms of almost as a building a common vocabulary, is gonna be helpful, and just do that as its separate own little mini introduction and then hop into the cog video (2019-07-26) - Quote 3: 'Cause I think when we did that, you and I looked at that with a pretty solid understanding of what motion is and what acceleration is and what's that... That kind of 153 thing, and we still came up with a whole pile of categories and a whole interesting little segmentation that if they didn't have that sort of focus a little bit, they might go in 9000 different directions on it. If I'm thinking of doing it in literally day one or day two of class without any sort of basis of what is physics, what are we looking for, will they get there, I guess, was the question. (2019-07-26) - Quote 4: Julie: But I think that the idea here literally day one or day two of class is not for what they do in this initial activity for them to be correct. AJ: Right. Yeah, that's true. Julie: And so it's more to see where they're at before you... This is just the elicitation of what they see and notice inherently. (2019-07-26) - Quote 5: Julie: That would be... It does make things longer because you're doing that step first, but I think there's two things that make me push you to do the video first. 'Cause one is, they do have definitions in their brain of what motion is. AJ: That's true, yeah. Julie: And so whether they're saying speed, velocity or acceleration, we didn't do that much. AJ: That's right. Yeah. Julie: The only thing was, is it motion or is it movement? We went back and forth around that, but that didn't really help us categorize much because most of it was it going in a circle. It didn't matter if it was at a constant rate or whether it was accelerating or not, for most of it. So I think if they just go with motion and let them describe the kinds of motion that they're talking about, let them use their own words, I think the idea would be to listen for those words… AJ: And then… Julie: Especially in the beginning, and then sort of... And then do the instruction around the vocab connected to the words that they use. AJ: That makes sense. Yeah, alright, I'll buy that. (2019-07-26) - Quote 6: Julie: Yeah. 'Cause acceleration's gonna be the hard one, and then that constant speed, acceleration, like those kinds of things. But most of the time, they can do things like slowing down, speeding up. And so it just gets a little funny once you start getting new vocabulary on that. (2019-07-26) - Quote 7: It's great when they can say the words in their own... I'm sorry, say the ideas in their own words. (2019-07-26) - Quote 8: Remember, we had to prompt with some words like giving them the word bank, almost of speeding up, slowing down, constant speed, staying the same, that kind of thing. Until we prompted that, my last [Fall] class never really dug into that [motion between cog bumps] or really cared. (2020-02-02) 154