ELEMENTARY SCIENCE TEACHER CANDIDATES’ NOTICING AND RESPONDING TO STUDENT SENSE-MAKING THROUGH SCIENCE TALKS AND ASSESSMENTS IN A METHODS COURSE By Meenakshi Sharma A DISSERTATION Submitted to Michigan State University in partial fulfillment of the requirements for the degree of Curriculum, Teaching, and Educational Policy—Doctor of Philosophy 2018 ABSTRACT ELEMENTARY SCIENCE TEACHER CANDIDATES’ NOTICING AND RESPONDING TO STUDENT SENSE-MAKING THROUGH SCIENCE TALKS AND ASSESSMENTS IN A METHODS COURSE By Meenakshi Sharma The NRC (2012) Framework and the ensuing Next Generation Science Standards (NGSS) have brought crucial shifts in the goals for students’ science learning. Teachers bear the responsibility of translating the three-dimensional vision of student learning articulated in these documents in classrooms through instruction. Central to the idea of three-dimensional learning is the use of science phenomenon for instruction and assessment, which bears the potential for engaging students in sense-making by using scientific practices and cross cutting concepts. The purpose of this study is to understand teacher candidates’ noticing and responding to student sense-making in the context of phenomena-based science talks and assessments. This study involved twenty-three teacher candidates within one section of an elementary science methods course at a large Midwestern university. Data sources included audio recordings of science talks, assessment items, teacher candidates’ written analysis and reflections around science talks, and students’ written work in response to assessments. The data analysis focused on the nature and scope of opportunities provided for student sense-making using the phenomena-based science talks and assessments. The examination of data also focused on the nature of candidates’ noticing and interpretation of students’ sense-making and responding by suggesting instructional adaptations, based their post analysis of science talks and students’ written responses to assessments. The findings show that using phenomena in science talks and assessments allowed candidates to elicit and notice students’ ideas regarding phenomena. However, teacher candidates struggled to design open ended prompts aligned with the phenomena during talks and assessments. In the science talks, candidates often focused students primarily on providing descriptions of the phenomena rather than enabling students to explain the phenomena. Regarding the analysis of noticing and responding with respect to assessments, Findings showed a direct relationship between the nature of teacher noticing and interpretation of student sense- making with the nature of assessment prompts. In both science talks and assessments, most candidates did not probe or pay attention to students’ mechanistic thinking around phenomena. Across the study, candidates struggled to respond to student sense-making and offer suggestions that may support student sense-making of phenomena during future instruction. The findings can help design targeted learning experiences for preparing teacher candidates to notice and respond to student sense-making around science phenomena as per the vision of the Next Generation Science Standards (NGSS, 2013) The coding system developed in the dissertation can help diagnose where teacher candidates are in their practice and what experiences may help shift their attention to desired forms of noticing and responding needed for supporting sense- making of phenomena. To my grandparents who sow the seeds of education in the family iv ACKNOWLEDGEMENTS Thank you to my advisor Dr. Christina Schwarz, for your support and guidance throughout this journey. You listened to my ideas and pushed me to think harder. I always left our meetings feeling motivated and in a positive mind set. I admire you as a scholar who is always open to sharing ideas and experiences. Other than a great advisor, you are a wonderful human being with compassion and kindness. You were there for me whenever I felt down due to life circumstances. I can never thank you enough. Thank you, Amelia Gotwals, David Stroupe, and Corey Drake, for great insights and perspectives to guide my work. I am forever grateful to have worked with each one of you. Thank you to my family for the prayers and encouragement and for being an unshaking constant in my life. My father, mother, chacha and chachi, and brothers and sisters who were always there to lend a hand of support in most tough times in my life. You all gathered me when I was broken and helped me to be where I am in my life. I am fortunate for having each one of you in my life. Thank you, Urja and Ujjwala, for your sacrifices and patience while I was working. Your love gives me the strength and courage each day. Thank you, Mukesh, for sharing my triumphs and tribulations, the life, and home we built together at Michigan State. It is set in time and space, forever! Thank you, my friends Elizabeth, Cori, Lora, Christa, Hima, and Vivek, for bringing the joy and support of friendship in my life. Thanks for the academic support, study dates and for being patient listeners. My special thanks to Vivek – for being my constant beck and call during the writing of this dissertation. Thank you to the course instructor and teacher candidates for allowing me to be a part of your classroom and work. I saw how each one of you strived to be good science teachers and a great teacher educator. You all inspired me for this study! v TABLE OF CONTENTS LIST OF TABLES ....................................................................................................................... viii LIST OF FIGURES ....................................................................................................................... ix CHAPTER 1 INTRODUCTION .................................................................................................... 1 Conceptual Framework ....................................................................................................... 5 Research Questions ........................................................................................................... 10 Part 1: Sense-making Conversations .................................................................... 10 Part 2: Assessments............................................................................................... 11 Significance....................................................................................................................... 11 CHAPTER 2 LITERATURE REVIEW ....................................................................................... 13 Attention to Student Sense-making .................................................................................. 13 Teacher Noticing and Responding: Practices to Support Student Sense-making ............. 17 Teachers’ Conception and Elicitation of Student Ideas in Science Classrooms ............... 19 Teacher Candidates’ Responding to Students Ideas through Instruction ......................... 23 CHAPTER 3 RESEARCH METHODOLOGY ........................................................................... 26 The Context of Teacher Preparation ................................................................................. 26 Study Participants ................................................................................................. 27 Study Data ............................................................................................................. 29 Selecting Occasions for Teacher Candidates’ Noticing and Responding ......................... 30 Data Sources and Analysis.................................................................................... 31 Coding Categories ................................................................................................. 33 Limitations ........................................................................................................................ 34 CHAPTER 4 SENSE-MAKING CONVERSATIONS: TEACHER CANDIDATES’ NOTICING AND RESPONDING TO SUPPORT STUDENT SENSE-MAKING OF SCIENCE PHENOMENA ............................................................................................................................. 35 Study Data ......................................................................................................................... 36 Sense-making Conversations ............................................................................................ 36 Audio Recordings of the Conversations ............................................................... 38 Written Analysis and Reflections after Each Conversations ................................ 38 Data Analysis .................................................................................................................... 39 Coding of Audio-recorded Sense-making Conversations..................................... 39 Coding Teacher Candidates’ Written Analysis and Reflections around Sense- making Conversations ........................................................................................... 40 Opportunities for Sense-making ........................................................................... 43 Driving questions .................................................................................................. 46 vi Nature of Exchange During Conversations .......................................................... 47 Teacher Candidates’ Noticing of Student Sense-making ..................................... 50 Teacher Candidates’ Responding to Student Sense-making ................................ 54 Discussion ......................................................................................................................... 55 Access to Students’ Thinking ............................................................................... 55 Attention to Mechanistic Thinking ....................................................................... 56 The Ultimate Challenge: How to Use Student Ideas ............................................ 57 CHAPTER 5HOW ELEMENTARY SCIENCE TEACHER CANDIDATES’ DESIGN FOR, NOTICE, AND INTERPRET STUDENT SCIENTIFIC SENSE-MAKING THROUGH WRITTEN ASSESSMENTS ........................................................................................................ 59 Framework for Assessment Analysis................................................................................ 60 Context of Assessments .................................................................................................... 60 Sources of Data ................................................................................................................. 63 Data Analysis .................................................................................................................... 63 Coding Assessments Tasks ................................................................................... 64 Coding Teacher Candidates’ Analysis and Reflections of Student Assessment .. 67 Patterns Across Teacher Candidates ................................................................................. 70 Assessment Tasks ................................................................................................. 70 Noticing and Interpretation of Student Responses from the Assessments ........... 71 Suggesting Changes to Assessment ...................................................................... 73 Discussion of Illustrative Examples .................................................................................. 74 Example 1: Teacher Candidates with No Phenomena and Closed Assessment ... 74 Example 2: Teacher Candidates with a Phenomenon Not Articulated with Assessment ............................................................................................................ 77 Example 3: Phenomenon Guided the Assessment but the Assessment Still Closed ............................................................................................................................... 79 Example 4: Phenomenon-based Assessment and Open-ended ............................. 82 Discussion ......................................................................................................................... 86 Layers of Challenge .............................................................................................. 86 Helping Teacher Candidates Pay Attention to Students’ Mechanistic thinking ... 88 CHAPTER 6 DISCUSSION AND IMPLICATIONS .................................................................. 90 Implications....................................................................................................................... 95 Future Directions .............................................................................................................. 97 REFERENCES ............................................................................................................................. 99 APPENDICES ............................................................................................................................ 106 APPENDIX A Focal Student Sense-making Science Talks and Analyses .................... 107 APPENDIX B Lesson Design and Analysis ................................................................... 122 vii LIST OF TABLES Table 3.1: Candidates Grade Levels and Topics Across Sense-making Conversations ............... 28 Table 3.2: Aspects of Candidates' Practice under Study............................................................... 30 Table 3.3: Data Sources and Analysis .......................................................................................... 32 Table 3.4: Opportunities for Student Sense-making ..................................................................... 33 Table 3.5: Noticing Student Sense-making .................................................................................. 34 Table 4.1: Main Prompts for the SM1, SM2, and SM3 Assignments .......................................... 37 Table 4.2: Data Sources and Their Coding Purpose ..................................................................... 39 Table 4.3: Coding Scene for Sense-making Conversations .......................................................... 42 Table 4.4: Intentions for Sense-making Conversations according to Discussion Plans ............... 44 Table 4.5: Focus of Sense-making Conversations as Chosen by Teacher Candidates ................. 45 Table 4.6: Nature of Driving Questions ........................................................................................ 46 Table 4.7: Teacher Candidates' Exchange with Students ............................................................. 48 Table 4.8: Aspects of Student Ideas Noticed by Teacher Candidates .......................................... 51 Table 4.9: Teacher Candidates' Interpretation of Student Sense-making around a Phenomenon 53 Table 5.1: Examples and Categories of Types of Assessments .................................................... 66 Table 5.2: Coding Scheme for Analysis of Teacher Candidates' Assessments ............................ 69 Table 5.3: Categories of Teacher Candidates Based on Phenomenon and Substance of Assessment .................................................................................................................................... 71 Table B-1: Grading Criteria for Assessment Assignment .......................................................... 125 Table B-2: Grading Criteria for Assessment Assignment .......................................................... 129 Table B-3: Draft CAEP Rubric structure .................................................................................... 134 viii LIST OF FIGURES Figure 1.1: Framework of Teacher Responsivenes..........................................................................6 Figure 1.2: Context of Science Talks for Sense-making ................................................................ 9 Figure 1.3: Context of Assessment for Student Sense-making..................................................... 10 Figure 2.1: Operationalizing of Scientific Practices for Sense-making of Phenomena.................14 Figure 4.1: Intentions for Sense-making Conversations according to Discussion Plans .............. 44 Figure 4.2: Focus of Sense-making Conversations as Chosen by Teacher Candidates................ 45 Figure 4.3: Nature of Driving Questions ...................................................................................... 47 Figure 4.4: Teacher Candidates' Exchange with Students ............................................................ 49 Figure 4.5: Aspects of Student Ideas Noticed by Teacher Candidates ......................................... 52 Figure 4.6: Teacher Candidates' Interpretation of Student Sense-making around a Phenomenon 53 Figure 5.1: Responsiveness through Assessments.........................................................................66 Figure 5.2: Learning Experiences for Teacher Candidates............................................................69 Figure 5.3: Assessment Item ......................................................................................................... 75 Figure 5.4: Samples of Student Work ........................................................................................... 76 Figure 5.5: Assessment Item ......................................................................................................... 77 Figure 5.6: Samples of Student Work ........................................................................................... 78 Figure 5.7: Assessment Item ......................................................................................................... 80 Figure 5.8: Samples of Student Work ........................................................................................... 81 Figure 5.9: Samples of Student Work ........................................................................................... 84 ix CHAPTER 1 INTRODUCTION The current science education framework, A Framework for K-12 Science Education: Practices, Crosscutting Concepts, and Core Ideas (National Resource Council [NRC], 2012) and the ensuing Next Generation Science Standards (Next Generation Science Standards [NGSS], 2013), presents a unique vision for students’ science learning. Integral to this vision is an integrated three-dimensional approach for student learning, in which students engage with disciplinary core ideas (DCI) while also participating in scientific and engineering practices (S&EP) and crosscutting concepts (CCCs). The combination of these three dimensions enable the students to make sense of the natural world around them. The integrated three-dimensional approach to student learning signifies a vision of research-based science teaching that possesses the potential for making science instruction meaningful and equitable for K-12 students. The cornerstone of the three-dimensional learning approach is attention and responsiveness to students’ ideas while facilitating their sense-making of science phenomenon. Schwarz, Passmore, and Reiser define sense-making as: a conceptual process in which a learner actively engages with the natural or designed world; wonders about it; and develops, tests, and refines ideas with peers and the teacher, Sense-making is the proactive engagement in understanding the world by generating, using, and extending scientific knowledge within communities (2017, p. 6). The current dissertation study foregrounds the stance that students engage in scientific inquiry when they are afforded an opportunity to make sense of science phenomena and engage 1 in set of learning opportunities that facilitate their understanding regarding why and how phenomena occur. It is crucial that sense-making opportunities support students to develop mechanistic causal scientific explanations for the phenomena (Reiser, 2013; Windschitl, Thompson, Braaten, & Stroupe, 2012). Sense-making of science phenomena is core to scientific inquiry and an important goal for students’ learning of science (NRC, 2012). It is crucial for teachers to develop practices that can support students’ disciplinary thinking through sense- making in science classroom. Noticing and Responding signify teaching practices that have potential to support student sense-making of science phenomena (Hutchison & Hammer, 2010; Kang & Anderson, 2015; Levin, Grant, & Hammer, 2012). Research literature shows that teacher noticing and attention to student ideas is critical for their science learning because students’ approach to their own learning is in part dependent on how teachers’ notice and probe their ideas science classrooms. For instance, teachers’ noticing of mere fact-based information may shift students’ attention to the classroom game of producing the correct answers. However, if teachers notice and respond to students’ ideas in a productive manner, students become encouraged to make sense of the natural world (Russ, Coffey, Hammer, & Hutchison, 2009). Students discontinue their mechanistic, terminology-laden responses and engage in sense-making, including the presentation of scientific reasoning and construction of causal explanations (Hutchison & Hammer, 2010; Russ et al., 2009). Therefore, preparing teachers to productively notice and respond to student sense-making of science phenomena is a crucial goal to meet to realize the vision of NGSS and NRC framework. In the present dissertation I define productive noticing and responding to students’ sense- making of science phenomena as: a) teachers asking open ended, reasoning-based question to 2 allow students to offer mechanistic explanations, b) teachers moving away from noticing students’ correct and incorrect answers and moving towards noticing students’ disciplinary thinking in relation to the science phenomena, and c) teachers’ trying to understand gaps in students’ understanding of the phenomena and planning an instruction to address that gap. I used this idea of productive noticing and responding as a guidepost to code and analyze the data within this dissertation. For example, I coded the instances where teacher candidates (TCs) engaged students in offering or noticed students’ explanations about phenomena, as occasions of sense-making. I coded the closed and description-eliciting “what” questioning by candidates as cases of limited sense-making opportunities provided to students. Such scenarios where candidates noticed and examined students’ disciplinary thinking related to the phenomena and provided tangible examples in support, were viewed as productive examples of noticing and interpreting. Preparing teacher candidates for productive noticing and responding to support student sense-making as described above is a challenging goal for teacher preparation. First, science instruction in most science classrooms is still dominated around delivering science content and making sure that students learn the correct science vocabulary and definitions (Schwartz, Sadler, Sonnert, & Tai, 2009; Van Driel, Beijaard, & Verloop, 2001). Such a dominant and limited view of science teaching and learning may take teacher candidates’ attention away from supporting student sense-making. Second, the notion of inquiry is confusing for most teachers and students and is frequently viewed as doing fun, hands-on science activities (Osborne, 2014; Schwarz et al., 2017). The idea of using science phenomena as a core of inquiry instruction is still new for most classrooms in which teacher candidates may observe or enact science instruction. Third, issues such as curriculum demands, standards-based teaching, institutional expectations, and 3 novice teachers’ personal concerns about their teaching often divert their attention away from students’ ideas. Finally, research shows that teachers often hold perception that young children lack reasoning abilities to make sense of the natural world around them (Hardy, Jonen, Möller, & Stern, 2006; McNeill, 2011). Such perceptions about students’ science learning can acts as possible obstruction for teachers’ noticing and responding to student sense-making. Lack of teacher preparation for supporting students’ scientific sense-making can possibly derail the goal of science instruction as envisioned within NGSS. There are rare examples within elementary science education that show how elementary science students are supported to construct evidence-based explanations for scientific phenomena (Beyer & Davis, 2008; Forbes, Biggers, & Zangori, 2013; Metz, 2009; Minogue et al., 2010). We need to address the prior issue by teachers who can effectively elicit, notice, interpret and respond to support the goal of students’ sense-making in elementary grades. In the present study, I examine current noticing and responding practices of 23 elementary science teacher candidates within a one-semester introductory of elementary science methods course. I used sense-making conversations or science talks and classroom-based assessments, both phenomena based, as a context in which candidates had a goal to support sense-making by engaging in pedagogical practices of noticing and responding. The elementary science methods course provided deliberate opportunities for candidates to plan and enact phenomena-based science talks and assessments in their placement classrooms. Studying teacher candidates’ analysis and reflections around science talks and assessments provided an understanding regarding the nature of noticing, interpretation and responding they engage in as they support student sense-making. 4 Conceptual Framework This study foregrounds the pedagogical practices of noticing and responding. Focusing on noticing and responding is important as they are components of actual teaching rather than indicators of aspects that may be related to teaching. This foregrounding contrasts with other studies that focus on and measure teacher knowledge itself (PCK) or self-efficacy. While teacher knowledge and self-efficacy are important, they do not have a simple relationship with actual pedagogical practice and do not take into account the complex relationships between how knowledge and efficacy and drawn upon in particular settings, and in engagement with different people including students. In foregrounding pedagogical practices of noticing and responding rather than measuring PCK and self-efficacy, this dissertation takes a socio-cultural and disciplinary orientation regarding science and teaching rather than a cognitive psychology orientation. These pedagogical practices manifest themselves as candidates create and reflect on a learning context valuable for students. Student sense-making is one such significant context for student learning in which students are actively engaged in making sense of the natural world. Sense-making is meant to be participatory in nature, making students’ disciplinary, culturally and socially relevant way of thinking explicit (Lemke, 2001). In this study, I am interested in how teachers notice and respond to students’ disciplinary thinking during sense-making. Kang and Anderson (2015) framework provides a systematic sense regarding the aspects to focus on when examining teachers’ noticing and responding practices (Figure 1.1). For instance, the framework considers opportunities as an important facet of teacher noticing and responding, which when considered for this study means looking into opportunities that candidates provided for student sense-making using phenomena-based science talks and assessments. Similarly, noticing and interpreting students’ disciplinary thinking and making 5 changes in ongoing and future instruction are crucial facets that have been employed as a basic guideline to examine the science talks and assessment data in this dissertation. Figure 1.1: Framework of Teacher Responsiveness The analysis of teacher candidates’ noticing and responding practices in this dissertation is influenced by my stance on student learning and how candidates should support such learning. It is imperative to understand the relationship between such desired view of students learning and the view on idealized aspects of teacher noticing and responding. The idealistic view of student science learning considered in this study is inspired by the NRC (2012) framework. and advocates for using science phenomena as a core of instruction. Phenomena- based instruction bears the potential to create a discourse of student inquiry wherein students are not just passive learners of science content. Instead, students have the epistemic agency to figure out the science phenomena using scientific practices and crosscutting concepts (Reiser, 2013; Schwarz et al., 2017; Windschitl et al., 2012). I define sense-making as students’ describing observations, offering reasoning, hypothesizing, predicting and constructing mechanistic explanations for the phenomenon. To support student sense-making as a core goal 6 for student learning, it is crucial that teacher candidates are prepared to notice, interpret and respond to key aspects of student sense-making. For example, it is necessary for candidates to notice and interpret the nature of observations and explanations students articulate regarding the phenomena to support their sense-making. It is also crucial that candidates notice what students identify as key factors underlying the phenomena or how they reason using these factors. In simple words, there is an idealistic view regarding how teachers may notice, interpret and respond to support student sense-making. The way I code candidates’ noticing, interpretation and responding in this study is related to this idealistic view on how they should notice and respond to effectively support student sense-making. The goal is to ascertain and improve teacher candidates’ current practices and prepare them within methods courses to meet this idealistic goal around teacher candidates noticing and responding practices. The noticing and responding framework (Kang & Anderson, 2015) helps to parse out the aspects that need to be studied to understand candidates’ practices of noticing and responding for meeting the goal of student sense-making. The framework is grounded in research on teacher noticing. Researchers (Sherin & van Es, 2005; van Es & Sherin, 2002) from the field of mathematics describe teachers’ noticing of classroom interactions as a very complex process comprised of three components: 1) the teacher’s ability to identify something significant amidst classroom interactions to attend, 2) the ability to recognize and utilize concepts within teaching and learning that underlie the identified situation, and 3) the ability to make sense of the situation based on their knowledge of the context, students, school, and subject matter (Sherin & van Es, 2005, p. 478). Various video studies (Luna, 2018; Sherin, 2007; Sherin & Han, 2004; Sherin & van Es, 2005; van Es & Sherin, 2002) from the field of mathematics reveal teachers’ learning 7 and thinking regarding the practice of noticing. These studies specifically focused on “what” teachers noticed and “how” they interpreted those events. Research on teachers’ noticing and responding practices has also gained attention in the field of science education. To begin with, the idea of “framing” has been used as an important and widespread construct for understanding teachers’ noticing and responding practices. Levin, Hammer, and Coffey define “framing” as an individual or group making sense of “what is going on here?” (2009, p. 146). Based on their study, the authors argued that novice teachers’ abilities to attend to the substance of student thinking depends on their framing of their teaching situation. In other words, teachers’ in-the-moment meaning making about their teaching situations has influence on if and what they notice and how they respond to student ideas. Factors such as curriculum demands, standards-based teaching, institutional expectations, and novice teachers’ personal concern about their teaching framed their attention away from students’ ideas. Studies show that teachers often value canonical ideas and notice and interpret students thinking in a limited manner (Barnhart & van Es, 2015; Gotwals & Birmingham, 2016; Kang & Anderson, 2015; Russ & Luna, 2013) Teachers’ epistemological framing has also been found to have an influence on students’ epistemological framing of their instructional goals. In other words, what teachers notice and how they evaluate students’ responses and reasoning influences how students develop their thinking as well. If teachers frame their noticing in terms of fact-based information and canonical answers then the students’ attention inevitably shifts away from scientific sense- making (Hutchison & Hammer, 2010; Russ et al., 2009). In this dissertation I use the noticing and responding framework (Figure 1.1) to guide my analysis of science talks and assessments as a context to study teacher candidates’ current practices of noticing and responding to support student sense-making. To give a brief idea, 8 sense-making conversations, or science talks, were designed to afford teacher candidates an opportunity to elicit, notice, interpret and respond to students’ ideas regarding science phenomena (Figure 1.2). Teacher candidates chose a science phenomenon to lead a discussion with a small group of students. The goal was to engage students in making sense of the phenomenon through discussion. Teacher candidates were expected to engage students in generating questions, formulate predictions, hypothesize, and construct mechanistic explanations for the phenomenon. Teacher candidates used these conversations as a learning opportunity to later analyze, reflect and respond to students’ thinking around the phenomena. Respond by creating learning opportunities that can help develop ideas over time Eliciting initial students thinking Sense- making conversation s Interpret how students are thinking about the phenomenon Probe for explanations of how and why things happen (mechanistic thinking) Figure 1.2: Context of Science Talks for Sense-making Similarly, phenomena-based assessments designed and implemented by teacher candidates at the end of their 2-day lesson instruction afforded students another opportunity to engage in sense-making. Students’ written responses to the assessment items allowed teacher 9 candidates to conduct an analysis, leading them to notice and interpret student sense-making in their written work (Figure 1.3) Substance of assessment items What opportunties exist for student sense-making? Interpreting student responses Noticing and interpreting sense-making Suggesting changes in assessment and lesson plan Responding through pedagogical decisions Figure 1.3: Context of Assessment for Student Sense-making Research Questions The study of teacher candidates’ noticing and responding practices in the context of phenomena-based science talks and assessments would reveal the nature of their current practices. The current noticing and responding practices among candidates provide a sense regarding if and how candidates’ support students’ sense-making. Therefore, I ask following questions based on the sense-making and assessment data sets. Part 1: Sense-making Conversations a) How do teacher candidates support student sense-making using sense-making conversations around science phenomena? b) What opportunities do teacher candidates create as they engage students in sense-making conversations around science phenomena? c) What is the nature of interactions between candidates and students during such conversations? 10 d) How do candidates notice and interpret student thinking around the phenomenon within these conversations? e) How do they plan to respond to student thinking through instruction? Part 2: Assessments a) What is the substance of the assessments used by teacher candidates—in other words, how do TCs’ assessments allow for opportunities for students’ sense-making? b) What do TCs notice and what do they interpret as evidence of students’ sense-making within the students’ responses to the assessments? c) How do teacher candidates use their understanding of students’ assessment responses to suggest adaptions to future instruction? Significance Teacher educators are striving to develop rigorous curricula that will support the development of practices within teacher candidates necessary for the implementation of three- dimensional teaching in K-12 science classrooms. Findings from this dissertation can potentially inform the curriculum and learning experiences within methods courses in ways that will more adequately prepare elementary science teacher candidates in the use of science talks and class- room assessments grounded in science phenomena to effectively support student sense-making. Examination of the actual interactions, coupled with the teacher candidates’ own analysis of their science talks, provided insights into the extent and nature of opportunities made available for student sense-making using science phenomena, ideas that teacher candidates considered noteworthy, and how the candidates planned to respond through changes in future instruction. Similarly, the findings from teacher candidates’ assessment practices also provided information 11 on the ways in which candidates used phenomenon to plan assessments and how they analyzed student responses to understand and notice student sense-making. Understanding teacher candidates’ current practices of noticing and responding can help them prepare better for actions and thinking required for supporting student sense-making using science phenomena. As mentioned in the beginning of the chapter, studies (Beyer & Davis, 2008; Forbes et al., 2013; Metz, 2009; Minogue et al., 2010) have shown that elementary science students lack rich opportunities to engage in sense-making about science and to construct evidence-based explanations for phenomenon. At the same time, teacher candidates struggle to implement ideas from their methods courses into actual school placements (Feiman-Nemser & Buchmann, 1983; Kennedy, 2005). Understanding how teacher candidates’ enactment practices designed to support student sense-making can provide insight into how teacher candidates conceptualize the practice, knowledge which can, in turn, be used to address the “problem of enactment” (Kennedy, 2005) within fieldwork experiences. The results from this dissertation will generate evidence of the strengths and struggles of TCs in their attempts to support student sense-making. Teacher educators can leverage this evidence to plan field experiences in integration within methods courses in order to better prepare teacher candidates for their professional work and improve the status of elementary science education. 12 CHAPTER 2 LITERATURE REVIEW The present chapter discusses what we know about the significance of students’ sense- making, teacher candidates’ conceptions and modes of eliciting student ideas, their noticing and responding to students’ ideas based on the research literature. Attention to Student Sense-making The current vision of K-12 science education increasingly emphasizes the use of science phenomenon to support student sense-making, as outlined in The Framework for K-12 Science Education: Practices, Crosscutting Concepts, and Core Ideas. As reflected in the title of the framework, the proposal’s three-dimensional view of learning comprises of three strands: scientific and engineering practices, crosscutting concepts, and core ideas from the physical sciences, life sciences, earth sciences, and engineering, technology and applications of science. The framework and the latest NGSS science standards promote instruction grounded within science phenomena contextualized in real life scenarios. The argument is that for a meaningful and effective science instruction a student should have an opportunity to engage with and develop explanations for how and why phenomena happen. At the core of the instructional framework are content ideas, theories, and mechanisms that explain the reasoning underlying a phenomenon. The process of instruction should use phenomena as a context to operationalize scientific practices and cross cutting concepts as sense-making tools (Figure 2.1; NRC, 2012; Reiser, 2013; Windschitl et al., 2012) 13 Figure 2.1: Operationalizing Scientific Practices for Sense-making of Phenomena The NRC (2012) committee argues that student involvement with scientific practices provides students with the opportunity to understand how scientific knowledge develops, thereby facilitating improved conceptual understanding of scientific ideas. Opportunities to engage in scientific practices nurture student curiosity and the students’ sense of exploration by allowing them to experience the “doing” of science, similar to real scientists (Berland & Reiser, 2009; 14 Lehrer & Schauble, 2006). Most importantly, the use of scientific practices allows students to pursue learning through sense-making, providing them epistemic agency to construct their own knowledge of the natural world. Consequently, these scientific practices represent an authentic way of exploring scientific phenomenon. By utilizing real world phenomena, teachers provide students with rich contexts and opportunities to integrate content with scientific practices in order to construct explanations (Krajcik, Codere, Dahsah, Bayer, & Mun, 2014). As a result, the performance expectations (PEs) within NGSS were written to present a three-dimensional view of learning that integrates scientific core ideas, practices, and crosscutting concepts to promote learning of science through sense-making of science phenomena. Theoretically, the idea of sense-making emphasizes the contextual and participatory nature of learning in contrast to the dominant notion of learning as a merely individualistic process. It belongs under the umbrella of situated learning; where the process of learning is viewed as a collective act of sense-making. Sense-making within science classrooms is a sociocultural process encouraging the elicitation of diverse perspectives regarding the phenomena, while moving away from the canonical and dominant forms of text book knowledge. It argues that knowledge is not merely constructed in an individual’s head, rather it is created during interactions among individuals who actively and collaboratively engage with each other within an authentic learning context (Lave, Wenger, & Wenger, 1991). Students engaged in sense-making belong to a community of practice, a learning environment where they actively communicate about and engage in the skills required to understand phenomena (Brown, Collins, & Duguid, 1989; Wenger, 1998). The students possess varying levels of expertise and diversity of views; these views are valued within the community and taken as a contribution toward developing a shared vision of knowledge (Wenger, 1998). A science classroom that 15 adopts a community of practice perspective opens up possibilities and opportunities to activate and share the students’ funds of knowledge—their unique knowledge based on their backgrounds and cultural and social ways of knowing and doing science (Barton & Tan, 2009). Sense-making underpins the epistemic goal of science education, that is an understanding of how we know what we know, and why we believe it (Duschl & Osborne, 2002). It unpacks the idea of “active engagement,” explaining what it really means to be involved in the act of learning with the construction of knowledge as the ultimate end. Learning science through sense-making lies in opposition to the common practice of “doing” science as a set of isolated activities and experiments grounded in fragmented science content. Learning science through sense-making is grounded in the logic that children develop deep science understanding by being able to figure out for themselves how the natural world works (Krajcik et al., 2014). Teaching science using phenomena to support student sense-making means shifting not only from traditional forms of teaching and learning but also rethinking the ways in which future science teachers are prepared. It has also implications for science teachers’ work within in their classrooms. Teachers now need to be able to involve students in productive conversations that empowers students in their attempts to unpack and articulate reasoning and explanations underpinning the phenomena. Approaching conversations with students in such a way can eventually help guide the instructional process. Research shows that K-12 teachers typically emphasize learning science by valuing canonical information (Anderson, 2002; Brown & Melear, 2006; Davis, 2003). However, teaching and learning science through sense-making necessitates a shift in science teachers’ pedagogy. It requires that teachers recognize and create occasions for students to show and process their thinking. Teachers must develop an ability to 16 “see student learning: to discern, differentiate, and describe the elements of that learning, to analyze the learning and to respond” (Rodgers, 2002, p. 231). Paying attention to students’ ideas and engaging them in sophisticated and cognitively demanding thinking remains a challenge for many teachers (Davis, 2003; Lehrer & Schauble, 2006; Metz, 1995). Specifically, in relation to elementary science teaching, many teachers hold the belief that young children are not capable of expressing scientific reasoning. Evidence contradicts that those beliefs, however, revealing that when provided with rich context, young children show the ability to articulate their reasoning, and to use evidence to construct explanations for scientific phenomenon (Metz, 2004). It is, therefore, crucial that elementary science teachers be prepared to facilitate sense-making and meaningful science learning among young children. Teacher Noticing and Responding: Practices to Support Student Sense-making Support for student sense-making depends on teachers’ noticing of students’ ideas about the phenomena, as well as their ability to interpret those ideas for what they may mean for student understanding and guide that interpretation to plan and adjust instruction accordingly. As noted in Chapter 1, teachers often have trouble noticing student ideas during instruction. For example, in an analysis of teaching videos, teachers focused primarily on the teacher in the video, his actions, pedagogical issues, and response to students (Sherin & van Es, 2005). Researchers have also found that teacher candidates’ observation skills are underdeveloped. Teacher noticing can be of a unique nature and have implications for students learning. As noted earlier, teachers may be more amenable to “ideas from authority,” meaning the teachers value and give attention to student ideas originating from authoritative sources, such as a science book or teacher. Second, teachers may frame their noticing in terms of “ideas from experience,” 17 attending to students whose ideas derive from experiences achieved through the use of their senses. Furthermore, the teacher’s epistemological framing influences the students’ epistemological framing of their instructional goals. Simply, what teachers notice and how they evaluate students’ responses and reasoning impacts the trajectory students take in their thinking as well (Russ & Luna, 2013). When teacher notice canonical answers, it shifts students’ attention away from scientific sense-making (Hutchison & Hammer, 2010; Russ et al., 2009). Moreover, when teachers are unproductive in the framing of their noticing, it leads to unproductive responsiveness to student ideas. Kang and Anderson (2015) investigated teacher candidates’ noticing and responding to student ideas in the context of student assessments. High quality assessments exemplified teachers’ productive framing, meaning these assessments provided opportunities for student sense-making. On the contrary, low quality assessments signified teacher candidates’ unproductive framing of the assessment task which demanded for factual knowledge and canonical information on students’ part. The findings showed that majority of teacher candidates who designed high quality assessments responded to student ideas in a productive manner. These teachers suggested very specific and direct instructional changes, in alignment with students’ conceptual difficulties. On the other hand, candidates implementing low quality assessments responded in an unproductive manner, suggesting suggested format- based changes in instruction or general strategies to enhance student engagement. There is, however, encouraging evidence that teachers do show an ability to notice and respond to students’ scientific ideas with strategies and scaffolding in this direction (Levin et al., 2009; Rosaen et al., 2008). Research has also provided examples where teacher candidates have been successful, to an extent, in creating opportunities to elicit and notice student ideas (Gotwals & Birmingham, 2016; Kang & Anderson, 2015). The development of a knowledge base that may 18 guide effective design of learning opportunities that better prepare TCs for the work of responsive teaching is in its infancy (Gotwals & Birmingham, 2016; Kang & Anderson, 2015). The current study aims to contribute to this expanding knowledge base by exclusively studying the nature of teacher candidates’ practices of noticing and responding in the context of phenomenon-based science talks and assessments. Teachers’ Conception and Elicitation of Student Ideas in Science Classrooms Teacher candidates typically focus on the organizational aspects of the classroom and student behavior instead of paying attention to student thinking (Barnhart & van Es, 2015; Gotwals & Birmingham, 2016). When they do pay attention to student ideas, teachers struggle to interpret students’ disciplinary thinking (Davis & Smithey, 2009). Teacher candidates especially hold diverse opinions about student ideas that shape if and how they use these ideas in future instruction (Otero & Nathan, 2008). For instance, teacher candidates could not use students’ ideas for any useful modifications when they interpreted them as emerging form their experiences or evaluated them for their level of correctness. According to a survey of preservice, novice, and expert science teachers, three main sources contribute to students’ prior knowledge: previous instruction, informal learning experiences, and general life experiences (Meyer, 2004). The teacher candidates in the study asked students to recall the content information from their previous instruction experience, using the responses to determine what to teach and “add on” to the information already known to students. Larkin (2012) also determined that teacher candidates viewed student ideas as a good reflection of their content knowledge from past instruction and valued them as an opportunity to obtain insight into the level of students’ conceptual understanding. However, candidates showed limited knowledge regarding the value and role that student ideas could play during instruction. 19 Many teacher candidates treated student ideas as misconceptions to be fixed in the course of their instruction, seeing misconceptions as barriers and obstacles towards the understanding of the new content. Teacher candidates viewed uncovering students’ ideas as a hook to interest them into the lessons (Larkin, 2012; Meyer, 2004). Teacher candidates held a perception that affording students a chance to share their thinking had an affective quality because gave students a feeling of comfort and self-worth in the classroom. Despite the challenges, candidates demonstrate the ability to pay attention to certain aspects of students’ science thinking in a productive manner. In a study by Levin (2009), teacher candidates made necessary moves in their classroom in order to prompt their students to share their conceptual and everyday thinking around science phenomena. They probed students to explain their ideas, using those ideas as a focus for class discussions. When explicitly supported through video case analysis, teacher candidates provided detailed descriptions of student thinking and explained ways in which student ideas were drawn out through questioning and how they fostered students’ sense-making (Barnhart and van Es, 2015). Expert teachers showed a more sophisticated understanding, accepting explanations for the phenomena which students offered based on their prior ideas. They saw these initial explanations as a leveraging point to build new understandings among students (Meyer, 2004) Teachers often struggle to elicit students’ thinking in science classrooms both because of the lack of opportunities created to do so and a limiting style of questioning and interactions they have with students. Teachers need open ended questions to tap into student thinking and elicit student ideas, however, they often engage in teacher-centered forms of questioning, particularly the initiation-response-evaluation (IRE) format. During IRE questioning, the teacher initiates a question, students offer a response, and the teacher evaluates it, moving forward without taking 20 any substantial action to support student thinking (Cazden, 2001). Oftentimes, use of the IRE format fails to bring student ideas to the surface. Questions posed to students through this format do not encourage student sense-making, while the evaluation of the response is too narrow to allow students to build upon any ideas (Harris et al., 2012). Teacher candidates who conceive students’ ideas as indicative of their content knowledge have been found to use IRE interactions (Larkin, 2012). Also, when teacher candidates perceive of themselves as occupying an authority position, they tend to control exchange with students using an IRE format (Gotwals & Birmingham, 2016). Candidates who evaluate students’ ideas as get it or don’t generally asked scientific vocabulary loaded questions to students (Otero & Nathan, 2008). Content specific and procedural knowledge focused questions (“what are you going to do next?”) have also been found to be prevalent practices among teacher candidates (Gotwals & Birmingham, 2016; Harris et. al., 2012; Larkin, 2012). In addition to the challenges described above, teacher candidates often struggle to design productive assessments (reasoning-based, open-ended) that could enrich students’ understanding. Unproductive assessments (fact-based, closed) lack epistemic framing—in other words, they do not provide students with opportunities to engage in sense-making. Unproductive assessments prompt students to produce factual information and display procedural skills and knowledge (Kang & Anderson, 2015). In fact, questioning designed to extract knowledge of procedural skills has been found to be very typical of teacher-structured discourse in science classrooms. Teachers mainly ask students procedural questions to ensure they are following along, taking correct steps, and using the right material during experimentation. In other words, the goal of asking procedural questions is mainly to confirm that students are following the task as structured by the teacher, “checking” on the students by asking “what are you doing?” or “what 21 will you do next?” (Harris et al., 2012). Teacher candidates use procedural/skill-based questions to lead students to the “correct” answers (Barnhart & van Es, 2015). Candidates’ use of tasks that are challenging for students to comprehend constrain them to gather students’ thinking about natural events (Meyer, 2004). Ambitious science teaching that argues to be much aligned with the NGSS (2013) the NRC (2012) advocates for a student-centered, phenomenon- based discourse of questioning. Ambitious science teaching stance believes that students explanations about the phenomenon change and becomes sophisticated over time through instruction that targets to develop mechanistic reasoning (Windschitl et al., 2012). Teachers who aim to develop conceptual understanding among students elicit student thinking by asking questions grounded in science phenomena. For instance, in a study by Van Zee, Iwasyk, Kurose, Simpson, and Wild (2001), teachers asked students questions related to the phases of moon, such as, “where can you see the moon?” or “when did you see it?” (p. 177). To develop students’ conceptual understanding, teachers also asked diagnostic and clarifying questions: “what other possibilities?” or “what experiences have you had to support your idea?” or “do we all make sense? or agree?”. Teachers in the study encouraged students to consider multiple viewpoints and fostered student sense- making of their ideas. Within a student-centered discourse, teachers ask “reflective toss” questions, in which they first themselves understand the meaning of the response made by student and then throw their response back to them and to the whole class to probe more thinking (Van Zee & Minstrell, 1997). Reflective toss questions also allow candidates to query the students at a level which requires higher cognitive skills, as well as permitting a range of student responses (Gotwals & Birmingham, 2016; Van Zee et al., 2001). They probe students in ways that allow for individual and collective sense-making of ideas. Candidates who engage students 22 with scientific practices have been found to be successful when using questioning that focused on facilitating students’ sense-making of the experiment (Barnhart & van Es, 2015). In conclusion, a teacher’s ability to stimulate student thinking has implications for the extent and depth of student thinking they may be able to access. Teachers often do not use rich tasks like science phenomenon to involve students in making sense of the world, and candidates often struggle to engage students in productive framing of their learning. Both deleteriously impact the students’ abilities to reason about the phenomenon. Therefore, it is important to study candidates use, noticing, and responding to student sense-making in the context of rich contexts for learning, such as science phenomena (Russ et al., 2009). Teacher Candidates’ Responding to Students Ideas through Instruction Candidates’ conceptions of students’ ideas and the ways in which they elicit them can impact if and how they incorporate the students’ disciplinary thinking into instruction. Teachers who interpret what students need to know in relation to content and academic vocabulary adapt their instruction according to what is missing rather than students’ understanding (Otero & Nathan, 2008). Teacher candidates view prior knowledge as an informational foundation upon which new information can be added. “[Prior Knowledge] is important in planning because you don’t want to start a lesson too far ahead of where the curve is. You kind of want to bring the curve along. So you have to start at what they last learned” (Meyer, 2004, p. 976). In general, teacher candidates lack knowledge about how to involve student ideas into instruction (Barnhart & van Es, 2015; Larkin, 2012; Meyer, 2004). They may collect information regarding students’ thinking but not view the information in nuanced ways, which may constrain them from using the information during instruction (Gotwals & Birmingham, 2016). Unproductive 23 assessments have also been associated with teacher candidates lacking responsiveness. The unproductive responsiveness may include generic or irrelevant reactions to student ideas, such as re-teaching information or assigning a few more labs, etc. (Kang & Anderson, 2015, p. 15). Even in cases where candidates did assess student thinking in a productive manner and provided opportunities for student sense-making to manifest a broader repertoire of student ideas, their responding to these ideas stayed unproductive. One reason for candidates responding poorly to student ideas could be that they did not include rich interpretation of student thinking. In general, teacher candidates failed to precisely identify and describe the specific student ideas they hoped to address in their instruction, offering only very general ideas to facilitate student thinking. Their strategies for adjusting their instruction were often very vague. Even when candidates mentioned a specific student idea that they wanted to address in their lessons and offered evidence for it, their plans for modifying it remained unclear (Barnhart & van Es, 2015; Meyer, 2004). Thus, better analysis or interpretation of student responses did not always guarantee productive attention during instruction. Candidates approaching student learning in a more sophisticated manner, however, will make explicit and logical connections between what they will modify in their instruction based on student ideas and how that may affect their learning. In Barnhart and van Es (2015), for the connection to be considered logical, candidates constantly cited their responses using evidence found in student thinking. To attend to student thinking in a responsive way, candidates needed to integrate ideas offered by students before modifying their lesson plans. Teachers also may also show a potential to respond to student thinking by a) constantly making public record of student ideas and bringing them to the forefront of class discussions, and b) by using use tools to organize student thinking. For instance, in the framework for ambitious 24 science teaching (Windschitl, Thompson, & Braaten, 2008), teachers introduce a phenomenon for students to figure out, have them share their initial hypotheses with the rest of the class, build models to explain how and why phenomena happens, and draw multiple viewpoints into the discussion. In this way students are constantly sharing their ideas and explaining their thinking in the light of new experiences, whereupon they revise their original stance. Teachers also respond in-the-moment by “re-voicing” student ideas, clarifying or rephrasing them in the moment to help students make better sense of things. Inviting students to clarify each other’s thinking, providing feedback to one another, is also a form of responding to student ideas in the moment. Most importantly, such in-the-moment responses are facilitated when teachers anchor their instruction through the construction of explanations for phenomena that target big science ideas and can be developed through a coherent set of activities (Harris et al., 2012). Overall, responding to student ideas is a challenging notion for teacher candidates. Most of the literature discussed here presents teachers and teacher candidates’ conceptions—eliciting, noticing, and responding to students’ ideas in the context of everyday classroom discourse, activities, and assessments. Based on the recommendations of NRC (2012) and NGSS (2013) documents, instruction within K-12 science classrooms and the preparation of teacher candidates must be shifted to an approach that focuses on sense-making of phenomenon grounded in science standards. The creation of a knowledge base that specifically studies elementary science teacher candidates noticing and responding practices in the context of using phenomena for instruction has begun but is in its early stages. Findings from the current study will, therefore, contribute to this knowledge base. 25 CHAPTER 3 RESEARCH METHODOLOGY To investigate teacher candidates’ noticing and responding practices to support students’ sense-making, I collected the data from teacher candidates within an elementary science methods course. It is a qualitative study analyzing candidates’ planning and enactment of science talks and assessments in their placements classrooms as a requirement of the methods course. Details regarding each data type and related data analysis are described in Chapters 4 and 5 as their analysis is unique to each type of data. In this chapter, I broadly discuss the research context, study participants, the rationale behind selecting phenomena-based science talks and assessments to study teacher noticing and responding practices, and the basis of the coding scheme used in the study. The Context of Teacher Preparation The context of this study is an elementary science course within a five-year, reform- oriented elementary science teacher preparation program at a large Midwestern university. The teacher candidates in the program are required to take two science-focused methods courses: one in the fall of their senior year and the other in the spring of their internship year. The data for this analysis was collected during the senior year methods course in fall 2016. The goal of the methods course was to begin preparing teacher candidates for NGSS-aligned instructional practices in the classroom. Therefore, there were learning opportunities in the course that targeted teacher candidates’ understanding around science phenomena and involved them in 26 unpacking the NGSS standards to clarify learning goals for students’ learning and design assessments aligned with learning objectives. During the senior year methods course, teacher candidates were assigned to a school placement and worked with a mentor teacher. Each week candidates observed the classroom in which they had been placed for 4 hours a week. They also were given an opportunity to plan and implement a 2-day lesson in the classroom during the semester. The planning for the 2-day lesson was carried out as a part of their assignment work within the science methods courses. It is important to note that there was a great deal of variation regarding the extent to which candidates get an opportunity to observe science instruction within their placements. Study Participants The participants for the study were elementary science teacher candidates in their senior year. All 23 participants belonged to the same cohort of the methods course. All candidates were placed in elementary classrooms in the area and were expected to design and implement a science lesson and course assignments within these classrooms. There was a range to the grade levels in which these candidates were placed (Table 3.1). 27 1 2 3 4 5 6 7 8 9 10 11 12 Kg Kg Kg Kg Kg 1st 1st 1st 1st 1st 1st Dead v. living plants Beluga whales Bird beaks Plant Moon phases Animal characteristics Physical characteristics of animals & their functions Kg Kg Kg 1st 1st Sunlight Weather Push/Pull Daylight & seasons Solids Observable properties of Kg Kg Kg 1st 1st 1st solid & 1st Growth of a bean plant Pollination Plant roots 1st 1st 1st 1st Weather 1st 13 1st Seasons 1st 14 15 16 1st Plant adaptations 1st 2nd Solid v. liquids 2nd 2nd Introduction solid & liquid 2nd 17 3rd Snow 3rd 1st 1st 1st 1st interactions with liquids Packing for severe weather Sound Properties of matter State of matter Relationship between daylight & seasons Properties of matter Water effects on land State of matter Benefits of travelling in a group Sunlight Weather Push/Pull Daylight & seasons Solids Observable properties of solid & interactions with liquids Preparing & responding to severe weather Sound Properties of matter State of matter Relationship between daylight & seasons Properties of Matter Water effects on land 2nd State of matter 3rd Benefits of travelling in a group Teacher candidates Sense-making 1 Sense-making 2 Sense-making 3 Grade Topic Grade Topic Grade Topic Beach Kg Weather & climate Kg Weather & climate Kg Sunlight Kg Sunlight Table 3.1: Candidates Grade Levels and Topics across Sense-making Conversations 28 Table 3.1 (cont’d) 18 19 20 21 22 23 3rd How ice cream is made 3rd Animal travel pattern 3rd Animal travel pattern 3rd Need of Plants 3rd Ecosystem 3rd Ecosystem 5th Animal features 5th 5th 5th Muscular & skeletal system How to classify animals 5th 5th Earth systems Matter & Sound Structure of properties of matter 5th 5th Flooding Circulatory system 5th Structure of properties of matter 5th Buoyancy 5th Water cycle 5th Water cycle Study Data Over the course of the semester, teacher candidates planned and enacted three sense- making conversations with a small group of students in their respective school placements. These were meant to be short conversations (almost 20 minutes duration, three times during the duration of the course) with a small diverse group of students grounded in a science phenomenon, explorable over candidates’ two-day lesson. The first sense-making conversation (SM1) can be thought of as a practice science talk because teacher candidates did not typically have the topics that they might be teaching in their lessons. The second and third sense-making (SM2 and SM3) conversations focused on the science phenomena around which candidates chose to structure their 2-day lessons. The nature of these conversations was intended to be open- ended and to probe students to make sense of the phenomena related to a driving question. The details regarding the nature of sense-making data are in Chapter 4. At the end of their two-day lesson, teacher candidates planned and implemented phenomena-based assessment tasks. They collected written student work based on their responses to the assessment and created rubrics to assess the work. Candidates chose six student work samples each, exemplifying high-, medium-, 29 and low-quality student work based on their assessment task and rubric for their reflections and analysis of student sense-making. The details regarding the assessment data are included in Chapter 5. Selecting Occasions for Teacher Candidates’ Noticing and Responding This study focuses on two aspects of methods work for analysis: planning and enacting science talks and implementing assessments and analyzing student responses based on science phenomena. Both occasions deliberately engaged teacher candidates in noticing, interpreting and responding to students’ thinking around science phenomena in their placement contexts. Candidates had a chance to use science talks and assessments in authentic school contexts with their own students. After each enactment, candidates produced a documented analysis and reflection of each talk and assessment. Each occasion of science talk and the assessment allowed teacher candidates to engage in affording opportunities for student sense-making, notice, interpret and respond to student ideas (Table 3.2). Each of the actions listed in Table 3.2 were examined and analyzed according to methods outlined in detail in Chapters 4 and 5. Assessments a) Substance of the assessment b) Opportunities for student sense- making c) Noticing and interpretation of student sense-making with assessment responses d) Responding through changes in assessment and instruction Table 3.2: Aspects of Candidates’ Practice under Study Science Talks Aspects of Data Analysis a) Introducing science phenomenon for discussion b) Plan for the discussion c) Driving question d) Probing questions e) Nature of the discussion f) g) TCs interpretation of student responses h) Responding through changes in instruction TCs noticing 30 In total, data gathered from 23 teacher candidates were analyzed to identify patterns of teacher candidates’ noticing and responding to student sense-making. Subsequently, I compared and tracked patterns backwards to understand relationships between various aspects of teacher candidates’ practices. Understanding relationships, such as the one between assessment design and teacher candidates’ noticing of sense-making, provided insight into the complexities of the process of supporting student sense-making. It also helped identify groups of teacher candidates who showed similar trends from which representative examples could be drawn in order to discuss these relationships Data Sources and Analysis Multiple data sources were used to analyze teacher candidates’ practices (Table 3.3). Transcribed audio recordings of each of the three sense-making conversations conducted by the candidate across the entire semester were analyzed by myself, in addition to the teacher candidates’ post written analysis and reflections of these talks. Using these two sources for each conversation provided me with a more comprehensive understanding of the context, goals, and structure of the talks. It also helped me to identify any contrasting or confirming evidence about teacher candidates’ noticing and responding to student sense-making. For the assessment data, I analyzed the substance of the assessment, examined teacher candidates’ analysis of their students’ assessment responses, and considered artifacts produced by students in the process of responding to assessments as resources that collectively confirmed how teacher candidates used phenomenon-based assessments as a source for supporting student sense-making. As stated, the main data source for analyzing teacher candidates’ practice of sense- making conversations was audio recordings of conversations they implemented with their students three times during the semester. The teacher candidates’ own analyses and reflections of 31 the sense-making conversations provided additional sources of data. For the assessment portion, primary data included the candidates’ written analyses of students’ responses to the assessments. Finally, video recordings of the methods course provided a record of learning experiences Unit of Analysis/ Chunking the data • Candidate- student exchanges during conversations • Aspects of candidates’ reflection aligned with the framework of analysis a) Phenomenon b) Driving Question for the talk c) Discussion plan d) Artifact e) Prompts guiding Candidates’ reflection around the conversations • Each Assessment item (Number of items=23) • Teacher candidates’ analysis of each students’ work (Number of work samples=138) candidates received as part of their university program instruction. Focus Practices Data Available Analysis Audio recordings of science talks (three times a semester) TCs written analysis and reflection of sense-making conversations (three times a semester) Video recordings of methods course Coding exchanges/ interactions • Opportunities for sense- • making Interactions during sense- making conversations • Use of phenomenon and questioning • Noticing, interpretation, and responding based on candidates’ reflections on three sense-making talks Sense-making Conversations Assessment items implemented by candidates at the end of their unit plan Candidates’ written analysis and reflection of six students’ written responses to assessment Artifacts produced by students Video recordings of methods course • Analysis of assessment items used by TCs as opportunities for sense- making • Examination of candidates’ analysis of students’ assessment responses and student artifacts to notice and interpret student sense- making (Evidence of sense-making and evidence of students’ learning) • Responding (planned changes in instruction and/or assessment items) Table 3.3: Data sources and Analysis 32 Assessment Coding Categories The main coding categories within each data set were based on Kang & Anderson’s (2015) noticing and responding framework (see Figure 1.1). For instance, I examined sense- making conversations data for opportunities made available for student sense-making and candidates noticing of sense-making (deductive code). The open codes developed during data analysis were influenced by my stance on students learning as sense-making of phenomena and what matters most in relation to noticing and responding practices to support such sense-making such as the nature of the driving question signaling the potential of opportunities made available to students for sense-making. I developed the following codes during the open-coding of the driving questions: a) content-driven, b) elicitation of descriptions, c) probing about the phenomenon, and d) recall (Table 3.4). These codes exemplify teachers’ pedagogical moves that afford students sense-making. While a question solely eliciting descriptions (see Table 3.4) regarding the phenomena does not sufficiently afford opportunities for student sense-making, a question probing around phenomenon is more aligned to the sense-making goal. Codes Sub-codes Content driven Driving question Elicitation of descriptions Probing about the phenomenon Use of Artifact Recall As a hook Eliciting descriptions Table 3.4: Opportunities for Student Sense-making Another example of a code developed during open-coding to ascertain candidates’ noticing practice is cause and effect (Table 3.5). The cause and effect code signaled instances when candidates’ noticed students’ offering explanations for the science phenomena and their noticing practices in line with the sense-making goal. Other codes for teacher candidates’ 33 noticing, such as, students’ gestures or content knowledge illustrate the nature of candidates’ practice that was not fully conducive to the sense-making goal. Noticing: of students’ sense-making about phenomenon Emotions/gestures (displayed by students & related to sense-making) Students’ content knowledge Students’ experiences (personal/school) Student ideas about phenomenon (cause and effect) Table 3.5: Noticing Student Sense-making Limitations Several limitations to this study exist in terms of the participants selected and the data collection. First, because it was a sample of convenience, participants may or may not be a representative population of TCs. These students represent an entire section of teacher candidates enrolled in the methods course, recruited by me to capture the trends in TCs practices in general. Additionally, while the audio recordings of sense-making conversations did allow me to identify the moves made by TCs during conversations, it was difficult to rationalize the reasoning behind them because I was not able to interview the candidates about their decisions. Also, TCs had the opportunity to enact these sense-making conversations three times a semester within school placements, giving them three deliberate occasions to rehearse the practice, but they did not receive feedback from their peers nor from the course instructors about their actual enactments— only their written analysis and reflection piece connected with a specific sense-making conversation. Consequently, the study was limited in capturing deeper reasoning for any trends in TCs’ enactments of the practice over the duration of the whole semester. 34 CHAPTER 4 SENSE-MAKING CONVERSATIONS: TEACHER CANDIDATES’ NOTICING AND RESPONDING TO SUPPORT STUDENT SENSE-MAKING OF SCIENCE PHENOMENA This chapter discusses findings showing how teacher candidates used sense-making conversations as way to notice and respond to student sense-making. As defined earlier, sense- making is a process where students are engaged in constructing mechanistic explanations for science phenomena while generating questions, hypotheses, making predictions, and using theories and mechanisms to explain how and why things happen. Teacher candidates in this case employed science phenomena as a point of discussion with a small group of students in their placement classrooms. Over the course of the semester, candidates held such conversations three times, after which they completed written analyses and reflections based on the prompts of the assignments (see Appendices). The analysis of transcripts of audio recordings of conversations held in small groups, and teacher candidates’ post-conversation reflections on these conversations, were the primary sources of data. I closely examined the patterns of interaction within the sense-making conversations to understand the substance of the sense-making conversations. Before discussing the findings, I will present in detail the nature of the data used in this study and how it was analyzed. 35 Study Data During the data collection phase, the teacher candidates comprising the study participants were enrolled in a five-year, reform-oriented elementary science teacher preparation program at a large Midwestern university in the USA. Programmatic requirements include two science focused methods courses, one offered during the candidates’ senior year and other during their internship year. Data for this study was collected during the senior-year methods course in the fall semester 2016. In total, 23 teacher candidates within the course consented to be the part of the study. Data resulting from the science talks was derived from two sources: a) audio recordings of the teacher candidates ‘small group conversations with their students, and b) teacher candidates’ post written analysis and reflections around these conversations. Before discussing the data sources in further detail, it is important to understand the structure and goals for the sense-making conversations planned and enacted by teacher candidates within the methods course. Sense-making Conversations The sense-making assignment required teacher candidates to conduct short conversations, (almost 20 minutes duration) three times over the semester-long methods course with a small, diverse group of students. In this study, I have labeled three consecutive sense-making assignments as SM1, SM2, and SM3. Table 4.1 consists of main prompts from each of the assignment which teacher candidates used to conduct the analysis and reflect upon at the conclusion of each sense-making conversation. The conversations were unique in the aspect that teacher candidates employed a science phenomenon as a center of discussions. The first sense-making conversations were conducted at the beginning of the semester. At this point, a few, but not all, of the teacher candidates knew the standards they wanted to focus 36 on for a 2-day lesson. The placement schedules assigned by the program, introductions with the mentors, and other logistics around school placements can impact how early candidates know what they want to teach and what to design their talks around. Consequently, the topic of the first talk may have differed from the eventual focus of the 2-day lesson. By the second talk, however, conversations encompassed the phenomenon selected for the lesson topic. The third sense- making conversations was generally an assessment talk or “wrapping-up” kind of a conversation around the phenomenon. SM1/SM2 Phenomenon examined SM3 Phenomenon examined Driving question Discussion plan Key ideas Visual aids used Driving question Discussion plan Key ideas Visual aids used Explain student thinking about their idea How do students’ ideas interfere with their understanding? Detail some strategies that would benefit your students as learners in making sense of the world Primary prompts for sense-making assignments How do you want students to think of a concept or idea discussed in group? Students’ prior experience and culture/personal resources Table 4.1: Main Prompts for the SM1, SM2, and SM 3 Assignments Given the scope of the 2-day lesson, it was not possible for the candidates to choose a complex phenomenon that might require a lengthier timeframe. As a result, the phenomena selected tended to be more simplistic in nature. Ideally a phenomenon needed to be an observable process or natural event, explorable by students through coherent learning experiences that would support students in developing an explanation for the phenomenon over time (Reiser, 2013). However, for a 2-day lesson the scope of the phenomenon selected by candidates was not expected to be something explorable over days. The methods course focused 37 on preparing teacher candidates for NGSS-aligned instruction, and thus candidates in the course had deliberately designed occasions to learn about the identification, significance, and use of science phenomena for instruction. The overall goal of the conversations was to engage students in making sense of the selected science phenomenon by talking about it in small groups. Prior to initiating the science talks, teacher candidates developed a written discussion plan illustrating their intentions for the conversation— the chosen phenomenon, ideas they would like to introduce about the phenomenon, questions designed for the process, etc. After enacting the discussion candidates analyzed and reflected on these conversations guided by assignment prompts. Audio Recordings of the Conversations Teacher candidates audio-recorded each of the three science talks enacted with a small group of 5–6 students in their placement classrooms. Candidates were advised to choose a diverse group of learners, if possible, for these conversations. Each candidate submitted the audio recording of all three conversations as a part of the assignment work required by the methods course. Each audio recorded conversation was 6–10 minutes long. In all, I collected 69 audio recordings (23 participants x 3 talks) which were transcribed and analyzed to understand the nature of interactions among students and teacher candidates. Teacher candidates later used these audio transcripts to analyze and reflect on the conversations. The details about written analysis and reflections are as follows. Written Analysis and Reflections after Each Conversations Each of the written analysis and reflection was 4–6 pages in length and used assignment prompts, shown in Table 4.1. The goal for the assignment was to have candidates use the 38 conversations and experiences to analyze and reflect on students’ sense-making of the science phenomena. As they analyzed and reflected, candidates were guided by the prompts provided within the course assignment. It is possible that teacher candidates’ analyses could have been limited by these prompts, and that these prompts may have shifted teacher candidates’ attention on some aspects of the conversation more than others. The third sense-making (SM3) conversation was unique as the assignment specifically asked teacher candidates to suggest adaptations and changes at the individual level for students in their small groups, based on what the students had noticed and interpreted about their sense-making of the phenomena. Data Analysis To analyze the recorded conversations and written analysis and reflections, I leveraged Kang and Anderson’s (2015) noticing and responding framework. Specifically, I examined the nature of opportunities made available for student sense-making of the phenomenon. I coded the data for what teacher candidates noticed as they engaged students in sense-making, how they interpreted the substance of their disciplinary thinking, and decided to respond. Type of Data Source Audio transcripts SM1/SM2 analysis and reflections Purpose for Coding Opportunities for sense-making Opportunities, noticing, and interpretation of sense- making SM3 Responding to sense-making Table 4.2: Data Sources and Their Coding Purpose Coding of Audio-recorded Sense-making Conversations The goal of analyzing audio recordings was to understand the extent to which sense- making was happening in the conversations. Within the recordings, I identified 85 moments across data at least one from each teacher candidate for further coding. The moments were 39 chosen based two considerations as adapted by Russ and Luna (2013): a) how the idea was expressed by the student in the moment, and b) the role or purpose of the idea in that moment. Based on above considerations, I created three codes to capture the nature of the interaction: a) Information exchange—when students were providing factual information to answer a “what” question or provide a definition, b) Description—when students were describing something based on the observations while manipulating an artifact or experience related to a learning activity, and c) Explanations—when students were articulating a cause and effect relationship underlying the phenomena, grounding the discussion to explain how and why something was happening or to make predictions. See coding scheme below in Table 4.3. The findings based on the analysis of audio transcripts speak to the nature and extent of in-the-moment opportunities that were made available to students for sense-making. Coding Teacher Candidates’ Written Analysis and Reflections around Sense-making Conversations For this part of the study, I analyzed all reflections and sense-making conversations. I used initial categories from noticing and responding framework (Kang & Anderson, 2015) to analyze all three conversations. Due to the nature of the prompts I used SM1, SM2 and SM3 conversations to code for sense-making opportunities. I coded SM1 and SM2 for noticing and interpretation of sense-making and SM3 to code for candidates’ responding (see Table 4.2). I used open coding to further characterize each set of the category from noticing and responding framework. For instance, the opportunities category comes from Kang and Anderson (2015) framework. I open-coded for the nature of opportunities provided by candidates for students’ sense-making and developed following codes (sub-codes): absence/presence of phenomena, intentions for sense-making based on discussion plans (information, descriptions, probing for 40 phenomenon), nature of driving questions (fact-based information, description, explanation, closed [yes/no or like/ dislike, recall]). Similarly, I open-coded for noticing, interpretation, and responding categories from Kang and Anderson (2015). The details of the coding scheme are in Table 4.3. Teacher candidates also reflected on their lesson and suggested what they might change or adapt for future based on the analysis of three sense-making conversations. I used two codes: a) specific changes if teacher candidate suggested specific, addressing a certain aspect of student sense-making as to how they could support in future, or b) generic changes—basically logistic changes in sequencing, structure of lesson activity, worksheet, drawings, procedure of lesson, and/or adding/removing vocabulary content for future lessons (Table 4.3). 41 Categories Codes Sub-codes Science phenomenon Phenomenon/No phenomenon Discussion plan Information (factual) Descriptions (based on observation) Probing around phenomenon Driving question Fact-based information Opportunities: eliciting & probing students’ ideas/initial explanations about phenomenon Descriptions Explanation Closed (yes/no, like/dislike, recall) Use of artifact Hook Nature of exchange Information sharing Facilitate observations/explanations Elicitation of descriptions/observations Construction of explanations Emotions/gestures (displayed by students & related to sense-making) Noticing: of students’ sense-making about phenomenon Students’ content knowledge Students’ experiences (personal/school) Student ideas about phenomenon Interpretation: of students’ engagement in sense-making Prior knowledge Misconceptions by students Responding: suggestion of changes in assessments & instruction Changes in instruction (suggested by TCs) Task-based changes (linguistic, social, and logistical) Conceptual changes (support for sense-making Table 4.3: Coding Scheme for Sense-making Conversations Based on the above analysis, I discuss the findings from the study, revealing patterns across teacher candidates’ noticing and responding across 23 teacher candidates and three sense- making conversations. I will organize the findings based on the research questions asked in the 42 study. All questions will be referred to their designation as outlined in Chapter 1, Part 1: Sense- making Conversations. For example, question a will be denoted by 1a, 1b, and so on. Opportunities for Sense-making Research Question 1a: What opportunities do teacher candidates create as they engage students in sense-making conversations around science phenomena? To answer Question 1a, I examined teacher candidate’s discussion plans for the conversations to code their intentions for the talk. I also coded the driving questions and their enactments (exchange with students) of sense-making conversations to identify the opportunities candidates provided for student sense-making by orchestrating these conversations. a) Discussion plans My analysis of the discussion plans indicates that at least 38 (55%) of the discussion plans intended to probe students for ideas around sense-making. Nevertheless, the other half of the candidates engaged in planning a conversation that focused on gathering fact- based information about content topics and describing observations not around phenomenon (Table 4.4; Figure 4.1). Note than in Table 4.4 and Figure 4.1, several discussion plans were given more than one code (SM1 and SM2), so the total number of codes applied to the discussion plans exceeds the total number of conversations (23 x 3 = 69). 43 Codes SM1 SM2 SM3 Teacher candidates’ intentions for science talks Seeking information Eliciting descriptions Probing ideas about phenomenon Activating prior knowledge Recalling facts 4 7 13 3 0 12 3 7 2 0 3 3 12 0 5 Total (N) 27 24 23 Total (N) 19 13 32 5 5 74 Table 4.4: Intentions for Sense-making Conversations according to Discussion Plans Teacher candidates' intended plans for sense-making conversations 80 70 60 50 40 30 20 10 0 SM1 SM2 SM3 Total Information based Elicitation of Descriptions Probing around phenomenon Activating Prior knowledge Recall facts Figure 4.1. Intentions for Sense-making Conversations according to Discussion Plans b) Phenomena vs content idea for the discussion Analysis of the sense-making conversations indicates that, in 38 (55%) out of 69 instances, teacher candidates were successful in articulating a phenomenon to guide the conversation (Table 4.5; Figure 4.2). At the same time, only one-third of the candidates aligned their driving 44 questions to engage students in constructing explanations for the phenomenon. Almost half of the candidates designed their conversation around content topic for which they involved students in reproducing fact-based information. I tracked back the 38 teacher candidates who had articulated a phenomenon for discussion and found that at least 30 of them were successful in having an exchange during actual conversations where they engaged students in constructing explanations about the phenomenon. Similarly, when I tracked back 31 teacher candidates who structured their conversations around content topic, 23 of them only involved in conversations that looked more like an exchange of information. Codes SM1 SM2 SM3 Total (N) Examples Science phenomenon 11 14 13 38 Subject of sense-making conversations Content topic 12 9 10 31 Change in season Moon phases Change in landforms with water Things to do at the beach Classifying material by describing properties Things animals need to survive Table 4.5: Focus of Sense-making Conversations as Chosen by Teacher Candidates Subject of sense-making conversations 80 60 40 20 0 SM1 SM2 SM3 Total Science phenomenon Content Topic Figure 4.2: Focus of Sense-making Conversations as Chosen by Teacher Candidates 45 Driving questions From SM 1 to SM3, teacher candidates’ tendency to ask fact-based questions decreased and they became progressively successful in designing driving questions that engaged students with explaining the phenomenon (Table 4.6; Figure 4.3). Questions that aligned with eliciting descriptions around phenomenon increased. In such cases, candidates engaged students in describing observations around the phenomenon but did not involve them in analyzing those observations and probe the phenomenon. For instance, the candidate asked students to observe the weather and began with a driving question, “What have you noticed outside? Is it warmer? Colder?” when the intended goal for the conversations was to discuss about change in seasons. Codes SM1 SM2 SM3 Total (N) Examples Information 11 8 Teacher candidates’ driving questions for conversations Explanation Closed (like/ dislike, yes/ no) 4 5 5 11 11 2 0 25 26 7 "How can we tell insects apart?" "Why do trees lose their leaves?" “How do you like the boat experiment today?” Table 4.6: Nature of Driving Questions 46 Nature of driving questions 80 70 60 50 40 30 20 10 0 SM1 SM2 SM3 Total Information Explanation Close ended (like/ dislike, yes/ no) Description Figure 4.3: Nature of Driving Questions Nature of Exchange During Conversations Research Question 1b: What is the nature of interactions between candidates and students during such conversations? To answer research question 1b, I analyzed teacher candidate and students exchanges during conversations. Of the 85 moments across data, based on the nature and role of ideas shared by the students, I identified 35 (41%) sense-making moments (Table 4.7; Figure 4.4). During sense-making moments students engaged in thinking about the phenomenon, trying to offer some explanations for the phenomena of focus. When traced back to the driving question and phenomena, analysis indicates most of these candidates who had successful sense-making moments to varying extents had a science phenomenon guiding the conversations. Such moments, however, lacked probing to lead to deeper and more meaningful discussions about the phenomenon—for example, asking how and why things are happening that led to the phenomenon? 47 Codes SM1 SM2 SM3 Total (N) Information- exchange 15 12 6 33 Teacher candidates’ exchange with students Elicitation of observations 7 6 4 17 Examples (TC: teacher candidate; S: student) TC: We are going to talk about plants right now. Has anyone seen plants outside of their house? What kind of plants have you see? What do they look like? S: Purple green. TC: Very good, what have you seen? S: Desert. TC: In Michigan? S: There is cactus, because those are the kind of plants that grow in desert? TC: Have you seen any plants in Michigan? S: I have seen a flower, green yellow black. TC: Can anyone tell me what plants need to grow? S: They eat bugs, they suck the juice. TC: What do you think then that plants need if they need to eat something? TC: So, you know when we filled out those lab notebooks and we were looking for patterns? Do you remember any patterns we found out? S(a): Hardness. TC: Hardness! Did the two objects [paperclip and popsicle stick] have the same hardness or different hardness? All: Different. S(b): Like I could break the, um, popsicle stick but I could not break paperclip. It’s too hard – I tried! Table 4.7: Teacher Candidates’ Exchange with Students 48 Table 4.7 (cont’d) Teacher candidates’ exchange with students Forming explanations 12 11 12 35 S: Because the sun is hot, and we might get sunburned. TC: How do you know the sun is hot? S: Because sometimes when you stay in it too long you get burned. TC: What do we need in order to protect us so we don’t get burned from the sun? S: A house. TC: Tell me more about how a house will protect us. S: It has a roof and walls. TC: *Shows picture of a house* Is this something that will protect us from the sun? S: Yes, because it has a roof, four walls, and no open spots. TC: *Shows picture of house with no roof* Do you think this would protect us from the sun? S: No. TC: Why do you think that this wouldn’t protect us from the sun? S: Because it has no roof or windows or doors. Teacher candidates’ exchange with students 90 80 70 60 50 40 30 20 10 0 SM1 SM2 SM3 Total Information- exchange Elicitation of observations Forming explanations Figure 4.4: Teacher Candidates’ Exchange with Students For instance, consider the interaction presented in Table 4.7 below, S: Because the sun is hot, and we might get sunburned. TC: How do you know the sun is hot? 49 S: Because sometimes when you stay in it too long you get burned. TC: What do we need in order to protect us, so we don’t get burned from the sun? S: A house TC: Tell me more about how a house will protect us. S: It has a roof and walls. TC: *Shows picture of a house* Is this something that will protect us from the sun? S: Yes, because it has a roof, four walls, and no open spots. TC: *Shows picture of house with no roof* Do you think this would protect us from the sun? S: No. TC: Why do you think that this wouldn’t protect us from the sun? S: Because it has no roof or windows or doors. The conversation started as a good example of a cause and effect interaction. The candidate was trying to compare various structures that could provide sun protection. The core idea was to explore that sunlight affects the earth surface, which was articulated by the candidate in the plan. Some of the key ideas in play here are: • Heat Transfer: from one object to another when two objects are at different temperature • Energy moves out of higher temperature objects and into lower temperature ones, cooling the former and heating the latter. This transfer happens in three different ways, by conduction within solids, by the flow of liquid or gas (convection), and by radiation, which can travel across space. • The processes underlying convection and conduction can be understood in terms of models of the possible motions of particles in matter (NRC, 2012). The teacher candidate did not probe further for these ideas and moved on to a mode of gathering facts. Teacher Candidates’ Noticing of Student Sense-making Research Question 1c: How do candidates notice and interpret student thinking around the phenomenon within these conversations? 50 Many of the teacher candidates’ who did have a phenomenon planned for the unit were not successful in noticing student ideas related to the phenomenon. Teacher candidates noticing remained limited to the content ideas and canonical information students were able to mention during the conversation (Table 4.8; Figure 4.5). Candidates analyzed the conversations with their attention mainly on what students did or did not know about the topic or phenomenon posed by them, rather than how students were thinking about the phenomenon itself. As interns working in their placements, candidates’ attention also remained on students’ attitude during the conversations and their personal experiences mentioned in the conversations. It is possible that due to the engagement with young children, candidates are inclined towards paying attention to students’ personal stories. Codes SM1 SM2 Total (N) Examples Content knowledge 15 9 23 Teacher Candidates’ Noticing Experiences 2 0 2 Student attitudes 0 5 5 Student ideas about phenomenon 6 9 15 They know that blood and the heart are related. They know the veins carry the blood throughout the body. They are unsure how oxygen is related. They are unsure about the physiology and anatomy of the heart. I think they all have this understanding from books they have read or conversations in school. I feel like animals is a science topic that is often talked about with young children. I came to get to know my focal students and their personalities. I learned that, for Ben and Jackson in particular, science is a subject that excites them and ignites in them the desire to share their ideas During this sense-making, a student implied that they knew the wind blew in a certain direction and that the wind changes speed/strengths, but they never explicitly stated the pattern. They never said “I learned that the wind blows in certain direction or that the wind always blows one way.” Table 4.8: Aspects of Student Ideas Noticed by Teacher Candidates 51 Teacher candidates' noticing 50 40 30 20 10 0 SM1 SM2 Total Content knowledge Experiences Student attitudes Student ideas about phenomenon Figure 4.5: Aspects of Student Ideas Noticed by Teacher Candidates Teacher candidates’ in the study struggled with the interpretation of student sense- making. At least at 30 instances teacher candidates noticed a student idea related with student sense-making of the phenomenon. However, despite identifying the ideas, teacher candidates did not engage in interpretation of these ideas at a deeper level, exploring what these ideas may mean for student sense-making of the phenomenon or a tangible evidence supporting the ideas. Most teacher candidates interpreted that students have ideas about the phenomenon, that they noticed, because of their prior experiences, personal and/or schooling (Table 4.9; Figure 4.6). They did not always dive into this further to make connections between how these prior experiences may relate to students’ ideas about the phenomenon. Other times, teacher candidates did make some superficial connections, relating the experiences to why students may think in a certain way but not extending it to how it may relate to their understanding. 52 Teacher candidates’ interpretation Codes Prior knowledge (personal experiences) Misconceptions SM1 SM2 13 2 18 2 Total (N) 31 4 Table 4.9: Teacher Candidates’ Interpretation of Student Sense-making around a Phenomenon Teacher candidates' interpretation 40 35 30 25 20 15 10 5 0 SM1 SM2 Total Prior knowledge (personal experiences) Misconceptions Figure 4.6: Teacher Candidates’ Interpretation of Student Sense-making around a Phenomenon Consider the following examples in which teacher candidates reflect on student ideas: Overall, I think that much of the knowledge these students hold regarding solids and liquids comes from their prior experiences in and outside of the class. I also think that these students were better able to discuss the differences of solids and liquids due to the very relatable picture I showed them, which consisted of ice cubes and a glass of water. I believe that my students are thinking about the warming effect of the sun on people and Earth’s surface because they were basing their explanations off of the evidence they had gathered from their personal experiences of being in the sun, particularly at the beach, based on the pictures I was showing them. Teacher candidates in the examples above are attributing students’ ideas about the phenomena to what they may have seen in the media or in real life. It is of value to understand students’ everyday reasoning by making connection between how students think about science ideas and their related life experiences. However, we need to take a step further and need to go beyond and analyze these ideas for what they mean for student understanding. 53 Teacher Candidates’ Responding to Student Sense-making Research Question 1d: How do they (TCs) plan to respond to student thinking through instruction? In the SM3 assignment, teacher candidates suggested changes in their lessons to better support student sense-making in future. They suggested adaptations and strategies at the individual student level, for the 4-5 students in their group, and holistically for the whole experience of designing and implementing sense-making conversations and lesson. All teacher responses were very generic in nature, pertaining to logistical changes in the activity, adding a worksheet, or clarification of vocabulary in the future. Teacher candidates’ struggles for suggesting changes that would support student sense-making of the science phenomena in future were very noticeable in the study. I would have students write down their observations in a science notebook instead of on a worksheet so that they wouldn’t lose track of it. I think implementing drawing would be very beneficial. If I taught this lesson again, I would allow students the opportunity to either explain in words and/or draw out their thinking process. It may be beneficial for (student) to have a visual aspect when learning. By including visual, auditory and kinesthetic elements to the lesson, Harry will be able to engage with the content in different ways. The examples above indicate a pattern of generic responding among teacher candidates. The adaptations suggested in the examples above are mainly structural in nature—writing down observations in a notebook instead of a worksheet, having students show their thinking by drawings, and using visuals are changes about how teacher candidates may approach the task in future. Some generic changes can be indeed useful and valuable for the future instruction around the science phenomena. However, at a specific level, we are looking for teacher candidates’ responding where they suggest changes to make ideas or mechanisms related to the science 54 phenomenon more accessible for students’ learning. Such changes may effectively allow students an opportunity to figure out and enable them to construct a mechanistic explanation for the science phenomenon. For instance, adaptations such as, what questions teacher candidate may ask to support sense-making in the future? How based on what teacher candidates know about student understanding about the phenomenon? — they may clarify a certain idea through teaching. Discussion Access to Students’ Thinking Teacher candidates in the study made efforts to ground their talks in science phenomenon. Using a science phenomenon for the talk leveraged some success to teacher candidates in shifting their attention to student thinking. Almost 50% of the teacher candidates in the study were successful in using phenomena for the conversations and at 35 out of 85 moments they were successful in probing students for an explanation around the phenomena. Teacher candidates interpreted students’ ideas about the phenomenon as originating from their prior knowledge. Such analysis also indicates candidates’ attention towards students’ everyday reasoning. During sense-making moments, students presented their thinking about the phenomena and moved away from the textbook language. Candidates in the study also tried to contextualize the generic phenomena into real-life scenarios, for instance, seasons into “how we get fall colors,” sunlight effects into “testing different structure that protect against sun,” sound and its effects by observing “how vibrations effect water and solids,” and so forth. Teacher candidates had a range of success in using and contextualizing phenomena, but use of phenomena did orient teacher candidates to allow students opportunities to describe observations and eliciting factors that play role in the phenomena. At least 50% of the teacher candidate did 55 struggle at various stages of planning and enacting sense-making conversations, in translating content topic to a contextualized phenomenon, having a driving question aligned with the phenomena, and engaging students in making sense of the phenomena. Such teacher candidates continued to involve students in information and fact-based questions and conversations. Findings show a need for targeted learning opportunities to deepen teacher candidates understanding regarding each step involved in the planning and implementation of sense-making conversations. Attention to Mechanistic Thinking One of the key goals for phenomena-based, sense-making conversations is to engage students in articulating and clarifying their thinking about how and why phenomena happen. Constructing explanations for the phenomena is not an outcome of the sense-making conversations. Sense-making conversations are meant to support the process of constructing explanations wherein students unpack and engage in reasoning about the phenomena. Some teacher candidates in the study were successful in involving students to identify cause and effect relationships existing within the phenomena, for instance, sound makes matter move or sun causes objects to heat, etc. However, teacher candidates in the study did not engage students in mechanistic reasoning around the phenomenon (Russ et. al, 2009; Kuhn & Reiser, 2005). Various hypothesis may explain why teacher candidates did not attend to students’ mechanistic reasoning around the science phenomena. One possible hypothesis is that teacher candidates perceive young children as not capable of showing such thinking. They limit them to making phenomenon-based reasoning, at a very naïve level consisting mainly as making observations about the world, either looking carefully at things or trying to see what happens (Newton, Driver, & Osborne, 1999). Second, as also noted by Metz (2011), typical science curricula give value to 56 science “process skills” that have been deemed developmentally appropriate, like observation, measurement, and categorization. Such notions regarding what young learners can or cannot do in science classroom, based on the view of developmental appropriateness, undermines if and to what extent they can engage in scientific inquiry. Studies show (e.g., Metz, 2004, 2011) that when provided with rich learning opportunities, young learners can also engage in scientific inquiry and successfully reason about phenomena. It is possible that teacher candidates in this study held similar perceptions about the ability of young learners to reason about phenomena or perhaps they primarily think of science as involving these aspects. In either case, candidates’ perceptions can have negative implications for the nature of tasks designed for the talks. The talks may also be limited in scope and depth in probing around the phenomena. A third factor that may have affected teacher candidates’ attention to students’ mechanistic reasoning in this study could be their understanding of scientific explanation. It is important that candidates understand various elements of a scientific explanation: how are these related? What a partial vs complete explanation of a phenomena may look like? how does one identify any gaps in a scientific explanation? Understanding key aspects of a scientific explanation may support teacher candidates to design their sense-making conversations with clear goals and understand it as process that develops over time. It is also crucial to modify teacher candidates’ perceptions about young learners’ abilities to engage in inquiry by presenting cases and examples that can reveal reasoning abilities among young learners. The Ultimate Challenge: How to Use Student Ideas Responding to student ideas in a manner where suggested changes support and guide instruction towards effective sense-making opportunities remains a challenge for teacher candidates. Studies show that even when teacher candidates notice and interpret student ideas for 57 their understanding, they often struggle to be responsive in their instruction (Gotwals & Birmingham, 2016; Kang & Anderson, 2015). Findings from this study support the conclusions from the research literature. Candidates in this study showed limited responsiveness even when they noticed and interpreted student ideas around the phenomenon. Teacher candidates in the study took an ambitious task of planning and enacting sense-making conversations in their placements. In the whole process, responding was structured as a last step which could have possibly limited some teacher candidates in relation to time for reflection and analysis of conversations to respond effectively. Also, teacher candidates’ interpretation of student ideas in this study was limited to their everyday reasoning, which may not have given them deep enough understanding about students’ disciplinary thinking. As said before, there is a value in knowing students’ everyday reasoning behind why and how things happen, however, for productive responding teacher candidates need to take a step further and think about how everyday reasoning enlightens them about students’ understanding of the phenomena and existing gaps in their explanations and reasoning. 58 CHAPTER 5 HOW ELEMENTARY SCIENCE TEACHER CANDIDATES’ DESIGN FOR, NOTICE, AND INTERPRET STUDENT SCIENTIFIC SENSE-MAKING THROUGH WRITTEN ASSESSMENTS The current chapter discusses findings that show how teacher candidates used assessments based on science phenomenon as a means for supporting student sense-making. In particular, this chapter aims to address the research questions: a) What is the substance of the assessments used by teacher candidate and how it allowed for opportunities for students’ sense- making? b) What do teacher candidates noticed and interpreted as an evidence of students sensemaking within students’ responses to these assessments? c) What kind of adaptations do TCs suggest for improving these assessments and how these changes related to students’ sense- making? d) Finally, if and how teacher candidates assessment design related how they notice and interpret student work in response to assessments? I analyzed 23 teacher candidates’ assessments and their analysis of student work, and artifacts of student work. I present patterns of how teacher candidates used phenomenon-based assessments as context for allowing students’ sense-making. I discuss teacher candidates noticing, interpretation, and responding to students’ sense-making as they analyzed their student’s assessment responses. I illustrate specific examples to describe these patterns in detail and discuss the relationship between assessment design and candidates’ noticing and responding. Finally, this chapter discusses lessons learned regarding preparing teacher candidates for productive design of assessments and their analysis of students’ work to support sense-making. I 59 specifically discuss the existing gaps within their understanding based on the findings, and how we may address these gaps within methods courses. Framework for Assessment Analysis As discussed earlier, the following framework illustrates how I analyzed candidates’ responsiveness in the context of the assessment. The assessment practice began with candidates engaging in planning assessment item guided by phenomena. It is possible that candidates chose assessment items suggested by their mentors. Some of them may have chosen it independently or have adapted an already existed item based on program requirements. Next steps to the assessment practice included the candidate’s analysis and reflections of student work in response to the assessment. These steps involved candidates in analyzing student work. On the whole, all three steps depict candidates’ responsiveness towards student sense-making (adapted from Kang & Anderson, 2015) Substance of Assessment items What opportunities exist for student sense- making Interpreting student responses Noticing and interpreting sense- making Suggesting changes in assessment and lesson plan Responding pedagogically Figure 5.1: Responsiveness through Assessments Context of Assessments This chapter presents analysis and findings related to teacher candidates’ noticing and responding to student sensemaking as they analyzed the student responses to the implemented assessment items. Most teacher candidates implemented assessments based on science phenomenon at the end of their two-day unit plans. I also examined the substance of assessments 60 implemented by teacher candidates to find the extent to which they allowed opportunities for student sense-making. The substance of the assessment had much to do with presence of the phenomena and alignment of the assessment. Teacher candidates were provided opportunities within the methods course to learn about the use of phenomenon for instruction and assessments and formative assessment, as well as experiencing a three-hour workshop to unpack NGSS performance expectations into three dimensions—disciplinary core ideas (DCI), scientific practices (SP), and crosscutting concepts (CCC) to design a three-dimensional NGSS-aligned assessment item for their unit (Figure 5.2). Teacher candidates read and analyzed examples of phenomena within methods courses. They identified NGSS performance expectations for their 2- day lessons and narrowed down to a lesson objective. In the assessment workshop, candidates used these performance expectations and examined them for all three NGSS dimensions in small groups with their peers, while constantly getting input from teacher educators. It is safe to say that to design assessments guided by phenomena was a program-specific effort and mentors and curriculums used in candidates’ school placements may or may not have not aligned with it, thus affording or constraining the whole process of assessment design and implementation. 61 Figure 5.2: Learning Experiences for Teacher Candidates 62 Sources of Data There were two primary sources of data analyzed to address the research questions: a) design of 23 assessments implemented by teacher candidates at the end of their 2-day lesson and b) teacher candidates’ analysis of students’ responses to these assessment items. Each candidate selected assessment responses from six students in their classroom to exemplify a range of student responses. Within their analysis, teacher candidates were prompted to notice and explain evidence of students’ sense-making based of students’ work in response to assessment. Teacher candidates reflected on moments where they found their students to be engaged in sense-making as per their understanding. They further used evidence from student work to support their claim about student sense-making and explain that how and why they think that student sense-making was happening. Another supporting data that was used were the students’ artifacts produced by students as they responded to the assessment item. In all, candidates conducted an in-depth analysis of students’ sense-making based on the prompts provided within the methods course assignment. Data Analysis I analyzed the data in two stages guided by the responsiveness framework by Kang and Anderson (2015). I divided the data into 23 assessment episodes. Each episode signifies the assessment design implemented by the teacher candidate and that candidate’s analysis and reflections of six students’ work (exemplifying a range) to the assessment. As a result, I had in total 23 assessment episodes from 23 teacher candidates. I coded each assessment episode in two stages. First, I coded the design of the implemented assessments. Then, I analyzed teacher candidates’ noticing, interpretations, and responding to student sense-making as documented in their analysis and reflections around students’ responses to the assessment. Each assessment 63 episode comprised of six accounts in which a teacher candidate analyzed and reflected on a student’s response to assessment. As a result, I ended up with 138 (23 x 6) accounts of teacher candidates’ analysis of student work. I treated each account as one unit of analysis to investigate teachers’ noticing. Teacher candidates suggested changes via instruction and/or assessment at the end of each assessment episode. Finally, I analyzed across assessment episodes focusing on teacher candidates’ noticing and responding to examine the conditions under which some teacher candidates noticed, interpreted, and responded more successfully than others. Coding Assessments Tasks The first round of analysis included analyzing the assessment items implemented by teacher candidates. These items provided information regarding the potential for allowing student sense-making. I coded the assessment items for their substance and structure, which involved examining following aspects: a) if a phenomenon was articulated to guide the assessment? (b) if and how the phenomenon was used to guide the assessment? (c) What were students involved in doing as they responded to the assessment? I defined science phenomenon as an observable event that students could explore and construct explanations for while explaining mechanisms for how and why the phenomena happens (Penuel & Bell, 2016; Reiser, 2013). Keeping the scope of a 2-day lesson in sight, I decided whether the phenomenon was at the core of the assessment based on whether: a) candidates articulated a natural process or event that students could unpack through observations, b) candidates made predictions, c) candidates could analyze and collect data, and d) candidates could construct explanations for using a mechanistic reasoning. These assessments may or may not integrate to a scientific practice or crosscutting concept. 64 Based on this analysis, the assessment task were coded as either: 1) unproductive assessment, without a phenomenon and closed, content driven, limited to classification, describing information, procedures such as labelling, circling correct, etc. 2) unproductive assessment, phenomenon present but assessment not based on it and still closed assessment, content driven, limited to classification, describing information, procedures such as labelling, circling correct, etc., 3) phenomenon-based assessment but still unproductive in nature, and 4) phenomenon-based productive assessment prompting students to show reasoning, collect and interpret data, prompting to construct a scientific explanation. Table 5.1 below shows the abovementioned categories and related examples. 65 Assessment Type Phenomenon/assessment relationship Assessment Tasks Examples How can you describe two new solids based on the knowledge of the properties used to describe solids in previous lessons? No phenomenon Phenomenon not aligned with assessment Reproduce and recall fact-based information Classification Descriptions Reproduce and recall fact-based information Classification Unproductive assessment Phenomenon aligned with assessment Productive assessment Phenomenon aligned with assessment Color in the picture that will offer you and your family the best protection from the sun and heat from the sun. Draw a structure that will offer protection to the dog below. Make sure that you include all of the essential components to your structure PHENOMENON: Sunlight and its effects Students will draw what they observed on the playground outside in the morning and in the afternoon and color their drawing based on how they think the object felt related to the temperature of the object: Blue=cold, Green=cool, Orange=warm, Red=hot. Also, the students will indicate where they found the object by either coloring the ground gray if the found the object in the shade, drawing a sun if they found the object in the sun, or explaining where they found the object in words when asked individually. Thus, I will assess the students formatively by observing students as they conduct investigations to determine how sunlight affects the temperature of the objects that they touch. Table 5.1: Examples and Categories of Types of Assessments 66 Coding Teacher Candidates’ Analysis and Reflections of Student Assessment In the second stage of analysis, the goal was to investigate teacher candidates’ noticing, interpretations, and responding to student sense-making. Recall here that I define student sense- making as a moment where students are engaged in making sense of the phenomenon—making predictions, asking questions, formulating hypothesis, constructing explanations, cause and effect, and/or with attention to mechanisms that underlie the phenomenon. I used teacher candidates’ analyses of each students’ work as a unit and thus ended up with 138 (23 x 6) units for analysis. I coded for teacher candidates’ noticing of the evidence of student sense-making. I also coded for how candidates articulated (interpretation) responses for why students responded the way they did. The following codes were developed for teacher noticing during the analysis process: a) students’ use of prior knowledge, b) students’ attitude (talking quiet, happy, excited, etc.), c) students’ constructing explanation, and d) students’ producing correct responses. Teacher candidates’ responding entailed suggesting changes to the assessment at the assessment episode level. I coded that candidates either suggested: generic changes (unproductive) meaning, changes to sequence of lesson, adding/ lessening content, additional scaffolds, etc., or specific changes, (productive) conceptual changes that targeted students’ difficulties in learning in some manner and/or created enhanced opportunities for student sense-making in the future, for instance, adding reasoning, alternative reasoning, or phenomenon. Finally, I examined the relationship between assessment design and nature of candidates noticing and interpretation and responsiveness. Table 5.2 details the codes, sub-codes, and related descriptions of codes that developed based on my analysis. These codes and sub-codes align with the research questions of the study. For instance, I used the code “substance of assessment” and sub-codes such as 67 phenomenon, nature or assessment (open-ended/closed), mechanics of assessment, etc., to ascertain the opportunities teacher candidates provided for student sense-making. Based on the relationship between assessment design and candidates’ noticing and responding, I categorized the teacher candidates in the study in four groups. I backtracked the trends in candidates’ noticing to see if and how these trends were related to assessment design. The categories represented if and how assessments used phenomena and what implications it had for candidates’ noticing and interpretation of student sense-making. Analysis of the data indicate a range of candidates’ assessment design and interpretation, from treating assessment to assess correct/incorrect factual understanding, to noticing and repeating patterns in data, to looking for and noticing some aspects of wrestling with ideas. Further, while teacher candidates were able to leverage phenomena for students’ questions, they rarely asked students to engage in mechanistic reasoning of “how” or “why” phenomena occurred. I discuss the findings in the following section. 68 Categories Codes Sub-codes Phenomenon Opportunities: eliciting & probing student ideas/initial explanations Substance of the assessment Open-ended Descriptions of codes Presence/absence of phenomenon in plan If & how assessment was grounded in phenomenon Asking for explanations & mechanisms underlying phenomenon Closed Assessment centered on factual/canonical knowledge Scientific practice/Cross- cutting concept Engaging students in scientific practice or cross-cutting concept Procedural skill Sense-making Engaging students in label/draw/circle responses Sense-making as ability to reason, hypothesize, or construct causal explanations as evidenced by analysis of responses. Students’ leveraging from learning experiences cited as source of sense-making Describing observations Sense-making interpreted as ability to make & describe observations Mechanics of the assessment Interpreting prior experiences Inferencing Responding to assessment partially/completely Experience as source for sense- making, rather than evidence from analysis Inferring & extrapolating student ideas based on students’ work & responses Sense-making as ability to respond to assessment partially or completely Noticing & interpretation: analysis of student responses, noticing of when & how sense-making occurs Correct/Incorrect Response to assessment Task-based changes Conceptual changes Task-based changes Conceptual need-based changes Suggesting linguistic, social, & logistical changes in assessment Suggesting changes in support of sense-making Addressing linguistic, social, & logistic changes Addressing conceptual idea for enhanced student sense-making through lesson adjustment Responding: TC suggesting changes in assessment & instructions Table 5.2: Coding Scheme for Analysis of Teacher Candidates’ Assessments 69 Patterns Across Teacher Candidates I present the findings in four sections, reflecting on what I learned in analyzing the assessment tasks designed by teacher candidates and ways in which teacher candidates noticed, interpreted, and responded to students’ sense-making based on these assessments. In the last section, I present four illustrative examples of teacher candidates’ noticing. Assessment Tasks Research Question 2a: What is the substance of the assessments used by teacher candidates in other words, how do TC assessments allow for opportunities for students’ sense-making? About one third (7 out of 23) of teacher candidates had an assessment design that was based on a science phenomenon and open-ended enough to elicit student thinking in different ways (Table 5.3). Nine out of 23 teacher candidates did not have a phenomenon guiding the assessment design and thus ended up with an assessment design that had no phenomena and was unproductive in nature. Unproductive assessments mainly focused on recalling and reproducing some content information or vocabulary discussed before during the 2-day lesson teaching. These unproductive assessments mainly asked students to label, draw, circle the correct answers. Some (3) teacher candidates were successful in articulating a science phenomenon for the assessment but struggled to align their assessment with it. A few other (4) were successful in using the phenomenon into assessment but in a limited manner, therefore the assessment designed by these teacher candidates remained unproductive in substance. The examples for assessment for each category are in Table 5.1. 70 TC group TC Phenomenon Phenomenon aligned to assessment Substance of the assessment (open-ended/ closed) No phenomenon Phenomenon but not aligned to assessment Phenomenon aligned to assessment Phenomenon present assessment aligned (open- ended) JG GK EW JH MJ NR BL LX SC HR KA NW RC HL AZ MN ST SS CE AH JK AD AR x x x x x x x x x √ √ √ √ √ √ √ √ √ √ √ √ √ √ x x x x x x x x x x x x √ √ √ √ √ √ √ √ √ √ √ unproductive unproductive unproductive unproductive unproductive unproductive unproductive unproductive unproductive unproductive unproductive unproductive unproductive unproductive unproductive unproductive Productive Productive Productive Productive Productive Productive Productive Table 5.3: Categories of Teacher Candidates Based on Phenomenon and Substance of Assessment Noticing and Interpretation of Student Responses from the Assessments Research Question 2b: What do TCs notice and how they interpret as an evidence of students sensemaking within students’ responses to these assessments? Recall that each of the 23 teacher candidates analyzed the work of six students in response to the assessment design they implemented in their classrooms. There was strong 71 evidence to show that candidates’ noticing and interpretation was very connected with if and how they used the phenomenon, and to the extent to which they were able to use the phenomenon to guide the assessment. Most teacher candidates designed assessments that were content-focused and mainly engaged students in recalling and reproducing information and vocabulary related to science content. The structure of the assessment did not allow any meaningful opportunities for students to show reasoning and construct mechanistic science explanations. The assessments mainly asked students for actions such as label, draw arrows, or follow a procedure. Teacher candidates who did not have a phenomenon guiding the assessment and an unproductive assessment mainly noticed student sense-making as matter of their behavior and attitude. They mainly viewed student talking, alertness, and ability to answer correctly to various parts of the assessment as a proxy for sense-making. They repeatedly interpreted students’ ability to engage in this form of sense-making as a manner to leverage their prior knowledge, whether from schooling or personal background. Teacher candidates engaged in limited interpretation because they could not gather many student ideas in the first place. Three teacher candidates had a science phenomenon guiding the assessment but continued to struggle to design an assessment that aligned with it aligned with the science phenomenon. These candidates also ended up paying attention to students’ attitudes, however, something that was very characteristic of these candidates was their tendency to make extrapolated claims about students’ understanding of the phenomenon based on their responses. They frequently noticed students’ ability to follow procedures as a process of sense-making. The assessment asked for classifications and descriptions and being successful to do so was treated as a process for student sense-making. Again, there were limited student ideas to notice and interpret. 72 Some teacher candidates successfully used phenomenon to guide assessment, however the assessment was still limited in ways to elicit students’ ideas regarding the phenomenon. Very characteristic of these candidates was their tendency to make extrapolated claims about students’ understanding of the phenomenon based on their responses. They frequently noticed students’ ability to follow procedures as a process of sense-making. Again, there were limited student ideas to a notice and interpret. The assessments mainly used phenomenon as a hook or an interesting scenario while still probing to follow procedures like drawings, circling pictures, using arrows etc. Seven out of 23 teacher candidates in this study were able to use science phenomenon to guide assessment and then design an assessment that was productive enough to probe students’ construction of explanations, collect data and observations, and respond the part(s) of the assessment using those observations. Teacher candidates in this group noticed student ideas in relation to the phenomenon which were mainly of cause and effect nature. They engaged in richer analyses of student responses and provided evidence of student sense-making from their work. The interpretation involved discussing of learning opportunities from the 2-day lesson as well as within the context of the assessment that lead to supporting student sense-making. Suggesting Changes to Assessment Research Question 2c: How teacher candidates use their understanding of students’ assessment responses to suggest adaptions to future instruction? Teacher candidates reflected on the design and structure of the assessment at the assessment episode level. Only three teacher candidates suggested changes to assessment that were productive in the sense that change would potentially allow for future sense-making opportunities for the students. In most cases teacher candidates struggled to suggest a productive 73 response and could only offer generic adaptations, such as adding more content, vocabulary, or changing the sequence of activity or structure of worksheets to ease transitions and/ or comprehension. These were mostly structural changes that did not allow for supporting students’ sense-making. Discussion of Illustrative Examples In this section I discuss in detail a typical example illustrating each of the categories from the table above. These example cases demonstrate the results found in the larger study. Example 1: Teacher Candidates with No Phenomena and Closed Assessment The teacher candidate in this case designed an assessment for first-grade students. The two-day lesson intended to focus on the following NGSS performance expectation (PE): 2-PS1-1. Plan and conduct an investigation to describe and classify different kinds of materials by their observable properties. (Clarification Statement: Observations could include color, texture, hardness, and flexibility. Patterns could include the similar properties that different materials share.) For the enactment of the lesson candidates articulated the following emphasis, LESSON FOCUS: This lesson will focus on having students a) observe two new solids, conduct tests, and record observations and b) discuss and compare observations and test results. For the assessment, the teacher candidates did not articulate a phenomenon aligned with the PE. The implemented assessment mainly focused on classification and description of the material provided (Figure 5.3). 74 Figure 5.3: Assessment Item In relation to the opportunities for sense-making, the assessment was limited as it was closed ended as well as a bit vague. The substance of the assessment did not allow students to engage in reasoning or constructing explanations. They mostly involved observing and describing characteristics of different object. Part of the assessment, concerning making a choice for a material to build the house, was a little vague and the mechanics of the assessment mainly asked students to fill in the boxes. The evidence of sense-making noticed by the teacher candidate mainly included that students were able to follow procedures, observe, and categorize. This student was engaged in sense making throughout the lesson. She used the given resources appropriately to successfully test the two new solids. She observed the solids, conducted the appropriate tests, recorded her observations, and had thoughts to add to the discussion about similarities and differences. 75 Candidates interpreted student responses to the assessment by mainly focusing on their descriptions of objects and what they could answer and not answer. He filled out the entire observation sheet with thoughtful and reasonable answers. For one box in the observation sheet, he said the paper clip was soft. I do not think this is an ideal answer, however comparatively to the block he may have concluded it was not as hard, so I still accept that answer as reasonable for showing understanding. Figure 5.4 is an example of the sample student response. Figure 5.4: Samples of Student Work The candidate suggested generic adaptations/changes to the assessment. Generic changes do not have the potential to enhance student sense-making for the future. For instance, the candidate suggested: After reviewing all of the responses I got on my assessment there are a few things I may change to get a better picture of the students’ progress towards mastering the learning goals. One thing would be to provide a picture or visual next to each of the properties on the observation chart as a scaffolding. 76 Example 2: Teacher Candidates with a Phenomenon Not Articulated with Assessment The second example illustrates the case where teacher candidate had a phenomenon but could not use it to guide the assessment. In the example here: PERFORMANCE EXPECTATION: K-PS2-2. Analyze data to determine if a design solution works as intended to change the speed or direction of an object with a push or a pull. The teacher candidates the following phenomenon and exploratory question, PHENOMENON: Bigger pushes/pulls make things go further and vice versa. DRIVING QUESTION: How do we move things? The phenomenon focused on engaging the young learners into exploring how force related with distance moved by objects. The following assessment however was closed and only asked to circle the case in which it was easier to push and pull. The assessment was rather a little vague in eliciting student response (Figure 5.5). Figure 5.5: Assessment Item 77 Figure 5.6 provides illustrations of students’ work in response to assessment. Figure 5.6: Samples of Student Work As an evidence for student sense-making student, teacher candidate often discussed students’ behaviors and attitude and made unsubstantiated claims about students’ understanding of the phenomena. [Student] was very engaged. She did not necessarily speak a lot, but she was attentive. During the second sense-making, she was a little bit more distracted. He used the world “slide” instead of “pull,” which led me to believe that he understood the basic idea, but simply wasn’t use the vocabulary I taught. The above quote reveals teacher candidates’ attention towards content and vocabulary and student attitudes and there is no discussion of student ideas related to phenomenon. At times when teacher candidate did pay attention to student ideas within their drawing, they ended up making assumptions about what students did or did not understand about the phenomena. The following examples show how candidates interpreted student responses: He seemed to understand the basic concept of a pull not happening when there are opposing forces but did not have the language yet to express that completely. 78 He started explaining that in the top left picture, the “fridge” (as he interpreted it) was very heavy and would therefore be more difficult to push. This showed me that he understood that there were better ways to push or pull things based on the situation. In each of the prior quotes, the candidate made extrapolated claims about students’ understanding of the phenomena. The assessment did not elicit many ideas about student understanding and the candidate inferred a great deal based on students’ drawings which did contain the information but also could be the source of some subjectivity. In relation to responding, the candidate could not suggest any specific adaptation in relation to the assessment to effectively leverage student understanding for the hypothetical next time in this case. I would try to create an assessment that would more clearly elicit evidence of the students’ science understanding. I think my assessment confused some of my students and they got caught up in how to complete it, and therefore their responses did not truly show their understanding of the science content of my lesson. The adaptation suggested was mainly to make the structure of the assessment less confusing for next time but how and why (rationale) for the intended change was not discussed making it an unproductive response. Example 3: Phenomenon Guided the Assessment but the Assessment Still Closed The third example is a representative of a group of teacher candidates who could articulate a core science phenomenon for the assessment and designed an assessment that was still closed ended. In the case discussed here teacher candidate chose flooding as a phenomenon for lesson discussion. The chosen phenomena also aligned well with the following NGSS performance expectation selected for the lesson: 5-ESS2-1. Develop a model using an example to describe ways the geosphere, biosphere, hydrosphere, and/or atmosphere interact. [Clarification Statement: Examples could include the influence of the ocean on ecosystems, landform shape, and climate; the influence of the atmosphere on landforms and ecosystems through weather and climate; 79 and the influence of mountain ranges on winds and clouds in the atmosphere. The geosphere, hydrosphere, atmosphere and biosphere are each a system. However, the assessment implemented (Figure 5.7) was closed and mostly read like a reading comprehension with closed prompts, which included mostly “what” questions. Figure 5.7: Assessment Item The candidate also asked student to draw a flooding scenario based on the learning experiences during the lesson as a part of the assessment. Artifacts showing a flooding scenario produced by students are shown in Figure 5.8. 80 Figure 5.8: Samples of Student Work The candidate in this scenario did not provide opportunities for students to produce causal explanation underlying the phenomenon of flooding. Therefore, the teacher candidate tried to notice and infer student understanding based on student drawings. The drawings were not prompted by any reasoning and teacher candidate made some unsubstantiated conclusions (inferencing) about students’ sense-making based on them. For instance, the teacher candidate inferred: Flood water seemingly flowing into house and carrying away people, this shows knowledge of how strong the water flow can be and recognition of damage that can occur. The drawing did not have strong evidence to support the candidate’s conclusion that the student had knowledge about force of water during flooding and how it could affect landforms. Also, similar to teacher candidates in the group 1, the teacher candidate in this case also notice noticed attentiveness and ability to ask questions as an act sense-making as an ability. The student was asking clarifying questions to other students at the table and was attentive in watching the demonstrations. 81 The teacher candidate interpreted students’ sense-making as an ability to leverage from personal experiences and learning opportunities provided during their lesson teaching. However, they did not specifically note the evidence and understandings students draw from those experiences to involve in sense-making in the content of the assessment. This student seemed to be engaged in sense making through the worksheet and what he had read. When producing the drawing it was clear that he had utilized the worksheet and a fact that he had gained from it. The nature of his ideas seemed to stem from the video as well as how we had discussed living by a riverbank. Example 4: Phenomenon-based Assessment and Open-ended The fourth example illustrates the case of a teacher candidate that was successful in articulating a phenomenon and plan an assessment which provided a potential context for student sense-making of the science phenomenon. The case of the teacher candidate presented here used the following NGSS performance expectation for the lesson: 1-PS4-1: Plan and conduct investigations to provide evidence that vibrating materials can make sound and that sound can make materials vibrate The lessons primarily focused on: students making predictions of what the waves they see will look like and then recording what they actually saw. The lesson and the assessment were grounded within the science phenomenon for how sound affects matter. The teacher candidate provided students with various experiences to observe sound waves through a medium and prompted them to predict and then write actual observations based on those experiences to share their thinking about how sound may affect matter. The teacher candidate provided concrete evidence of student sense-making by frequently referring to students’ ideas expressed around the phenomenon in their assessment responses. The teacher candidates consistently engaged in the analysis of these ideas to make conclusions regarding students’ understanding of the grounding phenomenon. 82 This student was engaged in the sense-making activity because she was using the water bottles to show us what she had learned within the experiment and what she had did. She showed us how the water moved and how you could see and feel that the water bottle was moving when sound was applied. This student was engaging during the sense-making because she took what she had learned from the lesson and applied it to what she would learn in the future. She made the question to say is there an easier way to see that things move in the air? So this makes me think that she is thinking outside of the box and that she is thinking about how to extend her knowledge. I know that this student understands what happens when sound is applied to a state of matter because he said that that state of matter moves. Above quotes from the teacher candidate’s analysis and reflection around individual students’ responses to assessment reveals their attention to students’ ideas around the phenomenon. The teacher candidate explained how students were using the investigative experience completed in the classroom during instruction to make sense of the phenomenon and make their predictions. In this case, the teacher candidate also noticed students’ ability to generate questions based on the learning experience as an evidence of sense-making. Shown below are two students’ responses to the assessment posed in this case. 83 Figure 5.9: Samples of Student Work Although, the teacher candidate did allow opportunity and noticed student ideas around the science phenomena, the assessment did not probe or provided scaffolds for students to express their mechanistic thinking. The attention to mechanistic thinking, reasoning about how and why things happened, was not foregrounded in the assessment item. Similar to other teacher candidates in the data, the candidate in this particular example also struggled to be productively responsive based on their noticing. The teacher candidate mentioned: 84 I will change my assessment, I would have the students fill out a worksheet with the same questions before the lesson to see what they know and then fill it out after to see if anything changes. I would do this so I could actually see if this is what students are learning from the lesson or if they are just filling out answers at the end just to be done. The adaptation suggested was mostly generic and it was not clear that how a change suggested above will target any of the specific concern around student understanding of the topic or support their sense-making if same assessment gets implemented with the suggested changes. The examples above present typical cases to teacher candidates design and use of phenomenon-based assessments for noticing and responding to student sense-making. There is a range to how teacher candidates were able to use assessments for creating opportunities, and for noticing and responding to student sense-making. To begin with, it was to varying extent that teacher candidates were able to use phenomenon for assessments; in some cases assessments remained content-oriented and emphasized procedural knowledge while in others, teacher candidates had a phenomenon but did not translate it productively to assessments by engaging students in making sense of the mechanisms underlying it. Teacher candidates who did not use phenomena at all or did not translate phenomena into an assessment ended up with limited evidence of students’ sense-making to notice and interpret. In prior cases, teacher candidates were limited to noticing what was missing, included, and was correct/incorrect within student responses. Also, in such cases teacher candidates frequently noticed student behavior—talking, being active, etc.—as proxy for student sense-making. They offered generic (Kang & Anderson, 2015) suggestions to change assessments for future. Rarely did teacher candidates use the phenomenon to effectively notice students’ ideas, reasoning and, ability to construct explanations as evidence for sense-making but did not allow opportunities for students to figure out and explain mechanisms underlying phenomena (Russ et al., 2009). Teacher candidates often 85 struggled to use their analysis to suggest changes that may enhance student sense-making of assessments. Discussion This study examined 23 teacher candidates noticing and responding to student sense- making while they engaged in assessments guided by science phenomenon. Based on the findings, I discuss about how we can potentially prepare well-started teacher candidates for noticing and responding to student sense-making. The first effort is in the direction of ensuring that teacher candidates have opportunities to prepare productive phenomenon-based assessments. The second effort that needs to be made is to prepare teacher candidates to pay attention to student’s mechanistic thinking and interpret student ideas around phenomenon. Layers of Challenge One of the most important findings of the study is that teacher candidates’ noticing, and interpretation of student responses, was very related to their assessment design. Candidates engaged in effective noticing and interpretation of student responses when they used productive phenomenon-based assessments. Teacher candidates in the study had two layers of challenge: 1) they made efforts to ground the assessment in phenomenon, and 2) they used assessment as a means to learn about student sense-making of the phenomenon. Using science phenomenon as a core of instruction and assessment is an ambitious step for teachers (Pellegrino, Wilson, Koenig, & Beatty, 2014; Reiser, 2013). We know that open-ended assessments (Furtak & Ruiz-Primo, 2008; Gotwals & Birmingham, 2015; Kang and Anderson, 2015) support in getting access to a repertoire of student ideas and make ideas available for analysis and interpretation of their understanding. By engaging students with phenomenon in the context of assessments we can 86 gather their understanding regarding how and why events happen. Such assessments can provide an alternate richer context for students to apply what they learned during instruction (Windschitl et al., 2012). The program in this study promoted and provided opportunities for teacher candidates to learn about phenomenon anchored instruction with an emphasis on planning science talks and assessments around phenomenon. Most teacher candidates in the study struggled for articulating phenomenon and using it to guide assessment. The assessment design, in turn, limited their noticing and interpretation of sense-making to students’ ability to be attentive, observe events to only describe characteristics, and extrapolating inferencing to make unsupported claims about student learning. While some teacher candidates were successful to an extent in designing and using phenomenon-based assessments, they still struggled to achieve a design to elicit students’ thinking regarding mechanisms underlying the phenomenon. There can be different possible reasons that can explain the range in teacher candidates’ success and struggle with using assessments to notice and respond to student sense- making. One, candidates used these assessments in their placement classrooms and there is a chance that the assessments were adopted or influenced in design by their mentors. Second, teacher candidates may have held onto the dominant notions regarding assessment as a way to evaluate what students “know or don’t know.” They may have encountered such notions during their “apprenticeship of observation.” Most teacher candidates struggled to make productive suggestions to assessment design based on their noticing and interpretation of assessment designs supports my theory. Teacher candidates often made suggestions that would enable them to get correct responses from students, if they implemented them in future. Only three teacher candidates in the study showed productive responsiveness. It is also possible that as teacher candidates, their assessments were good enough 87 in engaging students in sense-masking as more than half of them (14 out of 23) were successful in articulating a phenomenon that were intending to use to guide the assessment. Helping Teacher Candidates Pay Attention to Students’ Mechanistic thinking It was very clear in the study that with their assessment design, teacher candidates created a range of opportunities for students to engage in their sense-making. For instance, as detailed in example 1, teacher candidate only asked students to follow a procedure and sort different objects, while in example four, teacher candidate used an experiment as a context of the assessment and probed students to show as to how matter behaved as sound traveled through it. However, the same teacher candidate did not probe students to explain or show what could be happening at the microscopic level to elicit their thinking about mechanisms as sound passed through matter. All 7 teacher candidates in the study who designed a productive phenomenon-based assessment showed some aspect of cause and effect explanations from students but did not pay attention to probe them further for mechanisms underlying the phenomena in question. One probable reason could be the common notion among teachers about the ability of young learners within elementary grades to engage in scientific explanations. We know, however, that when provided with opportunity even young learners have the ability to engage in mechanistic thinking (NRC, 2007; Metz, 2011). It is important that teacher candidates overcome such traditional notions about young learners they teach. The findings of the study reveal that teacher candidates need scaffolding at various stages of the assessment design—first, while articulating a phenomenon, second, for designing an assessment grounded in the phenomenon, and third, for attention to assessment design that allows students to explain and engage with the phenomenon at an in-depth level, allowing for discussions of the reasoning and mechanisms underlying it. It is crucial to support teacher 88 candidates’ learning of these stages in coherence to allow for assessment and instruction to align to truly achieve the goal of NGSS as supporting student sense-making of science phenomenon. 89 CHAPTER 6 DISCUSSION AND IMPLICATIONS To meet the vision of student learning set in the NRC framework (2012) and the NGSS (2013), teachers should be prepared to notice, interpret, and respond to student thinking to promote students’ sense-making of science phenomena (Duschl & Bybee, 2014; Reiser, 2013; Wilson, 2013). Teacher educators need guidelines and research that can help them design a curriculum to prepare such teachers. Research studies around science teacher noticing and responding are expanding, particularly at the secondary level (e.g., Barnhart & van Es, 2015; Gotwals & Birmingham, 2016; Kang & Anderson, 2015; Talanquer, Bolger, & Tomanek, 2015). While scarcer, studies of elementary science teacher candidates’ noticing and responding practices are also growing (e.g., Benedict-Chambers & Aram, 2017; Luna, 2018; Russ & Luna, 2013). The findings from the current dissertation can inform the work of teacher educators who aim to prepare teacher candidates for noticing and responding practices needed to support student sense-making. This study offers insight into teacher candidates’ teaching practices in an authentic context. The candidates enacted science talks and implemented assessments with their students in actual school contexts. Therefore, the focus and context differ from many other teacher candidate studies that primarily focus on measuring teacher knowledge or self-efficacy. In this study, candidates had access to elementary students’ thinking and ideas around science phenomena, which they later analyzed and reflected on, thus revealing their noticing, 90 interpretation, and responding patterns. The nature of data and its analysis also revealed candidates’ understanding regarding the notion of student sense-making. For example, teacher candidates noticing signified moments when they understood that student sense-making was happening. Many teacher candidates in the study noticed students’ content knowledge regarding the phenomena which may have meant that they believed students were sense-making when the students were, in fact, producing content ideas. To pursue the sense-making goal, candidates must move beyond a focus on students’ recalling and producing the content. To address a sense- making goal it is critical that teacher candidates pay attention to how students understand how moon phases occur, for example, and not just the names of the moon phases. In other words, the study findings help us understand where teacher candidates are in their “doing” to promote student sense-making. The findings also help us see what is missing in teacher candidates’ practices if we need them to support sense-making in line with NGSS. The results from the current dissertation align with and add new perspectives to the existing research literature around teacher noticing and responding in science classrooms. For example, the current study indicates that the nature of the task used for eliciting students’ understanding of science phenomena (e.g., whether or not the task includes a contextualized phenomenon to “figure out”) had implications for teachers noticing and responding. Candidates who were successful in choosing a phenomenon for assessments and talks were often able to elicit students’ explanations and produce evidence of their sense-making. The findings discussed here support findings of other studies that have found a relationship between opportunities created for eliciting students thinking and teacher noticing in the same context (Barnhart & van Es, 2015; Gotwals & Birmingham, 2016; Kang & Anderson, 2015; Talanquer et al., 2015). 91 The findings of this dissertation also illustrate a range of noticing and interpretation among teacher candidates—from simple forms such as noticing students’ content knowledge to more complex forms of noticing students’ cause and effect explanations around the posed phenomenon. There are other research studies that discuss similar simple and sophisticated forms of teacher noticing and interpretation of student ideas (Luna, 2018; Talanquer et al., 2015). However, it is important to acknowledge that the views on simple and sophisticated forms of noticing and interpretation can be different among studies due to their unique stance on what matters most for students’ learning and what teacher should do to support such learning. The current dissertation aligns with the NGSS perspective on students, which values student sense- making as a central goal of any scientific inquiry. Therefore, while other studies may consider, for instance, teachers’ noticing students’ describing observations as a sophisticated form of noticing, in this study it is considered novice because it is still far from the ultimate aim of moving teachers’ noticing towards students’ reasoning and mechanistic explanations regarding the phenomena. In that sense, the range in teacher candidates’ noticing and interpretation revealed in this study is unique and more suited for the work of teacher educators who want to prepare candidates for noticing and responding that can support the aim of students’ sense- making in science classrooms. In addition to providing a rich context for teacher noticing and providing data to illustrate the range of teacher noticing, teacher educators can use the rubrics developed in this study to evaluate how teacher candidates notice and respond, what is missing in their practices, and how those aspects can be advanced to promote desirable practices among candidates. For example, rubrics can allow teacher educators to diagnose the nature of teacher candidates’ interpretation as they examine student ideas. The study revealed a range in candidates’ interpretation from 92 candidates not evaluating student ideas at all to examining only for missing and correct ideas. Some candidates evaluated students’ ideas as emerging from their prior knowledge without elaborating what it means for student understanding. Some examples of interpretation involved teacher candidates inferencing students understanding based on their work but not producing tangible evidence. More sophisticated forms of candidates’ interpretation of students thinking involved their analyzing and discussing examples from students’ work. While the current study reinforces the value of using science phenomena for student sense-making, it illuminates that the presence of phenomena may not be the only condition to prompt sense-making. The use of science phenomena needs to be accompanied by effective reasoning-based questioning that may help students unpack the phenomena and uncover their mechanistic thinking. Studies (Larkin, 2017; Krist, Schwarz, & Reiser, 2018) have shown the use and value of using contextualized phenomena to effectively elicit students’ disciplinary thinking and drawing their cultural and everyday experiences. In their study with science teachers, Kang, Thompson, and Windschitl (2014) found the use of contextualized or generic phenomenon as a critical scaffold for assessments tasks to engage students in scientific explanations. On the same lines, candidates in this dissertation, who were successful in choosing a phenomenon, were often able to gather students’ disciplinary thinking only if they also used questions that helped students to think through phenomena-based task. There were examples in the data that showed candidates success in choosing the phenomena but their struggle to present the phenomena with aligned productive questioning, which constrained them to elicit students’ sense-making and eventually limited their noticing and interpretation. It is well understood in the research literature that teachers struggle with designing productive questions (Chin, 2006; Van Zee et al., 2001; Whitby, 1992). Teachers in science 93 classrooms often engage in unproductive or low-level questioning that relates to asking students to recall information and facts, questions containing vocabulary words inaccessible by students, and so on. At the same time, studies also show that productive or high-order cognitive questions elicit students’ ideas, stimulate their thinking, and are conducive for inquiry-based science lesson to support student sense-making (Almeida & Neri de Souza, 2010; Chin, 2006; Oliveira; 2010; Van Zee et al., 2001). This dissertation showed how teacher candidates struggled to design productive questioning aligned with science phenomena. As a result, teacher questioning influenced the nature and repertoire of students’ ideas teacher candidates were able to notice and interpret around the phenomena (Benedict-Chambers & Aram, 2017; Luna, 2018.) Like many other studies (Hutchison & Hammer, 2010; Kang et al., 2014; Metz, 2009), the current dissertation reveals teacher candidates’ struggle to involve students in constructing scientific explanations with focus on mechanisms. Analysis of sense-making moments across both studies in this dissertation revealed that although some candidates were successful in encouraging students to share their explanations regarding the phenomenon, they did not attend to the mechanistic aspects of their scientific explanations. I hypothesize that most candidates’ lacked understanding regarding the meaning and significant aspects of a scientific explanation and the role that mechanism plays in those explanations. In addition to struggles with attending to mechanisms, inadequate instructional time could also be a possible hurdle for enabling teacher candidates to engage with students in exploring science phenomena at a deeper level. Finally, like other studies, this dissertation found that teacher candidates struggled with productive responses to students’ sense-making (Barnhart & van Es, 2015; Gotwals & Birmingham, 2016; Kang & Anderson, 2015; Talanquer et al., 2015). Teacher candidates only engaged with general strategies for responding such as adding more content to previous task or 94 introducing a reading or writing scaffold. There can be various reasons for teacher candidates’ struggle with responding to student sense-making. One possible reason may be that they possessed a limited repertoire of noticing and interpretation of students’ ideas to which to respond. Also, traditionally responding to student ideas is considered as the last pedagogical step, making it probably more challenging for candidates within methods course to address it. Implications Most of the implications discussed here are in relation to developing the practice of noticing and responding among elementary science teacher candidates within science methods courses to support the goal of student sense-making. A three-dimensional approach to student learning (NRC, 2012), with sense-making at its core, is still a very new idea for methods courses and teacher educators to grasp. It will require a rethinking and redesign of methods course curriculum to make them more supportive of teacher candidates’ need for noticing and responding to student sense-making. To begin with, teacher candidates need a deeper look and understanding regarding the concept of student sense-making. Many examples from this dissertation show that candidates take a limited view of what sense-making is and what it looks like when students are engaged in sense-making. They often limit sense-making to student talking, communicating fact-based information, and describing what they observe. An opportunity to analyze examples of student sense-making in action can help orient candidates to the process, observable aspects, and outcomes of student sense-making. There is potential that seeing the influence of sense-making on various aspects of student learning may address some of their perceptions regarding the reasoning abilities of young students (Metz, 2009, 2011). For teacher candidates who are new to a methods courses, NGSS can be a complex document to use for planning a standard-based instruction. The document can be a challenge 95 because of its novel vision for student learning and suggested paradigm shifts that set contemporary science teaching apart from traditional forms of science teaching. Further, there is limited instructional time allotted within the placement or other contextual constraints such as institutional goals for science teaching and learning and mentors’ goals for their classroom. Methods courses need to take account of these realities and help the candidate find an achievable and realistic goal for their instruction without losing sight of the basic tenets of the course. It is also essential to model instructional aspects of what and why one should notice around students’ interactions in science phenomena in the methods course because candidates may have limited to no chance of observing this in the placement. There is growing literature that support the use of rehearsals and tools to develop specific aspects of candidates’ practice (Davis, Kloser, Wells, Windschitl, Carlson, & Marino, 2017; Kang et. al., 2014; Larkin, 2017). Candidates need scaffolds to develop a line of open-ended questioning to elicit students’ thinking about mechanistic aspects of the phenomena (Hammer & Van Zee, 2006). The NRC (2012) framework and the NGSS (2013) require teachers to go beyond the general notion of causality (X causes Y) but instead involves students into thinking about the sequence/process that explain how X brings about Y (Russ et al., 2009). Teacher candidates must understand what to notice, listen, and probe for as they engage students in constructing mechanistic accounts of the science phenomena. Scaffolds can potentially guide candidates to design questions focused of “how” and “why” then simply using explicating questions that ask for describing observations of the phenomenon (Benedict-Chambers & Aram, 2017). Contextualizing the generic phenomena may also afford sense-making because it increases the accessibility of the task for students. Contextualized science phenomena can prompt students to express their thinking and they may draw on everyday language and reasoning to explain it. Nonetheless, to help support 96 teacher candidate’s attention to students’ mechanistic thinking, it is essential to make such attention an explicit goal of their learning activities, reflections, and analysis. The assignment templates used in this study limited teacher candidates and did not scaffold their attention to students mechanistic thinking during sense-making. To improve teacher candidates noticing and responding, it is important to have their attention shifted to students’ mechanistic explanations of phenomena. Teacher candidates should have learning opportunities to explore and critique examples of science phenomena and associated mechanistic scientific explanations. To improve teacher responding, teacher educators may use scaffolds to help teacher candidates evaluate and critique various step of their instructional planning and related outcomes for students learning. For instance, candidates may analyze and critique their assessment design and examine its relationship with student responses. The idea is to help candidates understand the relationship between planning and enactment of instruction and related aspects of student learning to effectively identify what and how they can respond. Future Directions The current dissertation study provides valuable evidence regarding candidates’ noticing and responding practices to support sense-making. It contributes to the research base around preparing elementary science teacher candidates for NGSS-aligned instruction (Hanuscin & Zangori, 2016; Luna, 2018; Reiser, 2014). It does so by illustrating the importance of phenomena-based assessments and talk to help teacher candidates notice sense-making; it illustrates a range of teacher noticing regarding sense-making and points to some features of that noticing range. Further, the study highlights some opportunities and struggles of teacher noticing, such as having teacher candidates follow up their interactions with students to probe 97 students’ reasoning and explanation of phenomena related to mechanism. Finally, candidates in this study had very few substantive ideas for how to respond to students even if they were able to notice and probe deeper reasoning in their students. Given its important to reform-based science instruction, it will be important to conduct future research on teachers’ noticing and responding. For example, such work should investigate recommendations from this study such as providing teacher candidates additional scaffolding regarding the nature of sense-making, the importance of mechanistic explanations of phenomena, the importance of phenomena, planning, and seeing examples of noticing, interpreting and responding to mechanistic reasoning and explanations of phenomena. Additionally, future research should engage in longitudinal investigations that span from teacher preparation to first few years of teaching may use the findings of this dissertations as an initial basis to determine what are best ways to support teachers throughout their professional teaching experiences. Finally, research studies with elementary science teachers in methods courses that accumulate evidence of teacher candidates’ practices and how these change over time will aid in on-going efforts towards reform-based visions of NGSS. Doing so will help teacher educators and teacher candidates alike in meeting these critical goals. 98 REFERENCES 99 REFERENCES Almeida, P., & Neri de Souza, F. (2010). Questioning profiles in secondary science classrooms. International Journal of Learning and Change, 4(3), 237-251. Anderson, R. D. (2002). Reforming science teaching: What research says about inquiry. Journal of science teacher education, 13(1), 1-12. Barnhart, T., & van Es, E. (2015). Studying teacher noticing: Examining the relationship among pre-service science teachers' ability to attend, analyze and respond to student thinking. Teaching and Teacher Education, 45, 83-93. Barton, A. C., & Tan, E. (2009). Funds of knowledge and discourses and hybrid space. Journal of Research in Science Teaching: The Official Journal of the National Association for Research in Science Teaching, 46(1), 50-73. Benedict-Chambers, A., & Aram, R. (2017). Tools for Teacher Noticing: Helping Preservice Teachers Notice and Analyze Student Thinking and Scientific Practice Use. Journal of Science Teacher Education, 28(3), 294-318. Berland, L. K., & Reiser, B. J. (2009). Making sense of argumentation and explanation. Science Education, 93(1), 26-55. Beyer, C. J., & Davis, E. A. (2008). Fostering second graders' scientific explanations: A beginning elementary teacher's knowledge, beliefs, and practice. The Journal of the Learning Sciences, 17(3), 381-414. Brown, J. S., Collins, A., & Duguid, P. (1989). Situated cognition and the culture of learning. Educational researcher, 18(1), 32-42. Brown, S. L., & Melear, C. T. (2006). Investigation of secondary science teachers' beliefs and practices after authentic inquiry‐based experiences. Journal of Research in Science Teaching, 43(9), 938-962. Cazden, C. B. (2001). The language of teaching and learning. The language of teaching and learning, 348-369. Chin, C. (2006). Using self-questioning to promote pupils’ process skills thinking. School Science Review, 87(321), 113–119. Davis, E. A., & Smithey, J. (2009). Beginning teachers moving toward effective elementary science teaching. Science Education, 93(4), 745-770. Davis, K. S. (2003). “Change is hard”: What science teachers are telling us about reform and teacher learning of innovative practices. Science Education, 87(1), 3-30. 100 Davis, E. A., Kloser, M., Wells, A., Windschitl, M., Carlson, J., & Marino, J. C. (2017). Teaching the practice of leading sense-making discussions in science: Science teacher educators using rehearsals. Journal of Science Teacher Education, 28(3), 275-293. Duschl, R. A., & Bybee, R. W. (2014). Planning and carrying out investigations: An entry to learning and to teacher professional development around NGSS science and engineering practices. International Journal of STEM education, 1(1), 12. Duschl, R., & Osborne, J. (2002). Supporting and Promoting Argumentation Discourse in Science Education. Studies in Science Education, 38, 39-72. Feiman-Nemser, S., & Buchmann, M. (1983). Pitfalls of Experience in Teacher Preparation. Occasional Paper No. 65. East Lansing, MI: The Institute for Research on Teaching, Michigan State University. Forbes, C. T., Biggers, M., & Zangori, L. (2013). Investigating essential characteristics of scientific practices in elementary science learning environments: The practices of science observation protocol (P‐SOP). School Science and Mathematics, 113(4), 180-190. Furtak, E. M., & Ruiz‐Primo, M. A. (2008). Making students' thinking explicit in writing and discussion: An analysis of formative assessment prompts. Science Education, 92(5), 799- 824. Gotwals, A. W., & Birmingham, D. (2016). Eliciting, identifying, interpreting, and responding to students’ ideas: Teacher candidates’ growth in formative assessment practices. Research in Science Education, 46(3), 365-388. Hanuscin, D. L., & Zangori, L. (2016). Developing Practical Knowledge of the Next Generation Science Standards in Elementary Science Teacher Education. Journal of Science Teacher Education, 27(8), 799-818. Hammer, D., & Van Zee, E. (2006). Seeing the science in children's thinking: Case studies of student inquiry in physical science. Portsmouth, NH: Heinemann Hardy, I., Jonen, A., Möller, K., & Stern, E. (2006). Effects of instructional support within constructivist learning environments for elementary school students' understanding of “floating and sinking." Journal of Educational Psychology, 98(2), 307. Harris, C. J., Phillips, R. S., & Penuel, W. R. (2012). Examining teachers’ instructional moves aimed at developing students’ ideas and questions in learner-centered science classrooms. Journal of Science Teacher Education, 23(7), 769-788. Hutchison, P., & Hammer, D. (2010). Attending to student epistemological framing in a science classroom. Science Education, 94(3), 506-524. Kang, H., & Anderson, C. W. (2015). Supporting Preservice Science Teachers' Ability to Attend and Respond to Student Thinking by Design. Science Education, 99(5), 863-895. 101 Kang, H., Thompson, J., & Windschitl, M. (2014). Creating opportunities for students to show what they know: The role of scaffolding in assessment tasks. Science Education, 98(4), 674-704. Kennedy, M. (2005). Inside teaching. Cambridge, MA: Harvard University Press. Krajcik, J., Codere, S., Dahsah, C., Bayer, R., & Mun, K. (2014). Planning instruction to meet the intent of the Next Generation Science Standards. Journal of Science Teacher Education, 25(2), 157-175. Krist, C., Schwarz, C. V., & Reiser, B. J. (2018). Identifying Essential Epistemic Heuristics for Guiding Mechanistic Reasoning in Science Learning. Journal of the Learning Sciences, 1-46. DOI: 10.1080/10508406.2018.1510404 Kuhn, L., & Reiser, B. (2005, April). Students constructing and defending evidence-based scientific explanations. Paper presented at the annual meeting of the National Association for Research in Science Teaching, Dallas, TX. Larkin, D. (2012). Misconceptions about “misconceptions”: Preservice secondary science teachers' views on the value and role of student ideas. Science Education, 96(5), 927-959. Larkin, D. (2017). Planning for the elicitation of students’ ideas: A lesson study approach with preservice science teachers. Journal of Science Teacher Education, 28(5), 425-443. Lave, J., Wenger, E., & Wenger, E. (1991). Situated learning: Legitimate peripheral participation. Cambridge: Cambridge University Press. Lehrer, R., & Schauble, L. (2006). Cultivating model-based reasoning in science education. Cambridge, UK: Cambridge University Press. Levin, D. M., Grant, T., & Hammer, D. (2012). Attending and responding to student thinking in science. The American Biology Teacher, 74(3), 158-16. Levin, D. M., Hammer, D., & Coffey, J. E. (2009). Novice teachers' attention to student thinking. Journal of Teacher Education, 60(2), 142-154. Luna, M. J. (2018). What Does it Mean to Notice my Students’ Ideas in Science Today?: An Investigation of Elementary Teachers’ Practice of Noticing their Students’ Thinking in Science. Cognition and Instruction, 1-33. DOI: 10.1080/07370008.2018.1496919 McNeill, K. L. (2011). Elementary students' views of explanation, argumentation, and evidence, and their abilities to construct arguments over the school year. Journal of Research in Science Teaching, 48(7), 793-823. Metz, K. (2009). Rethinking what is “developmentally appropriate" from a learning progression perspective: The power and the challenge. Review of Science, Mathematics and ICT Education, 3(1), 5-22. 102 Metz, K. E. (1995). Reassessment of developmental constraints on children’s science instruction. Review of Educational Research, 65(2), 93-127. Metz, K. E. (2004). Children's understanding of scientific inquiry: Their conceptualization of uncertainty in investigations of their own design. Cognition and Instruction, 22(2), 219- 290. Metz, K. E. (2011). Young children can be sophisticated scientists. Phi Delta Kappan, 92(8), 68- 71. Meyer, H. (2004). Novice and expert teachers' conceptions of learners' prior knowledge. Science Education, 88(6), 970-983. Minogue, J., Madden, L., Bedward, J., Wiebe, E., & Carter, M. (2010). The cross-case analyses of elementary students’ engagement in the strands of science proficiency. Journal of Science Teacher Education, 21(5), 559-587. National Research Council (NRC). (2007). Taking science to school: Learning and teaching science in grades K-8. Washington, DC: The National Academies Press. National Research Council (NRC). (2012). A framework for K-12 science education: Practices, crosscutting concepts, and core ideas. Washington, DC: National Academies Press. Newton, P., Driver, R., & Osborne, J. (1999). The place of argumentation in the pedagogy of school science. International Journal of science education, 21(5), 553-576. Next Generation Science Standards (NGSS). (2013). Next Generation Science Standards: For states, by states. Washington, DC: The National Academies Press. Oliveira, A. W. (2010). Improving teacher questioning in science inquiry discussions through professional development. Journal of Research in Science Teaching: The Official Journal of the National Association for Research in Science Teaching, 47(4), 422-453. Osborne, J. (2014). Scientific practices and inquiry in the science classroom. In N.G. Lederman & S.K. Abell (Eds.), Handbook of Research on Science Education, Volume II (pp. 593- 613). London: Routledge. Otero, V. K., & Nathan, M. J. (2008). Preservice elementary teachers' views of their students' prior knowledge of science. Journal of Research in Science Teaching, 45(4), 497-523. Pellegrino, J. W., Wilson, M. R., Koenig, J. A., & Beatty, A. S. (2014). Developing Assessments for the Next Generation Science Standards. Washington, DC: National Academies Press. Penuel, W. R., & Bell, P. (2016). Qualities of a good anchor phenomenon for a coherent sequence of science lessons. Seattle, WA: University of Washington. Retrieved from http://stemteachingtools.org/brief/28 103 Reiser, B. J. (2013, September). What professional development strategies are needed for successful implementation of the Next Generation Science Standards. Paper presented at the Invitational Research Symposium on Science Assessment. Reiser, B. J. (2014, April). Designing coherent storylines aligned with NGSS for the K-12 classroom. Paper presented at the National Science Education Leadership Association Meeting. Boston, MA. Rodgers, C. (2002). Defining reflection: Another look at John Dewey and reflective thinking. Teachers college record, 104(4), 842-866. Rosaen, C. L., Lundeberg, M., Cooper, M., Fritzen, A., & Terpstra, M. (2008). Noticing noticing: How does investigation of video records change how teachers reflect on their experiences?. Journal of teacher education, 59(4), 347-360. Russ, R. S., Coffey, J. E., Hammer, D., & Hutchison, P. (2009). Making classroom assessment more accountable to scientific reasoning: A case for attending to mechanistic thinking. Science Education, 93(5), 875-891. Russ, R. S., & Luna, M. J. (2013). Inferring teacher epistemological framing from local patterns in teacher noticing. Journal of Research in Science Teaching, 50(3), 284-314. Schwarz, C. V., Passmore, C., & Reiser, B. J. (Eds.). (2017). Helping students make sense of the world using next generation science and engineering practices. Arlington, VA: NSTA Press, National Science Teachers Association. Schwartz, M. S., Sadler, P. M., Sonnert, G., & Tai, R. H. (2009). Depth versus breadth: How content coverage in high school science courses relates to later success in college science coursework. Science Education, 93(5), 798-826. Sherin, M. G. (2007). The development of teachers’ professional vision in video clubs. In Video research in the learning sciences (pp. 383-395). Hillsdale, NJ: Erlbaum. Sherin, M. G., & Han, S. Y. (2004). Teacher learning in the context of a video club. Teaching and Teacher education, 20(2), 163-183. Sherin, M., & van Es, E. (2005). Using video to support teachers’ ability to notice classroom interactions. Journal of technology and teacher education, 13(3), 475-491. Talanquer, V., Bolger, M., & Tomanek, D. (2015). Exploring prospective teachers' assessment practices: Noticing and interpreting student understanding in the assessment of written work. Journal of Research in Science Teaching, 52(5), 585-609. Van Driel, J. H., Beijaard, D., & Verloop, N. (2001). Professional development and reform in science education: The role of teachers' practical knowledge. Journal of Research in Science Teaching: The Official Journal of the National Association for Research in Science Teaching, 38(2), 137-158. 104 van Es, E. A., & Sherin, M. G. (2002). Learning to notice: Scaffolding new teachers’ interpretations of classroom interactions. Journal of Technology and Teacher Education, 10(4), 571-596. Van Zee, E. H., Iwasyk, M., Kurose, A., Simpson, D., & Wild, J. (2001). Student and teacher questioning during conversations about science. Journal of Research in Science Teaching: The Official Journal of the National Association for Research in Science Teaching, 38(2), 159-190. Van Zee, E., & Minstrell, J. (1997). Using questioning to guide student thinking. The Journal of the Learning Sciences, 6(2), 227-269. Wenger, E. (1998). Communities of practice: Learning, meaning, and identity. Cambridge, UK: Cambridge University Press. Whitby, V. (1992). Teacher questioning in primary science. Early Child Development and Care, 83(1), 109-114. Wilson, S. M. (2013). Professional development for science teachers. Science, 340(6130), 310- 313. Windschitl, M., Thompson, J., & Braaten, M. (2008). Beyond the scientific method: Model‐ based inquiry as a new paradigm of preference for school science investigations. Science education, 92(5), 941-967. Windschitl, M., Thompson, J., Braaten, M., & Stroupe, D. (2012). Proposing a core set of instructional practices and tools for teachers of science. Science education, 96(5), 878- 903. 105 APPENDICES 106 APPENDIX A Focal Student Sense-making Science Talks and Analyses 107 Focal Student Sense-making Science Talk and Analysis #1 Preparing for this assignment: You will identify a small group (approximately 5) consisting of a representative range of abilities and backgrounds to focus on during the semester in relationship to science teaching and learning. In this assignment you will plan, conduct and analyze a series of three brief ~20 minute (dependent on the age of the students) science-focused discussions with these students in order to gather as much information about them as you can related to their learning resources, prior knowledge around the topic of your lesson, learning challenges, learning styles, reasoning and sense-making processes, and anything else you can find out that will help you “know” these students at a deeper level so you can help them gain the most from your teaching. 1. Talking with your mentor teacher. In conjunction with your mentor teacher, select your small group of students to represent a range of different cultures, learning needs, engagement with science, as well as linguistic, social, physical and academic needs. 2. Gathering necessary equipment. You will need to audio-record the conversation with your students. (As part of the assignment, you will need to reference specific examples of students’ talk as evidence of their resources for learning.) Therefore, you will need to arrange to have access to an audio recorder to record students’ voices. (Using the recording programs on cell phones is most common.) Make sure to test your recording equipment and set-up prior to the conversation to ensure that you can hear and understand each student’s voice. Be sure to ask your MT what permission is needed to audiotape students. Most schools will require that a permission form be sent home to parents. Some mentors may have already obtained this permission at the beginning of the school year. You will want to check with your MT regarding the permission status of each student in the classroom so that you can select students who have permission from their families to be recorded. (A permission letter to parents that you can send home with your information can be found on our D2L website.) 3. Planning to talk with students. You will need to carefully design a discussion that will allow you to understand what ideas and other resources your students bring to learning about your assigned lesson. (Multiple choice questions or questions asking students to define words are NOT rich opportunities for engaging with student understanding, experiences and sense-making.) Prepare a brief description of your 108 questions and follow-up probes, including any props or tasks you plan to use as part of your discussion. You should plan to obtain two types of information from the students in your small group during the discussion. They are listed separately here to highlight the need to address both kinds of questions; however, you may find it easier to mix these questions together in your actual conversation with students. a) Students’ content related conceptions, ideas and thinking. In this part of the discussion, your goal should be to learn about students’ ideas and explanations with respect to the phenomenon you will be addressing. Some cautions: • You should be finding out about how students are thinking about your topic through exploring a specific instructionally productive phenomenon, what they’ve noticed about the phenomenon, and most importantly, what explanations they’ve developed for the phenomenon – not the vocabulary words or facts that they know. Open-ended questions (i.e., those with more than one acceptable answer) work best for this type of discussion. Avoid questions with specific right or wrong answers. • Start the conversation with an open-ended driving/essential question to which all students can respond. (A student may know a lot about science from his/her everyday life, but will “shut down” if she feels that she has nothing to contribute to the conversation.) • Use props and examples of phenomena – physical objects are a great way to engage students with your topic and will allow for a more concrete conversation about your topic. • An essential part of this conversation is eliciting what students may have learned about your topic through their out-of-school experiences. Avoid the trap of asking only about ideas you expect students to have learned in school. b) Students’ prior experiences and cultural/personal resources for science learning. In this part of the discussion, your goal should be to learn about the kinds of knowledge that your students have because of their experiences in the world. You will want to ask your students questions that will allow you to answer the following about them: • What sorts of experiences have they had with your topic? • Where and what have they learned about your topic (e.g., past school experiences, museums, camps, TV shows, books, older relatives)? • Do they or anyone in their family have a job or hobby related to the topic? • Have they traveled or lived anywhere that might have given them experiences or ideas related to your topic? *** Please note: You will need to write questions to probe for specific experiences. For 109 example, you would not ask students directly if they have any hobbies or interests related to the topic. However, if you were teaching a lesson on animal life cycles, you might find out about students’ pets or experiences on farms or observing animals in natural settings. This requires thinking carefully about your topic and how students may learn about it, particularly out of school. 4) Talk with students. Talk with your select group of students around the topic of a specific phenomenon to elicit their a) content-related conceptions, ideas, and thinking and b) prior experiences and cultural/personal resources for science learning. Be sure to audio record your discussions(s). Some cautions: • Listen carefully to what students are saying. a) It can be easy to assume that students know nothing about a topic because they provide an incorrect answer to one of your questions. Try to probe further, for example, by asking students to explain their answers. Students can often tell us a lot about their thinking if we can hear past an incorrect answer. b) Similarly, if students use “science-y” words, be sure to probe further. It can be easy to assume that students fully understand a phenomenon because they can use big words, but often students use these words without really understanding what they mean. Asking students to explain their contributions or asking another follow-up question can help you to gain a more accurate picture of students’ understanding. c) The students should be the center of your conversation. Try to talk only to further explore students’ ideas. This is not the time to be correcting students’ ideas or “teaching” about the topic. 110 Focal Student Sense-making Science Talk Assignment Template #1: Name: Science/Lesson Discussion Topic: Grade Level: Driving Question: Discussion Plan: Write a description of your plans for your discussion with students. Consider the following: • How will you begin? • What key ideas do you want to bring up, if your students don’t mention them during the discussion? • What visual aids will you supply to support your students in talking about their ideas? (e.g. things they can hold, touch, manipulate, observe, and examine in detail as they explain their thinking to you) 111 Exchange #1: Post-Discussion Analysis Select an exchange you had with a student during your discussion that provided insight into the way a student was reasoning about science ideas or how the student makes sense of science ideas during the discussion (e.g. Extended thoughts about how or why something works, the way that something occurs and why, etc…). Transcribe that exchange here by typing out exactly what you and the student (and other students, if they also added to the exchange) said and did during this part of the conversation. (See the example of a transcription on our course website). Don’t forget to include any follow-up questions or probes you might have asked the student. 1) What did you learn about this/these student(s) during this exchange that is helpful for you to know as to how they are thinking about the science idea being discussed? 2) Given what you know about your student(s), why do you think they are thinking about this idea in this way? 3) How do the students’ ideas interfere or cause difficulty for the student’s understanding about how or why something happens? 4) As the teacher, HOW (not what, but how) do you want your students to be thinking about this specific concept or idea they talked about during the discussion? (Keeping your expectations grade-level appropriate, what are you hoping they would say about the idea?) 112 Focal Student Sense-making Science Talk and Analysis #2 Preparing for this assignment: You will use the same small group (approximately 5) consisting of a representative range of abilities and backgrounds to focus on during the semester in relationship to science teaching and learning. In this assignment you will plan, conduct and analyze a brief ~20 minute (dependent on the age of the students) science-focused discussion with these students in order to gather as much information about them as you can related to their learning resources, prior knowledge around the topic of your lesson, learning challenges, learning styles, reasoning and sense- making processes, and anything else you can find out that will help you “know” these students at a deeper level so you can help them gain the most from your teaching. 4. Talking with focal students. This sense-making discussion will take place with the same focal students from Science Talk #1. 5. Gathering necessary equipment. You will need to audio-record the conversation with your students. (As part of the assignment, you will need to reference specific examples of students’ talk as evidence of their resources for learning.) Therefore, you will need to arrange to have access to an audio recorder to record students’ voices. (Using the recording programs on cell phones is most common.) Make sure to test your recording equipment and set-up prior to the conversation to ensure that you can hear and understand each student’s voice 6. Planning to talk with students. You will need to carefully design a discussion that will allow you to understand what ideas and other resources your students bring to learning about your assigned lesson. (Multiple choice questions or questions asking students to define words are NOT rich opportunities for engaging with student understanding, experiences and sense-making.) Prepare a brief description of your questions and follow-up probes, including any props or tasks you plan to use as part of your discussion. You should plan to obtain two types of information from the students in your small group during the discussion. They are listed separately here to highlight the need to address both kinds of questions; however, you may find it easier to mix these questions together in your actual conversation with students. 113 c) Students’ content related conceptions, ideas and thinking. In this part of the discussion, your goal should be to learn about students’ ideas and explanations with respect to the phenomenon you will be addressing. Some cautions: • You should be finding out about how students are thinking about your topic through exploring a specific instructionally productive phenomenon, what they’ve noticed about the phenomenon, and most importantly, what explanations they’ve developed for the phenomenon – not the vocabulary words or facts that they know. Open-ended questions (i.e., those with more than one acceptable answer) work best for this type of discussion. Avoid questions with specific right or wrong answers. • Start the conversation with an open-ended driving question to which all students can respond. (A student may know a lot about science from his/her everyday life, but will “shut down” if she feels that she has nothing to contribute to the conversation.) • Use props and examples of phenomena – physical objects are a great way to engage students with your topic and will allow for a more concrete conversation about your topic. • An essential part of this conversation is eliciting what students may have learned about your topic through their out-of-school experiences. Avoid the trap of asking only about ideas you expect students to have learned in school. d) Students’ prior experiences and cultural/personal resources for science learning. In this part of the discussion, your goal should be to learn about the kinds of knowledge that your students have because of their experiences in the world. You will want to ask your students questions that will allow you to answer the following about them: • What sorts of experiences have they had with your topic? • Where and what have they learned about your topic (e.g., past school experiences, museums, camps, TV shows, books, older relatives)? • Do they or anyone in their family have a job or hobby related to the topic? • Have they traveled or lived anywhere that might have given them experiences or ideas related to your topic? *** Please note: You will need to write questions to probe for specific experiences. For example, you would not ask students directly if they have any hobbies or interests related to the topic. However, if you were teaching a lesson on animal life cycles, you might find out about students’ pets or experiences on farms or observing animals in natural settings. This requires thinking carefully about your topic and how students may learn about it, particularly out of school. 4) Talk with students. Talk with your select group of students around the topic of a specific phenomenon to elicit their a) content-related conceptions, ideas, and thinking and b) prior experiences and cultural/personal resources for 114 science learning. Be sure to audio record your discussions(s). Some cautions: • Listen carefully to what students are saying. d) It can be easy to assume that students know nothing about a topic because they provide an incorrect answer to one of your questions. Try to probe further, for example, by asking students to explain their answers. Students can often tell us a lot about their thinking if we can hear past an incorrect answer. e) Similarly, if students use “science-y” words, be sure to probe further. It can be easy to assume that students fully understand a phenomenon because they can use big words, but often students use these words without really understanding what they mean. Asking students to explain their contributions or asking another follow-up question can help you to gain a more accurate picture of students’ understanding. f) The students should be the center of your conversation. Try to talk only to further explore students’ ideas. This is not the time to be correcting students’ ideas or “teaching” about the topic. 115 Focal Student Sense-making Science Talk Assignment Template #2: Name: Science/Lesson Discussion Topic: Grade Level: NGSS Performance Expectation: Driving Question: Discussion Plan: Write a description of your plans for your discussion with students. Consider the following: • How will you begin? • What key ideas do you want to bring up, if your students don’t mention them during the discussion? • What visual aids will you supply to support your students in talking about their ideas? (e.g. things they can hold, touch, manipulate, observe, and examine in detail as they explain their thinking to you) 116 Exchange #2: Post-Discussion Analysis Select an exchange you had with a student during your discussion that provided insight into the way a student was reasoning about science ideas or how the student makes sense of science ideas during the discussion (e.g. Extended thoughts about how or why something works, the way that something occurs and why, etc…). Transcribe that exchange here by typing out exactly what you and the student (and other students, if they also added to the exchange) said and did during this part of the conversation. (See the example of a transcription on our course website). Don’t forget to include any follow-up questions or probes you might have asked the student. 5) What did you learn about this/these student(s) during this exchange that is helpful for you to know as to how they are thinking about the science idea being discussed? 6) Given what you know about your student(s), why do you think they are thinking about this idea in this way? 7) How do the students’ ideas interfere or cause difficulty for the student’s understanding about how or why something happens? 8) As the teacher, HOW (not what, but how) do you want your students to be thinking about this specific concept or idea they talked about during the discussion? (Keeping your expectations grade-level appropriate, what are you hoping they would say about the idea?) 9) What are the differences between how you want them to be thinking about this idea and how they are thinking right now? 10) Share two instructional experiences you could provide in a science lesson to support the student(s) in moving towards thinking about the idea in a more accurate or sophisticated way. Some possibilities might include: • Experiences that help the student notice new patterns in their data 117 • Experiences that challenge the student to reconsider a hypothesis or claim they have made during the discussion in light of new data/observations • Experiences that introduce new phenomena to observe that the student has not seen before 118 Focal Student Sense-making Science Talk and Analysis #3 Preparing for this assignment: An important part of teaching science is noticing and responding to individual students’ and their sense-making. In this final Science Talk and Analysis, we will turn our attention to several focus students you have been following during the semester. The goal of this portion of the assignment is to interpret the information you learned, find out more about how you might better enhance your teaching for these students, and then write about those techniques. Directions: Conduct the Sense-Making #3 discussion as you have for discussions #1 and #2, however, the focus of this discussion is to probe for what big ideas the students have taken away from your science lesson. To do this, be sure to not directly ask the students what they learned, but design an essential question that will allow the students to share with you what they have learned. 7. Gathering necessary equipment. You will need to audio-record the conversation with your students. (As part of the assignment, you will need to reference specific examples of students’ talk as evidence of their resources for learning.) Therefore, you will need to arrange to have access to an audio recorder to record students’ voices. (Using the recording programs on cell phones is most common.) Make sure to test your recording equipment and set-up prior to the conversation to ensure that you can hear and understand each student’s voice. 8. Conduct the Sense-Making #3 Exchange. Record the audio and transcribe the portion that best demonstrates learning by the students. You may choose to include multiple portions of the conversation that may not be consecutive. ***Please make a note in the transcript if the discussion is not consecutive. 9. Complete the Focal Student Analysis #3 Template. 119 Focal Student Sense-making Science Talk Assignment Template #3: Name: Science/Lesson Discussion Topic: Grade Level: NGSS Performance Expectation: Driving Question: Discussion Plan: Write a description of your plans for your discussion with students. Consider the following: • How will you begin? • What key ideas do you want to bring up, if your students don’t mention them during the discussion? • What visual aids will you supply to support your students in talking about their ideas? (e.g. things they can hold, touch, manipulate, observe, and examine in detail as they explain their thinking to you) 120 Exchange #3: Post-Discussion Analysis Select an exchange you had with a student during your discussion that provided insight into the way a student was reasoning about science ideas or how the student makes sense of science ideas during the discussion (e.g. Extended thoughts about how or why something works, the way that something occurs and why, etc…). Transcribe that exchange here by typing out exactly what you and the student (and other students, if they also added to the exchange) said and did during this part of the conversation. (See the example of a transcription on our course website). Don’t forget to include any follow-up questions or probes you might have asked the student. 1) Describe some strategies that would benefit your focal students as learners in making sense of the world. Choose 3 of your focal students to focus on for this. Using strategies from the literature relevant for the learners chosen, describe how these strategies would benefit your 3 focal students as learners if this lesson were to be taught again. ***Apply a different strategy for each focal student. (Three in all.) Strategies may include: accommodations for students with special needs, differentiation strategies (particularly with assessments), as well as extensions. Extensions may include engaging the students in art, music, poetry, forms of expression, engineering and design projects or other ways of enabling students to connect with science and engage them in different ways for making sense of the world. Consider technology as a set of tools that may play an important role in some of these strategies. 2) Describe how the focal students’ sense-making on the lesson you taught will impact future science lessons that you teach. 121 APPENDIX B Lesson Design and Analysis 122 Assessment and Data Collection Plan Lesson Design & Analysis Assignment Overview In the previous assignments you: a) identified a topic, as well as appropriate NGSS Performance Expectations, b) began framing your lesson in alignment with the NGSS and the Experiences, Patterns and Explanations model of teaching; and c) identified your students’ prior ideas and experiences (Sense-making #2) in relation to the science content you will be teaching. In this assignment, you will lay out specific plans for ASSESSING your students’ ability to meet the identified learning goals (NGSS) after teaching your lesson. Assignment Template and Explanation: Name(s): Grade Level: Targeted Learning Goals: Copy this section from your Framing assignment. (Lesson Identification and Learning Goal) Post Assessment Task Design ONE brief assessment task that will provide rich information about your students’ thinking and understanding for your unit learning goals. Include a copy of your assessment task in this assignment. Rich tasks should involve the students in creating a somewhat elaborate response, not just giving a one-word answer. It should involve the students carrying out the practices defined in your learning goal, not just recalling information. It should provide an opportunity to apply a main idea, not just recall or recognize it. Examples of rich tasks include performance assessments such as providing students with a variety of objects, asking them to use those objects to construct or do something and asking them to explain how the science ideas are important in their decisions to meet that goal. You can engage students in figuring things out, finding patterns, using their explanations to justify their decisions in written response items. You can use a variety of other assessments such as observing students as they work in groups, analyzing their drawings, labels and explanations in their science notebooks, or even a task that is already in your instructional materials. Here are some hints for designing a “rich” assessment task: 123 • Your assessment task should be closely aligned with your NGSS Performance expectation. • Your assessment task should engage students in meaningful and thoughtful work. They should be applying a big idea from your lesson and carrying out practices/cross-cutting concepts defined in your NGSS Unpacking and related knowledge & skills, not just recalling or listing information and ideas. • Students should provide an elaborate response, not a one-word answer. • Analysis of your students’ responses should provide you with information about their strengths and weaknesses with respect to your assessment objective. This should go beyond whether students “got” your assessment objective and whether they participated in your lesson and/or the task. • All students should be able to respond to your task, perhaps with varying degrees of quality. (If some students cannot respond at all, you miss the opportunity to find out what they do understand.) Post Assessment Task Rationale Write a brief statement explaining what this assessment task will allow you to learn about how much and how deeply your students understand your lesson NGSS Performance Expectation. What specific skills, ideas and practices are you trying to assess in this task? (Include how you are addressing your SEP/DCI/CCC in your assessment.) Scoring Guide for Analyzing Students’ Responses to the Post Assessment Task Next, you will need to determine how you will analyze and interpret the students’ responses to your task. Analyzing students’ responses can be done by identifying features in their responses that you can look for and document. You will create a scoring guide that thoroughly describes all of the desired features of students’ responses that would indicate the extent to which they have met your assessment objective. Your scoring guide should include the specific details you would look for in a student’s response that will let you know what aspects they know well, what aspects they struggled with, and how they were reasoning about your task. These features can be used to evaluate how much your students have learned the lesson content and how deeply they have understood it. The essential features represent the criteria you will use to analyze your students’ responses on the post assessment after your lead teaching. These features will provide the starting point for your analysis after the post assessment – but you may find that you’ll make some changes to these as a result of seeing the kinds of responses your students provide on the post assessment task. Note: If there are important aspects related to the learning goal (i.e., main ideas students should know, practices students should be able to do) that you cannot evaluate based on your 124 task, you may need to add to or change your task so that it will provide sufficient evidence to help you decide how well your students are meeting the learning goal. Desired Features Points • • • • • • • • • • The assessment objective matches the NGSS Performance Expectations. The assessment task engages students in opportunities to use knowledge gained from SEP/DCI/CCC for elaborated responses. The assessment objective describes a behavior that demonstrates a deep understanding of the learning goal. (not rote memorization, multiple choice, fill in the blank, etc.) The assessment task is likely to elicit rich information that will allow evaluation with respect to the assessment objective. The assessment task is accessible to students with a range of mastery (above and below expected levels of performance) of the assessment objective. The rationale clearly explains how the assessment task assesses the students’ understanding of the NGSS Performance Expectation. The rationale clearly explains what the assessment task is intended to show regarding students’ understanding of the NGSS Performance Expectation – including opportunities for illuminating possible misconceptions or advanced ideas. There is a clear plan for analyzing students’ responses to the assessment task, including the way in which results can be used to reflect upon students’ strengths and weaknesses (and not just whether they are “right” or “wrong”.) The scoring guide includes the specific details teachers should look for in a student’s response. The scoring guide provides students with an opportunity to give their explanations and reasoning related to the task. Table B-1: Grading Criteria for Assessment Assignment /5 /5 Post Assessment Task and Rationale Post Assessment Rubric/Scoring Guide 125 Analysis of Classroom Interactions, Student Learning, & Reflection Final Segment of Lesson Design & Analysis Assignment Overview This assignment is designed to support you in analyzing evidence from teaching your lesson in your field placement and in reflecting on your teaching. Preparing for the Assignment In order to successfully complete this assignment, you will need to collect a video or audio recording of your lesson and take detailed notes after teaching to have as much information about the nature of your lesson as possible. You will also need assessment responses or samples of student work from six students including the focal students in your placement classroom during the time that you teach your lesson. Your reflections should be detailed and specific, and should focus on the evidence from the recordings/notes and from student work. Assignment Directions There are several parts to this assignment. You will be providing a detailed response for each part that is well supported with specific examples from the recording of your lesson, your students’ work and your teaching notes. 1. 2. Analysis of Whole Class Interactions and Classroom Culture Carefully review your video/audio recording of your lesson and the detailed notes. Analyze and evaluate classroom community and interactions in the lesson using evidence from your recordings. Below, you will write a detailed, multi-paragraph analytical response for each of the following questions: What opportunities did students have to participate and engage in the lesson? How did they participate? How were students’ resources (e.g., funds of knowledge, ways of knowing) elicited and leveraged? How did students interact with each other and you as the teacher? Analysis of Individual Learning from Student Work Work with your instructor to decide how to choose sample student work. Carefully review evidence from identified focal and other students about student learning including their actions and talk as well as their work in the assessment. You will analyze student work using the assignment template (below), and write a detailed, multi-sentence analytical response for each of the following questions: In what ways did students engage in sense-making? In what ways did their work indicate they are not meeting, partially meeting, or meeting the learning goal? 126 3. 4. Reflections on Analysis and Teaching Review the analysis and findings from above regarding whole class interactions and student learning in addition to your notes from teaching. Then, you will write a detailed response to reflection questions about your overall impression of strengths and weaknesses of the lesson, how the lesson plan addressed diverse student learners, the strengths and limitations of the assessment, and how this experience impacted your teaching identity. Implications for Future Teaching Review the analysis and findings from above regarding whole class interactions and student learning in addition to your notes from teaching. Then, you will write a detailed response to the questions: Given the analysis of interactions and student learning, describe your written and oral feedback you would provide your focal and other students to advance their science learning. How would you teach this same lesson again to improve the lesson and why? Assignment Template The next part of this assignment is the assignment template to help guide you in your analysis and reflections Name(s): Lesson Topic and Grade Level: • PERFORMANCE EXPECTATION: • NARROWED LESSON FOCUS: • SCIENCE AND ENGINEERING PRACTICE: • CROSSCUTTING CONCEPT: 127 Phenomenon and Driving Question for Lesson: Identify a phenomenon and write a driving question designed to support students’ developing understanding of your learning goals. Your driving question should be directly aligned with the NGSS Performance Expectation, have a real-world context, and demonstrate a deep understanding of the learning goal when answered. See course slides for examples of how to identify a phenomenon and write a driving question. • PHENOMENON: • DRIVING QUESTION: 1. Analysis of Whole Class Interactions and Classroom Culture Write a detailed, multi-sentence analytical response for each of the following questions: a. What opportunities did students have to participate and engage in the lesson? Examples include talk, interactions with materials, etc. How did students participate? (e.g., who was doing the talking, what kind of language were they using?) b. How did you elicit and leverage students’ resources (e.g., funds of knowledge, ways of knowing)? c. How did students interact with each other and you as the teacher? (e.g., how were their ideas responded to, were they acknowledged, rejected or built on, whose ideas were taken up and whose were not?) 2. Analysis of Individual Learning from Student Work Assessment Objective: Desired Assessment Features/Scoring Guide: [list the features you identified in your LDA #1-2 assessment assignment for evaluating student work.] 128 Focal Student 1 Focal Student 1 Brief description for why you chose this student’s work. Description of the student’s interactions/engagement including their talk (e.g., what they said) during the lesson. Photo of student work sample(s): Focal Student 2 Brief description for why you chose this student’s work. Description of the student’s interactions/engagement including their talk (e.g., what they said) during the lesson. Photo of student work sample(s): Evidence of sense-making: Describe how this student was engaged in sense- making. What resources were they using? What was the nature of their ideas, reasoning, experiences, and how did they use those to address the lesson topic? Evidence from work sample of student learning: List features you have identified in your student work sample that indicate student understanding of the learning goal. Provide a claim for what this indicates about student understanding and a rationale of why this demonstrates that they are not meeting, partially meeting, or meeting your NGSS assessment objective. Focal Student 2 Evidence of sense-making: Describe how this student was engaged in sense- making. What resources were they using? What was the nature of their ideas, reasoning, experiences, and how did they use those to address the lesson topic? Evidence from work sample of student learning: List features you have identified in your student work sample that indicate student understanding of the learning goal. Provide a claim for what this indicates about student understanding and a rationale of why this demonstrates that they are not meeting, partially meeting, or meeting your NGSS assessment objective. Table B-2: Grading Criteria for Assessment Assignment 129 Table B-2 (cont’d) Focal Student 3 Brief description for why you chose this student’s work. Description of the student’s interactions/engagement including their talk (e.g., what they said) during the lesson. Photo of student work sample(s): (Focal) Student 4 Brief description for why you chose this student’s work. Description of the student’s interactions/engagement including their talk (e.g., what they said) during the lesson. Photo of student work sample(s): Focal Student 3 Evidence of sense-making: Describe how this student was engaged in sense- making. What resources were they using? What was the nature of their ideas, reasoning, experiences, and how did they use those to address the lesson topic? Evidence from work sample of student learning: List features you have identified in your student work sample that indicate student understanding of the learning goal. Provide a claim for what this indicates about student understanding and a rationale of why this demonstrates that they are not meeting, partially meeting, or meeting your NGSS assessment objective. (Focal) Student 4 Evidence of sense-making: Describe how this student was engaged in sense- making. What resources were they using? What was the nature of their ideas, reasoning, experiences, and how did they use those to address the lesson topic? Evidence from work sample of student learning: List features you have identified in your student work sample that indicate student understanding of the learning goal. Provide a claim for what this indicates about student understanding and a rationale of why this demonstrates that they are not meeting, partially meeting, or meeting your NGSS assessment objective. 130 Table B-2 (cont’d) (Focal) Student 5 Brief description for why you chose this student’s work. Description of the student’s interactions/engagement including their talk (e.g., what they said) during the lesson. Photo of student work sample(s): (Focal) Student 6 Brief description for why you chose this student’s work. Description of the student’s interactions/engagement including their talk (e.g., what they said) during the lesson. Photo of student work sample(s): 3. Reflections (Focal) Student 5 Evidence of sense-making: Describe how this student was engaged in sense- making. What resources were they using? What was the nature of their ideas, reasoning, experiences, and how did they use those to address the lesson topic? Evidence from work sample of student learning: List features you have identified in your student work sample that indicate student understanding of the learning goal. Provide a claim for what this indicates about student understanding and a rationale of why this demonstrates that they are not meeting, partially meeting, or meeting your NGSS assessment objective. (Focal) Student 6 Evidence of sense-making: Describe how this student was engaged in sense- making. What resources were they using? What was the nature of their ideas, reasoning, experiences, and how did they use those to address the lesson topic? Evidence from work sample of student learning: List features you have identified in your student work sample that indicate student understanding of the learning goal. Provide a claim for what this indicates about student understanding and a rationale of why this demonstrates that they are not meeting, partially meeting, or meeting your NGSS assessment objective. Write a detailed, multi-sentence analytical response for each of the following questions: Overall reflections (see tips for your reflections below): 1. What were some strengths of your lesson? Support your claims with evidence. 2. What were some weaknesses of your lesson? Support your claims with evidence. 3. How did your lesson support or not support student science learning? Support your claims with evidence. 131 Reflections on responsiveness to diverse students: 1. How did the lesson meet or not meet the needs of the students? 2. How did you adjust the lesson plan and teaching in response to students’ contributions and sense-making? Reflections on assessment: In addition to analyzing student responses to your assessment task for clear evidence of student understanding, you will also need to reflect upon the effectiveness of your assessment. 1. What were the strengths of the assessment you chose for providing evidence of student science understanding? Explain why. Include evidence (e.g., one example; overall class responses). 2. What were the limitations of the assessment you chose for providing evidence of student science understanding? Explain why. Include evidence (e.g., one example; overall class responses). 3. Based on your analysis of the responses, what changes would you make for this assessment task in order to get a more complete picture of all students’ progress towards mastering your science content NGSS learning goals? Why? Reflections on classroom culture: 1. How did the lesson conform or deviate from the established classroom culture from the mentor teacher? How might that have impacted student interactions and learning? Reflections on teacher identity: 1. How did teaching your lesson impact your own identity as a teacher and as a science learner? 4. Implications Write a detailed, multi-sentence analytical response for each of the following questions: 1. 2. If you were to give feedback to your six students whose work you analyzed, what would you write and say to help them learn and make better sense of the science? Provide specific text examples for each student and a rationale for the feedback. If you were to teach this same lesson again, what changes would you make to your lesson plan to better support your students’ science learning? Why? • • Tips for your reflections As you are working on your reflections, take time to review the themes from the course. Reference and use these ideas in your responses. As you are reflecting on your science teaching and student learning, remember that this reflection is not about behavior management or constraints out of your control. Instead, we are asking you to focus on your planning, your teaching, students’ engagement, and student learning. 132 • • Be sure to use evidence in your analyses and reflections to support the statements you are making. Even if your lesson was highly successful, challenge yourself to consider something on which you could make improvements in the future. This is an important skill to develop as a life-long learner. 133 Rubric for Lesson Design and Analysis Classroom Interactions, Student Learning, & Reflection Desired Features Nature and Quality Points Analysis of Whole Class Interactions and Classroom Culture Analysis of Individual Learning from Student Work An overall analysis and claim about classroom culture/community is presented based on specific evidence regarding: • • • opportunities for students to participate and engage (such as talk, interact with materials, etc.) in the lesson; how students participated (e.g., who was doing the talking, what kind of language were they using) the extent to which funds of knowledge are/are not leveraged in instruction (were students’ resources being elicited?) how students interacted with each other and you as the teacher, (e.g., how were their ideas responded to, were they acknowledged, rejected or built on, whose ideas were taken up and whose were not?) An analysis and claims about individual student learning are presented based on specific evidence from the work of focal students and those of others (such as the assessment), actions, and talk regarding: • • the ways in which students engaged in sense-making, paying attention to students’ resources: How does the lesson support students in drawing upon their knowledge, reasoning, experiences, interactions, funds of knowledge, etc.? the ways in which their work indicates they are not meeting, partially meeting, or meeting the learning goal • Examples for the analysis are included from conversations (recorded or transcribed) or detailed notes • Analysis is supported with evidence using examples of student talk and interactions • Claims focus on interactions around science • Examples for the analysis are included from focal students’ work • Analysis is supported with evidence using examples of students work, talk, etc. • Claims focus on the learning goals of the lesson as described in the NGSS performance expectation and lesson objective • Evaluations do not describe students’ understanding of the task, ability to finish the task, on/off task behavior or general engagement /4 /4 Table B-3: Draft CAEP Rubric structure 134 Table B-3 (cont’d) Reflections on Analysis and Teaching Teacher reflections are included regarding the analysis of classroom community, individual learning, and teaching regarding: • • • • • • • • (Overall): Overall impressions of the strengths and weaknesses of the lesson (Overall): How the lesson supported or did not support student science learning and engagement (Responsiveness to diverse students): How the lesson met or didn’t meet the needs of the focal students (Responsiveness to diverse students): How the lesson plan and teaching adjusted in response to students’ sense-making, engagement, and contributions (Assessment): Strengths and limitations of the assessment based on evidence. By assessment, we refer to informal, embedded, formative, and summative assessments. (Assessment): Improvements to the assessment with rationale (Classroom culture): How the lesson conformed or did not conform to the established classroom culture from the mentor teacher. How that may have impacted student interactions and learning. (Teacher identity): How teaching experience impacted one’s identity as a teacher and a learner of science Implications for the analysis and reflection on future teaching are included. They address: Implications for Future Teaching • Future directions to focal students for learning. Directions should include written and verbal feedback. A rationale for the feedback is provided. • How one could teach this same lesson again to improve the lesson and why. • 135 • Evidence is provided from the lesson, classroom interactions analysis or student work analysis to support these claims. • Reflections move beyond superficial claims about environment to nuanced claims about learning. /4 • Feedback to students considers findings from analysis of student learning. • Feedback to students is specific and addresses the strengths and needs of the student related to the learning objective • Feedback includes a /3 thoughtful rationale and strategy Ideas about how to teach the lesson focus on key areas to advance science learning. Not on pacing, repeating instruction or classroom management