FORMATIVE ASSESSMENT PRACTICES IN A LINGUISTICALLY AND CULTURALLY DIVERSE ELEMENTARY CLASSROOM: A CASE STUDY By Xuexue Yang A DISSERTATION Submitted to Michigan State University in partial fulfillment of the requirements for the degree of Curriculum, Instruction and Teacher Education-Doctor of Philosophy 2021 ABSTRACT FORMATIVE ASSESSMENT PRACTICES IN A LINGUISTICALLY AND CULTURALLY DIVERSE ELEMENTARY CLASSROOM: A CASE STUDY By Xuexue Yang The increased numbers of emergent bilinguals (EBs) in mainstream classrooms demand that teachers employ high leverage practices for all students. One powerful teaching practice that holds promise for supporting all students is formative assessment. However, little attention has been given to the connection between teachers’ assessment expertise, formative assessment practices, and supporting all student learning. Drawing on formative assessment from a sociocultural perspective and Lyon’s (2013a) Conceptualization Framework of Teacher’s Assessment Expertise, this qualitative study adopts a single case study design (Yin, 2003), examining an elementary teacher’s daily formative assessment practices in her fifth-grade mathematics classrooms. This study aims to uncover the nature of formative assessment practices in classrooms, the teacher’s beliefs about challenges that EBs may encounter, her corresponding support to EBs, as well as the alignment between the teachers’ beliefs, formative assessment practices, and EBs’ perception of their learning needs. Multiple data sources were collected: classroom observations, interviews with the teacher (i.e., Mrs. G) with EBs, and artifacts. Findings in this study are organized based on research questions. The first findings chapter focuses on the nature of Mrs. G’s formative assessment practices. Findings revealed that discourse is an essential part of formative assessment practices in Mrs. G’s math class. During the enactment of discursive formative assessment, Mrs. G places great value on having a classroom culture that prioritizes student ideas. Two practices emerged as she enacted discursive formative assessment: (1) communicating and clarifying learning targets and (2) eliciting and responding to student ideas. Yet, there seemed a large range regarding how Mrs. G presented learning targets and elicited student thinking. The first findings chapter illustrates the relationships between the higher and lower level of formative assessment practices, and how these practices align with Mrs. G’s value on creating the classroom culture that foregrounds students’ ideas. The second findings chapter is organized to answer the second set of research questions. Findings revealed instances of both alignment and misalignment among Mrs. G’s formative assessment practices, beliefs, and EBs’ perception of their learning needs. There are three emerging themes as to the construct of alignment: (1) alignment between Mrs. G’s beliefs of discourse in formative assessment and EBs’ perceptions; (2) alignment of Mrs. G’s beliefs and practices on scaffolding for EBs and EBs’ perceptions; and (3) alignment between Mrs. G’s beliefs of the barriers that EBs may encounter and EBs’ perceptions. Concerning the pattern of misalignment among Mrs. G’s formative assessment practices, beliefs, and EB’s perceptions, there are two emerging themes: (1) misalignment between supports that Mrs. G felt she provided to EBs and her actual implementation; (2) a tension between Mrs. G’s beliefs of instruction for all students and EBs’ needs in differentiation. Instances of the alignment and misalignment are presented. Drawing on findings in this study, I discuss six themes and claims that I find significant and connected to researchers and educators who are interested in formative assessment and working with EBs. This study can add to conversations on the implementation of formative assessment within linguistically and culturally diverse classrooms and best practices for EBs. Copyright by XUEXUE YANG 2021 This dissertation is dedicated to my parents. v ACKNOWLEDGEMENTS This work cannot be done without supports from so many people around me. I first want to thank my advisor and dissertation committee chair, Dr. Amelia Gotwals. It is your inspiration that motivates me to explore constantly in the field of assessment equity for EBs coupled with teacher’s assessment practices. It is your thought-provoking ideas and insightful feedback that deepen my understanding of formative assessment. It is your encouragement and care that have accompanied me in getting through the difficult time during my journey of completing this dissertation. Without your inspiration and encouragement, I would not be able to sit here writing the acknowledgment at the final stage of my dissertation. Thank you to the members of my dissertation committee. I want to thank Dr. Carrie Symons for introducing me to the school site so that I could build connections with participants in my dissertation study. I want to thank Dr. Patricia Edwards for giving me sincere suggestions about where I should go during the difficult comment when I felt hesitant and anxious. I want to thank Dr. Tonya Bartell for generously being willing to serve in my committee when I reached out and providing valuable feedback in moving my work forward. I would also like to thank Dr. Dongbo Zhang for supervising my work at the earlier stage. Your passion and rigorous attitude on scholarship have influenced me so much during this long journey of my doctoral study and the rest of my life. I want to express my sincere thanks to those faculty who have supported my academic growth in the Department of Teacher Education. Specifically, I want to thank Dr. Sandro Barros, who has provided me enough trust and space to learn and to try when we taught together. I want to thank Dr. Lynn Paine and Dr. Nancy Romig vi who have brought valuable professional development opportunities to me in my journey of learning to be a teacher educator. Next, I would like to thank the participants in this study. Thank you, Mrs. G, for welcoming me to observe your class, and being willing to participate in my study without hesitating. Thank you to all the students who participated in my study. It is your voices that make this study more meaningful. Without your participation and support, this study would not be possible. Then, first of all, I want to thank my friends and colleagues, especially Dr. Byungeun Pak, Dr. Abraham Ceballos-Zapata, Dr. Lisa Domke, Mingzhu Deng, Zhao Peng, Dr. Yue Bian, Dr. Wei Liao, Dr. Xuehong (Stella) He, et al., for listening to me, sharing your advice, and lifting me up. I then want to thank Dr. Jungmin Kwon and Dr. Wenyang Sun for introducing me to the international language and literacy scholar group. It is so helpful to join in the writing group and support each other. I also want to thank Ms. Terry Yuncker (Edwards) and Ms. Amy Peebles for your care and your smiles. Those little enjoyable chatting moments in Erickson have warmed my heart while I am studying abroad and being far away from home. My special thanks also go to Dr. Xin Jin and those who loved me, but I did not have a chance to express my gratitude. Finally, I want to express my deepest gratitude to my family. Thank you, mom and dad. I know that you have spared no effort in supporting me. I know that you want me to live nearby but still let me go to pursue my dream of studying in the US. Thank you, my younger brother, for accompanying our parents when I am not around. Without your understanding and support, I would not be able to concentrate on my study and complete my dissertation successfully. vii TABLE OF CONTENTS LIST OF TABLES .......................................................................................................................... x LIST OF FIGURES ...................................................................................................................... xii CHAPTER 1: INTRODUCTION ................................................................................................... 1 Research Questions ..................................................................................................................... 3 Significance of The Study ........................................................................................................... 4 Define Terms ............................................................................................................................... 5 CHAPTER 2: LITERATURE REVIEW ........................................................................................ 8 The Characteristics and Process of FA........................................................................................ 8 Where Are We Going? Articulating Learning Targets ........................................................... 9 Where Are We Now? Gather Evidence About Student Learning ......................................... 10 How Do We Get There? Actions Based on The Evidence of Student Learning ................... 11 Sociocultural Perspective of FA ................................................................................................ 13 Discursive FA ........................................................................................................................ 14 FA within Linguistically and Culturally Diverse Classrooms .................................................. 16 Reducing Linguistic and Text Complexities of Assessment Tasks ....................................... 17 Providing EBs with Language-Enriched Math Learning Environment ................................ 19 Relationships Among Teachers’ Knowledge, Beliefs and FA Practices .................................. 22 Reconceptualize Teachers’ Assessment Expertise .................................................................... 24 CHAPTER 3: METHODOLOGY ................................................................................................ 30 Context and Participant ............................................................................................................. 31 The School Site ...................................................................................................................... 31 Context of the Math Class: EBs ............................................................................................ 32 The Focal Teacher: Mrs. G .................................................................................................... 34 Data Collection .......................................................................................................................... 35 Class Observations and Teaching Videos ............................................................................. 36 Interviews .............................................................................................................................. 38 Artifacts ................................................................................................................................. 40 Data Analysis ............................................................................................................................ 41 Section 1 ................................................................................................................................ 41 Section 2 ................................................................................................................................ 47 CHAPTER 4: RESEARCH FINDINGS I .................................................................................... 60 Create A Classroom Culture of Valuing Student Ideas ............................................................. 61 Enactment of Discursive FA ..................................................................................................... 62 Communicating and Clarifying Learning Targets ................................................................. 63 Elicit Student Thinking Using Questioning Practices ........................................................... 74 Lack of Wait Time: A Missed Opportunity in Eliciting Students’ Thinking ........................ 81 viii CHAPTER 5: RESEARCH FINDINGS II ................................................................................... 85 Section 1: Alignment of Mrs. G FA Practice, Beliefs, and EBs’ Perceptions .......................... 86 Alignment between Mrs. G’s Belief of Discourse in General and EBs’ Perceptions ........... 86 Alignment of Mrs. G’s Beliefs and Practices on Scaffolding for EBs and EBs’ Perceptions89 Alignment between Mrs. G’s Beliefs of Barriers That EBs May Encounter and EBs’ Perceptions........................................................................................................................... 104 Section 2: Misalignment of Mrs. G FA Practices, Beliefs, and EBs’ Perceptions .................. 109 Misalignment between Supports that Mrs. G Felt She Provided to EBs and Her Actual Implementation .................................................................................................................... 109 A Tension Between Mrs. G’s Belief in Good Instruction for All Students and EBs’ Needs in Differentiation ..................................................................................................................... 115 A Closing Remark on Mrs. G’s Assessment Expertise Levels in Equity ............................... 118 CHAPTER 6: DISCUSSION ...................................................................................................... 122 High Quality of FA: Increase Students’ Role as Active Contributors .................................... 122 Responsive FA: The Role of Language in EBs’ Math learning .............................................. 125 Assessment Access: Using Language Modalities Flexibly to Support EBs’ Math Learning ............................................................................................................................................. 126 Assessment Access: Using “Direct Approach” to Support EBs’ Language Development . 127 Assessment Fairness: Direct Language Supports and Accommodations to EBs ................ 129 A Tension Between the Teacher’s Belief of Good Teaching for All Students and EBs’ Needs in Differentiation ..................................................................................................................... 130 What Counts as A Teacher’s Assessment Expertise: A Gap Between Knowing and Doing .. 132 An Insight into A Teacher’s Knowledge Base of “Responsive” FA in Math Classrooms ..... 134 Teacher’s Knowledge Base in PCK .................................................................................... 134 Teacher’s Knowledge Base in Educational Linguistics and Second Language Learning .. 137 Potential Influences of Math Curriculum on A Teachers’ FA Practice .................................. 139 CHAPTER 7: CONCLUSION AND IMPLICATION ............................................................... 141 Conclusion ............................................................................................................................... 141 Implication............................................................................................................................... 143 Implication on Future Research ........................................................................................... 143 Implication on Qualitative Case Study Design.................................................................... 145 APPENDICES ............................................................................................................................ 148 APPENDIX A: A Lesson Observation Protocol ..................................................................... 149 APPENDIX B: Interview Protocol with Mrs. G ..................................................................... 150 APPENDIX C: Interview Protocol with Students ................................................................... 151 APPENDIX D: A Spreadsheet to Show How Well Mrs. G Enacted FA Based on the FARROP Rubric ...................................................................................................................................... 152 BIBIOGRAPHY ......................................................................................................................... 154 ix LIST OF TABLES Table 2. 1 A Summary on Lyon’s (2013a) Conceptualization Framework of Assessment Expertise ................................................................................................................................ 25 Table 3. 1 EBs’ Language Backgrounds....................................................................................... 33 Table 3. 2 Information About the Math Unit ................................................................................ 37 Table 3. 3 A Timeline of My Classroom Observations ................................................................ 38 Table 3. 4 Dimensions of FA According to The FARROP Rubric (Wylie & Lyon, 2016) ......... 43 Table 3. 5 Excerpts from A Cross-Video Analysis Memo ........................................................... 46 Table 3. 6 “Scaffolding Student Participation”-An Example of Open Coding ............................ 49 Table 3. 7 An Example of A Video Episode ................................................................................ 52 Table 3. 8 Modified Assessment Expertise Rubric on the Dimension of Equity (Lyon, 2013a, p.1229) ................................................................................................................................... 57 Table 4. 1: Teachers’ Assessment Expertise Levels in Clarifying Learning Targets ................... 63 Table 4. 2 Our Goal Is Going to Use Estimation to Solve Longer Division Problems ................ 67 Table 4. 3 I Believe That Someone Said Yesterday… ................................................................. 69 Table 4. 4 What Is the Purpose of What We Did Today?............................................................. 71 Table 4. 5 Who Is in Charge of Your Learning? .......................................................................... 73 Table 4. 6 What Goes into My Thinking Bubble? ........................................................................ 76 Table 4. 7 Why Do You Think It’s Wrap A Whole? .................................................................... 78 Table 4. 8 Wait, 100… .................................................................................................................. 81 Table 5. 1 When You Are Dividing a Decimal… ........................................................................ 91 Table 5. 2 Lulua, Did I Rephrase All Right? ................................................................................ 93 Table 5. 3 Did You Scale It Down By 10 Or By 100? .................................................................96 x Table 5. 4 Moiz, Do You Want to Read Number 18? ................................................................ 100 Table 5. 5 I Did Not Wait for Him.............................................................................................. 112 xi LIST OF FIGURES Figure 3. 1 Mapping Data Analysis for The First Research Question .......................................... 42 Figure 3. 2 Mark Mrs. G’s Expertise Level on Presenting Learning Targets ............................... 45 Figure 3. 3 Mapping Data Analysis for The Second Set of Research Questions ......................... 48 Figure 3. 4 The “Wait Time” Use for EBs in FA ......................................................................... 55 Figure 4. 1 A Divide Task in Student Activity Book ................................................................... 69 Figure 5. 1 The Math Task of 316.2 ÷ 6 on Student Activity Book ............................................. 97 Figure 8. 1A Spreadsheet to Show How Well Mrs. G Enacted FA Based on the FARROP Rubric ............................................................................................................................................. 152 xii CHAPTER 1: INTRODUCTION In the United States, the number of emergent bilingual students (hereafter EBs) in public schools has greatly increased. According to the National Center for Education Statistics (NCES, 2020), the number of EBs (designated English language learners, ELLs, in official documents; however, I will continue to use EBs throughout this study) in public schools has increased to five million, which means that EBs now make up 10.1 percent of the total student populations. Although not one of the states with the highest number of EB student population in the United States, Michigan has seen a surge of EBs in its public schools. For example, there were more than 97,838 EBs in the school year of 2017-2018 in Michigan public schools, which was six percent of the state K-12 student population (Sugarman & Geary, 2018). In addition, 8.4 percent of the total student population in Grade 3 to 5 are EBs, which is higher than the percentage of EBs in middle and high schools (Sugarman & Geary, 2018). With the surge of EB student populations in our classrooms, conversations around high- leverage practices for all students, including EBs, have drawn increasing attention from researchers, policymakers, and practitioners. This is important in part because EBs encounter a challenge of developing both content and language at the same time in schools (Bailey, Maher, & Wilkinson, 2018). In particular, there has been a call for research on equitable and responsive instruction and assessment by drawing on EBs’ needs in the language (Abedi, 2010; Schleppegrell, 2007; Martiniello, 2008; Lyon, 2012; 2013a). Formative assessment (hereafter FA) is one teaching practice that holds promise for supporting all students, especially EBs (Alvarez, Ananda, Walqui, Sato, & Rabinowitz, 2014; Bailey, Maher & Wilkinson, 2018). FA serves the purpose of assessment for learning. Different from summative assessment (i.e., assessment of learning), FA is “a planned, ongoing process used by all students and 1 teachers during learning and teaching to elicit and use evidence of student learning to improve student understanding of intended disciplinary learning outcomes and support students to become self-directed learners” (CCSSO, 2017, p.2). From a sociocultural perspective, FA recognizes the importance of social interaction as a vehicle to construct student learning (Smith, Teemant, & Pinnegar, 2004; Black & William, 1998a, 2009; Gipps, 1999). Usually, high-quality FA involves the following stages: (1) clarifying learning targets (i.e., where are we going); (2) eliciting student thinking and gathering student learning evidence (i.e., where are we now); and (3) reaction on the learning evidence such as providing feedback to students and making instructional modifications (i.e., how can we get there) (Bennett, 2011, CCSSO, 2017; Gotwals & Birmingham, 2016; Gotwals & Ezzo, 2018; Sadler, 1989). FA, if used effectively, can improve students’ achievement (Black & William, 1998a; 1998b) and “provide teachers and their students with the information they need to move learning forward” (Heritage, 2007, p.140). Meanwhile, FA not only provides teachers with knowledge about “what a student says or writes in terms of mathematical or science content”, but also knowledge about “how a student is using language to express learning” (Bailey, Maher & Wilkinson, 2018, p.8). Researchers have found positive effects of using FA on student achievement in the context of math classrooms (Klute, Apthorp, Harlacher, Reale, & Research, 2017; Silver & Smith, 2015). Both student-directed FA, which emphasizes involving students more actively, and the FA directed by teachers during math instruction are effective to student learning: those students who had access to FA performed better on mathematics assessment than students who were not exposed to the FA interventions (Klute, Apthorp, Harlacher, & Reale, 2017). In addition to enhancing students’ learning of subject concepts, using FA in classrooms 2 supports EBs’ long-term academic language development (García, Kleifgen, & Falchi, 2008; Alvarez, Ananda, Walqui, Sato, & Rabinowitz, 2014; Gibbon, 2015). However, there seems to be a gap between research advocating the effectiveness of FA and research about classroom teachers implementing FA (Box, Skoog, & Dabbs, 2015). Compared with research on investigating the effectiveness of FA, studies on how classroom teachers perform FA is still underdeveloped due to the complexities of implementing FA (Philhower, 2018; Veon, 2016). Research shows that teacher knowledge and beliefs play a critical role in shaping teachers’ FA practices (Box, Skoog, & Dabbs, 2015; Cross, 2009). Unfortunately, little attention has been given to the connection between teachers’ assessment expertise, their FA practices, and supporting student learning, not to mention studies examining those connections within a linguistically and culturally diverse classroom setting. Therefore, the field still needs more information about how classroom teachers implement FA to support all student learning, including EBs. My study, as such, examines the nature of an experienced elementary teacher’s FA practices in her fifth-grade math class in a public school located in a college town in the upper Midwestern United States. This study collects multiple data sources: interviews with both the teacher and students, class observations with field notes and video recordings, and artifacts. The purpose of incorporating interviews with students is to better understand the “responsiveness” of the teacher’s FA practices to EBs as well as to triangulate data in this qualitative study. Research Questions I chose a fifth-grade math class given my research purpose and the linguistic diversity of EBs in the participating teacher’s classroom. Research shows that EBs at upper elementary levels can encounter more challenges in math problem solving, as the math tasks become more 3 linguistically and cognitively complex (Bunch, Walqui, & Pearson, 2014). In addition, EBs in this fifth-grade class presented a wider linguistic diversity of their home languages (e.g., Arabic, Farsi, and Vietnamese) compared with the other math classes that my focus teacher taught. The following research questions guide this study: ● What is the nature of the teachers’ FA practices during a mathematics unit on division with whole numbers and decimals? ● How does the teacher perceive challenges that EBs face during FA? What are the EBs’ perceptions of their needs in learning mathematics? And to what extent are the teacher's FA practices, FA beliefs, and EBs’ perceptions of their learning needs aligned? Significance of The Study Overall, FA can serve as a powerful tool in supporting all students’ learning. FA supports student learning through practices such as clarifying learning targets, eliciting student thinking, and giving feedback. From a sociocultural perspective, FA emphasizes the importance of learners constructing knowledge as active contributors interacting with the teacher and their peers (Smith, Teemant, & Pinnegar, 2004; Black & William, 1998a, 2009; Gipps, 1999). Through analyzing a mainstream teacher’s FA practices and her interaction with all students, particularly EBs, this study can enhance our understanding of an elementary teacher’s daily FA practices, and thus informs us about the teacher’s assessment expertise. Meanwhile, this study can contribute to the literature on equitable assessment by drawing attention to students’ voices about their learning needs in math classrooms. Following this chapter, I present a literature review that summarizes research on the definition and implementation of FA. Then I have a methods chapter that presents information 4 about participants, data sources, and data analysis. I then have two findings chapters. The first findings chapter responds to the first research question with a focus on examining the nature of this teacher’s FA practices, particularly how the teacher valued student ideas when she presented learning targets and elicited student thinking. The second findings chapter responds to the second set of research questions with a focus on the teacher’s interaction with EBs, particularly how multiple data sources inform us about the responsiveness and alignment among teachers’ belief of challenges that EBs face, her FA practices, and EBs’ perceptions of their learning needs in mathematics. Finally, I have a discussion and conclusion chapter where I make connections to literature and describe the implications of my findings. Define Terms English Language Learners (ELLs): Students who come from non-English-speaking homes and communities and learn English as an additional language. Usually, the ELL student populations do not have fully proficient English to communicate or learn effectively in schools, and thus typically need modified instruction (Genesee, Lindholm-Leary, Saunders & Christian, 2005). The ELLs, sometimes, were referred to as “English learners (ELs)”, “language minorities” or “linguistic minorities” (LMs). Emergent Bilinguals (EBs): Also known as English language learners (ELLs) or English learners (ELs) in the U.S. school system, are students “who speak a language other than English and are acquiring English in school” (Garcia, Kleifgen, & Falchi, 2008, p.7). However, the definition of ELLs tends to focus on their limited English proficiency; yet those students’ home languages and cultures are neglected or undervalued. Thus, to avoid the potential deficit views on students who speak English as an additional language, researchers proposed to use an alternative term-EBs 5 Summative Assessment (SA): Also referred to as assessment of learning, is a process of making a judgment based on standards, goals, and criteria at a given point (e.g., at the end of a unit) to gather student learning evidence (Taras, 2005). Formative Assessment (FA): Also referred to as assessment for learning, is a process of clarifying learning goals, gathering learning evidence, and closing the learning gap by providing feedback (CCSSO, 2017; Black & William, 1998a). FA highlights the role of feedback provided by teachers and student peers. Those provided feedback can reveal a learning gap between students’ actual level of work and the required level; in the meantime, support to reduce the learning gap (Taras, 2005; CCSSO, 2017; Black & William, 1998a, 2009). Assessment Expertise: This article adopts Lyon’s (2013b) views and uses “assessment expertise” to emphasize “teacher growth toward more sophisticated and responsive assessment practices” (p.443). A teacher’ assessment expertise including their knowledge of assessment (i.e., understanding assessment concepts), and their capabilities to translate their understanding and knowledge of assessment to daily assessment practices (i.e., facility with assessment practices) (Gearhart, et al., 2006) Linguistically Responsive Teaching (LRT): As an extension of the framework of culturally responsive teaching (Villegas & Lucas, 2002), LRT centers the role of language in the discussion of teaching, especially when teachers work with EBs (Lucas & Villegas, 2013; Lucas, de Oliveira, & Villegas, 2014; Lucas, Villegas, & Freedson-Gonzalez, 2008). Knowledge and skills of linguistically responsive teachers include understanding about EBs’ language backgrounds, identifying language demands of the classroom task, scaffolding instruction for EBs, and applying key principles of second language learning (i.e., academic language proficiency, 6 comprehensible input, first language transfer) into classrooms (Lucas, de Oliveira, & Villegas, 2014). 7 CHAPTER 2: LITERATURE REVIEW In this chapter, I first review literature about the characteristics and process of formative assessment (hereafter FA) and FA from sociocultural perspectives. Secondly, I review the literature on teaching and assessing EBs to uncover challenges that EBs may encounter and how teachers can best support EBs during FA. Thirdly, I discuss relationships between teachers’ beliefs, knowledge, and their FA practices. Finally, drawing on Lyon’s (2013a) Conceptualization Framework of Assessment Expertise, I propose a need to rethink teachers’ assessment expertise within the context linguistically and culturally diverse classrooms. The Characteristics and Process of FA Assessment can be placed into two main categories based on the purpose of assessment: summative assessment and FA. Summative assessment refers to assessment of learning. The purpose of summative assessment is to evaluate what students know and do not know after learning has happened. In contrast, FA (i.e., assessment for learning) is a collaborative process of clarifying learning targets, gathering evidence, and providing feedback to inform student learning, and providing sources of evidence for teachers’ instructional decision making (Popham, 2010; Stiggins, 2002). For a long time, researchers have suggested bringing assessment that can contribute to learner development into classrooms, rather than only measuring what students know using objective testing (Gipps, 1994; Crossouard, 2009). FA entails a paradigm shift in terms of how to assess students: from objective testing after learning for some time to ongoing assessment during the learning process (Sadler, 1989; Crossouard, 2009; Black & William, 1998a, 2009). To better understand FA, below I review relevant literature on the characteristics and process of FA. 8 Researchers proposed that there are different stages in effective and high-quality FA. For example, Black and Wiliam (2009) argued that FA can be conceptualized as consisting of five stages: (1) clarifying and sharing learning intentions and criteria for success; (2) engineering effective classroom discussions and other learning tasks to elicit evidence of student understanding; (3) providing feedback that moves learners forward; 4) activating students as instructional resources for one another; and (5) activating students as owners of their learning. Similarly, the Council of Chief State School Officers (CCSSO) (2017) highlighted that, effective FA should embed the following five essential practices: (1) clarifying learning goals and success criteria; (2) eliciting and analyzing evidence of student thinking; (3) engaging in self-assessment and peer-feedback; (4) providing actionable feedback; and (5) using the evidence and feedback to move student learning forward. The above descriptions on the stages of FA, overall, can be categorized as questions teachers and students may ask themselves: “where are we going?”, “where are we now?”, and “how do we get there?” (Sadler, 1989; Hattie & Timperley, 2007; Gotwals & Ezzo, 2018; Gotwals & Birmingham, 2016). Below I will explain each stage of the FA in detail. Where Are We Going? Articulating Learning Targets Communicating with students about the instructional goals and learning targets is essential in daily classroom instruction. Clear learning targets can guide both teaching and learning (Konard, Keesey, Ressa, Alexeeff, Chan, & Peters, 2014) and “provide guidelines for assessing student learning” (Miller, Linn, & Gronlund, 2009, p.47). Similarly, the National Council of Teachers of Mathematics (NCTM, 2014) presents two benefits of communicating learning targets in classrooms: (a) guiding teachers’ instructional decision making, and (b) 9 promoting students’ awareness of ownership to their learning to move their current learning forward. In classrooms, teachers need to explain both lesson and unit learning targets and the success criteria to their students (Veon, 2016). Usually, there are three ways in articulating learning targets and success criteria: (1) presenting learning targets to students (e.g., posting on the whiteboard); (2) explaining learning targets to students using language; and (3) introducing documents such as rubrics and standards (Chappuis, Chappuis, & Stiggins, 2009). Without clear communication of learning targets with students, teachers are unlikely to assess students effectively and accurately (Chappuis, Chappuis, & Stiggins, 2009; Moss, Brookhart, & Long, 2011; Marzano, 2013). Where Are We Now? Gather Evidence About Student Learning After clarifying “where to go”, it is important for teachers to check how well students are learning and whether they have met the learning targets. When teachers provide students with opportunities to fully demonstrate what they learned, the critical data that teachers gathered can inform students-teacher interactions, and thus, help teachers and students pinpoint a gap between what students are expected to learn and what they currently understand (Heritage, Kim, Vendlinski, & Herman, 2009). Usually, the evidence of student learning consists of everything students do such as “conversing in groups, completing seatwork, answering and asking questions, working on projects, handing in homework assignments, even sitting silently and look confused-is a potential source of information about how much they understand” (Leahy, Lyon, Thompson, & William, 2005, p.19). In mathematics class, there are various opportunities for a teacher to collect and gather evidence of student learning. In particular, FA on a moment-by-moment basis allows 10 teachers to collect real-time, rich, and flexible data documented via conversations and interactions (Heritage, Kim, Vendlinski, & Herman, 2009). How Do We Get There? Actions Based on The Evidence of Student Learning Teachers’ actions based on the evidence of student learning are critical (Black & William, 1998a). There are two aspects in terms of teachers’ actions on the evidence of student learning: (1) feedback to students, and (2) instructional modification. The format of feedback to students can be one-to-one, group-based, or whole class-based. Usually, teachers’ response to a student’s written work is one-on-one and often happens after the student has turned in their work; but feedback during classroom discussion maybe to a larger group and can be given immediately (Black & William, 2009). To further clarify the forms and levels of the feedback given to students, Hattie and Timperley (2007) reviewed literature relevant to the effects of feedback on student learning and achievement. This study presented four levels of feedback: (1) feedback not explicitly tied to assessment tasks (e.g., good job); (2) feedback relevant to student task performance; (3) feedback about task processing, and (4) feedback for student self-regulation. Hattie and Timperley’s (2007) review revealed a connection between student learning and achievement and the types of feedback that students received. When students received feedback (e.g., at process level) with “cues to directions for search and strategizing” or “leads to further engagement with or investigating further effort to the task”, they felt more powerful and beneficial (Hattie & Timperley, 2007, p.102). Findings from Hattie and Timperley’s (2007) study suggest that teachers need to make sure that their feedback is goal- oriented, tangible and transparent, actionable, user-friendly, timely, and consistent to ensure the effectiveness of feedback (Wiggins, 2012). 11 Teachers also “seek and learn from feedback” and use student learning evidence to adjust their instruction (Hattie & Timperley, 2007, p.104). There are two types of instructional adjustments: remediation and enhancement (Sass-Henke, 2013; Veon, 2016). Remediation is “a set of corrective activities, such as peer tutoring, review games, and additional assistance from the teacher” (Veon, 2016, p.36), which is usually used when students did not meet the learning targets. In contrast, the enhancement activities were often added when students exceeded the learning targets (Veon, 2016). Yet, making decisions on their instructional modification is not easy for teachers. Elementary teachers in math classes tend to struggle in making decisions on how to modify their instructions based on students’ performance in mathematics tasks (Heritage, Kim, Vendlinski, & Herman, 2009). In a nutshell, FA, if used systematically and appropriately, has positive effects on student learning (Black & William, 1998a; Kingston & Nash, 2011; Bennett, 2011). For example, Black and William (1998a) conducted a meta-analysis study and reviewed 250 empirical studies on FA to illustrate evidence about the effects of FA. The effect size of FA ranges from 0.4 to 0.7, which means FA has much larger effects on student achievement, compared with other educational interventions or instructional strategies. Efforts to enact FA, if implemented effectively, could raise student performance from the 50 percentiles to the 85 percentiles (Black & William, 1998b). In particular, FA practices were found to benefit low achievers significantly (Black & William, 1998b). In addition, the use of FA can enhance student self-esteem and motivation (Nicol & Macfarlane-Dick, 2006; Weurlander, Söderberg, Scheja, Hult, & Wernerson, 2012). During FA, students have more chances to understand what they expected to learn, to know what the current stages of their learning are, and to receive descriptive feedback from teachers and student peers (Nicol & Macfarlane-Dick, 2006; Weurlander, Söderberg, Scheja, Hult, & 12 Wernerson, 2012). The process of making sense of teachers and peers’ feedback, thus, allows students to improve their understanding of the content (Nicol & Macfarlane-Dick, 2006). Sociocultural Perspective of FA From a sociocultural perspective, assessments “recognize the importance of sociocultural activity as a vehicle for integrating these desired outcomes, and it anticipates the variability in performance that can occur across particular situations” (Smith, Teemant, & Pinnegar, 2004, p.40). The notion of assessment for learning (i.e., FA) is based on an underlying assumption that learning is accomplished through social interactions in communities (Black & William,1998, 2009; Veon, 2016; Gipps, 1999; Crossouard, 2009). During FA, students have opportunities to receive scaffolding from teachers and capable peers, and thus move their learning forward (Heritage, 2007; Torrance, 2012; Sardareh & Saad, 2013; Crossouard, 2009). For example, Box, Skoog, and Dabbs (2015) found that students’ understanding was enhanced when they were paired with peers and teachers to engage in the learning and assessment process. Also, language and other social tools can be used to mediate student learning and promote learner development (Vygotsky, 1978; Gibbons, 2006; Smith, Teemant, & Pinnegar, 2004). Another key feature of assessment for learning is associated with student involvement (Bell & Cowie, 2001). Oftentimes, it is the assessment experts or teachers designing, administering, and grading assessment, whereas students, in contrast, are often passive. Fortunately, FA highlights the importance of perceiving learners as active contributors to their learning (Black & William, 1998a; Gipps, 1999; Ruiz-Primo, 2011). Seeing FA as sociocultural echoes Cross’s (2008) work, which claims that effective mathematics instruction demands teachers using multiple learner-oriented tasks and activities to support students to become powerful mathematical thinkers. To leverage student participation 13 experiences during FA, classroom teachers need to create a classroom culture characterized by openness and acceptance where students feel safe and comfortable to work with teachers and their peers (Box, Skoog, & Dabbs, 2015). Discursive FA Usually, there are two types of FA in classrooms: formal and informal FA. According to Ruiz-Primo and Furtak (2007), formal FA usually starts from teacher selected or designed activities (e.g., checklist for student self-assessment; and questioning); while informal FA, oftentimes, is “more improvisational and can take place in any student-teacher interaction at the whole-class, small-group, or one-on-one level” (Ruiz-Primo & Furtak, 2007; p.59). Compared with formal FA, during informal FA teachers tend to gather information and react to the information that they gathered (e.g., students’ ideas) on the fly by reckoning a student’s response (Ruiz-Primo & Furtak, 2007; Ruiz-Primo, 2011). Informal FA, oftentimes, is viewed as a “socially situated activity” (Moss, 2008; Ruiz- Primo, 2011, p.16). This type of ongoing informal FA process, which emphasizes teachers collecting information on a continuing basis, has been referred to as “assessment conversation'' (Ruiz-Primo & Furtak, 2007; Ruiz-Primo, 2011). This study refers to this type of FA as discursive FA. The boundaries between discursive FA and instruction, oftentimes, are blurry because discursive FA is an essential part of the instruction (Sadler, 1989). Using discursive FA allows teachers to gather reliable and solid evidence on students’ learning by drawing on their daily class activities; in the meantime, it allows teachers to react to students’ response using flexible and multiple modes of feedback (e.g., oral text, written text, a combination of oral and written text, and visual supports) (Ruiz-Primo, 2011). In addition, discursive FA allows teachers to elicit 14 students’ thinking explicitly and to recognize students’ language use in an unobtrusive manner (Rize-Primo & Furtak, 2007; Ruiz-Primo, 2011). Effective discursive FA is (a) learning goal-guided, (b) dialogic and interactive; (c) used as an instructional scaffolding tool, and (d) used as a supportive tool of social participation and cognition (Ruiz-Primo, 2011). Based on those features, it is not difficult to find that interaction is a core idea of discursive FA. However, traditional student-teacher interaction patterns, such as the Initiation- Response-Evaluation/Feedback (IRE/IRF), do not provide much space for students to explain their ideas because students are led through predetermined answers (Herbel-Eisenmann & Breyfogle, 2005). When performing discursive FA (i.e., assessment conversation), researchers proposed the concept of ESRU cycles, which includes four discourse moves: eliciting, soliciting, responding, and reacting. The purpose of the ESRU cycle is to support students’ involvement and participation. Specifically, a complete ESRU cycle consists of four steps: (a) teachers ask questions, (b) students respond, (c) teachers recognize students’ responses, and (d) teachers use the information they collected to support student learning (Ruiz-Primo & Furtak, 2007). As presented by those four moves, listening to students’ responses and reacting to students’ ideas is very important. The core spirit of the ESRU cycle, to a large extent, echoes the higher-level questioning patterns that Herbel-Eisenmann and Breyfogle (2005) proposed. Based on the extent to which students have enough space and opportunities to explain and justify their thinking, there are typically three types of questioning patterns given the student-teacher interaction in math classrooms: IRE/IRF, funneling and focusing patterns. Funneling “occurs when the teacher asks a series of questions that guide the student through a procedure to a desired end” (Herbel- 15 Eisenmann & Breyfogle, 2005, p.485). In other words, students may have space to explain and justify their thinking during the funneling questioning pattern. However, overall, during this type of interaction students are mainly directed to answer questions to get the desired answer that the teacher expected. The focusing pattern, on the contrary, requires teachers not to rush for getting the desired answer. Instead, teachers not only need to listen to their students' responses but also need to provide students with enough space to explain their thinking. Students’ thinking and contributions are highly valued in the focusing questioning pattern (Herbel-Eisenmann & Breyfogle, 2005). If teachers do not probe student thinking and foreground students’ role in constructing knowledge, it would cause missed opportunities to assess student knowledge (Box, Skoog, & Dabbs, 2015). FA within Linguistically and Culturally Diverse Classrooms With an increasing number of EBs in K-12 classrooms, schools become more and more linguistically and culturally diverse. This increasing diversification demands attention to the responsiveness of FA for all students’ needs. However, what does FA in the context of linguistically and culturally diverse classrooms look like? How does a teacher differentiate and tailor their instruction and assessment to meet EBs’ needs? To answer those questions, Smith, Teemant, and Pinnegar (2012) proposed that assessment in a linguistically diverse classroom should be useful, meaningful, and equitable. The usefulness of assessment means that assessment should be able to support learning and teaching via timely and actionable feedback. The meaningfulness of assessment refers to assessing relevantly and accurately. In other words, it is important to ensure that assessment reflects the key content of the curriculum and best elicits student learning. During FA, students can get prompt and helpful feedback during a meaningful context such as the whole class discussion. 16 The nature of FA, especially discursive FA, speaks to its qualifications of meeting the criteria of “usefulness” and “meaningfulness”. Thus, FA per se can be used as a powerful tool in supporting all students’, but especially EBs’, math learning. Another feature of responsive classroom assessment is equity, which emphasizes the importance of “fairly accommodating students’ sociocultural, linguistic, and development needs” (Smith, Teemant & Pinnegar, 2012, p.41). To meet the demands of “equity” in assessment, FA in linguistically and culturally diverse classrooms needs to “provide each student with appropriate opportunity to demonstrate what they know and can do” (Smith, Teemant & Pinnegar, 2012, p.41). For the equity dimension, this study focuses on the influence of sociocultural factors, especially language, on EBs’ assessment performance and content learning. Considering research concentrating on classroom assessment for EBs is still underdeveloped, I included relevant studies examining assessing EBs beyond the subject area of mathematics. By reviewing those studies, I hope to uncover the challenges that EBs face in content-area and assessment and explore how teachers can best support EBs during FA. Specifically, I reviewed relevant literature from two perspectives to illustrate what “equitable” FA should look like within a linguistically and culturally diverse classroom setting. The two perspectives are: (a) reducing linguistic and text complexities of assessment tasks in math classrooms; and (b) providing EBs with a language-enriched math learning environment. Reducing Linguistic and Text Complexities of Assessment Tasks Assessing EBs in equitable ways refers to maximizing the opportunities for EBs to demonstrate their math knowledge and not be constrained by language complexities (Kersaint, Thompson & Petkova, 2013; Clark-Gareca, 2016; Martiniello, 2009; Abedi & Lord, 2001; Abedi & Levine, 2013). In other words, it is significant to ensure the assessments are comprehensible 17 for EBs both linguistically and culturally (Siegel, Wissehr & Halverson, 2008). Despite necessary content knowledge, EBs also need to achieve a certain level of academic language proficiency to finish assessment tasks. If assessment tasks, including both the written and oral tasks, are not comprehensible and accessible for EBs, their content knowledge can be underestimated due to the language barrier. Linguistic features that seem challenging for EBs include complex syntactic features (e.g., long sentences with relative clauses) and words with less frequency, synonyms, polysemous words (e.g., raised) which refers to words having different meanings or connotations, and context of the math tasks (e.g., a fund-raising related word problem) (Martiniello, 2008; Schleppegrell, 2007; Martiniello & Wolf, 2012). Word problems in mathematics with multiple sentences and complex syntax structures are likely to add cognitive load for EBs and demand working memory. Vocabulary with multiple meanings may hinder EBs’ accurate understanding of the math tasks; while a lack of cultural reference such as background knowledge may make EBs feel unrelated and thus increase EBs’ frustration in completing the math task. In mathematics assessment, polysemous words and cultural references have been found particularly challenging for EBs in math story problems (Martiniello & Wolf, 2012; Martiniello, 2008). The complex linguistic and text complexities influence EBs’ comprehension and problem-solving process in math classrooms (Martiniello & Wolf, 2012). Thus, it is important to fairly accommodate EBs’ needs in language. Research shows that EBs who received accommodations that matched their linguistic and cultural needs outperformed those who did not receive such support in mathematics assessments (Kopriva, Emick, Hipolito, Delgado, & Cameron, 2007). 18 To reduce the linguistic complexities and challenges that EBs encountered during math assessment, teachers need to identify language demands, such as semantic and syntactic complexities of the classroom tasks (Lucas, Villegas, & Freedson-Gonzalez, 2008). Martiniello and Wolf (2012) also pointed out a couple of ways to alleviate the challenges that EBs may encounter: (a) ensuring EBs understand the particular meaning of a polysemous word; (b) exposing EBs to different connotations of polysemous words in classroom discourse; (c) explaining the meaning of words or phrases (e.g., a total of, at least) explicitly; (d) guiding EBs in unpacking the meaning of complex texts; and (e) facilitating and encouraging the use of students’ native languages. Providing EBs with Language-Enriched Math Learning Environment Maximizing learning opportunities for EBs also means providing them with accessible and rigorous content learning (Siegel, Wissehr & Halverson, 2008; Goldenberg, 2008; Moschkovich 2007; Walqui & Van Lier, 2010). This echoes principles that Siegel and colleagues (2008) proposed in assessing EBs: (a) challenging students to think about difficult ideas, (b) eliciting student understanding, (c) and scaffolding EBs’ use of language and support their learning. This study refers to a language-enriched math classroom as teachers hold high expectations for EBs and create opportunities for EBs to talk about their math thinking and understanding (Kersaint, Thompson & Petkova, 2013; Moschkovich, 2013). Another meaning of language-enriched math classroom refers to EBs having multiple opportunities to solve the math tasks (e.g., explaining their math thinking orally, or demonstrating math reasoning process in written texts) (Kersaint, Thompson & Petkova, 2013). In other words, in a language-enriched math classroom, all students, including EBs, “are encouraged and expected to talk about 19 mathematics and represent their understanding in a variety of ways-with words, pictures, and symbols” (Kersaint, Thompson & Petkova, 2013, p.127). For example, math talk, according to Moschkovich (2018), provide teachers with more realizable and solid evidence on student learning progress than written assessment. The National Council of Teachers of Mathematics (NCTM, 2013) also advocated that teachers orchestrate classroom discussions to support EBs’ learning. It argued that EBs have opportunities of getting access to the multiple presentations of math learning (e.g., speaking, writing, reading, and listening) with necessary language support (e.g., personal math dictionaries, bilingual dictionaries, and word walls) (NCTM, 2013; Kersaint, Thompson, & Petkova, 2013). To achieve the goal of supporting EBs’ participation in language-enriched math classrooms, it is important to consider using assessment flexibly and incorporating multiple modes of expression into the math classroom (Moschkovich, 2018, 2007). For example, allowing EBs’ access to visuals and gestures, as well as oral and written texts is essential in math classrooms. As claimed by Moschkovich (2007), “the shift to considering multiple modes of expression is particularly important to assess the competencies of students who are learning English” (p.348). In addition, to achieve the goal of supporting EBs’ participation in the language-enriched math classroom, it is also important for teachers need to establish a learning environment that supports students’ active engagement; in the meantime, to adapt their instruction when necessary to ensure the content is more accessible for EBs (NCTM, 2013; Kersaint, Thompson, & Petkova, 2013). Teachers can provide students with extra wait time, and scaffold students’ conceptual understanding by breaking down math questions into manageable pieces for EBs in whole-group discussion (Cardimona, 2018). Teachers can also consider bringing the following strategies into 20 a linguistically diverse math classroom: (a) using extra-linguistic support (e.g., visual tools) as a medium other than language for EBs; (b) using supplementing and modifying written text (e.g., teachers can adapt the text the make the language more accessible but not simplify the content); (c) using the supplementing and modifying oral language (e.g., pausing, outlining, repeating, establishing routines); (d) giving clear and explicit instruction; (e) engaging EBs to interact with student peers to negotiate the meanings (e.g., modifying students’ talk to ask how and why questions, and facilitating their conversations); and (f) minimizing EB’s anxieties during classroom instruction and assessment (Kersaint, Thompson, & Petkova, 2013; Lucas, Villegas, Freedson-Gonzalez, 2008). Altogether, FA plays an important role in providing high-quality instruction (Black & William, 1998a). An instructional system that does not make explicit provisions for FA practices is not complete (Sadler, 1989; Heritage, Kim & Vendlinski, 2009). If used appropriately, FA can be one of the most powerful tools for teachers’ classroom teaching and decision-making (Dorn, 2010). Therefore, teachers need to use FA to move student learning forward. FA is even more important in classrooms with EBs because they are working on their language proficiency as well as the content knowledge (Alvarez, Ananda, Walqui, Sato, & Rabinowitz, 2014). Providing appropriate accommodations and support during classroom assessment is one way to elicit better evidence of what EBs know and can do (Abedi, Lord, Hofstetter & Baker, 2000; Siegel, 2007; Clark-Gareca, 2016). When teachers have better evidence, they are better able to provide accurate feedback and make instructional decisions during FA. Yet, acknowledging the significance and benefits of FA to students’ learning does not guarantee a sound implementation of FA in classrooms. When implementing FA in classrooms, various factors, such as teachers’ beliefs and knowledge, influence teachers’ decision-making 21 during their FA practices. To better understand teachers’ FA practices with a linguistically and culturally diverse classroom, it is essential to explore the relationships among teachers’ belief, knowledge, and FA practices. The purpose of involving teachers’ belief in this study is to mirror their classroom practices. By doing so, we can have a better understanding of the teacher's FA practices. Relationships Among Teachers’ Knowledge, Beliefs and FA Practices Teachers play an important role in the student learning process, as Cross (2009) stated, “teachers organize and shape the learning context and therefore have an enormous influence on what is being taught and learned” (p.325). In particular, teachers’ knowledge plays a critically important role in affecting teachers’ capability of putting theories, such as their understanding of FA, into practice (Box, Skoog, & Dabbs, 2015; Schoenfeld, 2011). According to Abell and Siegel (2011), what a teacher knows and can-do influences how they planned and implemented assessment, as well as how they use data evidence to inform teaching and student learning. In addition, teachers’ belief systems have a significant influence on their instructional and assessment practices (Box, Skoog, & Dabbs, 2015; Guadu & Boersma, 2018; Cross, 2009). For example, researchers found that teachers who used a student-centered assessment approach (e.g., peer review, conduct lab together, group presentation) perceived students as having the potential to learn from their peers as they constructed knowledge (Box, Skoog, & Dabbs, 2015). On the contrary, teachers who perceived their roles as “filling their open minds by explaining what they needed to know about science” tended to use teacher-centered assessment (Box, Skoog, & Dabbs, 2015, p.971). Besides, teachers’ beliefs about the nature of subject (e.g., mathematics) influenced what types of classroom activities they designed and the way they interacted with students (Cross, 2009; Beswick, 2005; Stipek, Givvin, Salmon, & MacGyers, 2001). For 22 instance, teachers who perceived mathematics as computation tended to give lecture-based instruction with an initiate-respond-evaluate (IRE) questioning pattern (Cross, 2009). In contrast, teachers who perceive mathematics as a “thinking process” and “problem-solving” focused on the role of a teacher as a coach and students negotiating meanings of mathematics with peers and teachers through discourse. (Cross, 2009, p.332). Despite the powerful influence of teachers’ knowledge and belief on their assessment practices, the relationships and links between teachers’ beliefs and practices are not always clear (Beswick, 2005). While some research shows consistency between teacher’s belief, knowledge, and practice (e.g., Thompson, 1984; Cross, 2009; Stipek, Givvin, Salmon, & MacGyers, 2001), other research shows a discrepancy between a teacher’s reported beliefs, knowledge and their classroom practices (Guadu & Boersma, 2018; Shield, 1999). For example, Guadu and Boersma (2018) found that teachers reported a positive belief in the role of FA in supporting students’ learning in the English as a Second Language classroom. Yet, teachers' implementation of FA to monitor student learning as well as their scaffolding, when necessary, during FA, was found not good as expected (Guadu & Boersma, 2018). Given those potential inconsistent links between a teacher’s belief and their practices, researchers proposed conducting studies that include evidence not only from teachers’ verbal accounts but also from teachers’ classroom practices (Anderson & Piazza, 1996). There is a need to further examine the extent to which teachers’ knowledge and belief predict their classroom assessment practices. Also, research has found that contextual factors such as the types of students (e.g., students with diverse backgrounds) influenced teachers’ classroom assessment practices (Gross, 2019; Box, Skoog, & Dabbs, 2015). Therefore, it is fundamental to examine the 23 consistency of teachers’ knowledge, beliefs, and classroom practices when they assess students with linguistically and culturally diverse backgrounds. Reconceptualize Teachers’ Assessment Expertise In this study, I consider teacher beliefs as part of teacher knowledge (Pajares, 1992), although most literature that I reviewed in the previous section perceived teacher knowledge and belief as two separate concepts. Drawing on the definition of teacher “knowledge” with a broader meaning, I believe that teacher knowledge in assessment, namely assessment expertise, plays a significant role in shaping a teacher's FA practices within a linguistically and culturally diverse classroom. Research shows that to provide students with effective assessment opportunities, it is important to unpack teachers’ assessment expertise and explore what assessment knowledge and skills teachers need to develop (Abell & Siegel, 2011). In particular, with an increasingly diverse student population in mainstream classrooms, it demands teachers not only build their knowledge and skills in general assessments but also their expertise in teaching for diverse learners (Darling-Hammond, 2006). Under this context, Lyon (2013a) reconceptualized teachers’ assessment expertise in the linguistically and culturally diverse classroom context and pointed out that teachers’ assessment expertise should consist of three dimensions: 1) assessment design, 2) assessment use, and 3) assessment equity (see Table 2.1 for a summary about each dimension). This study uses Lyon’s (2013a) Conceptualization Framework of Assessment Expertise as a lens to investigate the nature of a teacher’s enactment of FA within a linguistically and culturally diverse math classroom. 24 Table 2. 1 A Summary on Lyon’s (2013a) Conceptualization Framework of Assessment Expertise Lyon’s (2013a) Conceptualization Framework of Assessment Expertise Assessment Design Alignment focuses on whether teachers consider different assessment tasks to elicit and engage student thinking. Cohesion focuses on whether teachers consider the cohesion among the assessment task, the specific learning objective, and specific criteria. Assessment Use Curricular context refers to what extent teachers consider when and why they select some specific assessment activity or tasks in the unit teaching. The cycle of inquiry focuses on whether teachers consider assessment strategies (e.g., an entire cycle of inquiry of FA) to support student learning. Assessment Equity Fairness represents the extent to which teachers consider and act on language and cultural influences on assessment. Access represents the extent to which teachers consider providing EBs with opportunities to engage in complex thinking, develop language, and fully participate. The table shows that assessment design emphasizes alignment and cohesion of assessment tasks, learning targets, and criteria. While this study acknowledges the importance of cohesion and alignment in assessments, the dimension of assessment design is not a focus of this study. This is because Mrs. G followed the curriculum strictly and did not modify or design assessment tasks. In addition, FA, especially discursive FA, oftentimes, does not involve a teacher intentionally planning and designing assessment tasks. Therefore, this study focuses on the dimensions of assessment use and assessment equity. 25 Regarding the dimension of assessment use, this study focuses on the subdimension of inquiry of cycle, and especially spotlights how the teacher communicates learning targets with students, how the teacher gathers learning evidence from students using different strategies (e.g., questioning), and how the teacher responds to students’ ideas. For the dimension of assessment equity, this study considers sociocultural factors in assessing EBs, particularly with a focus on language. There are two subdimensions regarding teacher’s assessment expertise in equity: assessment access and assessment fairness. According to Lyon (2013a), assessment access refers to whether teachers considering providing EBs with rigorous content learning such as supporting EBs’ fully participation, developing EBs’ complex thinking and academic language, and providing EBs with tailored feedback. Assessment fairness refers to the extent to which teachers consider sociocultural (e.g., language and cultural) influences on assessment, and the strategies that can be used for supporting EBs. I will revisit Lyon’s (2013a) assessment expertise in the dimension of equity in the methodology chapter. Even though Lyon’s (2013a) framework was developed in the context of linguistically and culturally science classrooms, the characteristics and needs of EBs share a lot of common features and similarities across their subject-area learning. More importantly, this framework can be a good tool to guide teachers’ classroom assessment conceptually in a linguistically and culturally math classroom. I believe that the application of a useful framework is far beyond its original context and can be applied to multiple disciplines. Therefore, this study brings in Lyon’s (2013a) framework to guide analysis of the teacher’s FA in an elementary math classroom. To sum up, FA has received increasing attention for its purpose of gathering evidence to inform student learning and teacher instruction. FA, if used appropriately, can leverage student learning experiences and improve students’ achievement (Black & William, 1998a; Kingston & 27 Nash, 2011). Usually, effective FA includes three stages: (1) where are we going-articulating learning targets, (2) where are we now-eliciting and analyzing student thinking, and (3) how do we get there-actions based on the evidence of student learning (Gotwals & Ezzo, 2018; Gotwals & Birmingham, 2016). Throughout the stages of FA, more and more researchers and practitioners embrace engineering effective discussions to elicit evidence of student understanding (CCSSO, 2017; Black & Wiliam, 1998a, 2009). This advocacy of classroom discussion during FA, to a large extent, echoes the sociocultural perspective on assessment for learning. FA from a sociocultural perspective foregrounds the importance of interactions among students, teachers and their peers, as well as student involvement. As such, student-centered assessment approaches and feedback such as self-assessment and peer feedback are highlighted in discursive FA (Black & Wiliam, 2009; CCSSO, 2017). Using FA, especially the discursive FA, allows teachers to gather reliable and solid evidence on student learning via questioning strategies; in the meantime, to provide prompt feedback to learners. Nowadays, with the increasing student populations of EBs in the math classrooms, it demands teachers’ attention to EBs’ needs in developing content and language. To provide high- quality and equitable instruction to EBs, issues such as how to use FA as a promising way to support all student learning and how to reduce language barriers during FA to ensure the assessment tasks are comprehensible for EBs, need to be further articulated. To uncover how to ensure the responsiveness of FA, a review of current literature reveals two ways: a) reducing the linguistic complexities of assessment tasks in math to ensure assessment fairness (Kersaint, Thompson & Petkova, 2013k; Martiniello, 2008; Schleppegrell, 2007; Martiniello & Wolf, 2012) and b) provide EBs with a language-enriched math learning environment to ensure the 28 assessment access (Kersaint, Thompson & Petkova, 2013; Moschkovich, 2013; NCTM, 2013; Lucas, Villegas, Freedson-Gonzalez, 2008; Lyon, 2013a). However, teachers’ understanding and knowledge about using FA to support all students’ learning do not equal their implementation of FA. When translating their theories about FA into classroom practices, there might be a discrepancy (Guadu & Boersma, 2018; Shield, 1999) due to influences from teachers’ knowledge, belief and other contextual factors such as the student characteristics (Box, Skoog, & Dabbs, 2015; Cross, 2019). Thus, researchers have pointed out that it is important to include evidence both from teachers’ verbal accounts (e.g., interviews) and evidence from teachers’ actual implementation when conducting studies examining teachers’ classroom practices (Anderson & Piazza, 1996). 29 CHAPTER 3: METHODOLOGY As a reminder, this study is guided by the following research questions: ● What is the nature of the teachers’ FA practices during a mathematics unit on division with whole numbers and decimals? ● How does the teacher perceive challenges that EBs face during FA? What are the EBs’ perceptions of their needs in learning mathematics? And to what extent are the teacher's FA practices, FA beliefs, and EBs’ perceptions of their learning needs aligned? To answer those questions, this study adopts a qualitative exploratory case-study design. According to Yin (2003), a case study is an empirical inquiry that investigates a contemporary phenomenon within its real-life context" (p. 13). This study was conducted in a natural setting: a math class in a local public elementary school. Using a single case-study design allows me to collect multiple sources of data, and thus to have an in-depth understanding of the teacher’s FA practices within one mathematics unit. I consider this study as an exploratory case study, although this study may involve characteristics of a descriptive case study. An exploratory case study is often used to answer “what” and “how” questions and to “explore any phenomenon in the data which serve as a point of interest to the researcher” (Zainal, 2007). In this study, I considered a “typical” case, which focuses on a “representative case” (Seawright & Gerring, 2008, p.299). Oftentimes, the typical case shows a stable and widely presented phenomenon (Seawright & Gerring, 2008). Being “representative” the case does not have to be “exemplary”. The purpose of doing so is to explore what FA practices look like in a “typical” linguistically and culturally diverse classroom, and how this teacher’s FA practices 30 provide evidence of her assessment expertise. It is common that there may be a few EBs but with a wide linguistic diversity in mainstream classrooms. Therefore, it is important to have a better understanding of what a teacher’s FA practices look like under this typical classroom context so that we can better support teachers’ practices of using FA to support all students. Context and Participant The School Site The school site, Rochester Elementary School (pseudonym) is a public school with students from preschool through fifth grade, located in a college town in the upper Midwestern United States. Forty percent of the student populations in Rochester Elementary received free or reduced lunch. According to Mrs. G, the school oftentimes had EBs from immigrant families who work at a local university. However, there has been an EB student population shift since 2008; there are more and more EBs from refugee families. The EB student population that increased most are from the Arabic speaking countries. Rochester Elementary School adopted a program called Math Expression, which is an inquiry-based K-6 curriculum that is built on NSF-funded research, such as research on learning math through real-world situations and multiple ways to solve problems. The Math Expression curriculum aims at helping students to “make sense of math by exploring, discussing, and demonstrating their understanding of key concepts'' (Houghton Mifflin Harcourt, n.d.). Equipping students with a deeper understanding of math is the hallmark of the program. This school also provides pullout ESL program support. EBs who do not exit English services will be pulled out from mainstream classrooms to receive explicit language instruction with a focus on English speaking, listening, reading, and writing. In addition, the school provides push-in services, where an ESL paraprofessional will work with EBs in the mainstream 31 classrooms during math instruction. However, the math class where I observed did not receive the push-in service because Mrs. G was not their homeroom teacher. I chose Mrs. G’s math class because of my research purpose and the EB populations. In the fifth-grade math class, EBs may need more accommodations and support. This is because EBs at upper elementary levels may encounter more challenges in math problem solving, as the math tasks become more linguistically and cognitively complex (Bunch, Walqui & Pearson, 2014). In addition, student populations in this class presented a wider linguistic diversity of their home languages, compared with the other math class that she taught. A close look at Mrs. G’s FA practices in this fifth-grade math class allows me to better understand upper elementary EBs’ voices of their learning needs, as well as how Mrs. G responded to EBs’ learning needs. Context of the Math Class: EBs In the fifth-grade math class that I observed, there were twenty-nine students in total. Among the twenty-nine students, Mrs. G reported six of them are EBs. According to my interview with Mrs. G, EBs’ English proficiency was assessed using the World-Class Instructional Design and Assessment (WIDA) test and the local school screening test. Four EBs were identified as level 3-the developing level, and two others were identified as level 4-the expanding level. The “level 3-developing” refers to EBs having general and some specific language of content areas, being able to use expanded sentences, but will have some phonological, syntactic, or semantic errors in their oral and written language. While the “level 4- expanding” indicates that EBs have specific and some technical language of content areas and can use a variety of sentence lengths in oral discourse, and having minimal phonological, syntactic, or semantic errors that may impede the meaning of communication (WIDA, 2012). 32 EBs who were identified as developing level were required to work with an ESL teacher at Rochester Elementary for a couple of days a week. EBs who were identified as expanding level exited from ESL services but still needed to participate in the WIDA test. Regarding their home language backgrounds, four of the EBs speak Arabic as their home languages, and two of them speak Farsi as their home languages. In addition to the aforementioned EBs that Mrs. G reported in the math class, I include another student, Jasmine, in this study (all students’ names used in this study are pseudonyms). Mrs. G did not give Jasmine’s name to me when I requested information about EBs in her class at the beginning of this unit. According to Mrs. G, Jasmine is proficient in English. However, during my interview with Jasmine, I learned that she is bilingual, and she is acquiring both English and Vietnamese. She speaks Vietnamese with her parents and grandparents at home. Thus, I perceive Jasmine as an EB learner and include her in this study. In total, I include seven EBs in this study to discover students' voices regarding what support they may need during FA. The purpose of doing so is to better understand the “responsiveness” of Mrs. G’s FA. For information about the EBs’ language backgrounds, see Table 3.1 below, Table 3. 1 EBs’ Language Backgrounds Student Name English Proficiency Levels Tasnime Nisha Salam Moiz Lulua Safia Jasmine Developing Developing Developing Developing Expanding Expanding Proficient Home Language Spoken Arabic Arabic Farsi Farsi Arabic Arabic Vietnamese 33 The Focal Teacher: Mrs. G Mrs. G (a pseudonym), the participant, is a female elementary teacher who teaches in Rochester Elementary School. At the time of this study, Mrs. G had been teaching at an elementary school and a middle school in the college town for over ten years. Before that, she worked in an urban public school district in a larger city for six and half years. However, because of family reasons, she moved from the larger city to this college town. Mrs. G holds an undergraduate degree in Elementary Education with a double major in science and social studies. She was certified to teach all subject areas in elementary schools and to teach science and social studies in secondary schools in the 1990s. Later, Mrs. G obtained her first Master’s degree in educational technology, which provided her with the knowledge and skills that she needed while she was teaching in the larger city. After that, she obtained another Master’s degree in Public Administration because of her interest in being a school principal. Mrs. G speaks English as her native language. She had some experiences of foreign language learning, as she said she learned Spanish in her high school for three years. However, Mrs. G acknowledged that she had almost forgotten how to speak Spanish. Mrs. G mentioned that she wanted to pick up Spanish learning after her retirement. I got to know Mrs. G in 2016 via a connection with my colleague. I reached out to one of the professors in my dissertation committee and indicated my willingness to observe how elementary teachers support EBs in the mainstream classroom. Then the professor emailed a school principal, and the principal forwarded her email to all the school’s teachers to see who was willing to have a university doctoral student observe in their classes. Mrs. G responded and welcomed me to visit her class. 34 After getting to know Mrs. G for two years, I identified her as a good example of a typical case and a good research participant because of her extensive teaching experiences and reflectivity. Mrs. G has worked with diverse learners, including EBs over the past 15 years. The school in which she works provides a good context on how EBs are integrated into the mainstream classroom. Another reason I chose Mrs. G is her reflectivity. According to my interview with Mrs. G, she connected her teaching practices with current research and had a good sense of self-improvement in learning to teach. She conducted action research to investigate gender equity issues in her math class, presented her research in conferences, and mentored intern teachers from the local university. Mrs. G also sought out professional development (PD) opportunities in teaching EBs. Even though not being well prepared to work with EBs by graduation, she participated in PD training initially to learn how to better support EBs in the middle of her career. During one PD, all elementary and middle school teachers were invited to get together and discuss strategies that were useful for all students, especially for EBs. This PD lasted over two years and teachers gathered for a half-day to look into different strategies every few months. Mrs. G had also been participating in an action research project with a focus on teacher discourse moves in math classrooms. This project was led by a math education professor in the local university. Data Collection I visited Mrs. G’s classroom for an entire unit on mathematics instruction from February to March 2019. In total, I spent 13 days observing her class during this time. The unit covered division with whole numbers and decimals. It is important to examine her practice for some time so that I can see ways in which she communicated learning targets, elicited students’ thinking 35 throughout the unit, and gave feedback to students. By gathering data from the entire unit, I can see her ongoing FA practices. In this study, the primary data sources included classroom observations with field notes and videotaped lessons, interviews, and artifacts. Collecting different sources of data allowed for data triangulation, thus increasing the credibility and quality of findings in this study (Yin, 2003; Flick, Garms-Homolova, Hermann, Kuck, & Rohnsch, 2012). A reminder is that I visited Mrs. G’s fourth-grade math class once a week from 2017 to 2018. Although the class that I observed was different, the previous visits allowed me to build rapport with Mrs. G, and get familiarity with her teaching routine. Besides, before starting my data collection, I visited Mrs. G’s fifth-grade math class once a week and volunteered in the math class since the Fall semester in 2018. This allowed students to get used to having a “guest” in their class. Below I present details about how I collected those data. Class Observations and Teaching Videos When I observed Mrs. G’s class, I focused on her instructional practices, such as when and how she communicated learning targets with students, what questions she asked to elicit student thinking, and what were her responses. For each of the classroom observations, I took field notes using a lesson observation protocol (see appendix A). While visiting Mrs. G’s class, I used a “direct observation” approach (Yin, 2003, p.92). I sat in the back of the classroom and did not participate in or intervene in her teaching process. This is because I wanted to ensure that the lessons that I observed were real and natural. Class observations allow me to have a better understanding of the context and the ongoing process of her FA. Meanwhile, it also provided me with clues and enlightenment about 36 what probing questions to ask regarding Mrs. G’s specific FA practice. All classroom observations were video recorded. The math unit that I observed included 11 lessons in total. Table 2 presents detailed information about each lesson’s focus in unit 5. In this table, “L” represents “lesson” within this unit. Table 3. 2 Information About the Math Unit L1: Compare division methods L2: Experiment with two-digit divisors & Estimation L3: Underestimati ng L4: What to do with the remainder L5: Practice dividing & solve division word problems L6: Divide a decimal by a one-digit number L11: Math and currency L9: Divide mentally & solve division problems L8: Use money to see shift patterns, and change decimal divisors to whole numbers L10: Whole number and decimal operations, make predictions, and mixed real-world applications L7: Use money to see shift patterns & relate decimal to multiplicatio n & change decimal divisors to whole numbers It took me four weeks to complete the class observations. Due to uncontrollable factors such as snow days, school professional development meetings, and Mrs. G’s schedule in attending a conference, some of our scheduled class visits and interviews were postponed. I collected 12 teaching videos. Each of the classes that I observed lasted around 65 minutes. In total, the videos that I collected lasted 11 hours 55 minutes. The last lesson focused on a unit test, which primarily served a summative assessment purpose, and I did not seek a 37 follow-up about how Mrs. G used evidence from this unit test to support student learning. Therefore, I excluded the last videotaped lesson for data analysis in this study. Table 3.3 shows a timeline of my classroom observations. I used “CO” representing “classroom observations”, and I used “Ls” representing “lessons” of the math unit. In this table, I distinguished “CO'' and “Ls” because one classroom observation was not equal to a complete lesson that Mrs. G finished. For certain lessons (e.g., Lesson 1), it took Mrs. G two class periods to finish, and as a result, I visited Mrs. G’s class twice for Lesson 1. Table 3. 3 A Timeline of My Classroom Observations COs CO1 CO2 CO3 CO4 CO5 CO6 CO7 CO8 CO9 CO10 CO11 Date 2/13 2/14 2/19 2/20 2/22 2/26 3/4 3/5 3/6 3/7 3/8 Ls L1 L1 L2 L3 L4 L4 &5 L6 L7 L8 L9 L10 Interviews The case study interview is “open-ended” in nature (Yin, 2003). When conducting the case study interview, I used semi-structured questions in an interview protocol and included simultaneous and friendly questions to keep the interviews conversational and open-ended (Yin, 2003). I collected interview data both from Mrs. G and students. In the following section, I first present how I conducted interviews with Mrs. G, I then present how I gathered interview data from students. Interview with Mrs. G I conducted semi-structured interviews and informal interviews with Mrs. G. Specifically, I have three semi-structured interviews: one before Mrs. G began her unit teaching, one in the middle of her unit teaching, and one at the end of her unit teaching. An interview protocol (see appendix B) was used to guide the semi-structured interviews. The foci for these 38 interviews included three aspects: (1) Mrs. G’s professional learning opportunities on assessment (e.g., courses, workshops); (2) Mrs. G’s understanding and implementation of FA; and (3) challenges that Mrs. G encountered while assessing EBs. In addition to the three semi-structured interviews, I also had modified stimulated recall interviews (Sherin & van Es, 2005) with Mrs. G to reflect on certain teaching moments and discuss specific instructional moves that Mrs. G had. I call it “modified stimulated recall interviews” because I did not select a video clip for us to watch together; instead, I only asked questions related to her teaching moves at certain moments based on my observation notes. Those teaching moves that I selected focused on the teacher’s questioning practices and her interaction with EBs. I will discuss more why I chose the modified stimulated recall interview and its potential limitations in the conclusion chapter. Both the semi-structured and informal interviews were conducted within Mrs. G’s math classroom. According to the school schedules, there was a longer recess time from 10:00 to 10:40 each morning and we conducted these interviews during that time. Each of the interviews lasted around 30 minutes and was audio recorded. I also included interview data that I collected from Mrs. G in 2017 for data analysis. Interview with Students Conducting interviews with students allowed me to better understand how Mrs. G’s FA was responsive to EBs’ learning needs. Prioritizing students’ voices of their learning needs in investigating the teacher’s classroom assessment practices is critical. Before conducting interviews with students, I collected nine consent forms back, seven of them from EBs and two of them from non-EBs. After obtaining the consent forms, I first had a meeting with the school principal, who recommended a quiet conference room for me to have the 39 interviews. Then I checked with the school administrative assistant to ensure that students knew where this conference room was, and they felt comfortable talking with me. After that, I talked with Mrs. G and she sent the first student to the conference room. Then each time at the end of my interview with a student, I asked the student who finished the interview to bring the next students to the conference room. The purpose of doing so was to reduce the potential distraction of my interviews to the whole class. I chose the class time to conduct my interviews because students said that they did not want their recess time to be taken. An interview protocol (see appendix C) was used to guide my semi-structured interviews with students. This interview protocol focused on EBs’ perspectives regarding challenges relevant to the math tasks and supports they expect Mrs. G to provide during classroom assessments. However, one thing that was not shown on the interview protocol was the “warm- up” section. During each of the interviews, I first introduced myself to students, such as where I came from, which university I was in, my families, and my language backgrounds. After that, I asked students to talk about their families, favorite school subjects, and what languages they speak at home. The purpose of those “warm-up” questions was to reduce students’ anxiety, and in the meantime to gather information about students’ language backgrounds. All interviews were conducted in this conference room except the one when I talked with Tasnime. This is because the conference room was reserved for a meeting. I audio recorded all of the interviews. My interview time with each student ranged from 20 minutes to 50 minutes, with a majority of them lasting around 30 minutes. Artifacts The artifacts that I collected include students’ worksheets, Mrs. G’s lesson slides, and teacher’s guide on unit 5. I also collected EBs’ journals on this math unit. Collecting those 40 documents allowed me to gain an understanding about the math curriculum, and thus enabled me to better understand the teacher’s FA practices. In addition, data collection of those documents can “corroborate and augment evidence from other resources” (Yin, 2003, p.87). Data Analysis I used both inductive and deductive coding approaches for data analysis. The inductive approach (Strauss & Corbin, 1998) was used to find emergent themes by drawing on interviews and teaching videos. The deductive approach, also named as a top-down approach, "employs the ideas from a theoretical framework or other driving ideas" (Galman, 2016, p.24). I conducted deductive coding to look for patterns of Mrs. G’s FA practices as well as support that Mrs. G provided to EBs. To present how I used these two approaches to code and analyze data, I structured this section based on research questions in this study. Specifically, the first section focuses on the first research question and the second section focuses on the second set of research questions. Section 1 In section 1, the inductive and deductive coding aimed at responding to the first research question: what is the nature of the teachers’ FA practices during a particular mathematics unit? Figure 3.1 shows relationships between the study’s first research question, data sources, and corresponding findings. 41 Figure 3. 1 Mapping Data Analysis for The First Research Question As Figure 3.1 presents, I used inductive and deductive coding approaches to analyze three sets of data sources: interviews with Mrs. G, teaching videos and artifacts. Specifically, I first utilized the Formative Assessment Rubrics, Reflection, and Observation Protocol (FARROP) guide (Wylie & Lyon, 2016) to code and analyze teaching videos deductively. The purpose of using the FARROP rubric is to examine Mrs. G’s FA practices and to look for patterns in the math class. The artifacts, in particular Mrs. G’s lesson slides, were used when analyzing Mrs. G’s FA practices in presenting learning targets. Then I used an inductive approach-open coding-to code and analyze interviews with Mrs. G and teaching videos. Emerging themes from the inductive coding were used to supplement and make sense of themes and patterns generated from deductive coding. 42 Why FARROP Rubric in Deductive Coding? Using the FARROP rubric (Wylie & Lyon, 2016) as an analytic tool enabled me to observe Mrs. G’s FA practices more explicitly and concretely based on its pre-developed dimensions. The data analysis using the FARROP rubric connects to Lyon’s (2013a)’s Teacher’s Assessment Expertise Conceptualization Framework with a focus on the dimension of assessment use in the cycle of inquiry. According to the FARROP, ten dimensions of FA can be observed. See Table 3.4 for specific information. Table 3. 4 Dimensions of FA According to The FARROP Rubric (Wylie & Lyon, 2016) 1. Learning goals 2. Criteria for success 3. Tasks and activities that elicit evidence of student learning 4. Questioning strategies that elicit evidence of student learning 5. Self-assessment 6. Extended thinking during discourse 7. Descriptive feedback 8. Peer feedback 9. Using evidence to inform instruction 10. Collaborative culture of learning According to the FARROP rubric (Wylie & Lyon, 2016), the ten dimensions of FA can be categorized as different stages of FA: (1) where are we headed, (2) where are we now, and (3) how do we close the gap (p.3). Specifically, the first two dimensions answer the first question of where we are headed; the dimensions of 3, 4, 5 answer the second question of where we are now; while the dimensions of 6,7,8 answers the third question of how to close the gap. Dimensions of 9 and 10 can be incorporated in each stage of FA. All the dimensions (i.e., 1-8) involve using evidence to inform instruction and collaborative culture of learning. 43 How Did I Code Data Using the FARROP? Before starting to analyze teaching videos, I discussed notions of each dimension of FA in the FARROP guide with my advisor to ensure the consistency between my understanding and hers, who is an experienced scholar. While using the FARROP to analyze Mrs. G’s FA practices, I centered on Mrs. G’s FA practices in general, namely her interactions with students. Specifically, I focused on the interaction between Mrs. G and the whole class because the whole class discussion was what I could best hear with the video I collected. I first went through each video and noted instances of each dimension of FA. For example, I identified sentences such as “I can…”, “Today we will talk about…”, “our goal is…”, as evidence of whether Mrs. G communicated learning targets with students explicitly in each lesson. Then I wrote notes about what happened. Sometimes I transcribed the conversation between Mrs. G and the students. After that, I used the FARROP rubric to determine the level of her practice across lessons. Figure 3.2 on the following page presents an example of how I marked the level of Mrs. G’s practices in presenting learning targets, which included how Mrs. G presented learning goals and criteria for success, if at all. The video that I used to mark Mrs. G’s expertise level on how she presented learning targets in Figure 3.2 was videotaped on February 22 in 2019. 44 Figure 3. 2 Mark Mrs. G’s Expertise Level on Presenting Learning Targets 45 Similarly, I marked Mrs. G’s expertise levels in other dimensions of FA using the FARROP rubric. In light of the levels that I marked on each teaching video, I looked across the lessons for patterns and themes. First, I created a spreadsheet (see appendix D) to illustrate how well Mrs. G enacted FA through the unit. In the meantime, I wrote memos about my cross-video analysis to note my preliminary findings on Mrs. G’s FA practices, as Table 3.5 presents, Table 3. 5 Excerpts from A Cross-Video Analysis Memo ● There is no clear evidence showing that Mrs. G provided students with opportunities for self-assessment and peer feedback during the FA practices, although she let students work with table partners to work on math tasks. In addition, no explicit evidence of descriptive feedback was observed. “Descriptive feedback” refers to formal verbal or written feedback to individual student-specific pieces of work product (Wylie & Lyon, 2016). However, there is not much evidence about teacher’s formal feedback to students’ specific work products. Instead, her feedback focuses on responses to students’ ideas during the whole class discussion. ● In terms of different stages of FA practices, it seemed that levels of teacher’s clarification and communication of learning targets with students varied. In other words, there is a big range in teacher’s practices of presenting learning targets. In addition to learning targets, another dimension that seems to have a relatively big range in Mrs. G’s FA is to use questioning strategies to elicit student thinking. In addition, there is a missed opportunity for Mrs. G to elicit or extend student thinking due to her insufficient wait time use. ● Data showed a relatively consistent use in Mrs. G’s FA in the following two dimensions: (1) tasks and activity to elicit evidence of learning, (2) a collaborative culture of learning where teachers and students are partners in learning. An overarching pattern that emerged from my cross-video analysis is that there was a wide range in Mrs. G’s expertise levels in implementing FA, particularly in the dimension of presenting learning targets and eliciting student thinking using questioning strategies. To make sense of these variations and ranges, I conducted inductive coding after completing the deductive coding. 46 Inductive Coding The value of inductive coding lies in its grounding theory in data rather than analyzing the data with a preconceived theory in mind (Strauss & Corbin, 1998). In other words, using inductive coding allows me to discover Mrs. G’s teaching perceptions and practices that emerged from the data. To make sense of patterns generated from deductive coding, I first conducted open coding (Strauss & Corbin, 1998) to look for emerging concepts and categories that may explain the ranges of Mrs. G’s expertise levels in implementing FA. To do this, I went iteratively between the interview transcripts and teaching videos. As Table 6 in the next section shows, I labeled concepts on the margins of the interview transcripts; then I categorized those concepts into groups based on a common purpose of the teacher’s talk moves. Using concepts that emerged from the open coding of interview data, I went back to the teaching videos and looked into interactions between Mrs. G and students in the whole class discussion. I transcribed video clips where I found higher and lower level of FA practices, such as using questions to elicit student thinking. After that, I used video transcripts to analyze Mrs. G’s FA practices to make sense of her lower and higher FA practices (see Table 3.7 for an example of a video clip with codes). In total, the inductive coding showed three emerging themes: (1) valuing students’ ideas; (2) a funneling questioning pattern; and (3) wait time (or lack thereof). I will explain the three themes in Findings Chapter I. Section 2 This section focuses on how I used inductive and deductive coding approaches to look for themes that answer the second set of research questions: How does the teacher perceive 47 challenges that EBs face during FA? What are the EBs’ perceptions of their needs in learning mathematics? And to what extent are the teacher's FA practices, FA beliefs, and EBs’ perceptions of their learning needs aligned? I first explain how I coded data using open coding and axial coding to find emerging themes and patterns by drawing on three sets of data sources: interviews with Mrs. G, interviews with EBs, and teaching videos. Built on inductive coding, then I explain how I use Lyon’s (2013a)’s Assessment Expertise Rubric as an analytic tool to look into Mrs. G’s expertise in assessment equity deductively from two perspectives: assessment fairness and assessment (see Table 3.8 for details about descriptions on a teacher’s assessment expertise levels). Figure 3.3 presented on the following page presents an illustration of my data analysis process. Figure 3. 3 Mapping Data Analysis for The Second Set of Research Questions 48 Inductive Coding For the inductive coding in this section, I used open coding, axial coding, and a constant comparison approach (Strauss & Corbin, 1998; Charmaz, 2006). Using inductive coding allows me to gain insights into Mrs. G’s FA practices in a linguistically and culturally diverse classroom setting without having a preconceived theory in mind. I first conducted open coding for the interview data and then I performed open coding for the teaching videos. Next, I conducted axial coding to “uncover relationships among categories” (Strauss & Corbin, 1998, p.127), and to look for broader patterns. Open Coding for Interview Data Regarding open coding for the interview data, I conducted the following procedures: (1) labeling and naming concepts sentences by sentences; and (2) grouping concepts into categories. The purpose of grouping concepts is to “reduce the number of units” that I was working with and “to explain and predict” phenomena (Strauss & Corbin, 1998, p.113). In Table 3.6, I present an example to show how I labeled and grouped concepts using an interview excerpt. Table 3. 6 “Scaffolding Student Participation”-An Example of Open Coding Transcripts Category: Scaffolding student participation Codes You know one of the good decisions that I made is that I try hard at the beginning of the lesson to get my ELL students involved at a lower level to gauge where they are at their day, and then I do my best to return to them later. [Concept 1] Scaffolding EBs’ participation In the task I might give them, [I] might be I want you to listen to Snow's response because when Snow finishes, I'm going to ask you what you heard. [Concept 2] Returning to elicit EBs’ thinking 49 Table 3.6 (cont’d) There are times where I don't do that also because if you let a child off the hook they are tuned in to being a learner. [Concept 3] Benefits of returning to students The other thing that I've given them is there are times when I ask a question and their response is a question. So, they know that I don't expect them to give me an answer every time if they need clarification. you know that is a valuable contribution. [Concept 4] Being flexible with EBs’ response It's safe to say I need to know more about and then take charge of the conversation that way, you know, that's a valuable contribution. [Concept 5] Value EBs’ contribution According to the table, five concepts were named in this interview excerpt. I grouped these five concepts into a category of “scaffolding student participation” based on the common purpose of the emerging concepts. Similarly, I utilized the same open coding procedures to analyze interview data with EBs. An analysis of interview data with EBs allowed me to triangulate Mrs. G’s FA practices, as well as look into students’ voices about their learning needs. All the concepts and categories generated from the interview data were used for axial coding. I will discuss how I performed the coding procedures in the section of axial coding. Open Coding for Teaching Videos Identify Teaching Episodes In order to have a closer look at how Mrs. G responded to EBs’ learning needs during FA, I first identified episodes with interactions between Mrs. G and EBs for data analysis. In this study, I define an episode as a whole conversation that Mrs. G and the students had while they were solving a math task. Usually, I used her turn of talk such as “now let’s move to the next 50 question”, “go ahead for [question] number 3” or “I would like you to do next is …”, as signals when she completed discussing a math task with students and would start a new one. In total, I selected 13 episodes and transcribed them. The criteria that I selected for these video episodes include: (1) the conversation is about elaborating how to solve a math task; (2) EBs were engaged in the conversation and were asked to elicit their thinking. I mainly included video episodes, instead of the full-class videos, for data coding and analysis. This is because I believe that there is no necessity to transcribe a full teaching video to learn patterns on Mrs. G’s support of EBs and her expertise in assessment equity. In addition, some teaching moments within the lesson were either irrelevant to the objective of this study or not being well heard on the recording (e.g., small group discussion). While I did not transcribe the full teaching videos, I went back to check the full class teaching to better understand the contexts of the episodes during my data analysis. Open Coding Procedures with Teaching Videos I began coding and analyzing teaching videos at a “microanalysis” level (Strauss & Corbin, 1998, p.109), which focuses on the teacher’s discourse moves and turn of talk. For example, Table 3.7 shows that the whole class was discussing using multiplication or division to solve a math story problem. In this episode, the conversation started when Mrs. G initiated questions and invited Lulua to join the conversation; it ended with Mrs. G moving to the next math problem. I used the parentheses “()” to present supplementary information that I added with a purpose to support a better understanding of video episodes. I used the bracket “[]” to add participant’s utterance information, for example, a vocabulary that was not spoken or misspelled. I used the ellipsis “…” to represent teacher-student conversations that I intentionally cut out. Those omitted conversations are either not relevant to the teacher-student interaction or not very 51 close to the research purpose in a long teaching episode. I use “< >” to present missing words that were not able to be recorded or transcribed. Table 3. 7 An Example of A Video Episode Transcripts Mrs. G: Gus ran 3.6 miles. He took a sip of water every 0.9 miles. How many sips did he take? What is the situation equation for this? Lulua (EBs, a pseudonym), do you know what the situation equation is? Lulua: It will be 3.6 divided by 0.9. Mrs. G: Is that situation? Ss: Oh, situation? Um, 0.9 times whatever number equals 3.6. Codes Read aloud Confirm student ideas Mrs. G: (Mrs. G wrote down 0.9×S=3.6 on the whiteboard) Awesome. To solve it, you did 3.6 divided by 0.9 to get the number of S (Mrs. G wrote down 3.6 ÷0.9=S). You said to solve [it], you need to divide. Notice our solution equation and situation equation on the last one was the same. Will the answer be greater or less than 3.6? Ss: I think it’s greater Revoice student response and then check student’s understanding Mrs. G: You thought Greater? Why do you think that? Ss: Um, < > (could not record and transcribe clearly) Mrs. G: You were [not] here yesterday, so in your defense that you did not hear the rich conversation we had about that. What we take away from yesterday’s conversation related to dividing by a decimal, so when we were dividing something smaller than our dividend, we said? Simon (non-EBs, a pseudonym), what [did] we say? Ss: Ignore the decimal and divide by < >. Probe student thinking Connect to yesterday’s conversation Mrs. G: But that is not the question I asked. The question I asked was [that] will be the answer greater or less than 3.6? Ss: Greater. Reiterate question Mrs. G: And I am looking for the reason why. Ss: If you move decimal out of 3.6, it is actually 36. And if you could move decimal out of 0.9 [it is 9]. 36 divided by 9 equals 4, so it is greater than 3.6. Probe student thinking 52 Table 3.7 (cont’d) Mrs. G: Okay. What confuses me is about what you just said is that you did 36 divided by 9. If you did 36 divided by 9, 4 is not bigger than 36. Lulua, do you have a different take on it? Lulua: I think why he says that is because you taught yesterday that [you] need to scale up to a whole number. So he scaled [up] 3.6 and 0.9. That is how I got it is greater. Mrs. G: So add back the decimal? Ss: Yeah, adding back. Mrs. G: But I did not put it back when I put my answer here. Ss: So when you put back the decimal, really what you need to do is putting back the decimal here in the 36. Mrs. G: Okay. So even though the answer is 4, the dividend you are paying attention to is the original. Ss: Ah Mrs. G: Is that confusing to anyone else? Ss: Kinda confusing. Lulua: Yeah Mrs. G: One of the things that I was thinking about was this idea of 3 times 5 equals 15, 3 times 50 equals 150. You have to scale up two of the three numbers, right, to just put the 0 in to do something with the place value. And in this case, if my 3.6 divided by 0.9 equals 4, the numbers I am scaling up are on the same side, but I still get the same answer. And that is confusing, right? Because if we were ignoring 0 here (pointed 50) and put them on later (pointed the quotient of 150), it is on the other side of the equal sign. And this scale (pointed out 3.6 ÷0.9=4; 36 ÷9=4) happened on the same side of the equal sign, and you do not have to do anything to the answer. So I think that is confusing. Ss: Yeah. Pointed out confusing places and invited Lulua to share her ideas. Pointed out where the problem is. Check for students’ understanding Give more examples to illustrate the confusing place. Mrs. G: And it is the way it is, but it’s a place to get stopped. Go ahead for number 3. Move to a new task Videotaped on March 8, 2020 On the margins of video episode transcripts, I first labeled the purposes of each of Mrs. G’s turn and talk. How I named her turn and talk was built on students’ responses. Then I began to categorize the concepts based on their shared characteristics. For example, to elicit students’ 53 thinking, Mrs. G used multiple strategies such as revoicing, activating students’ previous math knowledge, and providing extra examples to elaborate the confusing place in the whole class discussion. In fact, all these strategies were used to scaffold students’ mathematical conceptual understanding and participation. Thus, a core category that emerged here is “scaffolding”. Below I discuss how I connect different categories and themes that I obtained from open coding during the axial coding stage. Axial Coding Following the open coding, I conducted axial coding to “uncover relationships among categories” (Strauss & Corbin, 1998, p.127), and to look for broader patterns. While doing axial coding, I included comparative analysis of categories and concepts generated from open coding. After a few rounds of coding, I focused on comparisons to look for alignment and misalignment across three data sources. For example, Figure 3.4 below shows how I compared emerging concepts and categories and the alignment or misalignment of “wait time” use across different data sources: interviews with Mrs. G, interviews with EBs, and class observations. 54 Figure 3. 4 The “Wait Time” Use for EBs in FA According to Figure 3.4, concepts and categories generated from an interview with Mrs. G showed that Mrs. G valued “wait time” use for all students. The benefits of additional “wait time” were advocated by EBs as well. However, the category that emerged from the open coding of teaching videos reveals that Mrs. G did not provide EBs with enough “wait time”. The three 55 sets of data sources together speak to a “misalignment” pattern between Mrs. G’s belief of “wait time” use, EB’s perception, and her implementation of “wait time” during discursive FA. Deductive Coding: Using Lyon’s (2013a) Assessment Expertise Rubric as An Analytic Tool As a reminder, Lyon (2013a) conceptualized teacher’s assessment expertise from three perspectives: assessment design, assessment use, and assessment equity. This study centered on Mrs. G’s expertise in the dimension of assessment use and assessment equity. First of all, the assessment design feature was not obvious in this study because Mrs. G followed the math curriculum strictly. Second, according to Lyon (2013a), FA is primarily categorized as the dimension of assessment use. I have discussed how I used the FARROP rubric to code and analyze Mrs. G’s FA practices in general. In this section, I illustrate how I used Lyon’s (2013a) Assessment Expertise Rubric as an analytic tool to support data coding with a focus on Mrs. G’s interaction with EBs, and Mrs. G’s expertise in the dimension of assessment equity, which includes two sub- dimensions: assessment fairness and assessment access. As Table 3.8 below shows, a teacher's expertise in equity can be categorized into four levels: not present, introducing, implementing, and elaborating. I did some modifications, for example, replacing the term “Language Minority (LM)” with EBs, and replacing the context of “Science” with Math, on Lyon’s (2013a) Assessment Expertise Rubric on the dimension of equity so that it matches the context of math teaching. 56 Table 3. 8 Modified Assessment Expertise Rubric on the Dimension of Equity (Lyon, 2013a, p.1229) Not present Introducing Implementing Elaborating Equity: Fairness Do not consider the sociocultural influence on assessment. Consider socio- cultural influences, such as that (a) students come in with various backgrounds, (b) language/culture influence, (c) multiple forms of assessment should be used, or (d) assessment features (content and structure) should match to the context of instruction. Consider at least one strategy that draws attention to the influence of language and culture on assessments (e.g., modify the language, scaffold the language, modeling, differentiate assessments). Considers at least one strategy for contextualizing assessment or incorporating students’ language and culture in the design and use of assessment. Equity: Access Does not consider opportunities for EBs to engage in complex thinking, to develop language, or to fully participate. Considers assessments that allow students to talk mathematics, read or write authentic mathematics texts, or allow students to learn the language of mathematics but does not explicitly link this to EBs’ needs. Considers (explicitly) how assessment can help EBs fully participate in mathematics, develop complex thinking and academic language. Consider (explicitly) how assessment can help EBs fully participate in science, develop complex thinking and academic language; AND provide feedback tailored to EBs’ needs. Regarding assessment fairness, the level of “not present” refers to teachers not considering the sociocultural influences on assessments. The level of “introducing” refers to teachers considering students’ various backgrounds, language, and cultural influences on students’ assessment performance, using multiple forms of assessments, and ensuring assessment features (content and structure) match instruction. The “implementing” level focuses on specific strategies that teachers used. It specifically refers to teachers considering at least one strategy that draws attention to the influence of language and culture on assessments, such as modifying language, 57 scaffolding language, modeling, and differentiating assessments. That highest level is “elaborating”, which emphasizes that teachers considering contextualizing assessments or incorporating students’ language and culture when designing and using assessment. Same as assessment fairness, there are four levels of teachers' expertise in assessment access: not present, introducing, implementing, and elaborating. As its name indicates, the level of “not presenting” refers to teachers not considering opportunities for EBs to engage in complex thinking, language development, and full participation. The level of “introducing” focuses on teachers’ general sense of using discourse to support students’ content learning and language development, but without an explicit link to EBs’ needs. While on the “implementing” level, teachers start to explicitly attend to support EBs’ full participation, promoting their complex thinking and language development. The highest level-“elaborating level”-is built on what teachers should do like the “implementing” level described. In addition, teachers need to consider providing tailored feedback based on EBs’ learning needs. The tailored feedback is a key indicator to distinguish a teacher’s expertise of assessment access at the “implementing” or “elaborating” level. Two stages were involved in the process of deductive coding using Lyon’s (2013a) rubric. I first categorized emerging concepts and themes from inductive coding around assessment fairness and access. Built on the above categorization, then I used Lyon's (2013a) assessment expertise rubric to analyze Mrs. G’s expertise levels on the “equity” dimension. Constructs of assessment fairness and assessment access were used throughout the presentation of data analysis resulting in Findings Chapter II. 58 To sum up, the case study research design allows me to better understand Mrs. G’s FA in a real-life context. I collected different data sources with the purpose of triangulation. The data sources included class observations with field notes and video recordings, interviews with both the teacher and students, and artifacts. I used both deductive and inductive coding approaches for data analysis. Specifically, I used the FARROP rubric (Wylie & Lyon, 2016) and Lyon’s (2013a) Assessment Expertise Rubric to look into Mrs. G’s FA practices, her support to EBs, and how the support that Mrs. G provided informs her expertise in assessment equity. In terms of the inductive coding approach, I engaged in open coding and axial coding (Strauss & Corbin, 1998) because it allowed me to identify themes and patterns that are grounded in data. Of course, there could be an interplay between the process of inductive coding and deductive coding (Strauss & Corbin, 1998). Even though I was trying to analyze data without having a preconceived theory in mind during my open coding process, it is difficult to avoid being influenced by research questions that reside in my subconscious. In other words, there may be some deductive elements sneaked in my inductive coding process. Similarly, there may be certain inductive components involved in my deductive coding process. 59 CHAPTER 4: RESEARCH FINDINGS I Findings in this chapter draw on data analysis guided by the rubric FARROP (Wylie & Lyon 2016) and the sociocultural lens of formative assessment. I first present formative assessment (hereafter FA) practices that Mrs. G usually conducted, namely discursive FA; then I present Mrs. G’s stated values in foregrounding students’ ideas and her enactment of discursive FA. Mrs. G reported that there are three ways that she used FA to gather evidence of student learning. The first way is getting information from students through discourse. The second one is “lesson checks”, namely quick quizzes. And the third one is “math discussions more formalized,” such as asking students to come to the front and share their thinking (interview with Mrs. G, March 26, 2019). I consider the third way-formalized math discussion-as an essential component of “discourse” that she reported. Thus, there are two major ways that Mrs. G used to collect data: (1) discourse, and (2) quick quizzes. In this findings chapter, I focus on her discursive FA practices in the whole class discussion. Discourse is an essential part of FA practices in Mrs. G’s class, as she pointed out, “discourse in our classrooms is a huge part of that formative assessment……to drive instruction and make decisions for the next day” (interview with Mrs. G, March 13, 2019). Discursive practices provide great opportunities for Mrs. G to understand students’ learning in an “organic” way, in the meantime, for students to identify the gap of their learning. When I asked her thoughts about the importance of discourse in the math classroom, Mrs. G said, I think for me one of the most important things is that [it] helps me understand where they are, and it helps them understand what they might not understand. Like listening to 60 the conversation from that video from last Monday, for example, the children were saying “oh I understand this now” and that was completely organic. [Interview with Mrs. G, March 13, 2019]. According to Mrs. G, using discourse also allows her to develop students’ skills in “discourse capacity” (interview with Mrs. G, March 13, 2019), which, as a result, can benefit students’ future learning in middle school where the math curriculum is primarily inquiry-based. In this chapter, I first briefly describe the ways in which Mrs. G’s responses to interview questions illustrate the value she sees in discursive FA, then I illustrate what her discursive FA practices looked like and how her practices aligned (or did not align) with her reported values. Specifically, I focus on how Mrs. G communicated learning targets, and how she elicited student thinking and responded to student responses during her FA practices. Create A Classroom Culture of Valuing Student Ideas According to my interview with Mrs. G, one theme that emerged is that she intended to create a classroom culture of valuing student ideas. Mrs. G commented that only in a classroom where students feel that teachers and peers value their contribution, would students feel safe and comfortable to participate in discussions. When I asked her how she supported students to participate in class discussions, Mrs. G said, The other thing that I’ve given them is [that] there are times when I ask questions and their response is a question. So they know that I do not expect them to give me an answer every time if they need clarification, it’s safe to say I need to know more about and then take charge of the conversation that way, you know, that’s a valuable contribution. As they know I value it, and their peers value it, so it is all about practice and helping them feel comfortable with that. [Interview with Mrs. G, March 13, 2019] 61 Mrs. G tried to create a classroom culture of valuing students’ ideas through different ways, for example, allowing students to respond with questions, acknowledging students’ ideas and providing positive feedback, as well as bringing in certain students’ ideas in previous classes as a way to bridge their conversation when communicating learning targets with students. Mrs. G also believed that “wait time” plays an important role in supporting students to present their ideas in discursive FA. She felt as though using wait time allowed students to have more time to process information, organize their thoughts well, and thus feel more comfortable participating in classroom discussion. For example, she said, My wait time in instruction has taken years of work. Through my action research, I’ve recognized that it is a strength…... When students know you won’t let them off the hook but will wait for them, they are more likely to respond with a shot [Interview with Mrs. G, March 2017] In a classroom culture of valuing students’ ideas, what does Mrs. G’s discursive FA practice look like? How does she forefront student ideas and use wait time as a strategy to leverage student learning experiences? How is the consistency between Mrs. G’s reported values and her FA practices? In the next section, I will illustrate Mrs. G’s enactment of the discursive FA in her math class. Enactment of Discursive FA In terms of the enactment of discursive FA, I explore two specific aspects of Mrs. G’s formative assessments: (1) communicating and clarifying learning targets, and (2) eliciting and responding to students’ ideas. These two practices emerged, through analysis, as the main ways in which she enacted discursive FA. In these descriptions, I will point out the relationships between higher levels of these practices and how she used these practices to create a classroom 62 culture of valuing student ideas. Conversely, I provide instances of lower levels of FA practices and show how they tended to align with practices that did not place as much value on creating a classroom culture that foregrounds students’ ideas. Communicating and Clarifying Learning Targets A cross-video analysis showed that there is a large range regarding how Mrs. G presented learning targets. According to the FARROP rubric (Wylie & Lyon, 2016), Mrs. G’s practices in clarifying learning targets with students ranged from “not presenting at all” to “extending”. However, the trend of Mrs. G’s practices in clarifying learning targets, overall, was between “developing” and “progressing”, as Table 4.1 shows, Table 4. 1: Teachers’ Assessment Expertise Levels in Clarifying Learning Targets Lessons Levels Comments/Notes Beginning Mrs. G only posted success criteria (i.e., 5.5.1 I can divide multi-digit numbers by single-digit divisors) on the board. In the middle of this lesson, Mrs. G started to use her slides. The learning target on the first slide (lesson 1, 5.5.1) flashed. Students did not have time to notice those success criteria; in the meantime, Mrs. G did not express the success criteria posted on the slide. So I counted Mrs. G’s practices of presenting learning targets in this lesson as “not observed”. Mrs. G only posted success criteria (i.e., I can divide multi-digit numbers by multi-digit divisors) on the board. No further explanation was provided to deepen students' understanding of the learning targets. 63 Lesson 1 (February 13, 2019) Lesson 2 (February 14, 2019) Not observed Lesson 3 (February 19, 2019) Beginning Table 4.1 (cont’d) Lesson 4 (February 20, 2019) Between Developing and Progressing Lesson 5 (February 22, 2019) Progressing Mrs. G. presented the success criteria on the slide at the beginning of the lesson but without any further verbal or nonverbal explanation. Additionally, she presented the focus of today’s lesson for students verbally with references made to previous learning. Mrs. G said to students, “we’ll talk more today using estimation to solve larger problems. So our goal is to use estimation to solve larger division problems”. Mrs. G also provided an opportunity for students to engage with the learning target by asking students to connect to what they learned before. she said, “Moiz, do you remember what trouble about estimation we talked about yesterday”? However, this engagement is not deep. A larger sequence of learning was not explained. Mrs. G presented the focus of today’s lesson after a warm-up activity (i.e., class discussion). She said, “Awesome, fifth graders, you can also just ignore it [i.e., decimal]. Depends on what the situation is, right? This is what we talk about today”. Mrs. G provided students with an opportunity to think and warm up before she shared the learning focus of today’s lesson. However, she did not explain how the current lesson fits within a larger sequence of learning. No reference back to the learning targets at the end of the lesson was made. Lesson 6 (February 26, 2019) Lesson 7 (March 4, 2019) Not observed Mrs. G only presented tasks that students needed to complete. No learning targets were shown in any form. Progressing Mrs. G presented criteria for success (i.e., I can divide dividends with decimals by whole numbers) to students near the start of the lesson. She communicated learning targets with students by letting a student read the success criteria aloud, following by a discussion about key concepts (i.e., decimal, dividend) related to the learning targets. 64 Table 4.1 (cont’d) Lesson 8 (March 5, 2019) Developing Progressing Lesson 9 (March 6, 2019) Lesson 10 (March 7, 2019) Extending In the middle of the lesson after a whole-class discussion, Mrs. G presented the focus of today’s lesson via accessible language. She said, “I saw some of you have finished page 212. It is okay if you have not finished it. Today we are going to talk about dividing by a decimal divisor”. However, Mrs. G did not explain how the current lesson fits within a larger sequence of learning. No reference back to the learning targets at the end of the lesson was made. Mrs. G first reviewed what the class discussed yesterday. She made a connection to previous learning in a way that facilitates students' understanding, as she said, “so let’s get back to yesterday’s lesson, and we were talking about how to describe what happens when you divide a whole number by a decimal. And I took it from the teacher manual and put it (on a whiteboard in front). Can you read that for me, please? When you see the ‘decimal part’, what does it mean to you?”. Mrs. G then transitioned to today’s learning target (i.e., I can divide decimal numbers by decimal divisors) and said, “so today we are going to formalize what we know about dividing a decimal by a decimal”. After that, she started to discuss a math task in the student activity book (unit 5-lesson 8: use the money to see shift patterns). Mrs. G verbally communicated the focus of today’s lesson with her students in accessible language. She said, “Look at the top of page 223, why did they give us that information (circled 1715 divided by 35 equals 49), Lucas (non-EBs, a pseudonym)? ……I can and you can right away put 49 on the top of all of these. And I believe that someone said yesterday that the strategy they used is to ignore the decimal and figure out where it goes later, so that is what we are going to practice today”. The content of today’s learning targets is appropriate. At the middle and the end of the lesson, Mrs. G invited students to discuss their role in taking responsibility for their learning and to reflect on what they learned from the lesson. 65 Table 4.1 (cont’d) Lesson 11 (March 8, 2019) Developing Mrs. G presented the focus of today’s lesson using accessible languages. She said, “A goal for today is talking about identifying word problems that you are using, you have to decide whether multiplication is what you need to do or division is what you need to do. So that is going to be our goal”. The content of the learning targets is appropriate. However, few opportunities for internalizing the learning target were provided to students. No evidence showed that Mrs. G referred back to the learning targets. This table illustrates varying levels of Mrs. G’s practices in presenting learning targets to students. In some cases, Mrs. G demonstrated higher level practices, where she foregrounded students’ ideas and provided opportunities for them to internalize the learning target. However, in some lessons, she demonstrated a lower-level practice, where students did not get enough opportunity to express ideas. Below I use specific examples and connect them to her reported value of creating a classroom culture that foregrounds student ideas. Lower Level Practices of Presenting Learning Targets At the lower level practices of presenting learning targets, students’ ideas or reasoning were not foregrounded and students did not have sufficient opportunities to internalize learning targets. Oftentimes, Mrs. G communicated learning targets with students in a way of simply posting or stating the focus of lessons. For example, Mrs. G presented the focus of lesson 3-using estimation to solve longer division problems-in Table 4.2. As Table 4.2 presents, no further explanation, such as a connection to students’ previous learning or a larger scope of their future learning, was made. After discussing the lesson focus with students briefly, Mrs. G quickly transitioned to do a division problem (i.e., 5185 ÷ 85) on the student activity book, 66 Table 4. 2 Our Goal Is Going to Use Estimation to Solve Longer Division Problems Transcripts Comments Mrs. G: We will talk more today about using estimation to solve larger problems. Tasnime, would you be able to help me? (managing student behavior). So our goal is going to use estimation to solve longer division problems. I want to start by looking at the top of page one ninety? Ss: Nine Mrs. G presented the focus of today’s lesson at the beginning of the lesson. Mrs. G: Ninety-nine. So if you’d like to have it close where you at on page 199. On page 199 it shows you heard me use the word forgiving (this word may not be recorded and transcribed accurately) to describe the expanded form algorithm you can use, right? Ss: Uh-huh. Mrs. G transitioned to discussing a math task in a student exercise book. No further explanation about the learning target was made. Mrs. G: If we look at here, using the traditional algorithm and we have 93 left in our divisor as 85 (managing student misbehaviors), what’s wrong here, Greg (non-EBs, a pseudonym)? Ss: Ur (paused) Mrs. G: 85 times 5 is 425, right? That was where this came from. Greg: Oh, he put the < > the wrong place. Mrs. G did not create an opportunity for students to review/internalize learning targets with the discussion going. Mrs. G: No, it is okay. If we were using the expanded form algorithm that we have been using, it looks like this, right? But those used the digit-by-digit algorithms, if 85 is my divisor (Mrs. G wrote down 5185 ÷ 85), how many 85 in 51? (Mrs. G continued to discuss the division problem posted at the top of the student activity book). Videotaped on February 20, 2019 In this excerpt, as she usually did, Mrs. G presented learning targets verbally near the start of this lesson using a sentence starter of “our goal is…...”. However, Mrs. G did not clarify why estimation is important and what is the connection between these learning goals and student success. Additionally, there is no evidence showing how Mrs. G communicated learning targets with students by activating their prior knowledge or explaining the learning targets in a larger 67 scope. Overall, at the lower level of presenting learning targets, students seemed not to have enough opportunities to internalize (e.g., through discussion) the learning targets before Mrs. G making a transition to discuss the math division task of 5185 ÷ 85. Higher Level Practice of Presenting Learning Targets At the higher level practices of presenting learning targets, Mrs. G valued students’ ideas and knowledge. Not only did she make connections to students prior learning experiences, but she also provided students with opportunities to internalize learning targets. According to the FARROP (Wylie & Lyon, 2016) guide, the opportunity for students to internalize learning targets includes debriefing the purposes of a lesson and creating spaces for students working with teachers to create learning targets. In this fifth-grade math class, Mrs. G created opportunities for her students to internalize learning targets through the following three ways: (a) revisiting and reviewing what students learned at the end of lessons, (b) constructing students’ awareness in self-regulated learning, and (c) activating students’ prior knowledge and bringing in students’ perspectives to ensure that students feel connected. Below I explain how Mrs. G values students’ ideas using three specific examples. Example 1: Bring Student’s Ideas into Class Discussions Before illustrating the conversation in Table 4.3, it is important to provide background information about the math task and Mrs. G’ pedagogical moves. Mrs. G first asked students to open activity books on page 223 and looked at the top of this page where it shows information about 1715 ÷ 35 = 49. The image below shows what the math task looks like in student activity books. 68 Figure 4. 1 A Divide Task in Student Activity Book Then she asked questions to let students warm up and recall knowledge about estimation in math (i.e., ignoring the decimal and figured it out where it goes) that they discussed yesterday. Table 4.3 shows how Mrs. G built on knowledge activation, and then guided students to think about learning targets. Table 4. 3 I Believe That Someone Said Yesterday… Transcripts Comments Mrs. G: Okay, we will take a look at page 223. At the top of page 223 why did they give us this information (circled 1715 ÷ 35 = 49), Lucas (non-EBs, a pseudonym)? Lucas: Because all of those are similar, (paused), they are [the] same numbers but put decimals in different places. Mrs. G: So I can and you can right away put 49 on the top of them. And I believe that someone said yesterday that the strategy they used is to ignore the decimal and figure out where it goes later. Ss: Um. Mrs. G brought in student’s ideas that they discussed yesterday. Mrs. G: So that is what we’re going to practice today. So if I am looking here (circled the divisor of 35) and 35 is my divisor and 17 is the whole part of my dividend, do I know how big the magnitude of my equation will going to be? Nisha, what would you know about 17 divided by 35? Ss: It cannot divide. Drawing on the student’s idea that they discussed yesterday, Mrs. G pointed out that this would be the focus of today’s lesson. Mrs. G: You can divide. What do you know about that, [Nisha], I just wrote 17 divided 35 in a different form to help you. Nisha: < > Mrs. G: Say it again. Nisha: 7 multiple 5 is 35. 69 Table 4.3 (cont’d) Mrs. G: That is true. The 7 multiple 5 is 35. But I do not know whether that can help us to figure out how big our answer should be. I have 17 divided by 35. Tasnime, do you have ideas about that? What do you know about 17 divided by 35? Tasnime: < > Mrs. G: (Walked toward Tasmine trying to hear what she said) Maybe equal to 49? Okay, other thoughts about this, 17 divided by 35 (Mrs. G was pointing 17 in the divisor of 17.35), Franklin (non- EBs, a pseudonym)? Franklin: 17 divided by 35, then, um (paused). Mrs. G: How many 35 in 17? Franklin: Less than 1. Mrs. G: Less than 1, right? How do we make the 49 less than 1 in decimal form? Tasnime, how to make the 49 less than 1? (Wait for a few seconds) Where do I put the decimal, after the 9, between the 4 and 9, or in front of 4? Tasnime: In front of 4. Mrs. G: In front of the 4. Does it make sense, Tasnime, that the seventeen thirty-fifth (i.e., 17/35) is about half (Mrs. G wrote down 17/35 ≈ ½)? Ss: Yeah. Mrs. G: It does, right? So forty-nine hundredth (i.e., 0.49) is our answer making sense. How about the next one? Videotaped on March 7, 2019 In this example, Mrs. G intended to guide students to the lesson focus (i.e., learning targets). However, instead of posting the lesson’s focus directly on the whiteboard or stating it directly, Mrs. G started by discussing a math task in the students’ activity book with the whole class. Mrs. G asked Lucas why they [the activity books] give us that information (circled 1715 ÷ 35 = 49). After Lucas’s response, Mrs. G went a further step based on Lucas’s answer, and then intentionally brought the student’s ideas (i.e., the strategy they used is to ignore the decimal and 70 figure out where it goes later) into the discussion before she told students the lesson’s learning target. By doing so, students could feel that Mrs. G valued their contribution because she remembered what students said in the previous lesson. In the meantime, the way that Mrs. G foregrounded students’ ideas and contributions can also play a role in hooking students’ attention when communicating learning targets. Example 2: Revisiting and Review Learning Targets In addition to foregrounding students’ contribution, there were occasions when Mrs. G created opportunities for students to internalize learning targets through revisiting the learning target at the end of a lesson. As Table 4.4 shows, Mrs. G asked her students to share what they learned from the lesson. Table 4. 4 What Is the Purpose of What We Did Today? Transcripts Mrs. G: Before you put your pencil down, I am going to ask you what is the purpose of what we did today? What you get out of it. Lucy (non-EBs, a pseudonym), what did you get out from it? Lucy: I think what I get out from is how to use my decimal what (paused) what to do. Mrs. G: It was what to do with them. We did not figure any problem out from the top to the bottom, right? Ss: Yeah Mrs. G: We used our? Lucy: I use digit by digit. Mrs. G: We use the way to figure out every digit with the decimal and some that we did not with decimal. So we did compare problems as you were saying. …… Mrs. G: Awesome. Tasnime, what did you get out from what we did today? Ss: Where to move [when] dividing by the decimal? 71 Commentary Mrs. G asked students to reflect on the lesson's purpose, and what they have learned at the end of today's lesson. Mrs. G guided students to think about one of the algorithms methods-digit by digit- that they practiced, which is an essential component of the math curriculum. Mrs. G invited another student Tasnime to share her takeaway. Table 4.4 (cont’d) Mrs. G: So you know where to put the decimal? Ss: Yes Mrs. G: Awesome. Is there just one way, Greg (non-EBs a pseudonym)? Ss: No …... Mrs. G: We went back and forth, some [algorithm method] worked better than some others. But they [the algorithm methods] work for all of them [the division problems]. Videotaped on March 7, 2019 Mrs. G invited student Greg to attend to show that there are multiple ways to solve the division problem. Mrs. G gave a brief conclusion The excerpt shows that Mrs. G guided students to reflect on what they learned and the lesson focus of today. She first asked students explicitly to think about “what is the purpose of what we did today” and “what you get out of it”. This move provided students with opportunities to revisit this lesson’s learning target. Then Mrs. G guided students to think about the algorithm method (i.e., digit by digit) that they learned today. To further check students’ thinking, she invited other students such as Tasnime and Greg to join this conversation. After hearing Tasnime’s responses, Mrs. G first revoiced the student’s idea, then she invited Greg to think about whether there is only one way to put the decimal. The purpose of Mrs. G doing so is to emphasize that students can use multiple ways (i.e., different algorithm methods) when solving division problems, as the conclusion she made, “we went back and forth, some [algorithm method] worked better than some others. But they [the algorithm methods] work for all of them [the division problems]”. 72 Example 3: Supporting Student Learning Autonomy Mrs. G’s practice of presenting learning targets was not limited to focusing on mathematics content. There were occasions when she also encouraged students to think about learning autonomy by asking them to reflect on who was in charge of their learning. Through a conversation presented in Table 4.5, Mrs. G encouraged students to think about how to be responsible and independent “thinkers". Table 4. 5 Who Is in Charge of Your Learning? Transcripts Comments/Notes Mrs. G: Fifth graders, what are we doing right now? What we are doing right now in this math class? Ss: Dividing. Mrs. G: We are dividing a decimal by decimal number, right? What part of the lesson? Who is in charge of your learning? Ss: You. S1: No, yourself. S2: Oh. Ss: Ourselves. Mrs. G invited students to think about their learning autonomy. Mrs. G: We just had an amazing discussion about a couple of them. It is your responsibility to attend to that discussion, right? Ss: Yes. Mrs. G talked about how to be a student. Mrs. G explained to students the purpose of her asking. Mrs. G: You learn only if you are willing to think about it. I am not in charge of your learning right now. I am facilitating your learning. And I asked you questions trying to let you think. And I have been doing that since September. You probably do not remember what math was like before you came to Mrs. G’s class. Ss: It was bad. Mrs. G: I would not say it was bad. I would not say that. It is just not feeling like this. We are talking about math right now. And if you are not listening, you are not talking about it either, right? Ss: Yes. Videotaped on March 7, 2019 73 As the excerpt showed, Mrs. G invited students to think about their learning autonomy by asking “what we’re doing right now in this math class?” and “Who is in charge of your learning”? After realizing that some students still perceived that it is the teacher in charge of the learning, Mrs. G discussed how to be a student and pointed out that “it is your (i.e., the students) responsibility to attend to that discussion”. She also went further and explained that her role as a facilitator and her purpose of trying to let students think via asking them questions. Taken all together, the levels of Mrs. G’s practices in communicating and clarifying learning targets are varied. She valued students’ ideas via making connections to students’ prior knowledge and learning experiences, as well as allowing students to internalize learning targets. Yet, not all of her FA practices forefront students’ ideas in communicating learning targets. One possible reason that leads to the wide range of her practices in presenting learning targets is lack of sufficient instructional time, as she reported during our interview, I think that it would be helpful to provide that [learning targets], you know, that it always comes down to time and how much time do I have to get things done. This year, we never have as many snow days as we have had. We have missed like 10 days of school, that is two weeks of instruction in the heart of our [instructional time] ...... As an instructor, I am nervous about the impact that’s going to have. [Informal Interview with Mrs. G, March 26, 2019] Elicit Student Thinking Using Questioning Practices The previous section focused on Mrs. G’s practices in communicating learning targets, namely “where are we going” in FA. This section attends to “where are we now”, namely gathering evidence of student learning, and “how do we get there”, which refers to action on student learning evidence. Specifically, this section focuses on how Mrs. G used questioning as a 74 tool to elicit students’ thinking. According to the analysis of the teaching videos and interviews with Mrs. G, questioning is an essential strategy that Mrs. G used to gather evidence of student thinking. The questioning practices allow Mrs. G to “understand where they [students] are at their day”, as well as to help students “understand that they might not understand” (interview with Mrs. G, March 13, 2019). In this study, I consider the strategy of questioning use serves two purposes: (a) gathering evidence of student learning, and (b) providing feedback by drawing on students’ responses. This is because it is difficult to separate what questioning moves are used to gather evidence and what questioning moves are used to give feedback to students during discursive FA practices. Oftentimes, they are intertwined. When analyzing the teacher’s questioning practices, the questions that Mrs. G asked by drawing on students’ responses can be counted as her feedback to students as well. Data analysis shows that a “funneling” type questioning pattern dominated the classroom discourse during FA. The “funneling” questioning pattern refers to teachers guiding students to the desired end via asking a series of questions (Herbel-Eisenmann & Breyfogle, 2005). The funneling questioning patterns in Mrs. G’s FA ranged from lower level to higher-level funneling questioning practices based on the extent to which her questions were built on students’ ideas. Below I illustrate what Mrs. G’s questioning practices look like by drawing on specific examples. Lower Level Practices of Funneling Questioning At the lower level of funneling questioning pattern, Mrs. G tended to ask students lower cognitive demand (e.g., factual questions) questions. Usually, general or evaluative feedback was presented to students to get them to the right answer, rather than aiming at exploring students’ 75 thinking. The interaction between Mrs. G and students, overall, reveals that students did not have many opportunities to express their thinking and reasoning, although there are some occasions that Mrs. G asked students to explain their ideas using higher cognitive demand questions such as probing questions. For example, Table 4.6 shows an example of lower level funneling questioning practice. During the conversation, Mrs. G walked through students to solve a division task (i.e., 1533 ÷ 21) to the desired answer. Table 4. 6 What Goes into My Thinking Bubble? Transcripts Comments Mrs. G: What goes into my thinking bubble, Tasnime? (Students and Mrs. G were working together to look for the answer to the math problem of 1533 ÷ 21.) Tasnime: 20. Mrs. G: 20, great job. So I am thinking 20, that is super close, right? Tasnime: Yeah. Mrs. G initiated a factual question. Then Mrs. G provided general positive feedback. Mrs. G: And 15 is too small for 21, so this is (Mrs. G circled 153, the dividend is 1533) what are we looking at. We are looking for a three-digit subtraction problem, right? Tasnime, how many twos in fifteen? (Wait for Tasnime’s response). Tasnime: 6. Mrs. G explained the procedure, followed by another factual question. Mrs. G: Oh, that is 12. Can we get another one? Tasnime: 7. Mrs. G: 7, right? So we do 21 times 7, which is 147. Do you think that is close enough? Tasnime: Yeah Instead of pointing out that her answer was wrong, Mrs. G guided the student to give another answer. Mrs. G revoiced the student's response and led Tasnime to reflect on her answer. 76 Table 4.6 (cont’d) Mrs. G: But that is 7, right? We need to scale up by 10 (Mrs. G added a 0 behind 147). Would we have 70 upper here (the quotient place)? Is that look good? Then we are going to subtract, 13 minus 7? Tasnime: 6. Mrs. G: Thank you. Now Moiz, what is next? Moiz: Ah, 21 times 3. Mrs. G: Oh, tell me how you got that, Moiz? Moiz: Yeah. 3 times 1 equals 3; 3 times 2 equals 6 (doing the multiplication of 21 times 3). Mrs. G explained why they are doing subtraction at this moment, with a follow- up factual question. Mrs. G invited Moiz to participate in the conversation. Then Mrs. G asked Moiz how he got his answer by asking him a slightly higher-level question. Mrs. G: Awesome. Moiz, can I use my estimation skill too? How many 2s in 6? Moiz: 3. Mrs. G: 3, Awsome. So that is 3 [added 3 at the quotient place]. So what is our quotient, Moiz? Moiz: 73. Mrs. G: So smart. Mrs. G asked a factual question about the quotient. Then Mrs. G offered evaluative feedback that is not relevant to the math task. Videotaped on March 5, 2019 In this dialogue, Mrs. G posted her initial question (i.e., what goes into my thinking bubble) when she was drawing a thinking bubble on the whiteboard in front of the whole class. Then she offered general positive feedback (i.e., good job) to student’s responses. Following the feedback, Mrs. G asked a series of factual questions. Mrs. G did not end up providing students with general feedback (e.g., awesome, so smart). She explained to students why and guided her students to think about the rationale implicitly. When walking through students to find the desired answer to this math task, Mrs. G provided rationale regarding each specific procedure that the class was doing (e.g., so I am thinking 20, that is super close, right). This conversation also shows that Mrs. G used a slightly higher-level pattern when she was asking Moiz how he got his answer. 77 Yet, it seems that Mrs. G constantly took the role of “leading” the discussion rather than “facilitating” the discussion when iterating with students. In the meantime, Mrs. G provided very limited opportunities to her students to express their thinking. Higher Level Practices of Funneling Questioning In addition to the lower level funneling questioning pattern, Mrs. G used higher level “funneling” questioning practices, where she often asked questions by drawing on students’ ideas. During this type of interaction, Mrs. G listened to students’ responses intentionally and provided students with opportunities to express their thinking. Here I consider Mrs. G’s questioning practices shows a higher level “funneling” questioning pattern instead of the “focusing” questioning pattern because there are still missed opportunities to dig out students’ deeper thinking during Mrs. G’s FA practices. As a reminder, a “funneling” pattern emphasizes teachers directing students to the desired answer during class discourse where students having no sufficient opportunity to explain their thinking; while the “focusing” pattern emphasizing teachers listening to students’ responses, giving feedback by drawing on student ideas, and providing students with enough space to explain their math thinking and reasoning process (Herbel-Eisenmann & Breyfogle, 2005). Table 4.7 shows how Mrs. G created opportunities to illustrate students’ thinking when they were discussing a math division task of 1715 ÷3.5. This task includes a clue of 1715÷35=49 on student activity books. Table 4. 7 Why Do You Think It’s Wrap A Whole? Transcripts Comments 78 Table 4.7 (cont’d) Mrs. G: How about number 5? If we are thinking about the whole number 17 divided by 3, what would that be about, Kevin (non- EBs, a pseudonym)? 17 is close to 18, right? And 18 divided by 3 is 6, right? How could I make this close to 6? Am I going down or wrap it [490] whole? (Mrs. G and students were talking about whether rounding 490 up to 500 or going down). Ss: Wrap a whole Mrs. G: Why do you think it is wrap[ping] a whole? Kevin: Because it is not close to the actual answer. Mrs. G: I would disagree that it is not close to the actual answer. Kevin: I need a hundred away, a hundred ten away. Mrs. G: So you want me to do this? Is that better? S1: Yeah. Mrs. G: But is it useful for reasoning for what the answer is? S1: No. S2: Just to leave the number. S3: 18 is too big. Ss: Yeah. Mrs. G initiated questions. Mrs. G asked students to explain why they think it is wrapping a whole. Then Mrs. G provided negative feedback showing she disagreed with students’ ideas. Mrs. G verified the student’s ideas. Then Mrs. G led the student to reflect on whether his answer is useful. Mrs. G: What I did that (Mrs. G wrote down 1700 ÷ 3), I would still say (Mrs. G wrote down 1700 ÷ 3 estimated to 600). Do you like that better? Ss: Yes. Mrs. G: So Lucy (non-EBs, a pseudonym), if that is the case does 490 check out? Ss: No Mrs. G invited another student Lucy to join the conversation. Mrs. G: (Mrs. G wrote down the multiplication of 490 × 35=1715.0) Could I do this (Mrs. G wrote down 4 × 5 =2.000) to save myself time? Ss: Yes. Then Mrs. G guided students’ reasoning from a “saving time” perspective. Mrs. G: We knew that 49 times 35 was 1715. So we’re not distributing what our numbers are, it is the magnitude of our numbers, how large the number is with the power of 10, and how small it is with the power of 10. Alright, how about number 6. 79 Table 4.7 (cont’d) Videotaped on March 7, 2019 In this example, Mrs. G first asked two more specific questions (how could I make this close to 6? Am I going down or wrapping it whole”?) to unpack her first question (i.e., what would that be about”). After the student’s response, Mrs. G provided a follow-up question (i.e., why do you think it is wrapping a whole?) to let students explain their thinking. After Kevin responded to the “why” question, Mrs. G did not provide him an extra opportunity to explain why he thought it is not close to the actual answer. Instead, she gave negative feedback (i.e., I would disagree that it is not close to the actual answer). After Kevin provided another idea, Mrs. G confirmed his thinking (i.e., so you want me to do this? is that better?); then she asked the whole class, including Kevin, to reflect on whether it is useful for finding the answer. After noticing that Lucy said yes, Mrs. G invited Lucy to explain her ideas. This conversation shows Mrs. G created multiple opportunities to invite students’ contributions to the class discussion. Compared with lower level funneling questioning practices, where Mrs. G tended to use lower cognitive demand questions and leave students insufficient opportunities to express their ideas, Mrs. G foregrounded students’ ideas and used higher cognitive demand questions more often in her higher level funneling questioning practices when articulating student thinking. According to Mrs. G, probing question use is an essential strategy that can be used to support students’ engagement and participation, as she stated, I think the key to a strong lesson and student engagement is knowing what questions to ask them that you do not give them the answer…..a lot of the practices that you see in my classroom have developed into being very strong because of that work, the ideas that you 80 can ask probing questions of students, not leading questions but probing questions to get them to connect what they need to connect to be successful with math. [Informal Interview with Mrs. G, February 22, 2019] Lack of Wait Time: A Missed Opportunity in Eliciting Students’ Thinking While Mrs. G attempted to forefront students’ contribution in her discursive FA practices, data analysis shows a pattern of missed opportunities for eliciting students’ thinking due to a lack of wait time. As a result, Mrs. G tended to answer her own questions sometimes before students responded. Table 4.8 reveals how Mrs. G might have missed opportunities in eliciting Anna’s mathematical understanding. Before having the conversation in Table 4.8, Mrs. G was discussing a math story problem with her students, focusing on a tricky part about decimal places and the power of tens. Here is the description of the math problem: A rectangular garden has an area of 882 square meters, the long side of the garden has a length of 35 meters, how long is the short side? Table 4.8 Wait, 100… Transcripts Comments Mrs. G: Is that what we said? Explain what I just showed you, please? Explain what I just showed to you, please, Anna (non-EBs, a pseudonym). Anna: you (did not finish). Mrs. G asked Anna to explain the instruction that she just did. Mrs. G: How did I get the decimal number without converting to the fractions? Anna: Wait, 100… Mrs. G: How did I get the decimal number with doing 882 divided by 35 without worrying about fractions? Anna: You (paused). Mrs. G started to ask another question when Anna did not finish her response. Anna mentioned “wait”, which suggests that she might need more time at this moment. 81 Table 4.8 (cont’d) Mrs. G: What is zero there? We cannot see but we did need it before now. Anna: Yeah, then. Mrs. G: And I said how many 35 into 70? And that was 2. Ss: Oh (an Aha moment). Mrs. G realized that Anna may need help; so she narrowed down her question. Videotaped on February 22, 2019 In this excerpt, Mrs. G asked Anna to recollect what she (i.e., Mrs. G) just said. However, it seems that there were three times that Anna was not given sufficient wait time to explain her ideas. The first one is that Anna started to say “you…” then she was interrupted by Mrs. G’s follow-up question (i.e., How did I get the decimal number without converting to the fractions). The second one is that Anna was attempting to answer Mrs. G’s question, but it seems she had not been ready yet, so she said “wait, 100...”. Then Mrs. G posted a similar follow-up question (i.e., How did I get the decimal number with doing 882 without fractions) before Anna said she needed help. After that, Mrs. G noticed that there was a pause in Anna’s response, thus she narrowed down her question to help Anna. Anna seemed to understand Mrs. G’s narrowed-down question; then she said “Oh, then” but did not complete her response. At this moment, Mrs. G offered her own answer (i.e., and I said how many 35 into 70? And that was 2). This whole interaction process revealed that Mrs. G attempted to guide Anna to a desired answer but did not provide Anna enough wait time to express her reasoning process. To sum up, this chapter aims to answer the first research question: What is the nature of the teachers’ FA practices during a particular mathematics unit? Drawing on the FARROP rubric (Wylie & Lyon, 2016) and a sociocultural lens of FA, findings in this chapter illustrate how Mrs. G’s discursive FA practices were aligned (or unaligned) with her values of forefront students’ 82 ideas in the math classroom. Discursive FA in Mrs. G’s fifth-grade math classroom can connect to the literature on assessment conversation, which is also referred to as informal FA (Ruiz- Primo & Furtak, 2007; Ruiz-Primo, 2011). Preliminary findings showed that Mrs. G incorporated a lot of class discussions, especially whole-class discussions in FA. Class discussions allow students to develop their capability of math talk and “discourse capability” via interacting with the teacher and their peers. My interviews and observation notes showed that Mrs. G valued student ideas and she attempted to create a classroom culture that values student involvement and contribution in FA. However, the extent to which Mrs. G put her value of creating a classroom culture of foregrounding student idea into practices is quite variable. Drawing upon the predeveloped dimensions of FA in the FARROP (Wylie & Lyon, 2016) rubric, findings showed that there is a large range regarding Mrs. G’s FA practices in the dimensions of presenting learning targets and eliciting and responding to student thinking. Throughout FA practices, a pattern of lower and high-level funneling questioning was found. At the lower level funneling questioning practices, it seems that Mrs. G tended to direct students to the desired answer. Sometimes, she did not give students sufficient wait time and tended to answer her own questions. As such, students did not have enough opportunities to express their ideas and to internalize learning targets. There might be missed learning opportunities for students because of Mrs. G’s lower level funneling questioning practices. While at the higher level funneling questioning practices, Mrs. G valued students’ ideas and contributions. When presenting learning targets, Mrs. G, oftentimes, created opportunities for students, such as revisiting learning targets and making connections to students’ previous learning experiences, to internalize learning targets. When eliciting student thinking using 83 questioning strategies, Mrs. G also asked questions drawing on students’ responses with the purpose of probing and prompting student thinking. 84 CHAPTER 5: RESEARCH FINDINGS II This chapter focuses on Mrs. G’s beliefs and practices around supporting EBs during her discursive formative assessment (hereafter FA). Similar to the previous findings chapter, this chapter uses evidence from class observations and interviews with Mrs. G. In addition, this chapter introduces evidence from interviews with EBs in order to prioritize and centralize students’ voices. This findings chapter aims at answering the second set of research questions: How does the teacher perceive challenges that EBs face during FA? What are the EBs’ perceptions of their needs in learning mathematics? And to what extent are the teacher's FA practices, beliefs, and EBs’ perceptions of their learning needs aligned? Alignment and misalignment are two emerging constructs based on data analysis across multiple data sources. Drawing upon Lyon's (2013a) conceptualization framework on teachers’ assessment expertise, this chapter is organized to present instances of alignment and misalignment around three evidence sources: interviews with Mrs. G, classroom observations, and interviews with EBs. In particular, this chapter focuses on the dimension of assessment equity (Lyon, 2013a). As described in the literature review, there are two sub-dimensions in assessment equity: access and fairness. Assessment access emphasizes teachers using productive discourse and providing EBs with opportunities for language development and participation. While assessment fairness focuses on the ways in which teachers address the potential language and cultural influences on EBs’ assessment performances. I use the sub-dimension of assessment access to guide my analysis of the alignment and misalignment between Mrs. G’s beliefs and practices around using discursive FA to support EBs’ math learning and language development. I use the sub-dimension of assessment fairness to guide my analysis of the alignment and misalignment 85 among Mrs. G’s beliefs of barriers that EBs may encounter, the corresponding support Mrs. G provided to EBs, and EBs’ perceptions about their learning needs. Since this study focuses on Mrs. G's discursive FA, it may be difficult to separate assessment access and fairness in Mrs. G’s scaffolding practices. Thus, both assessment access and fairness would be involved in my analysis of Mrs. G’s scaffolding practices for EBs. Below I first present research findings around the alignment among Mrs. G’s beliefs, practices, and EBs’ perceptions. I then present research findings around the misalignment among Mrs. G’s FA practices, beliefs, and EB’s perceptions. Section 1: Alignment of Mrs. G FA Practice, Beliefs, and EBs’ Perceptions In this section, I present the alignment among Mrs. G FA practice, beliefs, and EBs’ perceptions from three aspects: (1) alignment between Mrs. G’s beliefs of discourse in FA and EBs’ perception of discourse; (2) alignment of Mrs. G’s beliefs and practices on scaffolding for EBs and EBs’ perceptions on the supports they received; and (3) alignment between Mrs. G’s perceptions of the barriers that EBs may encounter and EBs’ perceptions of challenges they face during FA. Alignment between Mrs. G’s Belief of Discourse in General and EBs’ Perceptions My observation notes showed Mrs. G’s typical routine in her classes included large swaths of time for math discussions, which is aligned with her beliefs about the critical role of discourse in supporting all students’ learning. In addition, Mrs. G believed that discourse plays an important role in supporting EB’s language development and content learning. The emphasis and implementation of discourse in Mrs. G’s math class are consistent with Lyon’s (2013a) conceptualization framework of assessment expertise, where the presence of supportive 86 discourse during FA was foregrounded as a high-quality practice in the sub-dimension of assessment access. Mrs. G mentioned two ways that she valued using discourse to support EBs’ learning: developing EBs’ academic language in large group discussion and building EBs’ math ideas via “common languages” in small group discussion. She stated, There are several key things that are part of math discourse in the classroom. One of them is the, I want to say it's called the language register. The idea that when two students are talking about math, the language they use you know might be this and that, or it's very informal when a smaller group comes together. The level of language, the larger the group the more important having academic languages. [Interview with Mrs. G, March 13, 2019] In this interview excerpt, Mrs. G mentioned the concept of “language register”. She felt that it is important to use academic language in large groups. However, she saw the importance of allowing EBs to use language that they feel comfortable with to talk about math and to build their math ideas in small groups. During my interview with Mrs. G, she also said, One of the things that is really important, especially with ELLs, is making sure that common language I talk about-- so not the academic language, but the common language that we might use to describe length and width, for example, is established for them. So when they're doing small group work, helping the students that are working with them to establish-- so they know that they're talking about the same thing, because that frustration can come in early in understanding. [Interview with Mrs. G, March 28, 2019] Based on the interview excerpt, Mrs. G acknowledged that language plays an important role in students’ math learning. As essential components of language, Mrs. G highlighted the 87 significance of the co-existence of both academic language and “common language” in her math class. Specifically, Mrs. G values developing students’ math ideas via allowing students to use “common language” in small group discussions such as peer talk, and to encourage students to use academic language in the larger group discussion. According to Mrs. G, the small group discussion was able to support students’ math ideas. The math ideas that students developed, in turn, built a foundation for larger group discussion using academic language. Mrs. G’s emphasis on discourse in her math class is aligned with EBs’ attitudes toward the whole class discussion and peer talk. EBs felt that class discussion was helpful because it provided them with opportunities to better understand how to solve math problems that they were struggling with, as the interview excerpt below illustrates, I think discussion is more helpful. For a written paper task, there is something you cannot know. For discussion, they will tell you how to do that so that you will do better at where you are struggling at. While a piece of paper cannot talk to you and tell you what to do. [Interview with Lulua, April 26, 2019] In addition, EBs mentioned how “pair talk” could support their math learning. For example, I kind of like partner work because partners can explain what they were thinking but at the same time I do not have to be socially awkward. …I mean pair work, I like it, and also it makes [process] faster. [Interview with Jasmine, April 24, 2019] I have personally my best friend Clara (a pseudonym), she is the best person that I understand when she talks, …Clara says in a different way than Mrs. G said. Clara says that in English that it just makes more sense [to me] in her own ways. She helps me a lot. [Interview with Tasnime, April 30, 2019] 88 According to these interview excerpts, EBs feel that peer talk helps because sometimes the ways that their peers explain math is more comprehensible than Mrs. G’s. In addition, EBs reported the peer talk was helpful in a way of reducing social anxiety. The comment from Jasmine and Tasnime echoed Mrs. G’s views on the importance of developing students’ math ideas using “common language” within small groups. It also brings up a new perspective that Mrs. G did not mention, such as reducing social anxieties. Alignment of Mrs. G’s Beliefs and Practices on Scaffolding for EBs and EBs’ Perceptions As aforementioned, it is hard to separate assessment access and fairness in Mrs. G’s scaffolding practices, so I use both subdimensions to guide my analysis about Mrs. G’s belief and scaffolding practices for EBs. First of all, using math discourse to support EBs’ content learning and language development is an essential way that Mrs. G used to scaffold EBs. Mrs. G’s scaffolding during FA shows a connection with assessment access. In addition, through scaffoldings in the discursive FA, Mrs. G was able to provide equitable opportunities for EBs’ to demonstrate their learning, thus, I also drew on the sub-dimension of assessment fairness in my analysis. The strategies that Mrs. G used to scaffold EBs’ content and language development included transcribing verbal discussions, revoicing, and asking scaffolding questions. Below I first present Mrs. G's beliefs of different scaffolding strategies drawing on interview evidence. Then I present Mrs. G’s practices and how the scaffoldings may or may not support EBs’ math learning by drawing teaching videos and interviews. After that, I illustrate EBs’ perspectives to illustrate whether EBs felt the scaffolding strategies they received from Mrs. G were helpful. 89 Transcribe Verbal Discussion According to Mrs. G, transcribing students’ talk during classroom discussions provided both verbal and visual channels for EBs to understand what the whole class was discussing and thus be on the same page. For example, Mrs. G made the following comment when I asked her what supports she provided to EBs, I try to support my verbal feedback with something visual, so you'll see me write on the pad of paper. You'll see me like in the video, I tried to transcribe what the students said on the board. So then the words were there. So they heard the students say it, they heard me say it as I read [and] voiced what they said. And they had access to what the student said on the board as well. So there were three ways for that profound statement to get in. So I do that a lot. I'm always writing something because that's how I learn. [Interview with Mrs. G, March 26, 2019] According to Mrs. G, transcribing verbal discussions provided EBs with multiple opportunities to understand the math talk. Students can hear the same information from both their peers and Mrs. G. The transcribed text on the pad of paper or board serves as visual support for students. In doing so, students were able to get various channels of language input. However, transcribing verbal discussions does not mean that Mrs. G would always write down what exactly the student says during class discussions. Oftentimes, she would first process students’ responses and then extract key ideas. To better visualize interactions between Mrs. G and EBs, I pulled out a segment of the conversation as presented in Table 5.1 between Mrs. G and EBs such as Safia and Nisha. In this transcript, I used “Ss” to represent the whole class. 90 Table 5. 1 When You Are Dividing a Decimal… Transcripts Comments Mrs. G: So fifth graders, when we divide a number by a decimal number less than one, why is the quotient greater than the original number? Why is our answer bigger than we started, Greg (non- EBs, a pseudonym)? Greg: It is adding a 100? Oh, no, it is adding a 100. Mrs. G: No, we are not adding. Greg: Timing, multiplying 10 or 1. Ss: Or dividing. Mrs. G: (Mrs. G Smiled). Oh, lord, impulse control. Safia (EBs, a pseudonym), dear? Safia: So when you are multiplying by 10 or 100, it is clearly like you, I mean dividing by 0.1 or 0.01, it is like you multiply by 10 or 100. Mrs. G: Mrs. G wrote down ‘when you are dividing by a decimal, you are…’ when Safia was responding) Other ideas? When you are dividing by a decimal (paused), use this stamp to see whether you can finish it for me. What do you think, Nisha? when you are dividing by a decimal, your groups are? The divisor should be groups. What can be said about the groups? Mrs. G transcribed Safia’s responses by posting written text on the whiteboard. She also orally presented Safia’s response in front of the class. …… Mrs. G: So when you are dividing by a decimal, your groups are smaller than a whole, each whole in your dividends makes ten or a hundred or a thousand groups (Mrs.G wrote this sentence down on the whiteboard). Mrs. G finished and presented the sentence starting with when you are dividing by a decimal, your grous are…on the whiteboard). Videotaped on March 5, 2019 In this excerpt, Mrs. G first asked Safia to continue to answer her question of “why is our answer bigger than we started ''. Safia explained her thinking: when you multiply by 10 or 100, it is clearly like you, I mean when dividing by 0.1 or 0.01, it is like you multiply by 10 or 100. However, Safia’s explanation was not clear, so Mrs. G chose to paraphrase her response. She wrote down “when you are dividing by a decimal, you are…” on the board in front of the class, 91 and then orally presented it . The discourse move provided students opportunities to understand Safia’s thinking three times: by listening to Safia’s talk, by listening to Mrs. G’s talk, and by reading the written text on the whiteboard. This practice demonstrates aspects of both the dimension of assessment access in supporting EBs’ math ideas and participation, and the dimension of assessment fairness in supporting EBs’ to better explain their math thinking with the assistance of visual support (i.e., the written text). However, how do EBs perceive the strategy of transcribing verbal discussion? Do they feel it is helpful to their math learning? According to my interview with EBs, the majority of them noticed this instructional strategy was helpful. For example, when I asked Salam’s opinion regarding Mrs. G’s support in translating math solutions into written text on the whiteboard in the whole class, Salam said, Yes, it [written answer] is helpful because first, we do it together, and then we get the answer and write the answer on our book. All of our class were paying attention to Mrs. G and doing with her, that is helpful. [Interview with Salam, April 26, 2019] According to the interview excerpt, Salam found this strategy helpful, which was aligned with Mrs. G’s comments. Yet, Lulua presented a mixed feeling about using this strategy instead. She said, Sometimes it is [helpful] when you do not get it and she showed you how to do it, you like Ah. But sometimes it is not helpful cause I already do that or when you want to do [the task] by yourself, but she kind of already wrote on the board, so that is some kinds of opportunities I lose. She just wrote down the answer on the board. [Interview with Lulua, April 26, 2019] 92 According to Lulua, she felt that she might lose the opportunity to demonstrate her math learning independently if Mrs. G wrote down the answers on the board with the whole class. However, for the most part, most EBs felt that the process of solving math tasks with Mrs. G transcribing student talk was helpful because it could get them focused. Revoicing Another discourse move that Mrs. G used is revoicing students’ responses, where she rephrased, restated, or elaborated student responses. Based on my field notes, she revoiced students’ responses multiple times each class period. Although Mrs. G did not report using revoicing directly, she mentioned using “restate” and “paraphrase” when talking with students. For example, Mrs. G explicitly mentioned that she was going to restate Dylan’s (non-EBs, pseudonym) response during the whole class discussion, I am going to restate what you just said. Tell me whether it is what you mean. Dylan said the divisor is the thing you divide something by entering the number of pieces in a group or the number of groups, depending on what your citation is. [Classroom observation, Feb. 13, 2019] To better understand how the discourse move of revoicing supported EBs’ content and language development, below I presented two episodes showing the interaction between Mrs. G and EBs in Table 5.2 and Table 5.3. Table 5. 2 Lulua, Did I Rephrase All Right? Transcript Comment Mrs. G: Alright, move to number 7. Who can tell me why they know the 4.9 is the right answer? How do you know that 4.9 is the right answer, Lewis (non-EB, a pseudonym)? Lewis:.< > 93 Table 5.2 (cont’d) Mrs. G: You got to be with me this time, right? We are talking about number 7. Lewis: I didn’t, I didn’t feel is the number 7. Mrs. G: Fifth graders, what we are doing right now? What we are doing right now in the math class? Ss: Dividing. Mrs. G: We are dividing a decimal by decimal number, right? What part of the lesson? Who [is] in charge of your learning? Ss: You. S1: No, yourself. S2: Oh… Ss: Ourselves. Mrs. G: We just had an amazing discussion about a couple of them. It is your responsibility to attend to that discussion, right? Ss: Yes. Mrs. G: You learn only if you are willing to think about it. I am not in charge of your learning right now. I am facilitating your learning. And I asked you questions trying to let you think. And I have been doing that since September. You probably do not remember what math was like before you came to Mrs. G’s class. Ss: It was bad. Mrs. G: I would not say it was bad. I would not say that. It is just [does] not feel like this. We are talking about math right now. And if you are not listening, you are not talking about it either, right? Ss: Yes. Mrs. G: Lulua, are you comfortable being with 4.9 being the answer? Lulua: Yes. Mrs. G: Can you tell me why? Lulua: Because since both the factors have decimal and tenth places, and if you did that it will go to the place of the quotient, the decimal number of the place. That is why 4.9 is the right. 94 Table 5.2 (cont’d) Mrs. G: Okay. My gosh, that is actually brilliant. She uses the related number sentences. (Drawing on Lulua’s response, Mrs. G wrote down 3.5 × 4.9=17.15) She knows that 3 times 4 is 12, and It is gonna be a whole-digit answer. She also knows that she is going to scale up by 10 for each of those factors. And she has to put it back again. So if we were just using 35 times 49 equals 1715 without any decimals. She knows there is a decimal here [between 3 and 5] and a decimal here [between 4 and 9], that is why she places she needs one [decimal] as well [in the quotient between 17 and 15]. Lulua, did I rephrase what you said all right? Lulua: Yes Mrs. G: That is brilliant. Does anyone else have different thinking here, Franko (non-EBs, a pseudonym)? Franko: Well, how I thought through is the one number after the decimal point and the divisor, it times the dividend by 10. Mrs. G revoiced Lulua’s response. Mrs. G also checked with Lulua to see whether her idea has been correctly revoiced. Mrs. G: Okay. You know that you have to scale up your divisor (i.e., 3.5) by 10, you need to scale up your dividend (i.e, 17.15) by 10, too. And the decimal will right be here [between the 171 and 5 in the quotient]. And Lucy’s (non-EBs, a pseudonym) theory that where the digits are makes sense in this one only if I move the decimal over, right, Lucas (non-EBs, a pseudonym)? So you see the 49 is here, and I would put decimal right there [in front of 4 in the quotient place]. You will get the wrong answer, right. So we have two ways that number was explained. How about number 8. Videotaped on March 7, 2019 The episode shows how Mrs. G revoiced and elaborated Lulua’s response during the whole class discussion. In particular, text in italics in this excerpt shows how Mrs. G revoiced Lulua’s response. By revoicing Lulua’s ideas, it helps the whole class to better understand Lulua’s math reasoning process; at the same time, it allows students to be aware that Mrs. G valued Lulua’s ideas and contribution. Another example in Table 5.3 shows that the discourse moves of revoicing supported Lulua’s awareness of language use such as the collocations, which refers to habitual juxtaposition with two or more words going together, such as, “scale up” or “scale down”. When 95 using the vocabulary “scale” to describe making a number larger in its size or amount, it is important for ESL learners to know that they need to add the preposition “up”, for example, “scale it up”. Similarly, when describing making a number smaller in its size or amount, it is important for ESL learners to know that they need to add the preposition “down” following the word “scale”. Table 5. 3 Did You Scale It Down By 10 Or By 100? Transcripts Comments Mrs. G: Look at number 3. Lulua, can you tell me your thinking on solving this one? Lulua: First, I did 6 times 5 equals 30. After you have that, then you did 6 times 2; then 6 times 4. Mrs. G: Um, 6 times 4 would be 24, honey. This is 42. How many 6 in 42? Lulua: 7. Mrs. G: 7, right? Where does the decimal go? Lulua: Between 52 and 7. Mrs. G: How do you know that? Lulua: Because (paused) I scaled it by (paused)10. Mrs. G: Did you scale it down by 10 or by 100? Lulua: (silent). Mrs. G: If you ignore the decimal, you scale it up by 10 and by 10 more, right? So that’s 100. So you scaled it up by 100 to make this 9865 (task #4), so [for] your answer, you have to come back and scale it down by equal amounts. Lulua’s response (i.e., because I scaled it 10) shows that the proposition “down” or “up” was missing here. Lulua’s response (i.e., because I scaled it 10) shows that as a preposition, “down” or “up” was missing here. 96 Table 5.3 (cont’d) …… Mrs. G: Awesome. So we scaled it down by the 10th. Lulua, not a hundred…...so my caution to you, friends, about just putting the decimal places in the same place that it was in, you have to make sure that you do not add zeros or do more to change the places of the decimal. And the other thing that you’ve noticed use this line here, this 5 is always above the 7, this 9 is always above the 6, this 6 is always above the 8. You have to keep your quotient lined up. Keep things lined up is very important. (Lulua raised her hand) Lulua? Lulua: I scaled it down by 10. Lulua did a self- correction through interacting with Mrs. G and used the correct form of “scale it down” compared with her previous language use (i.e., Because I scaled it by 10). Videotaped lesson on March 4, 2019 In this conversation, Mrs. G was discussing the math problem of 316.2 ÷ 6. To support a better understanding of the conversation about math problem solving, below I presented a picture showing the math tasks (i.e., number 3). Figure 5. 1 The Math Task of 316.2 ÷ 6 on Student Activity Book In this example, Lulua was intended to say because I scale it down or up by 10; however, she missed the preposition “down” or “up”. When revoicing Lulua’s response, Mrs. G asked a yes or no question (i.e., did you scale it down by 10 or 100?) to confirm Lulua’s thinking; in the meantime, she used the correct form-scale it down- to check whether this is what Lulua intended to say. At this moment, it seemed that Lulua noticed that when she used the vocabulary “scale” to describe her specific problem-solving procedure, such as making a number smaller, she 97 needed to add the preposition word “down” after “scale”. So, she said “I scaled it down by 10” after discussions with the teacher. Of course, the main purpose that Mrs. G did this might not be addressing the linguistic issue. However, an indirect impact of revoicing a student's response was that it drew the student's attention to her language use. The benefit of revoicing was reported by certain EBs as well. One of the EBs felt that the strategy of “kind of like restate” was helpful for her to better understand Mrs. G’s instructional talk. For example, during my interview with Safia, when I asked her what support she would hope Mrs. G provided during FA, Safia said, Sometimes Mrs. G restate what she said [is helpful] to make it more clearer, like the volume thing, what you suppose to do to find the volume...People who did not get the first time they can get the second time [by using ] different words…Just like kind of explain clearly to the class, as I told you, kinda of like restate, cause sometimes I do not really get what she was talking, sometimes she thinks in her head, she does not really say it to the class. [Interview with Safia, 4/29/2019] According to Safia, the discourse moves of “restate” was helpful because it offers students different opportunities to listen to Mrs. G and thus better understand her instructional talk. Mrs. G’s revoicing practice suggests that this scaffolding practices not only was able to support EBs’ better understanding of the math talk, but it may also support EBs’ awareness of language use. The benefits of supporting EBs’ content and language development, to a large extent, echoed the dimension of assessment access in Lyon’s (2013a) framework, where rigorous content learning, including academic language development, was emphasized. 98 Scaffolding Questions Another support that Mrs. G felt important in supporting EBs is using scaffolding questions during FA. The functions of scaffolding questions, according to Mrs. G, include asking questions to confirm students’ thinking and illustrating the problems that students were stuck with. She illustrated during our interview, So I might say do you mean the dividend or the divisor, or I may need to point out the parts of the problem, you know. It is a lot of scaffolding that I need to do depending on where I think the students are. [Interview with Mrs. G on March 13, 2019] According to the interview excerpt, Mrs. G felt it was important to scaffold EBs’ understanding by confirming their responses or pointing out where they might get stuck. Mrs. G believed that her decisions on scaffolding questions were associated with the evidence that she gathered about student learning, as she said, “depending on where I think the students are”. In addition, Mrs. G mentioned the importance of starting with lower-level questions to guide EBs’ participation during the whole class discussion, as she stated, You know one of the good decisions that I made is that I try really hard at the beginning of the lesson to get my ELL students involved at a lower level to gauge where they are at [during] their day. And then I do my best to return to them later. In the task I might give them, I might [say] I want you to listen to Snow’s response because when Snow finishes, I am going to ask you what you heard. [Interview with Mrs. G, March 13, 2019] According to Mrs. G, the benefits of using lower-level questions are two fold: providing EBs with participation opportunities and supporting her to gather EBs’ learning evidence during discursive FA. She also mentioned the significance of returning to students to elicit their thinking. 99 Mrs. G’s perception about the purposes of using scaffolding questions, to a large extent, echoes the core ideas of assessment access, where EBs’ participation opportunities are valued. In the meantime, it also echoed the core idea of assessment fairness, where scaffolding questions provide EBs with opportunities to better explain their math thinking and reasoning. To better understand how Mrs. G supported EBs using scaffolding questions, below I present an episode of a whole-class discussion in Table 5.4. The whole episode of the conversation is very long, so I did not include all the conversations that Mrs. G had with students. Instead, I selected the segment with a focus on Mrs. G’s interaction with EB students of Moiz, Salam, and Nisha. Table 5. 4 Moiz, Do You Want to Read Number 18? Transcripts Comments Mrs.G: Moiz, do you want to read number 18? Moiz: The Morenos made a kil? (paused). Mrs. G: A kiloliter. Moiz: The Morenos made a kiloliter of fruit punch for a large party. They will pour it into punch bowls that each hold 0.01 kiloliter. How many bowls will they need? Moiz had difficulty in reading the vocabulary of “Kiloliter”. Mrs. G: Okay. So how many hundredths are there in 1 whole, Moiz? Moiz: A hundred. Mrs. G: A hundred. So one divided by one hundredth is one hundred. And it is the same as? Moiz: 1 times 100. Mrs. G asked Moiz lower-level questions (i.e., how many hundredths are there in 1 whole) to check his current understanding and gauge him into the conversation. Mrs. G: 1 times 100. So fifth graders when we divide a number by a decimal number less than one, why is the quotient greater than the original number? Why is our answer bigger than what we started, Greg (non-EBs, a pseudonym)? Ss: it is adding a 100? Oh, no, it is adding a 100. 100 Table 5.4 (cont’d) …... Mrs. G: What do you think, Nisha? when you are dividing by a decimal, your groups are? The divisor should be groups. What could be said about the groups? Nisha: Bigger? Mrs. G invited Nisha to answer her question. Mrs. G: The groups are bigger? So in this question (Mrs. G pointed question #18), 1 divided by 0.01 equals 100 (Mrs. G pointed out 1 ÷ 0.01=100). The dividend is smaller than the quotient. What made that happen? Ss: (Silent). Mrs. G cited an example (i.e., question # 18) to support Nisha’s understanding. Mrs. G: Hardest part for today. Nisha, think about the liter that we broke to tenth pieces is ten, right? What can be said 10 compared to 1? Nisha: A tenth equals 10. Mrs. G gave another example about the “liter”. Mrs. G: A tenth equals 10? Salam, do you want to try and help her out? Salam: It stays the same. Mrs. G invited Salam to help Nisha. Mrs. G: What stays the same? Salam: If you’re doing 0.1 divided by 1, it stays the same. Mrs. G: Okay, that’s true. But why is it when 1 [is] divided by 0.1, the answer is 10, or when I divide 1 by 0.01 the answer is 100? Salam: Because, it’s going to say, because (paused) one (paused) Mrs. G invited Lulua to respond. Mrs. G: I know it is hard. Finding words is tricky sometimes. Lulua? Lulua: So when you, when 1 times 100 [or] 10, it is the same because it times by itself, it’s a 100 [or] it’s 10. Mrs. G: Do you agree with that? Ss: Yeah. Mrs. G: (Mrs. G wrote down 2 ÷ 0.01=200 on the whiteboard) I think we are getting stuck on the fact that 0.1, 0.01, and 0.001 has a one in it all the time. And I want you to think differently about it. Why do I have 200 there? 200 is bigger than 2. Ss: Because we are doing division. [the divisor is] 0.01 (paused) is like timing, um, 100. Mrs. G: So it is the same as 2 times 100 equals 200. (Moiz raised hands) Moiz, do you want to give it a try? Moiz: < > Mrs. G returned to Moiz to check his understanding. 101 Table 5.4 (cont’d) Mrs. G: Yes. That is what Bob (non-EBs, a pseudonym) just said. But I am looking for words to describe because that is not a reason why it’s a procedure. When you [are] dividing by a decimal, your groups are (Mrs. G pointed the sentence stamp of “when you’re dividing by a decimal, your group” on the whiteboard)? Are they larger or smaller than a whole? Ss: Smaller. Mrs. G: So when you are dividing by a decimal, your groups are smaller than a whole, each whole in your dividends makes ten or a hundred or a thousand groups (Mrs. G also wrote this sentence down on the whiteboard). What do I mean by that? Tell me what do I mean that, Lucas (non-EBs, a pseudonym)? Mrs. G wrote down the answer to her “why” question on the whiteboard. Lucas: You mean that if it is gonna be 0.001, then it will be 1000 groups; if it is 0.01, it will be 100 groups, and if it is 0.1, it will be 10 groups. Videotaped Lesson on March 5, 2019 This example shows how Mrs. G talked through the story problem about how many bowls Morenos needs to pour the fruit punch with the whole class. At the beginning of this conversation, Mrs. G built an easy entry for Moiz by letting him answer two lower cognitive level questions (e.g., how many hundredths are there in 1 whole?). By doing so, Mrs. G was able to “gauge” Moiz’s understanding to allow him access to this conversation. Then instead of asking Moiz to respond to her higher-level question, Mrs. G switched to inviting a non-EB student (i.e., Greg) to respond to her probing question (i.e., why the quotient is greater than the original number when dividing a number by a decimal number less than one). In the middle of the conversation, Mrs. G used scaffolding questions to support Nisha’s understanding of whether the groups are bigger or smaller when dividing by a decimal. She first asked Nisha “what can be said about the groups”; at the same time, she articulated that “the divisors should be groups” so that Nisha was clear about what the “groups” means in Mrs. G’s 102 question. However, Nisha seemed unsure about her answer; she paused a little bit and said “bigger”. After realizing Nisha did not give a right answer, Mrs. G attempted to guide Nisha’s thinking by drawing on a math division problem that they just discussed, and she said, “so in this question 1 divided by 0.01 equals 100. The dividend is smaller than the quotient. What made that happen?” However, it seemed that Nisha was stuck there. So Mrs. G invited Salam to help Nisha out. After several rounds of discussion, Mrs. G then returned to Moiz and asked him to try to explain why the quotient is greater than the original number. This move is aligned with her reported practices of returning to EBs to articulate their thinking. However, after Moiz explained his thoughts, Mrs. G did not follow up and ask Moiz why he had a similar explanation with Bob. Instead, Mrs. G said, “that is what Bob just said. But I am looking for words to describe because that is not a reason why it’s a procedure”. It seems that Mrs. G intended to funnel Moiz to the desired answer, rather than asking follow-up questions by drawing on his ideas. Then how do EBs perceive Mrs. G’s practices of scaffolding questions? My interviews with students, overall, revealed positive feedback and they felt Mrs. G’s scaffolding questions were helpful. In particular, EBs reported that teacher modeling, which is about walking through math questions step by step, was able to support their math learning. Moiz said, “at the start of the multiple fractions, I asked her about it and she did it step by step and I learned it. It is helpful.” Additionally, when I asked EBs whether there is any change or modification they would like Mrs. G to do, Jasmine expressed her desire for Mrs. G to walk through the solving process of a math task. Jasmine said, “she already does that for us. I like that she can walk through the questions to us.” 103 Jasmine and Moiz’s comments suggest an alignment between Mrs. G’s reported scaffolding question use and her implementation of the scaffolding questions during FA. Their comments also bring up the significance of teacher’s modeling (i.e., walking through the procedure) in supporting EBs’ math learning and participation during discursive FA. Alignment between Mrs. G’s Beliefs of Barriers That EBs May Encounter and EBs’ Perceptions Data analysis in this study reveals that there is an alignment between barriers that Mrs. G identified and EBs’ perceived challenges in their math learning. It seems Mrs. G had a good awareness about challenges that EBs may encounter. She was able to consider the language and culture influences on EBs’ assessment performances, which echoed Lyon’s (2013a) description of the teacher's expertise in assessment fairness. During my interviews with Mrs. G, she identified two challenges that EBs may encounter. The barriers include a) linguistic complexity of academic language, b) a lack of “common language” and familiarity with contexts of math problems. Below I present each barrier that Mrs. G identified, following with EBs’ perception about their needs during discursive FA. Linguistic Complexities of Mathematics Assessment Tasks Mrs. G felt that EBs might encounter difficulties in reading comprehension due to text complexities of assessment tasks, which are written in academic language, as she said, Reading comprehension is a big part of it. Yes, you know, we cannot get to the math until they can interpret the question. And as they get older and the text gets more complex it gets in their way, so they need to practice reading in the math classroom also. [Interview with Mrs. G, February 22, 2019] 104 According to Mrs. G, solving math problems successfully requires reading skills. However, the increasing text complexities of math tasks may lead to EBs’ incomplete or inaccurate interpretation. As a result, the text complexities may lead to a barrier for EBs to fully demonstrate their math knowledge and skills. In addition to the text complexities, Mrs. G identified synonyms and disciplinary vocabulary as barriers to EBs’ math learning. She said, One of the things that I see prevalent in mathematics is the idea we call the same thing by different names. There are a lot of synonyms in math. And I think that makes it difficult for ELL learners. That [vocabulary] we may call “dividing” “factoring” for example, and not having that interchangeable language. And I think that if I were to identify one particular part of the language of mathematics that’s probably the hardest part. So you have to break down the academic language and build it back up. [Semi-structured member check interview with Mrs. G, December 1, 2017]1 Mrs. G felt that synonyms in math were challenging for EBs; while the most difficult part is the disciplinary vocabulary (e.g., dividing). According to Mrs. G, vocabularies on abstract concepts were “the hardest part” because there is no interchangeable language to use. Additionally, Mrs. G believes that it is important to “break down” the academic language and “build it back up” in order to better support EBs’ math learning. 1 The interview data was collected in an earlier study before my dissertation. However, the participant between my previous study and my dissertation is the same. That is why I went back and added part of the previous study’s data into my dissertation data analysis. 105 The challenges of linguistic complexities that Mrs. G perceived, to a large extent, are aligned with EBs’ perceptions. During my interview with EBs, Jasmine reported the vocabularies on abstract concepts (e.g., concave, denominator) were challenging for her. In addition, Jasmine felt vocabularies with multiple meanings (i.e., polysemy) or with less frequent use in daily life were difficult for her. During our interview, she said, In the student activity book, there is a box showing the vocabularies, but I do not know what they really mean. Some words like “concave” that I did not see or use often, I do not know its meaning. Sometimes I thought of the word “table” a literal table, not the table in math context. And sometimes I forgot what grams and kilograms is. Sometimes I just need to guess. [Interview with Jasmine, April 24, 2019] According to Jasmine, she had difficulty in knowing the meaning of disciplinary vocabulary that she was not familiar with. Additionally, Jasmine brought up another challenge caused by polysemy when the meaning of a vocabulary (e.g., table) has different meanings in the context of math registers. Jasmine expressed her concerns about being awkward when she did not use the math terms appropriately in her explanation during class discussions, as she illustrated, Maybe I need to work on my vocabulary because sometimes I do not know what they are called, sometimes I forget the denominator and numerator, which one [is which]. It is hard to speak that vocabulary. And that is weird. My explanation does not make a lot sense because I am using different kinds of vocabulary. [Interview with Jasmine, April 24, 2019] According to Jasmine, she wants to work on academic vocabulary in math. Due to an unfamiliarity with the academic vocabulary, Jasmine found it was difficult to completely show 106 her math thinking through oral explanation. As a result, this situation caused an uncomfortable feeling to Jasmine. It is worthwhile to note that in addition to the linguistic complexity of written text that Mrs. G reported, EBs reported linguistic complexity of oral text during discursive FA. The academic vocabularies on abstract concepts may impact EBs’ understanding of Mrs. G’s instructional talk, as Tasnime said, I usually know what to do [regarding learning objectives]. But sometimes Mrs. G says big words of English such as “common denominator”, and I do not really understand. So I am the first one to raise my hand and ask that. And I do not like that. [Interview with Tasnime, April 30, 2019] According to Tasnime, using big words such as “common denominator” in instructional talk added difficulties for her to understand Mrs. G’s instructional talk about lesson learning objectives. The case of Tasnime’s experience also revealed the importance of using strategies that Mrs. G reported, such as “interchangeable language” and “breaking academic language down and then building it back up”, during the discursive FA. Lack of “Common Language” and Familiarities with Contexts of Math Problems. In addition to the academic language, Mrs. G identified another two challenges that EBs encountered during discursive FA: a lack of “common language” and a lack of familiarity with the context of math problems. Those two challenges were not brought up by EBs. This may be related to my data collection because I did not ask questions about the “common language” and context issues of math tasks during my interviews with EBs. Thus, there is no explicit evidence about the alignment or misalignment between Mrs. G’s identified barrier about the context of math problems versus EBs’ perceptions on the challenges that it may bring to them. 107 Drawing on the data sources of the interview with Mrs. G, below I explain why she felt that a lack of “common language” and context of math problems may hinder EBs’ class participation. During my interview with Mrs. G, she discussed the role of “common language” in supporting EBs building math understanding, as she stated, A lot of time we use common language to describe mathematical ideas and the academic language comes later. And we do not have the same common language to build their understanding of, so I think it is hard for them. I think it is wonderful for their language development skills. But I think that because they do not have the common language, they are first trying to figure out what the words mean that we are using as common language. And then they have to attach that academic language to that foundation that they are building. So it is definitely accelerated for them. [Interview, March 28, 2019] Mrs. G reported that having “common language” was essential for EBs to build conceptual understanding when talking with peers in the math class. She pointed out that using the “common language” was able to accelerate EBs’ academic language development. According to Mrs. G, the math ideas that EBs built using common language allow them to make connections to academic language. Unfortunately, Mrs. G found that EBs, oftentimes, did not have enough common language used for communicating and sharing math ideas. Mrs. G was also concerned about the context of a math problem, which may influence EBs’ math problem-solving. For example, I asked about one idea that she discussed previously by saying “I remember that last year you mentioned some words might be challenging for EBs to understand the story problems in math”, Mrs. G clarified her ideas and said, It is not a word, a single word of problem like coming across to where they are not familiar with, and [it is] more about the problem of, the money problem for example, 108 understanding the context of the problem, had a money value and their answer should be reflected in the same value even though it was not a decimal division problem. [Interview with Mrs. G, February 22, 2020] Mrs. G believed that understanding the context of the math story problems was often more critical than a single vocabulary word. According to Mrs. G, the context of math problems matters because it impacts the accuracy of EBs’ responses to math problems, for instance, EBs need to keep the same value when solving a money problem. Section 2: Misalignment of Mrs. G FA Practices, Beliefs, and EBs’ Perceptions In this section, I present patterns in misalignment among Mrs. G’s FA practices, beliefs, and EBs’ perceptions from two perspectives: (1) misalignment between supports that Mrs. G felt she provided to EBs and her actual implementation during FA; and (2) a tension between Mrs. G’s belief of instruction for all and EBs’ needs in differentiation. The reason to examine these misalignments is that they may cause missed opportunities to learn for EBs, for instance, EBs may not have sufficient chances to explain their math thinking, or not receive differentiated accommodations. I examine these misalignments using the lens of both assessment fairness and access in Lyon’s (2013a) framework. Misalignment between Supports that Mrs. G Felt She Provided to EBs and Her Actual Implementation Using “Interchangeable Language” As a reminder, Mrs. G identified linguistic complexities, which include text complexity and disciplinary vocabularies, as barriers to EBs’ math learning. Although Mrs. G has a good awareness of identifying language barriers that EBs may encounter, it seems that there was a gap 109 between the support that she felt she provided to EBs and her actual implementation of those supports within this unit. For example, Mrs. G reported that she used synonyms or “interchangeable language” to support EBs’ math learning during class discussions, as she said, So I use the language, but I also use synonyms for the language as we have the discourse, so that if you hear the exponent or power of 10 in a situation, you know that they mean the same thing. It is like calling the answer the quotient that the same idea or saying the thing that you are dividing by is the divisor. When they hear me say those things they become interchangeable. And we have had so much practice at this point. I think the more practice they have, the more they internalize it. [Interview with Mrs. G, March 13, 2019] The strategy of using synonyms or “interchangeable language” is consistent with EBs’ reported needs. For example, when asked about whether there was anything that she felt Mrs. G could do to support her understanding of math story problems, Tasnime said, You know how math has other words of confusing, like words. So maybe she [Mrs. G] could change the word to the words that she says that are not confusing. Like, if she can change the word as to like “add", or like "divide the [pause]", you know, there's a specific [name], I don't know what the name is. But like, add or advise the things in the parentheses first. There's a name for that and when we first started this, I didn't know that. Why couldn't you just put this say the prep was added at the parentheses. [Interview with Tasnime, April 30, 2019] According to Tasnime, certain math vocabularies are confusing to her. She suggested that Mrs. G changes the confusing vocabularies to those that are easy to understand or adding extra 110 information to explain the confusing vocabularies in the parentheses. While Tasnime said it would be helpful to have these supports, she did not mention that Mrs. G had offered support of “interchangeable language”. My classroom observations did not show clear evidence about Mrs. G using “interchangeable language” within this unit Using Wait Time Another example of misalignment is about using wait time. Mrs. G mentioned wait time more than once during our interviews. During my interview with her in 2017, I asked her what support she usually provided to EBs, one of the key strategies that she mentioned was providing EBs with additional wait time. She believed that students would likely try their best to respond to her questions if they were given extra time, as she stated, My wait time in instruction has taken years to work. Through my Action Research, I’ve recognized that it is a strength. Wait time is student specific. While ELLs often need additional processing time, other students with unique needs need that time as well. When students know you won’t let them off the hook but will wait for them, they are more likely to give responding a shot. [Interview with Mrs. G, March 22, 2017] According to Mrs. G, she has many years’ experiences of using wait time in her class, and the benefits of using wait time was also identified through her action research. Mrs. G felt that students would be motivated to participate in the class discussion if they were given enough wait time. In light of the benefits of wait time, Mrs. G did not limit the wait time used for EBs only. Instead, she used wait time for all students. Below I present a segment of a conversation between Mrs. G and Moiz in Table 5.5 to illustrate how the wait time may not be sufficiently provided. In this episode, Mrs. G and the 111 whole class were discussing a math task. This task requires students to use the fact that 1715 ÷ 35=49 to solve the following problem: 0.1715 ÷ 0.35. Table 5. 5 I Did Not Wait for Him Transcripts Mrs. G: How about the next one? 0.35 is our divisor, and 0.17 is the start of the dividend. What do you think, Nick (non-EBs, a pseudonym)? Ss: So it could be 49 because it can go many times, neither of them [the divisor and dividend] are whole numbers. Mrs. G: Okay. Neither of them are whole numbers. So you are expecting your answer could be bigger. Ss: Yes. Comments Mrs. G: Interesting. Moiz, do you have thoughts about this one (0.1715 ÷ 0.35)? Moiz: < > Mrs. G asked Moiz to share his thoughts. Mrs. G: Does this help you? (Mrs. G wrote down 0.17 ÷0.35) I am not sure. We kind of got the past point that we know we have decimals as numerators and denominators, right? Ss: Yeah. Mrs. G provided additional information to Moiz to support his understanding Mrs. G: Simon (non-EBs, a pseudonym), do you have thought about this one? Moiz: 0. Mrs. G: Oh, I am sorry. Moiz was starting. I did not wait for him. Moiz:0.049. Mrs. G: You are thinking something should be like that (wrote down 0.049)? Why are you thinking that way? Moiz: Because when I looked at number one (the division problem of 17.15 ÷ 35 =0.49), it [the quotient] starts from one “0”. Mrs. G: Okay. And this (0.175 ÷ 0.35) up here appears smaller than that? (Mrs. G waited for a few seconds). I like that thinking. That is a good way to explain it. Simon, what’s your thought about it? Videotaped on March 7, 2019 112 Mrs. G realized that she did not give Moiz enough wait time. Then Mrs. G asked Moiz to explain why. Mrs. G paused a while intentionally to wait for Moiz’s response. In this conversation, Mrs. G asked Moiz to share his thoughts about the division problem of 0.175 ÷ 0.35. Moiz responded to her. However, it seemed Moiz’s response was not clear. So Mrs. G provided him with certain visual support (i.e., wrote down 0.17 ÷0.35) and briefly mentioned relationships between decimals, numerators, and denominators. Mrs. G waited for a while; however, when she switched to ask Simon to answer the same question, then Moiz started to say his answer. It reveals that Moiz needed extra wait time to process Mrs. G’s question than Mrs. G expected. At this moment, Mrs. G realized that she did not give Moiz enough wait time before he started to share his answer, so she said, “oh, I am sorry. Moiz was starting. I did not wait for him.” While during the second time that Mrs. G was confirming Moiz’s thinking, Mrs. G intentionally paused for a while to wait for Moiz’s response when she asked him “and this [0.175 ÷ 0.35] up here appears smaller than that?” The conversation between Mrs. G and Moiz shows there were instances when Mrs. G. did not give EBs enough wait time when eliciting their thinking during FA. When reflecting on her teaching during our informal conversations, Mrs. G also acknowledged the moment where she did not give Moiz sufficient wait time. She said, In that same video I asked a question of Moiz, I had a very long wait [and] he didn't appear to be processing. I called on another student, and he responded. So I just didn't give him enough time. And I think that that does happen sometimes. But it's a really good example of a place where ELL learners need it, he needed more time. And it's a good example of him processing the math and processing the English because in that particular example that I'm thinking of he had the wrong answer. [Interview with Mrs. G, March 26, 2019] 113 Mrs. G felt that she did not give Moiz enough wait time during the discursive FA, although she felt that she already had a long wait. It turns out that Moiz may have needed more wait time than Mrs. G expected because she posits that he processed both math and English at the same time. This misalignment about the wait time used is consistent with my observation. During my interview with Mrs. G, she further discussed her concerns about the EB student-Moiz and why she believes that Moiz needed more wait time. She said, I probably worry about Moiz most because he probably is a very deep thinker. And I often don’t give him enough time. He is very quiet, that is another piece to it. I think he is quiet because he needs more time. But he also shares very brilliant thinking. [Interview with Mrs. G, February 22, 2019] Mrs. G pointed out that Moiz was a deep thinker; as a result, he needs more time when completing a math task. Once allowing Moiz to think with more time, he shared great ideas sometimes. Yet, Mrs. G acknowledged that she did not give him enough time. This may be due to Moiz being quiet when he was processing his thoughts, and this quiet signal might lead to Mrs. G’s misinterpretation that Moiz did not know the answers to her questions. The importance of having enough wait time during FA was acknowledged by EBs. During my interview with Moiz, he said. Yeah. I feel like I know I had [need] a little bit more time because sometimes, like, I'm trying to think about what the question is, and then she [Mrs. G] calls on me, and then I had to improvise really quickly because I can't just sit there to be quiet…I want to answer to her. So usually like half of the time I just still think and then give her an answer. But sometimes they just have to see something like, oh, I'm not sure. [Interview with Moiz, May 1, 2019] 114 Moiz reflected that he needed more time to think about what he was asking to do and responding to Mrs. G’s questions during FA. However, Moiz, sometimes, had to react very quickly, although he seemed not to have enough time to process Mrs. G’s questions, as he said, “I had to improvise really quickly because I can't just sit there to be quiet”. Moiz’s statement infers that it is fundamental to ensure that EBs having additional wait time when eliciting their thinking during FA. A Tension Between Mrs. G’s Belief in Good Instruction for All Students and EBs’ Needs in Differentiation Another misalignment is that there seems a tension between Mrs. G’s belief in good instruction for all students and the specific accommodations and differentiated instruction that EBs’ need. Mrs. G believed that good instruction is good instruction for every student, as she said, As an instructor, one of the things that I strive for is accommodations that can be used by all. Good teaching is good teaching for everyone. And if I'm making an effort to accommodate something and more than my ELLs or more than my special education students, if there are others that would like to use that accommodation, it needs to be available to them. [Interview with Mrs. G, March 28, 2019] Mrs. G felt that accommodations can be used by all students. She did not limit the accommodations for EBs only. Instead, all students can get access to the same accommodations in Mrs. G’s class as long as they need. Mrs. G’s comment reveals that she cares about learning opportunities for all students. However, having all students in mind does not mean not differentiating EBs’ specific learning needs. Based on my interview with Mrs. G, I found out that Mrs. G always put EBs 115 together with other groups of students when she was talking about how she supported EBs. This shows that Mrs. G might not consider EB’s needs separately and differentiate their unique needs in language when planning her instruction and assessments. For example, when Mrs. G was talking about how she supported EBs using questioning practices, she mixed EBs, special education students, and females as a big group of “marginalized student populations”. She said, For the marginalized students in the classroom, the special education students, the ELL students, females, I would look to see purposefully going into the next class and which questions I wanted to ask them. And if it was someone that may be challenged, you know, what was I going to build in front of their questions so they could be successful at the higher-level question. [Interview with Mrs. G, March 13, 2019] According to Mrs. G, it seems that she viewed EBs’ needs as similar to the group of students who are females and the group of students who have special education needs, yet it is likely that these students may have needed different types of support. This undifferentiated and general grouping as well as the belief in good instruction for all seems inconsistent with EBs’ voices. My interviews with EBs reveal that EBs needed accommodations that were provided based on their language and math proficiency levels. For example, Lulua felt it would be helpful for Mrs. G to differentiate math tasks assigned to students based on students’ math levels. I think something for her [Mrs. G] to change is not to do the same thing for everyone. Sometimes the paper is easy for one person but super difficult for another. She should have different types of papers according to student levels. [Interview with Lulua, April 26, 2019] 116 According to this interview excerpt, Lulua felt that one instructional modification that Mrs. G could make is to provide differentiated math tasks to students. In addition, Lulua discussed the importance of tailored feedback for their math learning. She said, I think she should see how you would work with your math, like certain things you would get like. Pretend you're really advanced at math, maybe she can write something that's like more advanced on a sticky note. Then maybe like you're not the best at math, she could write something that makes more sense to you. Please ensure sticky notes that would make sense to one person individually... Some people like one person in math class can be better at literacy and then one person that's really advanced at math can be not very good at literacy. [Interview with Lulua, April 26, 2019] According to Lulua, only feedback drawing on the individual student’s math and literacy level are meaningful. EBs’ needs for tailored feedback have echoed Lyon’s (2013a) description of teachers' highest level (i.e., elaborating level) of expertise in assessment access. According to Lyon (2013a), the “elaborating” level refers to teachers considering explicitly how assessment can support EBs’ participation, promote their complex thinking; AND how teachers provide feedback tailored to EBs. The tailored feedback is one important indicator to distinguish a teacher’s expertise of assessment access at the “implementing” and “elaborating” level. I will visit Mrs. G’s overall expertise level in assessment fairness and access in the summary section. EBs also mentioned their need for explicit language support. For example, Jasmine expressed her desires for a vocabulary glossary in Mrs. G’s class. I think there should be some sort of section like list the vocabularies and put [information about] what it means. I think that will work better instead of just putting them there… 117 Maybe she can make a vocabulary glossary for each unit, like in the unit of Geometry, she can put “parallel” mean this. [Interview with Jasmine, April 24, 2019] Jasmine felt it would be helpful if Mrs. G could list the math vocabulary and its meaning in each unit. Overall, it is worthwhile to notice that the interview excerpts from EBs have revealed a need for explicit language support, such as vocabulary glossaries. EBs can obtain ample opportunities to develop their academic language during math discourse. Yet, it is also critically important for teachers to teach or model academic vocabularies explicitly in the math context. In the meantime, providing direct language support such as English or bilingual glossaries for EBs seems a must need because EBs can use these language resources as tools to support their math learning and class participation. A Closing Remark on Mrs. G’s Assessment Expertise Levels in Equity In summary, there was both alignment and misalignment among Mrs. G’s FA practices, beliefs, and EBs’ perceptions. Findings show that Mrs. G values using discourse to support EBs’ math learning. Aligned with her values, there were large amounts of math discussions in her typical lesson routines. During discursive FA, Mrs. G used multiple discourse moves, such as transcribing verbal discussion, revoicing, and scaffolding questions to support EBs’ conceptual understanding and participation. The benefits of those discourse move, overall, echoed what EBs’ voiced about their learning needs. In addition, there is an alignment between Mrs. G’s perception of barriers that EBs may encounter and EBs’ voices on the challenges that they had, such as linguistic complexities of assessment tasks. Aside from the linguistic complexities (e.g., disciplinary vocabulary) of written text, EBs also brought up potential challenges that they faced due to the linguistic complexities 118 of oral text (e.g., polysemy). Both EB’s voices and Mrs. G’s perceptions suggest a need for providing direct and explicit language support, such as the use of a glossary and explicit vocabulary teaching, for EBs. Findings in this study also reveal that there was some misalignment between Mrs. G’s FA practices, beliefs and EB’s perceptions of their learning needs. Specifically, the misalignment occurred in two main areas. First, there was inconsistency between supports that Mrs. G felt she provided to EBs, such as the uses of “wait time”, compared to her actual implementation of those strategies. Second, there was a tension between Mrs. G’s beliefs of good instruction for all students versus EBs’ voices about their learning needs. For example, Mrs. G tended to group EBs with special education students and female students when she was talking about how she used questioning practice to support “marginalized students.” This grouping suggests that she might not consider differentiating her instruction to meet EBs’ unique needs. My interview with EBs, in contrast, reveals their desire for differentiated tasks and tailored feedback, such as assigning tasks according to students’ literacy and math levels. The misalignment may lead to missed learning opportunities for EBs. As a result, it may affect the fairness and access of FA in various ways such as hindering opportunities for EBs to better demonstrate their math knowledge or moving learning forward through receiving tailored feedback. Drawing on findings in this chapter and the aforementioned summary about Mrs. G’s support to EBs, it seems that her assessment expertise in equity, overall, falls within the level of “implementing.” As a reminder, according to Lyon’s (2013a) Assessment Expertise Rubric, there are four levels (i.e., not present, introducing, implementing, and elaborating) in describing a teacher's expertise in equity. 119 In light of Mrs. G’s reported barriers that EBs may encounter such as linguistic complexities and context of math problems, it seems that Mrs. G was aware of the influence of language and culture on EBs’ assessment performance. She also used different teaching strategies such as transcribing verbal discussion, revoicing, and scaffolding questions to support EBs’ conceptual understanding and participation during FA. Yet, while Mrs. G. used discursive FA as a tool to support EBs in a meaningful context (i.e., class discussion), there was no clear evidence showing Mrs. G incorporated EBs’ language and culture, especially their home language and culture, into her FA design and implementation. Thus, I consider Mrs. G’s expertise of assessment fairness as the “implementing level”. Findings also show that Mrs. G values using discourse to support EBs’ math learning. Through math discourse, she used multiple scaffolding strategies such as transcribing verbal discussion and providing wait time to support EB’s math learning. Thus, I consider Mrs. G’s expertise in assessment access as the “implementing” level as well. I consider Mrs. G’s expertise in assessment equity, overall, as “implementing” level, however, it does not mean that Mrs. G well differentiated EBs’ unique needs and provided them with specified accommodations. Similarly, considering Mrs. G’s expertise in assessment equity as “implementing” level does not mean that Mrs. G has supported EBs’ full participation in math. The insufficient wait time, her funneling questioning patterns, as well as lack of explicit language scaffolding and resources during FA, may lead to EBs’ missed opportunities to learn. Therefore, there is room for improvement in supporting EBs in Mrs. G’s math class. Furthermore, there were misalignments between what Mrs. G considered to do and the support she actually provided to EBs during FA. This finding suggests a gap between Mrs. G’s knowing 120 and doing, which is connected to a discussion on indicators of teachers’ assessment expertise Rubric. I will further explain this point in the discussion chapter. 121 CHAPTER 6: DISCUSSION In this chapter, I synthesize my findings and discuss the findings in relationship to existing literature. Specifically, I discuss six major themes and claims based on findings about Mrs. G’s beliefs, FA practices, and support for EBs. Those themes and claims include: (1) High quality of FA: increase students’ role as active contributors in FA, (2) Responsive FA: the role of language in EBs’ math learning, (3) A tension between a teacher’s belief in good teaching for all students and EB’s needs in specific accommodations, (4) What counts as a teacher’s assessment expertise: a gap between knowing and doing, (5) An insight into a teacher’s knowledge base of “responsive” FA within linguistically and culturally diverse math classrooms. (6) Potential influences of mathematics content and curriculum on a teachers’ FA practices. I chose these themes and claims because I find them significant and connected to researchers and educators who are interested in FA and working with EBs. High Quality of FA: Increase Students’ Role as Active Contributors My findings in this study showed that in some instances Mrs. G provided rich opportunities for students (EBs and non-EBs) to be involved in discursive practices of FA. However, at other times Mrs. G used more teacher-directed approaches to FA. In this section, I discuss instances when Mrs. G supported student engagement as well as when she hindered student engagement and made connections to research in this area. Regarding instances when Mrs. G supported student engagement in discursive FA, my observations showed that she supported students’ talk during whole-class discussions by 122 connecting to students’ ideas. Mrs. G also gave students the flexibility to respond with clarification questions. Mrs. G’s high-level FA practices, according to the FARROP rubric, foregrounded students’ role as active contributors. For example, Mrs. G integrated discussion into all the stages of FA. She asked questions drawing on students' ideas and encouraged pair talk in her math class with a purpose to elicit and promote student thinking during the discourse. Mrs. G’s FA practices are connected to research on assessment from sociocultural perspectives, which highlights the importance of students constructing knowledge through interaction and scaffolding (Smith, Teemant, & Pinnegar, 2004; Sardarech & Saad, 2012; Gipps, 2002; Shepard, 2000; Black & Wiliam, 1998a). In addition, Mrs. G provided opportunities for students to participate in math talk during the FA. This is also connected to Moschkovich's (2018) work, where students’ talk in the math class is highlighted. According to Moschkovich (2018), discussions with students will provide more accurate assessment data than the written assessment. Without knowing student thinking, a teacher would not be able to provide feedback that meets students’ needs and make informed instructional decision making in FA (Heritage, 2007; Gotwals & Ezzo, 2018). Yet, not all of Mrs. G’s FA practices foregrounded students’ role as active contributors. As mentioned above, there are some instances where Mrs. G hindered student engagement. For example, I found that she often used a funneling questioning pattern to guide students to a desired end or answer. As a result, students may not have had enough opportunities to explain their thinking. This is connected to research from Herbel-Eisenmann and Breyfogle (2005), who argued that using the funneling question can “limit what students are able to contribute because it directs their thinking in a predetermined path based only on how the teacher would have solved the problem” (p.486). While acknowledging the teacher's role in guiding students’ talk, this 123 study suggests that in order to better support student learning, Mrs. G could work on shifting her questioning to be characterized more by a focusing-interaction pattern (Herbel-Eisenmann & Breyfogle, 2005). This focusing questioning pattern values having teachers listen to students’ responses and providing corresponding feedback by drawing on students’ ideas. In the meantime, to value students’ active role in FA, findings in this study suggest that it is crucial for Mrs. G to avoid answering her questions and to strive to provide students with enough wait time. In addition to the interaction between teachers and students, assessment approaches that are centered on students, such as self and peer assessment have received increasing attention (Butler, 2016; Edwards, 2014; Black & Harrison, 2001; Panadero, Jonsson, & Strijbos, 2016; Willey & Gardner, 2010). As a result, high-quality FA emphasizes incorporating self and peer assessment to foreground students’ participation (Black & William, 1998a; Wylie & Lyon, 2016; Lynch, McNamara, & Serry, 2012; Sadler, 1998). Through self-and peer-assessment and feedback, learners can be better aware of their learning progress as well as how to close their learning gap (Black & Harrison, 2001). This awareness, in turn, can make learners take ownership of their learning and participate in the assessment process actively (Panadero, Jonsson, & Strijbos, 2016; Nicol & Macfarlane-Dick, 2006). Yet, findings in this study showed that Mrs. G did not incorporate self and peer assessment into her FA during this unit. This mirrors findings from Volante and Beckett (2011), which revealed elementary teachers tended to struggle with integrating self and peer assessment and give students opportunities for reflecting and taking ownership of their learning during FA. Part of the reason is that those teachers worried about the objectivity and reliability of peer assessment (Volante & Beckett, 2011). While the purpose of this study is not to generalize findings, Mrs. G’s FA practice, to some extent, has informed a need to illustrate how to better 124 support elementary teachers’ self and peer assessment use when developing teachers’ expertise in FA. Responsive FA: The Role of Language in EBs’ Math learning As evidenced by existing literature, it is critically important to address EBs’ needs of language in content-area assessment (Moschkovich, 2013; Bailey, Maher, & Wilkinson, 2018; Lyon, 2012, 2013a, 2013b; Clark-Gareca, 2016; Abedi & Lord, 2001; Abedi, Hofstetter, & Lord, 2004; Wolf & Leon, 2009). However, the role of language in content-area instruction and assessment can be complicated and confusing given research findings and suggestions from various fields (e.g., ESL, content learning, and assessment). In particular, there may be conflicting views on the best practices for EBs. For example, researchers and educators in content areas advocate supporting EBs’ language and content development at the same time through meaningful discourse. Yet, the emphasis on discourse may have embedded a potential risk of losing sight of EBs’ needs in explicit language support. In light of this situation, this section aims to illustrate a clearer picture of the role of language in EB’s math learning during FA. Specifically, drawing on the concept of “assessment equity” (Lyon, 2013a), I discuss how to meet EBs’ needs in the language during FA from three aspects: 1) assessment access: using language modalities flexibly to support EBs’ math learning; 2) assessment access: using “direct approach” to support EBs’ language development; and 3) assessment fairness: direct language supports and accommodations for EBs. By doing so, I hope that this study can add to existing literature, in the meantime, to enrich researchers and educators’ understanding of EBs’ language needs in the elementary math classroom assessment and instruction. 125 Assessment Access: Using Language Modalities Flexibly to Support EBs’ Math Learning As presented in this study, findings suggest that Mrs. G emphasized the importance of using “common language” to build student’s math ideas in small groups. The “common language” that Mrs. G mentioned can be understood as everyday language. According to Mrs. G, using “common language” was able to support EBs developing math ideas; while the math ideas that students developed, in turn, would benefit EBs’ academic language development in whole- class discussions. Mrs. G’s statement about the “common language” use, to a large extent, is aligned with Moschkovich (2018, 2013) advocacy of using language modalities flexibly. According to Moschkovich (2018), while acknowledging the importance of developing EBs’ academic language in math talk, everyday language should not be perceived as an obstacle in math classrooms. Instead, everyday language registers and academic language registers can be used for different purposes in class discussions (Moschkovich, 2018). Mrs. G’s practices echoed current research that highlights the importance of allowing EBs to use informal language when exploring a new math concept, developing ideas with peers in small groups, and engaging students in practices (Moschkovich 2018; Goldenberg, 2013). As such, in order to ensure the equity and responsiveness of FA, it is essential to incorporate discursive FA to support all students learning. In addition, findings in this study suggests that there is a need to reconsider relationships between everyday language and academic language register, and to use language modalities flexibly in different types of formative assessment activities such as the whole class discussion and small group discussion. 126 Assessment Access: Using “Direct Approach” to Support EBs’ Language Development My interviews with EBs showed that EBs desired explicit language support. This can be aligned with Goldenberg’s (2008) advocacy of incorporating a direct approach to support EBs’ learning needs in language. According to Goldenberg (2018), there are two approaches in best supporting EBs in content-area learning: an “interactive approach” and a “direct approach”. The interactive approach refers to “instruction with giving and taking between learners and the teacher, where the teacher is actively promoting students’ progress by encouraging higher levels of thinking, speaking, and reading at their instructional levels” (Goldenberg, 2008, p. 18). The direct approach emphasizes “explicit and direct teaching of language skills or knowledge, for example, letter-sound associations, spelling patterns, vocabulary words, or mathematical algorithms” (Goldenberg, 2008, p. 18). In this section, I refer to the “direct approach” as explicit and direct teaching of language knowledge and skills, such as direct vocabulary instruction (Gibbon, 2006; Symons, 2021; August & Shanahan, 2006; Kinsela, 2005), explicitly scaffolding language use (Gibbon, 2006, 2015), and communicating language-specific learning objectives. The importance of explicit attention to linguistic form and function has been supported by researchers in second language learning and linguistically responsive teacher education (Villegas, & Freedson-Gonzalez, 2008; Schleppegrell, 2004; Gass,1997; Swain, 1995). As evidenced by findings in this study, Mrs. G often used the interactive approach (i.e., class discussions) to communicate learning targets and to elicit students’ thinking by drawing on students’ ideas. Yet, Mrs. G rarely used a direct approach to support EBs’ needs in math learning and language development. Based on my observation notes, Mrs. G’s explicit language support 127 for EBs is very limited. However, the limited use of direct language support seemed in contrast with EBs’ voices and Mrs. G’s identified challenges to EBs. Findings showed that Mrs. G and EBs identified three types of vocabularies that were challenging: vocabularies associated with abstract concepts, polysemy, and synonyms. For example, EBs expressed their difficulties in telling meanings of polysemy of mathematical words (e.g., table). Those identified challenges called out explicit attention to language, particularly teaching academic vocabularies explicitly to EBs (Kersaint, Thompson, & Petkova, 2013; Symons, 2021). In addition, findings in this study showed that Mrs. G revoiced students’ responses during her FA. Despite a lack of knowledge of whether Mrs. G used revoicing to scaffold EBs’ language use intentionally, it turned out that this discourse move was able to support EBs’ academic language development via self-correction. For example, an EB student-Lulua-was able to correct her use of “scale it up/down” when she explained her thinking after several rounds of interaction with Mrs. G. This move is connected to Gibbon’s (2015) work, which emphasizes the importance of scaffolding EBs’ language use via recasting students’ responses using correct grammars and words. In addition, research suggests that a high-quality and responsive FA in linguistically and culturally diverse classrooms requires teachers not only post content learning targets, but also clear language learning outcomes or targets (Gibbon, 2015). By doing so, EBs can get more opportunities for explicit language instruction in the math classroom. On the contrary, a lack of FA practices for explicit language learning targets may lead to missed learning opportunities for EBs and thus influence the fairness of assessment. 128 In this study, my observations showed that Mrs. G rarely posted language-specific learning outcomes or targets in her lessons throughout the whole unit, although EBs seem to have opportunities to develop their academic language through Mrs. G’s intentional (or unintentional) scaffolding. Mrs. G is not alone in rarely posting language learning targets. This finding, to some extent, echoed Gibbon’s (2015) study, where she found “although most teachers acknowledge the importance of language in the classroom, it is often not explicitly planned for across the curriculum, so it becomes the ‘hidden curriculum’ of schooling” (p.214). Assessment Fairness: Direct Language Supports and Accommodations to EBs Findings in this study also showed that Mrs. G and EBs identified language-relevant challenges and barriers to EBs’ math learning. Specifically, both Mrs. G and EBs mentioned influences of linguistic complexities, such as vocabulary on abstract math concepts, polysemy, and synonyms, hinder EBs’ participation during FA. These identified or voiced challenges are connected to the existing literature on assessment fairness for EBs in content areas (Abedi &Levine, 2013; Siegel, 2007, 2008; Clark-Gareca, 2016; Martiniello & Wolf, 2012). For instance, Martiniello and Wolf (2012) found that EBs encountered difficulties in “tackling word problems in mathematics assessments” due to linguistic complexities of assessment tasks (Martiniello & Wolf, p.152). Given the linguistic complexity, EBs expressed their needs for explicit language support. For example, comments from Jasmine suggested that having a vocabulary glossary would be helpful for her to use as a reference whenever she could not remember meanings of the math vocabulary or when she did not know how to say it when responding to Mrs. G’s questions. The desire of having a vocabulary glossary in math class is connected to the existing literature on advocating for using direct language support and accommodations (e.g., glossaries, reading 129 aloud, linguistic modifications, translanguaging) to increase assessment fairness for EBs (Abedi & Levine, 2013; Willner, Rivera, & Acosta, 2009, Siegel, 2007; Clark-Gareca, 2016; NCTM, 2013). In addition, EBs’ voices indirect language support, such as the “vocabulary glossary” and “interchangeable language”, brought up the importance of tackling oral text complexities. According to Jasmine and Tasnime, having vocabulary glossaries and interchangeable language would allow them better comprehending Mrs. G’s instructional talk, and better explaining their thinking and math reasoning process during FA. Unfortunately, the majority of literature on assessment accommodations focuses on written assessment, such as unit tests, quizzes, and large-scale assessment in mathematics (Clark-Gareca, 2016). The oral language complexities when assessing EBs formatively in mathematics, in contrast, are understudied. This may be associated with common sense that EBs can get moment-to-moment support in a meaningful context (e.g., class discussion). However, the fact that a meaningful context enriches EBs’ learning opportunities does not mean that EBs do not need extra direct language support. During discursive FA, Mrs. G can increase her use of direct language support to scaffold EBs’ language and math learning. A Tension Between the Teacher’s Belief of Good Teaching for All Students and EBs’ Needs in Differentiation Findings in the study showed that there was a tension between Mrs. G’s belief of good teaching for all students and EBs’ needs in differentiation. Mrs. G believed that “good teaching is good teaching for everyone”. This belief seemed to echo the core spirit of inclusive education and universal design for learning (Stanford & Reeves, 2009). There is no doubt that it is important for a teacher to attend to all students’ needs. However, “good teaching for all” does not 130 mean there is no need to differentiate EBs’ learning needs. EBs expressed a strong desire to receive differentiated support from Mrs. G. My interviews with EBs revealed that they wanted and needed differentiated support during FA, such as differentiated math tasks and feedback. According to Lulua, it would be helpful if Mrs. G could provide differentiated feedback according to students’ math and literacy level. In addition, Jasmine expressed her need for explicit language support such as vocabulary glossaries. Yet, in Mrs. G’s math class, it seems that she may have not always differentiated for EBs’ specific needs when planning her instruction and assessments if drawing on evidence that she discussed her scaffolding question use for marginalized students. In the statement she made, Mrs. G considered supporting “marginalized students’’ such as females, students with special education, and EBs together without an emphasis on EBs' unique learning needs. The tension might be caused by the possibility that Mrs. G confounds her ideas of “good teaching for all” and the concept of “just good teaching” (DeJong & Harper, 2005). The “just good teaching” (JGT) refers to applying good strategies and practices (e.g., cooperative learning, and a connection to students’ prior knowledge) developed for a diverse group of native English speakers (De Jong & Harper, 2005). However, the JGT is not enough when working with EBs. While acknowledging the rationality and relevance of JGT for EBs, it is critical for Mrs. G to differentiate EBs’ learning needs for language in classroom instruction and assessments (De Jong & Harper, 2005; Lyon, 2013a; Lucas, Villegas & Freedson-Gonzalez, 2008). In addition, the tension might be caused by Mrs. G’s position about her role in supporting EBs. When discussing direct language accommodations that were provided to EBs during assessments, Mrs. G said, if EBs “got stuck on the language, she (the ESL teacher) can do that”. Her comments suggested that Mrs. G may perceive herself mainly as a content teacher to support 131 all students’ math learning; while it is the ESL teacher’s major responsibility to support EBs’ language needs directly during assessments. Mrs. G is not the only teacher who might hold this belief. Walker and colleagues (2004) found that mainstream teachers often felt that their teaching responsibilities did not include EBs. The fact is that teaching EBs is not only ESL teachers’ responsibility; mainstream teachers also play an important role in supporting EBs’ language needs during classroom instruction and assessment (Yoon, 2008; Lucas, Villegas & Freedson- Gonzalez, 2008). What Counts as A Teacher’s Assessment Expertise: A Gap Between Knowing and Doing My findings showed that there is a gap between what Mrs. G said in the interviews versus her FA practices in the math class. In other words, there seemed to be a gap between Mrs. G’s knowing and doing. In this section, I will illustrate this gap by drawing on different instances where it shows those misalignments. First, there seems to be a misalignment between Mrs. G’s claim of valuing students’ ideas and to what extent she translates the value to her FA practices consistently. During my interview with Mrs. G, she reported the importance of incorporating students’ ideas and contributions to the discursive FA. Yet, as the above section discussed, Mrs. G was not always able to draw on students’ ideas during the discussion to fully support students’ engagement. Findings showed that there was a wide range regarding the extent to which Mrs. G has foregrounded students’ ideas during her FA practices. Second, there is a misalignment between supports that Mrs. G felt she provided to EBs compared to her enactment of those strategies (e.g., wait time and interchangeable language). There was also a gap between Mrs. G’s perceived EBs’ learning needs and the actual support that Mrs. G provided to meet EBs’ needs during FA. Those inconsistencies and misalignments 132 between Mrs. G’s beliefs and practices revealed a potential gap between what she knew about supporting EBs and what she did during FA practices. Another example of this gap is about Mrs. G’s knowledge of what to do and her ability to put the knowledge into the classroom when analyzing her practices in supporting EB. Mrs. G’s “awareness” and “consideration” of language and cultural influence on EBs’ assessment performance did not necessarily translate to good implementation of this type of support for EBs. The gap between Mrs. G’s knowing and doing has raised a question about what counts as a teacher’s assessment expertise. Many researchers have proposed that teachers need to develop knowledge of what to assess, why to assess, and how to assess (Lyon, 2013b; Wylie & Lyon, 2016; Abell & Siegel, 2011). To make it more specific, Abell and Siegel (2011) proposed four components in a teacher's knowledge base of assessment: knowledge of assessment purpose; knowledge of what to assess, knowledge of assessment strategies, and knowledge of interpretation and action-taking. Yet, a good knowledge of assessment does not equal a teacher’s ability to put the knowledge into practice. To distinguish the ideas of knowledge of what to do and capability of putting the knowledge into classroom practices, Gearhart and colleagues (2006,) categorized assessment expertise into two categories in teachers’ assessment-related professional development: “understanding assessment concepts” (i.e., knowing) and “facility with assessments practices” (i.e., doing) (p.241). In addition to researchers such as Gearhart and colleagues (2016), Herman and colleagues (2015) also mentioned the importance of teachers translating assessment knowledge into practice. They defined FA as part of a teacher’s “content and pedagogical knowledge in action” (P.345). 133 Findings about the gap between Mrs. G’s knowledge of what to do in FA and her ability to put the knowledge into practice have implications for the conceptual framework that this study draws on. Specifically, it may suggest a need to reconsider descriptions and indicators of teachers’ assessment expertise in Lyon’s (2013a) rubric. This is because Lyon's (2013a) rubric describes the teacher's expertise in equity from a perspective of whether the teacher considers the use of certain strategies for EBs, which focuses solely on a teacher’s knowledge of what to do in assessment. However, a teacher's capability of putting their knowledge into practice is also critically important. As Kennedy (2008) proposed, teacher qualities not only include their personal resources (e.g., knowledge, beliefs, credentials), and effectiveness (e.g., fostering/motivating student learning), but also include teacher performance, which refers to “teachers actually do their daily practices” (p.60). Thus, when defining what counts as a teacher’s assessment expertise, it is essential to include a teacher’s ability to translate their knowledge of assessment into practices. In other words, both “knowing” and “doing” are equally important when defining assessment expertise and its relevant professional development. An Insight into A Teacher’s Knowledge Base of “Responsive” FA in Math Classrooms Teacher’s Knowledge Base in PCK Conducting effective assessment cannot be separated from a teacher’s knowledge in content-pedagogy, as Herman and colleagues (2015) claimed in their work: “formative assessment clearly makes demands on teachers’ content and pedagogical knowledge” (p.345). They perceive FA as the teacher's “content and pedagogical knowledge in action” (p. 34). In addition, to conduct effective FA, it is essential for teachers to know how to best elicit students’ understanding of a particular topic (Siegel & Wissehr, 2011), which suggests the need for a teacher’s pedagogical content knowledge. 134 In order to avoid the potential confusion of “pedagogical content knowledge” (PCK) (Shulman, 1986, 1987) and content-pedagogy knowledge (Herman, Osmundson, Dai, Ringstaff, & Timms, 2015), this study uses the concept of “pedagogical content knowledge (PCK)” to represent a teacher’s knowledge in content, pedagogy as well as the overlapping section between content and pedagogy. According to Shulman (1986), the PCK includes “the ways of representing and formulating the subject to make it comprehensible to others'' and “an understanding of what makes the learning of specific topics easy or difficult” (p.9). Specifically, in math classrooms, a teacher’s PCK not only includes their understanding of math structures, ideas, and presentation, but also includes their understanding of students’ challenges when developing the math ideas, knowing how to best support students’ learning, and using specific assessment strategies to check student learning progress (Herman, Osmundson, Dai, Ringstaff, & Timms, 2005; Shulman, 1986, 1987). As evidenced by this study, Mrs. G used different questioning strategies, such as higher cognitive level questions (i.e., “why” and “how” questions), and lower cognitive level questions (i.e., factual questions) to elicit student thinking during FA. It is the combination of Mrs. G’s knowledge in math content and pedagogy that leads to her practices of using different questioning strategies to elicit student thinking. In contrast, Mrs. G was not able to provide various strategies to differentiate EBs’ learning needs in content and language during her FA practices. The insufficient support to EBs may be caused by Mrs. G’s lack of understanding of EBs’ characteristics (e.g., EBs’ home language background and English proficiency levels). Knowledge of students is one of the seven core categories of PCK that teachers need to know (Shulman,1986,1987). The National Board for Professional Teaching Standards (NBPTS, 2016) has emphasized teachers’ knowledge of students, including EBs, when articulating what 135 teachers should know and be able to do. When I asked Mrs. G for EBs’ English language proficiency levels, Mrs. G said that she needed to talk with the ESL teacher to know the exact levels. This infers that Mrs. G may lack enough knowledge about her EBs in the math class. As such, it is critical for Mrs. G to develop her knowledge of all students. In addition, Mrs. G’s practices in differentiation may infer Mrs. G’s lack of knowledge on specific strategies (e.g., accommodations) used for EBs. According to Lucas and Villegas (2013), “pedagogical knowledge and skills of linguistically responsive teachers” should include four essential categories: (a) a repertoire of strategies for learning about linguistic and academic background of EBs in English and their native languages, (b) an understanding of and ability to apply key principles of second language learning (e.g., sociolinguistics), (c) ability to identify the language demands of classroom tasks, and (d) a repertoire of strategies for scaffolding instruction for EBs. However, during my interviews, Mrs. G, shared her lack of professional development (PD) experiences in developing pedagogical knowledge and skills for teaching EBs. Mrs. G said that she felt that she was not well prepared to work with EBs when she graduated. After she became a classroom teacher, there were not many professional development opportunities focusing on teaching EBs, not to mention focusing on assessment for EBs. The insufficient and inconsistent PD for in-service teachers may take certain accountability for classroom teacher’s lack of knowledge and skills of teaching EBs. Thus, it is important to put the development of consistent and systematic PD for classroom teachers into the agenda. Overall, a teacher’s knowledge base in PCK plays an important role when working with diverse student populations. In the sub-section below, I will go further and discuss what 136 knowledge and skills a teacher may need to know in order to conduct responsive FA when working with EBs. Teacher’s Knowledge Base in Educational Linguistics and Second Language Learning Researchers have argued that it is critically important to extend teachers’ knowledge base in second language learning and educational linguistics, although teachers may perceive themselves as content-area teachers only (Fillmore & Snow, 2000; Lucas, Villegas & Freedson- Gonzalez, 2008; Lucas & Villegas, 2013). Knowing basic theories and principles of linguistics and second language learning can deepen teachers’ understanding of the role of language in working with EBs. In the meantime, it can also help teachers better understand and predict difficulties and challenges that EBs may encounter during classroom instruction and assessment (Flores, 2016; Schleppegrell, 2007). Fillmore and Snow (2000) examined what teachers should know about linguistics. According to Fillmore and Snow (2000), teachers’ knowledge sets on linguistics should include (a) language and linguistics (e.g., language structure, language use in educational settings), (b) language and cultural diversity, (c) sociolinguistics such as language policies and politics, (d) language development such as academic language development in school, (e) second language learning and teaching such as second language acquisition, (f) language of academic discourse (e.g., informal vs academic language register) and (g) text analysis and language understanding. At first sight, this comprehensive model may make teachers feel overwhelmed. I am not arguing for adopting the exact same suggestions from Fillmore and Snow (2000). However, their proposal provides us with some insight into what knowledge about second language learning and educational linguistics that mainstream teachers are expected to know. 137 The advocacy of preparing linguistically responsive teachers and developing teachers' knowledge in second language learning was also shown in Lucas and colleagues’ (2008) work, where they proposed six principles that teachers need to know when working with EBs. Specifically, the six principles are: (1) be aware of the fundamental differences between academic language proficiency and conversational language proficiency, (2) second language learners must have access to comprehensible input, (3) social interaction where EBs actively participate fosters their conversational and academic English development, (4) EBs with strong native language skills are more likely to achieve parity with native-English-speaking peers, (5) a safe, welcoming classroom environment with minimal anxiety is essential for EBs, and (6) explicit attention to linguistic form and function is essential to second language learning. In Mrs. G’s math class, she valued students’ contributions and attempted to create a safe and welcome learning environment for all students. Mrs. G was aware of the concept of “language register” and used “common language” to support EBs’ math ideas and academic language development. In addition, she used different strategies such as “transcribing student talk”, “revoice” and “scaffolding questions” to increase EBs’ class participation. Of course, there is still room for Mrs. G to improve in the following aspects: providing EBs with comprehensible input, providing native language support, and increasing explicit language teaching and support (e.g., academic vocabularies and linguistic forms/functions). The places where Mrs. G needs to improve, in turn, have echoed the need and importance of developing the teacher's knowledge base in second language learning and educational linguistics, in order to provide EBs with responsive FA. 138 Potential Influences of Math Curriculum on A Teachers’ FA Practice In addition to teacher’s knowledge, the curriculum, which refers to the math content topics for students to learn in this study, may play another important role in shaping Mrs. G’s FA practices. FA, especially discursive FA, is a situated social activity within the classroom and school (Moss, 2008; Ruiz-Primo, 2011). When Mrs. G implemented FA practices, her teaching was under constraints of various contextual factors, including the math curriculum. As a result, Mrs. G’s FA practices may be influenced directly or indirectly by the math content and topics that she was teaching as well as the adopted curriculum resources she used. For example, the math curriculum that Mrs. G’s school adopted, overall, focuses on inquiry-based math learning. The focus of this curriculum seems consistent with Mrs. G’s beliefs in her math teaching, where she values discourse and student’ ideas in FA. Thus, there is a possibility that Mrs. G held the beliefs of integrating discourse into her math classroom teaching before this math curriculum was adopted by the school. Another possibility is that this inquiry- based math curriculum may have influenced Mrs. G’s perceptions and implementation of discursive FA. In addition, the characteristics of math topics and corresponding learning goals and types of math tasks may influence the ways in which Mrs. G implemented specific FA practices. According to the teacher’s guide (i.e., curriculum), within this unit students are expected to (a) connect the methods for whole numbers to computing with decimals, (b) explain patterns in the numbers of zeros of the product when dividing by powers of 10, (c) decompose factors into base ten units (d) use strategies based on place values, the properties of operations and/or relationships between multiplication and division, and (e) illustrate and explain the calculations 139 by using equations, rectangular arrays and area models. In particular, a challenging aspect of this unit is that students are able to perform the division problem dividing by a decimal less than 1. Those learning goals reveal that procedural knowledge (i.e., divisions with whole numbers and decimals) is an essential part of this unit. To follow the curriculum, Mrs. G needed to provide opportunities for students to practice within the math classroom before they were able to apply that procedural knowledge into real-life math problems. These practices of computing divisions problems and equations could be observed within each lesson. Given the characteristics and goals of this math unit, there is a possibility that Mrs. G tended to use funneling questions to direct students to a predetermined answer when students and Mrs. G worked on math tasks focusing on procedural knowledge. In addition, Mrs. G might tend to use funneling questions and lower cognitive demand questions (e.g., factual questions) when walking through math tasks that required students’ procedural knowledge than the situation where Mrs. G and students worked through math tasks that required students’ math reasoning and problem-solving skills. The potential role of this math unit (i.e., unit 5 on the division with whole numbers and decimals) in shaping Mrs. G’s FA practices, if at all, inform us about the importance of including curriculum when investigating a teacher’s FA practices in disciplines. Different content topics may shape a teachers’ FA practices in different ways. Therefore, to make sure the consistency of the nature of Mrs. G’s FA practice, this study suggests a need to look into Mrs. G’s FA practices across different math units in the future. 140 CHAPTER 7: CONCLUSION AND IMPLICATION Conclusion In conclusion, drawing on FA from sociocultural perspectives and Lyon’s (2013a) conceptualization framework on teacher’s assessment expertise, this study investigated how an elementary teacher-Mrs. G-implemented FA in her math class, as well as how she addressed EBs’ learning needs in both content and language to ensure the responsiveness of FA practices. This study can provide insights into the teacher’s expertise in high quality and responsive FA from two perspectives: (1) Mrs. G’s enactment of and rationale for discursive FA, and (2) alignment and misalignment among Mrs. G’s FA practices, beliefs and EBs’ perceptions. First of all, regarding Mrs. G’s enactment of FA, findings showed that discourse is an important part of Mrs. G’s FA practices. This is why I structured my first findings chapter around Mrs. G’s discursive FA, specifically with a focus on her questioning and responding practices given the nature of discursive FA. Findings suggest that Mrs. G valued creating a classroom culture of valuing student ideas throughout FA. However, when enacting the FA, there seemed a variation in terms of the extent to which Mrs. G prioritized students’ ideas and contributions. This study revealed a funneling questioning pattern regarding the interaction between Mrs. G and her students when eliciting students’ thinking. This study also found that there were both lower level and higher-level practices regarding Mrs. G’s FA practices in communicating learning targets with students and eliciting student thinking using questioning strategies. At the lower level of FA practices, students’ ideas and contributions, overall, were not prioritized. In the meantime, there were missed learning opportunities because Mrs. G did not 141 always provide enough wait time to students at her lower-level practices. While at the higher- level practices, Mrs. G valued students’ ideas by connecting to students’ prior learning experiences and providing students with opportunities to internalize the learning targets. For example, Mrs. G brought students’ ideas into the math classroom and revisited the lesson focus at the end of class. Mrs. G also constructed students’ awareness in self-regulated learning in the middle of the lesson and asked questions by drawing on students’ responses at her higher-level questioning practices. Secondly, regarding the “responsiveness” of Mrs. G’s FA, this study examines Mrs. G’s support around the dimension of assessment equity (Lyon, 2013a). Findings in this study showed a pattern of alignment and misalignment among Mrs. G’s FA practices, beliefs and EBs’ perceptions on their learning needs and the support that they received from Mrs. G. Specifically, both Mrs. G and EBs acknowledged the importance of bringing discourse into math classrooms. Aligned with this value, Mrs. G used discursive FA and integrated class discussions to support student talk and elicit student thinking. To meet EBs’ needs in language and content, Mrs. G used multiple strategies such as transcribing verbal discussion, revoice, and using scaffolding questions to develop students’ math ideas and academic language. The use of those strategies, overall, received positive feedback from EBs. For example, an EB student mentioned that the discourse moves of “restate” could help her better understand Mrs. G’s instructional talk. EBs also mentioned that teacher modeling (i.e., walking through the math question step by step) was able to support their math learning. However, there were misalignments between the support that Mrs. G felt she provided to EBs and her actual implementation. In other words, findings in this study revealed a gap between Mrs. G’s knowing and doing in supporting EBs. For example, Mrs. G reported that she had years 142 of experience in giving students wait time. However, in actual implementation, it seemed that Mrs. G missed some opportunities to elicit EBs’ thinking due to insufficient wait time. In the meantime, there was a tension between Mrs. G’s belief about instruction for all students and EBs’ needs in differentiation. The misalignment and tension suggest a need to increase explicit language support for EBs to provide EBs with meaningful and responsive FA experiences. To achieve this goal, it is critical for Mrs. G to develop her knowledge of EBs, including how EBs acquire English as an additional language, and how language plays a complicated role in EBs' math learning. Implication In this section, I discuss the implications of this study methodologically and theoretically from two aspects: (a) implication on future research, and (b) implication on qualitative case study design, which includes potential limitations of this study. Implication on Future Research First of all, findings in this study reveal the importance of prioritizing students’ voices and student involvement when investigating a teacher’s FA practices and their assessment expertise in linguistically and culturally diverse classrooms (Black & Wiliam, 1998a; Volante & Becket, 2011). Incorporating EBs’ voices enables us to better understand what supports are “responsive” to EBs’ needs and what supports are “not responsive” during FA. Incorporating students’ voices into the research also enables us to further understand the best practices for EBs (Bartell, Wager, Edwards, Battey, Foote, & Spencer, 2017). In particular, it enables us to visualize EBs’ needs in language and thus having a clearer idea on how to promote teachers’ expertise in language responsive teaching (Prediger, 2019). 143 Yet, this study only included EBs voices in examining Mrs. G’s FA practices. Future research can include interview data from both EBs and non-EBs to compare similarities and differences of those students’ needs. The comparison may allow researchers and teachers to see what supports apply to both EBs and non-EBs, and what supports need to specifically be provided to EBs. To better understand how to implement high-quality FA, it is also critical for future research to examine how teachers value students’ active role and contribution during FA, especially how teachers gather learning evidence and engage students through self-assessment and peer assessment. In this study, it seems that Mrs. G rarely incorporated self-assessment and peer feedback during her FA practices, which may lead to missed learning opportunities for students. Future research can investigate how teachers can incorporate self-assessment and peer feedback effectively in FA and what it may look like for EBs. Another direction for future research is to investigate teacher’s moment-to-moment decision-making to further understand how teachers use learning evidence that they gathered from students to inform instruction. By uncovering what is happening within the black box of decision making, we can better understand whether the teacher has considered EBs’ specific learning needs during their instruction and assessments (McMillan, 2003; Schoenfeld, 2008). Last but not least, findings in this study raised a question regarding supporting EBs in the long run, especially for those EBs who have been identified as proficient in English. Oftentimes, practitioners may think that EBs who have been identified as English proficient will exit from the English service program, and thus do not need English support anymore. However, findings in this study revealed different views. During my interviews with EBs, one EB student-Jasmine- who speaks Vietnamese and English at home, expressed her concerns concerning not being able 144 to tell the meaning of math polysemy or recalling appropriate math terms (e.g., denominator) when responding to Mrs. G’s questions. She said that it would be helpful if Mrs. G could provide a “list of vocabularies” with definitions during this class discussion. The voice from this EB student, to some extent, informs both practitioners and policymakers about supporting EBs’ language needs from a sustainable perspective. Implication on Qualitative Case Study Design This study has implications on qualitative case study design. First of all, findings on gaps between the teacher’s “knowing” and "doing” suggest an importance of collecting multiple data sources (e.g., teaching videos, interviews, and artifacts) when examining a teacher’s assessment practices. The purpose of doing so is two fold. First, collecting multiple data sources allows for the triangulation of data analysis (Mathison, 1988). Second, it enables a more comprehensive and reliable understanding of a teacher’s instruction and assessments. For example, without a cross- comparison of three data sources (i.e., videotaped lessons, interview with Mrs. G, and interview with EBs), I would not be able to have a better understanding of the alignment and misalignments between Mrs. G’s FA practices, beliefs, and EBs perceptions. The credibility and reliability of data analysis is another important factor in qualitative case study design. To increase the accuracy of the data analysis and interpretation, researchers have recommended conducting member check interviews with the participant (Harper & Cole, 2012). Regarding this study, I did not have my preliminary findings until a year later in my data collection when it was pandemic time. Given the long time since I collected data in Mrs. G’s class, conducting a follow-up member-check interview after I obtained my findings was not as effective as those conducted right after completing my data collection. Therefore, a member check interview was not included in this study, which is a potential limitation of this study. 145 However, I did a member-check with Mrs. G after my first round of data collection in 2017 when time allowed. This experience has taught me a lesson on planning with caution about the timeline of data collection and conducting member check interviews during an intensive data collection stage. This study also suggests the necessity of using video clips to stimulate a teacher’s recollection of their teaching practices. Video clips have been used as a useful tool to support pre-service and in-service teacher’s reflection on their teaching practices (Sherin & van Es, 2005). It enables participants to have deeper and detailed reflections on their teaching (Sherin & van Es, 2005). In this study, I relied on observation notes to generate stimulated recall interview questions. This is because my intensive schedule of data collection did not allow me to watch each videotaped lesson and then prepare selected video clips for us to sit together and discuss. However, I might have missed opportunities to illustrate detailed interaction and rationale between Mrs. G and her students without using video clips to stimulate our conservations. Thus, it is critical for future case study design focus on teaching and classroom practices, which collects teaching videos as major data sources, to use video clips as tools to understand teachers’ FA practices. Overall, this study adopts a qualitative case study design to investigate Mrs. G’s discursive FA because it is situated in a real-life context. While acknowledging the importance of exploring a teacher’s FA practices in real life context, this study also admits the complexities of classroom teaching. Even though I focus on a specific aspect (i.e., FA) of teaching, Mrs. G’s FA practices can be influenced by various factors, such as the curriculum, student behaviors, and instructional time. Similarly, Mrs. G’s decision-making during FA can be influenced by multiple factors as well. As Mrs. G mentioned, one of reasons that she did not expand a discussion on 146 learning targets during FA is her concerns about a lack of enough instructional time due to snow days. In addition, because of the complexities of teaching, Mrs. G’s knowledge and theory of FA may not be able to translate to her teaching practices in one hundred percent, even if she may have had the best intensions. For example, this study revealed that Mrs. G aimed to create a classroom culture that valued students’ ideas and contribution; in the meantime, she also valued the wait time use during FA. However, various factors such as the class size, student behaviors, and limited instructional time may not allow her to use adequate wait time in every moment of her teaching. Moreover, when Mrs. G reported certain support (e.g., “interchangeable language”) that she used for EBs, she may talk about the support she provided to EBs in general, rather than the support she specifically provided for EBs within this math unit. Thus, there is a possibility that that a math unit that contains more math concepts may reveal stronger evidence of how Mrs. G supports EBs using the strategy of “interchangeable language” during FA. 147 APPENDICES 148 APPENDIX A: A Lesson Observation Protocol Observation Date: Unit/Lessons: Learning targets/objectives: o When to clarify & how to clarify Classroom Assessment Practices Assessment Activities 1.Assessment tasks/activity in the mathematics classrooms o When to assess o How to assess 2. Feedback to students o Written and/or verbal o Types of feedback 3. Instructional adjustment 4. Differentiation for EBs: o Linguistic modification (e.g., using modified or simplified tasks) o Visual support (e.g., providing pictures and graphics to EBs) o Read aloud o Native language use o Extra time o Small group work o scaffold language use o Explicit vocabulary instruction o English dictionaries/glossaries o Bilingual dictionaries/glossaries o Other 149 APPENDIX B: Interview Protocol with Mrs. G Selected Sample Questions: 1. How many years have you been teaching? During the years of teaching experience. How long have you worked with diverse learners, especially emergent bilinguals? 2. Did you have any undergraduate/graduate courses focusing on classroom assessments in math? Does the course you took involve assessment for diverse learners, including emergent bilinguals? 3. Since you worked in this elementary school, have you attended workshops or any other PD session on classroom assessment or teaching emergent bilinguals? 1. If yes, do the PD sessions cover specific content of assessment for emergent bilinguals? 4. What assessment methods do you currently use for fifth graders in this math class? 5. How do you define FA? What is the purpose of FA? 6. To what extent are you confident in using formative assessment to assess students’ learning in math? 7. In what ways do you collect evidence of student learning within the FA process? Do you let students do self and peer assessments in the fifth-grade math class? 8. For FA, what types of assessment tasks do you often use? 1. Where do the assessment tasks come from? Do you make them? i. If you made the assessment tasks, usually when you made tasks, and what is the purpose? ii. Any modifications/accommodations for emergent bilinguals? 9. How do you set up and communicate the unit or lesson learning targets with students? 10. Do you use any criteria when grading/evaluate student assessment work? 1. Any modifications or other considerations when evaluating emergent bilinguals’ work? 11. What do you usually do to ensure the alignment of learning targets, assessment tasks, and curriculum criteria/standards? 12. How do you select and use each type of assessment task? 13. What types of feedback do you provide to students? 1. Is there any difference in the feedback provided to emergent bilinguals and non- emergent bilinguals? 14. What instructional adjustments/modifications, if any, do you make from the data of student learning and feedback? 15. What kinds of challenges have you encountered, or you perceive, during formative assessment in a linguistically and culturally diverse classroom? 150 APPENDIX C: Interview Protocol with Students Selected Sample Questions: 1. How do you feel when Mrs. G asks you to write in math? Any things you find challenging about the mathematics tasks? 2. Is there anything you would like for Mrs. G to change about the mathematics tasks to make them easier to understand? 3. What types of feedback (e.g., oral vs written feedback) do you find most useful? 4. What kind of help do you expect Mrs. G to provide in your math learning? 151 APPENDIX D: A Spreadsheet to Show How Well Mrs. G Enacted FA Based on the FARROP Rubric Figure 8. 1A Spreadsheet to Show How Well Mrs. G Enacted FA Based on the FARROP Rubric 152 Dimensions of the FARROPVideo (Feb.13)Video (Feb.14)Video (Feb.19)Video (Feb.20)Video (Feb.22)Video (Feb.26)Learning goals (including success ceritera)BeginningNot observedBeginningBetween developing and progressingProgressingNot observedTasks and activity to elicit evidenc of learningExtendingExtendingExtendingExtendingExtendingBetween progressing and extendingQuestioning strategies to elicit evidence of learningProgressingProgressingProgressingProgressingBetween developing and progressingDevelopingExtending thinking during discurseBetween developing and progressingProgressingBetween progressing and extendingProgressingDevelopingDevelopingDescriptive feedbackNot observedNot observedNot observedNot observedNot observedNot observedPeer feedbackNot observedNot observedNot observedNot observedNot observedNot observedSelf-AssessmentNot observedNot observedNot observedNot observedNot observedNot observedCollaborative culture of learningBetween progressing and extendingBetween progressing and extendingProgressingProgressingProgressingProgressingUse of evidence to inform instructionBetween progressing and extendingBetween developing and progressingProgressingBetween developing and progressingDevelopingDeveloping Figure 8. 2 (cont’d) 153 Video (March 4)Video (March 5)Video (March 6)Video (March 7)Video (March 8)ProgressingDevelopingProgressingExtendingDevelopingProgressingProgressingExtendingExtendingBetween developing and progressingProgressingBetween developing and progressingBetween developing and progressingBetween progressing and extendingBetween progressing and extendingProgressingDevelopingDevelopingProgressingBetween progressing and extendingNot observedNot observedNot observedNot observedNot observedNot observedNot observedNot observedNot observedNot observedNot observedNot observedNot observedNot observedNot observedProgressingProgressingBetween developing and progressingProgressingProgressingProgressingProgressingBetween developing and progressingProgresssingProgressing BIBIOGRAPHY 154 BIBLIOGRAPHY Abedi, J., (2010). Research and Recommendations for Formative Assessment with English Language Learners. In Andrade, H. L. & Cizek, G. J. (Eds.), Handbook of formative assessment (pp.181-197). New York, NY: Routledge. Abedi, J., Hofstetter, C. H., & Lord, C. (2004). Assessment accommodations for English language learners: Implications for policy-based empirical research. Review of Educational Research, 74(1), 1-28. Abedi, J., & Levine, H. G. (2013). Fairness in Assessment of English Learners. Leadership, 42(3), 26. Abedi, J., Lord, C., Hofstetter, C., & Baker, E. (2000). Impact of accommodation strategies on English language learners' test performance. Educational Measurement: Issues and Practice, 19(3), 16-26. Abedi, J., & Lord, C. (2001). The language factor in mathematics tests. Applied measurement in education, 14(3), 219-234. Abell, S. K., & Siegel, M. A. (2011). Assessment literacy: What science teachers need to know and be able to do. In D. Corrigan, J. Dillon, & R. Gunstone (Eds.), The professional knowledge base of science teaching (pp. 205–221). Dordrecht, The Netherlands: Springer. Alvarez, L., Ananda, S., Walqui, A., Sato, E., & Rabinowitz, S. (2014). Focusing formative assessment on the needs of English language learners. San Francisco, CA: WestEd. Retrieved from https://www.wested.org/wp- content/uploads/2016/11/1391626953FormativeAssessment_report5-3.pdf Anderson, D. S., & Piazza, J. A. (1996). Changing beliefs: Teaching and learning mathematics in constructivist preservice classrooms. Action in Teacher Education, 18(2), 51-62. Bailey, A. L., Maher, C. A. & Wilkinson, L. C. (2018). Language, Literacy, and Learning in the STEM Disciplines. In A. L. Bailey, C. A., Maher & L. C. Wilkinson (Eds.), Language, literacy, and learning in the STEM disciplines: How language counts for English learners (pp.1-10). Routledge, NY: New York. Bartell, T., Wager, A., Edwards, A., Battey, D., Foote, M., & Spencer, J. (2017). Toward a framework for research linking equitable teaching with the standards for mathematical practice. Journal for Research in Mathematics Education, 48(1), 7-21. 155 Bell, B., & Cowie, B. (2001). The characteristics of formative assessment in science education. Science education, 85(5), 536-553. Bennett, R. E. (2011). Formative assessment: A critical review. Assessment in education: principles, policy & practice, 18(1), 5-25. Beswick, K. (2005). The beliefs/practice connection in broadly defined contexts. Mathematics Education Research Journal, 17(2), 39-68. Black, P., & William, D. (1998a). Assessment and classroom learning. Assessment in Education: principles, policy & practice, 5(1), 7-74. Black, P., & William, D. (1998b). Inside the black box: Raising standards through classroom assessment. Phi Delta Kappan, 80(2), 139-148 Black, P., & Wiliam, D. (2009). Developing the theory of formative assessment. Educational Assessment, Evaluation and Accountability (formerly: Journal of Personnel Evaluation in Education), 21(1), 5. Box, C., Skoog, G., & Dabbs, J. M. (2015). A case study of teacher personal practice assessment theories and complexities of implementing formative assessment. American Educational Research Journal, 52(5), 956-983. Bunch, G. C., Walqui, A., & Pearson, P. D. (2014). Complex text and new common standards in the United States: Pedagogical implications for English learners. TESOL Quarterly, 48(3), 533-559. Cardimona, K. (2018). Differentiating Mathematics Instruction for Secondary‐Level English Language Learners in the Mainstream Classroom. TESOL Journal, 9(1), 17- 57. Chappuis, S., Chappuis, J., & Stiggins, R. (2009). Keys to Quality. Quest, 67(3), 14-19. Charmaz, K. (2006). Constructing grounded theory: A practical guide through qualitative analysis. London, UK: SAGE Publications Ltd. Clark-Gareca, B. (2016). Classroom assessment and English Language Learners: Teachers' accommodations implementation on routine math and science tests. Teaching and Teacher Education, 54, 139-148. Council of Chief State School Officers (CCSSO) (2017). Revising the Definition of Formative Assessment. Washington, D.C.: Council of Chief State School Officers. Retrieved from https://ccsso.org/sites/default/files/2018- 06/Revising%20the%20Definition%20of%20Formative%20Assessment.pdf 156 Cross, D. I. (2009). Alignment, cohesion, and change: Examining mathematics teachers’ belief structures and their influence on instructional practices. Journal of Mathematics Teacher Education, 12(5), 325-346. Crossouard, B. (2009). A sociocultural reflection on formative assessment and collaborative challenges in the states of Jersey. Research Papers in Education, 24(1), 77-93. Darling-Hammond, L. (2006). Constructing 21st-century teacher education. Journal of teacher education, 57(3), 300-314. De Jong, E. J., & Harper, C. A. (2005). Preparing mainstream teachers for English- language learners: Is being a good teacher good enough?. Teacher Education Quarterly, 32(2), 101-124. Dorn, S. (2010). The political dilemmas of formative assessment. Exceptional Children, 76(3), 325-337. Fillmore, L.W., & Snow, C. E. (2002). What teachers need to know about language (Opinion Paper). Retrieved from Washington, DC: ERIC Clearinghouse on Languages and Linguistics: https://files.eric.ed.gov/fulltext/ED444379.pdf Flick, U., Garms-Homolova, V., Herrmann, W. J., Kuck, J., & Röhnsch, G. (2012). “I Can’t Prescribe Something Just Because Someone Asks for It.” Using Mixed Methods in the Framework of Triangulation. Journal of Mixed Methods Research, 6(2), 97-110. Flores, G. S. (2016). Assessing English language learners: theory and practice. London, UK: Routledge. Galman, S. C. (2016). The good, the bad, and the data: Shane the lone ethnographer’s basic guide to qualitative data analysis. New York, NY: Routledge. García, O., Kleifgen, J. A., & Falchi, L. (2008). From English Language Learners to Emergent Bilinguals. Equity Matters. Research Review No. 1, Teachers College, Columbia University. Gearhart, M., Nagashima, S., Pfotenhauer, J., Clark, S., Schwab, C., Vendlinski, T., ... & Bernbaum, D. J. (2006). Developing expertise with classroom assessment in K–12 science: Learning to interpret student work. Interim findings from a 2-year study. Educational Assessment, 11(3-4), 237-263. Genesee, F., Lindholm-Leary, K., Saunders, W., & Christian, D. (2005). English language learners in US schools: An overview of research findings. Journal of Education for students placed at risk, 10(4), 363-385. 157 Gibbons, P. (2006). Bridging discourses in the ESL classroom: Students, teachers and researchers. New York, NY: Continuum. Gibbon, P. (2015). Scaffolding language scaffolding learning. Portsmouth, NH: Heinemann. Gipps, C. (1999). Chapter 10: Socio-cultural aspects of assessment. Review of research in education, 24(1), 355-392. Gotwals, A. W., & Birmingham, D. (2016). Eliciting, identifying, interpreting, and responding to students’ ideas: Teacher candidates’ growth in formative assessment practices. Research in Science Education, 46(3), 365-388. Gotwals, A., & Ezzo, D., (2018). Formative Assessment: Science and Language with English Language Learners, in Bailey, A. L., Maher, C. A., & Wilkinson, L. C. (Eds.), Language, literacy, and learning in the STEM Disciplines: How language counts for English learners (pp.169-186). Routledge. Goldenberg, C. (2008). Teaching English language learners: what the research does — and does not — say. American Educator. Retrieved from http://www.aft.org/pdfs/americaneducator/summer2008/goldenberg.pdf Guadu, Z. B., & Boersma, E. J. (2018). EFL instructors’ beliefs and practices of formative assessment in teaching writing. Journal of Language Teaching and Research, 9(1), 42-50. Hattie, J., & Timperley, H. (2007). The power of feedback. Review of educational research, 77(1), 81-112. Harper, M., & Cole, P. (2012). Member checking: Can benefits be gained similar to group therapy. The qualitative report, 17(2), 510-517. Heritage, M. (2007). Formative assessment: What do teachers need to know and do?. Phi Delta Kappan, 89(2), 140-145. Heritage, M., Kim, J., Vendlinski, T., & Herman, J. (2009). From evidence to action: A seamless process in formative assessment? Educational Measurement: Issues and Practice, 28(3), 24-31. Herbel-Eisenmann, B. A., & Breyfogle, M. L. (2005). Questioning our patterns of questioning. Mathematics teaching in the middle school, 10(9), 484-489. Herman, J., Osmundson, E., Dai, Y., Ringstaff, C., & Timms, M. (2015). Investigating the dynamics of formative assessment: Relationships between teacher knowledge, assessment practice and learning. Assessment in Education: Principles, Policy & Practice, 22(3), 344-367. 158 Kennedy, M. M. (2008). Sorting out teacher quality. Phi Delta Kappan, 90(1), 59-63. Kingston, N., & Nash, B. (2011). Formative assessment: A meta‐analysis and a call for research. Educational measurement: Issues and practice, 30(4), 28-37. Klute, M., Apthorp, H., Harlacher, J., & Reale, M. (2017). Formative Assessment and Elementary School Student Academic Achievement: A Review of the Evidence (REL 2017-259). National Center for Education Evaluation and Regional Assistance, Retrieved from: https://files.eric.ed.gov/fulltext/ED572929.pdf Kersaint, G., Thompson, D. R., & Petkova. M. (2013). Teaching mathematics to English language learners. New York, NY: Routledge. Konrad, M., Keesey, S., Ressa, V. A., Alexeeff, M., Chan, P. E., & Peters, M. T. (2014). Setting clear learning targets to guide instruction for all students. Intervention in School and Clinic, 50(2), 76-85. Kopriva, R. J., Emick, J. E., Hipolito‐Delgado, C. P., & Cameron, C. A. (2007). Do proper accommodation assignments make a difference? Examining the impact of improved decision making on scores for English language learners. Educational Measurement: Issues and Practice, 26(3), 11-20. Houghton Mifflin Harcourt (2021). Math Expressions: Build deep understanding with this essential inquiry-based mathematics curriculum. Retrieved from https://www.hmhco.com/programs/math-expressions Leahy, S., Lyon, C., Thompson, M., & Wiliam, D. (2005). Classroom assessment: Minute Lynch, R., McNamara, P. M., & Seery, N. (2012). Promoting deep learning in a teacher by minute, day by day. Educational leadership, 63(3), 18–24. education programme through self-and peer-assessment and feedback. European Journal of Teacher Education, 35(2), 179-197. Lyon, E. G., Bunch, G. C., & Shaw, J. M. (2012). Navigating the language demands of an inquiry‐based science performance assessment: Classroom challenges and opportunities for English learners. Science Education, 96(4), 631-651. Lyon, E. G. (2013b). Learning to assess science in linguistically diverse classrooms: Tracking growth in secondary science preservice teachers’ assessment expertise. Science Education, 97(3), 442-467. Lyon, E. G. (2013a). Conceptualizing and exemplifying science teachers' assessment expertise. International Journal of Science Education, 35(7), 1208-1229. 159 Lucas, T., Villegas, A. M., & Freedson-Gonzalez, M. (2008). Linguistically responsive teacher education: Preparing classroom teachers to teach English language learners. Journal of Teacher Education, 59(4), 361-373. Lucas, T., & Villegas, A. M. (2013). Preparing linguistically responsive teachers: Laying the foundation in preservice teacher education. Theory into practice, 52(2), 98-109. Lucas, T., de Oliveira, L. C., & Villegas, A. M. (2014). Preparing linguistically responsive teachers in multilingual contexts. In A. Mahboob & L. Barratt (Eds.), Englishes in multilingual contexts (pp. 219-230). Dordrecht, Netherlands: Springer. Marzano, R. J. (2013). Targets, objectives, standards: How do they fit? Educational Leadership, 70, 82–83. Mathison, S. (1988). Why triangulate?. Educational researcher, 17(2), 13-17. Martiniello, M. (2009). Linguistic complexity, schematic representations, and differential item functioning for English language learners in math tests. Educational assessment, 14(3-4), 160-179. Martiniello, M. (2008). Language and the performance of English-language learners in math word problems. Harvard Educational Review, 78(2), 333-368. Martiniello & Wolf (2012). Cases of Practices: Assessing ELLs in Mathematics. In Celedon-Pattichis, S & Ramirez, N.G. (Eds.), Beyond good teaching: Advancing mathematics education for ELLs (pp.139-162). Reston, VA: National Council of Teachers of Mathematics. Miller, M. D., Linn, R. L., & Gronlund, N. E (2009). Instructional goals and objectives: Foundation for Assessment. In M.D. Miller, R. L., Linn, & N.E., Gronlund (10th Eds.), Measurement and Assessment in Teaching (pp. 47-69). NJ: New Jersey. Pearson Education Upper Saddle River. Moss, P. A. (2008). Sociocultural implications for the practice of assessment I: Classroom assessment. In P. A. Moss, D. Pullin, J. P. Gee, E. H. Haertel, & L. J. Young (Eds.), Assessment, equity, and opportunity to learn (pp. 222-258). New York, NY: Cambridge University Press. Moschkovich, J. N. (2007). Beyond words to mathematical content: Assessing English learners in the mathematics classroom. In A. Schoenfeld (Ed.), Assessing mathematical proficiency (pp. 345–352). New York, NY: Cambridge University Press. Moschkovich, J. (2013). Principles and guidelines for equitable mathematics teaching practices and materials for English language learners. Journal of Urban Mathematics Education, 6(1), 45-57. 160 Moschkovich, J. (2018) Talking to learn mathematics with understanding: Supporting academic literacy in Mathematics for English Learners. In A. L. Bailey, C. A. Maher, & L. C. Wilkinson (Eds.), Language, Literacy, and Learning in The STEM Disciplines: How language counts for English learners (pp. 13-34). New York, NY: Routledge National Center for Education Statistics (2020, May). English Language Learns in Public Schools. Retrieved from https://nces.ed.gov/programs/coe/indicator_cgf.asp National Council of Teachers of Mathematics. (2014). Principles to Action Ensuring Mathematics Success for All. Reston, VA: National Council of Teachers of Mathematics. National Council of Teachers of Mathematics (2013). Teaching Mathematics to English Language Learners (Position Statement). Retrieved from http://www.nctm.org/ELLMathematics/ National Board for Professional Teaching Standards (2016). What teachers should know and be able to. Retrieved from http://accomplishedteacher.org/ Nicol, D. J., & Macfarlane‐Dick, D. (2006). Formative assessment and self‐regulated learning: A model and seven principles of good feedback practice. Studies in higher education, 31(2), 199-218. Panadero, E., A. Jonsson, & J. W. Strijbos (2016). “Scaffolding self-regulated learning through self-assessment and peer Assessment: Guidelines for classroom implementation.” In D. Laveault & L. Allal (Eds.), Assessment for Learning: Meeting the Challenge of Implementation (pp.311-326). Dordrecht, Netherland: Springer. Pajares, M. F. (1992). Teachers’ beliefs and educational research: Cleaning up a messy construct. Review of educational research, 62(3), 307-332. Philhower, J. (2018). Investigating High School Mathematics Teachers' Formative Assessment Practices (Doctoral dissertation). Retrieved from ProQuest (Accession No. 10815272). Popham, W. J. (2010). Classroom assessment: What teachers need to know (6th ed). Boston, MA: Pearson. Prediger, S. (2019). Investigating and promoting teachers’ expertise for language- responsive mathematics teaching. Mathematics Education Research Journal, 31(4), 367-392. Ruiz‐Primo, M. A., & Furtak, E. M. (2007). Exploring teachers' informal formative assessment practices and students' understanding in the context of scientific inquiry. Journal of research in science teaching, 44(1), 57-84. 161 Ruiz-Primo, M. A. (2011). Informal formative assessment: The role of instructional dialogues in assessing students’ learning. Studies in Educational Evaluation, 37(1), 15-24. Sadler, D. R. (1989). Formative assessment and the design of instructional systems. Instructional science, 18(2), 119-144. Sardareh, S. A., & Saad, M. R. M. (2012). A sociocultural perspective on assessment for learning: The case of a Malaysian primary school ESL context. Procedia-Social and Behavioral Sciences, 66, 343-353. Sass-Henke, A. M. (2013). Living and learning: Formative assessment in a middle level classroom. Voices from the Middle, 21(2), 43. Seawright, J., & Gerring, J. (2008). Case selection techniques in case study research: A menu of qualitative and quantitative options. Political research quarterly, 61(2), 294- 308. Schleppegrell, M. (2012). Linguistic tools for exploring issues of equity. In B. Herbel- Eisenmann, J. Choppin, D. Wagner, & D. Pimm (Eds.), Equity in Discourse for Mathematics Education (pp. 109-124). Dordrecht, Netherland: Springer. Shepard, L. A. (2000). The role of assessment in a learning culture. Educational researcher, 29(7), 4-14. Sherin, M., & van Es, E. (2005). Using video to support teachers’ ability to notice classroom interactions. Journal of Technology and Teacher Education, 13(3), 475- 491. Shulman, L. S. (1986). Those who understand: Knowledge growth in teaching. Educational researcher, 15(2), 4-14. Shulman, L. (1987). Knowledge and teaching: Foundations of the new reform. Harvard educational review, 57(1), 1-23. Schleppegrell, M. J. (2007). The linguistic challenges of mathematics teaching and learning: A research review. Reading & writing quarterly, 23(2), 139-159. J. M. Truran & K. M. Truran (Eds.), Making the difference: Proceedings of the 22nd Annual Conference of the Mathematics Education Research Group of Australasia (pp. 439-445). Adelaide: MERGA. Shield, M. (1999). The conflict between teachers' beliefs and classroom practices, in Siegel, M. A. (2007). Striving for equitable classroom assessments for linguistic minorities: Strategies for and effects of revising life science items. Journal of Research in Science Teaching, 44(6), 864-881. 162 Siegel, M. A., Wissehr, C., & Halverson, K. (2008). Sounds like success: A framework for equitable assessment. The Science Teacher, 75(3), 43. Siegel, M. A., & Wissehr, C. (2011). Preparing for the plunge: Preservice teachers’ assessment literacy. Journal of Science Teacher Education, 22(4), 371-391. Silver, E., Smith, M. S., (2015). Integrating Powerful Practices: Formative Assessment and Cognitively Demanding Mathematics Tasks (pp.5-12), in C. Suurtamm & NCTM (Eds.), Assessment to enhance teaching and learning, Reston, VA: National Council of Teachers of Mathematics. Smith, M. E., Teemant, A., & Pinnegar, S. (2004). Principles and practices of sociocultural assessment: Foundations for effective strategies for linguistically diverse classrooms. Multicultural perspectives 6(2), 38-46. Stanford, B., & Reeves, S. (2009). Making it happen: Using differentiated instruction, retrofit framework, and universal design for learning. Teaching Exceptional Children Plus, 5(6), n6. Stiggins, R. J. (2002). Assessment crisis: The absence of assessment for learning. Phi Delta Kappan, 83(10), 758-765. Stipek, D. J., Givvin, K. B., Salmon, J. M., & MacGyvers, V. L. (2001). Teachers’ beliefs and practices related to mathematics instruction. Teaching and teacher education, 17(2), 213-226. Strauss, A., & Corbin, J. (1998). Basics of qualitative research. Thousand Oaks, CA: Sage Publications. Sugarman, J., & Geary, C. (2018). English Learners in Michigan: Demographics, Outcomes, and State Accountability Policies. Migration Policy Institute. Retrieved from https://www.migrationpolicy.org/sites/default/files/publications/EL- factsheet2018-Michigan_Final.pdf Symons, C. (2021). Instructional practices for scaffolding emergent bilinguals’ comprehension of informational science texts. Pedagogies: An International Journal, 16(1), 62-80. Taras, M. (2005). Assessment–summative and formative–some theoretical reflections. British journal of educational studies, 53(4), 466-478. Thompson, A. G. (1984). The relationship of teachers' conceptions of mathematics and mathematics teaching to instructional practice. Educational studies in mathematics, 15(2), 105-127. 163 Torrance, H. (2012). Formative assessment at the crossroads: Conformative, deformative and transformative assessment. Oxford Review of Education, 38(3), 323-342. Veon, K., (2016). A case study of teachers’ practices using formative assessment for fifth grade mathematics students (Doctoral dissertation). Retrieved from ProQuest (Access No.10032423) Villegas, A. M., & Lucas, T. (2002). Preparing culturally responsive teachers: Rethinking the curriculum. Journal of teacher education, 53(1), 20-32. Volante, L., & Beckett, D. (2011). Formative assessment and the contemporary classroom: Synergies and tensions between research and practice. Canadian journal of education, 34(2), 239-255. Vygotsky (1978). Mind in society: Development of higher psychological processes. Cambridge MA: Harvard University Press. Walker, A., Shafer, J., & Iiams, M. (2004). Not in my classroom”: Teacher attitudes towards English language learners in the mainstream classroom. NABE Journal of Research and Practice, 2(1), 130-160. Walqui, A., & Van Lier, L. (2010). Scaffolding the academic success of adolescent English language learners: A pedagogy of promise. San Francisco, CA: WestEd. Weurlander, M., Söderberg, M., Scheja, M., Hult, H., & Wernerson, A. (2012). Exploring formative assessment as a tool for learning: students’ experiences of different methods of formative assessment. Assessment & Evaluation in Higher Education, 37(6), 747-760. World-Class Instructional Design and Assessment (2012). 2012 Amplification of the WIDA English Language Development Standards. Retrieved from https://wida.wisc.edu/resources/2012-amplification-wida-english-language- development-standards. Wiggins, G. (2012). Seven keys to effective feedback. Feedback, 70(1), 10-16. Willner, L. S., Rivera, C., & Acosta, B. D. (2009). Ensuring accommodations used in content assessments are responsive to English‐language learners. The Reading Teacher, 62(8), 696-698. Wolf, M. K., & Leon, S. (2009). An investigation of the language demands in content assessments for English language learners. Educational Assessment, 14(3-4), 139- 159. Wylie, C., & Lyon, C. (2016). Using the formative assessment rubrics, reflection and observation tools to support professional reflection on practice (Revised). 164 Washington, D.C.: Council of Chief State School Officers (CCSSO). Retrieved from https://ccsso.confex.com/ccsso/2017/webprogram/Handout/Session4829/FARROP20 16.pdf Yin, R. K. (2003). Case study research and applications: Design and methods. Thousand Oaks, CA: Sage Publications. Yoon, B. (2008). Uninvited guests: The influence of teachers’ roles and pedagogies on the positioning of English language learners in the regular classroom. American Educational Research Journal, 45(2), 495-522. Zainal, Z. (2007). Case study as a research method. Jurnal kemanusiaan, (9), 1-6. 165