TEACHERS’ USE OF SCAFFOLDING DURING COGNITIVELY DEMANDING TASKS By Kathryn Soo Kyeong Appenzeller A DISSERTATION Submitted to Michigan State University in partial fulfillment of the requirements for the degree of Mathematics Education–Doctor of Philosophy 2019 TEACHERS’ USE OF SCAFFOLDING DURING COGNITIVELY DEMANDING TASKS ABSTRACT By Kathryn Soo Kyeong Appenzeller Researchers argue that scaffolding students’ mathematical thinking is an effective teaching practice (Calder, 2015; Mercer & Littleton, 2007; Roll, Holmes, Day, & Bonn, 2012) that supports students when they need additional assistance to complete a task (Wood, Bruner, & Ross, 1976). Others demonstrate the importance of enacting cognitively demanding tasks because they help build students’ capacity for mathematical reasoning (Stein, Grover, & Henningsen, 1996) and are critical to students’ achievement and learning (Charalambous, 2008; Hiebert & Wearne, 1993; Stein & Lane, 1996). Recent literature identifies a connection-and at times a tension-between scaffolding and cognitive demand (Sullivan & Mornane, 2014). For this dissertation, I conducted an empirical study that included classroom observations and semi-structured teacher interviews to address a gap in research by asking the following research questions within the context of cognitively demanding tasks: 1) How do teachers think about the relationship between scaffolding and maintaining cognitive demand?, 2) In what ways do teachers provide scaffolding through discourse? and 3) What are students’ experiences and in what ways do teachers respond to students’ experiences? I analyzed teachers’ discourse, specifically teacher questions and prompts, during the teachers’ enactment of the Connected Mathematics Project (CMP) curriculum and teacher interviews. In this work, teachers voiced their struggle of providing appropriate scaffolding without reducing the cognitive demand of tasks. I used curriculum (lesson phases and focus questions), student uncertainty, and student frustration as contexts to illustrate how the participating teachers mediated tension between scaffolding and cognitive demand. Drawing from teachers’ strategies, largely teacher discourse, I developed a teacher question framework that described the nature of unfolding teacher questions that encouraged student exploration and conceptual learning. Instead of fading, teacher scaffolding changed to reflect the contexts mentioned above while maintaining cognitive demand. My findings provide insights for professional development on enacting cognitively demanding tasks. Copyright by KATHRYN SOO KYEONG APPENZELLER 2019 TABLE OF CONTENTS LIST OF TABLES .................................................................................................................. viii LIST OF FIGURES .................................................................................................................. ix CHAPTER 1 INTRODUCTION AND CONCEPTUAL FRAMEWORK .............................................. 1 Working Definitions .................................................................................................................. 3 Scaffolding ............................................................................................................................ 3 Cognitive demand ................................................................................................................. 5 Potential relationship between scaffolding and cognitive demand ................................ 10 Discourse ............................................................................................................................. 10 Conceptual Framework .......................................................................................................... 12 CHAPTER 2 LITERATURE REVIEW ......................................................................................... 16 Scaffolding Related to Written Tasks ..................................................................................... 17 Written tasks as independent ............................................................................................. 17 Written tasks as dependent ................................................................................................ 19 Scaffolding in written tasks ................................................................................................. 21 Scaffolding During the Enactment Phase ............................................................................... 23 The importance of the enactment phase ........................................................................... 23 Teacher behavior during enactment that supports scaffolding .......................................... 24 Cognitively demanding discussion as a part of enacting cognitively demanding tasks ...... 26 Vagueness of Scaffolding Within Cognitive Demand Research .............................................. 27 CHAPTER 3 CONTEXT OF STUDY AND OVERVIEW OF METHODS ........................................... 29 Context of the Study ............................................................................................................... 29 Curriculum .......................................................................................................................... 32 Participant selection ........................................................................................................... 34 Data sources ....................................................................................................................... 35 Data Analysis .......................................................................................................................... 46 Identifying teacher questions ............................................................................................. 47 Codes for teacher questions ............................................................................................... 54 Coding teacher questions ................................................................................................... 65 Inter-rater reliability of teacher questions coding .............................................................. 67 Identifying teacher ambiguous responses to student uncertainty ..................................... 69 Identifying salient student episodes ................................................................................... 72 Note About Findings Chapters ................................................................................................ 73 CHAPTER 4 TEACHERS PROVIDED SCAFFOLDING BY ASKING QUESTIONS ............................. 76 Claim 1: Trends Identified in How Teacher Questions Unfolded ............................................ 76 Written curriculum context clues influenced teachers’ questioning trends ....................... 83 v Launch phase ................................................................................................................... 84 Explore phase .................................................................................................................. 84 Summarize phase ............................................................................................................ 85 The nature of unfolding Launch, Explore, and Summarize suggested questions ............ 85 Focus question ................................................................................................................ 86 Teacher questions during enacted curriculum ................................................................... 86 Connections between curriculum and teacher use ......................................................... 87 Connections between teacher questions and discourse moves ..................................... 87 Claim 2: Teachers Posed More Questions than Statements .................................................. 88 Teachers encouraged student discussions .......................................................................... 90 Teachers emphasized mathematical connections .............................................................. 92 Teachers balanced scaffolding with time to explore .......................................................... 93 Discussion of Teacher Question Codes and Frequencies ....................................................... 95 Teachers responded to student needs ............................................................................... 95 Teachers used mathematical conversations to support student learning .......................... 96 Teachers grappled with balancing scaffolding and exploration .......................................... 96 Teacher questions mediated tension between scaffolding and cognitive demand ........... 97 Claim 3: The Types of Questions Were Unevenly Distributed Across Phases ........................ 98 Time constraints influenced teachers’ decisions to lead .................................................... 98 Focus question and emphasis on prediction influenced teacher scaffolds ....................... 100 Curriculum Content and In-the-Moment Decision Making Contributed to an Uneven Distribution of Teacher Questions ............................................................................................ 101 Teacher Questions as Discourse Moves Mediated Tension Between Scaffolding and Cognitive Demand ..................................................................................................................................... 102 CHAPTER 5 TEACHER AMBIGUOUS RESPONSES TO STUDENT UNCERTAINTY AS A POTENTIAL FORM OF SCAFFOLDING .................................................................................................... 106 Resolution Type 1: Student Uncertainty Resolved by Students ............................................ 106 Teachers Supported Students to Resolve Uncertainty About Pre- and Co-Requisite Skills ... 108 Resolution Type 2: Teachers’ Increased Scaffolding Facilitated Students to Resolve Student Uncertainty ............................................................................................................................... 109 Increased Teacher Scaffolding Occurred on a Continuum .................................................... 113 Resolution Type 3: Student Uncertainty Not Resolved ......................................................... 114 Unresolved Student Uncertainty Provided Opportunities for Student Conjectures ............. 118 Curricular Considerations for Uncertainty During Enacted Phases of a Lesson .................... 120 Launch phase ..................................................................................................................... 120 Explore phase ..................................................................................................................... 120 Summarize phase ............................................................................................................... 121 Teachers Used Discourse to Mediate Tension Within the Context of Student Uncertainty . 121 CHAPTER 6 SALIENT STUDENT EXPERIENCES AND TEACHER RESPONSES ............................ 123 Salient Episode 1 with Madison ............................................................................................. 124 Salient Episode 2 with Tate ................................................................................................... 127 No black chips on Tate’s chip board .................................................................................. 127 vi Adding black chips without changing the value of Tate’s chip board ................................ 128 Group member contributed an explanation ...................................................................... 129 Tate objected to six red chips on his chip board ................................................................ 129 Relationship between number of chips and the value of Tate’s chip board ..................... 130 Tate wavered in confidence ............................................................................................... 131 Tate followed Sophie’s increased scaffolding .................................................................... 132 Student Resistance to Engaging with a Cognitively Demanding Task ................................... 134 Salient Episode 3 with Leonore ............................................................................................. 135 Salient Episode 4 with Evelyn ................................................................................................ 136 The Role of Emotions in Students’ Experiences and Teachers’ Responses ........................... 139 Teachers’ Need to Mediate Tension Within the Context of Student Frustration was Necessary for Student Perseverance .......................................................................................................... 140 CHAPTER 7 DISCUSSION AND IMPLICATIONS OF SCAFFOLDING AND COGNITIVE DEMAND RELATIONSHIP ................................................................................................................... 143 Teachers’ Perceived and Experienced Tension Influenced Scaffolding Choices .................... 143 Teachers asked questions with an obvious answer ........................................................... 143 Teachers led students through or eliminated portions of a written task .......................... 144 Teachers Thought of Questions as a Bridge Between Scaffolding and Cognitive Demand ... 144 Teacher question framework mediated tension ............................................................... 145 Role of context in how teachers mediated tension between scaffolding and cognitive demand ..................................................................................................................................... 145 Instead of Fading, Varied Contexts Illustrated Evolving Scaffolds ......................................... 148 Teacher and Student Voices Detailed Salient Experiences Pertinent to Enacting Cognitively Demanding Tasks and Professional Development .................................................................... 149 APPENDICES ...................................................................................................................... 152 APPENDIX A Task Analysis Guide ........................................................................................... 153 APPENDIX B Teacher Consent Form for Research Participant ............................................... 154 APPENDIX C Parent & Student Information and Participant Form for Research ................... 156 APPENDIX D Classroom Observation Form ........................................................................... 158 APPENDIX E Pre-Observation Semi-Structured Interview Protocol ....................................... 159 APPENDIX F Post-Observation Semi-Structured Interview Protocol ..................................... 168 APPENDIX G Teacher Questions Spreadsheet ....................................................................... 170 APPENDIX H Explore Phase Percentages for Every Lesson .................................................... 171 APPENDIX I Student Uncertainty Resolution Type 3 Example Complete Transcript ............. 172 APPENDIX J Salient Episode 1 Complete Transcript .............................................................. 173 REFERENCES ...................................................................................................................... 175 vii LIST OF TABLES Table 1 General Classroom Observation Information for Each Teacher .................................... 29 Table 2 Primary Data Source for Research Questions ................................................................ 36 Table 3 Comparison of Launch to Summarize Within a Code Category ..................................... 77 Table 4 The Number of Questions Asked During a Lesson ........................................................ 89 Table 5 Explore Phase Percentages ............................................................................................ 90 Table 6 Task Analysis Guide ...................................................................................................... 153 Table 7 Explore Phase Percentages for Every Lesson ............................................................... 171 viii LIST OF FIGURES Figure 1 Conceptual Framework ................................................................................................ 15 Figure 2 Multiple Questions Within a Single Task ...................................................................... 18 Figure 3 Goal Task ...................................................................................................................... 20 Figure 4 Interim Task .................................................................................................................. 20 Figure 5 Observed Task Information .......................................................................................... 37 Figure 6 Suggested Teacher Questions from CMP Written Curriculum ..................................... 39 Figure 7 Tailored Questions Related to CMP Unit ...................................................................... 44 Figure 8 Teacher Questions Related to Focus Question ............................................................ 48 Figure 9 Decontextualized Questions Not Related to the Focus Question ................................ 51 Figure 10 Challenges to Transcribe One Teacher Question into a Spreadsheet ........................ 53 Figure 11 Teacher Poster of Professional Standards for Teaching Mathematics ....................... 54 Figure 12 Initial Teacher Question Codes 1–5 ............................................................................ 57 Figure 13 Code 1 Sub-Codes with Teacher and Student Talk ..................................................... 58 Figure 14 Code 2 Sub-Codes with Teacher and Student Talk ..................................................... 60 Figure 15 Code 3 Sub-Codes with Teacher and Student Talk ..................................................... 62 Figure 16 Code 4 Sub-Codes with Teacher and Student Talk ..................................................... 63 Figure 17 Code 5 Sub-Codes with Teacher and Student Talk ..................................................... 64 Figure 18 Final Teacher Question Codes 1–5 ............................................................................. 65 Figure 19 Example of Heather’s Teacher Questions Unfolding and Code Patterns Emerging ... 80 Figure 20 Occurrences of When the Most Teacher Questions Were in a Particular Phase ....... 94 ix Figure 21 Heather Supported Student to Resolve Student Uncertainty ................................... 111 Figure 22 Examples of Teacher Talk When Student Uncertainty Was Not Resolved ................ 115 Figure 23 Tate Described His Conundrum of Having No Black Chips on His Chip Board .......... 127 Figure 24 Tate Described How to Add Black Chips Without Changing the Value ..................... 128 Figure 25 Group Member Contributed an Explanation ............................................................. 129 Figure 26 Tate’s Objection to Six Red Chips on His Chip Board ................................................ 130 Figure 27 Tate Showed Frustration and Sophie Highlighted Relationship Between the Number of Chips and the Value of Tate’s Chip Board ................................................................................. 130 Figure 28 Tate’s Confidence Wavered ...................................................................................... 132 Figure 29 Tate Followed Sophie’s Increased Scaffolding .......................................................... 132 Figure 30 Leonore Confused When Graphing Function on Calculator ...................................... 136 Figure 31 Evelyn Frustrated and Disengaged from the Task ..................................................... 136 x CHAPTER 1 INTRODUCTION AND CONCEPTUAL FRAMEWORK Researchers argue that scaffolding students’ mathematical thinking is an effective teaching practice (Calder, 2015; Mercer & Littleton, 2007; Roll, Holmes, Day, & Bonn, 2012) that supports students when they need additional assistance to complete a task (Wood, Bruner, & Ross, 1976). Others demonstrate the importance of enacting cognitively demanding tasks because they help build students’ capacity for mathematical reasoning (Stein, Grover, & Henningsen, 1996) and are critical to students’ achievement and learning (Charalambous, 2008; Hiebert & Wearne, 1993; Stein & Lane, 1996). How teachers provide scaffolding; however, may conflict with how they enact cognitively demanding tasks. If both scaffolding and cognitively demanding tasks are positive teaching practices, then how can teachers provide scaffolding while enacting a cognitively demanding task without lowering the cognitive demand of the task? Recent literature identified a connection-and at times a tension-between scaffolding and cognitive demand (Sullivan & Mornane, 2014). In this work, teachers voiced their struggle of providing appropriate scaffolding without reducing the cognitive demand of tasks. In other words, teachers sought to use discourse moves to ask questions, answer requests, or provide prompts without lowering cognitive demand. Researchers noted that providing too much of something (e.g., directions or demonstrations), or asking certain kinds of questions (e.g., questions that focus on learning procedures), may lower the cognitive demand of a given task (Hiebert & Wearne, 1993; Lloyd & Wilson, 1998; Stein, Smith, Henningsen, & Silver, 2000). The relationship between scaffolding and cognitive demand as a focus of empirical research is minimal, specifically the inclusion of teachers’ voice about this tension. For example, what are 1 teacher strategies for providing scaffolding that maintains cognitive demand of a task and what are students’ experiences and reactions to their scaffolding? Inclusion of teacher and student perspectives on, and interactions during, cognitively demanding tasks may give insights to potential challenges and strategies for mitigating these tensions. Therefore, understanding how teachers provide effective scaffolding while enacting cognitively demanding tasks (and maintaining their cognitive demand) is important for mathematics education. My study seeks to address a gap in research by asking the following research questions within the context of cognitively demanding tasks: 1. How do teachers think about the relationship between scaffolding and maintaining cognitive demand? 2. In what ways do teachers provide scaffolding through discourse? 3. What are students’ reactions and experiences? a. In what ways do teachers respond to students’ reactions and experiences? The remainder of Chapter 1 establishes working definitions to clearly convey what theories I draw from and how I use these terms in my work. Chapter 2 synthesizes research that highlights the ways scaffolding is explicitly and implicitly present in cognitive demand research. My discussion of the nuanced ways in which scaffolding appears in cognitive demand literature supports my argument that the relationship between scaffolding and cognitive demand is left unclear and requires further study. Chapter 3 articulates my method for data collection and analysis. For this study, the organization of findings and discussion chapter deviates from conventional format. Chapters 4 – 6 present and discuss findings related to the varying 2 instructional contexts, including phases of a lesson and students’ experiences, in which teachers address the tension between scaffolding and cognitive demand. Specifically, Chapter 4 pertains to teacher questions as a form of scaffolding (RQ2), Chapter 5 highlights how teachers use uncertainty as a way to scaffold (RQ2), and Chapter 6 describes salient student and teacher experiences and reactions (RQ3). For each finding chapter, discussion of the specific finding immediately follows the data. Note, RQ1 cuts across several of my findings. Therefore, Chapter 7 zooms out to review how the study adds to the cognitive demand literature through an exploration of teachers’ thinking about the relationship between scaffolding and cognitive demand through their discourse move choices and in-the-moment decision making (RQ1). To conclude, Chapter 7 describes implications related to scaffolding and opportunities for further cognitive demand research. Working Definitions In order to explicitly address my perspective in this study, I establish working definitions of scaffolding, cognitive demand, and discourse. Ryve (2011) stressed the importance of conceptual accountability by quoting Sfard (2008), stating conceptual accountability is “being explicit about how we use the keywords and how our uses relate to those of other interlocutors” (p. 42). Scaffolding. Bakker, Smit, and Wegerif (2015) provided a complex history of scaffolding definitions and uses in educational research, with origins in the work of Vygotsky (1978)1, Bruner (1975a, 1975b), and Wood et al. (1976). Wood et al. (1976) were not the first group of 1 Vygotsky used the phrase “zone of proximal development” which helps to understand when and how scaffolding can support learning. 3 researchers to coin the term “scaffolding”; however, their study was considered the first to include an extensive discussion of the metaphor with empirical data. Scaffolding, as defined by Wood et al. (1976), is ‘‘the process that enables a child or novice to solve a problem, carry out a task, or achieve a goal which would be beyond his (sic) unassisted efforts’’ (p. 90). Wood and colleagues characterized scaffolding as: an interactive system of exchange in which the tutor operates with an implicit theory of the learner’s acts in order to recruit his attention, reduces degrees of freedom in the task to manageable limits, maintains ‘direction’ in the problem solving, marks critical features, controls frustration and demonstrates solutions when the learner can recognize them. (p. 99) Wood and colleagues’ (1976) definition characterized scaffolding as a process and interactive system of exchanges that help a student successfully complete a task or learning goal. I interpret one example of an interactive exchange as the interactions a teacher and student or small group of students may engage in during classroom instruction. A student or small group of students who are working to complete a mathematical task may seek help from a teacher when they are confused or unable to proceed. Without the help of a teacher, the student or students may become frustrated or not able to finish the task. Teachers’ scaffolding is the help they offer through questions or statements that enables the student or students to proceed in their mathematical work. For example, a teacher question or statement may prompt students to explain their mathematical work, draw attention to a particular feature of the task, or encourage perseverance with a student strategy. While the original conceptualization of scaffolding focused on tutors and tutees, the 4 concept has subsequently been applied in many different contexts, with different actors providing scaffolds, including teachers and peers. I am focusing in this study on the scaffolds teachers provide. I draw on Edson (2017) and Wood et al. (1976) to define scaffolding as an interaction between individuals (e.g., teacher and students) that enables a student to complete a task. In other words, scaffolding is an “active and responsive role required to support the complexities of student learning in classrooms” (Edson, 2017, p. 736) by helping students go beyond that which they could do alone. Cognitive demand. Doyle (1983) considered the curriculum as a collection of academic tasks that students experience as they progress through school. The types of tasks students experience largely impact what the students learn (Hiebert & Wearne, 1993). This early work of Doyle (1983), along with Schoenfeld (1992), influenced Silver and Stein’s (1996) work with The QUASAR Project2. Silver and Stein (1996) studied the development and implementation of reform mathematics instruction in urban middle schools located in economically disadvantaged communities. The reformed mathematics tasks emphasized reasoning, thinking, understanding, and explaining, instead of memorization and replication, improving students’ mathematical thinking. Additionally, teaching strategies which involved more student engagement and group activity were implemented. Evidence from district-mandated, standardized test scores and a cognitive assessment instrument supported the alternative model of instruction. These results supported Stein, Grover, and Henningsen’s (1996) claim that “mathematical tasks [are] 2 For additional information about the QUASAR (Qualitative Understanding: Amplifying Student Achievement and Reasoning) Project see Silver and Stein (1996). 5 important vehicles for building student capacity for mathematical thinking and reasoning” (p. 455). Work on the QUASAR Project laid the foundation for Stein and colleagues (1996) to develop the Mathematical Tasks Framework (MTF) and, later, the Task Analysis Guide3 (TAG) (Smith & Stein, 1998). Curriculum theorists decades before, and now the MTF, drew attention to the differences between written, enacted, and learned curriculum. Smith and Stein (1998) developed the TAG by paying particular attention to specific differences among written tasks (for the complete TAG, See Appendix A) and then expanded on how a specific written task evolved in the course of a lesson through phases that emphasized the selection of tasks, set-up, and implementation. The TAG allows instructional tasks to be analyzed with a focus on the differing levels of mathematical thinking involved. Some tasks may require students to perform memorized skills whereas others have students draw connections to other mathematical concepts to make meaning of the task. Each task provides an opportunity for student learning, and cumulatively, students implicitly develop ideas about the nature of mathematics (Stein et al., 2000). The opportunity for student learning may vary since “all tasks are not created equal” and require different cognitive demands (Stein et al., 2000, p. 3). The TAG differentiates the cognitive demands of mathematical tasks along four levels: (a) memorization, (b) procedures without connections to understanding (hereafter simply referred to as procedures without connections), meaning, or concepts, (c) procedures with connection to understanding, meaning, 3 The Task Analysis Guide is presented in multiple sources that discuss teaching with cognitively demanding tasks (e.g., Smith & Stein, 1998; Stein et. al., 2000) 6 or concepts (hereafter simply referred to as procedures with connection); and (d) doing mathematics. The four levels are more generally divided into lower-level cognitive demand- memorization and procedures without connections-and higher-level cognitive demand- procedures with connections and doing mathematics. Tasks classified as memorization could include repeating a fact or definition, or involve reproducing a procedure previously learned that requires a fact, rule, formula, or definition. In these tasks, students are required to commit information to memory for later use. These tasks have no connection to concepts or meaning and typically take a short time to complete because they are not ambiguous or multistep. Math facts (e.g., addition, subtraction, multiplication, and division fact with integers) or formulas (e.g., Quadratic Formula) are examples of tasks considered to have a cognitive demand of memorization. Students typically memorize basic math facts to enable quick recall and eliminate the need to complete a process to find an answer. Formulas, like the Quadratic Formula, take students a short amount of time to complete because they are established and not ambiguous. The other lower-level cognitive demand category is procedures without connections, where students focus on a learned procedure or algorithm. Students still are not required to make connections to concepts or meanings that underlie the procedure. Students focus on producing a correct answer that requires no explanation beyond describing the procedure they used. For example, prior to students memorizing integer addition, subtraction, multiplication, and division facts, they may engage with a procedure or algorithm to find the solution. Algorithms provide a set of directions students may follow to find a solution. In tasks with a cognitive demand of procedures without connections, students may follow a set of 7 directions without providing explanations about how they arrived at the solution (beyond the algorithm) or how they know the solution is correct. In contrast, both kinds of higher-level cognitive demand tasks require additional student thinking with solution paths connected to underlying mathematical concepts. For procedures with connections tasks, students’ attention is on the procedures, but the purpose is linked to the underlying mathematical concepts. Multiple representations, such as diagrams, manipulatives, symbols, and contextualized problem situations, aid in making mathematical connections and developing meaning for underlying concepts. The tasks suggest solution paths, either implicitly or explicitly, but those paths are more general and broader than an algorithm. Although direction is given to a suggested pathway, students cannot mindlessly follow a procedure and must interact with connected mathematical concepts. For example, prior to students knowing an algorithm for integer addition and subtraction, students engage in a task with a cognitive demand of procedures with connections when they explore how to visually represent (e.g., chip board model) symbolic expressions using manipulatives (e.g., black and red chips) to arrive at a solution. During the task, students use manipulatives to make connections between multiple representations (e.g., symbols and chip board model) and develop an understanding about adding and subtracting integers (without an algorithm). Lastly, the most cognitively demanding tasks (doing mathematics) are complex, non-algorithmic, and require students to explore mathematical concepts, processes, and procedures. There are no clear or predictable approaches explicitly or implicitly suggested so students must actively self-monitor their own thinking. Students must access relevant mathematical concepts and use them appropriately throughout the task. Since these higher-level tasks are considerably more 8 cognitively demanding, some anxiety is anticipated for the students. Consider the following task context where the Super Brains have a score of -500 points during a game of Math Fever4 and are determining how many points to wager on the final question. If they answer correctly, the wagered points are added to their current score. If they answer incorrectly, the wagered points are subtracted from their current score. How can they predict whether the sum of two integers (their current score and their potential last answer score) is 0, positive, or negative? Given any possible score, how could a team predict whether the sum of two integers is 0, positive, or negative? (Lappan, Phillips, Fey, & Friel, 2014, p. 124-125) The task does not implicitly or explicitly provide directions about how to proceed. Students must access related information about subtracting integers and further explore subtraction of integers to determine relevant patterns and arrive at an answer to the question. Multiple solution strategies exist since students may use number lines, chip board models, or symbols to explore the context and mathematics of the task. With multiple solution strategies, no clear directions, and the “unpredictable nature of the solution process” (Smith & Stein, 1998, p. 348), students may need to persevere through anxiety to complete the task. For the purpose of this paper, cognitive demand refers to the types of mathematical thinking students employ during a task. More specifically, cognitively demanding tasks are tasks where students engage with underlying mathematical concepts and cannot mindlessly follow a previously seen procedure. 4 Math Fever is a game students can play in the Connected Mathematics Project 7th grade unit, Accentuate the Negative, to introduce how numbers (i.e., group scores) can be either positive or negative. Students answer mathematics content questions to either earn or lose points, while simultaneously familiarizing themselves with the Math Fever context. Students refer to a fictional Math Fever context for their first task in the unit. 9 Potential relationship between scaffolding and cognitive demand. Individuals who provide scaffolding engage in a process with a student to provide assistance with a task. During this process, the individual providing assistance may reduce “degrees of freedom in the task to manageable limits” or demonstrate “solutions when the learner can recognize them” (Wood et al., 1976, p. 99, emphasis mine). The emphasis, indicated by the bolded text, specifically relates to the TAG Smith and Stein (1998) created. Characteristics of a task and support during the task, such as reducing degrees of freedom and demonstrating solutions, may impact the cognitive demand of the task. For example, when a teacher provides scaffolding by modeling the subtraction problem 2 – 4 on a chip board so a student may proceed and develop an understanding of integer subtraction, the teacher has now modeled a procedure that a student may follow for subsequent tasks and potentially lowered the cognitive demand of the task. Or, when a teacher reduces degrees of freedom by directing a student’s attention to a particular pattern in a table, the teacher may also impact the cognitive demand by suggesting a solution pathway. Specific connections to the cognitive demand framework are discussed in more detail in a later section. Discourse. The term discourse has multiple meanings used within research (Ryve, 2011). I am unable to provide an exhaustive summary of the discourse literature to produce a simple description of all they ways in which it was defined within research. Instead, I provide the theoretical assumptions underlying my definition for discourse. “The study of discourse is the study of human communication; the most unique of this communication is language in use” (Ryve, 2011, p.169). Communication occurs between individuals, making it a social interaction. During social interactions humans may communicate 10 by ways other than spoken language; context clues (Gee, 2014), intonation (Schiffrin, 1994), and gestures (Parks & Schmeichel, 2014), for example, may all contribute to meaning making during an interaction. In other words, not only is what is being said important, but also how it is being said. Discourse is language-in-use5. For the purposes of this paper, I use the term discourse when referring to spoken or written word as well as the other aspects (i.e., context clues, intonation, and gestures). The reason for this choice is because language is part of discourse and humans use language purposefully to accomplish a goal. I, along with other researchers, assume discourse is used for different functions (Bloom, Carter, Christian, Otto, & Faris, 2010; Gee, 2014; Boyd & Markarian, 2011) such as requesting information, promising a future act, or declaring a truth (Schiffrin, 1994). In school, teachers typically ask questions to which they already know the answer. For example, a mathematics teacher may ask, “How likely is it that a chocolate chip will land on the flat side after being tossed in the air?” (Jones and Tarr, 2007, p. 13). The function of the questions in school is not for the student to learn something previously unknown; instead, the function of the teacher’s question is to learn what the student knows. Discourse helps reify students’ mathematical thinking which was previously unseen. Therefore, I assume discourse is not just about saying but prompts action (Bloom et al., 2010). Jaworski and Coupland (2005b) stated, “language ceases to be a neutral medium for the transmission and reception of pre-existing knowledge. It is the key ingredient in the very constitution of knowledge” (p. 3). Power dynamics between individuals and what or whose 5 Speech act theory researchers use the term language-in-use instead of discourse. Refer to original work by John Austin or John Searle for more information (e.g., Austin, J. (1962) and Searle, J. (1975, 1979, 1989)). 11 knowledge counts influence the transmission and reception of knowledge for a teacher and student. The social interaction between teacher and student modifies information, resulting in both the teacher and student co-constructing knowledge (Ryve, 2011). Whether language is not neutral because of power dynamics (Jaworski & Coupland, 2005b) or the process of modification during interactions (Ryve, 2011), information is not passed seamlessly from one individual to the other intact. In this study, I analyze teacher discourse to focus on the social interactions and how it may modify information by what and how someone communicates. Drawing on theoretical assumptions, I define discourse as language-in-use that prompts action and co-constructs knowledge. Conceptual Framework Cognitive demand, scaffolding, and discourse are the three main constructs in this study and important because they are inter-related. Figure 1 below is my proposed model of how cognitive demand, scaffolding, and discourse are related, and how analysis of discourse6 was applied to reveal teachers’ scaffolding. During this study, students engaged with cognitively demanding tasks. The oval represents the cognitively demanding task and signifies the space in which all individuals worked. Cognitively demanding tasks emphasize reasoning, understanding, explaining, and justifying. Without explicit directions for how to solve the problem in the written task, students make mathematical connections with underlying concepts and explore possible solution approaches. With an emphasis beyond following procedures, cognitively demanding tasks 6 Speech act theory uses the term analysis of language-in-use instead of discourse analysis. Similar reasoning as previous discussion with the term discourse. 12 require additional student thinking and monitoring of their own thinking as students explore, explain, and justify their individual approaches. Increased critical thinking, multiple solution pathways, and ambiguity may result in student anxiety and requests for help. Scaffolding provides a teaching strategy to support student learning when students need extra help. Teachers make in-the-moment decisions about how to respond to students’ requests for help while attending to characteristics of cognitively demanding tasks. Teachers listen to students’ explanations, consider the task goal (e.g., Task Focus Question or Learning Goal), and respond to provide help. Teachers may respond to student requests with questions or uncertainty as a way to provide help that also encourages exploration and explanations of various solution possibilities. Since teachers’ responses depend on students’ current mathematical understanding and the explanations provided, the teacher responses are likely to vary. A teacher who provides individualized scaffolding to a student during an interaction may ask different questions than during an interaction with a subsequent student. Although variation is likely, similarities, such as teachers asking the same types of questions or students becoming confused about the same portion of the task, may exist since all students are working on the same task. Students’ explanations provide insights into their mathematical thinking. Through active interactions with a teacher, students explain their current understanding and both students and teachers ask questions to clarify their mathematical thinking. Clarification, along with their ability to make connections, conjectures, explanations, and justifications refines their discourse. In addition to spoken or written language, I pay attention to intonation and gestures. In the context of a cognitively demanding task, students may experience anxiety which could manifest 13 in intonation changes or specific gestures. Intonation changes or specific gestures are student responses that represent their experiences during a cognitively demanding task. Teachers also consider nonverbal aspects of discourse like intonation and gestures when responding and providing help to students. Through various interactions, teachers take an active and responsive role to support student learning and help students accomplish a task they may not be able to do alone. Analysis of discourse is a way for me to explore and describe the interactions I see to understand how teachers think about the relationship between scaffolding and cognitive demand, how teachers provide scaffolding during cognitively demanding tasks, and how students’ discourse (i.e., context clues, intonation, and gestures) impacts teacher scaffolding. In a sense, I see the three constructs-cognitive demand, scaffolding, and discourse-nested within one another. 14 Figure 1 Conceptual Framework Interpretation of Students’ Mathematical Thinking Relation of Student Thinking to Focus Question Cognitively Demanding Task Discourse Scaffolding: Teacher Responses Response: Student Experiences Students’ Mathematical Thinking Interpretation of Teacher Scaffolding T e a c h e r Analysis: Teacher Questions Teacher Ambiguous Answers Student Experiences 15 CHAPTER 2 LITERATURE REVIEW I draw on key pieces of literature from cognitive demand research to argue: 1) Recent literature on written tasks marks a departure from an assumption of independent tasks, providing additional opportunities to consider scaffolding for cognitively demanding tasks, 2) Several articles on the enactment of cognitively demanding tasks explicitly use the term scaffolding; however, the researchers’ definitions and descriptions of scaffolding remain elusive, and 3) Existing empirical research is largely about whether cognitive demand of a task is maintained (an end result). I provide a brief preview of my main points in the remainder of this section. In the later sections of the literature review, I discuss evidence to support my claims. Researchers make assumptions (without always explicitly stating them) about the role of scaffolding within written tasks. The researchers’ assumptions may impact the framework (e.g., TAG) teachers use to determine the cognitive demand of a task and select cognitively demanding tasks for a lesson. Additionally, one researcher’s assumption may seem contradictory to a different researcher’s assumption. For example, Stein et al. (2000) assumed the cognitive demand of a task was determined by analyzing characteristics of a purposefully selected stand-alone task to accomplish a goal, without considering the previous or subsequent task. Alternatively, Sullivan and colleagues (2006, 2014, 2015) explicitly described sequences of tasks that worked together to reach a goal. Key pieces of literature (e.g., Boston & Smith, 2011; Henningsen & Stein, 1997; Stein et al., 1996) included explicit statements suggesting that scaffolding was a factor in maintaining high-level cognitive demand. For example, Silver and Stein (1996) described the use of “scaffolding students’ thinking so that they were able to perform at complex levels” (p. 513). 16 This suggests that the use of scaffolding impacts the level of cognitive demand that students experience; however, literature is vague about what scaffolding means in particular studies. Even with the large corpus of literature about cognitive demand that includes references to scaffolding, details about how to scaffold and teachers’ perspectives about providing scaffolding during cognitively demanding tasks are limited. What researchers do acknowledge are teachers’ voicing of challenges and tensions they experience (Sullivan & Mornane, 2014) without providing details, explanations, or resolutions. Scaffolding Related to Written Tasks For students to experience cognitively demanding tasks, a teacher must first select tasks that contain key characteristics such as multiple solution paths that engage students in the underlying mathematical content. In this section, I begin with an analysis of how researchers have viewed tasks as discrete units (i.e., as independent of one another), how recent researchers viewed tasks as a collection (i.e., as co-dependent), and how researchers approached research on written tasks and scaffolding. Written tasks as independent. Prior research on cognitive demand, beginning with the analysis of written tasks, has asserted that tasks must be seen as independent (e.g., Jones & Tarr, 2007; Stein et al., 2000). This perspective is based on the assumption that written curriculum does not always dictate actual classroom experience (Jones & Tarr, 2007; Zhu & Fan, 2006). In other words, the prior appearance of a task within a textbook does not mean a student knows that mathematical content or experienced that task prior to their engagement with the particular task of interest. 17 Stein et al. (2000) provided a task sorting activity to help individuals familiarize themselves with determining the cognitive demand of a task. To complete this activity, participants were instructed to decide what level of cognitive demand they would code the particular tasks. These tasks differed in content and therefore were not a sequence of tasks that would be found grouped together in a written curriculum. Although Stein and colleagues did not explicitly articulate the perspective that tasks were independent during the task sorting activity, elsewhere they suggested that in order to determine the cognitive demand of a task, an individual use the TAG as a “judgement template (a kind of scoring rubric) that permits a “rating” of the task based on the kind of thinking it demands of students” (p. 15). The focus was on the cognitive demand characteristics that were present in the single written task. Similarly, Jones and Tarr’s (2007) curriculum analysis of a written textbook applied this same perspective of tasks as independent. They first identified the individual tasks that included the desired mathematical content (statistics and probability) to be coded. The content that preceded and followed the task was never included. In some instances, an individual task included multiple questions, as seen in Figure 2. When this occurred, the researchers coded the whole task, not the individual questions, according to the levels of cognitive demand outlined in Stein et al.’s (2000) work. Figure 2 Multiple Questions Within a Single Task How likely is it that a chocolate chip will land on the flat side after being tossed in the air? Perform the following experiment and answer these questions to help formulate your answer to this question. 1. What are the possible outcomes for the landing position of a chocolate chip? (cid:1) 2. With your partner, toss 50 chocolate chips and record the landing position. How many chips landed on the flat side? (cid:1)(cid:1) 18 Figure 2 (cont’d) flat side? (cid:1) 3. Based on your data, what is the experimental probability of a chocolate chip landing on the 4. As a class, pool your data. Based on the pooled data, what is the (cid:1)experimental probability of a chocolate chip landing on the flat side? (cid:1) 5. How does the experimental probability based on your data compare to the experimental probability based on the pooled data? How do you account for any differences? (cid:1) 6. Which of these experimental probabilities do you believe to be closest to the theoretical probability? Why? How could you obtain a better estimate of the theoretical probability? (cid:1) These studies illustrate how an individual could operate under the assumption that tasks are independent when determining the cognitive demand of a task, specifically when using the TAG. Additionally, the researchers’ methods largely pertain to analyzing written tasks to determine the cognitive demand of a task or describe the cognitive demand distribution within a curriculum. Written tasks as dependent. In contrast, recent literature provides examples where researchers view tasks as related, or as a collection, that students and teachers experienced in a sequence (Sullivan & Mornane, 2014; Sullivan, Mousley, & Zevenbergen, 2006; Sullivan et al., 2015; Ursula de Araujo, 2012). Sullivan and colleagues’ research (2006, 2014, 2015) introduced the terms “interim” and “goal” tasks to begin exploring how teachers selected sequences of tasks during lesson planning. Explicit definitions of these terms were not given; although, my interpretation of these terms came from Sullivan and colleagues’ use of examples for each type. As the name suggests, interim tasks are tasks that precede the goal task. Sullivan and Mornane (2014) viewed this “strategy of an interim or scaffolding task as a lead in to the challenging task” (p. 211). 19 Additionally, interim tasks may introduce key mathematical content or require fewer steps, but essentially be the same type of task as the goal task. Below, in Figure 3, I provide an example of a goal task. Students completing the task were given the following prompt: “I did a multiplication problem correctly for homework, but my printer ran out of ink” (Sullivan et al., 2015, p. 129-130). Students were given an equation that contained missing numerical digits and asked to fill in as many solutions as possible to the scenario to make a correct statement. The goal task invited students to think of two-digit numbers that multiply to a three-digit number ending in zero. Through this task, students should recognize a pattern of multiplication that focused on identifying numbers whose product ended in a zero. Figure 3 Goal Task 2 __ X 3 __ = __ __ 0 What might be the digits that did not print? (give as many answers as you can) The interim task given below in Figure 4 was suggested if the goal task challenged a student was challenged. Notice the interim task is the same style as the goal task but contains fewer missing digits and fewer constraints. Students were still asked to recognize patterns of multiplication and identify numbers whose product ended in a zero. Figure 4 Interim Task __ X __= __0 What might be the digits that did not print? (give as many answers as you can) Interim tasks provide the bridge, and in this case mark critical features of the goal task, by having students notice multiplication patterns. Additionally, interim tasks provide 20 opportunities for teachers to check in with students by asking probing questions (Sullivan & Mornane, 2014) or reviewing student work before introducing the next task (Sullivan et al., 2006). The goal task is the task that occurs at the end of the sequence, the point teachers hope all students reach. Sullivan, Mousley, and Zevenbergen (2006) argued that, when planning a lesson, teachers must consider: (1) a set of tasks, at least some of which are open-ended, especially the goal task, carefully sequenced to ensure some students have the necessary experience at each state, (2) prompts that can be used to support students experiencing difficulty with the tasks, and (3) tasks that could be posed to those students who complete the original task to extend their thinking in a productive and, hopefully, interesting way. (p. 121) Sullivan and colleagues’ (2006) suggested goal tasks are open-ended tasks where students engage in exploring multiple solution paths, communicate with peers, and connect mathematical concepts with a focus on mathematical thinking. Sullivan and colleagues’ definition of goal tasks contained features that are consistent with the definition of high cognitive demand tasks presented within seminal pieces of cognitive demand research (e.g., Silver & Stein, 1996; Smith, Bill, & Hughes, 2008; Smith, Hughes, Engle, & Stein, 2009; Stein & Lane, 1996). Scaffolding in written tasks. I assert that the shift in focus from tasks as independent to tasks in a sequence provides a potentially larger role for scaffolding. More importantly, the view of tasks as a sequence provides a way to think about the relationship between scaffolding and cognitive demand, specifically when selecting written tasks. Sullivan and colleagues’ (2006, 2014) use of interim tasks and goal tasks assumes tasks occur in a set, not individually, with 21 interim tasks representing manageable smaller tasks that build up to the goal task. The interim tasks have reduced degrees of freedom and break the goal task into manageable limits, reflecting my definition of scaffolding. Since interim tasks may be more directed and teachers purposefully scaffold interim tasks to lead up to a goal task, does this scaffolding impact the cognitive demand of the goal task? Does the goal task still maintain the key characteristics of a cognitively demanding task? My interpretation of Jones and Tarr’s (2007) discussion of multiple questions within a task is that the sequence of sub-questions breaks the task into manageable pieces that lead to the end goal and is a form of scaffolding. Figure 2 provides an example of a task where the first question of the task helps direct students’ actions to a first step in performing an experiment to answer the task question. To determine the question of the likelihood of a chocolate chip landing on the flat side after being tossed in the air, a student could perform an experiment and consider the possible landing outcomes. Even in the instance of coding a single written task, Jones and Tarr seems to encounter scaffolding. There are multiple solution paths for the proposed task, so does the sequence of questions suggest a particular solution path? Would students struggle more with completing the task if sub-questions were removed? When and how should teachers intervene to help students? How would teachers’ scaffolding impact students’ mathematical thinking? Though not explicitly identified, Jones and Tarr (2007) provided an example of a task, instead of sequences of tasks, where multiple questions within the task may represent scaffolding. The sub-questions pertain to my previously articulated definition of scaffolding as their inclusion may aid in students’ successful completion of the task. Although Stein and colleagues (2000), 22 Jones and Tarr (2007), and Sullivan and colleagues (2006, 2014, 2015) largely left out explicit discussions of scaffolding, a review of their work highlights the increased opportunities for considering scaffolding within a written task. Scaffolding During the Enactment Phase Previously I discussed how selecting and sequencing tasks and/or questions within single tasks may be forms of scaffolding. I now consider how cognitive demand literature discusses scaffolding during the enactment of the written tasks. The importance of the enactment phase. The enactment phase includes both teachers’ set-up and implementation of tasks (Stein et al., 2000) and is a particular challenge for teachers (Stein et al., 1996). It is during the enactment phase that teachers encounter the greatest difficulty in maintaining the cognitive demand (Boston & Smith, 2011; Silver & Stein, 1996; Stein et al., 2000; Ursula de Araujo, 2012). The cognitive demand of tasks may lower because the teacher provides information during the set-up phase that guides students systematically step- by-step through to a solution. Providing key formulas that suggest a solution path or focus students’ attention on a procedure to solve the solution may also result in the lowering of cognitive demand (Charalambous, 2008; Stein et al., 2000). Similarly, during the implementation phase the teacher may answer students’ questions by telling students how to proceed instead of allowing the students to struggle and work collaboratively to reach a solution (Stein et al., 2000). Teachers may feel pressure from students to reduce task complexity or further specify the procedures for completing the task. The examples of interactions during the enactment of a task highlight opportunities for students and teachers to engage in an exchange that pertains to key characteristics of 23 scaffolding. In addition to the enactment phase being a focus of this study, research on enacting written tasks is salient, especially since teachers have difficulty with the enactment phase (Henningsen & Stein, 1997). Teacher behavior during enactment that supports scaffolding. Henningsen and Stein (1997) referred to scaffolding as a teacher behavior by stating “another key factor in the students’ successful implementation of the task was the scaffolding behavior of the teacher” (p. 540). Researchers have noted these behaviors to include a teacher providing background (i.e., prior mathematical content needed for a subsequent task) or scaffolding to aid students’ learning of newer concepts (Stein & Kaufman, 2010). Teachers may also change the type of problem or provide scaffolding when a student does not understand how to proceed with a task (Fennema, Carpenter, Franke, Levi, Jacobs, & Empson, 1996). And lastly, a major component of successfully enacting a cognitively demanding task is scaffolding students’ thinking and reasoning (Boston & Smith, 2011; Stein et al., 2000) during whole-class discussion. When students do not know how to proceed with a cognitively demanding task, they may press teachers for additional help and direction. As a response to this pressure, teachers may mark critical features of a task to reduce the degrees of freedom and control frustration (Wood et al., 1976), resulting in the number of possible solution paths decreasing, which lowers the cognitive demand. When considering classroom discussion, discussions between students and teacher are rarely “clean exchanges of unambiguous questions and responses” (Hiebert & Wearne, 1993, p. 403). Extant literature uses the words “orchestrating” and “productive” when describing the role of the teacher during discussion (e.g., Jackson, Garrison, Wilson, Gibbons, & Shahan, 2013; 24 Smith et al., 2009; Stein, Engle, Smith, & Hughes, 2008). The teacher facilitates a conversation between students that furthers the learning of all by pressing “students to explain and justify their solutions, evaluate their peers’ solutions, and make connections between different solutions” (Jackson et al., 2013, p. 648). Expecting teachers to perform so many actions during conversations may help explain why teachers experience challenges in facilitating discussions during the enactment of cognitively demanding tasks. It may also explain why a plethora of literature exists on conducting discussions (e.g., Hiebert & Wearne, 1993; Jackson et al., 2013; Reinhart, 2000; Smith et al., 2008; Smith et al., 2009). The complexity of orchestrating classroom discussions seems to support literature which suggests teachers experience challenges as part of the enactment of a cognitively demanding task (Boston & Smith, 2011). With the complexity and literature suggesting the challenges teachers face during enactment, would Stein and colleagues’ (2000) example be a teacher scaffolding behavior which supports the maintenance of a cognitively demanding task? Teachers’ ability to scaffold students’ mathematical thinking is a behavior (Stein et al., 2000). Is this the type of scaffolding behavior Henningsen and Stein (1997) meant when they stated, “another key factor in the students’ successful implementation of the task was the scaffolding behavior of the teacher” (p. 540)? Seminal pieces of cognitive demand research on the enactment of cognitively demanding tasks explicitly mention scaffolding; however, a clear description of the teacher scaffolding behaviors is missing, making it difficult to understand if the studies are holding the same assumptions about the nature of scaffolding that occurred. Additionally, past studies include scaffolding as a positive teacher behavior to support cognitively demanding tasks during enactment (Henningsen & Stein, 1997) while more recent 25 literature note a potential tension between scaffolding and enacting cognitively demanding tasks (Sullivan & Mornane, 2014). Analysis of discourse may offer a way to understand and clarify what research means with terms like teacher behavior and how teachers’ use of language impacts the enactment of cognitively demanding tasks. Cognitively demanding discussion as a part of enacting cognitively demanding tasks. I do not claim that cognitively demanding discussions are synonymous with enacting cognitively demanding tasks. Instead, I note that teachers are expected to orchestrate discussions as part of enacting cognitively demanding tasks (Jackson et al., 2013). Within this expectation, researchers include aspects of scaffolding that may direct students’ mathematical thinking in a particular way (Boston & Smith, 2011; Hiebert et al., 1997; Jackson et al., 2013). A teacher listens to conversations and students’ mathematical thinking to either ask prompting questions when students struggle or to select and sequence student solutions for class discussion (Jackson et al., 2013; Stein et al., 2008). For exploration and discussion, teachers employ scaffolding practices in questioning and classroom discourse that either maintain or lower the cognitive demand of tasks (Fennema et al., 1996; Hiebert & Wearne, 1993). The opportunity to ask probing questions and sequence student solutions are specific teacher scaffolding behaviors. Researchers provide explicit examples of teacher scaffolding behavior, but what is still missing is a clear connection between how teachers’ scaffolding behaviors support cognitively demanding tasks. This is problematic since literature documents teachers’ challenges, describes extensive research about conducting discussions, and acknowledges possible tensions between scaffolding and cognitive demand. 26 Vagueness of Scaffolding Within Cognitive Demand Research Tasks are a focus in cognitive demand literature because they “impact students’ perceptions of and opportunities to learn mathematics; therefore, selecting instructional tasks is among the most important decisions a teacher makes” (Boston & Smith, 2011, p. 966). However, most research and resources provided to teachers consider tasks as independent (e.g., Jones & Tarr, 2007; Stein et al., 2000) with a recent departure to viewing tasks as a sequence (Sullivan & Mornane, 2014; Sullivan et al., 2006; Sullivan et al., 2015). Students do not experience tasks as isolated math problems without any influence from prior tasks, student interactions, or exchanges with the teacher (Hiebert & Wearne, 1993). Whether tasks are viewed as independent or related, studies may include the term scaffolding or an alternative term. For example, Henningsen and Stein (1997) investigated classroom factors that inhibit or support the enactment of cognitively demanding tasks. Their study specifically considers scaffolding defined as “teachers or more capable students simplifying the task so that it could be solved while maintaining task complexity” (p. 532). Yet, they only provide the result that, in 73% of the tasks, scaffolding was a key factor in maintaining students’ engagement in the cognitively demanding task. Similarly, Sullivan, Mousley, and Zevenbergen (2006) investigated teacher behaviors without explicitly using the term scaffolding. They investigated enabling prompts, a teacher behavior, defined as “supports offered to students who experience difficulty along the way” (Sullivan et al., 2006, p. 123). Researchers include scaffolding as part of enacting cognitively demanding tasks without helping to clarify how teachers (or students) simplified the task to provide scaffolding while maintaining task complexity (cognitive demand). 27 Whether the research includes the term scaffolding (e.g., Boston & Smith, 2011; Fennema et al., 1996; Henningsen & Stein, 1997; Stein & Kaufman, 2010; Sullivan & Mornane, 2014) or alternative names, the majority of studies focused on determining the cognitive demand of a task or determining if the cognitive demand of a task was lowered or maintained. When studies such as Henningsen and Stein (1997), Sullivan, Mousley, and Zevenbergen (2006), Stein and colleagues (2000) investigated teachers’ actions (including discourse), the scope of the study was concerned with larger aspects of lesson planning. They desired “to investigate whether it is possible, from a teacher’s perspective, to plan and teach lessons that include four specific aspects of lesson planning” (Sullivan et al., 2006, p.126), present case studies of teachers’ enactment to compare whether cognitive demand was maintained (Stein et al., 2000), or understand other aspects of teacher actions during enactment. In each of these instances, the way scaffolding is discussed and enacted stays on the periphery of research on cognitive demand. Researchers continue to posit that teachers should provide both scaffolding and cognitively demanding tasks; however, the question still remains about how teachers can do both. Indeed, recent literature documents teachers’ challenges and the tension between appropriate scaffolding and the maintenance of cognitively demanding tasks is ignored (Sullivan & Mornane, 2014). Specific attention to teachers’ views on the relationship between scaffolding may illuminate reasons why they experienced challenges and tension. Additionally, the ways in which teachers provide scaffolding during the enactment of a cognitively demanding task is needed. A closer analysis of teachers’ in-class experiences and discourse may provide insight into understanding when and how to provide scaffolding and its connection to cognitive demand. 28 CHAPTER 3 CONTEXT OF STUDY AND OVERVIEW OF METHODS The following research questions guided my study: 1. How do teachers think about the relationship between scaffolding and maintaining cognitive demand? 2. In what ways do teachers provide scaffolding through discourse? 3. What are students’ reactions and experiences? a. In what ways do teachers respond to students’ reactions and experiences? To address the research questions, I conducted an empirical study that included classroom observations and semi-structured teacher interviews. Analysis of discourse and a qualitative approach were applied to teacher and student interactions to understand how teachers provided scaffolding and responded to students’ reactions and experiences. Context of the Study I observed three middle school teachers’ classrooms, Sophie, Austin, and Heather (all names are pseudonyms) in two different school districts. For each teacher, Table 1 details the grade level, number of class periods, number of days, length of each class period, and the approximate number of students I observed. The number of students is listed twice for Sophie because I observed two of her classrooms for the same number of days. Table 1 General Classroom Observation Information for Each Teacher Information Grade Number of Class Periods Observed Number of Days in Classroom Length of Each Class Period (minutes) Sophie 7th 2 4 55 Austin 8th 1 3 55 Heather 7th 1 3 53 29 Table 1 (cont’d) Number of Students Note. Two values are listed for the number of students in Sophie’s class because I observed two of her class periods. One class had 24 students while the other class had 29 students. 24, 29 25 30 The two contexts varied in many ways; however, three differences were potentially significant to how teachers thought about scaffolding and cognitive demand or how they provided scaffolding for students. The first significant difference was at the district level. Sophie and Austin taught in a district where a subset of students received additional mathematics support. Each day, the school district allocated one period for these students to engage in activities, such as homework support and early introduction to key mathematics for subsequent lessons, with a teacher. This was not the case for Heather’s school. The inclusion of one period, each day, prior to regular mathematics instruction may have impacted what I observed. For example, a prior activity that introduces a concept is a form of scaffolding; however, it occurs outside the scope of the study. Teachers’ questioning and students’ mathematical thinking during these activities may not be verbalized when the students work on a similar task at a later time. In terms of cognitive demand, students seeing a similar activity prior to the observed class period challenges assumptions about features of high cognitive demand tasks. For example, for a task to be considered high cognitive demand, specifically doing mathematics, students are required to explore mathematical concepts, processes, and procedures when no clear or predictable approaches are suggested. Students’ prior experiences could provide familiarity with the later task and therefore could suggest a solution path. When a subset of students is integrated into groups during regular mathematics classroom instruction, the way student groups approach the task may be 30 influenced. The way in which student groups explore the task may differ if all students see the task for the first time, at the same time. The second notable difference was at the grade level. Austin taught 8th grade where a portion of high achieving mathematics students were encouraged to enroll in an algebra course. Austin taught 8th grade mathematics for students who were not enrolled in algebra. This was not the case for Sophie and Heather as their classrooms included a heterogeneous mixture of students from their particular grade. The differences in grade level, specifically heterogeneous classes, relate to the organization of student groups to support teacher scaffolding. Teachers may organize students into small groups by thinking about various characteristics, such as mathematical knowledge, personality, and work ethic. For example, a group of four may include students who typically excel in mathematics along with students who typically need additional support. That way, if a student needs help, another group member could provide a suggestion. Additionally, to encourage student discussion and collaboration in the pursuit of a mathematical solution, teachers may attend to personality characteristics and work ethic. Groups composed in such a manner encourage student talk and collaboration and provide opportunities for teacher scaffolding. A teacher may listen to student discussion and step in when the whole group needs help. A teacher may ask a question instead of providing an explanation with the hope that one of the group members will be willing to answer. Cognitively demanding tasks benefit all students, no matter their previous experience or success with mathematics, and removing students who excel in mathematics may introduce challenges to teacher scaffolding. 31 The third difference was at the classroom level. Sophie and Heather taught a heterogeneous mixture of students with a varied level of special education support. For Sophie, the special education teacher was not present during regular mathematics classroom instruction. Instead, particular students in her classroom worked with the special education teacher outside the regular mathematics classroom instruction. Alternatively, the special education teacher at Heather’s school was in the mathematics classroom I observed. A subset of Sophie’s students may have worked with the special education teacher outside the regular mathematics; however, I was not aware of this detail. An in-class special education teacher may provide scaffolding to particular students, decreasing the interactions between the classroom teacher and particular students. Similar to previous discussion, outside work or support on similar mathematical tasks may challenge assumptions of high cognitive demand. Curriculum. The Connected Mathematics Project curriculum (CMP) is a problem- centered, middle school mathematics curriculum that promotes an inquiry-based teaching and learning classroom environment. Contextualized mathematical ideas are sequenced in a set of tasks and explored in depth to allow students and teachers to develop mathematical knowledge, understanding, and skills. The design criteria for task creation include some or all of the following features: 1) Embeds important, useful mathematics, 2) Promotes conceptual and procedural knowledge, 3) Builds on and connects to other important mathematical ideas, 4) Requires higher-level thinking, reasoning, and problem solving, 5) Provides multiple access points for students, 6) Engages students and promotes classroom discourse, 7) Allows for various solution strategies, and 8) Creates an opportunity for teachers to assess student thinking. Thus, the nature of the tasks written are of high cognitive demand that builds 32 “awareness of the rich connections among mathematical strands and between mathematics and other disciplines” (CMP and Overarching Goals of Connected Mathematics, n.d.). In order to enact tasks that support inquiry-based teaching and learning, CMP curriculum provides suggested mathematical questions that teachers may ask to encourage student exploration, communication of their understanding, and multiple solution pathways. Examples of suggested teacher questions are included in Figure 6 in the Data Sources section. A typical CMP classroom has students and teachers actively working together in a three- phase instructional model (i.e., Launch, Explore, Summarize) when pursuing a challenging mathematical task. The teacher provides the challenge (i.e., task) and context of the task during the Launch phase. Then, during the Explore phase, CMP curriculum writers state: Students work together to solve the [task] as the teacher moves around the classroom observing, prompting, redirecting, questioning, and encouraging. Students are engaged in the challenge, working together to find a solution, making conjectures, validating conjectures, considering alternative strategies, questioning each other, and communicating their findings. (CMP and Explore, n.d.) Together, students and the teacher unpack the embedded mathematics of the task during the Explore phase. Then, during the Summarize phase, the teacher and students engage in a whole- class discussion to share, solidify, clarify, validate, generalize, connect, and extend their understanding. Each CMP curriculum grade is composed of units, with each unit containing 2-4 investigations. Each investigation contains 2-4 math problems. Each problem is designed for one to two 40-minute sessions. 33 Participant selection. At the time of the study, Sophie, Austin, and Heather taught middle school mathematics using CMP curriculum. Sophie had been teaching for 10 years. For the previous eight years, she had been teaching 7th grade using CMP materials. She was first introduced to CMP during her student teaching internship year. She attended CMP workshops about problem-centered teaching and continued to engage with ongoing CMP research. Additionally, she participated in district-level discourse-related professional development (PD) experiences. Along with other members of the district mathematics department, she focused on the use of questioning as a teaching tool. Austin had been teaching for eight years, with the last six years teaching 8th grade using CMP materials. Austin had also attended similar CMP workshops and the same district-level discourse-related PD experiences. Sophie and Austin taught in the same school district, a school district serving about 3,300 PK-12 students. Students in grades 6, 7, and 8 attended a single middle school located within the district. Heather had a prior career before beginning teaching. She had taught using CMP materials for all four years of her teaching career. For the previous two years, she had taught 7th grade in a school district serving about 3,600 PK-12 students, with about 760 students in grades 6, 7, and 8 located in a separate building. Heather had attended CMP workshops and participated in discourse-related PD. Facilitated by a university professor, Heather and area teachers met to discuss discourse as it related to their teaching. In particular, Heather was working on teacher discourse moves, such as creating opportunities for students to engage with another’s thinking, asking students to revoice another student’s idea, or encouraging 34 students to make sense of their own mathematical work. For example, Heather attended to who was saying what, how others took it up in discussion, and what counted as knowledge. I sent an email outlining my study and expectations for teacher participants to middle school teachers in two school districts. Teacher participants were based on the following criteria: 1) Taught using CMP curriculum and 2) Returned a signed consent form (see Appendix B). Once I selected a teacher, the teacher and I found available time in our schedules. The available time dictated the class period(s) I observed. Students in Sophie, Austin, and Heather’s selected class periods were potential participants for the study. Each teacher distributed an information form (See Appendix C) to each student in the selected class. A student could opt out of the study by returning the information form with the indicated choice. For any unreturned form, I interpreted it as agreeing to participate in the study. Data sources. Data sources included classroom observations and semi-structured teacher interviews to address the three research questions (see Table 2). Each teacher completed one pre- and several post-observation interviews. Pre-observation interviews were completed before beginning any classroom observations. Post-observation interviews were conducted immediately after each classroom observation. Classroom observations provided data about the ways teachers provided scaffolding through discourse (RQ2), students’ reactions and experiences (RQ3), and how teachers responded to students’ reactions and experiences (RQ3). Teacher interviews mainly provided data about how teachers thought about the relationship between scaffolding and maintaining cognitive demand (RQ1) while also supplementing classroom observations data for RQ2 and RQ3. 35 Table 2 Primary Data Source for Research Questions Research Question 1 2 3 Data Sources Teacher Interviews Classroom Observations Classroom Observations The tasks observed were selected out of convenience. Teacher availability dictated the dates I visited classrooms and therefore, the tasks I observed. For 7th grade classroom observations, I observed tasks in the units Accentuate the Negative and Stretching and Shrinking. In Accentuate the Negative, curriculum writers intend for students to explore integers and rational numbers, including addition, subtraction, multiplication and division of rational numbers, absolute value, opposites, order of operations, and the distributive property. In Stretching and Shrinking, curriculum writers intend for students to explore similarity with activities such as enlarging figures. Coordinate rules are one way that students create figures to investigate similarity and the connections between symbolic and graph representations. Students describe effects of scale factors on perimeter and area, ratios between and within similar figures, and using similarity to find measures. For 8th grade, I observed tasks from the unit Growing, Growing, Growing. In Growing, Growing, Growing, the written curriculum encourages students to use multiple representations (e.g., tables, graphs, and equations) to explore and show exponential growth and decay. Rules for exponents, scientific notation, growth/decay factors, and growth/decay rates help students evaluate functions and describe connections among the representations. Figure 5 describes the specific CMP tasks I observed, the goal of each task, who taught the lesson, and what day I observed the lesson. The description of each problem outlines the 36 purpose of the problems within the larger unit and in relation to the other unit tasks. For example, the Accentuate the Negative unit is about exploring integers and rational numbers. During my observations, students were introduced to addition and subtraction with integers using a chip model in Task 1.4. In the subsequent observed task, they used their chip models to write an algorithm for addition. Subsequent Accentuate the Negative tasks intend to build on their knowledge to explore a subtraction algorithm, multiplication and division of rational numbers, absolute value, opposites, order of operations, and the distributive property. Figure 5 Observed Task Information Sophie Austin Heather 10.23.18 10.24.18 10.26.18 10.29.18 12.11.18 12.12.18 12.13.18 What does the problem do? Task 7th Grade Accentuate the Negative 1.4 In the Chips 2.1 Extending Addition to Rational Numbers 7th Grade Stretching and Shrinking 1.2 Scaling Up and Down Students model addition and subtraction on chip board displays. Students explore relationships between addition and subtraction as well. One chip color indicates positive values, and the other chip color indicates negative values. Using chip models, students are introduced to the idea of an algorithm and challenged to develop an algorithm for addition of rational numbers. The goal is to engage the students in concrete ways of representing and solving the problems. Students move from concrete representations to addition algorithms. Using the context of a copy machine, students work with figures of various sizes that are similar to an original figure. They will work with some scale drawings that are smaller than the original figure and some that are larger. They will use percents to compare similar figures. They will compare side lengths and angles of similar figures and notice that lengths of sides may differ, but the measures of angles do not. Students learn how to make similar and nonsimilar shapes using a coordinate system. Students graph members of the Wump family, plus other figures that claim to be in the Wump family. Students compare the shapes, side lengths, and angles of the figures they draw as they contrast similar figures with nonsimilar ones. 37 2.1 Drawing Wumps Figure 5 (cont’d) 8th Grade Growing, Growing, Growing 2.1 Killer Plant Strikes Lake Victoria 2.2 Growing Mold 2.3 Studying Snake Populations 10.26.18 10.29.18 10.30.18 Students read about a real situation in which a non- native plant spread rapidly and began to cover Lake Victoria in Africa. They then solve a task about a similar situation. In the task, the area of the plant doubles each month, and the starting value is greater than 1. Students create and look for patterns in a table, graph, and equation representing the growth pattern. Exponential data for the growth of mold is presented in the form of an equation that represents an exponential function. Students find and interpret the y-intercept and growth factor for the function from the equation. They also use the equation to answer questions about the situation. Students are given a graph from a real-world exponential growth relationship for a snake population. This is the first time that they find and interpret the y-intercept and growth factor from a graph of an exponential function. They use this information to write an equation that represents the relationship and then use the equation to answer questions about the relationship. Suggested teacher questions are provided in CMP teacher guides. Figure 6 shows examples of suggested teacher questions for the Launch, Explore, and Summarize phases of each task. For suggested scaffolds in the form of statements, I rewrote them in question form. For example, for Task 1.4 In the Chips, a suggestion is to “Discuss the context of describing finances as ‘in the black’ or ‘in the red’” (Lappan et al., 2014, p. 64). I wrote the suggested teacher scaffold in question form as “What does ‘in the black’ or ‘in the red’ mean when describing finances?” Both the question and statement serve the same purpose; students are prompted to discuss what they know about the context and to prepare them for using red and black chips in the task. I provide these suggested teacher questions to give additional problem context and show what supports are provided in the curriculum materials. I also return to these suggested 38 questions in later examples and discussions. For example, I highlight suggested questions for Task 1.4 that appear in Sophie’s teacher questions when I explain how I identified teacher questions in my Methods section. The teacher questions in the various phases and how that unfolds in the classroom is a discussion I return to in a findings chapter. Figure 6 Suggested Teacher Questions from CMP Written Curriculum Launch Task 7th Grade Accentuate the Negative 1.4 In the Chips Explore Summarize • What does “in the black” • Question A parts (2) • How are the number line and (3) have the same total value, but the combination of chips are different. Why is this true? • How can you take away 3 red chips when only 2 are on the board? • What happens to the total value on the chip board if you add 1 black chip and 1 red chip? • How can you simplify the expressions in Group A? • How can you simplify the expressions in Group B? • How do you know if the solution will be positive or negative? model and chip model alike? • What patterns do students notice when a chip board has the same total value, but the combination of chips are different? • How do you solve addition problems in which both signs are the same? • How do you solve addition problems in which the signs of the numbers are different? • What is the measurement of the base of the triangle in the smaller figure? • In I want to enlarge a figure by 25%, will the image be larger or smaller than the original? Why is this true? 39 or “in the red” mean when describing finances? In Julia’s expression, -6 + (+4) = -2, What does -6 represent? • • What does +4 represent? • How does Julia’s chip board relate to her expression? • What are some other ways Julia can show a total value of -2? 2.1 Extending Addition to Rational Numbers • How could you model this problem using chips? • How is this problem different from the one before? 7th Grade Stretching and Shrinking 1.2 Scaling Up and Down • If you wanted to make a very accurate enlargement of a figure, would you use a rubber- band stretcher? • What other ways do you know of to make a larger copy of something? Figure 6 (cont’d) 2.1 Drawing Wumps • The points for Zug are found from the points of Mug. The rule is (2x, 2y). • What do you think this rule tells us to do to a Mug point to get a Zug point? 8th Grade Growing, Growing, Growing 2.1 Killer Plant Strikes Lake Victoria • In Problem 1.2, some students came up with two equations for the situation. Are both equations correct? • • What is the y-intercept for this relationship? Is the area of the plant growing exponentially? How do you know? 2.2 Growing Mold • How much mold is there at the end of day 1? At the end of day 2? At the end of day 3? • Do you see any similarities between the pattern of change in this situation and the patterns of change in some of the Problems in the last Investigation and in Problem 2.1? • Do your new characters • generally look like Mug? If your characters generally do not look like Mug, what could you check? • How would you describe to a friend the growth of the figures that you drew? • Which figures seem to belong to the Wump family and which do not? • Are Lug and Glug related? • Did they grow into the same shape? In earlier Units of CMP, we learned that both the angles and the lengths of edges help determine the shape of the figure. How do the corresponding angles of the five figures compare? • Does it make sense to connect the dots on the graph? • How would the equation change if the initial area covered was 1,500 square feet? • How did you find the number of months it will take the plant to completely cover the lake? • How does the mold grow from one day to the next? • • What does each part of your equation tell you about the growth of the mold? Suppose you started with 25 mm2 of mold and it grew in the same way that it did in the Problem. How would the equation change? How would the graph change? • What is the starting value, or y-intercept? • What is the growth factor? • What information do you need to write an equation? • What did you do to find the solution? • What is the initial value? • What is the base of the exponent? • What value is raised to the exponent? Why? • Did you follow the Order of Operations? 40 Figure 6 (cont’d) 2.3 Studying Snake Populations different from the last two Problems we’ve look at? • What does the graph tell us about the snake population? • to read? If we assume this same growth pattern for years 0 to n, what is the population in year 1? • How is this Problem • Which points are easy • What were the ways you found your equations? • What does the y- intercept (0, 1) mean in the context of the problem? How is that possible? During each classroom observation, I placed three video cameras around the classroom; one to capture a wide angle of the classroom and two to capture two small groups of students working together on a task. Three audio recorders were placed in each classroom; one to record the teacher audio (lapel microphone) and two to capture two small groups of students working together on a task. Either the participating teacher or I selected the two recorded small groups. Student small groups were selected if the teacher anticipated students in the group were likely to: 1) Talk with one another, 2) Articulate multiple solution pathways, 3) Struggle with a mathematical concept, or 4) Share mathematical thinking that may help with future lesson planning. I chose to ask for participating teachers’ suggestions for two reasons. First, the participating teachers were more familiar with the students in the classroom. I believed their familiarity with students would provide additional insights about students in small groups who would engage in mathematical discussions. This could mean students who were more likely to ask questions if they did not understand, offer mathematical conjectures, or think about a concept in a different way. Second, I wanted to collaborate with teachers and provide ways to improve their classroom practice. I accomplished this by asking for teacher input, specifically about which students they wanted recorded. If the teacher discussed a student or small group during a post-observation interview, I chose to record that individual student or small group on 41 the following day. That meant a student or small group could be recorded for one day or multiple days. There were also individual students or small groups not selected for recordings. I recorded field notes (see Appendix D) during my observations to discuss with the teachers during post-observation interviews. I included what task students were working on, the date, class period, and questions I wanted to ask during our post-observation interview. For example, if a teacher made an in-the-moment decision to change an instructional plan, I made a note to ask about the teacher’s reasoning. For each phase of the task (i.e., Launch, Explore, Summarize) I made a note of any interaction that caught my attention and my current thinking about why it caught my attention. Examples of interactions that caught my attention included students who asked the teacher for help multiple times or showed emotional reactions (e.g., frustration, excitement, or confusion). If the teacher asked a particular question repeatedly or provided an explanation instead of asking a question, I recorded it in my field notes. The field notes also marked places to review on the video and audio recordings. Sophie, Austin, and Heather each participated in one video recorded interview (See Appendix E) before I initiated any observation. These pre-observation interviews lasted 18 minutes 57 seconds, 39 minutes 8 seconds, and 37 minutes 53 seconds, respectively. The interviews started with general questions that asked teachers to describe their overall approach to teaching mathematics, how they provided scaffolding during instruction, and how they used cognitively demanding tasks in their classroom. These general questions provided context to their teaching and what I might see during my observations. Teachers also described how they provide scaffolding when answering the questions. For example, teachers’ descriptions of their overall teaching approach included their emphasis on the connectedness of mathematical 42 concepts, perceived power of discovery, and value of exploration during instruction. Complementary to their teaching approach, all three teachers used questioning as their main scaffolding tool. The next set of general questions began to explore RQ1. These questions confirmed the existence of a relationship between scaffolding and cognitive demand (RQ1). Teachers responded to the following questions or prompts: 1) What relationship do you see (if any) between providing scaffolding and maintaining cognitive demand? If any, explain., 2) Do you see any conflict between providing scaffolding and maintaining cognitive demand?, and 3) Describe a time when you experienced tension or a challenge to provide scaffolding during a cognitively demanding task. The first question confirmed teachers saw a relationship and allowed them to describe their perspective. Then, follow-up questions confirmed they saw conflict, if they had not already brought it up in their previous answer. Lastly, their examples of experienced tension or challenge included their perception of student reactions and feelings during the task. The examples also suggested student experiences and reactions (RQ3) that I might notice during my classroom observations. Each pre-observation interview protocol included a portion tailored to the CMP unit I observed. The tailored questions helped situate the task I observed and identified portions of the task where students may struggle. In general, I asked teachers to think across the investigation by asking: 1) What is your learning goal for the investigation? and 2) How does each problem (specific problems inserted) help you with the goal for the investigation? These questions were getting at the connectedness of mathematics and how teachers planned to support students’ exploration of tasks in a way that helped them accomplish one goal. 43 Questions such as: 1) During which problems(s) (if any) did you or do you expect students to struggle?, 2) During which part(s) of the problem (if any) did you or do you expect students to struggle?, and 3) How do you plan to support their learning during the struggle? asked how teachers planned to scaffold if students struggled. Unlike the previous questions about general mathematics teaching, these questions were situated within a specific mathematical task. After thinking across the CMP unit, I asked teachers to zoom in on one task. I asked Sophie questions that pertained to Accentuate the Negative problems, whereas those questions were replaced with other unit-specific questions for Austin and Heather (See Figure 7). Figure 7 Tailored Questions Related to CMP Unit Sophie Austin [Now, let’s dig into 2.1.] 1. In the previous investigation, students explored subtraction of positive and negative values. Students learned both the number line and chip model. a. Can you show me on a number line how you did plan or would plan to present the launch of problem 2.1? Specifically: i. +8 + -5 with a chip model ii. +10 + -25 and +10 – -15 with a number line 1. -25 is the change; -15 is start, result is the change for subtraction problem iii. +9 – -4 with a chip model b. What model are you most comfortable with and why? c. How do you provide scaffolding for students who use the chip model compared to the students who use the number line? [Now, let’s dig into 2.3.] 1. For example, problem 2.3, students look at a graph representing exponential growth. What connections do you see with the graphical representation, equation, and table? a. What do you draw from in the previous problems (2.1 and 2.2) to b. To what extent do you have students analyze and make sense of help students understand the graphical representation? i. Do you consider where the parts of the equation != #(%&) appear in the graph? the graph? 44 Figure 7 (cont’d) [Now, let’s dig into 2.1.] Heather 1. For example, problem 2.1, students begin to draw shapes on the coordinate plane. What prior knowledge do you identify students need for this problem? a. How do you help students who may not understand the coordinate grid (x-axis and y-axis orientation)? b. How do you help students who do not understand how to plot points on the coordinate grid? c. How does the development of these prior skills relate to later problems where students use or write rules that create a similar (or not) figure? In addition to preparing me to observe the particular tasks, the teachers expressed their feelings and understanding about particular tasks. I observed their enactment to understand if their understanding or preferences influenced the way they scaffolded or presented solution pathways in class. For example, if Sophie was more comfortable with chip models, did that appear in her questioning when students could use either model? For Austin, would connections he saw between graphical representations or tasks appear in discussions? Finally, if Heather had to help a student understand how to plot points on a coordinate grid, how would she do that? The tailored questions investigated teacher understanding, scaffolding, and cognitive demand for particular mathematical contexts. Immediately after each classroom observation, I conducted another interview. The teacher participated in one video-recorded interview, reflecting on the observed lesson (hereafter referred to as a post-observation interview). Each post-observation interview typically lasted about 15 minutes, with a few as short as 10 minutes or as long as 40 minutes (see Appendix F). Post-observation interview questions were semi-structured and followed-up on our pre-observation interview or reflected on the observed lesson. Questions asked 45 included: 1) What was your general reaction to the enactment of the task?, 2) What went well?, 3) What was surprising?, and 4) What, if anything, would you do differently next time? These first four questions were designed to let the teachers talk about instances during class that were salient to them. What the teachers chose to talk about gave me a point of reference for continuing our conversation. Then, I asked specific questions about scaffolding during those instances. Questions such as: 1) In what ways did you provide scaffolding?, 2) Did you struggle in any way to provide scaffolding without reducing the cognitive demand?, and 3) How did students respond to your scaffolding? allowed me to dig deeper into an episode the teacher previously highlighted while simultaneously connecting it back to our pre-observation discussion. Data Analysis Video recorded pre- and post-observation interviews and teacher audio recordings from observations were transcribed using a software program. Heather’s lapel microphone did not properly record teacher audio recording for December 5, 2018 classroom observation and therefore, was not included in data analysis. Then, while listening and comparing the software- generated transcripts to the audio recordings, I corrected errors in language. When needed, I added small group student talk to existing teacher audio transcripts for specific portions, such as student uncertainty and salient student episodes. A couple of reasons existed for my decision to transcribe selected portions of the small group audio. I encountered difficulties discerning student talk because of the general audio quality. With audio recorders placed on a desk in the middle of the small group, the microphones recorded any paper rustling, pencil tapping, or other movements. Overlapping talk, varied distances from the audio recorder, and 46 quiet speaking volume of some students made it difficult to create a meaningful student audio transcript of all the small group audio. Instead, I used the small group student audio as supplementary data. I created spreadsheets for each lesson where each line was a teacher question (see Appendix G). As the teachers described in their pre-observation interviews and then I observed during the enactment of tasks, their main tool for providing scaffolding was questioning. Teacher questioning and student responses provided a set of data I could compile and further analyze for similarities and differences to understand how teachers provided scaffolding (RQ2). I added lines of teacher talk that indicated a change in lesson phase to provide context for the questioning. Later in this section, I outline my teacher question coding scheme. Instances of student uncertainty and frustration emerged from my analysis of teacher talk, classroom observations, video recordings (classroom and small group), and audio recordings (teacher and small group). In the following sections I discuss how I identified and coded teacher questions followed by how I identified and coded student uncertainty and frustration. Identifying teacher questions. My analysis focused on teacher questions related to the content focus question because I thought that a closer analysis of what teachers asked would provide a nuanced understanding of how the teachers scaffolded. Teachers had already identified questioning as a way to scaffold; however, questions could be asked in various forms with different purposes. Thus, instances of teacher talk which served the purpose of asking a question and included mathematical content were identified as a teacher question. Questions including an interrogative word, such as what, how, why, and who, or statements with rising intonation were considered teacher questions because both served the purpose of asking a 47 question. Any questions that were mathematical in nature, some specifically related to the focus question, some broadly related to the mathematics of a task, I identified as teacher questions. I excluded questions that did not pertain to mathematical content or the focus question from my analysis. For example, in order to understand if a teacher question addressed the focus question, “How can you use a chip model to represent addition and represent subtraction?”, I listened to the teacher and student discourse and transcribed a list of questions in the order they occurred (see Figure 8). I also listened for questions that were similar to CMP suggested questions. For example, CMP included, “How does Julia’s chip board relate to her expression?” as a suggested question. This question prompted students to connect -6 with 6 red chips and that Julia’s brother, Tate, owed $6. This suggested question was similar to when Sophie asked, “What does a red or black chip represent?” (personal communication, October 23, 2018). A student responded with, “Red is negative and black is positive.”. Students also responded with, “How much money [the brother] owed,” when describing what the red chips represented in the story. Although Sophie did not ask the exact suggested question, their intended purposes were similar and evoked a similar student response. Similar teacher questions were part of the CMP supports to help students achieve the task and, therefore, a teacher question that addressed the focus question. Figure 8 Teacher Questions Related to Focus Question Recording 4 Focus Question: How can you use a chip model to represent addition and represent 1. What does a red or black chip represent? 2. Why is the combination/value of zero important and how do you represent it with subtraction? chips? 48 Figure 8 (cont’d) 3. How do you determine the value of a chip board? 4. Can you find different combinations for certain values? 5. How is starting value similar to the start of the number sentence? 6. What is the starting value? 7. How are the actions of adding or subtracting represented in a chip board? 8. Can you determine the answer? 9. Do you see a pattern? 10. How would you generalize addition or subtraction? For students to use a chip model to represent addition and subtraction, they needed to know what the different colored chips represented, how to determine the total value of a group of red and black chips, how to place chips on a chip board to represent a particular value, how to add or take away chips, and how to determine a final value. The focus question pushed students to see patterns and generalize addition and subtraction. Therefore, if a teacher asked questions about or related to any question previously listed, I considered it a teacher question. The question, “How can I add chips without changing the value of the chip board?” is not specifically listed above; however, the question is related to how students place chips on a chip board to represent a particular value. The understanding that multiple solutions existed for a particular chip board value became more important when students began subtracting. Students became confused when they did not have chips of a particular color to subtract. The questions identified key information and discourse related to the focus question. Main examples of excluded questions occurred when a teacher addressed other student school activities or behavior. Two teachers designated time during class once a week to ask about students’ non-school activities. Students shared and discussed what they were excited to do in the coming weekend or what had happened the prior weekend. Typically occurring before 49 the Launch phase, after the Summarize phase, and short in duration, I identified instances of teacher questions unrelated to mathematical content. At other times, teachers used questions to address student behavior not conducive to learning. These questions were not related to mathematical content and, therefore, not included as teacher questions in my analysis. There were also instances when the teacher asked a question pertaining to mathematical content, yet not necessarily related to the focus question. Teacher questions associated with a number talk were not included because the number talks were self-contained and decontextualized. Heather asked the question, “How could I write the relationship between three and eight or three-eighths as a percent?” (personal communication, December 13, 2018) to begin a number talk. On the previous day, students had begun working with Task 2.1, Drawing Wumps, and would continue their exploration of Task 2.1 after the number talk. The self-contained and decontextualized number talk occurred in the middle of an Explore phase. Although subsequent CMP tasks potentially highlighted relationships between side lengths or percent increase or decrease of similar figures, the number talk lacked explicit connections to potential payoffs. Previous, current, or future mathematical connections were implicit without a context to embed understanding. Since the teacher question was missing key features of a CMP task, I chose to exclude the teacher questions during the number talk from my analysis. Figure 9 provides examples of teacher questions and student responses during the number talk that were mathematical in nature, yet not related to the current lesson’s focus question, “How can you determine if two shapes are similar by looking at the rule?”. 50 Figure 9 Decontextualized Questions Not Related to the Focus Question Teacher Talk How could I write the relationship between three and eight or three-eighths as a percent? Would anyone like to share, first off, did anybody come up with an answer? Just give me an answer, don’t talk about the strategy yet. If you have an answer. If you don’t, that’s okay. Do you have an answer? [Student], do you have an answer? Anybody else have an answer? Okay, now let’s go back and think about a strategy so we can talk about these ideas that have been shared so far. [Student], what is one way you thought about this? Okay, so how did you know that each eighth was 12.5? Okay, so you said ¼ is 25% and so how did, again, how did you get here (points to the white board to show the move to 1/8 is 12.5. Student Response(s) Students silently thought about Heather’s question. Some students indicated how many strategies they had found by placing their hand on their chest and holding up a finger for each strategy. Students raised their hands and Heather called on one of the students. 37.5 % 38% No student offered an answer. I thought of it as each eight is 12.5 because that’s one-eighth of 100 and the percent is out of 100. And then I just did 12.5 times 3 which was 37.5. One-fourth is 25 and one-half of 25 is 12.5. ¼ is 25%, then ½ of ¼ is 1/8, and so ½ of 25% is 12.5. Students had explored what it means to be similar and generated an informal definition without using the term scale factor. Students had discussed general noticings, such as corresponding lengths of an image being twice as long as the original and corresponding angles having the same measure. Students had explored enlarging an image using a copy machine for a certain percent. 51 During the number talk, the teacher and students discussed strategies to write the relationship between three and eight as a percentage. The teacher questions prompted students to share their answers and describe the process of the strategy. Teacher questions and student responses such as “Okay, so how did you know that each eighth was 12.5?” and “One- fourth is 25 and one-half of 25 is 12.5” encouraged students to connect underlying mathematical concepts but were decontextualized. Although questions like, “How could I write the relationship between three and eight or three-eighths as a percent?” may have related to scale factor and percent increase or decrease, the number talk questions were not explicitly related to a mathematical task. In particular, the CMP contexts which embed the mathematics were left out of the question. Similarly, the questions, “Okay, so how did you know that each eighth was 12.5?” may have helped students think about the relationship between corresponding sides of a shape, potential ways to convert a ratio to a percent, and ultimately similarity, but they were decontextualized and separate from the task. Neither the teacher nor students made explicit connections to the CMP mathematical task. In general, I considered a teacher question for analysis if the teacher asked a question related to mathematical content during the Launch, Explore, or Summarize phases and the question directly related to the focus question. Once I identified teacher questions for consideration, I organized the potential teacher questions into a spreadsheet. I transcribed each teacher question as a separate line of data that I later coded. Teachers usually posed a single question prior to a student’s response, so each line contained one teacher question. When creating the spreadsheet, I experienced a challenge if more than 52 one teacher question occurred before a student response. Figure 10 compares three general cases I observed. Figure 10 Challenges to Transcribe One Teacher Question into a Spreadsheet Teacher Talk Who can respond to her idea? (pause) Student 8, what do you think about Student 3’s idea? Student 27 just said you always have three more black than you do red color. Yeah? You have the same number of red and black? Remind us again, what do we, what, what'd we say initial means? Context A student provided an explanation. The teacher asked students to respond to another student’s mathematical thinking. After the first question, there was no response. The teacher asked the second question. A student provided an explanation. The teacher summarized what the student said and then immediately asked the two questions. Students discussed the starting point (starting area) of mold growing for an experiment modeled by exponential growth. First, teacher questions were not always followed by a student response. As a result, the teachers asked a different question. In the example given, a student had provided an explanation during a whole-class discussion. Heather asked students, “Who can respond to her idea?” (personal communication, December 11, 2018). After a pause and no student response, the teacher asked the next question, “Student 8, what do you think about Student 3’s idea?”. I interpreted the pause in between the questions as a way to separate the two question. In this case, I transcribed each question into a different line of the spreadsheet. Second, when teachers asked several questions, without pause, I interpreted the student response as the answer to the group of questions and having the same purpose. I interpreted both questions “Yeah?” and “You have the same number of red and black?” (personal communication, December 24, 2018) as Sophie asking student if they agreed or not. For those teacher 53 questions, I grouped them together as one line in my spreadsheet. Third, teachers changed their questioning mid-question. The last example, Austin began to ask a question with the words, “what do we,” and then started a new question. Austin asked the complete question, “what’d we say initial means?” (personal communication, December 29, 2018). I was unsure if the teacher’s initial question agreed with his final question. In the spreadsheet, I grouped the questions together (as voiced by the teacher) as one line and I coded the complete question. Codes for teacher questions. While observing one of my teacher participants, I noticed a document posted on her classroom bulletin board (see Figure 11). Figure 11 Teacher Poster of Professional Standards for Teaching Mathematics The Professional Standards for Teaching Mathematics document from the National Council of Teachers of Mathematics (1991) includes questions that one of the school districts used as a PD 54 resource. A pair of documents, the Professional Standards for Teaching Mathematics and the Curriculum and Evaluation Standards for School Mathematics, were designed by a commission composed of classroom teachers, supervisors, educational researchers, mathematics teacher educators, and university mathematicians to guide school mathematics reform. The Professional Standards for Teaching Mathematics (1991) outlines mathematics teaching standards “to support the changes in curriculum set out in the Curriculum and Evaluation Standards” (p. vii). Central to both documents is the development of all students’ mathematical power, defined as: The power to explore, conjecture, and reason logically; to solve nonroutine problems; to communicate about and through mathematics; and to connect ideas within mathematics and between mathematics and other intellectual activity…Students’ flexibility, perseverance, interest, curiosity, and inventiveness also affect the realization of mathematical power. (emphasis mine, p. 1) The bolded words share language with cognitively demanding task features and CMP curriculum materials. Central ideas, such as mathematical power, share important features with CMP and cognitively demanding tasks. Both mathematical power and CMP tasks emphasize collaboration and exploration. While engaging in CMP tasks, students collaboratively explore and persevere, find a solution, make conjectures, and consider alternative strategies to a nonroutine problem. Mathematical power, CMP, and cognitively demanding tasks build awareness of mathematical connections among mathematical strands and between mathematics and other disciplines or intellectual activity. Professional Standards for Teaching Mathematics (1991) describes a shift toward: 55 1) classrooms as mathematical communities – away from classrooms as simply a collection of individuals, 2) logic and mathematical evidence as verification – away from the teacher as the sole authority for right answers, 3) mathematical reasoning – away from merely memorizing procedures, 4) conjecturing, inventing, and problem solving – away from an emphasis on mechanistic answer-finding, and 5) connecting mathematics, its ideas, and its applications – away from treating mathematics as a body of isolated concepts and procedures. (p. 3) To address the shift of mathematics classroom expectations and develop mathematical power, teachers’ ability to “orchestrate classroom discussions in ways that promote the investigation and growth of mathematical ideas” becomes more important (p. 1). Discourse and teacher discursive moves are integral for supporting the reimagined mathematics classroom. The document fit the aims of my study and became the first draft of my codes (Codes 1- 5) for teacher questions. These initial codes suggested a differentiation in questioning and a way for me to approach making sense of how teacher questions differed. RQ2 asks, “In what ways do teachers provide scaffolding?” In their own words, teacher questions were a way they provided scaffolding. This document goes further, suggesting a way to break down what a teacher question meant, specifically in terms of mathematics. Teachers could provide scaffolding by asking students to explain their mathematical thinking or work, respond to one another, offer alternative solutions, or critique work that helps a student who struggles. Although the specific questions differed, they all functioned as a way for the teacher to engage and help students without providing answers themselves. Figure 12 displays the document as my initial Codes 1-5. 56 Figure 12 Initial Teacher Question Codes 1-5 Initial Codes 1-5 1. To make sense of mathematics… • Do you agree (disagree) with this? • Does anyone have the same answer but a different way to explain it? • What do you think about what she just said? • Do you understand what he said? • Can you convince us that that makes sense? • Would you ask that to the rest of the class? 2. To rely on themselves to determine correctness… • Why do you think that? • Why is that true? • How did you reach that conclusion? • Does that make sense? • Can you make a model to show that? Is that true for all cases? 3. To learn to reason mathematically… • • Can you think of a counterexample? • Does that always work? • Can you prove that? How? • What assumptions are you making? 4. To learn to conjecture, invent and solve problems… • What would happen if? What if not? • Do you see a pattern? • How did you think about this? • What is the same and what is different about your methods? • What can you predict from this? The next one? The last one? • What are some possibilities here? • What decision do you think he should make? 5. To connect mathematics and its applications… • Have we ever solved a problem like this before? • How does this relate to…? • What have we learned before that is useful in solving this problem? • Can you give me an example of...? • What uses of mathematics did you find in (a specific activity)? 57 To refine the initial codes, I applied these codes to a portion of my data, listened to teacher and student talk on audio and video recordings, added notes, and edited the code questions to better reflect what teachers asked (i.e., the specific words the teacher used for a posed question) and the purpose of the teacher question (e.g., What posed questions encouraged students “to make sense of mathematics?”). I analyzed student answers to understand the teacher question purpose. For example, Code 2 had the question, “Can you make a model to show that?”. Number lines and chip boards were relevant models to Accentuate the Negative tasks. To add clarification and make a more descriptive code, I added additional questions that paralleled the initial question. I added questions, such as “Can you show me with a number line or chip board?” to the code. Code 1 and Code 2 seemed similar; however, teacher questions coded as “to make sense of mathematics” (Code 1) related to students describing what they did or thought, whereas Code 2, “to rely on themselves to determine correctness,” related to how or why a student solution was correct. When coding the teacher questions, I paid attention to student responses. I assumed teachers used discourse to accomplish a goal. Therefore, what the student said was important to what the teacher accomplished with the posed question. The student responses for Code 1 were generally short or described student work. To further explain Code 1, I provide an example of teacher and student discourse in Figure 13. Figure 13 Code 1 Sub-Codes with Teacher and Student Talk Teacher Talk So first of all, do we agree with the fact that it is increasing? Who remembers what’s another way? Student Response(s) Yes. If the x is going up and the y is going down? Code 1a 1b 58 Figure 13 (cont’d) What do you think about what Student 14 is saying, Student 32? Do you want to do the three and four? Check in with your table partner. That’s Zug. Yup. I did this at the start of the equation and this at the end of the equation. And then drew a box and that was the change between the two numbers. This question was given to the class. There was no distinct student talk; however, students began discussing with their table partner before asking the teacher a question. 1c 1d 1e For example, when I applied Code 1a, students typically stated if they agreed or not, without an explanation about why or why not. Longer explanations occurred, such as the example for Code 1d, when a student described their mathematical work. I interpreted the student’s response as a description of her mathematical work because she shared what she did at the start of the equation without explaining why or how, continuing in this way throughout her entire explanation. For Code 1b, the student shared what he thought was a different way to tell if a table represented an inverse relationship. I interpreted his explanation as Code 1b because he failed to compare the two ways or move to a more conceptual understanding. I found a few of the sub-codes harder to differentiate, such as Codes 1a and 1c. For example, the student response, “That’s Zug,” represented what the student thought. The student probably agreed with the previous student; however, the response was not explicit. In general, when I applied Code 1 to a teacher question, the student response described what a student thought without expanding upon why or how they reached their conclusion. Additionally, as I listened to and interpreted Code 1 teacher questions and student responses, I heard teacher discourse moves 59 that created opportunities for students to engage with another’s thinking or asked students to revoice another’s thinking. Discourse moves like these were the focus of Heather’s PD. For all five codes, I tried to differentiate the sub-codes to reflect the slightly different student response; however, my later analysis only paid attention to the larger codes (Codes 1- 5). The sub-codes reflected the overall code (Codes 1-5) and those overall codes were used for analysis. Similar explanations for Codes 2-5 follow. Figure 14 displays examples of Code 2 when student talk moved beyond describing what they did and to why or how their solution was correct. Figure 14 Code 2 Sub-Codes with Teacher and Student Talk Teacher Talk Why did you choose those numbers to look at? How do you know that? Well, we could do this, right? Is Lug a line? Student Response(s) I chose 5 and 7 because they don’t have decimals so I think they would be easier to divide and add and stuff. Because there’s a 5 right there and in centimeters and on the other it’s like twice as big at 7 centimeters. Yeah. No. (Students turned to the coordinate plane with an image of the Lug figure.) Code 2a 2b 2c 2d Code 2a was applied when students described how they reached a solution. A teacher asked, “Why did you choose those numbers to look at?” (personal communication, December 12, 2018) and a student responded, “I chose 5 and 7 because they don’t have decimals so I think they would be easier to divide and add and stuff.” The student provided an explanation as to why he chose those specific values. Code 2b described how a student reached a conclusion beyond describing the procedure or steps. The student response for Code 2b described how she knew the measurement of the image doubled. Even though the student gave an incorrect 60 mathematical explanation, the teacher question prompted a student response which included an explanation of how the student reached their answer. Code 2c may seem similar to Code 1a; however, the context of previous teacher questions and student responses suggested discourse was about how the student arrived at a solution. The teacher and students were engaged in conversation to make sense of and determine a correct solution. When the teacher asked the question, “Well, we could do this, right?” (personal communication, December 12, 2018) the student response indicated that the student understood, and the explanations made sense. Code 2d incorporated a model or visual image that students used to check their answer to determine if they were correct. Students in the various classes used chip boards, number lines, tables, graphs, coordinate planes, and figures to check their solutions. In the provided example, the teacher helped students understand how to graph an enlarged figure on a coordinate plane using a symbolic rule. Students were unable to determine and locate the correct points. To check the correctness of their previous answer, that Lug was a line, they turned to a page with an image of Lug. Looking at the figure Lug, they corrected their previous answer. In general, Code 2 was about students making sense of a student answer. Typically, when a student responded to a how or why question, they made sense of mathematics and moved beyond describing mathematical work. Code 3 (Figure 15) extended beyond a single student solution. Students began to think about if there were other examples that supported the solution or counterexamples. I saw Code 2 as determining correctness of a student response whereas Code 3 was thinking more abstractly to say something about a broader mathematical concept. When I applied Code 3a to teacher and student talk, the students commented on a broader mathematical topic. 61 Figure 15 Code 3 Sub-Codes with Teacher and Student Talk Teacher Talk Did anybody not get the same answer for two and three when you did the chips? So would that work? How can we test that idea out? Oh, where they eliminate or just cancel each other out you’re talking? Student Response(s) Absence of student indication of a counterexample. Prior to this question, a student had presented an idea. The student noticed that adding black chips was the same as removing red chips. Yes. Can you tell me more about that? So, when you subtract them, how do you know what number is first? We could put this in a copy machine. We could measure another side. Eliminate where there is a black and red chip. Code 3a 3b 3c 3d For example, a student presented the mathematical idea that adding black chips was the same as removing red chips. Following up on the possible pattern, the teacher asked students for a counterexample, specifically for the completed class problem examples. I indicated the absence of a verbal counterexample as a student response. For codes 3b and 3c, I included several student responses because more than one student response followed the teacher question. When the teacher asked, “So would that work?”, the teacher asked students to respond to an alternative solution path. Follow-up student conversations determined if the suggested solution path would work in the current and additional examples. Students did not just evaluate one answer or compare two strategies; instead, they tried to determine if the proposed solution pathway was valid for all cases. Lastly, Code 3d was applied when students referred to an assumption. For example, as students were learning how to add and subtract integers with a chip board, they referred to the assumption and mathematical concept of zero when talking about eliminating or canceling chips. Specifically, they noted one positive and one negative 62 equaled zero. In general, I applied Code 3 when the teacher and students talked about several examples (beyond one or two student answers) to determine if their thinking applied to multiple cases. In these conversations, patterns or general algorithms were not yet discussed. I applied Code 4 (Figure 16) when students noticed patterns, generalized, and predicted. Teachers and students discussed patterns, general claims or algorithms to develop conceptual understanding. Bullet points from the original document were collapsed because I thought they overlapped. For example, when students tried to determine an algorithm for adding integers, they engaged in prediction as a way to make conjectures and check their algorithms. I use the word conjecture like Mason (1998), Lampert (2001), and Brodie (2007) who argue conjectures are all contributions to the conversation which are open to investigation, discussion, critique and revision. Student predictions also fit this description (seen in Brodie, 2007). Students wrestled with the larger mathematical concept of integer addition beyond a specific example or two. Students invented their own algorithm for integer addition. Teacher and students looked at patterns, determined why they may get the same answers for different methods, and made generalized algorithms. Figure 16 Code 4 Sub-Codes with Teacher and Student Talk Teacher Talk So who thinks they can explain to the whole class what the pattern is and how you predict what will be your sum? So if you had to tell [someone], he’s going to walk in here and you had to tell him how to use chips to subtract, how would you word that? Student Response(s) All positive numbers plus positive numbers equal positive numbers. Negative numbers plus negative numbers equal negative numbers. Code 4a You’d probably tell him first that if they’re paired together, two positives or two negatives, they’re [inaudible], then you’d add zeros so you have extra positives or extra negatives. 4b 63 Figure 17 provides examples of Code 5 when the teacher question and student responses made explicit mathematical connections to students’ experiences with previous tasks or strategies. Typically, this meant students were comparing similarities or differences with previous tasks. Figure 17 Code 5 Sub-Codes with Teacher and Student Talk Teacher Talk How does this graph compare to the graphs of the exponential functions and Investigation 1? But Student 38, you’re saying now it might be different? Student Response(s) They are similar because they are using multiplication and they are all increasing by a different amount. You take away two numbers because you don’t always have to take away the smaller number from the bigger. Code 5a 5b In the first example, a student compared the similarities of a current task to one in the previous investigation. Students also related different tools, such as money, number lines, and chip boards to current strategies. Teachers and students also recalled previous strategies, such as subtraction, and compared how they used it in the past to how they used it for the current task. In the example, the student clarified a previous student response that subtraction with whole numbers used to be taking away a smaller number from a larger number but was not the case with integers. In general, Code 5 highlights a connection between tools or strategies and not just the ability to solve a problem with a strategy or tool. What emerged were the codes in Figure 18 that described my interpretation of the purposes teacher questions served: 1) To make sense of students’ mathematical work, 2) To rely on themselves to determine correctness, 3) To learn to reason mathematically, 4) To learn to conjecture, invent, and solve problems, and 5) To connect mathematics and its applications. I used key questions to identify each of these five categories and describe the types of questions 64 teachers asked to help students and teachers make sense of students’ work, determine correctness, learn to reason mathematically, learn to conjecture, or connect mathematics. Figure 18 Final Teacher Question Codes 1-5 1. To make sense of students’ mathematical work (What) a. Do you agree (disagree) with this? b. Does anyone have the same answer but a different way to explain it? c. What do you think about what she just said/did? d. Can you convince us that that makes sense? Explaining what is on their chip board. What is the answer? What did you say/do? What is the value? Questions about student work/response. Where is… e. Would you ask/tell that to the rest of the group/class? 2. To rely on themselves to determine correctness (Why or How) a. Why did you do that? b. How did you reach that conclusion? c. Does that make sense? d. Can you make a model to show that with a chip board, number line, table, graph, or equation? 3. To learn to reason mathematically a. Can you think of a counterexample? b. Would that work? c. How can you test that? d. What assumptions are you making? 4. To learn to conjecture, invent, and solve problems a. Do you see a pattern? b. What decision do you think he should make? 5. To connect mathematics and its application a. Have you ever solved a problem like this before? b. What strategy/tools would you have to use…? Coding teacher questions. Once I determined the teacher questions and codes, I listened to audio recordings from the teachers’ lapel microphones to code each question. The lapel microphone recorded student talk when the teacher interacted with an individual or a small group of students. For whole-class discussion, if I was unable to hear student talk on the lapel microphone, I watched video recordings to hear student responses. When needed, I used 65 additional small group audio and video recordings. At times, I was unable to hear student explanations on any audio or video recording. In those instances, I interpreted the teacher question as what I thought the teacher intended. I paid attention to both the teacher question and student responses to best capture the interaction; however, I interpreted both teacher and student talk. What the teacher intended or heard and what the student intended and heard were not included in my data. For each teacher question, I applied one code, except for Code 5. The only double code was Code 5. In these instances, a teacher could ask a student to determine correctness by drawing from previously learned knowledge. For example, students worked with number lines and chip models prior to articulating addition and subtraction algorithms for integers. When a student asked about the correctness of their algorithm, the teacher asked how to check using a number line or chip board model. I coded the teacher question as both Code 2 (to rely on themselves for correctness and Code 5 (to connect mathematics and its applications). I also thought of the first four codes (Codes 1-4) as serving different purposes, such as a more procedural or conceptual understanding. I thought about their purpose because the teachers tended to ask questions about student work as a starting point, not because it was the least important. If a question asked students to make sense of students’ mathematical work (Code 1) and referred to the assumption about adding zeros (Code 3), then I only coded for the assumption about adding zeros (Code 3). In general, I thought of the first categories as more procedural and the latter as more conceptual; however, this was not a simple or distinct difference. Therefore, if a question reached a Code 3 or 4, then I used that code. Again, the overall code (Codes 1-5), and not the sub-codes, were tallied and analyzed. 66 Inter-rater reliability of teacher questions coding. In order to check the validity of my codes, I asked a colleague to code a subset of my data to establish inter-rater reliability (IRR). When revising my codes, I tried to describe and refine them so another individual could apply the codes on a set of teacher questions. For IRR coding, I selected a classroom observation and a set of teacher questions which exemplified all five of the code categories. During my pre-IRR coding, this particular classroom observation included codes from all categories and the lesson included all the phases of enactment. During the observation, the teacher launched the task, allowed time for students to explore, and facilitated a summary. For some observations, the teacher ran out of time to engage with students in all three phases. Most commonly, the teacher did not have time to complete the Summarize phase during an observation and began the following day with a summary of the previous day’s lesson. My colleague, a fellow mathematics education Ph. D. candidate who is familiar with scaffolding, cognitive demand, teaching, observing instructors, and mathematics education research practices, coded every teacher question I identified during the selected classroom observation. Our initial IRR was 56%. With our agreement being 56%, we met to discuss codes for a specific example. When comparing our initial coding, we noticed a teacher question about making sense of mathematics (Code 1) could also describe the other codes. For example, when a teacher asked a question about determining correctness (Code 2), my colleague interpreted that as making sense of mathematics (Code 1). Our discussion centered around the distinctions between Codes 1 and 2. We determined when we applied Code 1, the teacher questions asked students to explain their mathematical work. To reflect this distinction, I changed Code 1 “to make sense of students’ mathematical work.” At times, students explained their mathematical 67 work in order to answer a Code 2, 3, 4, or 5 question. We discussed the purposes of codes so that if a teacher question had several codes, Codes 2-4 took precedence. After our discussion, we agreed 94% of the time. The instances where we did not find consensus exemplified a main challenge for coding, which I detail in the next section. We both encountered a challenge of how to know if a teacher asked students to explain their mathematical work or to determine correctness. All of the coded items that we disagreed about related to this challenge. Specifically, of the 11 disagreed upon codes, 10 (91%) of them occurred when my colleague coded the questions as determining correctness and I coded them as asking about students’ thinking. When a teacher asked a question that seemed to desire a response of agree/disagree, my colleague used the code “to determine correctness.” We thought that if a student agreed or disagreed, then their reasoning followed a course of evaluating the prior statement or question for correctness. When a student included a justification, we found no disagreement and we both coded the teacher question as evaluating for correctness. If the student only responded agree/disagree without a reason for why, then I did not assume the student determined correctness before answering. I could not rule out other factors, such as pressure to agree or desire to avoid explaining or defending their reasoning, contributed to their answer of agree/disagree. Without the students’ explicit explanation of why they agreed or disagreed, I did not know the reason for their response. Therefore, I decided to code questions which yielded responses of agree or disagree without justification as Code 1 and not Code 2.The one other disagreed upon code (1 of 11, 9%) occurred for the question, “Do you guys agree that the red and black, those two and two, that is zero?” My colleague coded the question as determining correctness (Code 2) and I coded the 68 question as asking about a key assumption (Code 3). In this case, even though we confronted the same challenge as described above with the other 10 disagreements, this question related to the assumption that the same number of positive chips and negative chips have an overall value of zero. And as stated above, I used Codes 3 and 4 if they were reached instead of Codes 1 or 2. Identifying teacher ambiguous responses to student uncertainty. During the initial analyses of teacher questions, I noticed that teachers responded to student uncertainty with what I interpreted as ambiguous responses, such as “I don’t know”, (Lesson 2, 5, 7, 12) “That’s not a question for me”, (Lesson 12) or “Interesting” (Lesson 1, 2, 5, 7, 12). Written CMP tasks encourage students to explore, discover, and conjecture about mathematics. When enacted, students who explored, discovered, or conjectured also encountered uncertainty and asked for help. Teachers’ ambiguous responses were another way they scaffolded when students asked for help (RQ2). I looked for the phases above in transcripts to identify and mark instances as “uncertainty” when the teacher responded with ambiguous responses. I returned to recordings and transcripts and listened to and read the interactions prior to and after the marked “uncertainty.” Specifically, I read the student talk to understand the context of the interaction. I looked for an outcome, whether the student uncertainty was resolved or not. Then, once the marked instances were separated into two categories, I reviewed the context of each instance to look for patterns. When student uncertainty was resolved, analysis suggested teacher involvement occurred on a continuum. Looking at instances when student uncertainty was resolved, the teacher supported students to resolve the uncertainty. In some instances, the ambiguous 69 teacher responses acted like questions to push students to explore additional examples and agree on a solution. Once a student shared an additional example or strategy, students agreed relatively quickly on a solution. For example, during a whole-class discussion, a teacher asked students if they expected all the chip boards representing a given value to look the same. Students answered, “yes” and “no.” The teacher responded, “Maybe, maybe not” (personal communication, October 24, 2018). The teacher was not responding to a student question, instead it revoiced students’ responses in a form that acted like a question. The teacher’s phrasing reframed “yes” as a “maybe” and “no” to “maybe not,” and made the answers less certain. The teacher acknowledged and questioned both answers. The students responded by sharing their chip boards to show chip boards representing a given value did not need to look the same. Other instances were similar where the teacher asked questions so students would explain more of their mathematical thinking and agree on a solution. For example, students shared explanations of their chip board value to Sophie; however, the students knew their value was incorrect. To maintain cognitive demand, Sophie minimized her involvement by refraining from offering an explanation that would focus on the production of a correct answer (Smith & Stein, 1998). Instead of clarifying the students’ uncertainty, she responded with, “Oh shoot. We have to have a starting value of negative four” (personal communication, October 26, 2018). Sophie interpreted the students’ uncertainty to be about how to get a starting value of negative four. Her response confirmed they needed to reconsider their starting value and provided students with the opportunity to explore and develop a mathematical understanding. Instead of Sophie resolving the uncertainty, another student in the group explained a way to 70 show negative four on the chip board. In cases like these, a student explanation resolved the uncertainty with minimal teacher involvement. At other times, teachers spent extensive time questioning students or offering hints until students reached a resolution. The student voiced the resolution but only with increased teacher support. For example, a student attempting to measure the side length of a figure had trouble reading the indicated measurement on the ruler. The teacher said, “Let’s look at that. Let’s use the ruler to help us.” (personal communication, December 11, 2018). The teacher’s response served the same purpose as a question by providing an opportunity for the student to re-examine his answer that two-fifths was halfway between two and three. The teacher offered directions (e.g., “Show me where halfway between.”), asked a question (e.g., “So it is just half?”), and provided an explanation (e.g., “This is half to me.”) while the student took out a ruler to show what he thought. By the end of the interaction, the student resolved his uncertainty and continued to work. And lastly, there were instances when neither students nor the teacher resolved uncertainty. In these instances, students were typically left with unconfirmed conjectures. Teachers provided responses like, “Interesting. Huh? I don’t know” (personal communication, October 23, 2018). In one particular example, students made a conjecture and then added the word “right” at the end of the sentence with an upward inflection. The inflection suggested students were unsure of their response and wanted verification. Instead, the teacher asked the students to prove it with the chip boards, thereby providing a response without resolving the students’ uncertainty. Similarly, another teacher responded to a student by saying, “So let’s see how we follow along with the class discussion today and see if you had that same idea or [a] 71 different one at the end of class” (personal communication, December 12, 2018). In both cases, the teacher neither verified nor denied the students’ conjectures, yet provided them help for a possible next step. My findings chapter on student uncertainty and ambiguous teacher responses delves deeper into specific examples to highlight details such as teacher involvement, mathematical context, and a possible purpose of ambiguous teacher talk. I did not use IRR for teacher ambiguous responses to student uncertainty because I selected specific examples to describe a phenomenon of scaffolding during cognitively demanding tasks instead of identifying and coding every occurrence. Identifying salient student episodes. Salient student episodes during cognitively demanding tasks highlighted how teachers mediated frustration and continually balanced cognitive demand and scaffolding to encourage student perseverance. These episodes provided examples of student reactions and experiences during cognitively demanding tasks (RQ 3). During observations, I kept a journal about what I noticed in real time. While observing students, I noted times when a student voiced frustration while interacting with the teacher. I watched and listened for students who stated frustration, resistance, or disinterest (e.g., I’m frustrated, I don’t understand, I don’t know, etc.) or showed nonverbal gestures indicating frustration (e.g., hands on their foreheads with shaking heads, tears, etc.) while interacting with the teacher. Additionally, I identified specific students the teachers spoke about during post- observation interviews. Teachers talked about students who had grasped how to do the task the previous day yet were confused the next day. Teachers tended to reflect about their choices of when to stay or leave and what to say or not to say when a student was struggling to 72 the point of showing visible signs of frustration. If a student met either criteria, I considered it a possible salient episode and went back to the recordings and transcripts. For each episode, I transcribed teacher and student talk. Then, I grouped the episodes by student in chronological order. After compiling complete transcripts for each student, I selected episodes where students expressed frustration, either with the specific words or with their inflection and gestures. I then looked for ways teachers consistently responded during their interactions with particular students (RQ3). For similar reasons as teacher ambiguous responses, I did not use IRR for student salient episodes. Note About Findings Chapters The following three research questions guided my study: 1) How do teachers think about the relationship between scaffolding and maintaining cognitive demand?, 2) In what ways do teachers provide scaffolding through discourse?, and 3) What are students’ reactions and experiences? As I collected and analyzed data, what emerged were different contexts where teachers provided scaffolding during cognitively demanding tasks in ways that mediated tension between scaffolding and cognitive demand. To answer these research questions, I begin with RQ2 by describing two ways in which teachers provided scaffolding through discourse during different phases of a lesson. The first and second findings chapter on teacher questions and ambiguous teacher responses, respectively, share classroom observation data supplemented by teacher interviews to report on teachers’ scaffolding trends. In Chapter 4, I argue different phases of a lesson provided different contexts. I begin with the Launch and Summarize phase to show the nature of teacher questions started with Code 1 and moved to Code 4. Then, I focus on the Explore phase when 73 teachers spent extensive time responding to students. In Chapter 5, I use the context of student uncertainty to show how teachers mediated tensions between scaffolding and cognitive demand. I answer RQ3 in the third findings chapters by sharing mainly classroom observations of specific student salient episodes to illustrate student responses and experiences. To include teacher reflections about these student salient episodes, I supplemented classroom observation data with teacher interviews. In Chapter 6, I argue teachers mediated tensions between scaffolding and cognitive demand in the context of student frustration. RQ1 cuts across all three findings chapters and relies heavily on teacher interview data. The ways in which teachers scaffolded through teacher questions and ambiguous answers and interacted with students who struggled spoke to how they thought about the relationship between scaffolding and maintaining cognitive demand. Issues relating to the findings and how they viewed the relationship were integrated into the classroom observations and teacher interviews. Therefore, I return to RQ1 at the beginning of my conclusion. For Chapter 4, on my analysis of teacher questions, Sophie dominated the data because all of the teachers’ questions were included within the data set. I observed Sophie for the most classes, so she appeared the most in my data. The unequal amount of observations influenced all of my findings chapters; however, I included Heather and Austin when I found a particular example from one of their classroom observations or interviews that represented a general trend. For Chapter 5, Heather discussed the role of prediction explicitly during her interviews and classroom observations, so I included her examples as they relate to uncertainty and foreshadowing of upcoming content. For Chapter 6, on student salient episodes, I used audio 74 and video recordings to select student episodes that represented a frustrated student. Madison and Tate were students in Sophie’s class and represented the most extreme and least common representation of student frustration. These two examples were the only observations where I witnessed students crying. Austin appeared less in earlier findings, so I included Leonore and Evelyn from Austin’s classroom who represented student reactions in all three teachers’ classrooms. Students who were disinterested or resistant to participation were more common and usually represented a few students per classroom. 75 CHAPTER 4 TEACHERS PROVIDED SCAFFOLDING BY ASKING QUESTIONS This section reports on findings relative to RQ2. Classroom video contained observations of the enactment of CMP tasks in middle grades classrooms. Teacher scaffolds included teacher questions. I examined teacher questions by creating a spreadsheet of teacher questions with contextual markers, such as focus questions and lesson phases. My analysis of discourse focused on the nature of the questions, number of questions compared to the number of statements, and how many questions occurred during each phase. Patterns were examined using my coding scheme (Codes 1-4) and frequencies. Findings revealed that: 1) The nature of the questions that teachers posed to students tended to begin with Code 1 and move toward Code 4, 2) Teachers posed more questions than statements to students as they explored mathematics tasks in their small groups, and 3) The nature of questions asked was unequally distributed among the phases. Claim 1: Trends Identified in How Teacher Questions Unfolded My first result compared the number of questions within a code in a particular phase to the total number of questions in that phase. Specifically, I examined the Launch and Summarize phases because the teacher asked questions to the whole class during these phases. Differing from the other two phases, during the Explore phase, a teacher moved from group to group and could ask the same question to different groups. I could not tell if questions were repeated to multiple groups. Comparing the distribution of questions within a code during the Explore phase did not give a distribution of questions within this code in the same way as the other two phases. In Table 3, percentages for a given question code are reported for the Launch and Summarize phase for every lesson. Sophie’s data is given for Lessons 1–8, Austin’s data is given 76 Launch Launch Summarize Launch Summarize Launch Summarize Launch Summarize Summarize Summarize Phase Launch Lesson 1 2 3 4 5 6 7 8 9 10 11 12 Note. Two lessons did not follow the pattern as the other lessons and is explained in subsequent text. Code 2 16% 12% 23% 9% 11% 27% 22% 12% 21% 31% 27% 5% 11% 10% 0% 18% 43% 17% 14% 13% 17% 22% 24% 17% Code 1 61% 40% 57% 9% 78% 45% 61% 51% 60% 30% 54% 29% *28% *56% *6% *54% 21% 56% 62% 60% 57% 43% 48% 57% 0% 0% 2% 36% 11% 12% 4% 7% 6% 17% 0% 38% 0% 11% 0% 3% 0% 0% 3% 4% 4% 1% 0% 6% Summarize Summarize Summarize Launch Summarize Launch Summarize Launch Launch Summarize Launch Launch Code 4 2% 32% 2% 36% 0% 15% 0% 31% 6% 13% 4% 24% **39% **20% **81% **18% 7% 5% 3% 8% 11% 9% 12% 17% for Lessons 9–10, and Heather’s data is given for Lessons 11–12 (this remains true for all Lesson data references). Table 3 Comparison of Launch to Summarize Within a Code Category Code 3 Looking at Table 3, the percentage of questions pertaining to the code “to make sense of students’ mathematical work” tended to decrease from the Launch phase to the Summarize phase. Alternatively, the percentage of questions pertaining to the categories “to learn to reason mathematically” and “to learn to conjecture, invent, and solve problems” tended to increase from the Launch phase to the Summarize phase. 77 Although this trend applied to many of the lessons, it did not hold for all lessons; two were of particular interest. In Table 3, for Lessons 7 and 8, the percentages increased from Launch to Summarize phases for “to make sense of students’ mathematical work,” denoted by an asterisk. The percentages for “to learn to conjecture, invent, and solve problems” decreased, denoted by a double asterisk. For some individuals, this result may be troublesome because teachers asked questions about conjecturing before students explored mathematical ideas; however, the context in which discourse occurred was important. Perhaps, the teacher had prior knowledge about the students and thought they had sufficient knowledge to conjecture before exploring. Student responses could inform the teacher’s questioning during the Explore phase. Or, the difference in questioning may be related to a lesson’s focus question (Vale, Widjaja, Doig, & Groves, 2019). I chose to focus on the relationship between the focus question and teacher questions. For each class, Sophie used the focus question as the “big idea” and a way to introduce the lesson. The two lessons of interest gave, “How can you predict whether the result of an addition problem is going to be positive, negative, or zero?” as the focus question. Sophie’s data in Table 3 shows an increase in questions about students’ mathematical work from Launch to Summarize. For Sophie, I coded 28% of the Launch phase questions in Lesson 7 as asking about students’ mathematical work, which increased to 56% of the Summarize phase. Similarly, for the second instance for Sophie (Lesson 8), I coded 6% of the Launch phase questions as asking about students’ mathematical work, which increased to 54% of the Summarize phase. Along with the increase for that particular teacher question code from Launch phase to Summarize phase, the reported percentages for teacher questions about 78 conjecturing decreased, 39% to 20% and 81% to 18%, respectively. The trend in how the percentages changed from the Launch phase to Summarize phase is important. In these instances, Sophie taught the same lesson back-to-back and she acknowledged that she had slight changes between how she enacted the tasks in the two classrooms. For example, when the first class had difficulty with a certain aspect of the lesson, she changed or spent more time providing scaffolding to the whole class. Sophie’s questioning changed for those two lessons compared to the other lessons I observed in her classroom. As part of contextualizing the task during the Launch phase, the teacher and students spent more time thinking about predicting than in other lessons. During an interview, Sophie also discussed her challenges for that lesson and, in particular, her struggle to help students understand and see the pattern being presented in the curriculum to help them make predictions. “I know that trying to get them to figure out what the group problems have in common because then they go and solve them and think it’s all because, oh, they’re going up by tens or the numbers, what are they, like two, two, and eight and two” (personal communication, October 29, 2018). Her increased amount of time spent with whole-class discussion in the second class represented a slight change from one class to another. She provided more guidance to get students started on observing the pattern in the first group of expressions before letting them work in groups during the Explore phase because “that’s a struggle every year…What about the problems? What did they have in common? That always seems to be a struggle.” Sophie asked more questions about potential patterns students saw and what that may help them predict before letting them explore in their small groups. During her interview, she stated: 79 I was like, who sees something to kind of get the ball rolling in that direction to kind of divert from all the, oh, they’re all even…I’ll have them predict without using chips or number lines, what you know, but then they still are on the, oh well I got a positive or negative. They’re still not seeing that, what does the group have in common? (personal communication, October 29, 2018) Sophie tried to alleviate the challenge of identifying a group’s common characteristic by asking more questions during the Launch phase with the emphasis on prediction. I found similar patterns about the nature of teacher questions posed during whole-class discussions. While engaging with students about proportional reasoning during (personal communication, December 11, 2018), Heather asked the questions shown in Figure 19. Figure 19 Example of Heather’s Teacher Questions Unfolding and Code Pattern Emerging Teacher Question This group right here, give me one way you talked about comparing four times larger? Can you explain how you got 400%? Student 2, what did your group actually, Student 10, share out what you guys had, something different? What do you guys think about that? Student Response(s) 400% Key Phrase What is the answer? Code 1 2 1 2 How did you reach your conclusion? What is the answer? What do you think about what they said? 100% is the entire thing. So 400% is 4 of the entire thing. If you multiply the image by 0.25, you get the original. I agree because if you make like quarters, like 25 cents, one quarter is one- fourth of a dollar. So then if you multiplied a quarter by 4 it would be 4 times bigger. And if you divided by 4 it would be [inaudible]. 80 Figure 19 (cont’d) Okay, so how come, how does that number that she's giving us here that decimal, 0.25 or 25 hundredths is different than four? How does that still make sense? So how can we say 4 times larger and then one-fourth of it? Do those mean the same thing? So what's, what, what are we doing here that's making those two statements still make sense? What are we comparing when I say 4 times larger? When we talk about one-fourth, what are we comparing? 0.25 is one-fourth of like a full 100 which is 1. So you just basically divided it by 4. Would that work? Why are you getting the same answer? No audible student response. No audible student response When you’re taking the 4 times larger and you’re and multiplying by one- fourth like one-fourth of this. When you’re multiplying the image, you’re not multiplying by a fraction. So it’s getting larger. The image to the original. What is the pattern of comparison? The original to the image. What is the pattern of comparison? 3 3 3 4 4 4 Analyzing Heather’s discourse, I noticed the posed teacher questions began with Code 1 and moved toward Code 4. Heather’s first posed question elicited a student answer. Once the student stated, “400%”, Heather asked a follow-up question to encourage the student’s explanation of their answer. Before discussing the student’s answer, Heather elicited another student to share their answer. Heather’s beginning questions, both about eliciting and pressing student contributions, resulted in students sharing answers and explanations. Her discourse moves introduced two student answers and explanations into the whole-class discussion. Once answers were given and a student voiced a way to determine correctness, Heather invited 81 other students to engage with two students’ mathematical ideas. Heather did not confirm or deny whether the students’ explanations were correct. Instead, she maintained both student answers and explanations by keeping them in the “public realm for further consideration” (Brodie, 2011, p. 180). Another student responded with agreement and an explanation. The student’s agreement validated both answers yet did not quite explain why or how both were correct in the context of the task. Heather pushed students to connect their thinking to the original figure and image when she asked, “What are we doing here that’s making those two statements still make sense?” Heather inserted or “[added] something in response to the learner’s contributions” (Brodie, 2011, p. 180). Heather’s inserted information pushed the student to specify how previous student talk of one-fourth and four times the size related to the starting and ending figure. Heather confirmed the students’ shared mathematical thinking and used their language of one-fourth and four times larger. Student responses to Heather’s last two questions involved the whole classroom. One student began the conversation with an answer, several contributed through the discussion, and the whole class had the opportunity to engage with the larger mathematical concept of scaling up and down. I selected this example to highlight a pattern I observed in multiple instances. All three teachers valued student talk to make sense of mathematics and used questions to encourage student talk. Sophie and Austin echoed Heather’s stance that she talked “a lot about [how] it’s okay if you don’t know the right answer. That is probably the least important part of the process, but it’s just trying to make sense of an idea or a process” (personal communication, December 5, 2018). Heather emphasized the sense making of an idea or process that a student put forth for consideration. Sophie added emphasis that students took 82 an active role in sense making when she said, “So even like when they are presenting, I’m not here to critique it, you guys, it’s your job” (personal communication, October 23, 2018). During an observation, Austin shared his perspective with students on the importance of perseverance and reasoning when he said, “You grow from struggling a little bit and then persevering through it…Even if we don’t know the answer right off the bat. Okay. Give it your best effort. As long as you have something, you can back it up with reasoning” (personal communication, October 26, 2018). Later in the observation when a student did not think they could provide an answer, Austin responded, “Do you want [to] help us get started and maybe we can work it out together as a class?” Teachers and students collaboratively made sense of presented mathematical ideas. The teachers’ questioning approach resulted in discussions beginning with a student offering a mathematical idea or solution. The teachers tended to ask if students agreed or disagreed with the student’s work. As students discussed to critique the suggested solution, students raised questions about students’ work or offered alternative solutions. To wrap up discussions, teachers asked about observed patterns, how the different approaches compared, and why something was true. This general pattern of these types of questions across the Launch, Explore, and Summarize phases of the lesson were present within other lessons and across multiple teachers. Written curriculum context clues influenced teachers’ questioning trends. The written curriculum provided context clues to teachers. Specifically, the nature of suggested questions for the particular lesson phase and the focus question. A pattern emerged within the written curriculum of a lesson when I looked at suggested teacher questions, beginning with the Launch phase and ending with the Summarize phase (see Figure 6). 83 Launch phase. The Launch phase serves as a brief introduction to the problem. Questions such as, “What does ‘in the black’ or ‘in the red’ mean when describing finances?”, “What other ways do you know of to make a larger copy of something?”, or “Is the area of the plant growing exponentially?” introduce the context in which students will work (e.g., red and black chip boards, percent increase using a copier, and exponential growth of mold, respectively). During the Launch, the teacher and students contextualize the problem; however, the goal of the curriculum is not to unpack the mathematics, connect to prior knowledge, or relate to future knowledge. In other words, the Launch is meant to set the scene. These suggested teacher questions are most similar to Code 1 when students make sense of mathematical work. The written curriculum did not yet suggest students to determine correctness (Code 2), reason mathematically about multiple solutions (Code 3), or solve the problem (Code 4). Explore phase. During the Explore phase, students engage further with the task and additional scenarios. Suggested teacher questions encourage students to record and report their work while engaging in discussion with group members. Questions such as “How can you take away 3 chips when only 2 are on the board?”, “If your characters generally do not look like Mug, what could you check?”, or “What value is raised to the exponent? Why?” moves beyond describing the context or mathematical concepts. These suggested teacher questions engage students with different scenarios and push them toward determining if they are correct (Code 2) and why they are correct (Code 3). While students explore, the written curriculum encourages teachers to walk around and engage with small groups of students. During this time, teachers may stop to support students 84 who have questions. Teachers are encouraged to notice student work to ask clarifying and follow-up questions. Additionally, teachers note student work to present during the Summarize phase. Interactions during the Explore phase bridge the Launch and Summarize phases, occurring as students unpack the mathematics, reason mathematically, and move toward solving the problem. Teacher questions that encourage these student actions are similar to Codes 1-4. Summarize phase. Lastly, the Summarize phase allows students to share solutions and strategies, engage with another student’s thinking, discuss patterns, and make conjectures. Suggested questions include “What patterns do students notice when a chip board has the same total value, but the combination of chips are different?”, “How would you describe to a friend the growth of the figures that you drew?”, and “Suppose you started with 25 mm2 of mold and it grew in the same way that it did in the Problem. How would the equation change? How would the graph change?”. The suggested teacher questions are intended to help students enhance their conceptual understanding by asking students to make connections and refine generalizable problem-solving techniques and algorithms. These suggested teacher questions are similar to Code 5 which encourage students to solve problems and make generalized mathematical conjectures. The nature of unfolding Launch, Explore, and Summarize suggested questions. Applying Codes 1-4 across the table of suggested teacher questions, a trend of questioning emerged within the written curriculum. In general, the suggested Launch questions emphasize Code 1, Explore phase questions emphasize Codes 2 and 3, and Summarize questions 85 emphasize Codes 3 and 4. In addition to the lesson phases, the written curriculum includes a focus question as context. Focus question. The focus question serves as context to the written problem and may suggest the nature of questions included in the Launch phase (Vale et al., 2019). The Launch phase is the brief introduction when students are contextualizing the problem. For Sophie’s lessons, the focus question, “How can you predict whether the difference of two integers is 0, positive, or negative?” emphasizes prediction. As part of the Launch phase, students identify important mathematical concepts and key vocabulary, such as difference, signs (e.g., zero, positive, and negative), and prediction. With prediction as an integral part of the written curriculum context, teacher questions about prediction during the Launch phase are more likely. Teacher questions during enacted curriculum. The written curriculum reflects the intent of curriculum writers and not necessarily the reality of what happens during the enactment phase. Looking at the Launch phase of the enacted curriculum, teachers asked questions that encouraged students to make sense of mathematical work and introduced the context (Code 1). The nature of teacher questions posed changed throughout the of enactment of the lesson. Table 3 illustrates a general change from Launch to Summarize while Heather’s example shows how the nature of teacher questions moved from Code 1 to Code 4. I noted a difference when Sophie’s focus question pertained to prediction. Consequently, with more questions about prediction and conjectures allocated to the Launch phase, I saw a decrease in the Summarize phase. The enacted curriculum examples shed light on curricular considerations and how they impacted the posing of teacher questions throughout a lesson. The context clues 86 provided in the written curriculum influenced teachers’ questioning by suggesting questions of a particular nature for a given phase or focus question. Connections between curriculum and teacher use. I expected the written and enacted curriculum to differ because teachers adapt written curriculum to meet the needs of their students (Cirillo, Drake, Herbel-Eisenmann, & Hirsch, 2009), yet, I was also not surprised they shared similarities since teachers use written curriculum as a resource and guide that influences teachers’ instructional decisions (Robitaille & Travers, 1992). CMP is a problem-based curriculum that embeds sequences of contextualized problem situations that may make the mathematical understandings and reasoning less obvious (Edson, Phillips, Slanger-Grant, & Stewart, 2019). Teachers who have little experience as a learner with problem-based instruction may experience additional challenges for enacting the curriculum. To support teachers’ enactment of the curriculum, CMP intentionally includes suggested teacher questions as educative features. Since the suggested teacher questions occurred in the context of lesson phases and over the development of a task, then it seems reasonable that teacher questions unfolded in particular ways, specifically moving from Code 1 to Code 4. Connections between teacher questions and discourse moves. Similarities arose between the discourse moves and patterns I observed. Brodie’s (2011) findings compared teachers’ discourse moves as they related to reform practices. Heather’s example illustrates how Brodie’s (2011) discourse moves may also relate to the unfolding of questions. The teachers’ stance to make sense of mathematics together through a process focused on presenting students’ answers, critiquing one another’s mathematical work, and coming to a solution shares features with Brodie’s (2011) discourse moves, specifically elicit, press, 87 maintain, and insert. When describing elicit, Brodie (2011) states, “while following up on a contribution, the teacher tries to get something from the learner. [The teacher] elicits something else to work on learner’s idea” (p. 180). Once a student shared their solution, Heather’s posed question, “Student 2, what did your group actually, Student 10, share out what you guys had, something different?” elicited another student answer. When a student offered a solution, Heather asked a follow-up question that resulted in an explanation. This is similar to Brodie’s discourse move, press. What followed were posed teacher questions that maintained student contributions and inserted information into the discussion. My teacher question trends in the Launch and Summarize phases also relate to Franke and colleague’s (2009) finding that teachers struggled to ask follow-up questions after the initial questions that elicited students’ mathematical thinking. Sophie, Heather, and Austin used questions to elicit multiple solutions, maintain several student solutions, critique suggested solutions, and find agreement among students. The teacher question trends also addressed the variability of teacher questions after the initial question by looking at how they fit with Brodie’s (2011) framework and my coding scheme. Patterns emerged that suggested teachers asked questions for a specific purpose (e.g., elicit, press, maintain, and insert) that facilitated student discussion moving from a procedural to a conceptual mathematical understanding (e.g., Codes 1-4). Claim 2: Teachers Posed More Questions than Statements The previous section focused on the Launch and Summarize phases or when teachers were engaged with the whole class. For my second claim, I focus on the Explore phase. Teachers posed more questions than statements, particularly during the Explore phase. Using 88 my data set of coded teacher questions, low estimations are shown in Table 4. The number of teacher questions for the 12 lessons ranged from 125 to 404 with a mean of 200.75 and median of 197. A few reasons pertain to my reported numbers being a low estimation. First, the numbers represent the coded teacher questions, specifically related to the focus question and mathematical content for the lesson. Still related to mathematical content, a portion of the class may have included discussion of the prior day’s homework. This issue mostly pertained to the first observation day with each of the three teachers. Additionally, at times the teachers asked students about their comfort with mathematics or if they could complete a problem. Often, these questions were general and not included in my coding. And then there were times when the teachers engaged with students on a more personal level. With teacher questions dominating the classroom discourse, it was worth exploring the nature of questions in more detail. Table 4 The Number of Questions Asked During a Lesson Lesson 1 2 3 4 5 6 7 8 9 10 11 12 Teacher Sophie Sophie Sophie Sophie Sophie Sophie Sophie Sophie Austin Austin Heather Heather Number of Questions Asked 157 201 135 200 194 192 125 204 212 171 404 214 With that goal in mind, my second result examined the relative frequency of a code application during a phase to the lesson. In other words, how often a code was applied in a 89 particular phase compared to the lesson. For example, in Table 5, the phase in which the most questions occurred is reported for all the teachers. Of the 58 phases reported, I found 35 (60%) occurred during the Explore phase. (See Appendix H for a table with all the Explore phase percentages.) Overall, teachers asked the most questions during the Explore phase. This finding suggests that teachers were very interactive with students during the Explore phase. Table 5 Explore Phase Percentages Code 1 Explore Launch Explore Explore Explore Explore Summarize Explore Explore Summarize Launch Explore Code 2 Explore Launch Explore Explore Explore Explore Explore Explore Explore Explore Launch Explore Code 3 NA Summarize Explore Explore Explore Explore Summarize Explore NA Summarize Launch Summarize Code 4 Explore Explore Summarize Summarize Summarize Summarize Explore Explore Explore Summarize Explore Explore Code 5 Launch Launch Explore Explore Launch Explore Explore Summarize Summarize Summarize Summarize Explore Lesson 1 2 3 4 5 6 7 8 9 10 11 12 Students were not left to explore mathematics alone, but instead, the teachers used this time as an opportunity to see student work, listen to students explain and discuss mathematics, and interact with students who may have needed additional support. For example, during the Explore phase, teachers asked questions that emphasized student discussion and mathematical connections. Teachers encouraged student discussions. During a post-observation interview, Heather shared strategies for engaging with students and encouraging student discussion. To help a student who had a question, Heather first asked if the student had discussed the question with their table partner or group. To engage more students and to emphasize 90 students’ mathematical thinking, Heather explained her general approach for interacting with a table partner or group. “[student’s name] just came over to me with a question. What was the student’s question?” (personal communication, December 5, 2018). If the table partner or group members could not articulate the question, then Heather responded: Oh, it sounds like [student’s name] has a question that maybe you guys should all talk about first. If you are still uncertain about how to move forward after you’ve discussed it together, then call me over. It’s all about what you guys think, I learned just as much from you as you will from me. (personal communication, December 5, 2018) At other times, Heather took a different approach. She described her approach in the following way: You walk by and you're trying to look at some of their thinking on paper because sometimes I think it's easier for them to be willing to show something on paper that I can sort of use as a launching point and have a conversation. (personal communication, December 12, 2018) Alternatively, for students who did not necessarily have a question but found a solution, Heather stressed “that even if somebody thinks they understand the problem really well,” she asked, “Can you think [of] another way that might also work?” (personal communication, December 5, 2018). All three examples illustrate how Heather engaged with students during the Explore phase in ways that encouraged students to discuss multiple student driven solutions. Heather tried: to push students that seem to work through something quickly. Okay, you’re not done 91 with this. Now, is there another way that we could have explored this idea? Is there another way that we could think about it? Or even when they were going over their work from…yesterday in a small group…I went back and said, okay, so let’s think about…I was intentional, I didn’t want to say what part was hard for you because I didn’t want the answer, ‘Well, none of it.’ Right. So I said if a student were to struggle, what do you think the most challenging part of this work would be and why?...So trying to direct them into digging deeper because they had the time while other students were finishing up. (personal communication, December 12, 2018) Heather’s description provides an example of how and why teachers spent considerable time questioning during the Explore phase. Teachers emphasized mathematical connections. Perhaps, a group was unable to make progress and called the teacher back for additional support. As shared during interviews, Sophie asked questions related to students’ mathematical thinking. A few examples were, “Can you show me?”, “What did you do before?”, and “Can you tell me what this (referring to students’ mathematical work) means?” (personal communication, October 23, 2018). As the teacher and students began their conversation, the teacher asked additional questions that drew on mathematical connections. Sophie’s students were learning about subtracting integers. When faced with difficulty understanding the concept, Sophie asked, “Think about this. If you owe me $3 and you make $5, do you still owe me $2?” (personal communication, October 26, 2018). Sophie reflected about why she introduced the connection of money into discussions: Up until then we’ve talked a lot about money, or they played the Math Fever game. Kids 92 connect with money pretty well. Hey, you owe me $20, you go make $30, what do you get? Things like that. So, asking them those questions to get them going. (personal communication, October 23, 2018) Sophie identified a related mathematical concept the student understood to connect it with the current concept of subtracting integers. Sophie also shared about a student who had tears of frustration during the Explore phase. Sophie said: I had the group [who] went back to the number lines. I was like, that’s awesome. Go for it. And one gentleman had tears of frustration at the end. I said, well what if we thought about money? You will make $4 and you take $2 from me as well, then I owe you $6. And he could do it there. So, the chips, some kids just cannot handle the chips…He knew it when I said it with money. He’s still, he goes, ‘You’re pulling, you’re pulling number out of thin air.’ I didn’t do anything. Your group said, well let’s add two red and two black to get the black on the board. When I said it as money as a whole, it’s negative six. Then why are you telling me it’s negative two when you were doing that with the chips? (personal communication, October 24, 2018) Sophie asked questions to the whole group that encouraged mathematical connections between number lines, money, and chip models. Even when Sophie used student responses as explanations and made mathematical connections between the chip model and the student’s understanding of integer subtraction with money and number lines, he remained unconvinced. Teachers balanced scaffolding with time to explore. Table 5 represents three teachers, each of whom I observed a different number of times. Keeping this in mind, I considered the 93 teachers separately. Figure 20 provides a more detailed understanding of each teachers’ questioning during a lesson. For example, “Explore” appears 26 out of 39 times in Figure 20 for Sophie. So, the value 26 is reported in Figure 20 under “Frequency” in the “Explore” column. Both Sophie and Heather asked more questions during the Explore phase than either the Launch or Summarize phase, with 26 out of 39 and 5 out of 10, respectively. A slight difference becomes apparent when distinguishing Austin’s questioning from the whole group. Figure 20 Occurrences of When the Most Teacher Questions Were in a Particular Phase. Teacher Sophie Heather Austin Total Number of Reported Percentages 39 10 9 Explore Frequency Percentage Frequency Percentage Frequency Percentage Summarize Launch 26 5 4 67% 50% 44% 5 3 0 19% 30% 0% 8 2 5 21% 20% 56% Unlike the other two teachers, Austin seemed to ask fewer questions during the Explore phase. There were four out of the nine times when the number of questions asked during a particular phase compared with the total number of questions in that code were the highest during the Explore phase. So, by a small margin, he asked the most questions during a different phase, specifically the Summarize phase. When asked during a post-observation interview, Austin highlighted an important factor which impacted his choices during the Explore phase. Austin expressed tension between providing scaffolding through questioning while at the same time allowing students to explore when he said: Yeah, I mean that’s definitely a struggle because the biggest thing I come down with this, you know, the time. Lots of my students, they get going on it, but they don’t always get to the end result and we have to, I have to guide them as a class and I can’t guide, I don’t have enough time to talk about every single person’s strategy. (personal 94 communication, October 23, 2018) Austin noted an important factor which influenced his choices of scaffolding and questioning during the Explore phase; providing more scaffolding to the class as a whole in the form of mini summaries. Fellow teacher participants echoed Austin’s discussion about time being the biggest factor and using mini summaries to provide scaffolding to the whole class. Discussion of Teacher Question Codes and Frequencies The Explore phase context varies from the Launch and Summarize phases because teachers have time to help individual students or small groups. A focus on the Explore phase provides insights to the extensive time and interactions teachers had with students. As stated above, students engage further with the task and additional scenarios while teachers walk around and support student learning. Heather and Sophie’s examples illustrate how teachers used questions in different ways to address student needs while Austin pointed out a particular challenge for this phase. Teachers responded to student needs. Both Heather and Sophie shared strategies for responding to a student question. For Heather, she reflected in general about how she responded when a student has a question. Her teacher question, “What was the student’s question?” helped her identify if the small group had discussed the question prior to asking her for help. Her follow-up question encouraged students to work together and move forward. If the small group of students were unable to proceed, she could insert helpful information. Perhaps a student did not ask a question, but the teacher noticed student work while walking around the classroom. Heather described the role of walking around and noticing student work. She looked at student work, in addition to what students said, as a way to 95 understand their current thinking. Then, she could use the students’ work to begin a conversation. Teachers used mathematical conversations to support student learning. All teachers emphasized mathematical conversations to support student learning. Whether the conversation began because of a student asking a question or the teacher noticing student work, the teachers drew group members into the discussion by using student work as the launching point or pushing students to make mathematical connections. Heather illustrated this when she asked students to identify additional solutions when they had already found one possibility or worked through the task quickly. The teachers valued student discourse and making sense of mathematics. Heather pushed students to dig deeper and make connections between multiple solution pathways. Sophie provided specific examples of this with her conversations with students about integer addition and subtraction using number lines, chip board models, and money. Heather and Sophie’s examples highlighted various contexts present in all three teachers’ classrooms during the Explore phase when teachers provided support to students. Teachers grappled with balancing scaffolding and exploration. The amount of time it took to have conversations created by the teachers’ discourse moves is a concern for teachers (Herbel-Eisenmann, Steele, & Cirillo, 2013). All three teachers shared this concern and experienced a challenge during the Explore phase to balance teacher questions that engaged students with progressing in the lesson. Austin identified tension between asking questions which guided students to a particular point and allowing time for students to explore. Although the teachers walked around the room to work with individual students or small groups, they 96 were unable to engage with every single person. Even if they did work with a student or small group, they had to make decisions about how much to guide or allow students to explore. The student Sophie worked with about subtracting integers with chip board models remained unconvinced at the end of their discussion. Her questions resulted in group member sharing explanations and possible solutions; however, when the student did not agree with the other students, she had to make a decision about whether to continue guiding or allow the student more time to work with the concept. Within all of the contexts discussed, tension between providing scaffolding and time to explore persisted. Teacher questions mediated tension between scaffolding and cognitive demand. These examples also illustrated how teachers’ use of questioning mediated tension between scaffolding and maintaining cognitive demand. Teachers’ discourse moves to elicit student solutions allowed the teacher to identify students’ current understanding and where they needed help. Their emphasis on student discussion engaged group members into conversations and alleviated the need for the teacher to provide explanations to student questions or misunderstandings. Teachers focused more on students’ ability to talk about multiple solutions than quickly finding a solution. Even if a general procedure could be followed, the teachers’ focus about explanations discouraged students from following them mindlessly. The other significant way teachers’ questioning mediated tension was by attending to the cognitive demand assumption of multiple solution paths. The teacher questions provided opportunities for students to develop a deeper level of understanding by encouraging mathematical connections. For example, Sophie encouraged students to investigate multiple representations (e.g., number lines, chip board models, and money) and Heather asked 97 students to identify and understand multiple solutions. Teachers’ questions asked students to explore and understand the nature of mathematical relationships. Claim 3: The Types of Questions Were Unevenly Distributed Across Phases In prior paragraphs I have discussed potential reasons for variances within the data, whether that be a teacher’s struggle to provide scaffolding during tasks, decisions about changes from the enactment of the same lesson from one class to another, or the particular focus question of the investigation. In order to describe the uneven distribution across phases for the different types of questions, I attend to two aspects: 1) Differences seen across teachers’ data and 2) Differences seen within teachers’ data. Not surprising, I found different percentages for Sophie, Austin, and Heather, as well as for the observed lessons. The fact is, the teachers had different students in different classes, and they were being responsive to students. With this being the case, how does the data reported provide any understanding to the discussion? Given the expected reasons for different percentages, I focus on teacher scaffolding for students during tasks. Time constraints influenced teachers’ decisions to lead. Here is where I want to revisit the struggle of providing scaffolding while also allowing students to explore. I want to add Sophie’s and Heather’s perspectives, which echoed Austin’s. That is not to say that Austin shared his perspective first, but, that they all three shared the same struggle that seemed to impact their questioning and scaffolding. Heather described her perspective by saying: Time is my biggest enemy. Oftentimes I want to make sure that they’re understanding a big idea and I can find myself sometimes maybe leading them a little bit more than I want to because I want them to be able to accomplish something by the end of the 98 hour. So that to me is my biggest struggle. (personal communication, December 5, 2018) Sophie discussed constraints of time when she said: I think it’s still important for them to have A and B just to, to see that before they get going into the groups. That takes up, you know, 10, 15 minutes and we only have 55 minutes to start. So that’s kind of where we’re at. (personal communication, October 30, 2018) Sophie described her struggle to manage time so students could complete all parts of a problem (in this example, parts A, B, and C) in a given period of time. She ended up doing part A quickly to save time in the end. With all three teacher participants, they adamantly expressed a tension between scaffolding – providing help during a task or deciding what portions of a task to include – and maintaining cognitive demand, specifically because of time. Time contributed to teachers’ struggle to maintain cognitive demand and provide scaffolding because teachers wanted students to do the intellectual work and be efficient with time. Additional tensions, although less explicit during interviews, were the constant choices and adaptations teacher made to the written curriculum during instruction. The three teachers used CMP curriculum which curriculum writers designed for a generic student. They enacted the curriculum in a classroom where they were trying to be responsive to individual students. And lastly, teachers discussed balancing what a student wanted with what the teacher thought was best for the student. Yes, teachers acknowledged the fact that students wanted the teachers to provide answers instead of asking questions. Sophie anticipated this balance when she described interactions with 99 students, “Why don’t you just answer the question. I’m pushing you and there’s some frustration with the beginning of the year especially. They don’t want me to keep asking questions” (personal communication, October 23, 2018). Not only in the beginning of the year, Sophie reflected about the same pressure throughout the year saying, “Kids get so sick of me. Quit asking me questions. I want to know” (personal communication, October 30, 2018). Although students provided pressure on teachers, all three teachers voiced comfort with continuing with their questioning. The teachers struggled to know how much to help students and worried that they guided them too much in order to accomplish certain goals by the end of the class period. Additionally, all three teachers wished they had enough time to have individual conversations with each student during the Explore phase of lessons. Heather stated: So of course time is always an issue…I’m really allowed maybe a couple of class periods to explore this idea more because I want to give them time to make connections because if I don’t hook them right now, then they’re not going to be invested in future investigation. (personal communication, December 5, 2018) Again, the teacher voiced a challenge to balance time, exploration, and student learning. As expected with these tensions, differences existed between teachers as they made choices about their questioning. Focus question and emphasis on prediction influenced teacher scaffolds. Zooming in on one specific code, “to learn to conjecture, invent, and solve problems,” the way each teacher viewed prediction and making conjectures may have contributed to the differences evident in the tables. When I asked Heather what role she thought prediction played in the understanding of mathematics, she responded with: 100 It’s huge. Huge. Yesterday when we were actually looking at the percent, let’s just think about stepping back from the math and try to make sense of what you’re seeing. A big idea instead of just getting bogged down in computation. Because if that’s all you’re focused on, then when you come up with an answer, you have no idea whether it’s right or not…What should my answer look like? I mean, a prediction is almost, you know, you’re making an estimate, right? I mean, you’re thinking about what is this question really asking? So, what do you notice first? Well, they can all say, I noticed that the figure got larger or it got smaller. All right, what do you think that might tell us about the percent? So, we did a lot of that questioning yesterday as well. (personal communication, December 12, 2018) Heather valued prediction and conjectures within her teaching, which I heard throughout my observations. Contrast that with Sophie’s statement that she really wanted students to use chips to prove the answers, so they were not misled by what they thought was the answer. Throughout my observations, I heard Sophie repeatedly say, “Use the chips” or “I know you think you know, but show me” or “Don’t try to do this on your own yet”. Students, in a sense, were predicting what the answer would be when they completed the problems without a model (e.g., number line or chip board). Heather wanted students to wait to make predictions until after they had more experience exploring the concept of adding and subtracting integers. Curriculum Content and In-the-Moment Decision Making Contributed to an Uneven Distribution of Teacher Questions These views may be closely related to the content and not the teachers’ general teaching views and unfortunately, I am not sure. In either case, and independent of the two 101 teachers I used in these examples, teachers’ beliefs about the role of prediction or the connection to specific content may have influenced how teachers question students. As I said in the opening of this finding, differences also existed within data reported for an individual teacher. The prior discussions about curriculum, such as the specific focus questions, role of prediction within a lesson, and time can also explain the differences apparent in the tables. For each teacher, they have different values reported within their own data. If I consider what they shared in their interviews and classrooms, the tension between scaffolding and cognitive demand had them questioning when and how to provide help. That ongoing tension resulted in a back-and-forth and in-the-moment decisions making that changed pre- planned lessons. Teacher Questions as Discourse Moves Mediated Tension Between Scaffolding and Cognitive Demand In general, and to answer RQ2, teachers posed questions as a way of providing scaffolding through discourse. In particular, my analysis of teacher questions described the ways in which the teacher questions were relational (Fennema et al., 1996; Franke, Kazemi, & Battey, 2007), instantiations of discourse moves (Brodie, 2011), and supporting assumptions of cognitively demanding tasks (e.g., multiple solution paths, cannot follow procedures mindlessly). Teacher questions as scaffolding were related to contextual clues, such as the different lesson phases or focus questions. The context clues influenced the nature and number of posed teacher questions. Teacher questions as discourse moves allowed them to elicit, press, and maintain student solutions in ways that supported student learning and assumptions of 102 cognitively demanding tasks. As a result, teacher questions in context mediated tension between scaffolding and cognitive demand. My findings show how teacher questioning “work[ed], for whom, and when” (Franke et al., 2007, p. 228). Teachers’ posed questions provided opportunities for all students to engage in and benefit from mathematical argumentation and conceptual explanations (Chapin & O’Connor, 2004; Chapin, O’Connor, & Anderson, 2009). The trends in teacher questions showed how it worked, specifically how teachers posed questions went beyond the initial question to facilitate student learning and orchestrate student discussion. Herbel-Eisenmann, Steele, and Cirillo (2013) suggest that teachers developed strategic and spontaneous incorporation of discourse moves through professional development. My analysis of teacher questions suggests strategic incorporation of discourse moves were also associated with context. Teacher questions were contextual because they were influenced by phases of a lesson, curriculum, and student responses. My first claim, that the teachers tended to begin with asking questions about students’ mathematical work and moved to questions about generalizations to encourage student conjectures, detailed teacher questioning during the Launch and Summarize phases. My second claim, that the teachers posed more questions than statements, particularly during the Explore phase, focused on the Explore phase. During the Explore phase, teachers’ used student verbal responses or written work as a springboard for scaffolding. These first two claims respond to RQ2 by showing how teachers provided scaffolding during cognitively demanding tasks. Furthermore, the teacher examples elevated connections between context clues, the ways teachers scaffolded, and the importance of dependent tasks. 103 The intentional instructional sequencing of curriculum is meaningful for students and teachers (Stylianides, 2007c). CMP curriculum is sequenced in a particular way to help students develop a contextualized and connected understanding of the concepts. Drawing on the way CMP curriculum sequences tasks, all three teachers encouraged students to connect prior learning with current concepts, try multiple solution strategies to develop a deeper understanding, and predict what may be in future lessons. For example, Sophie made explicit connections between various tasks that investigated integer operations using different models (e.g., number lines and chip models) and contexts (e.g., Math Fever game and money). When learning about chip models, Sophie asked about, or students independently referred to, number lines as a way to make sense of the current task. Heather highlighted the connectedness of investigations when she said, “…if I don’t hook them right now, then they’re not going to be invested in future investigations” (personal communication, December 5, 2018). The intentional sequencing of curriculum and the way in which teachers encouraged mathematical connections suggests written tasks are dependent. My third claim, that the nature of questions was unevenly distributed across phases, begins to highlight how teachers thought about the relationship between scaffolding and cognitive demand. All three teachers talked explicitly about time constraints and how time influenced their decisions about scaffolding. They modified their questioning or the written curriculum in ways that they felt led students. Teachers identified time as a constraint that pushed them to lead student exploration with the potential to decrease cognitive demand. Teachers rushed through or skipped particular portions of a task or engaged in mini-summaries to move the whole class through a task. Even with questions as their main tool for scaffolding, 104 the teachers struggled with how to provide scaffolding and maintain cognitive demand. All the teachers agreed that cognitively demanding tasks and scaffolding are important; however, the enactment within a classroom and relationship between the two constructs was challenging. 105 CHAPTER 5 TEACHER AMBIGUOUS RESPONSES TO STUDENT UNCERTAINTY AS A POTENTIAL FORM OF SCAFFOLDING This section reports on the finding relative to RQ2. Teacher questions and statements in middle school grades classrooms, compiled from classroom videos and teacher audio, were reviewed to identify and mark ambiguous teacher responses as “uncertainty.” Analysis focused on the context of student uncertainty. Patterns were examined to identify whether student uncertainty was resolved and the ways in which it was resolved. In other words, uncertainty pertained to a pre-requisite or co-requisite skill, directions, mathematical content of the task, or foreshadowing of mathematical content. Variances in the teacher’s involvement to address student uncertainty were examined. Findings suggest the resolution of student uncertainty was related to specific contexts and teacher involvement in resolving uncertainty occurred on a continuum. At times, teachers resolved student uncertainty or supported the student to resolve student uncertainty. The teacher was more directive in some instances than others, possibly staying with the student until they reached a resolution; however, it seemed like the student was the one who reached the resolution. In addition, instances where students remained uncertainty were also present. Resolution Type 1: Student Uncertainty Resolved by Students When teacher ambiguous responses to student uncertainty pertained to a pre-requisite or co-requisite skill, the teacher responded by asking questions until the students resolved the uncertainty. While summarizing from a previous day’s discussion, a student asked, “Can we have negative centimeters?” (personal communication, December 12, 2018). The student’s question connected to a pre-requisite skill about measurement and distance. The current focus 106 question, how can you determine if two shapes are similar by looking at the rule, did not require students to measure side lengths of shapes like in previous lessons. Instead, students were using what they previously learned to understand symbolic equations and similarity. Heather provided an opportunity for students to offer solutions by saying, “That’s a question for the class, not me” (personal communication, December 12, 2018). As students provided possible solutions, Heather’s only words were, “Interesting. Interesting.” After a few responses from students, Heather connected the current conversation with a possible real-life experience. “Can we have negative centimeters? Do we measure a negative number? Think back to a tape measure. Do you see negative numbers on a tape measure? I wonder why?” interjected Heather as she added to the ongoing conversation. Multiple students responded “no” to her first three questions and an individual student offered an explanation to her why question. Heather facilitated student responses to the previously posed question. During a similar situation, students voiced uncertainty about a co-requisite skill. In this example, students were trying to answer the focus question, “How can you predict whether the result of an addition problem is going to be positive, negative, or zero?” when Sophie asked, “How do you get two numbers to add to zero?” (personal communication, October 29, 2018). A student responded, “You would have the same number except for one positive and one negative.” Using a discourse move to support students’ use of the mathematical register and precise mathematical language (Herbel-Eisenmann et al., 2013), Sophie asked, “What do we call those?” “Opposites,” answered the student. Now, using the mathematical term, Sophie’s asked, “If I add two opposites, will I always get zero?” Students looked unsure and a few murmured “yes” and “no.” Sophie repeated what she heard, saying, “No. Yes.” Like Heather, 107 instead of providing a solution, Sophie asked students follow-up questions about additional examples. Linking to students’ previous discussion about absolute value, Sophie asked students, “So, if I have a negative two plus a positive two, which one is further from zero?” After the students considered their additional examples, they confidently answered, “Yes.” to Sophie’s last question, “Will it always get you zero?” Students’ understanding about opposites was an integral part of the focus question and a question they discussed at the end of the lesson. Teachers Supported Students to Resolve Uncertainty About Pre- and Co-Requisite Skills The student uncertainty resolved in each of the prior instances related to a prerequisite skill, the focus questions, or mathematical content students should understand or were working to understand by the conclusion of the lesson. For both of the examples, the teachers’ responses facilitated students to resolve the uncertainty by providing explanations to teacher questions. The teachers’ willingness to work within a context of uncertainty resulted in minimal teacher involvement in voicing mathematical connections or explanations. In Heather’s example, she facilitated students’ discussion and offered one suggestion to think of a familiar visual. Similar to Heather, Sophie asked students to think about additional examples to solidify a student’s initial explanation. Students in Sophie’s class provided an explanation to Sophie’s question while using precise mathematical language. Both teachers inserted information for students to consider without explicitly offering directions or explanations that resolved the initial student question. Heather mediated tension between scaffolding and cognitive demand by allowing students to explore possible explanations about the existence of negative centimeters. Once students offered explanations and responded to one another, Heather inserting information 108 that asked students to access relative experiences. She asked students to consider their experiences with measuring to make use of it in the present task. Similarly, Sophie used the context of uncertainty to pose additional examples and support students’ noticing of patterns (Vale et al., 2019). Sophie waited while students considered how the broad concepts of opposites and distance were closely connected to the focus question. Both teachers used a lesser known wait time, waiting after a student response, allowing students time to think about a teachers’ question and their responses (Herbel-Eisenmann et al., 2013). Sophie waited after asking the question, “So, if I have a negative two plus a positive two, which one is further from zero?” She allowed students to think about the question and prior responses before answering. Researchers go on to state that, “when this second form of waiting occurs, students’ responses can become more complex (Rowe, 1986), and students may be more likely to respond to their peers’ contributions” (Herbel-Eisenmann et al., 2013, p. 183). In Heather’s example, students responded to one another instead of the teacher and explored a deeper understanding of measurement. Students’ explanations connected measurement and negative values. There were other times when the teacher provided more extensive and directed support until students resolved uncertainty; however, these instances shared a common characteristic, as I describe in the next section. Resolution Type 2: Teachers’ Increased Scaffolding Facilitated Students to Resolve Student Uncertainty For this type of resolution, I argue teachers increased their support to students in two ways. In certain situations, the teachers provided an explicit answer to a student question. I considered this increased support because the teacher was the individual who resolved the 109 student uncertainty. In other situations, the teacher engaged in extended interactions with a student that involved more teacher discourse. The teacher did not voice the answer; however, the teacher asked more questions or inserted more information than in the previous example. When students were uncertain about specific directions, teachers responded by resolving their uncertainty. This is in contrast with the examples in which the uncertainty related to a focus question or a pre-requisite skill. For example, students working to show addition of integers during the Explore phase asked Sophie, “Do you word it or just draw it?” The students were asking if they should describe in words or draw with pictures to show how they were adding two integers. Similar cases arose when students were uncertain if they must utilize a specific strategy. In these instances, teachers provided a clear answer to their uncertainty. For Sophie, she responded, “Yes. You word it. Describe it in words” (personal communication, October 24, 2018). With student questions about applying specific strategies, all three teachers responded with a direct answer, usually that students could use multiple strategies. In fact, the teachers encouraged students to solve using multiple strategies. Another example of teachers increasing their involvement to resolve uncertainty occurred when students needed clarification in order to continue mathematically. Perhaps students knew they were completing a problem correctly but were confused by a different strategy or an unexpected answer. Or, students knew their answers were incorrect and did not have a solution strategy. Consider the first option where a student worked with a chip board to complete the problem -4 – 2. In order to take two black chips away, she needed to get two black chips on the board. The student added two red chips and two black chips. With the new chips added, the student had two black chips to take away. Earlier in the class, students had 110 been working to understand the concept of zero. Students placed chips on their boards to show an overall value of zero. And by looking at all the students’ examples, they saw more than one solution existed. Now, the student asked, “What happens if you add more?” Meaning, what if the student added more pairs of red and black chips? The teacher responded, “Try it and keep adding them” (personal communication, October 24, 2018). After doing a few more examples, the student exclaimed, “Oh!”, like a light bulb turning on. Sophie responded, “Oh, I saw the light. Crazy. I like it.” Although Sophie did not provide an answer to the student’s initial question, she participated in several interactions through questioning and ultimately confirmed the student’s “ah ha” moment. A similar example is provided in the transcript below (see Figure 21) when a student tried to understand percent change from an original figure to its image. While investigating percent change, fractions and decimals entered students’ explanations. A student was uncertain about the value two-and-a-half. Figure 21 Heather Supported Student to Resolve Student Uncertainty Heather Student Heather Heather Heather Heather Student What would be exactly halfway between two and three? Two-fifths. Is that more than one or less than one? Let’s look at that. Let’s use the ruler to help us. Show me where halfway between, what two numbers were we looking at? Student pointed to show on the ruler. We can treat the ruler just like a number line, right? So what do you notice about that where you just pointed? So is it just half? Inaudible response from student. Okay. This is half to me. How are they different? What is this? Did you point here? 111 Figure 21 (cont’d) Heather Student Heather Student Heather Student Heather Student Heather Student Heather Heather Student Heather Student Heather I pointed right here. So if this is half, what is this measurement right here between my two fingers? If this half an inch. What’s this measurement right here? One inch. That’s one inch. Okay. What’s this one? That’s half, but one-and-a-half. What’s this one? Two. What’s this? Two-and-a-half. Oh! So, it’s two-and-a-half. So you think that might be two and a half? Okay. Go ahead and write that down. And then how could you check your idea? We already measured it so… You did, did you measure this one yet? I thought you were just measuring this one? We’re making a prediction on this one. And you predicted it would be what? You just talked to me about that. What do you think you’re going to get? Two-and-a-half. Okay, let’s see if you’re right. Did it work? Yeah. Nice job. Heather asked the student for the value exactly between two and three. Her question allowed the student to share their understanding and used it as a starting point. When the student responded, “two-fifths,” Heather neither confirmed nor denied the answer. Instead, she asked a follow-up question to clarify how the student understood the value two-fifths. When the student was unable to answer whether two-fifths was more or less than one, Heather incorporated a visual manipulative. With the assistance of the ruler, Heather continued her questioning until the student arrived at a solution saying, “Oh. So, it’s two-and-a-half.” Although Heather assisted, the student was the individual who stated the solution, tested the solution, and concluded it was correct. 112 Increased Teacher Scaffolding Occurred on a Continuum In only one type of situation did the teacher provide a clear and direct answer to a student question. The teachers’ answers were not about mathematical content but about how the students should show their understanding. Even though the teachers’ answers were explicit, they mediated the tension between scaffolding and cognitive demand by refraining from telling how the specific example should be presented. Sophie directed students to use words to represent their answer without stating the actual answer. In general, the teachers responded directly to student inquiries about using a specific solution strategy. The teachers’ responses maintained cognitive demand by supporting students’ use of multiple strategies and making connections between the representations (Smith & Stein, 1998). In the cases when teachers facilitated students resolving uncertainty, typically occurring in the Explore phase, the teacher spent more time and interacted repeatedly with the person or group who had a question. Unlike the two examples when teachers supported students to resolve uncertainty in a whole-class discussion, these two examples occurred during an interaction between the teacher and an individual student or group. The teacher asked follow- up questions and clarified when needed. In the second example when the student wanted to know, “What happens if you add more?”, the teacher’s involvement was between the teacher resolving the uncertainty and the teacher supporting the student to resolve it. Although the student ultimately resolved the uncertainty, the teacher was more directive than in the previous section. Instead of asking follow-up questions, she explicitly stated, “Test it.” And in the last example, the teacher 113 provided even more support for the student to arrive at an understanding of what the value two-and-a-half meant. In the last example about the value halfway between two and three, the teacher’s follow-up questions were more explicit about mathematical connections necessary to make sense of the solution. Additionally, in the last example, instead of confirming a student’s initial explanation with additional examples, the teacher facilitated the student’s exploration until he voiced an explanation. Still, the teacher did not answer directly but engaged students in discussing mathematics until a student voiced a solution. Researchers posit that scaffolding may change over time or in different contexts (Davis & Miyake, 2004). The three examples show how scaffolding changed in different contexts as teachers’ involvement occurred on a continuum. With all three examples, teachers mediated tension between scaffolding and cognitive demand by using student uncertainty as an opportunity to encourage student discourse and exploration. The teachers elicited student answers (Brodie, 2011), allowed time for student discussion (Herbel-Eisenmann et al., 2013), and inserted information related to the initial question until the students voiced a solution. Resolution Type 3: Student Uncertainty Not Resolved Unlike the prior examples where student uncertainty was resolved, there existed instances when student uncertainty was not resolved (see Figure 22). Sophie stated the following examples during an observation (personal communication, October 23, 2018). 114 Figure 22 Examples of Teacher Talk When Student Uncertainty Was Not Resolved Oh, I like the thought. Let's, let's keep thinking about it. Okay, so what I'm taking away that negative, they actually got a positive three. I know. Crazy. Huh? We're “un-canceling it.” We'll see. We're going to keep playing around with it in a little bit. I know, mind blown. Right? So [they’re] like, wait a second. So we took away negatives and somehow we got a positive. Okay. I know right now your heads hurt. Some of you, and we're gonna work with these chips throughout today. In Sophie’s class, the students were beginning to subtract integers. They had just completed a problem where subtracting a negative integer resulted in a positive three. For this first example, the students were uncertain, yet Sophie did not confirm or deny if subtracting a negative integer would always result in a positive answer. She also did not explain why or suggest any more examples. She left the students feeling uncertain. The second example occurred during the same lesson when students in the class voiced conjectures after Sophie resisted resolving their initial voicing of uncertainty. Students suggested, “It depends on if it is positive or negative.” Students considered the individual values; however, their conjecture remained unclear. And again, Sophie left students uncertain by saying, “Oh, I like the thought. Let’s, let’s keep thinking about it.” Before students began the Explore phase, the class reviewed 115 the example again. After looking at the example for a second time, students exclaimed, “What?!” “I know, mind blown. Right?...We’re going to work with these chips throughout today,” responded Sophie. Students continued to model addition and subtraction of integers with their chip boards. In a similar way, Heather used uncertainty during a Summarize phase to foreshadow upcoming lessons. Her students were summarizing how they knew if a shape was similar or not. If the shapes were similar, then they were part of the Mug family. If they were not, then they were called imposters. In addition to determining if a symbolic rule created an imposter, they also summarized how perimeter and area of the original shape related to the perimeter and area of its image. At the end of the discussion, Heather gave a more challenging symbolic rule for students to consider. Instead of asking for a solution, she stated, “Come with an answer tomorrow” (personal communication, December 13, 2018). Any student uncertainty was left unresolved. After class ended, a student approached Heather to ask what happened to the area of an image if she reduced the original shape. The student understood what happened when she enlarged a shape; however, they had not investigated what happened when they reduced a shape. Heather responded, “That’s a great question. Let’s talk about that one tomorrow and I’m not going to hold you accountable for that” (personal communication, December 13, 2018). Again, instead of answering, Heather used the student’s uncertainty as an opportunity to foreshadow and connect future lessons. In addition to foreshadowing the upcoming lesson, unresolved uncertainty occurred when teachers encouraged students to explain their mathematical thinking, join a conversation, 116 make a conjecture, and test a conjecture. Revisiting a previously highlighted interaction illustrates how a teacher’s response encouraged each of these student actions. Recall in a previously described interaction, a student encountered uncertainty when trying to complete the problem -4 – 2. Her uncertainty motivated her to explore additional scenarios where the same number of red and black chips were added to a chip board. Now, I share the beginning portion of the larger conversation between the teacher and small group. Sophie approached the group when the girl asked, “How can you take away two black when there is no black?” (personal communication, October 24, 2018). This question began their interaction when the student voiced her initial uncertainty. Sophie responded, “That is a great question. There’s no black.” Instead of Sophie explaining what to do, the girl said, “…you add them.” Simultaneously, another boy in the group responded, “I’m trying to show you.” Sophie encouraged the boy to show it by saying, “Let’s see it.” and then waited as the boy explained and showed with the chips how he arrived at the answer negative six. After his explanation, she said, “Show me again.” When he started adding four black chips, Sophie asked, “You have $4 in debt, right?” She confirmed what he had just stated in his story about owing $4. And then, without resolving the uncertainty for the students, Sophie asked, “Why are you adding 4 black chips?” After a slight pause and silence from the students, she continued, “Is that what it says to do?” The students made new suggestions about what they must do. Sophie encouraged them to test their suggestions, saying, “Try it. Let’s try it.” The students’ suggestions related to the concept of the same amount of red and black chips equaling zero and where the previous example began. For a transcript of the entire interaction, see Appendix I. 117 Students also made and tested conjectures. One group of students conjectured that “adding black chips is the same as taking away red chips, and vice versa” early in their work with adding and subtracting integers with chip models. Sophie responded, “Really? That worked on both?” She went on to say, “Test it” (personal communication, October 23, 2018). Students noticed patterns, made conjectures, and tested their conjectures. Unresolved Student Uncertainty Provided Opportunities for Student Conjectures Students were engaged in some aspect of conjecture that related directly to the focus question. This could mean they were making a conjecture that answered the focus question or pushing beyond the focus question. At times, students’ conjectures seemed to foreshadow upcoming mathematical content. In addition, teachers provided opportunities to explain their mathematical thinking, join a conversation, make conjectures, and test conjectures when teachers left uncertainty unresolved. When the girl asked how she could take away black chips when there were none on her chip board, Sophie responded by verifying her question. Sophie confirmed that the girl asked an important question; however, Sophie did not provide a solution. Instead, another boy in the group offered an explanation. Sophie’s discourse move resulted in another student joining the conversation, ultimately expanding the conversation to include the whole group. Although the boy’s initial explanation was incorrect, Sophie still connected her questions to his explanation of debt as representing a negative value. Her inclusion of multiple ways to think about a task deterred students from mindlessly following a procedure (Smith & Stein, 1998). Again, Sophie did not deny the student’s thinking. Sophie asked additional questions for students to verify and test a conjecture multiple times until the students were satisfied with 118 their solution of negative six. Without Sophie’s response to leave uncertainty unresolved, at least for the immediate moment, students may have missed out on important conversations with each other where students were the experts who made and tested conjectures. Sophie maintained cognitive demand by emphasizing students’ mathematical thinking that made connections and engaged with underlying concepts instead of her own explanations (Smith & Stein, 1998). Perhaps responses like, “test it” illustrated the link to making and testing conjectures. Whether the conjectures occurred during the Launch, Explore, or Summarize phases, students’ conjectures seemed to foreshadow upcoming mathematical content. Heather made clear statements about what went beyond the focus question when she said, “That’s a great question. Let’s talk about that one tomorrow and I’m not going to hold you accountable for that.” Heather removed pressure from students to return the next day with a solution and verified the question held merit so the students could continue to think about and push their mathematical thinking. The student’s discussion about enlarging an image related to shrinking an image even if they did not have time to discuss the connections in class. The student who asked the question at the end made these connections without Heather’s leading. This question provided an opportunity for the student who asked, and the rest of the class, to delve deeper into and make more mathematical connections. Teachers’ responses to leave uncertainty unresolved matters because it allows students to explore and discover answers for themselves. Student uncertainty is a chance to organically show mathematical connections by letting students’ curiosity take a larger role in mathematical learning. 119 Curricular Considerations for Uncertainty During Enacted Phases of the Lesson Launch phase. When I looked more closely at when the uncertainty occurred and how the teachers responded, I saw another pattern in how the teachers chose to respond. There were several instances during the Launch phase when the teacher and students introduced an idea. The context of the situation was given, and the focus question was introduced. Even as early as the Launch, some students voiced conjectures about the focus question. The teachers responded with phrases similar to, “Okay, so when I'm taking away that negative, they actually got a positive three. I know. Crazy. Huh? We're un-canceling it. We'll see. We're going to keep playing around with it and a little bit.” The teachers used students’ uncertainty as a way to transition into the Explore phase. At times, the teachers asked students to keep the conjectures in mind as they explored during the task. The students who had a conjecture, they could then test their conjectures during the Explore phase. The teachers chose not to provide a judgement about what students noticed and conjectured in the Launch phase but encouraged them to investigate themselves. Explore phase. When I looked at the Explore phase, I saw different groups respond with similar conjectures and in turn, the teachers responded without verifying or denying the students’ thinking. Instead, the teachers pushed the students to test their conjecture and explore multiple solution paths. Several of the examples included in this section highlight how teachers increased support may result in more time spent with a student or group of students. Each CMP problem contains shared characteristics, such as observing patterns and relationships in a situation, recognizing and employing prior understandings to conjecture and test, justifying their reasoning, discovering salient mathematical concepts, making judgements 120 about operations or representation that are helpful, and communicating their understanding with their group. Therefore, it follows that the teachers spend more time supporting students to answer questions, exploring additional solutions, or resolving any differences in student thinking. Summarize phase. And lastly, during summary, if the student uncertainty related to the focus question or pre-requisite skills, then the students or teacher resolved uncertainty. Questions about pre-requisite skills were resolved by other students or the teachers asking follow-up questions to help them arrive at a conclusion without telling them how to do something. Teachers’ decision to not resolve student uncertainty in the Summarize phase seemed to occur when the students were extending and pushing beyond the focus question or it pertained to later content. In general, student questions during the Summarize phase were left unresolved. Brodie (2011) talks about how the teacher discourse move, maintain, can keep student contributions in a discussion. These teachers showed an additional way a teacher maintained student contributions. The teachers used the student inquiry as a way to connect sequences of tasks, encourage students to continue thinking about the mathematical concept, and begin the following class period with student discourse. Teachers Used Discourse to Mediate Tension Within the Context of Student Uncertainty Returning to RQ2, teachers continued to use discourse moves in the context of student uncertainty in ways that mediated tension between scaffolding and cognitive demand. Instead of providing direct statements or explanations as scaffolding that resolved student uncertainty, teachers facilitated conversations until another student suggested a solution. Teachers’ discourse moves increase discourse within small groups and have potential to positively impact 121 student achievement (Franke et al., 2007). In general, teachers’ use of uncertainty as scaffolding allows students to: 1) explain their thinking, 2) join conversations, 3) make conjectures, 4) test conjectures (sometimes show thinking using a model), and 5) foreshadow mathematical content. In this way, the teacher provided opportunities for students to participate in aspects of reasoning and proving, such as identifying patterns, making and investigating conjectures, developing and providing non-proof arguments (National Council of Teachers of Mathematics, 2000; Stylianides, 2009), which are necessary for maintaining cognitive demand. Even though the teacher desires for students to offer suggestions (whether correct or not), there are times when teachers need to use prompts to elicit student responses (Vale et al., 2019). Teachers used these prompts as scaffolding to provide direction when students needed additional help. These interactions occurred when the teachers spent more time with the students and the teacher was unable to help the students in prior attempts. The teacher stayed involved in the conversation as students explained their thinking or others joined in, especially about a pre-requisite skill. These examples illustrate the tension teachers face when enacting these cognitively demanding tasks with students. How teachers responded to uncertainty differed depending on the situation. And when they did resolve uncertainty, they took more time and asked questions directly relating to the students’ current understanding. 122 CHAPTER 6 SALIENT STUDENT EXPERIENCES AND TEACHER RESPONSES And then they get to e and the wheels fall off. And I had kids crying in the last two days, and that's like, holy cow. –Sophie, Post-Observation Interview And the one gentleman, [he] had tears of frustration third hour, at the end. –Sophie, Post-Observation Interview This section reports on findings related to RQ3. Classroom video, field notes, and post- observation interviews of the enacted CMP tasks in the middle school classrooms tell a story of student experiences and reactions to a mathematical task. I examined salient episodes of students exhibiting frustration with and disengagement to a cognitively demanding task by watching video to describe non-verbal cues and listening to recorded audio for voice inflection and word choice. Analysis focused on students’ experiences and teacher responses. Patterns were examined and findings emerged to highlight teachers’ scaffolding that mediated tension between frustration and maintaining cognitive demand. Cognitively demanding tasks are discussed in literature as positive experiences. What I mean is, student achievement is shown to improve when teachers use cognitively demanding tasks. And, researchers suggest using these non-algorithmic and open tasks to encourage multiple solution paths. Even implicitly suggested solution paths may be discouraged at times. Although the extant literature shows positive student outcomes and instructional incentives, 123 what seems to be less apparent are the potential challenges to teachers and students during these tasks. Take, for example, the teacher’s comments above that described her experience. Students were working with chip boards to model addition and subtraction of positive and negative integers. Instead of providing algorithms to the students, the teacher asked students to explore how they could use chip boards with red and black chips to represent addition and subtraction. The students noticed patterns and determined a set of algorithms for addition and subtraction. Perhaps the benefits of students’ exploration and creation of algorithms overshadows the potential struggles. I agree with the use of cognitively demanding tasks; however, what are students’ experiences and how do they influence learning? Are students expected to face challenges? If so, how do teachers respond? The following chapter focuses on students who struggled and needed to persist. The first two episodes, Madison and Tate, focus on students who experienced confusion which led to frustration while the second two episodes, Leonore and Evelyn, highlights students who were disengaged from the task. The student experiences represent two broad types of related student experiences that have the potential to negatively influence student learning (Di Leo et al., 2019). Salient Episode 1 with Madison The first student, Madison, worked in a group of three. Each student had their own chip board to show the problem -4 – 2. Madison was confused about what the starting value represented but another student provided an explanation to resolve Madison’s confusion. When Sophie returned to discuss the same problem a second time, Madison remained 124 confused about the starting value. Madison said to start with “negative four and positive two.” Sophie asked again, “What does number two say to start with?” In the midst of Madison’s response, “negative four and,” Sophie interrupted with, “Show me a chip board that has negative four.” Sophie recognized that Madison was still confused about the starting value. To identify if Madison knew how to show a value on the chip board, Sophie asked her to do the first part. Madison successfully showed Sophie her understanding of creating a given value with chips. Once Madison created the original chip board, Sophie asked questions about how to complete the subtraction when there were no black chips on the board. Madison was unsure and began cradling her forehead in her hands with each hand pressed again a temple and her gaze down toward her paper. Madison continued to stare at her paper, only looking up when she wanted to verify if she answered correctly. Sophie continued to ask questions. This was the first time Sophie discussed subtraction after Madison created the original chip board. So, instead of providing answers, Sophie asked questions like, “What does it want you to do?” when Madison asked, “Is that wrong?”. Or, Sophie said, “Try it, let’s see.” when Madison asked, “No, so we have to add positive two and positive red. I mean negative red, right?”. At the end of the interaction, Madison exclaimed, “Negative six. Oh!” which suggested surprise. With Madison’s realization, Sophie walked away because Madison continued independently, working to explain the surprising event (Di Leo et al., 2019). Madison’s surprise was short-lived (Baker, Rodrigo, & Xolocotzin, 2007) and quickly transitioned into a different emotion (Di Leo et al., 2019). When Sophie returned to the group and asked, “How are the other ones going over here?”, Madison immediately picked her head up from the desk and said, “I just don’t understand.” When Sophie asked for clarification about 125 why Madison was confused, Madison responded, “I know how to do, but when we start with it, do we start with the regular chip board and make…”. She continued to stare down at her paper, rub her hand over her temple until it rested on her cheek, slightly covering and pulling back her slightly turned down mouth. This discussion was the third time Madison asked about the starting value of the chip board. During Sophie’s third visit, she provided the explanation: You make that chip board to have a value of this. I don’t care what combination. Remember when we did the chip boards and I said, make a value of positive 12? You can use as many or as little chips as you want, but the value has to be that. Does that make more sense? (personal communication, October 26, 2018) Madison attempted additional examples. Sophie’s final visit to the group for the day occurred after Madison raised her hand to ask, “Are these right?” Madison’s question implied that she remained unsure of her understanding. When Madison shared her answer, Sophie decided to model the problem. “Watch. Watch this. Yeah. Are you adding positive two?”, asked Sophie. “You’re subtracting positive two,” answered Madison. With the distinction of positive two versus negative two, Madison raised another concern, “Does it matter if it’s a negative or a positive?” Sophie and Madison were specifically discussing the second value, the value being added or subtracted. Through additional questioning and review of previous examples, Madison arrived at the following conclusion, “It matters but I don’t feel like the negative and positive matters. On this, when you did it. Oh yeah, it did matter.” Sophie’s fourth visit to Madison and her small group resulted in Sophie modeling the problem and Madison asking about why or how the second 126 number changes the final value. For a complete transcript of the interactions between Sophie and her student, see Appendix J. Salient Episode 2 with Tate The second student, Tate, worked in a group of four. Each student had their own chip board but worked together while sharing the chips to complete the addition or subtraction problems. Sophie noticed the students working asynchronously with varying recorded answers. Instead of pointing out incorrect answers, she asked Tate to show her how he arrived at his answer for the problem -4 – 2. “Show me this, show me number two with chips,” said Sophie. Sophie’s prompt grabbed the attention of all the students at the table as they listened to her question and watched Tate show how to complete the problem on his chip board. As Tate pulled chips toward him, four red chips with one hand and two red with the other, Sophie asked, “Wait, wait, what are you starting with?” Tate confidently replied, “Negative four.” No black chips on Tate’s chip board. Noting the two red chips in Tate’s one hand, Sophie began the interaction shown in the portions of transcript below (see Figure 23). Figure 23 Tate Transcript Described His Conundrum of Having No Black Chips on His Chip Board Sophie Tate Sophie Tate Sophie Tate Sophie Tate Group Member Sophie Tate Sophie Oh, you are starting with two positives, too? If you start with that, is that negative four? No, this is negative four. And this is positive two and we’re subtracting or taking. Okay, but does it say to add two positives? No, it says to subtract. Then why are you adding two positives? That’s negative four, right? Yeah, minus two. It takes away two. Takes away two what? It’s a positive and a negative. If you subtract a positive from a negative. It says subtract, it means take away from the negative four. But this is negative four. Do you have two positives to take away? No. Where? Are there two? That’s your start, right? That’s negative four? 127 Figure 23 (cont’d) Tate Sophie Group Member Tate Yeah, that’s the start. Do you have two? No, we don’t have two positives to take away. These are the two positives. Tate explained what the black and red chips represented, he explained how to create and determine the value of a chip board, and he explained that he must have two black chips since subtraction means removing the chips. His confusion arose when there were no black chips on his starting value of negative four, made with four red chips. Sophie identified this misconception and asked about adding the two black chips. Adding black chips without changing the value of Tate’s chip board. Sophie continued with her questioning (see Figure 24) by asking, “But are they on your board?” She drew attention to the relationship between Tate’s modeling and explanation of needing, but not having, two black chips to subtract. Tate confidently answered, “No, not yet.” He knew the chips were not on his chip board to subtract. So, he added two black chips because he knew they had to be on his chip board to remove. Sophie articulated this challenge when asking, “So how are we, how are we going to get them on the board without changing the value?” She asked Tate to identify how to accomplish what he knew needed to be done, without telling him how. Figure 24 Tate Described How to Add Black Chips Without Changing the Value Sophie Tate Sophie Tate Sophie But are they on your board? No, not yet. Okay, not yet. I like that. So how are we, how are we going to get them on the board without changing the value? Because you still have to start with a value of negative four. This is your starting value. Negative four. So it’s minus from the four. So if you do this, is it still negative four? 128 Figure 24 (cont’d) Tate Sophie No. So you can’t just (Teacher did not finish sentence.) Group member contributed an explanation. By this time, all the group members were listening (see Figure 25). Another student in the group had an “ah ha” moment. While placing two black chips along with two red chips on the board, she said, “Oh yeah.” The following interaction incorporated another student into the conversation to highlight how adding the same number of red and black chips did not change the overall value. Every time Tate’s group members added two red chips to compensate for the two black chips, Tate moved them away. Figure 25 Group Member Contributed an Explanation Group Member Sophie Sophie Sophie Tate Sophie Tate Sophie Tate Oh, yeah. Oh, what did [she] just do? So put your black ones back there. What did you just do? Group member added the same number of red and black chips. But why did you do that? [She] just put these on here, why did you put two more reds? If you add these then these ones can take it away and the value would still be a negative four. If you add two red and two black, what are you adding, essentially? Zero. Okay. So it’s still negative four, right? If we leave all of this right here, still negative four? Yeah. Tate objected to six red chips on his chip board. Depicted in Figure 26, now with two black chips on the board, Sophie asked, “Okay, can you take away that two black now?” Although she asked about completing subtraction, Tate remained unconvinced that adding two red chips was correct. He noted that the subtraction problem included the value -4, 129 represented by four red chips; however, there were six red chips now on the board. Another student provided an explanation of why six red chips were okay since they had also added two black chips. Figure 26 Tate’s Objection to Six Red Chips on His Chip Board Sophie Sophie Tate Group Member Okay, can you take away that two black now? Wait (Said at the same time as Tate’s question.) Why are we adding two of these now? There isn’t six red in the question. Because you don’t have enough to take away positive two, so you have to add two negative on and then you’d be able to take away your positive two. Relationship between the number of chips and the value of Tate’s chip board. Throughout the conversation, Tate had listened and responded to the teacher’s questions and the other students’ responses. He clearly had some understanding about how to add and subtract integers; however, at that moment he seemed to be beyond productive frustration (see Figure 27). Tate had been rubbing his hands on his forehead, shaking his head in disagreement, and moving red chips off the board when other group members had tried to suggest a solution. While others tried to explain why the same number of red and black chips needed to be added, Tate frowned. So, what did Sophie do? Sophie acknowledged his frustration and asked the student to “stay with” her. She asked the student to persevere even though he experienced difficulty in that moment. Figure 27 Tate Showed Frustration and Sophie Highlighted Relationship Between the Number of Chips and the Value of Tate’s Chip Board Sophie When we’re looking at the start. I know, right now, I know you’re at that breaking point. 130 Figure 27 (cont’d) Sophie Stay with me. Hold on. Let the chips do the work. Let the chips prove this to you. So if we add zero, Tate. Look at this chip board. This first number, is that just the amount of chips on there or is it the value of the board? Let’s go back to the front. This first number, that was always your value that you started out at, right? Did the teacher waver on her stance to keep asking questions? How did she balance the tension of supporting Tate while also keeping the task open? After another student explained why they were adding more red chips when the value was negative four (Tate thought he only needed four red chips), the teacher continued with her stance of questioning and asked a more directed question. Through the discussion it become clearer that Tate conflated the starting value of the chip board with the number of chips necessary on the chip board. Yes, the starting value was negative four and that could mean four red chips; however, there were other possibilities. Sophie reminded the student about the opening class activity where the whole class discussed multiple possibilities to represent the same starting value. Tate wavered in confidence. Even though the teacher still asked questions, she inserted a few hints to help Tate. Sophie stayed with Tate, not leaving him when he needed support. When Sophie returned to the beginning of their conversation, Tate understood how to determine the value of his chip board; however, he no longer understood if he could take away two black chips (see Figure 28). 131 Figure 28 Tate’s Confidence Wavered Tate Sophie Tate Sophie Sophie Tate Yes. Is this a value of negative four still? Yes. We’re all set with the starting value, right? Can you take away two black ones now? I don’t know. Tate and Sophie had been discussing this same point several times and for several minutes. Tate followed Sophie’s increased scaffolding. Eventually Sophie gave more pointed directions along with her questions (see Figure 29). Figure 29 Tate Followed Sophie’s Increased Scaffolding Sophie Tate Sophie Tate Sophie Tate Sophie Look, look at your chip board. Can you take away two black ones? I don’t know. Do you have two black ones on your chip board? Look, right here. Do you have two black chips right here? Yes. Then take them away. What’s left? Wait, did you take just two black ones? What’s left? Six. Negative six, right? I need, I need you to not get so frustrated. Okay? Ask me a question. I know, subtracting theses negatives are frustrating, right? Because usually things come pretty quick to you, right? I need, take a deep breath and I’m going to come back in a second. Thank. (Teacher ended conversation with group and moved to summary of lesson.) Sophie continued to press Tate to come to an understanding without providing all the answers. She chose to mediate frustration and encouraged Tate to continue. This particular student and teacher returned the following day and discussed the same assumption. Tate was still unclear about the chip board and did not fully understand the representation, even by the conclusion of the day. While reflecting during her post-observation interview, Sophie commented that Tate had stated, “You’re pulling, you’re pulling numbers out of thin air.” Even Tate’s group, not 132 Sophie, had not suggested adding two red and two black chips. Two days later, Sophie revisited Tate’s group to ask, “Okay. You’re done with the first one? So you’re doing number two? This is kind of where we left off Wednesday and our heads were hurting, right? Alright. Let’s see, what is going on. What’s your value?” During their conversation, Tate said, “I get this now” and articulated his understanding of using a chip board to model addition and subtraction of integers. Madison’s and Tate’s experiences and Sophie’s responses shared several characteristics. Considering the students’ experiences, both Madison and Tate became confused about a similar concept. When trying to complete the problem -4 – 2, Madison and Tate correctly created a chip board with the starting value of -4 by placing four red chips on their board. When Sophie began conversing with the students, they confidently breezed past the starting value to focus on the subtraction and encountered confused about the subtraction. They both noticed and were confused about the absence of black chips to take away. After several interactions and discussing the same concept, Madison and Tate visibly showed signs of frustration, such as bringing their hands to their forehead, placing their head on the desk, or frowning. Once the students experienced frustration as described, their confidence waivered. Madison and Tate no longer knew how to answer Sophie’s questions or what to suggest as the next step in the problem. Sophie’s responses to Madison and Tate also shared common characteristics. First, Sophie persisted with her questioning. For Madison, Sophie waited until her third visit and for Tate, Sophie waited until she had asked a variation of the same question several times to provide an explanation about starting value. Sophie drew on the opening activity and 133 classmates’ examples of creating several different chip boards that represented the same value to help Madison and Tate. When Madison and Tate showed increased signs of frustration and still struggled, Sophie’s final step was to model the problem for the students. Slight variations existed in Sophie’s decision making when interacting with Madison and Tate. Sophie and Madison’s interactions occurred during four separate interactions. When Sophie felt Madison could proceed independently (i.e., Madison accepted a group member’s answer to her question or Madison exclaimed “Negative six, oh!”, then Sophie left the group. Unlike Madison, Tate and Sophie’s interaction occurred in one interaction. Even when a group member provided a strategy or explanation, Tate either moved the chips away or voiced disagreement. With Tate less likely to proceed independently, Sophie did not leave the group until after she modeled the problem. Once she modeled the problem, she asked Tate to try a few independently and checked-in with him during a subsequent class. Student Resistance to Engaging with a Cognitively Demanding Task I consider the next two student experiences of Leonore and Evelyn to represent a different category of emotions (e.g., positive activating, positive deactivating, negative activating, and negative deactivating) and less extreme in the outwardly demonstrated emotions than the previous two examples. Instead of confusion and frustration, students may respond by disengaging or negatively deactivating (Di Leo et al., 2019) with a cognitively demanding task. A student who is disengaged or non-persistent during a task may exhibit disinterest, lack enthusiasm, not following instructions, avoid social interactions, and may have difficulty completing a task (Dumdumaya, Banawan, & Rodrigo, 2018). During an interview, Austin shared his concern about students shutting down (i.e., becoming disengaged or non- 134 persistent) (personal communication, October 23, 2018). The next group salient episode began with Austin managing student expectations during the Launch phase when he said, “It’s meant for you guys to struggle a bit. You grow from struggling a bit and then persevering through it. Even if you don’t quite get the right answer, you might spark an idea for somebody else” (personal communication, October 26, 2018). He acknowledged potential student struggle associated with problem solving (Stylianides & Stylianides, 2014) and maintained cognitive demand by deemphasizing a focus on producing correct answers (Smith & Stein, 1998) while simultaneously giving students guidance and encouragement before they began the Explore phase. Amidst the Explore phase, Austin engaged students in mini-summaries as a way to scaffold. It is during these mini-summaries and Explore phases that the following interaction occurred. Salient Episode 3 with Leonore Austin asked students how they could make a graph using their equation. Students suggested a x- and y-coordinate table using a calculator. Austin asked a student, “Do you think you can take us through that process?” The student’s responded, “Probably not.” Instead of moving on and having another student respond, he asked, “Do you want at least to help us get started and maybe we can work it out together as a class?” The student proceeded to explain how to use the slope and y-intercept to write an equation in the calculator. Once Austin completed the mini-summary, he asked students to work in their groups to graph the equation and use it to answer the next portion of the task (e.g., How does this graph compare to the graphs of the exponential functions in Investigation 1?, How much of the lake surface will be 135 covered at the end of the year by the plant?, and How many months will it take for the plant to completely cover the surface of the lake?). While students were working in their groups, Austin noticed a student who did not have a graph of the equation. The following interactions occurred. Figure 30 Leonore Confused When Graphing Function on Calculator Austin Leonore Austin Leonore Austin What’s up? How come you don’t have the graph done? I got confused on the calculator. You got confused. All right. Did you ask your group members to help you out? I bet they’d be more than happy to help you out. She looks like she’s got good answers. Will you let her help you out? Student shrugged shoulders. Don’t do it for her. Show her, help her. Okay? Teacher walked away to help other groups and the students proceeded with graphing the equation. Austin had just helped students graph the equation during a mini-summary but Leonore had disengaged from the task. Her reason was because she became confused with the calculator. Austin encouraged her to ask a group member for help and left once the group continued together. Salient Episode 4 with Evelyn After leaving, he walked to another group where the next interaction occurred. Figure 31 Evelyn Frustrated and Disengaged from the Task Austin Evelyn Austin Evelyn Austin Evelyn What’s going on? I don’t know what to do! (Student said in a raised and higher pitched tone.) Did you ask your group members to help you? I said help me and they said nothing. (Student still had a raised and higher pitched tone.) Now you need to say, I really need your help please. [They] won’t (Student did not finish the sentence.) 136 Figure 31 (cont’d) Austin Austin Evelyn Austin Evelyn Austin Evelyn You need to take ownership. You need to, okay, [Student’s name], you’re missing the point here. You’re missing the point. (Student talked over the teacher about how the other group members were copying and not helping.) You need to take ownership. I’m having a bad day. Okay? I’m sorry you’re having a bad day. I don’t have great days either, okay? Well, don’t yell at me because I’m having a bad day. I’m not yelling at you. I’m telling you, you gotta get your work done. Okay. I interpreted Evelyn’s head shaking back and forth (i.e, shaking her head no) along with her raised and higher pitched tone as frustration. Again, Austin encouraged her to ask group members who could provide suggestions. Through their exchange, Evelyn persisted that the other group members were unwilling to help. Austin remained adamant that Evelyn also needed to take responsibility for her learning and persevere. Evelyn expressed she was having a bad day and did not appreciate Austin yelling at her. Austin reassured her that he was not yelling but still expected her to engage in the mathematical task. Once Evelyn and Austin found agreement, Austin walked away and Evelyn began working with her group members. Austin’s experiences were less extreme than Sophie’s in the sense that students were not brought to tears. Austin’s students were older and tended to disengage before persevering mathematically to the point of tears. I was less certain of the specific mathematical content which frustrated Leonore and Evelyn; however, their frustration was still present. Austin’s decision to intervene with Leonore and Evelyn occurred when they showed signs that they were disengaged with the task. After the initial student description of a possible student 137 solution, Leonore and Evelyn were unable to graph the equation. Austin’s discourse moves were also similar to Sophie’s as he asked questions which encouraged group members to become involved rather than providing the solution. Once he established a group member could help and Lenore and Evelyn were willing to collaborate, Austin left the group. Adding to Sophie and Austin’s observation data, I share Heather’s perspectives about students who struggled with concepts in their classes. Like Sophie, Heather made decisions about when to stay and when to leave a student or group. Heather shared her challenge to know when to leave. She said: That’s hard for me because I don’t want to leave them hanging. I want to keep pushing on their thinking because if, if they’re not really sure where to go next and I walk away, I do not believe that they’re going to try to continue to make sense of it. (personal communication, December 13, 2018) Heather’s reflection alluded to her desire to help students through an impasse until they had a possible solution strategy and could proceed. Intervening when students face impasses has educational implications for student perseverance, learning strategies, and achievement (Di Leo et al., 2019). In general, all three teachers intervened to provide support, so students had a possible solution strategy before leaving the individual student or small group. At times, teachers had to make decisions about when to let a student think about a concept until a subsequent class period. Heather reflected during a post-observation interview about her prior experiences with students’ struggles to understand addition and subtraction of integers with chip models and algorithms. She described students’ desire to memorize algorithms without being able to verify the validity with an alternative model. In particular, she 138 was not sure about the expectation for all students to come up with a concept at the same time. She discussed how some students may take more time to make sense of addition and subtraction and struggled to move onto multiplication and division. As a closing comment, she stated, “I mean learning is messy. That I do know very well and every hour, every group of kids it looks different and I guess that’s what keeps it exciting” (personal communication, December 13, 2018). Although every student is different, the teachers provided strategies for helping students that mediated tension with cognitive demand. The Role of Emotions in Students’ Experiences and Teachers’ Responses These episodes address an important and more recent focus on the role of emotion during learning (Di Leo et al., 2019) and add a dimension to previous literature about cognitively demanding tasks. Di Leo, Muis, Singh, and Psaradellis (2019) described learning “as an emotionally charged experience” when “emotions are fundamental and influential in students’ learning, motivation, self-regulated learning, and academic achievement” (p. 121). Student experiences with confusion and frustration, like the ones described in this chapter, can hinder student learning. My description of students’ emotions over time as they evolved showed that emotions were not isolated and transitioned during the problem-solving process (Harley, 2016, Di Leo et al., 2019). Madison and Tate grew increasingly confused, which transitioned into frustration, with Tate evolving further toward disengagement. Transcripts of Sophie’s interactions with Madison and Tate followed a pattern when interacting with students who were frustrated. Post-observation interviews with Austin and Heather shared similar decision making of how to respond to a student or small group who needed help. 139 All three teachers wrestled with decisions of when to leave or stay with an individual or group. Their decisions relied on if they felt an individual or group would proceed without their help. Sophie’s examples illustrated how an individual student was still confused when she decided to leave; however, the group as a whole helped one another and discussed ideas about how to proceed. Tate was confused about adding the same number of black and red chips even when Sophie decided to leave; however, he and his group members worked together and tried more examples to advance their understanding. Austin said during a post-observation interview that drawing students together in conversation exposed students to those higher-level thinking conversations and connections, even if they did not yet understand. Teachers’ Need to Mediate Tension Within the Context of Student Frustration was Necessary for Student Perseverance These episodes emphasize two main points. First, teachers encounter tension when enacting cognitively demanding tasks, especially when students struggle to the point of frustration. During each interaction, the teacher balanced when to stay and when to walk away, when to ask questions and when to provide a hint, and what to say and what not to say. As one teacher shared, she tried to stay until the student had a suggestion about how to proceed. Sophie illustrated this several times when she would walk away after a student had an “ah ha” moment or another conjecture to test. Austin provided an alternative context when students became frustrated; however, he still walked away when the interactions suggested the student would proceed without him. The context implied students were confused about the mathematics even though the discourse focused on the students’ actions and their halted work. The decisions teachers made balanced students’ productive exploration with their frustration. 140 These two episodes highlight students’ experiences during these tasks. Yes, students were exploring, discussing, conjecturing, and justifying their thinking; however, what was also necessary to acknowledge was their potential frustration. Persistence is key to completing challenging tasks in the face of frustration (Dumdumaya et al., 2018). Without acknowledging the relationship between frustration and persistence, teachers may be unprepared to mediate students’ frustration and impede student perseverance during the cognitively demanding task. By thinking about and identifying these general patterns in how a teacher mediates students’ frustration with exploratory learning, students and teachers may have better experiences. In all the cases, the teacher still pushed students to explore additional examples and be able to explain the mathematical concepts. In addition to holding student accountable for their own learning, when necessary, the teachers provided support through scaffolding to assist them in their learning. Teachers need to be prepared to address student frustration because persistence is not always productive or positive (Dumdumaya et al., 2018). Madison and Tate kept attempting the task without success with outward signs of reduced motivation and increased frustration. These students illustrated that students’ learning is not at the same pace and they can get stuck in a wheel spinning phenomenon (Dumdumaya et al., 2018). Whether a student readily grasped the concept on the first day or, in the case of Tate, understood after a few days, Sophie used funneling discourse moves to assist Tate in completing an interim task so he could attempt a similar task and, eventually, the goal task (Sullivan et al., 2006; Sullivan & Mornane, 2014; Wood et al., 1976). Brodie (2011) described sequences of elicit moves in relation to Bauersfeld’s (1980) funneling and Franke et al.’s (2009) leading questions as teacher discourse 141 moves that suggested particular answers. Wood (1998) made a distinction between funneling student responses with focusing students’ mathematical thinking (seen in Franke et al., 2007) by contrasting who (e.g., teachers or students) had the intellectual burden. The teacher responses to student examples illustrated how teachers did more intellectual work during the sequences of elicit moves related to the interim task while still placing intellectual burden on students for the focus question. In the context of student frustration, Sophie assisted Tate with an interim task that he was unable to complete independently without her answering and potentially reducing the cognitive demand of the focus question, “How can we use a chipboard to represent subtraction?” or broader integer subtraction algorithm concept. These episodes illustrated how teachers anticipate students’ struggles and can have a model on how to proceed (Vale et al., 2019) while responding to individual student needs. As Heather commented in a concluding remark, she knew very well that learning was messy and with every hour, group of students, or individual student, the learning looked different. She saw anticipating and embracing the messiness of learning as an exciting piece of her job. Sharing students’ experiences that impacted teachers’ experiences may help prepare teachers for the messiness of learning so they too can find excitement. 142 CHAPTER 7 DISCUSSION AND IMPLICATIONS OF SCAFFOLDING AND COGNITIVE DEMAND RELATIONSHIP To begin my conclusion, I return to the RQ1: How do teachers think about the relationship between scaffolding and maintaining cognitive demand? Teacher interviews described teachers’ perceived and experienced tension between scaffolding and maintaining cognitive demand. My analysis of classroom observations, specifically teacher discourse, showed how teachers thought about questions as a bridge between scaffolding and maintaining cognitive demand. Teachers’ Perceived and Experienced Tension Influenced Scaffolding Choices Adding to recent literature about tension between scaffolding and tension (e.g., Sullivan & Mornane, 2014), Sophie, Austin, and Heather all described time as a major constraint (Foley, Khoshaim, Alsaeed, & Nihan, 2012). More importantly, all three teachers identified the ways time influenced how they scaffolded during their instruction and potentially decreased cognitive demand. Although the teachers emphasized the importance of exploration and discovery, they also acknowledged the expectation to adhere to schedules. For example, they wondered if their motivation to reach a certain point in a lesson influenced them to provide more directed help. Teachers asked questions with an obvious answer. Austin expressed this tension when discussing the challenge of asking questions or guiding students without making it really obvious. Austin reflected about the students’ attempts to write exponential equations and his challenge to ask questions without making an idea or solution obvious (Charalambous, 2008; 143 Stein et al., 2000). Similarly, Heather discussed her struggle to provide help without too much direction as students learned about proportionality. Teachers led students through or eliminated portions of a written task. Time constraints also influenced in-the-moment-decisions to change written tasks. Sophie explicitly described a time when she asked students to skip portions of the written task to ensure the students completed the “meat and potatoes” of the problem. Another example was when students analyzed two tables to determine a pattern of adding and subtracting integers with the goal of writing an algorithm. The written task was broken into smaller portions (i.e., interim tasks) to build up to the portion which allowed students to answer the focus question (i.e., goal task). Time constraints impacted the enactment of the task. The teacher encouraged students to move quickly through, or completely skip, a portion of the task to be efficient with time. Time constraints contributed to increased tension between scaffolding – providing help during a task or deciding what portions of a task to include – and maintaining cognitive demand. All three teachers reflected about challenges to maintain cognitive demand, specifically related to the enactment phase (e.g., when and how to provide scaffolding that responded to specific student needs in different contexts) (Boston & Smith, 2011; Silver & Stein, 1996; Stein et al., 2000; Ursula de Araujo, 2012). Teachers Thought of Questions as a Bridge Between Scaffolding and Cognitive Demand Teacher questions dominated teacher discourse and provided a way for teachers to scaffold while maintaining cognitive demand. Teachers’ questions described a specific scaffolding behavior that aided teachers to successfully implement a task (Henningsen & Stein, 1997). Teacher questions intervened in ways that elicited, pressed, and maintained students’ 144 mathematical thinking (Brodie, 2011) and inserted information to aid students’ successful completion of the current task and learning of new concepts (Stein & Kaufman, 2010). Teachers thought of questions as examples of teacher discourse that connected scaffolding and cognitive demand. Additionally, teacher questions as teacher discourse seemed particularly important for mediating tension. Teacher question framework mediated tension. The teacher question framework I developed by drawing on pieces of work allowed me to understand how discourse moves that specifically related to mathematical content mediated tension. Existing literature identified teachers’ ability to ask an initial question (Franke et al., 2009); however, teaching is “not just about listening to students and asking them to describe their thinking” (Franke et al., 2007, p. 226). The teacher question framework implements discourse moves, pushes past an initial question or description of student thinking, and values multiple student voices to facilitate student reasoning and problem solving. Sophie’s chip board model example illustrated how the framework related to a specific focus question and allowed me to see how suggested teacher questions related to the focus question and were used for anticipating students’ reasoning (Vale et al., 2019). Role of context in how teachers mediated tension between scaffolding and cognitive demand. Equally important to understanding the complex role of teacher discourse are student responses and curricular considerations (Fennema et al., 1996; Franke et al., 2007). Teachers mediated tension between scaffolding and cognitive demand in the contexts of student uncertainty, student frustration, and lesson phases. 145 My analysis of teacher responses during student uncertainty highlighted how they mediated tension by using student uncertainty as opportunities to increase small group discourse, encourage student conjectures, and motivate student questions. Teachers supported and responded to student uncertainty in ways that placed intellectual burden on students (Franke et al., 2007) and maintained cognitive demand. Specific examples were discussed to show instances when teachers gave direct answers to student questions; however, these instances were specific and maintained the cognitive demand by leaving the focus question (i.e., goal task) unanswered. A similar context which focused on student experiences was student frustration. Students experience confusion that may lead to frustration and disengagement during problem solving tasks (Di Leo et al., 2019). My attention to student frustration placed value on an important student emotional context that relates to student achievement and learning. Teachers addressed potential hindrances to perseverance by identifying when a student transitioned into a counter-productive state (e.g., frustration, disengagement, wheel spinning) and providing help during these salient episodes. They mediated tension between scaffolding and maintaining cognitive demand by continuing to ask questions that drew on the small group discourse, leaving once students agreed upon a possible strategy, and delaying any moves that may lower cognitive demand (e.g., asking leading questions, modeling a problem, suggesting a solution path). At times they recognized students needed additional help and engaged in asking leading questions, modeling a problem, and suggesting a solution; however, teachers engaged in these when students were unable to persists in a task. The teachers also mediated tension by providing increased scaffolding, specifically during interim tasks, while maintaining the 146 cognitive demand of the goal task. Persistence is key to completing challenging tasks in the face of frustration (Dumdumaya et al., 2018) and students needed to persist in order to understand the goal task. I have discussed how lesson phases provided context clues and suggested teacher questions supported teacher scaffolding in ways that increased the potential to maintain cognitive demand. Now, I turn to the focus question of a lesson to show how teachers mediated tension. For each lesson, teachers were provided with a focus question that summarized the overall goal of the task. Each of the interim tasks were sequenced and built to encourage students’ understanding of the focus question (i.e., goal task). Within a lesson, interim tasks break the larger task into manageable pieces (Jones & Tarr, 2007), provide increased scaffolding for an interim task, and help students develop understanding for the goal task. Interim and goal tasks provide a way to think about scaffolding and cognitive demand within the context of a lesson. I also argue an interim and goal task perspective applies to sequences of lessons and supports the view that tasks are dependent. Just as the interim tasks are sequenced and built up to a goal task (Sullivan & Mornane, 2014; Sullivan et al., 2006), the sequence of focus questions can be thought of as interim tasks that build up to the subsequent focus question. In essence, the prior focus questions break the later focus question into manageable pieces and becomes an interim task and scaffolding for the later focus questions (Davis & Miyake, 2004; Jones & Tarr, 2007). This pattern continues throughout the intentional sequencing of the curriculum since the mathematical tasks are intentionally sequenced to develop deep mathematical understanding (Franke et al., 2007; CMP and Philosophy, n.d.). 147 Similar to teachers’ voice, students’ perspectives are less apparent in extant research. Understanding students’ experiences, particularly their frustration during a lesson, contribute to a fuller description of interactions during cognitively demanding tasks. When students faced frustration, the teachers felt pressured to stop asking questions, provide an answer, or model a solution. Teachers made decisions about when to provide support and to what degree. The teachers returned several times to an individual or group of students, each time providing more explicit scaffolding. A similar trend surfaced in my analysis of teachers’ questions. Although researchers have identified instances when the teacher lowered the cognitive demand of a task, usually by providing too much support or directed instructions, my analysis described possible student experiences and the tension teachers experienced prior to the lowering of the cognitive demand. Instead of Fading, Varied Contexts Illustrated Evolving Scaffolds Contrary to existing scaffolding research which suggested scaffolding fades (Davis & Miyake, 2004; Brown, Collins, & Duguid, 1989; Collins, Brown, & Holum, 1991; Stone, 1998), my discussion on student experiences and curricular considerations suggest teacher scaffolding is always present. In fact, teacher scaffolding is necessary to address frequent and important student needs, such as frustration and disengagement, during cognitively demanding tasks that may have academic implications (Di Leo et al., 2019). My teacher question framework and attention to various contexts addresses a need to think about scaffolding in complex ways by attending to scaffolding practices for individuals and groups of students and how scaffolding may change over time as students become more competent (David & Miyake, 2004). I described how teachers provided scaffolding to individuals that related to small group work and 148 the ways in which the teacher questions framework identified similarities between teacher scaffolding of individuals, small groups, and whole-class discussions. In addition, I illustrated how the teacher question framework and contexts related to changes in teacher scaffolding during student uncertainty and frustration. Curriculum is a collection of academic tasks (Doyle, 1983) and the intentional instruction sequencing of curriculum is meaningful for students and teachers (Stylianides, 2007c, p. 209). Taking a dependent task perspective acknowledges the complexity of teaching, where teacher questions, student experiences, and a collection of tasks as written curriculum converge to influence in-the-moment teacher decisions and the enacted curriculum. Heather summarized by stating, “Learning is messy” (personal communication, December 13, 2018) with messy and ambiguous interactions between students and teachers (Hiebert & Wearne, 1993). Teacher and Student Voices Detailed Salient Experiences Pertinent to Enacting Cognitively Demanding Tasks and Professional Development The emphasis on teachers’ implementation of the two constructs adds teachers’ voices to the existing body of literature; however, more work can be done to understand teacher and student interpretations of messy and ambiguous interactions. I faced challenges when coding teacher and student intentions and interpretations of discourse. Discourse prompts action during a social interaction between a teacher and student who co-constructed knowledge. I interpreted that to mean a teacher asked a question for a specific reason to prompt a student action and vice versa. Together, through their questions and responses, they came to a shared understanding. Instead of interpreting what a teacher said and the purpose of their talk from interview and classroom observation video and audio recordings, teachers’ own descriptions 149 and explanations of what they did or said could be included. Similarly, student reflections and descriptions of their own experiences during the enactment of cognitively demanding tasks could be explored. Both teachers’ and students’ perspectives add important details to ongoing research to support students’ exploration and development of mathematical thinking. Teachers who want to begin using cognitively demanding tasks must also be aware of likely shared and persistent experiences, ways to address tension and student experiences, and implications for their intervention. Literature already identifies and expands upon the importance of both scaffolding and cognitive demand; however, possible challenges and tensions teachers face are important aspects of the conversation. My analysis of Sophie’s, Heather’s, and Austin’s interviews and classroom observations highlighted shared experiences and struggles they faced during the enactment phase. More importantly, the teachers provided strategies that all teachers may implement when enacting cognitively demanding tasks. Documenting prompts and questions when planning and enacting problem-solving tasks helps teachers elicit and share expressions of generality during whole-class discussions (Vale et al., 2018). Additionally, Vale and colleagues (2018) called for thinking about how student responses satisfy a focus question, unfold in a sequence to provide a trajectory for students’ learning during whole-class discussion, and go beyond identifying a range of isolated possible student responses and teacher prompts. My teacher questions framework, which draws from their strategies, does just that. I described how particular teacher prompts and student responses related to a focus question and unfolded in a sequence (Codes 1-4) to develop students’ mathematical understanding. In addition, I showed how teachers used the framework in more contexts than just a whole-class 150 discussion and supported student learning while maintaining cognitive demand. My teacher questions framework and attention to context cues (e.g., phases of a lesson, focus question, student uncertainty, and student frustration) demonstrated how teachers mediated tension in various contexts and contributes important insights to professional development. 151 APPENDICES 152 APPENDIX A Task Analysis Guide • • Table 6 Task Analysis Guide Lower-level demands Memorization • Involve either reproducing previously learned facts, rules, formulas, or definitions or committing facts, rules, formulas, or definitions to memory • Cannot be solved by using procedures, because a procedure does not exist or because the time frame in which the task is being completed is too short to use a procedure • Are not ambiguous. Such tasks involve exact reproduction of previously seen material, and what is to be reproduced is clearly and directly stated. • Have no connections to the concepts or meaning that underlies the facts, rules, formulas, or definitions being learned or reproduced Higher-level demands Procedures with connections Focus students’ attention on the use of procedures for the purpose of developing deeper levels of understanding of mathematical concepts and ideas Suggest, explicitly or implicitly, pathways to follow that are broad general procedures that have close connections to underlying conceptual ideas as opposed to narrow algorithms that are opaque with respect to underlying concepts • Usually are represented in multiple ways, such as visual diagrams, manipulatives, symbols, and problem situations. Making connections among multiple representations helps develop meaning. • Require some degree of cognitive effort. Although general procedures may be followed, they cannot be followed mindlessly. Students need to engage with conceptual ideas that underlie the procedures to complete the task successfully and that develop understanding. Procedures without connections • Are algorithmic. Use of the procedure is either specifically called for or is evident from prior instruction, experience, or placement of the task • Require limited cognitive demand for successful completion. Little ambiguity exists about what needs to be done or how to do it. • Have no connection to the concepts or meaning that underlies the procedure being used • Are focused on producing correct answers instead of on developing mathematical understanding • Require no explanations or explanations that focus solely on describing the procedure that was used Doing mathematics • Require complex and nonalgorithmic thinking–a predictable, well-rehearsed approach or pathway is not explicitly suggested by the task, task instructions, or a worked-out example • Require students to explore and understand the nature of mathematical concepts, processes, or relationships • Demand self-monitoring or self-regulation of one’s own cognitive processes • Require students to access relevant knowledge and experiences and make appropriate use of them in working through the task • Require students to analyze the task and actively examine task constraints that may limit possible solution strategies and solutions • Require considerable cognitive effort and may involve some level of anxiety for the student because of the unpredictable nature of the solution process required 153 APPENDIX B Teacher Consent Form for Research Participant Study Title: Teachers’ Use of Scaffolding During Cognitively Demanding Tasks Principal Investigator: Kathryn Appenzeller Department: Mathematical Sciences Contact Information: appenze2@msu.edu You are invited to participate in a research study about teaching mathematics by using scaffolding during cognitively demanding tasks. If you agree to be part of the research study, you will be asked to complete two interviews for each observed task. One prior to the teaching of a CMP task, and one following the teaching of a CMP task. The pre-observation interview will ask you to share about your lesson planning, specifically about how you may provide scaffolding during the task. The post-observation interview will ask you to reflect on the completed lesson, sharing your thoughts about specific episodes. In addition, classroom video and audio recording of the lesson will be collected. The recordings will include you, as well as two student groups. Participating in this research gives me permission to include your data in this study. Data will include my observations/notes and recordings (audio and video) of your enacted teaching and audio and video recordings of individual interview with you. Your data and contributions in the study will be communicated without identifying you in any way. Your privacy will be protected. Your confidentiality as a participant in this study will remain secure. I will never use your actual name in any reports. Data collected will be kept in confidence. Your participation in this study is completely voluntary. You have the right to say no. Even if you decide to participate now, you may change your mind and stop at any time. You may choose not to finish for any reason, without giving a reason and with no negative consequences. I will use what I learn through this research to support teachers and I may share what I am learning in publications and presentations. If you have questions or concerns about this research study, your rights as a study participant, are dissatisfied at any time with any aspect of this study, or are concerned about a potential conflict of interest, you may contact Kathryn Appenzeller, doctoral candidate, at Michigan State University (appenze2@msu.edu). If you have questions or concerns about your role and rights as a research participant, would like to obtain information or offer input, or would like to register a complain 154 about this study, you may contact, anonymously if you wish, the Michigan State University’s Human Research Protection Program at 517-355-2180, Fax 517-432-4503, or email irb@msu.edu or regular mail at 4000 Collins Rd, Suite 136, Lansing, MI 48910. By continuing to the research procedures, I acknowledge that I am at least 18 years old, have read the above information, and agree to participate. I grant permission for the researcher to use all information collected for research and educational purposes. I understand that all information will remain confidential and that individual identities of participants will not be revealed in any study reports. Printed Name: ________________________________ Signature: ________________________________ Date: ________________________________ In sharing what we are finding from this work or in future work with educators, we may want to use videos and/or photos collected during our study. We may include these in presentations and publications. Actual names will NOT be used with the photos or videos. I give my consent for photos and videos to be used for educational purposes. I understand that real names will NOT be used with the photos and videos. Printed Name: ________________________________ Signature: ________________________________ Date: ________________________________ Email: ______________________________ 155 APPENDIX C Parent & Student Information and Participant Form for Research Dear Parent or Guardian, This semester your child’s math teacher is working with Kathryn Appenzeller, a doctoral candidate at Michigan State University, to learn more about ways teachers provide help to students during instruction. This research will not interrupt the normal classroom instruction and will not affect your child’s interactions with the teacher. Your child will not be asked to participate in any activities that are different than the normal daily activities. Several of the activities around this study may include the research team video recording your child’s teacher and classroom. The focus of the video recording will be on your child’s teacher, but your child might be in the video, too. No personal identifying information will be included in the videos. Our research team will analyze the videos for research purposes. We might talk about this study in classes, meetings, and/or conferences. We might also communicate the results in publications and/or presentations. In these cases, we will always keep your child’s information private. It will not cost your child anything to be in a classroom that is involved in this study. The videos will be used to help improve teaching and learning for future teachers. All videos will be kept securely by our team at MSU. You reserve the right to withdraw consent at any time. If you opt-out of the study and choose to have your child NOT participate, we will not intentionally videotape your child and will edit any footage containing your child. Your child will 156 still be able to participate fully in classroom mathematics instruction and his/her grades will not be affected in any way by your decision to opt out. If you have any questions or concerns regarding your rights as a study participant, or are dissatisfied at any time with any aspect of this study or concerned about a conflict of interest, you may contact Kathryn Appenzeller, doctoral candidate at Michigan State University (appenze2@msu.edu). Thank you for your consideration. Please fill out and return the information below if you decide NOT to participate in the project. Printed Name of Student: _________________________________________________________ Signature of Student: ____________________________________________________________ I am the parent/legal guardian of the child named above. I have received and read your letter regarding the MSU research team in my child’s classroom and do NOT give permission for my child to appear on a video recording and understand he/she will be seated outside of the recorded activities or edited from video recordings. Printed Name of Parent/Guardian: _________________________________________________ Signature of Parent/Guardian: _____________________________________________________ 157 Teacher: Date: Class Period: Class Time: 7:15-7:35 Video Recording Started at: Task: General Day: Notes: Questions to Ask: Time (Clock) What Happened Launch Explore Summarize APPENDIX D Classroom Observation Form Who Said Why Interesting 158 APPENDIX E Pre-Observation Semi-Structured Interview Protocol I am joined today, _________, with _Sophie_. The purpose of this interview is to elicit teachers’ thinking as they comment on or relate their personal beliefs and experiences to the potential relationship between scaffolding and cognitive demand. Specifically, to understand how teachers think about scaffolding and cognitive demand when planning tasks. Thank you for taking time to discuss scaffolding and cognitive demand. I have some questions for you to think about and respond to. There are no right or wrong answers. Feel free to take the time to think about your answers. Your perspective is helpful to the study. [First, we’ll start with some easy questions to get more comfortable J ] 2. What is your name? 3. What classes do you teach? [Now, I’ll ask you a few general questions about your teaching.] 4. Can you describe your overall approach to teaching mathematics? 5. For this study, I view scaffolding as supports teachers provide to help students accomplish a task they otherwise would not be able to do independently. The supports could include guiding questions, prompts, examples, or other ways that you interact with students throughout class which facilitates their learning. Can you give me some examples of how you use scaffolding in your teaching practice? [We will return to a specific example at a later question that I can probe this idea further.] 6. For this study, I view cognitive demand as a framework that helps describe students’ mathematical thinking during a task. Specifically, I’m considering cognitively demanding tasks which may include aspects of non-algorithmic thinking where students engage with underlying mathematical concepts. When enacting cognitively demanding tasks, a goal is to maintain the cognitive demand, keeping aspects of the tasks as discussed previously. For example, students have not previously seen a known procedure which they can follow mindlessly. Can you provide examples of how you use cognitively demanding tasks? a. During cognitively demanding tasks, what do you perceive to be your students’ experiences? 159 7. What relationship do you see (if any) between providing scaffolding and maintaining cognitive demand? If so, explain. 8. Do you see any conflict between providing scaffolding and maintaining cognitive demand? 9. Describe a time when you experienced tension or a challenge to provide scaffolding during a cognitively demanding task. a. What made the task cognitively demanding? b. How did the students react to the task? c. What kind(s) of scaffolding did you want to provide? d. What feelings (e.g., frustration, excitement, challenge, etc.) did you (the teacher) experience during the example? [We’ve talked more generally about your teaching practice in regard to scaffolding and cognitive demand. Let’s look at a specific example within CMP curriculum. We’ll start with talking about the investigation as a whole, thinking across the investigation. Please look at Investigation 2 of Accentuate the Negative. I’d like to discuss how you would provide scaffolding for particular task in the investigation. You can think hypothetically if you were planning a future class using the task. Or, you may reflect on a time when you used the task in the past.] 10. What is your learning goal for the investigation? a. How does your goal compare to CMP3 curriculum materials? 11. How does each problem (2.1, 2.2, 2.3, 2.4) help you with the goal for the investigation? [Think about how the relationship of algorithms and conceptual understanding to the teacher’s learning goals.] a. What role does the focus question have in your goal for the problem? 12. During which problem(s) (if any) did you or do you expect students to struggle? 13. During which part(s) of the problem (if any) did you or do you expect students to struggle? a. How do you plan to support their learning during the struggle? 14. In the beginning of the unit, students explore addition and subtraction without algorithms, what happens once students learn about the algorithms? a. Previously, we discussed scaffolding to use the various models instead of algorithms, how do you remove the scaffolding for the models, so students can do it on their own? [In other words, do you check for understanding about the models or do you stay with algorithms once students learn them? Are algorithms the ultimate goal?] 15. What types of scaffolding do you provide when students are learning addition and subtraction algorithms? 160 16. In what ways do you remove scaffolding so students may complete tasks independently? [Questions to ask if there is time.] 17. In general, in what ways do you provide scaffolding to your students during the Launch 18. In general, in what ways do you provide scaffolding to your students during the Explore phase? phase? [Thinking about specific tasks within the investigation, I wasn’t in your classroom for the previous investigation, but you recently taught it. Some of the following questions will have you think about Investigation 1 and its relationship to Investigation 2.] 19. Tell me briefly about how you approach your lesson for problem 2.1? [Now, let’s dig into 2.1.] 20. For example, problem 2.1, part A presents two groups based on the adding integers with the same sign or different signs. Do you present the groups as stated in the written task, or do you group in an alternative way, such as the sum of the two integers being 0, positive, or negative? 21. In the previous investigation, students explored subtraction of positive and negative values. Students learned both the number line and chip model. a. Can you show me on a number line how you did plan or would plan to present the launch of problem 2.1? Specifically: i. +8 + -5 with a chip model ii. +10 + -25 and +10 – -15 with a number line problem iii. +9 – -4 with a chip model 1. -25 is the change; -15 is start, result is the change for subtraction b. What model are you most comfortable with and why? c. How do you provide scaffolding for students who use the chip model compared to the students who use the number line? [The last example we’ll discuss is 2.3 which relates subtraction to addition.] 22. Once you reach this problem, how do you talk about subtraction with your students? [Do you always change subtraction to addition or think of subtraction as its own operation?] a. In other words, how do you approach subtraction conceptually? b. What are the ways your students might talk about subtraction? 161 1. What is your name? 2. What classes do you teach? [Now, I’ll ask you a few general questions about your teaching.] 3. Can you describe your overall approach to teaching mathematics? Thank you for participating in this interview. Your perspective is very helpful for the study. If, by any chance, I have follow-up questions, may I contact you at a later date? I am joined today, _________, with _Austin_. The purpose of this interview is to elicit teachers’ thinking as they comment on or relate their personal beliefs and experiences to the potential relationship between scaffolding and cognitive demand. Specifically, to understand how teachers think about scaffolding and cognitive demand when planning tasks. Thank you for taking time to discuss scaffolding and cognitive demand. I have some questions for you to think about and respond to. There are no right or wrong answers. Feel free to take the time to think about your answers. Your perspective is helpful to the study. [First, we’ll start with some easy questions to get more comfortable J ] 4. For this study, I view scaffolding as supports teachers provide to help students accomplish a task they otherwise would not be able to do independently. The supports could include guiding questions, prompts, examples, or other ways that you interact with students throughout class which facilitates their learning. Can you give me some examples of how you use scaffolding in your teaching practice? [We will return to a specific example at a later question that I can probe this idea further.] 5. For this study, I view cognitive demand as a framework that helps describe students’ mathematical thinking during a task. Specifically, I’m considering cognitively demanding tasks which may include aspects of non-algorithmic thinking where students engage with underlying mathematical concepts. When enacting cognitively demanding tasks, a goal is to maintain the cognitive demand, keeping aspects of the tasks as discussed previously. For example, students have not previously seen a known procedure which they can follow mindlessly. Can you provide examples of how you use cognitively demanding tasks? a. During cognitively demanding tasks, what do you perceive to be your students’ experiences? 162 6. What relationship do you see (if any) between providing scaffolding and maintaining cognitive demand? If so, explain. 7. Do you see any conflict between providing scaffolding and maintaining cognitive demand? 8. Describe a time when you experienced tension or a challenge to provide scaffolding during a cognitively demanding task. a. What made the task cognitively demanding? b. How did the students react to the task? c. What kind(s) of scaffolding did you want to provide? d. What feelings (e.g., frustration, excitement, challenge, etc.) did you (the teacher) experience during the example? [We’ve talked more generally about your teaching practice in regard to scaffolding and cognitive demand. Let’s look at a specific example within CMP curriculum. We’ll start with talking about the unit Growing, Growing, Growing.] 9. What investigations do you teach? [For this unit, many teachers pick and choose the investigations. What they choose may be connected with Common Core State Standards.] a. How much do you emphasize the different investigations? i. Investigation 4 is exponential decay and Investigation 5 is patterns with exponents. Do each of these investigations receive less or more time for instruction? [Teachers may deemphasize or skip Investigation 4 because it is not in CCSS. Teachers may emphasize Investigation 5.] 10. How do you prepare students in Investigation 2 for what will come later in the unit? a. For example, growth rates are discussed in Investigation 3. How do you use growth factors to help prepare students to learn about growth rates? Please look at Investigation 2 of Growing, Growing, Growing. I’ll be asking questions that have us thinking across the investigation. I’d like to discuss how you would provide scaffolding for particular task in the investigation. You can think hypothetically if you were planning a future class using the task. Or, you may reflect on a time when you used the task in the past.] 11. What is your learning goal for the investigation? a. How does your goal compare to CMP3 curriculum materials? 12. How does each problem (2.1, 2.2, 2.3) help you with the goal for the investigation? [Think about how the different representations in Investigation 2 help students understand linear and exponential relationships. How do patterns to come up with an equation, an equation, and graph relate to one another.] 13. How do you see the three problems (2.1, 2.2, 2.3) relating to one another? 163 the equation, !=#(%&), where does a show up in the table, equation, and a. To what extent do you tie the three representations together? For example, in graph? Where does b show up in the table, equation, and graph? Where is the growth factor in the table, equation, and graph? b. How would you provide scaffolding for students who struggle with understanding the equation of an exponential equation? c. How would you provide scaffolding for students who struggle with understanding the graphical representation of an exponential equation? 14. During which problem(s) (if any) did you or do you expect students to struggle? 15. During which part(s) of the problem (if any) did you or do you expect students to struggle? a. How do you plan to support their learning during the struggle? 16. In what ways do you remove scaffolding so students may complete tasks independently? [Questions that to ask if there is time.] I am joined today, _________, with _Heather_. 164 17. In general, in what ways do you provide scaffolding to your students during the Launch 18. In general, in what ways do you provide scaffolding to your students during the Explore phase? phase? [Thinking about specific tasks within the investigation.] 19. Tell me briefly about how you approach your lesson for problem 2.3? [Now, let’s dig into 2.3.] 20. For example, problem 2.3, students look at a graph representing exponential growth. What connections do you see with the graphical representation, equation, and table? a. What do you draw from in the previous problems (2.1 and 2.2) to help students i. Do you consider where the parts the equation !=#(%&) appear in the b. To what extent do you have students analyze and make sense of the graph? understand the graphical representation? graph? Thank you for participating in this interview. Your perspective is very helpful for the study. If, by any chance, I have follow-up questions, may I contact you at a later date? The purpose of this interview is to elicit teachers’ thinking as they comment on or relate their personal beliefs and experiences to the potential relationship between scaffolding and cognitive demand. Specifically, to understand how teachers think about scaffolding and cognitive demand when planning tasks. Thank you for taking time to discuss scaffolding and cognitive demand. I have some questions for you to think about and respond to. There are no right or wrong answers. Feel free to take the time to think about your answers. Your perspective is helpful to the study. [First, we’ll start with some easy questions to get more comfortable J ] 1. What is your name? 2. What classes do you teach? [Now, I’ll ask you a few general questions about your teaching.] 3. Can you describe your overall approach to teaching mathematics? 4. For this study, I view scaffolding as supports teachers provide to help students accomplish a task they otherwise would not be able to do independently. The supports could include guiding questions, prompts, examples, or other ways that you interact with students throughout class which facilitates their learning. Can you give me some examples of how you use scaffolding in your teaching practice? [We will return to a specific example at a later question that I can probe this idea further.] 5. For this study, I view cognitive demand as a framework that helps describe students’ mathematical thinking during a task. Specifically, I’m considering cognitively demanding tasks which may include aspects of non-algorithmic thinking where students engage with underlying mathematical concepts. When enacting cognitively demanding tasks, a goal is to maintain the cognitive demand, keeping aspects of the tasks as discussed previously. For example, students have not previously seen a known procedure which they can follow mindlessly. Can you provide examples of how you use cognitively demanding tasks? a. During cognitively demanding tasks, what do you perceive to be your students’ experiences? 6. What relationship do you see (if any) between providing scaffolding and maintaining cognitive demand? If so, explain. demand? 7. Do you see any conflict between providing scaffolding and maintaining cognitive 165 8. Describe a time when you experienced tension or a challenge to provide scaffolding during a cognitively demanding task. a. What made the task cognitively demanding? b. How did the students react to the task? c. What kind(s) of scaffolding did you want to provide? d. What feelings (e.g., frustration, excitement, challenge, etc.) did you (the teacher) experience during the example? [We’ve talked more generally about your teaching practice in regard to scaffolding and cognitive demand. Let’s look at a specific example within CMP curriculum. We’ll start with talking about the investigation as a whole, thinking across the investigation. Please look at Investigation 1 and 2 of Stretching and Shrinking. I’d like to discuss how you would provide scaffolding for particular task in the investigation. You can think hypothetically if you were planning a future class using the task. Or, you may reflect on a time when you used the task in the past.] 9. What is your learning goal for the investigation? a. How does your goal compare to CMP3 curriculum materials? 10. How does each problem (1.1, 1.2, 2.1, 2.2, 2.3) help you with the goal for the investigation? [Think about how the mucking around of the first two investigations relate to later learning goals.] a. What role does the focus question have in your goal for the problem? 11. During which problem(s) (if any) did you or do you expect students to struggle? 12. During which part(s) of the problem (if any) did you or do you expect students to struggle? a. How do you plan to support their learning during the struggle? 13. In the beginning of units, students explore similarity without knowing a mathematical definition, what kind of guidance do you provide during this exploration? (How do you decide when and if you should intervene?) a. During the first two investigations, students are talking about similarity, comparisons, and measurements in more general ways. At times, finding the exact measurement (hypotenuse of a triangle) is not something students know yet. What is your approach to teaching during these instances? 14. What types of scaffolding do you provide when students are learning about similar 15. In what ways do you remove scaffolding so students may complete tasks figures? independently? [Questions to ask if there is time.] 166 16. In general, in what ways do you provide scaffolding to your students during the Launch 17. In general, in what ways do you provide scaffolding to your students during the Explore phase? phase? [Thinking about specific tasks within the investigation, I wasn’t in your classroom for the previous investigation, but you recently taught it. Some of the following questions will have you think about Investigation 1 and its relationship to Investigation 2.] 18. Tell me briefly about how you approach your lesson for problem 1.1? a. Are you using the rubber band activity to introduce the idea of similarity? i. Can you share your thinking behind your decision? 19. Tell me briefly about how you approach your lesson for problem 2.1? [Now, let’s dig into 2.1.] 20. For example, problem 2.1, students begin to draw shapes on the coordinate plane. What prior knowledge do you identify students need for this problem? a. How do you help students who may not understand the coordinate grid (x-axis and y-axis orientation)? coordinate grid? b. How do you help students who do not understand how to plot points on the c. How does the development of these prior skills relate to later problems where students use or write rules that create a similar (or not) figure? [The last example we’ll discuss is 2.2.] 21. In problem 2.2, students have an option to predict how a rule may change a figure. Do you see this prediction exercise tying to previous problems? 22. In what ways to do draw students attention back to previous units? a. Do you think this way of looking back is a way to scaffold? Why or why not? b. Do you think looking forward and starting conversations to foreshadow what will come later is scaffolding? Why or why not? Thank you for participating in this interview. Your perspective is very helpful for the study. If, by any chance, I have follow-up questions, may I contact you at a later date? 167 Post-Observation Semi-Structured Interview Protocol APPENDIX F I am joined today, _________, with _______. The purpose of this interview is to elicit teachers’ thinking as they reflect back on recent teaching. Specifically, to understand how teachers think about scaffolding and cognitive demand after teaching a lesson or lessons. Thank you for taking time to discuss scaffolding and cognitive demand. I have some questions for you to think about and respond to. There are no right or wrong answers. Feel free to take the time to think about your answers. Your perspective is helpful to the study. 1. Please give your general reaction to the enactment of the task. What went well? What was surprising? What, if anything, would you do differently next time? 2. How much scaffolding do you think you provided? a. Is this level of scaffolding typical of your teaching? Why or why not? b. What kinds of scaffolding did you provide during the launch to help students access the problem and get started? c. What kinds of scaffolding did you provide during the exploration of the problem? d. What kinds of scaffolding did you provide during the summarize portion? 3. In what ways did you provide scaffolding? (Please give an explicit example from the lesson). a. What did you notice that suggested that students needed scaffolding? b. Did you struggle in any way to provide scaffolding without reducing cognitive demand? c. How did students respond to your scaffolding? 4. In the future, how do you plan to remove the scaffolding so students may try the problem, or similar problems, independently? 5. In a particular moment, I noticed you did this… (have teachers watch the short video clip of the episode). a. What was your thinking behind the decision to say/do/show… 6. What relationship do you see (if any) between providing scaffolding and maintaining cognitive demand? 168 a. Has your thinking about this relationship changed since the pre-observation interview? demand? interview? 7. Do you see any conflict between providing scaffolding and maintaining cognitive a. Has your thinking about potential conflict changed since the pre-observation 169 APPENDIX G Teacher Questions Spreadsheet Notes Code 1 Code 2 Code 3 Code 4 Code 5 170 File Focus Question Footnote Number Teacher Question Explore Phase Percentages for Every Lesson APPENDIX H Table 7 Explore Phase Percentages for Every Lesson Recording To make sense of students’ mathematical work? To rely on themselves to determine correctness? To learn to reason mathematically? To learn to conjecture, invent, and solve problems? To connect mathematics and its applications? NA 33% 60% 74% 48% 56% 44% 80% NA 0% 15% 17% 67% 40% 38% 28% 9% 13% 60% 67% 56% 0% 43% 48% 28% 23% 100% 57% 10% 38% 60% 57% 38% 35% 24% 60% 1 (Oct. 23) 3rd Hour 2 (Oct.23) 2nd Hour 3 (Oct. 24) 2nd Hour 4 (Oct. 24) 3rd Hour 5 (Oct. 26) 2nd Hour 6 (Oct. 26) 3rd Hour 7 (Oct. 29) 2nd Hour 8 (Oct 29) 3rd Hour 9 (Oct. 26) 4th Hour 10 (Oct 29) 4th Hour 11 (Nov. 11) 12 (Nov. 12) 71% 43% 70% 75% 47% 62% 74% 73% 39% 48% 17% 42% 42% 29% 60% 59% 50% 64% 46% 58% 56% 31% 25% 42% 171 APPENDIX I Student Uncertainty Resolution Type 3 Example Complete Transcript Which one? How can you take away two black when there is no black? That is a great question. There’s no black. I’m trying to show you. … you add them. Let’s see it. Let’s say you’re $4 in debt. Okay. And you clean the house and you get $2. Since you have $2, you have to instantly put it in, so you’d have $2 left in debt. But… Okay, hold on. Show that again. You have $4 in debt, right? Why are you adding 4 black chips? (pause) Is that what it says to do? No. No, it says to take them away. Do I have any black chips here to take away? No. That is our problem. How could I get black chips on there without changing the value? Add black and red. Try it. Let’s try it. Add a black and a red. Is that value still negative four? Yes. Because we’re adding zero, right? So when we take them away, the two away, it’s negative six. (girl) Negative six. What do you think about that? You were adding black chips. What if you added more? 172 Sophie Girl Sophie Boy Girl Sophie Boy Sophie Boy Sophie Sophie Boy Girl Sophie Girl Sophie Girl Sophie Girl Sophie Girl Sophie Sophie Girl APPENDIX J Salient Episode 1 Complete Transcript Student 37, what are we doing? Okay, so then what’s wrong with it? But you got to answer of four? Show me, show me with the chips, how the answer is four. Make the original chip board. Help her get the original chip board back. (Teacher was talking to a student in the group.) And what is number two say to do? So now, figure out what the value of that. And that’s how we got the answer of negative one. Is the value counting all the chips? Is that what the value was? Student 17, how did you get a number of a value of one? Show her what you did. I went like this. Okay This really goes down here. And then I looked at both and there’s one left here. So, that row cancels, right? All those are zeros. And then you’ve got one left, right? It should be positive one. So, go back to the original. Do the next one. What does number two say to start with? Start with negative four and positive two. I’m just confused on. (Student did not finish her sentence.) What does number two say to start with? Negative four and. (Teacher interrupts student before she finished her sentence.) Show me a chip board that has negative four. Is that a negative four chip board? This is a negative four chip board. Okay. That’s a negative chip board. I agree. What do they want you to do? Subtract, no. Whoa, but what are you doing? Is that wrong? What does it want you to do? Subtract positive two. Do you have positive two? No, so we have to add positive two and positive red. I mean negative red, right? Try it. Let’s see. What are you really adding? Zero Is that okay? Is that still negative four? Yeah. No. (Students within the group voiced different responses.) Okay. Now can you do the next part? Yeah. What’s your answer? So, then we subtract this. 173 Teacher Teacher Teacher Student Teacher Student 17 Teacher Student Student 17 Teacher Teacher Student Student Teacher Student Teacher Student Teacher Student Teacher Student Teacher Student Teacher Student Teacher Student Teacher Students Teacher Student Teacher Student Student Teacher Student Teacher Student Teacher Student Teacher Student Teacher Student Student Teacher Student Teacher Student Student Teacher Student Teacher Negative six. Oh! Teacher walked away, helped other small groups of students, and returned for the next conversation. How are the other ones going over here? I just don’t understand. Which one? I know how to do, but when we start with it, do we start with the regular chip board and make. You make that chip board to have a value of this. I don’t care what combination. Remember when we did the chip boards and I said, make a value of positive 12? You can use as many or as little chips as you want, but the value has to be that. Does that make more sense? Teacher walked away, helped other small groups of students, and returned for the next conversation. Are these right? What did you get for number two? Negative six. Watch. Watch this. Yeah. Are you adding positive two? (The teacher modeled the problem for the student.) You’re subtracting positive two. Does it matter if it’s a negative or a positive? That makes a big difference, doesn’t it? On the second number? The second number does that. Does that make a difference? (This line of questions was slightly different from previous questions. The previous questions asked about how to complete the addition or subtraction to get an answer. This question was about why or how it changed if the second number changed.) I feel like it doesn’t It’s just like this one [inaudible]. Does the second number matter? It matters but I don’t feel like the negative and positive matters. On this, when you did it. Oh yeah, it did matter. It depends on what color you’re taking away or you’re adding. 174 REFERENCES 175 REFERENCES Baker, R. S., Rodrigo, M. M. T., & Xolocotzin, U. E. (2007, September). The dynamics of affective transitions in simulation problem-solving environments. In International Conference on Affective Computing and Intelligent Interaction (pp. 666-677). Springer, Berlin, Heidelberg. Bakker, A., Smit, J., & Wegerif, R. (2015). Scaffolding and dialogic teaching in mathematics education: Introduction and review. ZDM, 47, 1045-1065. Bauersfeld, H. (1980). Hidden dimensions in the so-called reality of a mathematics classroom. Educational studies in mathematics, 11(1), 23-41. Bloom, D., Carter, S., Christian, B., Otto, S., & Faris, N. (2010). Discourse analysis and the study of classroom language and literacy events: A microethnographic perspective. New Jersey: Lawrence Erlbaum Associates. Boston, M. D., & Smith, M. S. (2011). A ‘task-centric approach to professional development: enhancing and sustaining mathematics teachers’ ability to implement cognitively challenging mathematical tasks. ZDM, 43(6-7), 965-977. Boyd, M., & Markarian, W., (2011). Dialogic teaching: Talk in service of a dialogic stance. Brodie, K. (2007). Teaching with conversations: Beginnings and endings. For the learning of Language and Education, 25(6), 515-534. Mathematics, 27(1), 17. Brodie, K. (2011). Working with learners’ mathematical thinking: Towards a language of description for changing pedagogy. Teaching and Teacher Education, 27(1), 174-186. Brown, J. S., Collins, A., & Duguid, P. (1989). Situated cognition and the culture of learning. Bruner, J. S. (1975a). A communication to language – A psychological perspective. Cognition, Educational Researcher, 18(1), 32–41. 3(3), 255-287. Bruner, J. S. (1975b). The ontogenesis of speech acts. Journal of Child Language, 2(1), 1-19. Calder, N. (2015). Student wonderings: Scaffolding student understanding within student- centered inquiry learning. Chapin, S. H., & O’Connor, M. C. (2004). Project challenge: Identifying and developing talent in 176 mathematics within low-income urban schools (Research Report). Boston, MA: Boston University. Chapin, S. H., O’Connor, M. C., & Anderson, N. C. (2009). Classroom discussions: Using math talk To help students learn (2nd ed.). Sausalito, CA: Math Solutions Publications. Charlambous, C. Y. (2008). Mathematical knowledge for teaching and the unfolding of tasks in mathematics lessons: Integrating two lines of research. In Proceedings of the 32nd Conference of the International Group for the Psychology of Mathematics Education (Vol. 2, pp. 281-288). Cirillo, M., Drake, C., Eisenmann, B. H., & Hirsch, C. (2009). Curriculum vision and coherence: Adapting curriculum to focus on authentic mathematics. Mathematics teacher, 103(1), 70-75. Collins, A., Brown, J. S., & Holum, A. (1991). Cognitive apprenticeship: Making thinking visible. American Educator, 15(3), 6–11, 38–39. Connected Mathematics Project. (n.d.). Retrieved from https://connectedmath.msu.edu/curriculum-design/philosphy/ Connected Mathematics Project. (n.d.). Retrieved from https://connectedmath.msu.edu/curriculum-design/philosophy/explore/ Davis, E. A. & Miyake, N. (2004). Explorations of scaffolding in complex classroom systems. The Journal of the Learning Sciences, 13(3), 265-272, DOI: 10.1207/ s15327809jls1303_1 Di Leo, I., Muis, K. R., Singh, C. A., & Psaradellis, C. (2019). Curiosity… Confusion? Frustration! The role and sequencing of emotions during mathematics problem solving. Contemporary Educational Psychology, 58, 121-137. Doyle, W. (1983). Academic Work. Review of Educational Research, 53(2), 159-199. http://doi:10.2307/1170383 Dumdumaya, C. E., Banawan, M. P., & Rodrigo, M. M. T. (2018, July). Identifying Students' Persistence Profiles in Problem Solving Task. In Adjunct Publication of the 26th Conference on User Modeling, Adaptation and Personalization (pp. 281-286). ACM. Edson, A. J., (2017). Learner-controlled scaffolding linked to open-ended problems in a digital learning environment. ZDM Mathematics Education. doi: 10.1007/S11858-017-0873-5 Edson, A. J., Phillips, E., Slanger-Grant, Y., & Stewart, J. (2019). The Arc of Learning framework: An ergonomic resource for design and enactment of problem-based curriculum. International Journal of Educational Research. Retrieved from 177 https://doi.org/10.1016/j.ijer.2018.09.020 Fennema, E., Carpenter, T. P., Franke, M. L., Levi, L., Jacobs, R., & Empson, S. B. (1996). A longitudinal study of learning to use children’s thinking in mathematics instruction. Journal for Research in Mathematics Education, 27(4), 403–434. Foley, G. D., Khoshaim, H. B., Alsaeed, M., & Nihan Er, S. (2012). Professional development in statistics, technology, and cognitively demanding tasks: classroom implementation and obstacles. International Journal of Mathematical Education in Science and Technology, 43(2), 177-196. Franke, M. L., Kazemi, E., & Battey, D. (2007). Mathematics teaching and classroom practice. Second handbook of research on mathematics teaching and learning, 1(1), 225- 256. Franke, M. L., Webb, N. M., Chan, A. G., Ing, M., Freund, D., & Battey, D. (2009). Teacher questioning to elicit students’ mathematical thinking in elementary school classrooms. Journal of Teacher Education, 60(4), 380-392. Gee, J. (2014). Social linguistics and literacies: Ideology in discourses. Routledge. Harley, J. M. (2016). Measuring emotions: a survey of cutting edge methodologies used in computer-based learning environment research. In Emotions, technology, design, and learning (pp. 89-114). Academic Press. Henningsen, M., & Stein, M. K. (1997). Mathematical tasks and student cognition: Classroom- based factors that support and inhibit high-level mathematical thinking and reasoning. Journal for Research in Mathematics Education, 28(5), 524–549. http://doi.org/10.2307/749690 Herbel-Eisenmann, B. A., Steele, M. D., & Cirillo, M. (2013). (Developing) teacher discourse moves: A framework for professional development. Mathematics Teacher Educator, 1(2), 181-196. Hiebert, J., & Wearne, D. (1993). Instructional tasks, classroom discourse, and students’ learning in second-grade arithmetic. American Educational Research Journal, 30(2), 393– 425. Jackson, K., Garrison, A., Wilson, J., Gibbons, L., & Shahan, E. (2013). Exploring relationships between setting up complex tasks and opportunities to learn in concluding whole-class discussions in middle-grades mathematics instruction. Journal for Research in Mathematics Education, 44(4), 646–682. http://doi.org/10.5951/jresematheduc.44.4.0646 178 Jaworski, A., & Coupland, N. (2005b). Introduction: Perspectives on discourse analysis. In A. Jaworski & N. Coupland (Eds.), The discourse reader (2nd ed., pp. 1-37). London: Routledge. Jones, D. L., & Tarr, J. E. (2007). An examination of the levels of cognitive demand required by probability tasks in middle grades mathematics textbooks. Statistics Education Research Journal, 6(2), 4-27. Lampert, M. (2001). Teaching problems and the problems of teaching. Yale University Press. Lappan, G., Phillips, E. D., Fey, J. T., & Friel, S. N. (2014). Accentuate the Negative: Integers and Rational Numbers. In Lappan, Phillips, Fey, & Friel, Connected Mathematics 3. Boston: Pearson Education, Inc. Lappan, G., Phillips, E. D., Fey, J. T., & Friel, S. N. (2014). Growing, Growing, Growing: Exponential Functions. In Lappan, Phillips, Fey, & Friel, Connected Mathematics 3. Boston: Pearson Education, Inc. Lappan, G., Phillips, E. D., Fey, J. T., & Friel, S. N. (2014). Stretching and Shrinking: Understanding Similarity. In Lappan, Phillips, Fey, & Friel, Connected Mathematics 3. Boston: Pearson Education, Inc. Lloyd, G., & Wilson, M. (1998). Supporting innovation: The impact of a teacher's conceptions of functions on his implementation of a reform curriculum, Journal for Research in Mathematics Education, 29(3), 248-274. Mason, J. (1998). Talking while learning mathematics versus learning the art of Mathematical conversation. For the Learning of Mathematics, 18(1), 41-51. Mercer, N., & Littleton, K. (2007). Dialogue and the development of children's thinking: A sociocultural approach. London: Routledge. National Council of Teachers of Mathematics. 2000. Principles and standards for school mathematics, Reston, VA: National Council of Teachers of Mathematics. Parks, A. N., & Schmeichel, M. (2014). Children, mathematics, and videotape: Using multimodal analysis to bring bodies into early childhood assessment interviews. American Educational Research Journal, 51(3), 505-537. Reinhart, S. C. (2000). Never say anything a kid can say! Mathematics Teaching in the Middle School, 5(8), 478–483. http://doi.org/http://dx.doi.org/10.1108/17506200710779521 Robitaille, D. F., & Travers, K. J. (1992). International studies of achievement in mathematics. In 179 D. A. Grouws (Ed.), Handbook of research on mathematics teaching and learning: A project of the National Council of Teachers of Mathematics (pp. 687-709). New York, NY: Macmillan Publishing Co., Inc. Roll, I., Homes, N. B., Day, J., & Bonn, D. (2012). Evaluating metacognitive scaffolding in guided invention activities. Instructional Science, 40(4), 691-710. Ryve, A., (2011). Discourse research in mathematics education: A critical evaluation of 108 journal articles. Journal for Research in Mathematics Education, 42(2), 167-199. Schiffrin, D. (1994). Approaches to discourse: Language as social interaction. Oxford: Blackwell. Schoenfeld, A. H. (1992). Learning to think mathematically: Problem solving, metacognition, and sense making in mathematics. In D. Grouws (Ed.), Handbook for research on mathematics teaching and learning (pp. 334-370). New York: Macmillan. Sfard, A. (2008). Thinking as communicating: Human development, the growth of discourses, and mathematizing. Cambridge: Cambridge University Press. Silver, E. A., & Stein, M. K. (1996). The QUASAR Project: The “Revolution of the Possible” in mathematics instructional reform in urban middle schools. Urban Education, 30(4), 476– 521. Smith, M. S., Bill, V., & Hughes, E. K. (2008). Thinking through a Lesson : Successfully implementing high-level tasks. Mathematics Teaching in the Middle School, 14(3), 132– 138. Smith, M. S., Hughes, E. K., Engle, R. A., & Stein, M. K. (2009). Orchestrating Discussions 5 Practices Class handout.pdf. Mathematics Teaching in Middle School. Smith, M. S., & Stein, M. K. (1998). Selecting and creating mathematical tasks: From research to practice. Teaching Mathematics in the Middle School, 3(5), 344-350. Stein, M. K., Engle, R. A., Smith, M. S., & Hughes, E. K. (2008). Orchestrating productive mathematical discussions: Five practices for helping teachers move beyond show and tell. Mathematical Thinking and Learning, 10(4), 313-340. doi:10.1080/10986060802229675 Stein, M. K., Grover, B. W., & Henningsen, M. (1996). Building student capacity for mathematical thinking and reasoning: An analysis of mathematical tasks used in reform classrooms. American Educational Research Journal, 33(2), 455–488. http://doi.org/10.3102/00028312033002455 Stein, M. K., & Kaufman, J. H. (2010). Selecting and supporting the use of mathematics curricula 180 at scale. American Educational Research Journal, 47(3), 663-693. Stein, M. K., & Lane, S. (1996). Instructional tasks and the development of student capacity to think and reason: An analysis of the relationship between teaching and learning in a reform mathematics project. Educational Research and Evaluation, 2(1), 50-80. Stein, M. K., Smith, M. S., Henningsen, M. A., & Edward, A. Silver. 2000. Implementing Standards-Based Mathematics Instruction: A Casebook for Professional Development. Stone, C. A. (1998). The metaphor of scaffolding: Its utility for the field of learning disabilities. Journal of Learning Disabilities, 31, 344–364. Stylianides, G. J. (2007c). Investigating the guidance offered to teachers in curriculum materials: The case of proof in mathematics. International Journal of Science and Mathematics Education, 6(1), 191-215. Stylianides, G. J. (2009). Reasoning-and-proving in school mathematics textbooks. Mathematical thinking and learning, 11(4), 258-288. Stylianides, A. J., & Stylianides, G. J. (2014). Impacting positively on students’ mathematical problem solving beliefs: An instructional intervention of short duration. The Journal of Mathematical Behavior, 33, 8-29. Sullivan, P., Askew, M., Cheeseman, D. C., Mornane, A., Roche, A., & Walker, N. (2015). Supporting teachers in structuring mathematics lessons involving challenging tasks. Journal of Mathematics Teacher Education, 18, 123-140. Sullivan, P., & Mornane, A. (2014). Exploring teachers’ use of, and students’ reactions to, challenging mathematics tasks. Mathematics Education Research Journal, 26(2), 193– 213. http://doi.org/10.1007/s13394-013-0089-0 Sullivan, P., Mousley, J., & Zevenbergen, R. (2006). Teacher actions to maximize mathematics learning opportunities in heterogeneous classrooms. International Journal of Science and Mathematics Education, 4, 117-143. Ursula de Araujo, Z. (2012). Transferring demand: Secondary teachers' selection and enactment of mathematics tasks for English language learners (Unpublished doctoral dissertation). The University of Georgia, Athens, Georgia. Vale, C., Widjaja, W., Doig, B., & Groves, S. (2019). Anticipating students’ reasoning and planning prompts in structured problem-solving lessons. Mathematics Education Research Journal, 31(1), 1-25. Vygotsky, L. S. (1978). Mind in society: The development of higher psychological processes. 181 Cambridge: Harvard University Press. Wood, D., Bruner, J. S., & Ross, G. (1976). The role of tutoring in problem solving. Journal of Child Psychology and Psychiatry, 17, 89-100. Zhu, Y., Fan, L. (2006). Focus on the representation of problem types in intended curriculum: A comparison of selected mathematics textbooks from Mainland China and the United States. International Journal of Science and Mathematics Education, 4(4), 609-626. 182