SUPPORTING STUDENTS’ SCIENTIFIC EXPLANATIONS: A CASE STUDY INVESTIGATING THE SYNERGY FOCUSING ON A TEACHER’S PRACTICES WHEN PROVIDING INSTRUCTION AND USING MOBILE DEVICES By Ibrahim Delen A DISSERTATION Submitted to Michigan State University in partial fulfillment of requirements for the degree of Curriculum, Instruction, and Teacher Education—Doctor of Philosophy 2014     ABSTRACT SUPPORTING STUDENTS’ SCIENTIFIC EXPLANATIONS: A CASE STUDY INVESTIGATING THE SYNERGY FOCUSING ON A TEACHER’S PRACTICES WHEN PROVIDING INSTRUCTION AND USING MOBILE DEVICES By Ibrahim Delen Engage students in constructing scientific practices is a critical component of science instruction. Therefore a number of researchers have developed software programs to help students and teachers in this hard task. The Zydeco group, designed a mobile application called Zydeco, which enables students to collect data inside and outside the classroom, and then use the data to create scientific explanations by using claim-evidence-reasoning framework. Previous technologies designed to support scientific explanations focused on how these programs improve students’ scientific explanations, but these programs ignored how scientific explanation technologies can support teacher practices. Thus, to increase our knowledge how different scaffolds can work together, this study aimed to portray the synergy between a teacher’s instructional practices (part 1) and using supports within a mobile devices (part 2) to support students in constructing explanations. Synergy can be thought of as generic and content-specific scaffolds working together to enable students to accomplish challenging tasks, such as creating explanations that they would not normally be able to do without the scaffolds working together. Providing instruction (part 1) focused on understanding how the teacher scaffolds students’ initial understanding of the claim-evidence-reasoning (CER) framework. The second component of examining synergy (part 2: using mobile devices) investigated how this teacher used mobile devices to provide feedback when students created explanations. The synergy between providing instruction and using mobile devices was investigated by analyzing a middle school teacher’s     practices in two different units (plants and water quality). Next, this study focused on describing how the level of synergy influenced the quality of students’ scientific explanations. Finally, I investigated the role of focused teaching intervention sessions to inform teacher in relation to students’ performance. In conclusion, findings of this study showed that the decrease in the teacher’s support for claims, did not affect the quality of the students’ claims. On the other hand, the quality of students’ reasoning were linked with the teacher’s practices. This suggests that when supporting students’ explanations, focusing on components that students find challenging would benefit students’ construction of explanations. To achieve synergy in this process, the collaboration between teacher’s practices, focused teaching intervention sessions and scaffolds designed to support teachers played a crucial role in aiding students in creating explanations.     Copyright by IBRAHIM DELEN 2014       ACKNOWLEDGEMENTS When graduating from college, I made a scholarship application that started a long journey. I am grateful to Turkish Ministry of Education for supporting majority of this process starting with the language school. I would have never imagined attending graduate school in another country without this scholarship. My adviser, Prof. Joe Krajcik, was always by my side during entire process by helping me to focus on what I am interested in. I once again thank you for being not just the chair of my dissertation committee, but a great friend who patiently supported me to become a scholar. When transferring from the other school down the road, Dr. Peter Youngs made all the little details work. Without his assistance, I could never transfer from the blue shirts to the green ones. I was also really lucky to have a great dissertation committee that supported me right after my transfer. I am grateful to Drs. Amelia Gotwals, Michelle Williams, and Gail Richmond for their continuous support in the last two years. I also thank Drs. Joe Krajcik, Bob Geier and Angie Calabrese Barton for being great mentors to make me part of the Journal of Research in Science Teaching Editorial Team. Finally, I am grateful to Sue Carpenter for making sure everything was appropriately documented. In this process, I had two unofficial advisers who completed their studies before me and provided valuable tips. Dr. Nirit Glazer helped me to shape my ideas in numerous meeting in my early years. Then Dr. Dante Cisterna made sure that I did not miss any details when completing my degree at MSU. Completing this journey would not be possible without having great friends. Consuelo Morales has been a great colleague and friend for all these years. I am so sorry that we could not   v     walk together in the graduation ceremony at the Big House. But, I am thankful that you kept supporting me until the last day. I started the Ph.D. program with Drs. Ingrid Sanchez and Rohit Setty, but I was a little behind at the end. When following your footsteps, I thank you again Ingrid for always devoting time for me whenever I needed support. And Rohit, thanks for making room for us in your house when watching the Sunday night football. In addition, I would like to thank Zydeco team members for the great research experience and the National Science Foundation for supporting us with NSF Grant DRL 1020027. When I introduced my willingness to focus on technology, I am grateful that Dr. Chris Quintana created room for me in the project and supported me during the process. Dr. Alex Kuhn has been a wonderful friend that taught me how mobile devices can be valuable contributors of education. I also thank Alissa Ampezzan, Clara Cahill, Wan-Tzu Lo, Brenna McNally and Dr. Shannon Schmoll for teaching me different aspects of the design process. The teacher in this case study deserves huge credit. She has been a great friend for the last three years. I thank her once again for making sure that all the little details we would like to see were implemented. I also thank several Turkish colleagues Sedat Gumus, Mehmet Sukru Bellibas, Ridvan Demirkol, Burak Tufekcioglu, Volkan Mujdat Tiryaki, Ertugrul Dalkic, Selcuk Bucak, Serafetting Gedik, Ridvan Eksi, Mehmet Can, Seckin Senlik, Alpaslan Bayrak, Abdulkadir Elmas, Hasan Ilhan, Sedat Sen and Feyzullah Gulpinar for the time we spent together. Finally, special credit goes to all my family members. I am really sorry that I spent the last six years away from you. When I finished the masters degree in a year, you all thought I was coming back soon, and I apologize for being 5000 miles away in all important celebrations. I would never end this journey, if you have stopped being on my side.   vi     TABLE OF CONTENTS LIST OF TABLES .......................................................................................................................xi LIST OF FIGURES .....................................................................................................................xiii CHAPTER 1 INTRODUCTION .......................................................................................................................1 Overview of the Dissertation ..................................................................................................1 Overview of Remaining Chapters ...........................................................................................7 CHAPTER 2 LITERATURE REVIEW ............................................................................................................8 Role of Providing Instruction and Understanding Teachers’ Challenges ..............................8 Providing Professional Development to Support Instruction .................................................10 Technology Scaffolds .............................................................................................................12 Addressing the Gaps Summarized in My Study .....................................................................14 My Study ................................................................................................................................16 CHAPTER 3 THEORETICAL FRAMEWORK ...............................................................................................19 Technology Scaffolds .............................................................................................................20 Scaffolds Designed to Prompt Students to Start Creating Explanations ................................21 Defining Synergy ....................................................................................................................23 CHAPTER 4 CLASSROOM INTERVENTION...............................................................................................28 Features of the Mobile Application ........................................................................................28 Constructing Questions ........................................................................................29 Data Collection ....................................................................................................29 Creating Explanations ..........................................................................................31 Summary of the Units and Focused Teaching Intervention Sessions ....................................32 Unit 1- Plants (Science Fair Projects) ..................................................................32 Day 1 to Day 6 ........................................................................................34 Day 7 to Day 9 .........................................................................................34 Day 10 to Day 20 ....................................................................................34 Day 21 & Day 22 .....................................................................................35 Focused Teaching Intervention ............................................................................35 First Focused Teaching Intervention Session- Day 1 .............................36 Second Focused Teaching Intervention Session- Day 2 ..........................37 Unit 2- Water Quality ..........................................................................................38 Day 1 .......................................................................................................38 Day 2 & Day 3 .........................................................................................39 Day 4 .......................................................................................................40 Day 5 ........................................................................................................40 Day 6 .......................................................................................................40   vii     Day 7 & Day 8 ........................................................................................41 CHAPTER 5 METHODS ..................................................................................................................................43 Participants..............................................................................................................................43 Data Analysis ..........................................................................................................................44 Analyzing Data Sources Separately .......................................................................................47 Students’ Scientific Explanations .......................................................................47 Coding Students’ Explanations ...............................................................49 Step 1- Analyzing Claims (Categorizing Data) ..................................50 Step 2- Analyzing Evidence (Categorizing Data) ...............................50 Step 3- Analyzing Reasoning Statements (Categorizing Data) ..........50 Step 4- Creating Patterns (Axial Coding) ...........................................52 Teacher’s Practices and Defining Levels of Synergy .........................................52 Step 1- Coding Teacher’s Practices: Analyzing CER Separately (Categorizing Data) ..................................................................................54 Step 2- Classifying Data by Level of Support (Categorizing Data) ........58 Step 3- Examining The Level of Support for Each Activity and Each Day (Categorizing Data) .................................................................60 Step 4- Examining The Level of Support for Each Part (Categorizing Data) ........................................................................................................61 Step 5- Defining Level of Synergy in Each Unit (Categorizing Data) ....62 Creating Patterns- Comparing Synergy Across Units (Axial Coding) ...63 Focused Teaching Intervention Sessions .............................................................64 Step 1- Coding Focused Teaching Intervention Data (Categorizing Data) ..................................................................................65 Step 2- Creating Patterns (Axial Coding) ................................................65 Teacher Interviews ..............................................................................................66 Step 1- Coding Interview Data (Categorizing Data)................................66 Step 2- Creating Patterns (Axial Coding) ................................................67 Creating the Case Study (Selective Coding) ...........................................................................69 First Mini-Case Study: Synergy in Unit 1 ...........................................................70 Second Mini-Case Study: Focused Teaching Intervention & Synergy ..............72 Synergy Comparison............................................................................................72 CHAPTER 6 RESULTS ...................................................................................................................................74 Mini-Case Study 1: Synergy in Plants Unit ...........................................................................74 Pre-Interview........................................................................................................75 Examining Synergy in Unit 1 ..............................................................................78 Part 1- Providing Instruction ...................................................................78 Activity 1- Making Connections with Previous Activities .................78 Activity 2- Modeling CER .................................................................79 Activity 3- Practicing Zydeco & Critiquing Explanations ................83 Activity 4- Making Connections with Everyday Explanations ..........85 Unit 1- Part 1 Summary .....................................................................87   viii     Part 2- Using Mobile Devices ..................................................................87 Day 1- Using Mobile Devices.............................................................88 Day 2- Using Mobile Devices ............................................................90 Unit 1- Part 2 Summary ......................................................................91 Defining Level of Synergy in Unit 1 .......................................................92 Quality of Students’ Explanations .......................................................................93 Unit 1 Interview ...................................................................................................96 Unit 1 Summary ...................................................................................................97 Mini-Case Study 2: Focused Teaching Intervention & Synergy in Water Quality Unit ........99 Focused Teaching Intervention Sessions .............................................................100 Focused Teaching Intervention- Day 1 ....................................................101 Focused Teaching Intervention- Day 2 ....................................................103 Focused Teaching Summary ....................................................................105 Examining Synergy in Unit 2 ..............................................................................106 Part 1- Providing Instruction ...................................................................106 Activity 1- Modeling CER ..................................................................106 Activity 2- Critiquing Explanations ....................................................107 Unit 2-Part 1 Summary .......................................................................109 Part 2- Using Mobile Devices ..................................................................110 Day 1- Using Mobile Devices.............................................................110 Day 2- Using Mobile Devices.............................................................111 Day 3- Using Mobile Devices.............................................................112 Unit 2- Part 2 Summary ......................................................................114 Defining Level of Synergy in Unit 2 .......................................................114 Quality of Students’ Explanations .......................................................................115 Unit 2 Interview ...................................................................................................117 Unit 2 Summary ...................................................................................................119 Cross-Case Synthesis ..............................................................................................................119 CHAPTER 7 DISCUSSION AND IMPLICATIONS ......................................................................................124 Discussion ..............................................................................................................................124 How does the teacher use instructional strategies to support students’ understanding of the claim-evidence-reasoning framework? ..............................125 How do the teacher’s scaffolds and the scaffolds in the Zydeco work together to support student learning? ...................................................................127 How does the level of synergy between providing instructional supports and using the supports in mobile devices aid in improving the quality of students’ explanations? ..................................................................129 What is the role of focused teaching intervention in informing the teacher about her practices? .................................................................................131 What does this teacher think about providing instruction, using mobile devices, and students’ ability to create explanations before and after the intervention? ...................................................................................134 Implications.............................................................................................................................135 Limitations ..............................................................................................................................137   ix     Conclusions .............................................................................................................................139 APPENDICES .............................................................................................................................141 Appendix A: Unit 2- Day 1 Worksheet .................................................................................142 Appendix B: Unit 2-Indicators Worksheet ............................................................................145 Appendix C: Unit 2- CER Worksheet ..................................................................................147 Appendix D: Unit 2- Testing Indicators ................................................................................149 Appendix E: Teacher Interviews ...........................................................................................152 Appendix F: Explanations Coded in Second Focused Teaching Intervention Session ........155 Appendix G: Explanations Coded in First Focused Teaching Intervention Session ..............161 Appendix H: Creating Hypothesis Worksheet- Unit 1 ..........................................................164 Appendix I: Science Fair Questions ......................................................................................166 Appendix J: Procedure Worksheet.........................................................................................168 Appendix K: Defining Variables ............................................................................................170 Appendix L: Plants Modeling Activity ...................................................................................172 Appendix M: Plants CER Activity ........................................................................................174 Appendix N: Summary of Coding for Plants Unit (Unit 1) ....................................................176 Appendix O: Quality of Students’ Explanations in Unit 1 .....................................................183 Appendix P: Summary of Coding for Water Quality Unit (Unit 2) ......................................186 Appendix Q: Quality of Students’ Explanations in Unit 2 .....................................................192 REFERENCES ............................................................................................................................196   x     LIST OF TABLES Table 1 Zydeco Scaffolds and Scaffolding Guidelines ...........................................................21 Table 2 Activity Sequence in Plants Unit ...............................................................................33 Table 3 Research Questions and Data Collection Methods ....................................................45 Table 4 iPad Claim-Evidence-Reasoning Rubric ...................................................................48 Table 5 Coding the Quality of Claims ....................................................................................50 Table 6 Coding the Quality of Students’ Reasoning ..............................................................51 Table 7 Rubric for Evaluating Teacher’s Practices ................................................................55 Table 8 Sample Coding- Teacher Practices ...........................................................................57 Table 9 Thresholds for Classifying Level of Support ............................................................59 Table 10 Defining the Level of Support for Each Part ............................................................61 Table 11 Defining Levels of Synergy .......................................................................................63 Table 12 Codes Developed for Analyzing Focused Teaching Intervention Sessions ..............64 Table 13 Sample Coding- Focused Teaching Intervention Sessions ........................................65 Table 14 Codes Developed to Analyze Interviews ...................................................................67 Table 15 Sample Coding- Interview Sessions .........................................................................68 Table 16 Questions Investigated in Each Mini Case Study ......................................................71 Table 17 Examining the Level of Support in Unit 1-Part 1 .....................................................87 Table 18 Examining the Level of Support in Unit 1-Part 2 ......................................................92 Table 19 Examining Level of Synergy in Unit 1 ......................................................................93 Table 20 Quality of Students’ Explanations in Unit 1 ..............................................................95 Table 21 Examining the Level of Support in Unit 2-Part 1…………………………………. 110 Table 22 Examining the Level of Support in Unit 2-Part 2…………………………………. 114   xi     Table 23 Examining Level of Synergy in Unit 2…………………………………. …………115 Table 24 Quality of Students’ Explanations in Unit 2………………………………………. 116 Table 25 Comparing the Level of Synergy in Unit 1 and Unit 2……………………………. 121 Table 26 Quality of Students’ Explanations in Unit 1 and Unit 2…..………………………. 122 Table 27 Indicators Worksheet …………...……………...…………………………………. 146 Table 28 Testing Indicators ……….……...……………...…………………………………. 150 Table 29 Interview Questions Designed for Pre-Interview…………………………………. 153 Table 30 Interview Questions Designed for Post-Interviews after Unit 1 and Unit 2………. 154 Table 31 Summary of Teacher Practices in Part 1-Activity 1 (Unit 1)…………………...….177 Table 32 Summary of Teacher Practices in Part 1-Activity 2 (Unit 1)…………………...…. 178 Table 33 Summary of Teacher Practices in Part 1-Activity 3 (Unit 1)…………………...…. 179 Table 34 Summary of Teacher Practices in Part 1-Activity 4 (Unit 1)…………………...…. 180 Table 35 Summary of Teacher Practices in Part 2-Day 1 (Unit 1)…………..…………...…. 181 Table 36 Summary of Teacher Practices in Part 2-Day 2 (Unit 1)…………..…………...…. 182 Table 37 Quality of Students’ Explanations in Unit 1 .............................................................184 Table 38 Summary of Teacher Practices in Part 1-Activity 1 (Unit 2)…………………...…. 187 Table 39 Summary of Teacher Practices in Part 1-Activity 2 (Unit 2)…………………...…. 188 Table 40 Summary of Teacher Practices in Part 2-Day 1 (Unit 2)…………..…………...…. 189 Table 41 Summary of Teacher Practices in Part 2-Day 2 (Unit 2)…………..…………...…. 190 Table 42 Summary of Teacher Practices in Part 2-Day 3 (Unit 2)…………..…………...…. 191 Table 43 Quality of Students’ Explanations in Unit 2 .............................................................193   xii     LIST OF FIGURES Figure 1. Structure of My Study................................................................................................ 17 Figure 2. Add New Claim Scaffold ............................................................................................22 Figure 3. Adding Evidence to Claim ...........................................................................................22 Figure 4. Removing Evidence from Claim .................................................................................22 Figure 5. Reasoning Scaffold ......................................................................................................23 Figure 6. Synergy in My Study ...................................................................................................27 Figure 7. Planning Page: Reviewing Questions and Creating Questions....................................29 Figure 8. Data Collection Page ..........................................30 Figure 9. Labeling the Data ........................................................................................................30 Figure 10. Creating Claims by Reviewing Questions ...................................................................31 Figure 11. Data Collection Under the Review Section ..................................................................31 Figure 12. Student Data from Day 1-Plants Unit...........................................................................35 Figure 13. Teacher’s Explanation ............................................................................................... 37 Figure 14. Reviewing Student Progress .......................................................................................38 Figure 15. Sample Claim-Evidence-Reasoning ...........................................................................41 Figure 16. Reviewing 2012 Data...................................................................................................42 Figure 17. Defining the Synergy ...................................................................................................54 Figure 18. Oil spill problem……………………………………………………………………144 Figure 19. Living Organisms….. ………………………...……………………………………151 Figure 20. Coding Explanations- Explanation 1 .………………………...……………………156 Figure 21. Coding Explanations- Explanation 2 .………………………...……………………156 Figure 22. Coding Explanations- Explanation 3 .………………………...……………………157   xiii     Figure 23. Coding Explanations- Explanation 4 .………………………...……………………157 Figure 24. Coding Explanations- Explanation 5 .………………………...……………………158 Figure 25. Coding Explanations- Explanation 6 .………………………...……………………158 Figure 26. Coding Explanations- Explanation 7 ………………………...…………………….159 Figure 27. Coding Explanations- Explanation 8 ………………………...…………………….159 Figure 28. Coding Explanations- Explanation 9 ………………………...…………………….160 Figure 29. Explanations Coded from Unit 1 (Explanations 1-5)…...…….……………………162 Figure 30. Explanations Coded from Unit 1 (Explanations 6-9)…...…….……………………163 Figure 31. Plants Modeling Activity …….………………….…………………………………173   xiv     CHAPTER 1: INTRODUCTION This chapter provides a short summary of the dissertation by briefly discussing the gaps in the literature and design of the study briefly. The chapter also presents the research questions and provides an overview for the remaining chapters. Overview of the Dissertation Inquiry has been a crucial element in science education for decades (National Science Teacher Association, 1987; National Research Council, 1996; National Research Council, 2000; Linn, Davis, & Bell, 2004; Bybee, 2010; National Research Council, 2012). For instance, the National Science Teaching Association (NSTA, 1987) reported that effective teachers value the role of inquiry, design inquiry-oriented learning environments; and use inquiry in instruction with different kinds of methods, including discussion, investigation, and debate. The National Research Council (NRC, 2000) noted that inquiry occurs at all grade levels through the use of different strategies, such as learner-centered, knowledge-centered, community-centered, and assessment-centered approaches. The latest NRC report (2012) underlined the importance of developing student understanding across time and further refined what is meant by scientific inquiry as engaging in scientific practices (NRC, 2012). The Framework for K – 12 Science Education (NRC, 2012) specified scientific and engineering practices as one of the three dimensions of the framework. The scientific and engineering practices include asking questions and defining problems; developing and using models; planning and carrying out investigations; analyzing and interpreting data; using mathematics and computational thinking; developing explanations and designing solutions; engaging in argument from evidence; and obtaining, evaluating, and communicating information (NRC, 2012).   1     Three of these skills -- analyzing and interpreting data, developing explanations, and engaging in argument from evidence -- are directly related to the scientific explanations aspect of inquiry, defined by the NRC (2012) as the connection between scientific theory and observations. Scientific explanations are responses to questions about phenomena that provide a justification for why phenomena occur (NRC, 2012). McNeill, Lizotte, Krajcik and Marx (2006) created a new framework derived from Toulmin’s (1958) argumentation model to support students in constructing scientific explanations, which provides a useful structure of claims, evidence, and reasoning (CER) for students. In another study, Gotwals, Songer and Bullard (2012) described scientific explanations as “evidence-based explanations” (p. 186), while also using the claim-evidence-reasoning structure to support students. Common to these studies is the description of a claim as a statement that answers the question; evidence as the data to support the claim; and reasoning as the link between the data and the claim (McNeill et al., 2006; Gotwals et al., 2012). The NRC report (2012) underlines the features of argumentation as “appraisal of data quality, modeling of theories, development of new testable questions from those models, and modification of theories and models as evidence indicates they are needed” (p. 27). Argumentation focuses on justifying claims with data and also emphasizes creating rebuttals (Erduran, Simon, Osborne, 2004). Several studies defined scientific explanations using the claim-evidence-reasoning framework because they focused on learners explaining phenomena by emphasizing the importance of supporting claims with data (McNeill & Krajcik, 2008a; Gotwals et al., 2012; Novak & Treagust, 2013). Songer (2006) defined this process as complex reasoning that involved: (a) making predictions, (b) data analysis, and (c) justifying evidence.   2     Argumentation is larger than explanation and can include arguing for one’s question, data analysis technique, a model or research design. Explanation is about explain why a phenomenon occurs. The claim, evidence and reasoning framework is a scaffold to support students in constructing explanation. Argumentation and explanations studies both focused on engaging students in responding to scientific questions to promote justifying evidence to construct explanations (NRC, 1996; NRC, 2000; Duschl, Schweingruber, & Shouse, 2007; McNeill & Krajcik, 2008a; Kuhn, 2010; NRC, 2012). Despite its importance, Krajcik, Blumenfeld, Marx, Bass and Fredericks (1998) found that creating evidence-based explanations in science is challenging for middle school students. Further, Sandoval and Millwood (2005) found that high school students continue to struggle to link sufficient evidence to support their claims. McNeill and Krajcik (2007) similarly noted that middle school students exhibited the same struggle. Other studies noted that in addition to challenges in using evidence to support their claims, students also struggle to provide reasoning statements to justify their claims (Songer, 2006; McNeill & Krajcik, 2008a; Gotwals & Songer, 2010). Although, the previous literature primarily focused on understanding students’ challenges when creating scientific explanations, students are not alone in facing this challenge. Some studies noted pre-service teachers (Crawford, Zembal‐Saul, Munford, & Friedrichsen, 2005; Zembal-Saul, 2009) and in-service teachers (Osborne, Erduran, & Simon, 2004; McNeill & Krajcik, 2008a; McNeill & Knight, 2013) face challenges when supporting explanations. Teachers’ challenges can be summarized as: 1) Supporting students in the reasoning component of the explanation framework (McNeill & Krajcik, 2008a; McNeill & Knight, 2013), and 2) Providing feedback to students as they are creating explanations (Simon, Erduran, & Osborne, 2006; McNeill & Knight, 2013).   3     To support students and teachers in this process, previous studies focused on improving teachers’ use of various instructional strategies (Simon et al., 2006; McNeill & Krajcik, 2008a; McNeill & Knight, 2013), and designing technology scaffolds (Sandoval & Reiser, 2004; Songer, 2006; Maldonado & Pea, 2010; Kuhn et al., 2012; Laru, Järvelä, & Clariana, 2012) to support students in constructing scientific explanations. However, very few studies investigated the connection between teachers’ practices and scaffolds designed to support students’ explanations. One such study by Tabak (2004) focused on defining how teachers’ practices and technology scaffolds support student learning. Several other studies (McNeill et al., 2006; McNeill & Krajcik, 2009) also highlighted that teachers’ use of written scaffolds play a crucial role in supporting students’ understanding of the explanation framework. My study aims to portray the synergy between a teacher’s practices when providing instruction to support students’ understanding of the claim-evidence-reasoning (CER) framework (McNeill & Krajcik, 2008b), and using mobile devices when students are engaged in creating their own explanations. In this study, synergy is defined as various scaffolds working together to enable students to accomplish the challenging task, such as creating explanations, that they would not be able to accomplish without the scaffolds. When investigating technology scaffolds, I focused on whether a mobile application (Zydeco) supports or hinders a teacher’s practices when she provides feedback to students who are analyzing data and developing explanations by tracking a middle school science teacher in two different units (plants and water quality). Previously, Tabak (2004) noted that “synergy between the software scaffolds and the teacher’s scaffolding” (p. 324) would support students’ learning experience. My study adds to this definition by investigating the synergistic effect of providing instruction and using mobile devices on the quality of students’ explanations in two   4     different units (plants and water quality). The primary focus was on examining synergy, but I also investigated role of focused teaching intervention by organizing two sessions after the first unit to support the teacher by discussing the students’ explanations. The term focused intervention used commonly in medical studies (Mahoney, O'Sullivan, & Dennebaum, 1990; Wechsberg, Lam, Zule, & Bobashev, 2004; Vidovich, Lautenschlager, Flicker, Clare, & Almeida, 2013) when measuring the role of different health related factors. Connected with this dissertation, focused teaching intervention focused on supporting the quality of reasoning and teaching practices before the teacher started the second unit. Borko (2004) noted the importance of discussing practices from classrooms to improve the quality of teaching. Consistent with this idea, several scholars (Simon et al., 2006; McNeill & Knight, 2013) supported this strategy of having teachers analyze student explanations. The focused teaching intervention sessions organized as part of this study focused on investigating how these sessions informed the teacher (Borko, 2004). To achieve this goal, in the first session, the teacher coded explanations from the first unit to investigate students’ performance. In the second session, the teacher and I created an activity for the second unit to help her become familiar with the coding rubric. When tracking a middle school science teacher to investigate the synergistic effect on the quality of students’ explanations in two different units (plants and water quality), I conducted a mixed methods study (Yin, 2014) and collected the following data sources: (a) teacher interviews (pre-interview and two post-interviews with the teacher); (b) students’ written scientific explanations using the iPad version of Zydeco after each unit; (c) video and audio data that tracked the teacher during two units; and (d) video and audio data of the focused teaching intervention sessions. These different data sources helped me to investigate: (a) the need to   5     understand how a teacher implements technology scaffolds designed to support students in constructing scientific explanations; (b) the level of synergy between providing instruction and using mobile devices; (c) patterns in how the changes in level of synergy relate to changes in students’ explanations (students’ claim, evidence and reasoning statements); and (d) the role and importance of focused teaching intervention to inform teacher’s in relation to students’ performance. In summary, my study aims to add to the synergy definition provided by earlier studies (Tabak, 2004; McNeill et al., 2006; McNeill & Krajcik, 2009) by portraying a teacher’s practices when providing instruction (part 1) and when using mobile devices (part 2). In this process, I examined how various scaffolds worked together with the teacher’s practices in both parts. My overarching research question is: “How does the level of synergy between providing instructional supports and using the supports in mobile devices aid in improving the quality of students’ explanations?” My more specific research questions are: • Research Question (RQ)-1: What does the teacher think about providing instruction, using mobile devices, and the students’ ability to create explanations before and after the intervention? • RQ-2: How do students’ claim, evidence and reasoning scores change during the two units (plants and water quality)? • RQ-3: How does the teacher use instructional strategies to support students’ understanding of the claim-evidence-reasoning framework? • RQ-4: How do the teacher’s scaffolds and the scaffolds in the Zydeco work together to support students’ explanations?   6     • RQ-5: What is the role of focused teaching intervention in informing the teacher about her practices? Overview of Remaining Chapters When investigating these research questions, the following chapters will elaborate the points raised in this chapter. Chapter 2 (Literature Review) examined studies that focused on understanding teachers’ challenges and how these studies helped teachers and students to overcome the challenge. After discussing these studies, chapter 2 discussed how my study focused on the gaps underlined in the chapter. Chapter 3 (Theoretical Framework) defined the synergy in my study by discussing the role of scaffolding with an emphasis on: (a) how the teacher scaffolds students’ understanding of the explanation framework and (b) scaffolds in the mobile application. This chapter also investigated how the scaffolds designed in the mobile application prompted students to create claims, add evidence and reasoning statements. Chapter 4 (Classroom Intervention) presented complete features of the mobile application, and provided a summary of both units and focused teaching intervention sessions. Chapter 5 (Methods) discussed the participants, how levels of synergy defined and data analysis process. Chapter 6 (Results) organized the findings in three sections: (a) Findings from unit 1, (b) Findings from unit 2, and (c) Cross-case analysis between unit 1 and unit 2. Chapter 7 (Discussion and Implications) examined the contributions of this study by making connections with the existing body of literature. The chapter also presented implications for practice and future studies.   7     CHAPTER 2: LITERATURE REVIEW Chapter 1 provided a short summary to present the research questions of this dissertation. Chapter 2 discusses the gaps in the literature by reviewing studies that examined the role of providing instruction, professional development (PD) sessions, and technology scaffolds to support students and teachers in creating scientific explanations. Finally, I will introduce how my study is connected to the gaps found in this section. Role of Providing Instruction and Understanding Teachers’ Challenges McNeill and Krajcik (2008a) studied thirteen middle school teachers as they engaged students in constructing explanations while implementing the same unit, in order to investigate how using various instructional strategies supports students in constructing scientific explanations. The instructional strategies included: modeling explanations, providing rationale for explanations, defining explanations, and making connections with everyday explanations. The researchers found that providing a rationale and explaining the components of the explanation framework (claim-evidence-reasoning) supports students in constructing written explanations. In addition, McNeill and Krajcik (2008a) also noted that if teachers left out providing a rationale, defining the components of the framework led to an opposite effect when students create explanations. In this study, few teachers implemented making connections with everyday explanations, and the researchers noted that this strategy had little impact on supporting students’ explanations. Finally, the researchers noted that the majority of the participants successfully implemented the practice of scientific explanation despite also finding that the implementation of these strategies varied across teachers. Although providing instruction with various instructional strategies supported students in constructing explanations, teachers struggled with implementing the reasoning part of the framework more than claims or evidence (McNeill & Krajcik, 2008a). In   8     another study, McNeill (2011) tracked 5th grade students in a yearlong study that also underlined the need for investigating teachers’ practices by discussing the importance of using different instructional strategies (e.g., peer critique, discussing explanation framework, modeling explanations) to support students in constructing explanations. In a similar study, McNeill (2009) focused on six middle school science teachers’ implementation of a unit designed with the driving question: “How can I make new stuff from old stuff?” The researcher examined how teachers used the following strategies: (a) defining explanations; (b) modeling explanations; (c) providing the rationale for explanation; (d) making connections with everyday explanations; (e) giving feedback; (f) focusing on students’ existing ideas; and (g) emphasis on science content. McNeill found all six teachers defined and modeled explanations, but implementation of these strategies varied across teachers. When creating explanations, students had the most difficulty constructing reasoning statements. The researcher also added that writing a quality explanation depends on understanding both the content and the explanation framework. Missing one of these components would lead to a poor explanation (McNeill, 2009). When investigating teachers’ practices, the two studies summarized above (McNeill & Krajcik, 2008a; McNeill, 2009) used the CER framework. Other studies cited in this section used this framework but with an emphasis on argumentation. Osborne and colleagues (2004) studied twelve teachers that implemented an activity discussing the affordances and limitations of zoos in two consecutive years, and focused on developing tools to analyze classroom practices with an emphasis on argumentation. The researchers found that there was an improvement in teachers’ practices in the second year, and the changes varied across teachers. Although, teachers struggled to support higher-level   9     arguments (e.g. providing rebuttals), Osborne and colleagues (2004) noted that teachers can adopt argumentation into their classroom practices. The studies summarized in this sub-section noted that various instructional supports have the potential to enhance students’ construction of scientific explanations or arguments. These studies also added that it is challenging for teachers to support students in constructing explanations (Osborne et al., 2004; McNeill & Krajcik, 2008a; McNeill, 2009). To help teachers overcome this challenge, several studies focused on the role of providing professional development to support teachers in using instructional strategies (Simon et al, 2006; McNeill & Knight, 2013). Providing Professional Development to Support Instruction Simon, Erduran, and Osborne (2006) studied twelve teachers in two consecutive years to measure the effect of the teachers’ participation in professional development (PD) focused on teaching argumentation on teaching a unit focusing on argumentation. The researchers studied how to support the teachers’ instructional practices with regard to argumentation by understanding how teachers implemented various instructional strategies. These instructional strategies were grouped into several categories: (a) teachers’ support for listening and talking during discussions, (b) teachers’ definition of argument, (c) teachers’ role when students are creating ideas, (d) support to include evidence, (e) teachers’ aid when students are constructing arguments, (f) support to critique arguments, (g) support to create counter-arguments, and (h) discussing the process of argumentation (p. 248). Similar to McNeill & Krajcik (2008a), Simon and colleagues (2006) noted that the implementation of instructional strategies varied across teachers. Despite some teachers showing no improvements when using instructional strategies after participating in PD, in general the PD intervention showed a positive effect in improving teachers’ instructional practices (Simon et al., 2006).   10     McNeill and Knight (2013) noted a lack of emphasis on professional development in relation to scientific practices and provided PD for seventy elementary, middle and high school teachers in their study. Previously, several studies (Osborne et al, 2004; Simon et al., 2006; McNeill & Krajcik, 2008a; McNeill, 2009) focused on teachers’ implementations during the same unit or activity. McNeill and Knight (2013) concentrated on supporting teachers’ understanding of how to design lesson plans with an emphasis on argumentation across different content areas. The researchers organized three PD sessions to support teachers by engaging them in analyzing student work and pre-recorded classroom videos. Teachers participating in these sessions also designed various lessons that incorporated argumentation using the claim, evidence and reasoning framework. McNeill and Knight found that participating in workshops supported teachers’ understanding of the claim-evidence-reasoning model; however, they also reported that teachers struggled with analyzing students’ discussions and implementing the reasoning component of the framework. In summary, scientific explanation provides support for engaging students in the inquiry process but implementation can pose a challenge to students (Krajcik et al., 1998; Sandoval & Millwood, 2005; McNeill & Krajcik, 2007; Duschl et al., 2007; Gotwals & Songer, 2010) and teachers (Osborne et al., 2004; Crawford et al., 2005; Simon et al., 2006; McNeill & Krajcik, 2008a; McNeill, 2009; Zembal-Saul, 2009; McNeill & Knight, 2013). Instead of focusing on why students struggle with scientific explanations, the NRC Framework (2012) reported on the paucity of opportunities provided to students to engage with scientific explanations. The next section will discuss how previous studies designed technology scaffolds to support students when creating explanations.   11     Technology Scaffolds Gotwals and Songer (2010) concluded that students find it challenging to create explanations with providing approriate evidence and reasoning. The researchers also emphasized that “either pieces of the content or the structure of the scientific explanation are not fully accurate” (Gotwals & Songer, 2010 p. 276) in students’ written explanations. The importance of content understanding is unavoidable when creating scientific explanations (Sandoval & Millwood, 2005; McNeill et al., 2006; Simon et al., 2006; Songer, 2006; McNeill & Krajcik, 2008a; Gotwals & Songer, 2010; NRC, 2012), because students need to use scientific principles to justify why data supports a claim. But some studies (McNeill, 2009; Gotwals & Songer, 2010) also stressed that students’ understanding of the explanation framework also plays an important role in this process. Connected with these studies, a number of researchers have developed software programs that focus on investigating specific content to help teachers and students use the scientific explanation process. As suggested by Krajcik, Blumenfeld, Marx, and Soloway (2000), inquiry can be facilitated through the use of technology. In an early study, Sandoval and Reiser (2004) used ExplanationConstructor software to support students in formulating explanations. ExplanationConstructor provided a computer-based environment that focused on investigating biological topics, such as the natural selection of finches in the Galapagos Islands, or the ecology of panthers in North Africa. The software then scaffolds students’ explanations by providing prompts about the explanation process for understanding evolution and natural selection; as an electronic journal, ExplanationConstructor also helps students to examine the relationship between questions, explanations, and evidence. Sandoval and Reiser (2004) found that   12     ExplanationConstructor helped students create scientific explanations about natural selection by supporting the construction of claims using the data provided by the software. Williams, Montgomery and Manokore (2012) studied how using the Web-based Inquiry Science- Environment (WISE) promotes science learning in seventh-grade students. Students participating in the study used the online curriculum to observe traits, cell growth and division, cell differentiation, and sexual and asexual reproduction. In this process, WISE provided several simulations and visualizations in relation to genetic inheritance. When measuring students’ understanding, researchers focused on understanding how students included scientific ideas in their explanations. Williams and colleagues (2012) found that using the online curriculum with embedded prompts supported students’ content understanding and helped students improve the quality of their explanations, but discussing the products of mitosis and meiosis was still challenging for students after the intervention. Along with using computer-based tools, the use of mobile devices has increased rapidly in the last decade and the demand for their use in educational settings is increasing (Norris, Hossain, & Soloway, 2011; McCaffrey, 2011). A number of researchers have used mobile devices to explore science learning. Songer (2006) used CyberTracker on PDAs during the BioKIDS project. In BioKIDS, students explored a question focusing on biodiversity, collected data by observing the physical characteristics of organisms, and finally used the data to explain characteristics of different organisms. During this study, students analyzed the combined data coming from other students; the experimental group increased its ability to build scientific explanations about biodiversity (Songer, 2006). Maldonado and Pea’s (2010) Let’s GO project enabled students to collect pH, temperature, and dissolved oxygen data with latitude (GPS co-ordinates) data with using mobile devices.   13     After engaging in data collection, the students were able to provide more scientific information in the post-questionnaire compared to the pre-questionnaire over the same information (Maldonado & Pea, 2010). Kuhn and colleagues (2012) focused on creating a data pool that combined all the data collected by students to enable them to select others’ data or their own data when creating explanations with using a mobile application. Students who participated in that study primarily focused on including data coming from their peers in their explanations (Kuhn et al., 2012). Laru and colleagues (2012) designed a mobile application for Nokia phones, known as Flyer, that prompted students to create a claim (e.g. a woodpecker has made the traces), provide a ground for this explanation (e.g., there are holes on the tree), add warrants (e.g., woodpeckers knock trees) and a piece of data to support their ideas (p. 113). Although this interaction engaged students in creating arguments, some students struggled to create messages that represented a higher level of content understanding (Laru et al., 2012). In summary, several studies designed scaffolds to aid students in collecting data and creating explanations; however, none of these focused on understanding how teachers would implement these tools. In the next section, I will summarize the gaps presented in this literature and delineate how this dissertation focuses on addressing these gaps when providing instruction and using technology scaffolds. Addressing the Gaps Summarized in My Study Providing feedback to students when they are developing explanations is noted as an important strategy (Simon et al., 2006; McNeill & Krajcik, 2008b; McNeill & Knight, 2013). To highlight the importance of feedback, Pellegrino, Chudowsky, and Glaser (2001) focused on the role of students monitoring their own learning experience. But, an important gap that exists in the   14     literature is how to provide students with feedback. Several studies (Simon et al., 2006; McNeill and Knight, 2013) found that teachers faced challenges when providing feedback to support students’ explanations. To address this gap, McNeill and Knight (2013) suggested finding ways to “support teachers in noticing aspects of student argumentation in classroom practice” (p. 965). In this process, putting an emphasis on reasoning is critical because several studies noted that implementing the reasoning portion of the explanation framework is challenging for teachers (McNeill & Krajcik, 2008a; McNeill & Knight, 2013). To address these challenges, my study focused on investigating how a middle school science teacher supported students’ understanding of the claim-evidence-reasoning framework as “part 1: providing instruction” (RQ-3: How does the teacher use instructional strategies to support students’ understanding of the claim-evidence-reasoning framework?). Then I focused on how this teacher used mobile devices to provide feedback to students when they are engaged in writing explanations. The second part will be referred to as “using mobile devices” (RQ-4: How do the teacher’s scaffolds and the scaffolds in Zydeco work together to support students’ explanations?). Besides putting an emphasis on the challenges teachers face when supporting explanations, my study also addresses another gap by focusing on the connections between a teacher’s practices when providing instruction and using mobile devices. As introduced in chapter 1, this connection was defined as synergy (Tabak, 2004). Previous studies focused on how to support instruction when teachers implement instruction that foregrounds scientific explanations (Simon et al., 2006; McNeill & Krajcik, 2008a; McNeill, 2009; McNeill & Knight, 2013) and how to use technology scaffolds to support students when creating explanations (Sandoval & Reiser, 2004; Songer, 2006; Maldonado & Pea, 2010; Kuhn et al., 2012; Laru et al., 2012; Williams et al.,   15     2012). But few studies discussed the synergy between teachers’ practices and using technology scaffolds when supporting explanations (Tabak, 2004). My study adds to this definition by focusing on how the level of synergy between providing instruction and using mobile devices supports quality of students’ explanations. This chapter only presented the gap for synergy and chapter 3 will closely examine the construct of synergy by discussing how my definition of synergy differs from previous studies. My Study As noted by Eisenhardt (1989) and Yin (2014), constructing a case study would be the appropriate method when there is not much known about a topic. By extending Tabak’s (2004) definition of synergy, my case study focuses on empirically investigating the level of synergy between providing instruction and using mobile devices and how that supports students in constructing explanations. As I will discuss in the methods chapter, I used grounded theory method (Strauss & Corbin, 1998) to define the level of synergy among instructional supports. Since previous studies (Tabak, 2004; McNeill et al., 2006) did not provide “a preconceived theory” (Strauss & Corbin, 1998, p. 12) in relation to examining synergy, the levels of synergy in this study “emerge from the data” (Strauss & Corbin, 1998 p. 12). Chapter 4 provides a step-bystep analysis to describe how analyzing teacher’s practices led to creating “building blocks” (Strauss & Corbin, 1998 p.13) for the levels of synergy. I tracked the teacher in two different units to examine the synergistic effect on the quality of students’ explanations (RQ-2: How do students’ claim, evidence, and reasoning scores change during the two units?). After the first unit, I organized two focused teaching intervention sessions to discuss students’ performance (RQ-5: What is the role of focused teaching intervention in informing the teacher about her practices?). The rationale for having these sessions after the first   16     unit was our existing collaboration with the teacher, who used the application previously and was confident in using it to support students’ scientific explanations. After unit 1, I found that many students struggled with creating reasoning statements when using the mobile application; and I organized two sessions to inform the teacher. Figure 1 presents the structure of my study with an emphasis on the data collection process. When organizing these sessions, I focused on creating activities to evaluate students’ performance (Garet, Porter, Desimone, Birman, & Yoon, 2001; Desimone, Porter, Garet, Yoon, & Birman, 2002; Fishman, Marx, Best, & Tal, 2003; Borko, 2004, Richmond & Manokore, 2011). This prominent theme was also addressed in two studies discussed in the literature review (Simon et al. 2006; McNeill & Knight, 2013) as the value of supporting teachers in analyzing student work to improve teacher practices. First  Unit:   Providing   Instruction  and   Using  Mobile   Devices     PD:  Analyzing   student  work   &  Planning   activity  for   second  unit     Second  Unit:  Providing   Instruction  and  Using   Mobile  Devices     Figure 1. Structure of My Study   17     Finally, I have also examined patterns of change in the teacher’s practices in supporting students in the claim-evidence-reasoning framework related to changes in students’ explanations in both units (RQ-2: How do students’ claim, evidence and reasoning scores change during the two units?). In this process, I have also added the teacher’s insights by interviewing the teacher before the first unit and after completing each unit (RQ-1: What does this teacher think about providing instruction, using mobile devices, and students’ ability to create explanations before and after the intervention?)                       18     CHAPTER 3: THEORETICAL FRAMEWORK In the previous chapter, I divided the teacher’s practices into two parts: (a) providing instruction, and (b) using mobile devices, both of which focus on how the teacher scaffolds students’ understanding of scientific explanations. In this section, I will examine how scaffolds in each part helped me to define the synergy. As discussed earlier, several studies focused on the different types of supports (e.g. improving instructional strategies and providing technology scaffolds) to aid students in creating explanations. Improving instructional strategies focused on how teachers scaffold the explanation process for students (Simon et al., 2006; McNeill & Krajcik, 2008a; McNeill & Knight, 2013). Technology scaffolds focused on supporting students when collecting data (Songer, 2006; Maldonado & Pea, 2010; Kuhn et al., 2012; Kuhn et al., 2012; Laru et al., 2012) and when creating explanations (Sandoval & Reiser; 2004; Kuhn et al., 2012; Laru et al., 2012). The underlying idea behind scaffolding is to provide some kind of structure/support and guidance for students to learn a task they would not otherwise be able to master on their own. This is consistent with Vygotsky’s theory of Zone of Proximal Development (ZPD). Vygotsky (1978) emphasized that assistance can help students reach a higher-level understanding; in this process, teachers play a key role. In this study, the teacher played a crucial role when providing assistance to students. However, there are many additional scaffolds designed in the mobile application to help students and the teacher in this process. In the following sub-section, I will examine the technology scaffolds in the mobile application before discussing the role of synergy.   19     Technology Scaffolds Reiser (2004) defined the goal of a scaffold: “… the intention is that the support not only assists learners in accomplishing tasks but also enables them to learn from the experience” (p. 275). Quintana and colleagues (2004) also took up the idea of scaffolding by proposing a scaffolding design framework around three essential processes: sense making, process management, and articulation and reflection. First, sense making is the process of constructing explanations, interpretations, and hypothesizing through inquiry (Quintana et al., 2004). Second, Quintana and colleagues (2004) explain process management as helping students manage all of the steps of inquiry and understanding, which steps come next in an investigation. Finally, Quintana and colleagues, similar to Reiser (2004), believe that it is important for students to be able to reflect back and reevaluate their learning. In addition, students need to be able to articulate what they have learned. Connected with Quintana and colleagues’ (2004) guidelines, the complexity of the task is structured by describing the process in four steps -- plan, collect, analyze and explain. Zydeco also supported the sense making process by analyzing data and constructing explanations. The reflection process supported with annotating data and creating explanations. A complete list of how scaffolds in Zydeco match the scaffolding guidelines discussed by Quintana and colleagues (Quintana et al., 2004) can be found in Table 1. This section will only present the scaffolds that are designed to prompt students to create claims, add evidence and reasoning. Additional scaffolds will be examined in the next chapter.   20     Table 1 Zydeco Scaffolds and Scaffolding Guidelines Quintana et al.’s (2004) scaffolding guideline Organize tools and artifacts around semantics of the discipline How does Zydeco support it? -Zydeco guides students to claim-evidence-reasoning framework (see Figure 15). Zydeco prompts students to review the data (see Figure 11), then students create claims by reviewing their questions (see Figure 10), and add the data to claims (see figure 3), finally students add their reasoning (see Figure 5). -Zydeco supports teachers to review students’ explanations by using claim-evidence-reasoning framework (see Figure 15). Provide structure for -Zydeco provides four modes: plan, collect, review and complex tasks and analyze. Students can easily navigate between these modes functionality (see Figure 7, 8, 11 and 15). Automatically handle non- -The claim-evidence-framework helps students to organize salient routine tasks the work products (see Figure 15). Provide reminders and -Zydeco supports students to create questions in the guidance to facilitate planning phase (see Figure 7). articulation during sense -The numbers after helper questions show the amount of making data collected under each one. This provides guidance during the data collection (see Figure 8). -Zydeco guides students to annotate the data they collected (see Figure 9). -Zydeco reminds students to add claims (see Figure 2), add evidence (see Figure 3), and reasoning (see Figure 5). -Zydeco provides reminders to create claims by reviewing data collection questions (see Figure 10). Scaffolds Designed to Prompt Students to Start Creating Explanations When creating explanations, the mobile application provides several scaffolds that are designed to provide reminders for students during sense making (Quintana et al., 2004) to create claims, add evidence and reasoning. As presented in Figure 2, the mobile application prompts students to create a claim as the first step of the explanation. When creating claims, there is another scaffold designed to remind students to review the driving question and data collection   21     questions (see Figure 10). Figure 2. Add New Claim Scaffold Under the evidence section, the mobile application provided several scaffolds. After adding a claim, evidence scaffolds remind students of the need to add evidence and reasoning (see Figure 5). When reviewing the data after creating claims, students can add any data collected by themselves and their peers to their explanations. As presented in Figure 3, if students create multiple claims, Zydeco helps students to link their data and the claim by selecting a claim. Besides adding evidence to claims, students can also remove evidence from the claim (see Figure 4). Figure 3. Adding Evidence to Claim   Figure 4. Removing Evidence from Claim 22     After creating claims and adding evidence to the claims, students discuss the link between claim and evidence under the reasoning section. As provided in Figure 5, the mobile application provides another scaffold to remind students of the need to create a reasoning statement. Figure 5. Reasoning Scaffold In summary, the scaffolds discussed in this section only provided reminders when creating claims, adding evidence and reasoning. As noted by several studies (Reiser 2004; Quintana et al., 2004; McNeill et al., 2006) scaffolds engage learners in doing complex tasks. To illustrate this idea, scaffolds in Zydeco also support students when creating their own questions, collecting and reviewing data. The remaining scaffolds will be presented under chapter 4. Defining Synergy McNeill and colleagues (2006) found that providing scaffolds that support the claim-evidencereasoning framework aided students when creating explanations. Connected with this idea, several studies focused on understanding how teachers scaffold explanations for students (Osborne et al., 2004; Simon et al., 2006; McNeill & Krajcik, 2008a). Other studies examined   23     the role of technology to support this process (Sandoval & Reiser, 2004; Songer, 2006; Maldonado & Pea, 2010; Kuhn et al., 2012; Laru et al., 2012; Williams et al., 2012). But as noted by Tabak (2004), creating an explanation is a challenging process and the technology scaffolds alone would not be enough to support this process. Tabak (2004) also underlined the need for providing additional supports that work together to promote student learning. To support students in this process, Tabak (2004) illustrated the need for synergy between a teacher’s practices and technology scaffolds. Synergy can be thought of as various scaffolds working together to enable students to accomplish challenging tasks. Tabak (2004) described this process as “student performance is facilitated through the combined contribution of the teacher and the software working in a system” (p.319). In another study, Pasnik, Strother, Schindel, Penuel, and Llorente (2007) examined media synergy by focusing on the effect of different multimedia tools supporting students’ literacy skills. McNeill and Krajcik (2009) defined synergy as “congruency between the teacher instructional support and the curricular scaffolds” (p. 449). In this process, researchers tested the effect of the influence of context-specific and generic scaffolds in a unit that was designed to engage students with several explanations. Generic scaffolds are designed to support students engaging in a specific task. On the other hand, context-specific scaffolds provide “content- and task-specific hints” (McNeill & Krajcik, 2009, p. 428). In the case of supporting explanations, generic scaffolds are designed to support the structure of claim-evidence-reasoning framework. Content-specific scaffolds are designed to provide specific hints around each component of the framework. For instance, generic scaffolds prompted students to include evidence, while contextspecific scaffolds asked students to discuss specific evidence instead of providing the CER framework. McNeill and Krajcik (2009) found context-specific scaffolds were more effective to   24     support students’ explanations when they work synergistically with the teachers’ practices. In summary, several studies highlighted the importance of synergy between teachers’ practices and scaffolds designed to support students when creating explanations (Tabak, 2004; McNeill et al., 2006; McNeill & Krajcik, 2009). In this process, these studies shed light on scaffolding design and noted how synergy among scaffolds would support quality of students’ written explanations. My work differs because I put the spotlight on the level of synergy between providing instruction (part 1: providing instruction to scaffold students’ understanding of CER framework) and using mobile devices (part 2: technology scaffolds and teacher’s scaffolding to provide feedback when using mobile devices). Providing instruction (part 1) focuses on understanding how the teacher scaffolds students’ initial understanding of the CER framework. Some studies discussed in this chapter focused on examining teachers’ practices when providing instruction and also investigated scaffolding design when students were creating explanations (McNeill et al., 2006; McNeill & Krajcik, 2009). After scaffolding students’ understanding of scientific explanations, the teacher engaged students in the following steps underlined by the Next Generation Science Standards to construct scientific explanations: (a) generating data, (b) analyzing and interpreting the data, and (c) creating explanations from evidence (Achieve, 2013). McNeill and Krajcik (2008b) underscored the importance of providing feedback as students are creating their own explanations. Thus, the second component of examining synergy (part 2: using mobile devices) investigates how this teacher uses mobile devices to provide feedback when students are creating explanations. As discussed earlier in the chapter, these scaffolds primarily serve as reminders, and I investigated how this teacher’s scaffolds work together with the scaffolds designed in the mobile application to support students in constructing written explanations.   25     The scaffolds designed in Zydeco provided generic support. Davis (2003) noted the value of generic scaffolds (e.g. our ideas right now are…) to promote reflection when compared with using direct prompts (e.g. our evidence critiques will be useful later because…). Although, Davis (2003) found generic scaffolds promoted students to include more scientific principles, McNeill and colleagues (2006) suggested providing content specific scaffolds together with generic scaffolds when engaging students in creating explanations. In a more recent study, McNeill and Krajcik (2009) underlined that content specific scaffolds promoted explanations better than generic scaffolds. Connected with the importance of merging generic and content-level scaffolds, Gotwals and Songer (2010) noted that the depth of the explanation both depends on the level of content and the quality of reasoning. The use of scientific ideas is central when creating explanations (Sandoval & Millwood, 2005; McNeill et al., 2006; Simon et al., 2006; Songer, 2006; McNeill & Krajcik, 2008a; Gotwals & Songer, 2010; NRC, 2012). Thus providing content-specific scaffolds is crucial when supporting explanations (McNeill et al., 2006; McNeill & Krajcik, 2009). In this study I investigated the teacher’s role when providing generic and content-specific scaffolds. Zydeco did not provide content specific scaffolds. I examined how this teacher supported the role of including specific scientific principles when supporting students’ understanding of CER in part 1. I also focused on how the teacher supported students’ discussion of specific content when examining the teacher’s support in part 2. Tabak (2004) focused on the synergistic support when students are engaged in creating explanations, but did not examine teachers’ practices when supporting students’ understanding of the explanation framework. Figure 6 shows how the construct of synergy is used in this   26     dissertation and how the role of focused teaching intervention can be used to promote synergy among scaffolds. Synergy  in  unit  1   Part  1:  Providing  Instruction   Part  2:Using  Mobile  Devices  with  Teacher’s   Support     Synergistic support examined by McNeill (McNeill et al., 2006; McNeill & Krajcik, 2009) Synergistic support examined by Tabak (2004) Focused  Teaching  Intervention  before  Unit  2 Part  1:  Providing  Instruction   Part  2:Using  Mobile  Devices  with  Teacher’s   Support     Synergistic support examined by McNeill (McNeill et al., 2006; McNeill & Krajcik, 2009) Synergistic support examined by Tabak (2004) Synergy  in  Unit  2nit  2   Figure 6. Synergy in My Study In summary, when investigating synergy, my dissertation adds another dimension by portraying the interplay between how the teacher scaffolds students’ understanding of the claimevidence-reasoning framework and how she supports students when they are creating their own claim-evidence-reasoning statements using mobile devices. When investigating the synergy (the interplay between part 1 and part 2), my main goal is to explore how the level of synergy across introducing explanations to students and using the technology to build the explanation changes the quality of students’ explanations. The level of synergy will be defined in chapter 6 when discussing the analysis process. Besides portraying how the level of synergy supports students’ explanations, my study also emphasizes the role of focused teaching intervention to inform the teacher in relation to her practices.   27     CHAPTER 4: CLASSROOM INTERVENTION In the previous chapter, I focused on examining the scaffolds designed in the mobile application to prompt students to include claims, evidence and reasoning. In this chapter, I will first describe the list of scaffolds discussed in Table 1 by focusing on the features of the mobile application: (a) construction questions, (b) students’ data collection, and (c) students creating explanations. The teacher used these features of the mobile application during both units. After describing the mobile application as a whole, I will then provide summaries for each unit (plants and water quality) and how focused teaching intervention sessions were designed to help the teacher improve her practices to support students’ understanding of the CER framework. Features of the Mobile Application My study focuses on two units in which the students engaged in various scientific practices. To support students’ inquiry, the teacher in this study used a mobile application called Zydeco. In the first unit, the teacher supported students in using the mobile application when creating their science fair projects about plant growth. In the second unit, the teacher focused on investigating water quality. Zydeco was developed to support teachers and students in creating units around a driving question. After designing the driving question, the teacher added several data collection questions (helper questions) under each driving question to support students in the data collection process. Students create their own questions after reviewing the questions created by the teacher (first step of Zydeco), then students collect their own data by using iPads to take pictures, and record videos and audio notes (second step of Zydeco). Finally, teachers prompt students to use the data as they try to answer a driving question by creating scientific explanations (third step of Zydeco).   28     Constructing Questions   The planning of the unit starts with the creation of the driving question (DQ). A good driving question is related to a feasible real-world problem, and it directs students to a diverse set of activities that help them develop scientific understanding (Krajcik, Blumenfeld, Marx & Soloway, 1994; Singer, Marx, Krajcik, & Chambers, 2000; Zhang & Quintana, 2012). The classroom teacher created two questions in conjunction with the Zydeco team: • Driving question for first unit: “How do plants stay alive? • Driving question for second unit: “How do we determine the health of the water in our community?” The teacher participating in this study and the Zydeco team created these driving questions collaboratively. After creating the driving question, the teacher created several additional data collection questions (helper questions) to guide students’ data collection process; during the unit students added their own questions to help them answer the driving question (see Figure 7). Figure 7. Planning Page: Reviewing Questions and Creating Questions Data Collection   After creating their own questions, students collected data (audio notes, videos, and photos) guided by the helper questions (see Figure 8). During the science fair project, students collected   29     data daily in relation to their projects. Due to time limitations, students focused on investigating several indicators to test the water quality for just one day when examining water quality. The observations can be associated as evidence for answering the helper questions, which in turn help students answer the DQ. During the data collection, students selected a helper-question, and then they recorded audio notes, took videos, and captured photos (see Figure 8). The numbers before the helper-questions show the amount of evidence collected under that question (see Figure 8). Figure 8. Data Collection Page After collecting a piece of data, scaffolds in Zydeco support students in labeling the data. In this process, students provide a title for the data and add labels that would help them reflect on their data collection (see Figure 9). Figure 9. Labeling the Data   30     Creating Explanations   After students completed data collection, they used the data collected from the field trip/ experiment to provide an answer to the driving question by writing a scientific explanation. In this process, the teacher supported students in reflecting on what they collected by leading students to create claims (see Figure 10), support their claims by selecting evidence from the data pool (see Figure 11), and finally add reasoning to justify their claims (see Figure 15). Figure 10. Creating Claims by Reviewing Questions When adding evidence to their claims, students can filter the evidence by collector type (own data or others’ data), type (photo, audio, text or video), labels, and questions (see Figure 11). Figure 11. Data Collection Under the Review Section   31     Summary of the Units and Focused Teaching Intervention Sessions The teacher participating in this study used Zydeco in two different units and participated in two focused teaching intervention sessions in between the first and second units. This sub-section provides an overview for each unit and both focused teaching intervention sessions. Unit 1- Plants (Science Fair Projects) Unit 1 was designed around the following driving question: “How do plants stay alive?” Ms. Robinson initially designed two helper questions to underline the key concepts in the unit: (a) How do plant structures function to support growth? (b) How do plants create energy? These questions are related to the goals described by NRC (2012). NRC (2012) noted that by the end of 8th grade, students need to know that (p. 146): Plants reproduce in a variety of ways, sometimes depending on animal behavior and specialized features (such as attractively colored flowers) for reproduction. Plant growth can continue throughout the plant’s life through production of plant matter in photosynthesis. Genetic factors as well as local conditions affect the size of the adult plant. In this study, the teacher associated the plants unit with science fair projects. A short list of activities carried out in the plants’ units is shown in Table 2. Since the data collection process took more than two weeks, unit 1 (22 days) is longer than unit 2 (8 days). When reporting teacher practices in both units, I will closely analyze the days in which the teacher engaged in supporting students to create explanations. During unit 1, the teacher spent five days scaffolding students in creating explanations by using several instructional strategies. During those days, Ms. Robinson created four different activities: (a) making connections with previous activities and defining CER, (b) modeling CER, (c) critiquing explanations, and (d) making connections with everyday explanations. When the students completed data collection, they engaged in creating their own scientific explanations by   32     using the mobile application for two days. In summary, of the twenty-two days spent on this unit, I will analyze seven days to investigate the synergy by focusing on how this teacher supported students’ understanding of CER and how she supported students when they were creating their own explanation. Below I will provide snippets to summarize the activity flow for the remaining fifteen days. Table 2 Activity Sequence in Plants Unit Days 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22   Activities Introduction to Zydeco and Pre-Survey Using Zydeco to observe and record the part of monocot and dicot seeds Taking pre-test Observe germination, parts of plants and their functions & Practice Zydeco Parts of the plants continued & Practice Zydeco Parts of the plants continued & Practice Zydeco Create hypothesis for science fair projects & Data Review Research Background Information for Science Fair Projects Complete Planning for Science Fair Projects Discuss: What are variables in a controlled experiment? Complete planning for science fair project & Start Experiments Students finalized their own questions to investigate in their science fair projects. Photosynthesis & Collect data on plant growth Photosynthesis Continued & Collect data from experiments Respiration & Photosynthesis & Collect data from experiments Defining CER & Making Connections with Previous Activities (Providing Instruction- Day 1) Collect data from experiments Modeling CER (Providing Instruction- Day 2) & Collect data from experiments Practice Zydeco CER feature & Collect data from experiments Critiquing CER (Providing Instruction- Day 3) Critiquing CER (Providing Instruction- Day 4) & Collect data from experiments Fertilization & Collect data from experiments Fertilization continued & Collect data from experiments Making Connections with Everyday Explanations (Providing Instruction- Day 5) Creating Explanations (Using Mobile Devices- Day 1) Creating Explanations (Using Mobile Devices- Day 2) 33     Day 1 to Day 6. A graduate student from the Zydeco team introduced the mobile application by using a document camera on day 1. By following the instructions provided by the graduate student, the students explored the various phases in the application: plan, collect, review, and explain. In the following days, the teacher focused on exploring (a) dicot and monocot seeds, (b) germination, and (c) parts of the plants. Students continued to use Zydeco on a daily basis in various activities (e.g. observing seeds, parts of plants, germination process). Day 7 to Day 9. In these three days, the teacher supported students as they finalized their ideas for their science fair projects by having individual discussions with each student. In this process, she first asked the students to select their science fair question (see Appendix I). These ten questions were prepared based on the materials the teacher can provide to the students. Then students created their hypotheses (see Appendix H), defined the procedure (see Appendix J), and identified variables for their projects (see Appendix K). Day 10 to Day 20. Previous data collection days focused on the process in which seeds were sprouting. The goal of collecting data on a daily basis before starting the science fair projects was to make students familiar with Zydeco application and data labeling process. On day 10, students completed the setup and collected the first data in relation to their science fair projects. In this process, the teacher labeled students’ plants with their initials (see Figure 12) so that they could continuously collect data from the same experiment. Figure 12 presents sample data from the first day, during which one of the students linked data with the question and also added some notes. From day 11 to day 20, the teacher focused on reminding the students about important content ideas that were covered in previous years by exploring photosynthesis, respiration, and fertilization. Besides putting an emphasis on the content, Ms. Robinson used several scaffolding   34     strategies to support students’ understanding of using the claim-evidence-reasoning (CER) framework. She spent five days to focus on CER. Figure 12. Student Data from Day 1-Plants Unit Day 21 & Day 22. At the end of the unit, Ms. Robinson devoted two days to writing explanations. In day 21, she asked her students to create an explanation related to the student’s own science fair project. The goal of the following day (day 22) was to analyze another student’s scientific explanation. Similar to the five days on which she provided instruction, these two days will be closely investigated in the findings chapter. Focused Teaching Intervention Ms. Robinson was an integral part of the project. She had used an explanation framework in her classes for years, and she also used the CER application in the previous year. When looking at her previous experiences, I believed the need for additional support would be unnecessary. However, following the plants unit, two days of focused teaching intervention sessions were conducted with the teacher to support the enactment of the second unit. The reason for having focused teaching intervention sessions was to inform the teacher in relation to   35     students’ performance during unit 1, since very few students included scientific principles and almost half of them could not create reasoning statements. Two major activities were carried out during focused teaching intervention to accomplish this goal: (a) discuss students’ performance in the first unit, and (b) complete preparations for the second unit. In addition, I also introduced a new feature of the application called “instant review.” Due to technical difficulties, this feature was not implemented during the first unit. To discuss students’ performance, plan the second unit and introduce the instant review feature, I organized two focused teaching intervention sessions. During these sessions, my aim was to create an environment in which the teacher learned more about how to support students in constructing explanations. Analyzing students’ work is highlighted as an important activity to improve the quality of teaching practices (Garet et al., 2001; Desimone et al., 2002; Fishman et al., 2003; Borko, 2004, Simon et al. 2006; Richmond & Manokore, 2011; McNeill & Knight, 2013). Consistent with that idea, I focused on examining student work instead of analyzing teacher’s practices since I only had one participant. A short summary of the activities in the sessions can be found below. These two days will be closely analyzed under the results chapter. First Focused Teaching Intervention Session- Day 1. In the first session, I asked Ms. Robinson to code nine explanations (see Appendix G) from the plants unit, and then we discussed what was missing in those explanations and how the teacher could improve students’ explanations. Finally, I asked her to create an explanation by using the data students collected (see Figure 13). The quality of the teacher’s explanation will be evaluated under the results chapter.   36     Figure 13. Teacher’s Explanation Second Focused Teaching Intervention Session- Day 2. The goals of the second day were to create an activity that Ms. Robinson could use for scaffolding students in constructing explanations during the second unit and to familiarize the teacher with the coding rubric. To help the teacher in this process, the chair of my dissertation committee and I selected nine explanations created in another school after investigating water quality (see Appendix F). During the second day, Ms. Robinson and I coded and ranked these explanations. During unit 2, Ms. Robinson used these nine explanations to support critiquing and debating student explanations (McNeill & Krajcik, 2011). During the second session, I also introduced the instant review feature (see Figure 14) in the Zydeco application, which allowed the teacher to review students’ progress in constructing explanations as the students worked on their explanations. Second focused teaching intervention session was also designed to help the teacher become familiar with the scoring rubric when using the instant review feature of the application. This new scaffold was added to provide an additional support for the teacher when providing feedback to students as they were engaged in creating explanations (McNeill &   37     Knight, 2013). From her device, the teacher can review student progress regarding the data added, claims, and reasoning statements. Figure 14. Reviewing Student Progress Unit 2- Water Quality Unit 2 is shorter than unit 1 because the students’ data collection took only one day. In unit 2, there were two activities that the teacher spent a day and a half to support the students’ understanding of CER. When creating explanations, Ms. Robinson asked her students to complete two explanations by investigating the data collected in 2012, and 2013. In this section, I will provide a short summary for each day by discussing the important activities that occurred. I will investigate four days (one day and a half for providing instruction, two and a half days for using mobile devices) to investigate synergy in the findings chapter. Day 1. Day one focused on reminding students of several key ideas (erosion and deposition) in relation to water quality (see Appendix A). At the beginning of the class, Ms. Robinson asked the students to define deposition, erosion and watershed. After defining the key terms, Ms. Robinson showed a video that discussed the effects of nonpoint source pollution. Then, the teacher conducted a watershed activity in which students discussed the direction of the water flow (see Appendix A).   38     As a final activity, the teacher presented the driving question, “How do we determine the health of our river?” and asked students to create two questions to answer the driving question. As a group, the students created two questions: • What organisms live in the water? • How do we find the health of the river using water quality indicators? Ms. Robinson asked students to log in to Zydeco and add these two questions. Finally, each student included an additional question that he/she would like to explore during the unit. Day 2 & Day 3. Ms. Robinson started the second day by asking students to list the things that would help them determine water quality (see Appendix B). When discussing how far pollutants can travel, she provided an example by informing the students about a soccer ball lost during the Japanese tsunami travelled to Alaska. Next, Ms. Robinson focused on investigating the water quality indicators by focusing on dissolved oxygen (DO), temperature, phosphate, biochemical oxygen demand (BOD), fecal coliform, pH, turbidity, microorganisms, and macro organisms. During this instruction, students took notes on their worksheets (see Appendix B). When examining the indicators, the teacher made several connections with the plants unit. One of the questions that investigated the role of pH during science fair projects dealt with acid rain. In the example provided below, she created a link between dissolved oxygen and pH by discussing photosynthesis: Ms. Robinson: So, all living things can only live within a certain pH range. If it’s too far to the basic or acid side, we have problems either way. What happens with acid rain, when the rain we have is too acidic, what happens to plants? Student(s): They die. Ms. Robinson: They die. Plants die, they can’t do photosynthesis, right? If the water is too acidic, or too basic, are there going to be plants in the water? Student(s): No.   39     Ms. Robinson: No, so there’s not going to be plants in the water undergoing photosynthesis, and producing dissolved oxygen, got it? All right, bacteria are probably the most tolerant. Day 4. The activity flow for day 4 was as follows: (a) writing an explanation (see Appendix C) to determine the health of the pond at the school, (b) testing the water quality, and (c) reviewing explanations created in another school (see Appendix F). This last review activity was created during the focused teaching intervention sessions. Similar to previous CER scaffolding activity days, day 4 will be analyzed under the results chapter. The data collection focused on testing the tap water in the classroom and took a short time (about 10-15 minutes) since students tested the indicators that can be tested quickly (e.g. phosphate, DO, turbidity). The goal of this process was to help students use Zydeco and become familiar with testing and labeling indicators. On day 4, the teacher also started a critiquing activity by reviewing several explanations selected during the second focused teaching intervention session. Day 5. Due to weather conditions, students could not visit the river. To test the water quality of the Rouge River, Ms. Robinson brought buckets of water from the river that runs through the backyard of the school: “It’s only 4 degrees Celsius out, I went and collected water. This week we’re going to do the water testing here.” Students collected data using the following water quality indicators: phosphate, temperature, dissolved oxygen (DO), pH, and turbidity (see Appendix D). BOD and fecal coliform tests were conducted by the teacher and uploaded to Zydeco since they require waiting several days for accurate results. Day 6. In unit 1, Ms. Robinson spent the last day of instruction creating an activity to support students’ understanding of CER. In unit 2, she completed the critiquing activity before the students started writing explanations. On day 6, she spent half of the class sessions critiquing   40     explanations and then provided about twenty minutes for the students to start writing their own explanations. Day 7 & Day 8. Students completed two explanations during these days. In the first explanation, they analyzed the data collected in day 5. In this process, Ms. Robinson asked students to review their data; then she asked students to create claims about water quality. Students selected data from the pool to support their claims, and finally they completed their scientific explanation by providing reasoning to show why the data served as evidence (see Figure 15). Figure 15. Sample Claim-Evidence-Reasoning In the second explanation, students determined the health of the Rouge River by analyzing the data collected in 2012, compared to that collected in 2013. Not being able to visit the river influenced the data collection in relation to examining human activities and organisms. Students could only analyze human activity, microorganism and macro organism data from the previous year (see Figure 16). By searching data with “2012,” students could access all the data from the   41     previous year and analyze the water quality by looking at the organisms found (see Appendix D). The three days that focus on creating explanations (day 6 to day 8) will be closely analyzed under the results chapter. Figure 16. Reviewing 2012 data   42     CHAPTER 5: METHODS This chapter focuses on discussing the participants and the data analysis process. The chapter also presents how the levels of synergy were created when analyzing the data. Participants When examining synergy, I focused on investigating the practices of an experienced middle school science teacher in an urban middle school in the Detroit Public School system. The primary data source was an audio recorder that the teacher, given the pseudonym of Ms. Robinson, carried during the two units. Ms. Robinson participated with two of her eighth grade classes (n=54). All student conversations were transcribed anonymously as “student(s).” Ms. Robinson holds a bachelor’s degree in nursing and a master’s degree in K-12 teaching and has taught for over twenty years. Ms. Robison has participated in several research projects using technology in the last ten years, joining the Zydeco project two years ago. In previous years, she organized four field trips that used the Zydeco application. With the first three field trips, she only used the data collection features of Zydeco; during the last field trip, she was introduced to the claim-evidence-reasoning feature. Besides using Zydeco in the classroom, Ms. Robison also participated in design meetings to provide feedback when designing the scientific explanation feature in the mobile application. When testing the earlier version of Zydeco with an emphasizing water quality, Ms. Robinson participated with several 7th grade classes in 2012. In the beginning of this year, Ms. Robinson’s school was merged with another school in the same district. Because of this, some of Ms. Robinson’s 8th grade students had used Zydeco with her previously. During the intervention, the Zydeco project provided thirty iPads, which the teacher kept for the entire year. The mobile devices were primarily used during the plants and water quality units.   43     Ms. Robinson’s participation in numerous projects to design technology scaffolds and her goal of embedding the creation of scientific explanations in her teaching made her a good candidate for the case study. Although the need for support when implementing a new technology is a prevalent theme in the literature (Baylor & Ritchie, 2002; Windschtil & Sahl, 2002; Russell, Bebell, O’Dwyer, & O’Connor, 2003; Zhao & Bryant, 2006; Ertmer & Leftwich, 2010; Gray, Thomas, & Lewis, 2010; Gerard, Varma, Corliss & Linn, 2011; Norris & Soloway, 2011), Ms. Robinson did not need additional support due to her previous experiences in using various technologies and the CER (Claim-Evidence-Reasoning) framework. Data Analysis To explore my research question that investigates the level of synergy between providing instruction and using mobile devices, I used multiple sources of data to frame a case study. Yin (2014) noted the importance of having various data sources because “any case study finding or conclusion is likely to be more convincing and accurate if it is based on several different sources of information” (p. 120). Similarly, Reinking and Bradley (2008) underlined the importance of evidence supported by different data sources. Connected with this idea, I utilized and analyzed multiple data sources: (a) teacher interviews (one pre-interview and two post-interviews with the teacher); (b) students’ scientific explanations using the iPad version of Zydeco, written after each unit; (c) video and audio data that tracked the teacher during two units; and (d) video and audio data of the focused teaching intervention sessions. Table 3 summarizes the relationship between my research questions and the data collection sources for the case study. When analyzing these data sources, three major steps identified by Strauss and Corbin (1998) guided the analysis. Below I will describe the process briefly; examples for each data source will be provided later in the chapter.   44     Table 3 Research Questions and Data Collection Methods Research Question Overarching Research Question: How does the level of synergy between providing instructional supports and using supports in mobile devices aid in improving the quality of students’ explanations Research Question (RQ) 1: What does this teacher think about providing instruction, using mobile devices, and students’ ability to create explanations before and after the intervention? RQ-2: How do students’ claim, evidence and reasoning scores change during the two units? Data collection tool Interview RQ-5: What is the role of focused teaching intervention in informing the teacher about her practices? 1. How many times/ days? Before the first unit After each unit Once iPad CER After each explanations unit RQ-3: How does the teacher use instructional strategies to support students’ understanding of the claim-evidencereasoning framework? RQ-4: How do the teacher’s scaffolds and the scaffolds in the Zydeco work together to support student learning? When Videos and audiotapes -Videos and audiotapes -Teacher’s explanation Videotaped both classrooms in two units, and the teacher was audio recorded in all of the lessons. During both sessions Twice Four explanations (Two after each unit) Teacher practices (Plants- 7 days & water quality- 4 days) 2 days Open Coding: As noted by Strauss and Corbin (1998), this step helps the researcher in “grouping similar items according to some defined properties” (p. 121) that would, in turn, help categorize the data. Strauss and Corbin pointed out that one potential way to use open coding would be developing codes “when the researcher already has   45     several categories and wants to code specifically in relation to them” (p. 120). Consistent with that idea, I created preliminary codes to analyze each data source separately. Then, these codes were finalized for each data source in several research meetings with the chair of the dissertation committee. Later in this chapter, step 1 of the data analysis will be represented as “categorizing data” and I will provide how these finalized categories were used to analyze each data source. When coding the data, two researchers coded each data source for inter-reliability. 2. Axial Coding: The aim of this step is to “uncover relationships among categories” (Strauss & Corbin, 1998, p. 127). In my study, this step focuses on creating patterns for each data source. For instance, when analyzing the quality of students’ explanations, step 1 focused on categorizing data with an emphasis on claim, evidence and reasoning in the plants and water quality units. Axial coding (step 2) examined the patterns across these categories by focusing on how the quality of claim, evidence and reasoning changes across units. In this chapter, I first described how each data source was categorized, and which patterns were created. Later, I investigated comparisons across data sources. 3. Selective Coding: The last step focused on “integrating and refining categories” (Strauss & Corbin, 1998, p. 127). In this process, “central categories” are created from the “list of existing categories” (Strauss & Corbin, 1998, p. 146). After examining the patterns within data sources, I created four main categories: (a) teacher’s ideas and practices about providing instruction, (b) teacher’s ideas and practices about using mobile devices, (c) quality of students’ performance and what this teacher thinks about students’ performance, and (d) how the level of synergy   46     affects the quality of students’ explanations. The final step of analysis focused on examining how patterns varied across data sources (Strauss & Corbin, 1998) when creating the case study. Richmond and Manokore (2011) used Grounded Theory to categorize the data and uncover the relationships between categories when analyzing teacher interviews after participating in professional learning community meetings. Similarly, in this chapter I first discussed how step 1 and step 2 (categorizing data and axial coding) helped me to analyze and categorize each data source separately. Then, I described the last step of analysis (selective coding) by examining how I used main categories to create the case study in order to investigate the synergistic effect of the scaffolds on the quality of students’ explanations. During this process I also focused on including the teacher’s ideas about supporting components of synergy and students’ performance. When comparing across data sources, several studies discussed the importance of examining the patterns that are similar across participants, and ideas that differentiate between participants (Eisenhardt, 1989; Boije, 2002). But as noted by Keen and Packwood (1995) the nature of the case study may change the type of the comparisons. When creating the case study I focused on depicting Ms. Robinson’s practices by analyzing the patterns across these four main categories listed under selective coding. Analyzing Data Sources Separately As noted earlier, the first phases of the analysis (categorizing data and axial coding) focused on analyzing each data source separately. Students’ Scientific Explanations McNeill and Krajcik (2008a) noted that change in a teacher’s practices leads to a change in students’ scientific explanation scores. Thus, students’ scientific explanations allowed for   47     exploration of RQ-2 (How do students’ claim, evidence and reasoning scores change during the two units?). To code students’ scientific explanations for each unit, I used a modified rubric (see Table 4) designed by McNeill and colleagues (2006). I used the rubric to score students’ explanations separately for claim, evidence, and reasoning (Krajcik, McNeill & Reiser, 2008). Table 4 iPad Claim-Evidence-Reasoning Rubric Claim (Unit 1 and Unit 2) Evidence (Unit 1 and Unit 2) Code 0 1 Explanation Does not make a claim or makes an inaccurate/inappropriate claim. Makes an accurate and complete claim to answer the question. 0 Does not provide evidence or only provides inappropriate evidence (evidence that does not support claim) Provides the evidence as text within the reasoning. (Students cannot type at the evidence part. But some students add data values into claim and reasoning statements without linking it with a piece of data from the pool.) Provides one piece of evidence by selecting evidence from the data pool. Provides one piece of photo/ video/ text from the data pool. Provides multiple pieces of evidence. Does not provide reasoning or only provides reasoning that does not link evidence to claim. Links the claim and evidence, but fails to include supportive details (scientific principles). Provides some accurate but incomplete scientific principles. At this stage students may provide some inaccurate scientific principles. Reasoning with complete scientific principles. Provides reasoning that links evidence to claim; includes appropriate and sufficient scientific principles. Does not provide reasoning or only provides reasoning that does not link evidence to claim. Links the claim and evidence, but fails to include supportive details (indicator rankings or scientific principles). Links the claim and evidence, and provides judgment on the rank of the quality by using indicator rankings. Provides some accurate scientific principles by discussing what the indicator means for water quality, with some inaccurate principles for some indicators. Provides appropriate and sufficient scientific principles by discussing what the indicator means for water quality. 1 2 Reasoning (Unit 1) 3 0 1 2 3 Reasoning (Unit 2) 0 1 1a 2 3   48     As depicted in Table 4, the claim section focuses on whether students created an accurate and complete claim. The evidence section identifies the data students selected from the data pool. The most detailed analysis was employed when analyzing reasoning statements. The reasoning section emphasized how students linked the claim and evidence, and how the students included scientific principles when discussing this link. When examining scientific principles, I focused on analyzing different content areas. In unit 1, I investigated how students discussed scientific principles with an emphasis on plant growth (e.g. how the level of water affects the rate of photosynthesis). In the second unit, I analyzed how students discussed scientific principles related to water quality (e.g. how the level of phosphate affects the water quality). After scoring students’ explanations, I looked for changes in claim-evidence-reasoning performance between unit 1 and unit 2. After using the rubric to score students’ explanations, I found the claim and evidence sections remained the same for both units. In unit 2, students used the indicator chart (see Appendix D), and some students focused on ranking the indicators without discussing scientific principles. While not as sophisticated, use of the indicators was analyzed as a new category when analyzing reasoning performance in unit 2 (see level 1a in reasoning for water quality section). Coding Students’ Explanations. After the plants unit, two researchers (the external evaluator of Zydeco project and I) separately coded 20% of the data to determine the interreliability. The inter-reliability score for the plants investigation was 93%. The researchers had discussion meetings to resolve disagreements. The remaining data were then coded independently by one of the two researchers. When analyzing the water quality data, the researchers coded all data, and then compared their scores for inter-reliability since both researchers coded water quality data from Ms.   49     Robinson’s class with a similar rubric in the previous year. After coding all the data, the interreliability score was 92%. Researchers organized discussion meetings to resolve the disagreements. Step 1- Analyzing Claims (Categorizing Data). Analyzing students’ explanations started by examining whether the student had a complete and accurate claim. Table 5 provides examples from both units. Step 2- Analyzing Evidence (Categorizing Data). The amount of evidence students added to their explanations was used to analyze quality of the evidence. When coding the evidence section, I focused on whether students included evidence from the data pool, or whether they just noted the evidence without providing any support from the data pool (see Table 4). The examples presented in Figure 15 show several pieces of evidence to support a student’s explanation, with the student receiving a rating of “3” for the evidence score (see Table 4). When students did not add any evidence into their explanations, they received “0.” Table 5 Coding the Quality of Claims Student’s Claim The type of soil does affect plant growth. How does the amount of water affect plant growth? DO I believe that the Rouge River from 2012 is not healthier than the Rouge River in 2013. Coding Claim level 1- Student had a complete and accurate claim Coding: Claim level 0- Student did not have a claim Claim level 0- Student did not have a complete claim Claim level 1- Student had a complete and accurate claim Step 3- Analyzing Reasoning Statements (Categorizing Data). After scoring the claim and evidence components, I focused on the quality of the reasoning statements created by evaluating how students justified the evidence they added to their claims (McNeill et al., 2006). In this   50     process, I gave credit to the details students added (see Table 4) when discussing the evidence they added (e.g., indicator rankings, including scientific principles). Table 6 provides examples to discuss all levels in the coding rubric. Table 6 Coding the Quality of Students’ Reasoning Student’s Reasoning Statement Temperature does affect plant growth because plants can grow in certain types of conditions. In my experiment the plant that was in the cold grew the longest. The amount of water does affect plant growth. That is obvious by the different lengths of the plants they grew more and more each day. This years dissolved oxygen water level was 8 ppm which was a number 3 meaning it has a good ranking, and last years dissolved oxygen water level was 4 ppm which was a number 1 meaning that the water was in poor condition. The level of phosphate is 2ppm, which is good. The more ppm of phosphate is not that good because it increase the plant growth and plant growth decreases the amount of DO ... DO is needed very much in lakes and ponds. If it is low, then lots of aquatic animals can die and the dead organisms make the water even more unhealthy. The level of the DO was 0ppm and that not so good. The amount of water affects plant growth because plants need water to carry out bodily functions. Plants use water to grow and reproduce. The DO is at 8ppm which is 88% saturation, if the water is saturated then it can hold more DO the more oxygen the healthier the water and organisms living in it. The organisms in the water need DO to survive.   Coding Reasoning level 0- Wrong reasoning. Student created an incorrect link between evidence and claim. Reasoning level 1- Discussing the claim and evidence without including scientific principles. Reasoning level 1a- Discussing ranking without including scientific principles. Student ranks DO level without discussing what it means for water quality. Reasoning level 2- Student both had accurate and inaccurate principles. Student included correct explanation for DO, but added incorrect link between plant growth and DO. Reasoning level 2- Student started discussing principles, but principles are not complete since the student did not make connections with photosynthesis. Reasoning level 3- Student included accurate scientific principles by discussing the link between DO and water quality. 51     Step 4- Creating Patterns (Axial Coding). After coding students’ explanations, the final step of the analysis focused on describing the quality of students’ explanations in both units by comparing the claim-evidence-reasoning scores in unit 1 to those from unit 2. This process is guided by the following question: How did students’ claim, evidence and reasoning performance change between unit 1 and unit 2? Table 17 in the results chapter focuses on examining how the claim-evidence-reasoning scores changed between unit 1 and unit 2. When creating patterns, I investigated: (a) the percentage of students that successfully created claims, (b) evidence added to these claims by focusing on the percentage of students that selected data from the data pool, and (c) reasoning statements by comparing the percentage of students that added supportive details (e.g. indicator rankings, scientific principles) in both units. Teacher’s Practices and Defining Levels of Synergy The primary focus of this study is exploring the synergy between instructional strategies the teacher used to scaffold students’ initial understanding of the claim-evidence-reasoning framework (part 1: providing instruction) and the teacher’s instructional practices when students used the mobile application to create explanations with the CER framework (part 2: using mobile devices). Previously studies examined the synergy when supporting students’ understanding of the CER framework (McNeill et al., 2006; McNeill & Krajcik, 2009), and discussed the synergy between technology scaffolds and teachers’ practices (Tabak, 2004). When combining these different definitions to examine synergy, the levels of synergy emerged when analyzing the data (Strauss & Corbin, 1998). To analyze data with an emphasis on the level of synergy, I first examined the practices of the teacher when providing instruction using various activities about constructing explanations   52     (RQ-3: How does the teacher use instructional strategies to support students’ understanding of claim-evidence-reasoning framework?), and using mobile application separately (RQ-4: How do the teacher’s scaffolds and the scaffolds in the mobile application work together to support students’ explanations?). The first step of analysis focused on investigating the quality of teacher practices when supporting the claim-evidence-reasoning framework (separately for providing instruction and using mobile devices). Part 1 examined the quality of support when the teacher used different instructional activities to scaffold the students’ understanding of the claimevidence-reasoning framework. Part 2 emphasized understanding how the teacher provided feedback to students when they were engaged in writing explanations using the mobile application. In the second step of analysis, I examined the synergy between part 1 and part 2 (providing instructional practices and using mobile devices). As depicted in Figure 17, synergy is defined as comparing the quality and the support of instruction about constructing explanations to how they work together with the instruction when students used mobile devices to construct explanations. In summary, the goal of examining the teacher’s instructional practices is to investigate: (a) how this teacher supported the students’ understanding of the CER framework (part 1), (b) how she supported students when they were creating their explanations by using the CER framework (part 2), and (c) defining the level of synergy between part 1 and part 2. Teacher practices were analyzed through five steps: analyzing CER separately in both parts, classifying data with an emphasis on the level of support in both parts, examining the support level for each activity in part 1 and for each day in part 2, examining the support level for each part, and defining synergy by comparing support levels between part 1 and part 2 in each unit. Focusing on each part   53     through the lens of the CER framework helped me to investigate and compare the support provided by the teacher in both parts. Part  1:   •  Support  for   claim-­‐evidence-­‐ Providing   Instruction  for   reasoning   Explanations   •  Support  for   Part  2:   claim-­‐evidence-­‐ Using  Mobile   reasoning   Devices   Defining  the   Synergy:     Comparing  Support   Levels  Between   Providing   Instruction  for   Explanations  and   Using  Mobile   Devices   Figure 17. Defining the Synergy Step1- Coding Teacher’s Practices: Analyzing CER Separately (Categorizing Data). In the first step of analysis, I focused on coding the teacher’s practices by using the rubric provided in Table 7, emphasizing the claim-evidence-reasoning framework. In this process, I modified a rubric created by McNeill and colleagues (2006) and used this new rubric to code part 1 and part 2. This, in turn, helped me identify and compare the teacher’s practices through the following steps. Table 8 displays a sample analysis for analyzing the teacher’s practices with an emphasis on CER. Part 1 analyzed the teacher’s practices when providing instruction to support students’ understanding of CER with an emphasis on generic and content specific scaffolds provided by the teacher. In analyzing how the teacher supported the students when creating claims, I focused on examining the teacher’s definition of claim in both units. Since the content was different in both units, I examined how the teacher supported content specific claims. For supporting students in the use of evidence, I examined how the teacher helped students to discuss data and   54     include data values. Students measured the factors influencing plant growth in unit 1, and the factors influencing water quality in unit 2. Since the measurements differed (e.g. measuring how much plants grew in centimeters, and the percent of dissolved oxygen), this part also focused how the teacher provided content-specific scaffolds by discussing measurements accurately. Table 7 Rubric for Evaluating Teacher’s Practices Claim Evidence Code 0 (no support) 1 (reminding to create claims) 2 (claim as a statement) 0 (no support) Explanation Teacher does not help students to create claims. Teacher reminds the need to create a claim. 1 (the need to include data) Teacher supports students to write a statement in relation to the question. Teacher does not provide feedback for including evidence or teacher guides students to inappropriate evidence. Teacher discusses the importance of including data (no discussion of data values). 2 (discussing data) Teacher discusses why student added the data or reminds students to include accurate data values. Reasoning 0 (no support) 1 (the need to add reasoning) 2 (connection with claim and evidence) 3 (including scientific principles) Teacher does not guide students to link claims and evidence. Or the teacher guides the students to inappropriate reasoning. Teacher only notes the importance of adding a reasoning statement. Teacher helps students to provide reasoning that link the claim and evidence. Teacher guides students to include scientific principles by providing content specific scaffolds. (e.g. Discussing how temperature affected plant growth in unit 1. Examining the role of pH to define water quality in unit 2.) Finally, for supporting students’ understanding of reasoning, I focused on whether the teacher only discussed the role of linking evidence and claim (generic scaffold), or prompted students to include supportive details (e.g. indicator rankings, scientific principles). In my study,   55     the scientific principles focused on examining how the teacher provided content-specific scaffolds in different content areas in both units. For instance, the plants unit examined the content-specific scaffolds with an emphasis on the factors influencing plant growth (e.g. temperature, amount of water, and type of soil). The water quality unit focused on contentspecific scaffolds by examining how the teacher supported students analyzing different factors related to determining water quality (e.g. phosphate, dissolved oxygen, and turbidity). Part 2 investigated how this teacher supported students as they created explanations using the mobile application. In part 2, the mobile application provided several scaffolds that were designed to remind students of key aspects of the explanation process. With using the rubric provided in Table 4, the support coming from the scaffolds in Zydeco can be coded as: (a) the need to add claim, (b) the need to add evidence to claims, and (c) the need to add reasoning. The scaffolds in Zydeco provided generic support (McNeill & Krajcik, 2009). When examining the teacher’s practices, I investigated how this teacher provided generic and content-specific scaffolds in both units. The support for the reasoning scaffold in Zydeco reminded students to add a reasoning statement. When analyzing the teacher’s practices to support reasoning, I looked for content specific scaffolds and generic scaffolds. I focused on understanding how the teacher supported students in discussing the link between claim and evidence and adding scientific principles to justify why the data serves as evidence to support the claim. In each unit the driving questions focused on different content areas, and I investigated how the teacher supported students to create claims after reviewing the questions under claim section. Evidence scaffolds were designed to remind students that they needed to add evidence to their claims by selecting pictures and videos from the data pool (see Figure 3), but it is also important for the teacher to support students to do more than pull data from the pool. In this   56     process, I investigated how the teacher supported students by adding content specific details to the data selected from the pool by questioning why students selected that particular data and discussing data values. Table 8 Sample Coding- Teacher Practices Quote Ms. Robinson: Did you make revisions? You still need to work on that one (reasoning statement). Because you don’t have any science principles related to that. You just talk about your data. Coding Reasoning- Level 3: Teacher supported students to include scientific principles. In this example, there is no content discussion since the student did not add any specific details. The teacher is reminding the student to include scientific principles. Ms. Robinson: So you shouldn’t have been Claim- Level 1: In this example, the playing, should you? So the question was, is teacher is having a whole class discussion it a plant? So is a tomato a plant? in which she reviewed one of the students’ Student(s): Yes. explanations. When discussing the claim, Ms. Robinson: So that should have been your she did not review the question or data. She claim. Ronald, let’s hear yours. only highlighted the need for the claim. Student(s): My claim is ::unclear:: is a plant. Evidence- Level 1: When reviewing My evidence is that it has roots::unclear:: students’ evidence, she only focused on the Ms. Robinson: So did he have a claim? number of the evidence. She did not ask for Student(s): Yes. data values. Here the teacher only focused Ms. Robinson: Did he have 3 pieces of on the need to have evidence. evidence? Reasoning- Level 1: Similar to claim and Students(s): Yes. evidence, the teacher did not take the Ms. Robinson: Did he have reasoning? reasoning discussion further. In summary, Student(s): Yes. she focuses on illustrating the need for having claim, evidence and reasoning to complete an explanation without taking students’ understanding further. Ms. Robinson: Yesterday we started Claim- Level 2: In this example, student scientific investigations, you wrote a claim defined the claim as a statement, and the evidence reasoning statement about what a teacher supported that idea. plant was, and we also did one with the data table, correct? So, just to review, what is a claim? Student(s): A statement. Ms. Robinson: A statement about what? Student(s): What you believe.   57     In summary, the generic scaffolds (McNeill & Krajcik, 2009) in the mobile application were designed to help students complete the various parts of the explanation. I focused on understanding how the teacher’s practices worked synergistically to support the quality of explanations with an emphasis on content specific scaffold by providing questions to support using scientific principles in part 2 of the study. Another graduate student who recently completed his Ph.D. in a Research, Evaluation, Measurement & Statistics program, and I coded the teacher practices. First, we coded a sample lesson from the plants investigation, and found an inter-reliability of 70%. After coding the first day, we used discussion meetings to resolve the differences in coding and proceeded to code for two more days, selecting one day from plants and another day from water quality investigation. In this phase, the inter-reliability was 90% and we again had several meetings to discuss the differences in the coding. After this step, we shared the data, and the remaining data were coded by one of the two coders. Step 2- Classifying Data by Level of Support (Categorizing Data). The first step of analysis under this category focused on coding teacher practices with an emphasis on the claimevidence-reasoning framework. The second step of analysis emphasized the quality of the teacher’s practices by specifying the level of support by: (a) classifying the level of support, (b) defining support levels for each day, and (c) defining support levels for each part. In order to classify the level of support, thresholds for each component of the claimevidence-reasoning framework were defined by examining the definition of CER provided in the literature. As noted earlier, several studies defined claim as a statement that answers the question; evidence as the data to support the claim; and reasoning as the link that justifies why the data can be used as evidence to support the claim (McNeill et al., 2006; Gotwals et al., 2012).   58     Connected with these definitions, Table 9 presents how levels of support are classified for each component of the CER framework (no support, low-level support, and high-level support). Similar to Table 4, Table 9 was also used to analyze teacher practices in both parts. If the teacher did not provide any support that resulted in a classification of “no support.” When support provided by the teacher was not aligned with the definitions of claim, evidence, and reasoning, this was considered “low-level support.” When creating CER statements, if the teacher only focused on reminding students to add claims, evidence and reasoning, this also resulted in “low-level support.” Table 9 Thresholds for Classifying Level of Support NO SUPPORT LOWLEVEL SUPPORT HIGHLEVEL SUPPORT Claim 0 (no support) Evidence 0 (no support) Reasoning 0 (no support) 1 (reminding to create claims) 1 (the need to include data) 1 (the need to add reasoning) 2 (claim as a statement) 2 (discussing data) 2 (connection with claim and evidence) 3 (including scientific principles) In order for high-quality support to occur and aid students in developing quality explanations, the support coming from the teacher should be consistent with the definitions provided in the literature (McNeill et al., 2006; Gotwals et al., 2012), and it also needs to take students’ understanding further, as in the following example: (a) teacher defines the claim as a statement; (b) when adding evidence, she supports students to add data values and the link between claim and reasoning; and, (c) finally, under the reasoning section, the teacher focuses on discussing the link between claim and evidence and also including scientific principles to explain the link.   59     Step 3- Examining The Level of Support for Each Activity and Each Day (Categorizing Data). When analyzing the level of support in each part, I first examined the level of support for each activity when providing instruction and for each day when using mobile devices. For instance, when supporting students’ understanding of CER, Ms. Robinson created four activities to provide instruction on how to construct explanations, and she created two activities in unit 2. In each unit, I classified the teacher support on the activity basis (Unit 1 was analyzed as: part 1activity 1; part 1-activity 2; part 1-activity 3; and part 1-activity 4. Unit 2 was analyzed as: part 1-activity 1, and part 1-activity 2). When using mobile devices she spent two days for creating explanations, and I investigated the level of support for each day in each unit (Unit 1 was analyzed as: part 2- day 1, and part 2-day 2. Unit 2 was analyzed as: part 2- day 1; part 2-day 2; and part 2-day 3). When defining the level of support for each activity in part 1 and for each day in part 2, I focused on whether the majority of the instructional instances fit the definition of high-level support for each activity or day. For instance, if the teacher had twenty opportunities to support creating claims in part 1-activity 1, and the majority of these instances could be defined as highlevel support (e.g. twelve instances supporting high-level support defined in Table 9, eight supporting low-level support defined in Table 9), I concluded that the teacher’s practices with regard to the claim in part 1-activity 1 provided high-level support (for a sample summary see Table 17 and Table 18 in the results chapter). Although, the conclusion would note the existence of high-level support for supporting claims in part1-activity 1, the results chapter provides a detailed analysis to illuminate the complete practices. After defining support levels for each activity in part 1 and for each day in part 2, the next level of analysis focused on defining the level of support for each part.   60     Step 4- Examining The Level of Support for Each Part (Categorizing Data). After examining the level of support for each activity in part 1 and for each day in part 2, Step 4 focused on summarizing the level of support for each part (e.g. Unit 1 summary created by examining the level of support in part 1 and part 2). In this process, I focused on capturing how the teacher’s support varied during each step with an emphasis on the level of support for each part of the explanation framework in each unit. For instance, when examining the support levels in part 2-unit 1, support level was first examined for each day under step 3 (Unit 1-part 2 was analyzed as: part 2- day 1, and part 2-day 2). Step 4 takes this one step further by summarizing the level of support in both days separately for claim, evidence, and reasoning under each part (see Table 10). Table 10 Defining the Level of Support for Each Part Support Level for Each Activity/ Day 75% or more activities/ days provide high support 50% to 74% of the activities/ days provide high support 25% to 49% of the activities/ days provide high support Less than 25% of the activities/ days provide high support Summary of the Support Level for Each Part High support Moderate Support Mixed Support Low support As depicted in Table 10, if 75% or more of the activities/ days provided high-level support, this resulted in high support for that part (e.g. The teacher provided high-level support in three activities, and low-level support for one activity in plants unit. This resulted in high-support for claims in part 1-unit 1). Moderate support defined when 50% to 74% of the days or activities provided high-support (e.g. The teacher provided high-support for one day, and low-support for the other day when students were using mobile devices in plants unit. This resulted in moderate   61     support for claims in part 2-unit 1). On the other hand, if the majority of the teacher’s practices provided low-support this resulted in low or mixed support (see Table 10). Step 5- Defining Level of Synergy in Each Unit (Categorizing Data). Previous steps of analysis focused on analyzing teacher practices with an emphasis on CER, classifying the level of support, defining the level of support for each day, and defining the level of support for each part in each unit. As presented in Figure 17, synergy was defined by comparing the level of support in both parts separately for claim, evidence and reasoning. Tabak (2004) underlined the importance of coherence for synergy and noted that the existence of synergy would support the quality of explanations students constructed. Connected with this idea, I define synergy as occurring when the teacher supports students’ understanding of the claim-evidence-reasoning framework in both parts and how the supports align with and complement each other. When classifying teacher practices with an emphasis on synergy, reaching higher levels would provide better opportunities to help students’ understanding of CER. Table 10 defined different levels of support for each part. Since the synergy focuses on the coherence of support between part 1 (providing instruction) and part 2 (using mobile devices), there are different levels of synergy presented in Table 11. The levels of synergy are described in collaboration with the chair of dissertation committee and me, as: (a) Synergy, (b) Moderate synergy, (c) Mixed Synergy, and (d) Low synergy. Low-level synergy happens when the level of support stays at low-level for one of the parts. Similarly, mixed synergy occurs when the support level is mixed in one of the parts. The teacher providing moderate support for one part, and high- or moderate-level support for the other part defines moderate synergy. In order for the synergy to be evident, it should be consistent and high quality in both parts. For instance, the teacher provided high support for   62     reasoning in both parts in unit 2, and this was considered as providing synergy for supporting reasoning in unit 2. Table 11 Defining Levels of Synergy Level of Support in Part 1 High Support Moderate Support High Support/Moderate Support Level of Support in Part 2 High Support High Support/ Moderate Support Moderate Support High Support/Moderate Support/ Mixed Support/Low Support Mixed Support Mixed Support Low Support High Support/Moderate Support/ Mixed Support/Low Support Level of Synergy Synergy Moderate Synergy Mixed Synergy High Support/Moderate Support/ Mixed Support/Low Support High Support/Moderate Support/ Mixed Support/Low Support Low Support Low Synergy Creating Patterns- Comparing Synergy Across Units (Axial Coding). After examining the level of synergy in both units by applying the five steps of analysis (analyzing CER separately, classifying data with the level of support, examining level of support for each activity in part 1 and for each day in part 2, examining level of support for each part, and defining synergy by comparing level of support in each part for both units), the final step of the analysis focused on comparing the synergy across units to determine how the changes in synergy affected students’ explanations. Questions investigated in this step were:   • How did the synergy level for supporting claim change between units? • How did the synergy level for supporting evidence change between units? • How did the synergy level for supporting reasoning change between units? 63     Focused Teaching Intervention Sessions In the first session, the teacher coded students’ explanations from the plants unit, and in the next session she coded explanations developed in another classroom focusing on water quality. During this process, I also discussed the coding with the teacher. In addition, I examined the quality of the explanations created by the teacher after analyzing the data students collected during the plants unit. When investigating the quality of the explanation created by the teacher (see Figure 13), I used the rubric provided in Table 4. A professional transcriber transcribed the audio recordings of both sessions. Table 12 Codes Developed for Analyzing Focused Teaching Intervention Sessions Code Coding Agreement Coding Disagreement Codes Claim Struggles with Claim Instant Review Struggles with Reasoning Discussing Evidence Discussing Scientific Principles Reviewing Quality Researcher Support Teacher’s Expectations Explanation The teacher agrees with the codes provided in the rubric The teacher disagrees with the codes provided in the rubric Teacher codes the claim without support Teacher struggles to code claim Teacher or researcher discusses the role of instant review Teacher struggles when coding reasoning statements Teacher discusses evidence when coding reasoning statements Teacher discusses scientific principles when coding reasoning statements The teacher or the researcher discusses the overall quality of explanations Researcher intervenes when discussing the quality of students' explanations Teacher’s expectations about the quality of students’ explanations To analyze the focused teaching intervention data, I developed several codes presented in Table 12. These codes helped me to investigate: (a) how the teacher analyzed students’ explanations created in unit 1, (b) her disagreements with the coding rubric, (c) when she needed   64     the researcher’s support, (d) her initial ideas in relation to the instant review feature, and (e) what she thought about students’ performance after coding their explanations from the plants investigation. Step 1- Coding Focused Teaching Intervention Data (Categorizing Data). To code the audio data, another graduate student in the Curriculum Instruction and Teacher Education program at Michigan State University and I coded the entire data set and resolved differences in discussion meetings. Table 13 provides several examples of how the recordings were coded. Table 13 Sample Coding- Focused Teaching Intervention Sessions Quote Code Ms. Robinson: But most of our kids consistently found the ones they microwaved the most grew the most. Researcher: Really? Ms. Robinson: That was consistent with their data. Researcher: OK. Ms. Robinson: It doesn’t make any sense. Ms. Robinson: This is a 1, not even a 2? Researcher: No, because they don’t have any scientific principles, do they? They just mention the data and rankings. Ms. Robinson: They just regurgitate that data, ok. Researcher: So, it’s good they have ideas butMs. Robinson: But they don’t explain it. Discussing Evidence: In this quote, teacher discusses the evidence, but does not include the scientific principles. Coding Disagreement & Researcher Support: In this example, the teacher is not sure how to code, and the researcher intervenes to help her with the coding. Step 2- Creating Patterns (Axial Coding). In order to construct patterns after analyzing the recordings of the focused teaching intervention sessions, I explored the following questions:   • How did the teacher analyze the students’ explanations in day 1 and day 2? • What disagreements did she have with the coding rubric? 65     • When did she need researcher support? • What ideas did she have about the about instant review feature in Zydeco? In day 1, the teacher coded explanations created in unit 1. The goal of this activity was to make the teacher familiar with the coding rubric and to see the level of students’ performance in unit 1. In day 2, she coded explanations from another school because students from another school district created higher quality explanations that included scientific principles when using Zydeco. Because the teacher was not optimistic about students’ performance after unit 1, the goal of day two was to increase the teacher’s familiarity with the coding rubric and present higherquality explanations developed by Zydeco. Teacher Interviews To further explore the teacher’s views about using the mobile devices and instructional strategies, I completed several semi-structured interviews (Glesne, 2011). Yin (2014) noted the importance of having interviews as an “essential source of case study evidence” (p. 113). Yin (2014) also stressed that interviews cannot be the single data source; rather, they support the information coming from the various data sources. In this study, I primarily examined the relationship between teacher practices and students’ scientific explanations (McNeill & Krajcik, 2008a), and I used interview data to investigate the teacher’s position about using instructional strategies and the mobile devices to support students in constructing explanations during the two units. Table 14 presented the codes developed to analyze the interviews. Step 1- Coding Interview Data (Categorizing Data). The chair of the dissertation committee and I designed the interviews (see Appendix E). I coded the teacher interviews with an emphasis on answering the following questions: (a) what does the teacher thinks about students’ ability to create explanations before and after the intervention, (b) what are her ideas in   66     providing instruction by using various instructional strategies, and (c) what are her ideas about using mobile application to support students in creating explanation. Table 14 Codes Developed to Analyze Interviews Code Previous technology use Teacher's goal Definition of explanation Example of explanation Challenges of explanations Quality of students' explanations Previous Zydeco experience Expectations from Zydeco Challenges when using Zydeco Previous science fair projects Zydeco design Involvement in the project Instructional strategies Zydeco experience Role of focused teaching intervention Behavioral issues Description Teacher discusses her previous experiences when using technology Teacher discusses her goal in relation to the unit Teacher's definition of explanation Teacher's CER example Teacher's challenges when supporting explanations What teacher thinks about the quality of students' explanations Teacher's ideas in relation to her previous Zydeco experience Teacher's expectations when using Zydeco Teacher's challenges when using Zydeco Teacher summarizes experiences during previous science fair projects What teacher thinks about new Zydeco design How teacher explains her involvement in the project How teacher summarizes her use of instructional strategies How teacher summarizes role of using Zydeco Teacher's ideas in relation to focused teaching intervention sessions How teacher discusses discipline issues I analyzed all three interviews using this coding scheme. Later, the chair of the dissertation committee coded the interviews, and we compared our coding and resolved disagreements. Table 15 provides information in relation to the coding process. Step 2- Creating Patterns (Axial Coding). I interviewed the teacher before the first unit and at the end of the each unit (see Appendix E). The questions guiding the analysis of these three interviews when creating patterns are:   67     • What does this teacher think about students’ ability to create explanations before and after the intervention? • What does the teacher think about using various instructional strategies to support students’ understanding of the claim-evidence-reasoning framework before and after the intervention? • What does the teacher think about using mobile devices before and after the intervention? Table 15 Sample Coding- Interview Sessions Quote Ms. Robinson: I really like the platform now (application design); I’m hoping that it lives up to my expectations. It looks very intuitive. I think the kids are going to be able to navigate it quite readily. Ms. Robinson: Um, collecting data is not usually a problem, though. They don’t always- organizing the data is difficult for some kids. Doing the experiments, not everybody likes to do that. But then making sense out of all of it later, drawing connections, and using that evidence to write explanations is challenging. Getting kids to write is challenging. Researcher: So how did the focused teaching intervention help you? Ms. Robinson: You know, talking about it with someone else, looking at the examples of good and bad, and understanding the rating system myself, and seeing how you were rating them, and how you guys were looking at them. Um it gave me an understanding what we were missing. Code Zydeco design: Teacher was involved in the design process, and believes the current design of the app is “intuitive”. Challenges of explanations: Based on teacher's experiences students do not struggle when collecting data. But organizing and making sense of the data are challenging. Role of focused teaching intervention: Teacher acknowledged the role of focused teaching intervention to support her practices.     68     The first interview focused on the teacher’s previous experiences in relation to using mobile devices and instructional strategies to support students in constructing explanations. I also investigated what she thought about the students’ ability to create explanations based on her previous experiences. The interviews that occurred after unit 1 and 2 examined the teacher’s ideas about the students’ performance while constructing explanations, and the use of various instructional strategies along with the mobile application to support students in constructing explanations. When analyzing these interviews, I focused on defining the teacher’s initial ideas in the pre-interview and how these ideas changed during the intervention. Creating the Case Study (Selective Coding) The reason for analyzing various sources of data is to create a rich description (Maxwell, 2005; Creswell, 2007; Yin, 2014) of the synergy that occurred when the teacher provided instructional support to aid students in constructing explanations (part 1: providing instruction) and when the students used the mobile devices to construct explanations (part 2: using mobile devices). After analyzing each data source separately (categorizing data and axial coding), four main categories were created (selective coding) to guide the comparisons across data sources: (a) teacher’s ideas and practices about providing instruction, (b) teacher’s ideas and practices about using mobile devices, (c) the quality of students’ explanations and what this teacher thinks about how the students performed, and (d) how the level of synergy affects the quality of students’ explanations. To investigate the synergy effect by using these categories in both units, I created two smaller case stories. Yin (2014) defined these smaller stories as “case studies within a case study” (p. 167). In this dissertation, the first mini-case study primarily focused on defining the influence of synergy on the quality of students’ explanations in unit 1. The second mini-case   69     study investigated the link between synergy and the students’ explanations after the teacher participated in focused teaching intervention sessions. In addition to exploring the relationship between synergy and the students’ explanations in each mini-case study, I also analyzed this teacher’s ideas with a particular lens on what she thinks about her students’ ability to create explanations and her ideas about using instructional strategies and mobile devices in both units. After creating the two small case studies to examine the synergy and quality of the students’ explanations in unit 1 and unit 2, I focused on comparing how the changes in synergy affected the quality of students’ explanations in both units. Yin (2014) named this step as “cross-case syntheses” (p. 167). To examine the synergistic effect of the teacher providing instruction to students about constructing explanations with how she supported students when they were constructing explanations using the mobile application, I investigated the change in quality of the students’ explanations in both units. In this step, I first analyzed the quality of the students’ claims, evidence and reasoning; I then compared the students’ scores with the synergy level in relation to claim, evidence and reasoning in both units. Connected with the research questions, questions investigated under each mini-case study and cross-case synthesis are presented in table 16. First Mini- Case Study: Synergy in Unit 1 To answer the questions listed in Table 16, I focused on four main categories (selective coding): (a) teacher’s ideas and practices about providing instruction, (b) teacher’s ideas and practices about using mobile devices, (c) the quality of students’ explanations and what this teacher thinks about how the students performed, and (d) how the level of synergy affects the quality of students’ explanations.   70     Table 16 Questions Investigated in Each Mini Case Study Mini Case Study 1 Questions (1) What does this teacher think about providing instruction, using mobile devices, and students’ ability to create explanations before unit 1? Mini Case Study 2 Questions (1) What does this teacher think about providing instruction, using mobile devices, and students’ ability to create explanations after unit 2? (2) What does this teacher think about providing instruction, using mobile devices, and students’ ability to create explanations after unit 1? (2) How did students perform in unit 2? (3) How did students perform in unit 1? (3) How does the teacher use instructional strategies to support students’ understanding of the claim-evidencereasoning framework in unit 2? (4) How does the teacher use instructional strategies to support students’ understanding of the claimevidence-reasoning framework in unit 1? (4) How do the teacher’s scaffolds and the scaffolds in the Zydeco work together to support students in constructing explanations in unit 2? (5) How do the teacher’s scaffolds and (5) What is the level of synergy between the scaffolds in the Zydeco work together providing instructional supports and using to support students in constructing mobile devices in unit 2? explanations in unit 1? (6) What is the role of focused teaching (6) What is the level of synergy between intervention in informing the teacher about providing instructional supports and her practices? using mobile devices in unit 1? Cross Case Synthesis • • • • How did the synergy level for supporting claims change between units? How did the synergy level for supporting evidence change between units? How did the synergy level for supporting reasoning change between units? How did students’ claim, evidence and reasoning score change between unit 1 and unit 2? When creating first mini-case study, I made four comparisons: (a) examining what this teacher thought about students’ ability to create explanations before and after unit 1; (b) focusing on how this teacher’s ideas changed in relation to supporting components of synergy (providing instruction and using mobile devices) by comparing the teacher’s initial ideas and her ideas after   71     completing unit 1; (c) comparing the teacher’s ideas in relation to supporting components of synergy (the teacher’s ideas in relation to providing instruction and using mobile devices), and how her practices (the teacher’s practices in relation to providing instruction and using mobile devices) supported the components of synergy in unit 1; and (d) how the level of synergy affected the quality of the students’ explanations. Second Mini-Case Study: Focused Teaching Intervention & Synergy The comparisons in the second mini case study focused on three areas: (a) what this teacher thought about the students’ ability to create explanations after coding students’ explanations in focused teaching intervention sessions, and how her ideas changed after unit 2; (b) examining this teacher’s ideas in relation to providing instruction and using mobile devices, and comparing her ideas about supporting components of synergy with her practices about synergy; (c) How the level of synergy affected the quality of students’ explanation statements. Synergy Comparison The two mini case studies focused on examining how the teacher’s ideas changed during each unit, and the alignment of her instructional strategies. The final step of analysis focused on making comparisons across the two units with an emphasis on synergy. Tabak (2004) discussed that in order for synergy to happen, teachers’ practices and technology scaffolds should work collaboratively; however, she also added that this will not necessarily create synergy unless the teacher makes synergy “an explicit goal for the enactment of the curriculum” (p. 329). Ms. Robinson was an ideal candidate since she participated in various projects to design technology scaffolds, and she made creating explanations an integral part of her teaching. The last step of the analysis investigated the comparisons across cases (Boije, 2002) by looking for patterns in the changes in the teacher’s instructional practices in supporting students in the claim-   72     evidence-reasoning framework in relation to changes in students’ explanations (students’ claim, evidence and reasoning statements). For instance, after analyzing students’ explanation scores (claim, evidence and reasoning scores) in unit 1 and unit 2, I then focused on understanding the level of synergy that supported the students’ explanations in unit 1 and unit 2 (see Table 16). This comparison helped to analyze how the synergy level influenced the quality of the students’ explanations. To find out how the changes in synergy affected the quality of students’ explanations, this step compared: • Synergy level for supporting claim in both units and quality of students’ claims in both units. • Synergy level for supporting evidence in both units and quality of students’ evidence in both units. • Synergy level for supporting reasoning in both units and quality of students’ reasoning in both units.     73     CHAPTER 6: RESULTS The results chapter is organized under three sub-chapters: (a) mini-case study 1 explores synergistic scaffolding in the plants unit, (b) mini-case study 2 explores the level of synergy after participating in focused teaching intervention sessions, and (c) cross-case analysis examines how the changes in the level of synergy affect the quality of students’ explanations. This process was guided by making comparisons across four main categories (selective coding): (a) teacher’s ideas and practices about providing instruction, (b) teacher’s ideas and practices about using mobile devices, (c) quality of students’ performance and what this teacher thinks about students’ performance, and (d) how the level of synergy affects the quality of students’ explanations. Mini-case study 1 explores the findings from the plants unit and reports on analyses derived from the pre-interview before unit 1, the teacher’s practices in unit 1, the quality of the students’ explanations in unit 1, and post-interview after unit 1. Mini-case study 2 emphasizes similar sources with the addition of focused teaching intervention sessions. The cross-case analysis will investigate how the changes in level of synergy would influence the quality of students’ explanations. Mini- Case Study 1: Synergy in Plants Unit Mini-case study 1 presents findings for unit 1 by investigating three main categories: the teacher’s ideas about and practices when providing instruction, the teacher’s ideas about and practices when using mobile devices, and the quality of the students’ performance and what this teacher thinks about the students’ performance. This process was guided by examining the questions listed below: • What does this teacher think about providing instruction, using mobile devices, and students’ ability to create explanations before and after the intervention?   74     • What does this teacher think about supporting components of synergy and students’ ability to create explanations after unit 1? • How did students perform in unit 1? • How does the teacher use instructional strategies to support students’ understanding of the claim-evidence-reasoning framework in unit 1? • How do the teacher’s scaffolds and the scaffolds in the Zydeco work together to support students in constructing explanations in unit 1? • What is the level of synergy between providing instruction and using mobile devices in unit 1? Pre-Interview The first data source in this section investigated the following question: “What does this teacher think about providing instruction, using mobile devices, and students’ ability to create explanations before and after the intervention?” When examining this question, my goal was to determine the teacher’s goals in general, her previous experiences in supporting students’ understanding of CER, and her experiences with Zydeco and other technology tools designed to support the explanation process. Ms. Robinson defined her goal as “to expose kids to scientific phenomena … conducting investigations and collecting data. And making connections to real world things, making those science connections.” She also added that creating scientific explanations are "embedded" in her teaching "all the time." When asked to define how she supported the explanation process, she made connections to the claim-evidence-reasoning framework. She noted an example when students were engaged in creating explanations to determine physical and chemical change. The teacher supported students as they collected various types of data before creating their own   75     explanations: “When we looked at physical and chemical change, we collect data about whether in the observations of whether there is a new substance, by collecting data on melting points, physical characteristics, chemical characteristics, to determine if something new has happened in a chemical change or physical change has occurred.” In the second part of the interview, the teacher was asked to describe her previous experiences when supporting explanations. Ms. Robinson reported being confident when supporting explanations and she noted that she did not need any additional support related to improving her instructional practices. On the other hand, she was also willing to be introduced to current literature. For instance, she read one of the chapters from McNeill and Krajcik’s (2012) book that focused on examining how to incorporate teaching strategies when implementing scientific explanations. When describing student challenges, Ms. Robinson noted that students do not struggle when collecting data, but organizing the data and making sense of the data are challenging for students. The teacher also added that reasoning is the most challenging part of students writing explanations. When defining the quality of students’ explanations during the previous science fair project, she emphasized that only a handful of students completed explanations from a group of 150 students. She summarized the gap as: “Their explanations were really poor. They didn’t cite the data. If they cited the data, they couldn’t make connections in coherent sentences...They don’t pull it all together. And don’t see how they fit into a conclusion.” She also added, “Even though they [students] had the framework and we had worked on it in class on paper, it was either too much work or they didn’t understand it.” A similar conclusion about students’ experience with scientific explanations was noted by Gotwals and Songer (2010).   76     Before using Zydeco, the teacher had experience using laptops and personal digital assistants (PDAs). She summarized affordances of using these devices as annotating data and analyzing other students' data. The teacher used the mobile application in the previous year, and she expected students to create better explanations after using Zydeco this year. As noted earlier, she participated in design meetings; she was also happy with the new version of the app that "streamlines everything" for the teacher by guiding data collection, data analysis, and creating explanations. When describing her role in the Zydeco project, she noted that she was an “integral part” of the project. Despite the value of supporting the explanation process, Ms. Robinson also noted that having new students that had never used Zydeco could be a challenge. Besides the challenge of using Zydeco for the first time, the teacher also underlined that some students struggle when designing an experiment. But, she added that having students working on similar experiments and sharing data would lend support to students to overcome this challenge. Ms. Robinson provided ten experiment ideas (see Appendix I) for students to choose from during unit 1, and several students tested the same idea. The majority of the students focused on testing how the amount of light, the color of light, and the amount of water affect plant growth. In summary, before the intervention Ms. Robinson was confident in her ability to provide instruction to support students in constructing explanations and in using mobile devices. She did not address any challenges when supporting students’ explanations, but she noted students’ struggles in this process. When discussing the quality of students’ explanations in the previous science fair project, she found very few students completed the explanations.   77     Examining Synergy in Unit 1 To investigate synergy between providing instruction and using mobile devices, I first reported the teacher’s practices when providing instruction; I then focused on her practices when students were creating their own explanations using mobile devices. After discussing components of synergy, I examined the level of synergy in unit 1. Appendix N presents the summary of the coding when analyzing the teacher’s practices in the plants unit. Part 1- Providing Instruction. When scaffolding students’ understanding of the CER framework, Ms. Robinson spent five partial days providing instruction as students continued to collect data related to their science fair projects. During these five days, Ms. Robinson created four different activities: (a) making connections with previous activities and defining CER, (b) modeling CER, (c) critiquing explanations, and (d) making connections with everyday explanations. In this sub-section, I reported on her practices separately for each activity and analyzed the following question: “How does the teacher use instructional strategies to support students’ understanding of the claim-evidence-reasoning framework in unit 1?” Activity 1- Making Connections with Previous Activities. In the first activity of scaffolding students’ understanding of the claim-evidence-reasoning framework, the teacher focused on providing definition for the CER framework by discussing the activities she had conducted previously. For example, she defined claim as a position and a statement, evidence as the data, and reasoning as the connection between the two: “When we make a claim, we take a position, and then what do we do with that claim…Evidence is generally what we know or what we’re collecting as data… What was the reasoning? How did we tie that all together?” When discussing evidence, the teacher noted the connection with claim and the importance of having specific data values. Later on she also underlined the role of including scientific principles into   78     reasoning statements: “The reasoning is the science we understand behind it. The rationale, all right?” When defining CER in a whole-class discussion, she also made connections to previous activities by creating several claims when discussing previous activities: “So the problem was, how can we get Fred in a life jacket if he’s on top of, under the boat, and the life jacket is on top of the boat? All right, so if we said, we can save Fred, that’s a claim, right? If you say 8-2 is the best homeroom, that’s a claim, correct?” After discussing several pieces of evidence, she noted the connection between claim and evidence under reasoning: “What was the reasoning? How did we tie that all together?” The teacher provided high-level support when defining claims and reasoning by describing the claim as a statement and underlining the role of scientific principles when creating reasoning statements. Under the evidence section, she only focused on reminding students of the need to add evidence. When she was making connections with previous activities, she again provided example statements as claim and noted the importance of making connections between claim and evidence under the reasoning section. Once again, when discussing evidence, she just focused on listing evidence. In summary, the connections made by Ms. Robinson primarily focused on creating everyday explanations. These explanations did not discuss making links with the plants unit. In activity 1, the teacher provided high-level support for claims and reasoning by primarily reviewing everyday explanations. However, the evidence section stayed at a low-level, as she just reminded students about the need to add evidence. Activity 2- Modeling CER. After collecting data in relation to the science fair projects, Ms. Robinson introduced the activity provided in Appendix L. This activity focused on   79     examining plant structure. In the beginning of the activity, she read the question and defined the claim as a statement: “Our question is, is it a plant? So the first thing you do is, when you have a question like that, how do you change that to a claim? You change that to an affirmative statement, right? So is it a plant? How can we change that?” Then Ms. Robinson modeled CER when discussing the activity. In the example provided below, she created her claim as “A fern is a small plant.” Then she added several pieces of evidence to support the claim, and she also discussed the scientific ideas related to the evidence. She focused on adding three pieces of evidence and explained the scientific principles by providing content-specific scaffolds: (a) roots absorb water and nutrients, and anchor the plant; (b) the stem transports water and nutrients; (3) leaves perform photosynthesis. She stated the following: Ms. Robinson: You’ll have a worksheet of your own; there are several different possibilities. So I’m going to select fern. How many of you know what a fern is? A fern is a small plant that looks kind of like this. And it has very fine little leaves. Student(s): Oh I know what that is. Ms. Robinson: A lot of people grow them on their porches in the summer. They kind of have these big fronds that come off and hang down. Each of them has a stem and a lot of leaves. So a fern is a plant. So my claim is a fern, because I’m going to select this one, a fern is a plant. So what kind of evidence would I use that something is a plant? Say I want to take one of yours and say this is a plant. What would I use as evidence? Student(s): The roots. Ms. Robinson: The roots. So it has roots. So my evidence is it has roots. And what do the roots do? Student(s): :unclear:: Ms. Robinson: Anchor the plant. Student(s): Absorb water. Ms. Robinson: Anchor the plant and absorb water, water and nutrients, right. How many pieces of evidence are we going to use? 3. So, what’s another piece of evidence we have? Student(s): Leaves. Ms. Robinson: It has leaves, ok. And what color are the leaves? Student(s): Green. Ms. Robinson: Green leaves. All right, what is a third piece of evidence? Student(s): Stem. Ms. Robinson: It’s got a stem. So what do the leaves do? Student(s): ::unclear:: Ms. Robinson: Ok they do photosynthesis right? And it has a stem, what does a stem do? … Transfers water, and ok, so it’s a transport system so it has a stem … What else?   80     Student(s): Nutrients. Ms. Robinson: Food, right? It carries the sugars, too. We have our claim, we have evidence, and so what’s the reasoning? We go back to our claim. A fern is a plant because what? Student(s): Because it has leaves, roots, stems. Ms. Robinson: And because it has leaves it’s capable of doing photosynthesis. So, a fern is a plant because it’s capable of doing photosynthesis. It has roots and it has leaves. Therefore it’s a plant. That’s the reasoning … Ok so a fern is a plant because it has leaves that do photosynthesis. After modeling CER as a whole class activity, she summarized her explanations once again and then noted that the activity structure will follow the same order: So the claim is, a fern is a plant. The evidence is that we can see leaves, we can see a stem, we can see roots. The leaves are green, and we know our reasoning is that because it’s green we know it’s capable of photosynthesis and can transport materials. Therefore, it’s a plant. So you’re going to do the same thing, you’re going to select something other than fern, and you’re going to write a CER statement and we’re going to share them. Consistent with her model, she continued to support students writing claims as a statement (e.g. A tree is a plant) that answers a question. When discussing the evidence, the teacher primarily focused on reminding students to include data values. The teacher defined this process as including specific data. When discussing evidence, she also made connections with reasoning. As an example, the quote below shows the teacher’s discussion of one piece of evidence. Here the teacher supported student connection-making by examining the functions of roots and then asked students to add another piece of evidence: Ms. Robinson: What would be the second piece of evidence we know about grass? Student(s): It grows out of the ground. Ms. Robinson: So how is it related to the ground? How does it interact with the ground? Student(s): With the soil. Ms. Robinson: It has soil, ok. But what else? It has roots. So grass has roots. And the roots do what? Student(s): Store food. Ms. Robinson: Store food. Student(s): Maintain stability. Ms. Robinson: Maintain stability. And what else do they do? … Do they send nutrients or do they absorb nutrients? Student(s): Absorb.   81     …. Ms. Robinson: All right, what would be the 3rd piece of evidence? What else do we know about grass? At the end of the class, Ms. Robinson focused on critiquing the students’ explanations. In this process, if the students were missing details, she asked them to include more. For instance, one of the students made the following comment about evidence: “Evidence is that vines grow on brick walls sometimes.” Ms. Robison noted that the student had evidence but that was not enough. On the other hand, if the students had a complete explanation, she did not provide any additional support. One example (provided below) from a whole class activity, in which the teacher was critiquing the quality of the explanation, illustrates low-level support for claim, evidence and reasoning. This example also does not show the teacher providing content-specific scaffolds: Student(s): My claim is that a tree is a plant. My evidence is that a tree has roots in the ground, leaves for photosynthesis … My reasoning is a tree is a plant because it grows in the ground, and has leaves for photosynthesis. Ms. Robinson: Did she have a claim? Students(s): Yes. Ms. Robinson: Did she have evidence? Students(s): Yes. Ms. Robinson: Did she have reasoning? Student(s): Yes. Ms. Robinson: Good, so you’re next. So stand up and read yours. Although Ms. Robinson provided low-level support for evidence in the previous activity, she focused more closely on discussing the evidence in the second activity. When critiquing explanations, if the students were missing pieces of the explanation, she highlighted them; but if the students had a complete explanation, the teacher did not provide any additional support to discuss why that was a good explanation. Overall, the teacher modeled claim as a statement that answers a question, focused on the value of including evidence and specific data measurements, and emphasized the role of including scientific principles in reasoning. With an emphasis on   82     CER, the majority of her practices in activity 2 provided high-level support for claim, evidence and reasoning. Activity 3- Practicing Zydeco & Critiquing Explanations. Unit 1 was the first time some students practiced using Zydeco. There were several students who used the mobile application in the previous year, but some had no previous exposure. To help those students become familiar with creating explanations using the mobile tool, Ms. Robinson supported students in creating a sample explanation to answer the following question: “Is a seed alive?” During this process, she also used written scaffolds (see Appendix M). The teacher again defined claim as a statement based on scientific data, evidence as the data collected, and reasoning as the connection between claim and evidence. She also underlined the importance of including scientific principles: Ms. Robinson: What is a claim? Student(s): A statement. Ms. Robinson: A statement about what? Student(s): What you believe. Ms. Robinson: What you believe about based on scientific evidence right? The evidence is the data you collect, and what is the reasoning? … You include the science that you understand, right? So you tie it all together, go back to your claim, cite your evidence, and you put the science in. Once students completed their practice explanations, Ms. Robinson focused on critiquing the quality of the students’ explanations. To do this, she asked several students to read their explanations, and then she discussed these explanations with the students in the class. All the students had a complete claim during the explanation critique. Ms. Robinson primarily focused her critique on whether the students included three pieces of evidence and how they discussed it. In the example provided below, after listening to the students’ explanations, Ms. Robinson discussed the quality of the reasoning with the class. The question she asked checked the existence of the claim, however, did not prompt students to discuss the quality of the claim;   83     rather, she focused on the importance of having specific evidence and linking claim to evidence when discussing reasoning statements: Ms. Robinson: (One of the students read her explanation.) Did she have a claim? Student(s): Yes. Ms. Robinson: Did she have specific evidence? Student(s): Yes. She was talking about the earth. Ms. Robinson: Talking about the earth. Ok. Did she have reasoning? And did it come back to her claim? Student(s): Not the earth stuff. In this example, students noted a gap in the reasoning when critiquing the explanation. This was connected with Ms. Robinson’s definition of specific evidence, which focused on discussing the evidence instead of just listing evidence: “What is your evidence? It needs to be stated. You can’t just say show my evidence. You need to write a complete statement.” Students were paying attention to how the evidence is discussed when linking evidence and claim. Although the teacher emphasized discussing evidence, there were some cases in which the teacher focused on key aspects of writing explanations and did not discuss evidence with the students. In the next example, Ms. Robinson focused on completing the checklist for the explanation: (a) having a claim, (b) including three pieces of evidence, and (c) discussing evidence using scientific principles. Because the students’ explanation included a discussion of content, the teacher did not provide content-specific scaffolds and only focused on reminding the link between evidence and scientific principles: Student(s): Yes, a seed is alive. My evidence is a seed can develop into a plant, a seed can move around in the soil, and it needs water and nutrients to grow. My reasoning is that it’s alive because it can grow into a plant … Ms. Robinson: Ok, did she have a claim? Student(s): Yes. Ms. Robinson: Did she have at least 3 pieces of evidence? Student(s): Yes. Ms. Robinson: Did she have reasoning that included the evidence and science? Student(s): Yes.   84     When students were creating claims, the teacher supported the ideas of answering the question and creating a claim as a statement: “That’s the question again, right? You need to go back and make that a statement.” Since students did not have any issues with creating claims, the teacher did not provide additional support when critiquing explanations. Overall, the majority of the instruction focused on checking the existence of claims, so the level of support was low for claims. On the other hand, the teacher continued to highlight the importance of having data measurements and including scientific principles. A majority of her instructional practices provided high-level support for evidence and reasoning by noting the importance of discussing evidence and the connection between claim and evidence: “So then the evidence is your data that you collect to support that claim, right? … You have the evidence, but in the reasoning you have to go back and restate it. There has to be a rationale in there.” In summary, during the first four days, Ms. Robinson focused on defining, modeling, and critiquing explanations. Critiquing explanations took place in two different days. Although, the teacher devoted more time for the activity in the first day, she roughly spent ten minutes on critiquing a couple of explanations in the second day. During the critiquing activity, she focused on helping students practice the use of Zydeco and then critiqued explanations. Later in the unit, she decided to create another activity to make connections with everyday explanations. Activity 4- Making Connections with Everyday Explanations. To support students’ understanding of the CER framework, the last activity the teacher conducted was making connections with everyday explanations. In this activity, the teacher asked students to make five statements and pick two of them to create explanations: “You’re going to create 5 claims at your table that you’re all going to agree on. Then each of you will pick two of those and write a claim, evidence, reasoning statement.” When creating claims, the teacher again focused on defining   85     claim as a statement to answer a question, and created a sample claim “Detroit Pistons are the best NBA team.” When discussing the role of evidence, she noted the importance of having data values, but she did not provide detailed support for reasoning: “Now, evidence needs to be something like a record. Whether a team has a Heisman trophy winner. Whether there is a Cy Young award winner. A batting average. Number of interceptions thrown. Do you understand what I’m saying? It should be data. Not an opinion.” When discussing the evidence, the teacher also focused on the link between claim and evidence: “So what would be evidence to back up that Barack Obama is our best president? What makes you think he’s the best president?” When making connections with everyday explanations, the teacher provided very few supports for reasoning by focusing primarily on the sentence count and not the explanation quality: “Your reasoning should include 5 to 6 sentences.” Later in the class, she noted that students needed to discuss the link between claim and evidence in these sentences: “The reasoning, 5 to 6 sentences that you’re going to talk about your evidence, you’re going to come back to your claim, and you’re going to tie the rationale behind it.” Although she noted the link, her practices did not really support discussing the link: Ms. Robinson: How many sentences do you need as part of your reasoning statement? Student(s): 2. Ms. Robinson: 2? No. How about a minimum of 5 or 6? In summary, when making connections with everyday explanations, Ms. Robinson provided high-level support for claims and evidence. But for reasoning, she focused on the sentence count instead of underlining the connections between claim and evidence. Although there was high-level support for claim and evidence, her practices stayed at the low-level of support for reasoning.   86     Unit 1- Part 1 Summary. As depicted in Table 17, when supporting students’ understanding of CER in unit 1, a majority of the teacher’s practices provided high-level support for claim, evidence, and reasoning (see Table 10 – 75% or more activities provided high support for each component). Although three activities provided high-support for reasoning, Ms. Robinson’s support for reasoning decreased to low-level support for the last activity for which she supported students’ understanding of CER. As a result, I concluded that the support level stayed moderate for reasoning. When the support level was low for one component, the support for other components was at a high-level. For instance, in activity 4, when making connections with everyday explanations, Ms. Robinson provided high-level support for claim and evidence, but the support stayed at low-level for reasoning. On the other hand, in activity 3, support for claim was low, but the other components (evidence and reasoning) were high. Table 17 Examining the Level of Support in Unit 1- Part 1 Claim Evidence Reasoning Activity 1 Activity 2 Activity 3 Activity 4 High-Level Low- Level High- Level High- Level High- Level High- Level Low- Level High- Level High- Level High- Level High- Level Low- Level Part 1 Summary High Support High Support Moderate Support Part 2- Using Mobile Devices. After completing unit 1, the students had two weeks to complete their science fair boards that actually asked students to summarize their findings from unit 1. When students returned, the teacher provided two days for creating explanations to analyze the student’s own data in day 1, and another student’s science fair project data in day 2. In the second day of using mobile devices, Ms. Robinson noted this connection as: “Your claim, evidence, reasoning statement that you started yesterday (the explanation students created by   87     analyzing their own data) is essentially your conclusion for your science fair project.” In this process, the teacher was providing feedback to students when they were engaged in creating explanations with the mobile application. Part 2 investigated the following question: “How do the teacher’s scaffolds and the scaffolds in the Zydeco work together to support students in constructing explanations in unit 1?” Day 1- Using Mobile Devices. In the first day of creating explanations, Ms. Robinson asked students to review their own data and create an explanation. When supporting students to create claims, she focused on reviewing the students’ science fair questions and helping them create claims: “Start by turning your question into a point, a statement. Is that your question? That was the big question, but what was your investigation question?” The driving question for unit 1 was “How do plants stay alive?” When creating explanations, Ms. Robinson guided students to their own science fair questions to create an accurate claim related to their data collection. Connected with her practices in part 1, she continued to remind students to include specific data: “By specific what do I mean? Details, numbers, days, number of days, actual measurements, that’s why we collected them as numeric data.” When discussing the evidence students added, Ms. Robinson underlined the importance of having more data points instead of discussing the first and last day: “What happened between Day 1 and Day 17? You don’t have any data for anything in between …It should be in chronological order. They stayed the same height? How much? What height were they?” In addition to noting the importance of having multiple data entries, she also focused on supporting students to add data measurements: “So, is this first plant the one that got the most fertilizer? Yeah, plant 1? So explain to me that it grew 15   88     centimeters and received how much fertilizer? And then plant 2 grew this in how much fertilizer it got, ok?” Under the reasoning section, the teacher first reminded the students about the link between claim and evidence: “Make sure that when you get to the reasoning that you’re using complete ideas, that you’re referring to all of your data, and tying it back to your claim.” Connected with this idea, her individual discussions with students continued to focus on discussing evidence: But the one that did grow, how much did it grow? And at what point did it die? And which one got which amount of acid, of pH? You had 3 plants. You need to talk about plants 1, 2 and 3 and how much what the pH of the water was for them, and how much water they got. Later, she also added the importance of having scientific principles and the link between claim and evidence: You’re going to look for evidence to support that. OK? In the form of your plant growing, maybe the one in the middle and the one that didn’t grow at all. And then your reasoning would explain that. OK? All right. The reasoning is why your evidence supports your claim and the science behind it. This was the only instance in which the teacher noted the importance of scientific principles without providing content-specific scaffolds while students where creating explanations using the mobile application. When students were creating reasoning statements, the teacher also focused on the sentence count and the details in relation to the data: “And your reasoning statement should be a minimum of 8 to 10 sentences, grammatically correct, that are very specific. That means they are including the specific data you collected. Measurements, days, number of days, amount of water, etc. all right?” Overall, the teacher reviewed students’ questions when creating claims; her main focus was on supporting students to include data values when discussing evidence. In this process,   89     there are several instances where she discussed specific data: “That’s how you do the reasoning. You just explain that, that the room temperature plant grew X amount over so many days, the one in hot temperature grew this much, and the cold temperature grew this much.” When discussing reasoning, there was only one instance during which she reminded students to include scientific principles without discussing specific principles. Since the teacher supported creating claims, discussing data, and including the link between claim and evidence, her practices provided high-level support for claim, evidence and reasoning in day 1. The support for reasoning was high, but it did not go beyond discussing the link between claims and evidence. Gotwals and Songer (2010) noted that the quality of reasoning is connected with the quality of content and the quality of the explanation. In day 1, the teacher supported quality of the reasoning by using generic scaffolds. In this process she did not provide content specific scaffolds to guide students in discussing the content. Day 2- Using Mobile Devices. On Day 2, the teacher asked students to finish the explanation they started in the previous day and write another explanation by reviewing another student’s science fair project. During this activity, Ms. Robinson’s support for claim shifted to reminding students about the need to start their second explanation: “Today’s assignment is to take someone else’s question, a question of your choice, and write a claim-evidence-reasoning statement for that one.” When focusing on evidence, she continued to remind students to include more data: “That’s not specific enough and you haven’t talked about the data …You talked about one day. We had 18, and you had 3 colors, so that is not sufficient.” Besides reminding students to include more data, she also continued to highlight the importance of having data measurements: “You need to include at minimum of 2 more days and 2 more measurements.”   90     When students were creating the explanations using other students’ questions and data, Ms. Robinson focused primarily on reminding students to complete the assignment: “You need to get your claim-evidence- and reasoning statements complete.” In this process, she continued to remind students to include specific data measurements: “Reasoning you need to be very specific, on what days, what was the height of the plant?” This example clearly indicates that the teacher put an emphasis on evidence without supporting quality of the students’ explanation by providing content-specific scaffolds. In day 2, the teacher focused on discussing data values, she did not support the students in making connections with claim and evidence, and she did not emphasize the importance of scientific principles. She noted that she will be checking the quality of explanations, but this reminder did not go further to support students with individual discussions: “I’ll be walking around and looking at your work, and I suggest you do something to your work, I expect it to be done … I’m going to expect them to be highly detailed.” In this process, her primary focus was on reminding students that the explanations the students created were the summary for their science fair boards: “Your claim-evidence-reasoning statement that you started yesterday is essentially your conclusion for your science fair project.” Compared to day 1, Ms. Robinson’s practices when creating claims and reasoning statements shifted to primarily providing reminders, which resulted in low-level support. But she continued to provide high-level support for evidence since she focused on the importance of including content specific data measurements. Unit 1-Part 2 Summary. When using mobile devices to create claims, the teacher supported students in reviewing their science fair questions and creating a claim as a statement in day 1, but she only provided reminders in day 2. As summarized in Table 18, her practices   91     regarding claims provided high-level support only in day 1. Similar to supporting claims, the level of support for reasoning shifted to providing reminders in day 2 (see Table 18). On day 1, her practices for reasoning primarily addressed creating connections with evidence and claims, a high-level support. The teacher supported students in discussing data values in both days, and her instructional practices about evidence stayed at high-level in both days (see Table 18). Table 18 Examining the Level of Support in Unit 1- Part 2 Claim Evidence Reasoning Day 1 High-Level High-Level High-Level Day 2 Low-Level High-Level Low-Level Part 2 Summary Moderate Support High Support Moderate Support Defining Level of Synergy in Unit 1. After examining the level of support in both parts, this sub-section focuses on defining the synergy in unit 1 by examining the following question: “What is the level of synergy between providing instructional support and using mobile devices in unit 1?” As depicted in Table 19, when providing instruction, a majority of the teacher’s practices provided high-level support for students’ understanding of claim, evidence, and reasoning. But the teacher provided low-level support for reasoning in the last activity. When students were creating their own claims using mobile devices, Ms. Robinson’s practices in day 1 provided high-support, but this shifted to low-level support in day 2 by only reminding students to include claims. Since one of the parts remained in moderate-support level, the level of synergy for supporting claims was moderate (see Table 19). When supporting students’ understanding of the evidence component, the teacher’s instructional support continued at high-level support when students were using mobile devices to create their own explanations. This showed strong synergy for providing supports for evidence. Ms. Robinson supported students to discuss specific data values on both days.   92     Although the support level decreased under claim and reasoning (see Table 19), it stayed at high-level for the evidence component. When students were creating their own explanations, Ms. Robinson primarily supported connections between claim and evidence, but this shifted to providing reminders in day 2. Her support for reasoning stayed at moderate level, creating moderate-synergy for supporting reasoning in unit 1. Table 19 Examining Level of Synergy in Unit 1 Claim Activity 1Part 1 HLS1 Evidence LLS2 Reasoning HLS 1 Activity 2Part 1 HLS Activity 3Part 1 LLS Activity 4Part 1 HLS HLS HLS HLS HLS HLS LLS HLS: High-Level Support, 2 LLS: Low-Level Support Level of Synergy Day 1Part 2 Moderate- HLS Synergy Synergy HLS Moderate- HLS Synergy Day 2Part 2 LLS HLS LLS Quality of Students’ Explanations The quality of students’ claim, evidence, and reasoning were analyzed to answer the following question: “How did students perform in unit 1?” A summary of the coding for students’ explanations developed in the plants unit can be found in Appendix O. As presented in Table 20, when creating claims almost all explanations (50 explanations) included a complete and accurate claim. Only eight students did not create an accurate claim. Of these eight, seven repeated their question (e.g. Does the color of light affect how a plant will grow?), and one created an inaccurate claim (e.g. No the color of light didn't affect the plant growth.) After creating claims, only one student did not add any evidence. Fifty-seven explanations included more than one piece of evidence (see Table 20).   93     Students scored the lowest scores on reasoning. Almost half of the explanations (twentyfive explanations) did not contain a reasoning statement that discussed scientific principles or the link between claim and evidence (see Table 20). Of these twenty-five, three did not include any statements. Since the teacher’s support primarily focused on including data values, some students just discussed data measurements without adding links between claim and evidence. The following quote illustrates this point: “The potting soil grew the longest. The dirt did not grow at all and the sand grew just a little. All three of the seeds were watered with 3 drops. The sand grew to 1.7 and the potting soil grew to 14.9.” Besides missing the link between claim and evidence, some students included an inaccurate link between claim and evidence. In the example provided below, the student reached an inaccurate statement by noting the plant in the cold temperature grew the tallest. In this example, the use of inaccurate evidence leads to a wrong connection between claim and reasoning: Temperature does affect plant growth because plants can grow in certain types of conditions. In my experiment the plant that was in the cold grew the longest and the plant in the warm area grew also. The only plant that died was the one that was in room temperature. The plant in cold temperature grew the tallest. The plant in the hot temperature was the second tallest. So I thought that the temperature affected the hot and cold temperatures. Almost half of the explanations (twenty-seven explanations) focused on the link between claim and reasoning. In the previous example, the student reached an inaccurate conclusion, whereas the following example presents an explanation with the correct link between cold temperature (no sunlight) and hot temperature (sunlight): In the end, the amount of sunlight can affect plant growth. The plants with sunlight grew faster and longer… On the final day of measurement one plant was 20cm tall. Another plant had grew18cm tall. Then my last plant with no sunlight was 15cm tall.   94     Of these fifty-eight explanations, only six included scientific principles, though none of them were complete. These six explanations did not tie the scientific principles to photosynthesis. One of these explanations noted the importance of water with an emphasis on plant functions but never discussed the role of water in photosynthesis: “The amount of water affects plant growth because plants need water to carry out bodily functions. Plants use water to grow and reproduce. If plants don’t have water they will die of dehydration and shrivel up until they die.” Overall, the students created claims and included evidence in their explanations, but the reasoning was incomplete and, as such, was a challenge for them. One possible reason for this is that the teacher focused on sentence count and including data measurements without providing content-specific scaffolds to discuss scientific principles. As such, almost half of the students failed to create a reasoning statement, and only six explanations included scientific principles. One final point is the total number of explanations students created. Thirty-eight students completed fifty-eight explanations. Although Ms. Robinson asked students to create two explanations, only nineteen students completed this task. Of these nineteen, one created three explanations. The remaining eighteen students only created one explanation. Table 20 Quality of Students’ Explanations in Unit 1 Score 0 1 2 3   Claim 8 50 ------N=58 Evidence 1 -----57 N=58 95   Reasoning 25 27 6 --N=58   Unit 1 Interview Immediately after unit 1, I interviewed the teacher to determine (a) what she thought about students’ ability to create explanations, (b) her ideas in relation to using instructional strategies, and (c) her ideas in relation to using the mobile application. This section focused on answering: “What does this teacher think about providing instruction, using mobile devices, and students’ ability to create explanations after unit 1?” Ms. Robinson summarized her goal during the first unit as paying particular attention to improving the content when creating science fair investigations: “To learn about what things are involved in plant growth. So, they (students) learn the content as we went through different investigations in class, we captured data they could annotate, to help them answer questions, big questions.” During unit 1, she believed that she provided numerous opportunities to scaffold students’ understanding of CER when providing instruction. She also added that students had prior experiences before unit 1: “We had done it [writing claim-evidence-reasoning statements] at the beginning of the school year, and about half of my kids would spend a lot of time on it last year. So they had a lot of practice, prior practice.” When students were creating their own explanations, the teacher noted that providing feedback was essential for supporting students: “They [students] just wanted to say, “Look at my pictures.” Some of them hadn’t collected data the way they were asked to so they didn’t have the data to refer to. And I think they always struggled with the reasoning, pulling it back together. It’s hard for a lot of kids.” Although she noted students’ struggles with reasoning, the teacher thought that students understood the content: “They [students] did learn a lot about plants and plant structure and what plants need to survive.”   96     During this process, the teacher noted that the mobile application supported students when collecting and organizing data during science fair projects: “I really like using Zydeco with this in the classroom and collecting data. We could do a complete project from start to finish, they could translate it to a science fair board and use electronic device to do it. We could keep all the data and go back to it. I think it was a really nice way of organizing it and helping the kids organize it.” The teacher also added that students were more successful in their science fair projects compared to the previous year’s science fair projects due to the support coming from the mobile devices. In the pre-interview, she recalled that only a couple of students completed explanations when creating their science fair boards in the previous year, and she noted that there were more complete projects this year: “Overall the conclusions are better.” Although there was an improvement, the teacher was not happy with the quality of the explanations on the science fair boards. Unit 1 Summary Before and after unit 1, the teacher noted that creating reasoning statements would pose a challenge for students. Before the unit, she stressed that she did not need support for providing instruction about explanations, and she was comfortable with using mobile devices. During unit 1, she provided multiple opportunities to support students’ understanding of the CER framework. As depicted in Table 19, the majority of the teacher’s instructional practices provided high-level support during part 1. Although the teacher was aware of the challenge in reasoning, the teacher’s support for reasoning only created moderate-synergy in part 2 (see Table 19), which in turn influenced the quality of students’ explanations. Moreover, we see that when students were creating their explanations, the teacher primarily focused on supporting the link   97     between claim and evidence in the first day of using mobile devices. She noted the importance of including scientific principles once in two days, and she did not provide content-specific scaffolds (e.g. how temperature affects photosynthesis). In the second day of using mobile devices, she shifted her focus to reminding students to include reasoning statements. The lack of support under reasoning is connected to students’ products. Almost half of the students (twentyfive explanations) could not create a reasoning statement that linked their claim and evidence. When students created this link (thirty-three students) only six of them included scientific principles. The teacher was aware that students struggled with reasoning when writing explanations before and after unit 1, but she did not provide extra support for the students to incorporate reasoning into their explanations during both days. Her instructional practices were at high-level when supporting students’ understanding of CER, but when students were creating explanations, the teacher primarily provided high-level support for reasoning only on one day. The lack of emphasis to support students when creating explanations can be evidenced by the following dialogue. In this example, the teacher is discussing the science fair board with a student, which was the summary of the student’s science fair project. She reminded the student that she supported them in writing explanations instead of continuing to support the student: “Isn’t this the claim-evidence- and reasoning which is the conclusion? So why wouldn’t you put it on your board …Why wouldn’t you? We’re doing it in class and I’m structuring it or scaffolding it, so I can actually help you. Why wouldn’t you put it on there?” Similar to her comments during the interview after unit 1, the teacher believed she provided enough opportunities for the students so that the students should be able to complete the explanation process. Although the level of support for reasoning was linked to the quality of the students’   98     products, it did not influence the quality of claims. Ms. Robinson’s support for claim also decreased, but almost all explanations (fifty) included high quality claims. In this process, there was also support coming from the mobile devices except for reminding students of the driving question and science fair questions (see Figure 10). Ten years ago, when examining teacher practices, Lizotte, McNeill and Krajcik (2004) also reached a similar conclusion by noting teacher practices did not have a significant role on the quality of students’ claims. Finally, there is another important point to note under unit 1 related to attendance. Only thirty-eight students completed an explanation from a sample of fifty-four. After scoring these students’ explanations from unit 1, and finding a huge gap in the students’ reasoning statements, I decided to help the teacher understand what was missing in the students’ explanations by organizing two focused teaching intervention sessions before starting unit 2. Due to Ms. Robinson’s previous experience in supporting explanations and using mobile devices, I did not provide any additional support before the intervention (unit 1). Mini- Case Study 2: Focused Teaching Intervention & Synergy in Water Quality Unit The second mini-case study examines the level of synergy after participating in focused teaching intervention sessions by focusing on several main categories: (a) the teacher’s ideas about and practices regarding providing instruction, (b) the teacher’s ideas about and practices with using mobile devices, and (c) the quality of students’ explanations and what this teacher thinks about the students’ performance. With the addition of focused teaching intervention, my primary focus was on examining what the teacher thinks about students’ performance after analyzing explanations from unit 1. Analyzing the focused teaching intervention data provided additional valuable information (e.g. the teacher’s disagreement with the coding rubric, her ideas   99     in relation to the instant review feature), but only one category (what this teacher thinks about how the students performed) was used to make comparisons across data sources. The second-mini case investigated the following questions designed in conjunction with the research questions: • What does this teacher think about providing instruction, using mobile devices, and students’ ability to create explanations after unit 2? • How did students perform in unit 2? • How does the teacher use instructional strategies to support students’ understanding of the claim-evidence-reasoning framework in unit 2? • How do the teacher’s scaffolds and the scaffolds in the Zydeco work together to support students in constructing explanations in unit 2? • What is the level of synergy between providing instruction and using mobile devices in unit 2? • What is the role of focused teaching intervention in informing the teacher about her practices? Focused Teaching Intervention Sessions The goal of having focused teaching intervention sessions was to investigate: (a) how the teacher analyzed the students’ explanations created in unit 1, (b) her disagreements with the coding rubric, (c) in which areas she needed the researcher’s support, (d) her initial ideas in relation to the instant review feature of the digital application, and (e) her ideas about the students’ performance after coding their explanations from the plants investigation. This subsection discussed findings from each day of the focused teaching intervention separately and   100     examined the following question: “What is the role of focused teaching intervention in informing the teacher about her practices?” Focused Teaching Intervention- Day 1. When coding the nine explanations provided in Appendix G, the teacher did not have any problems when analyzing claims. She coded claims in agreement with the coding rubric. However, when coding reasoning statements, there were some instances where the teacher had disagreements with the researcher’s coding. In the example provided below, the student noted that the amount of light affected the plant growth, and the plants staying at the windowsills were competing for sunlight: My plant on the windowsill in class had to compete for sunlight while the bigger plants took all the sunlight. My plant on the shelf was getting the faintness of light from the fluorescent lights gave little support to my plant … The plant in the closet when it was here in the photos I seen that it has stayed the same size. When analyzing this explanation, Ms. Robinson was not sure how to code the reasoning statements, and the researcher supported her to reconsider her coding. Although the student provided adequate evidence to discuss how the amount of light affects plant growth, evidence was never tied to a scientific principle, for example, through a discussion of the effect light would have on photosynthesis: Ms. Robinson: I would give that a 2. Researcher: Why? Ms. Robinson: Because it talks about competing for sunlight and the amount of sunlight. And that once they change circumstances, and they weren’t competing for light anymore, they got it. And the fact that the one in the closet…how did you guys code it? Researcher: We gave it a 1, actually. Ms. Robinson: That would be what I would say, that it has some reasoning Researcher: There is competition, but here… Ms. Robinson: Well but if it’s being blocked by other…So you’re just saying that there is reasoning without principles? Researcher: So here, actually, the reason we gave it a 1 was because they don’t really have the scientific principles. They don’t really say what light means for the plant growth. Ms. Robinson: OK. Researcher: They have some ideas, but…   101     Ms. Robinson: But they don’t explain the importance of it. Researcher: Yes. Ms. Robinson: All right. Researcher: The scientific principle about light is absent. Although there were several nuances with the coding of scientific principles, Ms. Robinson did a good job identifying them. And she expected students to include complete scientific principles. In one of the instances, the researcher noted that when coding explanations, students were credited when they included some scientific ideas: “If it has some scientific principles, so we usually give credit for that.” Ms. Robinson noted that she would not give any credit to that: “All right. I wouldn’t, but OK.” After coding all explanations, the researcher summarized the students’ progress as: “They were able to create claims … Look at this one. There is a paragraph. They are writing paragraphs, really long sentences, but they don’t really represent their thinking.” When summarizing this gap the teacher was aware that students were creating explanations, but there were pieces missing: “But they’re real close, I mean, they’re really close.” Besides investigating what this teacher thinks about students’ explanations and the teacher’s disagreements with the coding rubric, I also analyzed an explanation created by the teacher during the first focused teaching intervention session and shared this analysis with the teacher for discussion. The teacher’s explanation presented in Figure 14 (see chapter 4) had a complete claim (the color of light does affect plant growth), and she included three pieces of data. When creating the reasoning statement, she added scientific principles as: “Plants require the sun to perform photosynthesis to produce food for the plant to grow. Sunlight contains the entire spectrum of colored light; however, red and blue light are particularly usefully for chlorophyll in the leaves to perform photosynthesis.” It is good to add the scientific idea that chlorophyll absorbs red and blue light because of the wavelengths. However, this explanation is   102     missing the connection the teacher made to the data: “Looking at my data the plant in green light grew the least at a total of 26 cm in 18 days, and the plant in blue light grew in 24.5 cm in 18 days and the plant in red light grew a total of 29 cm in 18 days.” The teacher refers to regular day light as the green light; as you can see from the pictures, one of the plants is covered with a colorless cover. In teacher’s explanation, there is a missing link between scientific principles and the data. The data suggest that a plant in blue light grew less than the plant left under regular daylight. But the scientific principles discussed by the teacher suggest blue color should support plant growth more. The gap in the teacher’s explanation are connected with her disagreements with the coding rubric. When analyzing one of the student’s explanations, Ms. Robinson did not analyze explanations as a whole. She was searching for scientific principles, and the researcher suggested that she analyze the explanation as a whole by also evaluating connections made with the evidence added by the students. Focused Teaching Intervention- Day 2. Similar to day 1, the teacher did not have challenges when coding claims. In the second day, there were examples of fine tuning using the rubric to assess students’ explanations in which the teacher showed a better understanding of the coding rubric, and she was paying more attention to the connections between claim and evidence. (The explanations coded by the teacher can be found in Appendix F.) Students in another school that used Zydeco to test water quality created these examples. The chair of the dissertation committee and I selected nine examples that present poor to high-quality explanations for practice.   103     In the fine-tuning example provided below, the researcher supports the teacher in reviewing her coding. In this example, the student rated water quality by focusing on DO and invertebrate data, but did not include any scientific principles: The DO is 8ppm, and the temperature is about 22 degrees Celsius. This means that the saturation is 92%, which is the saturation rate of a 4 (excellent). Also, someone from our class hour found some macro-invertebrates. These were Mayfly and Caddisfly larvae. These bugs when we looked at our sheet are good bugs to have in water and if you have them in water, your water is of good quality. When coding this example, the teacher initially noted that students used incomplete principles. After discussing this point with the researcher, the teacher concluded that there were no scientific principles; the student just repeated the data by finding the level of saturation. To differentiate ranking indicators from scientific principles, rankings were described as including ideas in relation to the water quality (e.g. excellent, poor): Researcher: Let’s see what they do under the reasoning. Ms. Robinson: So they talk about the macro invertebrates, but I don’t see those there. I would give this a 2 because they don’t complete the idea of the principle behind why these are good. Researcher: But does it have a scientific principle, or just an idea? Ms. Robinson: There is an idea, but no scientific principle. Well they start out with it, it’s here, it’s part of the oxygen saturation, butResearcher: But here like with the data we discussed, the principle or they just make finding saturation level? Ms. Robinson: They’re just finding the saturation level. There is no principle. When analyzing another example, the teacher continued to discuss connections between evidence and scientific principles: “And they kind of allude to it here. But they don’t go in and talk about if they had said without dissolved oxygen in the water; there isn’t oxygen for fish or plants. Then that would be a principle.” In the examples provided in first focused teaching intervention session, the researcher was more involved in helping the teacher with understanding the importance of linking the reasoning to the data. In the second session, the researcher supported the teacher; however, this time the teacher was more active. She analyzed the entire   104     explanation to find out how students ranked indicators to describe the water quality and checked for the use of scientific principles to explain the indicators. Similar to the previous session, there was another instance in which the teacher noted her high expectations for the students to include scientific principles. In the following discussion, the researcher again noted that it would be hard to expect that students would discuss all scientific principles related to all the indicators they listed in their explanations; however, the teacher expects students to explain all of their data: Researcher: We really didn’t expect them to code everything in great detail, because here they’re talking about four different indicators and it’s so hard to expect them all. Ms. Robinson: But you know that’s what we have been pushing for, what I’ve been pushing for, is for them to explain all their data. At the end of the second session, Ms. Robinson was introduced to the instant review feature of the mobile application. The researcher noted that by clicking on the students’ names, the teacher could monitor the progress of their explanations. Ms. Robinson used this feature to review a couple of sample explanations created by the research team, noting that this feature would “be helpful” in the second unit. Focused Teaching Intervention Summary. The first focused teaching intervention session helped the teacher realize that students were creating long reasoning statements that did not discuss scientific principles or the link between claims and evidence. When coding explanations in the first session, the teacher had several disagreements with the coding rubric, but this switched to fine-tuning with the rubric in the second session. The explanation created by the teacher presented missing links between data and reasoning; in the second session, Ms. Robinson was actively searching for these features and connections.   105     Finally, the teacher was pleased with the addition of the instant review feature in the Zydeco application; both focused teaching intervention sessions also revealed the teacher’s high expectations that were not discussed during the interviews. Examining Synergy in Unit 2 Similar to unit 1, I first analyzed the teacher’s practices when providing instruction and then I reported on the instructional practices when students used the mobile devices. Finally, I investigated the level of synergy in unit 2. A summary of the coding analysis of the teachers’ practices during the water quality unit can be found in Appendix P. Part 1- Providing Instruction. This section focused on the following question: “How does the teacher use instructional strategies to support students’ understanding of the claimevidence-reasoning framework in unit 2?” In the second unit, Ms. Robinson created two activities to support the students’ understanding of CER: Modeling CER, and Critiquing Explanations. Activity 1- Modeling CER. When the teacher started the water quality unit, the ducks in the school pond were dying. It was an unintentional coincidence that the teacher took advantage of this circumstance to contextualize the modeling CER. She started the activity by noting that the ducks were starting to die. After creating a claim in the discussion, she presented several possible explanations by primarily focusing on the negative effect of fertilizers. In this process, the teacher provided specific data and noted the importance of connecting the claim and evidence when creating reasoning statements: Ms. Robinson: We do know that we had 10 ducklings at the beginning of the week. Today is Friday. There is 1 duckling left. 8 of them died yesterday. They were kind of, one right after the other. We were out walking around after lunch so after I had seen you, and we were looking and he says, “you know, this looks like fertilizer to me”. So there are those fine, kind of whitish little particles all over in the grass that are probably fertilizer that somebody else said, “Yeah, I think somebody else probably fertilized the   106     courtyard by my room.” So, fertilizer was probably put down on the grass and a lot of times, fertilizer also has a pesticide in it. Pesticide to kill insects. So what do we know? What is our evidence? Student(s): We have the fact that each day more ducklings died. Ms. Robinson: We have the fact that ducklings were dying. We have the fact that I cleaned the pond out, right? Sort of? We’ve also had rain, and we have little granular things on the ground, and people say “yep they came and fertilized.” So make a claim, and then reasoning. Who can explain to me what the reasoning is? Student(s): It’s what ties the claim and evidence together. Ms. Robinson: Exactly, it’s what ties the claim and evidence together. It explains why you use the evidence you did in order to support your claim. All right? When modeling CER, Ms. Robinson asked students to complete a sample explanation by using the activity provided in Appendix C. When introducing the activity she defined a claim as a statement: “A claim would be whether you think the pond is healthy or not.” She reminded students to add three pieces of evidence, and highlighted the importance of discussing evidence using scientific principles: You need to have the scientific principles that connect your claim to the evidence, why you use that evidence. Okay? So what’s the science, the reason, what’s your thinking, your justification? You know what the word justification means? How you rationalize your decision to use that data to support your claim. So that’s the reasoning. In summary, when discussing evidence the teacher noted the importance of including pieces of specific evidence, which resulted in high-level support. Similarly, her support for claim and reasoning stayed at the high-level by providing content-specific scaffolds. When supporting the claims, she continued to define a claim as a statement that answers a question. Under reasoning, she focused on the connection between claim and evidence and also noted the importance of including scientific principles. Activity 2- Critiquing Explanations. When critiquing explanations, the teacher used the explanations coded during the second focused teaching intervention session. In this process, she distributed a copy of the explanations to each student and then focused on critiquing the explanations.   107     When defining the critiquing process, Ms. Robinson noted the importance of having a claim, explaining the evidence, and including scientific principles to connect the evidence to the claim: “You’re looking for a claim. You’re looking for an explanation of the evidence, and the rationale, the science principles that explain why you selected that.” Connected with the definition, when discussing claims, Ms. Robinson primarily focused on reminding students of the need to have a claim. After reading a claim, she asked: “Is that a claim?” to the group without defining a claim. When discussing the evidence and reasoning, she focused on reminding students about the importance of having specific data measurements and highlighting the importance of using scientific principles to justify the use of the evidence: “We’re looking for specifics about their data, and we’re also seeing justification for why they use it, the science principles.” Putting an emphasis on discussing the data and including scientific principles increased students’ awareness of the importance of justifying their data with scientific principles. In the example provided below, students noted the sample explanation discussed the data using scientific principles, and then Ms. Robinson took this explanation further by noting that students needed to take their explanation to the next level by providing generic scaffolds: Ms. Robinson: They have 6 pieces of data, right? And they have specific tests. So they have their evidence. Here is the reasoning: “I believe my claim to be true because the D.O. is 4 parts per million and the temperature is around 23 degrees Celsius. So the saturation level is above 47%, which is poor quality for water. The river has no traces of nitrates. The phosphate level was 2 parts per million, which is good, but ours is excellent. For the pH, our part shows that there is about 7 parts per million, which is excellent for water. There is trash in the water, and the water is hard to see through.” So, what have they done in the reasoning? What is there? Student(s): They explained their data. Ms. Robinson: They explained their data. Excellent. They really explained their data. But did they take it to the next step and say “I am using this data to support my claim because… Do they make that connection? Student(s): No.   108     Ms. Robinson: No, they did a really good job of explaining why they selected their data. But they didn’t take it to the next level. You guys are for the most part right here. You make good claims, you collect and select the right data, you even explain your data, but you just don’t explain why you selected that data for your claim. In the next example, when discussing the reasoning statement, students were able to identify the principles by discussing the role of acidity. Ms. Robinson further focused on helping the students see the use of other scientific ideas by noting the discussion for phosphate by providing content-specific scaffolds: Ms. Robinson: “So phosphate is a nutrient, needed for plant and animal growth. The phosphate in West Park’s water was a fair amount so plants and animals can survive. PH is the measure of acidity that is in the water, so a large amount of pH is bad for fish, which means the water is very acidic and that it might kill all of the fish.” So have they talked about their data? What else have they provided us here? Student(s): They told us the science principles. Ms. Robinson: They told us the principles. What would be an example of the principle? Look at that. Tell us what the principles are here. Student(s): They said that a large amount of pH is bad for the fish. Ms. Robinson: There is one. There is one more in that sentence, too. The first sentence. Phosphate is a nutrient needed for plant and animal growth. That also is the science behind this, it’s a science principle, correct? Student(s): Yes. In summary, Ms. Robinson’s practices for the claim only provided low-support since she primarily focused on checking the existence of claims. She supported discussing data and including scientific principles, which resulted in high-level support for evidence and reasoning. Unit 2- Part 1 Summary. Table 21 presents a summary of the instructional supports provided by the teacher in unit 2 to support students’ understanding of CER. Ms. Robinson provided high-level support for claims in activity 1, but this shifted to checking the existence of claims during the critiquing activity. Since the examples included claims (see Appendix F), the teacher did not provide any additional support for claims in activity 2. The level of support for evidence and reasoning stayed at high-level on both days by placing emphasis on discussing data and including scientific principles.   109     Table 21 Examining the Level of Support in Unit 2- Part 1 Claim Evidence Reasoning Activity 1 High- Level High- Level High- Level Activity 2 Low- Level High- Level High- Level Part 1 Summary Moderate Support High Support High Support Part 2-Using Mobile Devices. In unit 2, when using mobile devices to create explanations, the teacher devoted a little more time (an additional half class) to writing explanations compared to unit 1. Part 2 focused on answering: “How do the teacher’s scaffolds and the scaffolds in Zydeco work together to support students in constructing explanations in unit 2?” Day 1- Using Mobile Devices. In the first half of the day, Ms. Robinson critiqued explanations created in another school. The teacher used the other part of the period to provide additional instruction. During the second half of the period, she asked students to review the data they collected and create a scientific explanation. In this process, she did not provide individual support, but she used the instant review feature to provide feedback during whole class discussions. In this process, she reviewed several explanations. In the example provided below, she again focused on reminding students of the need for having a claim, discussing the data (e.g. saturation level, temperature), and using scientific ideas to justify the role of oxygen and phosphate: Ms. Robinson: Did they have a claim? Student(s): Yes. Ms. Robinson: Did they have evidence? Is the reasoning complete? Did they explain their evidence and why they were important? Student(s): No. Ms. Robinson: No, what do they need to add? … So they talked about the oxygen saturation, and why the animals need oxygen. They talk about the phosphate and why phosphate needed. Temperature, and why it’s important. And turbidity, and why it’s important.   110     When using the instant review feature, Ms. Robinson focused on leading whole class discussion. Throughout this process, her support for the claim was at low-level while evidence and reasoning were at high-levels. After reviewing the students’ explanations in the focused teaching intervention sessions, Ms. Robinson shifted her focus to reasoning when students created explanations by providing content-specific scaffolds. Day 2- Using Mobile Devices. In day 2, the teacher started using the instant review feature to review individual explanations. When reviewing students’ claims, there were very few instances in which the teacher supported students to create their claims. Since students were able to create claims, the teacher primarily focused on moving students forward by adding evidence and reasoning: “There is your claim and you need to support that.” During individual meetings, the teacher continued to remind students to have a claim: “I don’t even see a complete claim for today’s question.” Similar to the previous day, the teacher reviewed an explanation with the whole class. When reviewing one of the explanations, the teacher primarily focused on noting the need for discussing the data values and what indicator rankings mean for water quality. She stated: “Does he address the data specifically? Did he talk about dissolved oxygen, phosphate, turbidity?” She continued to remind students to discuss what the data values mean when having the individual meetings and provided content-specific scaffolds: “You don’t have a complete reasoning statement. I believe the Rouge River behind the school is healthy. The DO level is 81% and it is good. Why is it good?” During these individual meetings, the teacher also focused on discussing the data values with students: “Fecal coliform. It was negative.” When students completed their first explanation, Ms. Robinson asked students to create another claim by reviewing the previous year’s data: “You’re going to add a new claim. Then   111     your claim is going to be about health of last year’s River compared to this year’s. Has it gotten better or worse?” In this process, she encouraged students by noting the improvements in their explanations: “Quality of your explanations have gotten much, much better. You just need that extra little tweak to push them over the edge.” She also helped students to create their claims by making comparisons with the previous year’s data: “Your claim, it is not healthier now. Does it mean that it is at the same of health as last year? Is the water healthier now? So it is at the same level of health? And it got healthier or less healthy?” In summary, when supporting claims, the primary emphasis was on reminding the students of the need to create claims, which created a low-level of support. The teacher discussed evidence, providing high-level support for this component as she discussed specific data values. Finally, under the reasoning component, the teacher was actively engaged in reading how students reported data values and supporting them to include scientific principles: “You don’t have any science principles related to that. You just talk about your data.” Similar to the evidence part, the support for reasoning was at high-level. Finally, there was an important technical problem to note on day 2 which forced the teacher to spend extra time to solve the issue. The application suddenly started copying students’ claims. None of the explanations were lost. Unfortunately, this made the teacher lose some momentum during the individual meetings since the teacher needed to find the original explanation from the copied ones. Day 3- Using Mobile Devices. When the students were completing the second explanation, the teacher was pleased with their progress and pushed students to make one final effort to improve their explanations: “Today is like our final push with this, so I have complete confidence we can get there.” In day 3, the technical problem was less prominent since the bug   112     was partially fixed. The problem also made students frustrated that their initial explanations might get lost, but the teacher pushed them to create the second claim and reminded them that no explanations were lost: “Just start on the second claim. It’s not lost, you just can’t see it.” When reviewing evidence, the teacher made several reminders to push students to discuss their findings by providing content-specific scaffolds: “You say I believe that the Rouge River is healthier than the last time we visited. Okay. So your reasoning, you just restated it. Now I want you to talk about that evidence.” Besides discussing data, Ms. Robinson was also paying close attention to how students discussed the data. For instance, the result for the fecal coliform came out negative. When Ms. Robinson found some students were discussing fecal coliform incorrectly, she reminded students to search for the right data and improve their reasoning: “That’s fecal coliform. And that’s a positive result. But we didn’t find a positive result … The pond was positive ... so you’re looking for the river one.” Ms. Robinson continued to remind students to add claims and discuss their data. The majority of the instances in day 3 focused on discussing the reasoning by using content-specific scaffolds. In this process, the teacher asked several questions to prompt students to discuss scientific principles after reviewing the data and pressed the reasoning further: “What is the science principle behind that? Why it’s important?” In the example provided below, the teacher read the explanation first and noted that the student discussed the data but missed scientific principles. She asked the student to discuss the scientific ideas in relation to dissolved oxygen, pH, and phosphate: I don’t see specific things about the dissolved oxygen, the pH, and the phosphate. Yeah. And I wanta see that. The rest of this is excellent but just talk about the data you have here. Okay. All right. Why do you want to know why the acid is in there? What does it tell you about what can live in there? But the rest of it is excellent. Just explain, it tells you about the amount of acid. However, why is that important? What’s the science   113     principle behind it? What is the science principle? What is the science principle? Why do you want to know about the pH? In summary, the support for claim was low-level, but the teacher continued to provide high-level support for evidence and reasoning in day 3, pressing students to clarify their reasoning. Connected with the previous two days, the main emphasis was on supporting evidence and reasoning after the focused teaching intervention sessions. Unit 2- Part 2 Summary. As presented in Table 22, Ms. Robinson provided low-level support for claims during three days when students were using mobile devices to create explanations in unit 2. She provided high-level support for evidence and reasoning by providing content-specific scaffolds with discussing the data values and scientific principles during part 2. One possible reason for the low level of support for claims was that the teacher judged that students were doing fine with creating claims, but needed more support for evidence and reasoning. Table 22 Examining the Level of Support in Unit 2- Part 2 Day 1 Day 2 Day 3 Part 2 Summary Low Support High Support High Support Claim Low- Level Low- Level Low- Level Evidence High Support High- Level High- Level Reasoning High- Level High- Level High- Level   Defining Level of Synergy in Unit 2. By comparing the level of support in part 1 and part 2, this section answered the following question: “What is the level of synergy between providing instructional supports and using mobile devices in unit 2?” As depicted in Table 23, Ms. Robinson’s focus in unit 2 shifted to reasoning. Her support for claim in both part 1 and part 2 was low, which resulted in low-synergy; however, this is most likely because Ms. Robinson   114     believed that the students did not need as much support for writing claims as they did for evidence and reasoning. Table 23 Examining Level of Synergy in Unit 2 Activity 1Part 1 HLS1 Activity 2Part 1 LLS2 Level of Synergy Claim LowSynergy Evidence HLS HLS Synergy Reasoning HLS HLS Synergy 1 HLS: High-Level Support, 2 LLS: Low-Level Support Day 1Part 2 LLS Day 2Part 2 LLS Day 3Part 2 LLS HLS HLS HLS HLS HLS HLS There is a decrease in supporting claims compared to unit 1, but the level of support for reasoning increased in unit 2 compared to unit 1, particularly when students created their explanations using the mobile devices. This decrease in supporting claims and increase in supporting reasoning is appropriate. The results show that after participating in focused teaching intervention sessions, the teacher provided high-level support for reasoning when supporting students’ understanding of the CER framework and when students were using mobile devices to create their own explanations (see Table 23). Quality of Students Explanations When creating explanations in relation to water quality, fifty-two students completed explanations. Appendix Q presents the summary of the coding for explanations developed after the water quality unit. This section investigated the following question: “How did students perform in unit 2?” Of these fifty-two, only four students created one explanation. Forty-six students created two explanations, and two students had three explanations. As depicted in Table 24, only one student did not create a claim but instead noted, “DO” as the claim. Similarly, almost all students   115     included multiple pieces of evidence from the data pool. When creating reasoning statements, six explanations did not include any reasoning. Of these six, there was only one student who created two different claims, included evidence, but could not support the claims with any reasoning. Three of these students created another explanation that supported their claims with reasoning. One third of the explanations (see Table 24) focused on only ranking the indicators (see Table 24) without discussing scientific principles, such as: “The dissolved oxygen is good because it is 8 ppm and the saturation percent is 78% and that’s good. The pH level was good too. It was 7ppm which is rated excellent.” Although there were several explanations (three explanations) that included some incorrect scientific ideas, almost half of the students discussed what the ranking meant for water quality by using appropriate scientific principles. In the example provided below, the student created a claim stating that the river is healthy. After adding multiple pieces of evidence to the explanation, the student discussed what DO and pH mean for water quality: I believe that the water is healthy because the DO has 8 ppm that means 88% saturation. The DO is important because if there is no oxygen in the water fish can't breathe and if they can't breathe they'll die causing the river to become unhealthy. The pH is between 7 and 8 so it's good. pH is important because it measures the acidity and if acidity is high then the river would be unhealthy causing [its organisms] to begin dying so having a pH between 7 and 8 is actually pretty good. Table 24 Quality of Students’ Explanations in Unit 2 Score 0 1 1a 2 3   Claim 1 101 --------N=102 Evidence ------1 101 N=102 116   Reasoning 6 12 34 3 47 N=102   Unit 2 Interview The teacher interview conducted after unit 2 investigated “What does this teacher think about providing instruction, using mobile devices, and students’ ability to create explanations after unit 2?” In the previous unit, the teacher’s goal focused on improving the content understanding in relation to plants. This goal switched to improving the quality of explanations in unit 2: “To significantly improve their claim-evidence- and reasoning statements, and have them make the connection to the science principles. To understand water quality testing and the indicators were, and also really step up the claim evidence, the scientific investigations.” During the interview, the teacher acknowledged the role of focused teaching intervention by noting the value of the coding process. When coding explanations, Ms. Robinson had some disagreements in the first sessions but she was more comfortable when coding explanations in the second session. Coding explanations from unit 1 helped her to understand what was missing: “You know, talking about it with someone else, looking at the examples of good and bad, and understanding the rating system myself, and seeing how you were rating them, and how you guys were looking at them. Um it gave me an understanding what we were missing.” Although reviewing explanations from another school that performed much better when using Zydeco made the teacher feel as if her students should do better, it also helped the teacher realize that students are "capable" of creating good explanations: “I was concerned reading the ones that came out of West Park, and it was embarrassing. But no, it just confirmed to me again what my kids are capable of doing. And they apparently need a whole lot more experience with it to get it. But, it’s possible.” Before starting the second unit, Ms. Robinson was expecting one more challenge described by her as “one more mess.” She summarized the problem as the reasoning: “ It   117     (reasoning) was a huge problem.” When describing the role of providing instruction, the teacher noted that the activities were connected with activities in unit 1, but having the connection with the school pond and modeling the poor and good examples made the difference: I don’t honestly think I was explaining it any differently, but I think with the examples, in the format the kids were using, and then when the ducks died, that physical connection to it, I think made a world of difference. And just the physical phenomena, and then being able to tie it back into it, helped a lot … Modeling it differently instead of talking about and modeling it and then seeing some good and some poor examples, over a couple of days. Because we looked at them several times. And we looked at different examples each time. When describing students’ performance, the teacher noted that students had “a dramatic improvement” and also added that “It took a while but everybody got it. Or 99% of everybody got it.” Later in the interview, Ms. Robinson also compared the quality of explanations students created in the previous year since she used a similar question focusing on water quality. When comparing the improvements with the previous year, she noted that students were able to critique the data and they took more ownership “Last year to this year with the same kids, is huge. And the fact that they could actually look at data they collected last year and say “this is wrong” is huge…I think they took ownership and a little more pride in their work, too. Which I don’t see a lot of.” As illustrated in the quote above, Ms. Robinson noted the value of the critiquing explanations as a modeling activity. She also noted that focusing on good and bad explanations helped the students. When discussing the use of the mobile devices, she credited the instant review feature. When describing how she used it, she noted that being able to access students’ explanations changed the quality of their explanations. She did not focus on how her practices changed compared to unit 1 when using mobile devices: I think Zydeco was cool. I like the visual part, that they can see it. It’s all in one screen. Um, they can share them with each other. The fact that I could actually look at it at the   118     same time was really powerful … With the kids, the kids could see that I was interested, that I could actually see their work, I could share their work, and that while I was walking around looking around with them, when I walked away, I no longer had access to it. And the fact that I had access from anywhere in the room meant that I was watching. And I think they did better work because they thought I was watching … They did some work but the minute you walked away, they were like, “oh ok, I don’t really need to be working again until she comes back.” Despite the affordances of using the mobile application, there were some challenges. But in this process, having an experienced teacher made the difference. Ms. Robinson was aware that the mobile application was in the design process and there could be problems: “There are always software problems. As a teacher, you’re always going to have problems, you’re going to have a plan B regardless.” Unit 2 Summary Similar to earlier interviews, the teacher noted students’ challenges with reasoning after unit 2. Although she was aware of the challenge, focused teaching intervention sessions revealed that she had really high expectations despite the fact that she observed the students were not performing well. After completing unit 2, she was pleased to see the improvements in the quality of students’ reasoning in their explanations. She credited the role of PD, the critiquing activity, and the instant review feature when discussing this change. She noted that the instant review feature made students understand that the teacher was observing them; however, she did not discuss how using the instant review feature changed her practices with an emphasis on supporting claim, evidence and reasoning. During unit two, the teacher pressed students to include and state scientific principles appropriately to justify the use of data to support claims. Cross-Case Synthesis As noted in the previous chapter, the final step of analysis focused on understanding how the level of synergy influenced the quality of students’ explanations in both units. This section also   119     focuses on examining the following main category, “how the level of synergy affects the quality of students’ explanations,” by investigating the level of synergy before and after focused teaching intervention sessions with an emphasis the following questions: • How did the synergy level for supporting claims change between units? • How did the synergy level for supporting evidence change between units? • How did the synergy level for supporting reasoning change between units? • How did students’ claim, evidence and reasoning score change between unit 1 and unit 2? When discussing the improvements in unit 2, Ms. Robinson credited the role of the mobile application by noting that, “Zydeco definitely helped what would be grouped as the poor writers, write much better scientific investigations.” In this process, she also acknowledged that critiquing activities in unit 1 and unit 2 supported students’ understanding of CER by providing good and poor examples. When we compare the quality of support provided in both critiquing activities (critiquing activity in unit 1, and critiquing activity in unit 2), the support level was the same for claim, evidence, and reasoning. In both activities, the teacher provided low-level support for claims by primarily checking the existence of claims. When focusing on evidence and reasoning, Ms. Robinson discussed data and scientific principles, which resulted in highlevel support. In addition, the support levels the teacher provided when providing instruction were relatively the same. She provided high-level support for evidence and reasoning in both units (see Table 25). This finding is consistent with the teacher’s summary during the post-water quality unit interview, in which she noted that she did not change her instructional practices: “I don’t honestly think I was explaining it any differently.”   120     When discussing the improvements in the water quality unit, the teacher acknowledged the mobile application and the critiquing activity. But, the biggest change was in her practices to support reasoning when students were using mobile devices. As Table 25 shows, her support for students using evidence provided synergistic support for this component of the explanation framework in both units, but she only provided synergistic support for reasoning in unit 2. After analyzing the students’ explanations in the first focused teaching intervention session, the teacher shifted her focus to providing more careful and thorough support for including reasoning. Table 25 Comparing the Level of Synergy in Unit 1 and Unit 2 Claim Evidence Reasoning Level of Synergy in Unit 1 Moderate-Synergy Synergy Moderate-Synergy Level of Synergy in Unit 2 Low-Synergy Synergy Synergy McNeill and Krajcik (2008a) noted a direct link between teachers’ practices and the quality of students’ explanations. In unit 1, Ms. Robinson primarily focused on discussing the components evidence and claim with students, and she did not specifically address any scientific principles when students were creating their explanations. In unit 2, she shifted her focus to include discussing the importance of using scientific principles to justify their use of data as evidence. Ms. Robinson’s practices with respect to reasoning were connected with the quality of students’ reasoning statements. In unit 1, only 10% of the explanations (six explanations) included scientific principles (see Table 26). This number increased to 49% (fifty explanations) in unit 2. In unit one, 43% of the explanations (twenty-five explanations) did not have a reasoning statement that linked claim and evidence. In unit 2, only six percent (six explanations) missed the link between claim and evidence (see Table 26).   121     On the other hand, the decrease in the teacher’s support for claim did not have a negative impact on the quality of explanations. After the focused teaching intervention sessions, Ms. Robinson realized the main gap was in reasoning and that the students did not struggle when creating claims. She provided low-level support for claims for all days when students were creating explanations in unit 2. The decrease in supporting claims (see Table 25) did not influence the quality of students’ claims (see Table 26). This finding is consistent with Lizotte and colleagues (2004), who also found that teacher practices did not significantly influence quality of students’ claims. Table 26 Quality of Students’ Explanations in Unit 1 and Unit 2 Score 0 1 1a 2 3 ClaimUnit 1 8 50 --------N=58 ClaimUnit 2 1 101 ------N=102 EvidenceUnit 1 1 ------57 N=58 EvidenceUnit 2 ------1 101 N=102 ReasoningUnit 1 25 27 --6 --N=58 ReasoningUnit 2 6 12 34 3 47 N=102 As presented in Table 26, there is another substantial difference in the total number of explanations developed in both units. Only thirty-eight students completed explanations in unit 1. This number increased to fifty-two in the second unit. When discussing this change with the teacher after completing the intervention, she recalled poor attendance while the students were creating explanations in unit 1. In both units, the task was creating two explanations. In unit 1, only nineteen students completed this task; this number increased to forty-six in the second unit. In unit 2, the teacher   122     was checking the number of explanations by using the instant review feature, and this is one possible reason for the increase in the number of students who completed the task.   123     CHAPTER 7: DISCUSSION AND IMPLICATIONS This chapter focuses on making connections with literature to discuss the results of the dissertation and show how this dissertation connects to and extends the literature. Finally, the chapter presents the limitations and implications of the study. Discussion Inquiry is an important aspect of science education; in the last three decades many studies have reported the value of using inquiry (NSTA, 1987; National Research Council, 1996; NRC, 2000; Linn et al., 2004; Bybee, 2010; NRC, 2012). One aspect of inquiry is engaging students in scientific explanations (NRC, 1996; NRC, 2000; McNeill & Krajcik, 2008a; Kuhn, 2010; NRC, 2012; Achieve, 2013). The Framework for K-12 Science Education (NRC, 2012) has identified developing scientific explanation as one of eight key scientific practices in which students should engage. The Next Generation of Science Standards uses developing scientific explanations in a number of the performance expectations (Achieve, 2013). Constructing scientific explanations is an inevitable part of inquiry, since the goal of science education is creating scientifically literate students (AAAS, 1993). Despite its importance, developing scientific explanations challenges students (Krajcik et al., 1998; Sandoval & Millwood, 2005; McNeill & Krajcik, 2007; Duschl et al., 2007) and implementing scientific explanation challenges teachers (Crawford et al., 2005; Erduran et al., 2004; Simon et al., 2006; McNeill & Krajcik, 2008a; Zembal-Saul, 2009; McNeill & Knight, 2013). More specifically, several studies noted that the reasoning component of explanations challenges pre-service (Zembal-Saul, 2009) and in-service teachers (McNeill & Krajcik, 2008a; McNeill & Knight, 2013). Teachers also struggle to monitor students’ progress when creating explanations (Simon et al., 2006; McNeill & Knight, 2013). Because developing scientific explanation is so vital for   124     classroom teaching, it is critical to understand how to better implement scientific explanation. The insights from this dissertation can contribute to supporting teachers in scaffolding instruction for middle school students in scientific explanations, with an emphasis on synergy. Previously, some scholars (McNeill et al., 2006; McNeill & Krajcik, 2009) examined synergy when teachers used written scaffolds to support students’ understanding of CER. Tabak (2004) underlined the importance of synergy between teacher practices and technology scaffolds when students are engaged in creating explanations. In this study, I examined synergy with an emphasis on the teacher’s practices when providing instruction and when using technology scaffolds. I found the synergistic support improved the quality of students’ reasoning statements. This is consistent with Tabak’s (2004) idea of providing continuous support over time using various activities to create synergy: “Synergy can occur between different material supports and over a sequence of interactions between different activities” (p. 328). In this process, organizing focused teaching intervention sessions and adding the instant review scaffold played a vital role. When discussing how synergistic support affected the quality of students’ explanations, this section focuses on answering the overarching research question and five sub-questions designed to investigate the overarching research question. How does the teacher use instructional strategies to support students’ understanding of the claim-evidence-reasoning framework? When focusing on teacher practices, previous studies investigated teachers when implementing the same unit or activity. Osborne and colleagues (2004) examined twelve teachers teaching the activity unit in two consecutive years. Similarly, McNeill and Krajcik’s (2008a) study involved several teachers teaching the same unit. Each of these studies (Osborne et al., 2004; McNeill & Krajcik, 2008a) underlined the need for a greater emphasis on teacher   125     practices, and McNeill and Krajcik (2008a) specifically noted the need to track the change in teacher practices and how this change influences student learning. When tracking Ms. Robinson in two different units, she developed and used modeling and critiquing activities in both units and used content-specific scaffolds in these activities. In this process, she used written scaffolds in both modeling activities (see Appendix C and Appendix M). Several studies (McNeill et al., 2006; McNeill & Krajcik, 2009) examined synergy when teachers provided instruction to support students’ understanding of CER using written scaffolds. When modeling CER using written scaffolds, Ms. Robinson’s practices supported synergy by providing high-level support for claim, evidence and reasoning. In this process, she defined claim as a statement, discussed the importance of including data values, and underlined scientific principles. In both units, she critiqued explanations developed using the mobile application. In the plants unit, she critiqued explanations developed when practicing the mobile application. During this process, she also used written scaffolds (See Appendix M), since the activity was designed to familiarize students with the mobile application. In unit 2, she critiqued explanations developed in another school (see Appendix F). Both critiquing activities (activity 3 in the plants unit and activity 2 in the water quality unit) provided similar support. Novak, McNeill and Krajcik (2009) highlighted the importance of this technique, stating that critiquing would help students understand what is missing in their explanations. In this process, Ms. Robinson provided highlevel support for evidence and reasoning, but the level of support for claims stayed at low-level in both activities since the explanations used in critiquing had a complete claim. The teacher focused on discussing the gaps in both activities by illustrating specifics about data values,   126     adding justification for explaining how data served as evidence, and discussing the scientific principles. Besides the support she provided with written scaffolds and critiquing explanations developed with Zydeco, Ms. Robinson created two additional activities focused on making connections with everyday explanations in plants unit. McNeill and Krajcik (2008b) noted that discussing everyday experiences (i.e. what is the best music band?) helps students engage in the claim-evidence-reasoning process; however, teachers should provide support for students’ to create explanations for their everyday statements using evidence and reasoning (McNeill & Krajcik, 2008b). Unfortunately, Ms. Robinson’s main focus was on creating everyday claims and including evidence. When tracking the teacher, I found that she provided similar support when modeling and critiquing explanations. In both units, she provided high-level support for evidence and reasoning (see Table 17, and see Table 21), by using content-specific scaffolds in each unit. Since McNeill and Krajcik (2008a) found making connections with everyday explanations had little impact on improving students’ explanations, I primarily focused on examining her support for modeling and critiquing. How do the teacher’s scaffolds and the scaffolds in the Zydeco work together to support student learning? Tabak (2004) discussed the importance of synergy between technology scaffolds and teacher practices. Ms. Robinson’s support for the evidence section was similar in both units when using mobile devices. As such, her practices were synergistic in both units for evidence. When using mobile devices in the plants unit, the teacher struggled to support the reasoning component of the framework (McNeill & Krajcik, 2008a; McNeill & Knight, 2013), specifically when providing feedback to support students’ reasoning statements as they engaged in creating   127     explanations (Simon et al., 2006; McNeill & Knight, 2013). In unit 2, there was an additional scaffold designed to support the teacher when providing feedback. The recent developments in technology have provided new ways to monitor students, which was highlighted as an important need in recent reports (Office of Educational Technology, 2004; Office of Educational Technology, 2010). Despite its importance, there are not many examples in the literature. One of the few studies using mobile devices as monitoring tools focused on designing a curriculum using mobile devices in Singapore (Zhang et al., 2010). In this study, the control group used textbooks and workbooks when learning about fungi. With the addition of mobile devices, the experimental group watched videos and participated in informal activities through which they collected data to observe fungi. Students’ performance in the experimental group was better than the students in control group. Using mobile devices connected students with the learning environment and provided a more student-centered approach. Besides the increased student performance and engagement, the researchers also noted the value of providing quick feedback in relation to student work when integrating mobile devices. Monitoring students’ progress helped teachers examine students’ challenges during assignments (Zhang et al., 2010). Similar to Zhang and colleagues (2010), using the instant review feature helped the teacher in my study monitor student progress and provide feedback to students as they were creating explanations in the water quality unit. During the plants unit, the teacher had very few instances in which she had individual discussions to support students’ reasoning statements and she did not provide any content-specific scaffolds when students were using mobile devices. In the water quality unit, the teacher discussed the students’ explanations using the instant review feature in Zydeco and focused on investigating how students included scientific principles by providing content-specific scaffolds. In the final interview, she noted that using instant review   128     made students feel that the teacher was tracking them and that the students were no longer anonymous. How does the level of synergy between providing instructional supports and using the supports in mobile devices aid in improving the quality of students’ explanations? This section focused on discussing how the changes in synergy influenced the quality of students’ explanations and also investigated the second research question: “How do students’ claim, evidence and reasoning scores change during the two units?” When using instructional strategies to scaffold students’ understanding of CER, the teacher provided similar support when modeling and critiquing activities by discussing both the quality of the reasoning and the use of science ideas. On the other hand, in the water quality unit, her practices changed when using mobile devices. She supported providing content-specific scaffolds when students were using mobile devices. In both units, the teacher provided high-level support for evidence. The level of synergy for claims was at moderate level (see Table 19) in the plants unit. This decreased to a low-level of synergy in the water quality unit (see Table 23). The synergy level for reasoning was moderate in the plants units (see Table 19), but the scaffolds were synergistic in the water quality unit because the teacher provided high-level support when providing instruction and using mobile devices (see Table 23). McNeill and Krajcik (2008a) found teachers’ practices are linked to quality of students’ explanations. In this dissertation, I found the changes in teacher practices influenced the quality of reasoning, but did not affect the quality of claims. In the plants unit, the high-level support focused on creating links between claim and evidence without supporting students to discuss and use content. There was only one instance in which the teacher asked students to include scientific principles. In unit 1, only 10% of the explanations (six explanations) included scientific   129     principles. Connected with the teacher’s practices, almost half of the explanations created in unit 1 (47% of the explanations) only created links between claims and evidence. Findings from the plants unit match what is currently reported in the literature. Supporting the reasoning component of the framework was challenging for the teacher (McNeill & Krajcik, 2008a; McNeill & Knight, 2013), and the teacher did not provide quality feedback to students when they were engaged in creating explanations (Simon et al., 2006; McNeill & Knight, 2013). With this level of support, the students struggled when creating reasoning (Sandoval & Millwood, 2005; Songer, 2006; McNeill & Krajcik, 2008a; Gotwals & Songer, 2010). Ms. Robinson was an expert teacher who did not have any challenges when implementing the technology. As she stated in the pre-interview, she was confident in providing instruction for supporting explanations. After revealing a huge gap in the students’ explanations, I organized two focused teaching intervention sessions with the teacher before the water quality unit. Besides adding focused teaching intervention into the picture, there was an additional scaffold added to the intervention to help her provide instant feedback to students in the water quality unit. After participating in the focused teaching intervention sessions and adding instant the review feature, the level of synergy increased for reasoning, but it decreased when supporting claims (see Table 25). This synergistic effect could account for the improvement in the students’ written explanations. Several studies noted the importance of synergy when providing instruction (McNeill et al., 2006; McNeill & Krajcik, 2009), and Tabak (2004) discussed synergy when teachers use technology scaffolds to support students’ explanations. In this dissertation, when the teacher provided synergistic support for reasoning during providing instruction and using mobile devices, the quality of students’ reasoning improved.   130     The teacher’s practices affected the quality of reasoning statements the students wrote, but the decrease in the level of synergy did not affect the quality of claims. In both units, almost all explanations had a complete claim. Similarly, Lizotte and colleagues (2004) did not find a correlation between quality of students’ claims and teacher practices. Connected with this idea, studies that found gaps in students’ reasoning did not discuss challenges when creating claims (Sandoval & Millwood, 2005; Songer, 2006; McNeill & Krajcik, 2008a; Gotwals & Songer, 2010). It is important to realize that the teacher most likely backed off on her level of support for claims because she realized students were not having difficulty writing claims. This change can be connected with the Vygotsky’s (1978) Zone of Proximal Development theory that suggested supporting students by emphasizing on the areas that need support. In conclusion, the change in teacher practices supported the quality of students’ written reasoning statements (McNeill & Krajcik, 2008a) and, as such, the overall quality of their explanations. But this change did not occur naturally. In the water quality unit, the instant review feature helped the teacher provide feedback to the students. But, the instant review feature only presented the students’ claim, evidence and reasoning (see Figure 14). To help the teacher improve the feedback and understand what was missing in the students’ explanations after the plants unit, I organized two focused teaching intervention sessions. What is the role of focused teaching intervention in informing the teacher about her practices? Besides putting emphasis on synergy, my study also investigated the role of focused intervention (Desimone et al., 2002) to inform the teacher about her instructional practices. The focused teaching intervention sessions served as a key to the teacher’s instructional supports working together with the technology in the feedback she provided as students created their written explanations. The goals of these sessions were to (a) understand what was missing in the   131     students’ explanations in the plants unit, (b) become familiar with coding rubric, (c) support her when creating a critiquing activity for the water quality unit, and (d) introduce the instant review feature. In this process, making the teacher an active participant was the key. Desimone and colleagues (2002) highlighted this idea as “active learning” and defined the idea as “opportunities for teachers to become actively engaged in the meaningful analysis of teaching and learning” (p. 83). Since I only had one participant, I decided to focus on analyzing student work by coding nine sample explanations with the teacher in the first session instead of evaluating teaching practices. Analyzing student work to improve the quality of teaching practices is an important strategy noted by several scholars (Garet et al., 2001; Desimone et al., 2002; Fishman et al., 2003; Borko, 2004; Simon et al., 2006; Richmond & Manokore, 2011; McNeill & Knight, 2013). Previous studies also underlined the importance of support provided to teachers when implementing technology (Baylor & Ritchie, 2002; Windschtil & Sahl, 2002; Russell et al., 2003; Zhao & Bryant, 2006; Ertmer & Leftwich, 2010; Gerard et al., 2011; Norris & Soloway, 2011). Due to Ms. Robinson’s previous role in the project, the teacher did not need any additional support when using mobile devices, but helping the teacher realize the need for supporting students with the reasoning component of the explanation practices was important. The first focused teaching intervention session was designed to help the teacher understand what was missing in students’ explanations in the plants unit. The second focused teaching intervention session was designed to provide Ms. Robinson with support when creating a critiquing activity for the water quality unit, and to enhance the feedback she provided when using the instant review feature by increasing her familiarity with the mobile application. Several   132     scholars have previously underlined this need (Simon et al., 2006; McNeill & Knight, 2013), and technology scaffolds played an important role in supporting the teacher in this process. The results of this study show how the focused intervention (Desimone et al., 2002) supported the teacher in changing her practices, particularly around supporting students with reasoning. As discussed earlier, this resulted in the students creating explanations with improved reasoning components. Sherin and van Es (2009) designed monthly video club meetings for two years with teachers to improve the quality of teaching practices. After organizing monthly meetings the researchers found individual differences across teachers, and concluded that these meetings helped teachers to analyze students’ ideas. When supporting teachers in professional development sessions, previous studies (Simon et al., 2006; McNeill & Knight, 2013) also noted individual differences but highlighted the improvements in teaching practices to support students’ explanations. These studies organized several meetings for months. Having two sessions would seem a short time span to support synergy, but it is important to note that both focused teaching intervention sessions had one emphasized goal of supporting the teacher to improve quality of students reasoning statements. The explanation created by the teacher included evidence and scientific principles. The teacher did not discuss the connections between these two accurately. Participating in focused teaching intervention sessions helped the teacher to discuss and evaluate connections between evidence and scientific principles in her teaching practices. The change in the teacher’s practices in the water quality led to improvements in the student outcomes (Guskey, 2002).   133     What does this teacher think about providing instruction, using mobile devices, and students’ ability to create explanations before and after the intervention? As noted by Yin (2014), the interviews in this study were not the main data source, but were designed to support other data sources. In both units, I used the interview data to track the teacher’s ideas during the intervention. The teacher was aware that the students were challenged when creating reasoning in the preinterview and post-interview after unit 1. She was confident in using instructional strategies and mobile devices. The focused teaching intervention sessions revealed her high expectations of students. Guskey (2002) noted that change in teacher practices can lead to changes in students’ learning outcomes, which in turn helps teachers change their ideas. After the focused teaching intervention sessions, one of my goals was to investigate the change in the teacher’s practices and ideas. The post-interview after the water quality unit revealed that the focused teaching intervention sessions helped the teacher become familiar with the coding rubric after discussing several explanations focusing on plants and water quality. This, in turn, changed the feedback she provided students when using mobile devices. Besides helping the teacher become familiar with the coding rubric to enhance the feedback, coding explanations from another school (noted as West Park) helped Ms. Robinson understand that students can create quality explanations. In the post-interview after the water quality unit, Ms. Robinson noted that she was expecting another mess. Focused teaching intervention sessions helped the teacher change her ideas (Guskey, 2002) about student performance, since the teacher noted that her students were capable of creating quality explanations after the water quality unit.   134     Implications Findings of this study showed that Ms. Robinson’s support for student claims decreased in unit 2, but this did not affect the quality of the students’ claims. This suggests that when supporting students’ explanations, focusing on components that students find challenging would benefit students’ construction of explanations. In this process, organizing focused teaching intervention sessions (Desimone et al., 2002) that investigate the gaps in students’ understanding would help teachers make changes to their practices. Since the literature predominantly points out the gap in reasoning, creating scaffolds that primarily support reasoning would be beneficial for students and teachers. This dissertation also supports this position. In unit 1, technology scaffolds were designed to support the explanation process for students. In unit 2, we implemented another scaffold for the teacher, which played an important role when the teacher was providing feedback to students. Previous studies designed scaffolds to support students’ explanations (Sandoval & Reiser, 2004; Songer, 2006; Maldonado & Pea, 2010; Kuhn et al., 2012; Laru et al., 2012; Quintana, 2012). In addition, several scholars (Reiser et al., 2004; Quintana et al., 2004; Tabak, 2004; McNeill et al., 2006) highlighted the importance of scaffolds working collaboratively with teachers. Besides the connection between the teacher’s practices and technology scaffolds, my study suggests that creating scaffolds specifically designed to help teachers can improve student learning. Scaffolds reduce the complexity of the task (Reiser 2004; Quintana et al., 2004; McNeill et al., 2006), and as a result, more of a focus needs to occur on supporting teachers with providing feedback to students as they are creating explanations. Previously numerous studies focused on designing technology scaffolds to support students’ explanations, but Tabak (2004) noted that technology scaffolds would not be enough to support   135     this process. When Ms. Robinson tested water quality with 119 seventh grade students in the previous year, students created explanations in groups (thirty-eight groups in total) due to device limitations. Of these thirty-eight groups, thirty-one had a complete claim, thirty-two provided evidence, but only four (11% of the explanations) included complete scientific principles (Lo et al., 2013). The results from the previous year (Lo et al., 2013) were similar to the results in plants unit. When these students created explanations focusing on water quality in the eighth grade, the quality of students’ reasoning scores improved. This provides evidence that the teacher’s scaffolds worked synergistically with the scaffolds in Zydeco to improve student learning. In order to have synergy, different aspects involved in the process need to work collaboratively. Tabak (2004) noted when there is synergy among scaffolds the sum of the support is greater than the individual supports and it is important to examine “how this interaction can come into play” (p. 308). Synergy in this study happened when different components worked to support the same goal. Tabak (2004) defined this idea as: “coherence between the features of the materials and the teacher’s conceptions” (p. 329), but it is also important that teachers make synergy “an explicit goal for the enactment of the curriculum” (p. 329). Connected with these points, teachers need to make sure written scaffolds, technology scaffolds, other resources, and their practices need to work collaboratively to support students’ explanations. Finally, it is important to point out that students’ performance was associated with how the teacher used generic and content-specific scaffolds together. When the teacher provided only generic scaffolds without content-specific scaffolds, students could not create high-quality reasoning statements in the plants unit. The teacher shifted her practices in the water quality unit   136     by providing generic and content-specific scaffolds, which in turn supported the quality of students’ explanations. Connected with previous studies, my study also underlines the importance of having both generic and content-specific scaffolds to support explanations (McNeill et al. 2006; McNeill & Krajcik, 2009; Gotwals & Songer, 2010). Limitations In this dissertation I only worked with an experienced middle school science teacher, thus the conclusions I made are limited in how generalizable they are. However, this dissertation does provide some generalizations that can be used with caution. To better understand how the changes in level of synergy affect the quality of students’ explanations, future studies need to explore the level of synergy in different grades with multiple teachers. In addition, I did not dictate any procedures when providing instruction and using mobile devices. This led the teacher to provide an unequal number of activities (instances) when providing instruction and spending an additional day when using mobile devices in unit 2. Establishing procedures for intervention (deciding the number of activities and days for creating explanations) would help make better comparisons when investigating the level of synergy. However, this study represents what typically occurs in classrooms -- teachers typically modify their instruction based on their student needs. The teacher participating in the study did not have any challenges when using technology and, as such, the focused teaching intervention sessions focused on supporting her practices with an emphasis on the reasoning component of the explanation framework. Ms. Robinson was an integral part of the design team, and we were also testing the application during the intervention. In this process, the teacher faced several challenges, such as the mobile application duplicating students’ explanations when using the instant review feature. To understand how teachers would   137     implement technology scaffolds designed to support explanations, future studies need to also consider supporting teachers’ use of technology in focused teaching intervention sessions after completing the design process for technology scaffolds. I primarily linked the improvements in the quality of students’ reasoning with the changes in the teacher’s practices, but there are three important factors that I should consider for future studies: (a) the role of content, (b) students’ familiarity with the CER framework, and (c) students’ familiarity with the mobile application. The second unit was the second time students used the new version of Zydeco (third time for some students) and students created explanations in another content (plants and water quality). In this study, I did not see improvements from previous water quality unit (Lo et al., 2013) to plants unit. But, several studies emphasized that the role of content is critical when creating explanations since students need to create a link between their evidence and the scientific principles (Sandoval & Millwood, 2005; McNeill et al., 2006; Simon et al., 2006; Songer, 2006; McNeill & Krajcik, 2008a; Gotwals & Songer, 2010; NRC, 2012). Students created explanations in two different content areas, and this might have an impact on the change observed in their explanations. Some scholars also noted that students’ challenges might result from their lack of understanding of the explanation framework (McNeill; 2009; Gotwals & Songer, 2010). In this study, I did not measure students’ understanding of the CER framework, but the teacher noted several times that she presented plenty of opportunities to engage students with the framework before the intervention. Besides the opportunities provided to familiarize students with the CER framework, Ms. Robinson also supported students to create a sample explanation using Zydeco during the plants unit. Neither the teacher nor the students expressed any challenges when using   138     the mobile application for collecting data and creating explanations. Measuring students’ familiarity with the CER framework and using mobile devices in controlled settings would be important factors to consider in future studies. Conclusions In summary, the level of synergy had an impact on the quality of students’ reasoning when constructing explanations, but it did not affect the quality of students’ claims. One of the key ideas when designing scaffolds is to fade scaffolds. McNeill and colleagues (2006) found faded scaffolds supported the quality of students’ explanations better than continuous scaffolds. Vygotsky (1978) highlighted this idea by emphasizing to support the areas that are challenging for learners. In this study, fading the support claims did not influence the quality of claims. However, I could not examine how scaffolds can be faded for reasoning since the quality of students’ reasoning statements was high only in unit 2. Unit 1 replicated the findings from the existing body of literature by noting students’ challenges when creating reasoning statements (Sandoval & Millwood, 2005; Songer, 2006; McNeill & Krajcik, 2008a; Gotwals & Songer, 2010). Mini-case study 1 pointed out struggles the teacher had in supporting the reasoning component of the framework (McNeill & Krajcik, 2008a; McNeill & Knight, 2013) and providing feedback when students were engaged in creating explanations (Simon et al., 2006; McNeill & Knight, 2013). In this process, McNeill and Knight (2013) suggested finding ways to “support teachers in noticing aspects of student argumentation in classroom practice” (p. 965). Connected with this idea, I organized two focused teaching intervention sessions to help the teacher (a) understand what was missing in students’ explanations in the plants unit, (b) become familiar with coding rubric, and (c) introduce the instant review feature of the technology. In the second unit, adding an additional scaffold for the   139     teacher (instant review) and organizing focused teaching intervention sessions shifted the teacher’s focus to supporting the reasoning component of constructing explanations in unit 2. It is also important to note that, when the teacher shifted her focus to reasoning, she emphasized improving the quality of students’ explanation by providing both generic scaffolds and contentspecific scaffolds (Gotwals & Songer, 2010). This in turn supported the quality of students’ reasoning statements and the overall quality of their scientific explanations.   140     APPENDICES   141     APPENDIX A Unit 2- Day 1 Worksheet   142     Water Quality of Rouge River Watershed Do now: Please write down your definition of these terms:                           • Deposition • Erosion • Watershed 143     ACTIVITY: Oil spill problem Figure 18. Oil spill problem Based on this picture, if there was an oil spill on the St. Mary’s River, which of the Great Lake would be affected? Where would the oil end up? Why?   144     APPENDIX B Unit 2- Indicators Worksheet   145     Table 27 Indicators Worksheet Water Quality Indicators pH Definition How do we determine the water quality? When pH changes, DO When DO decreases, BOD When BOD increases, Turbidity When turbidity increases, Phosphate When phosphate increases, Temperature When temperature changes, Fecal Coliform When fecal coliform increases, (Bio-indicator) When water is getting polluted, Microorganism/ Macroorganism Sensitive Benthos: Moderately Tolerant Benthos: Microorganism/ Macroorganism Pollution Tolerant Benthos:   146     APPENDIX C Unit 2- CER Worksheet   147     Claim-Evidence-Reasoning Claim-Evidence-Reasoning is a framework to help you write up a scientific explanation. Do now: Write a scientific explanation that answers the question: Is the water in the school pond drinkable? Claim (Write a sentence that states if water in the school pond is drinkable.) Evidence (Provide data that supports your claim.) Reasoning (Write a statement that connects your evidence to your claim.)   148     APPENDIX D Unit 2- Testing Indicators   149     Table 28 Testing Indicators Indicators We use two types of units to measure DO: Turbidity Phosphate   How do we determine the water quality? 1-Record initial temperature. 2- Fill the short tube; put two tablets of DO and wait for 5 minutes. 3- Compare color of the sample with Dissolved Oxygen color chart. Record the result as ppm. 4- Match water temperature with the ppm you found. You can find the saturation chart below. For instance, if the water temperature is 20 °C, and DO is 8 ppm, saturation is 88 %. 5- Based on the saturation level you found by matching, ppm and temperature, you can find out how good the water quality is: 91-110 % saturation  rank: 4 (excellent) 71-90 % saturation  rank: 3 (good) 51-70 % saturation  rank: 2 (fair) less than 50 % saturation  rank: 1 (poor) 1-Look at the Secchi Disk sticker below the bucket and compare the JTU numbers. 0 JTU  rank: 4 (excellent) 0 JTU to 40 JTU  rank: 3 (good) 40 JTU to 100 JTU  rank: 2 (fair) higher than 100 JTU  rank: 1 (poor) 1-Fill the long tube for 10 ml; add one nitrate tablet. Wait for 5 minutes. 2-Compare the color chart and find out how good the water quality is: 1 ppm  rank: 4 (excellent) 2 ppm  rank: 3 (good) 4 ppm  rank: 2 (fair) 150     How to find living organisms? Figure 19. Living Organisms   151     APPENDIX E Teacher Interviews   152     Table 29 Interview Questions Designed for Pre-Interview When? Interview Questions Before What do you hope to accomplish in your science teaching? (Probe to see if improving scientific Unit explanations is a learning goal or not) How do you define scientific explanations? Is there a 1 framework you use in this process? Could you describe a situation in your classroom in which your students were involved in doing an investigation to create scientific explanations? Which challenges did you face when you engaged your students with scientific explanations? How did your students perform in science project last year? Were they able to complete scientific explanations? Tell me about the technologies you used in your teaching for improving scientific explanations besides Zydeco. • If teacher mentions a technology for the previous question  Why did you use it? What was the value of using it? What were some challenges of using it? How would Zydeco enhance scientific explanations? Why? Do you need any support when designing instructional strategies and materials to support students’ understanding of CER?   153   Goal of the Question To find out where she places scientific explanations in her goals To find out her familiarity with claim-evidencereasoning framework To find out her previous experiences for connecting investigations and scientific explanations To find out her previous challenges related to scientific practices To discuss the quality of student work in the previous science fair project To find out her previous relationship with scientific explanation technologies To explore her initial ideas about Zydeco To find out whether the teacher needs any support for implementing different strategies to support students’ understanding of CER   Table 30 Interview Questions Designed for Post-Interviews after Unit 1 and Unit 2 Interviews What was your goal during plants/ water quality/ energy unit? after unit Which strategies did you use when you engaged 1 and unit students with scientific explanations during plants/ water quality unit before using Zydeco? 2 How did Zydeco help or hinder your teaching practices for enhancing scientific explanations during plants/ water quality unit? • How did student’s engagement, success change by using Zydeco during plants/ water quality investigation? • How did student’s engagement, success change if you did not use Zydeco but create the same plants/ water quality investigation? • How did it facilitate you meeting your learning goals during plants/ water quality investigation? How did it hinder?   154   To find out what she would like to achieve during each unit To find out what the teacher thinks about using different strategies to support explanations To explore what she thinks about using mobile devices to support students when creating explanations   APPENDIX F Explanations Coded in Second Focused Teaching Intervention Session   155     Figure 20. Coding Explanations- Explanation 1 Figure 21. Coding Explanations- Explanation 2   156     Figure 22. Coding Explanations- Explanation 3 Figure 23. Coding Explanations- Explanation 4   157     Figure 24. Coding Explanations- Explanation 5 Figure 25. Coding Explanations- Explanation 6   158     Figure 26. Coding Explanations- Explanation 7 Figure 27. Coding Explanations- Explanation 8   159     Figure 28. Coding Explanations- Explanation 9   160     APPENDIX G Explanations Coded in First Focused Teaching Intervention Session   161     Figure 29. Explanations Coded from Unit 1 (Explanations 1-5)   162     Figure 30. Explanations Coded from Unit 1 (Explanations 6-9)   163     APPENDIX H Creating Hypothesis Worksheet- Unit 1   164     1. My project is …………….. 2. My hypothesis is …………… because ………………..     3. List 5 pieces of information you found and their sources to support your hypothesis.               4. Materials I need for this investigation are…….   165     APPENDIX I Science Fair Questions   166     Questions provided by the teacher were: 1. Does the amount of light affect plant growth? 2. Does the color of light affect plant growth? 3. Does the amount of fertilizer affect plant growth? 4. How does acid rain affect plant growth? 5. Does water pH affect plant growth? 6. Does temperature affect plant growth? 7. Does microwave radiation affect plant growth? 8. Does the size of the seed effect the rate and or speed of germination? 9. Does the type of soil affect plant growth? 10. Does the amount of water affect plant growth?   167     APPENDIX J Procedure Worksheet   168     My procedure is…… 1. 2. 3. 4. 5. 6. 7. 8. 9. 10.   169     APPENDIX K Defining Variables   170     Variables 1. Manipulated or Independent variable (“You change it” variable) 2. Responding or Dependent variable (“It changed”- the one you measure variable) 3. Control variables (variables you keep constant in the experiment)   171     APPENDIX L Plants Modeling Activity   172     Figure 31. Plants Modeling Activity   173     APPENDIX M Plants CER Activity   174     Activity #3 Is a seed alive? Write a claim evidence reasoning statement for this question. Claim: Evidence: Reasoning: Activity #4: Is a seed alive? Complete Claim, Evidence, and Reasoning statement using data on iPad.   175     APPENDIX N Summary of Coding for Plants Unit (Unit 1)   176     Table 31 Summary of Teacher Practices in Part 1- Activity 1 (Unit 1)   Activity 1-Claim Activity 1-Evidence 2 (claim as a statement) 1 (the need to include data) Activity 1- Reasoning 2 (connection with claim and evidence) 2 (claim as a statement) 2 (discussing data) 3 (scientific principles) 2 (claim as a statement) 1 (the need to include data) 3 (scientific principles) 2 (claim as a statement) 2 (discussing data) 2 (claim as a statement) 2 (claim as a statement) Claim Summary: All instances provided highlevel support resulting in high support for claims in activity 1. 1 (the need to include data) 1 (the need to include data) Evidence Summary: Majority of the instances provided lowlevel support resulting in low support for evidence in activity 1. 177   Reasoning Summary: All instances provided high-level support resulting in high support for reasoning in activity 1.   Table 32 Summary of Teacher Practices in Part 1- Activity 2 (Unit 1)   Activity 2-Claim Activity 2-Evidence Activity 2- Reasoning 2 (claim as a statement) 2 (discussing data) 3 (scientific principles) 2 (claim as a statement) 2 (discussing data) 1 (the need to add reasoning) 2 (claim as a statement) 1 (the need to include data) 2 (claim as a statement) 1 (the need to include data) 3 (scientific principles) 2 (connection with claim and evidence) 2 (claim as a statement) 1 (the need to include data) 1 (reminding to create claims) 1 (the need to include data) 2 (claim as a statement) 1 (the need to include data) 1 (the need to add reasoning) 2 (connection with claim and evidence) 2 (connection with claim and evidence) 1 (reminding to create claims) 2 (discussing data) 1 (the need to add reasoning) 2 (claim as a statement) 2 (discussing data) 3 (scientific principles) 1 (reminding to create claims) 1 (the need to include data) 1 (the need to add reasoning) 2 (claim as a statement) 2 (discussing data) 1 (the need to add reasoning) 1 (reminding to create claims) 2 (discussing data) 1 (reminding to create claims) 2 (discussing data) 1 (reminding to create claims) 1 (reminding to create claims) Claim Summary: Majority of the instances provided highlevel support resulting in high support for claims in activity 2. Evidence Summary: Majority of the instances provided highlevel support resulting in high support for evidence in activity 2. 178   Reasoning Summary: Majority of the instances provided high-level support resulting in high support for reasoning in activity 2.   Table 33 Summary of Teacher Practices in Part 1- Activity 3 (Unit 1) Activity 3-Claim Activity 3-Evidence Activity 3- Reasoning 2 (claim as a statement) 1 (reminding to create claims) 2 (discussing data) 1 (the need to include data) 2 (claim as a statement) 1 (reminding to create claims) 1 (the need to include data) 1 (the need to include data) 2 (claim as a statement) 1 (the need to include data) 2 (claim as a statement) 1 (the need to include data) 1 (reminding to create claims) 2 (claim as a statement) 2 (notes claim as statement) 2 (discussing data) 2 (discussing data) 2 (discussing data) 1 (the need to add reasoning) 3 (scientific principles) 2 (connection with claim and evidence) 1 (the need to add reasoning) 2 (connection with claim and evidence) 2 (connection with claim and evidence) 2 (connection with claim and evidence) 3 (scientific principles) 3 (scientific principles) 1 (reminding to create claims) 1 (reminding to create claims) 2 (discussing data) 2 (discussing data) 1 (reminding to create claims) 1 (the need to include data) 1 (reminding to create claims) 2 (discussing data) 2 (claim as a statement) 2 (discussing data) 2 (claim as a statement) 2 (discussing data) 2 (claim as a statement) 1 (reminding to create claims) 2 (discussing data) 1 (the need to include data) 1 (reminding to create claims) 1 (reminding to create claims) 1 (reminding to create claims) Evidence Summary: Claim Summary: Majority of Majority of the instances the instances provided lowprovided high-level support level support resulting in low resulting in high support for support for claims in activity 3. evidence in activity 3.   179   2 (linking claim and evidence) 3 (scientific principles) 2 (connection with claim and evidence) 2 (connection with claim and evidence) 2 (connection with claim and evidence) 2 (connection with claim and evidence) 2 (connection with claim and evidence) 3 (scientific principles) 2 (connection with claim and evidence) 2 (connection with claim and evidence) 2 (connection with claim and evidence) Reasoning Summary: Almost all instances provided high-level support resulting in high support for reasoning in activity 3.   Table 34 Summary of Teacher Practices in Part 1- Activity 4 (Unit 1) Activity 4-Claim Activity 4-Evidence Activity 4- Reasoning 2 (claim as a statement) 1 (the need to include data) 1 (the need to add reasoning) 2 (claim as a statement) 1 (the need to include data) 1 (the need to add reasoning) 2 (claim as a statement) 1 (the need to include data) 1 (the need to add reasoning) 2 (claim as a statement) 1 (the need to include data) 2 (claim as a statement) 2 (discussing data) 1 (the need to add reasoning) 2 (connection with claim and evidence) 2 (claim as a statement) 2 (claim as a statement) 2 (claim as a statement) 2 (claim as a statement) 2 (claim as a statement) 2 (claim as a statement) 2 (discussing data) 2 (discussing data) 2 (discussing data) 2 (discussing data) 2 (discussing data) 1 (the need to include data) 2 (claim as a statement) 2 (claim as a statement) 2 (claim as statement) 2 (claim as statement) 2 (claim as statement) 2 (discussing data) 1 (the need to include data) 2 (discussing data) 2 (discussing data) 2 (discussing data) 2 (claim as statement) 2 (discussing data) 1 (the need to include data) 1 (the need to include data) 2 (discussing data) 2 (discussing data) 2 (discussing data) Claim Summary: All instances provided highlevel support resulting in high support in activity 4.   1 (the need to include data) 2 (discussing data) 2 (discussing data) 2 (discussing data) Evidence Summary: Majority of the instances provided highlevel support resulting in high support for evidence in activity 4. 180   Reasoning Summary: Only one instance provided highlevel support resulting in low support for reasoning in activity 4.   Table 35 Summary of Teacher Practices in Part 2- Day 1 (Unit 1) Day 1-Claim Day 1-Evidence 2 (claim as a statement) 1 (the need to include data) 2 (claim as a statement) 1 (the need to include data) 2 (claim as a statement) 2 (discussing data) 2 (claim as a statement) 2 (discussing data) 2 (claim as a statement) 2 (discussing data) 2 (claim as a statement) 2 (discussing data) 2 (claim as a statement) 2 (discussing data) 2 (claim as a statement) 1 (the need to include data) 1 (reminding to create claims) 2 (discussing data) 1 (reminding to create claims) 2 (discussing data) 2 (claim as a statement) 1 (the need to include data) 2 (claim as a statement) 2 (discussing data) 2 (discussing data) 2 (discussing data) 2 (discussing data) 2 (discussing data) 2 (discussing data) 2 (discussing data) Claim Summary: Majority of the instances provided high-level support resulting in high support for claim in day 1.   Evidence Summary: Majority of the instances provided highlevel support resulting in high support for evidence in day 1. 181   Day 1- Reasoning 2 (connection with claim and evidence) 2 (connection with claim and evidence) 2 (connection with claim and evidence) 2 (connection with claim and evidence) 2 (connection with claim and evidence) 1 (the need to add reasoning) 2 (connection with claim and evidence) 2 (connection with claim and evidence) 2 (connection with claim and evidence) 3 (scientific principles) 2 (connection with claim and evidence) 2 (connection with claim and evidence) 2 (connection with claim and evidence) 2 (connection with claim and evidence) Reasoning Summary: Almost all instances provided high-level support resulting in high support for reasoning in day 1.   Table 36 Summary of Teacher Practices in Part 2- Day 2 (Unit 1) Day 2-Claim Day 2- Evidence Day 2- Reasoning 1 (reminding to create claims) 2 (discussing data) 1 (the need to add reasoning) 1 (reminding to create claims) 2 (discussing data) 0 (no support) 1 (reminding to create claims) 2 (discussing data) 1 (the need to add reasoning) 2 (claim as a statement) 2 (discussing data) 1 (only discusses including data) 1 (only discusses including data) 1 (the need to add reasoning) 2 (claim as a statement) 1 (reminding to create claims) 1 (reminding to create claims)   0 (no support) 2 (discussing data) 1 (only discusses including data) 2 (claim as a statement) 2 (discussing data) 1 (reminding to create claims) 2 (discussing data) 2 (claim as a statement) 0 (no support) 2 (discussing data) 1 (only discusses including data) 1 (reminding to create claims) 2 (discussing data) 2 (claim as a statement) 2 (claim as a statement) Claim Summary: Of these fifteen instances, only six provided high-level support. This resulted in low support for claims in day 2. 2 (discussing data) Evidence Summary: Of these fourteen instances, ten focused discussing data. This resulted in high-level support for evidence in day 2. 182   1 (the need to add reasoning) 1 (the need to add reasoning) Reasoning Summary: None of these instances provided high-level support, which resulted in low-level support for reasoning in day 2.   APPENDIX O Quality of Students’ Explanations in Unit 1   183     Table 37 Quality of Students’ Explanations in Unit 1 Explanation Number Explanation #1 Explanation #2 Explanation #3 Explanation #4 Explanation #5 Explanation #6 Explanation #7 Explanation #8 Explanation #9 Explanation #10 Explanation #11 Explanation #12 Explanation #13 Explanation #14 Explanation #15 Explanation #16 Explanation #17 Explanation #18 Explanation #19 Explanation #20 Explanation #21 Explanation #22 Explanation #23 Explanation #24 Explanation #25 Explanation #26 Explanation #27 Explanation #28 Explanation #29 Explanation #30 Explanation #31 Explanation #32 Explanation #33 Explanation #34 Explanation #35 Explanation #36 Explanation #37 Explanation #38   Claim Score 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 Evidence Score 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 0 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 184   Reasoning Score 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1   Table 37 (cont’d) Explanation #39 Explanation #40 Explanation #41 Explanation #42 Explanation #43 Explanation #44 Explanation #45 Explanation #46 Explanation #47 Explanation #48 Explanation #49 Explanation #50 Explanation #51 Explanation #52 Explanation #53 Explanation #54 Explanation #55 Explanation #56 Explanation #57 Explanation #58   1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 185   1 1 1 1 1 1 1 1 1 2 2 2 2 2 0 0 0 1 1 2   APPENDIX P Summary of Coding for Water Quality Unit (Unit 2)   186     Table 38 Summary of Teacher Practices in Part 1- Activity 1 (Unit 2)   Activity 1-Claim Activity 1- Evidence Activity 1-Reasoning 2 (claim as a statement) 1 (reminding to create claims) 1 (reminding to create claims) 2 (discussing data) 3 (scientific principles) 2 (connection with claim and evidence) 2 (discussing data) 1 (the need to include data) 2 (claim as a statement) 3 (scientific principles) 2 (connection with claim and evidence) 2 (claim as a statement) 3 (scientific principles) Claim Summary: Majority of the instances provided high-level support resulting in high support for claims in activity 1. 3 (scientific principles) 3 (scientific principles) Reasoning Summary: All instances provided highlevel support resulting in high support for reasoning in activity 1. Evidence Summary: Majority of the instances provided high-level support resulting in high support for evidence in activity 1. 187     Table 39 Summary of Teacher Practices in Part 1- Activity 2 (Unit 2) Activity 2-claim Activity 2- evidence 1 (reminding to create claims) 2 (discussing data) 1 (reminding to create claims) 2 (discussing data) 1 (reminding to create claims) 2 (discussing data) 1 (reminding to create claims) 1 (the need to include data) 2 (claim as a statement) 1 (the need to include data) 1 (reminding to create claims) 2 (discussing data) 3 (scientific principles) 2 (connection with claim and evidence) 2 (connection with claim and evidence) 2 (connection with claim and evidence) 2 (connection with claim and evidence) 1 (reminding to create claims) 2 (discussing data) 1 (the need to add reasoning) 1 (reminding to create claims) 2 (discussing data) 3 (scientific principles) 2 (claim as a statement) 2 (discussing data) 3 (scientific principles) 2 (claim as a statement) 1 (the need to include data) 1 (reminding to create claims) 1 (the need to include data) 3 (scientific principles) 2 (connection with claim and evidence) 1 (reminding to create claims) 1 (the need to include data) 2 (discussing data) Claim Summary: Majority Evidence Summary: Majority of the instances provided low- of the instances provided highlevel support resulting in low level support resulting in high support for claims in activity support for evidence in activity 2. 2.   188   Activity 2-reasoning 2 (connection with claim and evidence) 3 (scientific principles) 3 (scientific principles) Reasoning Summary: Almost all instances provided high-level support resulting in high support for reasoning in activity 2.   Table 40 Summary of Teacher Practices in Part 2- Day 1 (Unit 2) Day 1-Claim Day 1- evidence Day 1-reasoning 1 (the need to include data) 2 (discussing data) 3 (scientific principles) 1 (the need to include data) 2 (discussing data) 3 (scientific principles) 3 (scientific principles) 3 (scientific principles) Claim Summary: The teacher devoted almost 20 minutes for using mobile devices. In this process, she discussed the assignment, and had very few discussions. Her discussions of claims only provided reminders, which resulted in low support for claims in day 1.   Reasoning Summary: All of Ms. Robinson’s discussions Evidence Summary: Focused emphasized on scientific on discussing data, which principles. This resulted in resulted in high support for high support for reasoning in evidence in day 1. day 1. 189     Table 41 Summary of Teacher Practices in Part 2- Day 2 (Unit 2) Day 2-Claim Day 2- evidence Day 2-reasoning 1 (reminding to create claims) 2 (discussing data) 2 (claim as a statement) 1 (the need to include data) 3 (scientific principles) 3 (scientific principles) 2 (claim as a statement) 2 (discussing data) 3 (scientific principles) 1 (reminding to create claims) 2 (claim as a statement) 2 (claim as a statement) 1 (reminding to create claims) 2 (claim as a statement) 1 (the need to include data) 2 (discussing data) 2 (discussing data) 2 (discussing data) 2 (discussing data) 3 (scientific principles) 1 (reminding to create claims) 2 (discussing data) 1 (reminding to create claims) 1 (the need to include data) 1 (reminding to create claims) 2 (discussing data) 1 (reminding to create claims) 1 (the need to include data) 1 (the need to include data) 2 (discussing data) 2 (discussing data) 2 (discussing data) 2 (discussing data) Claim Summary: Seven instances focused on providing reminders, only five supported creating claim as a statement. This resulted in low support for claims in day 2.   Reasoning Summary: Compared to claim and evidence, there were fewer instances focusing on reasoning. All of them Evidence Summary: Majority focused discussing scientific of the instances provided high- principles. This resulted in level support resulting in high high support for reasoning in support for evidence in day 2. day 2. 190     Table 42 Summary of Teacher Practices in Part 2- Day 3 (Unit 2) Day 3-Claim Day 3- evidence Day 3-reasoning 1 (reminding to create claims) 1 (the need to include data) 3 (scientific principles) 1 (reminding to create claims) 2 (claim as a statement) 2 (discussing data) 2 (discussing data) 1 (the need to include data) 3 (scientific principles) 2 (claim as a statement) 2 (discussing data) 3 (scientific principles) 1 (reminding to create claims) 1 (reminding to create claims) 2 (discussing data) 2 (discussing data) 3 (scientific principles) 3 (scientific principles) 1 (reminding to create claims) 2 (discussing data) 2 (discussing data) 2 (discussing data) 2 (discussing data) 2 (discussing data) 2 (discussing data) 2 (discussing data) 2 (discussing data) 2 (discussing data) 2 (discussing data) 2 (discussing data) 3 (scientific principles) 3 (scientific principles) 1 (the need to include data) 3 (scientific principles) 3 (scientific principles) 3 (scientific principles) 3 (scientific principles) 3 (scientific principles) 3 (scientific principles) 2 (connection with claim and evidence) 3 (scientific principles) 2 (connection with claim and evidence) 3 (scientific principles) 2 (discussing data) 2 (discussing data) 2 (discussing data) 3 (scientific principles) 3 (scientific principles) 3 (scientific principles) 2 (discussing data) 2 (discussing data) 2 (discussing data) Evidence Summary: Almost all of the instances discussed data. This resulted in high support for evidence in day 3. 3 (scientific principles) 3 (scientific principles) 3 (scientific principles) Reasoning Summary: Almost all of the instances provided high-level support. This resulted in high support for reasoning in day 3. 2 (discussing data) 2 (discussing data) Claim Summary: Majority of the instances focused on reminding the need for adding claims. This resulted in low support for claims in day 3.   191     APPENDIX Q Quality of Students’ Explanations in Unit 2   192     Table 43 Quality of Students’ Explanations in Unit 2 Explanation Number Explanation #1 Explanation #2 Explanation #3 Explanation #4 Explanation #5 Explanation #6 Explanation #7 Explanation #8 Explanation #9 Explanation #10 Explanation #11 Explanation #12 Explanation #13 Explanation #14 Explanation #15 Explanation #16 Explanation #17 Explanation #18 Explanation #19 Explanation #20 Explanation #21 Explanation #22 Explanation #23 Explanation #24 Explanation #25 Explanation #26 Explanation #27 Explanation #28 Explanation #29 Explanation #30 Explanation #31 Explanation #32 Explanation #33 Explanation #34 Explanation #35 Explanation #36 Explanation #37 Explanation #38   Claim Score 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 Evidence Score 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 193   Reasoning Score 0 0 0 0 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1a 1a 1a 1a 1a 1a 1a 1a 1a 1a 1a 1a 1a 1a 1a 1a 1a 1a 1a 1a   Table 43 (cont’d) Explanation #39 Explanation #40 Explanation #41 Explanation #42 Explanation #43 Explanation #44 Explanation #45 Explanation #46 Explanation #47 Explanation #48 Explanation #49 Explanation #50 Explanation #51 Explanation #52 Explanation #53 Explanation #54 Explanation #55 Explanation #56 Explanation #57 Explanation #58 Explanation #59 Explanation #60 Explanation #61 Explanation #62 Explanation #63 Explanation #64 Explanation #65 Explanation #66 Explanation #67 Explanation #68 Explanation #69 Explanation #70 Explanation #71 Explanation #72 Explanation #73 Explanation #74 Explanation #75 Explanation #76 Explanation #77 Explanation #78 Explanation #79 Explanation #80 Explanation #81   1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 2 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 194   1a 1a 1a 1a 1a 1a 1a 1a 1a 1a 1a 1a 1a 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3   Table 43 (cont’d) Explanation #82 Explanation #83 Explanation #84 Explanation #85 Explanation #86 Explanation #87 Explanation #88 Explanation #89 Explanation #90 Explanation #91 Explanation #92 Explanation #93 Explanation #94 Explanation #95 Explanation #96 Explanation #97 Explanation #98 Explanation #99 Explanation #100 Explanation #101 Explanation #102 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1 1 1 1 1 1 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3     195   3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 1a 2 2 2   REFERENCES   196     REFERENCES   Achieve. Next Generation Science Standards. National Research Council: Washington, DC, 2013. Retrieved from: http://www.nextgenscience.org/next-generation-science-standards American Association for the Advancement of Science. (1993). Benchmarks for science literacy: A Project 2061 report. (New York: Oxford University Press). Baylor, A. L., & Ritchie, D. (2002). What factors facilitate teacher skill, teacher morale, and perceived student learning in technology-using classrooms? Computers & Education, 39(4), 395-414. Boeije, H. (2002). A purposeful approach to the constant comparative method in the analysis of qualitative interviews. Quality and Quantity, 36(4), 391-409. Borko, H. (2004). Professional development and teacher learning: Mapping the terrain. Educational Researcher, 33(8), 3-15. Bybee, R. (2010). The teaching of science: 21st century perspectives. Arlington, VA: National Science Teachers Association Press. Crawford, B. A., Zembal-Saul, C., Munford, D., & Friedrichsen, P. (2005). Confronting prospective teachers' ideas of evolution and scientific inquiry using technology and inquiry-based tasks. Journal of Research in Science Teaching, 42(6), 613-637. Creswell, J. (2007). Qualitative inquiry and research design (2nd ed.). Thousand Oaks, CA: Sage. Davis, E. A. (2003). Prompting middle school science students for productive reflection: Generic and directed prompts. The Journal of the Learning Sciences,12(1), 91-142. Desimone, L. M., Porter, A. C., Garet, M. S., Yoon, K. S., & Birman, B. F. (2002). Effects of professional development on teachers’ instruction: Results from a three-year longitudinal study. Educational Evaluation and Policy Analysis, 24(2), 81-112. Duschl, R. A., Schweingruber, H. A., & Shouse, A. W. (Eds.). (2007). Taking science to school: Learning and teaching science in grades K-8. Washington, DC: National Academies Press. Eisenhardt, K. M. (1989). Building theories from case study research. Academy of Management Review, 14(4), 532-550. Erduran, S., Simon, S., & Osborne, J. (2004). TAPping into argumentation: Developments in the application of Toulmin’s argument pattern for studying science discourse. Science Education, 88(6), 915-933.   197     Ertmer, P. A., & Ottenbreit-Leftwich, A. T. (2010). Teacher technology change: How knowledge, confidence, beliefs, and culture intersect. Journal of Research on Technology in Education, 42(3), 255-284. Fishman, B. J., Marx, R. W., Best, S., & Tal, R. T. (2003). Linking teacher and student learning to improve professional development in systemic reform. Teaching and Teacher Education, 19(6), 643-658. Garet, M. S., Porter, A. C., Desimone, L., Birman, B. F., & Yoon, K. S. (2001). What makes professional development effective? Results from a national sample of teachers. American Educational Research Journal, 38(4), 915-945. Gerard, L. F., Varma, K., Corliss, S. B., & Linn, M. C. (2011). Professional development for technology-enhanced inquiry science. Review of Educational Research, 81(3), 408448. Glesne, C. (2011). Becoming qualitative researchers: An introduction (4th ed.). Boston, MA: Pearson. Gotwals, A. W., & Songer, N. B. (2010). Reasoning up and down a food chain: Using an assessment framework to investigate students' middle knowledge. Science Education, 94(2), 259-281. Gotwals, A. W., Songer, N. B., & Bullard, L. (2012). Assessing student’s progressing abilities to construct scientific explanations. In Alonzo, A. C., & Gotwals, A. W. (Eds.), Leaping Forward. Learning Progressions in Science. Gray, L., Thomas, N., & Lewis, L. (2010). Teachers’ use of educational technology in us public schools: 2009. First Look. NCES 2010-040. National Center for Education Statistics. Guskey, T. R. (2002). Professional development and teacher change. Teachers and Teaching: Theory and Practice, 8(3), 381-391. Keen, J., & Packwood, T. (1995). Case study evaluation. BMJ: British Medical Journal, 311(7002), 444. Krajcik, J., Blumenfeld, P.C., Marx, R.W., Bass, K.M., & Fredericks, J. (1998). Inquiry in project based science classrooms: Initial attempts by middle school students. Journal of the Learning Sciences, 77, 317-337. Krajcik, J. S., Blumenfeld, P. C., Marx, R. W., & Soloway, E. (1994). A collaborative model for helping middle grade science teachers learn project-based instruction. The Elementary School Journal, 483-497.   198     Krajcik, J., Blumenfeld, P., Marx, R., & Soloway, E. (2000). Instructional, curricular, and technological supports for inquiry in science classrooms. In J. Minstrell & E. Van Zee (Eds.), Inquiring into Inquiry Learning and Teaching in Science (pp. 283-315). Washington, DC: American Association for the Advancement of Science. Krajcik, J., McNeill, K.L., & Reiser, B.J. (2008). Learning-goals-driven design model: Developing curriculum materials that align with national standards and incorporate project-based pedagogy. Science Education, 92(1), 1–32. Kuhn, D. (2010). Teaching and learning science as argument. Science Education, 94(5), 810 – 824. Kuhn, A., McNally, B., Schmoll, S., Cahill, C., Lo, W. T., Quintana, C., & Delen, I. (2012, May). How students find, evaluate and utilize peer-collected annotated multimedia data in science inquiry with Zydeco. In Proceedings of the 2012 ACM annual conference on Human Factors in Computing Systems (pp. 3061-3070). ACM. Laru, J., Järvelä, S. & Clariana, R. (2012). Supporting collaborative inquiry during a biology field trip with mobile peer-to-peer tools for learning: A case study with K-12 learners. Interactive Learning Environments, 20(2), 103–117. doi:10.1080/10494821003771350. Linn, M. C., Davis, E. A., & Bell, P. (2004). Inquiry and technology. In M. C. Linn, E. A. Davis, & P. Bell (Eds.), Internet environments for science education (pp. 3–27). Mahwah, NJ: Lawrence Erlbaum. Lizotte, D. J., McNeill, K. L., & Krajcik, J. (2004, June). Teacher practices that support students' construction of scientific explanations in middle school classrooms. In Proceedings of the 6th international conference on Learning sciences (pp. 310-317). International Society of the Learning Sciences.   Lo, W.T., Delen, I., Kuhn, A., Duck, J., McGee, S., Quintana, C. & Krajcik, J. (2013). Zydeco: A Mobile-Based Inquiry Learning System to Support Project-Based Learning. Paper presented at Annual International Conference of The American Educational Research Association (AERA). April 27- May, 1 2013. San Francisco, CA. Mahoney, G., O'Sullivan, P., & Dennebaum, J. (1990). Maternal Perceptions of Early Intervention Services A Scale for Assessing Family-Focused Intervention. Topics in Early Childhood Special Education, 10(1), 1-15. Maldonado, H., & Pea, R. D. (2010, April). LET's GO! to the Creek: Co-design of water quality inquiry using mobile science collaboratories. In Wireless, Mobile and Ubiquitous Technologies in Education (WMUTE), (pp. 81-87). IEEE.   199     Maxwell, J. (2005). Qualitative research design: An interactive approach (2nd ed.). Thousand Oaks, CA: Sage. McCaffrey, M. (2011). Why mobile is a must. T.H.E. Journal, 38(2), 21-22. McNeill, K. L. (2009). Teachers' use of curriculum to support students in writing scientific arguments to explain phenomena. Science Education, 93(2), 233-268. McNeill, K. L. (2011). Elementary students' views of explanation, argumentation, and evidence, and their abilities to construct arguments over the school year. Journal of Research in Science Teaching, 48(7), 793-823. McNeill, K. L., & Knight, A. M. (2013). Teachers’ pedagogical content knowledge of scientific argumentation: the impact of professional development on K–12 teachers. Science Education, 97(6), 936-972. McNeill, K. L. & Krajcik, J. (2007). Middle school students' use of appropriate and inappropriate evidence in writing scientific explanations. In Lovett, M & Shah, P (Eds.). Thinking with Data: Proceedings of the 33rd Carnegie Symposium on Cognition (pp.233–265). New York: Taylor & Francis. McNeill, K.L., & Krajcik, J. (2008a). Scientific explanations: Characterizing and evaluating the effects of teachers’ instructional practices on student learning. Journal of Research in Science Teaching, 45(1), 53–78. McNeill, K. L. & Krajcik, J. (2008b). Inquiry and scientific explanations: Helping students use evidence and reasoning. In Luft, J., Bell, R. & Gess-Newsome, J. (Eds.). Science as Inquiry in the Secondary Setting. (p. 121-134). Arlington, VA: National Science Teachers Association Press. McNeill, K. L., & Krajcik, J. (2009). Synergy between teacher practices and curricular scaffolds to support students in using domain-specific and domain-general knowledge in writing arguments to explain phenomena. The Journal of the Learning Sciences, 18(3), 416-460. McNeill, K. L., & Krajcik, J. (2012). Supporting grade 5–8 students in constructing explanations in science: The claim, evidence and reasoning framework for talk and writing. New York: Pearson Allyn & Bacon. McNeill, K. L., Lizotte, D. J., Krajcik, J., & Marx, R. W. (2006). Supporting students’ construction of scientific explanations by fading scaffolds in instructional materials. The Journal of the Learning Sciences, 15(2), 153 –191. National Research Council (1996). National Science Education Standards. New York: National Academies Press.   200     National Research Council (2000). Inquiry and the National Science Education Standards: A guide for teaching and learning. Washington, D. C.: National Academy Press. National Research Council. (2012). A framework for K-12 science education: Practices, crosscutting concepts, and core ideas. Committee on a Conceptual Framework for New K-12 Science Education Standards. Board on Science Education, Division of Behavioral and Social Sciences and Education. Washington, DC: The National Academies Press. National Science Teachers Association (1987). Criteria for Excellence. An NSTA Science Compact. (Report No: ISBN-0-87355-063-3). Retrieved from: http://www.eric.ed.gov/PDFS/ED280739.pdf. Norris, C., Hossain, A., & Soloway, E. (2011). Using smartphones as essential tools for learning. Educational Technology, 18. Norris, C. A., & Soloway, E. (2011). Learning and schooling in the age of mobilism. Educational Technology, 51(6), 3. Novak, A. M., McNeill, K. L., & Krajcik, J. S. (2009). Helping students write scientific explanations. Science Scope, 33(1), 54-56. Novak A. M., Treagust, D.F. (2013). Adjusting claims as new evidence emerges: Do students incorporate new information into their scientific explanations? Paper presented at Annual International Conference of National Association for Research in Science Teaching (NARST). April 5-9, 2013. Puerto Rico. Osborne, J., Erduran, S., & Simon, S. (2004). Enhancing the quality of argumentation in school science. Journal of Research in Science Teaching, 41(10), 994-1020. Pasnik, S., Strother, S., Schindel, J., Penuel, W. R., & Llorente, C. (2007). Review of research on media and young children’s literacy. New York, NY and Menlo Park, CA: Education Development Center and SRI International. Pellegrino, J. W., Chudowsky, N., & Glaser, R. (Eds.). (2001). Knowing what students know: The science and design of educational assessment. National Academies Press, Washington, DC. Quintana, C. (2012). Pervasive science: Using mobile devices and the cloud to support science education. Interactions, 19(4), 76-80. Quintana, C., Reiser, B.J., Davis, E.J., Krajcik, J., Fretz, E., Duncan, R,G., Kyza, E., Edelson, D., & Soloway, E. (2004). A scaffolding design framework for software to support science inquiry. The Journal of the Learning Sciences, 13(1), 337-386.   201     Reinking, D., & Bradley, B. A. (2008). Formative and design experiments: Approaches to language and literacy research. New York: Teachers College Press. Reiser, B.J., (2004). Scaffolding complex learning: The mechanisms of structuring and problematizing. The Journal of the Learning Sciences, 13(1), 273-304. Richmond, G., & Manokore, V. (2011). Identifying elements critical for functional and sustainable professional learning communities. Science Education, 95(3), 543-570. Russell, M., Bebell, D., O'Dwyer, L., & O'Connor, K. (2003). Examining teacher technology use implications for preservice and inservice teacher preparation. Journal of Teacher Education, 54(4), 297-310. Sandoval, W. A. & Millwood, K. A. (2005). The quality of students’ use of evidence in written scientific explanations. Cognition and Instruction, 23(1), 23-55. Sandoval, W. A., & Reiser, B. J. (2004). Explanation-driven inquiry: Integrating conceptual and epistemic supports for science inquiry. Science Education, 88, 345–372. Sherin, M. G., & van Es, E. A. (2009). Effects of video club participation on teachers' professional vision. Journal of Teacher Education, 60(1), 20-37. Singer, J., Marx, R. W., Krajcik, J., & Clay Chambers, J. (2000). Constructing extended inquiry projects: Curriculum materials for science education reform. Educational Psychologist, 35(3), 165-178. Simon, S., Erduran, S., & Osborne, J. (2006). Learning to teach argumentation: Research and development in the science classroom. International Journal of Science Education, 28 (2-3), 235-260. Songer, N.B. (2006). BioKIDS: An animated conversation on the development of curricular activity structures for inquiry science. In R. Keith Sawyer, (Ed.) Cambridge Handbook of the Learning Sciences. (pp. 355-369). New York: Cambridge. Strauss, A. & Corbin, J. (1998). Basics of qualitative research (2nd Edition). Thousand Oaks, CA: Sage. Tabak, I. (2004). Synergy: A complement to emerging patterns of distributed scaffolding. The Journal of the Learning Sciences, 13(3), 305-335. Toulmin, S. (1958). The uses of argument. Cambridge, UK: University Press. U.S. Department of Education. (2004). Office of Educational Technology. Toward a new golden age in American education: How the internet, the law and today’s students are revolutionizing expectations. Washington, D.C.   202     U.S. Department of Education. (2010). Office of Educational Technology. Transforming American education: Learning powered by technology. Washington, D.C.: U.S. Department of Education. Retrieved from: http://www.ed.gov/sites/default/files/netp2010.pdf. Vidovich, M. R., Lautenschlager, N. T., Flicker, L., Clare, L., & Almeida, O. P. (2013). Treatment fidelity and acceptability of a cognition-focused intervention for older adults with mild cognitive impairment (MCI). International Psychogeriatrics, 25(05), 815-823. Vygotsky, L. S. (1978). Mind in society: The development of higher psychological processes. Cambridge, MA: Harvard University Press. Wechsberg, W. M., Lam, W. K., Zule, W. A., & Bobashev, G. (2004). Efficacy of a woman-focused intervention to reduce HIV risk and increase self-sufficiency among African American crack abusers. American Journal of Public Health, 94(7), 1165-1173. Williams, M., Montgomery, B. L., & Manokore, V. (2012). From phenotype to genotype: Exploring middle school students' understanding of genetic inheritance in a web-based environment. The American Biology Teacher, 74(1), 35-40. Windschitl, M., & Sahl, K. (2002). Tracing teachers’ use of technology in a laptop computer school: The interplay of teacher beliefs, social dynamics, and institutional culture. American Educational Research Journal, 39(1), 165-205. Yin, R. K. (2014). Case study research: Design and methods (5th ed.). Thousand Oaks, CA: SAGE. Zembal, Saul, C. (2009). Learning to teach elementary school science as argument. Science Education, 93(4), 687-719. Zhang, B., Looi, C. K., Seow, P., Chia, G., Wong, L. H., Chen, W. & Norris, C. (2010). Deconstructing and reconstructing: Transforming primary science learning via a mobilized curriculum. Computers & Education, 55(4), 1504-1523. Zhang, M., & Quintana, C. (2012). Scaffolding strategies for supporting middle school students’ online inquiry processes. Computers & Education, 58(1), 181-196. Zhao, Y., & Bryant, F. L. (2006). Can teacher technology integration training alone lead to high levels of technology integration? A qualitative look at teachers’ technology integration after state mandated technology training. Electronic Journal for the Integration of Technology in Education, 5(1), 53-62.   203