TRACING FIFTH-GRADE STUDENTS’ EPISTEMOLOGIES IN MODELING THROUGH THEIR PARTICIPATION IN A MODEL-BASED CURRICULUM UNIT By Hamin Baek A DISSERTATION Submitted to Michigan State University in partial fulfillment of the requirements for the degree of Curriculum, Instruction, and Teacher Education – Doctor of Philosophy 2013 ABSTRACT TRACING FIFTH-GRADE STUDENTS’ EPISTEMOLOGIES IN MODELING THROUGH THEIR PARTICIPATION IN A MODEL-BASED CURRICULUM UNIT By Hamin Baek In the past decade, there has been a growing interest in scientific practices as a reform focus in K-12 science education of the United States. In this context, scientific practices refer to practices that have family resemblance to scientists’ professional practices and simultaneously are pedagogically accessible and useful to students. In this study, I propose development of students’ epistemic agency as an overarching goal for this practice-based approach to science learning. In particular, I argue that students’ epistemologies, one dimension of epistemic agency, should be developed as a result of participating in practice-based science learning. The research within this dissertation focuses on studying the practice of scientific modeling. There is a body of prior studies on students’ epistemological understandings about models and modeling. None have examined how students’ epistemologies about modeling changes over time and why they change the way they do. This research aims to contribute to this body of work by investigating how three elementary students’ epistemologies as deployed in their modeling practice, or, their epistemologies in modeling (EIMs) changed over time as a class of 5th-grade students (N=24) and their teacher, Mrs. M, and an intern teacher, Ms. H, enacted a model-based curriculum unit about evaporation and condensation and ways in which some of the curriculum events influenced the changes of their EIMs. To achieve these goals, I conducted a microgenetic analysis of the three focus students’ EIMs from the models, utterances, and notes they made in nine modeling activities as part of their curriculum enactment, and analyzed ideas about modeling from some modeling-related curriculum events that preceded each modeling activity using a coding scheme I developed based on prior analytical frameworks and the data. Analysis indicates that the students attended to three main model features such as communicative features (e.g., labels, sentences, key, colors), microscopic/theoretical entities (e.g., water particles), and empirical data (e.g., percentage humidity) with varying epistemic ideas about modeling throughout the unit. The students began with nascent epistemic ideas that emphasize clarity and including many details, but as they gained more experience with modeling, they developed more advanced epistemic ideas related to providing a scientific explanation (mechanism) and making a model accurate and persuasive. The curriculum materials, teachers’ instructions and scaffolding, and students’ interactions played important roles in the development of the focus students’ EIMs. These findings provide some insights into elementary students’ epistemologies about modeling that can contribute to learning progression research for scientific modeling. First, this study suggests that we need to attend to epistemic ideas that elementary students have in common as a result of sharing a fairly homogeneous historically established sociocultural world in developing a learning progression for modeling. Second, by showing an intermediary state that elementary students had as they developed their epistemologies about modeling, this study provides an insight into a trajectory or mechanism of how students’ epistemologies about modeling become increasingly sophisticated. I hope that this work contributes to the large effort to help students become more active and capable epistemic agents by learning science from engaging in scientific practices both for their present science learning and for their future life they will live as citizens in societies that will be increasingly populated with complicated, controversial socioscientific issues. To the Triune God, the source of life, strength, and hope iv ACKNOWLEDGEMENTS I want to express my special gratitude to my advisor, Dr. Christina Schwarz. Without the support and encouragement she has given me along this challenging journey, I would not have completed this dissertation. I am also grateful to Dr. Charles (Andy) Anderson, Dr. Angela Calabrese-Barton, and Dr. Mary Juzwik for their guidance and encouragement. Many thanks to the teachers and students I worked with in the MoDeLS project for their invaluable time and cooperation for this study. In addition, I am thankful to my colleagues, Hayat Hokayem, Li Zhan, Jing Chen, and Mete Akcaoglu for their work, jokes, and kind words. I also thank my brothers and sisters in Christ at Lansing Korean United Methodist Church including Rev. Borin Cho, Koom Jang, Eunil Lee, Oksoon Kim, Jungjin Kang, Youngmi Bae, and Haewon Jang. Their love cannot be overemphasized. I am also grateful to Changwan Cho, my former colleague at Ansan Dongsan Christian High School and brother in Christ, whose consistent support, friendship and prayer for our family meant a lot to me. I am always indebted to my parents and parents-in-law for their unending love and trust. I know how much they missed our family for the years we were in the United States. Eubin and Eujin, my dearest son and daughter, have been a source of love and joy, and a reminder of things more important than work. Finally, I want to express my special and deep gratitude to my better half, Mikyung Kim, who has supported me in immeasurable ways and has shared joy, hope, stresses, and uncertainty with me along this otherwise lonely journey. Thank you all! v TABLE OF CONTENTS LIST OF TABLES ··············································x ············································· LIST OF FIGURES ············································ ···········································xiii CHAPTER 1. INTRODUCTION ···································· 1 ···································· 1.1. Background ············································· 2 ············································· 1.1.1. A practice turn in K-12 science education ······················ ····················· 2 1.1.1.1. Key concepts and arguments about a practice-based approach ······ ····· 4 ····· 1.1.1.2. Research on students’ engagement in scientific practices ·········6 ········· ········ 1.1.1.3. Issues and challenges in the practice-based approach··········· ·········· 8 ·········· 1.1.1.4. A general goal of practice-based science education: To foster epistemic agency ··································· 9 ··································· ··································· 1.1.2. Research on students’ epistemologies about models and modeling ······· ······12 1.2. Goal and Research Questions ··································13 ·································· 1.3. Significance of the Study ····································· ····································14 1.4. Overview of the Dissertation ··································15 ·································· CHAPTER 2. CONCEPTUAL UNDERPINNINGS························17 ························ 2.1. Prior Work on Scientific Models and Modeling in Education ···············17 ··············· 2.1.1. Research on scientific models and modeling in education ············ ···········17 2.1.2. Research on students’ understandings of models/modeling ···········18 ··········· 2.1.2.1. Research on students’ understandings of models/modeling without interventions ································ ······························· 19 ······························· 2.1.2.2. Research on the impact of interventions on students’ understanding of models/modeling ····························· 22 ····························· ····························· 2.1.2.3. Research on a learning progression for scientific modeling ·······25 ······· ······ 2.2. Key Terms and Constructs ···································· ···································26 2.2.1. Scientific model ····································· ····································27 2.2.2. Scientific modeling ··································· ··································27 2.2.3. Epistemologies in modeling ······························ ·····························29 2.2.4. Ideas about modeling ·································· ·································31 CHAPTER 3. METHODS········································33 ········································ 3.1. Context: A Research Project Modeling Designs for Learning Science (MoDeLS) ···33 ··· 3.2. Research Site and Participants ·································· ·································35 3.3. A Model-based Curriculum Unit of Evaporation and Condensation ···········36 ··········· 3.3.1. Model-based instructional sequence ························· ························37 3.3.2. A model-based curriculum unit of evaporation and condensation ········ ·······41 3.4. Data Collection and Data Sources ·······························44 ······························· 3.5. Data Analysis ···········································46 ··········································· 3.5.1. Microgenetic analysis ·································46 ································· 3.5.2. Preparation of data files ································47 ································ vi 3.5.3. Development of a coding scheme for analyzing students’ EIMs and ideas about modeling present in curriculum events ·······················48 ······················· 3.5.3.1. CONTENT ································· ································ 52 ································ 3.5.3.2. EXPLANATION ······························ ····························· 54 ····························· 3.5.3.3. ACCURACY ································ ······························· 56 ······························· 3.5.3.4. COMMUNICATION ··························· 59 ··························· ··························· 3.5.4. Analysis of three students’ EIMs ··························· ··························61 3.5.5. Analysis of curriculum events ····························61 ···························· CHAPTER 4. ANALYSES OF IDEAS ABOUT MODELING FROM CURRICULUM EVENTS AND OF THREE STUDENTS’ EPISTEMOLOGIES IN MODELING OVER TIME······························63 ······························ 4.1. Overview of How Mrs. M’s Class Enacted the Model-Based Unit of Evaporation and Condensation ············································ ···········································63 4.2. Tracing Three Students’ EIMs and Ideas about Modeling in Curriculum Events over Time ················································· ················································68 4.2.1. The activity of constructing an initial model of evaporation (M1) and its preceding curriculum events ·····························69 ····························· 4.2.1.1. Introducing scientific models / Instructing how to construct a model (Class 2) ····································· 69 ····································· ····································· 4.2.1.2. The focus students’ construction of their initial models of evaporation (Class 2) ·································· ································· 71 ································· 4.2.2. The activity of constructing a second model of evaporation (by evaluating and revising the prior model) (M2) and its preceding curriculum events ······ ·····78 4.2.2.1. Introducing empirical evidence as a criterion for evaluating models (Class 3) ····································· 79 ····································· ····································· 4.2.2.2. Linking empirical evidence to models during empirical investigations on evaporation (Class 3~Class 5)······················ 82 ······················ ······················ 4.2.2.3. “What do you now understand about evaporation as a result of doing empirical investigations?” (Class 6) ···················85 ··················· ·················· 4.2.2.4. The focus students’ construction of their second models of evaporation (Class 6~Class 7) ·····························87 ····························· ···························· 4.2.3. The activity of evaluating one another’s second models of evaporation (M3) and its preceding curriculum event ·························96 ························· 4.2.3.1. Instructing how to evaluate one another’s models (Class 7) ·······96 ······· ······ 4.2.3.2. The focus students’ evaluations of others’ second models of evaporation (Class 7~Class 8) ·····························98 ····························· ···························· 4.2.4. The activity of constructing a third model of evaporation (by evaluating and revising the prior model) (M4) and its preceding curriculum events ····· ···· 105 4.2.4.1. Computer simulations about state changes / Students’ collective performance as water molecules in the playground (Class 9) ····· 106 ····· ····· 4.2.4.2. The focus students’ construction of their third models of evaporation (Class 10) ································ 108 ································ ································ 4.2.5. The activity of constructing a group consensus model of evaporation (M5) and its preceding curriculum event··························· 116 ··························· 4.2.5.1. Introducing a consensus model and guiding how to construct it (Class 10~Class 11) ······························· ······························ 117 ······························ vii 4.2.5.2. The focus students’ construction of their group consensus model of evaporation (Class 11) ·························· ························· 119 ························· 4.2.6. The activity of evaluating each group’s consensus model of evaporation ·· · 127 4.2.7. The activity of constructing an initial model of condensation (M6) and its preceding curriculum event ····························· ···························· 129 4.2.7.1. “How did water drops appear on the surface of a cold bottle?” (Class 14) ······································130 ······································ ····································· 4.2.7.2. The focus students’ construction of their initial models of condensation (Class 15~Class 16) ···························131 ··························· ·························· 4.2.8. The activity of constructing a second model of condensation (by evaluating and revising the prior model) (M7) and its preceding curriculum events ····· ···· 141 4.2.8.1. Linking empirical evidence to models during empirical investigations about condensation (Class 17~Class 19) ················142 ················ ··············· 4.2.8.2. Introducing and guiding the activity of constructing a second model of condensation (Class 19) ·························144 ························· ························ 4.2.8.3. The focus students’ construction of their second models of condensation (Class 19) ································ 146 ································ ································ 4.2.9. The activity of constructing a group consensus model of condensation (M8) and its preceding curriculum events ························ ······················· 156 4.2.9.1. Guiding the activities evaluating peers’ second models of condensation and of constructing consensus models of condensation (Class 20) ···156 ··· ·· 4.2.9.2. Evaluating others’ second models of condensation (Class 20) ·····157 ····· ···· 4.2.9.3. The focus students’ construction of their group consensus model of condensation (Class 21)·························158 ························· ························ 4.2.10. The activity of evaluating other groups’ consensus models of condensation (M9) ·········································· 164 ·········································· 4.2.10.1. The focus students’ evaluations of other groups’ consensus models of condensation (Class 22) ·························165 ························· ························ 4.3. Summary ············································· 168 ············································· 4.3.1. How did the focus students’ EIMs change over time? ············· 169 ············· 4.3.1.1. CONTENT ································171 ································ ······························· 4.3.1.2. EXPLANATION ·····························172 ····························· ···························· 4.3.1.3. ACCURACY ·······························172 ······························· ······························ 4.3.1.4. COMMUNICATION ··························· ·························· 173 ·························· 4.3.2. In what ways did some of the curriculum events influence the change of the focus students’ EIMs? ································ ······························· 174 4.3.2.1. Communicative features·························174 ························· ························ 4.3.2.2. Microscopic/theoretical entities·····················175 ····················· ···················· 4.3.2.3. Empirical data ······························ ····························· 177 ····························· CHAPTER 5. DISCUSSION AND CONCLUSION ······················· 181 ······················· 5.1. Discussion ············································ 181 ············································ 5.1.1. Did the focus students’ EIMs develop over time? ················ ··············· 181 5.1.2. Roles of the curriculum and instruction ······················ ····················· 188 5.2. Contributions, limitations, and suggestions for future work ··············· 190 ··············· 5.3. A concluding remark ······································ 196 ······································ viii APPENDICES ···············································197 ·············································· APPENDIX A. Additional Analysis of Ideas about Modeling From in Curriculum Events 198 A.1. “What (do you think) are scientific models?” (Before the unit began) ···· ··· 198 A.2. Explaining evaporation (Class 3) ·························· ························· 200 A.3. “What makes a good model?” (Class 6, Class 7) ················ 202 ················ A.4. Guiding the activity of constructing a third model of evaporation (Class 10) 204 A.5. The class conversations about the changes they made to their previous models of evaporation and about criteria for constructing a consensus model (Class 10) ·············································· ············································· 205 A.6. The class evaluations of each group’s consensus model of evaporation (Class 12~Class 14) ····································· 209 ····································· APPENDIX B. Summaries of how the focus students’ EIMs changed over time ···· 211 ···· B.1. A summary of how Brian’s EIM changed over time (Figure 4.29) ······ ····· 211 B.2. A summary of how Joon’s EIM changed over time (Figure 4.30) ······ 212 ······ B.3. A summary of how Mana’s EIM changed over time (Figure 4.31) ······ ····· 214 REFERENCES ·············································· 217 ·············································· ix LIST OF TABLES Table 3.1. Student demographic at the research site in the school year of 2008-2009 ······· ······35 Table 3.2. A coding scheme for analyzing students’ EIMs in the category of CONTENT ···· ···54 Table 3.3. A coding scheme for analyzing students’ EIMs in the category of EXPLANATION 55 Table 3.4. A coding scheme for analyzing students’ EIMs in the category of ACCURACY ··· ··58 Table 3.5. A coding scheme for analyzing students’ EIMs in the category of COMMUNICATION····································· ····································60 Table 4.1. Overview of Mrs. M’s class’s enactment of the unit ···················· ···················64 Table 4.2. Analysis of the ideas about modeling from the passages from the curriculum materials that introduce scientific models and instruct how to construct a model ·······71 ······· Table 4.3. Analysis of Brian’s EIM in constructing his initial model of evaporation ·······73 ······· Table 4.4. Analysis of Joon’s EIM in constructing his initial model of evaporation ········ ·······74 Table 4.5. Analysis of Mana’s EIM in constructing her initial model of evaporation ······· ······76 Table 4.6. Analysis of the ideas about modeling from the curriculum event of introducing empirical evidence as a criterion for evaluating models ················82 ················ Table 4.7. Analysis of the ideas about modeling from the curriculum material and Mrs. M’s scaffolding that tried to link empirical evidence to models ··············· ··············85 Table 4.8. Analysis of the ideas about modeling from the conversation of “What do you now understand about evaporation as a result of doing empirical investigations?” ···· ···87 Table 4.9. Analysis of Brian’s EIM in constructing his second model of evaporation ······89 ······ Table 4.10. Analysis of Joon’s EIM in constructing his second model of evaporation ······91 ······ Table 4.11. Analysis of Mana’s EIM in constructing his second model of evaporation ······ ·····93 Table 4.12. Analysis of the ideas about modeling from Mrs. M’s instruction of how to evaluate one another’s models ····································· ····································98 Table 4.13. The participation structure and order in which the focus students evaluated others’ second models of evaporation (Class 8) ·························· ·························99 Table 4.14. Brian’s evaluations of the other focus students’ second models of evaporation ·· 100 ·· x Table 4.15. Analysis of Brian’s EIM in evaluating the other focus students’ second models of evaporation ·········································· ········································· 101 Table 4.16. Joon’s evaluations of the other focus students’ second models of evaporation ··· ·· 102 Table 4.17. Analysis of Joon’s EIM in evaluating the other focus students’ second models of evaporation ·········································· ········································· 102 Table 4.18. Mana’s evaluations of the other focus students’ second models of evaporation ·· · 103 Table 4.19. Analysis of Mana’s EIM in evaluating the other focus students’ second models of evaporation ·········································· ········································· 103 Table 4.20. Analysis of the ideas about modeling from the computer simulations about state changes and students’ collective performance as water molecules in the playground 108 Table 4.21. Analysis of Brian’s EIM in constructing his third model of evaporation ······ 110 ······ Table 4.22. Analysis of Joon’s EIM in constructing his third model of evaporation ······· ······ 112 Table 4.23. Analysis of Mana’s EIM in constructing her third model of evaporation ······ ····· 114 Table 4.24. Analysis of the ideas about modeling from Ms. H’s and Mrs. M’s utterances guiding the activity of constructing a consensus model ····················· ···················· 119 Table 4.25. The focus students’ conversations in constructing their consensus model of evaporation (Class 11)··································· 121 ··································· Table 4.26. Analysis of the focus students’ group EIM in constructing their group consensus model of evaporation ···································· ··································· 124 Table 4.27. Analysis of the ideas about modeling from the class conversation about “specific evidence” in the focus students’ group consensus model of evaporation ······ ····· 129 Table 4.28. Analysis of the ideas about modeling from Mrs. M and three students’ performance of water particles collecting on a cold bottle and Mrs. M’s ensuing comments ·· · 131 Table 4.29. Analysis of Brian’s EIM in constructing his initial model of condensation ····· ···· 135 Table 4.30. Analysis of Joon’s EIM in constructing his initial model of condensation ····· 137 ····· Table 4.31. Analysis of Mana’s EIM in constructing her initial model of condensation ···· 139 ···· Table 4.32. Analysis of the ideas about modeling from Mrs. M’s scaffolding and handout given to help students link empirical evidence to their models ··············· 144 ··············· xi Table 4.33. Analysis of the ideas about modeling from the passages in the student notebook, Ms. H’s comments, and Mrs. M’s comments that introduce and guide the activity of constructing a second model of condensation ····················· 146 ····················· Table 4.34. Analysis of Brian’s EIM in constructing his second model of condensation ···· ··· 148 Table 4.35. Analysis of Joon’s EIM in constructing his second model of condensation ···· 151 ···· Table 4.36. Analysis of Mana’s EIM in constructing his second model of condensation ···· ··· 153 Table 4.37. Analysis of the ideas about modeling from the curriculum event of introducing empirical evidence as a criterion for evaluating models ··············· 157 ··············· Table 4.38. The focus students’ conversation in constructing their consensus model of condensation (Class 21) ·································· 159 ·································· Table 4.39. Analysis of the focus students’ group EIM in constructing their group consensus model of condensation ··································· ·································· 162 Table 4.40. Brian’s evaluations of other groups’ consensus models of condensation ······ ····· 165 Table 4.41. Analysis of Brian’s EIM from his evaluations of other groups’ consensus models of condensation ········································· ········································ 165 Table 4.42. Joon’s evaluations of other groups’ consensus models of condensation······· ······ 166 Table 4.43. Analysis of Joon’s EIM from his evaluations of other groups’ consensus models of condensation ········································· ········································ 166 Table 4.44. Mana’s evaluations of other group’s consensus models of condensation ······ ····· 167 Table 4.45. Analysis of Mana’s EIM from her evaluations of other groups’ consensus models of condensation ········································· ········································ 167 Table 4.46. Main model features that the focus students attended to and influential curriculum events ············································· ············································ 179 xii LIST OF FIGURES Figure 3.1. A model-based instructional sequence ···························38 ··························· Figure 3.2. A model-based curriculum unit of evaporation and condensation ···········42 ··········· Figure 4.1. Brian’s initial model of evaporation ····························· ····························72 Figure 4.2. Joon’s initial model of evaporation ·····························74 ····························· Figure 4.3. Mana’s initial model of evaporation ····························· ····························75 Figure 4.4. Analyses of the focus students’ EIMs in constructing their initial models of evaporation. For interpretation of the references to color in this and all other figures, the reader is referred to the electronic version of this dissertation. ··········77 ·········· Figure 4.5. Brian’s second model of evaporation ····························88 ···························· Figure 4.6. Joon's second model of evaporation ····························· ····························90 Figure 4.7. Mana's second model of evaporation ····························92 ···························· Figure 4.8. Analyses of the focus students’ EIMs in constructing their second models of evaporation ··········································· ··········································94 Figure 4.9. Analyses of the focus students’ EIMs in evaluating others’ second models of evaporation ·········································· ········································· 104 Figure 4.10. A computer simulation about phase change (The Concord Consortium, 2013) ·· · 107 Figure 4.11. Brian’s third model of evaporation ···························· ··························· 109 Figure 4.12. Joon's third model of evaporation ···························· 111 ···························· Figure 4.13. Mana's third model of evaporation ···························· ··························· 113 Figure 4.14. Analyses of the focus students’ EIMs in constructing their third models of evaporation ·········································· ········································· 115 Figure 4.15. The focus students’ group consensus model of evaporation············· 123 ············· Figure 4.16. Analysis of the focus students’ group EIM in constructing their group consensus model of evaporation ···································· ··································· 125 Figure 4.17. Mrs. M and three students’ performance of water particles collecting on a cold bottle (Class 14) ······································· ······································ 131 Figure 4.18. Brian’s initial model of condensation ·························· 133 ·························· xiii Figure 4.19. Joon’s initial model of condensation ··························· ·························· 136 Figure 4.20. Mana’s initial model of condensation·························· 138 ·························· Figure 4.21. Analyses of the focus students’ EIMs in constructing their initial models of condensation ········································· ········································ 140 Figure 4.22. Brian’s second model of condensation ························· 147 ························· Figure 4.23. Joon’s second model of condensation ·························· ························· 150 Figure 4.24. Mana’s second model of condensation ························· 152 ························· Figure 4.25. Analyses of the focus students’ EIMs in constructing their second models of condensation ········································· ········································ 154 Figure 4.26. The focus students’ group consensus model of condensation ············ 161 ············ Figure 4.27. Analysis of the focus students’ group EIM in constructing their group consensus model of condensation ··································· ·································· 163 Figure 4.28. Analyses of the focus students’ EIMs in evaluating other groups’ consensus models of condensation (Class 22)································· ································ 168 Figure 4.29. The change of Brian’s EIM over time ·························· ························· 169 Figure 4.30. The change of Joon’s EIM over time ·························· 170 ·························· Figure 4.31. The change of Mana’s EIM over time ·························· ························· 170 Figure 5.1. Development of the focus students’ EIMs in a learning progression framework for scientific modeling (frame of modeling) ························ 187 ························ xiv CHAPTER 1. INTRODUCTION Within the past decade, reform efforts in K-12 science education in the United States have focused on engaging students in scientific practices (Duschl, Schweingruber, & Shouse, 2007; National Research Council, 2012). At the same time, there is a growing body of research that focuses on better understanding or supporting students’ engagement in scientific practices. However, because of the recent reform emphasis, this practice-based approach to science teaching and learning is still in need of further conceptual refinement and empirical support to be beneficial to science learners and teachers. In particular, the education community needs a better understanding for how and why such scientific practices can develop over time and be supported if they are to be advocated successfully. The present study offers an empirical study that contributes to development of this new approach to K-12 science education. More specifically, this study aims to investigate how elementary students’ epistemologies, as deployed in their engagement in the practice of scientific modeling, changed over time as they implemented a modelbased curriculum unit. The focus on students’ epistemologies is important in determining how practices develop and can be supported, as those epistemic aspects are the ones that guide students’ meaning-making and engagement in the practices over time. The goal for my research is to better understand how students’ epistemologies develop with a model-based unit in order to inform students’ engagement in scientific practices (scientific modeling), and the reform-based practice efforts in general. In order to address these goals, this chapter begins with an overview of the reform movement as a general background of the present study. I then review the literature on students’ epistemological understandings of models and modeling that has been conducted in or without relation to the recent emphasis on scientific practices. In the final section of this chapter, I outline 1 the goal and research questions and significance of this study. 1.1. Background 1.1.1. A practice turn in K-12 science education Traditional ways of teaching and learning science in K-12 schools have long been the target of criticisms and reform efforts. Throughout history, educational reformers have proposed alternative approaches to science education with different emphases from various theoretical traditions. Within the past decade, a new reform agenda has emerged that focuses on engaging science learners in scientific practices that have family resemblance to professional scientists’ practices and at the same time pedagogically accessible and useful (Duschl, et al., 2007). This practice-based approach to science education has since drawn an increasing attention from a range of stakeholders in the science education community. Even in a recent K-12 science education standards document, scientific practices, along with engineering practices, were explicitly highlighted as one of the major foci (National Research Council, 2012). Several factors have been involved in the emergence of this reform agenda. First, the previous reform agenda, centered on scientific inquiry, has been losing its impetus. One reason has to do with the lack of consensus on key constituents of the process of scientific inquiry (National Research Council, 2012). Some noted that although inquiry had been a central goal of scientific literacy in the 1996 national science education standards (National Research Council, 1996), it had not been specifically defined, leading to divergent operationalizations (Abd-El-Khalick et al., 2004; Barrow, 2006). In addition, others were concerned about its scant empirical ground for success. Kirschner et al. (2006), for example, showed that there is little empirical evidence that supports the benefits of an inquiry-based approach with minimal guidance. In this milieu, science educators looked for a better reform focus. The scholarship of his- 2 torians, philosophers, sociologists, and anthropologists of science—or “science studies” in general—that provided that new focus. Despite the variety in their conceptual and methodological approaches to science as the subject of their investigation, they generally agreed to a portrait of science as a human activity: Unlike a common, ideally (or mis-) represented image of science that overemphasizes the rationality and linearity of the process of producing scientific knowledge, these students of science showed that science is a collective human activity consisting of multiple distinctive yet interconnected practices that in turn entail complex, dialectical interactions between human agents, ideational and material tools, disciplinary norms, and a larger society (Latour & Woolgar, 1979; Pickering, 1995). This new image of science granted science education reformers with a new approach to science learning and teaching. To them, helping students engage in processes close to what scientists do to produce scientific knowledge was certainly superior to the traditional approach. Still another factor that helped scientific practices become attractive to science education community was the advancement of sociocultural theory of cognition and learning in the 1980s and 1990s. Dissatisfied with then-dominant educational psychology and learning theory that exclusively focus on what is going on in the individual’s mind, an assemblage of educational psychologists and learning theorists came up with alternative theories and perspectives. Although they leveraged various frameworks and constructs including “situated cognition” (Brown, Collins, & Duguid, 1989; Greeno, 1998), “distributed cognition” (Hutchins, 1991; Salomon, 1993), “apprenticeship” (Collins, Brown, & Holum, 1991; Rogoff, 1990), “communities of practice” (Lave & Wenger, 1991; Wenger, 1998), “activity system” (Engeströ 1987), and “figured m, world” (Holland, Lachicotte, Skinner, & Cain, 1998), they all conceived human cognition and learning as essentially inseparable from such features as symbolic, ideational, and material tools 3 and interpersonal interactions. These approaches offered science educators lenses to see students’ participation in the practices of science as science learning, not just their internalization of scientific knowledge and facts in their abstract forms. 1.1.1.1. Key concepts and arguments about a practice-based approach What are the key concepts and arguments of the practice-based approach? To begin with, while touched on, the construct of practice should be presented in more detail. Perhaps the most common single misconception of practice is to see it in contrast with theory, thinking, or reasoning. It is understandable considering the lasting and pervasive dualism between mind and body or between theory and practice in Western thoughts. The idea of practice embraced in the new science education approach is far more comprehensive. Although the term practice appears to foreground its performative dimension, it entails other dimensions as well: participants’ identities and cognitions, technological and ideational (semiotic, symbolic) tools, social norms, division of roles, social interactions, and a community of a practice (Engeströ 1987; Holland, et al., 1998; m, Wenger, 1998). An educational significance that the multidimensionality of practice holds is that as learners engage in a practice, their knowledge and skills coordinate and develop dialectically. Scientific practice is a reasonably distinctive practice that scientists frequently engage in so as to investigate their targets and produce scientific knowledge. A framework for K-12 science education (National Research Council, 2012) identifies eight scientific practices to be key elements in K-12 science curriculum: 1. Asking questions 2. Developing and using models 3. Planning and carrying out investigations 4. Analyzing and interpreting data 4 5. Using mathematics and computational thinking 6. Constructing explanations 7. Engaging in argument from evidence 8. Obtaining, evaluating, and communicating information One thing to note about these scientific practices is that scientists conduct them not in a linear, sequential way but in an integrated and iterative manner. As needs arise, they do several practices at the same time or revisit the same practice multiple times. Therefore, scientific practices should be viewed as components that dynamically constitute scientific inquiry rather than as steps to be taken to conduct scientific inquiry. Proponents of practice-based science education point to multiple benefits and potential in this approach. First, it is argued that engaging students in scientific practices helps them to understand both scientific epistemic process and crucial scientific knowledge better than a traditional approach. It helps them move from memorizing and understanding scientific knowledge superficially and out of its context to understanding the process in which scientific knowledge is constructed, communicated, evaluated, and developed. As they go through this whole process, they may additionally appreciate the fact that there are a wide range of approaches to investigating natural phenomena beside the “scientific method.” Furthermore, they can grasp crosscutting concepts and ideas of science better in this approach than in a conventional one (National Research Council, 2012). Second, some science educators maintain that this approach has more potential to give benefits to nonmainstream students than a traditional approach on the basis of their findings that there is some congruence between cultural and linguistic practices that students bring from their home and community and some scientific practices. Since the late 1980s, researchers of the 5 Chè Konnen Project have investigated how low-income students from historically marginalche ized ethnic backgrounds (African American, Haitian, and Latino) talk and learn science. They have found evidence that there is some continuity and complementarity between these students’ everyday ways of knowing and talking and scientific epistemology and discourse (Ballenger, 1997; Rosebery, Warren, & Conant, 1992; Warren, Ballenger, Ogonowski, Rosebery, & Hudicourt-Barnes, 2001). These findings suggest that some aspects of the practice-based approach may contribute to educational equity. 1.1.1.2. Research on students’ engagement in scientific practices As scientific practices became a new reform focus and gained prominence, a number of science education researchers began to investigate them. Some of the scientific practices that were studied are designing and conducting empirical investigations (Crawford, Krajcik, & Marx, 1999; Metz, 2004), scientific explanation (McNeill, Lizotte, Krajcik, & Marx, 2006; Moje et al., 2004), argumentation (Berland & Reiser, 2009; Jimé nez-Aleixandre, Rodrí guez, & Duschl, 2000), and modeling (Lehrer & Schauble, 2004; Schwarz et al., 2009; B. Y. White & Frederiksen, 1998). Methodologically, these researchers deployed different approaches and strategies to examine student engagement in scientific practices. Some researchers did not make any intervention regarding instruction and investigated a classroom community’s discourses and practices as they occurred in their usual settings (Jimé nez-Aleixandre, et al., 2000). Others, on the other hand, set up a “practice field” (Barab & Duffy, 2000)—a learning environment that simulates scientists’ social-professional space—in the classroom by introducing some features of a scientific practice and investigated how students participated in that practice and learned scientific content. Particular strategies they used in designing such learning environments include providing a particular 6 type of scaffolds (McNeill, et al., 2006), assigning students intellectual roles (Herrenkohl & Guerra, 1998), and having them engage with computer software (B. Y. White & Frederiksen, 1998). Still others provided students experience opportunities to experience scientific practices by having them contact professional scientists in authentic contexts (O'Neill, 2001; Schwartz, Lederman, & Crawford, 2004). These studies generated mixed findings. On the positive side, they showed that students are capable of participating in scientific practices successfully under proper conditions. For example, Metz’s study (2004) suggests that even early elementary students can design and conduct empirical investigation with awareness of some uncertainties inherent in the process. Engle and Conant (2002) also report that fifth graders were able to generate questions, develop arguments, and used evidence in scholarly ways when they engaged in controversy over a species’ classification. In contrast to these results, other studies revealed that the differences and tensions between preexisting classroom norms and scientists’ disciplinary norms made students’ engagement in scientific practices challenging. Hogan and Corey (2001), for instance, provided a vignette that shows that fifth graders’ evaluations of one another’s ideas turned into negative criticisms. Berland and Reiser (2009) also found that in their engagement in scientific explanation and argumentation, middle school students were consistently able to use evidence to make sense of their target phenomena and articulate their understandings whereas they paid inconsistent attention to persuading others of their understandings because it requires social interactions that are very different from traditional classroom interactions. Some researchers explored the conditions in which science learners can engage in scientific practices meaningfully and productively. For example, Engle and Conant (2002) synthesized the findings of previous studies to identify four characteristics of a learning environment 7 that support students’ “productive disciplinary engagement,” a concept they constructed to refer to student engagement that leads to their progression in scientific practices: Such learning environments have the elements of problematizing content, of giving students authority, of holding them accountable to others and to disciplinary norms, and of providing them resources. 1.1.1.3. Issues and challenges in the practice-based approach While proponents highlighted much potential in practice-based approach, there are some challenges and critiques to be attended to. First, some point to challenges involved in engaging students in scientific practices particularly due to the gaps between the culture of preexisting school science and the culture of the scientific community. Berland (2008), for example, reviewed literature to identify two kinds of challenges—epistemic and social—in fostering scientific argumentation in middle school classrooms. She argued that differences in epistemic commitments, criteria for constructing and evaluating knowledge, and interaction patterns between the scientific community and classrooms may make their full engagement in scientific argumentation difficult. Any attempts to help science learners participate in scientific practices, therefore, need to address challenges that arise from cultural dissonances between two sociocultural spaces. Second, there are cautions about the fact that various efforts to promote this reform agenda were made more often by rhetorical means than on the grounds of empirical evidence. Ford and Forman (2006) stated, “Although this mandate to incorporate authentic disciplinary practices into classroom instruction has become a standard rhetorical move in journal articles and policy documents, it is rarely elaborated or adequately supported by evidence about those practices” (p. 1). A brief review presented above supports their observation. But, given the short span in which practice-based approach to science education emerged and started to gain impetus, this relative weak empirical grounding is hardly surprising. It is, however, probable that ardent vision and 8 rhetoric alone, without solid empirical foundation, cannot keep the present movement continuing. Finally, some theorists level more fundamental criticisms at the goals embraced by this reform agenda. In particular, they commonly problematize the notion of authenticity in authentic scientific practices (Roth & Calabrese-Barton, 2004; Sherman, 2004; van Eijck, Hsu, & Roth, 2009). Consistent with postcolonial perspective, some scholars criticize the goal of scientific literacy associated with the emphasis on authentic scientific practices for often emphasizing “pushing students into the world of scientists rather than a way of helping them cope with their own life worlds” (Roth & Calabrese-Barton, 2004, p. 22). Others challenge the assumption underlying the reform efforts to help engage students in scientific practices, the dichotomous idea that school science is an inauthentic human activity whereas professional science is an authentic activity. Sherman (2004) argues that this idea comes from a mistaken understanding that sees authenticity in human activities as a conditional concept. She further argues that because every human activity is locally organized and contingently constructed, and thus authentic, we should not look at professional scientists’ practices as a standard for school science but study school science as a version of science in its own right. These concerns and issues should not be taken as reasons to give up the vision of the practice-based science education. Rather, they should be seen as relevant problems to be addressed in the process of developing and specifying the practice-based approach. 1.1.1.4. A general goal of practice-based science education: To foster epistemic agency Considering the potential benefits and challenges in the practice-based approach outlined above, I use this part of the chapter to articulate a general goal for practice-based science education: by engaging in scientific practices, students should be able to become better epistemic agents or to foster advanced epistemic agency. Although the term epistemic agency was intro- 9 duced to the community of educational researchers in a more general context and without direct relation to scientific practices (Scardamalia, 2000), I think that it still offers a useful goal for the practice-based approach to science education. I start with Scardamalia’s (2000) argument that students should be prepared to be active members in a knowledge society. To that end, acquiring and understanding some sets of concepts is not enough. Because knowledge is dynamically and rapidly generated, distributed, changed, and replaced in a knowledge society, students need more powerful and general kinds of knowledge, tools, and habits than a body of particular information. Therefore, in schools, they need to participate in and understand various processes in which knowledge is produced, communicated, and used, and, in so doing, to acquire necessary knowledge and skills to actively and meaningfully engage in this process. I believe that this general learning goal can be succinctly captured by epistemic agency. Building on prior work on epistemic agency (Damşa, Kirschner, Andriessen, Erkens, & Sins, 2010; Reed, 2001; Scardamalia & Bereiter, 2006), I define epistemic agency as individuals’ knowledge, skills, and disposition needed to plan and conduct an action, individually or collectively, to construct, communicate, develop, and employ knowledge. Damşa and colleagues (2010) provided a somewhat detailed, albeit not exhaustive, picture of epistemic agency. Reviewing several studies focused on actions related to epistemic agency, they identified several common categories of them which they placed under two broader groups. The first group (“knowledgerelated”) includes such actions as searching information, sharing ideas, structuring idea, and producing idea. Under the second group (“process-related”), they put three actions: projective (e.g., setting and pursuing goals), regulative (e.g., monitoring and coordinating collaborative efforts), and relational (e.g., negotiating future course of actions). In their portrait of epistemic agency, we can note that to do these various actions, participants need proper knowledge, epistemology, 10 metacognition, skills, motive, and intention. But, at the same time, as they conduct these actions, they likely develop their epistemic agency in these dimensions. Developing epistemic agency as a goal is very well aligned with, though not limited to, the practice-based approach. Advocates of this approach argue, as mentioned earlier, scientific practices provide a learning environment in which students can experience and understand the whole process of creating and developing scientific knowledge and, in the process, can acquire epistemology and skills needed for future participation in scientific practices. But, at the same time, this goal does not limit students’ development within the domain of science; it encourages students to critically and creatively appropriate what they learn from their participation in scientific practices, whether it may be knowledge, skills or disposition, for their other epistemic practices in their present and future lifeworlds. One particular aspect of epistemic agency is epistemology. Here, I use epistemology as an overarching term that refers to explicit knowledge and tacit beliefs about a variety of interrelated topics of knowledge and knowing such as the nature, structure, sources, and justification of knowledge (cf. Chinn, Buckland, & Samarapungavan, 2011). Various terms that have been proposed to signify it include “personal epistemology” (Hofer & Pintrich, 2002), “epistemological beliefs” (Schommer, 1990), “folk epistemology” (Kitchener, 2002), and “epistemic cognition” (Greene, Azevedo, & Torney-Purta, 2008). Recently, some researchers paid attention to students’ epistemologies as deployed in their inquiry practices or scientific practices. For example, Sandoval (2005) proposed to focus on students’ epistemologies employed in their inquiry practices, or, what he called “practical epistemologies.” Researchers working in a project Supporting Scientific Practices in Elementary and Middle School Classrooms (SciPractices, henceforth), too, have focused on elementary and mid- 11 dle school students’ epistemologies that guide their engagement in scientific modeling, explanation, and argumentation. To better capture the embeddedness of students’ epistemologies in their practice, these researchers recently coined a term “epistemologies in practice” (Berland, Schwarz, Kenyon, & Reiser, 2013). Consistent with these efforts, the present study targets students’ epistemologies that guide their scientific modeling as a primary object of investigation. As an important goal of practice-based science teaching and learning, I argue that students’ epistemologies that guide their scientific practices should become more sophisticated and productive as a result of participating in scientific practices. With this goal in mind, in this study, I pay particular attention to how elementary school students’ epistemologies as used in modeling developed over time as a result of participating in scientific modeling. 1.1.2. Research on students’ epistemologies about models and modeling While scientific modeling as a core scientific practice received renewed attention in the recent practice-based reform movement, scientific models and modeling has been a consistent focus to a group of science education researchers for more than two decades independent of the reform agenda (Gilbert, 1991; Halloun & Hestenes, 1987; Harrison & Treagust, 2000; Lehrer & Schauble, 2006; Mellar, Bliss, Boohan, Ogborn, & Tompsett, 1994; B. Y. White & Frederiksen, 1998). However, of this body of research, only a handful of studies targeted students’ cognition about models and modeling (Gobert & Pallant, 2004; Grosslight, Unger, Jay, & Smith, 1991; Pluta, Chinn, & Duncan, 2011; Schwarz, et al., 2009; Schwarz & White, 2005; Treagust, Chittleborough, & Mamiala, 2002). As a whole, these studies showed that science learners generally have ideas about models and modeling that are less sophisticated than scientists’ cognition of models and modeling although some of their ideas, like their perceived criteria for model evaluation, are similar to scientists’ notions and that students’ cognition of models improve as a 12 result of participating in enactment of a model-enhanced curriculum unit. However, none of these examined carefully how students’ cognition of modeling students evolves over time as they engage in scientific modeling. A couple of interrelated reasons can be considered. First, these researchers appear to have assumed that the cognition of modeling students deploy while they are conducting modeling activities is identical with or at least comparable to their cognition of modeling captured when they reflect scientific models and modeling in general. But, those who embrace a knowledge-in-pieces view (diSessa, 1988, 1993; diSessa & Wagner, 2005; Hammer & Elby, 2002) challenge this assumption by pointing to evidence indicating that students activate different conceptual and epistemological resources in different contexts. Second, methodologically, to trace learners’ epistemologies used in modeling as they go through one modeling activity after another is challenging to researchers because it requires indepth analysis of their discourse and artifacts. Despite these challenges, empirical research that investigates how students’ epistemologies that guide their modeling practice evolves over time as they engage in scientific modeling is much needed to supply evidence about the effect of the practice-based approach on students’ epistemologies. This study aims at contributing to this line of research by providing one such analysis. 1.2. Goal and Research Questions My general goal for the present study is twofold: first, to document how elementary students’ epistemologies that guided their engagement in scientific modeling or epistemologies in modeling changed over time as they enacted a model-based curriculum unit; second, to investigate the ways in which some of the curriculum events influenced students’ epistemologies in modeling. To achieve these goals, I formulate the following research questions that I aim to ad- 13 dress in the present study: Given a class of elementary students and their teachers (Mrs. M and her intern teacher, Ms. H) enacting a model-based curriculum unit, (1) How did three focus students’ epistemologies in modeling change over time? (2) In what ways did some of the curriculum events influence the changes of their epistemologies in modeling? In an attempt to address these questions, I have several foci or steps in my investigation. To address the first research question, I will document the “microdevelopment” (Granott & Parziale, 2002) of the three focus students’ epistemologies employed in their multiple modeling activities. I will also document how teachers and students constructed and distributed ideas about modeling over time as they carried out various events of a model-based curriculum unit. To address the second research question, I will connect ideas about modeling found in some of the curriculum events to the focus students’ epistemologies in modeling. 1.3. Significance of the Study As I noted above, there has been little research conducted to investigate how students’ epistemologies that guide their modeling practice changes over time as a result of participating in scientific modeling. The present study aims at contributing to this research by providing an empirical analysis of this context. This study also contributes to recent research efforts to construct “learning progressions” (Alonzo & Gotwals, 2012; Smith, Wiser, Anderson, & Krajcik, 2006) or descriptions that represent increasingly sophisticated levels in important knowledge, epistemology, and practices. Some science education researchers have been developing a learning progression for scientific modeling (Schwarz, et al., 2009). Although their work has provided insights into different levels 14 in scientific modeling by assessing elementary and middle students’ performance and cognition of modeling, their empirical analysis does not provide a mechanism in which students’ modeling shifts from one level to a higher level and the role of a model-based curriculum in that shift. The present study aims to make some contribution to the literature on learning progressions by providing one such analysis. 1.4. Overview of the Dissertation To address my research questions of this study, I organize the remainder of this dissertation in the following ways. In Chapter 2, I provide a review of research on scientific models and modeling in education in general and, more particularly, research on students’ cognition about models and modeling to situate my work in ongoing intellectual research strand. Next, I articulate key words including epistemologies in modeling in this study. Chapter 3 provides information of a project Modeling Designs for Learning Science from which this study emerges, research sit and participants, the curriculum unit they implemented, and finally how I collected and analyzed data to address the research questions. Chapter 4 is the key chapter of the present study. In this chapter, I present the results of my analysis of three students’ epistemologies in modeling and ideas about modeling found in some curriculum events in two ways. First, I provide these analysis results with a documentation of how the focus students’ modeling activities and some curriculum events took place in a chronological order. This is to give readers a sense of the complex and dialectical ways that three students’ epistemologies in modeling and some of the curriculum events evolved over time. Next, I offer a summary of these analysis results to directly address the research questions. In the final chapter, I use the analysis results presented in Chapter 4 to theorize how the 15 students’ epistemologies in modeling developed and discuss the roles of curriculum and instruction in supporting such development. Finally, I discuss the contributions and limitations of the present study based on its findings and make suggestions for future work. 16 CHAPTER 2. CONCEPTUAL UNDERPINNINGS In Chapter 1, I provided an overview of recent practice-based reform movement as a general background of the present study. In the present chapter, I present another background, that is, a conceptual, background of this study. In the first section of the chapter (2.1), I review prior studies that targeted scientific models and modeling in education and, among them, research that specifically investigated students’ cognition about models and modeling. In the following section (2.2), I define and discuss key terms and constructs used in this study. 2.1. Prior Work on Scientific Models and Modeling in Education Although scientific modeling has drawn renewed attention due to the recent practice turn in K-12 science education, educational researchers had studied scientific models and modeling for decades before this new reform effort began. In this section of the chapter, I first review general literature on scientific models and modeling in education. Next, I review prior work on learners’ understanding of models and modeling, since my study builds on and advances this specific line of research. 2.1.1. Research on scientific models and modeling in education Historians and philosophers of science have written about the crucial roles of models and modeling in science for decades (See Frigg & Hartmann, 2012 for a review). Only in the late 1980s did they become a focus in educational research. This attention to models and modeling in the 80’s was reflected in the National Science Education Standards (National Research Council, 1996). In this document, scientific models and modeling was emphasized in the course of discussing the importance of understanding the nature of science and the process in which scientists construct and test scientific knowledge. Standards documents like this further triggered science education researchers’ interest in this subject. 17 Early researchers who investigated models and modeling in science education had two main foci. As technological development allowed scientists to construct, test, and revise scientific models much easier than ever, some researchers attended to the affordances and utility of computer simulations that incorporated scientific models for science learning (e.g., Feurzeig & Roberts, 1999; Stewart, Hafner, Johnson, & Finkel, 1992; B. Y. White, 1993). Aligned who researched students’ understandings of the nature of science (Carey & Smith, 1993), were interested in investigating students’ understandings of scientific models as part of the nature of science (e.g., Abell & Roth, 1995; Grosslight, et al., 1991). After this work around models and modeling in the 1980’s and early 90’s, other researchers began to emphasize “model-based reasoning” in philosophy of science (Magnani, Nersessian, & Thagard, 1999) and focused on the inquiry process specific to scientific modeling, and its implications and value in science teaching and learning. Lehrer and Schauble (2000), for example, investigated how elementary students’ model-based reasoning emerged as they participated in modeling scientific data. Windschitl and colleagues (Windschitl, Thompson, & Braaten, 2008a, 2008b) used “model-based inquiry” to distinguish it from what they call “the scientific method,” a commonly held notion of scientific inquiry which is however far from what scientists actually do. Clement (2008a) also pointed to a similar process in his own terms, “model evolution.” Several science educators used this idea to develop instructional sequences or/and model-based curricula to facilitate model-based teaching and learning (e.g., Schwarz & Gwekwerere, 2007; Steinberg, 2008). 2.1.2. Research on students’ understandings of models/modeling Students’ notions of models and modeling have drawn a few researchers’ attention both in the United States and in other countries. Researchers generally used two approaches to study- 18 ing this topic. One set of research studies examined students’ notions of models/modeling without interventions. Another body of research commonly focused on evaluation of a modelenriched curriculum unit or instructional intervention on students’ understanding of models and modeling. After reviewing these two bodies of research, I also introduce a recently emerging learning progression study that targeted students’ understanding of and engagement in scientific modeling. 2.1.2.1. Research on students’ understandings of models/modeling without interventions An influential pioneering work in this strand is Grosslight and colleagues’ (1991) study. They conducted a clinical interview with 32 mixed-ability 7th-grade students, 22 honors 11thgrade students, and 4 adult experts in the United States. The interview questions they asked include kinds of models (What comes to mind when you hear the word 'model'? Are there different kinds of models?), purpose of models (What are models for? Can you use models in science?), designing and creating models (What do you have to think about when making a model?), multiple models (Do you think scientists would ever have more than one model for the same thing?), and changing a model (Would a scientist ever change a model?). Analysis of the participants’ responses to these questions allowed them to identify three general levels in understanding of models. Those with a level-1 understanding think of models as either toys or simple copies of reality. They also find the utility of models in the fact they provide copies of actual objects or actions. Some of them note that not all the aspects of things modeled must be represented in models, yet without sensible rationale. In a level-2 understanding, they see the specific purpose of models that affect how models are constructed. They know that the real things being modeled are changed to some extent to be represented in models and that in this process model creators’ ideas play an important part. Their focus, however, is still more on 19 the relationship between models and the reality being modeled than on modelers’ ideas. Those who have a level-3 understanding place a stress on modelers’ ideas. They think that models go through a cyclic process of being constructed and evaluated in the service of developing and testing ideas and highlight modelers’ active role in this process. With these categories, they found that the majority of 7th graders had a level-1 understanding with the rest having understandings of level 1/2 and 2, that honors 11th graders were divided almost evenly into level 1 (23%), level 1/2 (36%), and level 2 (36%) scores, and that all the four experts were at level 3. It is worth noting that while this paper was highly influential in indicating that students have very poor understanding of modeling, much of the study was inherently flawed by the nature of the abstract questions (not contextualized in students’ lives or work in the classroom) using vocabulary that was largely unfamiliar to students. While it drew attention to this area of research, it was unclear what the implications could or should be for research and interventions. Treagust and colleagues (Treagust, et al., 2002) conducted a study with a similar goal: to measure secondary students’ understandings of scientific models in Australia. They first developed an assessment instrument called Students’ Understanding of Models in Science (SUMS). This was a Likert-type scale pencil-and-pen questionnaire that consisted of 27 items. These items were all abstracted statements (e.g., “Many models represent different versions of the phenomenon” (item 2), “Models show a smaller scale size of something” (item 16)). 228 students took the assessment and chose their level of agreement to each statement. Analysis identified the following five themes. (1) Scientific models as multiple representations: In contrast to Grosslight et al.’s (1991) finding, they found that most students appreciate the fact that multiple models can be used to 20 show different ideas and perspectives of the same target. (2) Models as exact replicas: 43% of the students subscribed to this idea and the majority of students agreed that a model should be close to the real thing in every way except size. (3) Models as explanatory tools: The majority of the students agreed that ‘models are used to physically or visually represent something,’ that ‘a model shows what the real thing does and what it looks like,’ and that ‘many models show different sides or shapes of an object,’ suggesting that they are aware of various explanatory functions of scientific models. (4) The use of models: The students were split nearly half and half on the idea that models are used for making predictions, formulating theories, and showing how information is used. (5) The changing nature of models: The majority of the students thought that a model can change ‘if there are new findings’ or ‘if new theories and evidence prove otherwise.’ These results indicate that there is some inconsistency among the multiple ideas that students have about scientific models in terms of the degree to which they are close to scientists’ ideas about modeling. Of various elements of students’ understanding of models, Pluta, Chinn, and Duncan (2011) have recently paid particular attention to epistemic criteria for evaluating scientific models. Participants were 324 seventh-grade students in the United States. They first engaged in a series of model evaluation tasks in a 40- to 50-minute orienting activity: choosing what they thought are models from 12 representations of volcanoes and comparing two models for the same phenomenon (seven times). Next, they were asked to generate six important characteristics of good models. The researchers analyzed the students’ responses by comparing them to expert criteria and the findings of prior work on students’ understanding of models and modeling. They found, unexpectedly, that the students collectively proposed many criteria that are similar to 21 those used by scientists. Regarding the criteria that philosophers of science deem to be primary, the majority of the participating classes attended to the explanatory function of models, the value of appropriate details and complexity, the role of evidence in supporting models, quantity of explanation, description, and information, and accuracy. Additionally, they also noted communicative and constituent features of models. However, the number of individual students who attended to these criteria varied across criteria. 2.1.2.2. Research on the impact of interventions on students’ understanding of models/modeling Unlike the studies focused on assessing students’ notions of models or modeling, another group of studies designed curriculum units that included modeling features and, after it was enacted, evaluated how the unit affected students’ understanding of models and modeling. Saari (2003) assessed whether a teaching sequence in which Finnish 13-year-olds were taught about the general understanding of models in the course of learning the states of matter had a positive impact on their notions of models and modeling using semi-structured pre- and post-interview and how stable the improvement, if any, remained afterward using a written assessment. First, from analysis of the students’ responses in interviews, three categories of notions about models were constructed. Category A includes ideas that a model is a thing or act to be copied, has to be accurate, and can change if there is any mistake or its creator wishes so and that the fitness of a model depends on who constructs the model. Ideas that constitute Category B include: a model is a representation of something known or unknown; the main purpose of a model is for learning and teaching; the fitness of a model depends on the nature of the model; and a model can change depending on a researcher or research. Finally, Category C includes ideas that: 22 a model represents something either known or unknown; the purpose of a model is to provide information about its target; the fitness of a model depends on how it is used; and a model can change is associated with research. Comparison between students’ pre- and post-interview data showed that almost all the students’ notion of models was main category A in pre-interview whereas most improved their notion of models to B and C. Analysis of a post-questionnaire indicated that the stability depended on whether models and modeling were included in subsequent teaching after the modeling intervention. Schwarz and White (2005) examined the effect of a model-based inquiry physics curriculum on seventh-grade students’ knowledge about modeling, or, what they called metamodeling knowledge (MMK). They designed the Model-Enhanced ThinkerTools (METT) curriculum to help middle school students learn about the nature of models and engage in modeling simultaneously. As they implemented the curriculum, they used its various features related modeling to construct computer models based on evidence from their everyday investigations, read and reflect in pairs passages about the nature of models and modeling, engage in a whole-class discussion about their computer models, and evaluate their models using such criteria as accuracy and plausibility. To examine how the students’ MMK changed, they used two instruments: pre- and post modeling written assessments and Schwarz’s modeling interview with 12 students. Analysis of the data from these two assessments indicates that METT had a positive effect on their understanding of the nature and purpose of models. Regarding the nature of models, they increasingly identified abstract models, saw a model as a representation that explains and predicts, recognized the possibility of multiple models for the same phenomenon and of incorrect models, and understood that models are estimates of the physical world. Additionally, most stu- 23 dents came to appreciate various purposes of models including visualization, theory testing, prediction, helping others’ understanding, and conducting investigations. On the other hand, these assessments provided no evidence that METT was effective in promoting the students’ understanding about the practice of modeling – in particular ideas about the nature and value of constructing, evaluating, and revising models. Still another study that took a similar approach is Gobert and Pallant’s (2004) work. As several middle and high school classrooms in the United States implemented an earth-science curriculum unit of “What’s on your plate?” in the Web-based Inquiry Science Environment (WISE), a virtual learning environment that aims to integrate science content, scientific inquiry skills, and epistemology, they administered a pencil-and-paper assessment before and after the unit to measure the gains in the 1100 participating students’ understandings of the nature of models. The curriculum unit, designed under the principles of making thinking visible and of helping students learn from one another, had several features related to modeling. Students were asked to build their models and explain a given phenomenon; evaluate and critique their learning partners’ models, revise their models and justify their changes; and visit dynamic models that depict different aspects of their subject of investigation. To measure students’ epistemologies about models, they used items adapted from Grosslight and colleagues’ (1991) questionnaire. These items asked about students’ understanding of the nature of models, the use of models, the relationship between a model and the real thing, constituents of a model, multiple models for the same target, and the change of a model. Gobert and Pallant used a scoring system that were also adapted from Grosslight et al.’s (1991) coding scheme. They gave scores, ranging from 0 to 3, to students’ responses to each of these items. Their analysis showed that the WISE unit had an ef- 24 fect on development of students’ understanding of models and modeling improved. 2.1.2.3. Research on a learning progression for scientific modeling There is still another research program that has focused on scientific modeling – those who have worked on developing a learning progression for scientific modeling (Schwarz, Reiser, Acher, Kenyon, & Fortus, 2012; Schwarz, et al., 2009). Such researchers have developed or used curriculum units that include modeling as a key feature, studied their enactments in multiple elementary and middle school classrooms, and measured the levels of students’ engagement in and metaknowledge about modeling using written assessments and interviews. As a result, such researchers have proposed a learning progression for scientific modeling (Schwarz, et al., 2012; Schwarz, et al., 2009) and found evidence that model-based curriculum units had some effect on students’ performance and knowledge about modeling (Baek, Schwarz, Chen, Hokayem, & Zhan, 2011; Bamberger & Davis, 2011). Taken together, these bodies of research on students’ notions of models and modeling, reviewed above, provide some insights into the subject. First, students have ideas about models and modeling that are different from how scientists understand them. However, some variance exists between these studies on the degree of coherence. Some studies indicate that students’ ideas about models are coherently ingenuous (Grosslight, et al., 1991; Saari, 2003) whereas others suggest that they have both naï and fairly sophisticated ideas about models and modeling ve (Pluta, et al., 2011; Treagust, et al., 2002). Second, the second and third body of literature all indicates that a model-enhanced curriculum unit helps improve students’ understanding of models/modeling. This literature has limitations. First, most of the studies reviewed above investigated middle and high school students’ understanding of models and modeling and only a few included 25 elementary school students’ understanding of models/modeling in their investigations (Schwarz, et al., 2009). Second, none of these looked into a process in which students’ understanding of models and modeling evolve class by class as they participate in implementation of a model-based curriculum unit. Therefore, although we know that a model-based curriculum unit has some positive impact on students’ notions of models and modeling, we do not know very well exactly how it happens or what features of the curriculum enactment play an important role in the improvement of students’ modeling understanding. This is critical if as a field, we are to advocate and better understand the effects of a practice-based science education reform agenda. Finally, in relation to the second point, all these studies employed written assessments and interviews as instruments and analyzed students’ responses to those instruments to construct students’ cognition of models and modeling. However, we cannot assume that this is identical to students’ epistemologies that guide their modeling activities. In other words, none of the above studies investigated students’ in-practice epistemologies about models and modeling. The present study aims to contribute to this literature by providing a microgenetic analysis of how elementary students’ epistemologies of modeling changed over time as they enacted a model-based inquiry unit and of how some of the curriculum events influenced the change of their epistemologies of modeling. 2.2. Key Terms and Constructs The present study lies at the intersection between research on practice-based science learning and research on students’ understandings about models and modeling. As such, this study appropriates multiple conceptual resources from each research tradition. In the next section of the chapter, I define and discuss key terms and constructs used in this study. 26 2.2.1. Scientific model In this research, I build on prior conceptualizations of a scientific model (Gobert & Buckley, 2000; Harrison & Treagust, 2000; Ingham & Gilbert, 1991; Lehrer & Schauble, 2006; Schwarz, et al., 2009) to define it as a representation of a system in some way analogous to its target phenomenon, which scientists develop and use to explain or predict the phenomenon and to communicate their understandings of it. A couple of notes may provide further clarifications. First, this definition shows my focus on external models, also called “expressed models” (Gobert & Buckley, 2000) or “conceptual models” (National Research Council, 2012), as contrasted with internal, mental models. Second, I emphasize the difference between a system and a phenomenon being modeled (Gobert & Buckley, 2000; Ingham & Gilbert, 1991). A scientific model does not just simplify its target phenomenon by focusing on some features of the phenomenon. It also enriches the phenomenon by incorporating theoretical features. To subsume these various relationships between a system and a phenomenon, I generally and vaguely state that the former is in some way analogous to the latter (Lehrer & Schauble, 2006; National Research Council, 2012). Finally, I include in this definition what scientists do with scientific models and their purposes to reflect my conviction that what make a model scientific are basically various practices scientists undertake with it (e.g., scientific modeling, scientific explanation, scientific way of communication). 2.2.2. Scientific modeling Scientific modeling generally refers to a scientific practice that scientists engage in with scientific models as a central artifact. This practice consists of multiple identifiable activities, which I henceforth call modeling activities. Here, I adopted an idea of activity theory that a human activity consists of multiple concrete actions (Leont'ev, 1978) although I did not take these 27 terms. Although various modeling activities can be identified, this study, in agreement with prior work (Clement, 2008b; Schwarz, et al., 2009), focuses on the following as core modeling activities that as a whole constitute the process in which a scientific model and scientific knowledge evolve together. - Constructing a model: Scientists construct a model using prior knowledge (e.g., evidence, scientific principles) to study a phenomenon. - Evaluating a model: Scientists evaluate their own and others’ models using criteria such as explanatory power and consistency with empirical evidence. - Revising a model: Scientists revise a model to increase their explanatory and predictive power and empirical validity. - Constructing a consensus model: Scientists compare multiple models for the same phenomenon and construct a consensus model that integrates the best features of each model. - Using a model: Scientists use a model to explain and predict a phenomenon and to communicate their understanding about it. Note that these activities do not occur precisely in this order; scientists may dynamically and iteratively carry out these activities. Further, it is important to emphasize that these modeling activities are interlocked with other scientific practices. For example, a model is constructed or used to explain its target phenomenon or other phenomena. Also, when scientists evaluate one another’s models or construct a consensus model, they typically engage in argumentation as well. From students’ point of view, some of these activities (e.g., evaluating others’ models, constructing a consensus model) are challenging while the others are relatively doable. Despite the variance in the level of difficulty, I argue for engaging students in all these activities because these activities as a whole provide an authentic context that fosters students’ epistemic agency. 28 Therefore, all these activities need to be pedagogically accessible and useful for learners by various means including curriculum and technology. The practice of scientific modeling can be analyzed not only vertically (as shown above) but horizontally as well. What the latter point indicates is that multiple dimensions—for example, cognitive, performative, and social dimensions—of scientific modeling can be identified analytically. Although investigation of multiple dimensions of modeling would provide a rich understanding of scientific modeling, researchers can focus on only one or two dimensions of it depending on their conceptual frameworks or research goals. In this study, I limit my main focus to the cognitive dimension of scientific modeling because this study aims to contribute to research on students’ understandings of models/modeling, reviewed earlier. It is important to emphasize that the concepts of scientific models and modeling outlined here are formal and not necessarily embraced by students. As research on students’ understandings of models and modeling has shown, students may view models and modeling in ways different from how scientists understand them. 2.2.3. Epistemologies in modeling In Chapter 1, I introduced epistemology as an important dimension of epistemic agency. Of various frameworks and models for the notion of epistemology, I regard Chinn, Buckland, and Samarapungavan’s (2011) framework as a good platform to introducing and discussing more specific concepts within epistemology because it conceptualizes epistemology broadly by combining both philosophical and psychological literature on the subject. Using a term of “epistemic cognition” as an umbrella term to refer to “all kinds of explicit or tacit cognitions related to epistemic or epistemological matters” (p. 141), Chinn and colleagues presented an extensive framework of epistemology. In this expanded framework, epistemology consists of at least five dis- 29 tinctive components: (1) epistemic aims and epistemic value, (2) the structure of epistemic achievements (e.g., knowledge, understanding, beliefs), (3) the sources and justification of epistemic achievements, and related epistemic stances, (4) epistemic virtues and vices, and (5) processes for achieving epistemic aims. It should be emphasized, however, that not all these components of epistemic cognition are analytically identified in every context related to epistemic matters. Using this general framework of epistemology, we can think of students’ epistemologies related to modeling. Students have explicit knowledge or tacit commitments about models and modeling, which can be conceptualized and analyzed using some of the five components of epistemology introduced above. Students’ epistemologies related to modeling can be further distinguished into two categories depending on contexts in which they deploy their epistemologies. First, students have explicit understandings of models and modeling as subjects of reflection. These are manifested usually in contexts in which students are asked to express their thoughts about models and modeling in general. Most of the studies that I reviewed earlier (2.1.2) examined students’ epistemologies about modeling that fall into this category. Second, students deploy their epistemological understandings or beliefs of models and modeling as closely integrated with their engagement in modeling as a practice. Researchers who developed a learning progression for scientific modeling (Schwarz, et al., 2009) attended to students’ epistemologies about modeling in this category. By distinguishing these two categories of epistemologies related to modeling, I clarify my assumption that students’ epistemologies are not necessarily coherent across different contexts (cf. Sandoval, 2005). The present study focuses on students’ epistemologies related to modeling that fall into the second category. Aligned with Berland and colleagues’ (Berland, et al., 2013) concept of 30 “epistemologies in practice,” I call students’ epistemologies that they deploy in their engagement in the practice of modeling their epistemologies in modeling (henceforth, EIMs (plural) or EIM (singular)). Individuals’ EIMs evolve in dialectical relations with their engagement in scientific modeling and with others’ EIMs. First, individuals’ EIMs guide how they engage in modeling practice and their accumulated experiences of modeling affect their EIMs. Second, when a group of individuals engage in modeling collaboratively, their EIMs help shape one another and co-evolve over time. In this study, I pay attention to these two ways in which individual students’ EIMs develop. 2.2.4. Ideas about modeling The second goal of the present study is to investigate ways in which some curriculum events influenced three students’ EIMs. I draw on social semiotic framework (Halliday, 1978; Hodge & Kress, 1988; van Leeuwen, 2005) to elaborate the concept of ideas about modeling. I begin with a concept of semiotic resources, defined as “the actions and artefacts we use to communicate, whether they are produced physiologically—with our vocal apparatus; with the muscles we use to create facial expressions and gestures, etc.—or by means of technologies— with pen, ink and paper; with computer hardware and software; with fabrics, scissors and sewing machines, etc.” (van Leeuwen, 2005, p. 3). Semiotic resources were traditionally called “signs” in semiotics but social semiotic theorists prefer semiotic resources because they want to emphasize that meanings are not fixed by the system of a “sign” but determined in a social context (or, socially situated). According to this definition, anything can be a semiotic resource as long as it can be used or interpreted to convey some meaning. There are different types of meaning that a semiotic resource potentially communicates. Theorists have proposed different typologies of 31 types of meaning (Halliday, 1978; Hodge & Kress, 1988; van Leeuwen, 2005). Despite some differences, they commonly acknowledge one type of meaning: representational meanings. In this study, I focus on representational meanings, which I simply call ideas, about modeling. When a class of students and their teacher engage in implementing a model-based curriculum unit, various things can function as semiotic resources that send out ideas about modeling. These include passages and images in the student notebook; a teacher’s utterances (e.g., instructions, framings, scaffolding comments), notes, drawings on a board, gestures; visual representations (e.g., diagrams, pictures, computer simulations); students’ utterances, notes, drawings, gestures; and conversations and activities a teacher and students (a group or a whole class) construct together. Ideas about modeling that these diverse semiotic resources convey need to be interpreted in the local context of a class of students and teachers enacting a model-based curriculum unit. In particular, because I am interested in examining ways in which ideas about modeling from these sources influenced three students’ EIMs, those ideas about modeling will be analyzed from these students’ point of view. 32 CHAPTER 3. METHODS In the preceding chapter, I reviewed the research traditions that the present study follows and explicated key constructs and terms used in this study. In the first half of the chapter (3.1~3.3), I provide various pieces of information about how I conducted an empirical investigation. This body of information includes information about the project Modeling Designs for Learning Science, the research site and participants, and the model-based curriculum unit that the participants implemented. In the second half of the chapter (3.4~3.5), I describe what kinds of data were collected from what sources and how I analyzed this body of data. Here, I introduce a coding scheme that I used to analyze both the focus students’ EIMs and ideas about modeling from curriculum events. 3.1. Context: A Research Project Modeling Designs for Learning Science (MoDeLS) This study is situated in a research project Modeling Designs for Learning Science (MoDeLS, henceforth). The MoDeLS project aimed at developing a “learning progression” (Smith, et al., 2006) that represents increasingly sophisticated levels of engagement in and epistemological understanding of scientific modeling to make this scientific practice meaningful and accessible for elementary and middle school students. To achieve this goal, the MoDeLS researchers developed or chose curriculum units designed to engage science learners in scientific modeling and had them implemented in multiple upper elementary and middle school science classrooms. To assist participating teachers and students in their enactment of these curriculum units, the MoDeLS researchers provided them curriculum materials (e.g., teacher guides, student notebooks), accompanying tools (e.g., experiment equipment), and professional development. Methodologically, the MoDeLS researchers adopted a “design experiment” (Cobb, Confrey, diSessa, Lehrer, & Schauble, 2003) approach in which researchers design and experi- 33 ment a particular form of learning in an actual learning context (e.g., classroom) and systematically investigate how that form of learning takes place in the context to gain information for further development. Accordingly, the MoDeLS researchers went through the following iterative process in multiple rounds: first, they constructed an initial learning progression framework for modeling on the basis of preexisting research and theoretical conjectures; second, they designed instructional interventions and had them enacted in classrooms; finally, using empirical data generated from those instructional interventions, they evaluated and revised the prior learning progression framework to construct a new framework. (See Schwarz et al., 2009 for more information). Within the larger context of the MoDeLS project, a research team led by Dr. Christina Schwarz and I worked in as a research assistant focused its work on fifth grade students’ engagement in scientific modeling. First, Dr. Schwarz and I developed a six to eight-week curriculum unit of evaporation and condensation that included scientific modeling as a central feature. To develop the curriculum unit, we used a general model-based instructional framework, developed based on previous studies on curriculum design specifically related to scientific modeling (Schwarz & Gwekwerere, 2007; Schwarz & White, 2005; B. Y. White & Schwarz, 1999). This instructional sequence was developed with the aim at incorporating modeling practice in multiple content areas (Kenyon, Schwarz, & Hug, 2008). We incorporated the sequence into a unit of evaporation and condensation for fifth grade students (For more information of the unit, see Baek, et al., 2011; Kenyon, et al., 2008). The participating teachers and students implemented this curriculum unit in their science classes. Although the curriculum materials specified instructional and learning steps in some detail, a significant amount of teachers’ input was also incorporated when the unit was enacted and the curriculum materials were modified. 34 3.2. Research Site and Participants In the school year of 2008-2009, two fifth grade teachers and their students (two classes for each teacher) from two public schools located in a Midwestern state participated in our research. The present study analyzes data collected when one of the teachers, Mrs. M, her intern teacher, Ms. H, and a class of students (N=24) implemented the model-based unit of evaporation 1 and condensation in the fall semester of 2008. I do not present an analysis of the other teacher’s students’ EIMs because the focus students in that class did not engage productively in some important modeling activities such as evaluating one another’s models and thus did not generate extensive enough data needed to capture how their EIMs changed over time. Mrs. M, Ms. H, and their students enacted the model-based unit a suburban school that served somewhat diverse student population in terms of ethnicity (See Table 3.1). Table 3.1. Student demographic at the research site in the school year of 2008-2009 Category Number of students (percentage) Total enrollment 243 Ethnicity White 145 (60%) Asian/Pacific Islander 68 (28%) Black 14 (6%) Hispanic 12 (5%) American Indian/Alaskan Native 4 (2%) Mrs. M, a White female teacher, had taught for six years when she began to participate in the research. She also had a master degree in science. She enjoyed interacting with students and motivated them to engage in school work by various means such as reward (e.g., candy) and her cheerful personality. In implementing designed curriculum units, she tended to be relatively flexible: when she found a pedagogical value in some curriculum event, she would spend more time on it than planned and she also would improvise some activities. In regards to the participating students, we did not collect much information about the participating students. From casually 1 All the names that appear in this dissertation are pseudonyms. 35 collected information, it appeared that their ethnicities reflected fairly well the student demographic of their school. During the teaching experiment, they worked in six groups most of the time. Among them, we focused on a group of students (called “focus students”) and videotaped them in all group work times. Selected by Mrs. M, this group consisted of four students. Adrianna was a Caucasian, female student and a leader of the group. According to Mrs. M, her academic performance level was middle to high. During the curriculum enactment, she was an active participant both in the class conversation and in group work. In particular, partly because she was a group leader, she mentioned time constraints more frequently than the other 2 focus students did and thus tried to get group work done within given time. Brian was Caucasian and male. Mrs. M informed us that his academic level was high. Perhaps for the reason, his ideas drew attention from both teachers and students. At a moment during the curriculum enactment, Mrs. M recognized him as an author of a certain idea (including an experiment device). However, he would be playful other times regardless of Mrs. M’s presence. Joon was male and a Korean exchange student. Thus, he was an English learner. He did not communicate in English as fluently as the other focus students did. According to Mrs. M, his academic performance level was middle. Finally, Mana was Indian-American and female student. Though she was a native speaker, she did not talk a lot or clearly. Mrs. M informed that her academic performance level was middle. 3.3. A Model-based Curriculum Unit of Evaporation and Condensation This section of the chapter provides a description of a model-based curriculum unit of 2 I do not present analyses of Adrianna’s models in this dissertation because, unfortunately, her notebook that contains her models could not be found. I only describe what she said in class and group conversations and activities. 36 evaporation and condensation that Dr. Schwarz and I developed. As mentioned above, we first designed a model-based instructional sequence and embedded it in the curriculum unit. Because this instructional framework was designed to help students’ engagement in and epistemological understanding of scientific modeling, I begin by specifying the sequence below. 3.3.1. Model-based instructional sequence Scientists engage in scientific modeling in a dynamic and iterative way, and ways in which they do modeling vary across different traditions of scientific inquiry and even across different research projects. It is neither possible nor useful to reflect the complexity and diversity in scientists’ engagement in modeling in any pedagogical approach that targets student engagement in scientific modeling. Therefore, several attempts have been made to organize key features of scientific modeling and other activities into a general instructional framework (Clement & ReaRamirez, 2008; Schwarz & Gwekwerere, 2007; B. Y. White & Schwarz, 1999; Windschitl, et al., 2008a). We built on these previous studies to develop a model-based instructional framework in an attempt to help teachers and students to engage in this challenging scientific practice more effectively. Below, I provide a diagram that represents this instructional sequence (Figure 3.1) and describe what students are expected to do in each constituent activity. An anchoring phenomenon and central questions about it: An anchoring phenomenon is introduced at the outset. Good candidates of an anchoring phenomenon are phenomena that can be found in students’ everyday life and thus intriguing as well as relevant to them. Next, students are asked key challenging as well as intriguing questions about the anchoring phenomenon. Finally, students are given an opportunity to come up with their ideas or hypotheses to answer those questions. 37 Figure 3.1. A model-based instructional sequence Other activities Modeling activities An anchoring phenomenon and central questions about it Constructing an initial model Conducting empirical investigations Evaluating and revising the initial model based on empirical evidence and construct a second model Evaluating one another’s second models Scientific ideas and scientific models about key concepts Evaluating and revising the second model based on new ideas and construct a third model Constructing a consensus model Using the consensus model to explain or predict other related phenomena Constructing an initial model: Individually, students construct their initial models that encompass their prior knowledge, ideas, and hypotheses about the anchoring phenomenon. Initial models should be constructed in ways to help them think of investigations they need to conduct in order to improve their models. Conducting empirical investigations: Students conduct a set of empirical investigations 38 about the phenomenon. As a result of this activity, they garner a body of empirical evidence about the phenomenon. Evaluating and revising the initial model based on empirical evidence and construct a second model: Students individually evaluate their initial models based on the empirical evidence collected. When they find any discrepancies between their initial models and the empirical evidence, they revise the models into their second models to reconcile the discrepancies. Evaluating one another’s models: In groups or as a class, students discuss and determine criteria for evaluating models. Next, in groups, students present their second models to others and evaluate one another’s models by the criteria determined. Scientific ideas and scientific models about key concepts: Students may lack understanding about fundamental scientific concepts that are necessary for making good models. At this step, scientific ideas and models about those concepts are given to help students improve their models. Evaluating and revising the second model based on new ideas to construct a third model: Students evaluate their second models on the basis of new ideas from others, scientific ideas, and scientific models, and incorporate those ideas into the models to construct their third models. Constructing a consensus model: As a group or class, students compare multiple individual models and combine best features or parts of each model to construct a group or class consensus model. Using the consensus model to explain or predict other related phenomena: Students use the consensus model to explain or predict other related phenomena. They determine strengths and limitations of their model for further revision. In particular, they may see the need and utility of making their consensus model more generic for multiple phenomena. 39 When this instructional framework was developed, particular attention was given to several features of it in order to make some important dimensions of scientific modeling accessible to teachers and students (for more information, see Baek, et al., 2011). First, we encouraged students to generate and develop defendable explanations that often involve unobservable or theoretical properties, structures, or processes for a target phenomenon from the outset and throughout the sequence. In the inquiry sequence frequently used in traditional science classrooms called “the scientific method” (Windschitl, et al., 2008a), students are only required to predict the result of an experiment, make a conclusion that summarizes a pattern in the data of the experiment, and test their prediction by comparing it to their conclusion. In this sequence, they do not need to develop an explanation of why the target phenomenon generates such a pattern. In contrast, in the model-based instructional sequence we developed, students are encouraged to construct their explanations in their initial models and improve them as they go along the sequence. Second, we placed an emphasis on empirical evidence as an important source of scientific knowledge. One of the characteristics of scientific inquiry that make it distinct from other forms of inquiry is what Pickering (1995) called “material agency”: after scientists construct theories to explain things in the world, they test their theories using empirical evidence collected by various means (e.g., experiment, observation) and, when inconsistencies between the two are found, revise them. In our instructional framework, we went beyond “simple experiments” or “simple observations” (Chinn & Malhotra, 2002) frequently found in traditional school science and highlighted the epistemic relation between models and empirical evidence in the hope that students can better experience and appreciate this evidence-based epistemic process. Finally, we emphasized the social dimension of scientific modeling. How scientists inter- 40 act and communicate with one another is quite different from ways of social interaction and communication in other social sectors. Students need to experience scientific ways of interacting with one another to have a better grasp of scientific modeling. Although the model-based instructional sequence we designed generally emphasizes this social dimension, we paid special attention to it in two modeling activities: evaluating one another’s models and constructing a consensus model. By helping students engage in these activities, we expected them to increasingly understand the social norms that govern scientists’ interactions and communication. However, because these activities are fairly alien to students, we noticed the need to give students a good amount of instructions and scaffolding to support their undertaking these activities when this sequence would be embodied in a curriculum unit. 3.3.2. A model-based curriculum unit of evaporation and condensation After developing the instructional sequence introduced above, we incorporated it into a unit of evaporation and condensation for 5th grade students. Below, I present a flowchart that represents how the unit proceeds (Figure 3.2) and outline each step. A main anchoring phenomenon and driving questions: The unit begins by introducing a solar still, a device that collects potable water from humid soil or sea water through evaporation driven by the heat of the sun and condensation by relatively cool ambient air. After students see water collected in a small cup placed in a solar still, they are then given two questions: “Would you drink the liquid in the bottle cap that came from this dirty water? Do you know what that liquid is and how it got there?” In this unit, two main phenomena that occur in a solar still are investigated: evaporation and condensation. Students begin investigating evaporation first. 41 Figure 3.2. A model-based curriculum unit of evaporation and condensation Anchoring phenomenon and questions Solar still: “Would you drink the liquid in the bottle cap that came from this dirty water? Do you know what that liquid is and how it got there?” Evaporation Water shrinking on a plate/in a humidifier: “What happened to the water on the plate/in the humidifier? Where did it go? How? Why?” Condensation Water drops forming on plastic wrap/a cold bottle: “What are things on the plastic wrap/the cold bottle? Where did they come from? How did they get there?” Constructing an initial model Constructing an initial model of evaporation Constructing an initial model of condensation Empirical investigations Conducting empirical investigations about evaporation Conducting empirical investigations about condensation Evaluating/ revising the initial model Evaluating/revising the initial model of evaporation and construct a second model of evaporation Evaluating/revising the initial model of evaporation and construct a second model of evaporation Evaluating one another’s models Evaluating one another’s models of evaporation Evaluating one another’s models of condensation Scientific ideas and models about key concepts Scientific ideas and models about key concepts about evaporation Scientific ideas and models about key concepts about condensation Evaluating/ revising the second model Evaluating/revising the second model of evaporation and construct a third model of evaporation Evaluating/revising the second model of condensation and construct a third model of condensation Constructing a consensus model Constructing a consensus model of evaporation Constructing a consensus model of condensation Anchoring phenomena and questions Using the model for Using the model for other phenomena of evaporation other phenomena Using the model for other phenomena of condensation Back to the solar still 42 Anchoring phenomena about evaporation and central questions: When the section of evaporation starts, students are introduced to two phenomena of evaporation and questions that will guide their investigations of it. One phenomenon involves water on a plate shrinking over time and the other water in a humidifier shrinking over time. They are asked, “What happened to the water on the plate or in the humidifier? Where did it go? How? Why?” Constructing an initial model of evaporation: To explain one of the phenomena about evaporation and, more particularly, to address the questions about it, students use their prior knowledge, ideas or hypotheses, or other resources to construct their initial models of evaporation. Conducting empirical investigations about evaporation: Students conduct multiple sets of experiments about evaporation and, in the process, collect a body of empirical evidence about evaporation. To guide them engage in this activity, some instructions are provided in the student notebook. Evaluating/revising the initial model of evaporation and construct a second model of evaporation: Students evaluate their initial models of evaporation based on the empirical evidence they have collected and, when discrepancies are found, revise them into their second models of evaporation. Evaluating one another’s models of evaporation: As a group or class, students talk about and determine a set of criteria for evaluating models. In groups, they use those criteria to evaluate one another’s models of evaporation. To help them engage in this unfamiliar activity, we gave specific intellectual roles to students. These roles are largely divided into presenting one’s own model and evaluating others’ models. And the latter is in turn broken into three roles: evaluating others’ models with a focus on how well they explain evaporation, how well they are con- 43 sistent with empirical evidence, and how clear they are. Scientific ideas and models about key concepts of evaporation: Students are presented fundamental scientific ideas about key concepts of evaporation such as “very tiny water bits” (water molecules), and watch scientific models (computer simulations) that explain how state of matter changes. Evaluating/revising the second model of evaporation and construct a third model of evaporation: Using the ideas they have learned from their teacher’s presentation and computer simulations, students evaluate their second models of evaporation and revise them into their third models of evaporation. Constructing a consensus model of evaporation: Using the same criteria, students combine the best features of multiple individual models of evaporation to construct a consensus model of evaporation. This activity can be done in groups of as a class. Using the model to explain or predict other phenomena of evaporation: Finally, students use the consensus model of evaporation to explain or predict other phenomena involving evaporation that they can observe in their daily life. By so doing, they notice the need to make the model generic. This activity ends students’ investigation of evaporation. Next, students go through the same sequence again with condensation as their second target phenomenon. Back to the solar still: After completing investigating evaporation and condensation using models, students return to the main anchoring phenomenon of solar still and the central questions. Combining what they have learned about evaporation and condensation so far, students construct a model of the solar still to answer the central questions. 3.4. Data Collection and Data Sources 44 To interrogate how the participating students’ engagement in and understanding of scientific modeling changed as a result of enacting the above curriculum unit, the research team led by Dr. Schwarz collected data from several sources. First, we collected the students’ written assessments they took before and after their curriculum implementation. These assessments were designed to examine their modeling practice as well as understanding about scientific modeling. Second, we videotaped two classes’ (one class for each teacher) conversation and activities over the whole period of their curriculum implementation. As we videotaped each class session, we also made a field note. Third, we conducted clinical interviews with six students from each class. All the audiotaped interviews were transcribed. Finally, we collected all the participating students’ notebooks and science journals. We created student notebooks to guide students’ engagement in this unit and to collect students’ models and notes. Science journals were their personal notebooks; Mrs. M had students take additional notes such as learning objectives on their science journals. For the present study, I only used data collected from one class led by Mrs. M. I first had all the videotapes of the class sessions that they had, except one that was missing, transcribed. 3 In total, I had 21 files of transcripts. I also used the research team’s field notes. These are indispensable bodies of data for this study because I focus on how three focus students’ epistemologies about modeling as used in their modeling activities changed over time as well as how some of the curriculum events affected those changes. Because I did not assume that students’ EIMs were not necessarily identical with their epistemologies about modeling displayed in written assessments and interviews, I used the data from written assessments and interviews only secondarily, for example, in constructing a coding scheme for analyzing students’ EIMs. Another cru3 I used a company’s transcription service. 45 cial body of data came from the focus students’ student notebooks and science journals. One of the four students’ notebooks was missing. So I focused my analysis on the remaining three students’ data. I also secondarily used other students’ notebooks, for example, in examining how many students used particular features in their models over time. The following shows succinctly the data sources I used for the present study. Primary data sources: - 21 transcripts of videotaped classroom conversation and activities - The research team’s field notes - 3 focus students’ notebooks and science journals Secondary data sources: - 3 focus students’ scanned pre/post written modeling assessments - 3 focus students’ transcribed interviews - Other students’ student notebooks and science journals 3.5. Data Analysis 3.5.1. Microgenetic analysis As a group of methods that aim at examining how learning or developmental change occurs, microgenetic methods have three essential properties (Siegler & Crowley, 1991). First, a target that changes is observed throughout the entire period of its change. Second, observations of the target are frequent enough, compared to the rate of its change. Third, observations are subject to intensive analysis with the goal of constructing a process or mechanism that gives rise to them. Because this study aims at providing an account of how their EIMs changed over time throughout the unit and why their EIMs changed the way they did, I adopt a microgenetic analy- 46 sis. Therefore, in accordance with the three characteristics of microgenetic methods above, I conduct an intensive analysis of the models, utterances, and notes that three elementary students made fairly frequently (in nine modeling activities) throughout the unit using a coding scheme I developed. In what follows, I will explain in detail what I did to do a microgenetic analysis of students’ EIMs. 3.5.2. Preparation of data files To address the two research questions outlined earlier, I created two major data (excel) files. In one file I compiled three focus students’ models, notes, and utterances made when engaging in multiple modeling activities. Their models and notes came from their student notebooks and science journals. I copied their utterances from the transcripts of classroom data. In the other file, I compiled 21 transcripts of videotaped classroom conversation and activities and one field note to supplement the missing videotape for one class session. I did additional work on the second file to make it better subject to analysis. First, I enriched the transcripts that initially included only the utterances made by Mrs. M, Ms. H (Mrs. M’s intern teacher), and students and thus did not represent what had happened and what they had talked about in class. Watching and listening to the video clips digitally converted from the videotapes, I further transcribed some of the parts not done previously, inserted some still shots (captured from the video clips) representing contexts of classroom events, the teachers’ notes written on a board or a standing pad, images drawn on a board or presented through an over head projector, and the teachers’ and students’ facial expressions and gestures (cf. Flewitt, Hampel, Hauck, & Lancaster, 2009), and added my observant comments. This enrichment helped me properly describe and interpret the classroom conversations and activities. Second, I divided the data into multiple curriculum events and labeled each curriculum event. By curriculum event, I 47 generally refer to an identifiable conversation or activity that takes place around a focus as a curriculum unit is enacted. In doing so, I identified curriculum events that had potentially influenced students’ EIMs. 3.5.3. Development of a coding scheme for analyzing students’ EIMs and ideas about modeling present in curriculum events To analyze students’ EIMs from their models and discourse data and to analyze ideas about modeling from some curriculum events, I developed a coding scheme which I call “EIM coding scheme” henceforth by building on other analytical frameworks. To develop this coding scheme, I went through multiple iterative processes of constructing a coding scheme, trying it to analyze data, and revising it into a new version. I drew on several analytical frameworks to develop the EIM coding scheme. First, because my work uses part of the data collected for the MoDeLS project, I consulted the coding scheme of the project or what the MoDeLS researchers called a construct map for students’ scientific modeling. They focused their analysis on whether students attend to salient and general features, explanatory features, sources of knowledge, and audience and clarity of communication (Schwarz, et al., 2012; Schwarz, et al., 2009). Second, I have also benefited from ongoing collaborative efforts to develop a coding scheme in the SciPractices project. This project expands the MoDeLS project to target scientific modeling, explanation, and scientific argumentation as key scientific practices and aims at constructing a learning progression of elementary and middle school students’ epistemologies that guide their engagement in these practices. The SciPractices researchers currently focus on four epistemic commitments: students’ attention to generality, mechanism, evidence, and communication and persuasion. Finally, I also found a general framework in social semiotics (Halliday, 1978; van Leeuwen, 2005) helpful both conceptually 48 and analytically though it was not as influential as the above two coding schemes. One benefit from this approach is that it enabled me to see models, utterances, notes, and other data as “multimodal texts and communicative events” (van Leeuwen, 2005) and students’ EIMs as their explicit or implicit meanings about models and modeling. Although I posit that students’ EIMs can be variously conceptualized, analyses of Mrs. M’s class discourse and students’ models led me to conceptualize their EIMs in the following way. I argue that students’ EIMs can be conceived to consist of the following two components or steps. First, students had certain ideas or beliefs about good models. In other words, they thought or believed that good models have such and such characteristics. Second, in their various modeling activities, they utilized their beliefs about good models. When they constructed a model, they tried to make a good model according to their image of good models. When they evaluated their own or others’ models, they compared them to what they thought are good models. When they found gaps between their models and what they thought are good models, they revised their models to make them closer to their ideal models. When a group of students constructed a consensus model, they made the best model out of their individual models and, in the process, negotiated their individual images of good models. In doing all these and other modeling activities, students attended to what they thought are characteristics of good models and saw if and to what extent their models had such characteristics. It is important to emphasize that these two components or steps were hardly separable in practice. Careful examinations of multiple data including classroom conversations, models, written assessments, and interviews allowed me to notice that teachers and students in this class used four main categories of ideas about good models throughout the unit. I identified other kinds of ideas (e.g., A model that most people consider to be good is a good model) as well, but they were 49 not deployed frequently or consistently enough to be used as analytical lenses. Next, through multiple rounds of an iterative process of developing a coding scheme, trying it to analyze a set of data, and refining it, the final EIM coding scheme emerged. Now, I briefly introduce the four EIM categories here. More detailed descriptions of the EIM coding scheme will ensue soon. - CONTENT: One way students evaluate quality of a model is to think about features that constitute the model. They attend to what kinds of features a model contains and how extensively it contains such features. Some students attend only minimally to these aspects. Others view a good model as a good repository and thus believe that a good model contains all that they have learned about a target phenomenon, regardless of what kinds of features they are. Still others believe that a good model contains only scientifically necessary features in a parsimonious manner. With these varying notions, they attend to model features in multiple modeling activities. - EXPLANATION: Students all believe that a good model provides a good explanation of its target phenomenon. However, they have various ideas about explanation. By explanation, some students refer to any kind of information about a target phenomenon. Others focus on macroscopic, physical causes or factors involved in a phenomenon but do not attend to hidden mechanisms. Still others attend to hidden mechanisms when they talk about explanation. With these various ideas about explanation, they attend to how a model explains its target phenomenon in various modeling activities. - ACCURACY: Students believe that a good model provides accurate information about its target phenomenon. Note that I use “accurate” as a general descriptor here. Though students all attend to this general idea, their levels of attention diverge on two specific issues about accuracy: first, what kinds of sources of accuracy do they attend to? second, 50 how sophisticated is their attention to the relation between sources of accuracy and model features? Some students’ attention to accuracy is limited. They only attend to most obvious sources of accuracy such as information given in the current context and to simply representing them in a model. Others are not different from the first group on the second issue, but they attend to other sources of accuracy such as information they have acquired in their previous experiences. Finally, still others attend to empirical evidence as a source of accuracy and to the relation between empirical evidence and model features. With these diverse ideas, they attend to how accurately a model provides information about its target phenomenon in various modeling activities. - COMMUNICATION: Students believe that a good model is communicatively effective, broadly construed. Some students are minimally aware of a general communicative purpose of models. Others believe that a good model conveys ideas in it clearly to others. Still others think that a good model is not only clear but also persuasive. With these different ideas, students attend to how effectively a model communicates information about its target phenomenon in various modeling activities. Before more detailed descriptions of the EIM coding scheme is provided, several notes need to be made here. First, students rarely expressed the four ideas about good models per se. They focused on particular model features or components in their models, utterances, and notes and the four ideas were constructed to make sense of their attention to such model features. The distinction and relationship between specific model features and their epistemic ideas behind their attention to such features is important for the main argument of this study. After I focus my analysis on the latter in Chapter 4, I will discuss the relation between model features and students’ epistemic ideas later in Chapters 4 and 5. 51 A second note, related to the first one, is that the four EIM categories outlined above are not entirely independent of one another. They may be related to one another in complex ways. One reason is that students may see the same feature in association with different ideas or foci. For example, as a student incorporates microscopic/theoretical entities such as molecules into her model, she may attend to the particular kind (CONTENT) and the explanatory power of that model feature (EXPLANATION) at the same time. Also, when a student includes empirical evidence in his model, he may want to make his model both accurate (ACCURACY) and persuasive (COMMUNICATION). Still another example is communicative features. Students may see them both as necessary model features (CONTENT) and as features that make a model clear (COMMUNICATION). This view is conceptually consistent with a connectionist account of cognition and is supported by my empirical data. Thirdly and finally, in the EIM coding scheme, I divided multiple modeling activities into two broad groups: constructing/revising a model and talking/writing about a model. Constructing/revising a model includes activities of constructing an initial model, of evaluating/revising a model to construct a new model, and of constructing a consensus model. Talking/writing about a model includes activities of presenting one’s model to others and of evaluating others’ models. The main reason that I made a distinction between these two groups of activities is that they are different with respect to nature of activities: students produce models in the first group of activities whereas they reflect models using oral and written discourse in the second group of activities. Because this difference generates different kinds of data, I decided to reflect the difference between the two groups of modeling activities in the EIM coding scheme. 3.5.3.1. CONTENT This EIM category is to capture students’ diverse ideas about good models that are based 52 on their concerns with what kinds of features a model includes and how extensively it has them. Descriptions about this category are presented in Table 3.2. Below, I highlight some of them. Level 1: Students at this level pay minimal attention to these aspects. Therefore, in constructing/revising their models, students often do not include necessary features (e.g., communicative features) sufficiently or, on the contrary, include extra features (e.g., human figures, background objects) in their models. In commenting on models, they do not attend to model features. Level 2: Students at this level see a model as a kind of schooling artifacts and attend to two aspects. First, they believe that a good model stores all information that they have acquired about a target phenomenon. Second, they think that a good model is clear. In constructing/revising their models, students at this level include all kinds of information that they have learned about a target in their models. Therefore, they often include scientifically unnecessary features. In Mrs. M’s class’s curriculum enactment, one such feature that students at this level frequently attended to is empirical data. Although it is part of what they learned about evaporation and condensation, scientific models do not include empirical data but instead reflect empirical evidence constructed from empirical data in them by updating explanations of scientific models. In talking/writing about a model, students at this level often emphasize including details or all that they have learned in it. One thing to note about this level is that students at this level may begin to notice some sort of parsimony as a characteristic of good models but for simple reasons such as clarity. Level 3: Students at this level are aware that a good model contains only all features needed to describe and explain a target phenomenon scientifically. These features include microscopic/theoretical entities as an explanatory feature and communicative features (e.g., labels, sentences). In constructing/revising models, students at this level include these features only and 53 do not include extra features in their models. One indicator that distinguishes level 3 from level 2 is empirical data. Students at this level do not include empirical data in their models. Instead, they make their (scientific) explanation to be consistent with empirical evidence. In taking/writing about models, they attend to these aspects. Table 3.2. A coding scheme for analyzing students’ EIMs in the category of CONTENT Level Description Indicators 1 They attend minimally CONSTRUCTING/REVISING A MODEL: to what kinds of mod- - They do not attend much to what kinds of features they inel features a model clude in their model. This minimal attention is evidenced by contains and how exinclusion of irrelevant features (e.g., a human figure, backtensively it contains ground objects) in a model or lack of communicative features such features. of a model. 2 3 TALKING/WRITING ABOUT A MODEL: - They do not talk about the kinds of features contained or to be contained in a model. They attend to wheth- CONSTRUCTING/REVISING A MODEL: er a model contains all - They include nearly all information they have learned about a information they have target phenomenon and communicative features in a model. acquired about its tarThey see empirical data as part of what they have learned altget phenomenon and hough it is not an essential feature of a scientific model. So, communicative feainclusion of empirical data in a model is an indicator of this tures. Optionally, they level. may avoid including too specific or redun- TALKING/WRITING ABOUT A MODEL: dant information. - They refer to “details” as what makes a good model. They evaluate a model based on whether it has "details." - They evaluate a model based on whether it includes sufficiently what they have learned. - They state that too many details make a model confusing. They attend to wheth- CONSTRUCTING/REVISING A MODEL: er a model contains all - They include all scientifically essential features (e.g., microscientifically essential scopic/theoretical entities) and no extra features (e.g., empiriinformation about its cal data, background objects), and communicative features. target phenomenon and communicative TALKING/WRITING ABOUT A MODEL: features. Optionally, - They evaluate a model based on whether it has all scientifithey may begin to atcally essential features and communicative features. tend to the generality of model features. 3.5.3.2. EXPLANATION 54 Another criterion by which students evaluate a model is how well it explains its target phenomenon. Most students know that a purpose of a scientific model is to provide an explanation of a phenomenon (See Appendix A.1 for some evidence). However, what they mean by explanation is different from student to student. Depending on students’ understanding and attention to explanation, different levels can be constructed within this category as follows. Descriptions of this category are presented in Table 3.3. Table 3.3. A coding scheme for analyzing students’ EIMs in the category of EXPLANATION Level Description - Indicators 1 They attend to wheth- CONSTRUCTING/REVISING A MODEL: er a model provides - In their model, they show or state how their target phenomegeneral (not distincnon changes over time on a macroscopic scale. tively scientific) in- In their model, they refer to "evaporation" or "condensation" formation about its as a term that explains what they observed (e.g., water shrinktarget phenomenon. ing over time, water drops forming on a cold surface) rather than as a phenomenon to be explained. 2 3 They attend to whether a model shows or talks about physical causes (e.g., heat, light, the air). But, they do not attend to microscopic/theoretical entities (e.g., water molecules, kinetic energy). They attend to whether a model explains its target phenomenon by the behavior of microscopic/theoretical entities (e.g., water molecules, kinetic energy). TALKING/WRITING ABOUT A MODEL: - They evaluate a model based on whether it has time elapse or "before, during, and after." CONSTRUCTING/REVISING A MODEL: - They show or state that a light or heat source (e.g., the sun) causes water to evaporate. - They show or write about "the air" in their model. TALKING/WRITING ABOUT A MODEL: - They evaluate a model based on whether it shows a light or heat source. - They evaluate a model based on whether it represents "the air" accurately. CONSTRUCTING/REVISING A MODEL: - They show and talk about water particles (e.g., "water vapor," "water molecules") in their model. TALKING/WRITING ABOUT A MODEL: - They state that water particles spread out (for evaporation) or clump together (for condensation). - They evaluate a model based on whether it has water particles (e.g., "water vapor," "water molecules"). Level 1: Students at this level think of explanation as providing general information and 55 do not attend to scientific explanation. In constructing/revising their models, students at this level do not include scientific explanation. One indicator of level 1 in this category is that they describe a phenomenon macroscopically whether they may or may not show how a phenomenon changes over time. Likewise, when they comment on models, they focus on the same feature. Level 2: Students at this level attend to physical cause or factor (e.g., the sun, the air) involved in a phenomenon but do not attend to microscopic/theoretical entities (e.g., water molecules) in both constructing/revising and talking/writing about models. Level 3: Students at this level attend to how microscopic/theoretical entities (e.g., water molecules) behave over time as an explanatory feature in both constructing/revising and talking/writing about a model. I see this feature as an indicator that students understand that scientific explanation is to provide a hidden mechanism for an observable phenomenon. 3.5.3.3. ACCURACY When students evaluate a model, they also attend to how accurate the model is. However, students vary on how to secure accuracy (validity) of a model. To identify different levels in this category and construct descriptions of those levels (Table 3.4), I considered two elements—first, what kinds of sources of accuracy do they attend to? second, how sophisticated is their attention to the relation between sources of accuracy and model features? Level 1: Students at this level attend to immediate or given information as a source of accuracy and attend to accurately representing it in a model. For example, when they construct models to explain how water on a plate shrinks over time, they attend to this information (water on a plate shrinking over time) and represent it accurately in their models. When they collect new information that contradicts their models, they neither find inconsistency between them nor revise their models to make them consistent with the new information. In evaluating a model, 56 they show the same concern discussed above or attend to accuracy of it for idiosyncratic reasons. Level 2: In terms of sources of accuracy, students turn to sources other than given information. These sources include prior learning, experiences, learning, and empirical investigations. However, they do not distinguish empirical evidence from other sources of knowledge. Next, they do not have sophisticated understanding of relationships between sources of accuracy and model features. In constructing/revising their models, they simply represent information from the sources mentioned above in their models without reflecting the accuracy of the information or the sources of the information. They may also use empirical evidence, but focus on their invalid interpretation of empirical evidence or information other than empirical evidence from an experiment (e.g., target situation: evaporation of hot water and cold water). Likewise, in talking/writing about a model, students at this level neither refer to sources of accuracy nor articulate the relation between the sources and model features. Level 2.5: I identified this level while analyzing the data for this study. Students at this level use empirical evidence for their models in a particular way: they include specific empirical data (e.g., data of percentage humidity, data of weight) in their models and make their explanations consistent with such data. This act indicates that students at this level begin to see empirical data as an important source of accurate knowledge for models—important enough to show explicitly in their models. Level 3: Students at this level attend not only to empirical evidence as a source of accurate (valid) knowledge but also to the relationship between empirical evidence and a feature or idea in a model. In revising their models, students at this level change their explanations to be consistent with empirical evidence that refuted the explanations. This indicates that they knew a relatively sophisticated way of securing accuracy (validity) of a model. In talking/writing about a 57 model, students at this level refer to empirical evidence and articulate how it supports or refutes the explanatory idea. Table 3.4. A coding scheme for analyzing students’ EIMs in the category of ACCURACY Level Description Indicators 1 They attend minimally CONSTRUCTING/REVISING A MODEL: to the accuracy of a - In their model, they only describe their target phenomenon model. accurately, according to immediate (given) information. (e.g.) A model of evaporation shows accurately that water shrinks as some of it evaporates over time. - They do not notice the inconsistency between a feature of their model and new information (e.g., teacher's comment, curriculum passages, empirical evidence, etc.). This is evidenced by the fact that they do not reconcile the inconsistency in their next model. 2 They do not distinguish empirical evidence from other kinds of information and they take for granted the accuracy of all information TALKING/WRITING ABOUT A MODEL: - They evaluate a model based on whether it describes its target phenomenon accurately, according to immediate information or without providing any rationale. - They argue that a model feature is accurate or inaccurate but their rationale is somewhat idiosyncratic (e.g., the model's scientific appearance or their personal preference). CONSTRUCTING/REVISING A MODEL: - In their model, they represent information other than empirical evidence (e.g., prior knowledge of a target phenomenon) without critically reflecting the accuracy of the information. - In their model, they represent their (invalid) interpretation of empirical data. - In their model, they represent information other than empirical evidence from an empirical investigation. TALKING/WRITING ABOUT A MODEL: - They evaluate a model based on whether it is consistent with their (invalid) interpretation of empirical evidence without referring to empirical evidence or articulating the relation between empirical evidence and the model feature. - They evaluate a model based on whether it is consistent with information other than empirical evidence from an empirical investigation. 58 2.5 3 They begin to note the epistemic authority of empirical evidence. Table 3.4 (cont’d) CONSTRUCTINIG A MODEL: - They include empirical data (e.g., percentage humidity) as a source of accuracy in their model, and make an explanatory feature of their model consistent with it. TALKING/WRITING ABOUT A MODEL: - They evaluate a model based on whether it has empirical data (e.g., percentage humidity). - They evaluate a model based on whether it is consistent with empirical data. They attend to the ep- CONSTRUCTING/REVISING A MODEL: istemic authority of - They make an explanatory idea of their model to be conempirical evidence sistent with empirical evidence (and do not include empirical and the epistemic reladata). tionship between empirical evidence and TALKING/WRITING ABOUT A MODEL: an explanatory idea of - To argue for the accuracy or inaccuracy of an explanatory a model. idea of a model, they refer to empirical evidence and articulate how it supports or refutes the explanatory idea. 3.5.3.4. COMMUNICATION The fourth criterion by which students judge the quality of models is how effectively the models communicate the ideas in them. A range of levels in this category are described below and in Table 3.5. Level 1: Students at this level do not attend to how to make a model communicatively effective (in constructing/revising a model) or simply mention the communicative purpose of a model (in talking/writing about a model). Level 2: Students at this level attend to communicative features such as labels, sentences, key, and colors that contribute to clarity of models in both constructing/revising and talking/writing about a model. Level 2.5: Like level 2.5 in ACCURACY, I identified this level in this category while analyzing the data for this study. Students at this level include empirical data not only as a source of accuracy of a model but also to make the model persuasive. I consider this limited attention to 59 persuasive efficacy of models as an important step toward the next higher level. Level 3: Students at this level knows a sophisticated way of making a model persuasive. In both constructing/revising and talking/writing about a model, they attend to making a model feature/idea consistent with empirical evidence to make a model persuasive. Table 3.5. A coding scheme for analyzing students’ EIMs in the category of COMMUNICATION Level Description Indicators 1 They are generally CONSTRUCTING/REVISING A MODEL: aware of the commu- - They show a target phenomenon visually but do not use such nicative purpose or communicative features as labels, sentences, and key. So, function of models or their model lacks clarity. audience. TALKING/WRITING ABOUT A MODEL: - They mention generally a communicative purpose of models. 2 They attend to wheth- CONSTRUCTING/REVISING A MODEL: er a model conveys its - In their model, they represent a target phenomenon both visuidea clearly, but not to ally and using such communicative features as labels, senwhether it conveys its tences, colors, and a key. idea persuasively. TALKING/WRITING ABOUT A MODEL: - They evaluate a model based on whether it has such communicative features as labels, sentences, colors, and a key. 2.5 They attend to wheth- CONSTRUCTING/REVISING A MODEL: er a model conveys its - In their model, they represent a target phenomenon both visuidea clearly and perally and using such communicative features as labels, sensuasively. But, their tences, colors, and a key. In addition, they include empirical understanding of how data (e.g., percentage humidity) in it to make it persuasive. a model conveys its idea persuasively is TALKING/WRITING ABOUT A MODEL: limited. - They evaluate a model based on whether it has empirical data but do not articulate how empirical data makes the model persuasive. 3 They attend to wheth- CONSTRUCTING/REVISING A MODEL: er a model conveys its - They make an explanatory idea of their model to be conidea clearly and persistent with empirical evidence (and do not include empirical suasively. And, their data). understanding of how a model conveys its TALKING/WRITING ABOUT A MODEL: idea persuasively is - They argue that an explanatory idea in a model is accurate (or sophisticated. not accurate) by showing how it is supported (or not supported) by empirical evidence. In this process, they may compare this idea to an alternative idea. 60 3.5.4. Analysis of three students’ EIMs Using the EIM coding scheme, I analyzed three focus students’ EIMs in their various modeling activities. The following nine modeling activities were subject to analysis. - M1: Constructing an initial model of evaporation - M2: Constructing a second model of evaporation (by evaluating/revising the prior model) - M3: Evaluating peers’ second models of evaporation - M4: Construct a third model of evaporation (by evaluating/revising the prior model) - M5: Constructing a group consensus model of evaporation - M6: Constructing an initial model of condensation - M7: Constructing a second model of condensation (by evaluating/revising the prior model) - M8: Constructing a group consensus model of condensation - M9: Evaluating other groups’ consensus models of condensation Regarding the kinds of data, I analyzed models in M1, M2, M4, M6, and M7, utterances and notes in M3 and M9, and models and utterances in M5 and M8. As noted above, I compiled these different kinds of data in one (excel) file. Analysis of students’ EIMs from these kinds of data requires more than mechanistic application of the EIM coding scheme to data—it requires a high level of interpretation from the analysis. And for this interpretation to be trustworthy, the coder needs to be familiar with the whole process of the class’s curriculum implementation as well as how the class made meanings of scientific modeling over the time. To meet this condition, I familiarized myself with the data about the focus students’ modeling activities as well as with other bodies of data such as classroom discourse data and interview data. 3.5.5. Analysis of curriculum events 61 For each modeling activity that the focus students engaged in, I looked at curriculum events that preceded that modeling activity and identified curriculum events that were potentially influential to the students’ EIMs. Although there are various factors that might have affected their EIMs, I found that preceding curriculum events played a relatively large role in helping shape the students’ modeling activities and EIMs in several ways. First, because students did not have particular ideas about modeling activities when they began the unit (they only had general understandings of models and modeling), they had to depend on the instructions and framing about modeling activities given by the curriculum materials and their teachers. Second, although students raised various ideas about modeling, they used their teachers’ evaluations of those ideas to decide whether to discard, revise, or keep those ideas. Next, I analyzed ideas about modeling found in those curriculum events using the EIM coding scheme. One difference between analysis of students’ EIMs and analysis of ideas about modeling detected in curriculum events is that because curriculum events aimed at achieving particular objectives, the ideas about modeling found in them are not always analyzable with respect to all the four categories. For example, when a teacher emphasized adding what students had learned about evaporation in their models of evaporation, the main idea about modeling found in this scaffolding comment is subject to analysis mainly in regard to the EIM category of CONTENT. 62 CHAPTER 4. ANALYSES OF IDEAS ABOUT MODELING FROM CURRICULUM EVENTS AND OF THREE STUDENTS’ EPISTEMOLOGIES IN MODELING OVER TIME In the previous chapter, I introduced a coding scheme used to analyze three focus students’ EIMs deployed in their various modeling activities and ideas about modeling present in curriculum events that preceded each modeling activity. This chapter presents the results of those analyses. This chapter is organized in three sections. In the first section, I outline how Mrs. M’s class implemented the model-based unit of evaporation and condensation introduced in the previous chapter. This overview allows readers to see the general context in which various curriculum events about took place and the focus students’ EIMs developed. In this first overview section (4.1), I show which curriculum events I focused on for the analysis in the rest of the chapter. The second section (4.2) offers a microgenetic analysis of three focus students’ EIMs used in their multiple modeling activities and an analysis of ideas about modeling detected in the curriculum events that preceded and thus potentially influenced those modeling activities. In this second section, I provide the results of these two analyses in a chronological order. Lastly, in the third and final section (4.3), I provide a summary of the analysis results presented in the second section. In this way, I attempt to address the two research questions of this study: (1) how did the three focus students EIMs change over time and (2) what curriculum events influenced the change of their EIMs in what ways? 4.1. Overview of How Mrs. M’s Class Enacted the Model-Based Unit of Evaporation and Condensation Table 4.1 outlines the main activities and curriculum events that Mrs. M’s class engaged 63 in as they implemented the model-based unit of evaporation and condensation. Note that the curriculum events and modeling activities that are analyzed later are underlined and in bold type respectively. For the modeling activities, codes are given for convenient reference. Table 4.1. Overview of Mrs. M’s class’s enactment of the unit Main activities or Curriculum events foci Before the unit  Mrs. M set up a two-cup experiment: she placed two cups— began one open, the other covered with plastic—containing the same amount of water on a table for students to observe for next several days. This was one of the experiments about evaporation. The unit anchor A solar still was introduced to students. After they observed ing phenomenon the solar still placed on each table, they were asked two quesand questions tions: “Would you drink the liquid in the bottle cap?” and “Do you know what that liquid is and how it got there?” They talked about what they would need to know and do to answer these questions. Anchoring phe Students were introduced to two phenomena—water shrinking nomena about over time on a plate and in a humidifier—and were asked evaporation and questions: “What happened to water? Where was it gone? questions How? Why?” Construct an ini Students read the passages in their student notebooks that intial model of troduce scientific models and instruct how to construct a modevaporation el.  Individually, students constructed their initial models of evaporation [M1].  In groups, students shared their initial models of evaporation with others. 64 Class # Class 1 Class 1 Class 2 Class 2 Empirical investigations of evaporation      Evaluate/revise the initial model of evaporation Evaluate one another’s second models of evaporation  Scientific ideas and models   Using the handout she had made on the basis of the student   Evaluate/revise the second model of evaporation Table 4.1 (cont’d) Mrs. M asked students to explain “evaporation” and several students came up with their ideas. Students were introduced to empirical evidence as a criterion for evaluating models (Class 3). Students conducted the following four sets of experiments about evaporation (Class 3~Class 5). As students conduct these experiments, both the passages in the student notebook and Mrs. M helped them link empirical evidence to their initial models of evaporation.  Experiment 1: Two cups revisited  Experiment 2: Cobalt chloride strips  Experiment 3: Humidity detector  Experiment 4: Hot vs. cold water Mrs. M and the focus students had a conversation about what they understood about evaporation as a result of doing the empirical investigations about evaporation. As a class, students came up with what they thought makes a good model. Individually, students constructed their second models of evaporation [M2].   notebook, Mrs. M instructed how to evaluate others’ models (Class 7). In groups, students evaluated one another’s second models of evaporation [M3]. Students watched several computer simulations about state changes. In recess, students went out to the playground and acted out as water molecules: they bumped one another, spread out (at a high temperature), and clumped together (at a cold temperature) at Mrs. M’s prompts. Using questions written on a board, Ms. H guided students into constructing their third models of evaporation. Individually, students constructed their third models of evaporation [M4]. 65 Class 3 ~ Class 7 Class 7 Class 7 ~ Class 8 Class 9 Class 10 Construct a consensus model of evaporation      Table 4.1 (cont’d) Students shared with the class the changes they had made to their second models of evaporation (Class 10). Ms. H introduced what a consensus model is and how to construct it (Class 10). Students talked about what to consider in constructing a consensus model (Class 10). Mrs. M introduced a consensus model and instructed how to construct it (Class 11). In groups, students constructed their group consensus models of evaporation [M5]. As a class, students evaluated each group’s consensus model of evaporation. Mrs. M played an important role in this process. Evaluate each  group’s consensus model of evaporation Anchoring phe Students were introduced to two phenomena that both involve nomena about water drops appearing on a cold object—water drops forming condensation and under plastic wrap that covers a cup containing water and on a questions cold bottle—and were asked questions: “What do you observe appearing on it? Where did it come from? How did it get there?” At that time, Mrs. M and three students acted out how water particles stick to a cold bottle. She also reminded students of the computer simulations and the playground performance. Construct an ini Using the passages in the student notebook, Mrs. M reminded tial model of constudents of how to construct a model (Class 15). densation  Individually, students constructed their initial models of condensation [M6].  Students moved around in the classroom to present their initial models of condensation to others and to get feedback from them (Class 16) Empirical investi-  Students conducted the following four sets of experiments gations of conabout condensation. As students conduct these experiments, densation Mrs. M helped them link empirical evidence to their initial models of evaporation using the handout she had created.  Experiment 1: Soda can & ice pack  Experiment 2: Mirror  Experiment 3: Humidity in a container  Experiment 4: Weigh and ice pack over time Evaluate/revise  Ms. H and Mrs. M gave scaffolding comments to help stuthe initial model dents construct their second models of condensation. of condensation  Individually, students constructed their second models of condensation based on the empirical evidence about condensation [M7]. 66 Class 10 ~ Class 11 Class 12 ~ Class 14 Class 14 Class 15 ~ Class 16 Class 17 ~ Class 19 Class 19 Evaluate one another’s second models of condensation/ Construct a consensus model of condensation Evaluate one another’ group consensus models of condensation     Table 4.1 (cont’d) Mrs. M gave a brief instruction of how to evaluate one another’ second models of condensation and construct a group consensus model of condensation. In groups, students evaluated one another’s second models of condensation. In groups, students constructed their group consensus models of condensation. [M8]. Groups were paired and evaluated their partner group’s consensus model of condensation. The focus group and another group (Group 3) evaluated each other’s consensus model of condensation. Then, the two groups evaluated Group 4’s consensus model of condensation [M9]. Class 20 ~ Class 21 Class 22 Comparing this enacted curriculum with the designed curriculum (outlined in Chapter 3) reveals several characteristics about Mrs. M’s class’s enactment of the unit. First, Mrs. M’s students spent a fair amount of time in evaluating others’ models. According to the original curriculum, this activity was planned to take place only twice—once in evaporation and once in condensation. Mrs. M had students engaged in this modeling activity two additional times. After they constructed group consensus models of evaporation, Mrs. M presented group consensus models of evaporation one after another to the class and students as a class evaluated them over nearly three class sessions (Class 12~Class 14). After making group consensus models of condensation, they undertook this activity again. At that time, they had a different participation structure. Every two groups formed a dyad and evaluated each other’s consensus model of condensation. This extended engagement in this modeling activity gave them more opportunities to experience the social milieu of critiquing one another’s models, and thus potentially helped them develop in other scientific practices such as presentation and argumentation. Second, students in Mrs. M’s class did not use models to explain or predict other cases of the same phenomenon. According to original design of the curriculum materials, students were 67 meant to use their consensus models of evaporation and condensation for other cases involving these phenomena. These activities were designed to support students’ attention toward generality of models and modeling, one of the criteria highly regarded in the scientific community. Increased attention to this criterion in turn requires that students focus on a model’s essential features and know how to make them more abstract. Given this underlying purpose, the fact that they did not engage in these activities might have affected their attention to the criterion of generality. Finally, Mrs. M combined the activities of evaluating one another’s second models of condensation and of constructing a consensus model of condensation, and skipped scientific ideas and models about key concepts of condensation as well as the activity of constructing a third model of condensation. Mrs. M seems to have made this decision because of time constraints— because students had spent more time than planned in prior activities. At the same time, condensing the end of the unit may have indicated that students were familiar with the model-based sequence. However, this way of undertaking a series of modeling activities did not allow students to think about their modeling activities more deeply and critically. As a result, students did not have a chance to develop their EIMs further. 4.2. Tracing Three Students’ EIMs and Ideas about Modeling in Curriculum Events over Time To address the research questions for this study, this section of the chapter presents the results of analyses of three focus students’ EIMs and of ideas about modeling from curriculum events that preceded each modeling activity in a chronological order. I chose this way of presenting analysis results because it can provide a rich documentation of the complex and dialectical ways in which curriculum events and these students’ EIMs evolved over time that are otherwise 68 hard to capture. 4.2.1. The activity of constructing an initial model of evaporation (M1) and its preceding curriculum events In the second day after the curriculum was introduced or Class 2, Mrs. M’s students conducted their first modeling activity. After being introduced to and discussing two situations that both involve water shrinking over time, students were asked to construct models to explain one of these physical phenomena. In order to facilitate this process, Mrs. M used the passages in the student notebook to give students general information about scientific models and an instruction as to how to construct a model. In this part of the chapter, I first analyze these two short curriculum events in combination with a focus on the ideas about modeling they conveyed. Then, I analyze the three focus students’ EIMs from their initial models of evaporation using the EIM coding scheme to show how they attended to the four EIM categories in constructing these models. 4.2.1.1. Introducing scientific models / Instructing how to construct a model (Class 2) Before students constructed their initial models of evaporation, they read two passages in their student notebooks. First, following Mrs. M’s direction, they read paragraphs that introduce scientific models. Below are some of the paragraphs from the curriculum materials that describe scientific models. - Excerpt 1: A scientific model is a representation (like a diagram, simulation, equation) that simplifies a system (like the solar system) or a phenomenon (like evaporation), to highlight its main parts. - Excerpt 2: Scientific models include those main parts, relationships between those parts, and rules for how the model runs. Scientists often use their scientific models to make 69 sense of a system or phenomenon, to communicate their understandings to others, and to generate explanations and predictions for a new system or phenomenon. Subsequently, Mrs. M had students read the following passages that instruct how to construct an initial model. (Words in parentheses are part of the excerpts.) - Excerpt 3: You could draw a ‘before – during – after’ picture of evaporation (like a comic strip or cartoon). In other words, show how it happens over time. - Excerpt 4: Your model should capture not just “what happens to the water” (description) but “why or how it happens” (explanation or mechanism) Note that because Mrs. M neither elaborated on these passages nor had the class discuss them, these passages were almost the only resource that students could use to construct their initial models of evaporation. I assume, therefore, that the ideas about modeling from these passages were potentially influential to students’ EIMs that guided their first modeling activity. These excerpts as a whole potentially influenced students’ EIMs in several ways. First, they emphasized that a model is not a replica but a simplified representation of its target. What this idea implied to students with respect to modeling was that students need to choose some aspects of a phenomenon being modeled. Regarding what aspects of a phenomenon need to be represented, “main parts, relationships between those parts, and rules for how the model runs” (excerpt 2) and “description…explanation or mechanism” (excerpt 4) were mentioned. These ideas potentially allowed students to see that a model should contain certain kinds of features (CONTENT) and that a model should not only describe its target but explain it somehow (EXPLANATION). However, the kinds of features to be included in a model and the nature of explanation were not specified. For example, what are “main parts” (excerpt 2) of a phenomenon remained vague. Also, though some ideas were given about explanation (e.g., “why or how it hap- 70 pen,” “mechanism”) (excerpt 4), these ideas possibly invoked a range of ideas from narrativebased explanation to mechanistic explanation from students. Second, excerpt 2 touched on a communicative purpose of models. This statement potentially helped students attend to making a model clear (COMMUNICATION). But, considering that how to achieve the goal was not provided in any of these excerpts, we can think of different ways in which this statement influenced students’ EIMs in COMMUNICATION. Analysis of ideas about modeling from the excerpts using the EIM coding scheme is presented in Table 4.2. Table 4.2. Analysis of the ideas about modeling from the passages from the curriculum materials that introduce scientific models and instruct how to construct a model Category Indicators Level CONTENT - A model was introduced not as a copy of its target but as a 2~3 simplified representation of it (excerpt 1). - Regarding model constituents, “main parts, relationships between those parts” (excerpt 2) and “description” and “explanation or mechanism” (excerpt 4) were mentioned. But, what they mean remained general. - The communicative purpose of a model was mentioned (excerpt 2). EXPLANATION - Description and explanation were distinguished. 1~3 - Regarding how a model explains its target, “main parts, relationships between those parts” (excerpt 2) and “explanation or mechanism” (excerpt 4) were mentioned. But, what they mean remained general. ACCURACY N/A N/A COMMUNICATION - Only a communicative purpose of a model was mentioned 1 (excerpt 2). 4.2.1.2. The focus students’ construction of their initial models of evaporation (Class 2) After reading these passages in their student notebooks described above, students individually created their initial models of evaporation. Next, they shared their initial models of evaporation with others in groups. It is important to emphasize that the activity of sharing one’s model with others was the first social modeling activity that took place in this classroom. How- 71 ever, because the focus students did not talk a lot in this activity, I do not analyze the activity separately but use what they said in the activity as additional data in my analysis of their initial models of evaporation. Below, I present the three focus students’ initial models of evaporation and an analysis of their EIMs. To better capture their EIMs, I use two kinds of additional data in my analysis. First, I include the notes the focus students wrote about the target phenomenon in their notebooks. They made these notes when they were introduced to a phenomenon in which water on a plate shrinks over time and were asked, “What do you think has happened to the water on the plate? Where do you think it has gone? How? Why?” Because students targeted this phenomenon for their initial models of evaporation, the focus students’ notes provide additional information of how they explained the target phenomenon in their models. Second, as noted above, I also analyze the comments they made about their models when they presented their models in their group after constructing their initial models of evaporation. Brian’s initial model of evaporation Figure 4.1. Brian’s initial model of evaporation 72 - Brian’s notes: I think the water on the plate evaporated. It went into the air. It went into the air by evaporating into the air. - 4 5 Brian’s utterances: Okay, here is the water, (lab) water in Mr. (?) 's experiment. This is the second day when there's less water (???) water. [Pointing to a figure] Oh, this is the guy who's checking on it every day. Analysis of Brian’s EIM from these data is summarized in Table 4.3. Table 4.3. Analysis of Brian’s EIM in constructing his initial model of evaporation Category Indicators CONTENT - Represented the phenomenon mainly using images and did not use such communicative features as labels and sentences except “day 1, day 2, day 3”. - Included an extra feature (e.g., a human figure) EXPLANATION - Explained how water on a plate shrinks over time by showing that water evaporates into the air. ACCURACY - Used the idea of water evaporating into the air from an external source (e.g., a lesson of water cycle) without attending to the accuracy of the idea. COMMUNICATION - Represented the phenomenon mainly using images and did not use such communicative features as labels and sentences except “day 1, day 2, day 3”. Level 1 2 2 1 Joon’s initial model of evaporation 4 5 Hereafter, words in a parenthesis signify utterances for which I have some doubt. Hereafter, quotation marks in a parenthesis signify inaudible utterances. The number of quotation marks indicates the length of the inaudible utterances. 73 Figure 4.2. Joon’s initial model of evaporation - Joon’s note: I think the water evaporated and the water goes up because water evaporated. Analysis of Joon’s EIM from this data is presented in Table 4.4. Table 4.4. Analysis of Joon’s EIM in constructing his initial model of evaporation Category Indicators CONTENT - Did not include such communicative features as sentences. EXPLANATION - Explained how water on a plate shrinks over time by showing that some of the water evaporates into the air. ACCURACY - Used the idea of water evaporating into the air from an external source (e.g., a lesson of water cycle) without attending to the accuracy of the idea. COMMUNICATION - Did not use such communicative features as sentences to describe that water on a plate shrinks over time. Mana’s initial model of evaporation 74 Level 1 1 2 1 Figure 4.3. Mana’s initial model of evaporation 75 Figure 4.3 (cont’d) A Note: The words in this figure are as follows (verbatim). A. Before B. 3 days later C. After D. The arrows are the water evaporation E. Those little dots are the water D F. The water is full to the top G. All of the water evaporated H. Those dots are the water that was on the dish I. When water evaporated the water goes into the clouds J. There is one little drop of water left F - B C H I G E J Mana’s notes (verbatim): I think the water evaporated. I think it evaporated because the water in the dish before was filled move. I think it happen like this; you left the water out for a few day and it dissapered (sic). (It evaporated.) Table 4.5 presents a summary of analysis of Mana’s EIM from these data. Table 4.5. Analysis of Mana’s EIM in constructing her initial model of evaporation Category Indicators CONTENT - Included communicative features such as sentences. - Described the phenomenon by showing that water on a plate shrinks over time. - Explained the phenomenon by showing/writing that water particles go up into clouds. - Included no extra features. EXPLANATION - Explained how water on a plate shrinks over time by showing/writing that water particles go up into clouds. ACCURACY - Used the ideas of water particles going up into clouds from a previous source (e.g., a lesson of water cycle) without attending to the accuracy of the idea. COMMUNICATION - Included a lot of sentences to clearly communicate her idea. Level 3 3 2 2 The analyses of the focus students’ EIMs as deployed in constructing their initial models 76 of evaporation are summarized in Figure 4.4. Figure 4.4. Analyses of the focus students’ EIMs in constructing their initial models of evaporation. For interpretation of the references to color in this and all other figures, the reader is referred to the electronic version of this dissertation. 3 2 CONTENT EXPLANATION ACCURACY 1 COMMUNICATION 0 Brian Joon Mana Generally, Brian’s EIM and Joon’s EIM were similar whereas Mana had more advanced EIM at this time. Below, I discuss the similarity and difference between their EIMs and how the preceding curriculum events affected their EIMs in each category. CONTENT: The difference between two boys’ EIMs and Mana’s EIM in this category came mainly from their different levels of attention to communicative features. Brian and Joon attended to including such features in their initial models of evaporation only minimally whereas Mana gave a higher level of attention to them, as evidenced by the fact that she used many sentences to clearly describe and explain her target phenomenon. I discuss more on this below in COMMUNICATION. EXPLANATION: All the three students commonly showed their target phenomenon in multiple time points. I argue that this act was influenced directly by the instruction of how to construct a model in the student notebook which explicitly emphasized showing change of a phenomenon over time. Another commonality across the three students’ initial models of evaporation regarding this dimension is that they all provided both a description and an explanation for 77 their phenomenon. It appears that all three students followed one of the instructions in the student notebook that emphasized these two distinctively (See Table 4.2). However, the two boys and Mana paid different levels of attention to how to explain their target phenomenon: Brian and Joon only showed that water goes up using arrows whereas Mana explained how water on a plate shrinks over time by showing and writing that small water particles, a microscopic/theoretical feature, go to clouds. This difference likely arose from the difference in their prior knowledge or in their attention to using their prior knowledge. ACCURACY: Despite the three students’ different levels of ideas, one thing was common regarding this category: they all used explanatory ideas (e.g., water going into the air, water going up into clouds) from their previous source of knowledge without thinking of the accuracy of those ideas. This result is not surprising considering the fact that no clear instruction as to how to make a model accurate (valid) was given in the preceding curriculum events. COMMUNICATION: In the passages from the curriculum material that introduce models and instruct how to construct a model, this dimension of modeling was not very emphasized; only a communicative purpose of models was mentioned (See Table 4.2). This was reflected in Brian’s and Joon’s first models. They both paid little attention to making their ideas clear in their models. But, Mana used a lot of sentences to clearly communicate her ideas in her model. She could activate this epistemic idea despite the lack of emphasis of the curriculum material on it. 4.2.2. The activity of constructing a second model of evaporation (by evaluating and revising the prior model) (M2) and its preceding curriculum events In the next part of the unit, Mrs. M introduced students to a scientific way of validating ideas in models using empirical evidence. Empirical evidence occupies a particularly important epistemic status in scientific inquiry; scientists place high value on empirical evidence as an epis- 78 temic criterion for securing the validity of scientific knowledge. Aware of this, the curriculum designers planned to have students engage in collecting empirical evidence and using them to improve their models. To this end, the students were introduced to the notion of empirical evidence as a criterion for evaluating models (Class 3) and then conducted four sets of experiments on the phenomenon of evaporation to collect a body of empirical evidence about it (Class 3~Class 5). During this time, Mrs. M offered a good deal of scaffolding to help students understand empirical investigations not simply as a hands-on activity but as a scientific epistemic practice. In addition, she gave students time to discuss and make sense of their collected empirical evidence in groups (Class 6). Below, I describe and analyze these curriculum events with a focus on how this classroom community constructed and communicated certain ideas about modeling through them. Then, I analyze the focus students’ EIMs from their second models of evaporation. 4.2.2.1. Introducing empirical evidence as a criterion for evaluating models (Class 3) After they shared their ideas about evaporation, Mrs. M’s students entered into the activity of conducting empirical investigations about evaporation. The purpose of this activity was for the students to collect empirical evidence about evaporation and, based on it, to evaluate and revise their initial models of evaporation. Use of empirical evidence to support or refute one’s knowledge or other epistemic products (e.g., explanations, models) is a crucial process in the establishment and development of scientific knowledge. But, elementary school students are likely to find this practice alien and challenging. Therefore, they need a good deal of support from curriculum materials and their teachers. Now, I look at what Mrs. M did to help her students figure out this epistemologically important process. First, Mrs. M introduced empirical investigations and empirical evidence as a means to 79 evaluate the quality of models using passages in the student notebook. She began by reading the following paragraph in the student notebook: Now we have various initial ideas of models of evaporation. Do you think they are all equally good? If not, based on what, based on what can we say that some models are good while others are not as good? Specifically how can we identify the aspects of a certain model that need to be revised? This paragraph suggests that because the qualities of models vary, some ways for evaluating and revising models are needed. Mrs. M then used subsequent passages in the student notebook to introduce empirical evidence as the following excerpt shows. 1 Mrs. M 2 3 4 5 Lassie Mrs. M Lassie Mrs. M [Reading the student notebook] “One important way to judge the quality…of models is to test it with a set of empirical evidence. Suppose here are 10 pieces of evidence about evaporation. We can find in our daily life and in more systematic experiments. If model A explains 9 of them and model B only 3 of them, which one can we say is better?” Which one do you think is better? Lassie? Model A. Why? It has, it has more examples. It has more examples. Remember what I drew? Just a big circle? And I said, “This is the globe, this is the earth, this is a model of the earth.” And you guys are like, “No, it's not.” And I said, “Yeah, it is. This is the model of the earth.” And you said, “We know but it's not good.” Right? We had to add some different things to that, that's kind of what we are talking about here. Sure, [Reading the student notebook] “you can say that model A is better. It is exactly what we're going to do this time. We will conduct a series of experiments about evaporation to improve our model. What you need to do is collect a set of empirical evidence…What you need to do is to collect empirical evidence to test your model with them.” Circle “test your model,” circle it, put a star next to it. This is exactly what your objective is for the day, alright? In this excerpt, two quite different ideas about how to use empirical evidence for evaluating models can be detected. First, the passages in the student notebook that Mrs. M read (line 1) not only introduced empirical evidence as a criterion for evaluating models but, more specifically, 80 described how empirical evidence is used to distinguish a better model from a not-as-good one: if one model explains more pieces of empirical evidence than its competitor does, the former is evaluated to be the better model of the two. In essence, this is an advanced idea of model evaluation. But, it was provided in a rather ambiguous and simplistic manner. First, two verbs used to show the relationship between empirical evidence and models—a model “explains” empirical evidence (line 1) and empirical evidence “tests” a model (line 5)—were perhaps hard to reconcile in the students’ mind. Second, discussion of other related topics was absent. Neither the passages in the student notebook nor Mrs. M provided any explanation of such issues as what em6 pirical evidence is, how it is obtained, and why it is important for evaluating models . Without such information, the students might have found it difficult to grasp what empirical evidence is about. This was manifested when Lassie used a more familiar word, “examples,” instead of “evidence” (line 4). A second idea can be identified in Mrs. M’s utterance (line 5) following Lassie’s response. Before discussing it, I would like to emphasize that Mrs. M’s question, “Why?” (line 3), was a pedagogically powerful teacher move that could have led the class to discuss deeper ideas about modeling including the ones mentioned above. Why is a model better that explains more pieces of empirical evidence than others do? To answer this question properly, all other epistemological issues need to be discussed as well. It appears, however, that Mrs. M did not want to go further in that route but took an easier one at this moment. Following Lassie’s response (line 4), Mrs. M referred to the previous conversation they had had about the globe as a model of the earth and restated the conclusion of that conversation: a simple globe is a still model of the earth, 6 They had brief conversations about what empirical means in empirical investigations before and after this curriculum event. But, the conversations were not substantial and spent mostly to connect the notion of empirical to some of what the students had formerly learned about in another teacher’s classroom (“the seven da Vincian principles”). 81 but, to be a good one, “[w]e had to add some different things to that.” Here, we can detect a relatively naï idea that a good model contains many “different things.” Furthermore, this utterance ve implied that empirical evidence is one kind of information to be added to a model to make it better. In Table 4.6, I provide an analysis of these two ideas using the EIM coding scheme. Table 4.6. Analysis of the ideas about modeling from the curriculum event of introducing empirical evidence as a criterion for evaluating models Category Indicators Level CONTENT - Mrs. M introduced empirical evidence as a kind of infor2 mation to be included in a model (level 2). EXPLANATION N/A N/A ACCURACY - In the curriculum material, empirical evidence was introduced 2~3 as a criterion for evaluating models. In particular, it was stated that a model that can explain more pieces of evidence is a better model (level 3). - Mrs. M introduced empirical evidence as a kind of information to be included in a model (level 2). COMMUNICATION - Empirical evidence was introduced as something that makes a 2.5~3 model persuasive. However, how empirical evidence is used to meet the goal was not specified. 4.2.2.2. Linking empirical evidence to models during empirical investigations on evaporation (Class 3~Class 5) After the general introduction of empirical evidence, they conducted four experiments to empirically investigate various aspects of evaporation from Class 3 to Class 5. Below, the experiments and the results are summarized: - Experiment 1: Two Cups Revisited. In Class 3, they compared the two cups (an open cup and a cup covered with plastic) they had set up before starting the unit. At the time the experiment was set up, both cups contained the same amount. Comparing the two cups, they found that the water level in both cups went down but that the water level in the open cup went down more than that in the covered cup. 82 - Experiment 2: Cobalt Chloride Strips. In Class 4, they placed three blue (dried) cobalt chloride strips in front of a humidifier at different positions—one at 5 centimeters, another at 10 centimeters, and the third at 15 centimeters from the humidifier—and measured the time it took each strip to turn pink. They found that the farther a strip was away from the humidifier, the longer it took to change colors. - Experiment 3: Humidity Detector. In Class 5, they placed a cup of water in a hood and measured humidity inside the hood with a humidity detector for some time. They found that humidity in the hood increased over the time. - Experiment 4: Hot vs. Cold Water. In Class 5, they set up two sets of Experiment 3 with the only difference being that they used hot water for one set and cold water for the other, and measured humidity inside each hood for some time. They found that humidity in both sets increased over the time but that the humidity for the hot water went up faster than that for the cold water. As they went through these experiments, the student notebook and Mrs. M helped them figure out how to use empirical evidence to improve their models. First, the passages in the student notebook about Experiment 1 exemplified how to do it. The passages asked the students to predict what would have happened to the water level in the open and covered cups if water had seeped through the bottom of the cups. Next, they were asked to compare this prediction and the actual result. (Because they had set up this experiment several days ago, they could see the result at this time.) Last, the passages helped the students to reason from the difference between their prediction and the empirical evidence that the idea of water seeping through the cup was not valid. Mrs. M assisted the students in taking these steps. These passages had several interrelated ideas about modeling. First, it showed clearly that empirical evidence is a source by which the 83 accuracy (validity) of an idea is evaluated. Second, it illustrated one scientific way to use evidence for model improvement: when an idea in a model is not supported by empirical evidence, it should be discarded. Second, Mrs. M used a particular sequence in addition to the prompts in the student notebook to guide the students to undertake the next three experiments. As a modified rendition of the well-known Predict-Observe-Explain framework (R. T. White & Gunstone, 1992), this sequence consisted of Predict, Share, Observe, and Explain. Note that this was introduced by Mrs. M, not by the student notebook. The student notebook generally took the following sequence: providing basic information about each experiment and tools to be used (e.g., a humidity detector); having students conduct an experiment (e.g., measuring humidity in a hood covering a cup of water with a humidity detector), collect data, and interpret the result; and finally asking them, “How may this evidence improve your model?” Mrs. M used the Predict-Share-Observe-Explain sequence and the general sequence of the student notebook in combination: she had students (1) Predict the result of each experiment, (2) Share their predictions with others, (3) Observe what happens as they conducted each experiment and collect data, (4) Explain or interpret the result, and (5) write their ideas of improving their models using this empirical evidence in their notebooks. Although pedagogically helpful, this hybrid sequence was limited in helping the students understand a scientific way of using evidence to improve models. First, in Predict, Mrs. M asked the students to make a prediction for a given situation but not on the basis of their model. Therefore, it was possible that even if the students’ predictions were refuted by empirical evidence, their models remained intact. Second, they were asked to “Explain” or “interpret” the result of an experiment. Both the verbs encouraged the students to come up with diverse and sometimes even 84 invalid ideas. Finally, the question of “How may this evidence improve your model?” also opened up a door through which various ideas of model change could come in. Analysis of these two efforts to link empirical evidence to models is presented in Table 4.7. Table 4.7. Analysis of the ideas about modeling from the curriculum material and Mrs. M’s scaffolding that tried to link empirical evidence to models Category Indicators Level CONTENT N/A N/A EXPLANATION N/A N/A ACCURACY - The passages about Experiment 1 showed how empirical evi2~3 dence is used to refute an invalid idea. This is a sophisticated way of securing accuracy (validity) of a model (level 3). - Mrs. M’s guide of how to use empirical evidence to improve a model was open to various ways of using empirical evidence for model improvement (level 2~3). COMMUNICATION - The passages about Experiment 1 showed how empirical evi- 2.5~3 dence is used to refute an invalid idea. This is a sophisticated way of making a model persuasive (level 3). - Mrs. M’s guide of how to use empirical evidence to improve a model was open to various ways of using empirical evidence to make a model persuasive (level 2.5~3). 4.2.2.3. “What do you now understand about evaporation as a result of doing empirical investigations?” (Class 6) As they wrapped up their empirical investigations about evaporation, Mrs. M read the following paragraph in the student notebook: Look at the findings that we collected so far. We know that water does not seep through the plate. We know that water does not disappear when it evaporates. It's somehow in the air. We just can’t see it though we can sometimes feel it, and our detectors can detect it. We also know the evaporation happens faster when water is hotter or has a larger surface area. The purposes of this paragraph were to summarize their findings (empirical evidence) about 85 evaporation and simultaneously to prepare the students to evaluate and revise their first models of evaporation based on them. Subsequently, Mrs. M gave the students additional time (about 13 minutes) to articulate their findings in groups. At the beginning of this event, she asked this question: What do you now understand about evaporation as a result of doing empirical investigations? It should be noted here that she used a typical school science language that focuses on students’ understanding (“What do you understand…?”) rather than a more technical language (e.g., “What empirical evidence about evaporation do you have now…?”). The general question turned out to invite various ideas the students had come to acquire from their empirical investigations. In this event, Mrs. M helped the focus students’ group most of the time. She asked each focus student to share what she or he understood about evaporation as a result of doing empirical investigations. Moreover, she assisted them in articulating the terms or ideas they presented. For example, when Brian said that the humidity detector showed that when water evaporates, it is humid, she asked, “What does humid mean?” With her scaffolding, Brian came to realize that being humid in the air means that there is water vapor in the air. Similarly, when Adrianna and Brian agreed to the idea that water dissolves into the air, she asked, “What does it mean to dissolve?” To this question, Adrianna, Brian, and Mana constructively responded that water dissolving means water becoming smaller particles. Still another significant instance was when Mrs. M asked where water goes. These various scaffolding comments given by Mrs. M added up to allow the focus students to see the need to have clear and accurate explanatory ideas. In particular, in discussing “humid” and “dissolving,” they came to note the importance of the concept of water particles (called “water vapor”) in explaining evaporation. 86 Analysis of the ideas about modeling from this conversation in general and Mrs. M’s approach in particular is presented in Table 4.8. Table 4.8. Analysis of the ideas about modeling from the conversation of “What do you now understand about evaporation as a result of doing empirical investigations?” Category Indicators Level CONTENT N/A N/A EXPLANATION - Mrs. M helped students shift their attention from somewhat 3 nascent ideas of “humid” and “water dissolving” to microscopic/theoretical entities (“water vapor”) as an explanatory feature. ACCURACY N/A N/A COMMUNICATION N/A N/A 4.2.2.4. The focus students’ construction of their second models of evaporation (Class 6~Class 7) After students conducted the empirical investigations, Mrs. M asked students to revise their evaporation models. I turn now to the focus students’ second models of evaporation. Below, I present their models first and then analyze them. Brian’s second model of evaporation 87 Figure 4.5. Brian’s second model of evaporation Note: The words in this figure are as follows (verbatim). A. Key / Red=Hot /blue=cold / dots=molecules B. Hot water C. How water evaporates faster D. humidity 77 / humidity 98 / humidity 100 E. 1 / 2 / 3 F. Cold water G. Cold water evaporates slow H. 60 / 65 / 71 I. 1 / 2 / 3 A C D B E F H G I Brian’s EIM from his second model of evaporation is analyzed as follows. 88 Table 4.9. Analysis of Brian’s EIM in constructing his second model of evaporation Category Indicators Level CONTENT - Included such communicative features as a key, colors, and 2 sentences. - Described the phenomenon by showing how hot water evaporates faster than cold water. - Explained the phenomenon by showing that the rate in which water molecules spread out is higher for hot water than for cold water. - Included empirical data of humidity. EXPLANATION - Explained how hot water evaporates faster than cold water by 3 showing that the rate in which water molecules spread out is higher for hot water than for cold water. ACCURACY - Included empirical data (showed how percentage humidity 2.5 changes over time for hot and cold water) and made his explanation (see above in EXPLANATION) consistent with them. COMMUNICATION - Included such communicative features as a key, colors, and 2.5 sentences. - Included empirical data of humidity as a way of making this model persuasive. Joon’s second model of evaporation 89 Figure 4.6. Joon's second model of evaporation A Note: The words in this figure are as follows (verbatim). A. Air B. little water C. evaporated D. water E. …in the air there is little water so the air evaporated with water, and the air will spread out F. water G. there is little water left D E B C F G Analysis of Joon’s EIM from his second model of evaporation is summarized in the next table. 90 Table 4.10. Analysis of Joon’s EIM in constructing his second model of evaporation Category Indicators Level CONTENT - Included such communicative features as labels, phrases, and 3 sentences. - Described the phenomenon by showing/writing that water on a plate shrinks over time. - Explained the phenomenon by showing/writing that water-air particles spread out in the air and go somewhere. - Included no extra features (e.g., empirical data) EXPLANATION - Explained how water on a plate shrinks over time by show3 ing/writing that water-air particles spread out in the air and go somewhere. ACCURACY - Used an idea of water-air particles spreading out and going 2 somewhere even though the idea is not supported by any empirical evidence. COMMUNICATION - Included such communicative features as labels, phrases, and 2 sentences. Mana’s second model of evaporation 91 Figure 4.7. Mana's second model of evaporation 92 Figure 4.7 (cont’d) Note: The words in this figure are as follows (verbatim). A. Before B. The water will evaporate C. There are moister in the air. D. The little dots are the water vapor and water that is evaporating E. The smaller circuls are water vapor F. After G. Those little dots are the water that is going to be evaporation H. The smaller dots are water vapor I. There is only a little drop on there J. There is also moisture in the air. F A B G C I E D H J We can analyze Mana’s EIM from this data as follows. Table 4.11. Analysis of Mana’s EIM in constructing his second model of evaporation Category Indicators Level CONTENT - Included such features as sentences, labels, and colors. 3 - Described the phenomenon by showing that water on a plate shrinks over time. - Explained the phenomenon by showing that water particles spread out in the air. - Included no extra feature. EXPLANATION - Explained how water on a plate shrinks over time by showing 3 that water particles spread out in the air. ACCURACY - Replaced the idea of water particles going into clouds she had 3 used in her prior model with an idea of water particles spreading out in the air to make this model consistent with some of the empirical evidence collected about evaporation. COMMUNICATION - Included such features as sentences, labels, and colors. 2 A summary of analyses of the focus students’ EIMs from their second models of evaporation is presented in Figure 4.8. 93 Figure 4.8. Analyses of the focus students’ EIMs in constructing their second models of evaporation 3 2 CONTENT EXPLANATION ACCURACY 1 COMMUNICATION 0 Brian Joon Mana Comparing this result with the result for their previous modeling activity (Figure 4.4) reveals that both Brian’s and Joon’s EIMs developed evidently in most of the EIM categories whereas Mana’s EIM progressed only in ACCURACY. Also, there are some differences between Brian’s EIM and Joon’s EIM. I comment on these differences below in each category. CONTENT: Both Brian and Joon included a lot more features in their second models of evaporation than in their prior models. Commonly, they added explanatory features and communicative features. But, the difference between the two lies is that Brian included the data of percentage humidity in his new model while Joon did not. Neither did Mana include this feature in her model. This difference reflects the difference between Brian’s idea and the others’ ideas of using empirical evidence to improve a model. Brian chose to include in this model empirical data from an experiment about evaporation (Experiment 4) in addition to the features and ideas he had learned while making sense of empirical evidence whereas Joon and Mana only included the latter. The emergence of these differing ways of using empirical evidence was in part due to the fact that neither the curriculum material nor Mrs. M’s instruction specified what exactly empirical evidence is and how to use it to improve models, as discussed above (See Table 4.6 and Ta- 94 ble 4.7). EXPLANATION: Mana continued to show how microscopic/theoretical entities (water particles) behave in her new model to explain evaporation. In contrast, both Brian and Joon newly incorporated such feature in their models. This shift was made possible by one preceding curriculum event. Brian and Joon came to attend to microscopic/theoretical entities (e.g., water molecules, water-air particles) as explanatory features when Mrs. M helped the focus students to move from somewhat vague notions of “humid” and “dissolving” to water particles (See Table 4.8). ACCURACY: Although Joon included the feature of small water-air particles in his model, he did not go so far as to reflect that feature in consistency with empirical evidence. This was clearly manifested when he did not show any such particles in the “after” of his model. By contrast, Mana not only kept using the feature but also showed where water particles are (spreading out in the air) to accurately reflect empirical evidence. Brian took the same approach as Mana did, except that he further included empirical data (percentage humidity) in his model as a source of accuracy (validity) for that idea. Again, the diversity in these students’ attention to empirical evidence was cultivated by the general and ambiguous ways in which empirical evidence and using empirical evidence to improve a model were introduced (See Table 4.6 and Table 4.7). COMMUNICATION: In constructing a model, it is hard to show its persuasive efficacy unless the model creator explicates in oral or written discourse how an idea in the model is supported by empirical evidence. That is why both Joon’s EIM and Mana’s EIM in this category are at level 2. Brian, however, showed his intention to make his model persuasive by including empirical data as a source of accuracy (validity) for his model. 95 4.2.3. The activity of evaluating one another’s second models of evaporation (M3) and its preceding curriculum event The activity of evaluating peers’ models by epistemic criteria is perhaps one of the most challenging modeling activities to elementary school students. The social and cultural norms embedded in this activity can be hardly found in any domain of students’ lifeworlds. In traditional classrooms, evaluation is barely a students’ task. Further, individualism is another factor that curbs emergence of this activity in traditional classrooms. Therefore, to help the students engage in this activity, both the curriculum designers and Mrs. M provided an explicit instruction. In what follows, I document how this activity was introduced and how the focus students engaged in it with a focus on ideas and epistemologies about modeling. 4.2.3.1. Instructing how to evaluate one another’s models (Class 7) To support their engagement in this challenging activity, the curriculum designers included in the student notebook an explicit instruction of what each student in a group does and in what sequence they undertake this activity: a student presents his or her model to the others in a group and the others evaluate it by three different criteria (Does it make sense? Is it consistent with evidence? Does it communicate its idea clearly?); then they rotate these roles. However, Mrs. M did not use this instruction in the student notebook but created a handout in which she described the roles little differently (I assigned numbers for the purpose of reference): Sense-making: - Excerpt 1: How does the model help you make sense of the scientific ideas? How does the creator demonstrate scientific understanding? - Excerpt 2: Please give some specific examples that will help the model make more scientific sense. 96 Evidence: - Excerpt 3: How is the model supported by evidence from the investigations? How well does the model capture what we've been learning? Please give examples. - Excerpt 4: Please give some suggestions relating to how this model can be better supported with evidence. Which experiments have we done that could be included in the creator's model? Communication: - Excerpt 5: How well do the parts (arrows, dots, colors, lines, etc.) of the model help you make sense of the scientific concepts? Do you understand what the creator is trying to communicate in their model? - Excerpt 6: What suggestions can you make to help make this model more understandable? In Class 7, Mrs. M mainly read herself or had students read the above passages of the handout. In the meantime, she provided clarifications or specifications about some of the passages. For example, in describing how to evaluate a model in terms of evidence, she said, “For example, if you want specific data, maybe from the humidity detector—that specific data, those numbers, the percent maybe might help you make sense of it. So you would add that.” Note that she placed a stress on including empirical data in a model here. It is a way of using empirical evidence for making a model better which Brian adopted for his second model of evaporation. Analysis of ideas about modeling from these passages and Mrs. M’s utterance is given in Table 4.12. 97 Table 4.12. Analysis of the ideas about modeling from Mrs. M’s instruction of how to evaluate one another’s models Category Indicators Level CONTENT - Excerpt 5 emphasized such communicative features as ar2~3 rows, dots, colors, and lines. - Excerpts 1, 2, and 5 emphasized “scientific ideas,” “scientific sense,” and “scientific concepts” (level 2~3). - Excerpt 3 emphasized “what we've been learning” (level 2). - Mrs. M emphasized including empirical data in a model (level 2). EXPLANATION - Excerpts 1, 2, and 5 emphasized scientific explanation. 2~3 ACCURACY - Excerpts 3 and 4 provided an idea that a model needs to be 2~3 supported by evidence to be accurate in (level 3). - Excerpt 3 also provided an idea that a model needs to include “what we've been learning” without thinking about the accuracy of it (level 2). - Mrs. M’s utterance emphasized including “specific data” in a model as a source of accuracy (validity) for a model (level 2.5). COMMUNICATION - Excerpt 5 emphasized such communicative features as ar2.5~3 rows, dots, colors, and lines. - Various ways of using empirical evidence to make a model persuasive were presented from including empirical data (level 2.5) in a model to a model being supported by empirical evidence (level 3). 4.2.3.2. The focus students’ evaluations of others’ second models of evaporation (Class 7~Class 8) Mrs. M’s students engaged in the activity of evaluating others’ models in Class 7 and Class 8. The focus students’ first attempt (Class 7) proved that this modeling activity is indeed challenging. They spent a significant amount of time only to figure out what this task is about and how to do it. In Class 8, however, they managed to engage in the activity more substantially. Their evaluations of peers’ second models of evaporation were made in two phases. First, they evaluated one another’s models verbally. Second, they wrote their feedback for others’ models in their handouts that Mrs. M had distributed to them. Below, I describe how they undertook this activity in Class 8 and analyze their verbal and written evaluations together. 98 Class 8 started with the whole class’s brief review of the roles introduced in Class 7. As Brian’s response to Mrs. M’s question about the role of the presenter—“The presenter presents stuff.”—illustrates, the students still had only vague understanding of this activity. For the same reason, as the focus students’ group entered into the activity, they had a brief discussion on how to undertake it. And then their engagement in the activity proceeded in the following way (summarized in Table 4.13). Table 4.13. The participation structure and order in which the focus students evaluated others’ second models of evaporation (Class 8) Presenter Evaluator Evaluator (ev- Evaluator (sense-making) idence) (communication) First round Brian Adrianna Mana Joon Second round Joon Brian Adrianna Mana Third round Mana Joon Brian Adrianna Fourth round Adrianna Mana Joon Brian The process was broken into four rounds of presentation-evaluation. And in each round, the order of participation was: (1) The presenter presented his or her second model of evaporation. (2) The “sense-making” evaluator evaluated the presenter’s model. (3) The “evidence” evaluator evaluated the presenter’s model. (4) The “communication” evaluator evaluated the presenter’s model. Apparently, they followed the participation structure given by the instruction in the student notebook. But, to look more closely at what they actually said as they took these roles discloses that they followed the instruction somewhat loosely. For example, when they presented their second models of evaporation, their primary aim was to clearly describe the features contained in their models rather than to explicate their explanatory ideas of their target phenomena and their rationale. Additionally, as shown below, they often blurred the distinctions between the three evaluation roles. 99 Despite these limitations, this way of engaging in this activity was potentially beneficial to these students. First, this participation structure allowed them to enjoy relatively equal participation in the activity. As a result, all the focus students, including Joon who was an Englishlearner and Mana who was reticent in other times, participated in this activity substantially. Second, it is important to note that the roles they performed were epistemic roles that scientists ordinarily take but are hard to find in elementary school classrooms. Therefore, these students’ engagement in these roles afforded them an opportunity to improve their epistemic agency. Finally, as they took these roles, they might be able to see how to make their models better. For example, when they found more sophisticated ideas or concerns in the others’ evaluations, they were likely to incorporate those ideas into their next models. I now offer three focus students’ verbatim verbal and written evaluations as well as an analysis of their EIMs found in these data. Brian’s evaluations of the other focus students’ second models of evaporation Table 4.14. Brian’s evaluations of the other focus students’ second models of evaporation Models Evaluations Adrianna’s se- Communication (utterance): cond model of Your model shows a lot of details, like, this is a picture and it shows symbols evaporation like arrows. It has good words and it has a key so you can tell what all the colors mean. So, like you can tell that gray is moisture in the air and black is the plate and blue is the water and green is the water vapor. 100 Joon’s second model of evaporation Table 4.14 (cont’d) Sense-making (utterance): Your model has a lot of words in it, which makes it easy to explain. And it has pictures and arrows to show time lapse and that it's evaporating. And it shows that it's not just floating into the air as a big bubbly mass of water, but going in as little particles of water. And that there's not very much water left later. And it makes a lot of sense. Sense-making (notes in his notebook): [Compliments] It shows arrows. It shows water vapor. [Wishes] Hot and cold water. Doesn't show humidity Mana’s second model of evaporation Sense-making (notes in his handout): [Compliments] It shows evaporation, time lapse, and has words. It shows pictures, symbols, and words. Evidence (utterance): The [model] has evidence from the humidifier and the water in the plate. If you look at the words, I think you can see other evidence like…from the two cups how it evaporates out of (?) Analysis of Brian’s EIM from his evaluations of the other students’ second models of evaporation is summarized in the following table. Table 4.15. Analysis of Brian’s EIM in evaluating the other focus students’ second models of evaporation Category Indicators Level CONTENT - Attended to communicative features such as pictures, sym2 bols, words, colors, and a key. - Attended to explanatory features such as "time lapse," "going in as little particles of water," and "words." - Attended to empirical data. EXPLANATION - Attended to microscopic/theoretical entities such as “little par- 3 ticles of water" as an explanatory feature. ACCURACY - Attended to empirical data as a source of accuracy (validity) 2.5 of a model. COMMUNICATION - Attended to communicative features such as pictures, sym2.5 bols, words, colors, and a key. - Attended to empirical data as a feature that makes a model persuasive. Joon’s evaluations of the other focus students’ second models of evaporation 101 Table 4.16. Joon’s evaluations of the other focus students’ second models of evaporation Models Evaluations Adrianna’s se- Evidence (utterance): cond model of I like the way that you put keyword and moisture in the air and you put color to evaporation show more, to show up. Brian’s second Communication (utterance): model of evap- I like the water (coming) little dots…And I like that you put little color on it oration and (showed) hot water (in red) cold water (in blue). Mana’s second model of evaporation Communication (notes in his notebook): [Compliments] I like the way that you put color to show it is hot water or cold water [Wishes] You draw model to small it is hard to see it Sense-making (utterance): (?) color, these arrows that show where it's going, where the water is going (?) lot of sentences (?) makes more sense Analysis of Joon’s EIM from this data is presented in Table 4.17. Here, I would like to discuss one analytical issue. When Joon evaluated Brian’s second model of evaporation, he did not give feedback on the fact that Brian’s model had empirical data of humidity because Joon evaluated Brian’s model by the criterion of communication at that time. What if, then, his role had been the “evidence” evaluator instead? Would he have commented on Brian’s empirical data? Though aware of this uncertainty, I decided to analyze his EIM based on the data available. Table 4.17. Analysis of Joon’s EIM in evaluating the other focus students’ second models of evaporation Category Indicators Level CONTENT - Attended to communicative features such as a key, color, sen- 3 tences, and a model’s size. - Attended to explanatory features such as water particles (represented by dots) and arrows showing where water particles go. EXPLANATION - Attended to water particles (represented by dots) as an ex3 planatory feature. ACCURACY - Attended to Adrianna's "moisture in the air," a feature from 2 empirical evidence about evaporation, but did not refer to empirical evidence as a source of this feature or articulate how this feature and empirical evidence are linked. COMMUNICATION - Attended to communicative features such as a key, color, sen- 2 tences, and a model’s size. 102 Mana’s evaluations of the other focus students’ second models of evaporation Table 4.18. Mana’s evaluations of the other focus students’ second models of evaporation Models Evaluations Adrianna’s se- Sense-making (utterance): cond model of (???) key (???) dots (???????) evaporation Brian’s second Evidence (utterance): model of evap- (???) evaporating and the air would (???????) hot water (eventually) moves oration faster (????) up Joon’s second model of evaporation Evidence (notes in her notebook): [Compliments] The model had evidence from the percentage you have and you said that warm water evaporates faster than cold water [Wishes] It would be better if you could put in some more words. Communication (utterance): (?) air (???) words (???) different days (???) water (??) Evidence (notes in her notebook): [Compliments] It had evidence from water on plate and it shows a lot of arrows wich is good. [Wishes] The model could have how the water evaporated. Mana’s EIM can be analyzed from the above data as follows. Table 4.19. Analysis of Mana’s EIM in evaluating the other focus students’ second models of evaporation Category Indicators Level CONTENT - Attended to communicative features such as a key and words. 2 - Attended to water particles (represented by dots) as an explanatory feature. - Attended to empirical data ("evidence from the percentage") in Brian’s model EXPLANATION - Attended to water particles (represented by dots) as an ex3 planatory feature. ACCURACY - Attended to empirical data ("evidence from the percentage") 2.5 in Brian’s model as a source of accuracy (validity) of the model. COMMUNICATION - Attended to communicative features such as a key and words. 2.5 - Attended to empirical data ("evidence from the percentage") in Brian’s model as a feature that makes the model persuasive. A summary of the analyses of the focus students’ EIMs is presented in Figure 4.9. 103 Figure 4.9. Analyses of the focus students’ EIMs in evaluating others’ second models of evaporation 3 2 CONTENT EXPLANATION ACCURACY 1 COMMUNICATION 0 Brian Joon Mana Comparison between this and the result for their prior modeling activity (Figure 4.8) shows that Brian’s and Joon’s EIMs remained the same whereas Mana’s EIM changed in three EIM categories. Below, I focus my discussion on the changes in Mana’s EIM and also mention how the students’ EIMs were influenced by the preceding curriculum event. CONTENT: Mana’s EIM in this category changed from level 3 to level 2. This change was due to their evaluations on a particular feature in Brian’s second model of evaporation— empirical data of percentage humidity. Recall that Mana did not include this feature in her own second model of evaporation. But, when Mana found this feature in Brian’s model, she gave a positive feedback on it, indicating that she acknowledged inclusion of this feature in a model as an acceptable way of using empirical evidence for a model. This change can be attributed to, or, at least encouraged by Mrs. M’s positive comment on including empirical data when she instructed how to evaluate one another’ models (Table 4.12). EXPLANATION: They all continued to attend to such mechanistic features as small water particles. Significant here is that they retained this attention across different kinds of modeling activities (from constructing a model to evaluating others’ models), indicating that this par- 104 ticular concern began to become solid. ACCURACY: Over all, their evaluations were all focused on whether or not certain features exist in a model; they did not articulate how a certain idea in a model is valid or not. Only Mana’s EIM in this category changed from level 3 to level 2.5. The reason was that she commented positively on the fact that Brian had included percentage humidity in his model. This indicates that she began to see the value of empirical evidence as a source that provides accuracy to a model. Again, Mrs. M’s utterance in the preceding event (Table 4.12) played a part in this change. COMMUNICATION: For the same reason just mentioned, Mana’s EIM in this category changed from level 2 to level 2.5. 4.2.4. The activity of constructing a third model of evaporation (by evaluating and revising the prior model) (M4) and its preceding curriculum events The previous modeling activity—evaluating one another’s second models of evaporation—provided a context in which students could learn from one another. However, not every idea that students provided one another was scientifically sensible. Students needed scientific knowledge about key concepts of evaporation to improve their models further. In Class 10, to gain assistance in this regard, students watched computer simulations that showed how microscopic/theoretical entities of matter behave as state of matter changes. After the class session ended, Mrs. M brought students to a playground and had them act out as water molecules to have embodied understanding of how they behave microscopically. Then, in Class 11, they constructed their third models of evaporation based on new ideas from these curriculum events. Below, I report on and analyze these curriculum events and how students were guided to the activity of constructing a third model of evaporation. Then, I provide an analysis of the focus 105 students’ EIMs from their third models of evaporation. 4.2.4.1. Computer simulations about state changes / Students’ collective performance as water molecules in the playground (Class 9) Since the videotape of this session (Class 9) is missing, I present a brief overview of what happened in this session here on the basis of a field note I made on the spot. In Class 9, they en7 gaged in two major events. First, they watched several interactive computer simulations—three of them about states of matter and two of them about state changes—as written in the student notebook. These simulations all showed how matter behaves both microscopically and macroscopically. For example, one simulation showed how water changes its state in a closed test tube in different temperatures macroscopically in the left window and microscopically (how water molecules behave) in the right window (See Figure 4.10). As students watched these interactive simulations, they became excited as evidenced by their high-level verbal participation (e.g., asking Mrs. M to click on a certain button in a simulation to see what would happen) and active gestures (e.g., pointing their fingers to the simulations). 7 According to the designed curriculum, Mrs. M was supposed to present a mini-lecture about key concepts about evaporation. But, this was not recorded on my field note. Thus, I do not focus on this curriculum event here. 106 Figure 4.10. A computer simulation about phase change (The Concord Consortium, 2013) A Note: The words in this figure are as follows. A. Changing Phase: Create a Solid, Liquid, and Gas o B. Dry Ice -80 C C. In this window above, you will see what happens on a microscopic scale as the water in the test tube is heated and cooled. D. Describe what happens to the water in the test tube when you heat the test tube to a high temperature. C B D The second crucial event took place in an unexpected area—a school playground. There, students collectively performed the change of states that they had seen in some of the simulations by individual students acting out as water molecules. Responding to Mrs. M’s prompts of differ- 107 ent temperatures (“hot,” “cold”), they behaved differently. At a high temperature, they moved fast and spread out; at a cold temperature, they moved slowly and gathered together. As they moved, they bumped into each other like the molecules they had seen in some of the simulations. These two events, combined together, gave students several vivid and embodied ideas of how water behaves microscopically: (1) Water consists of water molecules. (2) As temperature goes up, they move faster, and as temperature goes down, they move more slowly. (3) Water molecules spread out at a high temperature and clump together at a cold temperature. These are very sophisticated mechanistic ideas. One thing to note is that some students already deployed some of these mechanistic features in their previous models. But, through these two events, these ideas and features became as official, thus accurate, resources for modeling to students. Analysis of ideas about modeling from these two events is provided in Table 4.20. Note that because the two events were exclusively focused on mechanism, I did not analyze them regarding the other categories of modeling. Table 4.20. Analysis of the ideas about modeling from the computer simulations about state changes and students’ collective performance as water molecules in the playground Category Indicators Level CONTENT N/A N/A EXPLANATION - How water molecules move was emphasized as an explanato- 3 ry feature. ACCURACY N/A N/A COMMUNICATION N/A N/A 4.2.4.2. The focus students’ construction of their third models of evaporation (Class 10) I turn to the focus students’ third models of evaporation to analyze their EIMs from the models. Brian’s third model of evaporation 108 Figure 4.11. Brian’s third model of evaporation 109 Figure 4.11 (cont’d) Note: The words in this figure are as folB lows (verbatim). A. Hot water A B. day 1 D C. humidity 72% D. water vapor E. plate G F. water G. water has just started to evaporate O H. day 2 P I. humidity 83% N J. has made a lot of progress K. day 3 L. humidity 100% R M. water is almost done evaporating N. Cold water O. Day 1 P. Just starting out Q. humidity 72% R. water vapor S. water T. plate U. Day 2 V. humidity 77% W. very little progress X. Day 3 Y. humidity 82% Z. It would take at least a week at this rate. K H C L I E F Q M J U X V Y S W Z T Analysis of Brian’s EIM from this model is presented in Table 4.21. Table 4.21. Analysis of Brian’s EIM in constructing his third model of evaporation Category Indicators Level CONTENT - Included communicative features such as labels, phrases, and 2 sentences. - Described the phenomenon by showing/writing how water on a plate shrinks over time. - Explained the phenomenon by showing how water particles spread out in the air. - Included empirical data of percentage humidity, a relevant but not essential feature. 110 EXPLANATION - ACCURACY - COMMUNICATION - Table 4.21 (cont’d). Explained how hot water evaporates faster than cold water by showing that the rate in which water molecules spread out is higher for hot water than for cold water. Included empirical data of percentage humidity as a source of accuracy of the model. Included communicative features such as labels, phrases, and sentences. Included empirical data of percentage humidity to make the model persuasive. Joon’s third model of evaporation Figure 4.12. Joon's third model of evaporation 111 3 2.5 2.5 Figure 4.12 (cont’d) Note: The words in this figure are as follows (verbatim). A. Water evaporad with the air, and when air goes up, the air will spread out everywhere. B. BEFORE C. there is little water left. D. MIDDLE E. the water is gone, but at the bottom of the plate got little wet. F. AFTER G. Key Words [small circle] water [cloud-like shape] air [upward arrow] evaporate A E C G B D F Joon’s EIM can be analyzed from his third model of evaporation as follows. Table 4.22. Analysis of Joon’s EIM in constructing his third model of evaporation Category Indicators CONTENT - Included such communicative features as a key, labels, and sentences. - Described the phenomenon by showing/writing how water on a plate shrinks over time - Explained the phenomenon by showing/writing how water-air particles go up and spread out in the air. - Did not include extra features such as empirical data. EXPLANATION - Explained the phenomenon by showing/writing how water-air particles go up and spread out in the air. ACCURACY - Continued to use water-air particles, a feature not supported by empirical evidence or scientific ideas from the computer simulations. - However, he replaced the idea of water-air particles going somewhere (used in his second model of evaporation) with an idea of them spreading out in the air to make his explanation partly consistent with empirical evidence and scientific ideas from the computer simulations. COMMUNICATION - Included such communicative features as a key, labels, and sentences. 112 Level 3 3 2 2 Mana’s third model of evaporation Figure 4.13. Mana's third model of evaporation 113 Figure 4.13 (cont’d) Note: The words in this figure are as follows (verbatim). A. Before B. The more little dots are the little water vapor in the air. C. Those bigger dots are the water that is evaporating D. The green in the air is the moister in the air. E. The water is evaporating and the little water vapor is coming into the water on the plate. F. After G. There is more moister in the air H. The water is still evaporating I. There is still same water vapor J. Now the water is gone exapt for a little drop and there is more moister in the air. K. Key [blue lines] - moister in air [upward arrow] - evaporating [red arrow] - Information about the model. [a bigger circle] - The water that is evaporating [a smaller circle] - water vapor F A C B G H D E I K J Analysis of Mana’s EIM from this data is summarized in the following table. Table 4.23. Analysis of Mana’s EIM in constructing her third model of evaporation Category Indicators Level CONTENT - Included many sentences to clearly communicate her ideas. 3 - Described the phenomenon by showing how water on a plate shrinks over time. - Explained the phenomenon by showing/writing how water particles spread out in the air. - Did not include extra features such as empirical data. EXPLANATION - Explained how water on a plate shrinks over time by show3 ing/writing how water particles spread out in the air. 114 Table 4.23 (cont’d) ACCURACY - Continued to use the idea of water particles spreading out in the air and wrote that moisture in the air increases over time to make this model consistent with some of the empirical evidence collected about evaporation and scientific ideas from the computer simulations. COMMUNICATION - Included many sentences to clearly communicate her ideas. 3 2 The analyses of the focus students’ EIMs from their third models of evaporation are summarized in Figure 4.14. Figure 4.14. Analyses of the focus students’ EIMs in constructing their third models of evaporation 3 2 CONTENT EXPLANATION ACCURACY 1 COMMUNICATION 0 Brian Joon Mana Both Brian’s and Joon’s EIMs remained the same while Mana’s EIM changed in most categories. One interesting thing to note about Mana’s EIM is that her EIM here is identical with her EIM used in constructing their second models of evaporation (Figure 4.8). What does this mean? How were the preceding curriculum events related to the present result? I focus my discussion on these issues below. CONTENT: Mana’s EIM in this category changed from level 2 to level 3. It was because she did not include in her third model of evaporations empirical data, the feature on which she gave a positive feedback in Brian’s second model of evaporation. This indicates that Mana thought of both their way of using evidence (updating explanation to be consistent with empirical 115 evidence) and Brian’s way of using evidence (including empirical data in a model) as equally legitimate. I also conjecture that the evident emphasis on microscopic/theoretical entities as explanatory features given by the computer simulations and the class performance of water molecules (Table 4.20) had a constructive effect on this decision. EXPLANATION: All the three students’ EIMs remained the same in this category because they continued to show how microscopic/theoretical entities (e.g., water particles, waterair particles) move over time as a central explanatory feature in their models. ACCURACY: Joon’s EIM in this category did not change in terms of the level. But, there was some evidence indicating his increased awareness in this category. His EIM level in this category remained the same because he kept using an invalid idea of water-air particles. But, he made a noticeable change: he showed in this model that water-air particles are not gone (like in his second model of evaporation) but still in the air in the “after.” This indicates his intention to make his explanation consistent with empirical evidence (e.g., the results of Experiments 2, 3, and 4) as well as scientific ideas from the computer simulations about state changes. Next, Mana’s EIM in this category changed from level 2.5 to level 3. This change reflects the fact that she went back to her previous way of using empirical evidence (including an explanatory feature that is consistent with empirical evidence) after attending to empirical data when evaluating Brian’s second model of evaporation. COMMUNICATION: For the same reason discussed in ACCURACY (not including empirical data), Mana’s EIM in this category changed from level 2.5 back to level 2. 4.2.5. The activity of constructing a group consensus model of evaporation (M5) and its preceding curriculum event Consensus model building is another modeling activity that elementary school students 116 engage in rarely. The term consensus itself is not part of their ordinary vocabulary; they may think of consensus building as an activity for adults. Accordingly, the passages in the student notebook and the teachers—Ms. H in Class 10 and Mrs. M in Class 11—provided students guides and scaffolding before students engaged in this activity. In what follows, I document and analyze the ideas about consensus model building provided by the passages in the student notebook and two teachers’ utterances first. Next, I zoom in on the focus students’ consensus model of evaporation as well as on the process in which they constructed it to analyze their group EIM. 4.2.5.1. Introducing a consensus model and guiding how to construct it (Class 10~Class 11) At the beginning of this modeling activity (Class 10), Ms. H talked about what a consensus model is and how to construct it both by reading the passages in the student notebook and by adding her comments. In Class 11, Mrs. M introduced them in a similar fashion. Below, I examine the ways in which the passages in the student notebook and both teachers’ comments introduced this activity with a focus on ideas about modeling found in them. To begin with, Ms. H read the following passages in the student notebook that introduces a consensus model: - After coming up with all various models to explain a phenomenon, scientists try to construct a consensus model that integrates all the best parts of individual models. - Before you do this, you need to think about criteria by which to select the best parts of individual models. These paragraphs, albeit short, represent the process of consensus model building fairly well. Comparison between these descriptions and a recent report of how different groups of scientists developed a consensus model (Hauschild et al., 2008) reveals that these descriptions correctly identify most of the essential steps scientists take to build a consensus model. Of particular sig- 117 nificance is their emphasis on criteria. According to these passages, construction of a consensus model is a social process, but it is undertaken not on the basis of some individuals’ influences or unprincipled compromise but on the basis of criteria shared by all. After reading the above paragraphs, Ms. H referred to the previous conversation the class had had about the changes they had made to make their third models of evaporation: “So as a class, we have some ideas we’ve just talked about, right? All the things we used to change. Do we think we want to say that all our models have to have those?” Interestingly, several students (Dalia, Adrianna, Emily) replied in the negative. It seems that these students began to see that inclusion of all of them is not necessary and even not helpful for making a good model. To their responses, Ms. H said, “But do all of these things make our models better?” (emphasis added), favoring including all the features they had talked about in consensus models. Later, while they were talking about things to include in their consensus models, Ms. H described a consensus model as “the best model we can do to show how much we've learned” (emphasis added). Here, we can identify an idea that a good model shows how much they have learned. In Class 11, when Mrs. M went over and continued the activity, she read the same passages introducing consensus model building, shown above, and added her comments: [T]he more little things that you have, the more information that you have on there, ask yourself if it is relating scientifically to your task…so what I want you guys to do is if you have people explaining things, if you have characters, do you still need those characters? Do you need those or is that taking away from you explaining it scientifically? Can you explain it differently rather than using cartoon characters? She was suggesting that they take out such irrelevant features as human characters because they 118 have nothing to do with scientific explanation. These comments pushed students to have some sense of parsimony. In conclusion, the passages in the student notebook, Ms. H’s guides, and Mrs. M’s guides were slightly different in their emphases. The passages in the student notebook characterized consensus model building as a criteria-based social process. Ms. H primarily emphasized amassment of what one has learned in a consensus model. Finally, Mrs. M’s stress on scientific explanation sent a message to students that they would need to take out nonscientific features. Analysis of some ideas about modeling found in Ms. H’s and Mrs. M’s utterances guiding the activity of constructing a consensus model Table 4.24. Analysis of the ideas about modeling from Ms. H’s and Mrs. M’s utterances guiding the activity of constructing a consensus model Category Indicators Level CONTENT - Ms. H emphasized including all students had learned about 2~3 evaporation in their consensus models of evaporation (level 2). - Mrs. M emphasized taking out nonscientific features such as human figures (level 2~3). EXPLANATION N/A N/A ACCURACY N/A N/A COMMUNICATION N/A N/A 4.2.5.2. The focus students’ construction of their group consensus model of evaporation (Class 11) Students undertook consensus model building in Class 11 when Mrs. M led the class session. After Mrs. M reviewed the previous session and (re-)introduced the activity, students, in groups, invested a good amount of time (about 30 minutes) and efforts in constructing their group consensus models of evaporation. Below, I focus on the focus students’ engagement in this activity. I first describe how they carried out this activity, and then present an analysis of their group epistemology as deployed in constructing their consensus model of evaporation. 119 The four students—Adrianna, Brian, Joon, and Mana—processed the task mainly in two phases. First, they looked at all their third models of evaporation together and decided which features to include in their consensus model of evaporation. Adrianna played a logistical leadership to facilitate the process. Brian and Joon participated in the conversation. Mana, however, took up the role of making a list of features they chose and did not participate verbally in the conversation. On completing this list, they then actually began to construct the model. In this second phase, they had another round of conversation about more specific issues such as where to put certain features and how to represent them (e.g., size, color). Before looking carefully at their conversations and model, I discuss how each focus student participated in the task briefly. To begin with, their participations were neither equal nor similar. Analysis of the frequency of each student’s verbal participation in terms of on-task turns to speak identifies Adrianna as the most frequent participant (45.6%), Brian as the second most active participant (33.3%), Joon as the next frequent participant (21.0%), and Mana as the least active participant (0.0%). This result shows that Adrianna played a dominant role. Given that she was a group leader, this is not surprising. A closer examination, however, reveals that she did not only assume logistical leadership; she participated quite actively in the conversation in other ways, too—she presented ideas, challenged others’ ideas, and told the others what features to represent in what ways in constructing the model. Brian was the second most active participant among them. He presented several good ideas. In addition, he was the only one who, on a par with Adrianna, engaged in negotiations with her. In particular, as shown in the notes of E and F of Table 4.25, when Adrianna was not convinced of ideas about sentences and humidity, he attempted to persuade her. But, their negotiations were not particularly based on scientific criteria. The nature of their engagement was well expressed when Brian said in the process, “We’re like 120 politicians.” Joon did participate in these conversations, but not as often and not as significantly as the first two students. This had to do with the fact that he was an English-learner, speaking English slowly and in short sentences. Finally, Mana did not make any audible utterance at all in the two conversations. While a native speaker, she was generally reserved. Instead of verbal participation, she did other tasks such as making notes (in the first conversation) and writing and drawing (in constructing the model). Now, I focus on their conversations and model. The features they talked about in the conversations and ultimately included in their consensus model are summarized in Table 4.25 and their consensus model of evaporation is presented in Figure 4.15. Table 4.25. The focus students’ conversations in constructing their consensus model of evaporation (Class 11) Features Notes A. Colors - Adrianna proposed it by appealing to the majority rule as a rationale for her proposition. B. Showing evapo- - Adrianna proposed it. ration for hot and cold water C. Time elapse - Adrianna proposed “before, during, and after”. D. Key - Adrianna said to Mana, “Not labels, but a key,” indicating her avoidance of redundancy. - Later, when putting this feature, they were concerned with the location of it to make the model organized. E. Sentences - To this proposal made by Joon, Brian assented and Adrianna dissented. But, Brian persuaded Adrianna by saying, “I'd say we have them as far as it's good.” F. Data of percent- - Brian proposed it “to show that hot water is more humid.” To this proage humidity vs. posal, Joon assented, but Adrianna proposed to replace it by “faster and “faster and slower.” Brian tried to persuade Adrianna by saying, “It's just more deslower” tail.” A little later, Adrianna challenged Brian when he was to make up data of percentage humidity. To this, Brian proposed to have both features. Finally, Adrianna assented. - Later, when including data of percentage humidity, they tried to make them (hot water: 58%68%78%, cold water: 55%60%68%) consistent with their understanding that hot water evaporates faster than cold water. 121 G. Hot air and cold air above hot and cold water respectively Table 4.25 (cont’d) - Adrianna proposed this feature and justified it by appealing to their common experience that their hand feels cold over a glass of cold water and hot over a hot stove. - Later, when representing this, they attended to the number of red and blue streaks that represent hot air and cold air respectively (hot air: 554, cold air: 443). But, they did not have a clear ground for this decision. - Brian proposed this feature, which was rejected by the others because they thought that putting “hot and cold” was enough. H. Data of temperature for hot and cold water (not taken) I. Amount of water - They showed that hot water shrinks more than cold water on a plate, in consistency with their understanding that hot water evaporates faster than cold water. - Adrianna insisted that the amount of water in “during” be almost the same as that in “before” despite some questions from Brian and Joon. J. Water particles - They showed that the number of dots (representing the amount of wa(“water vapor”) ter vapor) in the air increases over time for both hot and cold water, in in the air consistency with humidity percentages. 122 Figure 4.15. The focus students’ group consensus model of evaporation 123 Figure 4.15 (cont’d) Note: The words in this figure are as follows (verbatim). A. Before B. The water is beggining to evaporate. C. humidity 58% D. During E. The water is evaporating faster. F. humidity 68% G. After H. Faster I. humidity 78% J. Before K. The water is starting to evaporate slowly. L. humidity 55% M. During N. The water is still evaporating slowly. O. humidity 60% P. After Q. Slower R. humidity 68% S. Key [blue] = Cold Air [black] = Plate [red] = Hot Air [green] = Water Vapor [brown] = Humidity A D B E C J G H F I P M N K Q L O R S Analysis of their group EIM identified in these data is presented in Table 4.26 and Figure 4.16. Table 4.26. Analysis of the focus students’ group EIM in constructing their group consensus model of evaporation Category Indicators Level CONTENT - Included such communicative features as labels, phrases, sen- 2 tences, and a key. - Described the phenomenon by showing/writing how water on a plate shrinks over time. - Explained the phenomenon by showing how water particles spread out in the air. - Included empirical data of percentage humidity which is an relevant but extra feature. 124 EXPLANATION - ACCURACY - COMMUNICATION - Table 4.26 (cont’d) Explained how water on a plate shrinks over time by showing how water particles spread out in the air. Included empirical data of percentage humidity as a source of accuracy (validity) of the model. Included such communicative features as labels, phrases, sentences, and a key. Included empirical data of percentage humidity to make the model persuasive. 3 2.5 2.5 Figure 4.16. Analysis of the focus students’ group EIM in constructing their group consensus model of evaporation 3 CONTENT 2 EXPLANATION ACCURACY 1 COMMUNICATION 0 Brian, Joon, Mana Comparing this with the three students’ individual EIMs in their prior modeling activity (Figure 4.14) reveals that Brian’s EIM remained the same while Joon’s and Mana’s EIMs changed or became identical with Brian’s EIM. I discuss this change below in each category. CONTENT: Both Joon’s and Mana’s EIMs in this category changed from level 3 to level 2. This change was due to the fact that they included empirical data of percentage humidity. Note that Joon and Mana did not include this feature in their third models of evaporation. But, in constructing their consensus model of evaporation, they acquiesced to Brian’s proposal to include this feature. Adrianna was the only one who challenged him by arguing that just putting “faster and slower” would be enough (Table 4.25, F). However, Brian ended up persuading her (and perhaps the others as well) by saying, “It's just more detail.” Two aspects in this process need to 125 be highlighted. First, I suppose that this decision was made possible in part due to their different patterns of participation. Joon’s and Mana’s relatively passive participation seems to have played a role in including this relevant yet scientifically extra feature in this model. Second, this decision process and, in particular, Brian’s utterance pointedly manifest that the majority of the students in this class deployed the idea that including more details makes a better model at this time, so much so that Brian could appeal to it to persuade Adrianna. Regarding this point, we detected the same idea in the comments Ms. H made to guide the activity of constructing a consensus model (Table 4.24). EXPLANATION: Their EIMs did not change in this category; they continued to show how water particles (which they called “water vapor”) spread out in the air over time in explaining evaporation. I think that Mrs. M’s emphasis on scientific explanation was of some assistance in this regard (Table 4.24). ACCURACY: Interestingly, they gave a lot of attention to representing accurately various features including the numerical data of percentage humidity and the number of water particles based on their general understanding or memory of their actual empirical evidence. This newly emergent attention has to do with the social aspect of the activity of constructing a consensus model. In evaluating others’ models, another social modeling activity, their engagement with each other was limited. But, in the present activity, they not only spent a lot of time but engaged actively in interacting with each other (though Adrianne and Brian were more active than Joon and Mana). I argue that this context allowed them to give more attention in these more specific aspects of a model. In addition, as the group accepted Brian’s proposal to include the data of percentage humidity in their consensus model of evaporation, they came to adopt his EIM in this category as 126 well. This was partly assisted by Ms. H’s emphasis on including all that students had learned about evaporation in their consensus models of evaporation (Table 4.24). COMMUNICATION: Again, as the group decided to include the data of percentage humidity in this model, they collectively came to become more aware of the need to make the model persuasive. 4.2.6. The activity of evaluating each group’s consensus model of evaporation From Class 12 to Class 14, students as a class evaluated each group’s consensus model of evaporation. It should be noted that this activity, initiated by Mrs. M, was not a part of the designed curriculum unit and replaced a planned activity of using their consensus models of evaporation to explain or predict other examples of evaporation that aimed to allow students to note the need to make a model general. Below, I document how Mrs. M and students evaluated each 8 group’s consensus model of evaporation generally and focus on a particular conversation that turned out to be influential for the focus students’ ensuing modeling activities. When this event began in Class 12, Mrs. M neither articulated the purpose of doing it nor instructed how to do it. She only said, “We’re going to go over each group consensus model.” In the course of the event, however, she interspersed some scaffolding comments on the activity as the need emerged. Group consensus models were presented and evaluated in the following order. - Class 12: Group 3’s, Group 6’s, and Group 5(focus group)’s - Class 13: Group 5’s (continued), Group 2’s, Group 1’s, Group 4’s - Class 14: Group 4’s (continued) For each model, Mrs. M showed it to the class using the transparency paper copy of it 8 For more detailed analysis of this whole activity, see Appendix A.6. 127 and an overhead projector, asked them to write down their compliments and wishes, and had them give their evaluations verbally. While compliments were also given, they spent more time on critiques. When a challenge was given to a model, Mrs. M would ask the group who had created the model to clarify or defend their thought. When competing ideas emerged, she would ask students to vote by a show of hands. In this way, she situated the conversations in the argumentative mood, which in turn helped make students’ often vague ideas public and articulated. Although she assumed an important role throughout the whole process of this activity, I would like to report on one particular instance in which Mrs. M had a significant impact on students’ EIMs. It took place when Mrs. M presented the focus group’s consensus model of evaporation to the class in Class 12. As soon as she began to show this model to the class, she noticed something significant in the model and said: This group burst out using a ton of evidence from our investigations. Please tell me right now. What is the most specific evidence you see on there? The most specific evidence…What's the most specific evidence you see? Lassie tried to answer this question by pointing to “faster” written in the model. Mrs. M also read “faster” and “slower” in the model but probed further by asking, “How do you know the rate at which it’s evaporating?” When Jonas replied, “Less humidity,” she asked again, “How do you know there is less humidity?” Jin talked about “little dots.” At this moment, she repeated her question emphatically: All right, there's little dots here. Then there's more dots and there's more dots. You're talking about dots. But what is the specific evidence? THE SPECIFIC EVIDENCE? Finally, when Finley referred to percent humidity, she emphatically confirmed it. She then asked who had come up with this idea and learned that it was Brian’s idea. She also asked “from what 128 investigation” he had taken this idea. When he indicated the humidity detectors, she gave a positive feedback to him. Analysis of how this short conversation potentially influenced students’ EIMs using the EIM coding scheme is presented below. Table 4.27. Analysis of the ideas about modeling from the class conversation about “specific evidence” in the focus students’ group consensus model of evaporation Category Indicators Level CONTENT - Mrs. M emphasized including empirical data, a relevant but 2 extra feature, in a model. EXPLANATION N/A N/A ACCURACY - Mrs. M emphasized including empirical data in a model as a 2.5 source of accuracy of it. COMMUNICATION - Mrs. M emphasized including empirical data in a model to 2.5 make it persuasive. Her emphasis on “specific evidence” or specific empirical data turned out to have a big impact on students’ modeling activities and EIMS. Before the above short conversation, only two groups (including the focus group) of students, that is, 8 of 24 students (33.3%) included the feature in their consensus models of evaporation. In contrast, after Mrs. M’s emphasis on empirical data, much more students attended to the feature. For example, in constructing their second models of condensation when empirical data was available, 17 (including the focus students) of 24 students (70.8%) included it in their second models of condensation. Most interestingly, in constructing their initial models of condensation, a modeling activity they engaged in right after the present activity, 10 (not including the focus students) of 24 students (41.7%) included empirical data in their models despite the fact that they had no empirical data at that time. This indicates that after Mrs. M’s emphasis on this feature, more students came to see it as a feature that makes a model look accurate and persuasive. 4.2.7. The activity of constructing an initial model of condensation (M6) and its preceding curriculum event 129 From Class 15 to the end (Class 22), Mrs. M’s class investigated the second target phenomenon of this unit: condensation. In this second phase, Mrs. M and students went through almost the same modeling activities in the same order as they had done when investigating evaporation. Because they knew some ideas about these modeling activities from their previous engagement in them, they did not spend as much time in making sense of what these activities are about and how to do them in this second phase. In what follows, I examine one curriculum event that took place before students con9 structed their initial models of condensation, and analyze the focus students’ epistemologies used in doing this activity. 4.2.7.1. “How did water drops appear on the surface of a cold bottle?” (Class 14) In Class 14, students were introduced to two phenomena that both involve water drops forming over time—one on a plastic wrap that covers a cup containing water and the other on the surface of a cold bottle. They were asked three questions for both phenomena: What do you observe on it? Where did it come from? How did it get there? When they talked about a cold bottle, Mrs. M asked Adrianna to share what she had written to respond these questions. Following her talk (not fully audible), Mrs. M acted out Adrianna’s ideas about the last two questions with three volunteering students. She pretended to be a cold bottle and the three students acted out as water particles. She asked the participating student, “What are you going to do? What happens if you're out floating, if you're out in warm air?...I’m cold!” And she asked the class, “What is going to happen with these guys?” In response, students shouted, “Clump together! Clump together!” Accordingly, the three students moved near Mrs. M (See Figure 4.17). 9 Mrs. M also had students some passages from the student notebook. But, they were identical with what they had read before constructing their initial models of evaporation. Also, Mrs. M did not add any comment on it nor had students talk about it. Therefore, I assume that this event did not have much effect on students’ EIMs at that time and do not focus on this event here. 130 Figure 4.17. Mrs. M and three students’ performance of water particles collecting on a cold bottle (Class 14) After this short engaging performance, Mrs. M referred to the two prior curriculum events: the computer simulations about state changes and students’ collective performance representing the behaviors of water molecules in the playground (Class 9). In this way, Mrs. M helped students note that when it is cold, water particles in the air get bundled on a surface. Mrs. M’s emphasis on water particles as an explanatory feature potentially allowed students to see the utility of the feature in constructing their initial models of condensation. Analysis of this idea using the EIM coding scheme is presented below. Table 4.28. Analysis of the ideas about modeling from Mrs. M and three students’ performance of water particles collecting on a cold bottle and Mrs. M’s ensuing comments Category Indicators Level CONTENT N/A N/A EXPLANATION - Mrs. M emphasized how water particles behave as an explan3 atory feature. ACCURACY N/A N/A COMMUNICATION N/A N/A 4.2.7.2. The focus students’ construction of their initial models of condensation (Class 15~Class 16) Students were able to construct their initial models of condensation using some of the knowledge and skills about modeling that they had accumulated and refined by far. Below, I focus my analysis on the focus students’ initial models of condensation. One note must be given at this point. This modeling activity was quite different from the preceding and ensuing modeling activities in that students had no empirical evidence at that time. 131 Accordingly, the focus students did not or could not include any feature related to empirical evidence in their initial models of condensation. For example, Brian did not include empirical data (e.g., percentage humidity) in this model while he had done in his prior model and would do in his next model. How should this change be interpreted? Did he change his epistemic idea about empirical evidence only at this time? Or, was it merely due to the unique situation of this activity (i.e., no empirical evidence)? Obviously, there is some ambiguity here. And unfortunately, I do not have data that can settle this ambiguity. My decision is to analyze the focus students’ initial models of condensation as they appear at this moment. Brian’s initial model of condensation 132 Figure 4.18. Brian’s initial model of condensation 133 Figure 4.18 (cont’d) Note: The words in this figure are as follows (verbatim). A. 1 B. water vapor C. water drop D. cold bottle E. The cold bottle is getting a few drops on it. F. 2 G. water drop H. water vapor that is turning into water drop I. There are a lot of drops. J. 3 K. There are a ton of drops on the bottle. The water vapor is warm and the bottle is cold so if the water vapor gets to close, it becomes a water drop. A D B F E C G H I J K In Class 16, Mrs. M asked Brian to explicate this model to her. When Mrs. M asked where the water droplets on the bottle come from, Brian said that they come from the water vapor in the air. Also, with Mrs. M’s constructive comments, he said that as water vapor in the warm air comes close to the cold bottle, it turns into water drops. Intrigued by the fact that Brian seemed quite confident about his explanation, Mrs. M asked, “How do you know it didn't come from inside the bottle?” He replied, “Because water doesn't just go through solid objects.” Analysis of these two data using the EIM coding scheme is presented below. 134 Table 4.29. Analysis of Brian’s EIM in constructing his initial model of condensation Category Indicators Level CONTENT - Included such communicative features as labels, sentences, 3 and colors. - Described the phenomenon by showing that the number of water drops on a cold bottle increases over time. - Explained the phenomenon by showing/writing that as water particles in the air come close to the bottle, they turn into water drops. - Included no extra feature. EXPLANATION - Explained how an increasing number of water drops form on 3 a cold bottle by showing/writing that as water particles in the air come close to the bottle, they turn into water drops. ACCURACY - Used the idea of water particles moving in the air and turning 2 into water droplets when it is cold from a previous source (e.g., the computer simulations about state changes) but did not refer to the source or articulate how the idea comes from the source. COMMUNICATION - Included such communicative features as labels, sentences, 2 and colors. Joon’s initial model of condensation 135 Figure 4.19. Joon’s initial model of condensation 136 Figure 4.19 (cont’d) Note: The words in this figure are as follows (verbatim). A. The bottle is starting to get water drop B. water vapor C. The water vapor going to bottle because bottle is cold D. 1 Before E. The bottle is cold so when water vapor goes to bottle it will be change to water drop F. water drop G. 2 During H. The bottle got more water drop I. Need more J. 3 After B D A C E F H I G J Joon’s EIM can be analyzed from his initial model of condensation as follows. Table 4.30. Analysis of Joon’s EIM in constructing his initial model of condensation Category Indicators Level CONTENT - Included such communicative features as labels, sentences, 3 and colors. - Described the phenomenon by showing that the number of water drops on a cold bottle increases over time. - Explained the phenomenon by showing/writing that as water particles in the air come close to the bottle, they turn into water drops. - Included no extra feature. EXPLANATION - Explained how an increasing number of water drops form on 3 a cold bottle by showing/writing that as water particles in the air come close to the bottle, they turn into water drops. ACCURACY - Used the idea of water particles moving in the air and turning 2 into water droplets when it is cold from a previous source (e.g., the computer simulations about state changes) but did not refer to the source or articulate how the idea comes from the source. COMMUNICATION - Included such communicative features as labels, sentences, 2 and colors. Mana’s initial model of condensation 137 Figure 4.20. Mana’s initial model of condensation 138 Figure 4.20 (cont’d) Note: The words in this figure are as follows A (verbatim). A. Before B. The gree dots are water vapor. There are not many water vapor here. C. This bottle has just been taken out of the fridge D. During E. There is more water vapor and there is some on the bottle too. F. There is more water vapor and there is B some on there too. G. After H. There is more water vapor on the bottle. C I. There is more water vapor on the bottle. J. Key [light blue dot] = water vapor [arrow] = information K [blue dot] = droplets K. I think that the water vapor from the air came to the cold bottle and it turned into water droplets. G D H E J F I Analysis of Mana’s EIM from her initial model of condensation is summarized in Table 4.31. Table 4.31. Analysis of Mana’s EIM in constructing her initial model of condensation Category Indicators Level CONTENT - Included such communicative features as sentences, a key, 3 and colors. - Described the phenomenon by showing that the number of water droplets on a cold bottle increases over time. - Explained the phenomenon by showing/writing that as water particles in the air come close to the bottle, they turn into water droplets. - Included no extra feature. EXPLANATION - Explained the phenomenon by showing/writing that as water 3 particles in the air come close to the bottle, they turn into water droplets. 139 Table 4.31 (cont’d) ACCURACY - Used the idea of water particles moving in the air and turning into water droplets when it is cold from a previous source (e.g., the computer simulations about state changes) but did not refer to the source or articulate how the idea comes from the source. COMMUNICATION - Included such communicative features as sentences, a key, and colors. 2 2 Figure 4.21 displays the general portrait of the focus students’ EIMs as manifested in this modeling activity. Figure 4.21. Analyses of the focus students’ EIMs in constructing their initial models of condensation 3 2 CONTENT EXPLANATION ACCURACY 1 COMMUNICATION 0 Brian Joon Mana Given the analytical issue discussed above, this result should be taken to indicate the focus students’ EIMs only as evidenced in their initial models of condensation. Without additional determinative evidence, we do not know confidently what EIMs they held at this time. With this difficulty in mind, I now compare this result with their group EIM deployed in constructing their consensus model of evaporation (Figure 4.16). Comparison of two results reveals that all the three students’ EIMs changed in CONTENT, ACCURACY, and COMMUNICATION. The changes in these categories are all related to the unique context of the present activity that no empirical evidence was available at that time. Their EIMs in EXPLANATION, however, continued to be at level 3. This is a significant result 140 considering that the two models targeted different phenomena (evaporation and condensation). It indicates that all these students carried over the explanatory idea and feature of how water particles behave to explain the new phenomenon of condensation. Although evaporation and condensation are closely related phenomena from a scientific perspective, elementary students may not see them as such. Furthermore, specific cases of these phenomena such as water on a plate shrinking over time (evaporation) and water drops appearing on a cold bottle (condensation) may look very different to them. Regarding the latter case, there is evidence that some students believe that water inside the bottle somehow comes out of the bottle. 10 Given that, it is noteworthy that these students used the same explanatory features they had used before to explain how water drops form on a cold bottle. I argue that this retention was made possible or, at least, reinforced by Mrs. M’s effort shown above (Table 4.28). When she not only reminded students of the computer simulations about state changes and their collective performance in the playground but also provided, with three students, an embodied explanation of how that happens, many students including the focus students likely decided to show it in their initial models of condensation. 4.2.8. The activity of constructing a second model of condensation (by evaluating and revising the prior model) (M7) and its preceding curriculum events According to the designed curriculum, students proceeded to the activity of conducting empirical investigations of condensation after completing their initial models of condensation. The purpose of this activity was to collect empirical evidence about condensation. Next, they evaluated and revised their initial models of condensation based on the evidence, and constructed their second models of condensation. Below, I investigate how Mrs. M and Ms. H instructed and guided students to do these activities with a focus on their potential influence on students’ EIMs. 10 In another teacher’s classroom where the same curriculum unit was enacted, several students deployed this idea. 141 Then, an analysis of the focus students’ epistemologies used in constructing their second models of condensation is provided. 4.2.8.1. Linking empirical evidence to models during empirical investigations about condensation (Class 17~Class 19) After creating their initial models of condensation, Mrs. M’s students conducted four sets of experiments about condensation from Class 17 to Class 19. Below is a summary of the experiments they conducted and the results. - Experiment 1: Soda Can & Ice Pack. In groups, students observed a soda can and an ice pack just taken out of a freezer. They observed the surfaces of the two objects becoming foggy. The focus students did this experiment in Class 17 but did not finish it. - Experiment 2: Mirror. In groups, students observed the surface of a mirror just taken out of a freezer. They found its surface becoming foggy. The focus students did this experiment in Class 17. When Mrs. M pushed them to explain it with several questions, they could articulate that water vapor coming from the air form “the fog” on the surface of the mirror. - Experiment 3: Humidity in a Container. In groups, students placed an ice pack in a hood and measured humidity in the hood over time. In most groups, they found humidity decreasing over time. The focus students undertook this experiment twice—first in Class 17 and second in Class 19. In Class 17, before they started it, most of them predicted that the humidity would be raised because they thought that water in the hood would evaporate over time. Mrs. M showed that the humidity actually went down and guided them to note that the water vapor came from the air and stuck to the surface of the ice pack. In Class 19, they did this experiment again. Chael came from another group and joined the 142 focus group at this time. He made a prediction that the humidity would go down and then begin to rise. When they measured the humidity in the hood for an extended time, they found that Chael’s prediction was right. This became a significant moment enough for them to change their models to be consistent with this result. - Experiment 4: Weigh and Ice Pack over Time. In Class 18, Mrs. M demonstrated this experiment. She placed an ice pack on a scale and measured its weight over time in the front. Students found that its weight went up over time. Recall that Mrs. M used a hybrid sequence using the Predict-Share-Observe-Explain framework and the general sequence in the student notebook to help students conduct empirical investigations about evaporation. For students’ engagement in empirical investigations about condensation, she refined her previous hybrid sequence to make a more formal sequence and created in a handout based on it. This new sequence consisted of the following five steps. After each step, I put the typical instructions found in the handout. - Predict: “Predict what will happen…” - Share: “Share with your partners.” - Observe: “Observe what is happening to…” and “List observations below.” - Explain: “Explain the data. How can you interpret the things you observed? What does this data tell you about condensation?” - Reflect: “Compare your observation with your prediction & reflect on how this evidence may help you improve your initial model. What do you wish to change in model as a result of doing this investigation?” Now, I analyze this Predict-Share-Observe-Explain-Reflect sequence and Mrs. M’s scaf- folding shown above with respect to the ideas about modeling supported by them. 143 Table 4.32. Analysis of the ideas about modeling from Mrs. M’s scaffolding and handout given to help students link empirical evidence to their models Category Indicators Level CONTENT N/A N/A EXPLANATION N/A N/A ACCURACY - Mrs. M exemplified how to use empirical evidence to refute 2~3 an invalid idea and thus to secure accuracy (validity) of a model (level 3). - In the handout, empirical evidence was emphasized but how to use it was not specified (level 2~3). COMMUNICATION - Mrs. M exemplified a sophisticated way of using empirical 2.5~3 evidence to make a model persuasive (level 3). - In the handout, empirical evidence was emphasized as a feature that makes a model persuasive but how to use it was not specified (level 2.5~3) 4.2.8.2. Introducing and guiding the activity of constructing a second model of condensation (Class 19) Immediately after students finished the empirical investigations about condensation, Mrs. M had them construct their second models of condensation on the basis of the empirical evidence they had available. Recall that they had used a significant amount of time to summarize and make sense of the empirical evidence previously after the experiments about evaporation. By contrast, at this time, she did not assign much time to such activity. It may be that she assumed that students had good enough understanding of the empirical evidence about condensation they had collected and how to do evaluate and revise their initial models based on the empirical evidence. Mrs. M even had Ms. H to take charge of the activity. Ms. H mostly read passages in the student notebook that direct the activity and added some comments as follows. Note that the quoted parts below are the passages in the student notebook she was reading. It says, “Evaluate and revise the model.” “Look at the findings we have collected so far.” They are all in your student notebooks, all the evidence collected. “We know the water 144 on the bottle did not come from the inside. We know that the water did not come from the clouds in the sky. We know that the water came from the air. And we know that condensation occurs better as the temperature difference between the can and the air is larger. Go back to your initial model,” the first model of condensation. “Evaluate and revise it to take account of the findings.” You need to think about all the things we've found, all the evidence we've found from the experiments and how you can apply that evidence to your new model. All right? So, just like we did before with evaporation we are going to do the same thing and revise your model and include all the stuff you have learned so far. Emphasis on empirical evidence was clear here. However, how to use it for their next models was not. The text she read asked students’ new models to “take account of” the evidence. But, in Ms. H’s following comments, she told students to “apply that evidence” to their new models. Finally, she advised them to “include all the stuff you have learned so far.” Over all, it is obvious that in this local situation Ms. H viewed the empirical evidence as some information students had learned about condensation to be included in their second models of condensation. Shortly after the activity was started, Mrs. M walked around to help individual students. At one moment, she told the whole class: …this is condensation so if you were to show like a first grader, second grader, fourth grader your model, could your model be used to explain condensation, okay? Do you need all that extra stuff? Dalia was including evaporation and all sorts of other stuff. Make sure you decide whether or not you need all the other stuff. Okay? Herein, she highlighted the simplicity and parsimony of the models. To this end, she placed the activity in an imaginary situation in which students explain condensation to younger students. The reason to bring this image was to speak to the need that their models should be focused on 145 condensation and simple. It was because she found that Dalia and, as we will see shortly, some other students had included “evaporation and all sorts of other stuff” in their new models. The ideas about modeling that can be detected in these utterances are analyzed as follows. Table 4.33. Analysis of the ideas about modeling from the passages in the student notebook, Ms. H’s comments, and Mrs. M’s comments that introduce and guide the activity of constructing a second model of condensation Category Indicators Level CONTENT - Ms. H emphasized including “all the stuff you have learned so 2~3 far” (level 2). - Mrs. M emphasized taking out extra features not related to condensation (level 2~3). EXPLANATION N/A N/A ACCURACY - The passages in the student notebook (“Evaluate and revise it 2~3 to take account of the findings.”) and Ms. H’s comment as a whole was open to various ways of using empirical evidence to make a model accurate (valid). COMMUNICATION - The passages in the student notebook (“Evaluate and revise it 2.5~3 to take account of the findings.”) and Ms. H’s comment as a whole was open to various ways of using empirical evidence to make a model persuasive. 4.2.8.3. The focus students’ construction of their second models of condensation (Class 19) Brian’s second model of condensation 146 Figure 4.22. Brian’s second model of condensation 147 Figure 4.22 (cont’d) Note: The words in this figure are as follows (verbatim). B A. Before B. Average humidity C. humidity 52% D. weight 204.08 grams E. Key [dot] - condensation [light blue square] - ice pack [dome] - container [small circle] - water vapor F. During G. humidity 44% H. weight 204.17 grams I. less water vapor less humidity more condensation more weight J. After K. humidity 47% L. weight 204.13 grams M. The water vapor became drops of water and stuck to the cold ice pack, causing less humidity and more weight on the ice pack. Afterwards, the ice pack heats up, and the water drops evaporate causing normal humidity and normal weight on the ice pack. A C E D F I G H J M K L N. Brian’s EIM can be analyzed from this data as follows. Table 4.34. Analysis of Brian’s EIM in constructing his second model of condensation Category Indicators Level CONTENT - Included such communicative features as labels, sentences, a 2 key, and colors. - Described the phenomenon by showing/writing how the number of water drops on a bottle, humidity, and weight change over time. - Explained the phenomenon by showing/writing how water particles turn into water drops and vice versa. - Included empirical data, a relevant but not essential feature. 148 EXPLANATION - ACCURACY - COMMUNICATION - Table 4.34 (cont’d) Explained the phenomenon by showing/writing how water particles turn into water drops and vice versa. Included two kinds of empirical data (percentage humidity, gram weight) as a source of accuracy of the model. Included such communicative features as labels, sentences, a key, and colors. Included two kinds of empirical data (percentage humidity, gram weight) to make the model persuasive. Joon’s second model of condensation 149 3 2.5 2.5 Figure 4.23. Joon’s second model of condensation Analysis of Joon’s EIM from his second model of condensation is presented in the following table. 150 Table 4.35. Analysis of Joon’s EIM in constructing his second model of condensation Category Indicators Level CONTENT - Included such communicative features as a key, sentences, 2 and colors. - Described the phenomenon by showing/writing how the number of water drops on a bottle, humidity, and weight change over time. - Explained the phenomenon by showing/writing how water particles turn into water drops. - Included empirical data, a relevant but not essential feature. EXPLANATION - Explained the phenomenon by showing/writing how water 3 particles turn into water drops. ACCURACY - Included two kinds of empirical data (percentage humidity, 2.5 gram weight) as a source of accuracy of the model. COMMUNICATION - Included such communicative features as a key, sentences, 2.5 and colors. - Included two kinds of empirical data (percentage humidity, gram weight) to make the model persuasive. Mana’s second model of condensation 151 Figure 4.24. Mana’s second model of condensation 152 Figure 4.24 (cont’d) Note: The words in this figure are as follows A (verbatim). A. Before B. TIME 10:00 C. There is water vapor in the air. D. The cold bottle has been just placed on the table E. Humidity 45% F. WEIGHT 203g G. During H. TIME 10:05 B C I. There is water vapor on the bottle. J. There is water vapor on the bottle. There is some in the air too. K. weight 204g D L. HUMIDITY 40% M. After N. TIME 10:10 E O. There is water droplets on the bottle. S There is still water vapor in the air. P. There is droplets on the bottle. There is still some water vapor in the air. Q. HUMIDITY 35% R. WEIGHT 206g S. Key [green] - Water vapor [red] - Table [pink] - Bottle [blue] - Water droplets M G I H N O J P F K L Q R Mana’s EIM can be analyzed from this data as follows. Table 4.36. Analysis of Mana’s EIM in constructing his second model of condensation Category Indicators Level CONTENT - Included such communicative features as sentences, a key, 2 and labels. - Described the phenomenon by showing/writing how the number of water drops on a bottle, humidity, and weight change over time. - Explained the phenomenon by showing/writing how water particles turn into water drops. - Included empirical data, a relevant but not essential feature. 153 EXPLANATION - ACCURACY - COMMUNICATION - Table 4.36 (cont’d) Explained the phenomenon by showing/writing how water particles turn into water drops. Included two kinds of empirical data (percentage humidity, gram weight) as a source of accuracy of the model. Included such communicative features as sentences, a key, and labels. Included two kinds of empirical data (percentage humidity, gram weight) to make the model persuasive. 3 2.5 2.5 Analyses of the focus students’ EIMs from their second models of condensation are summarized in Figure 4.25. Figure 4.25. Analyses of the focus students’ EIMs in constructing their second models of condensation 3 2 CONTENT EXPLANATION ACCURACY 1 COMMUNICATION 0 Brian Joon Mana To compare this result with the results for the focus students’ previous two modeling activities (Figure 4.16, Figure 4.21), we can notice that their EIMs in this activity are different from their EIMs analyzed from their initial models of condensation and identical with their EIMs used in constructing their consensus model of evaporation. With in mind the analytical difficulty involved in analyzing their EIMs from their initial models of condensation, I comment on what this means and how it is related to the two preceding curriculum events. CONTENT: The levels of all the three students’ EIMs in this category changed from 2’s (constructing a consensus model of evaporation) through 3’s (constructing an initial model of 154 condensation) to 2’s (constructing a second model of condensation). These different levels reflect mainly whether or not they included empirical data (e.g., percentage humidity, weight) in their model(s). I interpret this change not to indicate that their EIMs in this category actually fluctuated during these three activities based on evidence. We need to consider that nine students (other than the focus students) included such feature in their initial models of condensation even when they did not have any empirical data while only a group of students (other than the focus students) had included such feature in their consensus model of evaporation. As discussed earlier, this is evidence that they were influenced by Mrs. M’s emphatic confirmation of the percentage humidity in the focus students’ consensus model of evaporation (Class 12) (See Table 4.27). Considering this influence, it does not seem reasonable to think that the focus students withdrew their attention to such feature. EXPLANATION: The levels of the focus students’ EIMs in this category did not change. They continued to use the mechanistic feature of water particles in their second models of condensation. ACCURACY: Their EIMs in this category all changed from level 2.5 (constructing a consensus model of evaporation) to level 2 (constructing an initial model of condensation) to level 2.5 (constructing a second model of condensation). This is because they all included two sets of empirical data—of humidity and of weight—in their second models of condensation. As noted above in CONTENT, it appears that they continued to have commitment to this feature since their consensus model of evaporation. One thing to note here is that Brian’s data pattern was different from both Joon’s and Mana’s data pattern. While Joon and Mana showed decreasing humidity in the air and increasing weight of the bottle over time in their models, in Brian’s model humidity decreased and increased, and weight increased and decreased over time. Brian’s 155 data pattern came from the result of Experiment 3 when the focus students had conducted it secondly (Class 19). Here, we can see his increased attention to empirical data as a source that provides accuracy (validity) to a model. The two curriculum events analyzed above do not seem to have had much effect on this once Mrs. M’s emphasis on specific evidence had specified how to use empirical evidence (See Table 4.27). COMMUNICATION: Their EIMs in this category all changed from level 2.5 (constructing a consensus model of evaporation) to level 2 (constructing an initial model of condensation) to level 2.5 (constructing a second model of condensation) again because of their inclusion of empirical data. 4.2.9. The activity of constructing a group consensus model of condensation (M8) and its preceding curriculum events After students finished constructing their second models of condensation, Mrs. M had them engage in two ensuing modeling activities at a stretch. These were the activities of evaluating others’ second models of condensation and of constructing a group consensus model of condensation. She also did not give them substantial instruction or scaffolding for these activities. Thus, in what follows, I briefly document two curriculum events and then analyze the focus students’ group EIM used in constructing their group consensus model of condensation. 4.2.9.1. Guiding the activities evaluating peers’ second models of condensation and of constructing consensus models of condensation (Class 20) In Class 20, Mrs. M gave students a short instruction for two connected modeling activities—evaluation of others’ second models of condensation and construction of a group consensus model of condensation. It seems that her combination of two activities and short guide had to do with the fact she had spent more time than planned on the implementation of the unit up to this 156 point; she likely wanted to speed up the enactment of the rest of the unit. This was possible also because students had experience and understanding of these two modeling activities. Hence, Mrs. M even did not read the instructions imprinted in the student notebook and only emphasized one thing about a consensus model as follows. Group consensus models should not be a whole combination of every single thing that you think condensation is to represent. Group consensus models shouldn’t be hard to follow…Some of the things you might choose to leave out. Let us know why. Mrs. M stated that group consensus models do not have to have every single thing about condensation and that students might choose to leave out some things. She touched on parsimony here. But, she did not elaborate why she favored such parsimony. She only alluded to clarity or comprehensibility. Instead of presenting reasons, she urged students to come up with their reasons. Table 4.37. Analysis of the ideas about modeling from the curriculum event of introducing empirical evidence as a criterion for evaluating models Category Indicators Level CONTENT - Mrs. M emphasized taking out some of the things from a 2~3 model. EXPLANATION N/A N/A ACCURACY N/A N/A COMMUNICATION - Mrs. M emphasized taking out some of the things from a 2 model to make it clear. 4.2.9.2. Evaluating others’ second models of condensation (Class 20) After this guiding comment, Mrs. M talked about the procedure of engaging in these two activities with students. She adopted a student’s suggestion to use note cards for peer evaluations. This procedural decision had some impact not only on the way they evaluated one another’s models but on the content of their evaluations. Because students focused on the written evaluations, they did not engage actively in verbal, interactive evaluations as much as they had done previously. This in turn constrained the amount and specificity of their evaluations; they tended 157 to be succinct in writing. This was evident in the focus students’ evaluations of one another’s second models of condensation. For this difficulty, I choose not to analyze their evaluations systematically. There is, however, one instance I would like to highlight when the focus students engaged in this activity. When Brian and Adrianna paired up to evaluate each other’s second models of condensation, Adrianna asked why the weight of the bottle in the “during” is larger than that in the “after.” To this question, Brian explained: I know. This is when it starts to get warmer and then evaporates. Then [reading the sentences he wrote in his model] “the water vapor became drops of water and stuck to the cold ice pack, causing less humidity and more weight on the ice pack. Afterwards, the ice pack heats up, and the water drops evaporate causing normal humidity and normal weight on the ice pack.” Adrianna embraced this idea immediately (“Yeah, perfect!”) presumably because she remembered their second conduction of Experiment 3. It appears that Adrianna also accepted Brian’s commitment to empirical data as a source of accuracy (validity) of a model at that time. Although Mrs. M advised students not to include evaporation in their models of condensation (Table 4.33), Adrianna does not seem to have attended to the advice at the moment. As we will see shortly, Joon and Mana did the same as Adrianna did at this time. 4.2.9.3. The focus students’ construction of their group consensus model of condensation (Class 21) In Class 21, the focus students built their group consensus model of condensation for most of the time (about thirty minutes). Differently from when they had constructed their consensus model of evaporation, they talked about what features to include in the model and con- 158 structed the model at the same time. Their participation patterns did not change significantly. Analysis of their on-task turns to speak shows that Adrianna (46.2%) was the most active participant with Joon (28.9%), Brian (22.2%), and Mana (2.7%) being the second, the third, and the least active participants respectively. The following table summarizes the conversation they had about model features as they constructed their consensus model of condensation. Table 4.38. The focus students’ conversation in constructing their consensus model of condensation (Class 21) Features Notes A. Target object (a - Regarding their target object, they first thought of only a bottle, but bottle  a bottle Brian convinced them to change it into a bottle in a container. The latin a container) ter is closer to the experiment they conducted (Experiment 3: an ice pack in a hood) thus made it possible to represent percentage humidity accurately. B. Condensation on - Adrianna proposed to have this feature, but Brian said that it did not a hot bottle and a matter, to which Adrianna agreed. This indicates that they perceived a cold bottle (not comparative presentation of the two cases as too much details. taken) C. Organization - They were concerned a lot with how to organize various features (or where to put each feature) in the model. D. Before, during, - Adrianna proposed it. after E. Key - Joon proposed it. F. Weight - Adrianna proposed it. G. Decimal points - Brian proposed this, which the others accepted, for the space issue. It for weight was, however, mainly to represent the measures more accurately. measures H. Humidity - They reflected the result of an experiment (Experiment 3) that the humidity around an ice pack in a hood had increased and then decreased over time. I. Sentences - When Mana wrote sentences, Adrianna emphasized accurately representing the amount of water vapor and water drops in the sentences. J. The number of - They reflected the result of an experiment (Experiment 3) that the water drops humidity around an ice pack in a hood had increased and then decreased over time. K. Colors - They all liked using different colors to represent different things in the model. 159 L. The number of dots (representing “water vapor”) in the air Table 4.38 (cont’d) - They reflected the result of an experiment (Experiment 3) that the humidity around an ice pack in a hood had increased and then decreased over time. And their group consensus model of condensation is presented in Figure 4.26. 160 Figure 4.26. The focus students’ group consensus model of condensation 161 Figure 4.26 (cont’d) Note: The words in this figure are as follows (verbatim). A. Before B. 52% C. 205.08g D. There is a lot of water vapor in the air right now. There is a little bit of condensation happening. E. During F. 48% G. 205.16g H. There is still same amount of water vapor in the air and there is more condensation happening. I. After J. 50% K. 205.11g L. There is now more water vapor in the air because the same amount of condensation evaporated. M. [green] = water vapor [brown] = root beer [light blue] = condensation [blue] = humidity detector g = grams [dome] = container [blank] = air A E I F B J C G D K H L M Analysis of both their consensus model of condensation and the conversation they had in constructing it is summarized in Table 4.39 and Figure 4.27. Table 4.39. Analysis of the focus students’ group EIM in constructing their group consensus model of condensation Category Indicators Level CONTENT - Included communicative features such as sentences, key, and 2 colors. - Included a humidity detector and empirical data (e.g., data of humidity, data of weight), features that they had learned about condensation but are not necessary to explain condensation. EXPLANATION - Explained how the percentage humidity in the air and the 3 weight under the container change over time using water particles. 162 Table 4.39 (cont’d) ACCURACY - Included empirical data (i.e., how the data of humidity in the air and that the data of weight change over time) and made their explanation (i.e., how the number of water particles changes over time) consistent with the empirical data. COMMUNICATION - Included communicative features such as sentences, key, and colors. - Included empirical data (e.g., data of humidity, data of weight) to make the model persuasive 2.5 2.5 Figure 4.27. Analysis of the focus students’ group EIM in constructing their group consensus model of condensation 3 CONTENT 2 EXPLANATION ACCURACY 1 COMMUNICATION 0 Brian, Joon, Mana The focus students’ EIMs continued to be the same. However, to compare their consensus model of condensation with their individual second models of condensation, we can find an interesting thing. Comparison between Brian’s second model of condensation and their consensus model of condensation reveals that the consensus model incorporated two features from Brian’s model. First, it represented a set of experiment in detail. In particular, it showed a humidity detector measuring percentage humidity in a hood in which a cold bottle is placed on a scale. Obviously, Brian created this experiment by combining the two experiments they had conducted (Experiments 3 and 4). In contrast, Joon and Mana did not pay as much attention to this feature: Joon included only the data of humidity and of weight in his second model of condensation and Mana showed the setting of Experiment 4 only. Second, the focus students’ consensus model of 163 condensation showed that humidity in a hood goes down and then goes up and that weight of a bottle increases and decreases, another feature of Brian’s second model of condensation. Neither Joon nor Mana reflected it in their second models of condensation. That the other focus students incorporated these two features in their consensus model of condensation indicates that Brian’s ideas underlying those features were influential and propagated to the other students. 4.2.10. The activity of evaluating other groups’ consensus models of condensation (M9) Class 22 was fully dedicated to the activity of evaluating other groups’ consensus models of condensation. It consisted of two interrelated events. First, two groups exchanged their consensus models of condensation, and students in each group wrote their evaluations of the other group’s model in their student notebooks individually. Contingently, they shared their ideas of evaluations within their group. Second, the two groups sat together and evaluated, on the basis of their written evaluations, each other’s consensus model of condensation verbally and interactively. I would like to highlight the fact that this activity was not part of the designed curriculum; Mrs. M introduced it at her discretion. Though not planned, this activity met two crucial conditions that the curriculum designers had considered in developing this curriculum unit. First, it bore resemblance to scientists’ modeling activity. Next, it was pedagogically accessible and useful. Therefore, this activity can be regarded as a valuable event that fostered students’ engagement in scientific modeling. Unfortunately, I have no data that captured how Mrs. M introduced the activity. Thus, we do not know whether there was any utterance or text that provided a new idea about modeling at this time. 164 4.2.10.1. The focus students’ evaluations of other groups’ consensus models of condensation (Class 22) In this activity, the focus students’ group and another group (Group 3) engaged in mutual evaluations. Then, they teamed up to evaluate a third group (Group 4)’s consensus model of condensation. In what follows, I focus on the focus students’ evaluations of Group 3’s and Group 4’s consensus models of condensation. I present their written and verbal evaluations in combination below. I edited them minimally to avoid some redundancy. Brian’s evaluations of other groups’ consensus models of condensation Table 4.40. Brian’s evaluations of other groups’ consensus models of condensation Model Compliments Wishes Group 3’s - I like how it shows humidity per- You should have put the water drops consensus centage. on your key. model of - I like how it shows how the humidity - In last one it looks like there is more condensation lowers and rises. water drops even though it is evapo- It has air. rating. - It should have weight. Group 4’s - It has pretty colors. - It has a bad grammar. consensus - It has a scale, which shows that the - The cold area thingy grows. It model of water has weight so it's not just the doesn't need to grow. It doesn't have condensation same amount of weight. magical growing force field. - It has humidity percentages. - It has no falling then rising humidi- There's water vapor. ty…Because it warms up. Analysis of Brian’s EIM from his evaluations other groups’ consensus models of condensation is presented in the following table. Table 4.41. Analysis of Brian’s EIM from his evaluations of other groups’ consensus models of condensation Category Indicators Level CONTENT - Attended to communicative features such as a key, colors, and 2 the grammar of sentences. - Attended to microscopic/theoretical entities (e.g., water particles) as an explanatory feature. - Attended to empirical data (e.g., percentage humidity and gram weight), a relevant but not necessary feature. 165 EXPLANATION - ACCURACY - COMMUNICATION - Table 4.41 (cont’d) Attended to microscopic/theoretical entities (e.g., water parti- 3 cles) as an explanatory feature. Attended to empirical data (e.g., percentage humidity and 2.5 gram weight) as a source of accuracy of a model. Attended to communicative features such as a key, colors, and 2.5 the grammar of sentences. Attended to empirical data (e.g., percentage humidity and gram weight) as a feature that makes a model persuasive Joon’s evaluations of other groups’ consensus models of condensation Table 4.42 shows how Joon evaluated the two other groups’ consensus models of condensation in notes and utterances. Table 4.42. Joon’s evaluations of other groups’ consensus models of condensation Models Compliments Wishes Group 3’s - I like the way that they put humidity - I wish you put water drop in the key. consensus and clear key. - I wish you put air everywhere. model of - I like how they put color to make - I wish you put weight. condensation clear. - I like how they put a direction. Group 4’s - It is awesome drawing. - I don't get why they had more cold consensus - I like the way that the time (?) area. model of - I like the way that they put arrows. - I wish they had more sentences. condensation - They (?) detail, but they (?). Below is the result of analysis of Joon’s EIM captured in these evaluations. Table 4.43. Analysis of Joon’s EIM from his evaluations of other groups’ consensus models of condensation Category Indicators Level CONTENT - Attended to communicative features such as a key, sentences, 2 and colors. - Attended to the air as an explanatory feature. - Attended to empirical evidence (e.g., data of humidity and of weight) as a feature that makes a model persuasive. EXPLANATION - Attended to the air as an explanatory feature. 2 ACCURACY - Attended to empirical evidence (e.g., data of humidity and of 2.5 weight) as a source of accuracy of a model. COMMUNICATION - Attended to communicative features such as a key, sentences, 2.5 and colors. - Attended to empirical evidence (e.g., data of humidity and of weight) as a feature that makes a model persuasive. 166 Mana’s evaluations of other groups’ consensus models of condensation Table 4.44. Mana’s evaluations of other group’s consensus models of condensation Models Compliments Wishes Group 3’s - I like how they have a clear key. - I wish that there were water droplets consensus - I like how they put a lot of color into in the key. model of their model. - I wish that they would add more air condensation - I like how they have neat sentences. to their model. - I wish that they would put weight into their model. - I wish they didn’t put that much humidity in the after part because the percentage should be higher. Group 4’s - I like how they had time. - I wish they labeled when the conconsensus - I like how they had detail. densation was happening. model of - I like how they had humidity (??) - I wish they put more time in becondensation - I like how they had scale label. tween there. Analysis of Mana’s EIM from this data is given in Table 4.45. Table 4.45. Analysis of Mana’s EIM from her evaluations of other groups’ consensus models of condensation Category Indicators Level CONTENT - Attended to communicative features such as a key, labels, 2 sentences, and colors. - Attended to empirical data (“humidity” and “weight”), a relevant but not necessary feature. EXPLANATION - Attended to the air as an explanatory feature. 2 ACCURACY - Attended to empirical data (“humidity” and “weight”) as a 2.5 source of accuracy of a model. COMMUNICATION - Attended to communicative features such as a key, labels, 2.5 sentences, and colors. - Attended to empirical data (“humidity” and “weight”) as a feature that makes a model persuasive. The analysis results for the focus students’ EIMs are summarized in the following chart. 167 Figure 4.28. Analyses of the focus students’ EIMs in evaluating other groups’ consensus models of condensation (Class 22) 3 2 CONTENT EXPLANATION ACCURACY 1 COMMUNICATION 0 Brian Joon Mana To compare this result with the previous two results (Figure 4.25, Figure 4.27) reveals that Brian’s EIM remained the same in all the four categories while both Joon’s and Mana’s EIMs in CONTENT changed from level 3’s to level 2’s. These changes reflect the fact that both Joon and Mana did not comment on mechanistic features (e.g., water particles) in their evaluations of other groups’ consensus models of condensation. I doubt, however, that they did not know the importance of including these features in a model. I only suspect that as the mechanistic feature of water particles (“water vapor”) became pervasive, they began to attend to more specific features such as the amount of the air. 4.3. Summary In the previous section of the chapter, I have provided (1) a microgenetic analysis of the focus students’ EIMs and (2) an analysis of the ideas about modeling from some curriculum events preceding each modeling activity. I have also discussed how these two were related for each modeling activity. In the present section, I summarize these findings in order to address my research questions. First, I provide a summary of how three focus students EIMs changed over time with charts and a description. Then, I discuss in what ways some of the curriculum events 168 affected the changes of the focus students’ EIMs. 4.3.1. How did the focus students’ EIMs change over time? Figure 4.29, Figure 4.30, and Figure 4.31, presented below, summarize how three focus student’s EIMs changed over time. In these charts, the following codes are used to signify the modeling activities analyzed in this study. - M1: Constructing an initial model of evaporation - M2: Constructing a second model of evaporation (by evaluating/revising the prior model) - M3: Evaluating peers’ second models of evaporation - M4: Construct a third model of evaporation (by evaluating/revising the prior model) - M5: Constructing a group consensus model of evaporation - M6: Constructing an initial model of condensation - M7: Constructing a second model of condensation (by evaluating/revising the prior model) - M8: Constructing a group consensus model of condensation - M9: Evaluating other groups’ consensus models of condensation Figure 4.29. The change of Brian’s EIM over time 3 2 1 0 M1 M2 CONTENT M3 M4 EXPLANATION M5 M6 ACCURACY 169 M7 M8 M9 COMMUNICATION Figure 4.30. The change of Joon’s EIM over time 3 2 1 0 M1 M2 CONTENT M3 M4 EXPLANATION M5 M6 ACCURACY M7 M8 M9 COMMUNICATION Figure 4.31. The change of Mana’s EIM over time 3 2 1 0 M1 M2 CONTENT M3 M4 EXPLANATION M5 M6 ACCURACY M7 M8 M9 COMMUNICATION To highlight general patterns, I overlook data of two modeling activities: constructing an initial model of condensation (M6) and evaluating other groups’ consensus models of condensation (M9). M6 was quite different from modeling activities that preceded and followed it in that students did not have empirical data at that time. That unique situation generated distinct levels in M6. In M9, the last modeling activity, the focus students did not attend to microscopic/theoretical entities that they had consistently attended to. I do not consider this as a significant change because it is not reasonable to think that they suddenly lost their lasting attention to the feature without any apparent reason. Now, I describe and explain some general patterns in each category below. 170 4.3.1.1. CONTENT Brian’s EIM in this category started with level 1 in M1 and then continued to be level 2. He did not attend to issues such as what kinds of features are to be included in a model and how extensively a model contains such features in M1. But, since M2, he kept deploying an epistemic idea that a good model accumulates what one has learned about its target phenomenon. This result was mainly due to his consistent attention to empirical data (e.g., percentage humidity, weight). Although this feature was relevant information from a school learning point of view, it was not a necessary feature of a scientific model. Joon’s EIM was at level 1 in M1, at level 3 in the next three modeling activities (M2~M4), and at level 2 in most of the remaining modeling activities. Like Brian, Joon did not attend much to kinds of model features at the beginning. But from M2 to M4, he was aware that a good model includes exclusively features necessary to describe and explain a target phenomenon. Of significance is that even when and after he saw that Brian had included empirical data in his second models of evaporation, Joon did not attend to the feature in his modeling activities (M3, M4). But, when Joon agreed to Brian’s proposal to include empirical data in their group consensus model because it was “just more detail” (M5), he came to have an epistemic idea of a model as a repository. Then, he continued to have that idea to the end of the unit. Mana’s EIM level fluctuated in the first four modeling activities (M1~M4) and then became 2 since M5. Mana started with an epistemic idea that a good model includes exclusively features necessary to describe and explain a target phenomenon. But, when she saw that Brian had included empirical data in his second model of evaporation, she also acknowledged it as well as another epistemic idea that a good model contains many details. From M5 on, Mana shifted to the second idea as she acquiesced to Brian’s proposal to include empirical data in their consensus 171 model. We can see that both Joon’s and Mana’s EIMs regressed in this category after they accepted Brian’s idea to include empirical data in a model. It is because this idea was also connected to other kinds of epistemic ideas as we will see below in ACCURCY and COMMUNICATION. 4.3.1.2. EXPLANATION All the three students’ EIMs in this category were at level 3 most of the time. Very early on, they came to attend to how microscopic/theoretical entities behave over time as an explanatory feature. Behind this attention was their epistemic idea that a good model provides a scientific explanation, that is, an explanation that shows a hidden mechanism of how and why a phenomenon occurs. 4.3.1.3. ACCURACY Brian’s EIM in this category was at level 2 in M1 but at level 2.5 from M2 on. The progression was largely due to his attention to empirical data as a source of accurate knowledge for a model. And behind this new concern was his epistemic idea that a good model is supported by empirical evidence although he did not have a sophisticated understanding of what is empirical evidence and how to use it to secure the accuracy (validity) of a model. Joon’s EIM in this category was at level 2 in the first half (M1~M4) and at level 2.5 in the second half (M5~M9, except M6). This progression took place when Joon accepted Brian’s idea to include empirical data in a model in M5. At that time, Joon also accepted Brian’s epistemic idea that a good model is supported by empirical evidence along with Brian’s lack of a sophisticated understanding of what is empirical evidence and how to use it to secure the accuracy (validity) of a model. 172 Mana’s EIM level in this category was 2 in M1, fluctuated between 3 and 2.5 from M2 to M4, and was 2.5 from M5 to M9 (except M6). The shift of her EIM from level 2 (M1) to level 3 (M2) was because she revised her explanation to be consistent with empirical evidence. At that time, she came to have an epistemic idea that a good model is supported by empirical evidence and use empirical evidence in a sophisticated way. But, she also acknowledged Brian’s idea of including empirical data in a model as a way of using empirical evidence (M3). However, from M5 on, she shifted from her more sophisticated way of using empirical evidence to Brian’s less sophisticated way of using empirical evidence for model accuracy. 4.3.1.4. COMMUNICATION Brian’s EIM in this category was at level 1 in M1 but at level 2.5 from M2 on. Two factors were involved in this progression. First, from M2 on, Brian attended to communicative features (e.g., labels, sentences, key, colors), indicating an epistemic idea that a good model is clear. Second, Brian attended to including empirical data to make a model persuasive. Behind this attention was an epistemic idea that a good model is persuasive although he did not know a more sophisticated way of making a model persuasive. Joon’s EIM in this category was at level 1 in M1, at level 2 from M2 to M4, and at level 2.5 from M5 to the end (except M6). The first progress was due to Joon’s increased attention to communicative features that indicates an epistemic idea that a good model is clear. The second progress took place when Joon accepted Brian’s idea of including empirical data to make a model persuasive as well as his epistemic idea that a good model is persuasive. At that time, like Brian, Joon also lacked knowledge of a more sophisticated way of making a model persuasive. Mana’s EIM level in this category fluctuated between 2 and 2.5 from M1 to M4 and then 2.5 from M5 on (except M6). Note that she attended to communicative features from the begin- 173 ning. She only acknowledged Brian’s idea of including empirical data to make a model persuasive as well as his epistemic idea that a good model is persuasive in the first phase but became committed to the feature as well as Brian’s epistemic idea later. At that time, Mana also lacked knowledge of a more sophisticated way of making a model persuasive. 4.3.2. In what ways did some of the curriculum events influence the change of the focus students’ EIMs? From the charts and descriptions presented above, it becomes evident that the levels of the focus students’ EIMs are determined based primarily on how they attended to three model features: communicative features (e.g., labels, sentences, colors, key), microscopic/theoretical entities (e.g., water particles, water-air particles), and empirical data (e.g., data of humidity, data of weight). Therefore, I focus my discussion of the relationship between some curriculum events and the focus students’ EIMs on these three features. I note to readers that the curriculum events mentioned below played one part in the changes of the focus students’ EIMs. I do not claim that this is the only factor involved in the ways student’s EIM changed over time. 4.3.2.1. Communicative features Communicative features are related to the categories of CONTENT and COMMUNICATION. For students’ EIMs to be at level 2 or more in both CONTENT and COMMUNICATION, they need to attend to these features at least. The above charts and descriptions show that both Brian and Joon began to attend to them in their second models of evaporation (M2) whereas Mana in her initial model of evaporation (M1). The two boys’ and Mana’s different levels of attention to communicative features in their initial models of evaporation can be partly explained by the fact that no emphasis was given on 174 these features and only the communicative purpose of a model were touched on when scientific models and how to construct a model were introduced (Class 2) (See Table 4.2). What then helped Brian and Joon have increased attention to communicative features in their second models (M2)? In terms of ideas about modeling, no curriculum events that took place before they constructed their second models of evaporation explicitly highlighted communicative features. I suppose, though, that one event likely had an impact on this shift: sharing one’s initial model of evaporation with others (Class 2) (See 4.2.1.2). It is likely that when they presented their initial models of evaporation to others and saw others’ models after constructing their initial models of evaporation, they became increasingly aware of the need to make their models clear such that others could understand them better. Furthermore, the social context in which they engaged in following modeling activities likely established and reinforced their attention to communicative features. 4.3.2.2. Microscopic/theoretical entities Students need to attend to microscopic /theoretical entities as an explanatory feature in a model for their EIMs to be at level 3 in EXPLANATION. This feature is also related to the category of CONTENT. If a student includes this feature in her model and does not include any extra features, her EIM is at level 3 in CONTENT. But, if she includes microscopic/theoretical entities and extra features in her model, her EIM in this category is at level 2. The charts and description presented above show that Mana displayed her attention to the explanatory feature of microscopic/theoretical entities (e.g., water particles) from the beginning whereas Brian and Joon did not attend to that feature until in their second modeling activities. Given that this feature was not mentioned when scientific models and how to construct a model were presented in the student notebook (Class 2) (See Table 4.2) before their first modeling ac- 175 tivity, Mana’s early attention to microscopic/theoretical entities indicates that she already had an idea that using this feature makes a good model. How then did Brian and Joon come to pay attention to this feature in their second modeling activities? I would argue that their increased attention to microscopic/theoretical entities was made possible largely due to Mrs. M’s active scaffolding given to the focus students when they made sense of their empirical evidence about evaporation (Class 6) (See Table 4.8). More specifically, when Mrs. M helped Brian and Adrianna to connect their ideas of “humid” and “dissolving” to the concept of water particles, it is likely that Brian and Joon became more aware of the need to show such microscopic/theoretical entities in their models to explain their target phenomenon. Although they continued to attend to this explanatory feature from their second models of evaporation onward, when students watched computer simulations about state changes and collectively acted out as water molecules in the school playground (Class 9) (See Table 4.20), microscopic/theoretical entities became an explanatory feature officially. Another interesting aspect regarding microscopic/theoretic entities as an explanatory feature is that all the three students carried over this feature when they constructed their initial models of condensation (M6). While evaporation and condensation are closely related phenomena scientifically, it is not so easy for elementary students to make use of any conceptual or epistemic resources they have about evaporation to explain condensation. Given that, it is remarkable that the focus students used the same explanatory features (i.e., water particles) they had used to explain evaporation to explain a new phenomenon, namely, condensation. Mrs. M played a large role, among other factors, in helping this transfer take place. When students talked about how water drops form on a cold bottle (Class 14) (See Table 4.28), she, with three students, acted out 176 how water particles in the air come to a cold bottle and stick to it. In addition, she reminded students of the computer simulations about state changes and their collective performance as water molecules that both showed how water molecules behave to explain how state changes. 4.3.2.3. Empirical data The feature of empirical data is related to the categories of CONTENT, ACCURACY, and COMMMUNICATION. Epistemologically, empirical data plays an important role in generating valid scientific knowledge. From empirical data emerges some pattern (empirical evidence) and by empirical evidence explanatory ideas are tested. In this sense, it can be said that empirical data is an ultimate source of validity for scientific knowledge. However, from a scientific perspective, empirical data is not an essential feature of a model. Because it is a very specific kind of information, it makes a model hard to be used for multiple phenomena. However, some students attended to this feature because they thought that empirical data as a source of accurate (valid) knowledge would make a model persuasive. For this reason, when students include empirical data, along with their explanation consistent with it, in their models, their EIMs is given level 2 in CONTENT and level 2.5 in both ACCURACY and COMMUNICATION. To begin with, it should be emphasized that it was Brian that authored the idea or act of including empirical data in a model. Thus, I start discussing how he came to attend to empirical data. Figure 4.29 shows that Brian had no attention to the feature in constructing his initial model (M1) but attended to it from his second modeling activity (M2) onward except in constructing his initial model of condensation (M6) when no such data was available. What curriculum events had to do with the emergence of his attention to empirical data? We need to note, again, that the idea of including empirical data in a model was uniquely Brian’s. 177 I would argue, however, that this new idea could emerge in part because the preceding curriculum events allowed the emergence of such idea. When empirical evidence was introduced (Class 3) (See Table 4.6), two ideas were coexistent. One idea was that empirical evidence is something that tests an idea provided by some passages in the student notebook. The other was that empirical evidence is some information to be added to a model, an idea Mrs. M provided. Further, when students conducted experiments about evaporation (Class 3~Class 5), both the student notebook and Mrs. M did not clarify or specify what empirical evidence is and how to use it to improve a model (See Table 4.7). In some passages of the student notebook, an idea that empirical evidence is used to reject a wrong idea was presented. However, other ideas were possible as well. In particular, the student notebook seems to have tacitly advocated including students’ interpretations of empirical data in their models. In this ambiguity, Brian’s idiosyncratic idea of including empirical data itself in a model was possible. Not only was it welcomed. Later, this idea became highly valued in this classroom. I discuss that part below when I talk about Joon’s and Mana’s EIMs. Figure 4.30 and Figure 4.31 show respectively that Brian did not acknowledge Brian’s idea of including empirical data in a model and that Mana took an ambiguous position to the idea in their earlier modeling activities. I argue that their noncommittal position to it in this phase had to do with the ambiguity and openness with which empirical evidence was introduced (Class 3) (See Table 4.6) and Mrs. M and the student notebook guided how to use empirical evidence to improve a model (Class 3~5) (See Table 4.7), as noted above. But, in later modeling activities, both Joon and Mana became committed to an idea of including empirical data in a model. What helped this conversion happen? To be sure, Brian’s influence should not be overlooked. When the focus students constructed their consensus model of 178 evaporation (M5), Brian persuaded Adrianna of including specific data of humidity by saying, “It's just more detail.” At that time, Joon and Mana likely became persuaded, too. More importantly, though, Mrs. M had a large impact on the change of Joon’s and Mana’s position to this idea. When Mrs. M gave a compliment, with additional emphasis, on the fact that the focus students included “specific data” of humidity in their consensus model of evaporation (Class 12) (See Table 4.27), this way of using empirical evidence became officially approved. Considering the fact that many students other than the focus students increasingly included this feature in their later models, it is reasonable to think that both Joon and Mana, too, became committed to the idea of including empirical data at this moment. Table 4.46 summarizes the various ways in which some of the curriculum events influenced the focus students’ attention to the three model features. Table 4.46. Main model features that the focus students attended to and influential curriculum events Model features Influential curriculum events Communicative features - When scientific models were introduced, a communicative purpose of models was mentioned (Class 2). - Students shared their initial models of evaporation with others after they constructed them (Class 2). Microscopic/theoretical - Mrs. M helped the focus students shift from their vague ideas of entities “humid” and “dissolving” to the concept of water particles (Class 6). - Students watched computer simulations about state changes and, as a class, acted out as water molecules in the school playground (Class 9). - Mrs. M showed how water particles in the air come and stick to a cold bottle by acting out with students and reminded students of the computer simulations of state changes and the class collective performance of how water molecules behave they had experienced before (Class 14). 179 Empirical data Table 4.46 (cont’d) - When the student notebook and Mrs. M introduced empirical evidence (Class 3) and guided how to use it to improve a model (Class 3~Class 5), they allowed various ideas of empirical evidence and of using it for model improvement to occur. In this context, Brian created an idea of including empirical data as a way of using empirical evidence to improve a model. - In evaluating the focus students’ group consensus model of evaporation (Class 12), Mrs. M confirmed emphatically the idea of including empirical data in a model as a way of using empirical evidence to improve a model. 180 CHAPTER 5. DISCUSSION AND CONCLUSION In the previous chapter, I showed how the focus students’ EIMs changed over time and the ways in which some of the curriculum events affected the change of their EIMs. In the first section of the present chapter (5.1), I first discuss these findings against the goal of advancement of students’ epistemologies outlined in Chapter 1. I discuss whether and how the focus students’ EIMs developed over time and what roles the curriculum and instruction played in this process. Following this discussion, I also discuss contributions as well as limitations of the present study, and make some suggestions for future work in the second section of the chapter (5.2). 5.1. Discussion In this section of the chapter, I use the two findings presented in Chapter 4 to discuss the development of students’ EIMs and roles of curriculum and instruction for it. 5.1.1. Did the focus students’ EIMs develop over time? In Chapter 1, I argued that an overarching goal for practice-based approach to K-12 science education should be advancement of science learners’ epistemic agency and emphasized development of their epistemologies as a particular objective. I now use this objective to discuss the findings about three focus students’ EIMs presented in the prior chapter. Here is a question that guides the following discussion: did their EIMs become more advanced as a result of engaging in scientific modeling? I address this question below in discussing four epistemic ideas that students had about modeling with a focus on the three main model features mentioned in the previous chapter. First, the analyses presented in the previous chapter show that students continued to attend to making a model clear throughout the unit implementation. In terms of specific model components, they had persistent attention to communicative features such as labels, sentences, 181 and key. These features cannot be said to be particularly scientific. Nor is students’ attention to clarity an indicator of advanced epistemology. Students attend to this aspect in many everyday and school tasks. In relation to school work, I posit that their attention to clarity is associated with their interest in having positive evaluations from teachers and other students. Students learn through their experiences that they need to communicate their ideas clearly in various school contexts (e.g., in a test, when answering a teacher’s question in class, in presenting their work to a whole class) to gain compliments or good scores. For this reason, I conclude that students’ epistemic ideas regarding communicative features did not develop noticeably. Second, analyses of the focus students’ EIMs and the class-shared EIM (see Appendix A: A.3 and A.5) indicate that Mrs. M’s students had a particular idea about modeling early on and continued to hold it to the end of the unit: one needs to include all the relevant information that he or she has acquired about a target phenomenon in a model to make a good model. Although the analysis results for Brian’s and Joon’ EIMs show that they began to activate this idea from their second modeling activity, I hypothesize that this is an idea that elementary students, including Brian and Joon, often utilize in creating school artifacts subject to assessment. By so doing, students need to prove to their teacher that they have learned a lot about any given subject to get a good score. Epistemologically, this idea values specificity of information. Useful as this emphasis may be in typical school science, it is not always consistent with what scientists value in their modeling activities. While scientists attend to specificity in key scientific ideas to be included in a model, they also try to make a model parsimonious and generic. Underlying this tendency is scientists’ shared goal to produce scientific knowledge that can be applied broadly. Given this explanation, the analysis result that the focus students’ attention to including “details” in a model remained the same cannot be taken to indicate that their EIMs in this category developed. 182 The idea that including details in a model makes a good model seems to have allowed students to attend to two features: microscopic/theoretical entities and empirical data. From a students’ point of view, these features were part of what they had learned about evaporation and condensation during the implementation of the unit. But, I would argue that students viewed these features not merely as “details” but in association with other epistemic ideas as well. How then did students view microscopic/theoretical entities other than part of details to be included in a model? Students have a general idea that one purpose of a scientific model is to explain something. But, they do not know what it means to explain something scientifically. For this reason, both Brian and Joon showed simple explanatory ideas in their initial models of evaporation. But, when Mrs. M helped the focus students to replace their vague notion of “humid” and “dissolving” with a more scientifically meaningful idea of very small particles of water, they not only learned of this new feature (as a matter of fact, they did not learn substantially about it at this time) but also came to note that using such feature is an essential part of scientific explanation. Moreover, their understanding of microscopic/theoretical entities as an explanatory feature became increasingly sophisticated as they connected these entities to empirical evidence. For example, in constructing their second models of evaporation, both Joon and Mana changed their initial models of evaporation by showing that water-air particles (Joon) or water particles (Mana) spread out in the air after evaporation happens in order to reflect the empirical evidence they had at that time. Additionally, when the focus students all became committed to including empirical data in a model, they paid attention to making how the number of water particles in the air changes over time consistent with empirical data. All these instances indicate that their epistemic ideas about microscopic/theoretical entities as an explanatory feature became increasingly sophisticated. 183 Next, empirical data is another feature about which students had multiple epistemic ideas. First of all, as discussed earlier, they saw this feature as a part of what they had learned. This idea was clearly manifested when Brian described it as “just more detail.” However, that was not the only idea which students associated empirical data with epistemologically. There were several instances that indicate that they thought of empirical data as a feature that makes a model accurate (valid) and persuasive. For instance, when Mana included the data of humidity and weight in her second model of condensation, she used these features as an authoritative source of her new understanding that water particles in the air do not increase (her prior explanation before conducting experiments about condensation) but decrease over time. Brian also showed his intention to base his model on the results of experiments about condensation in his second model of condensation. Unlike others and despite Mrs. M’s caution not to make a model too complicated, he showed how humidity in the air decreases and then increases and how weight of a bottle increases and then decreases in this model as if to say, “This is what really happened.” In addition, the other focus students and, later, another group of students, accepted this idea quickly. All these show that students saw and used empirical data as a feature that provides a model accuracy and persuasive efficacy. To be sure, this is not a very sophisticated epistemic idea from a scientific perspective. Students did not go as far as to attend to the epistemic relation between empirical evidence and explanatory ideas: they had vague understandings of what empirical evidence is and how to use it to make their explanations valid and persuasive. Nor did they enter into the practice of argumentation to critique or defend their ideas based on empirical evidence. However, using empirical data to make a model accurate and persuasive is not an idea or act easily found in typical science classrooms. I therefore argue that students’ epistemic ideas about empirical data developed, 184 albeit modestly, in that they became increasingly attentive to accuracy (validity) and persuasive efficacy of a model. From these discussions, I conclude that over all the three focus students’ EIMs made a modest progress epistemologically. Figure 5.1 shows this visually. Before explaining the diagram in it, some concepts need to be introduced briefly. For this conceptualization, I draw on the concepts of epistemological resource and epistemological frame (Hammer & Elby, 2002; Hammer, Elby, Scherr, & Redish, 2005; Redish, 2004). As an alternative framework of students’ epistemologies, Hammer and Elby (2002) proposed a resources-based view. They first challenged previous views of students’ epistemologies—“epistemological theory” and “personality traits”—because their common view that each students has a unitary and consistent epistemology cannot explain variance of the same student’s epistemology across different domains and contexts. Hammer and Elby then argued that their alternative view that students’ epistemologies consist of multiple finer-grained epistemological resources can provide a better account of such variance. Later, Hammer et al. (2005) expanded their resources-framework by incorporating Redish’s (2004) concept of “epistemological frame” to refer to a locally coherent set of epistemological resources. Using “epistemological resource” and “epistemological frame,” I now explicate the diagram in Figure 5.1. I argued above that three main model features that the focus students attended to— communicative features, microscopic/theoretical entities, and empirical data—were connected to different epistemic ideas about modeling. To summarize what I discussed above: - The students’ attention to communicative features was related to an epistemic idea that (1) a good model is clear. 185 - The students’ attention to microscopic/theoretical entities was related to two epistemic ideas: (2) a good model contains many details and (3) a good model provides a scientific explanation. - The students’ attention to empirical evidence was related to two epistemic ideas: (2) a good model contains many details and (4) a good model is valid and persuasive. From a resources-based perspective, the four epistemic ideas mentioned here are consid- ered epistemological resources about modeling. I theorize that epistemic ideas (1) and (2) are components of an epistemic frame about modeling (Frame 1) that sees modeling as accumulating information that one has learned (“details”) about a subject in a model. On the other hand, epistemic ideas (1), (3), and (4) constitute a different epistemic frame about modeling (Frame 2). In this frame, modeling is viewed as constructing a valid scientific explanation of a target phenomenon using a model. What I argued above in an attempt to address the question raised earlier— Did the focus students’ EIMs become more advanced as a result of engaging in scientific modeling?—is that the focus students generally began with and continued to hold epistemic ideas about modeling that constitute a typical schooling frame of modeling (Frame 1), but soon developed new epistemic ideas about modeling that constitute a more authentic and productive frame of modeling (Frame 2). This can be taken to indicate that their EIMs became increasingly sophisticated. I view Frame 2 as less sophisticated than Frame 3 in which modeling is seen as generating general scientific knowledge using model and that consists of an advanced epistemic idea that a good model is generic and parsimonious (5) in addition to the ideas (1), (3), and (4). I argue, however, that given Mrs. M’s relatively high competence in teaching science and the three focus students’ high levels of engagement in modeling and middle to high academic performance levels, it is a reasonable and worthwhile goal for fifth-grade students to have epistemic ideas about 186 modeling that constitutes Frame 2 or to be able to construct a scientific explanation supported by empirical data in a model as a result of participating in a model-based unit. Figure 5.1. Development of the focus students’ EIMs in a learning progression framework for scientific modeling (frame of modeling) Progression Frame 2: Modeling is to construct a valid scientific explanation using a model. Frame 1: Modeling is to accumulate information in a model. (1) A good model is clear. (2) A good model contains many details. A B C B The focus students’ EIMs Model features communicative features Frame 3: Modeling is to generate general scientific knowledge using a model. (1) A good model is clear. A (1) A good model is clear. microscopic/ theoretical entities D (3) A good model provides a scientific explanation. empirical data E (4) A good model is valid and persuasive. (3) A good model provides a scientific explanation. (4) A good model is valid and persuasive. (5) A good model is generic and parsimonious. A. By default, students had this idea. As they shared their initial models of evaporation with others (Class 2), this idea became established. B. By default, students had this idea. Mrs. M and Ms. H emphasized this throughout the unit. C. Mrs. M introduced empirical evidence as something to be included in a model (Class 3). Brian activated this idea when he began to include empirical data in his model (Class 6~7). D. Mrs. M helped the focus students attend to the concept of water particles (Class 6). Students watched computer simulations about state changes and collectively performed water molecules in the playground (Class 9). Mrs. M showed how water particles in the air come and stick to a cold bottle by acting out with students (Class 14). E. The student notebook and Mrs. M allowed various ideas of empirical evidence and of using it for model improvement (Class 3~5). Mrs. M encouraged including empirical data in a model (Class 7). In evaluating the focus students’ group consensus model of evaporation (Class 12), Mrs. M confirmed emphatically the idea of including empirical data in a model as a way of using empirical evidence to improve a model. Based on this analysis, I propose a general mechanism in which students’ EIMs evolves 187 over time as follows. When students begin to engage in scientific modeling, students may attend to particular model features in association with preexisting, nascent epistemic ideas about modeling. But as they further experience scientific modeling that provides epistemologically more advanced contexts in which those features are used and viewed, they begin to associate those features with more advanced epistemic ideas about modeling. As students’ engagement in scientific modeling accumulates, the new association becomes increasingly strong while the old association weak, to the point when only the new association remains. Finally, they integrate various advanced epistemic ideas about modeling into an epistemologically developed frame of modeling. 5.1.2. Roles of the curriculum and instruction In the previous chapter, I showed the various ways that some curriculum events affected the changes of the focus students’ EIMs (Table 4.46). To relate this result to the discussion above, Mrs. M and some components of the designed curriculum played particular roles in developing the focus students’ EIMs to be epistemologically more sophisticated. Mrs. M’s instruction and computer simulations about state changes seem to have had a direct influence on their increased attention to microscopic/theoretical entities as an explanatory feature. By contrast, the role of Mrs. M’s instruction and the student notebook for the focus students’ growing attention to empirical data seemed somewhat indirect. When Brian crafted the idea of including empirical data in a model to make it better, the student notebook and Mrs. M only provided a context in which such a unique idea could emerge. Later, Mrs. M used her authoritative status as a teacher to help this idea gain public recognition and thus be distributed among students. Although the curriculum and instruction assisted in the development of the focus students’ 188 EIMs to some extent, there were some shortcomings in their roles as well. First of all, both Mrs. M and Ms. H, her intern teacher, frequently emphasized an epistemic idea that belongs to traditional school work frame of modeling (Frame 1): including details makes a good model. When empirical evidence was introduced (Class 3), Mrs. M described empirical evidence as a kind of information to be included in a model. Also, in instructing the activity of evaluating other models, she emphasized including what students had learned and features like specific data (Class 7). However, she later distanced herself from this idea, as evidenced by her later utterances on taking out nonscientific or irrelevant features from a model (Class 11, Class 15, Class 19, and Class 20). In contrast, Ms. H, an intern teacher, continued to stress including what students had learned in their models (Class 10, Class 19). The teachers’ consistent emphasis on including details in a model simultaneously reveals that this idea was already in place in this classroom and explains, in part, why the focus students continued to have this epistemic idea throughout the unit. Second, and more importantly, when Mrs. M helped the focus students in the way their EIMs became more sophisticated, the way she provided such assistance to them was limited in that it did not allow the focus students to critically reflect on and understand such features as microscopic/theoretical entities and empirical data. For example, when Mrs. M assisted the focus students in making sense of their empirical evidence about evaporation, she helped them to move from their vague notions of humidness and dissolving to a more scientific concept of water particles. But, at that time, she did not offer them an opportunity to think why using water particles is a better explanation than saying that it is humid or water dissolves. Such discussion could also lead to a deeper discussion of, for example, what kind of explanations is valued in science and why it is the case. Likewise, when Brian’s idea of including empirical data in a model was found, it could have been more instructive to ask his for his rationale and lead the class to discuss larger 189 issues such as how the accuracy (validity) a model and ideas in it can be secured scientifically and what role empirical evidence plays for it. One of the reasons that I propose this way of approaching features like microscopic/theoretic entities and empirical data has to do with my premise that to learn a new social practice effectively, one needs to carry out multiple activities that constitute that practice and to have timely meta-level discussions with experienced members of a community of that practice to reflect and make sense of the activities they have conducted. Issues such as when and how often such discussions should take place remain to be further investigated. But, based on the present research, I suppose that when multiple scientific practices (e.g., modeling and explanation, modeling and argumentation) are connected, such meta-level discussions would help students make sense of and engage in those practices. 5.2. Contributions, limitations, and suggestions for future work There are some ways in which this study contributes to research on students’ epistemologies about modeling (Gobert & Pallant, 2004; Saari, 2003; Schwarz & White, 2005). In what follows, I discuss some of the contributions. After that, I also comment on some limitations of my work and suggestions for future research. First, this study provides some insight into elementary students’ epistemologies about modeling. Most previous studies on students’ understandings of models and modeling targeted middle and high school students as subjects of investigation. The findings provided by this study can expand our understanding of K-12 students’ epistemologies about modeling by adding an analysis of elementary students’ epistemologies about modeling. Second, and more importantly, my work presented here fills in some conceptual as well as methodological shortcomings of prior studies on students’ understandings of models and 190 modeling (Gobert & Pallant, 2004; Saari, 2003; Schwarz & White, 2005). All these studies assumed that students’ epistemologies are coherent across different contexts; that is, their epistemologies manifested when they reflect the practice of scientific modeling in written assessments or interviews and their epistemologies that guide their modeling are essentially the same. But, this assumption has been challenged by a body of theoretical and empirical studies that argue for the dependency of students’ epistemologies on contexts (Hammer & Elby, 2002; Leach, Millar, Ryder, & Sé , 2000; Sandoval & Morrison, 2003). Aware of this, the present study aimed at ré providing some findings about students’ epistemologies that guide their modeling. One of the findings of this study is that when elementary students engage in scientific modeling in class, they start with some of the epistemic ideas that they have used in doing typical school work. I identified two such ideas in this study: accumulating all information they have acquired about a subject in a school artifact and communicating information clearly in such an artifact. I conjecture that this pattern can be found not only in the class I investigated but across various elementary classrooms because I believe they share a fairly similar historically established sociocultural world or what Holland and colleagues call “figured world” (Holland, et al., 1998). The present study also showed how elementary students’ epistemologies about modeling develop over time as they participate in a model-based inquiry unit. More particularly, I showed that students increasingly attended to two more advanced epistemic ideas—providing a scientific (mechanistic) explanation in a model and making a model accurate (valid) and persuasive— although their previous epistemic ideas were still in place. The findings provided by this study can be used as information about younger students’ epistemologies about modeling in developing a learning progression of K-12 students’ modeling practice (cf. Schwarz, et al., 2009). First, the finding mentioned above offers an idea about 191 younger students’ common epistemic ideas about modeling. Learning progression research places its interest less in variances between individual students’ thinking and practices than in common patterns that can be found across their thinking and practices. The finding of this study that elementary students began with previously shared epistemic ideas suggests that we need to pay more attention to concepts, epistemologies, and practices elementary students share as a result of sharing a homogeneous figured world, in developing a learning progression for scientific modeling. Second, the second finding of the present study provides an insight into a mechanism in which elementary students’ epistemologies about modeling progress from one level to a higher level. Learning progression researchers have noted the need to gather empirical evidence of intermediary states through which that students’ knowledge, epistemology, and practices progress from one level to a next higher level (Gotwals & Songer, 2009). The present study showed an intermediary state that elementary students had as they developed their epistemologies about modeling. Though this is just one case, it provided an insight into the trajectory in which students’ epistemologies about modeling become increasingly sophisticated and proposed a mechanism for how this progression happens. In addition to these contributions, the present study also contributes to research on students’ epistemologies about modeling by providing some ideas about curriculum development and instruction for model-based science learning. In consistent with prior work (Gobert & Pallant, 2004; Saari, 2003; Schwarz & White, 2005), the present study found some effectiveness of a model-based inquiry curriculum unit on students’ epistemologies about modeling. More specifically, it showed how three students developed two epistemologically more sophisticated ideas about modeling, that is, epistemic ideas of explaining a phenomenon scientifically using microscopic/theoretical entities and of making a model valid and persuasive using empirical evidence. 192 But, additional contribution of this study is that it showed more carefully the ways in which some features of the curriculum and instruction helped foster the emergence of these epistemic ideas. For their increased attention to microscopic/theoretical entities as an explanatory feature, Mrs. M’s repeated scaffolding was influential to a significant extent although computer simulations and the class experience played a part as well. The students’ growing attention to including empirical data in a model as a feature to make it valid and persuasive arose in the confluence of the open and ambiguous introduction of empirical evidence by the curriculum and Mrs. M; Brian’s proposal of this idea; and Mrs. M’s public confirmation of the idea. This finding highlights the importance of both predictable and contingent dimensions of curriculum implementation and the teacher’s role in the process. There are some limitations in the present study. First, this study has some methodological limitations. I focused my analysis on the three students’ models and not as much on their discourse. One reason has to do with the simple fact that I did not have a sufficient amount of their discourse data that can be used to address my research questions. This in turn is related to additional factors. For one, neither the curriculum unit nor the teachers paid sufficient attention to probing students’ thinking about their own modeling activities. In particular, when students constructed their models individually (five modeling activities), their rationale behind their modeling activities were barely inquired about. Next, even in modeling activities (evaluating others’ models, building a consensus model) intended to provide social contexts and thereby to elicit students’ ideas about modeling, the focus students’ utterances were terse and superficial, and focused on procedural aspects of those activities. For these reasons, I had to focus on their models more than their discourse. This situation required me to make a high level of interpretations when analyzing the students’ models. This does not mean that my interpretations are necessarily invalid. To in- 193 crease the validity of my interpretations, I looked at secondary data such as the focus students’ interviews. Nevertheless, it would be better to think about ways to collect students’ ideas about their modeling activities as they do them, at the stage of curriculum development. Second, the finding of this study about the role of the curriculum and instruction allows us to see some areas that should be further considered in the curriculum development and instruction for this kind of intervention research aimed at fostering students’ epistemologies. One of them is that a teacher’s role as an authoritative figure in class was more emphasized than disciplinary norms in the development of the students’ epistemologies about modeling. For future work, I propose to place more emphasis on a teacher’s role as a guide or “culture broker” (Aikenhead, 1996) that helps students shift their thinking, epistemologies, and practices from those of conventional school science to those of scientifically more rigorous and productive school science. In particular, I argue that teachers need to help students articulate their thoughts and rationale underlying their engagement in scientific practices and see differences between their thoughts and scientific epistemic ideas as well as reasons why the latter are valued in scientific practices. This should be considered in developing a model-based (or practice-based) curriculum unit and providing professional development to teachers. Teachers and students, first of all, will benefit from such approach. But, additionally, as I discussed earlier, this approach will offer researchers to understand students’ practical epistemologies more clearly what students think about scientific practices they engage in and how that change over time. This study provides some suggestions for future work. First, it raises questions about student diversity and equity in students’ engagement in scientific modeling and other scientific practices. In this study, I only touched on some aspects of it in a passing manner. In particular, when I documented how the focus students constructed their consensus models of evaporation 194 and condensation, I noted that Joon, a Korean and English-learning boy, and Mana, a reserved Indian-American girl, did not participate in the processes as actively as Adrianna and Brian, both Caucasian, did. I suspect that their uneven participations, along with other factors, is one reason that Brian’s ideas of including empirical data in a model (for their consensus model of evaporation) and of showing that humidity decreases and increase and that weight decreases and increases (for their consensus model of condensation) were taken by the group. Although there are some studies that focused on students’ unequal participation in the contexts of scientific inquiry (Cornelius & Herrenkohl, 2004; Kurth, Anderson, & Palincsar, 2002), I argue that more attention needs to be given to this issue in future research on students’ epistemologies about modeling, and more generally in research on practice-based approach to science learning. Another suggestion for future research on practice-based approach to science education is related to the goal of fostering epistemic agency I outlined for the practice-based science education in Chapter 1. For students become capable and productive epistemic agents for their present and future lifeworlds, they need to be prepared in various areas. The present study focused on only one dimension—epistemology—of epistemic agency and showed that the three focus students made a modest progress in this dimension. But, what about in other dimensions of epistemic agency? For example, did they become more sophisticated in working with others to produce scientific knowledge? Was there any change in the roles they took and in the ways they took the roles over time? Were they increasingly aware of the goal of engaging in scientific modeling? Did they use more advanced metacognition to manage the knowledge-generating process? To address questions like these, I argue that coordination among diverse conceptual frameworks and research programs is necessary in future research. In this way, we can have richer understandings of how students’ epistemic agency develops as they engage in scientific practices and suggest 195 ways in which the practice-based science education can be more beneficial to students. 5.3. A concluding remark In this study, I presented a microgenetic analysis of how three elementary students’ EIMs changed over time and of how curriculum events influenced those changes with aims to provide an empirical analysis that supports the currently active reform agenda focused on student engagement in scientific practices as well as to suggest some ideas to improve such approach in the areas of research, curriculum development, and instruction. Of course, more work needs to be done to make this new agenda provide substantial benefits to students both for their present science learning and for their future life they will live as citizens in societies that will be increasingly populated with complicated, controversial socioscientific issues. I only hope that this work contributes to the large effort to help students become more active and capable epistemic agents by learning science from engaging in scientific practices. 196 APPENDICES 197 APPENDIX A. Additional Analysis of Ideas about Modeling From in Curriculum Events In this section, I present my documentation and analysis of additional curriculum events. Although I think that these events were related in some ways to the focus students’ EIMs, I set them apart here because their influence was minimal or indirect compared to the influence of the curriculum events included in Chapter 4. A.1. “What (do you think) are scientific models?” (Before the unit began) Before the curriculum unit was launched, they had a class session in which students brought up what they thought are examples of scientific models, discussed some of those exam11 ples as a class, and individually wrote down what they thought are scientific models . Analysis of their ideas about scientific models from their notes generates the following result. 11 This session was not videotaped, but almost all the students wrote down pretty detailed notes in their science journals. The following discussion is based on analysis of these notes. 198 Table A.1. Top five dominant ideas about models found in students’ notes about what scientific models are (N=22) Rank Metamodeling ideas Number of CN EX AC CM students (percentage) 1 Models provide some information. (e.g.) Models 18 (81.8%) * * explain, show or tell…/ Models help us know, understand, learn, and study... With respect to kinds of information about their targets, - Unspecified 13 (59.1%) - Some aspects of their targets 9 (40.9%) ○ What they are, what they look like 5 (22.7%) ○ How they work 4 (18.2%) ○ What they do, how they are used 3 (13.6%) ○ What they are made of 3 (13.6%) 2 Models’ targets are real things or things in the real 12 (54.5%) world. 3 Models are smaller or larger versions of their tar10 (45.5%) (*) gets. 4 Models’ targets are invisible or hardly visible 8 (36.4%) * things. Notes: [1] CN: CONTENT, EX: EXPLANATION, AC: ACCURACY, CM: COMMUNICATION [2] *: Prominent idea(s), (*): Tacit idea(s) It is important to remember that these ideas about scientific models were generated in the context of reflecting scientific models generally (“What are scientific models?”) and not in the context of engaging in modeling. It is no wonder, then, that their common ideas about models here are not be the same as their common metamodeling ideas captured in their modeling practice, as will be shown later. Given the dependency of their EIMs on contexts, it is still interesting that these students’ ideas about models were not completely idiosyncratic. First, it is noticeable that the majority of students viewed models as an authentic epistemic tool rather than as a typical schooling tool. To combine #1 and #2, we can note that they thought that a primary function of models is to provide new information or generate knowledge about something out there in the world. Considering further #3 and #4, a number of students be- 199 lieved that they cannot easily come by the information of this sort in ordinary ways, implying that scientific models are associated with particular (e.g., scientific) epistemic processes. Second, most of them did not specify what kinds of information models provide about their targets. In addition, some students’ ideas about the kinds of information show that they did not pay particular attention to scientific kinds of information such as mechanism and physical causes. A.2. Explaining evaporation (Class 3) One of the curriculum events that influenced the students’ EIMs was the class conversation, led by Mrs. M, to explain evaporation. This conversation took place, rather contingently, as Mrs. M went over what they had done in the previous session with the students. Specifically, it was triggered when she asked a question: “What happens to the water when it comes in contact with the air?” When a student (Iona) said that it evaporates, she pushed the whole class to go beyond such a scientific-sounding but “cookie-cutter” term and to elaborate what it means. It invited a range of different ideas and the conversation lasted for some time (about 6 minutes). The following excerpt shows the main part of the conversation. 1 2 Pallavi Mrs. M … 3 Adrianna 4 Mrs. M 5 Jonas Umm, it's when water goes (up in the air). It's when water just kind of goes up in the air? Who really likes to say it's, “Well, ahhh I don't know.”…Is it when water goes up in the air? Do you agree with Pallavi? Yes or no? Because it's in the air and it dissolves into little, maybe, it like evaporates into the air…it kind of like dissolves into the air and smaller, it's like, smaller pieces like, small drops of water, very small. Okay, smaller drops of water, very small? Oh, look at him, look at Mr. Scientist back there. Jonas! Well when it evaporates, it turns into water vapor. And when it turns into water vapor when it gets hot and molecules want to spread out…to cool off…into the air… but when it's cold, it bunches up to warm. … 200 6 Mrs. M … 7 Glynn 8 ? 9 Mrs. M … 10 Brian …Alright! Would you please make sure you write these ideas of yours down? I was gonna say (?) you said but the water molecules split into hydro, one hydrogen and two oxygens. How does he know? [Echoing but more loudly] But how do we KNOW? I think the water kind of, when they, it dissolves into a gaseous form, water vapor and since it's warmer than they're colder it's kind of like pushed up into the air… and the cold air pushes it up 'cause cold air sinks…It condensates into a cloud or something. The conversation provided several ideas about modeling that potentially influenced students’ EIMs. To analyze the ideas about modeling properly, it is important to notice Mrs. M’s subtly different attitudes to individual students’ responses because the students likely took her attitudes importantly when they decoded the messages about modeling this conversation communicated. When Pallavi came up with an idea of water going up in the air (line 1), she asked four different questions that as a whole signaled her dissatisfaction with the idea. Glynn’s scientific-sounding yet excessively detailed account, too, faced her suspicion when Mrs. M echoed another student’s challenging question (lines 8, 9). By contrast, her positive attitude to Jonas and his ideas became apparent when she called him “Mr. Scientist” (line 4) and encouraged him to write down his ideas (line 6). To the remaining two students’ (Adrianne, Brian) ideas, she was neutral. Given that, the ideas about modeling can be analyzed as follows. Table A.2. Analysis of the ideas about modeling from the conversation on explaining evaporation Category Analysis Level CONTENT - Mrs. M pushed students to include more detailed ideas about 2~3 evaporation than just writing “evaporation” or showing that water goes up in the air. Given that “evaporation” or water going up in the air itself is a scientific explanation, more detailed ideas than that would be scientific explanations as well. 201 Table A.2 (cont’d) EXPLANATION - Adrianna, Jonas, and Brian all explained evaporation using 3 small water particles (called variously “smaller pieces,” “water vapor,” and water “molecules”), indicating their attention to mechanism. ACCURACY - Students carried over ideas from previous sources. For exam2 ple, the ideas of water going up (Pallavi) and of water going into clouds (Brian) may have come from the unit of water cycle. The idea of dissolving (Adrianna, Brian) likely came from the unit of mixtures and solutions Mrs. M and students had finished just prior to the current unit. When they all imported these ideas directly from their prior sources, they did not think about the accuracy of those ideas. COMMUNICATION N/A N/A A.3. “What makes a good model?” (Class 6, Class 7) Before the students constructed their second models of evaporation, Mrs. M asked the whole class a question: “What makes a good model?” This was a timely and relevant question. It situated their next activity—construction of a second model of evaporation—within a large process of making a good model. It helped the students engage in modeling with more increased and expanded metamodeling awareness. By this time, their focus had been given to empirical evidence as a criterion for evaluating and making a good model. The above question, however, pushed them to think about other criteria and ultimately to consider them when building their next models. The conversation following this question provides an insight into the contour of the epistemology about models and modeling that these students generally had at this point of time. In Class 7, Mrs. M had them go over what they had said of what makes a good model in Class 6. At this time, however, she asked a question differently (emphasis added): Who did not yet finish? Who still knows that there are things they need to add to it? I would say all of you. Why don’t you go ahead and take five minutes to finish adding your details and what it is that you think belongs to your second model. Remember, what was some of the things that you said you’d include in the model? 202 As the words I emphasized (“add”, “adding your details,” and “include”) illustrate, Mrs. M framed model revision as addition of new features in the model. In the EIM coding scheme, this is a level-2 idea in CONTENT. Below are the ideas some of the students came up with in Class 6 and in Class 7. (As I combined them, I modified some of them minimally to make them clearer.) - Showing what, how, why - Details - Clear labeling - Symbols - Sentences and phrases - Organization - Arrows - Examples - Pictures - Color These ideas show the epistemologies about modeling that the students as a class shared at this time. Analysis of them using the EIM coding scheme is presented in Table A.3. One thing to note about these responses is that most of them are features to include in a model rather than more general epistemic criteria. Second, none of them are explicitly related to the dimension of accuracy in general and, more specifically, empirical evidence. Considering that they had just finished conducting and discussing several empirical investigations about evaporation that had spanned four class sessions, this absence is striking and indicates that the students did not think of empirical evidence as an epistemic criterion that secures the validity of 203 a model. Table A.3. Analysis of the ideas about modeling from the conversation on what makes a good model Category Analysis Level CONTENT - Students attended to including various features in a model. 2 This was characterized by “details.” And no exclusive attention to scientifically essential features is found. EXPLANATION - The responses related to this dimension include “showing 1~3 what, how, why,” “sentences and phrases,” and “arrows.” But, none of these specified the nature of explanation. ACCURACY N/A N/A COMMUNICATION - Many features (e.g., clear labeling, symbols, organization, ar- 2 rows, pictures, and color) belong to this category, indicating that the students had a lot of interest in this dimension of modeling. But, they did not show interest in the persuasive efficacy of a model. A.4. Guiding the activity of constructing a third model of evaporation (Class 10) Class 10 was led by Mrs. M’s intern, Ms. H. The first task Ms. H had students do was to evaluate their own second models of evaporation using a list of evaluative questions. Students then moved on to construct their third models of evaporation on the basis of their selfevaluations of their prior models. As students began to evaluate and revise their second models of evaporation, Ms. H read the following questions written on a board as guidelines for this activity. (I assigned numbers for quick reference.) (1) Does it make sense? (2) Does it explain what happened to water on the plate? (3) Does it show where the water went? (4) What kind of evidence from the other experiments are you using? (5) Is it clear? (6) Does it include all of these things that we have listed? 204 As a whole, these questions became an important epistemic resource that students could use to do this activity. Analysis of these questions using the EIM coding scheme generates the following result. Table A.4. Analysis of the ideas about modeling from questions that guided the activity of constructing a third model of evaporation Category Analysis Level CONTENT - Question (6) characteristically shows that these instructions as 2 a whole emphasize the inclusion of various features. And the features mentioned here are generally relevant. EXPLANATION - The first three questions can be considered to fall in this cate2~3 gory. Of them, question (3) drew students’ attention to the movement of water and therefore invokes scientific explanation (mechanism included). ACCURACY - Question (4) explicitly highlighted empirical evidence. But, it 2~3 did not specify what empirical evidence is and how to use it for a model. COMMUNICATION - Question (5) invoked students’ attention to the communicative 2.5~3 efficacy of a model. - Question (4) had students attend to the persuasive efficacy of a model although no particular way to secure it was specified. A.5. The class conversations about the changes they made to their previous models of evaporation and about criteria for constructing a consensus model (Class 10) After most of the students finished constructing their third models of evaporation and before they moved to consensus model building, Ms. H asked them to debrief the changes they had made to their previous models. The ensuing conversation became particularly rich as data due to Mrs. Schwarz’s (researcher) intervention. After a few ideas were shared, she interrupted to tell the class, “I'm wondering if you can tell us the reason you had for making those changes. If you added color, what was important about adding color or the labels?” This intervention turned out to be useful; Ms. H immediately began to ask this question to those who talked about their changes and those students provided their rationales that they had rarely expressed previously. Their articulated reasons for their model changes provided a window through which we can ex- 205 amine their EIMs they had at this time. A similar conversation took place a little later—after Ms. H introduced consensus models and the activity of consensus model building. She asked them to share their ideas about things they would want to consider in constructing their consensus models. Although the questions that prompted the two conversations were different, both Ms. H and the students did not treat them as such. Therefore, I analyze the two conversations in combination below. But, before I present the analysis, I first comment on the utterances Ms. H made in this process. It is true that in the above conversation it was the students that played a major role. But, Ms. H did contribute to this conversation by maintaining it logistically, by commenting on some ideas, and at some times by providing what she thought were reasons for students’ model changes instead of asking theirs. And some of her utterances conveyed ideas about modeling. There were a couple of ideas about modeling repeatedly found in her comments. First, she made numerous utterances about adding certain features to models. Certainly, she was not the only or main person who highlighted this. Many students also said that they had added such and such features and, as she reiterated their utterances, she used the word again. But, later, she began to use it more initiatively; for example, she asked, “What did someone else add?...What else did you add?” Most saliently, when she wrapped up this conversation, she summarized the nature and purpose of model change in the form of question: “Okay, so was it important that we added all these new details and all these new things we learned and changed our model to show what we learned?” To analyze these ideas using the EIM coding scheme, they correspond to level 2 in CONTENT. Second, she kept mentioning the fact that many students had the same features. For example, when color was presented as a newly added feature, she said, “Color is something that 206 I've noticed a lot of people have added since their first model.” Likewise, when a student talked about humidity for how and cold water, she also stated, “That was something I saw a lot of you add because it was really good.” Regardless of her intention, the repeated emphasis on the high number of those who included a certain features might possibly send out a message that features shared by many models are good features and thus need to be included in a consensus model. A possible pitfall with this idea, however, is that it might promote compliance with the social norm or agreement without reasonable grounds. I now examine these two class conversations in combination to see the general contour of the students’ EIMs. Table summarizes the features they came up with and their additional comments. (I modified their comments minimally for clarity.) Table A.5. The class conversations about the changes students made to their previous models to construct their third models of evaporation and about the criteria they considered for constructing a consensus model Feature Additional comments A. Labels a. They help you know what's what. (Iona) b. They make the model a little clearer to understand. (Ms. H) c. Without them, the model will confuse whoever's looking at it. (Dalia) d. They describe what you're showing, exactly what you're talking about. (Kyungho, Ms. H) B. Color (for a. We learned about it. (Meryl) hot and cold b. If I see that one's red and one's blue, I'm going to know quickly that water) you're showing me hot water and cold water. (Ms. H) c. It would really catch peoples' eye so they would want to look at it. (Dalia) d. It shows more detail. (Brian) e. It is descriptive. (Iona) C. Humidity a. It is important because we learned about it. (Meryl, Glynn, Ms. H) (for hot and b. That was something we talked about when we talked about the evidence. cold water) (Ms. H) c. They showed me how much humidity was in the hot and cold and why that was different. (Ms. H) D. Moisture in a. It tells you that the water evaporated and now it's in the air. (Adrianna) the air 207 E. Words or sentences a. b. F. Molecules (for hot and cold water) G. Time elapse H. Key I. More details c. a. b. c. a. a. b. c. a. Table A.5 (cont’d) I put them to describe what's happening instead of having to have people guess what's going on. (Dalia) You can summarize in a quick sentence what's going on so they can look at it, understand it and just read one sentence that tells the most important parts in that picture. (Ms. H) They help the person studying your model understand it. (Emily) They show the difference between hot and cold (Kyungho, Ms. H)…that when it’s hot, water molecules spread out and when it’s cold, they get close together. (Lassie, Iona, Ms. H) They show not only that it evaporates but how it evaporates. (Glynn) They help show things. (Pallavi) It shows what happened over time. (Pallavi, Adrianna, Jonas) It says what everything represents. (Dalia) By seeing your key, I will know exactly what you're showing me. (Ms. H) If you already have labels, you don't have to have a key. (Glynn, Dalia) By adding more details, I explained better. (Lassie). These conversations allow one to look into the EIM that Mrs. M’s students as a class held at this time. Analysis of them provides the following result. Table A.6. Analysis of the ideas about modeling from the class conversations about the changes they made to their previous models of evaporation and about criteria for constructing a consensus model Category Analysis Level CONTENT - Regarding features to be included in a model, some students 2~3 highlighted “details” (B-d, I-a) and learning (B-a, C-a). This position was evidenced by their agreement to include empirical evidence such as humidity in a model (C). - On the other hand, some students stressed “molecules,” indicating their attention to scientifically essential features. - Finally, many communicative features were attended to (A, B. E, H). EXPLANATION - Some students attended to showing how (water) molecules 3 behave. ACCURACY - Some students advocated including humidity in a model. 2~2.5 What they meant by humidity, however, may be some physical entity similar to moisture and different from water parti12 cles or the data of percentage humidity. 12 This feature was found in multiple groups’ consensus models of evaporation. 208 Table A.6 (cont’d) COMMUNICATION - They had clear attention to clarity of models. Although this 2.5 concern can be detected in other ideas as well, it was explicitly expressed when they talked about clear communicative features such as labels (A), descriptive sentences (E), and key (H). In addition, some students emphasized including humidity in a model. A.6. The class evaluations of each group’s consensus model of evaporation (Class 12~Class 14) Next, I examine the students’ evaluations for each group consensus model of evaporation. First, I present a summary of their evaluations in the following table. Note that I did not include the evaluations Mrs. M solely made but included the evaluations Mrs. M and students made constructively. Table A.7. The class evaluations of each group’s consensus model of evaporation (Class 12~Class 14) Features Notes A. “Humidity” or - A group drew yellow cloud-like figures (which they called “humidity”) in moisture in the the “during” scene. Glynn and Mrs. M argued that it should be representair ed accurately, in consistency with the fact that as water evaporates humidity in the air increases. B. “Condensation” - A group drew water drops (which they called “condensation”) on the ceilor water drops ing of the room in “before” and “during” which Dalia and Mrs. M crion the ceiling tiqued for being confusing. C. Water particles - Some students argued that the number of dots (representing water partiin the air cles) should be represented accurately, in consistency with the facts that as water evaporates humidity in the air increases and that hot water evaporates faster than cold water. D. Arrows - Jin wished if a group had arrows to show the direction where water goes. - Some students and Mrs. M argued that the number of arrows (representing the rate of evaporation) should be represented accurately, in consistency with the fact that hot water evaporates faster than cold water. E. Heat - A group included this feature for both hot and cold water. Some students argued that it is an incorrect idea. F. Showing evap- Pallavi complimented a group’s model for this she framed as “things oration of hot we’ve learned.” and cold water G. Key - A student and Mrs. M wished if their key were bolder and thus clearer. H. Detail - Kyungho wished if a group had more detail. I. Sentences - Some students commented on this. 209 Before, during, and after K. “H2O” J. L. Pictures M. Color N. Organization O. Labels P. Size Q. Time Table A.7 (cont’d) - Adrianna gave a compliment on this. - R. Window - S. Irrelevant pictures - Hien and Mrs. M complimented a group’s model for having this scientific term. Yair complimented a group’s model for having a lot of pictures in it. Britt critiqued a group’s model for not having it. A group critiqued their own model for not having it. A group critiqued their own model for not having it. Lewis critiqued a group’s model because they had made it small. Mana complimented a group’s model for having actual time as data (e.g., 9:00AM, 9:10AM, 9:30AM). But, Mrs. M and some students pointed out that the time differences (e.g., 9:00AM  9:10AM) are too short for the change in percent humidity the model showed (45%  58%). Emily critiqued a group’s model for having this by arguing that it could lead others to believe that it is a necessary feature which it is not. Brian critiqued a group’s model for having such irrelevant pictures as an ice cube. Analysis of these evaluations provides the following result. Table A.8. Analysis of the ideas about modeling from the class evaluations of each group’s consensus model of evaporation Category Analysis Level CONTENT - Students attended to various features. Some students framed 2~3 them as details (H) or “things we’ve learned” (F) (level 2). Others attended to mechanistic features (level 3). EXPLANATION - Students assumed that the behavior of water particles is an 3 important explanatory feature when they talked about the accuracy of the number of water particles. ACCURACY - Students attended to: whether moisture in the air was repre2 sented accurately (A), whether water drops form on the ceiling of the room (B), the number of dots (representing "water vapor" or water molecules) was represented accurately (C), and whether the number of arrows (representing the rate of evaporation) was represented accurately (D). - They neither referred to such empirical evidence as a source of accuracy nor articulated the relationship between it and their knowledge. COMMUNICATION - Students attended to various communicative features such as a 2.5 key (G), sentences (I), color (M), and labels (O). - Students attended to whether some features of a model reflected empirical evidence or what they learned from empirical investigations accurately (A, F, Q). 210 APPENDIX B. Summaries of how the focus students’ EIMs changed over time This section provides summaries of how three focus students’ EIMs changed over time. This body of text as a whole provides more detailed information about Figure 4.29, Figure 4.30, and Figure 4.30 under the section of “4.3. Summary” in Chapter 4. B.1. A summary of how Brian’s EIM changed over time (Figure 4.29) CONTENT: He did not attend much to what kinds of features to include in his first model (M1) but began to include communicative features (e.g., labels, key, sentences, colors), explanatory features (e.g., water particles), and empirical data (e.g. percentage humidity) in his second model of evaporation (M2) and continued to include these features to the end of the unit except when constructing his initial model of condensation (M6). EXPLANATION: In his initial model of evaporation (M1), Brian’s explanation (upward movement of water) was general (level 1). Then, from his second modeling activity (M2) to his final modeling activity (M9) in the unit, Brian used water particles as an explanatory feature to explain the target phenomena (level 3). ACCURACY: When he used the idea of upward movement of water for his first model (M1), Brian simply took for granted the accuracy of the idea (level 2). From his second modeling activity (M2) on, he attended to whether a model includes empirical data (e.g., data of humidity or of weight) as a source of accuracy (validity) (level 2.5) except in constructing his initial model of condensation (M6) when there was no available empirical data to include in that model (level 2). COMMUNICATION: In constructing his first model of evaporation (M1), Brian displayed no intention to make his model clear or persuasive (level 1). However, when he constructed his second model of evaporation (M2), he paid attention to both aspects by including 211 communicative features (e.g., labels, key, sentences, colors) and empirical data (e.g., data of humidity) in that model (level 2.5). Then, he persistently showed his attention to these two features to the end except in constructing his initial model of condensation (M6) when he did not include empirical data in the model because it was not available (level 2). B.2. A summary of how Joon’s EIM changed over time (Figure 4.30) CONTENT: Joon’s first model of evaporation (M1) was characteristically simple (level 1). When he constructed his second model of evaporation (M2), however, he included a mechanistic feature (e.g., water-air particles) and communicative features (e.g., labels, sentences) in the model, and did not include any extra feature (level 3). And his attention to these two types of features continued to the end of the unit. When he evaluated the other focus students’ second models of evaporation (M3), he talked about mechanistic and communicative features. But, he did not critique on the data of humidity that Brian had included in his second model of evaporation (level 2), indicating his acknowledgement of his way of using empirical evidence for a model, namely, including empirical data in a model. However, he was not fully committed to that idea around that time, as evidenced by the fact that he did not include the feature of empirical data in his third model of evaporation (M4, level 3). When the focus students constructed their consensus model of evaporation (M5), he agreed to Brian’s proposal to include data of humidity in the model (level 2). After this modeling activity, Joon became committed to the idea of including empirical data in a model. Except in constructing his initial model of condensation (M6) when no empirical data was available, he continued to attend to empirical data. EXPLANATION: He showed and wrote that water evaporates in his initial model of evaporation (M1), which was not a distinctively scientific explanation (level 1). In his second model of evaporation (M2), however, he used the mechanistic feature of water-air particles to 212 explain evaporation (level 3). Since then, his attention to mechanistic features remained the same except in his last modeling activity (M9) when he did not comment on it for some unknown reason (level 2). One change, however, can be identified. In his second and third models of evaporation (M2, M4), he used water-air particles as a main explanatory feature. He replaced this feature with water particles in his later models. Most likely, he made a decision for this change when he worked with the other focus students to construct their consensus model of evaporation (M5), and especially when he found that the others had used water particles in their third models of evaporation. ACCURACY: When Joon used an idea of water moving up for his initial model of evaporation (M1), he did not critically reflect whether the idea or a source of the idea is accurate (level 2). In the next three modeling activities, he used the empirical evidence collected from experiments about evaporation, yet in a naï fashion. In his second and third models of evaporation ve (M2, M4), he showed water-air particles. In evaluating others’ second models of evaporation (M3), he talked about “moisture in the air.” These features are related to, but not well supported by the empirical evidence students had (level 2). However, when the focus students constructed their consensus model of evaporation (M5), Joon agreed to Brian’s idea to include data of humidity as a source of accuracy (validity) for the description and explanation of the model (level 2.5). After that, Joon was persistently attentive to empirical data as a source of accuracy (validity) for a model (level 2.5) except in constructing his initial model of condensation (M6) when no empirical evidence was available. COMMUNICATION: He did not attend to communicative or persuasive efficacy of a model when he constructed his initial model of evaporation (M1, level 1). But, he began to attend to communicative features of a model in his next modeling activity (M2) and this attention 213 lasted to his final modeling activity (M9). However, he did not attend to including persuasive features of a model until he and the other focus students constructed their consensus model of evaporation (M5). As noted above, in this activity, he accepted Brian’s idea of showing empirical data in a model to make the model persuasive to others (level 2.5). After that, his attention to empirical data as a feature contributing to the persuasive efficacy of a model was evident in the following modeling activities except in his initial model of condensation (M6) when no empirical data was available. B.3. A summary of how Mana’s EIM changed over time (Figure 4.31) CONTENT: In Both her initial and second models of evaporation (M1, M2), Mana explained evaporation using mechanistic features such as water particles, made her model clear to understand, and included no extra features (level 3). When she evaluated the other focus students’ second models of evaporation (M3), she attended to all the features she had included in her previous two models. But, in addition, she evaluated positively Brian’s second model of evaporation for having data of percentage humidity, a feature that may be relevant but do not need to be in a model from a scientific perspective. This is why the level of her EIM in this dimension was degraded to 2 in this modeling activity. But, when she constructed her third model of evaporation (M4), she did not include data of humidity as Brian had done (level 3), indicating that she only acknowledged Brian’s way of using empirical evidence for model improvement but did not embrace it for her own model. Although the focus students included data of humidity in their group consensus model of evaporation (M5, level 2), we do not know whether Mana indeed became more committed to the feature because she remained silent in the whole process of constructing the model. However, she began to include empirical data in her second model of condensation (M7) and continued to attend to that feature to the end of the unit except in constructing her ini- 214 tial model of condensation (M6) where no empirical data was available. EXPLANATION: Mana continued to attend to water particles (called “water vapor”) as an explanatory feature in all her modeling activities (level 3) except in evaluating other groups’ consensus models of condensation (M9) when she did not mention it. ACCURACY: In her initial model of evaporation (M1), he showed that water particles go into clouds, an idea she carried over from a prior source (e.g., lesson of water cycle). At that time, she took for granted the accuracy of this idea (level 2). But, in her second model of evaporation (M2), she abandoned the idea of water particles going into clouds and instead incorporated a new idea that water particles spread out in the air. This change indicates that she used some of the empirical evidence collected from experiments about evaporation to improve her model in a sophisticated way: when she found that her previous idea is not supported by the evidence, she replaced it with a new idea that is consistent with the evidence (level 3). This sophisticated use of empirical evidence was also found in her third model of evaporation (M4, level 3). However, between these two modeling activities, when she evaluated others’ second models of evaporation (M3), she did not display this sophisticated understanding of how to use evidence for model improvement. She commented briefly on possible relations between some model features and some of the experiments conducted earlier and empirical data (level 2.5). In later modeling activities (M5~M9), she became more attentive to empirical evidence as a source that provides accuracy (validity) to a model except in constructing her initial model of condensation (M6) when no such data was available. COMMUNICATION: First of all, she continued to be attentive to communicative efficacy of a model throughout all the modeling activities. Regarding persuasiveness of a model, her attention to it fluctuated over time. In her first two modeling activities (M1, M2), she did not pay 215 attention to that aspect (level 2). Although she showed some attention to the data of humidity Brian had included in his second model of evaporation as a feature to persuade others of the accuracy of his model (M3, level 2.5), she did not include that feature in her third model of evaporation (M4, level 2). Then, at the time when she and the other focus students constructed their consensus model of evaporation (M5) and thereafter, she was consistently attentive to empirical data as a feature that makes a model persuasive both in her models and in others’ models (level 2.5), except when she constructed her initial model of condensation (M6, level 2) when she had no empirical evidence. 216 REFERENCES 217 REFERENCES Abd-El-Khalick, F., BouJaoude, S., Duschl, R., Lederman, N. G., Mamlok-Naaman, R., Hofstein, A., . . . Tuan, H.-l. (2004). Inquiry in science education: International perspectives. Science Education, 88(3), 397-419. Abell, S. K. & Roth, M. (1995). Reflections on a fifth-grade life science lesson: Making sense of children's understanding of scientific models. International Journal of Science Education, 1722(1), 59-74. Aikenhead, G. S. (1996). Science education: Border crossing into the subculture of science. Studies in Science Education, 27, 1-52. Alonzo, A. C. & Gotwals, A. W. (Eds.). (2012). Learning progressions in science: Current challenges and future directions. Boston, MA: Sense. Baek, H., Schwarz, C. V., Chen, J., Hokayem, H. F., & Zhan, L. (2011). Engaging elementary students in scientific modeling: The MoDeLS 5th grade approach and findings. In M. S. Khine & I. M. Saleh (Eds.), Models and modeling: Cognitive tools for scientific enquiry (pp. 195-218). Dordt, The Netherlands: Springer. Ballenger, C. (1997). Social identities, moral narratives, scientific argumentation: Science talk in a bilingual classroom. Language and Education, 11(1), 1-14. doi: 10.1080/09500789708666715 Bamberger, Y. M. & Davis, E. A. (2011). Middle-school science students’ scientific modelling performances across content areas and within a learning progression. International Journal of Science Education, 1-26. doi: 10.1080/09500693.2011.624133 Barab, S. A. & Duffy, T. M. (2000). From practice fields to communites of practice. In D. H. Jonassen & S. M. Land (Eds.), Theoretical foundations of learning environments (pp. 2555). Mahwah, NJ: L. Erlbaum Associates. Barrow, L. (2006). A brief history of inquiry: From Dewey to standards. Journal of Science Teacher Education, 17(3), 265-278. doi: 10.1007/s10972-006-9008-5 Berland, L. K. (2008). Understanding the composite practice that forms when classrooms take up the practice of scientific argumentation (Doctoral dissertation). Retrieved from ProQuest Dissertations & Theses Database. (UMI No. 3331113) 218 Berland, L. K. & Reiser, B. J. (2009). Making sense of argumentation and explanation. Science Education, 93(1), 26-55. Berland, L. K., Schwarz, C. V., Kenyon, L., & Reiser, B. J. (2013). Epistemologies in practice: Making scientific practices meaningful for students. Paper presented at the annual meeting for the American Educational Research Association, San Francisco, CA. Brown, J. S., Collins, A., & Duguid, P. (1989). Situated cognition and the culture of learning. Educational Researcher, 18(1), 32-42. Carey, S. & Smith, C. (1993). On understanding the nature of scientific knowledge. Educational Psychologist, 28(3), 235-251. doi: 10.1207/s15326985ep2803_4 Chinn, C. A., Buckland, L. A., & Samarapungavan, A. L. A. (2011). Expanding the dimensions of epistemic cognition: Arguments from philosophy and psychology. Educational Psychologist, 46(3), 141-167. doi: 10.1080/00461520.2011.587722 Chinn, C. A. & Malhotra, B. A. (2002). Epistemologically authentic inquiry in schools: A theoretical framework for evaluating inquiry tasks. Science Education, 86(2), 175-218. doi: 10.1002/sce.10001 Clement, J. J. (2008a). Six levels of organization for curriculum design and teaching. In J. J. Clement & M. A. Rea-Ramirez (Eds.), Model based learning and instruction in science (pp. 255-272). Dordrecht, The Netherlands: Springer. Clement, J. J. (2008b). Student/teacher co-construction of visualizable models in large group discussion. In J. J. Clement & M. A. Rea-Ramirez (Eds.), Model based learning and instruction in science (pp. 11-22). Dordrecht, The Netherlands: Springer. Clement, J. J. & Rea-Ramirez, M. A. (Eds.). (2008). Model based learning and instruction in science. Dordrecht, The Netherlands: Springer. Cobb, P., Confrey, J., diSessa, A., Lehrer, R., & Schauble, L. (2003). Design experiments in educational research. Educational Researcher, 32(1), 9-13. Collins, A., Brown, J. S., & Holum, A. (1991). Cognitive apprenticeship: Making thinking visible. American Educator(Winter 1991), 6-46. 219 Cornelius, L. L. & Herrenkohl, L. R. (2004). Power in the classroom: How the classroom environment shapes students' relationships with each other and with concepts. Cognition and Instruction, 22(4), 467-498. Crawford, B. A., Krajcik, J. S., & Marx, R. W. (1999). Elements of a community of learners in a middle school science classroom. Science Education, 83(6), 701-723. doi: 10.1002/(sici)1098-237x(199911)83:6<701::aid-sce4>3.0.co;2-2 Damşa, C. I., Kirschner, P. A., Andriessen, J. E. B., Erkens, G., & Sins, P. H. M. (2010). Shared epistemic agency: An empirical study of an emergent construct. Journal of the Learning Sciences, 19(2), 143-186. doi: 10.1080/10508401003708381 diSessa, A. A. (1988). Knowledge in pieces. In G. E. Forman & P. B. Pufall (Eds.), Constructivism in the computer age (pp. 49-70). Hillsdale, NJ: L. Erlbaum. diSessa, A. A. (1993). Toward an epistemology of physics. Cognition and Instruction, 10(2-3), 105-225. doi: 10.1080/07370008.1985.9649008 diSessa, A. A. & Wagner, J. F. (2005). What coordination has to say about transfer. In J. P. Mestre (Ed.), Transfer of learning from a modern multidisciplinary perspective (pp. 121154). Greenwich, CT: Information Age Publishing. Duschl, R. A., Schweingruber, H. A., & Shouse, A. W. (Eds.). (2007). Taking science to school: Learning and teaching science in grades K-8. Washington, D.C.: National Academies Press. Engeströ Y. (1987). Learning by expanding: An activity-theoretical approach to m, developmental research. Helsinki, Finland: Orienta-Konsultit Oy. Engle, R. A. & Conant, F. R. (2002). Guiding principles for fostering productive disciplinary engagement: Explaining an emergent argument in a community of learners classroom. Cognition and Instruction, 20(4), 399-483. Feurzeig, W. & Roberts, N. (Eds.). (1999). Modeling and simulation in science and mathematics education. New York, NY: Springer. Flewitt, R., Hampel, R., Hauck, M., & Lancaster, L. (2009). What are mulimodal data and transcription? In C. Jewitt (Ed.), The Routledge handbook of multimodal analysis (pp. 4053). New York: Routledge. 220 Ford, M. J. & Forman, E. A. (2006). Redefining disciplinary learning in classroom contexts. Review of Research in Education, 30(1), 1-32. Frigg, R. & Hartmann, S. (2012). Models in science. Stanford Encyclopedia of Philosophy (Fall 2012 edition). Retrieved March 15, 2011, from http://plato.stanford.edu/entries/modelsscience/ Gilbert, S. W. (1991). Model building and a definition of science. Journal of Research in Science Teaching, 28(1), 73-79. doi: 10.1002/tea.3660280107 Gobert, J. D. & Buckley, B. C. (2000). Introduction to model-based teaching and learning in science education. International Journal of Science Education, 22(9), 891-894. Gobert, J. D. & Pallant, A. (2004). Fostering students' epistemologies of models via authentic model-based tasks. Journal of Science Education and Technology, 13(1), 7-22. doi: 10.1023/B:JOST.0000019635.70068.6f Gotwals, A. W. & Songer, N. B. (2009). Reasoning up and down a food chain: Using an assessment framework to investigate students' middle knowledge. Science Education, 94(2), 259-281. Granott, N. & Parziale, J. (2002). Microdevelopment: A process-oriented perspective for studying development and learning. In N. Granott & J. Parziale (Eds.), Microdevelopment: Transition processes in development and learning (pp. 1-28). New York, NY: Cambridge University Press. Greene, J. A., Azevedo, R., & Torney-Purta, J. (2008). Modeling epistemic and ontological cognition: Philosophical perspectives and methodological directions. Educational Psychologist, 43(3), 142-160. doi: 10.1080/00461520802178458 Greeno, J. G. (1998). The situativity of knowing, learning, and research. American Psychologist, 53(1), 5-26. Grosslight, L., Unger, C., Jay, E., & Smith, C. L. (1991). Understanding models and their use in science: Conceptions of middle and high school students and experts. Journal of Research in Science Teaching, 28(9), 799-822. Halliday, M. A. K. (1978). Language as social semiotic: The social interpretation of language and meaning. Baltimore, MD: University Park Press. 221 Halloun, I. A. & Hestenes, D. (1987). Modeling instruction in mechanics. American Journal of Physics, 55(5), 455-462. Hammer, D. & Elby, A. (2002). On the form of a personal epistemology. In B. K. Hofer & P. R. Pintrich (Eds.), Personal epistemology the psychology of beliefs about knowledge and knowing (pp. 171-192). Mahwah, NJ: L. Erlbaum. Hammer, D., Elby, A., Scherr, R. E., & Redish, E. F. (2005). Resources, framing, and transfer. In J. P. Mestre (Ed.), Transfer of learning from a modern multidisciplinary perspective (pp. 89-120). Greenwich, CT: Information Age Publishing. Harrison, A. G. & Treagust, D. F. (2000). A typology of school science models. International Journal of Science Education, 22(9), 1011 - 1026. Hauschild, M. Z., Huijbregts, M., Jolliet, O., Macleod, M., Margni, M., van de Meent, D., . . . McKone, T. E. (2008). Building a model based on scientific consensus for life cycle impact assessment of chemicals: The search for harmony and parsimony. Environmental Science & Technology, 42(19), 7032-7037. doi: 10.1021/es703145t Herrenkohl, L. R. & Guerra, M. R. (1998). Participant structures, scientific discourse, and student engagement in fourth grade. Cognition and Instruction, 16(4), 431-473. Hodge, B. & Kress, G. R. (1988). Social semiotics. Ithaca, NY: Cornell University Press. Hofer, B. K. & Pintrich, P. R. (Eds.). (2002). Personal epistemology: The psychology of beliefs about knowledge and knowing. Mahwah, NJ: L. Erlbaum. Hogan, K. & Corey, C. (2001). Viewing classrooms as cultural contexts for fostering scientific literacy. Anthropology & Education Quarterly, 32(2), 214-243. Holland, D. C., Lachicotte, W., Skinner, D., & Cain, C. (1998). Identity and agency in cultural worlds. Cambridge, MA: Harvard University Press. Hutchins, E. (1991). The social organization of distributed cognition. In L. B. Resnick, J. M. Levine & S. D. Teasley (Eds.), Perspectives on socially shared cognition (1st ed., pp. 283-307). Washington, DC: American Psychological Association. 222 Ingham, A. M. & Gilbert, J. K. (1991). The use of analogue models by students of chemistry at higher education level. International Journal of Science Education, 13(2), 193-202. doi: 10.1080/0950069910130206 Jimé nez-Aleixandre, M. P., Rodrí guez, A. B., & Duschl, R. A. (2000). "Doing the lesson" or "doing science": Argument in high school genetics. Science Education, 84(6), 757-792. Kenyon, L., Schwarz, C. V., & Hug, B. (2008). The benefits of scientific modeling: Constructing, using, evaluating, and revising scientific models helps students advance their scientific ideas, learn to think critically, and understand the nature of science. Science and Children, 46(2), 40-44. Kirschner, P. A., Sweller, J., & Clark, R. E. (2006). Why minimal guidance during instruction does not work: An analysis of the failure of constructivist, discovery, problem-based, experiential, and inquiry-based teaching. Educational Psychologist, 41(2), 75-86. doi: 10.1207/s15326985ep4102_1 Kitchener, R. F. (2002). Folk epistemology: An introduction. New Ideas in Psychology, 20(2–3), 89-105. doi: http://dx.doi.org/10.1016/S0732-118X(02)00003-X Kurth, L. A., Anderson, C. W., & Palincsar, A. S. (2002). The case of Carla: Dilemmas of helping all students to understand science. Science Education, 86(3), 287-313. doi: 10.1002/sce.10009 Latour, B. & Woolgar, S. (1979). Laboratory life: The social construction of scientific facts. Beverly Hills, CA: Sage. Lave, J. & Wenger, E. (1991). Situated learning: Legitimate peripheral participation. New York: Cambridge University Press. Leach, J., Millar, R., Ryder, J., & Sé , M.-G. (2000). Epistemological understanding in science ré learning: The consistency of representations across contexts. Learning and Instruction, 10(6), 497-527. doi: http://dx.doi.org/10.1016/S0959-4752(00)00013-X Lehrer, R. & Schauble, L. (2000). Modeling in mathematics and science. In R. Glaser (Ed.), Advances in instructional psychology (Vol. 5, pp. 101-159). Hillsdale, NJ: Lawrence Erlbaum Associates. 223 Lehrer, R. & Schauble, L. (2004). Modeling natural variation through distribution. American Educational Research Journal, 41(3), 635-679. Lehrer, R. & Schauble, L. (2006). Cultivating model-based reasoning in science education. In R. K. Sawyer (Ed.), The Cambridge handbook of the learning sciences (pp. 371-387). Cambridge, MA: Cambridge University Press. Leont'ev, A. N. (1978). Activity, consciousness, and personality (M. J. Hall, Trans.). Englewood Cliffs, NJ: Prentice-Hall. Magnani, L., Nersessian, N. J., & Thagard, P. (Eds.). (1999). Model-based reasoning in scientific discovery. New York, NY: Kluwer Academic/Plenum Publishers. McNeill, K. L., Lizotte, D. J., Krajcik, J., & Marx, R. W. (2006). Supporting students' construction of scientific explanations by fading scaffolds in instructional materials. Journal of the Learning Sciences, 15(2), 153-191. Mellar, H., Bliss, J., Boohan, R., Ogborn, J., & Tompsett, C. (Eds.). (1994). Learning with artificial worlds: Computer-based modelling in the curriculum. Washington, D.C.: Falmer Press. Metz, K. E. (2004). Children's understanding of scientific inquiry: Their conceptualization of uncertainty in investigations of their own design. Cognition and Instruction, 22(2), 219290. Moje, E. B., Peek-Brown, D., Sutherland, L. M., Marx, R. W., Blumenfeld, P., & Krajcik, J. (2004). Explaining explanation: Developing scientific literacy in middle school projectbased science reforms. In D. S. Strickland & D. E. Alvermann (Eds.), Bridging the literacy achievement gap, grades 4-12 (pp. 227-251). New York: Teachers College Press. National Research Council. (1996). National science education standards: Observe, interact, change, learn. Washington, D.C.: National Academy Press. National Research Council. (2012). A framework for K-12 science education: Practices, crosscutting concepts, and core ideas. Committee on a Conceptual Framework for New K-12 Science Education Standards. Board on Science Education, Division of Behavioral and Social Sciences and Education. Washington, D.C.: The National Academies Press. 224 O'Neill, D. K. (2001). Knowing when you've brought them in: Scientific genre knowledge and communities of practice. Journal of the Learning Sciences, 10(3), 223 - 264. Pickering, A. (1995). The mangle of practice: Time, agency, and science. Chicago, IL: University of Chicago Press. Pluta, W. J., Chinn, C. A., & Duncan, R. G. (2011). Learners' epistemic criteria for good scientific models. Journal of Research in Science Teaching, 48(5), 486-511. doi: 10.1002/tea.20415 Redish, E. F. (2004). A theoretical framework for physics education research: Modeling student thinking. In E. F. Redish & M. Vicentini (Eds.), Proceedings of the Enrico Fermi summer school, course CLVI. Bologna, Italy: Italian Physical Society. Reed, B. (2001). Epistemic agency and the intellectual virtues. The Southern Journal of Philosophy, 39(4), 507-526. doi: 10.1111/j.2041-6962.2001.tb01831.x Rogoff, B. (1990). Apprenticeship in thinking: Cognitive development in social context. New York: Oxford University Press. Rosebery, A. S., Warren, B., & Conant, F. R. (1992). Appropriating scientific discourse: Findings from language minority classrooms. Journal of the Learning Sciences, 2(1), 6194. Roth, W.-M. & Calabrese-Barton, A. (2004). Rethinking scientific literacy. New York: RoutledgeFalmer. Saari, H. (2003). A research-based teaching sequence for teaching the concept of modelling to seventh-grade students. International Journal of Science Education, 25(11), 1333-1352. Salomon, G. (Ed.). (1993). Distributed cognitions: Psychological and educational considerations. New York: Cambridge University Press. Sandoval, W. A. (2005). Understanding students' practical epistemologies and their influence on learning through inquiry. Science Education, 89(4), 634-656. doi: 10.1002/sce.20065 225 Sandoval, W. A. & Morrison, K. (2003). High school students' ideas about theories and theory change after a biological inquiry unit. Journal of Research in Science Teaching, 40(4), 369-392. doi: 10.1002/tea.10081 Scardamalia, M. (2000). Can schools enter a knowledge society? In M. Selinger & J. Wynn (Eds.), Educational technology and the impact on teaching and learning (pp. 6-10). Abingdon, The United Kingdom: Research Machines. Scardamalia, M. & Bereiter, C. (2006). Knowledge building: Theory, pedagogy, and technology. In R. K. Sawyer (Ed.), The Cambridge handbook of the learning sciences (pp. 97-115). New York: Cambridge University Press. Schommer, M. (1990). Effects of beliefs about the nature of knowledge on comprehension. Journal of Educational Psychology, 82(3), 498-504. Schwartz, R. S., Lederman, N. G., & Crawford, B. A. (2004). Developing views of nature of science in an authentic context: An explicit approach to bridging the gap between nature of science and scientific inquiry. Science Education, 88(4), 610-645. doi: 10.1002/sce.10128 Schwarz, C. V. & Gwekwerere, Y. N. (2007). Using a guided inquiry and modeling instructional framework (EIMA) to support preservice K-8 science teaching. Science Education, 91(1), 158-186. Schwarz, C. V., Reiser, B. J., Acher, A., Kenyon, L., & Fortus, D. (2012). MoDeLS: Challenges in defining a learning progression for scientific modeling. In A. C. Alonzo & A. W. Gotwals (Eds.), Learning progressions in science: Current challenges and future directions (pp. 101-137). Boston, MA: Sense. Schwarz, C. V., Reiser, B. J., Davis, E. A., Kenyon, L., Ach, A., Fortus, D., . . . Krajcik, J. (2009). Developing a learning progression for scientific modeling: Making scientific modeling accessible and meaningful for learners. Journal of Research in Science Teaching, 46(6), 632-654. Schwarz, C. V. & White, B. Y. (2005). Metamodeling knowledge: Developing students' understanding of scientific modeling. Cognition and Instruction, 23(2), 165-205. Sherman, W. (2004). Science studies, situatedness, and instructional design in science education: A summary and critique of the promise. Canadian Journal of Science, Mathematics and Technology Education, 4(4), 443-465. doi: 10.1080/14926150409556627 226 Siegler, R. S. & Crowley, K. (1991). The microgenetic method: A direct means for studying cognitive development. American Psychologist, 46(6), 606-620. doi: 10.1037/0003066x.46.6.606 Smith, C. L., Wiser, M., Anderson, C. W., & Krajcik, J. (2006). Implications of research on children's learning for standards and assessment: A proposed learning progression for matter and the atomic-molecular theory. Measurement: Interdisciplinary Research & Perspective, 4(1), 1 - 98. Steinberg, M. S. (2008). Applying modeling theory to curriculum development: From electric circuits to electromagnetic fields. In J. J. Clement & M. A. Rea-Ramirez (Eds.), Model based learning and instruction in science (pp. 151-171). Dordrecht, The Netherlands: Springer. Stewart, J., Hafner, R., Johnson, S., & Finkel, E. (1992). Science as model building: Computers and high-school genetics. Educational Psychologist, 27(3), 317-336. doi: 10.1207/s15326985ep2703_4 The Concord Consortium. (2013). Phase change (a multi-page activity): Interactive, scaffolded model. MOLO: Molecular logic. Retrieved April 22, 2013, from http://molo.concord.org/database/activities/180.html Treagust, D. F., Chittleborough, G., & Mamiala, T. L. (2002). Students' understanding of the role of scientific models in learning science. International Journal of Science Education, 24(4), 357-368. van Eijck, M., Hsu, P.-L., & Roth, W.-M. (2009). Translations of scientific practice to “students' images of science”. Science Education, 93(4), 611-634. doi: 10.1002/sce.20322 van Leeuwen, T. (2005). Introducing social semiotics. New York: Routledge. Warren, B., Ballenger, C., Ogonowski, M., Rosebery, A. S., & Hudicourt-Barnes, J. (2001). Rethinking diversity in learning science: The logic of everyday sense-making. Journal of Research in Science Teaching, 38(5), 529-552. doi: 10.1002/tea.1017 Wenger, E. (1998). Communities of practice: Learning, meaning, and identity. New York: Cambridge University Press. 227 White, B. Y. (1993). ThinkerTools: Causal models, conceptual change, and science education. Cognition and Instruction, 10(1), 1-100. White, B. Y. & Frederiksen, J. R. (1998). Inquiry, modeling, and metacognition: Making science accessible to all students. Cognition and Instruction, 16(1), 3-118. White, B. Y. & Schwarz, C. V. (1999). Alternative approaches to using modeling and simulation tools for teaching science. In W. Feurzeig & N. Roberts (Eds.), Modeling and simulation in science and mathematics education (pp. 226-256). New York: Springer. White, R. T. & Gunstone, R. F. (1992). Probing understanding. New York, NY: Falmer. Windschitl, M., Thompson, J., & Braaten, M. (2008a). Beyond the scientific method: Modelbased inquiry as a new paradigm of preference for school science investigations. Science Education, 92(5), 941-967. Windschitl, M., Thompson, J., & Braaten, M. (2008b). How novice science teachers appropriate epistemic discourses around model-based inquiry for use in classrooms. Cognition and Instruction, 26(3), 310-378. 228