TEACHING AND LEARNING WITH DIGITAL EVOLUTION: FACTORS INFLUENCING IMPLEMENTATION AND STUDENT OUTCOMES By Amy M Lark A DISSERTATION Submitted to Michigan State University in partial fulfillment of the requirements for the degree of Curriculum, Instruction, and Teacher Education – Doctor of Philosophy 2014 ABSTRACT TEACHING AND LEARNING WITH DIGITAL EVOLUTION: FACTORS INFLUENCING IMPLEMENTATION AND STUDENT OUTCOMES By Amy M Lark Science literacy for all Americans has been the rallying cry of science education in the United States for decades. Regardless, Americans continue to fall short when it comes to keeping pace with other developed nations on international science education assessments. To combat this problem, recent national reforms have reinvigorated the discussion of what and how we should teach science, advocating for the integration of disciplinary core ideas, crosscutting concepts, and science practices. In the biological sciences, teaching the core idea of evolution in ways consistent with reforms is fraught with challenges. Not only is it difficult to observe biological evolution in action, it is nearly impossible to engage students in authentic science practices in the context of evolution. One way to overcome these challenges is through the use of evolving populations of digital organisms. Avida-ED is digital evolution software for education that allows for the integration of science practice and content related to evolution. The purpose of this study was to investigate the effects of Avida-ED on teaching and learning evolution and the nature of science. To accomplish this I conducted a nationwide, multiple-case study, documenting how instructors at various institutions were using Avida-ED in their classrooms, factors influencing implementation decisions, and effects on student outcomes. I found that all of the participating instructors held views on teaching and learning that were well aligned with reform-based pedagogy, and although instructors used Avida-ED in a variety of ways, all adopted learner-centered pedagogical strategies that focused on the use of inquiry. After implementation, all of the instructors indicated that Avida-ED had allowed them to teach evolution and the nature of science in ways consistent with their personal teaching philosophies. In terms of assessment outcomes, students in lowerdivision courses significantly improved both their understanding and acceptance of evolution after using Avida-ED, and learning of content was positively associated with increased acceptance. Although student learning outcomes and instructor familiarity with Avida-ED were not associated with student affective response to the program, instructor familiarity was highly influential with regard to both how Avida-ED was implemented and student affective response, particularly student interest, enjoyment, and self-efficacy. The results of this dissertation provide strong evidence suggesting that Avida-ED is a promising tool for teaching and learning about evolution in reform-based ways, and suggest that improving instructor pedagogical content knowledge with regard to research-based tools like Avida-ED may be implicated in generating student interest in STEM. I dedicate this work: To my husband, Zachary, For all of your patience, support, Love, and understanding. I could not have come this far Without you. And To my daughter, Máire, Who has been my light and my way. My inspiration. I love you.   iv   ACKNOWLEDGMENTS I would like to thank my advisor, Gail Richmond, for her unfailing support and encouragement over the years. Thank you for always believing in me. To Rob Pennock, my dissertation co-chair, you have always been confident in my abilities; thank you for helping me to believe in myself. To Jim Smith and Amelia Gotwals: Thank you for serving on my committee and always offering great advice; you’re the best. To Louise Mead: You are an inspiration. To Wendy Johnson: You are the best darned collaborator and travel buddy ever. Thank you to Emily Weigel, who helped with inter-rater reliability and is an amazing friend. And to my friends who have supported me: Rosa, Zack, Kathy, Linda, Samina, Amal, May, and Kristen. Thank you.   v   TABLE OF CONTENTS LIST OF TABLES ......................................................................................................................... ix   LIST OF FIGURES ....................................................................................................................... xi   Chapter 1. Introduction ................................................................................................................... 1   Research goals and theoretical framing. ................................................................................... 11   Goals of the research ............................................................................................................. 11   Theoretical framework .......................................................................................................... 12   Learning about the nature and practices of science .......................................................... 13   Instructor goals and beliefs about teaching and learning and influences on curricular implementation ................................................................................................................. 16   Affective factors and influence on science learning ......................................................... 19   Research questions. ................................................................................................................... 21   Hypotheses and predictions. ..................................................................................................... 22   Overview of chapters. ............................................................................................................... 22   Chapter 2. Methods ....................................................................................................................... 24   Overview. .................................................................................................................................. 24   Methodology: Justification of the multiple-case approach. ...................................................... 24   Data sources and analyses. ........................................................................................................ 25   Pre-/post-implementation interviews with instructors .......................................................... 25   Pre-/post-assessment of students........................................................................................... 26   Development of the assessment instrument ...................................................................... 27   Scoring open-ended responses .......................................................................................... 28   Inter-rater reliability .......................................................................................................... 29   Scoring forced-choice responses....................................................................................... 31   Content vs. acceptance ...................................................................................................... 31   Post-implementation survey of students ............................................................................... 33   Course materials.................................................................................................................... 34   Cross-case analysis ............................................................................................................... 34   Characterization of implementations .................................................................................... 35   Chapter 3. Case and implementation descriptions ........................................................................ 38   Overview of cases. .................................................................................................................... 38   Case descriptions. ..................................................................................................................... 39   Case A_APBio: Advanced Placement Biology .................................................................... 40   Institutional context .......................................................................................................... 40   Course context .................................................................................................................. 40   Instructor context .............................................................................................................. 41   Description of implementation ......................................................................................... 41   Summary ........................................................................................................................... 46   Case B_300Evo: Upper-division Evolution.......................................................................... 46   Institutional context .......................................................................................................... 46     vi   Course context .................................................................................................................. 47   Instructor context .............................................................................................................. 47   Description of implementation ......................................................................................... 47   Summary ........................................................................................................................... 49   Case C_400Evo: Upper-division Evolution.......................................................................... 50   Institutional context .......................................................................................................... 50   Course context .................................................................................................................. 50   Instructor context .............................................................................................................. 50   Description of implementation ......................................................................................... 51   Summary ........................................................................................................................... 53   Case D_100Evo: Freshman Seminar in Evolution ............................................................... 54   Institutional context .......................................................................................................... 54   Course context .................................................................................................................. 54   Instructor context .............................................................................................................. 55   Description of implementation ......................................................................................... 55   Summary ........................................................................................................................... 56   Case E_200Bio_HC: Honors Introductory Biology for Non-majors ................................... 57   Institutional context .......................................................................................................... 57   Course context .................................................................................................................. 57   Instructor context .............................................................................................................. 57   Description of implementation ......................................................................................... 58   Summary ........................................................................................................................... 61   Case F_400Evo: Upper-division Evolution .......................................................................... 61   Institutional context .......................................................................................................... 61   Course context .................................................................................................................. 62   Instructor context .............................................................................................................. 62   Description of implementation ......................................................................................... 63   Summary ........................................................................................................................... 64   Institution G .......................................................................................................................... 65   Institutional context .......................................................................................................... 65   Cases G_100BioLabA & G_100BioLabB: Introductory Biology Laboratory ..................... 65   Course context .................................................................................................................. 65   Instructor contexts ............................................................................................................. 65   Description of implementation ......................................................................................... 67   Summary ........................................................................................................................... 69   Case G_100BioRes: Introductory Biology (Residential College) ........................................ 69   Course context .................................................................................................................. 69   Instructor context .............................................................................................................. 70   Description of implementation ......................................................................................... 70   Summary ........................................................................................................................... 72   Case H_100CompBio: Introduction to Computational Biology........................................... 72   Institutional context .......................................................................................................... 72   Course context .................................................................................................................. 72   Instructor context .............................................................................................................. 73   Description of implementation ......................................................................................... 73   Summary ........................................................................................................................... 75     vii   Chapter 4. Instructor experience with Avida-ED ......................................................................... 77   Overview. .................................................................................................................................. 77   Findings. ................................................................................................................................... 77   Instructor goals: Science ....................................................................................................... 77   Science practices ............................................................................................................... 77   The nature of science ........................................................................................................ 78   Scientific habits of mind ................................................................................................... 79   Knowledge transfer and application ................................................................................. 79   Prior experience ................................................................................................................ 80   Scientific reasoning ........................................................................................................... 81   Interest and motivation ..................................................................................................... 82   Instructor goals: Evolution.................................................................................................... 83   Instructor views on assessment ............................................................................................. 88   Instructor views on pedagogy ............................................................................................... 88   Instructor views on their role in student learning ................................................................. 90   Instructor views on Avida-ED .............................................................................................. 91   Chapter 5. Student learning outcomes .......................................................................................... 95   Overview. .................................................................................................................................. 95   Student learning of foundational evolution concepts................................................................ 95   Student acceptance of evolution. .............................................................................................. 97   Understanding and acceptance of evolution. ............................................................................ 99   Chapter 6. Student affective response to Avida-ED ................................................................... 102   Overview. ................................................................................................................................ 102   Findings. ................................................................................................................................. 102   Chapter 7. Discussion ................................................................................................................. 107   Overview. ................................................................................................................................ 107   Discussion of findings. ........................................................................................................... 107   Implementation of Avida-ED ............................................................................................. 107   Instructor beliefs ................................................................................................................. 109   Student affective response .................................................................................................. 118   Implications for science education.......................................................................................... 122   Limitations of the study. ......................................................................................................... 126   Directions for future research. ................................................................................................ 128   Conclusions. ............................................................................................................................ 135   APPENDIX ................................................................................................................................. 142   LITERATURE CITED ............................................................................................................... 152     viii   LIST OF TABLES Table 1. Examples of student constructed responses and scoring against rubric. ........................ 30   Table 2. Summary of inter-rater reliability outcomes. Pr(a) is equivalent to percent agreement between coders. ............................................................................................................................. 31   Table 3. Student acceptance of evolution was assessed with the following questions. Items were scored according to a 5-point Likert scale. Some questions were borrowed or modified from the Measure of the Acceptance of Evolutionary Theory (MATE) instrument (Rutledge & Warden, 1999). Other questions from an unpublished survey by Mead and Libarkin were written in the style of the MATE. ....................................................................................................................... 32   Table 4. Interpretation of acceptance scores. ................................................................................ 32   Table 5. List of participating institutions, characterized by size (estimated student population) and Carnegie classification data (if applicable). ........................................................................... 38   Table 6. Case summaries. Case codes are designated by institution code (see Table 2) and course level/type. Class levels are designated as Lower (AP, 100-, or 200-level) or Upper (300- or 400level). Only students taking both the pre- and post-assessment were included in data analyses; therefore, the number of students enrolled in each course may actually be greater than what is reported here. ................................................................................................................................ 39   Table 7. Instructor profiles. ........................................................................................................... 39   Table 8. Experimental parameters for Lab 1 of the Computational Evolution course. ................ 74   Table 9. Summary of average content scores by case. Pre- and post-test content scores and pre/post change are reported as the average percentage of the ideal response. Significance values marked with an asterisk were calculated using non-parametric analyses (Mann-Whitney U test). Effect sizes were determined using Cohen’s d and are interpreted accordingly: 0.2 = small, 0.5 = moderate, 0.8 = large. ................................................................................................................... 96   Table 10. Summary of average acceptance scores by case. Significance values marked with an asterisk were calculated using non-parametric analyses (Mann-Whitney U test). Effect size was determined using Cohen’s d and are interpreted accordingly: 0.2 = small, 0.5 = moderate, 0.8 = large............................................................................................................................................... 98   Table 11. Summary of survey data for each of seven cases. Survey items were based on a 5-point Likert scale (1= low; 5 = high); item mean response is reported for each case.......................... 103   Table 12. Correlation matrix showing associations between survey items, instructor familiarity with Avida-ED, and assessment data (average normalized student gains in content and   ix   acceptance scores). Cells shaded in yellow are significant at the 0.05 level while cells shaded in orange are significant at the 0.01 level. ...................................................................................... 105   Table 13. Correlations between survey items dealing with affect and the degree to which students found various resources helpful. Cells shaded in yellow are significant at the 0.05 level while cells shaded in orange are significant at the 0.01 level. .............................................................. 106     x   LIST OF FIGURES Figure 1. The genome of a digital organism in Avida-ED consists of a circular instruction sequence 50 commands in length. A different letter of the alphabet denotes each command, with a total of 26 commands. .................................................................................................................. 8 Figure 2. Populations of digital organisms grow and evolve on a virtual Petri dish, the size of which is determined by the user. Each cell of the grid represents a single individual organism. The different colors represent individual fitness relative to the average fitness of the population. ...................................................................................................................................... 9 Figure 3. Disciplinary subject matter knowledge consists of substantive and syntactic structures. The syntactic structure can be further subdivided into epistemology and practices..................... 12 Figure 4. Average pre/post student content scores for each of the ten cases. Error bars are based on the standard error of the mean. Possible scores range from 0% - 100% of the ideal response. Pre/post differences with a significance level of p < 0.05 are denoted by a single asterisk; significance levels of p < 0.01 are denoted by double asterisks. Mean differences were tested for significance using one-tailed paired Student’s t-tests for cases with normal distributions, and Mann-Whitney U tests for those with non-normal distributions. ................................................. 97   Figure 5. Average pre/post student acceptance scores for each of the ten cases. Error bars are based on the standard error of the mean. Possible scores range from 20 (complete nonacceptance) to 100 (complete acceptance). Pre/post differences with a significance level of p < 0.05 are denoted by a single asterisk; significance levels of p < 0.01 are denoted by double asterisks. Mean differences were tested for significance using one-tailed paired Student’s t-tests for cases with normal distributions, and Mann-Whitney U tests for those with non-normal distributions. ................................................................................................................................. 99   Figure 6. A Pearson correlation between pre/post change in average acceptance score and average content score across the ten cases reveals a significant, positive relationship (r = 0.73; p < 0.01). ........................................................................................................................................ 100   Figure 7. A Pearson correlation between average student normalized gains in both content and acceptance scores across the ten cases reveals a significant, positive relationship (r = 0.60; p < 0.05). ........................................................................................................................................... 101     xi   Chapter 1. Introduction Science literacy for all Americans has been the primary goal of science education in the United States since the term was coined by science education reformer Paul Hurd in the late 50s (Hurd, 1958), and continues to be the main objective of reform efforts at all levels of education (American Association for the Advancement of Science, 1989, 1993; Brewer & Smith, 2011; National Research Council, 1996, 2011). Although definitions of what it means to be scientifically literate vary greatly throughout the science education literature (Feinstein, 2011), most reforms at the national level advocate science literacy as a fundamental understanding of scientific knowledge, philosophy, and methodologies that allows for engagement in meaningful discourse around science-related issues that affect our lives and communities (American Association for the Advancement of Science, 1989, 1993; Brewer & Smith, 2011; National Research Council, 1996, 2011). However, despite these reforms and other government-instituted initiatives, we are as a nation still far from reaching the goal of science literacy for all Americans, and the United States continues to lag substantially behind other developed countries on internationally administered tests, particularly in the fields of math and science (Organization for Economic Co-operation and Development, 2007). The problem extends beyond basic science literacy and K-12 science education. According to recent statements by the President’s Council of Advisors on Science and Technology (S. Olson & Riordan, 2012) and the Association of American Universities (2011), the nation’s need for college graduates in STEM (Science, Technology, Engineering and Mathematics) fields has increased, and will only continue to do so as we become increasingly immersed in a culture reliant on technology. However, more than 40% of college students who   1 initially declare majors in STEM fields switch to non-STEM majors by the time they graduate, and college attrition rates are higher for STEM majors than for non-STEM majors. Many of these students are leaving STEM fields after their first year or two of college, suggesting that the problem may lie in the introductory STEM courses, many of which have large enrollments and are primarily lecture-based. Indeed, many students report dissatisfaction with instruction in these courses, citing that they are too large, too difficult, and uninteresting (Association of American Universities, 2011; Seymour & Hewitt, 1994). In September 2011, the Association of American Universities (AAU) announced its five-year STEM education initiative aimed at improving college STEM teaching and learning by encouraging college professors to adopt and implement reform-based pedagogical practices. The goal is to move away from traditional pedagogical approaches toward teaching that incorporates more active, cooperative learning and engagement in authentic science practices, as these have been associated with improved student outcomes (Singer, Nielsen, & Schweingruber, 2012). The most recent national reform efforts in K-16 science education reflect this goal, focusing on core disciplinary ideas as well as crosscutting concepts and authentic science practices (Achieve, 2013; Brewer & Smith, 2011; College Board, 2011; National Research Council, 2011). Although engaging students in the process of inquiry has long been a part of reform-based science education (National Research Council, 2000), the newest frameworks emphasize that science practices are not to be decontextualized or divorced from content, but rather integrated with it; therefore, students are to learn about science concepts, as well as the nature of science, by way of their engagement in science practices. Achieving such integration entails the design of classroom activities that allow students to ask questions, generate   2 hypotheses, make observations, design and conduct experiments, collect, analyze and represent data, and create arguments from evidence, all within the context of a particular content area. The new reforms have also addressed the perennial “mile-wide-inch-deep” problem of content coverage by limiting what students are expected to learn to four or five core concepts per discipline. In biology, evolution features prominently among these core concepts. For example, Vision and Change, written for undergraduate biology, lists evolution as the first core concept of five (Brewer & Smith, 2011); the new AP Biology Curriculum Framework lists it as the first of four big ideas (College Board, 2011); and the Next Generation Science Standards (NGSS) list it as the fifth of five core ideas in biology (Achieve, 2013). The explicit emphasis on evolution in these standards is significant. Less than half of the adult population in the US accepts evolution as true (Miller, Scott, & Okamato, 2006; Newport, 2012). Indeed, the Next Generation Science Standards have faced challenges, including lawsuits, in several states by groups attempting to delay, prevent, or overturn their adoption; the impetus in the majority of these cases has been religious and/or political opposition to the inclusion of evolution and climate change in the standards (Klein, 2013, 2014; National Center for Science Education, 2013). Despite resistance from detractors, the inclusion of evolution among the most important concepts in biology is of course warranted and necessary. Evolution is the single unifying principle of all biological disciplines; it forms the basis for everything we understand about the diversity and history of life on Earth (Dobzhansky, 1973). Moreover, the fundamental elements of the evolutionary mechanism (inheritance, variation, and selection) interact to produce change even in non-organic systems (Dennett, 1995), and engineers have been applying the basic principles of evolution by natural selection to solve design problems and produce other technological innovations (Pennock, 2005); that is to say, evolution works. Among topics in the   3 biological sciences, evolutionary biology is particularly representative of the nature of science (Pennock, 2005) as an understanding of evolution depends a great deal on understanding what science is and how it is done (Akyol, Tekkaya, Sungur, & Traynor, 2012; Lombrozo, Thanukos, & Weisberg, 2008; Rutledge & Warden, 2000). Evolution is, quite simply, a non-negotiable part of any biology curriculum worth its salt (Mead & Mates, 2009). Ironically, despite the status of evolution as the unifying theory of the biological sciences, many people do not have a solid understanding of its underlying mechanisms (Gregory, 2009). In addition, few topics in contemporary science education are as fraught with sociopolitical contention as the teaching of evolutionary theory. In the infamous 1925 court case, John Scopes was tried and fined for teaching human evolution in a public school, bringing evolution education to the forefront of America’s collective consciousness. While teaching evolution today is unlikely to result in a misdemeanor charge, the theory is often at the center of education debates, especially between the political left and right, making it a touchy subject that some educators are anxious to broach. Indeed, many K-16 science teachers refrain from discussing evolution at all in an effort to avoid potential controversy or upset to their students (Meadows, Doster, & Jackson, 2000). Teachers have reported being under considerable pressure from parents and even school administrators to include nonscientific ideas, such as intelligent design creationism, alongside evolution as part of the biology curriculum, or to omit the concept of evolution entirely (National Science Teachers Association, 2005). In addition, teachers sometimes struggle with reconciling the science with their own personal beliefs (Meadows et al., 2000). Even biology instructors who report using evolution as the central theme in their courses spend very little time addressing human evolution (Berkman, Pacheco, & Plutzer, 2008).   4 The teaching of evolution is heavily politicized due to the theory’s perception by some religious and/or political groups as incompatible with certain beliefs, particularly those relating to the origins of humans. Studies and polls have consistently shown the percentage of adults in the United States who accept the reality of evolution is low (Miller et al., 2006; Newport, 2012). This lack of acceptance has been variously attributed to education level, religious belief, political affiliation, lack of understanding of the theory specifically, and lack of understanding of the nature of science more generally (Akyol et al., 2012; Farber, 2003; Johnson & Peeples, 1987; Lombrozo et al., 2008; Rutledge & Mitchell, 2002). Some science educators disagree about the appropriate goals for science education where controversial issues such as evolution are concerned. Some argue that belief and acceptance are important for understanding scientific theories (Alters, 1997; Cobern, 1994; McKeachie, Lin, & Strayer, 2002), while others maintain that these are entirely separate constructs (Ha, Haury, & Nehm, 2012; Sinatra, Southerland, McConaughy, & Demastes, 2003; Smith & Siegel, 2004), and that the responsibility of science educators lies in teaching for understanding rather than belief (Smith & Siegel, 2004). Regardless, there is little doubt that understanding and acceptance play important roles when it comes to teaching evolution, reflected in the number of studies that focus on these two constructs and how they interact (reviewed in Lloyd-Strovas & Bernal, 2012). Assessing conceptual understanding of evolution has been the focus of many science education studies, and there are several instruments available that claim to measure levels of evolution knowledge (Anderson, Fisher, & Norman, 2002; Bishop & Anderson, 1990; Hawley, Short, McCune, Osman, & Little, 2011; Nadelson & Southerland, 2009; Nehm, Beggrow, Opfer, & Ha, 2012; Rutledge & Warden, 1999). Misconceptions about evolution and its associated mechanisms abound, and these are well documented in the literature (Gregory, 2009). In   5 addition, there are many resources available for instructors to aid in teaching evolution (Alberts & Labov, 2004; National Academy of Sciences, 1998; National Research Council, 2012; Nelson, 2008; Wuerth, 2004). Even so, understanding of evolution and its mechanisms remains quite low. It seems intuitively apparent that low levels of conceptual knowledge about evolution should account for much of the lack of acceptance of evolutionary theory among Americans. In fact, studies examining the relationship between understanding and acceptance have proved largely inconclusive (Lloyd-Strovas & Bernal, 2012). A few studies report positive relationships between levels of understanding and acceptance (Akyol et al., 2012; Rice, Olson, & Colbert, 2011), while others have found no association (Bishop & Anderson, 1990; Brem, Ranney, & Schindel, 2003; Demastes, Settlage, & Good, 1995; Ingram & Nelson, 2006; Sinatra et al., 2003), and at least one study found that acceptance decreased with increasing levels of understanding (Bailey, Han, Wright, & Graves, 2011). The issues that make teaching evolution so challenging serve to highlight the importance of the new reforms, particularly with regard to the integration of science content and practice. Using evolution as the context for engaging students in science practices may be an effective way to overcome some of these challenges (National Research Council, 2012; O'Brien, Wilson, & Hawley, 2009; Passmore, Stewart, & Zoellner, 2005; Pennock, 2005, 2007a; Robbins & Roy, 2007; Rutledge & Mitchell, 2002; Sandoval & Morrison, 2003). Simulations of evolution and natural selection are widely used by educators, but there are fewer opportunities for teachers to incorporate authentic research experiences on evolution in the classroom. One way of accomplishing this is to conduct artificial selection experiments with biological model organisms such as bacteria (especially E. coli), plants (e.g., Wisconsin fast plants), or animals (typically   6 Drosophila). These experiences are powerful in that they allow students to observe changes in populations over time, perform hands-on experiments, and collect data to see patterns and answer questions about natural selection and genetics. However, evolution in biological organisms is slow, even in bacteria, and it can take weeks or months for students to see appreciable changes in populations. Moreover, it can be difficult for students to understand what is happening at the level of an organism’s genome that accounts for population level changes. In addition, using these organisms as models may require potentially costly resources (especially time, space and materials), and are subject to errors and accidents. An option that overcomes most or all of the limitations posed by biological model organisms is digital evolution. Populations of digital organisms – mini-programs similar to computer viruses that are capable of self-replication – evolve in minutes and can produce large quantities of data in a short amount of time. An example of digital evolution software is Avida (“A” for “artificial”; “vida” is Spanish for “life”), a research platform initially developed by Charles Ofria and Christoph Adami at CalTech; both are now at Michigan State University, which has become the home for Avida-based evolutionary research. Avida was developed to test the power and substrate neutrality of the evolutionary algorithm, the idea that as long as there is variation, inheritance, and selection in a system evolution will occur (Dennett, 1995). Avida also allows biologists to ask evolutionary questions that are difficult or impossible to test in organic systems (Adami, 2006), and is currently being used to investigate many different questions, including the evolution of complex features, altruistic behavior, sex, intelligence, foraging behavior, and more, resulting in dozens of publications (a comprehensive list of research and publications can be found at Dr. Ofria’s website: http://www.ofria.com/).   7 There are many advantages to using Avida to study evolutionary processes, but chief among these is that it constitutes a true instance of evolution rather than merely a simulation of it (Pennock, 2007b). Although programmers developed the software and researchers set the input parameters, the outcomes are not pre-determined. Digital organisms in Avida (aka “Avidians”) replicate, mutate, and compete with other organisms for resources in their computational environment. The system possesses all of the requirements necessary for evolution by natural selection to occur, and it does. Recognizing the potential of Avida as a powerful tool for teaching about evolutionary processes and the nature of scientific reasoning and inquiry, Pennock developed an educational version of the software called Avida-ED (Pennock, 2007a). The program features a userfriendlier interface and is less complex than Avida. It uses a bacterial metaphor to make the Figure 1.  The genome of a digital organism in Avida-ED consists of a circular instruction sequence 50 commands in length. A different letter of the alphabet denotes each command, with a total of 26 commands. digital organisms and environment less abstract for students: individual Avidians are visually represented by a circular instruction set (their “genome”) that consists of a sequence of 50   8 computer commands, each denoted by a letter of the alphabet (for a total of 26 different commands; Fig. 1). Like organic prokaryotic organisms, Avidians are haploid and asexual, reproducing by making copies of their genome. Populations of Avidians grow in a virtual Petri dish consisting of a grid, the size of which is defined by the user (Fig. 2). Each cell on the grid is a potential space for one individual. The user also determines a number of other parameters in the settings. Per site mutation rate, or the probability that any one locus in the genome will change during a given replication event, can range from zero to 100%. In Avida, types of mutation include insertions and deletions as well as substitutions, but for the sake of simplicity only substitutions are allowed in Avida-ED; therefore, the size of the Avidian genome remains a static 50 instructions in length. Offspring placement can be set as near the parent (assigned randomly to one of the eight cells adjacent to the parent) or located randomly in the dish, giving the user the ability to test the effects of Figure 2.  Populations of digital organisms grow and evolve on a virtual Petri dish, the size of which is determined by the user. Each cell of the grid represents a single individual organism. The different colors represent individual fitness relative to the average fitness of the population.   9 different distribution patterns. Finally, the user determines which of nine resources are available in the environment. The resources are analogous to sugars that would be part of an actual bacterial growth medium, and therefore, keeping with the metaphor, end in “-ose” (e.g., notose, nanose, orose). These resources correspond to various computational logic operations or functions that the digital organisms can evolve to perform (e.g., NOT, NAND, OR). During an experiment, substitution mutations occur that alter the genomic sequence of the ancestor organism (the default ancestor organism in Avida-ED is capable only of replication and cannot perform any of the logic functions). Sometimes these mutations accumulate to produce, by chance, a sequence of commands that enables the organism to perform one of these logic functions, such as inputting two strings of numbers from the environment, comparing them, and outputting whether or not they are the same (i.e., the function EQU). If the organism evolves the ability to perform a function corresponding to an available environmental resource (“equose” in this example), the organism will receive an “energy boost” in the form of increased processing power, which will allow that organism to replicate faster. For this reason, these operations are commonly referred to as metabolic functions. A biological example of the same process would be the evolution of an enzyme that allows for the metabolism of a particular sugar in an organism’s environment, one that was previously unavailable as a resource; this very situation has been documented in E. coli (Blount, Borland, & Lenski, 2008). Because resources are uniformly distributed and unlimited in the computational environment, organisms in Avida-ED essentially compete for space. Avidians possessing functions that decrease replication time will tend to produce relatively more offspring than organisms lacking those functions – they will be more fit – and the frequency of beneficial   10 mutations that led to the expression of those fitter phenotypes will increase in the population – which is to say, the population will evolve. Although Avida-ED lacks the full functionality of Avida, it still allows the user considerable latitude to experiment by manipulating various parameters and observing the effects on an organism or population. Given these features, Avida-ED is potentially ideal for engaging students in authentic science practices as a means of learning about evolution and the nature of science, making it exceptionally well aligned with recent reform recommendations. Research goals and theoretical framing. Goals of the research   Avida-ED has been publicly available to instructors since June 2007 and data tracking reveals the software has been downloaded hundreds of times from the project website alone. Despite this, and despite its potential as a teaching tool that allows for the integration of science content and practices, there have been no formal studies documenting how instructors are using Avida-ED, their perceptions of the tool, or its influences on student learning. Therefore, the primary purpose of this dissertation is to investigate, for the first time, the effectiveness of Avida-ED as a tool for facilitating teaching and learning about evolution and the nature of science. The specific goals driving the current study are to: • Document the ways that instructors are implementing Avida-ED, both for future development of the program and to serve as a resource for instructors who wish to use it in their classrooms; • Obtain feedback from instructors who have used Avida-ED; • Assess the extent to which the program meets instructor needs and facilitates reformoriented teaching practices;   11 • Identify conditions (e.g., best practices) that are potentially necessary for achieving significant learning gains while using Avida-ED; and • Assess the extent to which Avida-ED facilitates student learning and document the nature of these outcomes. Theoretical framework   There are many factors to consider when assessing the influences of curricular materials and instructional innovations on teaching and learning. These include, among other factors, the nature and structure of disciplinary knowledge and how that knowledge can be accessed and understood by learners, the goals and beliefs of teachers and how they influence instructional decisions, and learner affective response to instruction. The literature addressing these factors and how they intersect span a wide range of disciplines from philosophy of science to Figure 3.  Disciplinary subject matter knowledge consists of substantive and syntactic structures. The syntactic structure can be further subdivided into epistemology and practices.   12 developmental educational psychology, making an in-depth review of each impractical. Here I provide a brief overview of the literature that is relevant to this research. Learning about the nature and practices of science Studies in disciplinary-based education research (DBER) have shown that best practices for teaching science include those that are learner-centered (National Research Council, 2005; Weimer, 2002), for example interactive lectures that encourage student participation, involving students in collaborative activities, and inquiry-based lab and field experiences (Singer et al., 2012). These findings serve, in part, to justify current reform recommendations at the national level, which aim to engage students in authentic science practices and foster critical thinking and scientific reasoning (Brewer & Smith, 2011; College Board, 2011; National Research Council, 2011). The results of DBER also support the idea that there is more to understanding science than mastery of content, which only accounts for part of subject matter knowledge, considered by some to be composed of two distinct, though closely related, structures: substantive and syntactic (Grossman, Wilson, & Shulman, 1989; Schwab, 1964a; Fig. 3). The substantive structure of a discipline consists of the conceptual frameworks on which all knowledge in the discipline is based. In science, this would include organizing concepts such as the conservation of matter, forces, or evolution, as well as the myriad principles, facts, and relationships that are generally thought of as science content and which, until very recently, formed the entire basis for standards and curriculum. The syntactic structure, in contrast, is composed of the rules and norms for how new knowledge in the discipline is constructed, accepted and canonized. In science, the syntactic structure can be broken down into two distinct, though mutually reinforcing, kinds of knowledge: nature of science (NOS) and scientific inquiry. The nature of   13 science consists primarily of scientific epistemology and habits of mind, while inquiry is the process by which scientific knowledge is generated, involving the actual procedures of questioning, data gathering (e.g., observations, experimentation), analysis of data to reveal patterns, and invention and development of models and theories that explain those patterns. The distinction between NOS and inquiry is important because there is evidence, supported by theories of learning, that different approaches are necessary for acquiring these different types of syntactic knowledge. These approaches tend to fall into one of three categories: explicit-reflective, implicit-apprenticeship, and integrated. The main premise underlying the implicit-apprenticeship approach is that learners will come to understand the syntax of science by directly engaging in authentic science practices, generally under the mentorship of a practicing scientist, but without being explicitly or formally introduced to any particular NOS ideas; instead, they will “pick up” aspects of NOS as a consequence of their participation rather than through direct instruction. The implicit approach is motivated by sociocultural perspectives on learning, which hold that learning is contextually situated (Lave & Wenger, 1991) and that knowledge is not created in the isolation of an individual’s mind but is instead a social construction and collaborative enterprise (Case, 1996; Rogoff, 2003). Sadler, Burgin, McKinney and Ponjuan (2010) critically reviewed the literature on this approach in science education at all levels (K-12, undergraduate, and pre- and in-service teachers). They reported on the efficacy of the approach for improving different aspects of science knowledge, including, among others, inquiry and NOS. They found that these experiences had positive effects on many different factors such as understanding research processes, communication, and technical skills, but only modest increases in participant understandings of NOS, and these tended to be somewhat superficial (e.g., increased   14 appreciation for science as collaborative, painstaking, and messy); deeper understandings of the more esoteric aspects of NOS (e.g., the evolution of scientific thought over time, or what Schwab (1964b) referred to as “fluid enquiry”) did not generally increase via the implicit approach. Unlike the implicit approach, the explicit-reflective approach to teaching about the nature and practices of science focuses on systematically addressing NOS concepts, especially those that have been identified in the literature as having many associated misconceptions (e.g., McComas, 1996). This is usually done in the context of a course, such as science methods or philosophy of science. The approach, which is influenced primarily by cognitivist theories of learning such as conceptual change (Posner, Strike, Hewson, & Gertzog, 1982), seeks to create cognitive perturbations and dissatisfaction with naïve views of science, modifying existing cognitive structures so that they are better aligned with expert conceptions (von Glasersfeld, 1989). Studies using this approach have reported success in improving learners’ understanding of deep NOS concepts such as the tentativeness of science, the distinction between observation and inference, and the relationship between theories and laws (Akerson, Abd-El-Khalick, & Lederman, 2000), shifting from product- to process-oriented views of science (Gess-Newsome, 2002), and improved ability to articulate complex NOS ideas (Abd-El-Khalick, 2005). However, an explicit approach may be lacking in providing learners with practical experience in the process of inquiry and therefore associated technical and procedural knowledge. As indicated by the name, the integrated approach to learning about science incorporates elements of both the implicit-apprenticeship and explicit-reflective approaches. In their review of the literature on the implicit-apprenticeship approach, Sadler et al. (2010) reported that of the studies having some positive influence on NOS ideas, only one (Schwartz, Lederman, &   15 Crawford, 2004) incorporated an explicit approach and, incidentally, this was the only study to document gains in deeper NOS concepts, such as tentativeness and observation vs. inference. The preceding literature has important implications for learning in science, because learning outcomes may be greatly influenced by the approach taken by an instructor. For example, if a primarily implicit approach is adopted (e.g., setting students to work on a laboratory project with little or no explicit guidance) one might expect to see gains in learning about the process of science, but not necessarily in NOS concepts or even specific content, as these may require more explicit treatment. Instructor goals and beliefs about teaching and learning and influences on curricular implementation Science teacher beliefs about teaching and learning influence their pedagogical decisions (Brickhouse, 1990; Cronin-Jones, 1991; Roehrig & Kruse, 2005). These beliefs, about how and what to teach and the reasons why, are shaped by many factors, including instructor content knowledge (Brickhouse, 1990; Roehrig & Luft, 2004), views on what subject matter knowledge students need to know (Cronin-Jones, 1991; Tobin & McRobbie, 1997), beliefs about how students learn and the teacher’s role in their learning (P. L. Brown, Abell, Demir, & Schmidt, 2006; Cronin-Jones, 1991; Tobin & McRobbie, 1997; Trautmann, MaKinster, & Avery, 2004; Volkmann, Abell, & Zgagacz, 2005), beliefs about student capabilities (P. L. Brown et al., 2006; Cronin-Jones, 1991; Roehrig & Luft, 2004; Trautmann et al., 2004), and pedagogical content knowledge (Grossman et al., 1989; Shulman, 1986, 1987). Teachers ultimately make final pedagogical decisions in the classroom, and are prone to translating intended curricula and materials into practices that align with their beliefs (J. Olson, 1981). Roehrig and Kruse (2005) found that teachers whose beliefs were aligned with reform-based pedagogy were significantly   16 more likely to engage in reform-based teaching practices. Therefore, teacher educators and developers of curriculum must take the reality of teacher beliefs into account when preparing new teachers and engaging practicing teachers in professional development. Teacher knowledge and beliefs also play key roles in both the adoption and successful implementation of reform-oriented curriculum materials (Czerniak & Lumpe, 1996; GessNewsome, Southerland, Johnston, & Woodbury, 2003; Powell & Anderson, 2002; Roehrig & Kruse, 2005; Southerland, Gess-Newsome, & Johnston, 2003; Sunal et al., 2001). Materials such as these encourage a constructivist and learner-centered approach to learning. However, even teachers whose beliefs are closely aligned with reforms may struggle with the implementation of curricular materials if they lack the knowledge of how to do so effectively. In his writings on pedagogical content knowledge (PCK), Shulman referred to this as curricular content knowledge: knowledge of curriculum and materials appropriate for illustrating particular concepts, as well as the knowledge of how to use such materials effectively (Shulman, 1986, 1987). Therefore, reform-based curricular materials require long-term professional development for sustainable implementation (Powell & Anderson, 2002; Sunal et al., 2001). It has been reported widely in the literature on social and educational reform that the effectiveness of innovations is determined to a great extent by how they are implemented by adopters (reviewed in Durlak & DuPre, 2008). Henderson and Dancy (2011) explain that many educational innovations fail due to what they call inappropriate assimilation—the act by which innovations are adopted but then implemented in ways that align with adopters’ traditional beliefs about teaching and learning instead of the way they were intended (J. Olson, 1981). Inappropriate assimilation may result in diminished outcomes and lead adopters to believe that the innovation is ineffective. For this reason it is important for reformers to document and   17 evaluate the implementation of an innovation as it is happening to determine the degree to which the actual enactment aligns with the theoretical bases of its development and the developers’ original vision (Durlak & DuPre, 2008). Several factors influence the outcomes of a given implementation, including fidelity and adaptation, duration, and overall quality. Fidelity is defined in the literature as the degree to which an innovation has been faithfully reproduced (i.e., as intended), while adaptation is the degree to which an adopter has modified or reinvented the innovation to better fit the context in which it is to be implemented. In general, higher fidelity of implementation is associated with more positive outcomes. In some cases, however, adaptation may lead to improved results as long as the innovation is not inappropriately assimilated (Durlak & DuPre, 2008). The duration of an implementation can also have significant effects on outcomes. For example, teachers who had participated in long-term science apprenticeships (greater than ten weeks) learned significantly more about the nature and practices of science than those who had only attended seminars or research experiences lasting only a few weeks (Sadler et al., 2010). Longer experiences give participants greater opportunity to engage in a wider range of practices and more time to observe and reflect. All of these factors contribute to learning outcomes and thus the efficacy of an intervention. Finally, quality is an overall estimate of how well the implementation was conducted, and this varies depending on the nature of the innovation. For educational innovations, this may be dependent on the degree to which implementation aligns with reform goals. For example, it may depend on the degree to which implementation is teacher- or learner-centered (Weimer, 2002), and, for science education, on the level of inquiry (Fay, Grove, Towns, & Lowery, 2007; National Research Council, 2000).   18 Observing how instructors translate and implement curricular innovations can help to identify the instructors’ strengths but also pinpoint problems that might be addressed with targeted professional development. Affective factors and influence on science learning Anyone who has taught will likely attest that there is much more to teaching than purely cognitive outcomes. Although the primary goal of education is ostensibly to facilitate student conceptual understanding of subject matter, teachers often have other, parallel goals. These may include helping students understand the nature of the discipline under study, developing practical and critical thinking skills, or catching hold of student interest and cultivating a lifelong love of the subject and of learning itself. Teachers often put an enormous amount of effort into developing lesson materials and experiences that will be interesting to their students. In this way, educators can influence both cognitive and affective student outcomes. Student interest has been a prolific area of educational psychology research for decades. The birth of the field is often attributed to the writings of John Dewey, who first explored the topic in his 1913 book Interest and effort in education. After a 50-year period where very little research was conducted on interest, the field was reinvigorated in the late 1980s, primarily by the work of researchers studying text-based learning (Schraw & Lehman, 2001), and has since expanded to produce valuable insights via hundreds of studies. In his book, Dewey (1913) developed a working definition of interest: “Genuine interest is the accompaniment of the identification, through action, of the self with some object or idea, because of the necessity of that object or idea for the maintenance of a self-initiated activity” (p. 14). Defined as such, interest is connected to ideas such as motivation and self-concept. Indeed, studies have shown that interest can result in increased motivation and achievement in students   19 (Bergin, 1999; Brophy, 1999; Mitchell, 1993; Schraw & Lehman, 2001; Singh, Granville, & Dika, 2002; Subramaniam, 2009), and may influence student decisions of future educational or vocational pursuits (Ainley & Ainley, 2011; Hidi & Renninger, 2006). Interest is also closely associated with several other affective factors, including enjoyment, attention, and self-efficacy, all of which have in turn been associated with learning, achievement, and intent (Ainley & Ainley, 2011; Hidi, Renninger, & Krapp, 2004; Linnenbrink & Pintrich, 2003; Schraw & Lehman, 2001). Educational psychologists working in the field have identified two distinct kinds of interest: individual or personal interest, and situational interest. As defined by Mitchell (1993), “A personal interest refers to an interest that people bring to some environment or context… On the other hand, situational interest refers to an interest that people acquire by participating in an environment or context” (p. 425). Teachers generally have little control over students’ individual interests that have been developed over time outside of the classroom. However, teachers have considerable control over situational interest, and can engender such interest by developing various classroom activities and experiences in which to engage students (Brophy, 1999; Mitchell, 1993). Characteristics of activities that create situational interest include active (i.e., hands-on) engagement of students, student choice, opportunities for students to work in social contexts (e.g., groups, cooperative learning), and meaningful content (Bergin, 1999; Rotgans & Schmidt, 2014; Schraw, Flowerday, & Lehman, 2001; Subramaniam, 2009). Students also perceive as interesting activities that elicit enjoyment, provide opportunities for exploration, demand high levels of cognitive engagement, and fill perceived gaps in knowledge (Rotgans & Schmidt, 2014). In terms of motivational theories, these kinds of activities also carry value for students as well as an expectation that the student will be successful (Brophy, 1983, 1999).   20 Considering the implications of this research on issues of academic success and persistence in STEM fields, a substantial body of work has been devoted to identifying factors that increase student interest and engagement in STEM. In alignment with the findings of DBER, these studies show that pedagogy engaging students in authentic practices that afford students agency (e.g., by allowing them to pursue questions of interest) has positive influences on student affect (Bergin, 1999; Rotgans & Schmidt, 2014; Schraw et al., 2001; Subramaniam, 2009). Science teacher pedagogical decisions, in addition to affecting learning outcomes, may therefore indirectly influence student persistence in STEM. Research questions. In order to investigate the effects of teaching and learning with digital evolution, I conducted a nationwide, multiple-case study that pursued the following questions: 1. How are biology instructors at different institutions across the United States using AvidaED in their courses? 2. What are instructors’ educational goals and beliefs about teaching and learning science, and how do these goals and beliefs influence implementation decisions? 3. To what degree is each instructor’s implementation of Avida-ED aligned with reformoriented pedagogical strategies? 4. To what degree does Avida-ED allow instructors to teach in ways consistent with both their personal teaching philosophies and national science education reform recommendations? 5. How does learning with Avida-ED influence student outcomes with regard to understanding and acceptance of evolution?   21 6. How do students respond affectively to instruction with Avida-ED, and what factors might influence this response? Hypotheses and predictions. Avida-ED was designed to align with evidence-based best teaching practices and reform recommendations, with the intention of integrating content and authentic science practice. If implemented as intended, it could potentially be a very powerful tool for teaching evolution and the nature of science. However, student outcomes – both cognitive and affective – are likely dependent on how instructors decide to use the program, and these pedagogical decisions may in turn be influenced by instructor goals for and beliefs about student learning. If these premises are true, then the degree to which student outcomes are influenced by instruction with Avida-ED may depend upon the nature of its implementation, specifically the degree to which instructors use Avida-ED to engage students authentically in a full range of science practices, just as its developers intended. Overview of chapters. In Chapter 2, I provide an overview of the methods used in this study. Chapter 3 addresses the question of how instructors used Avida-ED in their courses by providing detailed accounts of each implementation and descriptions of the various contexts in which they took place. Next, in Chapter 4, I explore the influences of instructor goals and beliefs on their implementation decisions through pre- and post-implementation interviews. Chapter 5 deals with the results of an assessment given to students before and after they used Avida-ED. I report on average gains in understanding and acceptance of evolution for each case and discuss the relationship between these factors. Chapter 6 pertains to student affective response to instruction with Avida-ED, measured by their responses to a survey on their experience with using the   22 program. I discuss the potential influence of various factors on their responses, and look for relationships between affect and assessment outcomes. In Chapter 7 I synthesize the results of each chapter and summarize what I have learned about teaching and learning with Avida-ED, and discuss the implications of my findings on reform-based teaching and learning in the sciences.   23 Chapter 2. Methods   Overview.   To investigate how instructors implemented Avida-ED in their courses and reveal potential influences on student learning outcomes, I used a multiple-case approach drawing from both qualitative and quantitative data sources. Each case was carefully characterized, and crosscase analyses were used to reveal emergent patterns. In this chapter, I first justify the use of my chosen approach. Next, I describe each of the data sources used in the study, providing information specific to particular chapters as necessary. Methodology: Justification of the multiple-case approach.   I implemented a multiple-case study design (Yin, 2009) involving eleven volunteering biology instructors, teaching ten courses at eight institutions, who had agreed to use Avida-ED in their courses during the 2012 – 2013 academic year. I collected both quantitative and qualitative data from each instructor and the students in his or her course. Using Yin’s (2009) criteria, the multiple-case approach is appropriate given the nature of the study questions and the conditions they entail: I wanted to know how Avida-ED was being implemented in classrooms and how implementation influenced student learning of fundamental concepts; I had no control over the behaviors observed in the study (what was taught and how); and I was focusing on a “contemporary phenomenon” in a “real-life context” (p. 18). In addition, I relied on a number of different data sources, including interviews, assessments, surveys, and course materials, to build each case and triangulate from evidence (another characteristic common to case study research; Yin, 2009). These data sources, along with rationale for their inclusion and methods of analysis, are listed below. Finally, the goals of this study were part exploratory, part descriptive, and part   24 explanatory. A case study design is particularly well suited to serving all of these purposes simultaneously (Yin, 2009). Data sources and analyses. Pre-/post-implementation interviews with instructors   I interviewed instructors using a semi-structured protocol on two occasions, before and after they implemented Avida-ED in their courses (see Appendix for interview protocols). The purposes of the initial interview were to build a friendly working relationship with each participating instructor, understand the case context, the instructor’s goals (including what challenges s/he typically faces when teaching science generally and evolution specifically), views on several aspects of teaching and learning, and reasons for choosing to use Avida-ED. This initial interview also focused on finalizing logistical details, such as how assessment questions would be administered to students and when the follow up interview would take place. In the post-implementation interview, I asked instructors to describe what they had done (in the event that the actual implementation differed from what they had planned), provide a personal assessment of success (e.g., the extent to which they were able to meet objectives and overcome challenges, perceptions of student learning outcomes and the bases for these, etc.), and to share their views on the major affordances and limitations of using Avida-ED as a tool for teaching and learning about evolution and the nature of science in their particular teaching situation. In addition, instructors were asked what they had learned from the implementation and how it might affect future practice (e.g., what they would change were they to use Avida-ED in the future). Prior to the post-implementation interviews, I summarized student assessment outcomes and shared them with instructors; these also served as a point of discussion. All interviews   25 occurred in person whenever possible, or else by phone or teleconference (e.g., Skype). All were audio recorded and relevant portions were transcribed. Participating instructors’ responses to semi-structured interview questions were analyzed following a grounded methodology (Glaser & Strauss, 1967) to identify emergent patterns and themes. These data were used in conjunction with other data sources (especially course materials) to characterize each course and implementation of Avida-ED (Chapter 3). I also used interview responses to address the research question of how instructor goals and beliefs about teaching and learning might have influenced implementation decisions (Chapter 4). Pre-/post-assessment of students   I asked participating instructors to administer a common assessment (see Appendix) to their students immediately prior and subsequent to any lesson sequence involving Avida-ED so that any changes in student responses could be reasonably associated with lessons relating to the software. In order to maintain consistency across cases, instructors agreed to refrain from altering or omitting any questions from the assessment; however, they were free to include additional items of interest or to include my questions as part of a regular assessment. Some faculty chose to administer the assessment questions digitally; this facilitated data collection, especially for large enrollment courses. The goal of the assessment was not to show the effectiveness of Avida-ED as compared to other teaching tools, but instead to document the direction and magnitude of changes in student outcomes, and to determine which items or concepts appeared to be most influenced by instruction with Avida-ED within each particular case.   26 Development of the assessment instrument The question set consisted of two components. The first component included two1 openended, constructed response items that dealt with conceptual issues. The concepts assessed – origins of genetic variation and the basic mechanism of natural selection – were chosen because they are 1) key to understanding how adaptive evolution occurs and yet are associated with a number of common misconceptions (Bishop & Anderson, 1990; Gregory, 2009), 2) particularly easy to observe in Avida-ED, and 3) universal (i.e., applicable to biological and digital organisms alike). The second component of the assessment consisted of ten forced-choice items that addressed student acceptance of evolution. These questions were borrowed or modified from two instruments, the widely used MATE (Measure of the Acceptance of the Theory of Evolution; Rutledge & Warden, 1999) and an unpublished questionnaire created for internal assessment of education goals at BEACON, a science and technology center funded by NSF that focuses on the study of evolution in action. The questions I chose from these two instruments focused on three main premises: 1) evolution is a real phenomenon that has happened and continues to happen, 2) evolutionary theory provides a good explanation for the diversity of life on earth, and 3) evolution is evidence-based science. In order to reduce variability within cases and make cross-case patterns easier to see, the student data for each case were reduced to a single average content score and single average acceptance score. These scores were then used for statistical comparisons of pre- and post-test averages for each case.                                                                                                                   1 The assessment had originally contained an additional two open-ended items pertaining to the nature and practices of science. However, these were problematic for a number of reasons and were ultimately omitted from the final analysis.   27 Scoring open-ended responses I compared each student-constructed response to an ideal response synthesized from the relevant literature (Sadava, Hillis, Heller, & Berenbaum, 2012; University of California Museum of Paleontology, 2014; Zimmer & Emlen, 2013), and assessed each for degree of accuracy and completeness. For each ideal response, I identified one or more critical components that were applicable to evolution in both digital and biological organisms. For example, on the question of the origins of variation, there were two critical components: “mutations” and “random”. (In Avida-ED, random mutations are the only source of genomic variation. Because Avidians are haploid and asexual, it would not have made sense to include other factors such as meiosis or fertilization that contribute to variation in populations of diploid, sexually reproducing organisms, because student learning gains with regard to those factors could not be attributed to Avida-ED.) Student responses were assigned scores based on their degree of accuracy and completeness of each critical component. Responses with critical components that were judged accurate and complete (i.e., well aligned with the ideal response) were given 2 points, the maximum possible score. Responses with critical components that were mostly accurate but incomplete (i.e., emerging understanding) were given 1 point. Responses that were ambiguous, incorrect, or that were missing the relevant critical component were given 0 points. Student scores were interpreted as a percentage of the ideal response, calculated as the ratio of points assigned to points possible. Student scores for each question were pooled (total points assigned divided by 10, the maximum number of points possible across both questions), and mean scores were calculated to arrive at the average content score for each case. These were compared statistically from pre- to post-assessment using paired Student’s t-tests.   28 Below are the questions and ideal responses; also included are examples of student responses to illustrate how they were scored against the rubric (Table 1; for the complete scoring rubric, see the Appendix): Question 1. Explain how variation arises in a population. Ideal response: All variation at the population level ultimately arises from random mutations caused by errors during genome replication in individual organisms. Question 2. Imagine that a new life form was just discovered on another planet. It is not made up of cells nor does it contain DNA. What characteristics of this life form would be necessary in order for it to evolve? Explain your reasoning. Ideal response: In order to evolve, the organism must satisfy three criteria: it must have some form of code or information that is copied/replicated and passed to offspring; variation in the population is caused by random mutations or changes to the code during replication; and selection acts on individuals so that those possessing certain traits are able to survive and reproduce better than competitors in a given environment. Inter-rater reliability Two hundred constructed response items (twenty per case; one hundred pre- and one hundred post-test) were dual-coded by two independent raters (I was one). I trained a colleague to code items according to the rubric. I then determined inter-rater reliability for each of the five critical components (mutation, random, inheritance, variation, and selection) using both percent agreement and Cohen’s kappa (Cohen, 1960). Initially, reliability for “inheritance” was low; we discussed the rubric for clarification, and re-coded these items. Ultimately, all five critical components had a percent agreement of over 80%, and kappa values fell within “substantial” or   29 “almost perfect” ranges, using benchmarks for interpretation proposed by Landis and Koch (1977). The inter-rater reliability outcomes are summarized in Table 2. Table 1. Examples of student constructed responses and scoring against rubric. Question 1: Explain how variation arises in a population. Student Response Mutation Random Total % Ideal Response Variation arises due to the appearance of recessive traits 0 0 0 0% when the right genetic makeup is achieved. Genetic mutations occur in an individual which then may 2 0 2 50% or may not pass on to offspring. Variation arises through random mutations or natural 2 2 4 100% selection. Question 2: Imagine that a new life form was just discovered on another planet. It is not made up of cells nor does it contain DNA. What characteristics of this life form would be necessary in order for it to evolve? Explain your reasoning. % Ideal Student Response Inheritance Variation Selection Total Response The life form would have to be made up of some kind of matter. If the matter has the 0 0 0 0 0% ability to change, the life form can evolve. Reproduction that allows for a mixing of genetic material so that weak and strong 1 0 1 2 33% characteristics can emerge and natural selection can occur Genetic information that is passed on. Reproduces. Mutations occur. Respond to 2 1 1 4 67% environment. It would need a replication system that isn't perfect, or random changes in whatever code it does use, this would allow for variation. It would also need to have some forms more fit for the environment than others in order to select for good or bad 2 2 2 6 100% mutations. These two things would be enough for evolution: changes occur that make some things better/worse, and the environment is harsh and weeds out the worse ones.   30 Table 2. Summary of inter-rater reliability outcomes. Pr(a) is equivalent to percent agreement between coders. Critical Component Pr(a) Cohen's Kappa Interpretation Mutation 0.96 0.89 Almost Perfect Random 0.92 0.80 Substantial Inheritance 0.82 0.73 Substantial Variation 0.88 0.78 Substantial Selection 0.81 0.64 Substantial Scoring forced-choice responses The ten forced-choice items are listed in Table 3. Student acceptance scores were calculated in the manner of the MATE instrument (Rutledge & Warden, 1999): categories of responses were assigned a numeric value (Strongly Disagree = 1; Strongly Agree = 5). The scale was inverted for items that were worded negatively (questions 3, 4, and 6) for consistency in scoring. The sum across all ten items was calculated and doubled to produce an acceptance score that ranged from 20 to 100. The average student acceptance score was then calculated for each case. Rutledge (1996) developed the following criteria for interpreting scores on the MATE instrument, and I used these as a very general guide to interpretation of the student acceptance scores in this study (Table 4). Pre- and post- mean acceptance scores for each case were then compared using paired Student’s t-tests. Content vs. acceptance To determine whether there was a relationship between student learning of content and acceptance outcomes, normalized gains (g-avg; Hake, 2002) were calculated for each student’s pre- and post-assessment scores, which were then averaged for each case. The Pearson’s correlation coefficient was then calculated to determine the degree of linear correlation between these two variables.   31 Table 3.  Student acceptance of evolution was assessed with the following questions. Items were scored according to a 5-point Likert scale. Some questions were borrowed or modified from the Measure of the Acceptance of Evolutionary Theory (MATE) instrument (Rutledge & Warden, 1999). Other questions from an unpublished survey by Mead and Libarkin were written in the style of the MATE.   Question Source Major Concept(s) Addressed Evolution happens; Evolution is explanatory 1. Organisms existing today are the result of evolutionary processes that have occurred over millions of years. Rutledge & Warden (MATE) 2. Evolution is a process that is happening right now. Mead & Libarkin (unpublished survey) Evolution happens 3. Evolution cannot ever be observed because it happens over very long periods of time. Mead & Libarkin Evolution happens; Evolution is evidence-based Mead & Libarkin Evolution is evidence-based 4. 5. 6. 7. 8. 9. 10. Evolutionary biology generally does not investigate testable ideas about the natural world. Evolutionary biology relies on evidence to make claims about the natural world. The available data are ambiguous (unclear) as to whether evolution actually occurs. Evolution can explain changes in populations of species over time. Evolutionary theory is supported by factual, historical and laboratory data. Computer programs can create instances of evolution (within a computational environment). Evolution is a scientifically valid theory. Table 4. Interpretation of acceptance scores. Interpretation Very High High Moderate Low Very Low   Acceptance Score 89 – 100 77 – 88 65 – 76 53 – 64 20 – 52 32 Rutledge & Warden Evolution is evidence-based Evolution happens; Evolution is evidence-based Evolution is explanatory Evolution is evidence-based Mead & Libarkin Evolution happens Rutledge & Warden Evolution is explanatory; Evolution is evidence-based Mead & Libarkin Rutledge & Warden Mead & Libarkin Post-implementation survey of students   In addition to the post-assessment, I asked instructors to administer a short survey to their students subsequent to lessons involving Avida-ED to elicit student feedback on their experiences with using the tool (see Appendix). All survey items were measured on a five-point Likert scale, with the low end of the scale corresponding to negative impressions and the high end to positive or favorable impressions. Items addressed student attentiveness and participation during activities involving Avida-ED, as well as interest and enjoyment, the degree to which Avida-ED helped increase student understanding of evolution and of the nature of science, the degree to which students felt comfortable discussing the subject of evolution, and intentions regarding continued use of the program or sharing with others. There were also a number of items asking students to rate how useful they found various instructional resources made available to them (a video tutorial on YouTube, lessons and projects, in-class demonstrations, conversations with peers and instructors, and the Avida-ED user’s manual). These data were intended to help gauge student interest and engagement with the program (participant responsiveness; Durlak & DuPre, 2008), and to provide useful feedback to the Avida-ED Curriculum Development team for improving the software and associated curriculum materials (for this reason, the survey initially also included four open-ended items that asked students to comment on what aspects they liked most and least about Avida-ED, what they found the most difficult, and what they felt they learned from using Avida-ED. The open-ended responses were not included in the final analysis). Surveys were returned for seven of the ten cases. Like the student assessment data, survey data were also summarized and shared with instructors prior to exit interviews. In a manner similar to the forced-choice items on the assessment, I converted the Likert-scale items   33 on the survey to numeric scores ranging from 1 (most negative response) to 5 (most positive response), and calculated mean scores for each item. These values were then used to create a correlation matrix showing associations between survey items and other variables, including instructor familiarity with Avida-ED and assessment data (Chapter 6). Course materials   I asked each instructor to volunteer any course materials they felt were relevant, including syllabi, assignments, lecture notes and presentations, instructor plans, worksheets or handouts, and assessments. Some instructors additionally provided artifacts produced by the students during their interaction with Avida-ED (e.g., papers, presentations, homework, research notebooks, etc.). I used these materials to provide additional detail with regard to the nature of the courses, including where Avida-ED fit into the overall context of each class and how it was being implemented (Chapter 3). The amount of materials provided by instructors varied tremendously from case to case; at minimum, instructors provided a syllabus for the course. Cross-case analysis   As mentioned previously, the degree of variability from case to case was very high. Cases differed in institution size and type, course size, type, and level, and instructor teaching experience and familiarity with Avida-ED. Therefore, cases could not be compared directly, nor could data from individual cases be pooled. However, by using cross-case analytic techniques I was able to look for patterns and emergent themes across cases (Yin, 2009), for example, by comparing trends in assessment and survey outcomes across different case groupings (e.g., lower- versus upper-division courses).   34 Characterization of implementations To accurately characterize each case, I drew from several data sources including semistructured instructor interviews and other exchanges (e.g., over e-mail), course syllabi, and other course materials including lecture presentations and assignments. These data were used in each case to establish the various contexts (institutional, course, and instructor) in which the implementation occurred, and to develop a detailed description or vignette of the implementation itself. After the cases had been described, I characterized each implementation using a set of objective criteria, adapted from the research on implementation, to reveal potential patterns that might be associated with student outcomes. These criteria are described below and the implementation rubric worksheet can be found in the Appendix. Durlak and DuPre (2008) define fidelity as the degree to which an innovation has been faithfully implemented as its creators intended. Avida-ED was created with the intention of bringing evolution in action into the classroom while simultaneously integrating science content and practices (Pennock, 2007a). Because it is an instance of evolution rather than a simulation, Avida-ED allows students to observe the evolution of populations directly (Pennock, 2007b). In addition, its original incarnation as a research platform means that students use Avida-ED as a digital laboratory space to engage in authentic science practices. If the program is being implemented faithfully, these two features must, at bare minimum, be present. That is, students must observe evolution in action, and engage in research practices as they learn evolution content. With any implementation of an innovation, there is a risk of inappropriate assimilation – of using the innovation in ways that do not reflect the developers’ vision (Henderson & Dancy, 2011). Avida-ED would be inappropriately assimilated if instructors reduced or eliminated the   35 ability for students to engage in inquiry by over-directing (i.e., creating “cookbook” or confirmatory exercises). Instructors who adapted the tool in this way would not be remaining faithful to its original intention. A faithful implementation of Avida-ED would involve engaging students in a wide range of authentic practices that were primarily learner-driven. To assess the range of inquiry practices of each implementation, I identified which of the eight science and engineering practices listed in the Next Generation Science Standards were included as part of each implementation The duration of an implementation influences its effectiveness. This is likely to hold true for Avida-ED, as well. As with any new tool, there is a learning curve. Students who spend more time working with Avida-ED are likely to develop a better understanding of the software and will more easily engage with it in substantive ways. In addition, longer durations likely allow for participation in a greater range of inquiry practices. However, it is important to note that duration of implementation and the amount of exposure students have to an innovation are not necessarily the same, as ultimately the latter will depend on how much time students actually spend working with the tool. In addition to duration, it is important to note the degree to which instructors support their students’ learning during an implementation. Instructional support in the context of Avida-ED might include activities, demonstrations, and tutorials meant to facilitate understanding and use of the tool. It may also include access to additional resources, such the Avida-ED website and user’s manual, instructional videos, readings, and anything that might help students as they interact with Avida-ED. Though not directly related to duration per se, the amount of support provided by the instructor is likely to be positively associated with the overall amount of time   36 students spend working with Avida-ED, either directly or indirectly, and thus the richness of the implementation.   37 Chapter 3. Case and implementation descriptions Overview of cases.   The ten cases examined in this study cover a broad range of institution sizes and types, from small, private liberal arts colleges to very large, high-output research universities (Table 5). Each case consists of one course taught during a single semester (fall 2012 or spring 2013). The courses differ in size, type, and level – i.e., lower division (100-200 level, serving primarily undergraduate freshmen and sophomores; due to the nature of course content, AP Biology is included within this designation), and upper division (300-400 level, serving primarily juniors and seniors; Table 6). Cases also differed in the amount of instructor experience and expertise generally and with regard to Avida-ED specifically (Table 7). Table 5. List of participating institutions, characterized by size (estimated student population2) and Carnegie classification data (if applicable). Institution Code Level Control Student Pop'n (est.) Carnegie Classification (Size and Setting) Carnegie Classification (Basic) A High school Private 500 N/A N/A B 4 year Private 1,650 C 4 year Public 10,500 D 4 year Public 30,500 E 4 year Public 34,750 F 4 year Public 43,000 G 4 year Public 48,000 H 4 year Public 51,000 S4/HR: Small four-year, highly residential M4/R: Medium four-year, primarily residential L4/NR: Large four-year, primarily nonresidential L4/R: Large four-year, primarily residential L4/NR: Large four-year, primarily nonresidential L4/R: Large four-year, primarily residential L4/NR: Large four-year, primarily nonresidential Bac/A&S: Baccalaureate Colleges-Arts & Sciences DRU: Doctoral/Research Universities RU/H: Research Universities (high research activity) RU/VH: Research Universities (very high research activity) RU/VH: Research Universities (very high research activity) RU/VH: Research Universities (very high research activity) RU/VH: Research Universities (very high research activity)                                                                                                                   2 Rounded values are based on average student enrollments for the 2012-2013 academic year, determined from information made publically available on institution websites.   38 Table 6. Case summaries. Case codes are designated by institution code (see Table 2) and course level/type. Class levels are designated as Lower (AP, 100-, or 200-level) or Upper (300- or 400level). Only students taking both the pre- and post-assessment were included in data analyses; therefore, the number of students enrolled in each course may actually be greater than what is reported here. Case Code Class level A_APBio Lower B_300Evo C_400Evo D_100Evo E_200Bio_HC F_400Evo G_100BioLabA G_100BioLabB G_100BioRes H_100CompBio Upper Upper Lower Lower Upper Lower Lower Lower Lower Course type Advanced Placement Biology (High School) Evolution Evolution Evolution Biology (Honors College) Evolution Biology Biology Biology (Residential College) Computational Biology Lecture/ Lab Major/ Non-Major N (matched pre/post) Combined N/A 17 Lecture Lecture Lecture Combined Combined Lab Lab Combined Combined Major Major Major Non-Major Major Major Major Major Major 9 15 12 30 33 153 234 101 24 Table 7. Instructor profiles. Case Code A_APBio B_300Evo C_400Evo D_100Evo E_200BioHC F_400Evo G_100BioLabA G_100BioLabB G_100BioRes H_100CompBio Instructor Code YN NN TN JO HD NR KA AE RY/AR TT Position HS Teacher Professor Assistant Professor Senior Lecturer (tenured) Postdoctoral Fellow Associate Professor Visiting Assistant Professor Coordinator Postdoctoral Fellows Instructor (non-tenure) Familiarity with Avida-ED Expert Experienced Novice Novice Experienced Expert Experienced Experienced Novice Expert Case descriptions.   Each of the following case descriptions provides a brief summary of how instructors implemented Avida-ED in their classrooms, including the contexts (institutional, course, and instructor) in which it was introduced. Due to variation in the materials received from instructors, the descriptions vary slightly in depth of detail.   39 Case A_APBio: Advanced Placement Biology Institutional context Institution A is a private, co-ed Catholic high school situated in an urban neighborhood located in a Midwestern capital city. The total enrollment at the school is approximately 500 students and includes grades 9 – 12. Students come from primarily affluent families; parents pay tuition, and the school does not offer a reduced or free lunch program. However, 30% of families receive financial assistance in the form of grants and scholarships. The curriculum is college preparatory, with the vast majority of students graduating and continuing on to college. Course context Advanced Placement courses are offered to high achieving high school students. The course content for AP classes is not determined by the individual teachers, but is instead established by the College Board to reflect the topics typically covered in introductory college courses. As such, the AP Biology course (College Board, 2011) is designed to be the equivalent of a two-semester college introductory biology course usually taken by biology majors during their first year. Students take a standardized assessment at the end of the year, and universities grant course credit to those with qualifying scores, essentially waiving introductory course requirements. AP Biology is offered to students at Institution A in their junior year. This is the first biology class that they have taken in high school, having completed physical science and chemistry their freshman and sophomore years, respectively. The major goals of the course are to develop critical thinking and problem-solving skills, appreciate science as a process and engage in science practices, and become familiar with key biological concepts. During the 2012-2013   40 academic year, eighteen students were enrolled in AP Biology, and all but one completed the pre- and post-assessment and user’s survey. Instructor context The teacher, YN, had been employed at the school for six years, teaching the regular and AP biology courses each year. YN is a very reform-minded teacher, and has been very active in taking up and implementing the new AP Biology standards, including becoming a peer leader through the AP Biology Leadership Academy, run jointly by the Biological Sciences Curriculum Study (BSCS) and the National Association of Biology Teachers (NABT). In addition, she participated in a Research Experiences for Teachers (RET) program for two consecutive summers prior to this study. During these experiences, she became a member of the Avida-ED Curriculum Development team and developed curricular materials for use with Avida-ED that she has since used in her own classroom. These materials include the lesson that she used during this study; this lesson also served as the basis for her Masters thesis and as the pilot for the current study. She has presented this work at several conferences (local, national, and international). She recently left her position at Institution A to pursue a doctoral degree in science education at a large Midwestern university. Because YN has used Avida-ED extensively through her own research and teaching, she is considered an expert user. Description of implementation YN taught her advanced placement high school biology class during the 2012 – 2013 school year. As this was her sixth year teaching the course at Institution A, she knew the material well and already had a well-developed plan for the entire year. This plan had changed over the years, especially in recent years as the AP Biology Framework had just been revised (College Board, 2011). YN was very familiar with the changes to the Framework through her participation   41 in the AP Biology Leadership Academy. This would be YN’s second time implementing AvidaED in her classroom. She had used it in both her AP and regular junior-year biology courses the previous year, after having spent the preceding summer developing curricular materials with the Avida-ED Curriculum Development team for a Research Experience for Teachers program. She had had great success during that implementation, and used the results from her assessment study as her Masters thesis in Science Education. She plans to submit that study for publication. YN’s goals for her students included providing them with a historical context for the theory of evolution so that they could see how scientists’ thinking has changed over time in light of new evidence. She also wanted students to appreciate evolution as the unifying theme in biology, and as such introduced it very early (in the third week), and continuously framed concepts from an evolutionary perspective. She started by introducing students to writings by Paley, Lamarck, and Darwin to show how people have accounted for complex biological features and the Earth’s biodiversity in the past. She also used these readings as an opportunity to introduce the students to the concepts of observation and inference in science, the nature of empirical evidence, and what it means to build an argument based on evidence in order to distinguish between scientific and non-scientific explanations for natural phenomena. In order to show students firsthand how populations of organisms change over time and relate these observations to Darwin’s discoveries, YN engaged her students in an artificial selection experiment involving a non-virulent strain of E. coli. In this investigation, students soaked small circles cut from filter paper in various anti-biotic substances, and then placed these in Petri dishes containing growth medium that had been inoculated with bacteria. Each day the students would examine their plates to check for signs of bacterial growth, and measured the size of the “zone of inhibition,” the clear area around the anti-biotic disc that was apparently free of   42 bacterial colonies. They would then swab this zone and use the bacteria closest to it to inoculate a fresh Petri dish with the same kind of anti-biotic disc. Over the course of a month, the students observed that the zones of inhibition were becoming smaller, as increasingly resistant bacteria were serially selected and transferred to new dishes where they were allowed to reproduce. Once the students had completed these experiments, they wrote a lab report to document their results. YN gave them an assessment that included questions on what was happening in the bacteria that allowed them to grow closer to the anti-biotic disc over time. This assessment served as a pretest that would be used as the basis for comparison to a post-test that students would take after they had experimented with Avida-ED (it also contained the questions from the assessment instrument used in this study). The Avida-ED lab began in mid-February, several months after the students had completed the bacterial artificial selection lab. The students spent a total of two weeks working with Avida-ED. During the first week (five class sessions, Monday through Friday, 50 minutes each), they learned how to use the software while simultaneously addressing concepts related to microevolution, following a protocol that YN had designed. The Avida-ED tutorial was explicitly meant to simultaneously teach the students how to use the software, engage in authentic science practices, and teach fundamental evolutionary concepts. The tutorial began with the reading of an article from Discover magazine on the research version of Avida-ED (Zimmer, 2005) to introduce students to the general idea of digital evolution and allow them to see how the software they would use is also used by scientists in research settings. The students then began working through the tutorial, which posed questions to guide their thinking as they engaged in experiments of increasing technical and conceptual complexity. For example, the first activity was simply a tour of the various view panes (Organism, Population, and Analysis), and   43 students learned about dragging saved organisms from the “freezer” into the experimental area, then watching the organism divide to produce a population of Avidians. The next activity had students work in the organism viewer to learn about the Avidian genome and experiment with replicating organisms at different mutation rates. Other activities involved growing populations of organisms under various environmental parameters, examining the genomes of evolved organisms, and competing organisms. Each activity was guided by a main big idea and walked students through experimental procedures so that they became accustomed to using the software and gained a sense of its capabilities. After spending a week working through the tutorial in class, the students were ready to pursue a more independent project. YN assigned an open-ended, artificial selection experiment that she had helped to design with the Avida-ED Curriculum Development team. In this activity, the students were to develop a protocol for evolving organisms that could efficiently utilize a particular environmental resource. The lesson involved a realistic problem to show more concretely how digital systems can be used to model authentic real-world scenarios. From the lesson handout: Design Challenge Scenario Your school wants to buy an adjacent piece of property to build new athletic facilities. The property includes a large warehouse that has been used for various industrial purposes over the last fifty years. During the site inspection it was found that the soil and water around the warehouse have been contaminated by trichloroethylene (TCE), a hazardous chemical used as a spot remover in dry cleaning and as a degreaser for metal parts.   44 The school board has asked our class to team up with an environmental consulting company to help clean up the TCE so that our school can move ahead with purchasing and using the land. The environmental consulting company has informed us that current methods to remove TCE are expensive and require that contaminated soil be removed and disposed in hazardous waste dumps. The company is interested in spiking the soil with bacteria that biodegrade (break down) TCE. The class’s goal is to use a model system (Avida-ED) to evolve a bacterial strain that can biodegrade TCE and present a protocol to the company. Your goal is to convince the company (using data) that your protocol is likely to lead to an efficient means of evolving TCE degrading bacteria. They can then use your recommendations to produce the bacteria that will allow the soil to be cleaned up on-site instead of dumped into a hazardous waste landfill. For this experiment, the students were to work in teams to first hypothesize a protocol for evolving organisms that could efficiently perform the “oro” function (an analogue for the hypothetical enzyme that can break down TCE) as measured by population fitness in an environment containing only the resource “orose”, and then design an experiment to test that hypothesis. Once the teams had conducted their experiments and collected data, they wrote up their results in a lab report and presented their results to the class. The students were given one week to complete this activity in their teams outside of class. After the students had finished the project with Avida-ED, YN administered the same assessment that she had given earlier in the year, after the bacterial experiment. She had purposefully avoided mentioning the bacterial experiment during the time that the students had been using Avida-ED, so that any gains the students had with regard to the questions on what was happening in the bacteria might be attributed to their engagement with the digital system.   45 The assessments revealed that the students had made tremendous gains (see Chapter 5), and YN concluded that using Avida-ED allowed students to understand the mechanisms that accounted for the population level changes that the students had observed with the bacteria. That is, working with the digital system helped the students to understand what was happening in the biological system. Both YN and the students repeatedly referenced Avida-ED during discussions over the remainder of the school year. Summary YN’s implementation of Avida-ED was extensive and multi-faceted. She engaged the students in authentic science practices while they learned about a number of concepts. In addition, she connected the digital evolution experiment to a similar experiment with biological organisms. Therefore, her implementation of Avida-ED was faithful to the developers’ original vision. The implementation lasted two weeks, with 250 minutes of class time devoted to teaching students how the system worked, and another week in which the students worked outside of class on their projects. The projects themselves were very high quality; although YN established the problem, the students were free to develop and test their own hypotheses using Avida-ED. The instructional supports YN provided, which included the bacterial experiment, the Zimmer reading, in-class demonstrations and the tutorial, were appropriate given the level of her students and helped to prepare them for the independent project. Case B_300Evo: Upper-division Evolution Institutional context Institution B is a private liberal arts college in the Midwestern United States. With an average population of about 1,650 students, the college is small but quite prestigious and known   46 for its rigorous academics. For the past three decades, U.S. News and World Report has consistently ranked it among the top 25 liberal arts colleges in the United States. Course context The 300-level evolution course is offered to students who are majoring in biology and have completed a one-semester introduction to biological inquiry course and two semesters of introductory biology. The course focuses on mechanisms of evolutionary change at micro- and macroevolutionary scales, investigating topics such as the origin and maintenance of genetic variation, population structure and speciation, systematic methods, adaptation, molecular evolution, and macroevolution, as well as considering applications of evolutionary theory to conservation biology, medicine, and human diversity. All nine students enrolled in the course during the spring 2013 semester completed the pre- and post-assessment and the user’s survey. Instructor context NN has been part of the Biology faculty at Institution B since 1995, and was promoted to full Professor in 2008. His faculty duties consist primarily of teaching, and he has instructed several courses at the college, including Introductory Biology, Animal Behavior, and History of Biological Thought. He has taught the evolution course for nearly 20 years. NN began using Avida-ED in the course about two years ago and is considered an experienced user. Description of implementation Because NN had been teaching the evolution course for so long he had a clear plan for meeting his well-defined goals, which included using Avida-ED to ease students into the practices of science. NN preferred to use Avida-ED in his undergraduate evolution course as an opportunity to both illustrate the basics of evolutionary theory (particularly those related to   47 microevolutionary mechanisms and the role of mutations in population-level variation), and also to introduce students to the kinds of self-directed activities that he expected them to engage in during later projects (e.g., a several-week long project on insect DNA microsatellites that culminated in a journal-style paper, and an open-ended selection experiment on bean beetles. The data from this project were used as the basis for a mock grant proposal that accounted for the bulk of the students’ course grade). As this was an upper-level course that would be dealing primarily with higher-order conceptual issues in evolution, he introduced Avida-ED early, in the third week of class. The class consisted of a lecture that met three times weekly (MWF) for 50 minutes and a laboratory section that met once a week for two hours and fifty minutes. Unlike some courses where the concurrent laboratory section is not closely synchronized with lecture, NN structured the course so that the laboratory would allow students to practice major themes that were discussed in lecture each week. All work with Avida-ED was done during the laboratory period. Students worked in groups of two or three. Avida-ED was introduced and NN gave a brief demonstration, based on the tutorial materials he found at the Avida-ED website. He also made available as a link on the course website a YouTube tutorial (created by the Avida-ED Curriculum Development team) so that students could refer back to it as they worked on their projects outside of class. Students then worked with their groups to familiarize themselves with Avida-ED and by the end of class had to propose a potential question that they would spend the next two weeks investigating on their own outside of class. The next week in laboratory, each student group presented its idea for investigation with Avida-ED. NN’s instructions to the students read as follows:   48 This will be an informal (ungraded) presentation; you should introduce (1) the question your study will address (and why it is interesting/relevant), (2) your design for the experiment, and (3) how you will analyze the data. You will probably want to do some “test runs” to make sure your plans are feasible. Formal presentations of the project will occur next week during lab. The following week, students presented their work to one another in their groups. NN videorecorded each group presentation in order to provide detailed feedback later on. Each group produced a PowerPoint presentation and followed a typical conference talk format. Student presentations were required to have an introductory slide with some background on the problem, a well formed hypothesis, a brief description of their experimental protocol, a few slides showing results including appropriate graphical representations of data and statistical analyses, and a discussion. Examples of student projects included the effects of mutation rate and environmental variability on fitness, and the effect of bottleneck events on fitness. The oral presentation was worth 15% of their grade for the course. Summary NN’s implementation was true to vision: his students learned concepts that would prepare them for the higher-order conceptual thinking that would take place later in the upper-division course, and they engaged in a full inquiry cycle which included collaboration in a group, generating questions and hypotheses and developing a protocol for testing them with Avida-ED, collecting and analyzing data, and communicating their results both visually and orally. NN devoted a fair amount of class time to activities involving Avida-ED, including an entire threehour lab period in which he demonstrated Avida-ED and had students work through a tutorial, as well as time during subsequent lab periods for the groups to present their proposals and results.   49 In addition, the students spent two weeks outside of class working on their projects. NN provided his students with additional resources, including access to the video tutorial of Avida-ED that they could reference as they worked on their projects. Case C_400Evo: Upper-division Evolution Institutional context Institution C is a medium-sized public university located in the Southeastern United States. It is recognized as a historically black university (HBCU), and is one of the nation’s most important producers of minorities with degrees in STEM fields. Course context The 400-level course in evolutionary biology is offered to students majoring in biology. Prerequisite courses include a semester each of general biology, zoology, and genetics. The course covers the basics of evolutionary theory including the origins of genetic variation and the role of inheritance and natural selection in producing biological diversity. Students have the option of taking a more advanced course in evolutionary concepts once they have completed this introductory course. Of the 22 students enrolled in the course during the spring semester of 2013, 15 completed the pre- and post-assessment. The instructor opted not to administer the user’s survey to his students. Instructor context TN is a new Assistant Professor in Biology, and this was his first time teaching an evolution course. Prior to this position, he spent two years as a Lecturer in the School of Medicine at an urban, public research university in the Midwest, and before that spent two years as a Postdoctoral Fellow, first researching biostatistics computation and later cancer proteomics.   50 He possesses advanced degrees in Computer Science and Microbiology and Molecular Genetics. This was TN’s first time using Avida-ED. He is considered a novice user. Description of implementation TN had never before used Avida-ED, so when two members of the Curriculum Development team (myself and a colleague) offered to visit the university to present a campuswide seminar on digital evolution and to introduce Avida-ED to his class, he gladly accepted. We worked together to develop a lesson plan based on his goals for his students. TN decided to use materials that had been developed by the Curriculum Development team for use with his students. Our plan involved his administering the pre-test to his students and having them download Avida-ED onto their computers (or onto the Biology Department’s computers) prior to our arrival. The students were also to have watched the introductory YouTube video on AvidaED prior to our visit. TN administered the pre-test assessment to his students on March 12, 2013. My colleague and I travelled to the university from March 13 – 15, presenting the introduction to TN’s class on March 14 in the afternoon, and giving our lecture on digital evolution that evening. During our visit to TN’s classroom, my colleague gave a ten-minute introductory talk about digital evolution. I then led the students in several guided-inquiry activities designed to illustrate some key, fundamental concepts while simultaneously familiarizing the students with Avida-ED. The first activity involved having the students replicate individual Avidians in the Organism Viewer at different mutation rates, first at 0% and then at 10%. Following the PredictObserve-Explain (POE) model, the students were asked to predict what would happen at each mutation rate, and then perform several replicates at each of the two mutation rates and explain their observations. The goal of this lesson was to have all students observe that mutations are   51 random, and that a given mutation rate is defined as the probability that any given locus in the genome will change during a replication event. A second exercise explored the mechanism of natural selection by investigating the relationships between variation, inheritance, and selection. The exercise consisted of three mini-experiments with different parameters that would allow students to make inferences about selective pressures. In the first experiment, students started a new population by dragging the ancestor organism into the virtual Petri dish and replicating at a 0% mutation rate with all resources available. This set of parameters results in a completely homogeneous population, illustrating that there can be no variation without mutation (and thus, no evolution) regardless of environmental conditions. In the second experiment, the mutation rate was to be increased to 2% and the resources limited to show the effect of environmental resource availability as a selection pressure. For the final experiment, the mutation rate was to remain at 2% with all environmental resources turned off. This would have allowed students to observe a different kind of selection pressure, on gestation time rather than metabolic rate. Unfortunately, due to technical difficulties and because many students had not yet downloaded the program, I was only able to complete the first activity (on mutation rates) with the students. Several weeks later (April 9), wanting to be sure that the students had been properly introduced to Avida-ED, TN repeated the inquiry activities with the students. He prepared a PowerPoint presentation and walked them through step by step, using screen-captured images to point out very explicitly the different features of the program and what parameters to enter. He asked the students to share their results, and they briefly discussed the outcomes as a class. After this second overview of Avida-ED, TN introduced the students to a third activity, also created by the Curriculum Development team, for the students to complete as homework on their own outside of class. This exercise involved investigating the effects of increasing mutation rates on   52 the genomes of individual organisms. Known colloquially as “the butterfly lesson” (Lark, Richmond, & Pennock, 2014), students use Avida-ED to model a study conducted on irradiated butterflies in Japan (Hiyama et al., 2012). The Great East Japan Earthquake of 2011 caused an enormous tsunami that subsequently damaged the Fukushima Daiichi Nuclear Power Plant complex, resulting in the meltdown of several reactors and release of radioactive particles. This accident gave researchers the unique opportunity to investigate the effects of irradiation on organisms by collecting butterflies at increasing distances from the damaged reactors. They found that distance (associated with ground radiation dose and, by proxy, mutation rate) was positively correlated with the frequency of phenotypic abnormalities in the butterflies. The students used Avida-ED, again following a POE strategy, to test the authors’ claim that the increase in abnormality rate was caused by greater numbers of random mutations due to radiation exposure. The students worked in groups to conduct their experiments, and afterward all students were required to submit a project report. For most students, the butterfly lesson was the last interaction they had with Avida-ED. TN administered the post-assessment to the entire class after the project reports had been submitted. A few students chose to use Avida-ED for one of three required independent class projects. These students designed their own study in Avida-ED to pursue a question of their choosing. They turned in their reports on the last day of class, and TN asked them to take the assessment once more. However, these students left much of the assessment blank, possibly due to test fatigue, and as a result these data were not included in any analysis. Summary TN faced several challenges in his implementation of Avida-ED, perhaps related primarily to his status as a novice (as a new faculty member teaching a new class, in addition to   53 his unfamiliarity with Avida-ED). Although he used the software to introduce students to fundamental evolutionary concepts via engagement in practice, his introduction was highly guided, consisting of a step-by-step walk though of the lesson materials. TN devoted at least two 75-minute class periods to working with Avida-ED (though these were separated by several weeks), and the students had one week on their own outside of class to spend working on the butterfly exercise, which is very much guided. A handful of students used Avida-ED to conduct independent projects, but for most of the students the authenticity of science practices was rather low. Case D_100Evo: Freshman Seminar in Evolution Institutional context Institution D is a large, urban, public research university in the Midwestern United States. The university is the second largest in the state with just over 30,500 students. It features an active research program with high output. Course context The introductory evolution course was offered during the fall semester of 2012 as a first year seminar. Open only to freshmen, the first year seminar is one of the first college courses that students will take, and therefore has no prerequisite courses. The topics of these seminar courses change each semester. Enrollment for these courses is limited to about 20 students, and the primary goal is to acclimate students to college life. As a seminar course, the expectation is that students will be largely responsible for their own learning. The goals of the evolution seminar were to introduce students to fundamental evolutionary concepts and familiarize them with the process of science. Twelve of the sixteen students enrolled in the course completed the pre- and   54 post-assessment. Due to unfortunate miscommunication between myself and the instructor, no students completed the user’s survey. Instructor context JO is a tenured Senior Lecturer with joint appointments in the departments of Biological Sciences and Kinesiology, and the School of Education. Virtually all of his appointment is devoted to teaching, with a small amount spent on service (mostly serving on graduate committees). He has regularly taught human gross anatomy and physiology since arriving at the university in 2004. In addition, he is involved with teacher professional development, most recently helping to prepare elementary teachers to adopt the Next Generation Science Standards and incorporate evolution into the elementary biology curriculum. JO serves on the Board of Directors for the National Center for Science Education and passionately supports evolution education. The fall semester of 2012 was the first time he had taught the freshman evolution seminar. This was also the first time he had ever used Avida-ED, and as such he is considered a novice user. Description of implementation JO’s commitment to reform-based teaching was among the highest of the instructors participating in this study, and he eagerly adopted Avida-ED after hearing about its potential for engaging students in authentic science practices just weeks before the term began. JO introduced Avida-ED to his students early in the semester by giving an interactive demonstration during class and encouraging students to ask questions. At their suggestion, he changed different parameters so that they could observe as a group how the outcomes were affected, and used this as the basis for a class discussion. As it was his first time using AvidaED, he opted to use the ready-made materials developed by the Curriculum Development group.   55 He used Avida-ED for a series of exercises beginning the third week of class. The first exercise was the same Avida-ED tutorial described above [A_APBio]. The students completed the handout as a homework assignment outside of class. The next two exercises were used as labs and completed in class, and consisted of guided inquiry activities. The first introduced the concepts of mutations and mutation rates, while the second extended these concepts to investigate the effects of different mutation rates on population size and average population fitness. These labs were used as a point of reference to guide an in-class discussion that focused on how genetic change works and its effects on characteristics of organisms. Because Avidians in the version of Avida-ED he used are haploid and asexual, JO could not use Avida-ED to do experiments with genotype changes in sexually reproducing organisms as he had initially planned; therefore, he designed his own lesson on sickle cell traits. He based the lesson on journal articles about mortality and the sickle cell trait in African populations and then had students discuss these articles and make conceptual connections to the activities they had done with Avida-ED. With this lesson, he was able to say something about fitness, population biology, how changes in genotype frequencies relate to phenotype frequencies, and relate these concepts to population growth. Summary JO used Avida-ED early in the semester as a way to introduce students to fundamental concepts that would be built upon as the course progressed. He used guided activities that engaged students in practices, but in a limited way (i.e., they followed a handout rather than investigating their own questions). He supported the students by giving an interactive demonstration of Avida-ED and allowing them to do the majority of the work with Avida-ED as labs in class (two 75-minute lecture periods) in addition to assigning the tutorial as an out of   56 class homework exercise. He then used the students’ experiences with Avida as the basis for discussion on sickle cell traits. Case E_200Bio_HC: Honors Introductory Biology for Non-majors Institutional context Institution E is a large, public research university. Located in the Midwestern United States, it is the largest university in the state with just under 35,000 students and is considered one of the top-tier research universities in the nation. Course context This course is the second of a two-semester science sequence offered to non-major, lower-division students in the Honors College. The intent of the sequence is to introduce students to fundamental scientific concepts as well as the nature and practices of science. Because they integrate physics, chemistry, and biology, both courses are interdepartmental and are team-taught by faculty from physical and biological sciences. Both courses consist of a lecture and lab component. During the spring semester of 2013, thirty-four students were enrolled in the course. Of these, 30 completed the pre- and post-assessment as well as the user’s survey. Instructor context HD received her PhD in Biology from Institution E in 2003. She served as a senior research technician for two years at the same institution in her mentor’s lab, and has stayed on as a Postdoctoral Fellow in the Division of Biological Sciences since 2005. Her responsibilities are split evenly between research and teaching. She has coordinated the science sequence for the Honors College since 2004. As coordinator, she is responsible for all aspects of course organization, including designing and implementing all laboratory activities, training and supervising teaching assistants, and identifying and organizing several lecturers from   57 departments across campus; she also delivers about a quarter of the lectures herself. HD has been using Avida-ED in this course for several years, and is therefore considered an experienced user. Description of implementation HD implemented Avida-ED quite differently from the other instructors in this study. In her introductory biology course, students used Avida-ED as a virtual lab space to design and conduct their own research study over the course of the 15-week semester. This project was almost entirely self-guided, with students checking in at designated times to ensure that they were progressing and to give them opportunities for sharing their work and obtaining peer feedback. In terms of authenticity, this implementation was the closest to actual research as the students engaged in a full inquiry cycle, driven by their own interests and ingenuity. It was also by far the lengthiest implementation, taking up the entire semester. Early in the semester HD introduced the students to Avida-ED by assigning several homework exercises. These were intended to break the semester project into manageable pieces, and also to help students become familiar with the software and understand its capabilities. In part one, completed individually, students were asked to read an introductory article on Avida (Zimmer, 2005) and use the information to answer a few questions. They were then given directions to download Avida-ED to their personal computers and perform some basic tasks, such as starting a new population from the ancestor organism and exporting data. Finally, each student was asked to brainstorm and propose two research questions that could be investigated using Avida-ED, including a simple methodology. In part two, students were instructed to work through a series of demonstrations as a group. The exercise was organized in three parts. Part 1 was intended to familiarize students with the Avida-ED interface by walking them through an experimental run and bringing their   58 attention to different aspects of the program, such as the various viewers and output screens. Part 2 continued this process of familiarization in the context of running an experiment on natural selection. Students were given a problem, asked to predict an outcome, and then conduct the experiment to test their prediction. In Part 3 students were tasked with discussing each of the research questions they had proposed individually in order to determine which one they would investigate as a group. They were then asked to write two hypotheses to their chosen question. Next, each group was asked to write a proposal based on their chosen research question and associated hypotheses. The proposals were to contain the following information, adhering to the provided specifications (from the handout): 1. Title & authors 2. Project Summary: In one paragraph, summarize the question, the methods, and what you hope to find. 3. Objectives: Provide background information about your topic (using citations), and address what specific question you hope to answer with your work, your proposed hypothesis and how is it justified by what is already known (using citations). 4. Expected Significance: Describe the impact (benefit) of your new information. Outline your clear hypothesis and cite previous results, models, theories or laws that make you think this is the answer to your question. This is the big picture part of your proposal. 5. Proposed Methods: Explain how you will go about your work. What variables will you manipulate? What sorts of measurements will you take? How will you analyze data?   59 6. Expected Results: Describe what you believe will happen; consider sketching a graph of your data and/or listing predictions. Restate your hypothesis and cite previous results, models, theories or laws that make you think this is what is going to happen (use citations). 7. Citations: Provide at least two citations from each member of the group (at least one of which should be from a peer-reviewed science journal). Follow formatting guidelines in course packet. HD provided the students with a graded rubric so that they would understand her expectations and the point values associated with each graded component of the proposal. After receiving feedback on their proposals, the students were given several weeks to pilot their experimental procedure. Each group then orally presented its preliminary findings to the rest of the class in order to receive feedback and suggestions from peers. This oral presentation occurred approximately one third of the way through the semester. After the oral presentation, each group completed its experiments, incorporating feedback from their peers. The results were then written in a standard journal article format. First drafts of the paper were due the second week of April. HD provided each group with feedback, at which point they were required to revise the original draft. The final paper was due the last week of class (the first week of May). The final product of the semester research project was a poster, which each group presented during the last week of class. HD gave the students guidelines for creating a professional-quality poster, and advice on how they ought to present it. In order to make the poster presentations as authentic as possible, the session was open to the public and advertised in advance. Students from other departments also presented their work at this session. This gave the   60 students an opportunity to explain their projects to people who were not familiar with the subjects of their research. Finally, the students were asked to evaluate their group members with regard to how well each communicated during the project and the degree to which each member contributed. This evaluation was factored into the students’ final project grades. Summary HD took full advantage of Avida-ED’s strengths by allowing students to use it as a virtual lab space to design their own study pursuing questions of their choice. This was a full, authentic inquiry cycle, beginning with the conception of a question and ending with public presentation of the findings. The implementation lasted the entire 15-week semester, although it is not possible to estimate how much of that time students actually spent working with AvidaED, aside from the designated class periods when they presented their work to the class. HD prepared the students to engage in their projects by having them read the Zimmer article on Avida and working through two introductory exercises. She made her expectations of students clear by providing them with rubrics for each step of the process. Support also came in the form of peer feedback when students presented their proposals and progress to the class. This was a very well done implementation of Avida-ED that resulted in impressive students outcomes (see Chapter 5), and could serve as a model for other instructors. Case F_400Evo: Upper-division Evolution Institutional context Institution F is a large, public research university in the Pacific Northwest. The university, one of the oldest on the west coast, enjoys a great amount of prestige, particularly due to its status as one of the most highly funded research universities in the nation.   61 Course context The course is designed to give upper-division undergraduates and beginning graduate students hands-on experience in the field of experimental evolutionary ecology. Students are required to have taken introductory biology before enrolling in this course. The course is composed of lectures and labs. The lectures introduce some of the current "big questions" in ecology and evolution that are experimentally testable. Students read primary scientific literature in order to gain a better understanding of how experimental approaches have been used to explore ecological and evolutionary phenomena. The labs are devoted to experimental design, data collection, and analysis. Students work in small groups to conduct experiments and investigate wide-ranging issues such as the evolution of bacterial antibiotic resistance, bacterial tradeoffs and competition, and coevolution of pathogens and their hosts. During the fall semester of 2012, thirty-three students completed the pre- and post-assessment. No students completed the user’s survey. Instructor context NR is an Associate Professor in the Department of Biology at Institution F, where he has served on the faculty since 2005. His professional responsibilities are split evenly between teaching and research. In addition to the upper-division evolutionary ecology course, he also teaches a large-enrollment introductory biology course for majors; he has taught both courses for six years. He has used Avida-ED in the upper-division course for three years. He has also coauthored several journal articles on Avida. Due to his deep understanding of digital evolution and of Avida specifically, he is considered an expert user.   62 Description of implementation NR’s senior-level course in Experimental Evolutionary Ecology included both a lecture, which met for 80 minutes twice weekly, and a once weekly, two-hour lab. Exercises involving Avida-ED were conducted primarily during the lab period, beginning around the middle of the fall 2012 semester. For these exercises, students worked together in groups of four or five. For the Avida lab exercises, NR provided the students with a detailed set of instructions on the course wiki, including background information, procedures, and relevant dates. Four lab periods were devoted to Avida-ED. During the first, students used Avida-ED to recreate with digital organisms a lab they had done earlier in the semester with bacteria. During that lab, students had performed a classic experiment first published by Luria and Delbrück in 1943 showing that mutations occur by chance rather than in response to particular environmental selection pressures; that is, mutations are spontaneous and random rather than directed. In the first Avida lab, the students recreated the experiment and found the same patterns. The purpose of this lab was to illustrate that the digital system represents an actual instance of evolution and serves as an accurate model for biological evolution, and to demonstrate some of the capabilities of Avida-ED so that students could later develop their own questions and hypotheses to test with the software. At the end of this lab, each group developed a question to test using Avida-ED and completed a handout that served as a short proposal of what they intended to do for their project. On the handout, students were instructed to record their question, list one or more possible explanatory hypotheses, a detailed methodological approach, and a set of predictions extending from each hypothesis. The next lab involved refinement of hypotheses and experimental designs. Each group carried out its experimental protocol. Afterward, they completed a second handout, intended to   63 help them think critically about their preliminary results and re-visit their study design. They then proposed a new experiment to test their new and/or revised hypotheses. The students ran the new experiments outside of class, and the resulting data were discussed during the third lab period. Based on these data, the students again devised new hypotheses and planned new experiments; some students redirected their research entirely. They once again executed these experiments outside of class. During the fourth lab period, the students analyzed all of their data and discussed the broader context of their motivating question; that is, what their experiments in Avida-ED might say about organic systems and how they might design experiments in organic systems to explore related themes. The students then wrote a report in the format of a scientific journal article, which was due the last week of class. In addition, some students chose to use their Avida-ED experiment as their individual class project, and presented the results of these projects to the class. Summary NR made extensive use of Avida-ED in his class. Like YN, his implementation explicitly linked the digital system to a biological example by having students re-create the Luria-Delbrück lab that they had completed with bacteria earlier in the semester. In addition, students spent an enormous amount of in-class time (over 8 hours total) working on their own investigations. A unique aspect of NR’s implementation was the focus on the revisionary nature of science, as students repeatedly designed protocols, tested them, and refined their questions and hypotheses.   64 Institution G Institutional context Institution G is a very large, public research university in the Midwestern United States. With nearly 48,000 students it is among the top ten largest universities in the US by student body. This university is highly regarded for its active research program. Cases G_100BioLabA & G_100BioLabB: Introductory Biology Laboratory   Course context The large-enrollment introductory biology laboratory course is offered to lower-division students majoring in the biological sciences. It is offered during both fall and spring semesters of each year and focuses on content related to organisms and populations. The course has recently been transitioning to a reformed format that provides students with opportunities to engage in authentic science practices, including maintenance of a laboratory notebook and conducting a group research project that culminates in a poster presentation. The course consists of a recitation or “lab lecture” that meets once per week for one hour and a lab component that meets once per week for three hours. Each lab section accommodates about 30 students. The instructor of record is responsible for the recitation, while labs are instructed primarily by graduate and undergraduate teaching assistants. During the fall semester of 2012 (G_100BioLabA), 154 of 202 students enrolled in the course completed the pre- and post-assessment and the user’s survey, and 234 of 355 students completed the pre- and post-assessment and the user’s survey during the spring semester of 2013 (G_100BioLabB). Instructor contexts AE was the instructor of record for the course during the fall semester of 2012   65 (G_100BioLabA). Her appointment is split 90/10 between teaching and administrative duties. She is an Academic Specialist and the Coordinator for all laboratories under the purview of the Biosciences program and has served in this position for more than 20 years. As Coordinator, her responsibilities include scheduling and ensuring that all of the laboratories are fully stocked at all times. She is also responsible for hiring, training, and supervising all teaching assistants for the Biosciences program. AE is responsible for the Organisms and Populations laboratory course each fall and the Cells and Molecules laboratory course each spring. In addition to teaching the lab lectures for each of these courses, she oversees all of the lab meetings, held each Friday afternoon during the semester, where she debriefs with the TAs from the previous week and leads them through the lab activities for the next week. She has used Avida-ED in this lab before and is considered an experienced user. KA was the instructor of record for the course during the spring semester of 2013 (G_100BioLabB). He became a Visiting Assistant Professor with a joint appointment in a residential science college at the university and the Biosciences program in 1996. Since that time, he has taught a wide variety of courses, including an honors biology laboratory, the Organisms and Populations lecture and lab, a professional development course for teachers, and a biodiversity study abroad course in Panama. His faculty appointment, which had up until recently been entirely devoted to teaching, is now split 25/75 between teaching and curriculum development, where he helps to reform the introductory biology courses. In addition to teaching the above courses, KA is also responsible for training the TAs who instruct the labs. He has used Avida-ED before in the Organisms and Populations lab course and is considered an experienced user.   66 Description of implementation Although there were two separate instructors, these were two offerings of the same course (G_100BioLabA ran in the fall semester of 2012 while G_100BioLabB ran during spring semester 2013). AE and KA planned the course together, so the curriculum and implementation were essentially the same. The only major difference between these two courses was that they had different graduate and undergraduate teaching assistants overseeing students as they worked in the lab, however the TAs were trained and supervised by the course instructors. Therefore, for the purposes of describing the implementation of Avida-ED I have chosen to group the courses together (the data from these courses is considered separately in subsequent chapters). The instructors designed a lesson that aimed to 1) address a suite of common misconceptions about evolution, and 2) prepare the students for conducting their own small project using Avida-ED. Students spent two three-hour lab periods working with the program, in addition to time spent outside of class working on the project. The first lab was spent working through a handout in pairs, with the aid of teaching assistants. The students used the second lab for designing an experiment to test an idea of their own, the data collection for which would be completed outside of class. Avida-ED was implemented halfway through the semester, after completion of the unit on genetics. For the first lab on evolution, the students were given a lengthy packet to work through. The packet included introductory and background information on Avida-ED and experimental evolution, information on the assignments, and data sheets. Students were expected to read the Discovery magazine article by Zimmer (2005) before coming to class. They were given a very brief introduction and demonstration by the lead graduate TA and then commenced working with their partners.   67 The handout listed five objectives for the Avida-ED labs: 1. Become acquainted with Avida-ED, a research platform developed by evolutionary biologists at Michigan State University for studying a variety of evolutionary processes, 2. Use Avida-ED as a model of evolution by natural selection, 3. Apply the principles of random genetic mutations, phenotypic variation, heredity and fitness to explain how Avidian populations change over time, 4. Use Avida-ED to explore simple questions about evolution, and 5. Design an original experiment to test a question about evolution using the Avida-ED software. To achieve the first objective, the lesson handout contained two introductory exercises intended to familiarize students with Avida-ED. This was followed by the exercise on testing misconceptions. Four common misconceptions were the focus of this activity (from the handout): 1. Mutations always reduce the fitness of organisms. 2. The presence of a selective agent causes advantageous mutations to occur. 3. Because mutations are random, they cannot lead to the orderly progress that is evolution. 4. Complex features (for example an eye) cannot evolve because either they are too complex to arise from one mutation or if tried to evolve in several intermediate steps, there would be no advantage for the intermediates, so these intermediates would never evolve.   68 To test these misconceptions, the handout guided students through four experiments. Each followed a POE format, asking students to predict outcomes, providing a step-by-step protocol to test the misconception (observe), and follow up questions probing them to explain their observations. Students spent the rest of the lab period working through the exercises with their partners. Before leaving lab that first day, students were required to brainstorm a question that they would test using Avida-ED and share this question with the TAs. The next week, students used the lab period to refine their hypotheses and begin collecting data. The following week they turned in a one-page lab report describing what they had done along with their findings. Summary Unlike most of the other courses, although the instructors designed this implementation it was essentially enacted by teaching assistants who were, with few exceptions, novices when it came to Avida-ED. However, the activities were well documented and explicitly guided students through different features of the program. Students spent at least six hours working with AvidaED in class, in addition to the time they spent working on their projects outside of class. The exercise on testing misconceptions prepared them to conduct their own study, and they received additional support from the teaching assistants who circulated and helped as needed. Case G_100BioRes: Introductory Biology (Residential College) Course context This is an introductory biology lecture offered to lower-division majors in a residential college within the larger university. The goal of the College is to provide a bridge between science and the humanities, especially history, social sciences, and philosophy, creating a curriculum that focuses on “liberal science”. Smaller class sizes provide opportunities for more interaction   69 between students and professors, generating a small-college feel at an otherwise very large institution. The course covers content similar to the introductory biology laboratories in Bioscience; however, it combines lab and lecture and is offered only to students in the residential college. Like the reformed course, the primary goal is to engage students in authentic science practices as they learn content related to organisms and populations. During the fall 2012 semester, 121 students were enrolled in the course. Of these, 101 completed the pre- and post-assessment and user’s survey. Instructor context This course was team-taught by two Postdoctoral Fellows. RY is a plant ecologist with a postdoc position in teacher education. She studies how K-16 students learn about ecology and evolution and is using this information to develop learning progressions on biodiversity and carbon cycling processes. AR is an evolutionary ecologist with a postdoc in biology education research. She studies courses designed by other postdoctoral fellows and investigates how they align their goals, activities, and assessments with their teaching practices. Both instructors had never used Avida-ED before this implementation, and are therefore considered novice users. Description of implementation Throughout the course, AR and RY alternated lead teaching responsibilities, and AR took the lead on the evolution unit. The lecture portion of the class met on Tuesdays and Thursdays for approximately ninety minutes. Avida-ED was implemented over the course of one week during the middle of the fall 2012 semester. The instructors uploaded the pre-assessment to the online course management system and asked the students to complete it on their own outside of class. On the first Tuesday of the implementation, Avida-ED was introduced during the last ten   70 to fifteen minutes of class, subsequent to the first lecture on natural selection. The lecture had focused primarily on sources of genetic variation and defining selection as a function of fitness. AR presented the lecture information via PowerPoint and included several slides with multiplechoice questions for students to answer using clickers or open-ended items to be addressed in their groups using a white board and dry-erase marker. As students reported out from their groups, AR used an electronic pen to write on these slides; the annotated PowerPoint file was afterwards uploaded to the course management website for students to download for later reference. During her introduction, AR quickly explained Avida-ED by showing students the different components of the program, including both images and videos taken from screen grabs. She also provided the URL of the Avida-ED website and a link to the video tutorial on YouTube. AR then introduced the homework assignment; this was the same exercise on artificial selection that YN [A_APBio] had used for her class, with modifications (for an explanation of these modifications, see Chapter 7). Students were to download the homework assignment from the course management website and had one week to complete the assignment individually outside of class. During the next lecture AR did not mention the Avida-ED project (however, some students did take the opportunity to ask questions about the assignment before or after class). The day’s topics included the advantages and disadvantages of sexual reproduction, sexual selection, kin selection and eusociality. Once again, AR provided several opportunities for students to discuss questions in their groups. She electronically annotated the PowerPoint slides and made these available for students to download after class.   71 The following Tuesday, one week after the Avida-ED exercise had been assigned as homework, AR spent the first fifteen minutes of class debriefing with the students, asking them questions about what they had observed and relating to concepts they had covered in lecture (e.g., “How is artificial selection similar to and different from natural selection?”). Afterward, AR assigned the post-assessment and Avida-ED user’s survey as homework assignments to be completed outside of class via the course website and due the following Thursday. She then spent the rest of the class period reviewing natural selection and lecturing on other evolutionary mechanisms, particularly genetic drift. Neither the instructors nor the students mentioned AvidaED or the artificial selection assignment for the remainder of the semester. Summary This implementation was one of the shortest in this study. Students engaged in an inquiry lesson that they completed on their own outside of class over the course of one week. They received very little instructional support in the form of in-class demonstration or introductory activities, however the instructors did make other resources, such as the instructional video and Avida-ED user’s manual, available to students. Case H_100CompBio: Introduction to Computational Biology Institutional context Institution H is a very large, public research university in the South Central United States. With over 50,000 students, the university is one of the top five largest in the nation and is considered a top tier research institution. Course context The computational evolution course is offered to freshmen students enrolled in a research initiative program designed to give students experience working in research labs beginning in   72 their first year of college. This is a new course and was, at the time of this study, only in its second year. The course content is interdisciplinary, bringing together computer science and evolutionary biology. Twenty-four of thirty-two students completed the pre- and post-assessment and user’s survey during spring 2013. Instructor context TT is an Instructor and Research Associate studying computational biology and bioinformatics. He spends about a third of his appointment on research with the rest devoted to teaching for the freshmen research initiative. He earned his PhD working in a digital evolution lab using Avida to examine the role of deleterious mutations as stepping-stones between adaptive peaks in fitness landscapes, and has had his work published in top tier journals. Because he has extensive experience with digital evolution and has used Avida-ED in his teaching several times, he is considered an expert user. Description of implementation The introductory computational evolution course met three times a week, twice for lecture and once for lab, for one hour at a time. TT employed a “flipped” classroom structure, wherein students did preparatory work outside of class and used class time to practice skills under the supervision and guidance of the instructor and teaching assistant. The students were responsible for completing any assigned reading and watching video tutorials that TT had prepared before attending the lab. Avida-ED was implemented during the first few weeks of the spring semester, mid-January to early February 2013. Unlike the other courses, Avida-ED served as a foundation for the rest of the Computational Evolution course. Students learned the basics of the Avida research platform by first learning their way around Avida-ED. The educational   73 version of the software served as scaffolding to ease the students into using the research platform. TT administered the pre-assessment during the first Monday lecture. To prepare for the lab, TT had the students read a Nature Reviews article by Adami (2006) that provides an overview of the Avida system, and used the article as a basis for discussion in class. To save time during class, students were also to have downloaded the software and viewed a video tutorial on Avida-ED that TT had created. The first lab session focused on the concept of fitness in Avida, which is a function of an organism’s metabolic rate divided by its gestation time. Students were given a worksheet and asked to run a series of experiments in which they would grow populations of Avidians under various parameters (Table 8), systematically altering the population size and mutation rate. Before beginning their experiments, students were asked to develop a hypothesis with regard to the relative fitness of each treatment, and which factor would be more important for predicting which population evolves the highest fitness. Table 8. Experimental parameters for Lab 1 of the Computational Evolution course. Mutation Rate (Genomic) Population Size 9.00% 3.00% 1.00% 60 x 60 30 x 30 10 x 10 After recording their hypotheses, students performed the experiments, running each for 5,000 updates and recording average fitness, average gestation time, and average metabolic rate for each population. After they collected all of their data, the students wrote a laboratory report.   74 In the discussion, they were to state whether or not the results supported their hypotheses and provide an explanation as to why. If the results did support the hypotheses, the students were asked to design a follow up experiment that would help them refine their ideas. During the next lecture, TT and the students debriefed from the lab. They discussed the effects of mutation rates on populations and how fitness is calculated in Avida. The lab reports were due to be turned in the next week. The discussion of fitness and its relationship to natural selection continued into the second lab. Students downloaded a prepared workspace in which to grow monocultures of preexisting organisms. For each population, they recorded average population fitness, gestation time, and metabolic rate when the mutation rate was set to zero. The students were asked to infer, based on these data, how fitness is determined in Avida. Next, the students conducted a number of competition experiments between the same pre-existing organisms under different conditions, first with the mutation rate set to zero percent, and then with the mutation rate at 3%. Before conducting the competition experiments, students were asked to hypothesize which organism would win in each match (based on the results of the monocultures), and whether the mutation rate would affect the outcome. The purpose of this exercise was to show the stochastic nature of mutations, and that it is not always possible to predict the outcomes of random processes. Once again, students wrote up the results of their experiments in a lab report and they discussed the lab during the next lecture. After students submitted the second lab report, TT administered the postassessment and Avida-ED user’s survey. Summary Ultimately, TT devoted two lab periods and three lectures to Avida-ED (a total of about five hours), but because of its scaffolding role, he continually referred back to this initial work with   75 Avida-ED over the course of the semester as the students worked with the research version of Avida. This implementation also differed from those in other cases because, although the students did not conduct their own studies in Avida-ED, they did design a research project in Avida. Essentially, Avida-ED served as a primer to the research platform.   76 Chapter 4. Instructor experience with Avida-ED Overview. In the preceding chapter, I described the diverse ways in which the eleven instructors participating in this study chose to implement Avida-ED in their classrooms. In this chapter I explore factors that may have motivated the instructors to use the software as they did, including their goals for student learning and their beliefs about teaching science, and the degree to which Avida-ED aligned with those goals and beliefs. Findings.   The ten cases in this study varied greatly in terms of characteristics (e.g., size and type of institution, course type and level, etc.), but despite these differences the participating instructors had very similar teaching philosophies. These have been broken down into two main categories: instructor goals (and associated challenges of teaching science in general and evolution in particular) and instructor beliefs (regarding assessment, effective pedagogical strategies, their role in student learning, and Avida-ED). Instructor goals: Science Instructors mentioned a number of goals when it came to teaching science in general, and these tended to fall into four categories: practices and skills; the nature of science; scientific habits of mind; and knowledge transfer and application. Science practices Many of the instructors’ comments focused on engaging students in the practices of science, such as experimentation, model-based reasoning, explanation, prediction, and drawing   77 inferences from or making arguments based on empirical evidence, and their pedagogical decisions depended significantly on providing students with opportunities to engage in these practices: “I think that science is all about analyzing evidence and making evidence-based claims… How to draw conclusions, how to make inferences. [Students] have to be able to explain what they’ve learned or explain the results of a lab.” YN (A_APBio) “I want them to understand the role of experimental design and data analysis and evaluating whether hypotheses are true or false or have support or not have support… I want them to be able to communicate as well.” NN (B_300Evo) “The elucidation of causal factors via experimentation, the role of models and modelbuilding, and so on.” NR (F_400Evo) “Document things like a scientist, so they have to maintain a lab notebook. They are required to think ahead of time and collect data, analyze. We want them to learn about stats, both descriptive and hypothesis testing. Then we have them do small reports where they are learning to write and present their results, in addition to practice creating tables and graphs. But the ultimate goal is using those skills in the final poster that they will have to do for their research project.” AE (G_100BioLabA) “I want them to learn how science works. People have ideas of science from TV and media, and they have very little to do with the reality of doing science.” TT (H_100CompBio) The nature of science The instructors wanted their students to engage in authentic science practices so that they would have an accurate sense of what scientists do that allows them to learn about and understand the world. They also wanted their students to develop a working definition of science that was at once reflective of its underlying, empirically based philosophy and useful in a practical sense: “The most important thing that I want [students] to learn is what science actually is: an approach based on the analysis of evidence used to explore and propose natural explanations for the physical world.” YN (A_APBio) “I think science primary consists of two concepts: to explain and to predict. As we try to make sense of the natural world, can we have a reasonable explanation for which there is   78 predictive value? As I try to explain this to students, the sun rises, but it’s not because the rooster crows. That’s not a legitimate explanation, although the predictive value is pretty good. One could conduct an experiment and kill all the roosters, and the sun would still rise. Although that might be an inefficient way to go about it.” TN (C_400Evo) “I guess the most important thing when I teach the science courses is … how science … gives us the opportunity to ask questions that will fill in gaps in our knowledge.” JO (D_100Evo) “That science is about testable hypotheses, that we’re not making absolute statements about truth in the world, but that we’re testing ideas about what we think might be happening. So, to give them the idea that it’s an ongoing, iterative process, that we’re always updating our ideas. Nothing is set in stone. That it’s a creative process, that it’s about things that you can observe in the world, you know, that it’s not amenable to things that are not observable in the world.” HD (E_200Bio_HC) Scientific habits of mind A concept closely related to the nature of science is that of scientific epistemology, or science as a way of knowing and thinking about the world. Several instructors spoke to the importance of developing students’ abilities to think critically and engage in scientific reasoning, appealing to logic and the empirical nature of science: “There are particular habits, there are particular things we are interested in and we try to explain the realm of matter and energy…[Students] need to be good thinkers. They need to understand that any scientific explanation should be supported by data.” YN (A_APBio) “Mostly, I would like my students to have a better understanding about science, the value of basing understanding of the natural world on empirical data…” NR (F_400Evo) “When students come in we start talking science. The most common words that I hear from them are that something is scientifically proven, or we set out to prove this. I’m trying to change that student mind-set. We really want to be observant and objective, to teach students how to evaluate evidence and use evidence to make conclusions. Not conclusions forever, but conclusions for the moment.” KA (G_100BioLabB) Knowledge transfer and application It was important for instructors that their students could draw on conceptual knowledge and scientific reasoning to solve problems in novel contexts and across scales. In addition,   79 instructors wanted students to develop a level of scientific literacy that would allow them to apply scientific thinking and knowledge in their lives, helping them to make informed decisions: “[Students need] to verify the things they hear in the news or when they are making decisions later about medical treatments or something. I want them to apply a scientific mindset to those kinds of decisions.” YN (A_APBio) “I want [students] to understand concepts in a way that they can apply them to a new situation rather than just spit it back to me… And I want them to see the connection between whatever particular subject they’re studying and other subjects in biology or other subjects outside of biology.” NN (B_300Evo) “Students need to understand how things fit together, and the ability to tie those ideas that they’re reading into some of these larger contexts… I ask them to show me that they can apply that process to the particular problems that I give them.” JO (D_100Evo) “We want students to use what they know about biology to explain any system, so if you were given a paragraph or a case or a New York Times article, you should be able to explain it.” RY (G_100BioRes) I asked the instructors to discuss the challenges they had faced in pursuing the above goals for their students. These challenges fell into three rough categories: prior experience, scientific reasoning skills, and interest and motivation. Prior experience Several instructors noted that their students come to them with varying levels of scientific knowledge, with regard to both content and practice, and that it could be challenging to get all of the students on the same page conceptually. In addition, students often harbor misconceptions about the nature and practices of science that they developed during previous years schooling or from engaging in their communities, and these can be difficult to overcome: “There are a lot of misconceptions about what science is, and typically if a student feels that they already know something they don’t listen at all. And it doesn't, you know they don’t change their ideas. So all of that stuff at the beginning of the year that we carry out throughout the year, at the end of the year I’ll still have 50% of the kids saying that a theory is just a guess and can’t be proven and all of this crazy stuff that I just, I thought we hammered away at that all year long. But it never really changed their view because   80 they’ve already had 17 years of school where they’ve developed that view.” YN (A_APBio) “Sometimes it’s differential preparation of students to jump into things that are very independent, sometimes that’s just because they’ve never done it before and sometimes because they’re not all at the same level of sophistication, for example in thinking about quantitative data, or have the ability to think about a concept rather than a fact.” NN (B_300Evo) “I think that this goes for both high achieving students as well as not so high achieving students: I think they come into their college science classes with the preconception that science is truth with a big T, that if a professor stands up there and says, “This is how something works” then that is sort of set in stone. That that’s an absolute and it’s not open to discussion. I think they can struggle a bit with sort of this disconnect between, you know, having to learn definitions and hard facts and this is how it works versus us then telling them that there’s a process in science which is always open to interpretation, and open to revision, and is a continual process. I think that is a pretty hard thing they come across.” HD (E_200Bio_HC) “Some basic misunderstandings of the way science is done and the nature of truth in science. For instance, it is extremely hard to purge the word ‘prove’ from students’ vocabulary when they are talking about scientific results. There are also limitations in understanding various quantitative concepts, such as population thinking, probability, statistics, and algebra, that make understanding and designing experiments more difficult.” NR (F_400Evo) Scientific reasoning According to these instructors, students have a tendency to struggle when it comes to thinking scientifically, such as asking questions and developing hypotheses and building models, and this can prove challenging for instructors who wish to make classroom science experiences as authentic as possible. Instructors must also contend with the difficulties that some students have in making conceptual connections and reasoning across scales: “Can students state a hypothesis? That’s a big issue. Can they make hypotheses that can be tested in the real world that are relevant to present knowledge? I think a lot of what I do orbits around hypothesis. If there’s a hypothesis then it’s good stuff, otherwise it ends up being unorganized details or a house of cards in terms of abstractions.” TN (C_400Evo) “I think that… students in general have a hard time generating hypotheses. And I don’t know how that happens, because I think little kids like, are kids. I think they’re pretty   81 good at it. They are always coming up with ideas about why something might be. And by the time they hit college, they feel a lot less comfortable sort of putting that out there, and coming up with potential explanations or potential correlations or, you know, anything. They just feel uncomfortable doing that. And I’m not sure why, but that takes a little bit of practice. Just to get them to sit down and brainstorm and think about possible explanations.” HD (E_200Bio_HC) “Coming up with very good questions that students can answer is probably the most significant [challenge]. Gathering great data, lots of it – in terms of quality and quantity, great data. So that they can do a great analysis, so that they can use that evidence to inform their conclusions. I think the roadblocks are every single one of those phases. It starts off with coming up with really good questions.” KA (G_100BioLabB) “Students struggle with developing scientific models that are specific, that use scientific language, and that always make the right connections or relationships between concepts. Integrating concepts across scales. In a lot of cases, students go through their cell and molecular class and don’t think that they need to carry on those concepts about what cells and molecules are doing in ecosystems.” AR (G_100BioRes) Interest and motivation A common source of frustration was the lack of motivation among certain students, either due to a paucity of interest in science or because effort had to be incentivized in the form of points. Some instructors dealt with this by carefully choosing activities that would appeal to students and were relevant to their interests: “I think kids who learn the most about a specific topic are those who want to learn, and you can’t make anyone want to learn something that is not intrinsic. A large proportion of the students here don’t care about school and don’t want to learn about biology, and so they learn very little. And those students who are curious and interested learn a lot… [But most students] are very grade focused. They don’t get the big picture, they don’t get that learning something in one class might help them in another class.” YN (A_APBio) “[It] depends to some extent on their experiences and sometimes their personalities. Part of that may be driven by their expectations, so when I first started it was a challenge because I didn’t realize that I’d be managing expectations. Particularly when you’re not teaching a course that’s not right out of a textbook and it’s not geared toward the GRE, a certain set of students might say well I didn’t learn anything because I wasn’t being prepared for the MCAT or something. That is a challenge, and you have to actually convince them that what they are doing is more worthwhile than what they think they ought to be doing in their great wisdom. That’s kind of a challenge, but it works well to sell the idea that learning this way is actually more beneficial and more satisfying.” NN (B_300Evo)   82 “Motivation is the big ingredient. What I’ve learned is when I keep students motivated, that’s when everything falls into place. It’s the number one thing. There’s a lot of factors that feed into motivation. Students like to have a course that’s reasonably well-organized, they like an instructor that brings some energy and also some capacity for building the knowledge, so I always try to start each lecture with stuff that’s really simple and basic so that students can get that reward of understanding at least something in the lecture… But I think there has to be some aspect of every college course where students have to struggle on their own. Develop some ability for scholarship on their own... You want students to get some real capacity for that and to be able to on their own recognize their own ideas and act upon those ideas.” TN (C_400Evo) “The students feel this is a lot of work, especially for a two credit class. A lot of written work and thinking. It is difficult for us to accomplish what we want without it becoming overwhelming. They associate too much of what they do with assessments, so it is hard to get them to do something for nothing… How do we get them to practice these skills? If you don’t give them the points they won’t do it. We tell them that the one page reports are to get them practice and to prepare for the poster at the end, but they wouldn’t do it if you didn’t put points associated with it.” AE (G_100BioLabA) “The hardest part is keeping them engaged. It’s a very open-ended course so there’s lots of points where they could fall off the wagon… I don’t like to use grades to punish students, but I have found that they are pretty effective at communicating with students, because they tell a student, “You need to do this better.” If you don’t incentivize it some way, then a lot of times they tend not to care. “Am I still getting an A? Okay, well then I don’t need to do anything better or work any harder.” And an unfortunate percentage of students have that attitude. In this last year, the way I overcame that was by adding a lot of assessment. When they got a bad grade on something, I made them come talk to me and said, “Well, you need to do better on this, that, or the other.” I think that kept a lot of students more engaged than they would have otherwise been. It gave them an incentive to work a little harder. I consider it mind control, because hey (chanting) you want a good grade, you want a good grade.” TT (H_100CompBio) Instructor goals: Evolution In contrast to their goals for science in general, which were much more process oriented, the instructors’ goals for student learning of evolution were primarily content-based, and spanned an array of topics from the essential role evolution plays in helping us to understand biology to fundamental evolutionary concepts.   83 NN (B_300Evo) was concerned, first and foremost, with helping his students to think evolutionarily (National Research Council, 2012) and adopt a Dobzhanskian view of evolution, understanding it as the light by which we make sense of biology (Dobzhansky, 1973): “I want them to understand the way that evolution is related to the other things that they’ve learned about in biology, to see why not taking an evolutionary perspective is going to lead you astray no matter what you’re doing. So that means showing them the relationship between genetics and evolution, showing them the relationships between behavior and evolution, development and evolution, molecular biology and evolution, so that by the end of it the idea of doing anything without having sort of the ideas of common ancestry and natural selection and the other forces of evolution in their mind all the time is silly. It’s silly not to have that in your mind all the time.” JO (D_100Evo) similarly discussed his desire to impress upon his students how critical the theory is to understanding biology: “The fact is that students need to see biology as embedded in evolution. That’s probably the main [goal], if I accomplish that then we’re good." For HD (E_200Bio_HC), a clear definition of evolution was key for her students, who were not biology majors and therefore not as familiar with the theory: "I think the big concept that we want them to understand is a good, solid definition of evolution, which is not that individuals change, right, individuals don’t evolve, populations of organisms change. So the average trait value of some group of organisms changes over time. It sounds simple, but it takes a while and it takes some misconceptions to get out of the way to get that to them.” Most instructors discussed the importance of helping their students to grasp the basic mechanisms of evolution, especially evolution by natural selection, and to a lesser extent contributing factors such as drift: “Having an understanding of natural selection as a mechanism… Sort of again grounding them in that idea of how selection works. Secondary goals are some of the other processes of evolution, such as drift. Focusing on natural selection is the main one, and if we have time and it seems like they’re doing okay with it then getting into some of these other mechanisms.” HD (E_200Bio_HC) “I would like students to be aware of the different evolutionary processes like natural selection, mutation, [and] migration. I would like students to understand the basic   84 requirements for evolution by natural selection, variation, heritability, differential fitness, covariance between heritable phenotype and fitness.” NR (F_400Evo) “I want them to understand the stochasticity of evolution, historical contingency, the three components of evolution and how they work, so variation, inheritance, and selection.” TT (H_100CompBio) Another key concept, one that is often a major source of misconceptions about evolution (Gregory, 2009), was the idea that mutations are random, but selection is not: “I guess the third big one is the distinction between random and non-random. So, having an understanding that change resulting from natural selection is not a random change. That is something, it’s not just a monkey plunking away on a typewriter, that is a nonrandom change. And to distinguish that from some things that are random, including some biological processes like mutation. But natural selection itself and the result of it is not random. It’s a process that fits an organism to its environment, and you couldn’t do that if you just had random changes.” HD (E_200Bio_HC) “I would like students to understand the random nature of mutation and the non-random nature of selection.” NR (F_400Evo) Finally, several instructors discussed how important it was for students to understand the evidence for evolution, that evolution is indeed good science with testable hypotheses, and that the outcomes of experimental evolutionary biology are directly applicable to our lives: “Using the nature and process of scientific inquiry is a way for us to reinforce the validity of the evolutionary model and also to talk about some of the more recent findings.” JO (D_100Evo) “I would like students to understand that evolutionary biology is an experimentally tractable and socially relevant field. I would like students to have a good set of examples from the field, lab, [and] computer of evolution in action. I would like students to be able to design experiments to test evolutionary hypotheses of their own making.” NR (F_400Evo) “I like to show areas of application that they might not think, like medicine.” NN (B_300Evo) Not all of the instructors had well-defined goals when it came to the subject of evolution. AR and RY (G_100BioRes), who were teaching their course for the first time, had been planning their lessons as they came to them, and although their goals were somewhat vague when I spoke   85 with them, they were very similar to those described by other instructors, especially with regard to defining evolution and understanding the underlying mechanisms: AR: “[Our goals] are not all exactly written yet. But I think I can give you the basic idea of what they’re going to look like. We want them to understand and be able to apply the concept of fitness to various cases. We will want them to be able to identify different types of selection, especially competing selection pressures and how those can shape phenotypes.” RY: “Understand there are other mechanisms to evolution besides selection.” AR: “Yes, mechanisms. We want them to especially contrast selection and drift and be able to figure out what’s happening under drift… I forgot an important one. We want them to be able to define evolution correctly.” RY: “And of course the things we’ve already been hitting with them. Mutation is random, you can’t pass on a phenotype you acquire.” AR: “Understanding heritability.” In contrast, KA and AE, who taught the same introductory biology laboratory during different semesters and planned the course together, seemed to struggle with their goals for the unit on evolution, in part because of difficulties in finding appropriate exercises that accurately and authentically represent evolutionary concepts: “We don’t know. We really don’t know… We don’t have evolution well structured. At this point we don’t have a vision… Other than Avida-ED, students don’t really get to see any kind of selection in action… We’re trying to have some authenticity in our scientific investigations… In order to bring it alive, otherwise Avida-ED is just a computer game. What we want to see is the evolution… It’s not that we’re preaching to students that evolution occurs. We want them to explore and have it be alive for them. That’s the goal, other than see, evolution occurs. That only goes so far. We want to do things on the computer that we can’t do in real life because we don’t have the generation time.” KA (G_100BioLabB) Even the instructors who had very clear visions for what they wanted their students to understand about evolution faced certain challenges. These were primarily due to the difficulty of demonstrating evolutionary processes in real time, as EC discussed, or student prior knowledge, particularly with regard to the differential preparation of students and misconceptions about evolution. JO (D_100Evo) expressed frustration with the way that evolution was addressed during his students’ high school educations, calling it “timid”, and   86 attributed this to the politicized nature of evolution and its diluting effects on teachers’ resolve. These teachers, he argued, were concerned about upsetting their students, using lukewarm language that inadvertently produced misunderstandings: “Outside of the very rare fundamentalist students that I have, I think it is because the teaching of evolution in K-12 is pretty timid. There is a whole mixture of understandings. Some of them really know the idea of descent with modification, and some of them still think that something turns into something else. So what we have is a very uneven understanding, and part of that I think is because of the K-12 political environment where the teachers are not laying it out explicitly the way that we understand it, but are trying to come up with other phrases, like change over time, that sort of have something to do with it and sort of sound less threatening, but that really aren’t a very good representation of what evolution is. So it’s a typical sort of science course issue of students coming in with misunderstandings they are absolutely convinced are real. And I think one of the bigger problems is that among the students who accept evolution rather than the ones who reject it, if they’ve got it wrong they are both accepting and promoting an incorrect idea." JO also explained that, because of its treatment in schools, most students’ understanding of evolution seemed to stop at Darwin in the 19th century. They have very little knowledge of advances in evolutionary biology since then, and do not appreciate how evolution is integrated with other fields, such as genetics: “They know a lot about genetics, they know a lot about Mendel’s peas, but they don’t really understand how deeply embedded biology is in evolution.” Very few instructors mentioned ever having had issues with students who did not accept evolution, particularly those in upper-division courses. These instructors pointed out that upperlevel students choose to take evolution courses, and rather than running into issues of acceptance they must deal instead with higher-order misconceptions: “It’s less of a challenge with this class because it’s a self-selected group of people that are, where it’s not a required class, so they’re there for a reason. Probably the most challenging thing is the level of abstraction that’s needed for a lot of evolutionary concepts.” NN (B_300Evo) “At the introductory biology level, I find that some fraction of the students do not believe in evolution, mainly as it applies to human evolution. I do not have that issue in my upper-level class. The main challenge is that students have a basic misunderstanding about some components of evolution. For instance, many students come in to my class   87 thinking that a stressful environment can induce mutations that engender stress resistance.” NR (F_400Evo) Other specific misconceptions that instructors mentioned as challenging included the belief that specific mutations are directed by the environment or are driven by need, that evolution is completely random, that natural selection and evolution are the same, and that evolution is not supported by evidence, cannot be tested, or is otherwise unscientific. Instructor views on assessment These instructors viewed assessment not just as a means to assigning grades to students, but also as a way to monitor the development of student ideas and understanding (i.e., formative assessment). Several explicitly stated that they steer clear of multiple-choice items in favor of short answer or essay, which provides them with more information on student thinking and alerts them to potential misconceptions. In addition, assessments were often integrated with their goals for engaging students in authentic science practices, particularly those involving effective communication; thus, in most cases the bulk of students’ grades was based on analytic products such as research papers, oral presentations, or posters, rather than on quizzes and exams. Instructor views on pedagogy The instructors differed somewhat in how they decided what to teach. A few were careful to align their content objectives to national and state standards and/or to big ideas in the field. Others, particularly those who were less experienced, followed precedent as a guide and referred to syllabi from prior offerings of the course. One instructor of an introductory course made his decisions based on content students would be learning in more advanced classes, determined by the program. Still others drew from their own expertise or experience, sometimes incorporating feedback from students and teaching assistants to iteratively make improvements to the course.   88 Despite these differences in deciding what content would be the focus of instruction, the instructors held quite similar views with regard to how the content would be delivered. All of them, to a greater or lesser degree, expressed the importance of engaging students in meaningful, learner-centered activities. These often included collaborative group work and hands-on, inquirybased projects, with the reasoning that if students are to gain an accurate sense of what scientists do, they ought to engage in science practice as much as possible: “If we can do a lab that’s first choice. If I can ask them a question and they can actually figure it out then that would be first choice.” YN (A_APBio) “Virtually all of the laboratory work is done in groups… That sort of interaction among students, they realize that science is done often in groups…” NN (B_300Evo) “If I’m going to teach them about the nature of science then [I will have] them do as much science as possible, where they do these active investigations.” HD (E_200Bio_HC) “I’m trying to reignite student interest in the natural world through field experience and inquiry-based learning… I want to raise the bar so that it’s actually a university-level research experience.” KA (G_100BioLabB) “I have two main rules: get them doing science, and teach them enough of the computational stuff so that they can do science on the computer…I don’t know of any other way to teach computational research than just doing a whole lot of it." TT (H_100CompBio) Several instructors intimated, almost apologetically, that it is not always possible to avoid lecture (particularly in the case of high-enrollment lecture courses), but that they try to make these as interactive as possible to keep students engaged. They accomplish this by asking and eliciting student questions, using illustrative examples and analogies, and facilitating discussion, as well as making liberal use of instructional technologies (e.g., white boards, individual response systems, i.e., “clickers”, course management software, blogs, etc.): “I try not to do sort of straight lectures, but as I said get students to not only jump in with questions but ask them to do the interpretation for me. We read papers on a pretty regular basis, so there are class discussions of those that students are supposed to submit   89 questions, so we have a discussion based on their questions not just mine.” NN (B_300Evo) “The problem with large land grant universities is that they will pile 300 to 400 students into a classroom, and it really limits pedagogical options. So it’s very much lectureoriented, but what I try to do is to set up during the course of the hour in the big classes is student response kinds of questions, both individual questions and group questions. We are using a non-clicker student response system that is web-based… Our course management software has a chat function, so we do an online chat during the class and students can ask questions during the lecture. We also use electronic interactive text markup software that allows students to read online and mark up the way they would their textbooks.” JO (D_100Evo) “I try to get as much feedback and have them be active with materials when I have them in class. I try not to do straight up lecture, I probably still do too much lecturing, but I try to have them, you know, respond to the material, talk to each other about it, answer reflection questions in the course of going through lecture material.” HD (E_200Bio_HC) “I give lectures, these are interactive, with small-group/whole-class discussion, smallgroup/whole-class activities, clicker-based quizzes and exercises. I try to make the class very interactive, where students interact with me and each other via discussion, answering questions, performing exercises, using clickers, and so on. I do have material that I present, but this is interspersed with interaction throughout the class. I always welcome questions from the students and I try not to go more than 15 minutes without asking a substantial question, breaking the class up for discussion, or running some activity.” NR (F_400Evo) “We have the students work in permanent groups…The class is basically a couple minutes of interactive lecture, then we do clickers that they talk about in their groups and then talk about as a class. We also do where they draw on carbonless [notebook paper] either individually or in groups, and we have them turn that in so that it can be used as formative assessment. We also use white boards. I use them a lot and find them very effective in getting them to talk to one another. … Most of the class is spent with them actively doing something.” RY (G_100BioRes) Instructor views on their role in student learning The instructors also held very similar views about their role in the learning process. These views tended to be quite reform-oriented, and focused on their facilitation of student learning by creating opportunities for student engagement. As YN (A_APBio) explained, “I think I’m the one who has the big picture in my mind, so I’m trying to lead them to that. I’m   90 pointing out all of the connections and I’m designing the questions.” Similarly, NN (B_300Evo) described how he orchestrates an experience that encourages and challenges students’ thinking: “[My role is] to organize the information in ways that [students] can understand it, to give them support as they do investigations, to make sure that they are getting something from them and not just getting frustrated. I set up the experience. In terms of content, I’m sort of the local expert who is helping them understand particular ideas correctly and apply them, but also instigating them thinking beyond their boundaries, and showing them new ways to think about problems that they might not have thought about before.” Other instructors self-identified as a guide or mentor, someone to support the students in their learning, generate interest and enthusiasm, and develop their confidence: “I think [my role] is really more like mentoring, putting them in front of the resources they need, try to ask questions that challenge them, and to get them to come up with strategies that work for them in terms of finding out what they need.” JO (D_100Evo) “I think of myself a little bit as a translator. I don’t know if that makes sense. But there’s all this jargon that they’re afraid of, and what I try to do is build up their sense of scientific curiosity, right, and sort of push the jargon to the side. So I think my role is to be enthusiastic, so I’m trying to be an enthusiastic tour guide… I am trying to help them unpackage some of these scary concepts or scary ideas, and kind of lead them into it and model enthusiasm about it.” HD (E_200Bio_HC) Instructor views on Avida-ED Although instructors gave several reasons for choosing to use Avida-ED in their courses, these were essentially variations on the same two themes: students could use Avida-ED to observe evolution in action and to experiment with evolving populations in real time: “One of the hardest things about teaching evolution is that students haven’t seen it happening and they have many misconceptions about what causes evolutionary change. Avida-ED allows students to see evolution in action and to investigate the mechanisms firsthand.” YN (A_APBio) “One thing that I liked about it was that students could collect data on populations that were changing over time. I thought that was the best feature of Avida-ED. It was some inquiry-based thing that we could do in a laboratory setting.” KA (G_100BioLabB) “For me anyway it was the fact that they’d be working with something hands-on and could explore evolution in a system rather than just study a case system that we are telling them about. I want them to do something with it.” AR (G_100BioRes)   91 The instructors saw the tool as much more dynamic than computer programs that merely simulate evolutionary processes. Because Avida-ED constitutes an instance of evolution, the outcomes are not predetermined and students are not simply engaging in confirmatory, “cookbook” exercises: “I had seen the benefit of doing these [other] simulations, but I liked the idea of [AvidaED] being more open ended, so I could say I want you to define a question and see if you can explore it in this environment. That was different from a lot of the other computerbased things that were geared to teaching a particular fact about the model or microevolution or whatever.” NN (B_300Evo) “Here, there isn’t a particular answer. It is an outcome that depends on the process that was activated by the students. So I think that is sort of a realistic simulation of what might happen in real life.” JO (D_100Evo) Some instructors stated that Avida-ED is the only tool available that allows students to observe and experiment with evolution in a short amount of time, making it convenient for use in a typical college classroom: “I realized that it was the only tool that students could, in a well-defined setting, really explore the dynamics of evolution. It’s the only tool that does that. You know, I’d love to bring students into the lab for ten years and have them do their own evolutionary experiment on fruit flies or whatever. But this was the only tool that would give them an active model of evolution.” TN (C_400Evo) “What else were we going to do with evolution? Can’t use bacteria, we don’t have enough time to use bacteria, what else can we do? We can’t do anything with evolution in the lab other than Avida-ED.” KA (G_100BioLabB) “It is relatively fast. We use it for two hours in the lab, and in those two hours you can address all of these [misconceptions]. And you can have different replicates going on at the same time, so you’ll have tons of evidence that you can look at and see that this [outcome] is not just a fluke.” AE (G_100BioLabA) “[Evolution is] so much faster on the computer. You can get many, many generations in such a short time. And again, we’re looking at Freshmen, who need to have these results pretty quick.” JO (D_100Evo)   92 For JO (D_100Evo), the greatest appeal of Avida-ED came from its ability to allow students to ask questions of their own design, giving them agency and control over their learning and adding meaning to the experience: “I want students to be engaged in something that’s their own, something that they can invest in, so that’s their own problem and their own focus… Mostly it’s about the student experience.” In sum, these instructors chose to use Avida-ED in their classrooms because it afforded them the ability to teach evolution via inquiry in a short amount of time. Not only would students be able to see digital populations evolving in real time, they would be able to manipulate the conditions under which those populations were evolving. In some cases, students could pursue their own questions. This afforded the students a sense of ownership and agency, which would foster interest and motivation. In the end, all of the instructors agreed that their implementations of Avida-ED were quite successful, even when the student assessment data did not reflect large gains. In addition, all of them stated that they would use Avida-ED in the future, although some, particularly those instructors using Avida-ED for the first time, suggested ways in which they would change the implementation to make it run more smoothly or to avoid some of the issues they had come up against. Most of the challenges that instructors encountered were related either to technological issues (e.g., the software was much slower for students with outdated versions of operating systems or older computers) or to their own unfamiliarity with Avida-ED (see Discussion below). However, all of the instructors agreed that Avida-ED matched very well with their personal philosophies on teaching, particularly because it allowed students to learn about evolution while simultaneously engaging in authentic science practice, and found using Avida-   93 ED to teach about evolution to be valuable for this reason. Overall, instructor feedback was overwhelmingly positive: “I wouldn’t be teaching evolution at all if it weren’t for Avida-ED. It makes it clear that evolution can be active process.” TT (H_100CompBio) “I think that Avida-ED makes me think that people are capable of understanding [evolution]. I have rejected what most people say, that most people aren’t going to get this, the general population can’t get this, it’s too hard. And I’m like, no, the general population isn’t getting this because we’re not giving them experiences like Avida-ED.” YN (A_APBio)   94 Chapter 5. Student learning outcomes Overview.   One major question driving this study was the extent to which Avida-ED influences student learning and acceptance of evolution. To investigate this, I asked instructors to administer an assessment immediately before and after their students had engaged in exercises with Avida-ED. The assessment measured both the students’ understanding of fundamental concepts (the origins of genetic diversity and the basic mechanism of adaptive evolution by natural selection) and their acceptance of evolution as a real phenomenon and explanatory, evidence-based theory. In this chapter, I report the patterns that emerged from looking across student assessment outcomes for each case. For details on the assessment instrument and how content and acceptance scores were determined, see Chapter 2. Student learning of foundational evolution concepts. Student responses to two open-ended assessment items were scored using a rubric and represented as a proportion of an ideal response (see Chapter 2). Mean content scores for each case were then compared to reveal significant differences from pre- to post-test (one-tailed paired Student’s t-test for cases in which scores were normally distributed, and one-tailed MannWhitney U test for cases in which score distributions did not meet assumptions of normality according to a Kolmogorov-Smirnov Goodness of Fit test). Table 9 provides a summary of the data for each case. In six of the ten cases, average student content scores increased significantly from pre- to post-test (Fig. 4). All six of these were lower-division courses. Only one lowerdivision course (D_100Evo) did not show a statistically significant increase on the post-test, though it was quite close (p = 0.51), and there was a moderate effect size (d = 0.59). None of the   95 upper-division courses had significant increases from pre- to post-test in average student content score. In two of these three cases, B_300Evo and F_400Evo, students had average pre-test content scores of 51% and 52% of the ideal response, respectively. These were the highest pretest scores of all ten cases. Students in these two courses had similarly high scores on the posttest. In contrast, students in the third upper-division course, C_400Evo, had the second lowest pre-test score (19%) and the lowest post-test score (26%). Table 9. Summary of average content scores by case. Pre- and post-test content scores and pre/post change are reported as the average percentage of the ideal response. Significance values marked with an asterisk were calculated using non-parametric analyses (Mann-Whitney U test). Effect sizes were determined using Cohen’s d and are interpreted accordingly: 0.2 = small, 0.5 = moderate, 0.8 = large.     Case Code n A_APBio B_300Evo C_400Evo D_100Evo E_200Bio_HC F_400Evo G_100BioLabA G_100BioLabB G_100BioRes H_100CompBio 17 9 15 12 30 33 153 234 101 24 Pre content 46% 51% 19% 21% 17% 52% 33% 36% 36% 33% Post content 56% 53% 26% 31% 32% 56% 39% 40% 50% 44% 96 Pre/post change 10% 2% 7% 10% 15% 4% 6% 5% 15% 11% p 0.01 0.40 0.09 0.05 0.00 0.12 0.00* 0.01* 0.00* 0.01 Effect Size (d) 0.69 0.13 0.51 0.59 1.62 0.22 0.32 0.29 0.85 0.68   Figure 4.  Average pre/post student content scores for each of the ten cases. Error bars are based on the standard error of the mean. Possible scores range from 0% - 100% of the ideal response. Pre/post differences with a significance level of p < 0.05 are denoted by a single asterisk; significance levels of p < 0.01 are denoted by double asterisks. Mean differences were tested for significance using one-tailed paired Student’s t-tests for cases with normal distributions, and Mann-Whitney U tests for those with non-normal distributions. Student acceptance of evolution. Average student acceptance scores for the ten cases ranged from 73.95 to 90.04 on the pre-test and 76.28 to 91.06 on the post-test, or moderate to very high acceptance for both preand post-tests (Table 10). Average acceptance score increased significantly from pre- to post-test in four of the ten cases (Fig. 5). These four cases were all lower-division courses that had statistically significant gains in average content score. As with the content score, students in two of the three upper-division courses, B_300Evo and F_400Evo, had very high acceptance on both   97 the pre- and post-tests, with no significant change. Students in the remaining upper-division course, C_400Evo, also failed to show a significant change in acceptance from pre- to post-test, however their average acceptance score on the post-test was lowest of all of the cases (76.28). Table 10.  Summary of average acceptance scores by case. Significance values marked with an asterisk were calculated using non-parametric analyses (Mann-Whitney U test). Effect size was determined using Cohen’s d and are interpreted accordingly: 0.2 = small, 0.5 = moderate, 0.8 = large.   17 Pre acceptance 82.98 Post acceptance 89.85 Pre/post change 6.87 0.00 Effect Size (d) 0.92 B_300Evo 9 90.04 88.51 -1.53 0.15 -0.15 C_400Evo 15 76.15 76.28 0.13 0.48 0.01 D_100Evo 12 75.48 79.10 3.62 0.29 0.21 E_200Bio_HC 30 81.86 88.82 6.95 0.00 0.66 F_400Evo 33 88.58 91.06 2.48 0.06 0.22 G_100BioLabA 153 80.70 82.23 1.53 0.22* 0.14 G_100BioLabB 234 80.21 83.14 2.93 0.00* 0.28 G_100BioRes 101 73.94 78.03 4.09 0.02* 0.29 H_100CompBio 24 86.17 88.73 2.57 0.06 0.24 Case Code A_APBio   n 98 p   Figure 5.  Average pre/post student acceptance scores for each of the ten cases. Error bars are based on the standard error of the mean. Possible scores range from 20 (complete nonacceptance) to 100 (complete acceptance). Pre/post differences with a significance level of p < 0.05 are denoted by a single asterisk; significance levels of p < 0.01 are denoted by double asterisks. Mean differences were tested for significance using one-tailed paired Student’s t-tests for cases with normal distributions, and Mann-Whitney U tests for those with non-normal distributions. Understanding and acceptance of evolution. Most of the students in lower-division courses had significant increases in both average content and average acceptance scores, suggesting a relationship between these two factors. A Pearson correlation confirmed that a significant, positive association exists between the change in average content score and the change in average acceptance score across the ten cases (r = 0.73, p < 0.01; Fig. 6). In order to ensure that this pattern reflected actual student learning gains   99 and was not simply the result of lower-division students showing greater gains due to their inexperience with evolution content, normalized gains (g-avg; Hake, 2002) were calculated for each student’s pre- and post-assessment scores, which were then averaged for each case. Average normalized gains in student content and acceptance scores across the ten cases were also significantly, positively correlated (r = 0.60, p < 0.05; Fig. 7), further supporting a positive association between student learning of foundational evolution concepts and their acceptance of evolution subsequent to engaging in lessons with Avida-ED. Figure 6.  A Pearson correlation between pre/post change in average acceptance score and average content score across the ten cases reveals a significant, positive relationship (r = 0.73; p < 0.01).     100     Figure 7.  A Pearson correlation between average student normalized gains in both content and acceptance scores across the ten cases reveals a significant, positive relationship (r = 0.60; p < 0.05).       101 Chapter 6. Student affective response to Avida-ED Overview. This chapter summarizes the results of a survey on student experiences that was administered by course instructors after the post-assessment once students had completed all work with Avida-ED. The survey included a number of items intended to measure various affective responses to Avida-ED, including student interest and enjoyment, perceptions of learning, and self-efficacy. The initial purpose of the survey was formative in nature, as a way to obtain student feedback in order to improve the software for future student use. However, crosscase analysis of student survey responses in light of other variables (assessment results and instructor familiarity with Avida-ED) revealed patterns of interest, namely that there was no association between student affect toward Avida-ED or instructor familiarity and learning or acceptance of evolution, but there was a strong association between student affective response and instructor familiarity with Avida-ED. Findings. With the exception of case G_100BioRes, student survey responses were generally favorable (mean greater than 3; Table 11). Students reported that they were attentive in class and actively participated in exercises involving Avida-ED. They had little difficulty understanding and using the software. Most students were interested in Avida-ED and enjoyed using it. Many students, particularly those in A_APBio and H_100CompBio, reported that Avida-ED helped them to better understand both evolution and the nature of science, and that they felt much more comfortable discussing the topic of evolution. However, most students were not particularly   102 Table 11. Summary of survey data for each of seven cases. Survey items were based on a 5-point Likert scale (1= low; 5 = high); item mean response is reported for each case. 19 B_300 Evo 9 E_200 Bio_HC 31 G_100 BioLabA 154 G_100 BioLabB 268 G_100 BioRes 112 H_100 CompBio 28 3.53 3.44 3.26 3.56 3.40 2.67 3.61 4.63 4.11 4.23 4.69 4.52 2.93 4.14 3.95 3.56 3.13 4.07 3.89 3.53 4.11 3.68 3.44 3.45 4.18 3.99 2.73 3.93 3.47 3.44 2.87 2.92 2.61 2.01 3.36 3.89 3.44 3.42 3.53 3.25 2.46 3.57 4.53 3.11 3.92 3.80 3.65 2.66 4.25 3.95 2.78 3.75 3.49 3.42 2.70 3.96 3.84 2.56 3.83 3.57 3.57 2.76 4.14 2.05 2.78 1.42 2.05 1.85 1.73 2.96 2.37 2.44 1.79 2.08 1.99 1.75 3.00 3 2 2 2 2 1 3 Content g-avg 17% -10% 18% 1% 2% 20% 9% Acceptance g-avg 37% -9% 39% 3% 4% 12% 14% Case Code A_APBio N Your general level of attentiveness during lessons involving Avida-ED Your general level of participation during lessons involving Avida-ED The relative ease with which you were able to understand Avida-ED The relative ease with which you were able to use Avida-ED Your overall interest in Avida-ED Your overall enjoyment of Avida-ED Avida-ED significantly increased my understanding of evolutionary processes Avida-ED significantly increased my understanding of the process of science I feel much more comfortable discussing the topic of evolution I will continue to experiment with Avida-ED in my free time I have or will share AvidaED with friends or family Instructor Familiarity Helpfulness of resources (1 = not at all helpful; 5 = extremely helpful) Avida-ED tutorial Avida-ED lessons/projects In-class demonstration(s) 3.79 3.00 3.00 3.18 2.99 2.69 3.38 3.53 3.56 3.41 3.33 3.15 2.26 3.82 3.83 3.17 3.41 3.61 3.44 2.30 3.56 Conversations with peers Conversations with instructor(s) Avida-ED user’s manual 3.47 3.75 3.75 3.67 3.55 2.83 3.96 3.94 3.75 4.09 3.88 3.76 2.49 3.92 2.50 3.33 2.33 2.74 2.55 2.21 2.67   103 interested in continuing their experimentation with Avida-ED or in sharing the program with their families and friends. In terms of instructional resources, students appreciated most of what was provided to them. They seemed to find conversations with instructors to be particularly helpful, but tended not to find the Avida-ED user’s manual to be of great use. A correlation matrix (Table 12) shows the Pearson correlation coefficients between different survey items, as well as instructor familiarity with Avida-ED and student content and acceptance normalized gains from the assessment. There were several strong associations between various affective factors (e.g., interest and enjoyment in Avida-ED were quite closely associated, r = 0.90, p < 0.01). Several of the affective measures were also significantly and positively associated with instructor familiarity. However, neither instructor familiarity nor any of the affective factors were associated significantly with gains in content or acceptance.   104 Table 12. Correlation matrix showing associations between survey items, instructor familiarity with Avida-ED, and assessment data (average normalized student gains in content and acceptance scores). Cells shaded in yellow are significant at the 0.05 level while cells shaded in orange are significant at the 0.01 level. Attentive Participate Understand Use Interest Enjoy Increase Evolution Increase Science Comfort Continue Share Instructor Familiarity Participate 0.89 Understand 0.57 0.43 Use 0.89 0.89 0.67 Interest 0.85 0.64 0.31 0.51 Enjoy 0.92 0.88 0.38 0.72 0.90 0.76 0.75 0.43 0.67 0.66 0.86 0.64 0.65 0.36 0.63 0.50 0.74 0.97 Comfort 0.55 0.56 0.37 0.62 0.33 0.59 0.90 0.97 Continue 0.52 0.11 0.52 0.29 0.64 0.35 0.13 0.03 -0.04 Share 0.67 0.28 0.60 0.43 0.76 0.57 0.48 0.40 0.33 0.92 0.83 0.67 0.53 0.63 0.85 0.89 0.91 0.83 0.72 0.50 0.78 -0.50 -0.37 -0.26 -0.45 -0.39 -0.24 0.19 0.32 0.37 -0.58 -0.33 -0.05 -0.06 0.10 -0.29 -0.13 0.07 0.27 0.57 0.65 0.61 -0.50 -0.16 0.33 Increase Evolution Increase Science Instructor Familiarity Content g-avg Accept g-avg p < 0.05 p < 0.01   105 Table 13 shows correlation coefficients between student ratings of different resources in terms of how helpful they were (1 = not at all helpful, 5 = extremely helpful) and the various affective factors. With the exception of the Avida-ED user’s manual, most of the resources were significantly and positively associated with ease of use, interest, enjoyment, and the degree to which students felt Avida-ED helped them to understand evolution. All resources, again excluding the user’s manual, were also significantly and positively associated with instructor familiarity. The Avida-ED user’s manual was only significantly associated with average normalized student gains in content and acceptance scores; however, these associations were strongly negative (r = -0.91, p < 0.01; r = -0.67, p < 0.05, respectively). That is, students who reported that the Avida-ED user’s manual was helpful tended to have lower normalized gains in both content and acceptance scores than students who did not find it helpful. Table 13. Correlations between survey items dealing with affect and the degree to which students found various resources helpful. Cells shaded in yellow are significant at the 0.05 level while cells shaded in orange are significant at the 0.01 level. 0.56 0.52 0.76 0.85 Avida-ED lessons/ projects 0.31 0.69 0.93 0.91 In-class demonstration(s) 0.47 0.85 0.75 0.95 Conversations with peers 0.22 0.73 0.77 0.77 Conversations with instructor(s) 0.18 0.77 0.77 0.91 Avida-ED user’s manual 0.19 0.32 0.64 0.40 0.88 0.75 0.91 0.61 0.79 -0.08 0.77 0.66 0.83 0.58 0.73 -0.25 0.63 0.32 0.60 0.55 0.57 0.73 0.73 0.21 0.46 0.52 0.51 0.62 0.64 0.20 0.41 -0.37 0.73 0.52 0.91 0.87 0.86 0.69 0.75 0.22 0.11 -0.39 -0.19 -0.48 -0.30 -0.91 0.44 0.10 0.28 -0.03 0.24 -0.67 Avida-ED tutorial Understand Use Interest Enjoy Increase Evolution Increase Science Comfort Continue Share Instructor Familiarity Content g-avg Accept g-avg p < 0.05 p < 0.01       106 Chapter 7. Discussion Overview. In this final chapter, I synthesize the findings reported in the previous four chapters in order to tell a story of teaching and learning with digital evolution. I first summarize each component of the study individually, then discuss the implications of these findings for teaching with digital evolution, and for science education in general. Next I present the limitations of the study, and my thoughts on future directions that this research may take. I end with a brief summary of key findings. Discussion of findings. Implementation of Avida-ED Despite differences in the specific activities used, all of the instructors participating in the study implemented Avida-ED in ways that were, to varying degrees, both aligned with reform recommendations and consistent with its intended use, specifically as a tool for the integration of evolution content and the active, authentic engagement of students in research activities. Many of the instructors endeavored to take full advantage of Avida-ED’s strengths by having students pose their own questions and design a study. This sort of open-ended project led to engagement in a wide range of research activities. However, even implementations that were more guided (e.g., A_APBio and D_100Evo) remained well within the range of practices considered learnercentered inquiry (National Research Council, 2000). Therefore, all ten cases illustrated a moderate to high degree of consistency with intended use. Implementations differed primarily in duration, amount of instructional support provided, and the source of lesson materials used. Duration of implementations ranged from one to fifteen   107 weeks, with a median duration of two weeks. (It is important to note that although the number of weeks serves as a rough estimate of duration, the actual duration is dependent on the number of hours spent on Avida-ED in class as well as the amount of time students spent working with Avida-ED on their own outside of class. Therefore, the actual duration is difficult to estimate.) Although there is evidence from the literature that increasing the duration of implementations is associated with more favorable outcomes (Durlak & DuPre, 2008; Sadler et al., 2010), there was no evidence of an effect of duration on student assessment outcomes (content and acceptance score) in this study. However, duration may have influenced student affective outcomes; the case with the shortest implementation (G_100BioRes) also had the poorest outcomes with respect to student affect, and duration was significantly negatively correlated with how well students could understand Avida-ED (r = -0.76, p < 0.05), although duration was not significantly correlated with any other affective factor. It is likely that other factors in addition to duration played a role in influencing student response in this case. The instructors differed with regard to the amount of instructional support they provided students. For example, most instructors encouraged the students to work in groups on their projects, while students in G_100BioRes primarily worked by themselves. These students may not have been able to utilize their peers for support and feedback as they worked on their projects. Some instructors explicitly pointed students to resources (e.g., NN provided URL links to the Avida-ED website, which included the user’s manual, as well as the video tutorial on YouTube), while others did not; those students who were not explicitly directed to resources may not have been aware of their existence and would therefore not have had access to them. The instructors also differed in the amount of time they spent introducing students to Avida-ED. For example, YN spent five 50-minute class periods working on introductory exercises with her   108 students in order to prepare them for the open-ended inquiry activity that they completed on their own outside of class. In contrast, AR spent only ten to fifteen minutes demonstrating the program to the students before assigning the same exercise. The cases also differed in the source of the lesson materials that were used, and this varied with the degree to which instructors were familiar with Avida-ED. Expert users designed their own materials while novices used existing materials produced by others, and experienced users used some combination of the two (e.g., materials that were produced by modifying existing materials). Instructors who were more experienced with Avida-ED may have been better prepared to design new materials or modify existing materials and align these with specific course objectives. As an example, two of the expert users (YN and NR) were able to directly link their experiments with digital organisms to biological systems. It appears as though increased experience with using Avida-ED may afford instructors increased creativity and the ability to imagine how broadly digital evolution can be applied in their classrooms (Koehler & Mishra, 2008). This same pattern has been found… Instructor beliefs Interviews with each of the participating instructors revealed that, although they came from a variety of backgrounds and were teaching different courses, they all held views of teaching and learning that are closely aligned with those advanced by national science education reform initiatives and supported by evidence from discipline based education research (Achieve, 2013; American Association for the Advancement of Science, 2011; Singer et al., 2012). These included learner-centered, inquiry-based pedagogical strategies that integrated course content with authentic science practices, and were reflected in their goals for student learning. In hindsight, this should not be surprising, as it likely explains both their enthusiasm toward Avida-   109 ED and their willingness to participate in education research. These findings are consistent with the literature on teacher beliefs and their influences on practice. Cronin-Jones (1991) has argued that teacher beliefs are primarily influenced by four factors. These include views on what subject matter knowledge students need to know, beliefs about how students learn and the teacher’s role in their learning, and student capabilities. The instructors in this study had very similar goals for student learning, both in terms of science in general and evolution in particular. They all wanted their students to appreciate the nature of science by engaging directly in authentic science practices. For evolution, their goals consisted primarily of addressing common misconceptions and helping students develop a deeper understanding of fundamental evolutionary concepts. These goals aligned well with their inquiry-based teaching approaches, which illustrate their beliefs about how students learn science; that is, via active, learner-centered pedagogy. Finally, the instructors saw themselves as creators of opportunities for student engagement and enthusiastic mentors or guides, there to facilitate rather than direct student learning. A number of the instructors illustrated their familiarity with reform-based best practices by lamenting that they still lectured more than they would like due to time limitations and large class sizes, but emphasized that they attempted to make those lectures as interactive as possible by encouraging students to ask questions, work in groups, and engage in meaningful discussion. Instructors also employed a broad range of tools in their teaching to keep students actively involved, including an array of educational technologies. Through their beliefs and actions, these instructors displayed a firm commitment to their students’ learning. All of the instructors indicated having had positive experiences with using Avida-ED in their classrooms, and stated that Avida-ED aligned well with their teaching philosophies, particularly because it allowed students to observe evolution happening and to engage in   110 authentic research practices, such as pursuing questions and hypotheses that they had developed with experimental protocols of their own design, collecting and analyzing data, and communicating their results. Several instructors stated that Avida-ED was the only tool they had encountered that allowed them to teach evolution in this way, and in such a short time, which is often a limiting factor in college courses. Although very similar in their views on teaching and learning, there were other ways in which the instructors differed. One major difference was their degree of familiarity with AvidaED. Three instructors were considered expert users, who had worked with Avida-ED to a significant extant and possessed a deep understanding of its theoretical underpinnings, affordances, and limitations. Four instructors had used it prior to this implementation and so were considered experienced, although their knowledge of the program was not as deep as the experts. The remaining instructors were considered novices, as this was their first time implementing Avida-ED in the classroom, and they possessed very little knowledge about its affordances and limitations. The degree to which instructors were familiar with Avida-ED may have influenced their experience with using the program. All three expert Avida-ED users are members of BEACON, an NSF-funded Science Center for the study of evolution in action. Avida and Avida-ED are the flagship products of BEACON research on computational experimental evolution and evolution education. The expert users were familiar with Avida through their participation in BEACON, where work on Avida and Avida-ED is presented frequently. They are familiar with the literature on both programs, in some cases having contributed directly to that literature, and have used both in teaching and research capacities. Indeed, one instructor (YN) had used Avida-ED in her Masters thesis, and the dissertation research of another (TT) had been based entirely on his experiments   111 in Avida. The third instructor (NR) has served as co-principal investigator on several funded research projects involving Avida. These instructors are intimately aware of the programs’ capabilities. They understand the substrate neutrality of natural selection as a process (Dennett, 1995), and how, therefore, the evolution of digital organisms serves as a model for the evolutionary process, including biological evolution (Pennock, 2007a, 2007b). Instructors who were not as familiar with Avida sometimes expressed frustration that the program was not more closely aligned with their preferred model system. For example, Avidians are asexual, and so genetic recombination during sexual reproduction cannot account for variation in the population. Avida cannot be used to model the Central Dogma of molecular biology (Crick, 1970), because the Avidian genome consists of computer code that directly accounts for phenotype – phenotypic expression is not mediated by transcription, translation, or protein synthesis as in biological organisms with genomes composed of nucleic acids. In addition, Avidians are haploid; they are essentially prokaryotic organisms, which can pose a challenge when the majority of organisms used as examples by an instructor are eukaryotic. In light of these apparent limitations, it can be difficult for instructors to appreciate the value of using a digital system to model “real life”: RY (G_100BioRes): “I wonder what the cost/benefit analysis would be for spending all of that time on [learning to use Avida-ED]. Because [Avidians] are asexual – well, because they are so weird. They’re just these letters. And [students] have to make this connection, and I wonder if it takes so much for us to-” AR: “There’s a lot of conceptual jumps that they have to make to apply the Avida system.” RY: “We really want them to explain what’s happening in the cases we use. What’s happening in elephants, or what’s happening in snakes. What’s happening in real life. There’s a couple of things we need to deal with there, and I wonder if they get caught on – I wonder if we spent more time, how easy it would be for them to move from genes and alleles and As, Ts, Cs, and Gs and proteins to the Avidians. And since the Avidians don’t have that gene to protein to phenotype step that is really emphasized then it is not the model that we are using… There are no proteins, and the little colored balls are not   112 nucleotides, and they are prokaryotes, which we don’t emphasize prokaryotes in our class. That’s just us.” AR: “I think that’s just biology in general.” RY: “If this were the cell and molecular class and we were looking at transcription and translation, then maybe we would spend more time on prokaryotes. But when you’re teaching an organismal class, you just don’t use prokaryotes.” (Post-implementation interview) The issue motivating RY’s questions seems to be concern over the universality of Avida as a model system. A perennial challenge of using model organisms in science courses is that it can be difficult for students to see the general, broad patterns that the models are meant to illustrate; instead, students can get bogged down by the idiosyncrasies of particular organisms or systems (Grosslight, Unger, Jay, & Smith, 1991; Harrison & Treagust, 2000). This is true whether Drosophila, Escherichia coli, or Avidians are the focus. To avoid this “forest for the trees” phenomenon, it is important for instructors to connect model organisms and systems to “real world” contexts that exhibit the same fundamental patterns. Doing so can increase relevancy for students and help them to understand phenomena at a level of generality that transcends specific examples, ultimately allowing students to apply ideas more broadly. That is the intended purpose of Avida-ED as a teaching tool: to serve not as a digital representation of a prokaryotic organism, but instead as a general model of universal evolutionary mechanisms. Expert users of Avida-ED understand this, and as a result are able to use the tool appropriately, drawing parallels between biological and digital evolution. The less experienced users, on the other hand, may fall into the same trap as their students and get caught up in the specifics of the digital system to the extent that they struggle to see how it can be generalized to biological systems. Another notable difference between expert and less experienced users, as mentioned above, has to do with the curricular materials they used in their implementations of Avida-ED. All three experts designed their own materials, while less experienced users tended to use   113 materials that had been developed by someone else (e.g., the Avida-ED Curriculum Development team); this was true for all of the novice users. This is significant, because not only did the less experienced users have to deal with navigating unfamiliar software, they also had to contend with using unfamiliar lesson materials. In terms of pedagogical content knowledge, this left the less experienced users, and the novices especially, doubly disadvantaged. The findings presented here make a strong case for the importance of professional development opportunities aimed at preparing teachers to effectively implement reform-based curricular materials in the classroom (Powell & Anderson, 2002; Sunal et al., 2001). Such opportunities will only become more essential as the use of technology in teaching increases in popularity. In response to this trend, Shulman’s ideas of pedagogical and curricular content knowledge have been extended to include technology, resulting in what has been named technological pedagogical content knowledge (or TPACK), a framework for understanding the complex interactions between content, pedagogy, and technology, and how these interactions produce effective and creative uses of technology for teaching (Koehler & Mishra, 2008; Mishra & Koehler, 2006; Voogt, Fisser, Pareja Roblin, Tondeur, & van Braak, 2013). Although “technology” refers to any tools that might be used as instructional aids, including white boards and overhead projectors and the like, TPACK usually deals with emergent technologies such as computers, the Internet, social media, and software that have been subsumed for educational purposes. Because these technologies change so quickly and new technologies are constantly being invented, their effective integration into the classroom poses an additional challenge for teachers. Technological pedagogical content knowledge is a young and rapidly growing field; introduced in 2005 by Koehler and Mishra, it has generated dozens of research articles since that   114 time (Voogt et al., 2013). However, the degree to which TPACK intersects with research on teacher beliefs has not yet been clearly established, particularly with regard to how teachers’ beliefs about teaching and learning influence their use of technology in the classroom, and vice versa. Koehler and Mishra (2008) argue that particular technologies have specific affordances and constraints. These constraints are either inherent to the technology or imposed externally by the user, a condition known as “functional fixedness”; that is, the manner in which the ideas we hold about an object’s function can inhibit our ability to use the object for a different function. Functional fixedness often stands in the way of creative uses of a technology, and as such makes it difficult for teachers to imagine how that technology might be useful as a tool for teaching and learning. In the case of Avida, functional fixedness may lead an instructor to believe that it is useful only as a simulation in a computational context, and may preclude that instructor from imagining how it can be used as a dynamic model of biological evolution. One way around this issue is to direct instructors to the literature on Avida and to provide professional development experiences that involve experimenting with Avida-ED. This work is already being undertaken at BEACON by the Avida-ED Curriculum Development team. As instructors become more experienced with using the program, they should also be encouraged to develop their own curricular materials for use with Avida-ED, just as the expert users have in this study. As Koehler and Mishra (2008) point out: The teacher, Dewey argued, is not merely the creator of the curriculum, but is a part of it: teachers are curriculum designers. The idea of teachers as curriculum designers is based on an awareness of the fact that implementation decisions lie primarily in the hands of particular teachers in particular classrooms. Teachers are active participants in any implementation or instructional reform we seek to achieve, and thus require a certain degree of autonomy and power in making pedagogical decisions. (p. 21; emphasis in the original)   115 In order to achieve the goals advanced by initiatives such as Vision & Change and the Next Generation Science Standards, teachers at all levels of schooling (K-16) must first possess an appropriate attitude toward change, one that embraces reform-oriented pedagogical practices (Cronin-Jones, 1991; Roehrig & Kruse, 2005), and secondly must be provided with opportunities to improve their pedagogical content knowledge by engaging in professional development activities (reviewed in Hew & Brush, 2007; Powell & Anderson, 2002; Roehrig & Luft, 2004; Schneider, Krajcik, & Blumenfeld, 2005; Sunal et al., 2001). Student learning outcomes – Cross-case analysis of the student assessment data provides evidence that Avida-ED may be an effective tool for teaching evolution content. Students in lower-division biology courses who engaged in lessons with Avida-ED demonstrated increased knowledge of foundational evolutionary concepts such as the role of random genetic mutations in producing variation at the population level, and the basic mechanism of adaptive evolution by natural selection. Evidence suggests that students’ experience with Avida-ED also had a positive influence on their acceptance of evolution. Average student acceptance scores increased significantly in four of the six lower-division courses, despite the fact that acceptance of evolution was already quite high on both the pre- and post-assessments for all ten cases in this study. There was a significant, positive correlation between change in content score and change in acceptance score from pre- to post-assessment across the ten cases. This pattern holds even when accounting for differences in education level by comparing average normalized student gains for both content and acceptance. These data indicate that there is a positive relationship between learning foundational evolution concepts and increasing acceptance of evolution. In their exhaustive review of the literature on acceptance and understanding of evolution among undergraduate students, Lloyd-Strovas and Bernal (2012) concluded that there is   116 sufficient evidence to support a positive relationship between instruction in evolution and understanding of evolution, and between instruction in evolution and acceptance of evolution. The results of the current study serve to further support these relationships. However, the authors also concluded that there is insufficient evidence to support a relationship of any kind between understanding of evolution and acceptance of evolution. Although at first this may not make intuitive sense, it should not be too surprising that there is little evidence of a positive relationship between understanding and acceptance of evolution. The two constructs need not be mutually exclusive; one need not understand evolution in order to accept it, nor must one accept evolution in order to understand it mechanistically. Meadows et al. (2000) described teachers who held young earth creationist beliefs, but were able to compartmentalize their beliefs and teach the science of evolution without conflict (similar to Gould’s (1999) non-overlapping magisteria, the idea that scientific and metaphysical knowledge can co-exist without coming into conflict). In their widely cited paper on the subject, Bishop and Anderson (1990) comment on this lack of association between understanding and acceptance: It appears that a majority of both sides of the evolution-creation debate do not understand the process of natural selection or its role in evolution. One result of this lack of knowledge is that the debate is reduced to, as creationists argue, a dispute between two different kinds of faith. Most students who believed in the truth of evolution apparently based their beliefs more on acceptance of the power and prestige of science than on an understanding of the reasoning that had led scientists to their conclusions (p. 426). In their study of non-science student misconceptions about evolution, Robbins and Roy (2007) echo this sentiment, finding that “[M]ost students who claimed to “believe in” evolution were … only substituting a scientific authority for a divine one” (p. 462). An appeal to authority is not reflective of the kind of critical, evidence-based reasoning we hope to develop in our students. Instead, students should observe patterns in data and use these to construct logical arguments, just as scientists do.   117 In addition to a lack of association between overall levels of understanding and acceptance, most studies on the subject have found little or no relationship between the change in understanding due to instruction and change in acceptance. Indeed, students may increase their levels of conceptual understanding without a concomitant increase in acceptance (Nadelson & Southerland, 2010). Lawson and Worsnop (1992) noted that evolution acceptance does not change to the same degree as evolution knowledge. The same pattern was illustrated in the current study – more cases had significant increases in student content scores than in acceptance scores. Even so, the relationship between learning and acceptance in this study was strong. Student affective response It is essential to cultivate student interest and engagement in the classroom. Interest has been linked to motivation, self-efficacy, achievement, and intent (Ainley & Ainley, 2011; Brophy, 1983, 1999; Hidi & Renninger, 2006; Hidi et al., 2004; Linnenbrink & Pintrich, 2003; Linnenbrink-Garcia, Pugh, Koskey, & Stewart, 2012; Mitchell, 1993; Schraw & Lehman, 2001; Singh et al., 2002). Instructors who create and maintain situational interest in the classroom – by engaging students in cognitively challenging, hands-on, meaningful inquiry-based activities, and providing opportunities for choice and social interaction – may influence student personal interests and inspire students to persist in a field (Mitchell, 1993; Rotgans & Schmidt, 2011). Student affect in response to instructional approaches in science education could therefore be critical for retention of STEM students and increasing the number of graduates and workers in STEM fields, both of which have been cited as crucial for the economic future of the country (S. Olson & Riordan, 2012). Avida-ED, as a virtual lab space for the observation and study of evolution in action, is theoretically well positioned to produce situational interest. From the assessment data collected   118 in this study, I found significant increases in both understanding and acceptance of evolution for students in lower-division courses, and a significant, positive association between content and acceptance gains across all ten cases. These results suggest Avida-ED is a promising tool for teaching evolution in ways consistent with reform recommendations. In addition, survey data provide evidence that Avida-ED elicits a favorable affective response from students. Therefore, Avida-ED may be useful not only as a tool for increasing student understanding and acceptance of evolution, but also for increasing student interest in science. Associations between student responses to survey items were consistent with the literature on interest. Student interest was tied very closely to enjoyment and attentiveness, which were also associated with participation and ease of use, all positive interactions that are associated with increased motivation and learning (Ainley & Ainley, 2011; Brophy, 1983, 1999; Linnenbrink & Pintrich, 2003). In addition, the degree to which students perceived increases in their understanding of evolution and of the nature of science were very strongly correlated (r = 0.97, p < 0.01), and each of these factors was related to how comfortable students felt when discussing evolution (r = 0.90 and 0.97, respectively; p < 0.01). The students’ perceptions of what they learned and their confidence in discussing these ideas may be linked to their selfefficacy, or the situation-specific belief that one can succeed in a given domain (Bandura, 1993). Self-efficacy has been shown to be important for predicting an individual’s success and persistence within a domain (Bandura, 1993; Fencl & Scheel, 2005; Linnenbrink & Pintrich, 2003; Singh et al., 2002). Student self-efficacy can influence academic performance, and can be positively influenced by certain instructional strategies, particularly those that are studentcentered (Fencl & Scheel, 2005; Linnenbrink & Pintrich, 2003; Linnenbrink-Garcia et al., 2012). Instructors who engage students in these activities have an influence on student learning and on   119 their self-efficacy. This is also linked to students’ interest in science and confidence to do science: “Attitudinal and affective variables such as self-concept, confidence in learning mathematics and science, mathematics/science interest and motivation, and self efficacy have emerged as salient predictors of achievement in mathematics and science. These factors also predict mathematics and science avoidance on the part of students, which affects long-term achievement and career aspirations in the mathematics/science fields” (Singh et al., 2002; p. 324). Several studies in the research literature on interest have found a positive effect of interest and related factors, such as engagement, motivation, and self-efficacy, on student learning and achievement (Schraw et al., 2001; Schraw & Lehman, 2001). The data from this study, however, do not support an association between student affective and cognitive outcomes. No measures of affect reported by students were significantly correlated with normalized learning gains, nor were they associated with change in student acceptance of evolution. There were only weak positive associations between student perceived learning of evolution and actual gains in content and acceptance scores (r = 0.19 and 0.57, respectively), though neither was significant; this suggests that students may have difficulty predicting their own learning gains, particularly if their experience with Avida-ED was largely negative. Negative emotion, such as frustration and anxiety, felt during an instructional task can greatly influence student interest and engagement (Ainley & Ainley, 2011; Bergin, 1999; Linnenbrink & Pintrich, 2003; Singh et al., 2002). Unlike in most of the cases, students in G_100BioRes reported a predominantly negative experience with Avida-ED, having the lowest average for most survey items, even though their learning gains were very similar to other lower-division courses. Indeed, when removing this case from the analysis, the correlation between average normalized student gain in content score   120 and student perceptions of what they learned becomes significant (n = 6, r = 0.85, p < 0.05), suggesting that students who had positive experiences with Avida-ED were much more likely to indicate that it had a positive influence on their learning. Despite the lack of association between affective and cognitive outcomes in this study, there were several strong correlations between various affective factors and instructor familiarity with Avida-ED. In particular, students of more experienced instructors reported significantly greater levels of interest and enjoyment, as well as a greater perceived increase in understanding of evolution. In addition, students with more experienced instructors found instructional resources to be more helpful. Notably, the only case in this analysis that was taught by novices, G_100BioRes, was also the course in which students reported a largely negative experience using Avida-ED. Although the specific reasons for why this may be are unknown, one could speculate, based on the nature of the implementation (see Chapter 3), that students were frustrated by being given a challenging assignment with Avida-ED, receiving very little introduction to the software and very little instructional support, and having little time to complete the assignment – the entire implementation lasted only one week. In addition, the instructors may have lacked sufficient technological pedagogical and curricular content knowledge of Avida-ED to support the students as they worked on the assignment. Research has shown that instructors can significantly influence student situational interest by providing adequate background knowledge needed for completing a task (Schraw et al., 2001) and possessing a high degree of content knowledge (Rotgans & Schmidt, 2011); both of these factors appear to have been lacking in the case of G_100BioRes. These results suggest that it is beneficial for teachers to take the time to engage in professional development and improve their expertise with regard to the instructional strategies that they choose to use. For Avida-ED, that   121 could mean spending substantial time experimenting with the software, reading the literature on Avida, and coming to understand the philosophical and scientific bases of digital evolution to the degree that the expert users in this study do. Expert users are familiar with possible technical glitches within the software, but can also anticipate patterns that may surprise students and result in questions. The expert instructors in this study understood that the evolution occurring in Avida-ED is real and illustrative of the substrate neutral nature of evolutionary processes. This understanding allowed them to apply the same principles governing the evolution of digital organisms to other (e.g., biological) contexts, and may have helped students to see that the patterns transcend particular systems. Experts were able to develop sophisticated, parallel experiments that showed the same phenomena in both digital and biological populations; this may be key both for helping students make connections and catching hold of student interest. Analyzing the data from the Avida-ED user’s survey through the lens of research on interest suggests that: 1) it is important to engage student interest; 2) Avida-ED is well suited to engage student interest; and 3) the degree to which Avida-ED engages student interest may depend on instructor familiarity, suggesting that professional development with regard to using and teaching with Avida-ED (and with instructional technologies more generally) is advisable. Implications for science education.   The outcomes of this study have provided valuable insights for the successful classroom implementation of reform- and research-based tools like Avida-ED, and potentially for science education in general. Drawing from the cases in this study, I propose the following set of criteria as benchmarks for successful implementation. Instructor goals for student learning must be aligned with reforms. Because Avida-ED was designed to integrate science content and practice, using it in more traditional ways (e.g.,   122 “cookbook” or confirmatory laboratory exercises) amounts to inappropriate assimilation of the tool and may result in poor outcomes (Henderson & Dancy, 2011). Instructors must be aware of the role of context in science teaching. It is critical that instructors and their students do not become too caught up in the specifics of a particular model system, especially when the goal is for students to understand the target phenomena broadly at a level that transcends individual contexts (Grosslight et al., 1991; Harrison & Treagust, 2000). Related to this point, it is good practice to include multiple contexts for illustrating phenomena (D. E. Brown, 1992). Evolution is a substrate neutral process that will occur spontaneously in any system that possesses variation, inheritance, and selection (Dennett, 1995). In this sense, digital and biological evolution are merely two different contexts in which to observe the phenomenon of evolution. Selecting a single context to the exclusion of all others may interfere with students’ ability to generalize across contexts, particularly if the chosen context is not universally representative. For example, it is problematic when instructors do not see the value in simple model organisms such as bacteria and Avidians. For some instructors, haploid, asexual organisms are so far removed from their preferred multicellular, diploid, sexually reproducing examples that it may seem unnecessary to adopt a more generalized definition of evolution that is inclusive of prokaryotes. Instructors who limit their view of what counts as a legitimate model organism may not immediately recognize the applicability of a system like Avida. This can have unintended negative consequences on student learning. For example, RY and AR defined evolution as “change in allele frequencies over time.” This is a common and useful definition of evolution so long as it is limited to organisms with DNA-based genomes. If, however, one wants students to understand the generalizability and substrate neutrality of natural selection, the above definition fails and serves only to constrain students’   123 understanding. This was observed several times in student responses to the second content question on the assessment. When asked what was necessary for an organism without cells or DNA to evolve, several students responded, “Evolution is change in allele frequencies over time. An organism without DNA doesn’t have alleles, therefore it cannot evolve.” This response often persisted even after students had used Avida-ED – a system in which the digital organisms lack cells and DNA, and yet evolve – demonstrating the limited scope of the students’ understanding of evolutionary mechanisms and inability to apply them to novel contexts. Instructor familiarity with teaching tools, including the limits and affordances of a tool, are critical. From a practical standpoint, instructors need to be technically familiar with the tool in order to anticipate or trouble shoot student difficulties. Beyond that, instructors need to be able to assess the extent to which a particular tool aligns with their objectives, and to understand a tool’s capabilities so that it can be used to its full potential. All of these issues serve to underscore the importance of pedagogical content knowledge, and particularly technological pedagogical content knowledge. To increase levels of TPACK, it is essential that instructors engage in professional development activities around the curricular materials they choose to use. For Avida-ED, this is especially true in light of the students’ affective responses. In contrast to other studies on student interest, there was no association between student affect and learning or acceptance of evolution. There was, however, a strong relationship between student affective response and instructor familiarity with Avida-ED, suggesting that instructor TPACK influences this relationship as well. It is important to provide students with sufficient instructional support over time. The duration of an implementation matters (Durlak & DuPre, 2008; Sadler et al., 2010), and students should be given ample time to become familiar with a tool and to complete all associated tasks.   124 In this study, two weeks seemed to be enough time for most students to learn how to use AvidaED and engage in a more or less full inquiry cycle; however, allowing for less time may have had a negative impact on student affect (case G_100BioRes). Also, it is advisable to provide students with a demonstration or tutorial – a series of simple tasks designed to familiarize students with the tool – allowing them to ease into it, and progressing from simple to more complex concepts and practices. As mentioned above, and depending on one’s goals for students learning, it is good practice to provide students with multiple contexts in which to observe phenomena. Student may also appreciate the provision of supportive materials such as supplementary articles, videos, or user’s manuals. Although the current study did not find a significant effect of instructional supports on student learning outcomes, there was evidence of a significant association with student affect, which has been shown to influence student interest and intention. Finally, it is critical for instructors to know their students, assessing early and often. Formative assessment serves to inform instructors of student misunderstandings and difficulties, and to adjust their instruction accordingly to meet student needs. It is especially important to monitor student progress when introducing a new tool like Avida-ED. Several instructors noted that Avida-ED was challenging to their students (especially Freshmen) because it is a haploid, asexual system similar to prokaryotes, and the students simply had no experience with prokaryotes. The references to bacterial cultures (Petri dish, freezer, etc.) had little meaning to them. JO [D_100Evo] also encountered this issue and explained during his post-implementation interview: “[Avida-ED] is set up to look like the sorts of bench experiments that people in this field do, and a lot of the references to the types of things that you experience when you work with bacterial culture on the bench, these guys had no experience with. So this stuff didn't really have any meaning to them. You know, putting stuff on a plate and putting in the   125 freezer, all of those things that are typical practices for that kind of a lab, they had no experience with and so they had no way of, it didn't help them, in other words. They didn't know what they were doing, they didn't know why they were doing it if they had nothing to relate it to.” The purpose of those analogies – the circular Avidian genome, referring to the population viewer as a virtual Petri dish, saving digital organisms to a “freezer” – is to give students some way to anchor the foreign digital system to the familiar organic world. If students are not familiar with bacteria and the common procedures used to study them, the analogy is rendered useless and may actually become a source of confusion. Another problem is when the students do recognize the similarities between Avidians and bacteria, but do not understand the analogical relationship and instead consider Avidians as a digital representation of bacteria. Avidians are, of course, not bacteria, and their “biology” is quite different; if students do not understand this, it can lead to misconceptions. Assessing students with regard to these ideas early on in an implementation may help to avoid potential confusion. Limitations of the study. The evidence gathered from this study suggests that Avida-ED may be a powerful tool for increasing student understanding and acceptance of evolution. However, there are several important limitations worth noting, primarily related to the scope and design of the study. With regard to analysis of student responses to open-ended assessment items, I was rather conservative. Rather than analyzing responses at the level of individual critical components or even individual questions, I collapsed student responses across the two questions into a single content score. This reduced variability, providing greater statistical power and making differences across cases more clear, but at the expense of detail. The cumulative content score reveals nothing about the variability in student responses from case to case, and there was indeed variability. For example, in three of the cases (A_APBio, B_300Evo, and F_400Evo) one   126 hundred percent of students mentioned “mutations” (one of the two critical components) in their responses on both the pre- and post-test. Therefore, any gains from this question came from an increase in the number of students who also mentioned that these mutations are random (the second critical component) on the post-test. Differences such as these among the cases are likely attributable to differences in instruction, as instructors focused on concepts to differing degrees depending on their objectives. Student outcomes were therefore most certainly dependent upon instruction, but reporting this level of detail was not a priority given the goals of the study. Another limitation imposed by my analytic conservatism concerned the strict criteria I used when analyzing student open-ended responses. I compared student responses to a rubric based on ideal responses, essentially ignoring any part of student responses that were not related to the critical components I had identified. In so doing, I lost much of the nuance present in student responses, perhaps giving the impression that the other ideas students contributed were incorrect (indeed, I was criticized by a participating instructor on the “narrow mindedness” of my scoring). This strict evaluation of student responses might also have created a ceiling effect with regard to student scores. Note that mean content score did not exceed 60% of the ideal response for any case, even on the post-test (Fig. 4). This result may seem discouraging, but in reality probably has less to do with student ability and more to do with how difficult it would have been for any student to earn a perfect content score given the rubric (see below for further discussion about this point). Although the evidence presented here suggests that Avida-ED positively influences both student understanding and acceptance of evolution, it is important to note that any gains due to Avida-ED cannot be isolated from other instruction that was happening at the same time. That is, there is no way to claim that Avida-ED was entirely responsible for observed increases in student   127 outcomes. However, because Avida-ED was part of the instruction in each of these cases, and because the pre- and post-assessments were given immediately before and after all instruction involving Avida-ED, it is safe to say that Avida-ED at least contributed to those positive outcomes. It was not the aim of this study to determine the degree to which instruction with Avida-ED may or may not be more effective than other instructional approaches. Directions for future research. This study has provided several insights with regard to teaching and learning science with digital evolution specifically, and educational technology more generally. However, there were many questions and issues raised by the study that may deserve further investigation. Perhaps the most compelling outcome of this study is the apparent influence of Avida-ED on student understanding and acceptance of evolution, and the relationship between these factors. Given that the majority of studies on understanding and acceptance of evolution has failed to show that the two constructs are related (Bishop & Anderson, 1990; Brem et al., 2003; Demastes et al., 1995; Ingram & Nelson, 2006; Lloyd-Strovas & Bernal, 2012), how might we account for the significant, positive relationship found in the current study? There are several hypotheses that could explain this pattern and provide avenues for future investigations. Hypothesis 1 (“Seeing is believing”): Student gains in understanding and acceptance of evolution may be due to their use of Avida-ED, an instructional technology that integrates content with authentic science practice just as advocated by national reforms. Avida represents an instance of the evolutionary process rather than a simulation. With Avida-ED, students directly observe and experiment with populations of evolving digital organisms while pursuing answers to questions that they have developed by collecting and analyzing real data. In none of the studies reviewed by Lloyd-Strovas and Bernal (2012) was this type of instructional strategy   128 utilized. Some of the studies used active learning and inquiry approaches (although the degree to which activities engaged students in authentic science practices is not clear), but none included direct observation of or experimentation with evolving populations. Discipline based education research has shown that integrating science content and practice is the most effective way for students to learn about science (Singer et al., 2012), and has formed the basis for recommendations in science education reform at the national level. If the integration of content and practice via Avida-ED can account for the positive association between understanding and acceptance of evolution in this case, my study will have provided additional support for the efficacy of those reform-based pedagogical approaches. Hypothesis 2: Rather than owing to factors intrinsic to Avida-ED, the relationship could be explained by the nature of the instrumentation used to assess student understanding and acceptance of evolution. The content questions were very basic and well aligned with what students would have seen while using Avida-ED, so increases in conceptual knowledge were expected. Student acceptance of evolution, however, was unexpectedly high when compared to acceptance rates in the literature among similar groups, especially on the pre-test. Many studies that have reported low levels of evolution acceptance have deliberately connected the scientific idea of evolution to religious belief in their assessment instruments, and have treated the two as diametrically opposed (Miller et al., 2006; Newport, 2012). In addition, these studies ask questions related to acceptance of human evolution, although it has been documented that some people are willing to accept the evolution of non-human organisms, reserving the notion of special creation for humans alone (Alters & Alters, 2001). Asking questions about human evolution and religious beliefs is problematic as they could potentially trigger a defensive response. For example, in a study by Demastes et al. (1995) on student understanding of natural   129 selection, student beliefs in evolution were not tested at the request of the participating teachers, “[B]ecause, in the teachers’ judgment, asking such questions might bring to the fore controversies that could interfere with the attempts by the teachers to help the students understand the basics of evolution” (p. 543). Berkman et al. (2008) have also noted that teachers who cover evolution often do not broach the subject of human origins, presumably for the same reason. For this study, I wanted any changes in acceptance scores to be related to what the students observed and inferred from their interaction with Avida-ED, and did not wish to trigger an emotional or defense response. The high levels of acceptance reported here, then, could be due to my purposeful avoidance of questions pertaining to religion or human evolution. For future studies, it would be worth including some mention of human evolution to see if the acceptance levels are as high or gains as great as observed in this study. Hypothesis 3: It is well documented that many people are willing to accept microevolution, broadly defined as changes within the same “kind” of organism, but not macroevolution, or evolution of different “kinds” (Alters & Alters, 2001). Avida-ED offers direct observations of and experimentation with microevolutionary processes, but does not address some aspects of macroevolutionary processes that some people may find problematic.3 Perhaps this explains the high levels of and increase in acceptance observed in the current study – for those students who oppose speciation and the idea that humans are phylogenetically related to other primates, it may be easier to accept the changes observed in Avida-ED, which result in the adaptation of populations to their environments, as evolution.                                                                                                                   3 Macroevolution includes the evolution of novel features (e.g., the appearance of feathers in the lineage of theropod dinosaurs that gave rise to modern birds); in this sense, Avida-ED does illustrate macroevolutionary processes, as new functions evolve from old. However, it does not illustrate other macroevolutionary processes such as speciation.   130 Hypothesis 4: Another possibility is that the correlation between understanding and acceptance of evolution is not “real”, and these two factors are instead mediated by an unmeasured third factor. Ha et al. (2012) found that the positive relationship between understanding and acceptance in their study was mediated by what they called “feeling of certainty”, a cluster of cognitive intuitions that makes a person feel that their beliefs are correct, with or without supporting evidence. Future studies could attempt to identify other factors that contribute to the relationships found in this work. In addition to the hypotheses regarding understanding and acceptance listed above, there are several other issues emerging from this study that could potentially lead to future lines of research: 1. Low average content scores. As mentioned above, the highest average content score for any of the cases on the post-test was only about 55% of the ideal response. There are several reasons for why this may be. The ideal responses were based on phenomena that apply to all evolving life forms, biological and digital. As a result, they excluded other explanations that are not necessarily wrong, but which do not apply in all cases. For example, many students explained that population-level variation arises by sexual reproduction and genetic recombination during gamete formation. In sexually reproducing, diploid organisms, these absolutely are sources of variation. However, Avidians, and some strains of E. coli, for example, are haploid and asexual; they do not swap genetic material with other organisms, nor do they pick up genome fragments from the environment to insert into their own genomes. Therefore, even though there were increases in the frequency of concepts such as sexual reproduction and recombination in student responses from pre- to post-test (which indicated that the students had learned), these ideas would not have contributed to the   131 students’ overall score for that item, as students could not have obtained that knowledge from Avida-ED. In this way, the rubric was very strict; any information provided that was not part of the ideal response was excluded and did not contribute to the students’ scores. It was very difficult for students to obtain a “perfect” score based on this rubric. Indeed, only one student out of 628 met the criteria specified by the rubric required to earn all 10 points on the postassessment. No students received a perfect score on the pre-assessment. 2. I used one assessment for all 10 cases, and, other than the fact that all of the instructors used Avida-ED, I had no control over the mode of instruction in any of the courses. The assessment was therefore unlikely to be aligned with instruction, and is not necessarily representative of the level of student learning. It is not the scores per se that are important for this study, but the change in scores from pre to post. Keeping this in mind, the pattern that emerged was that in 6 of the 10 cases, the average content score increased significantly after instruction with Avida-ED. 3. The case of C_400Evo. None of the three upper-division courses had statistically significant gains in either student content or acceptance scores from pre- to post-assessment. However, one of those three cases differed markedly from the other two. Both cases B_300Evo and F_400Evo had among the highest content and acceptance scores (pre and post), which makes sense given that these students, who were all in their junior or senior year of study and had successfully completed many biology courses, would be expected to have mastered the fundamental concepts targeted by the assessment in addition to possessing a high rate of evolution acceptance. Case C_400Evo, in stark contrast, had among the lowest preassessment scores for both content and acceptance, and the lowest scores for both on the post-assessment. This raises concerns, particularly because the demographics of this case   132 were very different from the other nine. Unlike the other institutions, this was a HBCU in the southern United States, and all of the students in the class were racial minorities. Bailey et al. (2011) studied attitudes toward science among students from the same demographic and concluded that they interact with science, and particularly evolutionary biology, differently from other, more “mainstream” demographics. They suggest that this is because African Americans tend to have a more fatalistic worldview than other sub-populations, owing to their relatively strong belief in God as an external locus of control in their lives. This fatalistic worldview is often at odds with the progressive nature of science, and may cause African Americans to reject and avoid engagement in science at higher rates than other groups. In addition, they found that the strength of religious beliefs among the African American college students in their study was negatively correlated with knowledge of and attitudes toward evolution (and of science in general). Therefore, there is evidence to suggest that the students in case C_400Evo are indeed different from students in other cases, owing to the issues discussed in the work of Bailey et al. (2011), and that these differences may account for the results of my study. Despite the above issues, the evidence is there to suggest that engaging students in authentic science practice using a tool like Avida-ED improves not only student understanding of content but also acceptance of established scientific ideas, and that the degree to which acceptance increases is related to student learning. Although the exact nature of this relationship is not yet understood and requires further investigation, I am optimistic that Avida-ED can be used to address the issue of evolution denial in the United States, and that insights arising from this work can be extended to address other socio-politically contentious issues in which understanding and acceptance of science co-vary, such as climate change and the safety of   133 vaccines. Shifting to these other important topics will involve developing the means to engage students in authentic research opportunities, perhaps in the form of computer simulations – for example, educational versions of the same tools that climate scientists have used to make predictions about the effects of global warming (Edwards, 2001). Until such tools become readily available, instructors can still engage students in science practices by establishing problems for students to investigate using real data (Rule & Meyer, 2009). Encouraging students to find patterns in these data may allow them to directly test common misconceptions and perhaps even develop and pursue questions of their own. These pedagogical practices are well aligned with national science education reform recommendations as they not only give students access to conceptual knowledge, but also integrate this knowledge with science practices. Emerging from this work are also implications for issues of educational inequity. Working at the elbows of scientists is an effective way for students to learn about the nature and practices of sciences (Sadler et al., 2010). While it is not feasible for all students to access these kinds of research experiences, thanks to developments in educational technology it is feasible to infuse every science course with authentic research opportunities. As more open-ended, inquirybased tools like Avida-ED become available for use in classrooms, professional development opportunities for instructors will become increasingly essential so that they can use those tools effectively and serve as mentors for students as they participate legitimately in the work of science. With regard to teaching, this study may serve to illustrate the importance of technological pedagogical content and curricular knowledge among teachers and its relationship to student affect, particularly in light of reform efforts such as NGSS and Vision & Change. Although instructor experience with using Avida-ED was not significantly associated with student learning   134 and acceptance gains, there was a strong, positive correlation with student interest, enjoyment, and self-efficacy, among other affective factors. Further analysis is required to connect what expert instructors are doing that differs from novice instructors to result in more positive experiences for students. Some factors that may contribute include the complexity of tasks and projects designed by the instructor, the degree to which instructors engage students in authentic science practices, the amount of student control (over questions, hypotheses, experimental design, etc.), and support provided by the instructor, among other factors. Unlike novice instructors, experienced Avida-ED users were able to create sophisticated exercises linking digital and biological contexts, and these may have increased situational interest for students. The literature on interest indicates a relationship between situational interest and student personal interest, motivation, and future intent. Assuming there is a positive association between instructor PCK (including TPACK) and student interest, as evidenced by the current study, it is possible that improving instructor pedagogical content knowledge might indirectly, through effective classroom engagement, positively influence student persistence in STEM. In addition, bringing authentic research opportunities into the classroom with tools like Avida-ED may free instructors to shift their role from purveyor of knowledge to mentor and guide. Assuming a more supportive role could prove more fulfilling for instructors and more meaningful for students, further contributing to retention of students in STEM. Conclusions. Avida-ED is an educational tool based on a research platform for the study of experimental evolution, and as such simultaneously allows students to observe evolution in action and to engage in authentic research practices (National Research Council, 2012; Pennock, 2007a; Speth, Long, Pennock, & Ebert-May, 2009). Although it has been in use by educators for   135 many years, there has heretofore been no systematic inquiry with regard to how Avida-ED has been implemented in classrooms and its effects on student learning of evolution. With those goals in mind, this study set out to answer the following questions: 1. How are biology instructors at different institutions across the United States using AvidaED in their courses? 2. What are instructors’ educational goals and beliefs about teaching and learning science, and how do these influence implementation decisions? 3. To what degree is instructor implementation of Avida-ED aligned with reform-oriented pedagogical strategies? 4. To what degree does Avida-ED allow instructors to teach in ways consistent with both their personal teaching philosophies and national science education reform recommendations? 5. How does learning with Avida-ED influence student outcomes with regard to understanding and acceptance of evolution? 6. How do students respond affectively to Avida-ED, and what factors might influence this response? I hypothesized that Avida-ED would provide unique opportunities for learner-centered, inquirybased pedagogy leading to improved understanding of evolution, but that student outcomes would be dependent on how instructors chose to implement the program. To test these hypotheses, I conducted a national, multiple-case study to examine how eleven biology instructors teaching ten different courses at eight institutions across the US were using Avida-ED in their classrooms. I interviewed instructors prior and subsequent to their use of Avida-ED, and used their responses along with course materials to characterize each case. I designed two survey   136 instruments, one to assess gains in student understanding and acceptance of evolution, and another to characterize student experiences with using Avida-ED. Analysis of the data revealed the following key findings; each of these findings is discussed briefly below: 1. Instructors used Avida-ED in a variety of ways, but all adopted reform-based pedagogical strategies. 2. All of the instructors held views on teaching and learning that were well aligned with reform-based pedagogical practices. 3. All of the instructors indicated that Avida-ED allowed them to teach evolution and the nature of science in ways consistent with their personal teaching philosophies (and, therefore, national science education reform recommendations). 4. Students in lower-division courses significantly improved both their understanding and acceptance of evolution after using Avida-ED. 5. Increased understanding of evolution was positively associated with increased acceptance. 6. Student learning outcomes were not associated with student affective response. 7. Instructor familiarity with using Avida-ED was not associated with student learning outcomes. However, instructor familiarity was highly influential with regard to both how Avida-ED was implemented and student affective response. Instructors used Avida-ED in a variety of ways, but all adopted reform-based pedagogical strategies. The ways in which Avida-ED was implemented by each of the instructors was influenced by a number of factors, including their personal views on teaching and learning, their goals for student learning about science as well as their specific lesson objectives,   137 the type of course they were teaching (e.g., general biology versus evolution), the level of their students, the context of the course (e.g., lecture or laboratory), and associated constraints such as time and class size. Regardless of these considerations, all of the instructors chose to use AvidaED as a way to engage students, to a greater or lesser degree, in authentic science practices. In each case, the focus of the lessons was not just on teaching evolutionary concepts but also on giving students experience with doing science. That is, they used Avida-ED to integrate content with science practices, precisely as advocated by national science education reforms (Achieve, 2013; Brewer & Smith, 2011; College Board, 2011; National Research Council, 2011). All of the instructors held views on teaching and learning that were well aligned with reform-based pedagogical practices. The instructors in this study were all greatly invested in their students’ learning, and were, to varying degrees, familiar with the findings of discipline based education research (Singer et al., 2012) and research on how people learn. They expressed a desire to teach in ways that eschewed traditional lecture and instead engaged students in active learning. They all wanted their students to understand in a fundamental way both course content as well as the nature and practices of science. Their progressive views on education likely explain both their eagerness to use Avida-ED and to participate in educational research, and all of them, even those whose work focuses primarily on science, had previously been involved in their own science education research projects. All of the instructors agreed that Avida-ED allowed them to teach evolution and the nature of science in ways consistent with their personal teaching philosophies (and, therefore, national science education reform recommendations). The instructors were able to use Avida-ED as a virtual lab space to enable students to observe evolution in action and test evolutionary hypotheses in real time. What’s more, they were able to do so in a digital   138 environment that required few physical resources, and that produced large amounts of data in a short amount of time. They were able to use Avida-ED to model biological evolution and better understand its mechanisms. Students were able to ask questions that interested them and design their own studies, giving them agency and making the experience more authentic. The digital system eliminated many of the constraints posed by biological model organisms such as bacteria, and some instructors intimated that if it were not for Avida-ED they would not be able to teach evolution in this way. Here was a tool that effectively met their need for learner-centered, inquiry-based pedagogy. Students in lower-division courses significantly improved both their understanding and acceptance of evolution after using Avida-ED. Average content scores were significantly greater on the post-assessment for six of the ten cases; all of these were lower-division courses. Similarly, average acceptance scores increased significantly in four of the ten cases, again all lower-division courses, despite relatively high initial levels of acceptance. These results indicate that Avida-ED is a promising tool for teaching about fundamental evolution concepts, one that will nicely complement our existing repertoire of strategies for teaching evolution. Increased understanding of evolution was positively associated with increased acceptance. Student normalized gains in content and acceptance scores across the ten cases were significantly, positively correlated, indicating a strong relationship between learning and acceptance. This finding is particularly interesting given that many studies on learning and acceptance of evolution have reported negative associations (Bailey et al., 2011), or have failed to find any relationship at all (Bishop & Anderson, 1990; Brem et al., 2003; Demastes et al., 1995; Ingram & Nelson, 2006; Lloyd-Strovas & Bernal, 2012).   139 Student learning outcomes were not associated with student affective response. Literature on interest, motivation, and self-efficacy has found positive associations between these affective conditions and learning (Ainley & Ainley, 2011; Hidi et al., 2004; Linnenbrink & Pintrich, 2003; Schraw & Lehman, 2001). That was not the case in this study. There were no associations between either student understanding or acceptance of evolution and affective response. That is, students learned from Avida-ED regardless of their experience working with it. Instructor experience with using Avida-ED was not associated with student learning outcomes. However, instructor experience was highly influential with regard to both how Avida-ED was implemented and student affective response. Expert Avida-ED users had worked extensively with the program and possessed deep theoretical and practical knowledge of the system. This allowed them to design rich, sophisticated lessons that drew parallels between digital and biological evolution. Student interest and enjoyment of Avida-ED was significantly, positively associated with instructor experience, as were student perceptions of learning with regard to both evolution and the nature of sciences, and student confidence in discussing evolution. In contrast, novice users of Avida-ED lacked technological pedagogical content knowledge of Avida-ED (Koehler & Mishra, 2008), and opted to use existing lesson materials. Although their students still learned from Avida-ED, student experiences with the program were much more negative. This finding points to the complex interaction between student cognitive and affective response to instruction, and the importance of professional development opportunities for instructors who are newly adopting curricular materials. The days of the “sage on the stage” are numbered. No longer can we sustain the idea of the instructor as a wellspring of knowledge to be transferred to students via passive instructional approaches. The findings of discipline based education research have shown that the learning of   140 science is best accomplished when content is integrated with active engagement in science practice (Singer et al., 2012), and national reforms have revised their recommendations to follow the data (Achieve, 2013; American Association for the Advancement of Science, 2011). It is time for science teachers to begin enacting these reform-based initiatives. In order to do so, they will require access to professional development opportunities and learner-centered, inquiry-based curricular resources that support these practices. This work contributes to our thinking about the ways in which Avida-ED, digital evolution for education, could be just such a tool.     141 APPENDIX   142 Pre-implementation interview protocol Background Information 1. Name 2. Institution 3. Role/title/departmental affiliation(s) 4. Courses taught; how long 5. Proportion of appointment dedicated to teaching/research/other (service, outreach) 6. Description of course/department Science in general 7. What goals do you have for your students? What are the most important things that students should learn about science? 8. What are the most challenging aspects of reaching those goals? 9. How do you decide what to teach and how to teach it? 10. What is your role in the learning process? 11. During classes, what sorts of interactions do you like to encourage? 12. What is your preferred teaching style? Briefly describe a typical day in your classroom. 13. How do you typically assess student learning? Evolution 14. What knowledge and skills are relevant to evolution? What should students know and be able to do at the end of the unit or course (what are your learning objectives)? 15. Unpack your syllabus: for each topic, what do you want students to know, and why do you think that’s important? How did you decide to teach these topics in particular? 16. What challenges/limitations do you encounter when teaching evolution? How have you dealt with these in the past? Avida-ED 17. How/when did you first learn about Avida-ED? 18. What made you want to use it in your classroom? 19. In what course(s) are you using/have you used Avida-ED? (List and briefly describe: subject area focus, grade level, size, offered to majors/non-majors/both, pre-requisites (if any) or requisite for subsequent courses, required for majors?, etc.) 20. How do/will you use Avida-ED in your course(s)? 21. What other kinds of instructional resources have you used/do you use in your course(s)? List and describe.     143 Post-implementation interview protocol   • Please briefly describe how you implemented Avida-ED in your class. What did you do? Did this differ from your original plan? (Include dates) • How successful was your implementation of Avida-ED (were you able to meet your goals)? What factors contributed to this assessment? • Were there any gains that you noticed that weren’t directly assessed? What were they, and how do you quantify them? • For areas where you didn’t see gains, why do you think this was and what would you do next time to address those specifically? • Were there any questions that, looking back, you wish you had asked (assessed students on)? • Is there anything you weren’t able to do in the past that Avida-ED allowed you to do? • Were there still things you wanted to do but were not able to even with Avida-ED? • Were there things that you couldn’t do because you were using Avida-ED? • What else did you learn from this implementation? • Will you use Avida-ED in your course again? What would you change next time? • To what degree does Avida-ED allow you to overcome the challenges/limitations that you mentioned in the pre-interview? • What aspects of Avida-ED make it a useful tool for teaching/learning about evolution/NOS? • What were the greatest affordances of using Avida-ED in your class? • What sorts of challenges did you encounter while teaching with Avida-ED? • What did you do to prepare to use the software and how long did it take you? Quantify: range of times. • To what degree do you feel that Avida-ED aligns with your personal teaching goals and philosophy? • To what extent has Avida-ED changed the way your students learn about evolution/NOS? In what ways? • To what extent has Avida-ED changed the way you teach about evolution/NOS? In what ways? • To what extent has Avida-ED changed the way you think about teaching evolution/NOS? In what ways?     144 Implementation rubric Case: Criteria: 1. Do students observe evolution in action? Y/N 2. Does the implementation integrate content and science practice? Y/N 3. Which of the Next Generation Science Standards Practices are included? a. Asking questions and defining problems b. Developing and using models c. Planning and carrying out investigations d. Analyzing and interpreting data e. Using mathematics and computational thinking f. Constructing explanations and designing solutions g. Engaging in argument from evidence h. Obtaining, evaluating, and communicating information 4. Duration of implementation: 5. Instructional supports provided:   145 Student Assessment Part 1. Short Answer. Please provide a brief (1 – 2 sentence) response to each question. 1. Explain how variation arises in a population. 2. Imagine that a new life form was just discovered on another planet. It is not made up of cells nor does it contain DNA. What characteristics of this life form would be necessary in order for it to evolve? Explain your reasoning.   146 Part 2. Please indicate the degree to which you agree or disagree with each of the following statements by placing an X in the appropriate box.   Neither     Organisms existing today are the result of evolutionary processes 1. that have occurred over millions of years. Evolution is a process that is 2. happening right now. Evolution cannot ever be observed 3. because it happens over very long periods of time. Evolutionary biology generally 4. does not investigate testable ideas about the natural world. Evolutionary biology relies on 5. evidence to make claims about the natural world. The available data are ambiguous 6. (unclear) as to whether evolution actually occurs. Evolution can explain changes in 7. populations of species over time. Evolutionary theory is supported 8. by factual, historical and laboratory data. Computer programs can create 9. instances of evolution (within a computational environment). Evolution is a scientifically valid 10. theory.     Strongly Disagree Disagree                                                                                                     147 Agree nor Disagree Agree Strongly Agree Assessment Rubric Constructed Response Items Each student constructed response was compared to an ideal response and assessed for degree of accurateness and completeness. Each ideal response was broken down by two or more critical components, each with a maximum score of two points. Responses that were considered accurate and complete (well aligned with the ideal response) were given 2 points. Responses that were mostly accurate but incomplete (emerging understanding) were given 1 point. Responses that were ambiguous, incorrect, or that were missing the relevant critical component were given 0 points. The percentage of the ideal response was calculated as the ratio of points assigned to points possible. 1. Explain how variation arises in a population. Ideal response: All variation at the population level ultimately arises from random mutations caused by errors during genome replication in individual organisms. Critical components of the ideal response: • Mutation o AC (Accurate and Complete) – 2 points: Mutations are responsible for variation (with or without additional factors). o AI (Accurate but Incomplete; emerging understanding) – 1 point: Define mutation but don’t use the term (e.g., “changes” in DNA). o Ambiguous, Incorrect, or Not Present – 0 points. • Randomness (in association with mutations)/Replication Errors o AC (Accurate and Complete) – 2 points: Indicate that mutations are random. o AI (Accurate but Incomplete; emerging understanding) – 1 point: Errors during DNA replication (without mentioning that these are random). o Ambiguous, Incorrect, or Not Present – 0 points. 2. Imagine that a new life form was just discovered on another planet. It is not made up of cells nor does it contain DNA. What characteristics of this life form would be necessary in order for it to evolve? Explain your reasoning. Ideal response: In order to evolve, the organism must satisfy three criteria: it must have some form of code or information that is copied/replicated and passed to offspring; variation in the population is caused by random mutations or changes to the code during replication; and selection acts on individuals so that those possessing certain traits are able to survive and reproduce better than competitors in a given environment. Critical components of the ideal response: • Inheritance: Information that is copied o AC (Accurate and Complete) – 2 points: Information/code/genetic material etc. that can be replicated/reproduced and passed to offspring.   148 • •   o AI (Accurate but Incomplete; emerging understanding) – 1 point: Code/info only OR reproduction only o Ambiguous, Incorrect, or Not Present – 0 points. Variation: Mutation o AC (Accurate and Complete) – 2 points: Variation caused by (random) mutations/changes in code. o AI (Accurate but Incomplete; emerging understanding) – 1 point: Mutation/changes only OR variation only (without elaborating on where the variation comes from) OR “randomization” only o Ambiguous, Incorrect, or Not Present – 0 points. Selection: Differential survival/reproduction; competition o AC (Accurate and Complete) – 2 points: Differential reproduction/survival; competition o AI (Accurate but Incomplete; emerging understanding) – 1 point: Fitness (without elaboration); adaptability; changing due to interaction with environment; utilizing environmental resources to survive – all without mention of reproductive/survival advantage. o Ambiguous, Incorrect, or Not Present – 0 points. 149 Avida-ED User’s Survey Provide a brief (1 – 2 sentence) response to each of the following questions. All responses will be kept strictly anonymous; please be as honest as possible. 1. What did you like most/least about Avida-ED? Most: Least: 2. What did you find most challenging about using Avida-ED? 3. What is the most important thing you learned by using Avida-ED? For each of the following items, circle the category that best describes your experience with Avida-ED. Your general level of attentiveness during lessons involving Avida-ED Not at all attentive Somewhat attentive Attentive Very attentive Extremely attentive Your general level of participation during lessons involving Avida-ED Never participated Rarely participated Sometimes participated Usually participated Always participated The relative ease with which you were able to understand Avida-ED Did not understand at all Had many difficulties Had some difficulties Had few difficulties Had no difficulties The relative ease with which you were able to use Avida-ED Not at all able to use Had many difficulties Had some difficulties Had few difficulties Had no difficulties Your overall interest in Avida-ED Not at all interested Somewhat interested Interested Very interested Extremely interested Hate it Don’t like it It’s okay Like it Love it Your overall enjoyment of Avida-ED   150 Indicate how effective the following materials were in helping you to use Avida-ED. Not at all effective Somewhat effective Effective Very effective Extremely effective N/A (not available/ did not use) Avida-ED tutorial Avida-ED lessons/projects In-class demonstration(s) Conversations with peers Conversations with instructor(s) Avida-ED user’s manual Other (describe): Indicate the degree to which you agree or disagree with each of the following statements. Somewhat disagree Disagree Neither agree nor disagree Somewhat agree Agree Avida-ED significantly increased my understanding of evolutionary processes Avida-ED significantly increased my understanding of the process of science I feel much more comfortable discussing the topic of evolution I will continue to experiment with Avida-ED in my free time I have or will share Avida-ED with friends or family Please provide the following demographic information. This information is for analytic purposes only and will be kept strictly anonymous. Gender: Female Male Freshman Sophomore Major: Class level: Junior Senior Graduate Thank you for your participation in this survey! In the space below, please feel free to provide any other feedback that you think will help us to improve Avida-ED.       151                 LITERATURE CITED   152 LITERATURE CITED Abd-El-Khalick, F. (2005). Developing deeper understandings of nature of science: the impact of a philosophy of science course on preservice science teachers' views and instructional planning. International Journal of Science Education, 27(1), 15-42. Achieve. (2013). Next Generation Science Standards: Achieve, Inc. on behalf of the twenty-six states and partners that collaborated on the NGSS. Adami, C. (2006). Digital genetics: unravelling the genetic basis of evolution. Nature reviews. Genetics, 7(2), 109-118. Ainley, M., & Ainley, J. (2011). Student engagement with science in early adolescence: The contribution of enjoyment to students' continuing interest in learning about science. Contemporary Educational Psychology, 36(1), 4-12. Akerson, V. L., Abd-El-Khalick, F., & Lederman, N. G. (2000). Influence of a reflective explicit activity-based apprach on elementary teachers' conceptions of nature of science. Journal of Research in Science Teaching, 37(4), 295-317. Akyol, G., Tekkaya, C., Sungur, S., & Traynor, A. (2012). Modeling the interrelationships among pre-service science teachers' understanding and acceptance of evolution, their views on nature of science and self-efficacy beliefs regarding teaching evolution. Journal of Science Teacher Education, 1-21. Alberts, B., & Labov, J. B. (2004). From the National Academies: teaching the science of evolution. Cell biology education, 3(2), 75-80. Alters, B. J. (1997). Should student belief of evolution be a goal. Reports of the National Center for Science Education, 17(1), 15-16. Alters, B. J., & Alters, S. M. (2001). Defending evolution: A guide to the creation/evolution controversy. Sudbury, MA: Jones & Bartlett. American Association for the Advancement of Science. (1989). Science for all Americans. New York: Oxford University Press. American Association for the Advancement of Science. (1993). Benchmarks for science literacy. New York: Oxford University Press. American Association for the Advancement of Science. (2011). Vision and change in undergraduate biology: A call to action. Washington, D.C. Anderson, D. L., Fisher, K. M., & Norman, G. J. (2002). Development and evalutation of the Conceptual Inventory of Natural Selection. Journal of Research in Science Teaching, 39(10), 952-978.   153 Association of American Universities. (2011). Five-year initiative for improving undergraduate STEM education: Discussion draft. Washington, D.C: Association of American Universities. Bailey, G., Han, J., Wright, D., & Graves, J. (2011). Religiously expressed fatalism and the perceived need for science and scientific process to empower agency. International Journal of Science in Society, 2(3), 55-87. Bandura, A. (1993). Perceived self-efficacy in cognitive development and functioning. Educational Psychologist, 28(2), 117-148. Bergin, D. A. (1999). Influences on classroom interest. Educational Psychologist, 34(2), 87-98. Berkman, M. B., Pacheco, J. S., & Plutzer, E. (2008). Evolution and creationism in America's classrooms: A national portrait. PLoS Biology, 6(5), 920-924. Bishop, B. A., & Anderson, C. W. (1990). Student conceptions of natural selection and its role in evolution. Journal of Research in Science Teaching, 27(5), 415-427. Blount, Z. D., Borland, C. Z., & Lenski, R. E. (2008). Historical contingency and the evolution of a key innovation in an experimental population of Escherichia coli. Proceedings of the National Academy of Sciences, 105(23), 7899-7906. Brem, S. K., Ranney, M., & Schindel, J. (2003). Perceived consequences of evolution: College students perceive negative personal and social impact in evolutionary theory. Science Education, 87(2), 181-206. Brewer, C. A., & Smith, D. (Eds.). (2011). Vision and change in undergraduate biology education: A call to action: American Association for the Advancement of Science. Brickhouse, N. W. (1990). Teachers' Beliefs About the Nature of Science and Their Relationship to Classroom Practice. Journal of Teacher Education, 41(3), 53-62. Brophy, J. (1983). Conceptualizing student motivation. Educational Psychologist, 18(3), 200215. Brophy, J. (1999). Toward a model of the value aspects of motivation in education: Developing appreciation for particular learning domains and activities. Educational Psychologist, 34(2), 75-85. Brown, D. E. (1992). Using examples and analogies to remediate misconceptions in physics: Factors influencing conceptual change. Journal of Research in Science Teaching, 29(1), 17-34. Brown, P. L., Abell, S. K., Demir, A., & Schmidt, F. J. (2006). College science teachers' views of classroom inquiry. Science Education, 90(5), 784-802.   154 Case, R. (1996). Changing views of knowledge and their impact on educational research and practice. In D. R. Olson & N. Torrance (Eds.), Handbook of education and human development: New models of learning, teaching, and schooling (pp. 75-99). Oxford: Blackwell Publishers. Cobern, W. W. (1994). Comments and criticism. Point: Belief, understanding, and the teaching of evolution. Journal of Research in Science Teaching, 31(5), 583-590. Cohen, J. (1960). A coefficient of agreement for nominal scales. Educational and Psychological Measurement, 20(1), 37-46. College Board. (2011). AP Biology Curriculum Framework 2012-2013. New York: The College Board. Crick, F. (1970). Central dogma of molecular biology. Nature, 227(5258), 561-563. Cronin-Jones, L. L. (1991). Science teacher beliefs and their influence on curriculum implementation: Two case studies. Journal of Research in Science Teaching, 28(3), 235250. Czerniak, C. M., & Lumpe, A. T. (1996). Relationship between teacher beliefs and science education reform. Journal of Science Teacher Education, 7(4), 247-266. Demastes, S. S., Settlage, J., & Good, R. (1995). Students' conceptions of natural selection and its role in evolution: Cases of replication and comparison. Journal of Research in Science Teaching, 32(5), 535-550. Dennett, D. C. (1995). Darwin's dangerous idea: Evolution and the meanings of life. New York: Simon & Schuster. Dewey, J. (1913). Interest and effort in education. Cambridge: Riverside Press. Dobzhansky, T. (1973). Nothing in biology makes sense except in the light of evolution. American Biology Teacher, 35, 127-129. Durlak, J. A., & DuPre, E. P. (2008). Implementation matters: A review of research on the influence of implementation on program outcomes and the factors affecting implementation. American Journal of Community Psychology, 41, 327-350. Edwards, P. N. (2001). Representing the global atmosphere: Computer models, data, and knowledge about climate change. In C. A. Miller & P. N. Edwards (Eds.), Changing the Atmosphere: Expert Knowledge and Environmental Governance. Cambridge, MA: MIT Press. Farber, P. (2003). Teaching evolution and the nature of science. American Biology Teacher, 65(5), 347-354.   155 Fay, M. E., Grove, N. P., Towns, M. H., & Lowery, S. (2007). A rubric to characterize inquiry in the undergraduate chemistry laboratory. Chemistry Education Research and Practice, 8(2), 212-219. Feinstein, N. (2011). Salvaging science literacy. Science Education, 95(1), 168-185. Fencl, H., & Scheel, K. (2005). Engaging students: An examination of the effects of teaching strategies on self-efficacy and course climate in a nonmajors physics course. Journal of College Science Teaching, 35(1), 20-24. Gess-Newsome, J. (2002). The use and impact of explicit instruction about the nature of science and science inquiry in an elementary science methods course. Science and Education, 11(1), 55-67. Gess-Newsome, J., Southerland, S. A., Johnston, A., & Woodbury, S. (2003). Educational reform, personal practical theories, and dissatisfaction: The anatomy of change in college science teaching. American Educational Research Journal, 40(3), 731-767. Glaser, B. G., & Strauss, A. L. (1967). The discovery of grounded theory: Strategies for qualitative research. Chicago: Aldine. Gould, S. J. (1999). Rocks of Ages: Science and Religion in the Fullness of Life. New York: Ballantine Books. Gregory, T. R. (2009). Understanding natural selection: Essential concepts and common misconceptions. Evolution: Education Outreach, 2(2), 156-175. Grosslight, L., Unger, C., Jay, E., & Smith, C. L. (1991). Understanding models and their use in science: Conceptions of middle and high school students and experts. Journal of Research in Science Teaching, 28(9), 799-822. Grossman, P. L., Wilson, S. M., & Shulman, L. S. (1989). Teachers of substance: Subject matter knowledge for teaching. In M. C. Reynolds (Ed.), Knowledge Base for the Beginning Teacher (pp. 23-36). Oxford: Pergamon Press. Ha, M., Haury, D. L., & Nehm, R. H. (2012). Feeling of certainty: Uncovering a missing link between knowledge and acceptance of evolution. Journal of Research in Science Teaching, 49(1), 95-121. Hake, R. R. (2002). Relationship of individual student normalized learning gains in mechanics with gender, high-school physics, and pretest scores on mathematics and spatial visualization. Paper presented at the Physics Education Research Conference, Boise, Idaho. Harrison, A. G., & Treagust, D. F. (2000). A typology of school science models. International Journal of Biology Education, 22(9), 1011-1026.   156 Hawley, P. H., Short, S. D., McCune, L. A., Osman, M. R., & Little, T. D. (2011). What's the matter with Kansas?: The development and confirmation of the Evolutionary Attitudes and Literacy Survey (EALS). Evolution Education Outreach, 4(1), 117-132. Henderson, C., & Dancy, M. H. (2011). Increasing the impact and diffusion of STEM education innovations. Paper presented at the Center for the Advancement of Engineering Education Forum, Impact and Diffusion of Transformative Engineering Education Innovations. Hew, K. F., & Brush, T. (2007). Integrating technology into K-12 teaching and learning: Current knowledge gaps and recommendations for future research. Educational technology research and development, 55(3), 223-252. Hidi, S., & Renninger, K. A. (2006). The four-phase model of interest development. Educational Psychologist, 41(2), 111-127. Hidi, S., Renninger, K. A., & Krapp, A. (2004). Interest, a motivational variable that combines affective and cognitive functioning. In D. Y. Dai & R. J. Sternberg (Eds.), Motivation, emotion, and cognition: Integrative perspectives on intellectual functioning and development (pp. 89). New Jersey: Lawrence Erlbaum Associates, Inc. Hiyama, A., Nohara, C., Kinjo, S., Taira, W., Gima, S., Tanahara, A., & Otaki, J. M. (2012). The biological impacts of the Fukushima nuclear accident on the pale grass blue butterfly. Scientific Reports, 2(570). Hurd, P. D. (1958). Science literacy: Its meaning for American schools. Educational Leadership, 16, 13-16, 52. Ingram, E. L., & Nelson, C. E. (2006). Relationship between achievement and students' acceptance of evolution or creation in an upper-level evolution course. Journal of Research in Science Teaching, 43(1), 7-24. Johnson, R. L., & Peeples, E. E. (1987). The role of scientific understanding in college: Student acceptance of evolution. The American Biology Teacher, 49(2), 93-96, 98. Klein, R. (2013). Next Generation Science Standards in Kentucky draw hostility from religious groups. The Huffington Post. http://www.huffingtonpost.com/2013/07/29/nextgeneration-science-standards_n_3672418.html Klein, R. (2014). Wyoming governor takes major stand against modern science. The Huffington Post. http://www.huffingtonpost.com/2014/03/07/wyoming-next-generationscience_n_4922333.html Koehler, M. J., & Mishra, P. (2005). What happens when teachers design educational technology? The development of technological pedagogical content knowledge. Journal of educational Computing Research, 32(2), 131-152.   157 Koehler, M. J., & Mishra, P. (2008). Introducing TPCK. In AACTE Committee on Innovation and Technology (Ed.), Handbook of Technological Pedagogical Content Knowledge (TPCK) for Educators (pp. 3-29). New York: Routledge. Landis, J. R., & Koch, G. G. (1977). The measurement of observer agreement for categorical data. Biometrics, 33(1), 159-174. Lark, A., Richmond, G., & Pennock, R. T. (2014). Modeling evolution in the classroom: The case of Fukushima's mutant butterflies. The American Biology Teacher, 76(7), 450-454. Lave, J., & Wenger, E. (1991). Situated Learning: Legitimate Peripheral Participation. Cambridge: Cambridge University Press. Lawson, A. E., & Worsnop, W. A. (1992). Learning about evolution and rejecting a belief in special creation: effects of reflective reasoning skill, prior knowledge, prior belief and religious commitment. Journal of Research in Science Teaching, 29, 143-166. Linnenbrink, E. A., & Pintrich, P. R. (2003). The role of self-efficacy beliefs in student engagement and learning in the classroom. Reading & Writing Quarterly, 19(2), 119-137. Linnenbrink-Garcia, L., Pugh, K. J., Koskey, K. L. K., & Stewart, V. C. (2012). Developing conceptual understanding of natural selection: The role of interest, efficacy, and basic prior knowledge. The Journal of Experimental Education, 80(1), 45-68. Lloyd-Strovas, J. D., & Bernal, X. E. (2012). A review of undergraduate evolution education in U.S. universities: Building a unifying framework. Evolution: Education and Outreach, 5, 453-465. Lombrozo, T., Thanukos, A., & Weisberg, M. (2008). The Importance of understanding the nature of science for accepting evolution. Evolution: Education and Outreach, 1(3), 290298. Luria, S. E., & Delbrück, M. (1943). Mutations of bacteria from virus sensitivity to virus resistance. Genetics, 28(6), 491-511. McComas, W. (1996). Ten myths of science: Reexamining what we think we know about the nature of science. School Science & Mathematics, 96(1), 10-16. McKeachie, W. J., Lin, Y. G., & Strayer, J. (2002). Creationist vs. evolutionary beliefs: effects on learning biology. American Biology Teacher, 64(3), 189-192. Mead, L. S., & Mates, A. (2009). Why science standards are important to a strong science curriculum and how states measure up. Evolution: Education and Outreach, 2(3), 359371. Meadows, L., Doster, E., & Jackson, D. F. (2000). Managing the conflict between evolution & religion. The American Biology Teacher, 62(2), 102-107.   158 Miller, J. D., Scott, E. C., & Okamato, S. (2006). Public acceptance of evolution. Science, 313, 765-766. Mishra, P., & Koehler, M. J. (2006). Technological pedagogical content knowledge: A framework for teacher knowledge. Teachers College Record, 108(6), 1017-1054. Mitchell, M. (1993). Situational interest: Its multifaceted structure in the secondary mathematics classroom. Journal of Educational Psychology, 85(3), 424-436. Nadelson, L. S., & Southerland, S. A. (2009). Development and preliminary evaluation of the Measure of Understanding of Macroevolution: Introducing the MUM. Journal of Experimental Education, 78(2), 151-190. Nadelson, L. S., & Southerland, S. A. (2010). Examining the interaction of acceptance and understanding: How does the relationship change with a focus on macroevolution? Evolution: Education and Outreach, 3(1), 82-88. National Academy of Sciences. (1998). Teaching about evolution and the nature of science. Washington, DC: National Academy Press. National Center for Science Education. (2013). COPE v. Kansas State BOE. Retrieved March 17 2014, from http://ncse.com/legal/cope-v-kansas-state-boe National Research Council. (1996). National science education standards. Washington, D.C.: National Academies Press. National Research Council. (2000). Inquiry and the National Science Education Standards: A guide for teaching and learning. Washington, D.C.: National Academies Press. National Research Council. (2005). How Students Learn: Science in the Classroom. Washington, D.C.: The National Academies Press. National Research Council. (2011). A Framework for K-12 Science Education: Practices, Crosscutting Concepts, and Core Ideas. Washington, DC: National Academies Press. National Research Council. (2012). Thinking evolutionarily: Evolution education across the life sciences: Summary of a convocation. Washington, D.C.: National Academies Press. National Science Teachers Association. (2005). Survey indicates science teachers feel pressure to teach nonscientific alternatives to evolution (press release). from http://www.nsta.org/about/pressroom.aspx?id=50377 Nehm, R. H., Beggrow, E. P., Opfer, J. E., & Ha, M. (2012). Reasoning about natural selection: Diagnosign contextual competency using the ACORNS instrument. American Biology Teacher, 74(2), 92-98.   159 Nelson, C. E. (2008). Teaching evolution (and all of biology) more effectively: Strategies for engagement, critical reasoning, and confronting misconceptions. Integrative and comparative biology, 48(2), 213-225. Newport, F. (2012). In U.S., 46% hold Creationist view of human origins. Gallup Pole. http://www.gallup.com/poll/155003/Hold-Creationist-View-Human-Origins.aspx O'Brien, D. T., Wilson, D. S., & Hawley, P. H. (2009). "Evolution for Everyone": A course that expands evolutionary theory beyond the biological sciences. Evolution: Education and Outreach, 2, 445-457. Olson, J. (1981). Teacher influence in the classroom: a context for understanding curriculum translation. Instructional Science, 10(3), 259-275. Olson, S., & Riordan, D. G. (2012). Engage to excel: Producing one million additional college graduates with degrees in science, technology, engineering, and mathematics. Report to the President. Executive Office of the President. Organization for Economic Co-operation and Development. (2007). PISA 2006: Science competencies for tomorrow's world: Executive summary. from [http://www.oecd.org/dataoecd/15/13/39725224.pdf] Passmore, C., Stewart, J., & Zoellner, B. (2005). Providing high school students with opportunities to reason like evolutionary biologists. The American Biology Teacher, 67(4), 214-221. Pennock, R. T. (2005). On teaching evolution and the nature of science. In J. Cracraft & R. W. Bybee (Eds.), Evolutionary science and society: Educating a new generation. Colorado Springs, CO: Biological Sciences Curriculum Study. Pennock, R. T. (2007a). Learning evolution and the nature of science using evolutionary computing and artificial life. McGill Journal of Education, 42(2), 211-224. Pennock, R. T. (2007b). Models, simulations, instantiations, and evidence: the case of digital evolution. Journal of Experimental & Theoretical Artificial Intelligence, 19(1), 29-42. Posner, G. J., Strike, K. A., Hewson, P. W., & Gertzog, W. A. (1982). Accommodation of a scientific conception: Toward a theory of conceptual change. Science Education, 66(2), 211-227. Powell, J. C., & Anderson, R. D. (2002). Changing teachers' practice: curriculum materials and science education reform in the USA. Studies in Science Education, 37(2002), 107-136. Rice, J. W., Olson, J. K., & Colbert, J. T. (2011). University evolution education: The effect of evolution instruction on biology majors' content knowledge, attitude toward evolution, and theistic position. Evolution: Education and Outreach, 4(1), 137-144.   160 Robbins, J. R., & Roy, P. (2007). The natural selection: Identifying & correcting non-science student preconceptions through an inquiry-based, critical approach to evolution. The American Biology Teacher, 69(8), 460-466. Roehrig, G. H., & Kruse, R. A. (2005). The role of teachers' beliefs and knowledge in the adoption of a reform-based curriculum. School Science & Mathematics, 105(8), 412-422. Roehrig, G. H., & Luft, J. A. (2004). Constraints experienced by beginning secondary science teachers in implementing scientific inquiry lessons. International Journal of Science Education, 26(1), 3-24. Rogoff, B. (2003). The cultural nature of human development. New York: Oxford University Press. Rotgans, J. I., & Schmidt, H. G. (2011). The role of teachers in facilitating situational interest in an active-learning classroom. Teaching and Teacher Education, 27(1), 37-42. Rotgans, J. I., & Schmidt, H. G. (2014). Situational interest and learning: Thirst for knowledge. Learning and Instruction, 32, 37-50. Rule, A. C., & Meyer, M. A. (2009). Teaching urban high school students global climate change information and graph interpretation skills using evidence from the scientific literature. Journal of Geoscience Education, 57(5), 335-347. Rutledge, M. L., & Mitchell, M. A. (2002). High school biology teachers' knowledge structure, acceptance & teaching of evolution. The American Biology Teacher, 64(1), 21-28. Rutledge, M. L., & Warden, M. A. (1999). The development and validation of the Measure of Acceptance of the Theory of Evolution instrument. School Science & Mathematics, 99(1), 13-18. Rutledge, M. L., & Warden, M. A. (2000). Evolutionary theory, the nature of science, and high school biology teachers: Critical relationships. American Biology Teacher, 62(1), 23-31. Sadava, D. E., Hillis, D. M., Heller, H. C., & Berenbaum, M. (2012). Life: The Science of Biology (Tenth ed.). London: Macmillan. Sadler, T. D., Burgin, S., McKinney, L., & Ponjuan, L. (2010). Learning science through research apprenticeships: A critical review of the literature. Journal of Research in Science Teaching, 47(3), 235-256. Sandoval, W. A., & Morrison, K. (2003). High school students' ideas about theories and theory change after a biological inquiry unit. Journal of College Science Teaching, 40(4), 369392. Schneider, R. M., Krajcik, J., & Blumenfeld, P. (2005). Enacting reform-based science materials: The range of teacher enactments in reform classrooms. Journal of Research in Science Teaching, 42(3), 283 - 312.   161 Schraw, G., Flowerday, T., & Lehman, S. (2001). Inscreasing situational interest in the classroom. Educational Psychology Review, 13(3), 211-224. Schraw, G., & Lehman, S. (2001). Situational interest: A review of the literature and directions for future research. Educational Psychology Review, 13(1), 23-52. Schwab, J. J. (1964a). Structure of the disciplines: Meanings and significances. In G. W. Ford & L. Pugno (Eds.), The structure of knowledge and the curriculum (pp. 6 - 30). Chicago: Rand McNally. Schwab, J. J. (1964b). The structure of the natural sciences. In G. W. Ford & L. Pugno (Eds.), The structure of knowledge and the curriculum (pp. 31 - 49). Chicago: Rand McNally. Schwartz, R. S., Lederman, N. G., & Crawford, B. A. (2004). Developing views of nature of science in an authentic context: An explicit approach to bridging the gap between nature of science and scientific inquiry. Science Education, 88(4), 610-645. Seymour, E., & Hewitt, N. M. (1994). Talking about leaving: factors contributing to high attrition rates among science, mathematics & engineering undergraduate majors: final report to the Alfred P. Sloan Foundation on an ethnographic inquiry at seven institutions: Ethnography and Assessment Research, Bureau of Sociological Research, University of Colorado. Shulman, L. S. (1986). Those who understand: Knowledge growth in teaching. Educational Researcher, 15(2), 4-14. Shulman, L. S. (1987). Knowledge and teaching: Foundations of the new reform. Harvard Educational Review, 57, 1-22. Sinatra, G. M., Southerland, S. A., McConaughy, F., & Demastes, J. W. (2003). Intentions and beliefs in students' understanding and acceptance of biological evolution. Journal of Research in Science Teaching, 40(5), 510-528. Singer, S. R., Nielsen, N. R., & Schweingruber, H. A. (Eds.). (2012). Discipline-based education research: Understanding and improving learning in undergraduate science and engineering. Washington, D.C.: National Academies Press. Singh, K., Granville, M., & Dika, S. (2002). Mathematics and science achievement: Effects of motivation, interest, and academic engagement. The Journal of Educational Research, 95(6), 323-332. Smith, M. U., & Siegel, H. (2004). Knowing, believing, and understanding: What goals for science education? Science & Education, 13(6), 553-582. Southerland, S. A., Gess-Newsome, J., & Johnston, A. (2003). Portraying science in the classroom: The manifestation of scientists' beliefs in classroom practice. Journal of Research in Science Teaching, 40(7), 669-691.   162 Speth, E. B., Long, T. M., Pennock, R. T., & Ebert-May, D. (2009). Using Avida-ED for teaching and learning about evolution in undergraduate introductory biology courses. Evolution: Education Outreach, 2, 415-428. Subramaniam, P. R. (2009). Motivational effects of interest on student engagement and learning in physical education: A review. International Journal of Physical Education, 46(2), 1119. Sunal, D. W., Hodges, J., Sunal, C. S., Whitaker, K. W., Freeman, L. M., Edwards, L., . . . Odell, M. (2001). Teaching science in higher education: Faculty professional development and barriers to change. School Science and Mathematics, 101(5), 246-257. Tobin, K., & McRobbie, C. J. (1997). Beliefs about the nature of science and the enacted science curriculum. Science and Education, 6(4), 355-371. Trautmann, N., MaKinster, J., & Avery, L. (2004). What makes inquiry so hard? (And why is it worth it?). Paper presented at the National Association for Research in Science Teaching, Vancouver, BC, Canada. University of California Museum of Paleontology. (2014). Understanding Evolution. Retrieved May 22 2014, from http://evolution.berkeley.edu/ Volkmann, M. J., Abell, S. K., & Zgagacz, M. (2005). The challenges of teaching physics to preservice elementary teachers: Orientations of the professor, teaching assistant, and students. Science Education, 89(5), 847-869. von Glasersfeld, E. (1989). Cognition, construction of knowledge, and teaching. Synthese, 80, 121-140. Voogt, J., Fisser, P., Pareja Roblin, N., Tondeur, J., & van Braak, J. (2013). Technological pedagogical content knowledge - a review of the literature. Journal of Computer Assisted Learning, 29(2), 109-121. Weimer, M. (2002). Learner-centered teaching: Five key changes to practice. San Francisco: Jossey-Bass. Wuerth, M. (2004). Resources for teaching evolution. American Biology Teacher, 66(2), 109113. Yin, R. K. (2009). Case study research: Design and methods (4th ed.). Thousand Oaks, CA: Sage. Zimmer, C. (2005, February). Testing Darwin. Discover Magazine. Zimmer, C., & Emlen, D. J. (2013). Evolution: Making Sense of Life. Greenwood Village, CO: Roberts and Company Publishers, Inc.     163