STUDYING ENGAGEMENT TO INFORM DESIGN OF CALCULUS 2 COMPUTATIONAL LABS By Andrew Joseph Krause A DISSERTATION Submitted to Michigan State University in partial fulfillment of the requirements for the degree of Mathematics Education—Doctor of Philosophy 2022 ABSTRACT STUDYING ENGAGEMENT TO INFORM DESIGN OF CALCULUS 2 COMPUTATIONAL LABS By Andrew Joseph Krause This is a study of student engagement with computational labs in Calculus 2. The labs task students with using MATLAB to investigate contexts such as rocket science, disease modeling, and market economics forecasting by modifying and executing provided code, guided by questions that have students report the results of their simulations or observations of the model, as well as interpretive questions that ask them to make sense of their findings in the context of the realistic situation. Locally, the study is situated within an iterative design process with the goal of improving the labs and their implementation, meaning there are direct connections between the research and improvements to the written tasks or teaching practices. The goal of this dissertation, from a research perspective, is to examine student engagement to contribute to the development of theory about lab-type activities and the learning opportunities they facilitate. To accomplish both goals, this study follows a design-based research (DBR) methodology, which provides a structure to relate the local goal to inform practice to the broader research goal to develop theory that is useful broadly. The labs can be considered novel learning activities, with novel learning goals, in a novel teaching context. The labs were novel learning activities because they were the original product of the curriculum designers and they were still being substantially revised during the third year of the project, when this study took place. The learning goals were novel at the research site because they involved collaborative and higher-level objectives, a notable departure from the traditional emphasis on individual mastery of various rote calculation. The teaching context was novel because this research was conducted during the first time the labs were taught “at-scale”, meaning it was the first time the labs were being led by graduate students during recitations connected to a multi-section large-lecture. In this novel context, studying engagement provides a way to understand how the labs facilitate students’ attainment of those learning goals without attempting to measure students’ individual attainment of those learning goals. Students learn through engagement, so understanding how students engage with the labs is a way to understand how students are learning—if students are engaged in the mathematical practices the labs are intended to facilitate, then they are learning those mathematical practices. This study examined engagement through interviews, classroom observations, and a post- course survey. Students were observed collaborating productively by discussing their thinking, tinkering with code, and co-authoring solutions. Rarely were off-task behaviors observed. Some interview participants identified intrinsic learning goals and other students were invested in completing the labs simply because they were a required, graded assignment. Some students found the labs to be interesting, challenging, and useful, but the broader sentiment was a more negative reaction that was especially pronounced in the post-course survey data. These negative feelings, however, did not appear to disrupt students’ behavioral or cognitive engagement in the classroom, because students continued to put forward effort to complete the labs through the end of the semester, suggesting that negative emotional engagement does not necessarily lead to overall disengagement or undermine students’ motivation. In other words, even though some students might not be trying to learn how to use MATLAB to model Calculus 2 concepts, they learn some of those skills anyway because they have to do the things they are supposed to learn. Keywords: Calculus 2, labs, engagement, active learning Dedicated to Brandon Nieporte, a best-friend missed by many. iv ACKNOWLEDGEMENTS Above all, I thank my wife, Elizabeth Krause, for her unwavering support throughout a decade-long journey to produce this dissertation and complete my graduate studies—yes, a decade. In the meantime, Liz has returned to school to earn her teaching certificate, taught in public schools for 5 years, carried our two young children, and left her career to support our family during the pandemic. She is a super mom and a super wife. I certainly would have succumbed to the pressure of work and school without her providing a steadfast foundation for our family. Liz, congratulations for making it through the last decade. I love and appreciate you! I also thank my advisor, Vincent Melfi, for his decade of service supporting me on my journey, and the sheer amount of time he has invested in my success. I tried to quit more than once, but Vince calmly guided me and wisely advised me about how to balance work, school, and family. Vince has helped me pick up the pieces when I have stumbled and illuminated the path when I saw no way forward. Beyond his support for my research, Vince has been an amazing mentor who has helped me grow into the educator I am today. He was instrumental in the creation of the instructional mentoring program in the Department of Mathematics, which has become the focus of my professional work. We also worked together on the first pilot of the Quantitative Literacy courses, which have grown into a beacon of modern educational progress at MSU. I hope to continue our collaboration for many years, which will yield continued teaching excellence in the mathematical sciences at MSU. I thank the rest of my committee members, Ralph Putnam, Shiv Karunakaran, and Willie Wong. I fondly remember hours of discussion with Ralph as we tried to wrap our heads around the problem of researching a large and rapidly changing teaching landscape. Shiv provided v crucial guidance when my research balanced on the precipice of collapse. Without Shiv’s help, there is no way I would have been able to respond from a failed dissertation proposal to change direction and produce this dissertation. I admire Willie as a colleague and a mentor because of the sacrifices he made to lead the lab design project and his ability to advise my educational research even though he spends most of his professional life as a mathematics researcher. I thank Tsveta Sendova, my main teaching collaborator, who has tolerated years of working with me as I have struggled to juggle my responsibilities while we have supported a vibrant teaching community together. I also thank Rachael Lund, Dave Bramer, Ryan Maccombs, and Alec Drachman, who have made my teaching work incredibly rewarding and a source of professional energy. I thank my fellow students for your support, especially Jeff Craig, Lynette Guzman, Durrell Jones, Younggon Bae, Abe Edwards, Chuck Fessler, and Katie Westby, who have all helped me grow in significant ways as an educator and scholar. I thank the amazing college teachers I have worked with who have inspired me as an educator. A special thanks goes out to Reshma Menon, Sarah Klanderman, Nicholas Rekuski, Chloe Lewis, and Samara Chamoun for your incredible dedication as teachers that is an inspiration to everyone. I thank all of the academic staff who have supported me, especially Lisa Keller and Freda Cruél, who have worked tirelessly to help my stay on track. I have missed countless deadlines for far too long—I cannot thank the two of you enough for your patience and guidance. Finally, I thank the rest of my family. I think my kids, Dexter and Cecilia, for loving me no matter how distracted and stressed I have been as I have worked to finish this degree. I also thank my parents and Liz’s parents for the crucial support that have given me by being awesome grandparents to my kids. And, of course, I thank my mom and dad for their life-long support—I thank you for everything I have. Love you all! vi TABLE OF CONTENTS LIST OF TABLES ......................................................................................................................... ix LIST OF FIGURES ........................................................................................................................ x CHAPTER 1: OVERVIEW OF STUDY AND BACKGROUND OF PROBLEM....................... 1 Research Question ...................................................................................................................... 3 Purpose ....................................................................................................................................... 4 CHAPTER 2: BACKGROUND OF THE PROBLEM AND CONCEPTUAL FRAMEWORK .. 7 Nationwide Movement to Improve Gateway Courses ............................................................... 7 Design Principle 1: Facilitate Collaboration and Active Learning ......................................... 9 Design Principle 2: Present Realistic and Relevant Applications of Calculus...................... 11 Design Principle 3: Imbuing Mathematical Practices in the Curriculum. ............................. 15 Conceptual Framework ............................................................................................................ 16 Why Study Engagement? ...................................................................................................... 16 Definition of Engagement ..................................................................................................... 18 How To Study Engagement .................................................................................................. 22 CHAPTER 3: METHOD .............................................................................................................. 28 Methodology ............................................................................................................................ 28 Research Context ...................................................................................................................... 30 Rationale for Context Selection ............................................................................................ 32 Participants ............................................................................................................................... 33 Positioning of the Researcher ................................................................................................... 33 Data Collection ......................................................................................................................... 34 Interviews .............................................................................................................................. 34 Classroom Observations ........................................................................................................ 36 Course Survey ....................................................................................................................... 38 Data Analysis ........................................................................................................................... 39 Open Coding.......................................................................................................................... 40 Descriptive Coding ................................................................................................................ 44 Thematic Analysis Coding .................................................................................................... 44 Survey Coding ....................................................................................................................... 46 CHAPTER 4: RESULTS .............................................................................................................. 47 Behavioral Engagement ........................................................................................................... 47 Individual Behavioral Engagement Modalities ..................................................................... 49 Group Behavioral Engagement Modalities ........................................................................... 51 Behavioral Engagement Modality Timelines ........................................................................ 53 Emotional Engagement ............................................................................................................ 59 Interest and Enjoyment .......................................................................................................... 59 Confusion .............................................................................................................................. 60 Frustration and Anger ............................................................................................................ 60 vii Feelings about Groupwork .................................................................................................... 61 Confidence............................................................................................................................. 62 Cognitive Engagement ............................................................................................................. 62 Student Learning Goals and Grade Motivation ..................................................................... 63 Student Perceptions .................................................................................................................. 64 Perception of Learning Value................................................................................................ 65 Perception of MATLAB ........................................................................................................ 66 Perception about Applications of Calculus Ideas .................................................................. 66 Survey Data ........................................................................................................................... 68 CHAPTER 5: DISCUSSION, IMPLICATIONS, AND CONCLUSION .................................... 70 Discussion of Research Findings ............................................................................................. 70 Claim 1: Wide Range of Behavioral Engagement Modalities Demands Attention .............. 71 Claim 2: Negative Emotional Engagement Does Cause Disengagement ............................. 72 Claim 3: Intrinsic Motivational Factors May Be Less Important Than Suggested ............... 74 Claim 4: Multiple Forms of Data are Required to Understand Student Engagement ........... 76 Connections to the Literature ................................................................................................... 77 Contribution to Engagement Research .................................................................................. 78 Understanding Different Design Goals ................................................................................. 81 Attending to Local Data ........................................................................................................ 82 Active Learning ..................................................................................................................... 84 Enriching Undergraduate Courses with New Learning Goals .............................................. 86 Recommendations of the Curriculum Foundations Project .................................................. 89 Implications for Local Practice ................................................................................................ 91 Conclusion ................................................................................................................................ 92 Limitations of this Research .................................................................................................. 93 Future Directions ................................................................................................................... 95 APPENDICES .............................................................................................................................. 97 APPENDIX A: SURVEY ....................................................................................................... 98 APPENDIX B: INTERVIEW CONSENT FORM ............................................................... 105 APPENDIX C: OBSERVATION CONSENT FORM ......................................................... 106 APPENDIX D: STUDENT INTERVIEW PROTOCOL ..................................................... 107 APPENDIX E: LAB TASK DETAILS ................................................................................ 110 APPENDIX F: CODEBOOK ............................................................................................... 114 APPENDIX G: POST-COURSE SURVEY DATA ............................................................. 116 REFERENCES ........................................................................................................................... 121 viii LIST OF TABLES Table 1: Post-course survey responses to Q48: Rate your usual participation in the Labs. ......... 61 Table 2: Post-course survey response data for a subset of survey questions. ............................... 69 Table 3: Alignment between Curriculum Foundations Project and lab tasks. .............................. 90 Table 4: Codebook ...................................................................................................................... 114 Table 5: Survey responses about behaviors that the labs facilitate. ............................................ 116 Table 6: Survey responses about skills related to MATLAB coding ......................................... 116 Table 7: Survey responses about the usefulness and effectiveness of the labs. .......................... 117 Table 8: Survey responses about the logistics of the labs. .......................................................... 118 Table 9: Survey responses about the groupwork component of the labs. ................................... 119 ix LIST OF FIGURES Figure 1: Reeves' (2006) Design-Based Research Model, as presented in Cotton et al. (2009). ... 2 Figure 2: Behavioral engagement modality timelines representing Observation 2. ..................... 56 Figure 3: Behavioral engagement modality timelines representing Observation 7. ..................... 57 Figure 4: Behavioral engagement modality timelines representing Observation 3. ..................... 58 x CHAPTER 1: OVERVIEW OF STUDY AND BACKGROUND OF PROBLEM This is a study of student engagement with computational labs in Calculus 2. The labs were designed to connect Calculus 2 concepts, such an integration and series, to disciplinary STEM practice through numerical simulations, computational modeling, and a team-based learning experience. The labs tasked students with using MATLAB to investigate contexts such as rocket science, disease modeling, and market economics forecasting by modifying and executing provided code, guided by questions that had students report the results of their simulations or observations of the model, as well as interpretive questions that asked them to make sense of their findings in the context of the realistic situation. The project to develop and implement the labs reflected an ongoing effort at the research site to improve undergraduate courses by shifting curriculum towards rich, collaborative experiences with interdisciplinary contexts and away from a traditional emphasis on individual mastery of various calculation methods. The lab design project was conceived as a three-year, iterative design process. The focus of the first year was creating and piloting the lab activities in a small class (~30 students) co-taught by the two tenure-stream mathematics faculty leading the project. In the second year, the faculty leads co-taught medium-sized lectures (~80 students) divided into two recitation sections (~40 students each) in which the labs were led by a pair of teaching assistants; development focused on revising the lab materials and addressing logistical issues with scaling up the labs. In the third year, a large-scale pilot was implemented in which the faculty leads co-taught one full-sized lecture (~160 students) and the labs were used in all associated recitation sections (~30 students each), each taught by a single teaching assistant (the regular “large-lecture plus recitation” structure Calculus 2 at the research site); development focused on revising the labs for at-scale deployment and developing teaching supports for lab 1 instructors. The lab development was based on teaching experience of the faculty leads, as well as on suggestions resulting from discussions with Engineering faculty. This dissertation examined student engagement with the labs during the third year of the project. The goal of this dissertation, from a research perspective, was to examine student engagement to contribute to the development of theory about how lab-type activities facilitate learning. The local goal of this study was to inform the iterative design process to improve these specific labs and their implementation at the research site. To accomplish both goals, this study follows a design-based research (DBR) methodology, which provides a structure to inform local practice while developing generalizable theory (Cotton, Lockyer, Brickell, & Brickell, 2009; Philippakos, Howell, & Pellegrino, 2021; Reeves, 2006). The structure of design-based research is illustrated in Figure 1. Figure 1: Reeves' (2006) Design-Based Research Model, as presented in Cotton et al. (2009). The development work of the faculty project leaders, who can be considered as “Practitioners” in this framing, follows the flow that is illustrated: they analyzed practical problems, developed solutions, and engaged in testing, refinement, and reflection. As the “Researcher”, I played a collaborative role in the iterative cycles of testing and refinement by providing expertise with mathematics education scholarship: first to situate the project within the 2 literature and second to design this study to inform the testing, refinement, and reflection. One way that I was able to contribute as a collaborator was that I identified the design principles that have implicitly guided the development of the labs throughout the project: 1. Prescribe active learning, specifically collaboration, in the small-class recitation setting. 2. Enrich the curriculum with realistic and relevant applications of calculus content. 3. Incorporate modern mathematical practices, such as data analysis, coding, modeling, using computational tools, and communicating ideas mathematically. This dissertation is structured to emphasize the general study of engagement, while the practice-based implications are presented in the background. The research question and purpose (below), the background and theoretical framing (Chapter 2), method (Chapter 3), and results (Chapter 4) all focus on the general theory of student engagement, and this study draws from and contributes to the literature about engagement. The local impact on practice is discussed in the introduction to the context (above), to motivate the methodology (Chapter 3), and in the conclusion (Chapter 6). The local context provides the “what and why” of the study, while the “how” is drawn from literature about engagement. Research Question This research is focused on answering a single research question: How do students engage with computational labs in a particular Calculus 2 class? This is a complex question because of the multi-faceted nature of "student engagement" involving (at least) behavioral, emotional, and cognitive components that must be assessed through a variety of data (Fredricks, Blumenfeld, & Paris, 2004; Middleton, Jansen, & Goldin, 3 2017). These engagement components will be explored in detail in the literature review, but briefly, behavioral engagement is the actions students perform, emotional engagement is the affective components including enjoyment and confidence, and cognitive engagement is the mental investment to complete the task to learn. Each component manifests itself in different ways, which is why a variety of data were collected from three sources: classroom observations, student interviews, and a post-course survey. Each of these data collection methods are individually limited, but together they provide a rich data set for this research that allowed me to make sense of the complicated and diverse ways in which students engage with the labs. Purpose The purpose of this dissertation is three-fold. First, this dissertation situates the lab design project within the literature to provide a research lens through which to understand how students engaged with the labs. Second, this research advances the scholarship about active learning by responding to the call of Freeman and colleagues (2014) for “second-generation research” about active learning. Third, this research intersects with the MAA Calculus studies (e.g. Bressoud, Mesa, & Rasmussen, 2015; Rasmussen et al., 2016) by enriching our conception about what it means to “attend to local data” to inform teaching improvements. This dissertation situates the lab design project within the literature. By contributing to the identification and honing of the design principles, the research provided a way to understand the project in terms of the literature related to each design thread. These connections to the literature are detailed in the Chapter 2, which provides a theoretical framework for understanding the lab design project. This research also answers the call of Freeman and colleagues (2014) for “second- generation research” about active learning. They presented a meta-analysis of 225 studies that 4 found that active learning improves student outcomes in STEM courses, especially for groups of students traditionally underrepresented in STEM fields. The authors considered the studies that they meta-analyzed as focusing on comparisons of student outcomes in active learning and traditional lecture classrooms, and categorized these as the “first-generation” of research. They suggested that first-generation research has generated sufficient evidence about the benefits of active learning to support movements to discontinue traditional lecture and claimed, “calls to increase the number of students receiving STEM degrees could be answered, at least in part, by abandoning traditional lecturing in favor of active learning” (p. 8410). They issued a call for second-generation research to inform course design, understand the impact of different types of active learning for different populations, and develop effective instructional techniques for active learning. This dissertation engages with the second item in their call, by examining the classroom impact of a specific type of active learning activity. This research also expands on findings from the MAA Calculus studies by enriching our conception about what it means to “attend to local data” (e.g. Bressoud, Mesa, & Rasmussen, 2015; Rasmussen et al., 2016). The Insights and Recommendations from the MAA National Student of College Calculus (Bressoud et al., 2015) found that institutions with effective calculus programs attend to local data to inform ongoing teaching improvements, and provide examples including various ways to employ student outcome data, student evaluations, and in-class teaching observations. There is, however, a lack of guidance about the role of qualitative research and examples of qualitative data that departments might collect. This study provides an example of how engagement data can be collected and analyzed to inform local teaching practices and contribute to scholarship broadly. This curriculum design project was targeted at improving the quality of students' learning experience with Calculus 2, not improving grade 5 outcomes. A qualitative approach allows us to make sense about how the curriculum supports the learning goals (or fails to do so) throughout the implementation of the project, prior to the development of quantitative metrics of those goals. In other words, interviews and observations can reveal that meaningful things are happening during the labs, even if we are unable to directly assess individual attainment of the learning goals at this point in the project. 6 CHAPTER 2: BACKGROUND OF THE PROBLEM AND CONCEPTUAL FRAMEWORK This chapter focuses on reviewing the literature that provides background about the problem and a conceptual framework for examining engagement. First, background about the problem will be provided by situating the Calculus 2 lab design project within the literature. Each of the design principles is connected to different aspects of the literature, so the review is organized by exploring those connections to each principle sequentially. The Calculus 2 lab design project is part of a nationwide movement to improve gateway mathematics courses, so the connections between this project and the literature are important for establishing how this study can inform a broad audience of reformers. Second, the literature about engagement research is reviewed to provide a conceptual framework for this dissertation. This part of the review will explain why this study focuses on understanding engagement and draw on the literature to justify the research method. Nationwide Movement to Improve Gateway Courses The 2012 PCAST report—Engage to excel: Producing one million additional college graduates with degrees in Science, Technology, Engineering, and Mathematics—is a broad call for improvements and expansion of undergraduate STEM programs. The report specifically identifies recruitment and retention in the first two years of postsecondary education as a large component of the solution: Fewer than 40% of students who enter college intending to major in a STEM field complete a STEM degree. Merely increasing the retention of STEM majors from 40% to 50% would generate three-quarters of the targeted 1 million additional STEM degrees over the next decade. (p. 7) 7 The recommendations outlined in the PCAST report, supported by findings that negative experiences in their introductory mathematics is one of the primary reasons that students leave STEM fields (Bressoud, Mesa, Hsu, & Rasmussen, 2014), have contributed to a widespread movement to improve teaching in introductory mathematics courses. While Calculus 2 is not always considered an “introductory” course, the Progress through Calculus 2 Project (Rasmussen et al., 2016) focuses on understanding the pathway through Calculus 2 because of the importance of success in Calculus 2 for students to pursue degrees in engineering, physics, chemistry, computer science, and many other STEM fields. Directly to this point, the Calculus 2 lab project was designed, in part, through a collaboration with the College of Engineering to better prepare the students for calculus applications in upper-division Engineering courses by improving student engagement with Calculus 2 coursework. Therefore, this study is located within an active field of research and development and can inform these activities broadly, beyond the scope of informing the local context. The Calculus 2 lab project is also situated among several, more specific threads of scholarship aligned with the design principles of the project. The first principle, to facilitate collaboration and active learning, is closely tied to the popular movement to promote active learning in undergraduate STEM courses. The second principle, to enrich the content with realistic and relevant applications, is related to reform movements to imbue modeling in algebra and calculus curricula. The third principle, to incorporate more authentic mathematical practice and higher-level tasks, is similar to the conceptualization of the Standards for Mathematical Practice in the Common Core State Standards (Common Core State Standards Initiative, 2010). In the following sections, I will explain how this study draws from and contributes to the literature from each of these fields. 8 Design Principle 1: Facilitate Collaboration and Active Learning Efforts to improve teaching in undergraduate mathematics courses have coalesced around promoting active learning, which I take as an umbrella term used to describe classroom activities that involve students doing anything other than passively consuming a lecture. The phrase “active learning” appeared in The Piaget Primer: Thinking, Learning, and Teaching (Labinowicz, 1980) as perhaps the first use of the phrase to describe a particular teaching and learning configuration. As the reference to Piaget suggests, this conceptualization grew out of constructivist theory that students must actively construct their own knowledge rather than passively absorb it through instruction. At this point, “active learning” referred to students’ activity in the learning context, rather than an instructional method, which it has since evolved to describe. Bonwell & Eison (1991) established the concept of active learning as an instructional activity and defined active learning as “instructional activities involving students in doing things and thinking about what they are doing” (p. iii). Freeman and colleagues (2014) further broadened the definition by including “approaches as diverse as occasional group problem- solving, worksheets or tutorials completed during class, use of personal response systems with or without peer instruction, and studio or workshop course designs” (p. 8410). In a meta-analysis of 225 studies, Freeman and colleagues (2014) found that active learning improved exam scores by 6%, on average, and that students were 1.5 times more likely to fail classes using traditional lecture versus employing active learning. Furthermore, there is evidence that active learning can mitigate inequities in STEM persistence among women and racial minorities (Kogan & Laursen, 2014; Laursen, Hassi, Kogan, & Weston, 2014). Freeman and colleagues (2014) called into question the continued use of traditional lecture, referencing the fact that similar findings in the medical field would necessitate immediately abandoning the 9 control (traditional lecture) over the treatment (active learning) for future studies. These findings have provided justification to promote active learning to improve teaching practices in undergraduate mathematics classes. They call for employing modern educational psychology and cognitive science methods to understand the details and differences between active learning in different contexts, specifically to “elaborate on recent work indicating that underprepared and underrepresented students may benefit most from active methods” (p.8413). For example, Johnson, Keller, Andrews-Larson, Fortune, and Keene (2018) found that males outperformed females in an active learning abstract algebra classroom, calling into question general claims that active learning will narrow achievement gaps. Shifting teaching practices is a complicated process. One of the barriers to improving teaching is that long-standing instructional traditions are difficult to change, especially with institutional structures in place that support traditional large-lecture teaching—well-established lecture-recitation funding structures, years of instructor experience teaching using traditional methods, and large-scale course materials that support traditional teaching methods and learning goals, to name a few. Individual efforts to implement active learning approaches within these structures have often failed or been forgotten as the reformer is rotated out of the course, which has led to scholarly efforts to develop sustainable systems of educational change, such as the Departmental Action Team (DAT) model (Reinholz, Pilgrim, & Finkelstein, 2018). This study contributes to this literature thread because it examines a curriculum design project in a large-scale Calculus 2 course. The large-scale course utilizes a rigid course structure to provide a uniform experience across approximately 60 sections each year (more than 1500 students), taught by a variety of tenure-track faculty, instructional faculty, graduate students, and undergraduate learning assistants. As such, one of the design principles of the Calculus 2 lab 10 project was to develop a curriculum element that intrinsically promoted active learning – in other words, active learning was baked-in to the curriculum. A similar approach can be found in the Active Calculus curriculum, which is explicitly designed to support active learning in the classroom through student-led investigations (Boelkins, Austin, & Schlicker, 2019). This study serves to understand how student activity in the classroom is impacted by the effort to prescribe active learning structurally. Design Principle 2: Present Realistic and Relevant Applications of Calculus The goal to enrich the content with realistic and relevant applications is related to reform movements to imbue modeling in algebra and calculus curricula, which will be reviewed in this section. One reform project targeted at incorporating modeling into college algebra curricula presents interesting parallels to this study and the Calculus 2 lab project (Ganter & Haver, 2011). These previous algebra reforms, along with the Curriculum Foundations Project (Ganter & Barker, 2004; Ganter & Haver, 2011), have motivated similar efforts to incorporate modeling in calculus curricula. This section reviews the literature from these reform movements and explains how this dissertation contributes to this scholarship. Curriculum Foundations Project. This Curriculum Foundations Project (Ganter & Barker, 2004; Ganter & Haver, 2011) was an effort to collect recommendations from the disciplines served by mathematics courses about what aspects of mathematics are important for students in their discipline–for example biology, chemistry, engineering, statistics, economics, or the social sciences. Unsurprisingly, the recommendations are similar from nearly all of the 22 working groups: emphasize conceptual understanding, emphasize problem-solving skills that include applying familiar mathematics in novel settings, emphasize mathematical modeling, emphasize communication including writing logical arguments in words, improve 11 interdisciplinary cooperation, emphasize the use of appropriate technology, use appropriate assessments to emphasize skills and knowledge beyond rote calculation, and encourage the use of active learning. The Calculus 2 labs were designed to address some of these areas, and this research seeks to understand the extent to which the labs facilitate student engagement in these areas. College Algebra Modeling Reform. This dissertation aims to contribute to the scholarship about designing and implementing reformed curricula in gateway mathematics courses. The MAA published a report (Ganter & Haver, 2011) on several institutions’ college algebra reforms targeted at incorporating modeling into college algebra curriculum. These reforms were responses to the MAA College Algebra Guidelines (CRAFTY, 2007), which called for algebra curricula to attempt to (p.1): • involve students in a meaningful and positive, intellectually engaging, mathematical experience; • provide students with opportunities to analyze, synthesize, and work collaboratively on explorations and reports; • strengthen students’ algebraic and quantitative abilities useful in the study of other disciplines; • improve students’ ability to communicate mathematical ideas clearly in oral and written form; and • develop students’ ability to use technology for understanding and doing mathematics. 12 The goals of this algebra reform closely parallel the goals of the Calculus 2 labs project, and some of the findings from this project are relevant to this dissertation, especially those about students’ responses to the reformed curricula. The findings most relevant to this study can be found in the report from the reform at South Dakota State University (Flint, Hunter, & Kemp, 2011), which focused on incorporating mathematical modeling in a College Algebra course. Although they had positive feedback from some students about the new curriculum helping them see the usefulness of mathematics in their lives, ultimately the reform failed because of negative student perceptions. Students were aware that there were two versions of College Algebra being offered and viewed the reformed modeling version as being more work for the same grade. The reform effort was abandoned, and the modeling curriculum was relegated to an honors section. Student perceptions are difficult to manage when different versions of a course are offered in parallel, especially when the reformed version diverges from students’ expectations about what their math class should be like and perceive that the reformed curriculum requires more work for the same grade. Flint, Hunter, and Kemp’s findings are echoed in the preliminary findings from the first two years of the Calculus 2 lab project (Krause, Maccombs, and Wong, 2020). The data for this dissertation was collected during the third year of the project, when the issue of parallel versions was somewhat resolved by deploying the labs to an entire group of recitations tied to a single large-lecture. Modeling in Calculus. There have been efforts to develop modeling activities within the calculus curriculum. The following review of these studies serves to locate the Calculus 2 lab project within current scholarly discussions. At the 2019 Joint Math Meetings, an entire session focused on similar efforts: the MAA Contributed Paper Session on Incorporating Programming 13 and Computing in the Math Major Curriculum. Because of the recency of this work, few reports of these projects have been published in other scholarly outlets. In several of the presentations, the educators established the importance of programming and computation within the mathematics curriculum and explained challenges in their implementation. Sullivan and Fasteen (2019) paired 75-minute computational explorations with each of the four courses in their calculus sequence to develop students’ programming skills and explore mathematical concepts that are enhanced by technology. Akelbek (2019) reported on the integration of MATLAB within courses that serve math majors who are typically dual majors in computer science, engineering, or other science majors. Kim (2019) reported on a project to embed Python programming within Calculus I using the Sage library. Kilty, Marr, and McAllister (2019) presented their approach for developing programming skills while simultaneously teaching calculus content in the traditional sequence; they also reported plans to expand the programming focus from Calculus I to the rest of the calculus sequence. H. Johnson (2019) reported on his development of computational methods to explore Leibniz’ early conceptions about discrete differences of function values to develop a discrete conception of derivatives and integration. The main challenge presented in this work was that the interdisciplinary work of teaching programming and mathematics simultaneously is complicated, and students are resistant when they are aware that the combination of computer science and mathematics is a divergence from the traditional calculus curriculum, which is perceived by students to be easier. Preliminary findings from this Calculus 2 lab project were presented, with an emphasis on making sense of overwhelming negative student feedback amid interviews and observations that suggested that students are engaging in the intended mathematical activity during class (Krause, Maccombs, 14 Wong, & Iwen, 2019). These findings are in parallel to those presented about the modeling algebra reform presented by Flint, Hunter, and Kemp (2011), and are explored in detail in the Results and Conclusion chapters of this dissertation. Furthermore, the inquiry methods presented at the conference highlight a lack of qualitative inquiry that is useful for understanding the kinds of student engagement these projects facilitate, and thus this study serves to illuminate this methodology. Design Principle 3: Imbuing Mathematical Practices in the Curriculum The third design principle was to imbue more authentic mathematical practices and higher-level tasks into the curriculum. The labs were designed to get students to analyze and interpret the results of numerical simulations and expose students to computational modeling, beyond the traditionally emphasized calculation techniques that can turn into rote practice. The Standards for Mathematical Practice (Common Core State Standards Initiative, 2010) were designed to guide K-12 curricula, but it is informative to consider how they can inform teaching in Calculus 2 related to the previously stated goals. The Mathematical Practices were not explicitly part of the initial design process for the labs, but rather have been drawn on a posteriori to understand the design goals within an existing framework in the literature. The Mathematical Practices, as enumerated by the Standards document, are listed below, and a complete description for each practice can be found in the Standards document: • MP1 - Make sense of problems and persevere in solving them. • MP2 - Reason abstractly and quantitatively. • MP3 - Construct viable arguments and critique the reasoning of others. • MP4 - Model with mathematics. • MP5 - Use appropriate tools strategically. 15 • MP6 - Attend to precision. • MP7 - Look for and make use of structure. • MP8 - Look for and express regularity in repeated reasoning Conceptual Framework In this section, I will review the literature about engagement to establish the conceptual framework for this study. First, I will establish a link between learning and engagement to justify why examining engagement is a reasonable approach for understanding the learning opportunities facilitated by the labs. Second, I will characterize engagement as a multi- dimensional construct involving behavioral, emotional, and cognitive components. Third, I will review methods used to examine engagement as background for the method of this study. Why Study Engagement? The labs can be considered novel learning activities, with novel learning goals, in a novel teaching context, though the research was conducted during the third year of the project. The labs were novel learning activities because they were the original product developed by two faculty members during the first two years of the pilot and were still being substantially revised during the third year. The learning goals were novel because they involved collaborative and higher-level objectives, a notable departure from the traditional emphasis on individual mastery of various rote calculation methods at the research site. The teaching context was novel because this research was conducted during the first time the labs were taught “at-scale”, meaning it was the first time the labs were during recitations connected to a multi-section large-lecture (see Research Context in Chapter 3 for details about this structure). In this novel context, it was not clear what instructional factors should be examined, nor how attainment of the learning goals should be assessed. Should we look at how instructors are 16 interacting with each group? How are they facilitating discussion? How are instructors introducing and launching the activity during class? The learning goals were difficult to assess because they included collaborative and novel elements. Notably, improving grades and drop/fail/withdraw (DFW) rates was not one of the design goals and the labs were not designed to improve outcomes on the quizzes and exams, the primary grade measures, which focused on more traditional Calculus 2 content and an emphasis on individual, rote, by-hand calculation fluency. What learning measure should we look at instead? How should we measure group-based learning and shared idea development? Are all students expected to achieve some level of computational modeling fluency? As these questions demonstrate, the complexity of implementing novel instructional activities makes it exceptionally difficult to construct a quantitative study to measure the extent to which the design principles were fulfilled. One of the most important confounding factors is that the tasks and implementation were constantly changing because the design and research is iterative. Doyle (1979, 1983) and Doyle and Carter (1984) conceptualized academic tasks as the medium for learning. Doyle (1979) explained that students “students acquire the operations that they use to reach the goal of the task” (p. 6), meaning that students learn based on what they do. Students learn through engagement, so understanding how students engage with the labs is a way to understand how students are learning. I posit that an examination of student engagement should be a first step toward studying the effectiveness of novel learning contexts. By virtue of their novelty, these contexts lack existing measures of efficacy. It is also unclear whether any quantitative measures can be effective in measuring whether the intervention succeeded in implementing its original design goals, especially during the early stages of the design process when the curriculum is rapidly changing. Examining engagement is a lens to understand one 17 aspect of the learning associated with a task, especially in the situation that the task or learning goals are novel, as they are in this context. This qualitative and exploratory approach also provides a way to examine the tasks and implementation without having pre-existing notions about the specific aspects that should be studied and can identify these aspects for future research. Definition of Engagement Middleton, Jansen, & Goldin (2017) explain that current mathematics education discussions are “predicated on the fact that in order to learn mathematics, students must engage with mathematics” (p. 667) and claimed that learning is fundamentally inseparable from the engagement through which the learning takes place. Middleton and colleagues caution, however, that researchers need to carefully define their conceptualization of engagement because there is not consistency in the field. In this subsection, I will summarize a multi-dimensional characterization of student engagement, involving behavioral, emotional, and cognitive components, which I adopt as my conceptual framing of engagement. Doyle refers to “student engagement” as being a quality of how students do tasks but does not develop the construct of engagement in detail. Since Doyle’s initial conceptualization of the relationship between learning and engagement in academic tasks, the scholarship about student engagement has expanded substantially and is now understood as a multidimensional construct that is a fundamental aspect of learning. Fredricks, Blumenfeld, and Paris (2004) characterized engagement as a “meta” construct that manifests through a dynamic relationship between behavioral, emotional, and cognitive components, and this perspective is expanded on by Middleton, Jansen, and Goldin (2017). 18 One challenge of adapting the existing constructs of engagement to undergraduate mathematics education is that the conceptualization of engagement grew out of interest in the connection between K-12 students’ engagement with school and overall success in that context, namely that higher levels of engagement are correlated with lower dropout rates. This is analogous to the STEM retention issue in undergraduate education, but the “dropout” outcome in undergraduate education is related to persistence in STEM fields, rather than persistence in undergraduate education in general. One key difference between the college and K-12 contexts is that K-12 engagement includes participation in non-academic activities (e.g., student government), while in the college context I am conceptualizing engagement as being isolated within specific disciplinary domains. There is a lack of consistency in describing these different kinds of engagement, but one way to separate the two is by labeling the former as school engagement (referring to overall engagement with the endeavor of school and having implications for overall persistence and success in school) and the latter as disciplinary engagement (referring to engagement with specific academic tasks). In this sense, this study aims to understand disciplinary engagement because it is focused on specific tasks in a single mathematics course (the labs), rather than school engagement with the entire college system. Throughout this research, I use the word “engagement” to refer to disciplinary engagement with the labs. Another challenge of adopting previous definitions of engagement is that there is a lack of consistency across the literature. There is a documented call (e.g. Fredricks et al., 2004; Fredricks & McColskey, 2012; Middleton et al., 2017) for theoretical development to unify these constructs to provide consistency, and thus it is necessary for researchers to carefully define each 19 construct as they conceptualize them without assuming a consensus definition, which I do in the subsections that follow. Behavioral engagement. I adopt behavioral engagement to mean the observable actions, the things that students do, associated with the labs. In the context of the labs, some examples of behavioral engagement are being to class on time, completing the prelab assignment, communicating with group members, and experimenting with MATLAB during class. Behavioral engagement draws on the idea of participation and includes involvement in academic, social, or extracurricular activities and is considered crucial for achieving positive academic outcomes and preventing dropping out (Connell & Wellborn, 1991; Finn, 1989). Other scholars define behavioral engagement in terms of positive conduct, such as following the rules, adhering to classroom norms, and the absence of disruptive behavior such as skipping school or getting into trouble (Finn, Pannozzo, & Voelkl, 1995; Finn & Rock, 1997). Emotional engagement. Emotional engagement encompasses the affective components of engagement; it involves students’ emotional reactions to a task such as interest in the task, importance of doing well, the relationship between the task and students’ learning and personal goals, and the cost of completing the task. Emotional engagement is presumed to impact other aspects of student engagement through motivational factors, such as students’ willingness to complete a task in a particular way. In the context of the labs, examples of emotional engagement (or lack thereof) include enjoying the labs, being curious about the problems that are presented as the context of each lab, and feeling frustrated that the labs do not align with the exam content. Emotional engagement has been used is to describe affective components that are associated with school engagement (Fredricks & McColskey, 2012). Emotional engagement 20 focuses on the extent of positive (and negative) reactions to teachers, classmates, academics, or school. Others conceptualize emotional engagement as identification with the school, which includes belonging, or a feeling of being important to the school, and valuing, or an appreciation of success in school-related outcomes (Finn et al., 1995; Finn & Rock, 1997). Positive emotional engagement is presumed to create student ties to the institution and influence their willingness to do the work (Connell & Wellborn, 1991; Finn, 1989). Cognitive engagement. Cognitive engagement refers to the psychological investment that students make to complete a task in a certain way, including thoughtfulness and willingness to exert effort to comprehend the learning goals of the task. Krause and Putnam (2016) identified diverse student dispositions towards their online calculus homework, with some students using their online homework as a tool to master the content while other students employed shortcuts to complete the assignments as easily as possible, dispositions that exemplify differing levels of cognitive engagement. Student strategies to complete tasks and self-regulated learning attributes are other constructs included in cognitive engagement. In the context of the labs, cognitive engagement includes things like an investment in understanding the goal of the labs (as opposed to simply completing them for a grade), viewing the labs as means to develop computational skills and a conceptual understanding of the calculus content, willingness to engage in with challenging problems, openness to novel problems that require creativity to solve, and students’ perceptions about the role of the labs within the curriculum. Cognitive engagement has also been defined as student’s level of investment in learning (Fredricks & McColskey, 2012). It includes being thoughtful, strategic, and willing to exert the necessary effort for comprehension of complex ideas or mastery of difficult skills (Fredricks et al., 2004). 21 How To Study Engagement Broadly, methods for studying engagement fall into five categories: self-report surveys, interviews, observations, experience sampling, and teacher ratings of students. I will review each method in this section and explain why this study is designed to draw primarily on interviews and observations, with a post-course survey providing background information. Interviews. The main benefits of interview methods are that interviews provide a means to access students’ non-observable engagement (emotional and cognitive) and provide more flexibility than surveys to understand variability in engagement. I have employed interviews because they provide an account about how students experienced the labs without limiting their responses to pre-determined positions defined by limited options on a survey response. In the following subsections, I will describe both semi-structured and stimulated-recall interview methods, and I will explain why I elected to employ semi-structured interviews over stimulated- recall interviews. Semi-structured interviews. Semi-structured interviews are conducted by following an interview protocol that guides the interviewer to facilitate a loosely structured conversation that touches on the topics outlined in the protocol. The interviewer can prompt the interview participant to expand on their responses and can probe for more depth. Semi-structured interviews provide space for students to describe their experiences without limiting their responses to prescribed areas and allow for a more conversational interview experience (Merriam & Tisdell, 2016). Semi-structured interviews were fruitful for my data collection. Students were able to explain a variety of factors that impacted their engagement with the labs and described several factors that were previously unrecognized by myself of the curriculum developers, while also 22 addressing common aspects that I prompted for using the interview protocol (Appendix D). For example, students expressed the need for a lab preparation activity that they could think about slowly and carefully about the lab context, so that they could be more prepared to engage with the lab and complete it in the time allotted—this was an unexpected finding that was mentioned by several interview participants. One specific concern was that the labs involved a substantial portion of reading to set the context, which took 10-15 minutes at the beginning of the lab, and students suggested that allowing them to do this orienting activity before class would improve the lab experience. Providing structure to the interviews allowed me to collect data from each participant about previously identified areas of interest or concern. For example, question 15: “Are you invested in completing the labs?” was included in the interview protocol to elicit data about students’ cognitive engagement with the labs, specifically regarding their disposition about why they choose to complete the labs. Stimulated recall interviews. Another way to conduct interviews is by using stimulated- recall methods, which involve showing students some artifact (e.g., a video of an observation) and asking them to reflect. For example, Webel (2013) studied high school students’ perceptions about working in small groups in their math class by showing students videos of themselves working in groups and asked them to explain and justify their behaviors. One of the strengths of this method is that students’ descriptions of their behaviors do not always align with observed behaviors, and thus the video reflection provides structure for students to reflect on a different version of the classroom experience than they might remember through recall alone. The goal of showing the video is not to assist students’ in recreating the most accurate description of the events, but rather to facilitate reflection about the episode by stimulating them with the video. 23 I have videoed students completing the labs with the focus on a single group working through the lab, which would be well-suited for stimulated recall interviews, but logistical constraints drove me away from this method. My goal was to capture a wide range of students’ experiences by interviewing a large number of students, and this would have been logistically infeasible if I needed to coordinate my observations with my interview participant pool. One complicating factor is that students were randomly assigned to new groups each week in most of the lab sections, and thus it would have been impossible to schedule interviews of group members without knowing ahead of time which groups students would be working with. Observations. Observations can also be used to study behavioral engagement along factors such as student attentiveness, compliance in doing assigned work, and enthusiasm about the activity (Fredricks et al., 2004). Observations are less useful for studying emotional or cognitive engagement because those dimensions of engagement involve mostly non-observable components (i.e., students' thoughts and feelings). However, Nystrand and Gamoran (1991) studied cognitive engagement by examining instructional discourse through an evaluation of the focus of student comments and questions in the classroom. They distinguished substantive engagement from procedural engagement by measuring the frequency of questions and comments that focused on learning the content beyond the scope of the task (substantive engagement) and those that focused primarily on completing the task (procedural engagement). For this dissertation, I employed observation data primarily to make sense of students’ behavioral engagement. It was rare for students to observe emotional and cognitive engagement during the observations, but short utterances such as, “I don’t see the point of these labs,” or “Oh, that is interesting,” revealed some aspects of emotional or cognitive engagement. 24 Self-report surveys. Self-report surveys involve providing students with items that address various aspects of engagement and having students select responses that best represent their experience. Behavioral engagement can be studied through questionnaires aimed at measuring behavioral engagement on a single scale (low-engagement/high-engagement) that includes a combination of factors including conduct, persistence, and participation. Emotional engagement can be studied through self-reported surveys about emotions including happiness, interest, frustration, inclusion, and self-efficacy. Cognitive engagement can be studied using self- report questionnaires that aim to measure metacognition (setting goals and organizing efforts), effort control (managing concentration to complete work), cognitive use strategy (actively monitoring comprehension or avoiding difficult problems) (Fredricks et al., 2004). For example, Kong, Wong, and Lam (2003) conducted a qualitative examination of 5th grade students engagement with mathematics to develop a quantitative instrument to measure engagement along cognitive (surface vs. deep strategy for learning and reliance on the teacher), affective (interest, achievement orientation, anxiety, and frustration), and behavioral (attentiveness, diligence, time spent) dimensions. Items that represented surface learning strategies included statements relating to memorization: • “I find memorizing formulas is the best way to learn mathematics” • “I think the best way of learning mathematics is to memorize facts by repeatedly working on mathematics problems” The strength of self-report methods is that they are cheap and easy to administer, and they provide insights into students' subjective perceptions of their emotional and cognitive engagement, which are not directly observable. The weakness of self-report surveys is that they 25 rely on students’ subjective perceptions, and they may not reflect the reality of students’ experiences. In the context of this research, the post-course survey is a self-report survey. This can provide important information about broad trends in students' perceptions about their engagement with the labs, but the data is unreliable and incomplete. The survey data is unreliable because the survey is a post-course survey. As such, students' perceptions are heavily influenced by their most recent experiences, namely a high-stakes, uniform final exam that is disconnected from the labs. We can expect that student responses will then reflect this end-of-semester experience, which is a sampling of a single instance of students’ perceived engagement, rather than a representation of their overall engagement throughout the semester. It is also important to note that students’ perceptions are heavily influenced by their a priori expectations about how a mathematics class should be structured. Because the labs are a divergence from the traditional structure, we can expect that students will have a negative reaction to instructional activities that are novel and challenging, especially if students perceive that the extra work does not contribute to their goals for their education. Middleton and colleagues (2017) advise that interviews and observations can validate self-report data, which justifies the design of this dissertation to draw on interview, observation, and survey data. Experience sampling. Experience sampling methods (ESM) emerged from research about “flow”, the state of engagement where individuals are so focused that they lose sense of time and space (Csikszentmihalyi, 1990). This work involves understanding variations in engagement over time, so ESM was developed to capture in-the-moment data, rather than retrospective self-reports. In educational contexts, this has been done using electronic pagers, and more recently with smart phone applications in a study of calculus students’ engagement 26 (Beymer, 2020). Two advantages of ESM are that it allows the researcher to track engagement over time and it reduces problems with recall. However, ESM methods require a higher level of participation from the research participants, it can be distracting to the classroom environment, and the frequent surveying of students requires that a small number of items be used in the survey, which can limit the ability to capture the multi-dimensional nature of engagement. While ESM offer the ability to capture student engagement over time, which has proven to be an important aspect of students’ engagement with the labs, ESM is logistically infeasible for this dissertation, which is a first study about student engagement with the labs. It is reasonable to consider ESM as a possible avenue to understand students’ engagement in subsequent studies, especially when known aspects of engagement are targeted. Teacher ratings of students. Teacher ratings of students are most useful for understanding teaching contexts involving young children who are unable to complete surveys (Middleton et al., 2017). In the teaching context that this study examined, the lecturers teach in large classes of more than 100 students, and the lab instructors teach multiple sections of 30 or more students only once per week. My expectation, based on previous teaching experience, is that the faculty and teaching assistants who were interacting infrequently with students would not be able to provide meaningful evaluations of students on issues of interest to the research, because they were not focused on engagement. Because of these constraints, teacher ratings of students are not used in this dissertation and a review of this literature is omitted. 27 CHAPTER 3: METHOD The overall purpose of this study is to examine student engagement to understand students’ learning experience with the labs. This study is design-based research intended to inform the local teaching context and provide general understanding about how lab-type activities can be designed and implemented. Descriptive data were collected using a survey, student interviews, and classroom observations. The data were analyzed using qualitative methods informed by a Straussian approach to grounded theory. Methodology This study follows a design-based research (DBR) methodology, which provides a structure to relate the local goal to inform practice to the broader research goal to develop theory that is useful broadly (Cotton et al., 2009; Philippakos et al., 2021; Reeves, 2006). Locally, the study is situated within an iterative design process focused on improving the labs and their implementation—there are direct connections between the research and improvements to the written tasks or teaching practices. These improvements were context- specific adjustments to the tasks and implementation, such as tweaking the order of class start-up activities or adjusting the language of the lab materials. These practice-based implications are not the focus of the research because they are difficult to generalize, so they are emphasized in the margins of this study: first in the introduction to the context (Chapter 1), here to motivate the methodology, and later in the conclusion where the practical implications are discussed (Chapter 6). The local context provides the “what and why” of the study. The focus of this dissertation, from a research perspective, is to understand student engagement with the labs to develop generalizable theory about designing and implementing lab- type activities in college mathematics classes. To accomplish this, the dissertation situates the lab 28 development project within college mathematics education scholarship and draws on engagement literature to provide research approach (see Chapter 2), and employs methods for data collection and established analysis procedures (described below). To examine engagement and generate theory about a novel instructional context, I drew on the Straussian approach to grounded theory (Howard-Payne, 2016; Mills, Durepos, & Wiebe, 2010; Strauss & Corbin, 1997). Grounded theory provides a framework for the study to generate new theories about the design and implementation of lab-type activities. Adopting a Straussian approach means that I drew on my experiences and immersion in the context to inform data collection and analysis, being mindful of the importance of the context to understand the data. This approach can be contrasted with a Glaserian approach (Howard-Payne, 2016) that would aim for objective detachment in data collection and seek to limit pre-existing conceptions’ influence on data coding and analysis. The design for data collection follows from the Straussian approach because I sought to collect data about specific aspects of engagement based on existing theory—that engagement is a multidimensional construct that involves behavioral, emotional, and cognitive components (Middleton, Jansen, & Goldin, 2017). The semi-structured interview protocol was designed to elicit specific data about students’ emotional and cognitive engagement with the labs, while creating space for students to provide unexpected data about their experiences. The classroom observations we designed to provide complimentary data about behavioral engagement data. The data analysis follows from the Straussian approach because I employed pre-existing notions to understand the data. For example, during the open-coding phase, I had pre-existing ideas about how data could be sorted, namely based on the structure by which the interviews and observations were designed to collect data. Interviewees were asked, “Do you feel like the labs 29 help you learn?”, and their responses were coded as “student perceptions about learning”. Because of this design, I was careful to identify data that fit awkwardly within the pre-existing categories and used those occurrences to create new coding categories or adjust coding definitions. Furthermore, the Straussian approach “embraces the open coding practice, which includes the conceptualization of even solitary occurrences” (Howard-Payne, 2016, p. 56), whereas the Glaserian approach relies on initial coding that identifies patterns and trends. I recognized that unique responses are important to consider, especially because of the small sample of wildly diverse student experiences I was able to capture. I did not seek to identify or describe a general student experience, but rather capture the range of student experiences, and thus valued singular occurrences in the data. Research Context This research examined a large-enrollment Calculus II course at a large, public university—the course serves approximately 1500 students each year. The course was taught in a large-lecture plus small-recitation format, which means that students met in a large lecture hall thrice weekly where the core calculus content was delivered via lecture, and they met one weekly in a small class with a graduate student instructor. Teaching styles and strategies vary from instructor to instructor, but the main components of the instruction are uniform—each section had a common syllabus, schedule, online homework, and exams. The uniformity of the course was intended to provide a consistent learning experience across many sections taught by a variety of faculty lecturers and graduate student recitation leaders. Because of the uniformity, instructional changes generally happen at the coordinator level, because the course coordinator 30 adjusts the online homework and provides instructions/lessons/activities to the recitation instructors, thus directing students’ focus on specific aspects of the content. The traditional structure of the recitations, which the labs were designed to replace, was a 30-minute review followed by a short quiz. The implementation of the labs was accompanied by a restructuring of the recitations, meaning that recitations classes would either be a lab for the entire 50-minute class or a mini-test, which was a quiz-type assessment that covered more sections than the previous weekly quizzes. The data collection for this study was conducted during the Fall 2018 and Spring 2019 semesters, during the 3rd year of the reform effort initiated in Fall 2016. The implementation plan was designed as a 3-year project. The focus of the first year (Fall 2016-Spring 2017) was creating and piloting the lab activities in a small class (30 students/semester) co-taught by the two tenure-stream faculty leading the project. In the second year (Fall 2017-Spring 2018), the faculty leads co-taught medium-sized lectures (80 students/semester) divided into two recitation sections in which the labs were led by a pair of teaching assistants; development focused on revising the lab materials and addressing logistical issues with scaling up the labs. In the third year (Fall 2018-Spring 2019), a large-scale pilot was implemented in which the faculty leads co- taught one full-sized lecture (160 students/semester) and the labs were used in all 7 recitation sections, each taught by a single teaching assistant (the regular structure); development focused on revising the labs for at-scale deployment and developing teaching supports for lab instructors. The labs were run at-scale in the year following the development (Fall 2019-Spring 2020); the labs were being used in all 24 sections in the Fall 2019 semester (across 4 large-lectures), and then in all 38 sections in the Spring 2020 semester (across 5 large-lectures), which was interrupted by the pandemic (Krause, Maccombs, & Wong, 2020). 31 Rationale for Context Selection I selected these Calculus II labs as the subject of my research because the labs are an example of a widespread movement to incorporate collaborative, lab-type activities in undergraduate mathematics courses, and there has been limited inquiry about students’ learning experiences with these lab-type activities. At the research site, labs are attractive replacements for traditional recitations because Calculus II is taught as thrice weekly in a large-lecture format paired with a weekly small-class recitation. These recitations have often been limited to a short review followed by a weekly quiz that focuses on computation. Recitation sections are intended to involve students, but sometimes inexperienced instructors review lecture content by demonstrating solutions from the board, essentially lecturing in a small class setting and providing limited opportunities for students to discuss ideas or work with their classmates. By incorporating labs, active learning is prescribed structurally, ensuring that students are working in groups and the recitation leader is not lecturing. The small class setting of recitations is a good fit for labs because the instructor is available to help with complex questions that challenging lab problems generate. The in-class labs are a recent teaching innovation that constitute a substantial portion of students’ learning experiences in the course—one of four class days were dedicated to the labs. Student achievement data were tracked by the department, yet student engagement with the labs has been largely unexamined. We have data about completion of the labs and course grades so that correlations can be analyzed, but we did not have a good sense about how students collaborate to complete the labs, what resources they employ, how they perceive the learning utility of the labs, or what kinds of mathematics they engage with or discuss. By exploring these 32 aspects of student engagement, we can develop a better understanding about how the labs shape students’ learning experiences. Furthermore, at the research site, similar lab elements have been enacted in four separate large-enrollment courses. Instructional improvements informed by this research can impact thousands of students each year, just in the local context. Participants The participants of the study were selected from the students enrolled in the sections that employed the labs. These are Calculus II students who are generally in their first year of undergraduate study and are STEM-intending—most of the students are engineering majors (70%), with other STEM majors (8%) also represented (i.e., chemistry, physics, secondary STEM teaching). Few math majors enroll in Calculus II (less than 8%), because there are a small total number of math majors, and many have completed the course prior to college or enroll in the honors sections that focus more on preparing for advanced mathematics study. The recruitment, selection, and compensation processes for each component of the data analysis are addressed in the following subsections about each particular data type. Positioning of the Researcher I have experiences teaching both small sections and large lectures of the Calculus II course, providing professional development for graduate student and post-doc instructors who teach the course, and contributing to the development of the labs. Furthermore, I am familiar with the types of data that are already being collected at a department level, such as grade data, assignment completion data, longitudinal course-taking patterns, and demographic data. Because of my closeness to the development project, I was careful to address my biases that were tilted towards believing that the labs, if implemented correctly, stand to provide a rich and meaningful 33 experience in a calculus class. Throughout the research, I tried to be mindful to identify disconfirming evidence of this belief, namely by being attentive to problems with the labs that led to negative student experiences. Data Collection The data for this study were collected through interviews, classroom observations, and a post-course survey of the Calculus II students. Interviews The purpose of the interviews was to collect detailed data from participants about their engagement with the labs, namely the emotional and cognitive components that are impossible to observe. Twenty-five individual semi-structured student interviews (including the 5 follow-ups) constitute the interview data set for this research. In total, I conducted 29 interviews with students, which included 5 follow-up interviews, and 6 interviews with instructors, but omitted 4 student interviews (technology failures, no recording) and the instructor interviews (poor interview protocol, data beyond scope of research) from the data set. The omitted interviews are acknowledged here because they do exist as background data, and my conducting of the interviews likely had some impact on my interpretation of the other data. Interview participants were solicited using a bulk email sent to all students enrolled in sections that used the labs, and a $5 gift card was offered for compensation for a 20-30 minute interview. I also visited the lecture the same day that the email was sent to briefly explain the research and encourage students to participate. The email included a link to a short Google Form survey that collected students’ contact information, as well as their class (i.e., Freshman, Sophomore), recitation section, and experience with Calculus II. This extra information was 34 included on the survey so that I could select a variety of students but was unnecessary because I eventually tried to arrange interviews with every student who responded to the email. My goal was to interview as many students as possible, with a variety of backgrounds, to capture a breadth of student experiences. While there was likely self-selection bias—perhaps more ambitious students were more likely to participate—I was pleasantly surprised by the diversity of experiences and perspectives. The number of interviews appears to have been effective for capturing a wide range of students’ experiences. The interviews were conducted using the semi-structured interview protocol, which focused on collecting information about students’ behavioral, cognitive, and emotional engagement with the labs (Appendix D). The behavioral components that were explored include the typical activity structure during the labs (what they do), use of resources to complete the labs, interactions with the instructor, and interactions with their group members. To understand students’ emotional engagement, I asked students to describe how the labs make them feel (i.e., frustrated, challenged), if the labs are enjoyable, how they feel about the coding component, and if they feel like the labs are worthwhile. To understand students’ cognitive engagement, I asked students if the labs were helpful for learning calculus overall, understanding connections between concepts, understanding applications of calculus, and practicing modeling (or understanding what modeling is). I also asked students to evaluate if the labs are worthwhile in terms of succeeding in the course or their long-term learning goals. The interviews also collected more extensive background information about participants’ calculus course-taking history, career trajectory, disposition towards mathematics, goals for the course, and study strategies to provide context to students’ experiences. 35 Interviews were planned throughout the semester, with the goal of capturing students’ learning experiences at different points in the semester. I was mindful that students’ feelings about the labs would change throughout the semester, especially surrounding midterm and final exams, which turned out to be a worthwhile consideration. Students' perceptions of the labs are more negative at the end of the semester, when their focus turns to their course grade and the extent to which exams determine their grade. I also conducted follow-up interviews with 5 of the students who were interviewed at the beginning of the semester to get a sense about how students’ perceptions changed throughout the semester. For these follow-up interviews, I recycled the same interview protocol, but skipped asking repeat questions about the background information that was unchanged. The interview audio files were transcribed using a two-step process. First, the files were uploaded to temi.com for AI transcription. Second, I manually edited the transcripts for accuracy. I uploaded both the audio file and transcript for each interview into MaxQDA for analysis, so I could listen to the conversation if I had any questions about the transcript. I also wrote a field memo immediately following each interview to summarize the interview in a few sentences and listed this on the descriptive memo of the interview file in MaxQDA. I started this practice after finding it difficult to keep track of the different interviews only by number. The field notes were used to organize the interview data, by helping me remember which interview is which, but were not included in the data set. Classroom Observations The purpose of the classroom observations was to collect data about students’ behavioral engagement, with specific attention to how the groups worked together during the lab. Sixteen classroom observations constitute the observation data set). Twenty observations were 36 conducted, but four observations are omitted because of data issues, such as failed technology (or user error) that resulted in a failure to capture some component of the data. The observations were conducted to observe as many classrooms as possible, given that several sections were scheduled simultaneously, by distributing my observations so that sections led by different instructors were observed. The study was designed to capture observations of classroom with different instructors to gain understanding about what certain instructors do, or omit, that might impact student engagement. I solicited observation participants in the classroom, before class began, by asking a group for their consent to participate in the study by allowing me to video-record their work during a lab. I selected the observation group by convenience, in the sense that I tended to observe groups near an edge of the classroom so the camera and myself were less disruptive to the physical classroom space. It may be the case that my data was skewed by my decision to select groups near the periphery of the room, because groups physically located near the center of the room may have had different engagement patterns. Recordings were conducted by positioning the camera so that the entire group was in view, although not all students’ screens or workstations were visible. After pilot data collection, I decided to also audio-recorded the observations by placing a small recorder on one of the student’s desks. When this audio was dubbed over the camera audio, this provided much better audio-video quality. This setup effectively captured the entirety of conversation in a group, and also captured the screen of at least one student; it was always the case that some students’ screens were not captured by the recording, so it is impossible to know completely what every student in the observation was doing on their computer. During the recordings, I also kept field notes, including a 1-2 line summary of the observation. While these notes are not included in the data, the notes were important for data 37 organization, and they helped me discuss and design my open-coding approach with my advisors throughout the data analysis process. The recording data was processed using Camtasia to do the audio-video editing, and the video files were uploaded directly into MAXQDA for analysis. Video data were not transcribed. Course Survey The purpose of the survey was to collect the background data from the broadest group possible, with relatively little cost, being mindful that survey data is limited. All Calculus II students had an opportunity to participate in pre- and post-course surveys for course credit—the survey includes factual questions (i.e., year of study) and Likert-type questions (i.e., “The labs are a useful tool for learning calculus overall” – agree/disagree). Students were asked to share information about their academic status, major and career trajectory, time commitments, course goals and expectations, plans for future math courses, high school and college mathematics experience, calculator experience, dispositions and habits about mathematics and college work (see Appendix A). This survey was part of the departmental data collection that is used to monitor the status of all large-enrollment courses, and it was not specifically designed for data collection for this dissertation, which is the cause of the bulk of the survey being unrelated to the research question. The post-course survey was designed by the lab designers as a teaching-related data collection mechanism. The researcher provided input for the design but was directly responsible for the authoring or administration of the survey. The pre-survey was posted on the uniform course website during the first two weeks of the course, and the post-survey was posted during the last two weeks. Students were emailed a reminder to complete the survey through a bulk email sent by the Calculus Coordinator, who is responsible for coordinating the entire calculus sequence, 38 but is not necessarily teaching every student involved. The lecturers and recitation instructors also reminded students about the survey during class. Students in the lab sections were also surveyed about their experience with the labs within the same, single survey, but with additional questions about the labs. They were asked to describe the frequency that they engaged in different behaviors during the labs, such as writing MATLAB code or analyzing results/data to solve a problem—these are elements of their behavioral engagement. Student were also asked to quantify (on a Likert-type scale) aspects of their emotional and cognitive engagement including their perceptions about the usefulness of the labs as a learning activity, the extent to which the labs make learning calculus feel important, and their perceptions about feeling including and taking a leadership role during the labs. Students were also asked to rate the appropriateness of the length and difficulty, the quality of the feedback, and group work dynamics associated with the labs—these questions focus on understanding the impact of different structural components on students’ learning experiences and are not data relevant to the research question for this dissertation. Several students submitted two post-course survey entries, perhaps following both the initial email and the reminder email. In these cases, the second submission is used in the data for this study and the first response is omitted. The responses varied between the two submissions, but these differences were not investigated by this study. Data Analysis The data were analyzed using a three-phase coding process that involved open coding, descriptive coding, and thematic analysis coding. The open coding sorted the data into sets of common codes. The descriptive coding identified variation and highlighted nuance within the open code sets. The thematic analysis coding organized the data within the engagement 39 framework, along behavioral, emotional, and cognitive dimensions. Coding involved both inductive and deductive processes, because the study is designed to produce a codebook that reflects the evolved analysis but draws on several pre-existing notions brought to the research. Following the Straussian approach to grounded theory, the coding was conducted using the constant-comparative method. The following sections elaborate on each of these phases by describing the coding process for each data source and providing examples of the coding process. The codebook is presented in Appendix G. Open Coding The open coding phase is designed to identify new concepts and patterns, which is an inductive process. The goal of the open coding phase was to organize and categorize data by applying codes to label similar data instances, then defining the codes using that data. Following the constant-comparative method, I refined the definition of my codes throughout the open coding phase, working to ensure that each data instance was described by the code definition. As I created new codes, I wrote a brief definition of the code. As subsequent data instances were coded using existing codes, I checked that the definition applied, and updated the definition if needed. Following a definition update, I needed to check that previously coded instances fit the updated definition to either confirm or remove the code. I also needed to compare the code and the updated definition to other codes to determine if codes overlapped and should be merged. There are several aspects of the open coding phase that obscured the boundaries between data collection and data analysis which align this study with a Straussian approach to grounded theory. First, I brought a priori conceptualizations about student engagement to the research, informed by the literature and experience (e.g., Krause and Putnam, 2016). These 40 conceptualizations significantly shaped the research questions and interview protocol, which was developed to elicit data about specific aspects of student engagement and instructional factors. Therefore, my study was targeted as uncovering certain kinds of data about specific aspects of students’ learning experience, and thus some of the coding was prescribed. One example is apparent in interview question 13 (Do you feel like the labs help you learn?), which is designed to elicit data about students’ perceptions of the learning value of the labs and is tied to students’ cognitive engagement. As a result, participant responses to this question were often coded “perception of learning value”, which is a subset of the “cognitive engagement” theme. Similarly, interview question 14 (How do the labs make you feel about your learning? Are you interested in the labs? Do you find the labs enjoyable?) was designed to elicit data about students’ emotional engagement with the labs, and thus the responses were coded “enjoy/interest”, which is a subset of the “emotional engagement” theme. Second, following the constant-comparative data analysis paradigm, my codebook and the structure of my coding categories was updated throughout the data analysis. For example, during the coding of the interviews, I made a specific effort to update and re-align my code book after analyzing 2 interviews, and the codes were not stabilized until approximately 10 interviews had been analyzed. I followed a similar practice for coding the classroom observations, which resulted in the philosophical development/decision about the grain-size of the video-analysis: that I would code silent work / collaboration by the length of the instance (to get measure of the amount of time that students collaborated), while coding mathematical practices by exercise (to get a sense of which practices were facilitated by which exercise). Third, my review of the literature about engagement (Fredricks et al., 2004; Fredricks & McColskey, 2012; Middleton et al., 2017) drove my decision to seek a coding structure that fit 41 the conceptualization of engagement being multi-dimensional (behavioral/emotional/cognitive). This approach ensured that my data would answer my research questions, but it was a pre- conceived notion that needed to be critically examined. Similarly, my effort to understand and describe the productive struggle observed in-class led me to adopt the CCSS-M Standards for Mathematical Practice (Common Core State Standards Initiative, 2010) as a coding framework, which dictated how subsequent data were viewed. The main risk of this philosophical approach is that the study could corroborate my preconceived understanding of the phenomenon, rather than generating novel theory. Because of this, I needed to be attentive that my codes and definitions were aligned with and drawn from the data and be careful that I was not forcing data into a rigid structure defined by these preconceptions. Interview Open Coding. The interview coding was conducted using MaxQDA and the interview audio files with attached transcripts. I open coded the transcripts while I listened to the audio recordings, meaning that I labeled a participant’s response to a question, or part of a response, based on what the response was about and how it fit within existing codes or necessitated a new code. Observation Open Coding. I analyzed the observation data by watching the videos within MaxQDA and coding portions of the recording, meaning that I labeled an episode based on what it was about and how it fit within existing codes or necessitated a new code. The engagement was coded at the group-level first, meaning that my attention was on the entire group activity, instead of being focused on an individual student. After this first phase of group-level coding, I was able to construct a “behavioral engagement timeline” (presented in Chapter 4), and it became obvious that similar timelines at the individual-level would illuminate interesting relationships between the group-level and individual-level engagement dynamics. 42 One aspect of this analysis that stands to be critiqued is the grain-size of the coding. That is, determining the length of time that constitutes an “instance” to be coded. Students can fluidly shift between modalities, as shown in the timelines in Chapter 4, but there is some minimum time required to switch modalities. For example, in Observation 2, from 34:40-37:43, the group is discussing slowly, as they look through their calculation, with several 20 second pauses. The conversation chain is maintained through the pauses, and thus this entire episode is coded “Whole-group Discussion”. When a longer pause interrupts the chain of discussion, however, this is a switch in modalities – for example, if an individual discusses a solution approach, works for 2 minutes to explore that method individually, and then afterward engages in conversation to discuss their findings. In this situation, the individual switched modalities as they engaged in different activities. Because of this, I chose 30 seconds as a minimum pause interval to code as “working silently” as an activity that broke up a discussion episode. Another complication is that the group-level modalities span a longer timescale based on the overall collaboration structure, rather than the moment-by-moment activity, to capture the differences between group dynamics throughout a lab session. Consider, for example, a situation where a group is engaged in a whole- group discussion for 10 minutes, but one student tunes out for 2 minutes while they work silently. In this situation, the group-level modality is “whole-group discussion”, and the individual-level modalities may switch between “whole-group discussion” and “working silently” for individual group members, with the majority of the individual modalities being “whole-group discussion”. Episodes that revealed students’ emotional or cognitive engagement were rare to observe, and short when they did appear. Occasionally a student would say, “Oh, that is interesting!” or 43 “These labs are so frustrating!” and these instances were coded to match similar, more numerous instances in the interviews. Descriptive Coding The goal of the descriptive coding phase was to identify patterns and variation within each open code set, to highlight the nuance and understand the complexity of each set. Descriptive codes were applied to each instance of the open-code set by comparing each instance and describing it with a word or phrase. For example, to descriptive-code the instances in the “interest/enjoyment” open-code set, I looked at each instance and asked, “what does the instance tell us about the participant’s interest in or enjoyment with the labs?” The following descriptive codes were generated: • Finds the labs INTERESTING (or not) • REAL-WORLD CONTEXTS are interesting • enjoy CHALLENGE of labs • CONFUSION prevents interest/enjoyment • feeling RUSHED prevents interest/enjoyment Thematic Analysis Coding The goal of the thematic analysis coding was to organize the open codes within the engagement framework along behavioral, emotional, and cognitive dimensions. To accomplish this, the open codes were organized into thematic code sets based on how those open code definitions related to the engagement dimensions. Individual instances of coded data were not examined for this coding phase, and instead the coding happened at the set level while working with the definitions previously generated in the open coding phase. Logistically, this was the 44 simplest coding phase, because the raw data was not examined as it was for the other two coding phases. One aspect of this thematic analysis coding that stands to be critiqued is that interview and survey data about emotional and cognitive engagement are drawn from participants’ reports about their own emotional and cognitive engagement, which is a perception that is influenced by that same emotional and cognitive engagement. There is some cyclical dependency involved here that can amplify how participants reported about their experiences, and this could happen in the positive or negative direction. A second aspect of thematic analysis coding that stands to be critiqued is the conceptualization of the cognitive engagement dimension. The open codes related to the behavioral and emotional dimensions were straightforward to sort into thematic codes because these dimensions were defined most clearly in the literature. Behavioral engagement is related to actions students performed—this can be directly observed, and participants can reliably report about the actions they do during the labs. Emotional engagement is related to participants’ feelings about the labs—this is not often observed, but participants can again reliably report about their feelings about the labs. The cognitive dimension was more unclear, because it is related to the psychological investment that students make to complete a task, which is only indirectly accessible by either observation or interview because it is related to the value students place on the labs based on how students perceive the labs as aligning with their goals. Because of this relationship between students’ goals and their perceived value of the labs, data about students’ perceptions of the labs is included in the cognitive engagement component. 45 Survey Coding The details about the coding process for the survey data are included in this separate subsection because the survey data was analyzed at several different times during the research process. The survey data was analyzed in a preliminary phase using simple tabulation and calculating simple descriptive statistics as soon as the survey was completed each semester, because this data was the most readily available and simple to process. This preliminary survey analysis informed ongoing design revisions (Krause et al., 2019), fulfilling the design-based research methodological philosophy of contributing directly to practice. The survey data cited in Chapter 4 of this dissertation were organized in a separate process from the preliminary analysis and are organized according to the codebook structure that emerged after the analysis of the interview and observation data. In this sense, the survey data can be used to understand the interview and observation data within a larger context, rather than being data that was used to draw conclusions on its own. The interview and observation data are detailed but from a small participant pool and the survey data is more superficial but from a broader participant pool. 46 CHAPTER 4: RESULTS The results of the study are presented below, organized by the engagement dimensions. The findings about the behavioral dimension includes both individual and group level engagement, primarily drawn from the observation data. The findings about the emotional dimension are drawn primarily from interview data about affective aspects of participants’ experiences with the labs, such as enjoyment and confidence. The findings about the cognitive dimension were drawn from both interview and observation data and the analysis involved more conceptual development than was needed for the behavioral and emotional dimensions. Behavioral Engagement One layer of behavioral engagement involves logistical factors that structure the learning environment. The lab instructors were directed to have students draw a card as they arrived to class to create random groups of 3-4 students. In the case that the card draw resulted in a group of 1-2, this group would be combined with another group. Most instructors followed these directions, but some allowed students to form their own groups. Interview participants reported that this group forming process was acceptable, and there were few complaints about group members. Survey responses indicated similar satisfaction with the groupwork. Multiple interview participants suggested that keeping groups for several consecutive weeks would be more productive, because they would establish a working relationship. Working on the labs required students to have MATLAB installed and to download a MATLAB file for each lab, or to use a cloud-based option—there were few technical issues with this. Observations revealed that some groups shared laptops and in other groups every group member had their own laptop; if an individual student experienced technical issues, they either troubleshooted those issues themselves, with the help of the instructor, or put their laptop away 47 and shared with a group member. The activities were written as a narrative within the MATLAB file, which allowed students to conduct their computation and coding within the narrative structure of the activity. Students were provided an accompanying worksheet (hard-copy) to report their findings from the activity, and a single worksheet was turned in to be grade for a single group grade—an example worksheet is provided in Appendix E. Observations revealed that some students took on the role of the recorder and wrote all of the answers their group discussed, while other groups passed around the worksheet for each group member to write or edit. The nature of a single document being turned in for the whole group seemed to inspire collaboration, specifically because students posed conjectures and confirmed their reasoning before they accepted the answer for the group. Observations revealed that the instructors had a limited impact on observation participants’ engagement, simply because interactions with the instructor were just a small portion of the class time. For example, in the observations presented in the timelines that follow, the reader can see that Observation 2 and Observation 3 contain a single 2-3 minute conversation with the instructor and Observation 7 contains two 5-10 minute conversations with the instructor. Interactions with instructors involved a range of topics and durations, including short questions to confirm results and long discussions to pose conjectures and plan the solution methods. Another layer of behavioral engagement involves the activities related idea development, which can be examined at both the individual and group level. At the individual-level, a student’s behavioral engagement can be directly observed and categorized into distinct modalities that involve different collaborative and individual activities. At the group-level, behavioral engagement emerges from the interactions between group members. Five individual- level modalities and five group-level modalities are described below. Following these 48 descriptions, several timelines show the behavioral engagement modalities throughout a class, at both the group and individual level. These timelines provide a visual representation of the wide range of behavioral engagement observed. Individual Behavioral Engagement Modalities A variety of individual behaviors were observed during the labs, which are categorized into five individual behavioral engagement modalities described below. The engagement modality of each group member is independent but related to and influenced by what their group members are doing. Whole-group Discussion. A student engaged in whole-group discussion is participating in a discussion that is open to the entire group. The student may be sharing their thinking with the entire group or listening to a group member who is sharing. Just one instance of active participation during a discussion warrants coding the engagement by that student during that instance as a whole-group discussion, because the student has contributed to the discussion. Split discussion. A student engaged in split discussion is participating in a discussion that involves only one other group member while the other group members are engaged with something else. The student may be sharing their thinking with or listening to the other split- discussion participant. An example of a split discussion is if a student taps their neighbor on the shoulder and softly explains their thinking, without addressing their speech to the other group members who are working silently or discussing something else. Sometimes groups that are engaged in split discussion are visually split, with a group of 4 sitting in two different pairs. Other times, the pairings of a split discussion can shift throughout the class. For example, discussions pairs can switch as is shown in the Observation 3 timeline below, where Individual D has split conversations with both Individual A and Individual C. Students engaged in split 49 discussions usually do come together for short whole-group discussions throughout the class to check the understanding and ideas of the other pair of students in their group. Working silently. Students are engaged in working silently when they are writing, reading, typing silently. The students were assumed to be on-task if they appeared to be working but their screen was not visible. Silent Observer. Students are silent observers when they are watching, listening, or thinking, but not working on their computer, writing, or discussing. The student may appear to be watching and listening to their group, or they may appear to be looking around the classroom, scrolling through the lab (without reading or changing inputs), or lost in thought. A student is assumed to be engaged as a silent observer unless they are doing something actively off-task. Off-Task. Students are off-task when they are actively doing something not related to completing the lab. Off-task behaviors include browsing the internet, texting, or instant messaging and are explicitly distinct from being a silent observer because it is visually obvious that the student is engaged in something else. Note about Off-Task Behaviors. It was rare to observe students who were off-task. In all 16 observations, two or more students within a group were focused on completing the lab as intended—that is, they were putting forth a good-faith effort to progress through each of the questions and were discussing their thinking with at least one other student. It is also worth noting that the effect of being videotaped may have inspired students to stay on-task during the labs. It was uncommon for students to be off-task for more than a couple of minutes during a lab session, but prolonged off-task behaviors were observed in three observations: • Observation 13: One group member in a group of three is persistently instant messaging throughout the class period. This student seems to try to appear engaged, 50 either to the recording or his group members, by doing things like looking at his classmates and nodding, or by scrolling up and down the lab when they minimize their instant messaging. At one point, the other two group members are discussing a method, then this off-task student offers the output from the code he runs, but it is not what the other group members were talking about and they are confused by his input. Ultimately, this student never focuses on the discussion long enough to get involved. • Observation 19: This is a group of three that was joined by a late 4th student 35 minutes into class, who opened their computer but said nothing to the group for the remaining 12 minutes of the observation. This student did, however, add their name to the sheet to earn the group grade. • Observation 20: This group involved two off-task students. The first off-task student looked through their just-handed-back exam for 7 minutes while their group-members worked on the lab. Then this student disappears to work with another group for 6 minutes and returns with “answers” from the other group. While this is happening, a fourth student who came to class late, is visibly off-task by doing things like rolling his paper into a tube and blowing through it. The two off-task students also engage in an off-task discussion while their two group members continue to work on the lab. Note that these 3 most prolonged off-task behaviors occurred after the mid-term of the semester, and observation 19 and 20 were the last lab of the semester. Group Behavioral Engagement Modalities Beyond individual engagement, there is group-level engagement that emerges from the interactions of each group member and can be categorized into similar modalities as the 51 individual-level behavioral engagement. Below, the five modalities of group-level behavioral engagement are described. Whole-group discussion. When two or more group members are engaged in whole- group discussion, the group-level engagement is whole-group discussion. The individuals may engage in short episodes of working silently as they read the lab, run code, or do calculations, but they come back to the group for discussion after each step to share and verify their thinking. One exception to the coding is that when short whole-group discussions (<30 seconds) interrupt longer periods of silent work (>30 seconds), those individual whole-group discussion instances are considered part of a group-level parallel work episode, as demonstrated in the timelines of Observation 3 and Observation 7 below. Split discussion. When two or more group members are engaged in split discussion, the group-level engagement is split discussion. The individuals may engage in short episodes of working silently as they read the lab, run code, or do calculations, but they come back to their discussion with their split-discussion partner. One exception to the coding is that when short split discussions (<30 seconds) interrupt longer periods of silent work (>30 seconds), those individual-level split discussion instances are considered part of a group-level parallel work episode, as demonstrated in the timeline of Observation 7 below. Parallel work. Groups engaged in parallel work are working on and thinking about the same task, with a mixture of whole-group discussion, split discussion, and working silently. Students who are engaged in parallel work are focused on reading the prompts, doing calculations, editing/running code, and sharing their thoughts with their teammates. If all group members are working silently for less than 2 minutes, then episode consisting of discussion— silent work—discussion should be coded as parallel work. 52 Working Silently. When all group members are working silently or being silent observers for longer than 2 minutes, the group-level engagement is working silently. The activities students are engaged in include writing, reading, typing, and observing. Note that students were assumed to be on-task unless directly observed to the contrary, with attention to the fact that being a silent observer is a mode of on-task engagement. Off-task. Groups are off-task when all two or more group members are directly observed doing something not related to completing the lab. Off-task behaviors include silent behaviors such as browsing the internet, texting, and instant messaging, as well as off-task discussions. Behavioral Engagement Modality Timelines In this section, timelines are presented for three observations, which provide a visual representation of individual and group-level behavioral engagement–it is a visualization of the different ways that active learning and group work “looks” in the classroom. These timelines show how individual behavior is related to group behavior and provides more detail about the variety of ways that students interact during collaborative tasks. These examples show that engagement is not homogeneous within a group and captures how different group members engage during discussions and silent work. Note that the various modalities of behavioral engagement are not hierarchical, meaning that one particular mode of engagement is not taken to be superior to another, and this is simply a description of different ways that groups interact. Nearly all of the observed groups were productive and completed the lab activities as intended, which is evidence that a variety of engagement modalities are productive forms of collaboration. In a sense, this data shows that “active learning” happens with a range of “active” taking place in the classroom. 53 These timelines also show that interactions with instructors constituted a minimal portion of students’ activity in the classroom, typically limited to only a few interactions each class only lasting a few minutes each. There were no observations of an instructor lecturing to their students, and any explanation or presentation from the front of the board was limited to just a few minutes at the beginning of class. These findings from the observations are closely related to Design Principle 1: Prescribe active learning, specifically collaboration, in the small-class recitation setting. The timelines show that the vast majority of student engagement during Observations 2 and 7 is discussion-oriented, and during Observation 3 is more individual-work oriented. Observations 2 and 7 are more aligned with the goal of facilitating collaborative active learning because more than half of the class time involves discussion within the group. Even though Observation 3 is less discussion-based, this still amounts to nearly half of the class being discussion-oriented while the students are actively engaged in individual work (running code, doing calculations) the rest of the time. Ultimately, all three of these observations show different ways that students engage for a class-long activity that facilitates active learning. Observation 2 Description and Interpretation. Observation 2 involves a group that engages in whole-group discussion for a large portion of the class, led by Individual A—she takes lead on writing the group solution, she drives the discussion throughout, and she sets the pace of the group progress. Individual B collaborates closely with Individual A, and the other two group members take a more passive role. At the beginning of class, the group is trying to get started by reading the lab and making sense of things individually, then shifts to whole-group discussion for the majority of the class. They are only off-task for a couple of minutes near the end of class as they discuss another class. 54 Observation 7 Description and Interpretation. Observation 7 involved a group that engaged in a mixture of split discussion, parallel work, and whole-group discussion. Individual A plays the role of the communicator in her group. For the beginning of class, there are two discussion threads going on and she navigates between both of them to provide cohesion to her group. Individual B is working silently most of the class, but amiably discusses his thinking with Individual A when she initiates conversation. The other two group members are working as a pair and engaging in split discussion with each other throughout class. Individual C and Individual D are also quick to discuss with the Individual A when she initiates conversation, but hardly communicate with Individual B at all. Towards the end of class, the engagement shifts to whole-group discussion, but Individual B continues working silently as the other group members discuss their thinking. Observation 3 Description and Interpretation. Observation 3 involved a group that worked the most silently of any group observed, but this group still collaborated throughout class, just less frequently than other groups. The group completed the lab effectively and discussed each of their answers in a whole-group discussion. The group simply spend long periods of class looking at their laptops individually and came together for just short discussions to ask questions and share their thinking. The timelines are presented on the next three pages, with the group-level timeline shown first, followed by each individual-level timeline shown after. The instructor interaction is shown only on the group-level timeline because including it on the individual-timelines made it impossible to fit all five on the same page. 55 The top timeline represents the group-level engagement. The four timelines below represent the individual-level engagement, starting with Individual A and ending with Individual D. Figure 2: Behavioral engagement modality timelines representing Observation 2. 56 The top timeline represents the group-level engagement. The four timelines below represent the individual-level engagement, starting with Individual A and ending with Individual D. Figure 3: Behavioral engagement modality timelines representing Observation 7. 57 The top timeline represents the group-level engagement. The four timelines below represent the individual-level engagement, starting with Individual A and ending with Individual D. Figure 4: Behavioral engagement modality timelines representing Observation 3. 58 Emotional Engagement Emotional engagement involves students’ affective response or their feelings about the labs. In the context of the labs, examples of emotional engagement (or lack thereof) include enjoying the labs, being curious about the problems that are presented as the context of each lab, feeling frustrated that the labs do not align with the exam content, and students’ having confidence that they can successfully complete the labs. Emotional engagement is presumed to impact other aspects of student engagement through motivational factors, and students’ perceptions about the task influence both their emotional and cognitive engagement. The data captures a large range of emotional engagement, with examples of both positive and negative emotional engagement represented in the data. The survey data tended to be negative, the interview data tended to be a mixture of positive and negative (often within the same interview), and the emotional engagement only appeared in the observations in rare instances where students expressed their feeling aloud, which were often proclamations of frustration. Interest and Enjoyment Multiple the participants explained that they did find the labs interesting or enjoyable, citing reasons such as finding the real-world contexts interesting, enjoying the challenge of the labs or how they were like puzzles, and being able to work on mathematics during class by experimenting with variables to see how things play out. One participant, majoring in elementary education, felt like the labs were designed mostly for engineers and did not understand how they could connect to her goals, but she finds the labs interesting, enjoyable, challenging, and perceives that they help her learn. It is worth noting that participants’ responses may have been influenced, in that more positive responses were elicited, by asking them about their interest and enjoyment directly, rather than letting it emerge organically. Interview Question 14 states: “How 59 do the labs make you feel about your learning? Are you interested in the labs? Do you find the labs enjoyable?” Confusion Multiple participants explained that they were so confused they could not make sense of what was going on or they did not have enough time and felt rushed to complete the labs. Multiple participants initially reported that they found the context of the labs interesting, but in follow-up interviews explained that they were so frustrated and confused that they did not enjoy the labs. One of these students explained, “I guess if I like understand what was going on, I’d probably find them a lot more enjoyable. Like I mean they're kind of a fun take on math I guess, but I just don't, I can't like…get to the point.” Multiple participants complained that their confusion was never resolved, in part, because they did not receive feedback that helped them understand their mistakes on the labs. Frustration and Anger Multiple interview participants expressed frustration with the labs. Some were frustrated by the timing of the labs because they felt rushed during class or were unable to finish the labs carefully because they ran out of time. Others reported being frustrated that they could not earn a full score on the labs and were doubly frustrated when they perceived that the timing of the labs was related to their low scores. Another group of participants expressed that they were frustrated that the labs did not help them study for quizzes and exams and seemed to be disconnected from the rest of the class. Finally, some participants expressed frustration that they did not get feedback that was useful for learning from their mistakes and improving for future labs. Similar instances of frustration, sometimes even anger, were captured in the observations when students expressed their feelings in short verbal outbursts. In one observation, students 60 discussed frustration that the lab took the entire class after they were told that it was easier than previous labs. In several observations, students expressed frustration that the labs make them feel more confused. In other observations, students discussed their frustration about not being able to earn a perfect score on the labs—at the end of one observation where students expressed confusion throughout a group member said, “well, there’s another 7 out of 10.” Feelings about Groupwork Many participants reported that working on the labs in groups made the lab experience more enjoyable. Many participants reported that they generally liked groupwork, and that the labs were not much different than previous groupwork experiences. Reports about negative experiences with group members were rare, with the only instance being related to frustration that some group members had not properly prepared for the lab (i.e., they had not downloaded the files before class). Several participants reported that keeping groups for several weeks was helpful because there is some introduction period required where group members are meeting each other and learning the strengths of each group member, which is reduced when working with the same group. On the post-course survey, students reported that they felt included in their groups. Q48: Rate your usual participation in the Labs. Linear scale: 1=”I often feel included during Labs”; 4=”I rarely feel included during Labs”) Answer Selection 1 2 3 4 Fall 2018 (n=105; response rate = 68.1%) 36% 39% 15% 10% Spring 2019 (n=92; response rate = 60.1%) 38% 40% 18% 4% Table 1: Post-course survey responses to Q48: Rate your usual participation in the Labs. 61 One interview participant, an international student who was an elementary education major, reported negative experiences of feeling excluded from her group and wondered if it was because she was an international student. She explained, “It depends because some group members it seems ignore me, I don't know….Basically... I mean maybe because I don't speak English well. People tend to be more, talk more to other people, not me, but I still, try to participate.” Confidence Interview participants shared a variety of perspectives about their confidence as it related to the labs. Nearly all of the participants expressed high self-confidence in their math ability in general, but some shared that recent struggles in Calculus 2, such as a low exam score, had shook their confidence. Several students reported that they had gained confidence with their coding skills, which was daunting when they were first starting the labs. Although students reported that they struggled on the labs and that they lacked confidence to complete the labs, no interview participants connected these feelings as being responsible for damaging their overall confidence in their mathematical abilities—participants were isolating their lack of confidence in being able to complete the labs from their overall confidence about their ability to succeed in Calculus 2. Cognitive Engagement Cognitive engagement involves engaging in a task with focus, determination, investment, and effort and is related to students’ goals. In the context of the labs, students are cognitively engaged when they are attempting to complete the lab by working with their group members, which could include persevering through difficulties. In contrast, students are cognitively disengaged if they are not putting forward an effort to complete the lab or contribute to their 62 group's thinking. Cognitive engagement is related to intrinsic and extrinsic motivation based on the value that students assign to the labs and how the labs align with their goals. Student Learning Goals and Grade Motivation Students' cognitive engagement is closely related to their own learning goals and their perception of how the labs help them achieve those goals. All of the interview participants reported that they were invested in completing the labs, but the motivation for their investment differed based on this relationship between their goals and the perception of the labs being able to facilitate those goals. Some students were cognitively engaged because they valued the labs as a way to learn mathematics and improve their coding skills, some were engaged because of their habits of being a “good student” and doing the tasks assigned to them, and some were engaged because they were trying to earn as many points as possible to earn a high grade. For the participants who affirmed that they were invested in the labs because they valued the labs as a learning opportunity, they always identified their own learning goals associated with the labs. Some explained that exposure to the material in different ways is useful for their learning. Others shared that they were invested because of their interest in the labs—they liked seeing how computer simulations can be used, they wanted to learn how to use MATLAB, or they were interested in understanding the realistic contexts of the labs. Other participants explained that they were invested because the labs were challenging, and they were driven to get the right answer. A second group of participants explained that they completed the labs simply because they were an assigned task. These students explained that the assignment being graded was a secondary consideration, beyond the simple fact that they try to do all of the tasks assigned to 63 them. One participant explained that they trusted that the labs must be helpful for learning if they were part of the curriculum. A third group of participants shared that their main investment in the labs was to earn the grade. Some explained that they were completing the labs because it was a graded assignment, and they might allocate time elsewhere if the labs were not graded. Multiple participants who shared that they were motivated solely to earn the points on the labs also explained that they were confused by the labs to the point that they felt they were not learning, or that the learning goals were not transparent. One participant shared that without earning a grade, the pain of struggling through the labs would not be worth it. Student Perceptions Interview participants were asked to discuss their perceptions about various aspects of their learning experience with the labs. This perception data, being self-report data, is indirectly related to both emotional and cognitive engagement based on the alignment between students’ goals and their perceptions about how the labs align with those goals. When students' goals are more intrinsic, such as wanting to learn about coding in MATLAB, their emotional engagement and perceptions also tended to be positive. When students’ goals are more extrinsic, such as wanting to improve their course grade, their emotional engagement and perceptions also tended to be negative. Negative emotional engagement and perceptions about the labs did not, however, correlate with behavioral disengagement with the labs – every observation revealed students engaged in completing the labs, even when they did not enjoy the labs or did not think the labs were helping them learn calculus. 64 Perception of Learning Value Participants were asked if they thought that the labs helped them learn, overall, and the responses were mixed. Some participants who affirmed that the labs were helpful for their learning experience explained that the labs provided a different perspective that helped them see the connections between calculus concepts and understand mathematics more broadly. Other students who found the labs useful appreciated the challenge of the labs and how the labs helped them understand realistic applications of calculus, including the connections between calculus and physics. One student explained, “it goes back to how I want mathematics to work for me…I want to see the real-world application of it. So, I see that's the learning experience for me is I can see that real world application, which is really cool.” Multiple students found the labs valuable because they provided an opportunity to work on interesting mathematics during class, rather than rushing through a few review problems before a quiz, as they perceived some of the recitations in their other classes to be structured. Participants who expressed that they did not find the labs useful were often confused by the labs, either because of the barrier-to-entry that coding in MATLAB presented or because they struggled to understand how the mathematics from class was applicable. One student wondered why they were using MATLAB simulations when it seemed like the graph could have been provided on a handout and the same analysis performed without MATLAB. Another student explained that the goals of the labs were misaligned with her longer-term goals as a student pursuing a teaching career in secondary education. Another student explained that the labs were interesting applications of calculus when they could understand them, but on the whole the labs seemed unrefined, and the MATLAB barrier ultimately made it more difficult to succeed in the course. 65 Perception of MATLAB Interview participants were asked how MATLAB impacts their learning experience with the labs. Students explained several reasons that they found their experience with MATLAB useful. Some students expressed that knowledge about how to use MATLAB would be useful for engineers, either from experience in their engineering classes or because a friend or relative who was an engineer told them so. One student explained that working with the coding in MATLAB helped them understand the logic behind the mathematics better, but that was mostly connected to being careful with his parentheses in both MATLAB and in the online homework system. Another student explained that it was useful to see a computer simulation of the mathematics, but that they were confused overall by the labs. One student credited his positive experience with the labs as the reason for enrolling in a subsequent data-science course, even though the labs (which the student found interesting) did not help them with the exam-related parts of the course with which they were struggling. Multiple participants were confused by the MATLAB coding and attributed their difficulties to their lack of coding experience or feeling too rushed during the labs to learn how to code within MATLAB. Follow-up interviews with two of the participants revealed that they learned that less coding expertise was required than their first impressions, and instead the labs needed students to merely change parameters or run parts of the existing code. Other students did not see the purpose of using MATLAB and thought MATLAB seemed like an extra-complicated graphing calculator and wondered if the graphs could just be provided on a printed handout. Perception about Applications of Calculus Ideas Interview participants were asked if the labs helped them understand or appreciate the applications of calculus. A common response was a short affirmation without detail provided, 66 such as “yes”, “oh yeah, for sure”, “yes, kind of”, and “yes, big yes”. Some participants expanded with a short explanation, such as “Yes, I like the little scenarios” and “Yes, it makes it seem not as pointless”. Another grouping of responses were specific ways that participants found the labs to be related to real-world applications. For example, interview participants reported that they liked seeing how computer simulations could be used in mathematics, appreciated that the applications helped them see how you can use math, and understood applications of calculus in other classes. In one observation, students were captured saying, “this is like what we’re doing in physics”. One interview participant offered a particularly detailed reflection: It really opened my eyes up to the applications of math, what it means, what derivatives, integrals, sequences, what, how they all relate to like a lot of things within like the physical sciences and like really how it's just like a different way to describe things. And I think that's really, really interesting. And that's what I'm really looking forward to with CMSE 201 because that's really, it's, I mean it's modeling data with computers specifically with python. That's literally what the whole course is about. So, I'm excited to see where that goes and yeah, taking that in a little bit more and not. Yeah, I'm just excited for that. Another group of interview participants explained that they were too confused by the labs to understand the applications of calculus, and multiple students specified that MATLAB and the coding was too confusing. Several follow-up interviews revealed that students who, at the beginning of the semester felt positive about the labs being able to help them understand the applications of calculus, later explained that the labs felt too disconnected from the rest of the course curriculum to be useful. 67 Survey Data Select responses of the post-course survey in the Fall 2018 and Spring 2019 semesters are presented below. The Fall 2018 survey was sent to 154 students, 105 students responded, which amounts to a 68.1% response rate. In the Fall 2018 survey, 13 students submitted two responses and the first response was omitted from the data. The Spring 2019 survey was sent to 152 students, 92 responded, which amounts to a 60.1% response rate. In the Spring 2019 survey, 13 students submitted two responses and the first response was omitted from the data. There are different response counts for each question because several students on each survey left responses to some questions blank. As mentioned in the Chapter 3, the post-course survey was designed by the lab designers as a teaching-related data collection mechanism. The researcher provided input for the design but was directly responsible for the authoring or administration of the survey. Because of this, the survey is not directly tied to the research question for this dissertation, but instead provides rough insights about broad students’ perceptions about the labs. Therefore, only a subset of the survey data is presented in Table 2 on the next page. This selection of survey data illustrates that survey responses regarding the learning value of the labs tended to be negative, but responses related to the groupwork experience tended to be positive. This survey data provided insights about how to interpret the interview and observation data, especially in highlighting the importance of understanding students’ negative reactions about the labs. There were only weak correlations between responses and grade outcomes, indicating that this is not a situation where high- achieving students had positive reactions and low-achieving students had negative reactions. The responses regarding the learning value of the labs tend to be negative across achievement levels. A complete tabulation of the survey data is available in Appendix G. 68 Response Distribution Question Text strongly strongly Term N agree neither disagree agree disagree The Labs is a useful tool for 4% 8% 11% 20% 57% SS19 90 learning calculus overall. 3% 10% 14% 22% 51% FS18 104 The Labs help you practice 3% 24% 20% 13% 40% SS19 91 modeling. 6% 17% 25% 17% 34% FS18 103 The Labs are important for 5% 12% 9% 21% 53% SS19 91 your success in MTH 133 4% 16% 13% 18% 50% FS18 103 Working in groups on the 13% 36% 26% 8% 17% SS19 92 Labs is enjoyable 12% 42% 22% 12% 13% FS18 104 Working in groups on the 11% 42% 17% 12% 17% SS19 92 Labs helps you learn 12% 38% 27% 13% 12% FS18 104 Table 2: Post-course survey response data for a subset of survey questions. Response distributions are presented separately for each term that the survey was administered. The number of responses to each question are listed in the right-hand column. 69 CHAPTER 5: DISCUSSION, IMPLICATIONS, AND CONCLUSION This final chapter of this dissertation includes a discussion of the research findings, connections to the literature, implications for practice, and concluding remarks. The “Discussion of the Research Findings” section provides a summary and my interpretations of the findings. The “Connections to Literature” section explains how this study contributes to several threads of research literature, which were first presented in Chapter 2. The “Implications for Practice” section explains how this study informed and impacted the local implementation, and what insights can be garnered for teaching in other contexts. The “Conclusion” section includes my perspectives about limitations and affordances of this study, and directions for future research. Discussion of Research Findings This study was designed to investigate the research question: How do students engage with computational labs in a particular Calculus II class? The data shows that engagement varied widely among the 20 participants interviewed and 16 groups observed. Some groups spent most of their time in whole-group discussion, other groups primarily worked silently interrupted by short, focused discussions. Some students enjoyed the labs and found them interesting and challenging, other students dreaded the labs and found them confusing and frustrating. Some students appreciated the labs as an opportunity to explore applications of mathematics and learn computational modeling skills, other students complained about the lack of alignment between labs and exams and found the labs to lack value. There was, however, one important consistency across the data: most students were engaged, behaviorally and cognitively, during the labs. Every group observed worked to complete the labs as intended; few students engaged in off-task behaviors and most students put 70 forward some mental effort to contribute to the development of their groups’ ideas. The groupwork looked different in every observation, but every group still completed the lab. Based on this brief summary, the reader may wonder, “so now what?” Recall that this study was designed to capture as broad of a range of student experiences as possible; I conducted as many interviews and observations as possible, across the span of the entire semester, and valued singular data points. This study was successful in capturing a wide range of experiences, demonstrating substantial variation in how students engaged with the computational labs. This variation has important consequences for how mathematics educators understand lab-type activities, as explained in the four claims below. Claim 1: Wide Range of Behavioral Engagement Modalities Demands Attention Students were observed engaging with the labs by discussing their thinking, tinkering with code individually, co-authoring solutions, and silently observing group discussions. The various “group engagement modalities” defined in Chapter 4 describe different ways groups collaborated, and the engagement timelines provide a visual representation of what active learning “looks like” for three groups that exemplify the contrast. Observation 2 involves a group that engages in whole-group discussion for a large portion of the class. Observation 7 involved a group that engaged in a mixture of split discussion, parallel work, and whole-group discussion. Observation 3 involved a group that worked silently for a large portion of the class. Each of these groups was productive towards completing the lab, and thus variation of behavioral engagement is evidence that collaborative learning can “look” different between groups and there is not necessarily an ideal engagement modality. The identification of the variation of behavioral engagement modalities can inform researchers, curriculum designers, teaching coordinators, and instructors. Researchers should 71 account for the observation that productive groupwork in an active learning setting can involve silent reflection and individual work, as well as discussion— “small group discussion” is not equivalent to “active learning”, “groupwork”, and “collaboration”. Curriculum designers could indicate which behaviors are intended or useful for a particular task, either by directly prompting students to engage in particular behaviors or by making suggestions in a teaching guide. For example, a task might include the instructions: “Individually explore how changing this parameter affects the simulation, then discuss your findings with your group members to pose a conjecture.” Teaching coordinators could address the range of behavioral engagement their teaching team might encounter in the classroom and discuss how to facilitate certain behaviors. Instructors can benefit from being aware that “active learning” involves various behaviors so they can more wisely facilitate activity in the classroom. In particular, this study shows how individual silent work is an important component of group work for some students, and it can lead to short, rich discussions that may appear like students are not working together. Claim 2: Negative Emotional Engagement Does Not Cause Disengagement Interview participants reported that the labs exposed them to interesting applications of calculus, they enjoyed the variety that the labs provided, they appreciated working on challenging problems, and they found the experience using MATLAB to be useful. Several interview participants expressed that they were frustrated when they could not finish or fully grasp the labs because they wanted to “figure it out.” These findings suggest that the labs are a useful teaching tool for inspiring curiosity and exposing students to interesting applications of mathematics. It is likely that this group of interview participants, however, is not representative of the Calculus II student body because they volunteered for an interview, rather than ignoring the email solicitation and devoting their time to scrambling to keep up with their classes. 72 The broader sentiment was a more negative reaction that was especially pronounced in the post-course survey results. Students’ perceptions that the labs were difficult to complete during class, were challenging to earn full credit on, and were unrelated to preparing for exams were some of the factors that contributed to the negative reaction. These factors are related to a larger issue, that the labs were perceived as a barrier to earn a high course grade, which is many students' primary concern, especially because of the impact Calculus 2 grades have on admission to or success in students’ academic programs. A lack of clarity about the purpose of the labs within the curriculum or how to improve lab performance compounded the negative emotional reaction. These negative feelings, however, did not appear to disrupt students’ behavioral or cognitive engagement in the classroom, because students continued to put forward effort to complete the labs through the end of the class. These findings suggest that negative emotional engagement does not necessarily lead to overall disengagement or undermine students’ motivation. In fact, it may be the case that the design and implementation of lab-type activities need not provide students with a positive emotional experience, and it might be expected that students will report negative feelings towards activities that generate opportunities for productive struggle. Many learners of mathematics can remember a time when learning through struggle is not necessarily fun or enjoyable in-the-moment but is perhaps fulfilling after the struggle is resolved–this may be the trajectory of developing a productive disposition. Calculus 2 students may have had more than a decade of experiences being successful with mathematics that focused on direct instruction and rote calculations. They may not have experience with group-based tasks that are designed to be complicated and challenging, to draw on knowledge from different group members, and may not be possible complete individually. 73 Many Calculus 2 students are hopeful to pursue engineering and are acutely aware that their grade in Calculus 2 is important for admission into an engineering program. For these students, experiences with productive struggle with Calculus 2 labs may feel less like rich learning opportunities and more like unnecessary barriers to their engineering goals, especially when Calculus 2 is a prerequisite class outside of the core of their major, leading to the negative emotional reactions captured by this study. Claim 3: Intrinsic Motivational Factors May Be Less Important Than Suggested Cognitively, a large majority of students were invested in the labs during class, meaning they were putting forth a good-faith effort to progress through each of the questions, discussing their thinking with their group members, and collaboratively generating the answers they wrote on their group document. It was rare to observe students engaging in off-task behaviors for more than a couple of minutes during a lab session. There was a small group of students that identified intrinsic learning goals associated with the labs, such as wanting to gain experience with MATLAB per the suggestion of an older sibling, who were deeply invested in extracting a rich learning experience from the labs. Many students, however, were simply invested in completing the labs because they were a required assignment. The labs being graded provided extrinsic motivation for the rest of the students, resulting in the overall productive cognitive engagement that was observed. It is important to note that these forms of cognitive engagement are not presented in a hierarchy, nor is the purpose to judge which form is “better” and which is negative or deficient. To the contrary, these are all productive forms of cognitive engagement because they all led students to approach the labs as intended, which resulted in a rich learning experience. 74 Middleton and colleagues (2017) posed the question, “Should teachers rely heavily on intrinsic reward as opposed to reward extrinsic to mathematics, as suggested by much of the research?” (p. 669). Maybe not. This study found that Calculus 2 students are able to engage with challenging tasks even when the main motivation is extrinsic. Several interview participants shared that they completed the labs because they were graded, and others completed the labs just because it was an assignment, they try to complete all of their assignments. For these students, the primary cognitive engagement was tied to generally being a good student and doing their assignments, they did not necessarily have intrinsic motivation tied to specific learning goals, even if they recognized that the labs might improve their coding skills. They certainly did not identify that they were invested in completing the labs because they hoped the labs would help them build a productive disposition or develop their ability to make conjectures and support them with evidence. However, nearly all of the observations revealed students engaged in those practices, because those practices were required to complete the tasks, which implies that they learned some of the knowledge and skills the tasks were designed to emphasize. It is an open question whether the cognitive engagement would have been different if there was a more pronounced intrinsic motivation. In other words, even though some students might not have the conscious intention to learn how to use MATLAB to model Calculus 2 concepts, they learn some of those skills anyway because they have to do the things they are supposed to learn, like analyzing the results of code they have run. One student exclaimed during a lab, “Oh, that’s what convergence means!”, illustrating how the labs can help clarify calculus concepts, but it is unlikely that this student entered the classroom with the intent to have a better conceptual understanding of convergence. Intrinsic motivation is related to the alignment between students’ goals and the 75 perceived learning opportunities, and it is difficult (or impossible) to bring those into alignment when students have many other factors influencing their academic goals, like admission into academic programs in engineering. So, while it may be beneficial for teachers to try to inspire intrinsic motivation, it is worth recognizing that it may be difficult to inspire intrinsic motivation and extrinsic motivation 0can still lead to positive outcomes. Claim 4: Multiple Forms of Data are Required to Understand Student Engagement This study captured variation in student engagement that was highlighted by the contrast between the interview, observation, and survey data. The survey captured negative student perceptions, summarized in Table 2, with more than 70% of students indicating that they disagree or strongly disagree with the statement that “The Labs is [sic] a useful tool for learning calculus overall.” In contrast, multiple interview participants shared that the labs were helpful for their learning experience, helped them see the connections between calculus concepts, helped them understand realistic applications of calculus, and provided an opportunity to work on interesting mathematics during class. Some interview participants reported that they were confused during most of the labs, but observations revealed that the groupwork setting provided structure to help most groups resolve their confusion during class. Furthermore, observations revealed that most students were engaged with the intended mathematic practices during the labs, which would imply that they were learning those practices, according to Doyle’s (1983) reasoning that engagement is the context of learning. This contrasting data is evidence that any one of the type of data provides an incomplete picture about the engagement that the labs facilitate and the learning opportunities that they present. There are several practical implications of this claim. First, we should be careful to draw conclusions from survey data and we should be mindful of the context of the survey. The survey 76 used for this study was conducted at the end of the semester, with the final exam looming, and the labs were not designed to align with the content of the final exam. At this point in students’ experience, their conception of “learning calculus” was likely closely tied to their goal of preparing for the final exam, and the labs were not helping them advance that goal. This contextual assumption does not invalidate the survey data, but rather suggests that we should be careful to make assumptions based on survey data about how curriculum or teaching affects engagement, especially when evaluating a curriculum element that is challenging. Second, we should be mindful that students who volunteer for interview are not representative of the entire class population. These students may tend to be more intrinsically motivated overall and thus more likely to value challenging academic tasks that align with their intrinsic learning goals. Third, video-recorded classroom observations in a Calculus 2 lab setting may be unlikely to capture negative emotional or cognitive engagement. Students in this setting seemed to be generally focused on following the directions to complete the lab, and the only indications of negative emotional or cognitive engagement are short, rare utterances. The act of video- recording the observation may have inspired more positive engagement and stifled expressions of negative engagement, but the shortcomings or any single data collection method are still highlighted. Ultimately, the results of this study imply that multiple data collection methods are required to capture the breadth of student experiences. Connections to the Literature This study is situated within at least four threads of the literature, which contributed to the design of the study and are advanced by the findings. First, this study served to locate and adapt engagement theories within the context of college students learning Calculus 2. Second, this study contributed to understanding active learning in college mathematics, specifically with 77 lab-type activities that are one particular format of active learning. Third, this study informs curriculum reform literature by contributing to the conversation about curriculum recommendations and their relation to persistence in STEM. Finally, this study explored tasks that require higher-level mathematical and interdisciplinary thinking. Contribution to Engagement Research This study is an example of how to study student engagement in a college mathematics context, which provides insights about the research method and what this research approach can contribute. This study illustrates how examining student engagement is a productive lens to understand students’ learning experiences. The study also exemplifies how the approach may produce findings that are more generalizable to other teaching contexts. This study also advanced the multi-dimensional conception of engagement by detailing the behavioral, emotional, and cognitive dimensions in the context of college mathematics learning. This study illustrated how examining student engagement is a productive lens to understand students’ learning experiences, especially with novel tasks that involve novel learning goals. This approach was useful because it allowed me to understand the learning experiences facilitated by the labs, without trying to measure individual attainment of evolving learning goals, especially in a collaborative setting. It is not clear how group-based activities can be assessed at the individual level, especially when the goal is for students to interact with a group to draw on different strengths of different group members to accomplish something together that none of the individuals could complete alone. In this situation, the intent is NOT that every student should internalize 100% of the task, but rather that the group accomplishment is greater than the sum of the parts. The research lens provided a way to systematically explore what was happening in the classroom and in students' minds without being constrained to measure things 78 that fit within an individual post-course exam or survey. Measuring is especially problematic when tasks and implementation are in the process of being developed (i.e., during the “pilot” phase), as was the case with this research. The study also exemplifies how the approach may produce findings that are more generalizable to other teaching contexts. Currently there is a weak link between disparate lab- type reforms because reports about improvements to student outcomes in those specific settings are not transferable—knowing that students who completed a specific set of labs improved on a specific final exam question is not transferable outside of that context because the tasks will not be implemented in another setting verbatim. The findings produced by this study are transferable to other contexts because they illuminated the learning process by detailing the nuance of student engagement with the tasks. Understanding the engagement dimensions. This study advanced the theoretical approach to studying engagement by detailing the multi-dimensional conceptualization of engagement within the context of college mathematics teaching and learning. Behavioral engagement. The theory about behavioral engagement was advanced by expanding the provided definition of “the patterns of participation students display with others” (Middleton et al., 2017, p. 669). This study found that specific, distinct patterns of behavioral engagement exist (the modalities of behavioral engagement presented in Chapter 4), and the presented timelines illustrated how those modalities play out in-the-moment to explain what different patterns of group engagement might “look” like. Emotional engagement. In terms of emotional engagement, Middle and colleagues (2017) ask, “with respect to emotional engagement is it sufficient or even desirable to aim toward fostering positive feelings and avoiding negative ones, for example, by offering 79 encouragement and providing success experiences?” (p, 669). This study found that negative emotional engagement, specifically around feelings of frustration, may be unavoidable when students are confronted with tasks that are challenging, novel, and different from traditional activities that focus on rote practice of a fixed solution method. Furthermore, the negative emotional engagement can be amplified when the tasks are perceived as “more work for the same grade” in comparison to non-lab sections running in parallel or even to their preconceptions about what should happen in Calculus II classrooms. These feelings of frustration are not necessarily counter-productive and can exist beside feelings of curiosity or interest in a realistic application of mathematics—students found the zombie apocalypse lab interesting but were also frustrated by the mathematics of the modeling. Cognitive engagement. In terms of cognitive engagement, Middleton and colleagues (2017) state, “The presence of a range of desires or objects of engagement does not necessarily detract from the effectiveness of engagement in learning mathematics” (p. 668). This study revealed that there were productive outcomes connected to both intrinsic and extrinsic motivation factors. Instead of placing these factors that influence cognitive engagement on a hierarchy to distinguish between engagement and disengagement, it is more productive to recognize the different motivational factors that can lead to students being invested in completing a task, and how those factors are connected to emotional engagement. Students who were more intrinsically invested in the labs tended to have more positive emotional engagement, but extrinsically motivated students with negative emotional engagement still contributed to their groups’ progress through the labs and were observed engaging in the intended mathematical practices. Students cannot “go through the motions” of modeling or making conjectures if they need to present their reasoning to their group members. In other words, students did the practices 80 because the tasks required them, even if they did not enjoy it or were not trying to learn modeling. It was important to link the observation of mathematical practices to cognitive engagement to capture this nuance, because it is difficult to ascertain through the interviews or survey what students’ cognitive engagement entailed. Understanding Different Design Goals This study provides an example of how to examine a curriculum element that was designed to facilitate active learning that targeted novel learning goals but was not designed to improve students’ grades. The lab content was not captured by quizzes and exams, and the labs were group activities that involved only a group grade. No individual assessment data was collected, because an emphasis was placed on making the labs a collaborative experience. Furthermore, the lab designers were explicitly trying to avoid changes to the overall grade distributions in the course. As planned, course grade and exam outcomes were unchanged by the lab implementation, and the labs seemed to have little or no impact on individual grade outcomes (Krause et al., 2019). Finally, there is not a concept inventory that can be used to measure learning outcomes associated with the lab goals, such as something like the Calculus Concepts Inventory (Epstein, 2013) that is used to measure precalculus and Calculus 1 conceptual understanding. Therefore, student engagement was examined because this provided a lens to understand the learning experience qualitatively when there was not a readily available metric to measure learning outcomes quantitatively. This kind of engagement research can be considered as second-generation research, as called for by Freeman and colleagues (2014), to inform course design, understand the impact of different types of active learning for different populations, and develop effective instructional techniques for active learning. The first-generation studies that were reviewed for that study 81 involved measuring the improvement to grades associated with active learning, for implementations of active learning targeted at improving those measures. This dissertation represents a different research goal and provides an example of how to understand in-process curriculum design projects that are intended to change the nature of students’ learning experience in ways that are not directly related to exam scores and course grades. Similarly, this study also calls into question general claims that active learning will narrow achievement gaps, as was questioned by E. Johnson and colleagues (2018). The previous research serves an important role to inspire change by providing actionable changes that instructors can implement by incorporating active learning, but the nuance of the implementation and the assessment of learning is of utmost importance to make specific claims that active learning in a particular setting improved student outcomes. Attending to Local Data Chapter 11 of the Insights and Recommendations from the MAA National Study of College Calculus (Zazkis & Nunez, 2015) explained that successful calculus programs attend to local data to guide curricular and structural modifications. The goal of this chapter is to provide recommendations about the conditions that facilitate continual teaching improvement in a department: frequent attention to planning ongoing improvements, awareness and responsiveness to changing needs, participation of faculty, and attention to local data. The chapter emphasizes the fourth of those conditions, and the authors claim to “give the reader a sense for what types of data he or she may wish to begin collecting at his or her institution and the types of program initiatives that these data may provide the impetus for” (p. 123). Notably, this chapter does not provide recommendations about collecting data to inform specific curriculum development efforts within the calculus curriculum, but instead focuses on data collection toward the purpose 82 of motivating change. I argue that while these recommendations are important, there is a lack of guidance about how data (specifically qualitative data) can be employed to guide the development of the reformed instructional activities. In the context of this dissertation, attention to local data (feedback from faculty and students in the Engineering department) provided an impetus to develop the labs, and this research informed the implementation. One way that scholars have attempted to understand the impact of shifting to more active learning is by searching for patterns in student achievement data such as DFW rates, course grades, exam scores, and concept inventory scores. These achievement studies examine the relationships between quantitative measures of student achievement and various instructional variables. Many studies related to active learning are achievement studies, constructed as follows: (1) design a curriculum and/or teaching reform to incorporate active learning, (2) track student outcomes and compare to students with the traditional curriculum and/or teaching. Achievement studies are important because they can provide evidence to support the argument that we must stop lecturing in undergraduate mathematics courses. It is also important that this data cleanly fits within framework of the scientific method—there is a control (traditional teaching practices), a treatment (active learning), and an observable, repeatable measure (student outcomes). This kind of evidence is convincing to a wide audience of scientists, engineers, and mathematicians who, as a group, have been committed to traditional lecture structures in the past, and value the experimental design of these studies. Another recommendation described in Chapter 11 of the Insights and Recommendations is about student success in subsequent courses. In the context of this study, success in subsequent classes is more distant, and thus complicated to understand. The goal of the lab design project was to enhance upper-division students’ ability to employ calculus in Engineering contexts one 83 or two years after their calculus experience, after completion of the entire calculus sequence and several Engineering courses. In this context, it is difficult to measure the impact of the lab project on student success in Engineering courses because it would require a costly longitudinal study, and it is unlikely that the exact impact Calculus II could be isolated from students’ experiences in the other subsequent courses. Therefore, it is important to promote research methods that can provide some of these insights, which this study does. For example, the findings suggest that students do achieve some familiarity with MATLAB (used in subsequent Engineering courses) and have some understanding about how Calculus II content can be used for computational modeling, even if those effects are not measured in subsequent Engineering course outcomes. Active Learning As mentioned in the Literature Review (Chapter 2), this study responds to the call by Freeman and colleagues’ (2014) call for second-generation research about active learning to inform course design, develop effective instructional techniques for active learning, and understand the impact of different types of active learning for different populations. As design- based research, this study was situated within an iterative design that informed local practice, while aiming to understand student engagement with the labs to develop generalizable theory about designing and implementing lab-type activities in college mathematics classes. The contributions to broad literature threads are discussed here, and implications for local practice are discussed in the “Implications” section that follows. This study found that the labs provided a collaborative learning environment that emphasized active learning and enriched the learning experience for many students. The observations captured students engaged with modeling realistic situations that involved applications of calculus content and higher-level mathematical practices, which are typically 84 absent in traditional recitation-type classes that focus on an instructor-led review of rote calculation exercises. It is notable that this study examined classrooms with graduate student instructors who were not involved in the design of the labs and that labs were implemented as planned—there was no observation nor reports of instructors adapting the labs to an instructor- led lecture presentation. In this sense, Design Principle 1 (prescribe active learning, specifically collaboration) was achieved. Broadly, this is evidence that lab-type activities can facilitate active learning without requiring significant professional development or placing a huge demand on instructors. In fact, observations revealed that students spent little time interacting with the instructor, perhaps only during brief whole-class launch at the beginning of class, wrap-up comments at the end of class, or if the group asked the instructor a few questions. A large portion of the observations revealed that class time is primarily spent working and discussing with group members. These findings are especially enlightening to understand the challenge of implementing sustainable change in college mathematics classes, namely that reform efforts often fail or are forgotten as faculty leading reforms are rotated out of the course (Reinholz et al., 2018). Implementing a lab-type intervention provides a mechanism to structurally prescribe active learning that is less dependent on the uptake of modern teaching practices by individual instructors. Lab-type activities make it more obvious, to the instructor, that certain activities should be left for students to do, because the lab can specifically prompt students, for example, to run a segment of code and report the results. Consider Exercise 3 from the Baseball Rocketry Lab: “summarize what you did to solve this problem, including whether you did it using pen- and-paper or computer simulations.” This is a straightforward prompt for students to explain their work and thinking, and it would feel weird, to the instructor, if they solved the problem in 85 the front of the room and then asked students to complete this summary. This can be contrasted with instructions for a traditional review-type recitation, where an instructor might be directed to “help students review this set of exercises”, which can be taken up by the instructor as an instructor-led, lecture-type presentation of the solution of those exercises. Also, this study found that student-teacher interactions were generally short conversations (3-5 minutes) a few times throughout the class, meaning that students spend a majority of the class interacting with their group and not involved with the instructor. The instructor role was located more in the background than is typically observed in a traditional classroom, where the instructor can lead the conversation and solution presentations throughout class. Enriching Undergraduate Courses with New Learning Goals The labs were designed to enrich the curriculum with realistic and relevant applications (Design Principle 2) and to incorporate modern mathematical practices, such as data analysis, coding, modeling, using computational tools, and communicating ideas mathematically (Design Principle 3). These design principles are closely related to several threads of the literature related to reforming the undergraduate mathematics curriculum to include modeling, higher-level learning goals, mathematical practices, and interdisciplinary skills—those connections are explored in this section. This study found that the labs did facilitate student engagement aligned with these principles. This engagement is evidence that the labs provided an opportunity for learning experiences that satisfied these principles for some students, but this study was not designed to measure the extent to which this was accomplished. Further work is needed to understand the impact of the labs among the broader student population. These findings suggest that 86 implementing lab-type activities may be a pathway to reform courses and curriculum in other contexts, where further inquiry is needed to understand the impact in those contexts. Modeling in calculus. The lab design project is an example of how modeling can be incorporated into the calculus curriculum, and this study illuminates aspects of lab design and implementation that curriculum developers and instructors might expect and attend to. The observation data suggests that labs did facilitate engagement with modeling for the groups that were observed, and many interview participants reported that the labs helped them see the connections between concepts and understand the realistic applications of calculus. One interview participant reported that he enrolled in a computational modeling course after his experience with the labs, in spite of his overall struggle with Calculus II; this is an inspiring finding that labs have the potential to facilitate a positive experience with realistic applications of mathematics through modeling, and that these kinds of experiences can support persistence in STEM. This suggests that labs may be a way to support positive experiences with mathematical modeling and counteract some of the factors that lead to many students leaving STEM fields after negative experiences in calculus courses, thus addressing the switcher–persister imbalance described by the MAA Calculus Studies (e.g. Ellis, Rasmussen, & Duncan, 2013). This observation data should be critiqued because the act of observing the groups may have inspired engagement that might be different if the students were not being watched and recorded, and the interview data should be critiqued because the interview participants were not necessarily representative of the broader student population. Nonetheless, the findings suggest that lab-type activities may be a way to incorporate a modeling experience that can make Calculus 2 curriculum feel more interesting and relevant. 87 The modeling can be lost on students if their experience is dominated by feeling overwhelmed, lost, or out-of-place. Some participants reported feeling overwhelmed by the overall challenge of completing and earning a good grade on the labs, and others specifically identified their lack of confidence or experience with coding as being a barrier to engaging with the labs. Some participants reported feeling lost when they could not see how to advance through the lab or did not understand the solution—some of these students reported that they might find the labs more useful if they could “get it”, but ultimately were left with unresolved confusion and a negative experience with modeling. Some other participants who reported feeling out-of-place, such as one student who was a secondary education major and wondered if the labs were designed for engineering students who could understand them better. Students may also have negative reactions to labs that involve modeling if the sections that involve labs are run in parallel to “traditional” sections that involve classroom activities like additional examples and review. When students are aware that there these two options are available for earning the same credit in the course, it can give students the feeling that they are doing more work for the same grade, or even that they are disadvantaged by not getting a traditional review each week during recitation. These findings are similar to those reported by Flint and colleagues (2011) that running college algebra sections that emphasized modeling in parallel to traditional sections was problematic. Higher-level collaborative learning goals. The lab project is also an example of how higher-level, collaborative learning goals can be incorporated into the calculus curriculum, and this research provides some insights about implementing lab-type activities. One barrier to implementing activities that target higher-level collaborative learning goals is that these goals are not well-aligned with traditional assessment practices. At the research site, 88 it is common for individual, timed quizzes and exams to constitute a large portion of the course grade. This grade emphasis affects students’ engagement with the labs because they accurately determine that the labs have only a small impact on their course grade—ultimately, they must earn high scores on traditional exams that focus on calculation techniques to earn a high grade. It is difficult to incorporate the lab content on these quizzes and exams because the lab activities are not well-suited for individual, timed work; the very nature of the labs is that they rely on collaboration for groups to achieve more than the individuals could working alone. Another barrier is that these activities can draw a negative reaction from students if they perceive that the activities add to their workload without adding to their grade. This research found that some students felt that the labs required them to do more work than students in sections without the labs and may have hurt their progress at earning a high grade. These findings are very similar to those reported by Flint and colleagues (2011). Recommendations of the Curriculum Foundations Project The Curriculum Foundations Project (Ganter & Barker, 2004; Ganter & Haver, 2011) presents recommendations from the disciplines served by mathematics courses about what aspects of mathematics are important for students in their discipline. The lab design project is an example of how these recommendations can be incorporated into the calculus curriculum to involve more interdisciplinary skills, and this study illuminates aspects of lab design and implementation that curriculum developers and instructors might expect and attend to. As an exploratory study, this dissertation does not attempt to identify the extent to which the recommendations are addressed, but the findings that students were engaged with the labs are evidence that the labs did provide some learning opportunities related to the recommendations. 89 Some of the recommendations are implicit to the lab project as a whole; the italicized phrases in the list below align with the recommendations in the Curriculum Foundations Project: • The labs were activities that emphasize mathematical modeling. • The labs were inspired by interdisciplinary cooperation between Mathematics and Engineering faculty. • The labs emphasized skills and knowledge beyond rote calculation. • The labs were implemented as group activities to encourage active learning. Other recommendations have more explicit ties to the tasks themselves. Table 3 illustrates specific prompts within the labs that are aligned with several of the recommendations. Recommendation Lab task aligned with Recommendation emphasize Baseball Rocketry Lab Exercise 1: Which is larger? The Δv gained from conceptual the first baseball thrown or the Δv gained from the last baseball thrown? understanding Give an intuitive explanation. Boom Bust Butterfly Lab Exercise 1: Why do we expect R(X) to be a decreasing function? Why should R(X) always be non-negative? emphasize Trials and Tribble-ations Exercise 1: Why is it that the faulty zoom lens problem-solving can cause the strategy given in the section “Mathematical Explanation" of skills that include the Lab Document to fail? Provide a position for the tribble to hide and a applying familiar sequence of scans that would lead you to the wrong conclusion. mathematics in Boom Bust Butterfly Exercise 3: When v < 1, does the convergence novel settings toward market equilibrium happen faster for less volatile markets or more volatile ones? Give an intuitive explanation. emphasize Trials and Tribble-ations Exercise 2: Describe your algorithm by filling communication in your responses below. including writing Trials and Tribble-ations Exercise 5: What do you notice that is different logical arguments qualitatively between the v = 2.8 case versus the other three cases? What in words does this say about attempts to forecast stock prices, in a volatile market, based on incomplete information? Table 3: Alignment between Curriculum Foundations Project and lab tasks. 90 Implications for Local Practice This study informed the iterative design of the labs. Contributions were made to the lab designers’ understanding about how students engaged with the labs, even during the data collection and preliminary analysis phases before the research was complete. In the year following the data collection for this dissertation, the fourth year of the project (the 2019–2020 academic year), the labs were being run at-scale, meaning that the labs were being used in all 24 sections in the fall semester and all 38 sections in the spring semester, which was ultimately interrupted by the COVID pandemic. One improvement inspired by this research was that pre-lab activities were implemented in the semester following data collection, the fourth year of the project. These activities involved students watching a short (5-10 minute) video that explained some of the context of the lab, downloading the MATLAB files, running the first instance of code, and submitting the results online. One goal of this pre-lab activity was to address situations where some students spend the first 5-15 minutes of class getting started. Another goal of this pre-lab activity was to provide a more consistent instructor launch, that is, a short explanation about the context and goal of each lab, which was found to be lacking in some classrooms. A second improvement inspired by this research was that a lab-practicum was piloted in the semester following data collection, the fourth year of the project. This was an assessment mechanism designed to increase the percentage of the grade that the labs constituted, and consequently reducing the final exam weight. It was decided, partially informed by this research, that the lab experience was valuable and should be represented in the grading scheme more prominently. The lab practicum was designed as a two-part assessment, with the first part involving reflection questions about the previous labs and the second part involving a modeling 91 exercise. The goal of the reflection component was to inspire students to look back on their lab experiences and take away some knowledge about the modeling skills they had developed. The goal of the modeling component was that students would be able to showcase the coding and modeling skills that they had developed by individually completing the task, which was simpler than those intended to be completed in groups during the regular labs. A third way that this research contributed to the local teaching context is that it situated the lab design project within current literature. This provided a way to clarify the design principles of the project and draw on existing knowledge about ways to understand lab-type curriculum design. A fourth way that this research contributed to the local teaching context is that the study contributed to a continuing shift in the department teaching culture by involving and exposing faculty to this kind of educational research. This influence is discussed by Krause, Maccombs, and Wong (2020) in the presentation of the preliminary findings of this research. Unfortunately, the pandemic terminated the lab design project, but hopefully this dissertation will provide inspiration to resurrect the labs and the rich modeling experience it provided students. Conclusion This research sought answers to the questions “How can the labs work?” and “What should we pay attention to when designing and implementing labs?” It was exploratory, descriptive, and theory-generating. This study was not trying to answer, “Are the labs working?” or trying to measure the effectiveness of the labs. We should expect that there are many successes and failures throughout an iterative design process as rich as this lab design project, so it is important to learn something at every step along the way. 92 To answer, “how can the labs work?” we can consider the most positive outcomes. For students who were intrinsically motivated to engage with the labs because they wanted to improve their coding skills or enjoyed the challenge of realistic applications of calculus, the labs provided a rich opportunity to engage with mathematics and computational science. For students who were engaged with the labs because they were required, graded assignments, the labs still provided a collaborative experience that exposed students to MATLAB coding and mathematical modeling. To answer, “what should we pay attention to?” we can consider the rich data generated by this study. We can understand students’ learning experiences with the labs by examining engagement, which reveals what students do, why they do it, how they feel about it. The lab design project and this dissertation suggest that design-based research is a way to change undergraduate mathematics, which is notoriously resistant to change (Reinholz et al., 2018). The DBR focus on designing, researching, and improving “interventions” provides a pathway for evolutionary change that incrementally improves the curriculum. This approach is in contrast to the revolutionary change that is required to install an entirely new curriculum, such as the Project DIRACC approach to calculus (e.g Thompson & Dreyfus, 2016). Installing labs can be a productive way to enrich the curriculum as an add-on, rather than shifting the focus of the entire curriculum. Especially when the goal is to target higher-level learning goals, labs provide a pathway to experiment with new learning activities without tearing apart the existing assessment mechanisms in place in large-scale courses. Limitations of this Research Here are several critiques of this study, which can guide future research. First, the research design, being an iterative process, was not always focused on engagement. The design of the interview protocol and the survey items could be improved by 93 revising them to focus more explicitly on engagement. There were many interview discussions and several survey items that addressed logistical considerations of the labs. Some of this data was useful for informing context-specific teaching changes, but little of it was useful for this dissertation research focused on understanding engagement. A strengthened interview protocol would also help the interviewer avoid leading questions, which was a problem for me with some of the early interviews. Second, the interview participant pool was limited by convenience because I interviewed every student who voluntarily responded to my solicitation. A future study could recruit participants who represented portions of the population who may not have been included in the interview pool. For example, there were few interviewees who expressed the strong negative feelings about the labs that were reflected in the post-course survey. Third, the observation data was limited because students were aware that they were being recorded. The effect of being observed could have led to engagement that was different from the engagement in the classroom when an observer was not present. Fourth, the approach to data analysis could have been improved. The breadth of the data collection was effective for capturing diverse student experiences, but the volume of data proved to be too large for effective coding and analysis. One alternative is that a two-phase analysis process could have been employed to quickly screen the interview and observations to identify a subset of cases to examine in detail. Finally, both the interview and survey data are self-report data that is an indirect representation of students’ actual engagement, which is filtered by students’ perceptions about the labs and the alignment between their own goals and the learning goals of the labs. 94 Future Directions This dissertation can inform several future research threads. First, there can be continued work defining the behavioral, emotional, and cognitive engagement dimensions, and categorizing the components of each dimension. In particular, future research can further clarify cognitive engagement, specifically and the relationship between students’ perceptions about a learning activity and their intrinsic and extrinsic motivation related to the activity. Second, future work can explore the relationships between various modalities of group engagement and active learning. This research presented one way to categorize and visualize the individual-level and group-level behavioral engagement, and future research can expand on this work to understand how those different behavioral engagement modalities facilitate different kinds of learning. Third, there can be continued work to understand how negative emotional and cognitive engagement impact students’ learning experience. This research found that students’ can stay on task and complete their work when they are having a negative experience and relying on extrinsic motivation factors. Future work could explore how learning outcomes vary among students with these negative experiences versus students who have positive experiences associated with intrinsic motivation factors. Fourth, future work could explore how engagement studies can be tied to achievement studies, such as the studies referenced by Freeman and colleagues (2014) that found grade improvements associated with active learning, to play a complimentary role in understanding ongoing reform efforts. Engagement studies can serve as the first line of inquiry, while curriculum development is ongoing, which can inform the development of achievement studies that can be employed to understand the impact of the curriculum after the development is 95 stabilized. For example, engagement studies could inform the development of assessment measures for curriculum design projects by identifying the knowledge, skills, and practices that those curriculum projects facilitate in the classroom. These assessment measures could include individual assessments, such as quiz and exam items, as well as group assessments like a lab- practicum. These assessment measures could also include concept inventories, like the Calculus Concepts Inventory (Epstein, 2013), that could align learning measures across studies. Ultimately, this dissertation can inspire future researchers to attend to engagement as one of the ways to understand students’ learning experiences. 96 APPENDICES 97 APPENDIX A: SURVEY Administration of survey: The survey was administered as a pre- and post-course survey using Qualtrics and a link provided on the course website. Students were informed about the survey by an email from the course supervisor and were also be reminded by their instructors during class. Students received extra credit for completing the survey (as is typical for the course), but were asked to consent to their survey responses being used as research data. About this Survey This survey will ask you about your mathematical experiences in calculus, your perceptions of math, and your time commitments this semester. This survey is confidential. Your name and email will be used by a departmental administrator so that we can identify who has taken the survey. Your instructor may request the results of the survey but all personally identifiable information will be removed prior to sharing the results with them. Accurate data is very important to us. Future departmental decisions and policies may be based on the results of this survey. Therefore we ask that you please electronically sign the statement below: I will answer the following survey questions truthfully and to the best of my knowledge. [Electronic Signature box] Consent for Research Participation Separate from the request to complete this survey for course improvement, you are also being asked to participate in this survey for research. Researchers are required to provide a consent form to inform you about the research study, to convey that participation is voluntary, to explain risks and benefits of participation, and to empower you to make an informed decision. You should feel free to ask the researchers any questions you may have. Study Title: Student Engagement with Labs in MTH 133 Researcher: Andrew Krause Email: krausea3@msu.edu IRB #: STUDY00001288 Check the box below to indicate that you agree to participate in a study about students' experiences in MTH 133. By agreeing to participate in the study, you verify that you are at least 18 years of age. Your participation in the study entails completion of this survey. Participation in this research project is completely voluntary. You have the right to say no. You may change your mind at any time and withdraw. You may choose not to answer specific questions or to stop participating at any time. Whether you choose to participate or not will have no effect on your grade or evaluation. The study will have no cost. Do you agree to participate in this study? [Yes/No] If you have questions or concerns about your role and rights as a research participant, would like to obtain information or offer input, or would like to register a complaint about this study, you may contact, anonymously if you wish, the Michigan State University’s Human 98 Research Protection Program at 517-355-2180, Fax 517-432-4503, or e-mail irb@msu.edu or regular mail at 4000 Collins Road, Suite 136, Lansing, MI 48910. Written Responses 1. [pre-survey only] Why did you decide to enroll in MTH 133? Describe the factors that influenced your decision. 2. [post-survey only] How has MTH 133 informed your view about the role of mathematics in your current career goal? Student Status 3. For which MTH Course are you filling out this survey? a. 132 b. 133 c. 234 4. What Class level do you consider yourself? a. Freshman b. Sophomore c. Junior d. Senior e. Other [fill-in-the-blank] Career Goals 5. Which of the following BEST describes your current career goal? a. a STEM career (including health and social sciences) b. a career in education c. a career in other fields d. undecided [if (a) a STEM career] 6. Which of the following BEST describes your current career goal? a. Medical professional (e.g., doctor, dentist, vet.) b. Other health professional (e.g., nurse, medical technician) c. Life scientist (e.g., biologist, medical researcher) d. Earth/Environmental scientist (e.g., geologist, meteorologist) e. Physical Scientist (e.g., chemist, physicist, astronomer) f. Engineer: Electrical, Computer Science g. Engineer: Civil, Environmental, Biosystems/Agriculture h. Engineer: Mechanical and Aerospace i. Engineer: Chemical and Materials j. Mathematician k. Social Scientist (e.g., psychologist, sociologist) 99 l. Other (please specify) [if (b) a career in education] 7. Which of the following BEST describes your current career goal? a. Science/Math teacher b. Other teacher (please specify) [if (c) a career in other fields] 8. Which of the following BEST describes your current career goal? a. Business administration b. Lawyer c. English/Language Arts specialist d. Packaging e. Other teacher (please specify) Commitments 9. Approximately how many hours per week during this semester do you expect to… [Options: 0, 1-5, 6-10, 11-15, 16-20, 21-30, 30+] a. work at a job this semester/term? b. participate in organized extracurricular activities such as sports, college newspaper, or clubs this semester/term? c. spend preparing for all classes this semester (studying, reading, writing, doing homework or lab work, analyzing data, rehearsing, or other academic activities outside of class)? d. spend preparing for MTH 133 this semester (studying, reading, writing, doing homework or lab work, analyzing data, rehearsing, or other academic activities outside of class)? Plans and Projections 10. What grade to you expect in MTH 133? a. 4.0 b. 3.5 c. 3.0 d. 2.5 e. 2.0 f. 1.5 g. 1.0 h. 0.0. 11. Do you intend to take another math course after this one? a. Yes b. No c. I don’t know yet 12. Is another math course required for your major? a. Yes 100 b. No c. I don’t know yet 13. How important is a good grade in this course in influencing your decision whether or not to take another math course? a. Not important at all b. Unimportant c. Slightly unimportant d. Slightly important e. Important f. Very important High School Experience 14. My mathematics courses in high school have prepared me to [Options: Strongly agree, agree, somewhat agree, somewhat disagree, disagree, strongly disagree] a. Complete complex calculations without a calculator b. Solve word problems c. Factor expressions d. Solve equations e. Solve inequalities 15. The teacher of my last mathematics course in high school [Options: Strongly agree, agree, somewhat agree, somewhat disagree, disagree, strongly disagree] a. Lectured most of the time b. Primarily showed us how to get answers to specific questions c. Frequently had us work in groups d. Frequently had us solve challenging problems e. Cared that I was successful in the course Previous Course Experience 16. What was the last math course you took before this one? (excluding statistics courses) a. College Algebra / Trigonometry / Pre-Calculus b. Calculus I c. Calculus II d. Calculus III (Multivariable) e. Other (please specify) 17. Where did you take that previous math course? a. High school b. A community college c. MSU d. Another University e. Other (please specify) 18. How long ago did that previous math course end? a. 0-1 months ago 101 b. 2-3 months ago c. 4-8 months ago d. 9-14 months ago e. 15+ months ago 19. What grade did you receive in that previous math course? a. 4.0 -- A b. 3.5 -- A- or B+ c. 3.0 -- B d. 2.5 -- B- or C+ e. 2.0 -- C f. 1.5 -- C- or D+ g. 1.0 -- D h. 0.0 -- E or F Calculator 20. Please rate the following statements: [Options: Strongly agree, agree, somewhat agree, somewhat disagree, disagree, strongly disagree] a. I am comfortable in using a graphing calculator b. I am comfortable in using a computer algebra system (e.g., Maple, MATLAB) c. I am comfortable with programming (e.g., Python, C++, Java, etc.) 21. In high school I was allowed to use graphic calculators on exams a. Always b. Sometimes c. Never 22. In high school I was allowed to use calculators that performed symbolic operations on exams (e.g., TI-89, TI-92) a. Always b. Sometimes c. Never Point of View 23. Please rate the following statements: [Options: Strongly agree, agree, somewhat agree, somewhat disagree, disagree, strongly disagree] a. I believe I have the knowledge and abilities to succeed in this course b. I under stand the mathematics that I have studied c. I am confident in my mathematics abilities d. I enjoy doing mathematics 24. When experience a difficult in my math class… [Scale 1-4] I try hard to figure it out on my own → I quickly seek help of give up trying 25. For me, making unsuccessful attempts when solving a mathematics problem is… [Scale 1-4] 102 A natural part of solving the problem → an indication of my weakness in mathematics 26. My success in mathematics PRIMARILY relies on my ability to… [Scale 1-4] Solve specific kinds of problems → make connections and form logical arguments 27. My score on my mathematics exam is a measure of how well… [Scale 1-4] I understand the covered material → I can do things the way the teacher wants 28. If I had a choice… [Scale 1-4] I would never take another mathematics course → I would continue to take mathematics 29. When studying mathematics in a textbook or in course materials, I tend to… [Scale 1-4] Memorize it the way it is presented → make sense of the material, so that I understand it 30. When solving mathematics problems, graphing calculators or computers help me to… [Scale 1-4] Understand underlying mathematics ideas → find answers to problems 31. The primary role of a mathematics instructor is to… [Scale 1-4] Work problems so students know how to do them → help students learn to reason through problems on their own 32. Please rate the following statements [Options: Strongly agree, agree, somewhat agree, neither agree nor disagree, somewhat disagree, disagree, strongly disagree] a. Mathematics instructors should show students how mathematics is relevant b. If I am unable to solve a problem within a few minutes, it is an indication of my weakness in mathematics c. In order to succeed in calculus at a college or university, I must have taken it before d. Mathematics is about getting exact answers to specific problems e. The process of solving a problem that involves mathematical reasoning is a satisfying experience. About the Labs [post-course survey only] 33. Which of the following did you experience during labs? [Options: Never, Rarely, Sometimes, Often, Always] a. You solved a problem using pen/pencil and paper. b. You solved a problem by writing or editing MATLAB code. c. You analyzed and interpreted results/data to solve a problem. 34. Please indicate the degree to which you either agree or disagree with the following statements about the design of the Application Labs: [options: (1) strongly agree, (2) agree, (3) neither agree nor disagree, (4) disagree, (5) strongly disagree, or (6) not applicable.] a. The Labs is a useful tool for learning calculus overall. b. The Labs help you prepare for quizzes and exams. c. The Labs help you understand connections between calculus concepts. d. The Labs help you understand applications of calculus ideas. 103 e. The Labs help you practice modelling. f. The Labs make learning calculus seem important. g. The Labs are important for your success in MTH 133 h. The Labs are important for your achieving your long-term goals at MSU. i. Weekly quizzes would be more helpful than the Labs for your learning. j. The length of the Labs is appropriate. k. Labs should be shorter. l. Labs should be longer. m. The difficulty of the Labs is appropriate. n. The Labs are too easy. o. The Labs are too difficult. p. It is easy to determine what the Labs questions are asking. q. The feedback on the Labs helps you learn. r. The Labs should be worth a larger part of the course grade s. The Labs should be worth a smaller part of the course grade t. The Labs would still be useful if they were not graded 35. Please indicate the degree to which you either agree or disagree with the following statements about working in groups on the Labs: [options: (1) strongly agree, (2) agree, (3) neither agree nor disagree, (4) disagree, (5) strongly disagree, or (6) not applicable.] a. Working in groups on the Labs is enjoyable b. Working in groups on the Labs helps you learn c. Working in groups on the Labs has helped you make connections for studying outside of class d. Working in groups on the Labs is similar to groupwork experiences in the past e. Working in groups on the Labs is better than groupwork experiences in the past f. Working in groups on the Labs is worse than groupwork experiences in the past g. I prefer being assigned to groups h. I prefer choosing my own group i. Changing groups periodically is a nice way to work with new people j. Changing groups periodically disrupts productive group work dynamics k. Changing groups periodically alleviates the stress of being in an unproductive group 36. Rate your usual participation in the Labs? [scale 1-4] a. I am often very engaged during the Labs → I am rarely engaged during the Labs b. I often take the lead during Labs → I rarely take the lead during Labs c. I often feel included during Labs → I rarely feel included during Labs 37. Please indicate the degree to which you either agree or disagree with the following statements about MATLAB: [options: (1) strongly agree, (2) agree, (3) neither agree nor disagree, (4) disagree, (5) strongly disagree, or (6) not applicable.] a. You can write original MATLAB code to solve problems. b. You can modify existing MATLAB code to solve problems. c. You can read and understand MATLAB code. 104 APPENDIX B: INTERVIEW CONSENT FORM Research Participant Information and Consent Form 1. EXPLANATION OF THE RESEARCH and WHAT YOU WILL DO: You are being asked to participate in a research study about students’ learning experiences with the MTH 133 labs. Your participation will consist of an audio-recorded interview. You must be at least 18 years old to participate in this research. 2. YOUR RIGHTS TO PARTICIPATE, SAY NO, OR WITHDRAW: Participation in this research project is completely voluntary. You have the right to say no. You may change your mind at any time and withdraw. You may choose to stop participating at any time. Whether you choose to participate or not will have no affect on your grade or evaluation in MTH 133. 3. COSTS AND COMPENSATION FOR BEING IN THE STUDY: No cost will be incurred to you based on your participation in the study. The interview is expected to take no longer than 30 minutes, and you will be compensated for your time with a $5 Amazon gift card at the end of the interview. 4. CONTACT INFORMATION FOR QUESTIONS AND CONCERNS: If you have concerns or questions about this study, such as scientific issues, how to do any part of it, or to report an injury, please contact the researcher, Andrew Krause, krausea3@msu.edu, 810-691-2546. If you have questions or concerns about your role and rights as a research participant, would like to obtain information or offer input, or would like to register a complaint about this study, you may contact, anonymously if you wish, the Michigan State University’s Human Research Protection Program at 517- 355-2180, Fax 517-432-4503, or e-mail irb@msu.edu or regular mail at 4000 Collins Road, Suite 136, Lansing, MI 48910. 5. DOCUMENTATION OF INFORMED CONSENT. Your signature below means that you voluntarily agree to participate in this research study. __________________________________________________________ _________________ Signature Date __________________________________________________________________ Print Name 105 APPENDIX C: OBSERVATION CONSENT FORM Research Participant Information and Consent Form 1. EXPLANATION OF THE RESEARCH and WHAT YOU WILL DO: You are being asked to participate in a research study about students’ learning experiences with the MTH 133 labs. Your participation will consist of a video-recorded observation of your normal classroom activity while completing one of the labs. You must be at least 18 years old to participate in this research. 2. YOUR RIGHTS TO PARTICIPATE, SAY NO, OR WITHDRAW: Participation in this research project is completely voluntary. You have the right to say no. You may change your mind at any time and withdraw. You may choose to stop participating at any time. Whether you choose to participate or not will have no affect on your grade or evaluation in MTH 133. 3. COSTS AND COMPENSATION FOR BEING IN THE STUDY: There is no cost nor compensation to participate in this study. 4. CONTACT INFORMATION FOR QUESTIONS AND CONCERNS: If you have concerns or questions about this study, such as scientific issues, how to do any part of it, or to report an injury, please contact the researcher, Andrew Krause, krausea3@msu.edu, 810-691-2546. If you have questions or concerns about your role and rights as a research participant, would like to obtain information or offer input, or would like to register a complaint about this study, you may contact, anonymously if you wish, the Michigan State University’s Human Research Protection Program at 517- 355-2180, Fax 517-432-4503, or e-mail irb@msu.edu or regular mail at 4000 Collins Road, Suite 136, Lansing, MI 48910. 5. DOCUMENTATION OF INFORMED CONSENT. Your signature below means that you voluntarily agree to participate in this research study. __________________________________________________________ _________________ Signature Date __________________________________________________________________ 106 APPENDIX D: STUDENT INTERVIEW PROTOCOL Introduction I am trying to learn about how students interact with the labs to have a better understanding about how students learn calculus, which will help to improve mathematics instruction. Warm-up Questions 1. How is your calculus class going? What is going well? What things stand out that aren’t as helpful or hold you back? How do you feel about your instructor? Demographic Questions 2. Can you please tell me your current academic class and list the college mathematics classes you have taken previously? a. AP calculus? b. Repeats? c. Courses at different institutions? 3. What is your major and what is your planned career trajectory? Disposition toward mathematics 4. What are your feelings about mathematics in general? a. Skill in mathematics? b. Confidence with mathematical ability? 5. How does your mathematics class make you feel? a. Do you feel like you can succeed in the class? b. Do you feel like you belong in the class? 6. How has MTH 133 informed your view about the role of mathematics in your current career goal? Goals and Study Strategy 7. How did you decide to enroll in MTH 133? (requirement, you think it will be useful) 8. What are your goals for MTH 133? 9. Can you explain a typical week of your learning associated with MTH 133? I am trying to get a picture of how you learn calculus, so tell me any details that you think are relevant. a. What do you do during lecture? During recitation? b. What do you do outside of class? c. How do you prepare for quizzes or exams? d. Do you visit the MLC, office hours, or some other kind of out-of-class instruction? e. What kind of homework do you do? Webwork? Other problems? 107 About the Labs 10. Can you explain what typically happens or what you typically do during your recitation/lab? a. What do you do? i. Solve a problem using pen and paper? ii. Writing/editing MATLAB code? iii. Analyzed/interpreted results? b. What about your group members? c. What about other classmates/groups? d. What does your instructor do? e. Does the lecturer mention/launch/debrief the labs during lecture? 11. How does MATLAB impact your learning experience? a. Did you gain insight about calculus concepts by seeing how they are represented in computer simulations? If so, which concepts and what insights? b. Do you feel proficient with MATLAB? i. Modifying existing code? ii. Writing original code? 12. Do you use any outside resources (other than MATLAB) while you are completing your labs? a. Calculators? (How do you use them / what do you use them for?) b. Websites? (How do you use them / what do you use them for?) c. Classmates in other sections? 13. Do you feel like the labs help you learn? a. What do the labs help you learn, specifically? i. Practice/preparation for exams/quizzes? ii. Understand connections between concepts? iii. Understand/appreciate applications of calculus ideas? iv. Practice modelling? b. Do you feel like class time spend on the labs is worthwhile in the context of succeeding in the course? i. Is there something else that would be a better use of class time to help you succeed in the course? (Perhaps compare the goals of the labs and quizzes and compare the effectiveness of those goals.) c. Do you feel like class time spend on the labs is worthwhile in the context of your long-term goals at MSU? i. Is there something else that would be a better use of class time to help you achieve your long-term goals? 14. How do the labs make you feel about your learning? a. Are you interested in the labs? b. Do you find the labs enjoyable? What aspects? i. Working in groups? ii. Doing something different than solving problems? iii. Working through problems without the instructor? iv. Seeing how calculus can be used in practical contexts? 15. Are you invested in the labs? 108 a. Why do you complete the labs? Trying to finish them? Good learning opportunity? Interesting? b. If the labs were ungraded, would you attend recitation to complete the labs? 16. Are there any changes that you would make to the way that the labs are done? a. Introduction/follow up? b. Related homework assignment? c. Tied to quizzes/exams? 109 APPENDIX E: LAB TASK DETAILS This appendix describes the labs in detail and is drawn from previous work about the labs (Krause, Maccombs, and Wong, in review). Note that the lab numbering changed between Fall 2018 and Spring 2019, during the data collection for this dissertation. Lab deployment Students complete labs in groups of three to four students, randomly assigned at the start of each recitation. Labs consist of a main lab document in the form of a MATLAB LiveScript file, which are e-mailed to students and made available for download on the course website prior to the recitation, and a printed worksheet distributed by the recitation instructor. The MATLAB LiveScript format provides an interactive programming notebook interspersing prose documentation with executable code. Students run the provided LiveScript file either on a local MATLAB installation on their personal laptops, or through the MATLAB Online cloud computing service. Our university has an institutional MATLAB license covering use by all students and faculty. Students follow the instructions given in the MATLAB LiveScript file, and record their answers to factual and interpretive questions on the printed worksheet, which are then collected and graded by the recitation instructor. The MATLAB LiveScripts and printed worksheets can be downloaded from www.calculus2labs.com. Lab 1: Numerical Integration Main learning objective: sums can be used to approximate integrals. This learning objective ties into the definition of the Riemann integral as area under the curve, and provides students with an introduction to numerical integration. Additionally, this ties into the “integral test” concept in the portion of Calculus II concerning “sequences and series”. 110 Lab 2: Baseball Rocketry Main learning objective: integrals can be used to approximate sums. This learning objective gives the converse of the previous one, showing that in practical applications integrals and sums are mostly interchangeable. Application setting: derive the relationship between fuel consumed and change of velocity in rocketry (discovering the Tsiolkovsky equation); evaluate which of two two-stage rocket designs has better performance. Lab 3: Zombie Attacks Main learning objective: exponential growth and decay, and introduction to differential/difference equations as modeling positive/negative feedback loops. Application setting: disease transmission in an insular community. Lab 4: Trial and Tribble-ation Main learning objective: sequence/series convergence via comparison against geometric sequence/series. Underlying every series convergence test is a comparison against a well-chosen model limiting behavior; this lab ties in with the derivation of the ratio test. Application setting: algorithm design (run-time guarantees), binary search, signal quantization with error. Lab 5: Boom Bust Butterfly Main learning objective: modes of sequence divergence. Address common misconception that for a sequence to diverge the terms must escape to infinity. Application setting: regular (convergent and periodic) versus chaotic behaviors in market economics forecasting. 111 Lab 6: Leibniz’s Wheel Main learning objective: the connection between rates of convergence and the radius of convergence of a power series. Application setting: design a method to efficiently evaluate numerically the natural log function using only addition, subtraction, multiplication, and division; basics of computer science. Lab 7: Etch-a-sketch Time Main learning objective: parametric curves, their derivatives, and reconstructions of curves from derivatives. Application setting: reconstruction of racetrack layout from on-board velocity data on a racecar. Sample worksheet questions for Lab 5 Exercise 1. The number of industry players in the next quarter can be computed from the number in the previous quarter by the formula X(t+1) = R(X(t)) * X(t). The rate of growth function R(X) typically exhibits the following two features: (i) R(X) is a decreasing function of X (ii) R(X) is never less than zero. • Explain why, in the context of our model of a market economy, these two features should be intuitively expected. • Explain also why the solution to R(X) = 1 is considered the market equilibrium. Exercise 2. [NB: Lab introduced a volatility parameter v into the rate of growth function R(X).] • For X(0) = 0.4, computer the percent change from X(0) to X(1) for the three volatility levels v = 0.5, 1, and 2. • Do the same for X(0) = 2. 112 • You may use MATLAB to perform these computations. Exercise 3. In the low volatility setting, does the convergence to market equilibrium happen faster for less volatile markets or more volatile ones? How does this compare with our intuitive understanding of volatility? Exercise 4. In the medium volatility setting, does the convergence to market equilibrium happen faster for less volatile markets or more volatile ones? How does this compare with our intuitive understanding of volatility? Exercise 5. Run the simulation in the "Chaos" section a few times, with different values of volatility. What do you notice that is different qualitatively between the v = 2.8 case versus the other three cases? What does this say about attempts to forecast stock prices, in a very highly volatile market, based on incomplete information? 113 APPENDIX F: CODEBOOK Open code and definition Descriptive codes Thematic (word in CAPS is the code) codes DISCUSSION: students are • WHOLE-GROUP discussion is Behavioral engaged in discussion with group directed towards the entire group engagement members • SPLIT discussion is directed as an aside to just one other group member SILENT: students are working • at least one group member is Behavioral silently, or observing silently as WRITING on the lab document engagement group members discuss or work • some group members are WORKING SILENTLY on their computer or paper • some group members are being SILENT OBSERVERS as others work or discuss STUCK: students reach an • resolved by asking INSTRUCTOR Behavioral impasse question engagement • discussed with ANOTHER GROUP • resolved by PERSEVERING OFF-TASK: Students are off-task Note: Off-task behaviors include Behavioral when they are actively doing browsing the internet, texting, instant engagement something not related to messaging, or off-task discussion. if a completing the lab. Must be student appears to be off-task, but their directly observed, not assumed, to screen is not visible in the data, the be off-task instance should be coded working silently. INTEREST/ENJOYMENT: • Finds the labs INTERESTING (or Emotional student expresses finding interest not) engagement and/or enjoyment in the labs, or • REAL-WORLD CONTEXTS are late thereof interesting • enjoy CHALLENGE of labs • CONFUSION prevents interest/enjoyment • feeling RUSHED prevents interest/enjoyment CONFUSION: student expresses • confused by the CODING Emotional that they were confused by the • confused by the LABS specifically engagement labs or explains how feeling • GENERAL CONFUSION about confused made them feel Calculus 2 Table 4: Codebook. 114 Table 4 (cont’d) Open code and definition Descriptive codes Thematic (word in CAPS is the code) codes FRUSTRATION: student • frustrated by TIMING, feeling rushed, Emotional expresses frustration or anger or not being able to complete the lab engagement • frustrated by GRADES and their perception that earning a full score was too difficult • frustrated by the lack of CONNECTION between the labs and quiz/exams GROUPWORK: student • LIKES GROUPWORK Emotional comments about their perception • groupwork was HELPFUL and cognitive or feelings about groupwork or • comments about GROUP MEMBERS engagement their group members • comments about CHANGING groups CONFIDENCE: student • has SELF-CONFIDENCE Emotional expresses state of confidence or • LACK of confidence engagement impact of experience on • struggle REDUCED confidence confidence • perseverance IMPROVED confidence GRADES: student expresses • reported that GRADE on lab was the Cognitive connection between their lab most important motivational factor engagement experience and their grades, either • reported that they complete all their overall or in terms of quiz and ASSIGNMENTs. exam studying and success. LEARNING VALUE: student • labs were HELPFUL for learning Cognitive discusses their perception of the • labs provided DIFFERENT engagement learning value of the labs. PERSPECTIVE • appreciated CHALLENGE labs provided • APPLICATIONS of labs were useful • CONNECTION between labs and the rest of the curriculum was not apparent MATLAB: student discusses their • matlab useful for ENGINEERS Cognitive perception of MATLAB or • coding matlab helped them engagement coding and how it impacted their UNDERSTAND the mathematics experience with the labs better • computer SIMULATIONS useful/interesting 115 APPENDIX G: POST-COURSE SURVEY DATA Response Distribution Question Text Term N never rarely sometimes often always You solved a problem using 11% 11% 33% 26% 19% SS19 93 pen/pencil and paper. 9% 18% 22% 27% 24% FS18 104 You solved a problem by writing 17% 28% 31% 16% 8% SS19 93 or editing MATLAB code. 13% 28% 26% 15% 18% FS18 105 You analyzed and interpreted 11% 11% 43% 24% 11% SS19 92 results/data to solve a problem. 6% 10% 37% 40% 8% FS18 104 Table 5: Survey responses about behaviors that the labs facilitate. Response distributions are presented separately for each term that the survey was administered. The number of responses to each question are listed in the right-hand column. Response Distribution Question Text neither Term N strongly strongly agree agree nor disagree agree disagree disagree You can write original 9% 12% 12% 24% 43% SS19 91 MATLAB code to solve problems. 6% 15% 19% 18% 41% FS18 104 You can modify existing 8% 31% 19% 19% 23% SS19 90 MATLAB code to solve problems. 4% 40% 17% 16% 23% FS18 105 You can read and 9% 23% 25% 21% 22% SS19 91 understand MATLAB code. 8% 28% 17% 24% 24% FS18 105 Table 6: Survey responses about skills related to MATLAB coding. Response distributions are presented separately for each term that the survey was administered. The number of responses to each question are listed in the right-hand column. 116 Response Question Text neither Term N Strongly strongly Agree agree nor disagree Agree disagree disagree The Labs is a useful tool 4% 8% 11% 20% 57% SS19 90 for learning calculus overall. 3% 10% 14% 22% 51% FS18 104 The Labs help you 6% 4% 8% 16% 67% SS19 90 prepare for quizzes and exams. 3% 7% 9% 18% 64% FS18 102 The Labs help you 8% 19% 17% 12% 44% SS19 90 understand connections between calculus 4% 16% 19% 17% 45% FS18 102 concepts. The Labs help you 8% 21% 18% 11% 42% SS19 92 understand applications of calculus ideas. 7% 22% 10% 18% 43% FS18 103 The Labs help you 3% 24% 20% 13% 40% SS19 91 practice modeling. 6% 17% 25% 17% 34% FS18 103 The Labs make learning 5% 14% 23% 15% 42% SS19 92 calculus seem important. 6% 15% 17% 17% 45% FS18 103 The Labs are important 5% 12% 9% 21% 53% SS19 91 for your success in MTH 133 4% 16% 13% 18% 50% FS18 103 The Labs are important 4% 12% 11% 19% 53% SS19 90 for your achieving your long-term goals at 3% 13% 17% 17% 50% FS18 103 MSU. Weekly quizzes would 53% 26% 5% 5% 10% SS19 92 be more helpful than the Labs for your learning. 40% 36% 9% 11% 5% FS18 103 Table 7: Survey responses about the usefulness and effectiveness of the labs. Response distributions are presented separately for each term that the survey was administered. The number of responses to each question are listed in the right-hand column. 117 Response Question Text strongly neither agree strongly Term N agree disagree agree nor disagree disagree The length of the Labs 10% 23% 22% 18% 27% SS19 91 is appropriate. 7% 31% 28% 23% 12% FS18 104 22% 27% 22% 11% 18% SS19 91 Labs should be shorter. 13% 15% 35% 23% 14% FS18 104 11% 9% 22% 20% 38% SS19 90 Labs should be longer. 5% 12% 31% 25% 27% FS18 102 The difficulty of the 4% 12% 14% 32% 37% SS19 91 Labs is appropriate. 5% 21% 25% 22% 27% FS18 104 6% 3% 8% 24% 60% SS19 89 The Labs are too easy. 4% 2% 18% 40% 36% FS18 102 The Labs are too 37% 49% 11% 1% 2% SS19 90 difficult. 20% 31% 29% 15% 5% FS18 102 It is easy to determine 6% 10% 7% 26% 52% SS19 90 what the Labs questions are asking. 5% 11% 25% 23% 37% FS18 104 The feedback on the 6% 13% 14% 16% 51% SS19 83 Labs helps you learn. 3% 11% 17% 23% 46% FS18 100 The Labs should be 5% 8% 13% 22% 52% SS19 92 worth a larger part of the course grade 8% 13% 17% 22% 41% FS18 102 The Labs should be 41% 40% 14% 3% 2% SS19 93 worth a smaller part of the course grade 28% 22% 29% 12% 9% FS18 103 The Labs would still be 17% 23% 26% 9% 25% SS19 92 useful if they were not graded 8% 15% 25% 20% 33% FS18 102 Table 8: Survey responses about the logistics of the labs. 118 Response Question Text Strongly strongly Term N Agree neither disagree Agree disagree Working in groups on the Labs 13% 36% 26% 8% 17% SS19 92 is enjoyable 12% 42% 22% 12% 13% FS18 104 Working in groups on the Labs 11% 42% 17% 12% 17% SS19 92 helps you learn 12% 38% 27% 13% 12% FS18 104 Working in groups on the Labs 12% 26% 17% 21% 24% SS19 92 has helped you make connections for studying 8% 23% 21% 27% 21% FS18 104 outside of class Working in groups on the Labs 11% 41% 23% 12% 13% SS19 92 is similar to groupwork experiences in the past 13% 33% 24% 14% 16% FS18 104 Working in groups on the Labs 7% 17% 36% 18% 22% SS19 92 is better than groupwork experiences in the past 4% 20% 33% 22% 22% FS18 102 Working in groups on the Labs 13% 23% 39% 14% 11% SS19 92 is worse than groupwork experiences in the past 11% 19% 37% 20% 14% FS18 102 I prefer being assigned to 11% 24% 31% 15% 19% SS19 91 groups 13% 27% 33% 17% 11% FS18 104 I prefer choosing my own 23% 24% 31% 16% 5% SS19 91 group 11% 23% 34% 16% 17% FS18 103 Table 9: Survey responses about the groupwork component of the labs. Response distributions are presented separately for each term that the survey was administered. The number of responses to each question are listed in the right-hand column. 119 Table 9 (cont’d) Changing groups periodically 13% 42% 31% 9% 5% SS19 91 is a nice way to work with new people 21% 39% 21% 9% 10% FS18 104 Changing groups periodically 7% 31% 32% 24% 7% SS19 91 disrupts productive group work dynamics 9% 22% 34% 19% 16% FS18 103 Changing groups periodically 12% 43% 31% 10% 4% SS19 91 alleviates the stress of being in an unproductive group 19% 39% 27% 6% 9% FS18 104 120 REFERENCES 121 REFERENCES Akelbek, M. (2019). Implementing MATLAB programming to begnner users in math courses. In Joint Mathematics Meetings Full Program. Baltimore. Retrieved from http://jointmathematicsmeetings.org/amsmtgs/2217_abstracts/1145-r1-816.pdf Beymer, P. (2020). Is it worth it? : Three papers examining students’ perceptions of cost in mathematics. Michigan State University. Boelkins, M., Austin, D., & Schlicker, S. (2019). Active Calculus. Retrieved from activecalculus.org Bonwell, C., & Eison, J. (1991). Active Learning: Creating Excitement in the Classroom. Clearinghouse on Higher Education at The George Washington University. Bressoud, D., Mesa, V., Hsu, E., & Rasmussen, C. (2014). Successful calculus programs: Two- year colleges to research universities. In NCTM Research Presession. Retrieved from http://www.maa.org/sites/default/files/pdf/cspcc/CSPCC-NCTM.pdf Bressoud, D., Mesa, V., & Rasmussen, C. (2015). Insights and recommendations from the MAA national study of college calculus. Washington, DC: MAA Press. Retrieved from http://www.maa.org/sites/default/files/pdf/cspcc/InsightsandRecommendations.pdf Common Core State Standards Initiative. (2010). Common Core State Standards for Mathematics. Washington, DC. Connell, J., & Wellborn, J. (1991). Competence, autonomy, and relatedness: A motivational analysis of self-system processes. In M. Gunnar & L. Sroufe (Eds.), Self processes and development (pp. 43–77). Lawrence Eribaum Associates, Inc. Cotton, W., Lockyer, L., Brickell, G. J., & Brickell, G. (2009). A Journey Through a Design- Based Research Project A Journey Through a Design-Based Research Project. Proceedings of World Conference on Educational Multimedia, Hypermedia and Telecommunications 2009, 2009, 1364–1371. CRAFTY. (2007). College Algebra Guidelines. Washington, DC. Retrieved from http://www.maa.org/sites/default/files/pdf/CUPM/crafty/CRAFTY-Coll-Alg-Guidelines.pdf Csikszentmihalyi, M. (1990). Flow: The psychology of optimal experience. New York: Harper Perennial. Doyle, W. (1979). The tasks of teaching and learning in classrooms. Annual Meeting of the AERA. Austin. Doyle, W. (1983). Academic work. Review of Educational Research, 53(2), 159–199. Retrieved from http://rer.sagepub.com/content/53/2/159.short 122 Doyle, W., & Carter, K. (1984). Academic tasks in classrooms. Curriculum Inquiry, 14(2), 129– 149. Retrieved from http://www.jstor.org/stable/3202177 Ellis, J., Rasmussen, C., & Duncan, K. (2013). Switcher and persister expereinces in calculus 1. In S. Brown, G. Karakok, K. Hah Roh, & M. Oehrtman (Eds.), Proceedings of the 16th annual converence on research in undergraduate mathematics education (pp. 2100–2106). Denver, CO. Epstein, J. (2013). The Calculus Concept Inventory–Measurement of the Effect of Teaching Methodology in Mathematics. Notices of the American Mathematical Society, 60(08), 1018. https://doi.org/10.1090/noti1033 Finn, J. (1989). Withdrawing From School. Review of Educational Research, 59(2), 117–142. https://doi.org/10.3102/00346543059002117 Finn, J., Pannozzo, G., & Voelkl, K. (1995). Disruptive and Inattentive-Withdrawn Behavior and Achievement among Fourth Graders. The Elementary School Journal, 95(5), 421–434. https://doi.org/10.1086/461853 Finn, J., & Rock, D. (1997). Academic success among students at risk for school failure. Journal of Applied Psychology, 82(2), 221–234. https://doi.org/10.1037/0021-9010.82.2.221 Flint, D., Hunter, B., & Kemp, D. (2011). South Dakota State University’s participation in the MAA Renewed College Algebra Project. In S. Ganter & W. Haver (Eds.), Partner Discipline Recommendations for Introductory College Mathematics and the Implications for College Algebra (pp. 73–76). The Mathematical Association of America. https://doi.org/10.1017/CBO9781107415324.004 Fredricks, J., Blumenfeld, P., & Paris, A. (2004). School Engagement: Potential of the Concept, State of the Evidence. Review of Educational Research, 74(1), 59–109. https://doi.org/10.3102/00346543074001059 Fredricks, J., & McColskey, W. (2012). The measurement of student engagement: A comparative analysis of various methods and student self-report instruments. In S. Christenson, A. Reschly, & C. Wylie (Eds.), Handbook of Research on Student Engagement (pp. 763–782). Boston, MA: Springer. https://doi.org/10.1007/978-1-4614-2018-7 Freeman, S., Eddy, S. L., McDonough, M., Smith, M. K., Okoroafor, N., Jordt, H., & Wenderoth, M. P. (2014). Active learning increases student performance in science, engineering, and mathematics. Proceedings of the National Academy of Sciences, 111(23), 8410–8415. https://doi.org/10.1073/pnas.1319030111 Ganter, S., & Barker, W. (2004). The Curriculum Foundations Project: Voices of the Partner Disciplines. (S. Ganter & W. Barker, Eds.). Washington, DC: Mathematical Association of America. Ganter, S., & Haver, W. (2011). Partner Discipline Recommendations for Introductory College Mathematics and the Implications for College Algebra. MAA Notes. 123 Howard-Payne, L. (2016). Glaser or Strauss? Considerations for selecting a grounded theory study. South African Journal of Psychology, 46(1), 50–62. https://doi.org/10.1177/0081246315593071 Johnson, E., Keller, R., Andrews-Larson, C., Fortune, N., & Keene, K. (2018). Inquiry and inequity in the undergraudate mathematics classroom (pp. 2014–2017). Johnson, H. (2019). A difference oriented view of Leibniz’s early ideas. In Joint Mathematics Meetings Full Program. Baltimore. Retrieved from http://jointmathematicsmeetings.org/amsmtgs/2217_abstracts/1145-r1-2723.pdf Kilty, J., Marr, A., & McAllister, A. (2019). Developing computing skills in calculus courses. In Joint Mathematics Meetings Full Program. Baltimore. Retrieved from http://jointmathematicsmeetings.org/amsmtgs/2217_abstracts/1145-r1-2433.pdf Kim, K. (2019). Embedding pytong programming experiences in Calculus I with Sage. In Joint Mathematics Meetings Full Program. Baltimore. Retrieved from http://jointmathematicsmeetings.org/amsmtgs/2217_abstracts/1145-r1-2691.pdf Kogan, M., & Laursen, S. L. (2014). Assessing Long-Term Effects of Inquiry-Based Learning: A Case Study from College Mathematics. Innovative Higher Education, 39(3), 183–199. https://doi.org/10.1007/s10755-013-9269-9 Kong, Q. P., Wong, N. Y., & Lam, C. C. (2003). Student engagement in mathematics: Development of instrument and validation of construct. Mathematics Education Research Journal, 15(1), 4–21. https://doi.org/10.1007/BF03217366 Krause, A., Maccombs, R., & Wong, W. (2020). Experiencing Calculus Through Computational Labs: Our Department’s Cultural Drift Toward Modernizing Mathematics Instruction. Primus: Problems, Resources, and Issues in Mathematics Undergraduate Studies, 31(3–5), 434–448. https://doi.org/10.1080/10511970.2020.1799457 Krause, A., Maccombs, R., Wong, W., & Iwen, M. (2019). Case study of MATLAB integration in Calculus II: Insights and improvements. In Joint Mathematics Meetings Full Program. Baltimore. Retrieved from http://jointmathematicsmeetings.org/amsmtgs/2217_abstracts/1145-r1-2959.pdf Krause, A., & Putnam, R. (2016). Online calculus homework: The student experience. In Proceedings of the 19th annual conference on research in undergraduate mathematics education. https://doi.org/10.1017/CBO9781107415324.004 Labinowicz, E. (1980). The Piaget Primer: Thinking, Learning, Teaching. Addison-Weslet. Laursen, S. L., Hassi, M. L., Kogan, M., & Weston, T. J. (2014). Benefits for women and men of inquiry-based learning in college mathematics: A multi-institution study. Journal for Research in Mathematics Education, 45(4), 406–418. https://doi.org/10.5951/jresematheduc.45.4.0406 124 Merriam, S., & Tisdell, E. (2016). Qualitative research: A guide to design and implementation. San Francisco: Jossey-Bass Publishers. Middleton, J. A., Jansen, A., & Goldin, G. A. (2017). The complexities of mathematical engagement: Motivation, affect, and social interactions. In Compendium for research in mathematics education (pp. 667–699). Reston, VA: National Council of Teachers of Mathematics. Mills, A., Durepos, G., & Wiebe, E. (2010). Coding: open coding. In Encyclopedia of Case Study Research (pp. 156–157). Thousand Oaks: SAGE Publications, Inc. https://doi.org/10.4135/9781412957397 Nystrand, M., & Gamoran, A. (1991). Instructional Discourse, Student Engagement, and Literature Achievement. Source: Research in the Teaching of English, 25(3), 261–290. Retrieved from http://www.jstor.org/stable/40171413%5Cnhttp://about.jstor.org/terms PCAST. (2012). Engage to excel: Producing one million additional college graduates with Degrees in Science, Technology, Engineering, and Mathematics. Washington, DC. Philippakos, Z., Howell, E., & Pellegrino, A. (Eds.). (2021). Design-Based Research in Education: Theory and Applications. New York: Guilford Press. Rasmussen, C., Apkarian, N., Bressoud, D., Ellis, J., Johnson, E., & Larsen, S. (2016). A national investigation of Precalculus through Calculus 2. In Proceedings of the 20th Annual Conference on Research in Undergraduate Mathematics Education. https://doi.org/10.1017/CBO9781107415324.004 Reeves, T. (2006). Design research from a technology perspective. In J. Van den Akker, K. Gravemeijer, S. McKenney, & N. Nieveen (Eds.), Educational Design Research (pp. 52– 66). UK: Routledge. Reinholz, D., Pilgrim, M. E., & Finkelstein, N. D. (2018). Departmental Action Teams : A five- year update on a model for sustainable change. Reinvention Collaborative 2018 National Conference, (November). Strauss, A., & Corbin, J. (1997). Grounded theory in practice. United Kingdom: SAGE Publications. Sullivan, E., & Fasteen, J. (2019). Integrating programming education across the undergraduate math curriculum. In Joint Mathematics Meetings Full Program. Baltimore. Retrieved from http://jointmathematicsmeetings.org/amsmtgs/2217_abstracts/1145-r1-1166.pdf Thompson, P., & Dreyfus, T. (2016). A coherent approach to the fundamental theorem of calculus using differentials. Proceedings of the Conference on Didactics of Mathematics in Higher Education as a Scientific Discipline, 355–359. Retrieved from http://pat- thompson.net/PDFversions/2016Thompson-Dreyfus.pdf 125 Webel, C. (2013). High School Students’ Goals for Working Together in Mathematics Class: Mediating the Practical Rationality of Studenting. Mathematical Thinking and Learning, 15(1), 24–57. https://doi.org/10.1080/10986065.2013.738379 Zazkis, D., & Nunez, G. (2015). How departments use local data to inform and refine program improvements. In Insights and recommendations from the MAA national study of college calculus (pp. 123–132). 126