PROFILES OF STUDENT ENGAGEMENT IN SYNCHRONOUS AND ASYNCHRONOUS SCIENCE INSTRUCTION By Matthew J. Schell A DISSERTATION Submitted to Michigan State University in partial fulfillment of the requirements for the degree of Educational Psychology and Educational Technology – Doctor of Philosophy 2022 ABSTRACT PROFILES OF STUDENT ENGAGEMENT IN SYNCHRONOUS AND ASYNCHRONOUS SCIENCE INSTRUCTION By Matthew J. Schell Virtual instruction at the K-12 level is on the rise, yet we know very little about the ways students engage in different types of virtual instruction. The goals of this study were to: 1) describe high school students’ engagement in virtual science courses in terms of behavioral, affective, cognitive-value, and cognitive-self regulatory dimensions; 2) explore whether students’ engagement patterns across these dimensions differed depending on whether science activities were synchronous or asynchronous; and 3) examine whether these engagement patterns were associated with students’ final course grades or over-summer retention in a virtual high school. Students enrolled in a range of science courses at virtual high school (n=124) provided multiple reports (n=493) of their engagement during both synchronous and asynchronous learning activities. Latent Profile Analysis (LPA) conducted with these data suggested five distinct situational engagement profiles representing different constellations of the affective, behavioral, cognitive-value, and cognitive-self-regulatory dimensions of engagement. During synchronous instruction, students tended to engage in ways characterized by higher engagement in all dimensions compared with asynchronous instruction. These high engagement profiles were also associated with higher final course grades. There were few differences in the extent to which profiles predicted retention; however, lower self-regulation was associated with higher rates of retention. This dissertation is dedicated to my patient, understanding, supportive and caring family: Dani, Huxley, and Liesl. I’m thankful for our adventures together the past several years that helped keep life in perspective and allowed me to stay motivated and engaged throughout my graduate school journey. iii ACKNOWLEDGMENTS First and foremost, I am extremely grateful to my adviser, Jennifer Schmidt, for her invaluable advice, understanding, and patience during my time as a graduate student. You saw me as a whole person, with a life and responsibilities outside of my PhD studies, and for that I am forever appreciative. I would also like to thank my committee members, Elizabeth Boltz, Matthew Koehler, and Kristy Robinson, for their feedback, advice, and ideas throughout the dissertation writing process. The final result is better because of your input. Additionally, I would like to express gratitude to the faulty in the EPET program. The formal and informal conversations and interactions I had with you all throughout the years were immensely helpful in shaping my identity as a researcher and scholar. Sharon Hammond, your expert knowledge of the bureaucracy within the College of Education was enormously beneficial and kept me from wasting countless hours stumbling around websites trying to figure out what form I needed to complete for some milestone or another. Finally, thank you to my fellow EPET graduate students. I’ll fondly remember the times we spent collaborating in lab, walking around campus, commiserating about our shared experiences, and celebrating each other’s accomplishments. iv TABLE OF CONTENTS LIST OF TABLES ...................................................................................................................... vii LIST OF FIGURES ..................................................................................................................... ix INTRODUCTION..........................................................................................................................1 LITERATURE REVIEW .............................................................................................................2 WHAT IS ENGAGEMENT? ..............................................................................................2 DIMENSIONS OF ENGAGEMENT ..................................................................................4 A PERSON-ORIENTED APPROACH TO ENGAGEMENT ..................................................7 THEORIZED AFFORDANCES AND CONSTRAINTS OF SYNCHRONOUS AND ASYNCHRONOUS INSTRUCTION ..........................................................................................9 ENGAGEMENT AS A FACILITATOR FOR RETENTION AND GRADES .....................12 CURRENT STUDY .....................................................................................................................13 METHODS ...................................................................................................................................14 STUDY PROCEDURE .....................................................................................................14 SETTING ...........................................................................................................................16 PARTICIPANTS ...............................................................................................................17 MEASURES .................................................................................................................................17 DIMENSIONS OF ENGAGEMENT ................................................................................17 PREDICTORS OF ENGAGEMENT ................................................................................18 OUTCOME MEASURES .................................................................................................19 ANALYSES ..................................................................................................................................19 RESULTS .....................................................................................................................................21 PRELIMINARY RESULTS ..............................................................................................21 PROFILE OF STUDENT ENGAGEMENT .....................................................................23 INSTRUCTIONAL MODALITY AS A PREDICTOR OF ENGAGEMENT .................26 ENGAGEMENT AS A PREDICTOR OF GRADES AND RETENTION ......................27 DISCUSSION ...............................................................................................................................28 SITUATIONAL ENGAGEMENT PROFILES .................................................................29 DISENTANGLING AND DEFINING COGNITIVE ENGAGEMENT ..........................31 INSTRUCTIONAL MODALITY AND INDIVIDUAL DIFFERENCES AS PREDICTORS OF ENGAGEMENT ................................................................................33 v OVER-SUMMER-RETENTION AND FINAL COURSE GRADES AS AN OUTCOME OF ENGAGEMENT PROFILES .................................................................36 MEASURING ENGAGEMENT IN ONLINE LEARNING ............................................38 USEFULNESS OF A PERSON-ORIENTED APPROACH .............................................40 EDUCATIONAL IMPLICATIONS ..................................................................................42 FUTURE DIRECTIONS .............................................................................................................45 LIMITATIONS AND DELIMITATION ..................................................................................46 CONCLUSIONS ..........................................................................................................................49 APPENDICES ..............................................................................................................................63 APPENDIX A – PARENTAL CONSENT AND STUDENT ASSENT TEXT ...............64 APPENDIX B – RECRUITMENT EMAILS SENT TO PARENTS AND STUDENTS .......................................................................................................................67 APPENDIX C – INSTRUCTIONAL ACTIVITIES TABLES .........................................69 APPENDIX D – ENGAGEMENT MEASURES ..............................................................76 APPENDIX E – ADDITIONAL ANALYTICAL INFORMATION ................................78 APPENDIX F – MISSING DATA ANALYSIS ...............................................................81 REFERENCES .............................................................................................................................83 vi LIST OF TABLES Table 1: Participant demographic characteristics ..........................................................................51 Table 2: Descriptive statistics and correlations for all engagement, predictor, and outcome variables ..........................................................................................................................52 Table 3: Within/between correlations for items measuring affective engagement........................53 Table 4: Within/between correlations for items measuring behavioral engagement .....................54 Table 5: Within/between correlations for items measuring cognitive-self-regulatory engagement ....................................................................................................................................55 Table 6: Within/between correlations for items measuring cognitive-value engagement .............56 Table 7: Fit statistics for candidate latent engagement profiles .....................................................57 Table 8: Means and 95% confidence intervals of engagement dimensions among profiles (n = 493).........................................................................................................................................58 Table 9: Odds ratios and standards errors for pairwise profile comparison. Reference profile is listed first ........................................................................................................................59 Table 10: Mean difference pairwise comparisons of distal outcomes ...........................................60 Table C1: Total occurrences of primary instructional activity across all teachers ........................69 Table C2: Description of content and instructional activities during the study period for Teacher 1 – Earth Science..............................................................................................................70 Table C3: Description of content and instructional activities during the study period for Teacher 2 – Physical Science .........................................................................................................71 Table C4: Description of content and instructional activities during the study period for Teacher 3 – Forensic Science ........................................................................................................72 Table C5: Description of content and instructional activities during the study period for Teacher 4 – Physical Science .........................................................................................................73 Table C6: Description of content and instructional activities during the study period for Teacher 5 – Veterinary Science .....................................................................................................74 vii Table C7: Description of content and instructional activities during the study period for Teacher 5 – Chemistry ...................................................................................................................75 viii LIST OF FIGURES Figure 1: Raw score profile means from selected profile solution ................................................61 Figure 2: Standardized profile means from selected profile solution ............................................62 ix INTRODUCTION The number of K-12 students taking online courses and enrolling in virtual schools is increasing. For example, during the 2004-05 school year only 300,000 students were enrolled in these types of courses; however, by 2019, prior to the COVID-19 pandemic, that number had grown to over 1 million online course enrollments and up to 1.5 million students taking online credit recovery courses (Digital Learning Collaborative, 2020). Enrollment in full-time 100% virtual schools grew from around 10,000 nationally in 2000, to 375,000 in 2019 (Miron et al, 2018; Digital Learning Collaborative, 2020). The COVID-19 pandemic exponentially increased students’ participation in virtual education, with over 90% of households with school age children experiencing some form of virtual learning during this period, even if only temporarily (Mcelrath, 2020). Because of the ubiquity of virtual education in today’s educational landscape, it is increasingly important that practitioners and researchers understand how educators can best support students’ engagement in virtual learning spaces to promote positive educational outcomes. Engagement is a particular challenge for virtual educators. As enrollment at virtual schools has steadily increased, so have concerns about high rates of attrition (Aud et al., 2012; Hawkins & Barbour, 2010). For example, a report by Grad Nation indicated that 61% of virtual schools suffer from low graduation rates (defined as a graduation rate less than 67%) compared with only 4% of traditional schools (Atwall et al., 2020). Additionally, Baker et al. (2007) and San Pedro et al. (2014) reported that students participating in virtual learning tend to disengage in a variety of ways. Student engagement has reportedly suffered to an even greater extent during the COVID-19 pandemic (Toth, 2021). Since engaged students are more likely to finish school, learn more, and experience higher achievement, keeping virtual students engaged is of utmost 1 importance to increase retention and graduation rates (Greene et al., 2004; Heddy & Sinatra, 2013; Reschly and Christenson, 2012). Virtual education can be divided into two broad types of learning experiences: synchronous and asynchronous learning. Due to inherent differences in how teaching and learning happens in these two instructional modalities it is likely that the way students engage differs, partially based on the technological constraints and affordances characteristic of synchronous and asynchronous learning. This is important because a basic understanding of how students engage in these two instructional modalities is the first step towards designing virtual experiences which encourage students to engage more fully with their learning. I sought to explore the ways that students engage in science instruction at a virtual high school and to explore whether there is systematic variation in how students engage by learning modality. Additionally, this study will examine the association between students’ engagement in virtual science instruction and two educational outcomes: 1) their persistence at their current school and 2) their final course grade. Engagement is closely linked with dropout and persistence in face-to-face schools (Finn, 1989; Reschly & Christenson, 2012). A deeper understanding of student engagement in virtual spaces my hold similar promise for understanding the roots of persistence in these types of learning environments. LITERATURE REVIEW WHAT IS ENGAGEMENT? Student engagement is often thought of as the active embodiment of students’ motivation; yet, it has long been hampered by the lack of a clear definition. In the Handbook of Research on Student Engagement Christenson et al. (2012) offers the following definition based on a 2 synthesis of studies representing decades of research from numerous disciplines and many scholars. Student engagement refers to the student’s active participation in academic and co- curricular or school-related activities, and commitment to educational goals and learning. Engaged students find learning meaningful, and are invested in their learning and future. It is a multidimensional construct that consists of behavioral, cognitive, and affective subtypes. Student engagement drives learning; requires energy and effort; is affected by multiple contextual influences; and can be achieved for all learners (p. 816–817). Research shows that how one engages in academic activities may depend on individual characteristics (gender, ethnicity, SES; Fredricks et al., 2018; Shernoff & Schmidt, 2008; Tomaszewski et al., 2020), one’s relationships and interactions with others such as teachers, peers, and parents, and the context of their learning environment (Skinner & Pitzer, 2012). Models of engagement vary in the number of dimensions they include, and in how those dimensions are defined. However, multiple reviews of the engagement literature demonstrate that the behavioral, affective and cognitive dimensions are the most commonly identified and studied engagement dimensions (Christenson et al., 2012; Fredricks et al., 2004; Fredricks & McColskey, 2012). Additionally, research in science has regularly framed engagement using these three dimensions (Sinatra et al., 2015; Vedder-Weiss, 2017). Conceptualizing engagement as consisting of these three dimensions is productive because previous work has shown that students can be simultaneously highly engaged in some of these dimensions, but not in others. These unique patterns of engagement are differentially related to important educational outcomes such as academic achievement, academic integrity, and educational aspirations (Connor & Pope, 3 2013; Wang & Peck, 2013). In keeping with past theoretical traditions, and taking note of the associations between patterns of engagement and important academic outcomes, this study conceptualized engagement as consisting of affective, behavioral, and cognitive components. DIMENSIONS OF ENGAGEMENT Evidence suggests that positive activating emotions, such as enjoyment, hope, and pride, are positively correlated with achievement, and negative deactivating emotions, such as boredom and hopelessness, impair academic performance (Pekrun, 2006; Pekrun et al., 2002). In alignment with previous work this study conceptualized affective engagement as consisting of positive emotions such as interest and enjoyment (Fredricks et al., 2004; Schmidt et al., 2018; Skinner et al. 2009).1 Importantly, studies in virtual settings have commonly conceptualized affective engagement in ways similar to their face-to-face counterparts (Borup et al., 2014; Dixson, 2015). It is also important to acknowledge that some emotions which may be construed as negative, such as frustration, may sometimes manifest themselves in students who are highly engaged (Baker et al., 2010). For example, a student may become frustrated working on a difficult problem, or angry when learning about global warming, while remaining highly engaged in learning. Although emotions are considered an important variable in the quality of student learning, research on emotions in science classrooms is lacking (Fortus, 2014; Wickman, 2006). A better understanding of the role that emotions play in predicting important academic outcomes will enable policymakers, teachers, and school administrators to enact practices which are supportive of students’ affective engagement. 1 Note that some models of engagement do not include an affective engagement dimension (Finn, 1989; Martin, 2007), and other researchers do not view emotions as a distinct dimension of engagement, but as a predictor of other forms of engagement (Pekrun & Linnenbrink-Garcia, 2012). 4 Studies have examined behavioral engagement at a variety of levels: from school level engagement to momentary engagement during learning activities (Appleton et al., 2006; Gobert et al., 2015; Schmidt et al., 2018). The purpose of this study was to describe the way students engaged during classroom instruction in two common instructional modalities, and so conceptualized behavioral engagement as the concentration, effort, and work students put into their academic activities during synchronous and asynchronous instruction (Heddy et al., 2014; Skinner et al., 2009). Behavioral engagement is important because of its role in predicting numerous academic and social outcomes such as academic coping, resilient mindsets, increased learning, and higher achievement (Appleton et al., 2008; Klem & Connell, 2004; Skinner & Pitzer, 2012). Sinatra et al. (2015) found strong links between behavioral engagement and achievement outcomes in science; however, these achievement outcomes typically relied on low- level processing tasks, such as simple recall. The researchers suggested that students who engage along multiple dimensions of engagement may be more successful when faced with complex achievement tasks in science domains, underscoring the need to conceptualize engagement as a multidimensional construct. The conceptualization of cognitive engagement is particularly rife with variation in the way researchers choose to define it; however, Fredricks et al. (2004) identified two distinct definitions of cognitive engagement. One approach focuses on the degree to which students perceive their academic activities as valuable and important. Research in the engagement field commonly includes a value component (Appleton et al., 2006; Connor & Pope, 2013; Schmidt et al., 2018). The other approach emphasizes metacognition and self-regulatory strategies (Ainley, 2012; Cleary & Zimmerman, 2012). Research examining engagement in virtual environments has made the distinction between these two types of cognitive engagement in ways similar to 5 face-to-face learning (Borup et al, 2014; Dixson, 2015). Both cognitive engagement definitions suggest that the way students cognitively engage with learning situations is variable, and based on context. However, it is possible that some students may engage highly with self-regulatory strategies, but do not value the content they are learning. Therefore, cognitive engagement was defined as two separate dimensions in this study: cognitive-self-regulatory, and cognitive-value. The cognitive-self-regulatory dimension was conceptualized as the cognitive processes purposefully employed by students during learning (Fredricks et al., 2016; Pintrich & De Groot, 1990). The cognitive-value dimension was conceptualized similar to Schmidt et al. (2018) and Connor and Pope (2013) as the degree to which students perceive their academic activities as valuable and important to themselves and their future goals. Cognitive engagement is linked to a variety of important educational outcomes. Increased levels of cognitive engagement are associated with higher achievement (Greene et al., 2004) and increased motivation (Guthrie et al., 2004). Additionally, after reviewing the literature on cognitive engagement, Green (2015) found that, although we know less about the predictors and outcomes of cognitive engagement in science compared with other domains, cognitive engagement was indeed associated with increased achievement in science classes. A more complete knowledge of cognitive engagement will give educational professionals better ways to cultivate cognitive engagement in their classrooms and give researchers deeper insight into how cognitive engagement encourages academic outcomes. Although these dimensions of engagement are not wholly distinct from one another in a variety of domains including science, researchers commonly conceptualize and measure dimensions of engagement as separate, but related, constructs (Sinatra et al., 2015; Wickman, 2006). Multiple research studies support the duality of the claim that dimensions of engagement 6 are related yet distinct. When multiple dimensions of engagement were assessed separately researchers found moderate correlations between dimensions ranging from 0.4 – 0.55 and that modeling these dimensions separately produced more accurate representations of the data (Appleton et al., 2006; Reeve & Tseng, 2011; Schmidt et al., 2018; Skinner et al., 2009). Other studies have suggested that students might be engaged in one dimension, but not engaged in other dimensions (Renninger & Bachrach, 2015; Sinatra et al., 2015). Modeling and describing multidimensional frameworks of engagement is needed to more thoroughly understand how different dimensions contribute to important academic outcomes. For instance, some studies have found that behavioral engagement and cognitive engagement are both associated with higher academic achievement (Fredricks & McColskey 2012; Greene, 2015). A PERSON-ORIENTED APPROACH TO ENGAGEMENT Broadly, person-oriented approaches are those which consider multiple variables functioning together and simultaneously within individuals (Bergman & Trost, 2006). The common conceptualization of engagement as multidimensional suggests that a person-oriented approach may provide valuable insights. Person-oriented approaches allow researchers to describe how students experience multiple dimensions of engagement differently or similarly and simultaneously. This approach is an alternative to variable-oriented modeling approaches, which require researchers to create an aggregate measure of engagement, or to describe one dimension of engagement while controlling for the others. Indeed, recent research in the motivation and engagement fields have explored how person-oriented approaches can give unique insights into how students engage (Conner & Pope, 2013; Schmidt et al., 2018; Wormington & Linnenbrink-Garcia, 2017). To summarize, person-oriented approaches give researchers a more holistic view of student engagement by identifying particular constellations of 7 engagement and exploring how they are associated with predictors and outcomes, rather than testing for effects of individual dimensions separately. One of the first studies to explore profiles of engagement using a person-oriented approach was Connor and Pope (2013). They theorized a taxonomy consisting of seven possible profiles defined by varying levels of engagement in the affective, behavioral, and cognitive dimensions, providing empirical support for three of these profiles using cluster analysis. Subsequent studies have continued to find evidence of the multidimensionality of engagement in traditional face-to-face classrooms using a variety of engagement frameworks. Salmela-Aro et al. (2016) examined engagement through the concept of burnout, although separate dimensions of engagement were not specified. Van Rooij et al. (2017) studied how school level engagement profiles predicted academic success in college. Finally, Schmidt et al., (2018) constructed momentary engagement profiles in face-to-face high school classrooms using cluster analysis. Although these studies provide evidence to suggest the different ways students might engage, few studies have explored this variability in science classrooms (Schmidt et al., 2018 being a notable exception), none explored student engagement in virtual learning environments, only some used the most recent statistical techniques (Magidson and Vermunt (2002) note that latent profile analysis is now considered the superior choice for person-oriented approaches compared with traditional cluster analysis), and just one created profiles at a grain size smaller than the school level. In this study I addressed unique questions regarding student engagement by using the most recent statistical techniques to understand how virtual high school students engage during classroom instruction. 8 THEORIZED AFFORDANCES AND CONSTRAINTS OF SYNCHRONOUS AND ASYNCHRONOUS INSTRUCTION Students attending virtual high schools generally experience a mix of synchronous and asynchronous instruction in their courses. Synchronous instruction takes place in real time using technologies such as video conferencing, instant messaging, tele-conferencing, or other means of live communication (Watts, 2016). Asynchronous instruction is characterized by time independent communication between participants such as text-based lessons, discussion forums, email, collaborative virtual spaces, and virtual bulletin boards. Learners are free to access the educational content anywhere, at any time, and complete assignments and lessons when convenient (Watts, 2016). Each instructional modality has unique affordances and challenges that may differentially support students’ engagement across a variety of dimensions when learning. Theories of motivation identify relatedness as a basic human need necessary for individuals to feel motivated and engaged (Ryan & Deci, 2000). Synchronous instruction may have greater affordances than asynchronous instruction for supporting students’ engagement through relatedness because it may allow students to experience higher quality interpersonal interactions in real time. For example, during synchronous sessions students can work together in small groups to complete an assignment, providing an opportunity to discuss academic topics and have more casual conversations, both of which might enhance perceptions of relatedness. Studies suggest that synchronous instruction can result in rich interpersonal communication, increased sense of belonging, and increased social presence (Falloon, 2011; Giesbers et al., 2014; Nippard & Murphy, 2007). Multiple studies have found that including synchronous interactions in online courses strengthens social presence and interpersonal relationships (Gunawardena & 9 Zittle, 1997; Yamagata-Lynch, 2014) and that instantaneous communication with teachers and other students is advantageous compared with asynchronous learning (Lin & Gao, 2020; Watts, 2016). Conversely, multiple studies found that asynchronous instruction leads to lower feelings of social presence, relatedness, or sense of belonging (Francescucci & Rohani, 2019; Hrastinski, 2008; Lin & Gao, 2020). Students who lack feelings of relatedness or social presence are more likely to become less motivated and disengage from the course (Anderman, 1999; Skinner & Furrer, 2003). Autonomy is another basic need that humans require to experience motivation and engagement (Ryan & Deci, 2000). In education, higher feelings of student autonomy are linked with increased student engagement (Reeve et al., 2004; Vasalampi et al., 2009). Teachers commonly provide autonomy support by offering choice to students (Flowerday & Schraw, 2000). Perhaps the clearest affordance of asynchronous learning is that learners may choose when to complete assignments, where to access the content, and the pace at which to move through lessons (Pang & Jen, 2018). For instance, a student who works a part-time job during weekday afternoons to help pay household bills would benefit from the flexibility afforded by asynchronous instruction. While there are certainly multiple ways that synchronous instruction might be structured to support students’ autonomy, at a very broad level this instructional modality typically restricts when, where, and how students can participate in learning activities. Students who are forced to attend synchronous sessions at inopportune times or places may experience lower engagement due to tiredness, mental fatigue, or distractions at home or in a public place (Olson & McCracken, 2015). During asynchronous learning students may explore additional resources or topics of interest without the time constraints and pressures of a synchronous agenda (Lin & Gao, 2020). 10 Students can move through an asynchronous lesson as quickly or slowly as they need to stay engaged. Since asynchronous instruction does not require immediate responses, studies have found that learners think more deeply about questions, reflect more, and consider discussion prompts to a greater degree prior to expressing their ideas compared to synchronous instruction (Brierton et al., 2016; Falloon, 2011; Lowenthal et al., 2017). Additionally, some asynchronous learning experiences lead to more self-directed learning behaviors and cognitive engagement (Lin & Gao, 2020). Since communication is dynamic during synchronous instruction, discussions about the relevance, meaningfulness, or usefulness of the content can be more tailored and nuanced compared with asynchronous learning. For example, during a lesson on aquatic ecosystems in a Biology course a student might ask why a local pond has no plants or fish. The answer might be that runoff from a local manufacturing site has turned the water too acidic to support life, sparking a discussion about acids and bases, connecting both Biology and Chemistry concepts within the context of aquatic health. On the other hand, asynchronous instruction allows students time to reflect, think, and absorb content which may allow them to more deeply consider the relevance of the information, and subsequently engage at a higher level compared with synchronous instruction. For instance, after exploring a lesson on radioactive half-life, a student might pause their forma learning to explore more about how this topic impacts a local nuclear power plants by watching some videos about the difficulties posed by storing nuclear waste. Making content relevant is important because making academic content more applicable, useful, or meaningful to students for purposes outside of school is linked with higher student engagement (Anderman et al., 2011; Schmidt et al., 2018; Wang & Eccles, 2013). 11 Teacher enthusiasm is often conceptualized as teachers expressing positive emotions and teaching with high energy in the classroom (Barsade & Gibson, 2007; Kunter et al., 2011; Zhang, 2014). Studies show that teachers who are enthusiastic in their teaching result in students who are more engaged affectively, behaviorally, and cognitively (Patrick et al., 2000; Zhang, 2014). Synchronous instruction allows students to hear voice inflection and see facial expressions and body language, both of which are important for expressing high energy and enthusiasm. However, teachers may find it more difficult to express their enthusiasm for content during asynchronous instruction since there is typically substantially less student-teacher interaction in this modality. Students at most virtual K12 schools (including this one) encounter the same academic content both synchronously and asynchronously. Few studies describe the student experience when participating in both modalities. It may be the case that students engage differently in each modality when they are given the opportunity to participate in both. Describing how students broadly engage in each of these modalities is an important first step to understanding the differences in the way students experience synchronous and asynchronous instruction. Understanding how students engage in each of these modalities may help teachers design virtual learning experiences to encourage engagement and allow researchers to better understand how engagement operates in online courses. ENGAGEMENT AS A FACILITATOR FOR RETENTION AND GRADES Given the high attrition rates observed in K-12 virtual schools, examining student retention is an important phenomenon to consider. In fact, virtual schools suffer from substantially lower graduation rates than traditional public schools (Atwall et al., 2020). High engagement may be one way to increase low retention. Multiple decades of research show that 12 higher student engagement is associated with lower dropout rates of students in traditional school settings (Finn, 1989; Reschly & Christenson, 2012; Rumberger & Rotermund, 2012). Little research exists examining the association between engagement and retention in virtual high school settings; however, researchers have examined the association between student engagement and retention in other online environments. In a review of the literature around student retention in higher education online learning courses Muljana (2019) found that students were more likely to remain enrolled when their instructors employed strategies to encourage engagement. For example, in courses where instructors helped foster a sense belonging among students and provided adequate feedback, retention rates were high. Additionally, recent work has examined the association between engagement and retention in Massive Open Online Courses (MOOCs). Xiong et al. (2015) and Chaw and Tang (2019) found that students who were more highly engaged in a MOOC were significantly more likely to finish the course. Given the evidence that suggests engagement and retention are closely linked constructs in a variety of settings, it seems likely they are associated in virtual high school settings also. Course grades are one way that achievement in high school science courses is commonly operationalized. A large body of research connects higher engagement with higher achievement in face-to-face science courses. Grabau & Ma (2017) found that multiple measures of science engagement significantly predicted higher scores on a science achievement test. Additionally, multiple studies have found that increased behavioral, cognitive, and affective engagement predict higher science achievement (Mo, 2008; Reschly et al., 2008; Ucar & Sungur, 2017). CURRENT STUDY In this study I described different ways that students engaged during virtual high school science instruction by identifying a set of “engagement profiles” consisting of affective, 13 behavioral, cognitive-self-regulatory, and cognitive-value dimensions. Since it is likely that membership in these profiles is fluid, partly based on the context of the learning environment, this study explored the extent to which instructional modality (synchronous vs asynchronous) predicted engagement profile membership and subsequently the extent to which engagement profiles predicted students’ over-summer retention and final course grades in science. Thus, I examined three research questions: 1. What profiles of engagement do students display in virtual high school science courses? 2. To what extent is instructional modality (synchronous vs. asynchronous) associated with students’ situational engagement profiles? 3. To what extent are the ways students engage differentially related to over-summer retention and final course grade in science? METHODS STUDY PROCEDURE Data were collected through brief, repeated online surveys administered at the end of daily lessons in which students were asked to report on multiple dimensions of their engagement during that day’s learning activities. Data were collected for five weeks during the spring semester, with the timing of data collection planned to avoid standardized testing and end-of semester activities. Teachers participating in the study collected students’ reports of their engagement as part of their normal classroom instruction, and consent was sought to use these data for research purposes. Prior to the study, all procedures were approved by the human subjects review boards at the university that conducted the research and the school district in which the study was conducted. Ethical safeguards were respected in the treatment of all research data as described in 14 the American Psychological Association’s Ethical Principles of Psychologists and Code of Conduct (APA, 2017). Permission was obtained from the school and teachers to conduct this study. All students (and guardians for those under age 18) whose data was used in this study provided active consent. Teachers and a school administrator emailed consent letters to all parents and assent letters to all students prior to the study launch, and reminders to complete consent and assent forms were sent throughout the study (Appendix A). Teachers also emailed parents and students a recruitment letter describing the study and prompting parents and students to complete the consent and assent forms (Appendix B). Return of consent and assent forms was incentivized by giving away several Amazon gift cards via a random drawing at the conclusion of the study. Students were given a total of ten surveys measuring their perceptions of engagement over the course of five weeks. One time each week students completed a survey at the end of a synchronous lesson and at the end of an asynchronous lesson. Students were expected and encouraged to both attend the daily synchronous sessions and to complete the asynchronous lessons. Data were gathered from synchronous sessions on the same days that data was gathered from asynchronous lessons. All surveys contained the same questions asking students to reflect on their behavioral, affective, cognitive-self-regulatory, and cognitive-value engagement over the course of the preceding lesson. Teachers posted links to surveys at the end of selected asynchronous text-based lessons. Lessons were selected in consultation with teachers and had features consistent with most lessons in the course. That is, the selected lessons were “business as usual” in the context of the course. At the end of synchronous lessons teachers provided students a link to the survey and gave students time to complete it in class. Additionally, teachers sent out reminder emails to all students on survey days. 15 SETTING This study took place at a virtual high school in Michigan. The high school serves students in grades 9-12, enrolls approximately 1,200 students and has existed for 6 years. All students in attendance at this high school are Michigan residents, with many students residing in the larger urban centers of the state. The school was in its fourth year of implementing NGSS and curriculum at this school was mapped to NGSS prior to the start of the school year. Study participants were recruited from five science courses including: Physical Science, Chemistry, Earth Science, Veterinary Science, and Forensic Science. These courses represent all commonly taken science courses at this school with the notable exception of Biology . Curriculum and pacing were standardized across sections of the same course. Major concepts within each course were organized into units, and units were broken down into daily lessons. Each lesson featured an asynchronous component in which students were expected to independently access and review a variety of text-based resources including diagrams, photos, figures, tables, graphs, and short activities. Students had the freedom to complete the asynchronous lesson at any point during, or after, the day they were assigned. Lessons also included a synchronous component which featured a 45-minutes video conferencing lesson with teachers and the other students in the class. For each synchronous and asynchronous lesson component in which data were collected, teachers reported the primary instructional activity during that portion of the lesson. Teachers reported using a variety of instructional activities during synchronous and asynchronous sessions, though certain activities were more common within each modality. The most common primary instructional activity reported during asynchronous instruction was reading (45%); however, asynchronous sessions included a variety of other primary instructional activities reported including: video, completing 16 review notes, quizzes, and labs which together accounted for 55% of the total observations. Each day students attended a synchronous session and completed a complementary asynchronous lesson, providing students two opportunities, using different modalities, to counter the same topic. The most common type of primary instructional activity reported during synchronous instruction was lecture (38%); however, teachers also reported: practice questions, guided notes, labs or simulations, discussions, and small group work as primary, which together accounted for 62% of the instructional activity observations. See Tables C1 – C7 in Appendix C for a complete summary of the academic content and instructional activities during each day of data collection. PARTICIPANTS Five science teachers agreed to participate in the study and the study was open to all students enrolled in a course with a participating teacher. A total of 124 students consented to have their data used for research purposes out of a possible 768 students, resulting in a 16.1% participation rate, which is in line with what the school typically experiences when surveying students. The curriculum and pacing were standardized across sections of the same course. All high school grade levels (9th – 12th) were represented in these courses. See Table 1 for full demographic information. MEASURES DIMENSIONS OF ENGAGEMENT Student self-perceptions of engagement in affective, behavioral, cognitive-self- regulatory, and cognitive-value dimensions were measured using modified versions of previously validated surveys. All survey items measuring engagement were on a 5-point Likert- type scale from 1 (Strongly disagree) to 5 (Strong agree). To assess affective and behavioral engagement I used a modified version of Skinner et al.’s (2009) questionnaire. Affective 17 engagement was measured with four survey items. Sample questions include, “I felt interested in what I was working on during my lesson today.” and “The lesson today was fun.”. Behavioral engagement was measured with five survey items. Sample items include, “I tried hard to do well during this lesson.” and “I concentrated while doing my lesson today.”. To measure cognitive- self-regulatory engagement I utilized four items based on Pintrich and De Groot (1996) and Fredricks (2016). Sample items included, “I asked myself questions throughout the lesson today to make sure I understood the material.” and “In my lesson today, I tried to connect what I was learning to things I have learned before.”. Finally, to assess cognitive-value engagement I used four survey items based on Voelkl (1996), Schmidt et al. (2018), and Connor and Pope (2013). Sample items included, “This lesson was important to me.” and “This lesson was important to my future.”. Items corresponding to each dimension of engagement were subjected to confirmatory factor analysis to construct composite indicators. This process is described in the preliminary results section. See Appendix D for full measures. PREDICTORS OF ENGAGEMENT All courses consisted of both a synchronous and asynchronous component. Each student report was identified as coming from either a synchronous or asynchronous learning experience. This instructional modality indicator was explored as a predictor of students’ engagement profiles. Individual characteristics often have important implications for how students engage in science (Skinner & Pitzer, 2012). These characteristics include: gender, ethnicity, and socio- economic status (SES). All variables were obtained from school records and were operationalized as: gender (male, female: this school does not have a non-binary indicator), ethnicity (White, Asian, Black, Latinx, Native American, Alaska Native), and SES (qualification 18 for free or reduced lunch). In this study an underrepresented minority in STEM (URM) was defined as ethnicities other than White or Asian and students. OUTCOME MEASURES I used engagement profiles to predict two outcome variables, final course grade and over- summer retention, while controlling for individual characteristics and instructional modality (gender, URM, SES, and instructional modality). Grades and over-summer retention data were obtained from school records. Grades were recorded on a 0 – 100-point scale. Over-summer retention was operationalized as the students who attended the same school during spring semester 2021 and fall semester 2021 (students who were retained to the next school semester), which is not the same as dropout or graduation rates 2. Students who were not retained may have enrolled elsewhere, and their departure from school may have many explanations. However, examining this particular persistence indicator allowed the consideration of whether students who remain in this unique learning environment engage in their learning experiences in different ways from those who (for a number of unknown reasons) withdraw. ANALYSES Latent Profile Analysis (LPA) was used to identify student engagement profiles consisting of affective, behavioral, cognitive-self-regulatory, and cognitive-value dimensions modeled as observed variables using factor scores 3. The profile identification process involves examining several different models with various assumptions about the mean, variances, and covariances of the indicators used to construct the profiles. Fit statistics, theoretical 2 Since all 12th graders in this study graduated at the conclusion of the spring semester they were excluded from analyses which involved over-summer-retention. 3 Indicators were initially modeled as latent variables; however, model identification issues necessitated modeling indicators as observed variables using factor scores. 19 interpretability, and considerations of parsimony are then used to select the profile solution. In this study I considered 6 different model types:  Model 1: Varying means, equal variances, and covariances fixed to 0  Model 2: Varying means, equal variances, and equal covariances  Model 3: Varying means, varying variances, and covariances fixed to 0  Model 4: Varying means, varying variances, and equal covariances  Model 5: Varying means, equal variances, and varying covariances  Model 6: Varying means, varying variances, and varying covariances For each of these models I examined solutions of 2 to 10 profiles for a total of 54 models. If a model did not converge, I increased the number of starts and iterations and reran the model. No further analysis was completed on models which did not converge after increasing starts and iterations. To account for the nesting of observations within students I used the TYPE = COMPLEX command in Mplus to adjust the standard errors to match the clustering of observations within students. To select a final model solution I followed the three steps suggested by Morin et al. (2016) and Nylund et al. (2007). I first used fit statistics (primarily BIC) to identify several candidate solutions (Nylund et al., 2007). Next, I considered theoretical interpretability of the candidate solutions. This included graphing the raw and z-scored means of each profile, and considering profile size, the variance/covariance structure of the profiles, and the theoretical meaning of each profile. In the third step, I considered both entropy and posterior classification percentages to quantify how well the proposed model solution did at classifying observations into the correct profile. 20 After selecting an optimal profile solution instructional modality and covariates were added as predictors of profile membership using the 3-step method (Asparouhov & Muthèn, 2014). The 3-step method uses separate multinomial logistic regressions in a pairwise fashion to compare all possible profile combination and has the advantage of reducing the likelihood of profile membership shift when adding predictors to the model compared with other methods. For example, one logistic regression was used to compare profile 1 to profile 2. Another logistic regression was used to compare profile 1 with profile 3, and so forth. Finally, profile membership was used to predict the final grade that students earned in their science course, along with whether they returned to the same school for the fall 2021 semester. Outcomes of profile membership were modeled using the manual ML 3-step method approach to include instructional modality, gender, URM, and SES as covariates, as recommended by Nylund-Gibson et al. (2019). The manual ML 3-step method allows the researcher to control for covariates, more accurately represent uncertainty of profile membership, and investigate a broader array of distal outcomes compared with automated approaches in Mplus (Asparouhov & Muthen, 2014; Nylund-Gibson, et al. 2019). Appendix E gives more information about finite mixture models and latent profile analysis. RESULTS PRELIMINARY RESULTS Students had the opportunity to answer five synchronous and five asynchronous surveys throughout the course of the study. Students who consented to their data being used for research purposes generated a total of 493 survey responses (336, or 68%, asynchronous survey responses, 157, or 32%, synchronous survey responses). This corresponds to an average of 3.98 surveys per student. Tables 2 through 6 provide overall descriptive statistics and within-person 21 and between-person correlations for all study variables. Correlations at both levels mostly followed expected patterns for correlation among engagement dimensions and among indicators within each engagement dimension. The exception was that two reverse-coded items within the cognitive value dimension (“The lesson today was useless to me .” and “The lesson today was a waste of time.”) exhibited low within-person correlations with the remaining cognitive-value items. Additionally, both Krosnick & Presser (2010) and Hughes (2009) suggest reverse coded items my introduce measurement error, and express caution about their use in applied research. Therefore, these items were removed from subsequent analyses. Prior to latent profile enumeration, a CFA with adjusted standard errors to account for the multi-level nature of the data (i.e., multiple observations within the same student) was performed to confirm the theorized four-factor structure of student engagement. Intraclass correlation coefficients indicated that between 66% and 77% of the variation in the four dimensions of engagement was due to between-person differences, and 0% to 5% was due to enrollment in different science classes. The substantial amount of variance explained at the between-person level necessitated adjusting the standard errors of the CFA. The loadings of one item in each factor were fixed to one, other loadings were estimated freely. The model fit the data well (χ2(113) = 283.623, CFI = 0.961, RMSEA = 0.055). Standardized loadings were between 0.654 and 0.9. Measurement invariance testing by instructional modality, gender, underrepresented minorities in STEM, and SES all achieved strict invariance between groups. Strict invariance is the highest degree of invariance possible and means that all engagement survey items measured the same dimensions of engagement in each group with the same degree of precision. 22 A series of analyses were performed to check for differences in response rates based on gender, ethnicity, socio-economic status, and final course grades, and outcome variables based on gender, ethnicity, and socio-economic status. Results of this analysis largely supported the assumption that different groups did not respond to surveys at different rates, and that there were not significant differences between groups in their final course grade or their enrollment status, with three exceptions. White and Asian students responded to a higher number of asynchronous surveys compared with underrepresented minorities. Additionally, students with higher achievement had higher response rates to the end of class surveys, on average. These biases will be considered when interpreting results. Finally, students from a higher SES background earned higher final science grades compared to their lower SES peers. To account for this systemic difference in science grades SES was included as a covariate when examining outcomes of profile membership. See Appendix F for more detailed missing data analysis information. PROFILES OF STUDENT ENGAGEMENT To describe engagement in virtual high school science courses, I used Latent Profile Analysis and the statistical program Mplus version 8.6 (Muthèn & Muthèn, 1998-2020) to identify engagement profiles with affective, behavioral, cognitive-self-regulatory, and cognitive- value indicators. For all analyses, standard errors were adjusted to account for the nesting of responses within students. Of the six model types tested, only two models (Model 1 – varying means, equal variances, and covariances fixed to 0; Model 2 - varying means, equal variances and covariances) resulted in meaningful profile solutions that converged and were interpretable. For both models 1 and 2 the four, five, and fix profile solutions were identified as candidates for a final solution. The 5-profile solution from model 2 was selected based on fit indices (primarily BIC), theoretical interpretability, and parsimony (Masyn, 2013; Morin et al., 2016). See Table 7 23 for fit statistic of candidate models. Additionally, model 2 estimated covariances between profile indicators, which more accurately represented the relationship between engagement dimensions. Although the six-profile solution from model 2 had a lower BIC, the additional profile added in this solution was below the commonly accepted threshold of 1% of total observations for model inclusion (Spurk et al., 2020). To confirm a stable profile solution the five-profile solution from model 2 was reproduced with double the number of starts and optimizations (2000 starts, 400 optimizations). A MANOVA test, followed by Tukey’s honestly significant difference (HSD) post hoc tests, evaluating the means of each dimension of engagement across profiles showed significant differences between profiles (F(4, 488) = 135.48, p < .001, Wilks’ λ = 0.06, ηp2 = 0.31). These between-profile differences support the representation of each profile as a unique pattern of engagement. Raw score profile means, 95% confidence intervals, and significance test results are presented in Table 8. Graphical depictions of both raw scores and z-scored profile means are depicted in Figure 1 and Figure 2. Profiles were named using guidelines suggested by Wormington and Linnenbrink-Garcia (2017) and an engagement taxonomy proposed by Conner and Pope (2013). Engagement dimensions with means below 2.5 were considered low engagement, between 2.5 and 4 were considered moderate engagement, and above 4 were considered high engagement. The Moderate all profile was characterized by a concordant pattern of engagement, whereby all dimensions were observed at approximately similar levels – in this case moderate (affective M = 3.09, behavioral M = 3.097, cognitive-self-regulatory M = 2.642, cognitive-value M = 2.839). A total of 78 observations, or 16% of the total sample, were classified in this profile. 24 The Mental profile (n = 10, 2%) contained observations with moderately low levels of affective (M = 2.24) and cognitive-value engagement (M = 2.164), and low levels of behavioral (M = 1.723) and cognitive-self-regulatory engagement (M = 1.455), suggesting a discordant pattern of engagement (at least one dimension has a substantially different mean value compared with other dimensions in the same profile). However, large standard errors around the means of engagement dimensions suggest this characterization as discordant may, or may not, be consistently observed with larger samples. Perhaps most striking was the extremely low level of behavioral engagement, and the mean levels of all dimensions being the lowest in the entire sample. The Moderate/High all profile was the second largest profile, containing 191 observations, or 39% of the sample. This concordant profile was characterized by high behavioral engagement (M = 3.996), moderately high affective engagement (M = 3.872), and moderate of cognitive-self-regulatory (M = 3.469) and cognitive-value engagement (M = 3.57). The next profile, Lower Self-regulatory, was one of the smaller profiles in the study (n = 17 or 3%), but was also one of two discordant profiles identified. The 95% confidence interval of cognitive-self-regulatory engagement did not overlap with any other dimension suggesting discordance among engagement dimensions in this profile. More specifically, this profile consisted of high affective (M = 3.969) and behavioral engagement (M = 3.932), moderate cognitive-value (M = 3.39) and moderately low cognitive-self-regulatory engagement (M = 2.285). The High all profile was the largest profile in the study, and contained 197 observations, or 40% of the sample. Means of all dimensions of engagement in this profile were well above four on a 5-point Likert scale, suggesting observations in this profile represented quite high 25 concordant engagement. Specifically, the high all profile contained observations displaying high affective (M = 4.559), behavioral (M = 4.764), cognitive-self-regulatory (M = 4.261), and cognitive-value engagement (M = 4.382). Variances of all dimensions of engagement, which were constant across profiles, indicated significant within-profile variation in all dimensions of engagement (affective: σ2 = 0.308, SE = 0.043; behavioral: σ2 = 0.068, SE = 0.009; cognitive-self-regulatory: σ2 = 0.319, SE = 0.042; cognitive-value: σ2 = 0.478, SE = 0.058). Correlations between engagement dimensions within each profile were significant and ranged between 0.462 and 0.832, similar to the range of correlations in the overall sample (0.59 to 0.81). INSTRUCTIONAL MODALITY AS A PREDICTOR OF ENGAGEMENT Instructional modality, gender, URM, and SES were added to the model using the automated (R3STEP) 3-step procedure in Mplus (Asparouhov & Muthen, 2014). A series of multinomial logistic regressions between all profile pairs broadly revealed students were more likely to engage at higher levels in all dimensions during synchronous learning and students from lower socio-economic backgrounds were more likely to experience lower engagement compared with their higher SES peers. Specifically, when students participated in synchronous instruction, they were more likely to experience engagement characteristic of the high all profile compared with both the moderate all profile (β = 0.89, p < 0.05, OR = 2.44) and moderate/high all profile (β = 0.70, p < 0.01, OR = 2.01). Students who were eligible for free or reduced lunch had a higher likelihood of experiencing engagement characteristic of the moderate all profile compared with both the moderate/high all profile (β = -1.38, p < 0.05, OR = 0.25) and high all profile (β = -1.57, p < 0.05, OR = 0.21). Notably, the two discordant profiles (mental and lower self-regulatory) were not predicted by any included variable, possibly because of their small size 26 and low power. Additionally, gender and URM were not predictive of profile membership. All predictor variables and covariates were included in the regression models; therefore, the regression coefficients may be interpreted as the independent effect of each predictor/covariate. Results are reported in odds ratios to assist with interpretability. Table 9 contains odds ratios and standard errors for all predictor variables and pairwise comparisons. ENGAGEMENT AS A PREDICTOR OF GRADES AND RETENTION To evaluate mean differences in final science grade and retention proportion between profiles I employed the manual 3-step approach in Mplus. This method allows the researcher to examine mean differences in outcome variables between profiles while also controlling for covariates. Outcomes were evaluated in separate models because of non-random missingness in the retention variable. All 12th graders included in this study graduated after spring semester 2021, and so did not return to the school for fall semester 2021. Therefore, the dataset for the model which estimated mean differences in final course grade contained all observations, and the model which estimated mean differences in over-summer-retention omitted observations from 12th graders.4 After adjusting for covariates (gender, URM, SES, and instructional modality) patterns of engagement were significantly predictive of final course grades, with profiles characterized by higher engagement predicting higher final course grades. Specifically, engagement characteristic of the moderate all profile was associated with lower grades on average compared with engagement characteristic of the moderate/high all profile (mean difference = 11.32, SE = 4.31, 4 As a robustness check two additional methods for modeling outcomes were tested. In one method both grades and retention were modeled simultaneously with all 12th graders removed from the dataset. In the second method both outcomes were again modeled simultaneously; however, FIML was implemented to deal with missingness in the retention variable for all 12th graders. Notably, the pattern of significant results was the same for all three options. Additionally, the mean structure of the engagement profiles when omitting 12th graders was similar to the structure with 12th graders included, further supporting the validity of the decision to model grades and retention separately. 27 p = 0.030), the lower self-regulatory profile (mean difference = 17.37, SE = 5.09, p = 0.004), and the high all profile (mean difference = 18.18, SE = 4.01, p < 0.001). Additionally, the moderate/high all profile was associated with lower grades compared with the high all profile (mean difference = 6.95, SE = 2.56, p = 0.009). Although not significant, the differences in mean grades between the mental profile and all other profiles were quite large, ranging between 24.12 and 31.07. The absence of significance may be due to the small number of observations present in the mental profile, reducing the statistical power of comparisons. Different patterns of engagement were also significantly predictive of over-summer- retention after adjusting for covariates. In this analysis over-summer-retention is reported as a proportion of observations within each profile that returned to the school the next semester. Specifically, the lower self-regulatory profile had a higher over-summer-retention proportion on average compared with both the moderate/high all profile (mean difference = 0.26, SE = 0.12, p = 0.034) and the high all profile (mean difference = 0.24, SE = 0.12, p = 0.047). Notably, the lower self-regulatory profiles contained only 3% of the total observations. See Table 10 for all mean comparisons of grades and retention. DISCUSSION In this study I described how students engaged in virtual high school science classrooms in terms of multiple engagement dimensions, examined whether these engagement patterns were meaningfully associated with instructional modality, and then tested whether students’ engagement patterns were predictive of final course grade and over-summer-retention. Analyses identified several distinct patterns of engagement characterized by unique combinations of the affective, behavioral, cognitive-self-regulatory, and cognitive-value dimensions that were meaningfully distinct from one another. Results suggested that fuller forms of engagement were 28 more likely to be observed in synchronous rather than asynchronous instruction, and among students from a higher SES background. Students who spent more time in profiles characterized by higher levels of engagement across all dimensions were more likely to earn higher grades in their science course. Finally, this study found that more full forms of engagement are not necessarily associated with increased likelihood of students returning to their current school; however, these results should be interpreted with caution based on the small sample size of some profiles. SITUATIONAL ENGAGEMENT PROFILES The identified engagement profiles suggested that students engage in meaningfully distinct ways during virtual high school science instruction. The five identified profiles suggest two broader patterns. Concordant patterns of engagement are characterized by all dimensions of engagement moving together at similar levels. Alternatively, discordant patterns of engagement are characterized by at least one dimension of engagement within a profile being at a substantially different level compared with the other dimensions. Characterizing profiles as concordant or discordant is useful as a broad categorization; however, it is important to note that even profiles described as concordant exhibit varying degrees of discordance among their indicators. For example, in the moderate/high all profile the magnitude of cognitive-self- regulatory engagement is more similar to cognitive-value than to affective or behavioral engagement. In this study, confidence intervals of engagement dimensions, interpretability, and the magnitude of discordance among indicators were considered when characterizing profiles as concordant or discordant. Considering the correlation among engagement dimensions was moderate, it is perhaps unsurprising that most observations in this study (94.5%) were more likely to be in a concordant 29 profile. However, discordant profiles may signal important life or school circumstances that impact how students engaged at school. For example, the mental profile might indicate students are distracted by events outside of school such as weekend plans, and lower self-regulatory engagement might indicate students need explicit instruction regarding self-regulatory strategy use. The tendency of engagement profile indicators to form concordant profiles has been mirrored in several other studies examining student engagement profiles (Virtanen et al., 2018; Wang & Peck, 2013). For example, Virtanen et al. (2018) found that students only engaged in concordant ways. However, this study found discordant engagement to be substantially more uncommon than other studies which constructed engagement profiles. In this study 5.5% of profiles were characterized as discordant, whereas Connor & Pope (2013), Wang & Peck (2013), and Schmidt et al., (2018) identified discordant profiles in 48%, 22%, and 50% of observations, respectively. One reason for this difference in discordant profile membership may be due to discrepancies in the frequency of specific instructional activities across studies. For instance, in Schmidt et al. (2018) the most common instructional practice was laboratory activities, and these activities were also consistently associated with membership in several different discordant profiles. Comparatively, in this study virtual labs or simulations were one of the least common instructional activities. This means that the instructional activity (i.e. laboratory/simulations) that accounts for a large proportion of discordant profiles in Schmidt et al. (2018) was much less common in this study. These results suggest that perhaps some types of instructional activities are less supportive of discordant patterns of engagement (i.e. lecture) and others are more supportive of discordant engagement (i.e. labs or simulations). 30 A second and third reason for the low discordant profile membership in this study might be the difference in school modality (face-to-face verses virtual) and the grain size of the engagement measurements. For example, Connor & Pope (2013) and Wang and Peck (2013) both measured engagement in face-to-face settings at the school level, and Schmidt et al. (2018) measured engagement at the momentary level. This study asked students to consider their engagement during the entire preceding virtual lesson. Therefore, it is possible that students can more accurately differentiate between dimensions of engagement in face-to-face settings, or that perceptions of engagement are inherently different when reflecting on momentary instructional episodes, entire classes, or school more broadly. DISENTANGLING AND DEFINING COGNITIVE ENGAGEMENT Nearly all studies which include a cognitive engagement dimension conceptualize it as either a value-based dimension, or a self-regulatory based dimension (Fredricks et al., 2004). This study simultaneously measured engagement in both ways by including a cognitive-value and cognitive-self-regulatory dimension. Results showed that in discordant profiles (mental and lower self-regulatory) the two cognitive dimensions substantially diverged from one another (by an average of 0.7 points on a 1 to 5 scale in the mental profile and 1.1 in the lower self- regulatory profile). Since there is meaningful separation between the cognitive-value and cognitive-self-regulatory dimensions in at least two of the profiles measuring both types of cognitive engagement simultaneously is important. Conceptualizing cognitive engagement as consisting of two separate dimensions has implications for future research and theory. Using the term “cognitive engagement” to describe both self-regulatory and value components adds more confusion to an already muddled construct. For example, studies by Appleton et al. (2006) and Ainley (2012) both include a 31 “cognitive engagement” dimension; however, one defines cognitive engagement as value-based, while the other as self-regulatory based. Future studies should include and name both definitions of cognitive engagement when possible, or clearly articulate and name how they conceptualize cognitive engagement when only one dimension is included. It may even be helpful to replace the term “cognitive engagement” with “value engagement” and “self-regulatory-engagement” where appropriate. It is noteworthy that cognitive-self-regulatory engagement was consistently the lowest scoring dimension among all engagement profiles. In each profile the mean of cognitive-self- regulatory engagement was the lowest of all dimensions, and the 95% confidence interval of cognitive-self-regulatory engagement did not overlap with at least one other engagement dimension. This is inconsistent with previous research conducted in face-to-face settings (van Rooij et al., 2017; Wang & Peck, 2013). One explanation is that students might not know how or when to apply self-regulatory strategies in online environments to the same extent as face-to-face learning, or they may not recognize certain thoughts as self-regulatory in nature, and therefore not report them as such. Finally, some questions used in this study referred to specific self- regulatory strategies. If students were not using the self-regulatory strategies specified, then questionnaire results would indicate lower self-regulation than was actually the case. Self- regulation is known to vary based on classroom environment and school context and is especially important for student success in online environments (Boekaerts et al., 2000; Wijekumar et al., 2006). The extent of student familiarity with self-regulatory strategies is important knowledge for teachers because self-regulatory strategies can be taught, and self-regulation is positively associated with performance and satisfaction in online courses (Wang et al., 2013a). 32 INSTRUCTIONAL MODALITY AND INDIVIDUAL DIFFERENCES AS PREDICTORS OF ENGAGEMENT I found that synchronous instruction predicted membership in profiles characterized by moderately higher engagement in all dimensions compared with asynchronous instruction. Previous research has shown that specific instructional activities are important in determining the way students engage (Schmidt et al., 2018; Skinner & Pitzer, 2012). However, in this study the primary instructional activities in both modalities were highly similar. Eighty two percent of the instructional activities that were recorded during the study were reported in both the synchronous and asynchronous sessions, whereas only 18% were unique to just one modality. Instructional activities which were analogous between modalities included: direct instruction (lecture in synchronous, reading in asynchronous), completing or reviewing notes, virtual lab/simulations, and various types of assessment (quizzes, tests, and practice questions). Instructional activities which were unique to a specific modality were video (asynchronous only), and discussions and small group work (synchronous only). The similarity of instructional activities between modalities provides evidence that instructional practices was not confounded with instructional modality, and since instructional practices between modalities were mostly similar it is likely that some of the unique affordances of synchronous instruction impacted the way that students engaged. In summary, across an array of instructional activities, those delivered during synchronous instruction were associated with higher engagement across all dimensions compared with those delivered during asynchronous instruction. The unique affordances of synchronous sessions and constraints of asynchronous learning may help explain some of the association between instructional modality and engagement. Higher teacher enthusiasm, social presence, and content relevance are all associated with higher student engagement (Anderman et al., 2011; Roorda et al., 2011; Zhang, 2014). An exclamation 33 point at the end of a sentence in an asynchronous text document does not carry the same weight as voice inflection, facial expressions, and body language conveyed during a synchronous lesson, all of which help teachers to express their enthusiasm about the content. Additionally, synchronous learning provides opportunities for students to interact in real time, which may have enhanced student perceptions of social presence, and in turn, benefited student engagement (Lin & Gao, 2020; Yamagata-Lynch, 2014). Finally, synchronous interaction allows teachers to modify their instruction in real time to suit the interests of the class, possibly making the content more relevant to individual students, whereas asynchronous text and video are static, regardless of the way students experience the relevance of the content. Although recorded video, which is a common element of asynchronous instruction, does offer some affordances similar to synchronous instruction, the absence of live interaction between participants markedly lessons the extent to which teachers are able to generate and express enthusiasm, foster social presence, and use students’ questions and interests to highlight the relevance of the content. Notably, explanations for the association between instructional modality and engagement presented here are speculative; however, future research might examine the extent to which these features of synchronous instruction explain the results found in this study. Finally, although this school claims to be committed to NGSS curriculum, teacher reported instructional activities demonstrated a substantial lack in practices representative of NGSS instruction (science and engineering practices and driving questions) in both modalities, but particularly during asynchronous lessons. The restricted application of NGSS instructional practices in asynchronous lessons may have played a role in the decreased engagement students experienced while learning in that modality. 34 Two affordances typical of asynchronous instruction are the flexibility granted to students regarding where and when they learn. However, since this school had a substantial synchronous component the degree to which students chose when to do school was restricted, while still allowing students the flexibility of place and pace. So, we do not know if students reported lower levels of engagement in asynchronous instruction because of inherent differences in affordances between synchronous and asynchronous learning, or because the synchronous session schedule necessitated students engaged with the content at specific times throughout the day. To summarize, synchronous sessions may allow teachers to utilize instructional practices associated with engagement to a greater degree than asynchronous instruction, and some of the theorized benefits of asynchronous instruction may not manifest themselves to as large an extent as speculated in K12 virtual schools because the large synchronous component limits the flexibility students have to choose when, where, and at what pace they complete their asynchronous lessons. Lower socio-economic status (as measured by free/reduced lunch eligibility) was associated with profiles described by lower levels of engagement. Previous research shows that students from lower socio-economic backgrounds generally experience lower engagement compared with their medium or high socio-economic peers (Fullerton, 2002; Tomaszewski, 2020). This lower engagement may be a symptom of the larger issue of a digital and technical divide reflecting inequalities in a variety of domains, including SES. For example, students from lower SES backgrounds may have access to lower quality hardware, have workspaces less conducive to working with technology, have less assistance from individuals familiar with technology, and have less time to use the technology (Beaunoyer et al., 2020). All four of these factors may contribute to students’ frustration attending a virtual school, lead to undesirable 35 experiences while learning at a virtual school, and result in lower engagement during lessons. This finding highlights why it is important for teachers to understand the context of students’ personal lives, and to identify school level supports to help students access appropriate technology and use it effectively. OVER-SUMMER-RETENTION AND FINAL COURSE GRADES AS AN OUTCOME OF ENGAGEMENT PROFILES In general, students who displayed situational engagement patterns where all dimensions of engagement were high also earned higher science grades. The reader will recall that engagement patterns are situational, which is to say that students who displayed these high engagement patterns in some situations may have displayed different engagement in other situations. Similarly, Connor & Pope (2013) and Wang & Peck (2013) found that specific profiles of engagement characterized by higher levels of all dimensions, but in particular behavioral and cognitive engagement, were associated with higher GPA. In virtual settings studies commonly report that user behaviors and actions (attendance, type or amount of content accessed, etc.) within an online learning course positively predict achievement (Bonafini et al., 2017; Green et al., 2018; Hartnett, 2012). However, few studies examine how other dimensions of engagement are associated with achievement in virtual settings, Lieno et al. (2021) being a notable exception. Leino et al. (2021) found that positive emotions were indirectly associated with higher course grades and negative emotions were predictive of lower course grades. There was one exception to the general association between higher engagement and higher grades. Engagement characterized by low self-regulation was associated with significantly higher grades compared to the moderate all profile. There are a few possible explanations for this finding. First, the lower self-regulatory profile was quite small (n = 17); 36 therefore, this finding my not consistently appear with larger sample sizes. Second, it may also be the case that affective, behavioral, and cognitive-value engagement compensate for a lack of engagement with self-regulatory strategies. Third, across all profiles cognitive self-regulatory was consistently the lowest scoring dimension of engagement. It is possible that this dimension is not as salient when students actively contemplate their engagement during class, even if they do in fact utilize self-regulatory strategies. It may be that students did in fact utilize self- regulatory strategies, but not the ones specifically described in the questions they answered. The specificity of cognitive-self-regulatory questions might have led to lower scores compared to affective, behavioral, and cognitive-value engagement, which asked more generally about emotions behaviors, and thoughts. Results comparing differences in over-summer-retention indicated the lower self- regulatory profile was associated with increased retention compared with some other types of engagement (moderate/high all and high all). This may be because students attending virtual schools do so as a “last resort” after experiencing a variety of difficulties at their face-to-face school. Students who can refocus on their academics and regulate their behavior might then transition back to a face-to-face school, while those who struggle to self-regulate remain at virtual schools. However, there are two reasons these results should be interpreted with caution. First, as discussed in the context of final course grades, the lower self-regulatory profile contained a limited number of observations, meaning that a relatively small number of individuals may greatly alter the retention proportion in this profile. Indeed, five out of the fourteen observations in this profile were from the same person, who did persist at the current school. With a larger number of individuals and observations in this profile, these results may not remain consistent. 37 Second, even though previous studies demonstrate a connection between engagement and persistence in school, there are a broad array of reasons why students may, or may not, return to a specific school, independent of student engagement. Muljana (2019) provides a review of the literature describing factors commonly associated with persistence in online learning and at online schools. Although engagement certainly plays a part in the process by which students and their families make the decision about withdrawing from a given school, or school more generally, it is by no means the only consideration. The decision to persist at online schools is complex and relies on many variables at multiple levels (institution, instructor, and student). MEASURING ENGAGEMENT IN ONLINE LEARNING Prior to this study there were no reports describing the way that virtual high school students simultaneously engaged in different engagement dimensions. This study makes it apparent that, similar to their face-to-face peers, high school students in virtual science classrooms do indeed engage in both concordant and discordant ways, and those specific patterns of engagement are more or less associated with important predictors and distal outcomes. Interestingly, some discordant profiles identified by previous research in high school science classrooms were not identified in this study, but others were. For example, two face-to-face studies, Connor and Pope (2013) and Schmidt et al., (2018), identified a busily engaged profile, which was not identified in this study. This profile was characterized by relatively low affective and cognitive engagement and relatively high behavioral engagement. It may be there is something inherently different about virtual environments that make it less likely for students to busily engage. For instance, students attending virtual school may struggle to engage behaviorally because they are not in a physical classroom with teachers constantly redirecting and monitoring behavior. The lower self-regulatory profile in this study was similar to a profile 38 identified in Wang & Peck (2013). It may be that a sub-set of students in both face-to-face settings and virtual settings struggle with self-regulatory strategy use in specific learning situations. These students often appear happy and productive in class; however, results from Wang and Peck (2013) suggest these students might be at risk of poorer academic outcomes compared with their more fully engaged peers. Studies of student engagement in online learning are diverse in the way they both define and measure engagement. Some studies include affective, behavioral, and cognitive dimensions and define them in ways quite similar to their face-to-face counterparts (Borup et al., 2020). Other studies define some dimensions in similar ways (e.g. behavioral), but operationalize that dimension differently to capture the unique way it manifests in online settings. For example, Dixson (2015) created an Online Student Engagement Survey which includes some questions specifically tailored to students’ experiences in online courses. San Pedro et al. (2014) and Gobert et al. (2015) use microbehavior data to assess engagement. Microbehavior data are records of student behavior which are automatically captured by an online system (e.g. errors made, log in data, content visited, number of requests for automated help, etc.) Studies focused on disengagement seem to be more common in online settings, possibly because automating data collection for a large amount of student microbehaviors in online settings is much more feasible compared with face-to-face learning. The distinction between microbehaviors in online learning contexts and behavioral engagement as traditionally defined in face-to-face learning suggests that future studies examining engagement in online settings might consider including two forms of behavioral engagement. One dimension might focus on perceptions of student hard work and concentration and another dimension might focus on automatically collected student microbehaviors. 39 Substantial divergence between these two domains of behavioral engagement might indicate a disconnect between students’ perceptions of their effort and the reality about the patterns of behavior students exhibit in an online course. USEFULNESS OF A PERSON-ORIENTED APPROACH Person-oriented analyses are particularly useful when the indicators used in profile analysis vary substantially within profiles (i.e. discordant profiles) and when these discordant profiles are either predicted by, or predict, variables of interest. If neither of these conditions is met, then a variable-centered approach may be warranted, since person-oriented approaches are generally more complex and more subjective due to model selection and interpretation that is not present in most variable-centered techniques. Importantly, neither approach is superior to the other, they are designed for different purposes, and to examine different types of research questions. In this study two discordant profiles were identified which contained a total of only 5.5% of the observations; however, both met the rule of thumb criteria set by Lubke & Neale (2006) for retaining profiles. There is nothing inherently wrong about keeping smaller profiles if there are reasons to believe those small profiles further theoretical understanding or assist in practical application. For example, student presentations are a learning activity which occur in many science classes but takes a small percentage of overall instructional time during a semester. If a particular maladaptive pattern of student engagement oftentimes occurs during student presentations, even though the related profile may have a small number of observations, teachers may be able to intervene in ways to encourage fuller forms of engagement during that activity. In this study, a variable-oriented approach would have resulted in similar, but not the same, conclusions as a person-oriented approach. Notably, I believe the decision to use a person- 40 oriented approach in this study was well-grounded in the evidence about the way in which students engage (Beymer et al., 2019; Schmidt et al., 2018). The discordant profiles identified in this study were small; however, some significant outcomes were associated with membership in these discordant profiles. A variable-centered approach would likely have identified associations between predictors/outcomes and concordant profile membership, but not with discordant profile membership. Although the predictive power of the discordant profiles should be interpreted with caution in this study due to their small size, the significant findings suggest that there may be some unique discordant patterns of engagement associated with important academic outcomes that warrant further study. Interestingly, some prior work directly comparing profiles in high school science classrooms found that profiles created from end-of-class surveys (like this study) resulted in more concordant profiles compared with profiles created with surveys measuring momentary engagement during specific learning activities (Beymer et al., 2019). It may be that students are able to distinguish between different dimensions of their engagement to a greater extent during instruction, as opposed to retrospectively reflecting on their engagement over the course of an entire class. Additionally, means on all engagement dimensions were relatively high, suggesting a possible ceiling effect of the measures, or that most students who participated in this study were already highly engaged in their science courses. It is possible that highly engaged students are more likely to engage in concordant ways compared with their lower engaged peers, resulting in smaller discordant profiles compared to a hypothetical sample of students with more moderate engagement on average. Finally, it may be that students’ retrospective recollection of engagement tends to be less nuanced in online settings compared with face-to-face classrooms. In virtual settings students 41 have more freedom to engage with other activities outside of the academic lesson. For example, if a student is not enjoying or interested in a particular lecture in a virtual class they might play a game on their phone or look up an interesting article on the internet. Although these positive feelings are not related to the academic content, students may recall generally higher levels of interest and enjoyment when they are asked to reflect on their engagement at the conclusion of a class than they otherwise might. EDUCATIONAL IMPLICATIONS Multiple studies support the conclusion that both synchronous and asynchronous instruction are necessary to create a virtual learning experience which maximizes student learning (Moorhouse & Wong, 2022; Murphy et al., 2011). In this study I extended those results, demonstrating that synchronous instruction is facilitative of a pattern of engagement where all dimensions of engagement are high. Further, results indicate that this pattern of high engagement is also associated with higher levels of course achievement. Therefore, it seems that including some form of synchronous instruction within a virtual course is beneficial to several academic outcomes. Notably, the results of this study do not suggest removing asynchronous content in high school science virtual courses. Asynchronous learning may be associated with a host of other important outcomes (goal structures, career aspirations, course satisfaction, etc.) not considered in this study, provide benefits to students beyond encouraging engagement, and provides an alternative to students who are unable to attend synchronous sessions; so, to conclude that asynchronous instruction has no place in high school science education would be a mistake. However, specific recommendations regarding the ideal amounts of synchronous and asynchronous instruction time to include within a virtual course and the types of instructional activities to incorporate were outside the purview of this study, and may depend on grade level, 42 subject, teacher and student preference, or a host of other factors. This study provided a first broad look at the engagement patters of students in a virtual school that included both synchronous and asynchronous modalities, but more research involving other virtual high schools is needed to make broader generalizations about implications for instructional design. For example, it is possible that asynchronous instruction might be more engaging than was found in this study in virtual schools that design asynchronous work to better leverage the affordances of this modality. It seems likely that some of the important affordances of asynchronous instruction were not utilized at this school, which may have contributed to lower levels of engagement among students in this modality. For example, students’ perceptions of autonomy and choice may have been considerably lessened during asynchronous instruction due to the extensive synchronous session schedule. As a result, students were restricted in when they could complete their asynchronous lessons. As a way to encourage more autonomy during asynchronous instruction teachers could embed other types of choice and control within these lessons, such as allowing students to select from a menu of assessment or assignment options (Schmidt et al., 2018), or helping to “frame” their experience by choosing the topic, the task, or how to define the problem (Stroupe, 2014). Improving discussion and interaction among students and between students and teachers during asynchronous instruction may help to foster a sense of belonging (Martin et al., 2018), highlight relevance of the content (Schmidt et al., 2019), and express enthusiasm (Diwaele & Changchen, 2021). During this study, asynchronous lessons at the school relied heavily on instructional practices which are not particularly effective at encouraging social interaction among students. For instance, all primary instructional practices reported during asynchronous instruction (reading, video, completing notes, quiz, and virtual labs) required no interaction or 43 communication with other students or teachers. Enhancing the asynchronous component of courses with instructional practices which encourage interaction and communication, such as discussion forums, Flipgrid discussions, or collaborative virtual bulletin boards, may heighten students’ perceptions of belonging, and in turn increase engagement (deNoyelles et al., 2014; Martin et al., 2018). These same discussion-based interactions might also allow teachers to highlight the relevance of the content to a greater degree by responding to questions and comments posed by students. For example, a Biology student might pose a question in a Flipgrid discussion about local ecosystems to which other students and the teacher can respond. Finally, videos or screencasts featuring the instructor, either in response to student questions on a video- based discussion forum or as stand-alone recordings, may allow teachers to express their enthusiasm for the content to a greater extent compared with text-based communication. Teachers often think about engagement as a singular entity. In this study I provided a framework to help teachers understand that engagement is more nuanced, and consists of multiple dimensions that may, or may not, move together. For example, whereas previously a teacher might see a student as simply “disengaged”, they might now understand that the student is disengaged behaviorally, but may still be engaged affectively and cognitively. Understanding engagement in this fashion is useful because a teacher might apply an intervention targeted at specific dimensions that are in need of support. For instance, consider a group of students who fail to see the value of a class project. The utility value literature is replete with examples of effective interventions to increase students’ perceptions of value (Hulleman et al., 2010; Harackiewicz & Hulleman, 2010). The teacher could select and use one of these interventions (e.g. students writing about the personal relevance of the activity) to enhance students’ perceptions of value regarding the project, thus supporting cognitive-value engagement. 44 However, note that further study is needed prior to making recommendations regarding classroom interventions. Over-summer-retention was positively predicted by only the lower self-regulatory profile. As discussed previously, there are several possible methodological and practical reasons for this finding; however, it might also be the case that retention is a not a useful measure of success for K-12 virtual schools. It may be that low retention rates at K-12 virtual schools are not a problem, and the schools are functioning as intended for many students who attend for a short time during a period of their life characterized by extenuating circumstances. For example, many students attended the school in this study as a “last chance” after failing out of their traditional school. Those students might reengage academically and return to their previous school, resulting in lower retention for the virtual school, but a positive outcome for the students. Some students also have health conditions (physical or mental) that prevent their attendance at a face- to-face school for a stretch of time. After recovering, they may choose to return to their traditional school, again reducing retention at the virtual school but resulting in positive student outcomes. Travel, work, or extracurricular activities are all other circumstances that might cause students to attend virtual schools for a limited time, but are not indicative of poor student outcomes. Additional research is needed to understand if retention is a useful indicator of K-12 virtual school quality, and if so, how retention should be operationalized. FUTURE DIRECTIONS This study was a first step to more deeply understanding the different ways that students engage while learning in their virtual high school science courses. Future studies might explore several directions. First, studies might examine how various instructional activities predict patterns of engagement. For example, engagement data could be collected during specific 45 instructional activities (i.e. lecture, lab, discussion ,etc.) and those learning activities could then be used to predict engagement profiles. Additionally, these instructional activity data might help to provide a sense of the variety of activities teachers choose to include in single lessons. The variety and specific types of activities included in each modality may be more predictive of engagement than broadly categorizing instruction as synchronous or asynchronous. Future studies might also measure specific instructional practices discussed in this study (e.g. relatedness, autonomy, relevance, teacher enthusiasm), and test whether those instructional practices mediate the relationship between instructional modality and engagement. Next, the way students engage may be associated with the ratio of synchronous to asynchronous instruction. Future studies might examine how different amounts of synchronous and asynchronous instruction within a course are associated with how students engage. For instance, it might be that students engage differently in a course with mostly synchronous instruction compared to a course with mostly asynchronous instruction. Finally, future studies might directly compare findings from a person-oriented with a variable-oriented approach using the same dataset to answer similar research questions. For example, comparing the ability of engagement profiles to predict grades and retention with the ability of a variable-oriented approach to do the same would provide some insight into the similarity of inferences when using these two methods, and possibly help guide researchers in selecting an approach to match the purpose of their study. LIMITATIONS AND DELIMITATIONS As with any study there were several limitations to this research. First, the generalizability of these results is limited to the included population. This study included participants from high school virtual science courses; therefore, results may not extend to traditional face to face settings or to other subjects (Sinatra et al., 2015). Engagement profiles 46 provide a unique and meaningful way to describe student engagement; however, the generalizability of profiles identified in this study to other populations is limited. Techniques such as LPA require more evidence from multiple studies before strong inferences can be drawn about the way students engage. Second, Masyn (2013) provides a useful reminder that latent profile analysis is an exploratory statistical technique that assumes a priori the existence of n number of subpopulations present in the sample, and that the exact number of profiles is subject to a somewhat subjective decision-making process. Third, White/Asian students completed significantly more asynchronous surveys compared with URM students. If non-URM students engage differently than URM students, then it is possible that the predictive ability of instructional modality may be more descriptive of White/Asian students compared with other ethnicities. Underrepresented minority in STEM was included as a covariate in the predictor analysis to minimize this potential bias. Next, students who earned higher course grades also had a higher response rate, meaning that the engagement described in this study was likely more characteristic of higher achieving students. However, a substantial portion of surveys were still obtained from students who earned lower grades with 31% of survey responses obtained from students who earned less than an 80%. Importantly, students’ knowledge of self-regulatory strategies and their achievement prior to the start of this study may have impacted how students engaged in each instructional modality, and how engagement was related to final course grades and retention. For instance, students who use self-regulatory strategies to a greater extent or had higher prior achievement may have been more likely to earn higher course grades. In the future, a pre-survey given to students could collect data on these (and other) individual differences to be used as covariates in analyses, and taken into account when discussion conclusions and implications. The sample of students included in this study was smaller than originally 47 anticipated. Over 400 unique students completed at least one end of class report totaling over 1,200 observations; however, only 124 students resulting in 493 observations were included in the study due to lack of consent from the remaining parents and students. Therefore, this study may have included students who were already more engaged with school more generally, which would likely be reflected in their reports of higher engagement in their classes. However, means of engagement dimension variables, while high, were not alarmingly so (ranging between 3.59 and 4.12 on a 1-5 Likert scale), providing confidence in the validity of the profiles estimated. Finally, the survey response rate within students was relatively low, students completed an average of about four out of a possible ten surveys. One of the advantages of giving students the opportunity to complete ten surveys was to represent engagement more accurately within each modality, since engagement may fluctuate from day to day. If students did not take full advantage of the surveys offered, then the engagement profiles may not be representative of as wide a range of learning contexts as originally hypothesized. However, the number of surveys completed during each week of the study declined only modestly (125 the first week to 96 the final week), suggesting that even if the total range of learning contexts was not reported by the average student, the sample as a whole was representative of the entire range of instructional activities in the study. Admittedly, this study looks at learning contexts from quite a large grain size. Learning contexts are broadly described as synchronous or asynchronous, with no adjustments made for the type of learning activities occurring within those modalities. Previous work has shown that learning context influences students’ engagement at finer grain sizes (e.g. what students are actively doing in their science courses, such as applied activities, lecture, seatwork, etc.) (Schmidt et al., 2018). Therefore, the way students engage might differ depending on the 48 activities they participate in during synchronous or asynchronous lessons. This study provided a useful first step in describing the way students broadly engage in these two modalities, giving educators insight into the different ways engagement manifests itself during synchronous and asynchronous learning. Finally, the COVID-19 pandemic certainly impacted students’ ability to engage with their schoolwork and teachers’ ability to provide instruction to students at some point and to some degree. Although virtual education was “normal” for this group of students and educators, their lives were undoubtedly affected because of sick family members, economic instability, pervasive anxiety, or several other reasons related to the pandemic. During the pandemic some students, parents, and teachers might have viewed participating in a research study as an unnecessary stressor in an already overloaded environment. Therefore, it is possible that only the more highly motivated and engaged students, parents, and teachers agreed to participate (mean engagement in this study was quite high). In a different historical context participation in this study might have been higher, especially among those students who are generally less motivated and engaged with school. CONCLUSIONS This study is a first step towards understanding and describing how students engage during instruction in virtual secondary school settings. Students in high school virtual science courses evidenced distinct patterns of engagement, experienced higher engagement while participating in synchronous instruction, and earned higher grades in science when engaging at a higher level. Most of the time students engaged in concordant ways, with all four dimensions of engagement being nearly equal at various levels (all high, all low, etc.); however, students also experienced situations where they simultaneously experienced higher engagement in some 49 dimensions and lower engagement in other dimensions. These results demonstrate that students in secondary virtual schools engage in a variety of ways during instruction, in ways both similar to and different from their face-to-face peers. Although some studies are beginning to utilize a wider variety of conceptual frameworks (Borup et al., 2014), previous engagement research in virtual education has mostly focused on engagement at the school level and microbehaviors in asynchronous lessons; examining how these types of engagement are associated with increased retention and persistence in online education and at virtual schools. This view of engagement fails to account for a myriad of other conceptualizations of engagement and various other important predictors and outcomes of that engagement which are vital to design and create effective online lessons and instructional episodes. These results add the nascent but growing body of literature describing and understanding the complexity of engagement in virtual secondary school instruction. 50 Table 1: Participant demographic characteristics N = 124 % Students Sex Male 35% Grade Level Female 65% 9th 14% 10th 35% Race/Ethnicity 11th 36% White 74% 12th 15% Black 16% Latinx 7% Teacher (Course) Native American or Alaska 2% Teacher 1 (Physical Science) 9% Native Asian < 1% Teacher 2 (Physical Science) 23% Teacher 3 (Earth Science) 11% Free/Reduced Lunch Eligible 72% Teacher 4 (Forensic Science) 23% Teacher 5 (Veterinary Science) 11% Teacher 5 (Chemistry) 23% 51 Table 2: Descriptive statistics and correlations for all engagement, predictor, and outcome variables 1 2 3 4 5 6 7 Within-person 1. Aff. Enga. 2. Beh. Enga. 0.45*** 3. Cog. S-R Enga. 0.31*** 0.45*** 4. Cog. Val. Enga. 0.40*** 0.28*** 0.09 5. Synchronous 0.09 0.10 -0.03 0.16** Between-person 1. Aff. Enga. 2. Beh. Enga. 0.76*** 3. Cog. S-R Enga. 0.73*** 0.76*** 4. Cog. Val. Enga. 0.81*** 0.59*** 0.64*** 5. Synchronous 0.05 0.22* 0.05 0.21* 6. Final Grade 0.50*** 0.49*** 0.32*** 0.39*** 0.07 7. Retention -0.07 -0.06 -0.09 -0.08 0.07 -0.18* Mean(SD) 3.87(0.98) 4.12(0.79) 3.72(0.87) 3.59(1.03) 0.32(0.47) 75.82(21.90) 0.73(0.45) Min – Max 1.0 – 5.0 1.0 – 5.0 1.0 – 5.0 1.0 – 5.0 0.0 – 1.0 0.05 – 100 0–1  0.95 0.93 0.77 0.88 - - - Note: * indicates p < 0.05. ** indicates p < 0.01, *** p < 0.001 Statistics for the cognitive value dimension were calculated after the removal of two reverse coded items. 52 Table 3: Within/between correlations for items measuring affective engagement 1 2 3 4 Within-person 1. Feel good 2. Interest .39*** 3. Fun .34*** .39*** 4. Enjoy .26*** .41*** .42*** Between-person 1. Feel good 2. Interest 0.75*** 3. Fun 0.76*** 0.87*** 4. Enjoy 0.75*** 0.89*** 0.88*** Mean(SD) 3.99 3.87 3.69 3.92 Min – Max 1.0 – 5.0 1.0 – 5.0 1.0 – 5.0 1.0 – 5.0 Note: * indicates p < 0.05. ** indicates p < 0.01, *** p < 0.001 53 Table 4: Within/between correlations for items measuring behavioral engagement 1 2 3 4 5 Within-person 1. Try hard 2. Work hard .59*** 3. Participate .33*** .36*** 4. Careful .38*** .50*** .40*** 5. Attention .34*** .45*** .40*** .47*** Between-person 1. Try hard 2. Work hard 0.83*** 3. Participate 0.75*** 0.73*** 4. Careful 0.79*** 0.80*** 0.67*** 5. Attention 0.71*** 0.72*** 0.67*** 0.83*** Mean(SD) 4.11 4.06 3.95 4.19 4.28 Min – Max 1.0 – 5.0 1.0 – 5.0 1.0 – 5.0 1.0 – 5.0 1.0 – 5.0 Note: * indicates p < 0.05. ** indicates p < 0.01, *** p < 0.001 54 Table 5: Within/between correlations for items measuring cognitive-self-regulatory engagement 1 2 3 4 5 Within-person 1. Ask questions 2. Think .54*** 3. Connect .44*** .44*** 4. Understand info. .28*** .25*** .35*** 5. Learn .33*** .32*** .38*** .18*** Between-person 1. Ask questions 2. Think 0.83 3. Connect 0.69 0.69 4. Understand info. 0.54 0.60 0.56 5. Learn 0.73 0.75 0.66 0.49 Mean(SD) 3.56 3.61 3.70 4.23 3.52 Min – Max 1.0 – 5.0 1.0 – 5.0 1.0 – 5.0 1.0 – 5.0 1.0 – 5.0 Note: * indicates p < 0.05. ** indicates p < 0.01, *** p < 0.001 55 Table 6: Within/between correlations for items measuring cognitive-value engagement 1 2 3 4 5 Within-person 1. Waste 2. Useless .26*** 3. Meaning .09 .25*** 4. Important future .13** .16** .41*** 5. Important me .12* .20*** .42*** .43*** Between-person 1. Waste 2. Useless 0.83*** 3. Meaning 0.60*** 0.61*** 4. Important future 0.58*** 0.51*** 0.78*** 5. Important me 0.64*** 0.65*** 0.82*** 0.74*** Mean(SD) 4.06 3.94 3.55 3.50 3.74 Min – Max 1.0 – 5.0 1.0 – 5.0 1.0 – 5.0 1.0 – 5.0 1.0 – 5.0 Note: * indicates p < 0.05. ** indicates p < 0.01, *** p < 0.001 The “Waste” and “Useless” items were removed from further analyses 56 Table 7: Fit statistics for candidate latent engagement profiles Model # of npar AIC BIC Entropy LL Profile Size (small to large) Smallest profiles Class % Model 1: 2 13 3865 3919 0.891 -1919.669 167, 326 34.00% Varying means, 3 18 3341 3417 0.896 -1652.782 94, 165, 234 19.00% equal variances, 4 23 3034 3140 0.905 -1498.731 53, 113, 137, 190 11.00% and covariances 5 28 2927 3045 0.922 -1435.807 7, 48, 116, 137, 185 1.40% fixed to 0 6 33 2852 2990 0.908 -1393.162 6, 48, 58, 103, 127, 151 1.20% 7 38 2797 2957 0.916 -1360.735 3, 6, 47, 55, 104, 128, 150 0.60% 8 43 2757 2938 0.894 -1335.749 7, 15, 41, 43, 70, 87, 107, 123 1.40% 9 48 2710 2912 0.904 -1307.466 3, 4, 6, 44, 45 60, 64, 123, 144 0.80% 10 53 2661 2884 0.904 -1277.844 6, 14, 21, 24, 31, 49, 57, 75, 106, 110 1.20% Model 2: 2 19 2750 2830 0.8 -1356.261 82, 411 17.00% Varying means, 3 24 2706 2807 0.824 -1329.234 45, 58, 390 9.00% equal variances, 4 29 2655 2777 0.886 -1298.708 8, 80, 196, 209 1.60% and equal 5 34 2621 2764 0.88 -1276.38 10, 17, 78, 191, 197 2.00% covariances 6 39 2586 2750 0.89 -1254.231 5, 10, 17, 78, 189, 194 1.00% 7 Non-interpretable model 8 49 2547 2753 0.902 -1224.386 4, 4, 17, 36, 50, 95, 141, 146 0.80% 9 Non-interpretable model 10 Non-interpretable model Note: Bolded model is selected solution. Adjusted LMR estimate and p-value are not included in this table because no profile comparisons were significant. npar = free parameters; AIC = Akaike Information Criterion; BIC = Bayesian Information Criterion; LL = Log Likelihood 57 Table 8: Means and 95% confidence intervals of engagement dimensions among profiles (n = 493) Variable Moderate All Mental Moderate/high All Lower Self-regulatory High All (n = 78) (n = 10) (n = 191) (n = 17) (n = 197) Affective 3.09[2.78, 3.40]a 2.24[1.68, 2.80]b 3.87[3.72, 4.02]c 3.97[3.62, 4.32]c 4.56[4.42, 4.69]d Behavioral 3.10[2.94, 3.25]a 1.72[1.37, 2.08]b 4.00[3.90, 4.09]c 3.93[3.76, 4.10]c 4.76[4.70, 4.83]d Cognitive Self-Regulatory 2.64[2.44, 2.84]a 1.46[0.80, 2.11]b 3.47[3.33, 3.61]c 2.29[1.75, 2.82]d 4.26[4.12, 4.40]e Cognitive Value 2.84[2.46, 3.22]a 2.16[1.56, 2.77]a 3.57[3.40, 3.74]b 3.39[2.87, 3.91]b 4.38[4.19, 4.58]c Note: Means and 95% confidence intervals of indicators within each profile. Within each row different subscripts indicate significantly different means using Tukey’s honestly significant difference tests. 58 Table 9: Odds ratios and standard errors for pairwise profile comparisons. Reference profile is listed first Instructional Modality Gender URM SES Odds Ratio Odds Ratio Odds Ratio Odds Ratio Moderate All vs Mental 0.73(0.59) Not estimable 1.59(1.46) Not estimable Moderate All vs Moderate/High All 1.21(0.54) 2.02(1.12) 0.69(0.43) 0.25(0.15)* Moderate All vs Lower Self-regulatory 2.83(2.07) 9.07(13.63) 0.68(0.77) 0.22(0.18) Moderate All vs High All 2.44(0.99)* 1.62(0.97) 0.93(0.62) 0.21(0.13)* Mental vs Moderate/High All 1.65(1.15) Not estimable 0.43(0.37) Not estimable Mental vs Lower Self-regulatory 3.86(3.61) Not estimable 0.43(0.54) Not estimable Mental vs High All 3.32(2.25) Not estimable 0.59(0.50) Not estimable Moderate/High All vs Lower Self-regulatory 2.33(1.49) 4.48(6.66) 1.000(1.11) 0.88(0.80) Moderate/High All vs High All 2.01(0.52)** 0.80(0.40) 1.36(0.74) 0.83(0.39) Lower Self-regulatory vs High All 0.86(0.53) 0.18(0.26) 1.36(1.50) 0.94(0.87) Note: *p < 0.05, **p < 0.01, ***p < 0.001. Standard errors given in parentheses after odds ratio. “Not estimable” values are due to homogeneity of the predictor variable in at least one of the comparison profiles. Significant values are bolded. Instructional Modality: 0 = asynchronous, 1 = synchronous Gender: 0 = male, 1 = female URM: 0 = White/Asian, 1 = Black, Latinx, Native American, or Alaska Native SES: 0 = free/reduced lunch eligible, 1 = not free/reduced lunch eligible How to interpret Odds Ratios: For every 1 unit change in the predictor variable, the odds of membership in the comparison profile, compared to the reference profile, will change by the value of the given odds ratio. Odds ratios greater than 1 indicate greater likelihood of being in the comparison profile, while odds ratios less than 1 indicate greater likelihood of being in the reference profile. For example, the odds of an observation being classified in the high all profile is 2.44 higher compared with the moderate all profile if the observation occurred during synchronous instruction. 59 Table 10: Mean differences pairwise comparisons of distal outcomes Intercept Outcome: Final Course Grade Difference SE p Moderate All vs Mental 12.71 22.74 0.576 Moderate All vs Moderate/High All -9.35 4.31 0.030 Moderate All vs Lower Self-regulatory -14.75 5.09 0.004 Moderate All vs High All -16.10 4.01 0.000 Mental vs Moderate/High All -22.06 21.94 0.315 Mental vs Lower Self-regulatory -27.46 22.04 0.213 Mental vs High All -28.81 22.13 0.193 Moderate/High All vs Lower Self-regulatory -5.40 4.17 0.195 Moderate/High All vs High All -6.75 2.57 0.009 Lower Self-regulatory vs High All -1.35 4.05 0.738 Intercept Outcome: Over-summer-retention Difference SE p Moderate All vs Mental -0.17 0.13 0.195 Moderate All vs Moderate/High All 0.06 0.12 0.586 Moderate All vs Lower Self-regulatory -0.17 0.10 0.099 Moderate All vs High All 0.003 0.12 0.982 Mental vs Moderate/High All 0.24 0.12 0.052 Mental vs Lower Self-regulatory 0.004 0.10 0.970 Mental vs High All 0.18 0.12 0.152 Moderate/High All vs Lower Self-regulatory -0.23 0.09 0.007 Moderate/High All vs High All -0.06 0.09 0.497 Lower Self-regulatory vs High All 0.17 0.08 0.036 Final Course Grade Profile Means (%) Retention Proportion Moderate All 71.75 0.79 Mental 58.86 0.90 Moderate/High All 82.98 0.75 Lower Self-regulatory 89.12 0.98 High All 89.93 0.80 Note: *p < 0.05, **p < 0.01, ***p < 0.001. Bolded comparisons are significantly different. 60 Figure 1: Raw score profile means from selected profile solution Affective Behavioral Cognitive-self-regulatory Cognitive-value test 5 4.5 4 3.5 3 2.5 2 1.5 1 Moderate All Mental Moderate/High All Lower Self-regulatory High All (n = 78, 16%) (n = 10, 2%) (n = 191, 39%) (n = 17, 3%) (197, 40%) 61 Figure 2: Standardized profile means from selected profile solution Affective Behavioral Cognitive-self-regulatory Cognitive-value test 1.5 1 0.5 0 -0.5 -1 -1.5 -2 -2.5 -3 -3.5 Moderate All Mental Moderate/High All Lower self-regulatory High All (n = 78, 16%) (n = 10, 2%) (n = 191, 39%) (n = 17, 3%) (197, 40%) 62 APPENDICES 63 APPENDIX A - PARENTAL CONSENT AND STUDENT ASSENT TEXT Parental Consent Form Text As part of normal instruction in your student’s science class this semester teachers asked them to complete exit tickets to share how they felt during science instruction and how engaged they were during the lessons. Teachers can use this information to inform their future instruction. This consent form gives permission for Michigan State University (MSU) researchers to use the exit ticket information collected by teachers as part of a research study. Additionally, your child’s final grade for their science course and demographic information (gender, free/reduced lunch eligibility, ethnicity, and enrollment status), would be collected from school records. The purpose of this disclosure of information is to understand the ways that students engage while reading lessons in the online school, and attending live virtual classes. Gathering data from as many students as possible is important because it will help us learn how to help all students engage more effectively, and it will help teachers to create and design meaningful and educational learning activities. This study requires no additional time commitment from your students(s) since they have already completed the exit tickets: We are just seeking access your child’s exit tickets and school records as described above. Your child’s participation in this study is voluntary. All identifying information will be removed from your child’s school records and exit ticket responses before being given to MSU researchers, and your child will not be identified in any research. Additionally, if both you AND your student complete consent forms on or before Friday, May 28th you will be eligible for a drawing to win a $100 Amazon gift card. If completed after this date you will be eligible for one of four $25 Amazon gift cards (you may choose to give these gift cards to your student). You must live in the state of Michigan to be eligible. To give away these gift cards a random drawing from eligible participants will be held at the conclusion of the study. Each eligible participant will be placed in one of four groups, and each group with will have one opportunity to win a gift card. You will receive an email prior to the drawing indicating the date and time of the drawing if you are eligible. The information we are collecting for the study will be shared with the research team including Matthew Schell and Jennifer Schmidt. If you have any questions about this consent, or the research study, please contact the lead researcher, Matthew Schell, at schellma@msu.edu. Please complete ONE of the following options: By typing my name and date I agree to allow my student’s classroom exit tickets, final science course grade, and demographic information (gender, free/reduced lunch eligibility, ethnicity, and enrollment status), to be shared anonymously with MSU researchers for research purposes. 64 Parent/Guardian First Name_________________ Parent/Guardian Last Name_______________ Date__________ Student First Name__________________ Student Last Name__________________ By typing my name and date I do NOT agree to allow my student’s classroom exit tickets, final science course grade, and demographic information (gender, free/reduced lunch eligibility, ethnicity, and enrollment status), to be shared anonymously with MSU researchers for research purposes. Parent/Guardian First Name_________________ Parent/Guardian Last Name_______________ Date__________ Student First Name__________________ Student Last Name__________________ Student Assent Form Text You have the opportunity to be involved in a research study in partnership with Michigan State University. As part of instruction in your science class this semester you completed exit tickets to share how you felt during science instruction and how engaged you are during your lessons. This study requires no additional time or effort from you since you have already completed the exit tickets. This consent form gives permission for Michigan State University (MSU) researchers to use the exit ticket information collected by your teachers as part of a research study. Additionally, your final grade for your science course and your demographic information (gender, free/reduced lunch eligibility, ethnicity, and enrollment status), would be collected from school records. The purpose of releasing this information is to understand the ways that students engage while reading lessons in the online school, and attending live virtual classes. Gathering data from as many students as possible is important because it will help us 65 learn how to help all students engage more effectively, and it will help teachers to create and design meaningful and educational learning activities. In order to participate in this study both you AND your parent/guardian must give permission by completing consent forms. If you and your parent/guardian complete the permission consent forms on or before Friday, May 28th your parent/guardian will be eligible for a drawing to win a $100 Amazon gift card. If completed after this date your parent/guardian will be eligible for one of four $25 Amazon gift cards (they may choose to share this with you). Your parent/guardian will be placed in one of four groups, and each group with will have one opportunity to win a gift card. Drawings will take place at the conclusion of the study and will be randomly selected from those who complete the form, whether you agree to have your data included or not. Your parent/guardian will receive and email indicating the date and time of the drawing if you are eligible. Your assent below gives Michigan State University researchers permission to use your data from exit tickets collected by your teachers and your demographic information described above. Your participation in this study is voluntary. Your answers on these exit tickets and your grades information will be kept confidential. Your teacher will never be able to link your responses on these exit tickets with your name and researchers will never know your identify. The information we are collecting for the study will be shared with the research team including Matthew Schell and Jennifer Schmidt. If you have any questions about this consent, or the research study, please contact the lead researcher, Matthew Schell, at schellma@msu.edu. Please complete ONE of the following options: By typing my name and date I agree to allow my classroom exit tickets, final science course grade, and demographic information (gender, free/reduced lunch eligibility, ethnicity, and enrollment status), to be shared anonymously with MSU researchers for research purposes. First Name_________________ Last Name_______________ Date__________ By typing my name and date I do NOT agree to allow my student’s classroom exit tickets, final science course grade, and demographic information (gender, free/reduced lunch eligibility, ethnicity, and enrollment status), to be shared anonymously with MSU researchers for research purposes. First Name_________________ Last Name_______________ Date___________ 66 APPENDIX B – RECRUITMENT EMAILS SENT TO PARENTS AND STUDENTS Parental Recruitment Email Text Dear [name of parent], I wanted to share some information about a study that will be taking place in your child’s science class this semester. Your child has the opportunity to participate in some exciting educational research at Michigan Great Lakes Virtual Academy (MGLVA)! MGLVA will be working with educational researchers at Michigan State University (MSU) to learn about the ways in which students engage while learning virtually. As part of normal instruction in your student’s science class this semester, their teachers will ask them to complete exit tickets to share how they feel during science instruction and how engaged they were during the lessons. I am asking for your permission to use the data collected as short exit tickets by teachers in a research study. Additionally, your child’s grades for this course and demographic information (gender, free/reduced lunch eligibility, ethnicity, and enrollment status), would also be collected from school records. This study requires no time commitment outside of class: We are just seeking access to your child’s exit tickets and school records as described above. All identifying information will be removed from your child’s school records and exit ticket responses before being given to MSU researchers, and your child will not be identified in any research. Gathering data from as many students as possible is important because it will help us learn how to help all students engage more effectively, and it will help teachers to create and design meaningful and educational learning activities. The purpose of the study is to understand the ways that students engage while reading lessons in the online school, and attending live virtual classes. If you have any questions or do not want your child’s data released to MSU researchers please contact the lead researcher, Matthew Schell, at schellma@msu.edu. Regards, Matthew Schell Student Recruitment Email Text Hello Learners! Would you like the opportunity to help make science education more fun, entertaining, and engaging (along with becoming eligible for an Amazon gift card)? This is your chance! You have the opportunity to be involved in a research study in partnership with Michigan State University. As part of your science class this semester, your teachers will be asking you to complete exit tickets to share how you feel during science instruction and how engaged you are 67 during your lessons. I am asking for your permission to use the data collected by your teachers and your course grades for a research study. Your answers on these exit tickets and your grades information will be made completely anonymous and you will not be identified in any way. Your participation in this study will greatly benefit researchers’ understanding of how students feel and engage in virtual science classrooms and courses. Data from this research will provide insight into the different ways that students engage with science content and allow educators to design more interesting and engaging lessons. Additionally, if you complete the consent form linked in this email you will be eligible for a drawing to win one of several $25 Amazon gift cards. Drawings will take place at the conclusion of the study and will be randomly selected from those who complete the consent form. Please click on and complete the consent form to enter the drawing. Consent form link: If you have any questions, please contact the lead researcher, Matthew Schell, at schellma@msu.edu. A. Student Assent Form Text You have the opportunity to be involved in a research study in partnership with Michigan State University. As part of instruction in your science class this semester your teachers will ask you to complete exit tickets to share how you feel during science instruction and how engaged you are during your lessons. I am asking for your permission to use the data collected by your teachers and your course grades in a research study. Your participation in this study will greatly benefit researchers’ understanding of how students feel and engage in virtual science classrooms and courses. Data from this research will provide insight into the different ways that students engage with science content and allow educators to design more interesting and engaging lessons. Additionally, if you complete this consent form you will be eligible for a drawing to win one of several $25 Amazon gift cards. Drawings will take place at the conclusion of the study and will be randomly selected from those who complete the consent form. Your consent below gives Michigan State University researchers permission to use your data from exit tickets collected by your teachers and your course grade for research purposes. Participation in this study is optional. Your answers on the exit tickets will be made completely anonymous and you will not be identified in any way. If you have any questions about this consent, or the research study, please contact the lead researcher, Matthew Schell, at schellma@msu.edu. 68 APPENDIX C – INSTRUCTIONAL ACTIVITIES TABLES Table C1: Total occurrences of primary instructional activity across all teachers Synchronous Instructional Total Count (days) Activity Lecture 20 Practice Questions 16 Guided Notes 5 Virtual Lab/Simulation 4 Discussion 3 Quiz 2 Small Group Work 2 Asynchronous Instructional Activity Reading 20 Video 12 Complete/Review Notes 5 Quiz/Test 4 Virtual Lab/Simulation 3 69 Table C2: Description of content and instructional activities during the study period for Teacher 1 – Earth Science Description of Content Instructional Activities Week 1 Synchronous Astronomy – Earth, Moon, Sun, Solar system Lecture, guided notes, quiz Asynchronous Introduction to Astronomy, the Sun Independent reading Week 2 Synchronous Terrestrial and jovian planets; planetary motion Lecture, discussion, quiz Asynchronous Terrestrial and jovian planets Online lessons (reading and videos) Week 3 Synchronous Color and brightness of stars Lecture, guided notes Asynchronous Color and brightness of stars Online lessons (reading and videos) Week 4 Synchronous Astronomy units test review and questions Polling questions, discussion Asynchronous Astronomy unit test Unit test Week 5 Synchronous No information No information Asynchronous No information No information 70 Table C3: Description of content and instructional activities during the study period for Teacher 2 – Physical Science Description of Content Instructional Activities Week 1 Synchronous Introduction to mixtures Lecture, notes, review questions Asynchronous Introduction to mixtures Online lessons (reading and videos) Week 2 Synchronous Mixtures and solubility lab Lab overview, in-class demo, discussion Asynchronous Mixtures and solubility lab Video, virtual lab, lab report Week 3 Synchronous Valence electrons Class discussion, polling questions, escape room activity Asynchronous Valence electrons Online lessons (reading and videos), practice quiz Week 4 Synchronous Introduction to chemical reactions Class discussion, formative assessment Asynchronous Introduction to chemical reactions Online lessons (reading and videos), practice quiz Week 5 Synchronous Balancing chemical equations Lecture, Nearpod practice, virtual simulation Asynchronous Balancing chemical equations Online lesson (reading and videos), virtual simulation, practice quiz 71 Table C4: Description of content and instructional activities during the study period for Teacher 3 – Forensic Science Description of Content Instructional Activities Week 1 Synchronous Detecting latent blood and blood typing Lecture, practice questions, formative quiz Asynchronous Blood types and usefulness in crime investigation Online lesson (reading and videos) Week 2 Synchronous Footwear impressions, casting, tire impression evidence Create foot wear print, video, polling questions Asynchronous Investigating crimes using footwear and tire impression Online lesson (reading and videos) evidence Week 3 Synchronous Investigating crime using soil, glass and tool evidence In-class activity/lab Asynchronous Continue working on crime case with new evidence Reading, continue writing essay Week 4 Synchronous Personal injury crimes Lecture, small group work Asynchronous Personal injury crimes Reading Week 5 Synchronous Business and financial crimes Lecture, small group work Asynchronous Business and financial crimes Reading 72 Table C5: Description of content and instructional activities during the study period for Teacher 4 – Physical Science Description of Content Instructional Activities Week 1 Synchronous Introduction to mixtures Lecture, guided notes Asynchronous Introduction to mixtures Online lesson (reading and videos) Week 2 Synchronous Factors that influence solubility Virtual lab Asynchronous Factors that influence solubility Complete notes, video Week 3 Synchronous Electron energy levels Lecture, practice questions Asynchronous Electron energy levels Complete notes, website activity Week 4 Synchronous Chemical bonds – metallic and hydrogen Lecture, practice questions Asynchronous Chemical bonds – metallic and hydrogen Video Week 5 Synchronous Balancing chemical equations Lecture, practice questions Asynchronous Balancing chemical equations Virtual simulation 73 Table C6: Description of content and instructional activities during the study period for Teacher 5 – Veterinary Science Description of Content Instructional Activities Week 1 Synchronous Case studies and previewed two example case studies Lecture, scenario polling questions Asynchronous Problems and diseases with animals Read articles Week 2 Synchronous Problems with animals and reviewed a case study #2 Lecture, polling questions Asynchronous Problems and diseases with animals Read articles Week 3 Synchronous Problems with animals and reviewed a case study #3 Lecture, polling questions Asynchronous Problems and diseases with animals Read articles Week 4 Synchronous Practice questions to review for test Whole class review game Asynchronous Reviewed content in “Problems with Animals” unit Reviewed guided notes Week 5 Synchronous Requirement to obtain a DVM, and job opportunities Polling questions with teacher explanation Asynchronous Reviewed content for “Career Opportunities” unit Reviewed guided notes 74 Table C7: Description of content and instructional activities during the study period for Teacher 5 - Chemistry Description of Content Instructional Activities Week 1 Synchronous Mole ratios and molar mass. Lecture, guided notes Asynchronous Molar mass, stoichiometry, Avogadro’s number Online lessons (reading and videos) Week 2 Synchronous Acids and bases Lecture, practice questions Asynchronous Environmental impact related to acids and bases Online lessons (reading and videos) Week 3 Synchronous Acid and base equations; difference between weak and strong Lecture, practice questions acid and base Asynchronous Neutralization reactions and titrations Online lessons (reading and videos) Week 4 Synchronous Reaction rates and energy of activation Lecture, practice questions Asynchronous Factors effecting reaction rates Online lessons (reading and videos) Week 5 Synchronous Reaction rates and equilibrium Lecture Asynchronous Review content for unit test (reaction rates and equilibrium) Review guided notes and textbook 75 APPENDIX D – ENGAGEMENT MEASURES Behavioral Engagement 5-point Likert scale (1 = Strongly disagree to 5 = Strongly agree) Asynchronous Item Synchronous Item Citation I tried hard to do well during I tried hard to do well during Skinner et al. (2009) the lesson today. class today. I worked as hard as I could I worked as hard as I could during Skinner et al. (2009) during the lesson today. class today. I participated during the lesson I participated during class today. Skinner et al. (2009) today. I read carefully during the I listened carefully during class Skinner et al. (2009) lesson today. today. I paid attention during the I paid attention during class Skinner et al. (2009) lesson today. today. Affective Engagement 5-point Likert scale (1 = Strongly disagree to 5 = Strongly agree) Asynchronous Item Synchronous Item Citation I felt good during the lesson I felt good during class today. Skinner et al. (2009) today. I felt interested in what I was I felt interested in what I was Skinner et al. (2009) working on during the lesson working on during class today. today. I had fun during the lesson I had fun during class today. Skinner et al. (2009) today. I enjoyed learning new things I enjoyed learning new things Skinner et al. (2009) during the lesson today. during class today. 76 Cognitive-Value Engagement 5-point Likert scale (1 = Strongly disagree to 5 = Strongly agree) Asynchronous Item Synchronous Item Citation The lesson today was useless to Class today was useless to me. Voelkl (1996) me. The lesson today was important Class today was important to me. Schmidt et al. (2018) to me. The lesson today was important Class today was important to my Schmidt et al. (2018) to my future. future. The lesson today was Class today was meaningful to Connor & Pope (2013) meaningful to me. me. The lesson today was a waste of Class today was a waste of time Voelkl (1996) time. for me. Cognitive-Self-Regulatory Engagement 5-point Likert scale (1 = Strongly disagree to 5 = Strongly agree) Asynchronous Item Synchronous Item Citation I asked myself questions to I asked myself questions to make Pintrich & De Groot make sure I understood the sure I understood the material (1990) material during the lesson during class today. today. I paused once in awhile to think I paused once in awhile to think Pintrich & De Groot about what I was learning about what I was learning during (1990) during my lesson today. class today. I tried to connect what I was I tried to connect what I was Fredricks (2016) learning to things I have learned learning to things I have learned before during the lesson today. before during class today. I tried to understand the I tried to understand the Fredricks (2016) information during the lesson information during class today. today. I thought about the things I I thought about the things I would Pintrich & De Groot would need to learn before my need to learn before class today. (1990) lesson today. 77 APPENDIX E – ADDITIONAL ANALYTICAL INFORMATION The primary analysis for this study consisted of three steps. I first identified a student engagement profile solution, then predicted profile membership based on modality of instruction, and finally used engagement profiles to predict final science course grades and over-summer retention. To identify engagement profiles, I used latent profile analysis (LPA). LPA is a specific type of analysis in the mixture modeling family of statistical techniques. LPA (and all mixture modeling techniques) assumes that there are multiple, distinct subpopulations within an overall sample of data. These subpopulations are assumed to be different based on a number of indicators that the researcher is theoretically interested in. A simple example best illustrates this point. Imagine a research study in a science classroom where students’ interest in science as the single variable under study. In this sample it might be the case that about half of the students were very interested in science, and the other half were not interested at all, with very few being moderately interested. If one were to take the mean of this sample it would be easy to conclude that students’ interest in science was average; however, that would be missing an important part of the story. In fact, students in this sample are not generally moderately interested in science, most lie near the extremes of interest (high and low). One might say there are two subpopulations of students within the overall sample; those that are highly interested in science, and those who are not interested in science. LPA and other mixture modeling techniques can quantify and describe these various subpopulations, leading to insights that may not emerge using other statistical techniques. MIXTURE MODELS 78 Mixture models assume that there is unobserved heterogeneity in a sample of observations. This unobserved heterogeneity manifests itself as multiple subpopulations within the overall sample with each observations’ membership unknown. These subpopulations are modeled as latent variables and given the name latent classes in the case of categorical indicators, or latent profiles in the case of continuous indicators. The number and nature of the classes or profiles present in the population, along with membership in these classes or profiles is unknown prior to analysis. Therefore, the goal of mixture modeling is to identify these sub- populations and assign observations to those sub-populations using mixture probabilities. Mixture models share an underlying goal with traditional clustering approaches; however, a key benefit of using mixture modeling, in addition to likelihood estimation-based fit indices, is that probabilities of profile membership are utilized, unlike cluster analysis. For a detailed overview of finite mixture modeling see Masyn (2013). ENGAGEMENT PROFILES I used LPA to identify student engagement profiles consisting of affective, behavioral, cognitive-self-regulatory, and cognitive-value dimensions using data from both synchronous and asynchronous surveys. LPA is superior to traditional cluster analysis because of its ability to account for uncertainty in profile membership assignment and its use of likelihood estimation- based fit statistics. These profiles make it possible to analyze the multivariate data collected on engagement within the parsimony of a single model. Broadly, indicator variables of profiles in LPA can be modeled as latent or observed variables. Latent variable modeling has certain advantages over modeling variables as observed when the constructs under study are latent in nature. One key benefit of modeling constructs as latent is the ability to reduce a large number of observed indicators to a much small number of factors, while also taking into account the 79 correlations among the observed variables that constitute each factor (Tabachinck & Fidell, 2013). Since engagement is a latent construct, these variables were initially modeled as such during the LPA enumeration process; however, issues with convergence and model interpretability precluded selecting a model solution with latent engagement indicators. Therefore, engagement dimensions were modeled as observed variables using factor scores. Modeling LPA indicators as factor scores, as opposed to a composite mean, retained some of the benefits of modeling indicators as latent variables, but without the complexity that resulted in uninterpretable solutions. PREDICTORS AND COVARIATES I predicted profile membership using a dummy coded instructional modality variable (i.e. synchronous or asynchronous). I also examined the predictive effects of all covariates in this step (gender, ethnicity, and SES). To predict profile membership multinomial logistic regressions were completed in MPlus using the 3-step method (Asparouhov & Muthèn, 2014). This method has the advantage of reducing the likelihood of profile membership shift when adding predictors to the model. The 3-step method involves using separate multinomial logistic regressions to examine the ability of all included variables to predict profile membership between all possible pairs of profiles. For example, one logistic regression was used to compare profile 1 to profile 2. Another logistic regression was used to compare profile 1 with profile 3. This process proceeded with all possible comparisons between profiles. All predictor variables and covariates were included in each logistic regression to control for the effects of one another. 80 APPENDIX F – MISSING DATA ANALYSIS Missing data analyses evaluated the extent to which data was systemically missing from the sample. These missing data analyses included only students who consented to participate in the study. First, I conducted a grouping composition check of response rates on synchronous and asynchronous surveys to check for systematic differences in the number of survey responses based on individual factors (gender, ethnicity, and SES) using a t-test for all three variables. The proportion of surveys answered in synchronous and asynchronous modalities was calculated as a proportion on a scale of 0 (no surveys answered) to 1 (all surveys answered) for each individual student. I also tested whether there was an association between students’ response rate and achievement (final course grade) by examining the correlation between these two variables. Finally, I examined whether there was systemic differences between students based on the same individual factors (gender, ethnicity, and SES) on each outcome variable (final course grades and over-summer-retention). A t-test was used to check for differences in final course grades and a chi-square test was used to check for differences in over-summer-retention. For gender, t-tests indicated there was no difference in response rates on either synchronous (t(95.74) = 0.53, p = 0.60) or asynchronous surveys (t(70.50) = -1.08, p = 0.28), no difference in final course grades (t(92.99) = -1.09, p = 0.28), and no difference in retention (χ 2(1, n = 124) = 1.23, p = 1.0). For ethnicity, t-tests indicated there was no difference in response rates on synchronous surveys (t(80.51) = 1.70, p = 0.09), but White/Asian students responded to a significantly higher number of asynchronous surveys compared with underrepresented minorities (t(63.65) = 2.44, p < 0.05, mean difference = 14%, eta squared = 0.044).This is a small effect size according to Cohen (1988). There was no difference in final course grades (t(59.86) = 0.55, p = 0.58) or retention (χ2(1, n = 124) = 3.485, p = 0.06) based on ethnicity. For socio-economic 81 status, t-tests indicated there was no difference in response rates on either synchronous (t(60.49) = 0.09, p = 0.93) or asynchronous surveys (t(70.36) = 1.22, p = 0.23). However, students who were not eligible for free or reduced lunch earned higher final grades compared to those students who were eligible (t(122.67) =3.56, p < 0.001, mean difference = 11.26, eta squared = 0.089). This is a moderate effect size according to Cohen (1988). There was no difference in retention based on socio-economic status ((χ2(1, n = 124) = 0.16, p = 0.69). Finally, students who earned higher final course grades completed more surveys (r(122) = 0.52, p = <0.001) 82 REFERENCES 83 REFERENCES Ainley, M. (2012). Students’ interest and engagement in classroom activities. In S. J. Christenson, A. L. Reschly, & C. Wylie (Eds.), Handbook of Research on Student Engagement (pp. 283–302). New York, NY: Springer. Anderman, L. H. (1999). Classroom goal orientation, school belonging, and social goals as predictors of students’ positive and negative affect following transition to middle school. Journal of Research and Development in Education, 32 , 89–103. https://psycnet.apa.org/record/1999-10018-002 Anderman, L. H., Andrzejewski, C. E., & Allen, J. (2011). How do teachers support students’ motivation and learning in their classrooms. Teachers College Record, 113, 969–1003. https://doi.org/10.1177/016146811111300502 Appleton, J. J., Christenson, S. L., & Furlong, M. J. (2008). Student engagement with school: critical conceptual and methodological issues of the construct. Psychology in the Schools, 45(5), 369–386. https://doi.org/10.1002/pits Appleton, J. J., Christenson, S. L., Kim, D., & Reschly, A. L. (2006). Measuring cognitive and psychological engagement: Validation of the student engagement instrument. Journal of School Psychology, 44(5), 427–445. https://doi.org/10.1016/j.jsp.2006.04.002 Asparouhov, T., & Muthen, B. O. (2014). Auxiliary variables in mixture modeling: A 3-step approach using Mplus. Structural Equation Modeling: A Multidisciplinary Journal, 21(3), 329–341. https://doi.org/10.1080/10705511.2014.915181 Atwall, M.N., Balfanz, R., Manspile, E., Byrnes, V., & Bridgeland, J.M. (2020). Building a grad nation: Progress and challenge in raising high school graduation rates. https://www.americaspromise.org/report/2020-building-grad-nation-report Aud, S., Hussar, W., Johnson, F., Kena, G., Roth, E., Manning, E., Wang, X., and Zhang, J. (2012). The Condition of Education 2012 (NCES 2012-045). U.S. Department of Education, National Center for Education Statistics. Washington, DC. Retrieved [10/1/2018] from http://nces.ed.gov/pubsearch Baker, R.S. J.d., (2007). Modeling and Understanding Students' Off-Task Behavior in Intelligent Tutoring Systems. Proceedings of ACM CHI 2007: Computer-Human Interaction, 1059- 1068. https://doi.org/10.1145/1240624.1240785 Baker, R. S., D’Mello, S. K., Rodrigo, M. T., & Graesser, A. C. (2010). Better to be frustrated than bored: The incidence and persistence of affect during interactions with three different computer-based learning environments. International Journal of Human-Computer Studies, 68 (4)(4), 223–241. https://doi.org/10.1016/j.ijhcs.2009.12.003 84 Barsade, S.G., & Gibson, D.E. (2007). Why does affect matter in organizations? Academy of Management Perspectives, 21, 36–59. doi:10.5465/AMP.2007.24286163 Beaunoyer, E., Dupéré, S., & Guitton, M. J. (2020). COVID-19 and digital inequalities: Reciprocal impacts and mitigation strategies. Computers in Human Behavior, 111, 106424 https://doi.org/10.1016/j.chb.2020.106424 Bergman, L. R., & Trost, K. (2006). The person-oriented versus the variable-oriented approach: Are they complementary, opposites, or exploring different worlds. Merrill-Palmer Quarterly, 52, 601–632. https://doi.org/10.1353/mpq.2006.0023 Beymer, P. N., Schell, M. J., Alberts, K. M., Rosenberg, J. M., & Schmidt, J. A. (April, 2019). Student engagement profiles in formal and informal STEM learning settings. Paper presented at the 2019 American Educational Research Association Annual Meeting, Toronto, ON. Boekaerts, M., Pintrich, P. R., & Zeidner, M. (Eds.). (2000). Handbook of self-regulation . San Diego, CA: Academic. Bonafini, F. C., Chae, C., Park, E., & Jablokow, K. W. (2017). How much does student engagement with videos and forums in a MOOC affect their achievement? Online Learning Journal, 21(4), 223–240. https://doi.org/10.24059/olj.v21i4.1270 Borup, J., West, R. E., Graham, C. R., & Davies, R. S. (2014). The adolescent community of engagement: A framework for research on adolescent online learning. Journal of Technology and Teacher Education, 22(1), 107–129. https://www.learntechlib.org/p/112371/ Brierton, S., Wilson, E., Kistler, M., Flowers, J., & Jones, D. (2016). A comparison of higher order thinking skills demonstrated in synchronous and asynchronous online college discussion posts. NACTA Journal, 60(1), 14-21. https://www.jstor.org/stable/10.2307/nactajournal.60.1.14 Chaw, L. Y., & Tang, C. M. (2019). Driving high inclination to complete massive open online courses (MOOCs): Motivation and engagement factors for learners. Electronic Journal of E-Learning, 17(2), 118–130. https://doi.org/10.34190/JEL.17.2.05 Christenson, S. J., Reschly, A. L., & Wylie, C. (2012). Handbook of Research on Student Engagement. New York, NY: Springer. Cleary, T. J., & Zimmerman, B. J. (2012). A cyclical self-regulatory account of student engagement: Theoretical foundations and applications. In S. J. Christenson, A. L. Reschly, & C. Wylie (Eds.), Handbook of Research on Student Engagement (pp. 237– 258). New York, NY: Springer. 85 Conner, J. O., & Pope, D. C. (2013). Not just robo-students: Why full engagement matters and how schools can promote it. Journal of Youth and Adolescence, 42(9), 1426–1442. https://doi.org/10.1007/s10964-013-9948-y deNoyelles, A., Zydney, J., & Chen, I. (2014). Strategies for creating a community of inquiry through online asynchronous discussions. Journal of Online Learning and Teaching, 10(1), 153-165. 153–n/a. Retrieved from http://search.proquest.com/docview/1614680520?accountid=458 Dewaele, J. M., & Li, C. (2021). Teacher enthusiasm and students’ social-behavioral learning engagement: The mediating role of student enjoyment and boredom in Chinese EFL classes. Language Teaching Research, 25(6), 922–945. https://doi.org/10.1177/13621688211014538 Digital Learning Collaborative. (2020). Snapshot 2020: A review of K-12 online, blended, and digital learning. Retrieved from https://www.digitallearningcollab.com/publications- overview. Dixson, M. D. (2015). Measuring Student Engagement in the Online Course : The Online Student Engagement Scale ( OSE ). Online Learning Journal, 19(4). https://doi.org/10.24059/olj.v19i4.561 Falloon, G. (2011). Making the connection: Moore’s theory of transactional distance and its relevance to the use of a virtual classroom in postgraduate online teacher education. Journal of Research on Technology in Education, 43, 187-209. https://doi.org/10.1080/15391523.2011.10782569 Finn, J. D. (1989). Withdrawing from school. Review of Educational Research, 59(2), 117–142. https://doi.org/10.3102/00346543059002117 Flowerday, T., & Schraw, G. (2000). Teacher beliefs about instructional choice: A phenomenological study. Journal of Educational Psychology, 92(4), 634–645. https://doi.org/10.1037// 0022-0663. 92.4.634 Fortus, D. (2014). Attending to affect. Journal of Research in Science Teaching, 51(7), 821–835. https://doi.org/10.1002/tea.21155 Francescucci, A., & Rohani, L. (2019). Exclusively synchronous online (VIRI) learning: The impact on student performance and engagement outcomes. Journal of marketing Education, 41(1), 60-69. https://doi.org/10.1177/0273475318818864 Fredricks, J. A., Hofkens, T., Wang, M. Te, Mortenson, E., & Scott, P. (2018). Supporting girls’ and boys’ engagement in math and science learning: A mixed methods study. Journal of Research in Science Teaching, 55(2), 271–298. https://doi.org/10.1002/tea.21419 86 Fredricks, J. A., Wang, M. Te, Schall Linn, J., Hofkens, T. L., Sung, H., Parr, A., & Allerton, J. (2016). Using qualitative methods to develop a survey measure of math and science engagement. Learning and Instruction, 43, 5–15. https://doi.org/10.1016/j.learninstruc.2016.01.009 Fredricks, J. A., & McColskey, W. (2012). The measurement of student engagement: A comparative analysis of various methods and student self-report instruments. In S. L. Christenson, A. L. Reschly, & C. Wylie (Eds.), Handbook of Research on Student Engagement (pp. 763–782). New York, NY: Springer. Fredricks, J. A., Blumenfeld, P. C., & Paris, A. H. (2004). School engagement: Potential of the concept, state of the evidence. Review of Educational Research, 74(1), 59–109. https://doi.org/10.3102/ 00346543074001059 Fullarton, S. (2002). Student engagement with school: Individual and school-level influences. Camberwell, Victoria: Australian Council for Educational Research. https://research.acer.edu.au/cgi/viewcontent.cgi?article=1030&context=lsay_research Furrer, C., & Skinner, E. (2003). Sense of relatedness as a factor in children’s academic engagement and performance. Journal of Educational Psychology, 95, 148 – 162. https://doi.org/10.1037/0022-0663.95.1.148 Giesbers, B., Rienties, B., Tempelaar, D., & Gijselaers, W. (2014). A dynamic analysis of the interplay between asynchronous and synchronous communication in online learning: The impact of motivation. Journal of Computer Assisted Learning, 30(1), 30-50. doi:10.1111/jcal.12020 Gobert, J. D., Baker, R. S., & Wixon, M. B. (2015). Operationalizing and Detecting Disengagement Within Online Science Microworlds. Educational Psychologist, 50(1), 43– 57. https://doi.org/10.1080/00461520.2014.999919 Grabau, L. J., & Ma, X. (2017). Science engagement and science achievement in the context of science instruction: a multilevel analysis of U.S. students and schools. International Journal of Science Education, 39(8), 1045–1068. https://doi.org/10.1080/09500693.2017.1313468 Green, R.A., Whitburn, L.Y., Zacharias, A., Byrne, G., Hughes, D.L. (2018). The relationship between student engagement with online content and achievement in a blended learning anatomy course. Anatomy Science Education. 11, 471 – 477. https://doi.org/10.1002/ase.1761 Greene, B. A. (2015). Measuring cognitive engagement with self- report scales: Reflections from over 20 years of research. Educational Psychologist, 50, 14-30. https://doi.org/10.1080/00461520.2014.989230 Greene, B. A., Miller, R. B., Crowson, H. M., Duke, B. L., & Akey, K. L. (2004). Predicting high school students’ cognitive engagement and achievement: Contributions of classroom 87 perceptions and motivation. Contemporary Educational Psychology, 29, 462–482. http://dx.doi.org/ 10.1016/j.cedpsych.2004.01.006 Gunawardena, C. N., & Zittle, F. J. (1997). Social presence as a predictor of satisfaction within a computer- mediated conferencing environment. American Journal of Distance Education, 11(3), 8–26. https://doi.org/10.1080/08923649709526970 Guthrie, J. T., Wigfield, A., Barbosa, P., Perencevich, K. C., Taboada, A., Davis, M. H., & Tonks, S. (2004). Increasing reading comprehension and engagement through concept- oriented reading instruction. Journal of Educational Psychology, 96, 403–423. http://dx.doi.org/10.1037/ 0022-0663.96.3.403 Harackiewicz, J. M., & Hulleman, C. S. (2010). The importance of interest: The role of achievement goals and task values in promoting the development of interest. Social and Personality Psychology Compass. 4(1), 42-52. https://doi.org/10.1111/j.1751- 9004.2009.00207.x Hartnett, M. (2012). Relationships between online motivation, participation, and achievement: More complex than you might think. Journal of Open, Flexible and Distance Learning, 16(1), 28. Retrieved from http://search.informit.com.au/documentSummary;dn=383717019325123;res=IELHSS Hawkins, A., & Barbour, M. K. (2010). U.S. virtual school trial period and course completion policy study. American Journal of Distance Education, 24, 5–20. https://doi.org/10.1080/08923640903529295 Heddy, B. C., & Sinatra, G. M. (2013). Transforming misconceptions: Using transformative experience to promote positive affect and conceptual change in students learning about biological evolution. Science Education, 97, 725–744. https://doi.org/10.1002/sce.21072 Heddy, B. C., Sinatra, G. M., Seli, H., & Mukhopadhyay, A. (2014). Transformative experience as a facilitator of interest development and transfer in a college success course for at-risk students. Philadelphia, PA: Paper presented at the American Educational Research Association. Hrastinski, S. (2008). Asynchronous and synchronous e-learning. Educause Quarterly, 31(4), 51- 55. https://er.educause.edu/articles/2008/11/asynchronous-and-synchronous-elearning Hughes, G. D. (2009). The Impact of Incorrect Responses to Reverse-Coded Survey Items. Research in the Schools 16(2), 76-88. https://web.s.ebscohost.com/ehost/pdfviewer/pdfviewer?vid=0&sid=2ca2321f-0773-4635- 8e0b-7cb31ccb81c2%40redis Hulleman, C. S., Godes, O., Hendricks, B. L., & Harackiewicz, J. M. (2010). Enhancing interest and performance with a utility value intervention. Journal of Educational Psychology, 102(4), 880–895. https://doi.org/10.1037/a0019506 88 Klem, A. M., & Connell, J. P. (2004). Relationships matter: Linking teacher support to student engagement and achievement. Journal of School Health, 74(7), 262–273. https://doi.org/10.1111/j.1746-1561.2004.tb08283.x Krosnick, J.A., & Presser, S. (2010). Question and Questionnaire Design. In P.V. Marsden & J. D. Wright (Eds.), Handbook of Survey Research (2nd ed.) (pp. 263 - 313). Emerald Group Publishing. Kunter, M., Frenzel, A., Nagy, G., Baumert, J., & Pekrun, R. (2011). Teacher enthusiasm: Dimensionality and context specificity. Contemporary Educational Psychology, 36, 289– 301. doi:10.1016/j.cedpsych.2011.07.001 Leino, R. K., Gardner, M. R., Cartwright, T., & Döring, A. K. (2021). Engagement in a virtual learning environment predicts academic achievement in research methods modules: A longitudinal study combining behavioral and self-reported data. Scholarship of Teaching and Learning in Psychology. https://doi.org/10.1037/stl0000281 Lin, X., & Gao, L. (2020). Students’ sense of community and perspectives of taking synchronous and asynchronous online courses. Asian Journal of Distance Education, 15(1), 2020. https://files.eric.ed.gov/fulltext/EJ1289947.pdf Lowenthal, P., Dunlap, J., & Snelson, C. (2017). Live synchronous web meetings in asynchronous online courses: Reconceptualizing virtual office hours. Online Learning Journal, 21(4), 177-194. https://doi.org/10.24059/olj.v21i4.1285 Lubke, G., & Neale, M. C. (2006). Distinguishing between latent classes and continuous factors: Resolution by maximum likelihood? Multivariate Behavioral Research, 41(4), 499–532. https://doi.org/10.1207/s15327906mbr4104_4 Magidson, J., and Vermunt, J.K. (2002). Latent class models for clustering: a comparison with K-means. Canadian Journal of Marketing Research, 20, 36-43. http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.128.9157&rep=rep1&type=pdf Martin, A. J. (2007). Examining a multidimensional model of student motivation and engagement using a construct validation approach. British Journal of Educational Psychology, 77(2), 413–440. https://doi.org/10.1348/000709906X118036 Martin, N. I., Kelly, N., & Terry, P. C. (2018). A framework for self-determination in massive open online courses: Design for autonomy, competence, and relatedness. Australasian Journal of Educational Technology, 34(2), 35–55. https://doi.org/10.14742/ajet.3722 Masyn, K.E. (2013). Latent Class Analysis and Finite Mixture Modeling. In T.D. Little (Eds.), The oxford handbook of quantitative methods volume 2. (pp. 551 – 611). New York: Oxford University Press. 89 Mcelrath, K., (2020, August 26). Nearly 93% of households with school-age children report some form of distance learning during COVID-19. United States Census Bureau. https://www.census.gov/library/stories/2020/08/schooling-during-the-covid-19- pandemic.html Miron, G., Shank, C., & Davidson, C. (2018). Full-Time Virtual and Blended Schools: Enrollment, Student Characteristics, and Performance. Boulder, CO: National Education Policy Center. Retrieved [January 6th, 2019] from https://nepc.colorado.edu/publication/virtual-schools-annual-2018 Mo, Y. 2008. Opportunity to learn, engagement, and science achievement: Evidence form TIMSS 2003 data. Unpublished doctoral diss., State University, VA Moorhouse, B. L., & Wong, K. M. (2022). Blending asynchronous and synchronous digital technologies and instructional approaches to facilitate remote learning. Journal of Computers in Education, 9(1), 51–70. https://doi.org/10.1007/s40692-021-00195-8 Morin, A. J. S., Meyer, J. P., Creusier, J., & Biétry, F. (2016). Multiple-group analysis of similarity in latent profile solutions. Organizational Research Methods, 19(2), 231–254. https://doi.org/10.1177/1094428115621148 Muljana, P. S. (2019). Factors contributing to student retention in online learning and recommended strategies for improvement : A systematic literature review. Journal of Information Technology Education: Research, 18, 19–57. Murphy, E., Rodríguez-Manzanares, M. A., & Barbour, M. (2011). Asynchronous and synchronous online teaching: Perspectives of Canadian high school distance education teachers. British Journal of Educational Technology, 42(4), 583–591. https://doi.org/10.1111/j.1467-8535.2010.01112.x Muthén, L.K. and Muthén, B.O. (1998-2017). Mplus User’s Guide. Eighth Edition. Los Angeles, CA: Muthén & Muthén https://www.statmodel.com/download/usersguide/MplusUserGuideVer_8.pdf Nippard, E., & Murphy, E. (2007). Social presence in the web-based synchronous secondary classroom. Canadian Journal of Learning and Technology, 33. Retrieved from http://www.cjlt.ca/content/vol33.1/nippard.html Nylund-Gibson, K., Grimm, R. P., & Masyn, K. E. (2019). Prediction from latent classes: A demonstration of different approaches to include distal outcomes in mixture models. Structural Equation Modeling, 26, 1–19. https://doi.org/10.1080/10705511.2019.1590146 Nylund, K. L., Asparouhov, T., & Muthén, B. O. (2007). Deciding on the number of classes in latent class analysis and growth mixture modeling: A Monte Carlo simulation study. Structural Equation Modeling, 14, 535–569. http://dx.doi.org/10.1080/10705510701575396 90 Olson, J. S., & McCracken, F. E. (2015). Is it worth the effort? The impact of incorporating synchronous lectures into an online course. Online Learning, 19(2), 1-12. https://doi.org/10.24059/olj.v19i2.499 Pang, L., & Jen, C. C. (2018). Inclusive dyslexia-friendly collaborative online learning environment: Malaysia case study. Education and Information Technologies, 23(3), 1023- 1042. https://doi.org/10.1007/s10639-017-9652-8 Patrick, B.C., Hisley, J., & Kempler, T. (2000). “What’s everybody so excited about?”: The effects of teacher enthusiasm on student intrinsic motivation and vitality. The Journal of Experimental Education, 68, 217–236. doi:10.1080/00220970009600093 Pekrun, R. (2006). The control-value theory of achievement emotions: Assumptions, corollaries, and implications for educational research and practice. Educational Psychology Review, 18, 315–341. http://dx.doi. org/10.1007/s10648-006-9029-9 Pekrun, R., & Linnenbrink-Garcia, L. (2012). Academic emotions and student engagement. In S. L. Christenson, A. L. Reschley, & C. Wylie (Eds.), Handbook of Research on Student Engagement (pp. 259–282). New York, NY: Springer. Pekrun, R., Goetz, T., Titz, W., & Perry, R. P. (2002). Academic emotions in students’ self- regulated learning and achievement: A program of qualitative and quantitative research. Educational Psychologist, 37, 91–106. http://dx.doi.org/10.1207/S15326985EP3702_4 Pintrich, P. R., & DeGroot, E. (1990). Motivational and self-regulated learning components of classroom academic performance. Journal of Educational Psychology, 82 , 33–40. https://doi.org/10.1037/0022-0663.82.1.33 Reeve, J., & Tseng, C. M. (2011). Agency as a fourth aspect of students’ engagement during learning activities. Contemporary Educational Psychology, 36, 257–267. http://dx.doi.org/10.1016/j.cedpsych.2011.05.002 Reeve, J., Jang, H., Carrell, D., Jeon, S., & Barch, J. (2004). Enhancing students’ engagement by increasing teachers’ autonomy support. Motivation and Emotion, 28(2), 147–169. https://doi.org/10.1023/b:moem.0000032312.95499.6f Renninger, K. A., & Bachrach, J. E. (2015). Studying triggers for interest and engagement using observational methods. Educational Psychologist, 50(1), 58–69. https://doi.org/10.1080/00461520.2014.999920 Reschly, A. L., Huebner, S. E., Appleton, J. J., & Antaramian, S. (2008). Engagement as flourishing: The contribution of positive emotions and coping to adolescents’ engagement at school and with learning. Psychology in the Schools, 45(5), 419–431. https://doi.org/10.1002/pits 91 Reschly, A. L., & Christenson, S. L. (2012). Jingle, jangle, and conceptual haziness: Evolution and future directions of the engagement construct. In S. L. Christenson, A. L. Reschly, & C. Wylie (Eds.), Handbook of Research on Student Engagement. (pp. 3–19). New York: Springer Science. Roorda, D. L., Koomen, H. M. Y., Spilt, J. L., & Oort, F. J. (2011). The influence of affective teacher-student relationships on students' school engagement and achievement: a meta- analytic approach. Review of Educational Research, 81(4), 493-529. https://doi.org/10.3102/0034654311421793 Rumberger, R., & Rotermund, S. (2012). The relation- ship between engagement and high school drop-out. In S. Christenson, A. Reschly, & C. Wylie (Eds.), Handbook of Research on Student Engagement. (pp. 491–514). New York, New York: Springer. Ryan, R. M., & Deci, E. L. (2000). Self-determination theory and the facilitation of intrinsic motivation, social development, and well-being. American Psychologist, 55, 68–78. https://selfdeterminationtheory.org/SDT/documents/2000_RyanDeci_SDT.pdf Salmela-Aro, K., Moeller, J., Schneider, B., Spicer, J., & Lavonen, J. (2016). Integrating the light and dark sides of student engagement using person-oriented and situation-specific approaches. Learning and Instruction, 43, 61–70. https://doi.org/10.1016/j.learninstruc.2016.01.001 San Pedro, M. O. Z., Baker, R. S. J. D., & Rodrigo, M. M. T. (2014). Carelessness and affect in an intelligent tutoring system for mathematics. International Journal of Artificial Intelligence in Education, 24(2), 189–210. https://doi.org/10.1007/s40593-014-0015-y Schmidt, J. A., Rosenberg, J. M., & Beymer, P. N. (2018). A person-in-context approach to student engagement in science: Examining learning activities and choice. Journal of Research in Science Teaching, 55(1), 19-43. https://doi.org/10.1002/tea.21409 Schmidt, J. A., Kafkas, S. S., Maier, K. S., Shumow, L., & Kackar-Cam, H. Z. (2019). Why are we learning this? Using mixed methods to understand teachers’ relevance statements and how they shape middle school students’ perceptions of science utility. Contemporary Educational Psychology, 57, 9–31. https://doi.org/10.1016/j.cedpsych.2018.08.005 Shernoff, D. J., & Schmidt, J. A. (2008). Further evidence of an engagement-achievement paradox among U.S. high school students. Journal of Youth and Adolescence, 37(5), 564– 580. https://doi.org/10.1007/s10964-007-9241-z Skinner, E. A., & Pitzer, J. R. (2012). Developmental dynamics of student engagement, coping, and everyday resilience. In S. L. Christenson, A. L. Reschly, & C. Wylie (Eds.), Handbook of Research on Student Engagement. (pp. 21–44). NewYork, NY: Springer. 92 Skinner, E. A., Kindermann, T. A., & Furrer, C. J. (2009). A motivational perspective on engagement and disaffection. Educational and Psychological Measurement, 69(3), 493– 525. https://doi.org/10.1177/0013164408323233 Sinatra, G. M., Heddy, B. C., & Lombardi, D. (2015). The challenges of defining and measuring student engagement in science. Educational Psychologist, 50(1). https://doi.org/10.1080/00461520.2014.1002924 Spurk, D., Hirschi, A., Wang, M., Valero, D., & Kauffeld, S. (2020). Latent profile analysis: A review and “how to” guide of its application within vocational behavior research. Journal of Vocational Behavior, 120, 103445. https://doi.org/10.1016/j.jvb.2020.103445 Tabachnick, B.G., & Fidell, L.S. (2013). Using multivariate statistics (6th edn). Boston: Pearson Education. Tomaszewski, W., Xiang, N., & Western, M. (2020). Student engagement as a mediator of the effects of socio-economic status on academic performance among secondary school students in Australia. British Educational Research Journal, 46(3), 610–630. https://doi.org/10.1002/berj.3599 Toth, M. (2021, March 17). Why student engagement is important in a post-COVID world – and 5 strategies to improve it. Learning Sciences International. https://www.learningsciences.com/blog/why-is-student-engagement-important/ Uçar, F. M., & Sungur, S. (2017). The role of perceived classroom goal structures, self-efficacy, and engagement in student science achievement. Research in Science and Technological Education, 35(2), 149–168. https://doi.org/10.1080/02635143.2017.1278684 van Rooij, E. C. M., Jansen, E. P. W. A., & van de Grift, W. J. C. M. (2017). Secondary school students’ engagement profiles and their relationship with academic adjustment and achievement in university. Learning and Individual Differences, 54, 9–19. https://doi.org/10.1016/j.lindif.2017.01.004 Vasalampi, K., Salmela-Aro, K., & Nurmi, J. E. (2009). Adolescents’ self-concordance, school engagement, and burnout predict their educational trajectories. European Psychologist, 14(4), 332–341. https://doi.org/10.1027/1016-9040.14.4.332 Vedder-Weiss, D. (2017). Serendipitous science engagement: A family self-ethnography. Journal of Research in Science Teaching, 54(3), 350–378. https://doi.org/10.1002/tea.21369 Virtanen, T., Lerkkanen, M.-K., Poikkeus, A.-M., & Kuorelahti, M. (2018). Student engagement and school burnout in finnish lower-secondary schools: Latent profile analysis. Scandinavian Journal of Educational Research, 62(4), 519-537. 93 Voelkl, K. E. (1996). Measuring students’ identification with school. Educational and Psychological Measurement, 56(5), 760–770. https://doi.org/10.1177/0013164496056005003 Wang, C. H., Shannon, D. M., & Ross, M. E. (2013). Students’ characteristics, self-regulated learning, technology self-efficacy, and course outcomes in online learning. Distance Education, 34(3), 302–323. https://doi.org/10.1080/01587919.2013.835779 Wang, M. Te, & Eccles, J. S. (2013). School context, achievement motivation, and academic engagement: A longitudinal study of school engagement using a multidimensional perspective. Learning and Instruction, 28, 12–23. https://doi.org/10.1016/j.learninstruc.2013.04.002 Wang, M. Te, & Peck, S. C. (2013). Adolescent educational success and mental health vary across school engagement profiles. Developmental Psychology, 49(7), 1266–1276. https://doi.org/10.1037/a0030028 Watts, L. (2016). Synchronous and asynchronous communication in distance learning: A review of the literature. The Quarterly Review of Distance Education, 17(1), 23–32. https://eric.ed.gov/?id=EJ1142962 Wickman, P. O. (2006). Aesthetic experience in science education: Learning and meaning- making as situated talk and action. Mahwah, NJ: Erlbaum. Wijekumar, K., Ferguson, L., & Wagoner, D. (2006). Problem with assessment validity and reliability in web-based distance learning environments and solutions. Journal of Educational Multimedia & Hypermedia, 15, 199–215. Retrieved from http://www.editlib.org/p/ 6259?nl Wormington, S. V., & Linnenbrink-Garcia, L. (2017). A new look at multiple goal pursuit: The promise of a person-centered approach. Educational Psychology Review, 29, 407-445. https://doi.org/10.1007/s10648-016-9358-2 Xiong, C., Ge, J., Wang, Q., & Wang, X. (2017). Design and evaluation of a real-time video conferencing environment for support teaching: an attempt to promote equality of K-12 education in China. Interactive Learning Environments, 25, 596–609. https://doi.org/10.1080/10494820.2016.1171786 Yamagata-Lynch, L. C. (2014). Blending online asynchronous and synchronous learning. The International Review of Research in Open and Distributed Learning, 15(2). https://doi.org/10.19173/irrodl.v15i2.1778 Zhang, Q. (2014). Assessing the effects of instructor enthusiasm on classroom engagement, learning goal orientation, and academic self-efficacy. Communication Teacher, 28(1), 44– 56. https://doi.org/10.1080/17404622.2013.839047 94