THE EFFECTS OF COMMUNITY OF INQUIRY, LEARNING PRESENCE, AND MENTOR PRESENCE ON LEARNING OUTCOMES: A REVISED COMMUNITY OF INQUIRY MODEL FOR K-12 ONLINE LEARNERS By Yining Zhang A DISSERTATION Submitted to Michigan State University in partial fulfillment of the requirements for the degree of Educational Psychology and Educational Technology—Doctor of Philosophy 2018 ABSTRACT THE EFFECTS OF COMMUNITY OF INQUIRY, LEARNING PRESENCE, AND MENTOR PRESENCE ON LEARNING OUTCOMES: A REVISED COMMUNITY OF INQUIRY MODEL FOR K-12 ONLINE LEARNERS By Yining Zhang Amid an explosive increase in K-12 online education in the United States, the quality of online learning has become a primary concern for researchers, educators, and policymakers. Two theoretical frameworks, Community of Inquiry (CoI) and self-regulated learning (SRL) have provided especially insightful explanations of how students learn in online settings. The present dissertation proposes a revised CoI framework that incorporates learning presence (i.e., self- efficacy and SRL strategies) and mentor presence and connects to learning outcomes in K-12 online setting from a sample of 696 high-school level online learners. The study yields four key findings: First, in contrast to Shea and Bidjerano’s (2010) model, this study found a significant relationship between teaching presence and SRL, and a non-significant relationship between teaching presence and self-efficacy. Second, because learning under the supervision of a mentor is a unique feature of K-12 online learning, this study found that mentor presence significantly predicted students’ use of self-regulated learning strategies. Third, when using the revised framework to predict learning outcomes (i.e., satisfaction, perceived progress, final grade), it was found that the hypothesized model containing CoI and SRL can be related to learning outcomes, thus answering previous scholars’ calls for a realistic integration of learning presence and learning outcomes into the CoI framework. And fourth, based on a comparison of two groups of students with different primary online learning locations (i.e., at-home vs. at-school), the study found that at-school students showed significantly less feelings of isolation, higher ability in generating curiosity once the online learning starts, higher perceptions on mentor’s practice as a problem solver, and higher goal-setting and help-seeking strategies than those whose primary online learning location is at home. In short, this study is among the first to shed light on the relationships among CoI, self- efficacy, SRL, and learning outcomes in a K-12 online-learning context. Its findings establish the capacity of the proposed theoretical framework to identify important components in K-12 online learning while striking a delicate balance between extensiveness and parsimony. This study also extends our understanding of the mechanisms of online learning among K-12 students, and thus has considerable practical implications for online educators as well as future researchers. Specific recommendations for future research projects are also provided. To my late grandfather, who taught me the value of education. To my parents, who have loved and supported me unconditionally throughout my life. To my husband, who encourages me with steadfast love, care, and understanding. And to my daughter, who inspires me to be more than what I was the day before: May you understand the journey to knowledge never ends. iv ACKNOWLEDGEMENTS First and foremost, I would like to express my deepest appreciation to my dissertation committee chair and advisor, Dr. Chin-Hsi Lin, for his continuous support during my doctoral studies. He stimulated my research interest in online learning, and instilled in me – by his own example – a love of and dedication to rigorous research and scholarship. Without Dr. Lin’s guidance and frequent help, this dissertation would never have been completed. I would also like to thank my other committee members: Dr. Dongbo Zhang, for his support and mentorship as I pursued my interest in self-regulated learning; Dr. Lisa Linnenbrink-Garcia, for her insightful suggestions on developing my research and conducting statistical analyses; and Dr. Rand Spiro, for his care and encouragement throughout my doctoral work. I would also like to extend my appreciation to Michigan Virtual Learning Research Institute for allowing me to conduct my dissertation research and providing me with assistance whenever I requested it. In this regard, special thanks are due to Dr. Joe Freidhoff, Dr. Kathryn Kennedy, Dr. Jungah Bae, Dr. Kristen DeBruler, and the members of the Blackboard team for their steady support, and to my participants for their enthusiasm. I would like to thank my editor, Dr. Dan MacCannell, for his reliable editing and proofreading of my papers throughout my doctoral studies. I am also thankful to the entire EPET program at Michigan State University: all the faculty and staff that I worked with, took classes from, and met throughout my doctoral studies, all of my fellow doctoral students. Thanks for offering me an opportunity to start a wonderful journey, and for your encouragement and help along the way. v Last but not least, I would like to thank my family for supporting me with love, care, understanding, encouragement, and inspiration throughout my life. Thanks for always being there for me. Thanks for believing in me. Thanks for loving me. vi TABLE OF CONTENTS LIST OF TABLES .......................................................................................................................... x LIST OF FIGURES ...................................................................................................................... xii CHAPTER 1 ................................................................................................................................... 1 INTRODUCTION .......................................................................................................................... 1 CHAPTER 2 ................................................................................................................................... 8 LITERATURE REVIEW ............................................................................................................... 8 Community of Inquiry ................................................................................................................ 8 Teaching Presence. ................................................................................................................. 8 Social Presence ..................................................................................................................... 10 Cognitive Presence................................................................................................................ 12 Self-regulated Learning ............................................................................................................ 13 Mentors in K-12 Online Courses .............................................................................................. 17 Mentor Presence.................................................................................................................... 17 The Impact of Mentor Presence ............................................................................................ 18 Learning Locations for K-12 Online Learners .......................................................................... 20 Research Questions ................................................................................................................... 22 CHAPTER 3 ................................................................................................................................. 27 METHODS ................................................................................................................................... 27 Context ...................................................................................................................................... 27 Participants ................................................................................................................................ 27 Procedure .................................................................................................................................. 29 Survey ....................................................................................................................................... 30 Demographic Information ..................................................................................................... 30 Community of Inquiry .......................................................................................................... 31 Teaching Presence ............................................................................................................ 31 Social Presence ................................................................................................................. 31 Cognitive Presence............................................................................................................ 32 Self-efficacy .......................................................................................................................... 32 Online Self-regulated Learning Strategies ............................................................................ 32 Mentor Presence.................................................................................................................... 33 Satisfaction ............................................................................................................................ 33 Perceived Progress ................................................................................................................ 33 Data Analysis ............................................................................................................................ 34 Instrument Validation ........................................................................................................... 35 Exploratory Factor Analysis ............................................................................................. 35 Confirmatory Factor Analysis ........................................................................................... 36 Structural Model ................................................................................................................... 38 Testing Mediation ............................................................................................................. 38 vii RQ1. Shea’s Model and the Alternative Model .................................................................... 38 RQ2. Model with Mentor Presence ...................................................................................... 39 RQ3. Model with Learning Outcomes .................................................................................. 39 RQ4. Learning at school vs. Learning at Home .................................................................... 39 CHAPTER 4 ................................................................................................................................. 40 RESULTS ..................................................................................................................................... 40 Exploratory Factor Analysis ..................................................................................................... 40 Confirmatory Factor Analysis ................................................................................................... 44 Composite Reliability ........................................................................................................... 46 Convergent Validity .............................................................................................................. 47 Factor Loading .................................................................................................................. 47 Average Variance Extracted ............................................................................................. 47 Discriminant Validity................................................................................................................ 48 Descriptive Statistics ................................................................................................................. 48 Measurement Model ................................................................................................................. 49 RQ1. Shea’s Model and the Alternative Model ........................................................................ 52 RQ2. Model with Mentor Presence .......................................................................................... 56 RQ3. Model with Learning Outcomes – Satisfaction ............................................................... 59 RQ3. Model with Learning Outcomes – Perceived Progress ................................................... 64 RQ3. Model with Learning Outcomes – Final Grade ............................................................... 67 RQ4. Studying at Home or at School........................................................................................ 70 CHAPTER 5 ................................................................................................................................. 75 DISCUSSION ............................................................................................................................... 75 Comparing Shea’s Model against the Alternative Model ......................................................... 77 Teaching Presence and Self-regulated Learning Strategies .................................................. 77 Teaching Presence and Self-efficacy .................................................................................... 78 Mentor Presence in K-12 Online Learning ............................................................................... 80 Mentor Presence.................................................................................................................... 80 Mentor Presence, Self-efficacy, and Self-regulated Learning Strategies ............................. 81 Connecting the CoI Models with Learning Outcomes ............................................................. 83 Incorporating Learning Presence into the CoI Model ........................................................... 83 Distinctive Learning Outcomes ............................................................................................ 85 Learning At-home vs. Learning At-school ............................................................................... 86 Feeling of Isolation ............................................................................................................... 86 Triggering Event in Cognitive Presence ............................................................................... 88 Perceived Mentor’s Help ...................................................................................................... 89 Goal-setting & Help-seeking Learning Strategies ................................................................ 90 Limitations and Recommendations for Future Research .......................................................... 90 Implications for Practitioners .................................................................................................... 92 Learning Presence ................................................................................................................. 92 Mentor Presence.................................................................................................................... 93 Online Learning at School .................................................................................................... 94 CHAPTER 6 ................................................................................................................................. 96 viii CONCLUSION ............................................................................................................................. 96 APPENDICES .............................................................................................................................. 98 APPENDIX A: Parental Consent Form .................................................................................... 99 APPENDIX B: Student Assent Form ..................................................................................... 101 APPENDIX C: Survey ............................................................................................................ 103 REFERENCES ........................................................................................................................... 107 ix LIST OF TABLES Table 1. Descriptive Statistics....................................................................................................... 28 Table 2. Test of the Validity of the Latent Constructs – Eigenvalue of the Exploratory Factor Analysis (N = 348) ........................................................................................................................ 40 Table 3. Test of the Validity of the Latent Constructs - Exploratory Factor Analysis (N = 348) 41 Table 4. Test of the Validity of the Latent Constructs – Confirmatory Factor Analysis (N = 348) ....................................................................................................................................................... 44 Table 5. Test of the Validity of the Latent Constructs - The Standardized Factor Loading, SSI, SEV, AVE, and Composite Reliability of the Latent Constructs (N = 348) ................................. 46 Table 6. Test of the Validity of the Latent Constructs - Discriminant Validity for the Measurement Model (N = 348) ..................................................................................................... 48 Table 7. Correlation Table ........................................................................................................... 49 Table 8. Measurement Model ...................................................................................................... 50 Table 9. Fit Indices of Shea's Model and the Alternative Model ................................................ 52 Table 10. The Structural Model for the Alternative Model ......................................................... 53 Table 11. The Structural Model for the Modified Alternative Model ......................................... 55 Table 12. The Structural Model for the Model with Mentor Presence ........................................ 56 Table 13. The Structural Model for the Modified Model with Mentor Presence ........................ 58 Table 14. The Structural Model for the Hypothesized Model with Satisfaction ......................... 60 Table 15. The Structural Model for the Modified Model with Satisfaction ................................ 62 Table 16. The Structural Model for the Hypothesized Model with Perceived Progress ............. 64 Table 17. The Structural Model for the Modified Model with Perceived Progress .................... 65 Table 18. The Structural Model for the Hypothesized Model with Final Grade ......................... 68 Table 19. The Structural Model for the Modified Model with Final Grade ................................. 69 Table 20. The Demographic Differences between “At-school” Students and “At-home” Students ....................................................................................................................................................... 71 x Table 21. Group Comparisons with Means and ANOVAs.......................................................... 72 Table 22. Summary of the Tested Hypotheses ............................................................................ 76 xi LIST OF FIGURES Figure 1. Shea's Model .................................................................................................................... 3 Figure 2. The Hypothesized Alternative Model ........................................................................... 23 Figure 3. The Hypothesized Alternative Model with Mentor Presence ....................................... 24 Figure 4. The Hypothesized Alternative Model with Mentor Presence and Learning Outcomes 25 Figure 5. The Path Coefficients of the Alternative Model ............................................................ 54 Figure 6. The Path Coefficients of the Modified Alternative Model ............................................ 55 Figure 7. The Path Coefficients of the Alternative Model with Mentor Presence ....................... 57 Figure 8. The Path Coefficients of the Modified Alternative Model with Mentor Presence ........ 59 Figure 9. The Path Coefficients of the Alternative Model with Mentor Presence and Satisfaction ....................................................................................................................................................... 61 Figure 10. The Path Coefficients of the Modified Alternative Model with Mentor Presence and Satisfaction .................................................................................................................................... 63 Figure 11. The Path Coefficients of the Alternative Model with Mentor Presence and Perceived Progress ......................................................................................................................................... 65 Figure 12. The Path Coefficients of the Modified Alternative Model with Mentor Presence and Perceived Progress ........................................................................................................................ 66 Figure 13. The Path Coefficients of the Alternative Model with Mentor Presence and Final Grade ....................................................................................................................................................... 69 Figure 14. The Path Coefficients of the Modified Alternative Model with Mentor Presence and Final Grade.................................................................................................................................... 70 xii CHAPTER 1 INTRODUCTION Enrollment in K-12 online courses has increased dramatically over the past decade. During the 2007-2008 school year, an estimated 1.03 million K-12 students in the United States took at least one online course, 47% more than had done so in 2005-2006 (Wicks, 2010). By the 2014-2015 school year, there were some 3.8 million online K-12 course enrollments by an estimated 2.2 million students (Watson, Pape, Murin, Gemin, & Vashaw, 2015). In addition to these overall increases in enrollment in K-12 online learning, the number of U.S. states providing it has also increased markedly: from 38 in 2006 (De Laat, Lally, Lipponen, & Simons, 2007), to all 50 states and the District of Columbia by 2011 (Watson, Murin, Vashaw, Gemin, & Rapp, 2011). Some states including Michigan, Virginia, Alabama, Florida, Idaho, Georgia, and West Virginia have even included online courses among their high-school graduation requirements (Corry & Stella, 2012; DiPietro, Ferdig, Black, & Preston, 2008; Watson et al., 2015). Amid this explosive increase in K-12 online education, its quality has become a primary concern for students, parents, researchers, educators, and policymakers (Black, Ferdig, & DiPietro, 2008; Borup, Graham, & Davies, 2013; Corry & Stella, 2012; Liu & Cavanaugh, 2012). As enrollment expands both numerically and geographically, many institutions that offer online courses are facing challenges in terms of both high student attrition rates and low academic performance, as compared to brick-and-mortar schools (Bernard, Abrami, Borokhovski, Wade, Tamim, Surkes, & Bethel, 2009; Borup et al., 2013; Cavanaugh, Gillan, Kromrey, Hess, & Blomeyer, 2004; Smith, Clark, & Blomeyer, 2005). 1 To address these two issues, it is crucial to understand K-12 students’ online-learning processes and the factors that predict their performance. Two theoretical frameworks have provided especially insightful explanations of how students learn online in both higher education and K-12 contexts. The first is the Community of Inquiry (CoI) framework devised by Garrison, Anderson, and Archer (2001), and the second, Zimmerman’s (1986) self-regulated learning (SRL) theory. CoI proposes that three components – teaching, social, and cognitive presences – explain how learners construct knowledge in online settings. SRL, on the other hand, focuses on how learners regulate their own learning processes metacognitively, motivationally, and behaviorally. CoI reflects social constructivism, insofar as it focuses on a dialogic approach to knowledge construction, and postulates that when teaching and social presences are provided, learners will necessarily acquire knowledge. This framework, however, does not take account of learners’ confidence, goal orientation, beliefs, or systematic efforts to plan, implement, and monitor their learning (Cho, Kim, & Choi, 2017; Shea & Bidjerano, 2010); and when learners are not motivated, or do not monitor their learning progress, desirable learning outcomes are less likely (Pintrich & De Groot, 1990). To address these limitations, Shea and Bidjerano (2010) developed a revised CoI model that includes learning’ presence, which relates closely to the strategic learner role in SRL theory. Shea and Bidjerano’s model further suggests that teaching presence, social presence, and learning presence are all associated with cognitive presence (see Fig. 1). Support for Shea and Bidjerano’s approach was provided by Cho et al.’s (2017) findings that SRL played a significant role for students perceived CoI, and that developing students’ SRL is crucial to the creation of positive online communities of inquiry. 2 Figure 1. Shea's Model Though their model is more comprehensive than either CoI or SRL by itself when it comes to explaining online learning performance, Shea and Bidjerano (2010) conceded that it still had limitations with regard to the conception of learning presence. Specifically, their revised CoI operationalized learning presence as self-efficacy and effort regulation, but the latter construct only covered a part of SRL. In other words, self-regulation is a metacognitive, motivational, and behavioral process in which learners actively participate in their own learning (Zimmerman, 1986); yet Shea and Bidjerano’s revised CoI only included students’ self-efficacy and one aspect of strategic learning, and did not consider the potential influence of other metacognitive uses of learning strategies. Because students’ degree of SRL plays an important role in their learning achievement (Graham & Harris, 2000; Pintrich & De Groot, 1990; Schunk 3 & Zimmerman, 2007; Zimmerman, 2008; Zimmerman & Bandura, 1994), it is necessary to incorporate various types of SRL strategies into any model that aims at a comprehensive understanding of the mechanisms through which effective online learning experiences are created (Barnard, Paton, & Lan, 2008; Kim, Park, Cozart, & Lee, 2015; Ley & Young, 2001; Lin, Zhang, & Zheng, 2017; McMahon & Oliver, 2001; Rockinson-Szapkiw, Wendt, Whighting, & Nisbet, 2016; Wang, Shannon, & Ross, 2013). Another limitation of Shea and Bidjerano’s revised CoI is its lack of predictive power regarding students’ learning outcomes (academic performance, learning satisfaction, or perceived learning). If we are to arrive at valid theoretical models that are useful for promoting effective online education, such models must be linked to measurable student-achievement outcomes (Akyol & Garrison, 2011a; Garrison, Cleveland- Innes, & Fung, 2010; Rockinson-Szapkiw et al., 2016). Perhaps more importantly, most of the previous studies that looked at how students learned online using the CoI and/or SRL frameworks were conducted in postsecondary online- learning settings (Akyol & Garrison, 2011b; Shea & Bidjerano, 2010, 2012), leaving the rapidly increasing population of K-12 online students under-researched (Hawkins, Barbour, & Graham, 2011). Rather than taking it for granted that there are no important differences between postsecondary and K-12 online learning, it is crucial to establish a conceptual framework that is sensitive to how students learn online in the K-12 context specifically (Barbour, 2013; Barbour & Reeves, 2009; Corry & Stella, 2012; Rice, 2006; Smith et al., 2005). In particular, when applying Shea and Bidjerano’s (2010) revised CoI model, two unique characteristics of online K-12 learning should be considered: the presence of on-site mentors (Borup & Drysdale, 2014), and the use of in-school time for students to learn online courses (Roblyer & Marshall, 2002). It is reasonably clear that physical separation between learners and 4 teachers makes it difficult for online teachers to provide personalized and targeted support to their K-12 students (Hawkins et al., 2011). To mitigate this problem, many K-12 online-learning institutions assign learners to mentors who are based in the students’ own brick-and-mortar schools – usually, existing staff members who have been specially trained for the purpose (Barbour & Mulcahy, 2004, 2009; Drysdale, Graham, & Borup, 2014; Harms, Niederhauser, Davis, Roblyer, & Gilbert, 2006; Wicks, 2010). Some states, such as Michigan, even require that their school districts provide each student who learns online with a mentor who is a state- certified teacher (Michigan Virtual Learning Research Institute, 2014). As distinct from online instructors who cannot be physically present, these mentors can show up in the classroom or learning labs during times when students are encouraged or required to be studying (Hannum, Irvin, Lei, & Farmer, 2008). The second unique feature of the existing K-12 online-learning environment is its use of in-school lab learning time. There are two reasons to explain for the use of in-school lab for online learning: First, this is to avoid the potential digital divide by a lack of access to technology at students’ homes (Roblyer & Marshall, 2002). For example, the 2014 Keeping Pace with K-12 Online Learning Report showed that students in five charter schools in New Orleans did not have stable and reliable Internet access at home. As a solution, they turned to use desktops at schools or rely on libraries to access online learning content (Watson, Pape, Murin, Gemin, & Vashaw, 2014). Second, it is believed that requiring some in-school learning time for students could be helpful in terms of providing additional monitoring and support to high school students, who may show less experience in online courses (Roblyer & Marshall, 2002). A removal of seat time requirements may lead to low learning performance when learners spend as little time as possible on content learning (Rice, 2014). 5 In this study, I propose a further revision of the CoI framework that incorporates aspects of CoI and SRL as well as actual learning outcomes, and which is specially tailored to K-12 online settings. The next section comprises a literature review structured around CoI, SRL, and the two key distinctive features of such settings (i.e., mentors and credit recovery). It is argued that, to better understand the mechanism(s) that can predict students’ cognitive presence in online learning, several components – including teaching presence, social presence, mentor presence, motivation, and online SRL – need to be taken into account. A second aim of the study is to understand how learning location may or may not affect students’ perception of teaching presence, social presence, mentor presence, cognitive presence, and their learning outcomes. At a theoretical level, findings from studies such as this one can help us gain a more comprehensive understanding of the mechanisms of students’ online learning. It is hoped that this, in turn, will provide future K-12 online-learning studies with a more robust foundation, and answer previous scholars’ calls for further investigation of the teaching, social, and cognitive aspects of online learning; its impacts on knowledge building; and which of its components are critical to the promotion of achievement (Barbour & Reeves, 2009; Rice, 2006; Smith et al., 2005). It will also shed light on if and how one’s primary learning location may affect students’ online learning perceptions and learning outcomes. At a practical level, Cavanaugh, Barbour, and Clark (2009) have argued that K-12 online-learning studies should provide best-practice guidelines for online teachers. For K-12 online-learning instructors, mentors, and administrators, findings from the current study could elucidate not only the complex relationships between the studied underlying constructs that exist within online-learning settings, but the importance of the quality of teaching presence, mentor presence, social presence, and course characteristics on students’ cognitive presence. Findings from the differences in students’ choice of primary 6 learning location can also inform online educators, researchers, and policy-makers in what ways difference may come up, in order to rethink how we can reshape students’ online learning experience both at-school and at-home. 7 CHAPTER 2 LITERATURE REVIEW Community of Inquiry The CoI framework explains the complexities of online learning by incorporating the conceptual components that are believed to be fundamental to successful knowledge- construction in online-learning environments, focusing on active collaboration and construction among active participants in such learning communities (Garrison, 2007; Garrison et al., 2001). At its root, CoI reflects deep and meaningful learning through the complex dynamics of design, facilitation, and interaction within the course (Akyol & Garrison, 2011a). As well as its three aforementioned key components (teaching presence, social presence, and cognitive presence), the CoI framework considers how these three components interact with each other, and how such interaction results in individuals’ constructive online-learning experiences. Teaching Presence. According to the CoI framework, teaching presence has three key responsibilities. The first is the design, inception, and organization of the learning content, learning activities, and learning schedules. The second is the facilitation of learning discourse and students’ online collaborative learning; and the third is direct content-related instruction. It plays the most fundamental role in online learning, as it sets the tone for the whole learning experience (Anderson, Liam, Garrison, & Archer, 2001; Garrison, Anderson, & Archer, 2010; Garrison et al., 2001). It can also facilitate and direct social presence and cognitive presence, in a manner that enables learners to generate meaningful learning outcomes (Anderson et al., 2001). Teaching presence was found to be a significant predictor of self-efficacy in Shea and Bidjerano’s (2010) structural equation modeling (SEM) analysis of 3,000 undergraduates taking online courses. In addition to its relationship to self-efficacy, teaching presence has a direct and 8 positive correlation with self-regulation. Two components of teaching presence – facilitating discourse and direct instruction – have been found to be directly correlated with metacognition, as these responsibilities help to develop one’s metacognitive awareness in monitoring and regulating learning (Akyol & Garrison, 2011a). Ebner and Ehri (2016) demonstrated that teachers could develop students’ self-regulation abilities through deliberate and structured teaching of thinking procedures for online learning, and when Crippen and Earl (2007) provided worked examples to online learners, it resulted in better problem-solving skills. The generally accepted account of the relationship between teaching presence and cognitive presence is that well-structured, carefully facilitated, direct instruction by online instructors has a positive influence on the establishment of higher-order cognitive learning (Garrison, 2007). Meyer (2004), for example, found that triggering questions raised by teachers influenced the quality of students’ responses. Similarly, Celentin (2007) pointed out that the role of the teacher was strongly related to whether class discussions reached the highest level of cognitive inquiry. Garrison et al.’s (2001) SEM model of the relationships among CoI’s three presences found that teaching presence was a significant, direct predictor of cognitive presence. Shea and Bidjerano’s (2010) structural model of the relationships among the components of CoI found that teaching presence directly and positively predicted cognitive presence. In another study of more than 2,000 college students, Shea and Bidjerano (2012) again found that teaching presence predicted cognitive presence, even after delivery mode and students’ prior online learning experience were controlled for. And more recently, Rockinson-Szapkiw et al. (2016) highlighted the singular power of teaching presence, over and above social presence, to positively predict the effectiveness of online learning environments. 9 Previous studies have consistently demonstrated that teaching presence was positively associated with satisfaction and/or perceived progress (Akyol & Garrison, 2008; Akyol & Garrison, 2014; Arbaugh, 2013; Garrison et al., 2010; Oliver, Osborne, & Brandy, 2009; Rockinson-Szapkiw et al., 2016). Garrison et al. pointed out that teaching presence had a major and positive impact on learning, which set the tone for the overall learning process. In a study of students in an online course, based on transcript analysis and a survey, all of Akyol and Garrison’s (2014) respondents reported that teaching presence was the most critical factor in their learning. The same study also found that teaching presence was strongly and positively correlated with satisfaction and perceived learning. Rockinson-Szapkiw et al., meanwhile, affirmed the importance of teaching presence to the quality of course design and implementation. Based on the relevant literature, this study proposes to test the following four hypotheses related to teaching presence: H1: Teaching presence positively predicts students’ online SRL strategies. H2: Teaching presence positively predicts students’ self-efficacy. H3: Teaching presence positively predicts cognitive presence. H4: Teaching presence positively predicts learning outcomes. Social Presence. Social presence refers to the degree to which the learner creates personal but purposeful relationships, and develops social bonds, in his/her learning context (Garrison, 2007; Garrison et al., 2001, 2010). Its three main aspects are effective communication, open communication, and group cohesion. Ideally, social presence operates to create open and comfortable conditions for students’ inquiry and their high-quality interaction with other parties, in the service of a wider educational goal (Garrison et al., 2010). Garrison (2007) sharply criticized some previous studies that artificially isolated social presence from CoI’s other two 10 constructs, and therefore failed to take account of the complex dynamics that took place between social presence and the other two in real-world settings. The present research endorses the idea that social presence cannot effectively be studied independently of cognitive presence and teaching presence. It has been hypothesized that a positive social presence should predict students’ self- efficacy, by serving as a source of “social persuasion” (Shea & Bidjerano, 2010, p. 1724). Wighting, Nisbet, and Rockinson-Szapkiw (2013) found that a higher sense of social community was associated with a higher confidence in learning; and Shea and Bidjerano (2010) found that social presence directly and positively predicted students’ self-efficacy. Arguing that CoI should be transformed from a theoretical and descriptive tool into one that can be used predictively in quantitative studies, Garrison et al. (2010) created an SEM model that included its three presences and their interactive relationships. This model’s results suggested that social presence mediated the relationship between teaching presence and cognitive presence. Shea and Bidjerano (2008) used cognitive presence as dependent variable in hierarchical regression analysis, and found that social presence significantly predicted cognitive presence. In a second study, Shea and Bidjerano (2010) structured an SEM model that established the direct and indirect relationships among a number of variables in online learning, and confirmed a direct and positive relationship between social presence and cognitive presence. Subsequently, Shea and Bidjerano (2012) confirmed that social presence was a significant and positive predictor of cognitive presence. Social presence has also been found to be significantly and positively correlated with learning-outcome variables including satisfaction and perceived progress (Akyol & Garrison, 2008; Arbaugh, 2013; Cobb, 2011; Hawkins, Graham, Sudweeks, & Barbour, 2013; Kang, Liw, 11 Kim, & Park, 2014; Nisbet, Wighting, & Rockinson-Szapkiw, 2013; Rockinson-Szapkiw et al., 2016; Zhan & Mei, 2013). For example, echoing Wighting et al., 2013, Nisbet et al. (2013) reported that the higher online students’ sense of social community was, the higher they rated all three perceived-learning areas, i.e., cognitive, affective, and psychomotor. And a recent meta- analysis conducted by Richardson, Maeda, Lv, and Caskurlu (2017) showed large positive correlations between social presence and satisfaction, and between social presence and perceived learning. Based on the relevant literature, this study proposes to test three additional hypotheses related to social presence, as follows: H5: Social presence positively predicts students’ self-efficacy. H6: Social presence positively predicts cognitive presence. H7: Social presence positively predicts learning outcomes. Cognitive Presence. Cognitive presence is the process whereby learners explore information, connect ideas, and apply new ideas to other settings through active construction and reflection as part of a community of inquiry (Garrison et al., 2001). Cognitive presence in an online-learning context can be represented as four phases in the progressive development of inquiry: triggering event, exploration, integration, and resolution. According to the CoI framework, cognitive presence is a cycle consisting of all four of these phases. Rockinson-Szapkiw et al. (2016) examined the relationship between CoI and perceived learning outcomes, and found that students with high levels of perceived cognitive presence tended to have higher course grades. Using hierarchical regression analysis, Shea and Bidjerano (2008) found that cognitive presence was a significant predictor of learners’ overall online learning satisfaction; similarly, in three consecutive studies conducted over a two-year period, 12 Kang et al. (2014) found that cognitive presence significantly predicted students’ learning satisfaction. Akyol and Garrison (2011a) found that students’ perceived levels of cognitive presence were significantly and positively associated with both their perceived learning and their learning satisfaction. Qualitative interviews conducted as part of the same study revealed that students considered their class projects as opportunities to synthesize and evaluate course content from throughout the semester, and attributed their satisfaction to their perceived cognitive presence. In light of the findings of the reviewed literature, this study proposes to test the following hypothesis related to cognitive presence: H8: Cognitive presence positively predicts learning outcomes. Self-regulated Learning In contrast to the CoI framework, which mainly describes the processes of online learning at a macro level, SRL theory helps to explain them at the micro level of individual characteristics (Shea & Bidjerano, 2010). As Zimmerman (1986, p. 308) summarized it, SRL is a multidimensional construct that regards students as “metacognitively, motivationally, and behaviorally active participants in their own learning process”. More specifically, it describes students’ proactive self-direction of their own learning processes, whereby their beliefs are transformed into learning goals and thus into academic performance (Zimmerman, 2008). Self- regulation is not a quality that one either possesses or does not possess; rather, it refers to the use of one or more specific processes that vary according to the specific learning task one is facing (Zimmerman, 2008). Early studies of SRL mainly focused on typologies of learning strategies (e.g., monitoring, planning, rehearsal, and time management) and/or on how to train students to adopt 13 such strategies (Zimmerman, 2008). While these strategies were demonstrably effective, however, their ability to foster real improvements in learning was found to be limited unless students adopted them spontaneously (Zimmerman, 2008). Noting that there is more to SRL than merely choosing and using appropriate learning strategies (Schunk & Zimmerman, 2012), researchers have greatly expanded the scope of this construct to include additional interrelated motivational processes, such as self-efficacy (Zimmerman, 2002; Zimmerman & Schunk, 2012). Grounded in social cognitive theory, self-efficacy emphasizes the close interaction between behavior, person, and environment and has four types of influences: mastery experience, vicarious learning, verbal persuasion, and psychological states (Bandura, 1997). Self-efficacy in online learning has been found an effective predictor of students’ use of metacognitive learning strategies (Cho & Shen, 2013; Shea & Bidjerano, 2010), and both self-efficacy and the use of SRL strategies have been confirmed as necessary to becoming a highly self-regulated learner (Zimmerman, 2002). It has been convincingly argued that in online-learning contexts, becoming a self- regulated learner is even more important than it is in face-to-face ones (Tsai, Shen, & Fan, 2013). As Zimmerman (1986, 2002) has pointed out, self-regulation strongly emphasizes students’ personal choices, goal-setting and learning autonomy. Online learning, because it lacks face-to- face interaction between teacher and students, requires the learner to be highly proactive and organized, planning his or her own learning and evaluating progress regularly (Puzziferro, 2008). Lynch and Dembo (2004) found that the self-regulatory attributes that were critical to online learning success included motivation, Internet self-efficacy, time management, study- environment management, and learning-assistance management. Building upon SRL theory and the unique characteristics of online learning environments, Barnard, Lan, To, Paton, and Lai 14 (2009) developed the Online Self-regulated Learning Questionnaire (OSLQ) to examine the psychometric properties of SRL in such environments. Their results identified several key aspects of online self-regulation, including environment-structuring, goal-setting, time management, help-seeking, task strategies, and self-evaluation. And Barnard-Brak, Lan, and Paton (2010) distinguished five profiles of online learners ranging from the most competent self- regulators to the least competent self-regulators, and found that these profile memberships differed significantly from one another in terms of academic achievement with the most competent self-regulators predicting the highest academic achievement. The advantages to students of being highly autonomous and self-regulated have also been noted in the case of K-12 online learning. Consistent with findings derived from postsecondary online-learning settings, constructs such as self-efficacy, intrinsic motivation, goal-setting, time management, and metacognitive skills can be key to achieving K-12 online learning success (Cavanaugh et al., 2004; DiPietro et al., 2008; Kim et al., 2015; Lin et al., 2017; Weiner, 2003). Moreover, younger learners are likely to possess lower levels of self-regulation ability than adult ones, leading the former to need more frequent teacher-student communication, more guidance from teachers as online-learning-behavior scaffolders, and more explicit instruction in cognitive and metacognitive skills (Cavanaugh et al., 2004). Some studies have found that online SRL skills are an important predictor of cognitive presence. For example, Garrison (2007) noted that to promote cognitive inquiry, individuals need to enhance their own metacognitive awareness of their tasks, and especially the ability to identify the level of their own cognitive contributions during class activities. In a similar vein, Pawan, Paulus, Yalcin, and Chang (2003) recommended that, to promote higher levels of cognitive presence, students should apply a strategy of self-coding to their responses in class. The SEM 15 results for Shea and Bidjerano’s (2010) model establishing the relationship between CoI, self- efficacy, and effort regulation indicated that self-efficacy directly predicted the respondents’ online cognitive presence, and that online effort regulation also partially mediated the association between these learners’ self-efficacy and their cognitive presence. Cho et al. (2017) used cluster analysis to classify students into four groups based on different self-regulation levels, and found that the high-self-regulation group had a stronger sense of CoI than the low-self-regulation one. Wang, Peng, Huang, Hou, and Wang (2008) found that online students’ self-efficacy predicted their learning outcomes via the mediating effect of SRL strategies. A learner’s degree of self-regulation is strongly and positively correlated with the likelihood of his or her academic success (Zimmerman, 2008). As Schunk (2005) put it, “self- regulated learning is seen as a mechanism to help explain achievement differences among students and as a means to improve achievement” (p. 85). In a study of students enrolled in an online liberal-arts course, Puzziferro (2008) established a clear relationship between online SRL subscales (i.e., time and study environment, effort regulation) and learning performance. The same study’s findings also indicated that high learning satisfaction was positively correlated with a variety of self-regulation subscales, including both cognitive and metacognitive strategies. Barnard et al. (2008) used the OSLQ to measure students’ online SRL skills, and found that approximately one-third of the relationship between perceptions of online-course communication and academic performance was mediated by online SRL strategies, further highlighting the importance of self-regulation in online learning. In the K-12 context, Roblyer and Marshall (2002) devised an instrument for detecting the likely success or failure of online students, and reported that intrinsic motivation, self-efficacy, time management and goal-setting were critical determinants of middle- and high-school students’ online learning success. Roblyer, Davis, 16 Mills, Marshall, and Pape (2008) found that self-efficacy and SRL skills including goal-setting, time management, and self-reflection were significantly capable of predicting whether online students would be successful or unsuccessful. Based on the literature reviewed above, this study proposes to test the following three hypotheses related to SRL: H9: Self-efficacy positively predicts online SRL strategies. H10: Self-efficacy positively predicts cognitive presence. H11: Online SRL strategies positively predicts cognitive presence. Mentors in K-12 Online Courses Most U.S. K-12 students who are enrolled in online courses also regularly attend a physical school locally, which typically provides each of them with a mentor (Borup & Drysdale, 2014). Mentors can help maintain a supportive on-site learning environment while also tracking students’ learning progress (Harms et al., 2006), and ultimately their role is to ensure that students achieve success in their online courses (Wicks, 2010). Barbour and Mulcahy (2004, p. 11) considered the establishment of the on-site mentor role as “one of the most dramatic changes” in distance learning since its extension to the K-12 sphere. It is important that such mentors’ support should be both continuous and comprehensive (Barbour, 2012; Freidhoff, Borup, Stimson, & DeBruler, 2015; Harms et al., 2006). Mentor Presence. For the purposes of the present research, mentors’ roles can be said to fall largely into four categories: relationship builders, monitors, content-learning facilitators, and problem solvers (e.g., Aronson & Timms, 2003; Barbour & Mulcahy, 2004; Borup & Drysdale, 2014; de la Varre, Keane, & Irvin, 2011; Drysdale, 2013; Hannum et al., 2008; Kennedy & Cavanaugh, 2010; Taylor et al., 2016; Wortmann et al., 2008). The relationship-builder role 17 refers to the mentor’s endeavors to establish positive relationships with learners to make them feel less disconnected and isolated (Barbour & Mulcahy, 2004, 2009; Drysdale, 2013; Hannum et al., 2008; Pettyjohn, 2012). This is of critical importance, insofar as students’ connections with their online learning environments and teachers have always been of primary concern in online learning in both theory and practice (Gunawardena et al., 2009; Swan, 2002). A monitor’s responsibility includes reminding students to submit assignments on a regular basis, contacting them immediately if they are found to be falling behind, tracking their study progress, detecting possible failures, and being aware of absences (Aronson & Timms, 2003; Barbour & Mulcahy, 2004; de la Varre et al., 2011; Drysdale, 2013). As for the role of content-learning facilitator, some studies have maintained that school- based mentors do not necessarily need to be content experts (Aronson & Timms, 2003; Borup & Drysdale, 2014). However, an increasing amount of research has shown that mentors do participate in some content-related learning activities, even where direct instruction is not listed among their official role responsibilities (Barbour & Mulcahy, 2004; de la Varre et al., 2011; O’dwyer, Carey, & Kleiman, 2007; Taylor et al., 2016). Last but not least, mentors should work as problem solvers; this role can include providing technical assistance to students (Barbour & Mulcahy, 2004; de la Varre et al., 2011; Hannum et al., 2008; Keane, de la Varre, Irvin, & Hannum, 2008), and conducting orientations to help familiarize students with the online learning environment (Aronson & Timms, 2003; Pettyjohn, 2012). The Impact of Mentor Presence. The U.S. Department of Education (2007) reported that students who sought assistance only from their online instructors were less likely to achieve success than those who also took advantage of local on-site support. Roblyer et al. (2008) found 18 that assigning students on-site mentors, and designating a specific period and location at school for online-course participation, were both effective tactics for increasing pass rates, and in some cases doubling them. Hannum et al. (2008) found that trained facilitators played a positive role in student retention and in encouraging students to spend more time engaging with their course content. This in turn echoed Simpson’s (2004) finding that mentors’ constant communication with students helped to improve retention rates. Recently, Taylor et al. (2016) reported that credit-recovery students who received extensive instructional support from their mentors tended to have higher course pass rates than those who had little such support. And Ferdig (2010) found that at-risk students who checked in with their mentors on at least two days per week passed at least one course, and were likely to express a belief that mentors were a critical factor in helping them to achieve success. Previous studies also reported mentors’ positive impacts on the promotion of motivation and on the enhancement of SRL strategies (de la Varre et al., 2011; Freidhoff et al., 2015; Murphy & Rodriguez-Manzanares, 2009; Staker, 2011). For instance, de la Varre et al. found that mentors’ presence in the classroom helped to encourage students’ self-efficacy, and that mentors’ support was especially beneficial to those who were struggling with their courses or felt frustrated by online learning. Freidhoff et al. (2015) interviewed 14 on-site mentors from programs with a history of high performance and found that all of them described working with students on time-management issues, including educating them about why they needed time- management skills in online learning, finding solutions for procrastination, and helping arrange times for study. Based on the relevant literature, this study proposes to test the following four hypotheses related to mentor presence: 19 H12: Mentor presence positively predicts self-efficacy. H13: Mentor presence positively predicts online SRL strategies. H14: Mentor presence positively predicts cognitive presence. H15: Mentor presence positively predicts learning outcomes. Learning Locations for K-12 Online Learners K-12 online learning is not limited to learning behavior that happens at home. The 2015 Keeping Pace with K-12 Digital Learning Report revealed that a current trend for K-12 online learning is to move online learning from being mostly online to frequently combining with onsite learning (Watson et al., 2015). The same document reported that many students actually learn their online courses from a physical learning institution, such as their local schools or other formal learning centers, not from their home. De la Varre et al. (2011) reported that taking an online course does not mean that learning happens at a virtual learning environment exclusively, and that K-12 online learners actually take their online courses from the school classroom, library, or home-school settings. However, this does not mean that all K-12 online learners nowadays are using the location other than their homes to access online learning contents. In fact, contrary to those who chose to learn from school labs or classrooms, there are a considerate number of students choose to take the online course from home without seeking support from their physical schools (Roblyer & Marshall, 2002; Watson et al., 2015). Although no studies so far has explicitly compare the learning outcomes between students who spend the majority of their learning online contents at-school and at-home, it is suggested that the sit-in time at school could be beneficial for online learning. Through collecting feedback from virtual high school instructors, Roblyer and Marshall (2002) found that 20 a common theme from these instructors was about the location of students’ online learning. On the one hand, a lack of access to the Internet at some students’ homes may greatly cause Digital Divide and affect students’ learning experience and performance. On the other hand, teachers believed that by requiring students to actually sit in a room with access to online learning and allocating certain time during the school day for online courses help guarantee the quality of online learning. This has much to do with the hypothesis that K-12 online learners may show low levels of self-regulated learning abilities (both cognitively and metacognitively) when it comes to learning independently and autonomously in an online learning environment (Borup & Drysdale, 2014; Rice, 2006). For example, both Harms et al. (2006) and de la Varre et al. (2011) reported that students study in the same physical room with other students spend more time on learning contents and less time on off-task behaviors. Studies also found that the physical location may be particularly helpful for those who are struggling with learning contents or receive very little support from home (Archambault, Diamond, Brown, Cavanaugh, Coffey, Foures-Aalbu, & Zygouris-Coe, 2010; Freidhoff et al., 2015; Pettyjohn, 2012). It is also believed that mentors set the local climate of students’ at-school online learning experience (de la Varre, Irvin, Jordan, Hannum & Farmer, 2014), and greatly facilitate those who choose to learn their online course at school (Borup & Drysdale, 2014; Freidhoff et al., 2015). As those who works as a liaison between the online instructor and the student (de la Varre et al., 2014), if a student use the school lab as his/her primary online learning location, he/she may benefit from mentor’s practice including but not limited to building rapport relationships (Barbour & Mulcahy, 2004, 2009; Drysdale, 2013; Hannum et al., 2008; Pettyjohn, 2012), progress monitoring (Aronson & Timms, 2003; Barbour & Mulcahy, 2004; de la Varre et al., 2011; Drysdale, 2013), content facilitation and direct instruction (Aronson & Timms, 2003; 21 Barbour & Mulcahy, 2004; Borup & Drysdale, 2014; de la Varre et al., 2011; O’dwyer et al., 2007; Taylor et al., 2016) and trouble-shooting (Aronson & Timms, 2003; Barbour & Mulcahy, 2004; de la Varre et al., 2011; Hannum et al., 2008; Keane et al., 2008; Pettyjohn, 2012) , as discussed in previous sections. Accordingly, based on the relevant literature, this study proposes to test the following hypothesis related to the primary online learning location. H16: Students who use school as their primary online learning locations demonstrate higher level of social presence, mentor presence, cognitive presence, self-efficacy, online SRL strategies, and learning outcomes, compared with those who choose to spend their majority time of online learning at home. Research Questions The present study aims to extend Shea and Bidjerano’s (2010) revised CoI framework by incorporating additional components drawn from SRL theory as well as the effects of mentor presence, it will then examine the resultant new framework in terms of its power to predict learning outcomes in K-12 online settings. This study also aims to test whether or not student’s primary online learning location matters. It will be guided by the following four research questions: 1. Looking at Shea and Bidjerano’s (2010) model, does adding a path from teaching presence to SRL make the model a better fit? Shea and Bidjerano’s (2010) model is shown in Fig. 1, and the hypothesized alternative model building on Shea’s model that added the path from teaching presence to self- regulated learning is shown in Fig. 2. 22 Figure 2. The Hypothesized Alternative Model H1: Teaching presence positively predicts students’ online SRL strategies. H2: Teaching presence positively predicts students’ self-efficacy. H3: Teaching presence positively predicts cognitive presence. H5: Social presence positively predicts students’ self-efficacy. H6: Social presence positively predicts cognitive presence. H9: Self-efficacy positively predicts online SRL strategies. H10: Self-efficacy positively predicts cognitive presence. H11: Online SRL strategies positively predicts cognitive presence. 2. How well does the model explain cognitive presence if mentor presence is added? 23 The hypothesized alternative model with mentor presence is shown on Fig. 3. Figure 3. The Hypothesized Alternative Model with Mentor Presence H12: Mentor presence positively predicts self-efficacy. H13: Mentor presence positively predicts online SRL strategies. H14: Mentor presence positively predicts cognitive presence. 3. How well does the model explain learning outcomes (i.e., satisfaction, perceived progress, final grade)? The hypothesized model with mentor presence and learning outcomes is shown in Fig. 4. 24 Figure 4. The Hypothesized Alternative Model with Mentor Presence and Learning Outcomes The three types of learning outcome were tested separately. H4: Teaching presence positively predicts learning outcome. H7: Social presence positively predicts learning outcome. H15: Mentor presence positively predicts learning outcome. H8: Cognitive presence positively predicts learning outcome. 4. How do the associations between social presence, cognitive presence, mentor presence, self-efficacy, online SRL strategies and learning outcomes vary according to whether the student’s primary online learning location is home or school? The hypothesis relevant to this question is H16 is as follows: 25 H16: Students who use school as their primary online learning locations demonstrate higher level of social presence, mentor presence, cognitive presence, self-efficacy, online SRL strategies, and learning outcomes, compared with those who choose to spend their majority time of online learning at home. 26 CHAPTER 3 METHODS Context This study was conducted at a nonprofit virtual school in the Midwestern United States during the spring semester of 2017. Although termed a school, this institution is a non-profit, state-wide supplemental program, which was authorized by the state and overseen by state agencies. It is one of the largest online-learning schools in the country. Although this institution’s online courses were the main focus of inquiry, it should be noted that all the students who participated in this study also attended their brick-and-mortar schools on a daily basis. Typically, each student would take one or two of their courses from the virtual school, but the majority at their own school. Each virtual-school course was self-paced and asynchronous. Most of the communication between students and their online teachers took place online through discussion forums and messages. Each student had an adult offline mentor, in addition to their online instructor(s) whose main responsibility was to deliver learning content. For the most part, students were encouraged to complete their online-learning activities at seat times, in labs located in their local schools, under the supervision of their mentors. However, some seat-time waivers were issued, entitling certain students to be mentored entirely online (see Rice, Huerta, Shafer, Barbour, Miron, Gulosino, and Horvitz, 2014). Assessment methods for online learners included, but were not limited to, essays, discussions, projects, and computer-graded quizzes. Participants All 696 individuals who participated in this study, via responding to an online survey, were taking online high-school level courses at the virtual school described above at the time of 27 their participation. As shown in Table 1, 72% of the participants were female (N = 502) and 28% were male (N = 193). An additional student did not indicate the gender. Most of the participants were 12th graders (N = 313, 45%), followed by 11th graders (N = 163, 23%), 10th graders (N = 132, 19%), 9th graders (N = 59, 8%), 8th graders (N = 25, 4%) and 7th graders (N = 4, 1%). Table 1. Descriptive Statistics Mean Min Max Alpha SD .45 .37 .15 .17 .25 .19 .08 .19 .28 .39 .42 .50 .28 .46 .30 .37 .35 .41 .23 1.05 .82 .96 .91 .80 .86 .81 1.12 .84 .28 .84 .02 .03 .07 .04 .01 .04 .08 .19 .23 .45 .08 .30 .10 .16 .14 .22 .06 2.95 3.93 3.74 3.72 3.87 4.13 3.59 3.93 4.20 80.14 Male Caucasian African American Hispanic Asian Other ethnicity 7th grade 8th grade 9th grade 10th grade 11th grade 12th grade English Foreign languages Math Science Social science Other subjects Credit-recovery Prior knowledge Teaching presence Mentor presence Social presence Cognitive presence Self-efficacy Online SRL strategies Satisfaction Perceived progress Final grade Teaching presence, Mentor presence, Social presence, Cognitive presence, Self-efficacy, Online SRL strategies, Satisfaction, and Perceived progress are the composite scores of the measured items. N 695 696 696 696 696 696 696 696 696 696 696 696 694 694 694 694 694 694 690 696 694 696 696 694 696 696 695 696 696 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 1 1 1 1 1 1 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 5 5 5 5 5 5 5 5 5 .93 .96 .88 .94 .92 .87 .90 .90 24.50 100 28 Regarding the ethnicity of the participants, the majority were Caucasians (84%), while 2% were African Americans, 3% Hispanics, 7% Asians, and 4% other ethnicities. The virtual-school subjects that the participants were enrolled in included Foreign Languages (30%), Science (16%), Social Science (14%), Math (10%), English (8%), and others (22%). Two students did not report the subjects of the online courses they were taking. Most participants were enrolled on a non-credit-recovery basis (N = 650; 94%) and around 6% of them (N = 40) for credit-recovery purposes, while 6 respondents did not answer this question. When the students were asked about their prior knowledge on the subject in this course, around 10% of them rated it as very poor, 21% as poor, 37% as fair, 26% as good, and 6% as very good. Other background questions about each learner covered their previous online-learning experience and primary learning location. With regard to the former, 38% reported that they had never taken an online course before, and 33% that they had taken only one; an additional 13% had taken two, and 16% had taken three or more. Two students did not answer this question. When asked about their primary location for completing most of their online coursework, around 58% students (N = 399) reported that they completed it at their own school and around 42% students (N = 295) reported that they completed it at home. One student did not answer this question. Procedure Participants were recruited primarily with the assistance of two gatekeepers at the research site, who offered help in contacting IT staff there to resolve technical issues with the survey settings, announced the study on the virtual school’s learning-management systems, and ensured that the Qualtrics survey link was displayed when each student logged in to such systems. The gatekeepers set a window of five days during which students could complete the 29 study. To help ensure a high response rate, the researcher provided 20 gift cards worth $20 each that were randomly awarded to 20 students who completed the survey. The first page of the online survey was a parental consent form (see Appendix A). Before answering any questions, each students first had to give this form to their parents and obtain their approval for study participation. The second page of the survey was a student assent form (see Appendix B), completion of which was also required before anyone could start the survey proper. On both these forms, the researcher provided informed about the nature of the study; made it clear that participation would be completely voluntary, and could be terminated by the student at any point; and that all responses would remain confidential and not have any effect on course grades. The time it would take for a given student to complete the survey was typically 20-25 minutes. Survey Survey approaches are typically used to measure the opinions, attitudes, or characteristics of a sample of the population, and have been very extensively utilized in educational research (Creswell, 2005). The use of a survey method is appropriate for the present study, given the large number of students involved and the need to conduct statistical analyses. The survey questions are provided in Appendix C. Demographic Information. This part of the survey instrument asked about students’ gender, grade level, ethnicity, and subject of the online course currently being taken. If they were taking more than one course, they were asked to choose the one that they thought best represented their online-learning experience. The multiple-choice response categories for course subject included English, Foreign Languages, Math, Science, Social Science, and “Other”. This section also asked the students their reasons for enrollment (i.e., credit-recovery or non-credit- 30 recovery), the major location in which they completed most of their online coursework (i.e., home, school, or other places), and their prior knowledge of the subject they were taking (i.e., very poor, poor, fair, good, or very good). Community of Inquiry. The participants’ perceptions of teaching, social, and cognitive presences were measured using a modified version of an instrument originally developed by Arbaugh, Cleveland-Innes, Diaz, Garrison, Ice, Richardson, & Swan (2008) based on the CoI framework. Specifically, some of Arbaugh et al.’s items were slightly modified to better fit the current study’s participants and context. In the modified instrument used here, each item was measured on a five-point Likert scale, ranging from 1 = strongly disagree to 5 = strongly agree. Of its 34 items, 12 were designed to assess teaching presence, six to assess social presence, and 12, cognitive presence. The reliability and validity of the factor structure of the three constructs in the original instrument have been widely confirmed (Arbaugh et al., 2008; Garrison et al., 2010; Rockinson-Szapkiw et al., 2016; Shea & Bidjerano, 2010, 2012). Teaching Presence. Teaching presence comprises three dimensions: design and organization, facilitation, and direct instruction. One sample item for design and organization is The instructor clearly communicated important course topics. One sample item for facilitation is The instructor was helpful in guiding the class towards understanding course topics in a way that helped me clarify my thinking. And for direct instruction, a sample item is My instructor provided useful illustrations that helped make the course content more understandable to me. The Cronbach’s alpha for teaching presence was computed as .93. Social Presence. Social presence consisted of two dimensions: affective expression and open communication. One sample item for affective expression is I have a sense of belonging in 31 the course. One sample item for open communication is I felt comfortable conversing through the online medium. The Cronbach’s alpha score for social presence was .88. Cognitive Presence. Cognitive presence included triggering event, exploration, integration, and resolution. One sample item for triggering event is Problems posed increased my interest in course issues. A sample item for exploration is I utilized a variety of sources to explore problems. A sample item for integration is Learning activities helped me construct solutions; and a sample item for resolution is I can describe ways to apply the knowledge created in this course. The Cronbach’s alpha computed for cognitive presence was .94. Self-efficacy. The four-item instrument used in the present research to measure students’ self-efficacy was adopted from the Motivated Strategies for Learning Questionnaire (MSLQ; Pintrich, Smith, Garcia, & McKeachie, 1993). It was answered using a five-point Likert scale, again ranging from 1= strongly disagree to 5 = strongly agree. This instrument has been deemed to have good construct validity by previous online-learning studies (Shea & Bidjerano, 2010, 2012). The Cronbach’s alpha for self-efficacy was found to be .92. One sample item for self- efficacy is I believe I will receive an excellent grade in this class. Online Self-regulated Learning Strategies. Ultimately, the instrument used to examine SRL strategies was based on a 24-item questionnaire developed by Barnard et al. (2009) to measure students’ online self-regulation skills in terms of five different aspects: goal-setting, help-seeking, task strategies, self-evaluation, and time management. Shea and Bidjerano (2012) further refined the original questionnaire by excluding items that were conceptually ambiguous, and confirmed the factor structure of the new version using exploratory factor analysis (EFA). The results showed that it contained three latent factors: goal-setting, strategic learning, and help-seeking. The current study adopted Shea and Bidjerano’s version of the survey, and re- 32 phrased some items to better fit the specific situation of the targeted online-learning institution. One sample item for goal-setting is I set goals to help me manage studying time for my online courses. A sample item for strategic learning is I work on extra problems or do additional readings in my online course beyond the assigned ones to master the course content. And a sample of a help-seeking item is I find someone who is knowledgeable in course content so that I can consult with him/her when I need help. The Cronbach’s alpha for online self-regulated learning strategies was .92. Mentor Presence. This part of the present study’s survey was adapted from previous literature regarding mentors’ roles (Borup & Drysdale, 2014), and includes 16 items that describe a variety of such roles, including problem-solver, social-relationship builder, progress tracker, and content instructor. The participants responded using the same five-point Likert scale described above. A sample of problem-solver is The mentor helped me become familiar with the course platform; a sample of social-relationship builder is The mentor expressed appreciation for my contribution; a sample of progress tracker is; and a sample of content instructor is The mentor helped me with content learning. The Cronbach’s alpha for online self-regulated learning strategies was .96. Satisfaction. Kuo, Walker, Schroder, and Belland (2014) developed five items to measure students’ satisfaction with online learning. The present study adopted four of these items, as the other one was not appropriate to K-12 online-learning settings. Students responded via the same five-point Likert-scale discussed above. The Cronbach’s alpha score for satisfaction was .90, and a sample item is Overall I am satisfied with this class. Perceived Progress. The current study’s instrument for measuring students’ perceived progress was adopted from Lin, Zheng, and Zhang (2016), and uses the same five-point Likert 33 scale as above. The Cronbach’s alpha for perceived progress was computed as .90, and a sample item is I understand most of the learning content in my class. Data Analysis SEM was the main form of statistical analysis used in the current study. A multivariate technique, it examines the complex relationships among variables in a hypothesized model to establish the extent to which the model fits the data (Schumacker & Lomax, 2010). In contrast to other statistical methods, SEM allows researchers to consider all the variables in the model simultaneously and to make decisions on whether to keep, reject, or modify the model (Kline, 2011). Conducting SEM always involves the creation of a measurement model and a structural model. The first tests the relationships of various factor loadings with the latent factor, while second captures the direct and indirect structural relationships among multiple variables, with the inclusion of measurement errors (Kline, 2011). Fit statistics in SEM provide information about the degree to which a given hypothesized model is supported by the data. The chi-square test, a traditional approach to measuring the fitness of a proposed model, compares the difference between the sample and fitted covariance matrices (Hu & Bentler, 1999). A non-significant chi-square value indicates a good model fit and little difference between the sample covariance matrix and the reproduced matrix (Schumacker & Lomax, 2010). However, since the chi-square test is very sensitive to sample size, a large sample can easily result in a statistically significant chi-square value. Multiple additional goodness-of-fit indices are therefore usually computed (Schumacker & Lomax, 2010). Two useful alternative fit statistics are comparative fit index (CFI) and root mean square error of approximation (RMSEA) (Hu & Bentler, 1999). CFI compares the sample’s covariance matrix with the baseline model, taking sample size into account, while RMSEA examines how well the 34 model fits the baseline covariance matrix with optimal parameter estimates (Schumacker & Lomax, 2010). CFI values larger than .9 are considered to reflect an adequate fit, and larger than .95, a good fit. RMSEA values smaller than .08 suggest an acceptable fit, and smaller than .05, a good fit (Hu & Bentler, 1999). If the initial model proves to be unacceptable, it needs to be modified through the addition or deletion of paths until all remaining paths achieve statistical significance (Kline, 2011). All SEM analyses in the current study were conducted using Mplus7 software. Instrument Validation. A double-split cross-validation technique was employed to examine the validity of the latent factor structure in the present study’s data (Hair, Black, Babin, Anderson, & Tatham, 2006). This requires splitting the entire dataset into two randomly selected sub-datasets that are then used to validate one another (Cudek & Browne, 1983). Specifically, EFA is first employed to identify the factor structure of one half of the sample, and then confirmatory factor analysis (CFA) used to confirm such structure using the other half. As the current study involves CoI, online SRL strategies, and mentor presence, it was decided to conduct both EFA and CFA on the measured items to validate those belonging to the theorized latent constructs. All participants (N = 696) were randomly split into two datasets, one drawn from 348 participants and the other from 348. The first was used for EFA and the second for CFA. The 67 items included in this testing contained measurements on teaching presence (12 items), social presence (6 items), cognitive presence (12 items), mentor presence (16 items), self- regulated learning strategies (10 items), self-efficacy (4 items), satisfaction (3 items), and perceived progress (4 items). Exploratory Factor Analysis. EFA is used to find the latent variables that can account for observed variation and covariation among the observed variables (Costello & Osborne, 2005). 35 An exploratory method, the small number of latent constructs it identifies can account for a larger set of variables. To decide whether the data is suitable for factor analysis, two types of tests – KMO and Bartlett’s Test of Sphericity are first conducted to evaluate sampling adequacy (Bartlett, 1950; Kaiser, 1974). Kaiser suggested that when judging the result of KMO, values less than .49 were unacceptable; .50 to .59, miserable; .60 to .69, mediocre; .70 to .79, middling; .80 to .89, meritorious; and .90 to 1.00, marvelous. The null hypothesis for the Bartlett’s Test of Sphericity is that the sample intercorrelation matrix is from a population in which the variables are not collinear. The results consist of chi-square statistics, the significance of which can be used to decide whether to accept or reject the null hypothesis. If the null hypothesis (i.e., that the test result is significant) is rejected, it is suitable to conduct factor analysis for the measured variables. Following Kaiser’s rule, eigenvalues larger than 1 and the scree test examining the graph of the eigenvalues and looking for the break point in the data were used to decide the number of factors retained (Kaiser & Rice, 1974). Both the unrotated results and the rotated results were examined. Since the original loadings from the unrotated results could not provide a direct interpretation, rotating the factors was chosen to present the result. The assumption of the data was that the resulting factors were correlated with each other, so oblique rotation (Promax rotation) was preferred over orthogonal rotation. The factor loading of the item should be greater than 0.4 under the relevant component and less than 0.4 under all the other components (Stevens, 1996). Confirmatory Factor Analysis. CFA answers questions about the extent to which the items under the latent constructs in a model indeed measure those latent constructs (Wang & 36 Wang, 2012). CFA’s primary concern is how well the model fits the data. In the current study, chi-square statistics, together with multiple goodness-of-fit indices including RMSEA and CFI, were applied to determine the fitness of the models used (Hu & Bentler, 1999). If a hypothesized CFA model is deemed a good fit for the data, it means that the tested factorial structure in the measurement model is valid. As well as using model-fit indices, researchers should refer to the convergent and discriminant validity of their measurements. Convergent validity is how well a factor is measured by its associated survey items (Campbell & Fiske, 1959). There are two ways to decide the convergent validity of the measured items: 1) testing the factor loading of each of them, and 2) comparing the average variance extracted (AVE) and the composite reliability (Fornell & Larcker, 1981; Hair et al., 2006). A factor loading greater than .70 is thought to indicate a well- defined construct (Gefen, Straub, & Boudreau, 2000; Hair et al., 2006). In this study, all factor loadings of the items under each latent construct were computed and compared to see whether they are greater than .70. Another way of testing convergent validity is to compare AVE against composite reliability. To compute AVE, one needs to sum up each squared factor loading and divide it by the number of indicators in the latent construct (Hair et al., 2006). If 1) the composite reliability is greater than the AVE, and 2) the AVEs are greater than .50, convergent validity is deemed satisfactory. Discriminant validity is evaluated to test whether a latent construct can explain more variance in the observed variables than either measurement error or other latent constructs under the measurement model (Fornell & Larcker, 1981). Specifically, the AVEs of any two latent constructs should both exceed the shared variance (i.e., square of the correlation) between these 37 two constructs. In other words, the square root of the AVE of each latent construct should exceed the inter-construct correlation between this latent construct and another latent construct. Structural Model. The second step of SEM is to examine the structural model using statistical-significance tests of the path coefficients, to test the researchers’ hypotheses (Kline, 2011). A structural model captures the structural relations among multiple variables, both latent and observed, while simultaneously taking measurement errors into account (Kline, 2011). According to Cohen (1988), a standardized path coefficient between .1 and .3 reflects a small effect size, .3 to .5 a medium effect size, and >.5, a large effect size. Testing Mediation. SEM allows researchers to test direct, indirect, and total effects simultaneously (Kline, 2011). A direct effect is the direct path from one construct to another; an indirect one is the path from one construct to another through one or more mediators in the model; and the total effect is the sum of the direct and indirect effects. Bootstrapping is a method used to determine the confidence intervals (CIs) for mediation effects. It generates an empirical approximation of the sampling distribution through repeated resampling, and establishes CIs and calculates p-values through such resampling (MacKinnon, Lockwood, & Williams, 2004). Bootstrapping with 5,000 resamples was employed to build CIs (Preacher & Hayes, 2006). RQ1. Shea’s Model and the Alternative Model. Two separate SEM models, Shea’s Model and the Alternative Model (adding one more direct path from TP to SRL) were established and compared. Chi-square significance testing and model-fit indices were used as criteria for deciding whether a better model for the data in the present study could be developed. Any insignificant direct effects were removed. 38 RQ2. Model with Mentor Presence. Based on the model identified from RQ1, to answer RQ2, one more variable (i.e., Mentor Presence) was added to the SEM model. Again, model-fit indices and standardized path coefficients were examined to determine whether the model could be accepted. Then, paths with insignificant direct effects were removed, and the final model tested for mediation effects. RQ3. Model with Learning Outcomes. Since previous studies found that the two outcome variables used in this study, satisfaction and perceived progress, differed in terms of their relationships with learning variables (Eom, Wen, & Ashill, 2006), it was decided that they would be used as endogenous variables, separately in two different models. Final grade would also be used as a spate learning outcome variable. As with the SEM analysis conducting in connection with RQ2, model-fit indices and standardized path coefficients were first analyzed in each model. Then, if a direct effect between two variables was identified as insignificant, its path was removed from the SEM model. Then, the bootstrap method with 5,000 resamples was used to determine the significance of mediation effects. RQ4. Learning at school vs. Learning at Home. To answer RQ4, regarding the difference between students who completed the majority of their whether at school or at home, descriptive statistics and ANOVA were used. These two groups of students were compared based on the differences in the items of different presences, including social presence, cognitive presence, mentor presence, self-efficacy, self-regulated learning strategies, satisfaction, perceived progress, and final grade. 39 CHAPTER 4 RESULTS This chapter includes statistical analyses of this study’s data, including tests of the validity of the latent constructs, examination of descriptive statistics, correlations, the measurement model, and analyses leading to the answers for each research question. Exploratory Factor Analysis As previously mentioned, the data subset for possible EFA was drawn from 348 participants randomly selected from the entire sample (N = 696). The KMO test and the Bartlett’s Test of Sphericity were then conducted to decide the appropriateness of conducting EFA on this data. The KMO result was .95, and the Bartlett’s Test of Sphericity result was 20127.40 (df = 2211, p < .001), which showed that the intercorrelation matrix in the sample was suitable for factor analysis. Next, PCA with Promax rotation was conducted to arrive at the factor solution of the measured variables. Following Kaiser’s rule and scree-plot results (Kaiser & Rice, 1974), a seven-factor solution with eigenvalues >1 was extracted, and explained 85.04% of the variance among the 67 tested items (see Table 2). The first factor (eigenvalue = 26.99) explained 55.79% of the variance, and contained all the 16 items from the mentor presence and one item from self- regulated learning strategies (i.e., If needed, I try to ask my online teacher/mentors about the question that I don’t understand) which also reflected the help that a student received from mentors. The factor loadings were all larger than .40, ranging from .40 to .80 (see Table 3). Table 2. Test of the Validity of the Latent Constructs – Eigenvalue of the Exploratory Factor Analysis (N = 348) Factor 1 Eigenvalue Difference Proportion Cumulative 26.99 22.11 .5579 .5579 40 Table 2 (cont’d) Factor 2 Factor 3 Factor 4 Factor 5 Factor 6 Factor 7 4.88 2.84 2.24 1.77 1.33 1.10 2.04 .60 .48 .43 .23 .07 .1008 .0587 .0463 .0365 .0275 .0227 .6587 .7174 .7637 .8002 .8277 .8504 Table 3. Test of the Validity of the Latent Constructs - Exploratory Factor Analysis (N = 348) TP_1 TP_2 TP_3 TP_4 TP_5 TP_6 TP_7 TP_8 TP_9 TP_10 TP_11 TP_12 SP_1 SP_2 SP_3 SP_4 SP_5 SP_6 CP_1 CP_2 CP_3 CP_4 CP_5 CP_6 CP_7 CP_8 CP_9 CP_10 CP_11 CP_12 EFFI_1 EFFI_2 Factor Factor Factor Factor Factor Factor Factor 1 .23 .24 .21 .21 .29 .29 .28 .23 .36 .23 .22 .31 .30 .26 .21 .25 .29 .28 .15 .20 .21 .16 .22 .14 .17 .24 .28 .08 .17 .07 .18 .18 2 .21 .21 .14 .19 .26 .28 .34 .27 .23 .24 .25 .25 .39 .35 .33 .31 .37 .40 .50 .60 .62 .55 .56 .63 .66 .67 .62 .68 .71 .66 .09 .20 4 .25 .20 .17 .18 .24 .20 .06 .11 .04 .16 .15 .11 .23 .11 .07 .23 .33 .21 .16 .23 .27 .11 .07 .17 .19 .22 .28 .34 .33 .30 .78 .83 3 .69 .64 .57 .40 .67 .67 .58 .63 .57 .73 .68 .61 .42 .40 .23 .28 .26 .19 .31 .30 .25 .20 .26 .28 .37 .32 .27 .21 .22 .09 .13 .16 41 5 .03 .08 .10 .05 .10 .18 .21 .16 .22 .06 .08 .07 .25 .24 .13 .18 .18 .19 .13 .22 .27 .29 .31 .13 .14 .17 .13 .07 .07 .02 .14 .14 6 .11 .10 .03 .07 .13 .14 .18 .10 .07 .03 .06 -.02 .12 .10 .01 .01 .00 .06 .01 .00 .06 .09 .09 .09 .09 .06 .11 .08 .03 .03 .03 .03 7 .11 .08 .19 .03 .10 .10 .02 .03 .05 .05 .04 .09 .25 .30 .30 .24 .24 .17 .23 .19 .10 -.08 .04 .14 .11 .07 .04 .04 .00 .09 .01 .02 Table 3 (cont’d) .20 .14 .21 .10 .18 .17 .16 .20 .22 .23 .40 .36 .79 .79. .76 .77 .72 .80 .64 .68 .63 .49 .47 .48 .61 .68 .70 .63 .22 .19 .12 .18 .13 .16 .22 EFFI_3 EFFI_4 SRL_1 SRL_2 SRL_3 SRL_4 SRL_5 SRL_6 SRL_7 SRL_8 SRL_9 SRL_10 MP_1 MP_2 MP_3 MP_4 MP_5 MP_6 MP_7 MP_8 MP_9 MP_10 MP_11 MP_12 MP_13 MP_14 MP_15 MP_16 SATIS_1 SATIS_2 SATIS_3 PROG_1 PROG_2 PROG_3 PROG_4 TP = Teaching Presence, SP = Social Presence, CP = Cognitive Presence, MP = Mentor Presence, SRL = Online SRL Strategies, EFFI = Self-efficacy, SATIS = Satisfaction, PROG = Perceived Progress -.02 .14 -.03 -.07 .02 .11 .14 .18 .22 .17 .07 .08 .16 .15 .13 .12 .04 .02 -.08 .18 -.03 -.02 .11 .05 .06 .05 .04 .00 .45 .58 .52 .09 .22 .16 .22 .14 .27 .23 .17 .22 .16 .10 .18 .23 .33 .07 .12 .06 .07 .15 .24 .15 .15 .11 .20 .16 .16 .10 .04 .16 .17 .14 .13 .47 .39 .38 .40 .44 .45 .45 .15 .22 .16 .15 .15 .13 .09 .04 .11 .16 .28 .31 .19 .18 .11 .16 .19 .23 .18 .22 .15 .15 .15 .16 .23 .20 .23 .22 .39 .36 .27 .12 .12 .12 .13 .81 .68 .37 .19 .40 .25 .04 .05 .10 .19 .13 .13 .13 .17 .12 .10 .22 .15 .20 .21 .17 .13 .07 .10 .10 .08 .12 .19 .37 .35 .38 .63 .67 .59 .59 .17 .13 .52 .69 .61 .69 .39 .45 .55 .51 .39 .39 .16 .13 .01 .07 .16 .19 -.02 .08 .04 .04 .08 .10 .16 .11 .11 .05 .12 .10 .12 .14 .03 .04 .08 -.01 .03 .06 .06 .12 .02 .14 .21 .02 .09 -.01 .02 .04 -.03 -.05 .02 .11 .14 .17 .27 .36 .61 .65 .56 .56 .18 .17 .09 .13 .06 .07 .10 .11 .15 .11 The second factor (eigenvalue = 4.88) explained 10.08% of the variance and contained all the 12 items from cognitive presence (factor loadings ranging from .50 to .71), one item from 42 satisfaction (i.e., Overall I am satisfied with this class. Factor loading was .47), and all the four items from perceived progress (factor loadings ranged from .40 to .45). The third factor (eigenvalue = 2.84) explained 5.87% of the variance and contained all the 12 items from teaching presence and two items from social presence. The factor loadings for teaching presence ranged from .40 to .73, and the factor loadings for social presence was .42 and .40. The fourth factor (eigenvalue = 2.24) explained 4.63% of the variance and contained four items, all from the self-efficacy items. Their factor loadings were .78, .83, .81 and .68 respectively. The fifth factor (eigenvalue = 1.77) explained 3.65% of the variance. The factor loadings larger than .40 were all from the self-regulated learning items. Of the 10 items, 7 items had factor loadings ranged from .45 to .69. The remaining three items showed factor loadings of .39, which was close to the .40 threshold. The sixth factor (eigenvalue = 1.33) explained 2.75% of the variance and contained four items from the mentor presence, ranging from .56 to .65. These four items were also shown in the first factor. The last factor (eigenvalue = 1.10) explained 2.27% of the variance and contained three items, all from the satisfaction items. Their factor loadings were .45, .58, and .52 respectively. The results of PCA showed that the 67 tested items reflected seven components, which largely reflected the eight latent variables measured in this study. Of the seven components, cognitive presence and perceived progress were found under the same factor. A potential explanation was that both latent variables measured students’ conceptions of their cognitive 43 achievement in the course. Thus, the classification from the EFA using the half dataset from the study is consistent with the proposed measurement model in this study. Confirmatory Factor Analysis The other data subset (N = 348), representing another half of the total data, was used for CFA of the construct validity of the latent factors. As shown in Table 4, there were eight latent variables in the model: teaching presence (12 items), social presence (6 items), cognitive presence (12 items), mentor presence (16 items), online SRL strategies (10 items), self-efficacy (4 items), satisfaction (3 items), perceived progress (4 items), The measurement model yielded χ2 (2116) = 8115.876, p < .001, RMSEA = .065 (90% CI from .064 to .067), CFI = .831, and SRMR = .047. According to the modification suggestions, correlations were added within each latent variable, including two correlations in teaching presence, four correlations in social presence, six correlations in cognitive presence, four correlations in online SRL strategies, and 11 correlations in mentor presence. The revised model yielded χ2 (2089) = 4632.357, p < .001, RMSEA = .043 (90% CI from .041 to .044), CFI = .929, and SRMR = .040, indicating a sufficient fit for the data. Table 4. Test of the Validity of the Latent Constructs – Confirmatory Factor Analysis (N = 348) Item TP_1 TP_2 TP_3 TP_4 TP_5 TP_6 TP_7 TP_8 TP_9 TP_10 TP_11 TP_12 TP Standardized β .74*** .68*** .69*** .51*** .79*** .80*** .76*** .74*** .76*** .77*** .67*** .68*** 44 S.E. .03 .03 .03 .04 .02 .02 .03 .03 .03 .02 .03 .03 SP_1 SP_2 SP_3 SP_4 SP_5 SP_6 CP_1 CP_2 CP_3 CP_4 CP_5 CP_6 CP_7 CP_8 CP_9 CP_10 CP_11 CP_12 SRL_1 SRL_2 SRL_3 SRL_4 SRL_5 SRL_6 SRL_7 SRL_8 SRL_9 SRL_10 EFFI_1 EFFI_2 EFFI_3 EFFI_4 MP_1 MP_2 MP_3 MP_4 MP_5 MP_6 MP_7 MP_8 Table 4 (cont’d) SP CP SRL EFFI MP .84*** .77*** .73*** .70*** .70*** .62*** .56*** .70*** .74*** .64*** .72*** .73*** .81*** .78*** .79*** .70*** .74*** .72*** .73*** .71*** .76*** .77*** .40*** .53*** .66*** .69*** .54*** .50*** .91*** .93*** .84*** .77*** .79*** .77*** .74*** .76*** .85*** .84*** .76*** .77*** 45 .02 .03 .03 .03 .03 .04 .04 .03 .03 .04 .03 .03 .02 .02 .02 .03 .03 .03 .03 .03 .03 .03 .05 .04 .04 .03 .04 .05 .01 .01 .02 .02 .02 .02 .03 .03 .02 .02 .03 .02 Table 4 (cont’d) .02 .03 .04 .03 .03 .03 .02 .03 .78*** .72*** .62*** .67*** .73*** .77*** .77*** .71*** MP_9 MP_10 MP_11 MP_12 MP_13 MP_14 MP_15 MP_16 SATIS PROG ***p < .001. TP = Teaching Presence, SP = Social Presence, CP = Cognitive Presence, MP = Mentor Presence, SRL = Online SRL Strategies, EFFI = Self-efficacy, SATIS = Satisfaction, PROG = Perceived Progress PROG_1 PROG_2 PROG_3 PROG_4 .73*** .87*** .85*** .84*** SATIS_1 SATIS_2 SATIS_3 .01 .01 .02 .03 .02 .02 .02 .92*** .91*** .80*** Composite Reliability. Table 5 presents the composite-reliability scores of each latent construct. Four steps were used to calculate them: 1) all factor loadings were summed; 2) this sum was squared to obtain the Square of the Sum of All Factor Loadings under the Same Latent Construct (SSI); 3) all error variances of each indicator were summed to obtain the Sum of Error Variances (SEV) of each indicator; and 4) composite reliability was computed using the formula SSI/(SSI+SEV). According to Nunnally and Bernstein (1994), a composite reliability larger than .70 is recommended. The computed composite-reliability scores for each of the eight latent constructs implied that all were satisfactory in this regard, with TP = .99, SP = .99, CP = .99, MP = .98, SRL = .97, EFFI = .99, SATIS = .99 and PROG = .99. Table 5. Test of the Validity of the Latent Constructs - The Standardized Factor Loading, SSI, SEV, AVE, and Composite Reliability of the Latent Constructs (N = 348) SSI SEV AVE Composite Reliability TP 73.79 .34 .52 .99 46 Table 5 (cont’d) .18 .35 .37 .06 .31 .04 .09 .53 .52 .41 .75 .57 .77 .68 6.92 10.82 .99 .99 .97 .99 .98 .99 .99 19.01 74.48 39.56 11.90 145.20 SP CP SRL EFFI MP SATIS PROG SSI = the square of the sum of all factor loadings under the same latent construct; SEV = the sum of all error variances of each indicator; AVE = the sum of each squared factor loading; Composite Reliability = SSI/(SSI + SEV). TP = Teaching Presence, SP = Social Presence, CP = Cognitive Presence, MP = Mentor Presence, SRL = Online SRL Strategies, EFFI = Self-efficacy, SATIS = Satisfaction, PROG = Perceived Progress Convergent Validity. Two criteria were used to test the convergent validity of the measured items: 1) the factor loading of each, and 2) comparing the Average Variance Extracted (AVE) and the composite reliability (Fornell & Larcker, 1981; Hair et al., 2006). Factor Loading. The standardized factor loadings of each item on the latent constructs are shown on Table 4. Of the 67 factor loadings, 51 items showed factor loadings larger than .70, the threshold recommended by Hair et al. (2006) and Gefen et al. (2000). For the remaining 16 items, 10 items had factor loadings larger than .60, 5 items had factor loadings larger than .50, and 1 item showed factor loading larger than .40. Therefore, the construct validity of the measurement model can be deemed satisfactory. Average Variance Extracted. According to Hair et al. (2006), convergent validity is adequate when 1) composite reliability is greater than AVE, and 2) all AVEs are larger than .50. As shown in Table 5, all the latent constructs’ composite reliabilities were greater than the corresponding AVEs, all of which in turn were greater than .50 (i.e., TP = .52, SP = .53, CP = .52, EFFI = .75, MP = .57, SATIS = .77, and PROG = .68) except in the case of SRL (.41). 47 Nevertheless, as most AVEs exceeded the required level and all the composite reliabilities were greater than the AVEs, it is reasonable to suggest that convergent validity was satisfactory. Discriminant Validity As mentioned above, the AVEs of any two latent constructs should be greater than the shared variance (i.e., square of the correlation) between those constructs (Fornell & Larcker, 1981). Table 6 shows the inter-construct correlation between each latent construct (off the diagonal) and the square roots of their AVEs (on the diagonal). All the AVE values were greater than all the inter-construct correlations, and the measurement model therefore met the discriminant-validity criterion. Table 6. Test of the Validity of the Latent Constructs - Discriminant Validity for the Measurement Model (N = 348) (.72 a) (.64 a) (.73 a) (.72 a) TP SP CP SRL EFFI MP SATIS PROG .44 .38 .43 .51 .67 .41 .56 .32 .23 .29 .44 .47 .29 Latent construct TP SP CP SRL EFFI MP SATIS PROG a Diagonals in parentheses are square roots of the AVE from the latent construct. Off-diagonals are the inter-construct correlations between each of the latent construct. TP = Teaching Presence, SP = Social Presence, CP = Cognitive Presence, MP = Mentor Presence, SRL = Online SRL Strategies, EFFI = Self-efficacy, SATIS = Satisfaction, PROG = Perceived Progress Descriptive Statistics .27 .31 .32 .43 .29 .31 .30 .37 .28 (.88 a) .51 (.76 a) .44 .27 (.86 a) .29 .55 .41 (.82 a) As mentioned earlier, Table 1 shows the descriptive statistics of the items measured in the study. The items included measurements of teaching presence (12 items), social presence (6 items), cognitive presence (12 items), mentor presence (16 items), online SRL strategies (10 items), self-efficacy (4 items), satisfaction (3 items), perceived progress (4 items), and final 48 grade. The reliabilities of these variables were: teaching presence (α = .93), social presence (α = .88), cognitive presence (α = .94), mentor presence (α = .96), online SRL strategies (α = .87), self-efficacy (α = .92), satisfaction (α = .90), perceived progress (α = .90). Table 1 also shows the demographic information of this study, including gender, grade, ethnicity, subject, and whether or not they take the course for credit recovery purpose. Table 7 presents the correlations between each of the measured variables, and indicates that all variables were significantly correlated with each other at the p = .001 level. Table 7. Correlation Table 1. TP 2. MP 3. SP 4. CP 5. EFFI 6. SRL 7. SATIS 8. PROG 9. Final grade 2 1 1 .61*** 1 .68*** .68*** .45*** .45*** .62*** .52*** .27*** 3 4 5 6 .51*** 1 .51*** .38*** .49*** .45*** .43*** .19*** .69*** 1 .49*** .51*** .68*** .54*** .26*** .56*** 1 .56*** .70*** .65*** .32*** .44*** 1 .59*** .70*** .25*** 7 8 9 .47*** 1 .45*** .24*** .70*** 1 .40*** .26*** 1 ***p < .001. TP = Teaching Presence, SP = Social Presence, CP = Cognitive Presence, MP = Mentor Presence, SRL = Online SRL Strategies, EFFI = Self-efficacy, SATIS = Satisfaction, PROG = Perceived Progress Measurement Model CFA was performed to test the fitness of the latent variables for the data. As shown in Table 8, there were eight latent variables in the model: teaching presence (TP, 12 items), social presence (SP, 6 items), cognitive presence (CP, 12 items), mentor presence (MP, 16 items), online SRL strategies (SRL, 10 items), self-efficacy (EFFI, 4 items), satisfaction (SATIS, 3 items), perceived progress (PROG, 4 items). The measurement model yielded χ2 (2116) = 8115.876, p < .001, RMSEA = .065 (90% CI from .064 to .067), CFI = .831, and SRMR = .047. According to the modification suggestions, correlations were added within each latent variable, 49 including two correlations in teaching presence, four correlations in social presence, six correlations in cognitive presence, four correlations in online SRL strategies, and 11 correlations in mentor presence. The revised model yielded χ2 (2089) = 4632.357, p < .001, RMSEA = .043 (90% CI from .041 to .044), CFI = .929, and SRMR = .040, indicating a sufficient fit for the data. Table 8. Measurement Model TP_1 TP_2 TP_3 TP_4 TP_5 TP_6 TP_7 TP_8 TP_9 TP_10 TP_11 TP_12 SP_1 SP_2 SP_3 SP_4 SP_5 SP_6 CP_1 CP_2 CP_3 CP_4 CP_5 CP_6 CP_7 CP_8 CP_9 CP_10 CP_11 CP_12 SRL_1 SRL_2 SRL_3 TP by SP by CP by SRL by Standardized S.E. .02 .02 .02 .03 .02 .01 .02 .02 .02 .02 .02 .02 .02 .02 .03 .03 .02 .03 .01 .01 .01 .03 .02 .02 .02 .02 .02 .02 .02 .02 .02 .02 .02 β .76*** .71*** .67*** .52*** .81*** .83*** .77*** .75*** .75*** .78*** .71*** .70*** .81*** .75*** .64*** .67*** .71*** .63*** .62*** .75*** .77*** .64*** .71*** .75*** .81*** .81*** .80*** .71*** .75*** .68*** .70*** .69*** .76*** 50 Table 8 (cont’d) SRL_4 SRL_5 .73*** .43*** .02 .04 .03 .03 .02 .03 .03 .01 .01 .01 .02 .87*** .92*** .87*** .78*** .54*** .67*** .73*** .55*** .54*** SRL_6 SRL_7 SRL_8 SRL_9 SRL_10 EFFI_1 EFFI_2 EFFI_3 EFFI_4 MP_1 MP_2 MP_3 MP_4 MP_5 MP_6 MP_7 MP_8 MP_9 MP_10 MP_11 MP_12 MP_13 MP_14 MP_15 MP_16 SATIS_1 SATIS_2 SATIS_3 PROG_1 PROG_2 PROG_3 PROG_4 EFFI by MP by SATIS PROG ***p < .001. TP = Teaching Presence, SP = Social Presence, CP = Cognitive Presence, MP = Mentor Presence, SRL = Online SRL Strategies, EFFI = Self-efficacy, SATIS = Satisfaction, PROG = Perceived Progress .78*** .77*** .71*** .77*** .82*** .86*** .73*** .80*** .78*** .70*** .64*** .67*** .78*** .78*** .79*** .72*** .02 .02 .02 .02 .01 .01 .02 .02 .02 .02 .02 .02 .02 .02 .02 .02 .74*** .88*** .87*** .86*** .91*** .90*** .80*** .01 .01 .02 .02 .01 .01 .01 51 RQ1. Shea’s Model and the Alternative Model To answer RQ1, regarding two models of the structural relationships among teaching presence, social presence, self-efficacy, online SRL strategies, and cognitive presence, two SEM analyses were conducted using Mplus7. The first model (see Fig. 1) was developed by Shea and Bidjerano (2012). Building on it, the alternative model (see Fig. 2) proposed by the current study added one more path in which teaching presence predicts online SRL strategies. Both models had the same five latent variables. The five latent variables were TP, SP, SRL, EFFI, CP. The latent construct of TP includes 12 indicators, measuring teacher’s direct instruction, facilitation, and design/organization of the course. CP was measured by 12 indicators, measuring triggering event, exploration, integration, and resolution. SP was measured by 6 items, including affective expression and open communication. SRL was measured by 10 items, including goal-setting, strategic learning, and help seeking. EFFI was measured by 4 indicators. Table 9 presents a comparison between Shea’s model and the alternative model in terms of model fit. For Shea’s model, the results were χ2 (878) = 2368.510 (p < .001), RMSEA = .050 with a 90% CI from .048 to .053, CFI = .926, and SRMR = .074. For the alternative model, they were χ2 (877) = 2295.049 (p < .001), RMSEA = .049 with a 90% CI from .047 to .052, CFI = .930, and SRMR = .045. Table 9. Fit Indices of Shea's Model and the Alternative Model χ2 df p RMSEA 90% CI of RMSEA CFI SRMR Shea’s Model Alternative Model 2368.510 2295.049 878 < .001 .050 877 < .001 .049 .048, .053 .047, .052 .926 .074 .930 .045 52 Since the first model was nested in the second model, chi-square difference testing could be used to compare the two models. The chi-square difference was 73.461 (df = 1, p < .001). Therefore, the alternative model proposed by the present study, with one extra path from TP to SRL, can be deemed more appropriate than Shea’s model to explaining the structural relationships between the components of CoI and SRL among this sample of K-12 online learners. Table 10 shows the standardized coefficients of the paths between CoI, EFFI, and SRL. The path coefficients of the hypothetical alternative model are also shown in Fig. 5, with all straight lines representing significant predictions, and all dotted lines, non-significant predictions. As the figure indicates, all the path estimates were statistically significant except for the path from TP to EFFI (β = .08, p > .05). Therefore, it was decided to remove this insignificant path and run a new model named the Modified Alternative Model. Table 10. The Structural Model for the Alternative Model TP SP EFFI TP SRL EFFI TP SP EFFI on SRL on CP on ***p < .001. TP = Teaching Presence, SP = Social Presence, CP = Cognitive Presence, SRL = Online SRL Strategies, EFFI = Self-efficacy S.E. .08 .08 .04 .04 .04 .04 .06 .06 Standardized β .08 .50*** .37*** .37*** .24*** .13*** .24*** .40*** 53 Figure 5. The Path Coefficients of the Alternative Model The standardized coefficients of the structural model for the Modified Alternative Model are shown in Table 11 and Fig. 6. The path from SP to EFFI had a large effect size (β = .57). The path from SP to CP had a medium effect size, with a standardized factor loading of .41. All the rest of the paths had small effect sizes (EFFI to SRL: .38, TP to SRL: .37, SRL to CP: .24, TP to CP: .23, and EFFI to CP: .12). Around 33% of the total variance in EFFI was accounted for by SP, around 40.5% of the total variance in SRL was accounted for by TP and EFFI, and around 72.4% of the total variance in CP was accounted for by TP, SP, SRL, and EFFI. 54 Table 11. The Structural Model for the Modified Alternative Model SP EFFI TP SRL EFFI TP SP EFFI on SRL on CP on ***p < .001. TP = Teaching Presence, SP = Social Presence, CP = Cognitive Presence, SRL = Online SRL Strategies, EFFI = Self-efficacy S.E. .03 .04 .04 .04 .04 .06 .07 Standardized β .57*** .38*** .37*** .24*** .12*** .23*** .41*** Figure 6. The Path Coefficients of the Modified Alternative Model 55 RQ2. Model with Mentor Presence To establish the structural relationships between CoI constructs, SRL, self-efficacy and mentor presence, the Modified Alternative Model presented above was modified further via the addition of MP. Again, all the SEM analyses were conducted using Mplus7. To differentiate this new model from previous ones, it will be referred to as the “Hypothesized Alternative Model with Mentor Presence”. It consisted of six latent variables – TP, SP, SRL, CP, EFFI, and MP. The indicators for the latent constructs TP, SP, SRL, CP, and EFFI were the same as those in the Modified Alternative Model. For the newly added latent variable MP, 16 measured indicators were included. Fig. 3 shows this new model. The new model was used to explore the hypotheses that MP predicts EFFI, SRL, and CP. Tests of model fit yielded results of χ2 (1670) = 3846.492, (p < .001), RMSEA = .044 with a 90% CI from .042 to .046, CFI = .929, and SRMR = .043. Table 12 presents the standardized coefficients for the Hypothesized Alternative Model with Mentor Presence, and its standardized coefficients are shown in Fig. 7. The straight lines represent significant predictions, and the dotted lines non-significant ones. As Fig. 7 shows, only two path estimates were statistically non-significant: from MP to CP (β = -.03, p > .05), and from MP to EFFI (β = .09, p > .05). Again, it was decided to remove the two insignificant paths and run a new model named the “Modified Alternative Model with Mentor Presence”. Table 12. The Structural Model for the Model with Mentor Presence SP MP EFFI TP MP EFFI on SRL on Standardized β .51*** .09 .35*** .22*** .24*** S.E. .05 .05 .04 .05 .05 56 Table 12. (cont’d) .04 .04 .06 .07 .04 -.03 SRL EFFI TP SP MP .24*** .12*** .25*** .41*** CP on ***p < .001. TP = Teaching Presence, SP = Social Presence, CP = Cognitive Presence, MP = Mentor Presence, SRL = Online SRL Strategies, EFFI = Self-efficacy Figure 7. The Path Coefficients of the Alternative Model with Mentor Presence The new model without the two insignificant paths is illustrated in Fig. 8. As shown in Table 13, the path from SP to EFFI (β = .58, p < .001) had a large effect size, while the paths from EFFI to SRL (β = .35, p < .001) and from SP to CP (β = .41, p < .001) both had medium 57 effect sizes. All the other paths’ loadings were of small effect size, with TP to SRL = .22 (p < .001), MP to SRL = .24 (p < .001), SRL to CP = .24 (p < .001), EFFI to CP = .12 (p < .01), and TP to CP = .24 (p < .001). Around 33.4% of the variance in EFFI was accounted for by SP; around 43.2% of the total variance in SRL was accounted for by TP, EFFI, and MP; and around 72.3% of the total variance in CP was accounted for by TP, SP, SRL, and EFFI. Table 13. The Structural Model for the Modified Model with Mentor Presence SP EFFI TP MP SRL EFFI TP SP EFFI on SRL on CP on **p < .01, ***p < .001. TP = Teaching Presence, SP = Social Presence, CP = Cognitive Presence, MP = Mentor Presence, SRL = Online SRL Strategies, EFFI = Self-efficacy .58*** .35*** .22*** .24*** .24*** .12** .24*** .41*** Standardized β S.E. .03 .04 .05 .05 .04 .04 .06 .07 58 Figure 8. The Path Coefficients of the Modified Alternative Model with Mentor Presence RQ3. Model with Learning Outcomes – Satisfaction To explore the structural relationships among CoI, SRL, self-efficacy, mentor presence, and satisfaction, Satisfaction (SATIS) was added to the model described in the previous section (see Fig. 4), with the expectation that TP, SP, MP, and CP would all have direct effects on SATIS. The overall model results were χ2 (1848) = 4213.685 (p < .001), RMSEA = .044 with a 90% CI from .042 to .045, CFI = .928, and SRMR = .043. Table 14 shows the standardized coefficients for the model after adding SATIS. Fig. 9 gives the standardized coefficients of the factor loadings. Again, the straight lines represent significant predictions, and the dotted lines non-significant ones. As Fig. 9 shows, all the path 59 estimates were statistically significant except for the path from TP to SATIS (β = .01, p > .05), and from MP to SATIS (β = -.06, p > .05). Table 14. The Structural Model for the Hypothesized Model with Satisfaction Standardized β .61*** .35*** .22*** .24*** .24*** .12** .25*** .39*** .36*** SP EFFI TP MP SRL EFFI TP SP CP TP SP MP EFFI on SRL on CP on SATIS on **p < .01, ***p < .001. TP = Teaching Presence, SP = Social Presence, CP = Cognitive Presence, MP = Mentor Presence, SRL = Online SRL Strategies, EFFI = Self-efficacy, SATIS = Satisfaction. S.E. .04 .05 .05 .04 .04 .04 .06 .07 .06 .06 .08 .04 .01 .57*** -.06 60 Figure 9. The Path Coefficients of the Alternative Model with Mentor Presence and Satisfaction After the insignificant path from TP to SATIS and MP to SATIS were removed, the resulting new model had results of χ2 (1850) = 4216.289 (p < .001), RMSEA = .044 with a 90% CI from .042 to .045, CFI = .928, and SRMR = .043. Fig. 10 presents this modified model with standardized path coefficients on each path, and Table 15 shows its standardized coefficients. In the modified model, two paths showed large effect size: from SP to EFFI (β = .61, p < .001), and from SP to SATIS (β = .53, p < .001). Medium effect sizes were found for three paths, i.e., from SP to CP (β = .39, p < .001), CP to SATIS (β = .36, p < .001), and TP to SRL (β = .35, p < .001). Small effect sizes were found for the six remaining paths, i.e., from TP to SRL (β = .22, p 61 < .001), MP to SRL (β = .24, p < .001), SRL to CP (β = .24, p < .001), EFFI to CP (β = .12, p < .01), and TP to CP (β = .25, p < .001). Table 15. The Structural Model for the Modified Model with Satisfaction SP EFFI TP MP SRL EFFI TP SP CP SP EFFI on SRL on CP on SATIS on **p < .01, ***p < .001. TP = Teaching Presence, SP = Social Presence, CP = Cognitive Presence, MP = Mentor Presence, SRL = Online SRL Strategies, EFFI = Self-efficacy, SATIS = Satisfaction. .61*** .35*** .22*** .24*** .24*** .12** .25*** .39*** .36*** .53*** Standardized β S.E. .03 .04 .05 .05 .04 .04 .06 .07 .06 .06 62 Figure 10. The Path Coefficients of the Modified Alternative Model with Mentor Presence and Satisfaction Mediation was then tested using the bootstrapping procedures previously described. The mediation between SP and SATIS, had an unstandardized total effect of .77 (p < .001), a total indirect effect of .20 (p < .001), and a direct effect of .57 (p < .001). The 95% CI of the unstandardized indirect effect ranged from .12 to .31. The total indirect effect from SP to SATIS was significant. The relationship between SP and SATIS was mediated by: 1) CP (B = .15, 95% CI from .09 to .24); and 2) EFFI, SRL, and CP (B = .02, 95% CI from .01 to .04). Both mediation effects were significant. 63 RQ3. Model with Learning Outcomes – Perceived Progress Another hypothesized model, using perceived progress (PROG) as the learning-outcome variable in place of SATIS, is presented in Fig. 4. This model aims to elucidate the structural relationships among CoI, SRL, EFFI, MP, and PROG. It was hypothesized that TP, SP, MP, and CP would all have direct effects on PROG. The overall model results were χ2 (1910) = 4493.232 (p < .001), RMSEA = .045 with a 90% CI from .043 to .047, CFI = .923, and SRMR = .046. Table 16 shows the standardized coefficients for the model using PROG as its learning- outcome variable, and Fig. 11 its standardized coefficients. All paths’ estimates were statistically significant, except for the path from MP to PROG, and from TP to PROG. After removing these insignificant paths (see Table 17 and Figure 12), the new model yielded χ2 (1912) = 4499.775 (p < .001), RMSEA = .045 with a 90% CI from .043 to .047, CFI = .922, and SRMR = .047. In this model, the path from SP to EFFI (β = .60, p < .001), and from CP to PROG (β = .60, p < .001) showed large effect sizes. A medium effect size was found for two paths, i.e., from EFFI to SRL (β = .35, p < .001), and from SP to CP (β = .37, p < .001). All the other paths showed small effect sizes, with TP to SRL (β = .22, p < .001), MP to SRL (β = .24, p < .001), SRL to CP (β = .24, p < .001), EFFI to CP (β = .16, p < .001), TP to CP (β = .24, p < .001), and SP to PROG (β = .21, p < .01). Table 16. The Structural Model for the Hypothesized Model with Perceived Progress SP EFFI TP MP SRL EFFI TP SP CP EFFI on SRL on CP on PROG on Standardized S.E. .03 .04 .05 .05 .04 .04 .06 .07 .06 β .61*** .35*** .22*** .24*** .24*** .16*** .27*** .34*** .61*** 64 Table 16. (cont’d) SP MP TP ***p < .001. TP = Teaching Presence, SP = Social Presence, CP = Cognitive Presence, MP = Mentor Presence, SRL = Online SRL Strategies, EFFI = Self-efficacy, PROG = Perceived Progress. .09 .04 .07 .33*** .04 -.01 Figure 11. The Path Coefficients of the Alternative Model with Mentor Presence and Perceived Progress Table 17. The Structural Model for the Modified Model with Perceived Progress SP EFFI TP MP EFFI on SRL on Standardized β .60*** .35*** .22*** .24*** S.E. .03 .04 .05 .05 65 Table 17. (cont’d) SRL EFFI TP SP CP SP CP on PROG on **p < .01, ***p < .001. TP = Teaching Presence, SP = Social Presence, CP = Cognitive Presence, MP = Mentor Presence, SRL = Online SRL Strategies, EFFI = Self-efficacy, PROG = Perceived Progress. .24*** .16*** .24*** .37*** .60*** .21** .04 .04 .06 .07 .06 .07 Figure 12. The Path Coefficients of the Modified Alternative Model with Mentor Presence and Perceived Progress As with the SATIS-outcome model described in the previous section, mediation was then tested using bootstrapping procedures. The mediation was between SP and PROG. Here, the bootstrapped unstandardized total effect was .36 (p < .001), the total indirect effect was .21 (p 66 < .01), and the direct effect, .15 (p > .05). The 95% CI of the unstandardized indirect effect ranged from .12 to .42. The relationship between SP and PROG was mediated by: 1) CP (β = .15, p < .01, 95% CI from .09 to .32), 2) EFFI and CP (β = .04, p < .05, 95% CI from .01 to .08), and 3) EFFI, SRL, and CP (β = .02, p < .01, 95% CI from .01 to .04). RQ3. Model with Learning Outcomes – Final Grade The last hypothesized model used students’ final grade as the learning-outcome variable. As shown in Fig. 5, the model explores the structural relationships among CoI, SRL, EFFI, MP, and final grade. The hypothesis was that TP, SP, MP, and CP would all relate with final grade. The model yielded χ2 (1728) = 3934.584 (p < .001), RMSEA = .044 with a 90% CI from .042 to .045, CFI = .928, and SRMR = .043. Table 18 shows the standardized coefficients for the model using final grade as its learning-outcome variable, and Fig. 13 its standardized coefficients. All paths’ estimates were statistically significant, except for the path from TP to final grade (β = .05, p > .05), from MP to final grade (β = -.05, p > .05), and from SP to final grade (β = .03, p > .05). After removing these insignificant paths (see Table 19 and Figure 14), the new model yielded χ2 (1731) = 3935.637 (p < .001), RMSEA = .044 with a 90% CI from .042 to .045, CFI = .928, and SRMR = .043. In this model (see Table 18), the path from SP to EFFI (β = .58, p < .001) showed large effect size. A medium effect size was found for two paths, i.e., from EFFI to SRL (β = .35, p < .001), and from SP to CP (β = .41, p < .001). All the other paths showed small effect sizes, with TP to SRL (β = .22, p < .001), MP to SRL (β = .24, p < .001), SRL to CP (β = .24, p < .001), EFFI to CP (β = .12, p < .001), TP to CP (β = .24, p < .001), and CP to final grade (β = .29, p < .01). 67 Table 18. The Structural Model for the Hypothesized Model with Final Grade SP EFFI on EFFI SRL on TP MP SRL CP on EFFI TP SP Final Grade on CP TP SP MP **p < .01, ***p < .001. TP = Teaching Presence, SP = Social Presence, CP = Cognitive Presence, MP = Mentor Presence, SRL = Online SRL Strategies, EFFI = Self-efficacy .58*** .35*** .22*** .24*** .24*** .12** .24*** .41*** .25*** S.E. .03 .04 .05 .05 .04 .04 .06 .07 .08 .09 .11 .05 Standardized β .05 .03 -.05 68 Figure 13. The Path Coefficients of the Alternative Model with Mentor Presence and Final Grade Table 19. The Structural Model for the Modified Model with Final Grade Standardized β SP EFFI on EFFI SRL on TP MP SRL CP on EFFI TP SP Final Grade on CP **p < .01, ***p < .001. TP = Teaching Presence, SP = Social Presence, CP = Cognitive Presence, MP = Mentor Presence, SRL = Online SRL Strategies, EFFI = Self-efficacy .58*** .35*** .22*** .24*** .24*** .12** .24*** .41*** .29*** S.E. .03 .04 .05 .05 .04 .04 .06 .07 .04 69 Figure 14. The Path Coefficients of the Modified Alternative Model with Mentor Presence and Final Grade RQ4. Studying at Home or at School First, descriptive tables were used to get an understanding of the difference of the location choice for students to complete the majority of their work (see Table 20). Again, of the 696 participants, about 58% reported that the primary location for them to complete most of their online coursework was at school and the rest 42% reported that they completed most of their work at home. 70 Table 20. The Demographic Differences between “At-school” Students and “At-home” Students Gender Grade Subject Credit- recovery Numbers of online courses before Male Female 7th grade 8th grade 9th grade 10th grade 11th grade 12th grade English Foreign languages Math Science Social science Other subjects Credit-recovery students Non-credit-recovery students 1 course 2 courses 3 courses 4 or more courses At-school 102 297 2 16 33 80 80 189 27 110 40 62 56 103 22 374 165 137 47 50 At-home 90 205 2 9 26 52 82 124 31 97 28 49 41 49 18 275 101 89 40 64 Total 192 502 4 25 59 132 162 313 58 207 68 111 97 152 40 649 266 226 87 114 As shown from Table 20, a slightly more students chose to complete the majority of their work at school regardless of gender, grade, subject, and whether or not the student took the online course for credit-recovery purposes. Another interesting thing to point out from Table 20 was, when it came to check the difference of the choice of the primary learning location in terms of the number of online courses that they have taken before, those who have taken 1, 2, or 3 online course before all preferred more to learning at school, while for those who has already taken 4 or more than 4 online courses, these group of students showed a preference of learning at home (N = 64) as compared with learning at school (N = 50). Next, this study examined whether students’ primary choice of online learning location related to students online learning perceptions and outcomes including their perceptions of social 71 presence, cognitive presence, mentor presence, online self-regulated learning strategies, and learning outcomes. One-way ANOVAs were conducted to evaluate the impact of two different primary online learning locations on various online learning presences and outcomes. As shown in Table 21, the ANOVAs showed significant group effects on the following items: Table 21. Group Comparisons with Means and ANOVAs SP CP SRL EFFI SP_1 SP_2 SP_3 SP_4 SP_5 SP_6 CP_1 CP_2 CP_3 CP_4 CP_5 CP_6 CP_7 CP_8 CP_9 CP_10 CP_11 CP_12 SRL_1 SRL_2 SRL_3 SRL_4 SRL_5 SRL_6 SRL_7 SRL_8 SRL_9 SRL_10 EFFI_1 EFFI_2 EFFI_3 EFFI_4 F 3.16 4.29* 2.01 .68 .12 .02 6.77** 6.55* 4.95* .55 .00 1.64 .62 2.47 .94 .39 .56 .10 1.14 6.26* 4.16* .11 .03 .82 .37 1.93 1.50 4.06* .32 .01 .05 1.46 M (SD) At-home 3.58 3.39 3.33 3.89 4.15 3.75 3.31 3.56 3.52 3.96 3.95 3.88 3.82 3.83 3.86 4.04 3.97 4.07 3.99 3.73 3.86 3.72 3.47 2.90 3.35 3.41 3.69 3.46 4.11 4.14 4.29 3.93 At-school 3.73 3.59 3..47 3.83 4.12 3.76 3.51 3.79 3.71 4.02 3.95 3.99 3.88 3.94 3.94 4.09 4.02 4.10 4.07 3.95 4.02 3.74 3.50 2.81 3.41 3.53 3.80 3.65 4.07 4.13 4.31 4.02 72 Table 21. (cont’d) MP_1 MP_2 MP_3 MP_4 MP_5 MP_6 MP_7 MP_8 MP_9 MP_10 MP_11 MP_12 MP_13 MP_14 MP_15 MP_16 SATIS_1 SATIS_2 SATIS_3 PROG_1 PROG_2 PROG_3 PROG_4 Final grade MP SATIS PROG Final grade *p < .05, **p < .001. SP = Social Presence, CP = Cognitive Presence, MP = Mentor Presence, SRL = Online SRL Strategies, EFFI = Self-efficacy, SATIS = Satisfaction, PROG = Perceived Progress. 3.68 3.74 3.87 3.77 3.86 3.79 4.29 4.02 3.87 4.04 3.42 3.45 3.49 3.58 3.60 3.76 4.03 3.94 3.89 4.29 4.24 4.14 4.12 81.41 3.41 3.53 3.64 3.61 3.72 3.57 4.20 3.99 3.85 3.99 3.54 3.59 3.47 3.49 3.44 3.65 4.00 3.87 3.83 4.24 4.19 4.09 4.08 78.39 8.24** 5.40* 6.56* 3.09 2.43 5.39* 1.32 .12 .08 .28 1.51 1.67 .02 .87 2.54 1.33 .15 .60 .38 .50 .42 .37 .20 2.59 In terms of social presence, only one measured item showed significant difference: “I never feel isolated in this online course”. The mean for at-school students was 3.59 and the mean for at-home students was 3.39 (p < .05). In terms of cognitive presence, three items showed significant differences. These items were: “Problems posed increased my interest in course issues” (M = 3.51 for at-school students and M = 3.31 for at-home students, p < .01), “Course activities raised my curiosity” (M = 3.79 73 for at-school students and M = 3.56 for at-home students, p < .05), and “I felt motivated to explore content related questions”(M = 3.71 for at-school students and M = 3.52 for at-home students, p < .05). In terms of mentor presence, at-school students reported significantly higher mentor presences on the following four items: “The mentor helped me get used to the online learning environment”(M = 3.68 for at-school students and M = 3.41 for at-home students, p < .01), “The mentor helped me become familiar with the course platform”(M = 3.74 for at-school students and M = 3.53 for at-home students, p < .05), “The mentor helped me when I met technical problems”(M = 3.87 for at-school students and M = 3.64 for at-home students, p < .05), and “The mentor fostered a sense of learning community”(M = 3.79 for at-school students and M = 3.57 for at-home students, p < .05). Lastly, in terms of self-regulated learning strategies, three items demonstrated significant differences: “I set short-term (daily or weekly) goals as well as long-term goals (monthly or for the semester)” (M = 3.95 for at-school students and M = 3.73 for at-home students, p < .05), “I keep a high standard for my learning in my online courses” (M = 4.02 for at-school students and M = 3.86 for at-home students, p < .05), and “If needed, I try to ask my online teacher/mentors about questions that I don’t understand” (M = 3.65 for at-school students and M = 3.46 for at- home students, p < .05). 74 CHAPTER 5 DISCUSSION Both the CoI framework and SRL theory have been widely accepted and used to provide explanations for students’ online learning process and outcomes. The former focuses on active collaboration and construction among learners and teachers aimed at building deep and meaningful learning through teaching-, social-, and cognitive presences (Garrison, 2007; Garrison et al., 2001); while the latter considers the metacognitively, motivationally, and behaviorally active learning processes initiated by learners (Zimmerman, 1986). The primary goals of the present study have been to extend Shea and Bidjerano’s (2010) model to include CoI presence and SRL as potential explanations of students’ online-learning process, and to evaluate the predictive power of this model using a large sample of data from an actual K-12 online- learning setting. In spite of – or perhaps because of – the profusion of online-learning research in higher- education settings, very few frameworks have explicitly focused on the unique features of younger online learners or on differences in online-learning environments between the school and college levels (Barbour, 2012, 2013; Barbour & Reeves, 2009; Borup et al., 2013; Cavanaugh et al., 2009; Rice, 2006). The present research is a response to these and other scholars’ calls for a study of existing theoretical frameworks’ applicability in K-12 online learning settings. This study expands on previous research in four ways: 1) by testing whether the most prominent previously established model can usefully be applied to K-12 online learners; 2) by testing the addition of mentor presence to that model; 3) by linking the previously established model to learning outcomes (i.e., satisfaction, perceived learning, and final grade); and 4) by 75 testing whether students who mostly learn at-school or at-home show similar or different level of online learning presences and learning outcomes. Overall, the results provide strong evidence in support of the hypotheses presented in Chapter 2, above, and help to explain the complicated relationships among CoI, self-efficacy, SRL strategies, mentor presence, and learning-outcome variables in K-12 online learning. Table 22 summarizes the results of the tested hypotheses, and the following sections discuss the main findings relating to each research question. Table 22. Summary of the Tested Hypotheses Not Supported Supported Not Supported Supported Supported Result Supported Hypothesis H1: Teaching presence positively predicts students’ online SRL strategies. H2: Teaching presence positively predicts students’ self- efficacy. H3: Teaching presence positively predicts cognitive presence. H4: Teaching presence positively predicts learning outcome. H5: Social presence positively predicts students’ self- efficacy. H6: Social presence positively predicts cognitive presence. H7: Social presence positively predicts learning outcome. Supported1 H8: Cognitive presence positively predicts learning Supported outcome. H9: Self-efficacy positively predicts online SRL strategies. H10: Self-efficacy positively predicts cognitive presence. Supported H11: Online SRL strategies positively predict cognitive Supported presence. H12: Mentor presence positively predicts self-efficacy. H13: Mentor presence positively predicts online SRL. H14: Mentor presence positively predicts cognitive presence. H15: Mentor presence positively predicts learning outcome. H16: Students who use school as their primary online learning locations demonstrate higher level of social presence, mentor presence, cognitive presence, self- efficacy, online SRL strategies, and learning outcomes, compared with those who choose to spend their majority time of online learning at home. Supported Not Supported Supported Not Supported Not Supported Partially Supported2 76 Table 22. (cont’d) 1 Supported in the satisfaction and perceived progress model but not supported in the final grade model. 2 Students who use school as their primary online learning locations did not demonstrate significantly higher level of self-efficacy and learning outcomes, compared with those who choose to spend their majority time of online learning at home. Comparing Shea’s Model against the Alternative Model As previously discussed, Shea and Bidjerano’s (2010) SEM model confirmed the relationship between the CoI framework, self-efficacy, and self-regulation (see Fig. 1). Guided by prior literature on students’ self-regulation under teachers’ guidance, the present study added one more direct path to Shea’s model – from teaching presence to self-regulation – with the aim of improving its fit with a sample of data from K-12 online learners. SEM analyses showed that the hypothesized model was a better fit than Shea’s model. The new model also found 1) a significant relationship between teaching presence and SRL that was not specified in Shea’s model, and 2) that teaching presence did not predict self-efficacy significantly, even though Shea’s model had shown a strong correlation between these two constructs. These two differences and why they were found are discussed in detail below. Teaching Presence and Self-regulated Learning Strategies. The new model proposed and tested in the current study confirmed a direct path between teaching presence and self- regulation, with a medium effect size (.45). Such a finding echoes SRL theory on the importance of teachers empowering students to become self-aware and self-directed during the instruction process (Zimmerman, 2002). Zimmerman, Bonner, and Kovach (1996) highlighted that teachers need to prepare students to learn on their own, through techniques such as setting goals and self- evaluating learning: i.e., to become self-regulated. The present findings also echo the idea that it is through online teaching presence that students become metacognitively self-regulated (Akyol 77 & Garrison, 2011b). The medium effect size of the relationship between teaching presence and self-regulation are consistent with previous quasi-experimental studies’ findings that students’ SRL skills could be improved through intentional and structured teaching practice (Crippen & Earl, 2007; Ebner & Ehri, 2016). When examining the correlation between teaching presence and self-regulation, Shea and Bidjerano (2010) reported a .19 correlation. In the current study, in contrast, the correlations between the latent factor of teaching presence and the latent factor of self-regulation was .33. There are two possible explanations for this discrepancy. First, since most of the courses in the current study were purchased from a third party, it is possible that they were uniformly well- made and thoughtfully designed to facilitate students’ self-regulation development. As a general matter, it is possible that these and other course-design considerations may relate to whether and how well teachers can simultaneously deliver content knowledge and metacognitive-thinking guidance. The second potential explanation resides in scholars’ differing choices of self-regulation measurements. Although the current study and Shea and Bidjerano’s (2010) were both clearly situated within the concept of self-regulation, each used a different scale to measure it. Notably, Shea and Bidjerano used four items from the “effort regulation” section of the MSLQ (Pintrich et al., 1993) as indicators for self-regulation; while the current study incorporated concepts such as goal-setting and help-seeking in addition to effort regulation. The additional goal-setting and help-seeking measurements may account for the stronger observed correlation between teaching presence and SRL. Teaching Presence and Self-efficacy. Shea and Bidjerano (2010) found that teaching presence significantly predicted self-efficacy, whereas the present study did not find that the 78 same relationship was significant. The reason for this may reside in the differences between the participants selected in the two studies. Shea and Bidjerano’s respondents comprised both blended learning-learning and fully online students, while in the present study all students took their relevant courses online only; and as Shea and Bidjerano conceded, they found much stronger correlations between teaching presence and self-efficacy among blended-learning students (who engaged in some forms of face-to-face interaction) than among students in fully online courses. The question, then, is why teaching presence is less relevant (or in our case, not relevant at all) to self-efficacy in online-learning settings. Research on traditional face-to-face settings has found that teachers can foster students’ self-efficacy through a range of practices, such as providing specific and accurate feedback, setting challenging tasks, individualizing learning, showing care, and demonstrating effective modeling practices (Linnenbrink & Pintrich, 2003; Pajares, 2012). It is possible that online-learning environments are deficient in vicarious learning experiences and/or verbal persuasion from teachers, due to the limits such environments place on teacher-student interaction. If so, it would be especially challenging for online learners to form strong self-efficacy. Previous studies have also found that it is difficult for teachers to adjust their lesson plans and tailor learning to each individual in K-12 online language courses (Lin & Zheng, 2015; Oliver et al., 2009), which were being taken by 29% of our sample. Therefore, given that the context of the current study involved an asynchronous learning format and less active roles on the part of the instructors than in Shea and Bidjerano’s sample, it is possible that there really was no significant relationship between teaching presence and self-efficacy to be found. 79 Mentor Presence in K-12 Online Learning Mentor presence was included in the current study’s modeling to reflect a key feature of K-12 online learning, one that has been called a critical change to such learning (Barbour & Mulcahy, 2004) and an important factor in helping K-12 students succeed online (Ferdig, 2010; Roblyer et al., 2008; Taylor et al., 2016). De la Varre (2011) recommended that mentors’ work be considered as a component of the CoI framework to reflect their facilitation and engagement activities, as in real-life online learning settings, teaching and facilitating practices were actually distributed through the contribution from the instructor and the mentor separately and individually. Building on Shea and Bidjerano’s (2010) model, this study added one more factor – MP – to reflect this trait of K-12 online learning. The SEM results showed a satisfactory fit for the resulting model, confirming that adding mentor presence to the existing combined SRL/CoI framework could provide a more comprehensive picture of the mechanisms that predict K-12 online learning. Mentor Presence. Previous studies have subdivided K-12 mentors into four roles: relationship builders, monitors, content-learning facilitators, and technical-problem solvers (e.g., Aronson & Timms, 2003; Barbour & Mulcahy, 2004; Borup & Drysdale, 2014; de la Varre et al., 2011; Drysdale, 2013; Hannum et al., 2008; Kennedy & Cavanaugh, 2010; Taylor et al., 2016). The current study examined the mentoring practices used by K-12 online-learning mentors, and found that their practices include all roles that previous studies have mentioned. This is especially reflected through the fact that all 16 items of the measured mentor presence loaded on to the latent factor of mentor presence with factor loadings greater than .60. Taken as a whole, this study confirmed the relevance of these four pre-existing categories. 80 Descriptive statistics showed that the composite score of mentor presence was perceived at moderate levels by K-12 online learners (M = 3.74), and the means for the 16 sub-scales of mentor presence that reflected mentor’s different roles in facilitating online learning ranged from 3.47 to 4.25. This reflected the fact that mentors can play a variety of roles in facilitating students’ learning, rather than just limiting themselves to the original purpose of tracking students’ progress (Borup & Drysdale, 2014; Drysdale, 2013; Kennedy & Cavanaugh, 2010; Taylor et al., 2016). Specifically, this finding confirmed previous studies’ findings about the commonplace, albeit unofficial, extension of mentors’ roles into direct content instruction (Barbour & Mulcahy, 2004; de la Varre, 2011; O’Dwyer et al., 2007). Mentor Presence, Self-efficacy, and Self-regulated Learning Strategies. After the addition of MP, the model illustrated in Fig. 8 indicated that such presence did not significantly predict students’ self-efficacy. The insignificant direct effect of mentor presence on self-efficacy did not support previous studies’ findings about the positive impact of mentoring on students’ confidence in online learning (de la Varre et al., 2011; Freidhoff et al., 2015; Murphy & Rodriguez-Manzanares, 2009). A potential explanation is that those studies explored how mentor’s explicit encouragement affected students beliefs on whether they would achieve success in online learning; while the mentor presence measured in the current study were mainly interested in areas including relationship building, monitoring, content-learning instruction or facilitation, and technical-problem solving, with only one item asking encouragement-related question (i.e., “The mentor encouraged students when needed”) out of the 16 items. Since this study is the first and the only study that use quantitative approach to describe mentor presence, a replication of such method is strongly needed. This finding also stated the need that perhaps in order to promote one’s self-efficacy, mentors need to use explicit self-efficacy promoting 81 strategies, which echoed with some professional-development programs that have specifically trained mentors in strategies for improving students’ self-efficacy and motivation (Staker, 2011). K-12 online learners have been found to have relatively low metacognitive skills, which makes their online learning a challenging experience (Borup & Drysdale, 2014; Cavanaugh et al., 2009). It is therefore crucial for each child to have an on-site mentor who can help them with the development of such metacognitive skills (Borup & Drysdale, 2014). The present study’s modeling also identified a direct relationship between mentor presence and online SRL strategies, implying that students with higher levels of perception of their mentors’ presence were more likely to demonstrate higher levels of online SRL strategies. The finding of such relationship directly supported previous findings regarding mentors’ positive role in developing students’ SRL skills (Freidhoff et al., 2015; Harms et al., 2006), further confirmed the importance of mentors to teach students with learning skills rather than merely supervising their content learning (Harms et al., 2006; Wicks, 2010). In the context of the current study, students enrolled in only one or two online courses offered by the research site as a supplement to the curricula of their own brick-and-mortar schools (Watson et al., 2011). In addition, it was required by law that each student be equipped with one mentor employed by his or her school district, who would be responsible for “determining appropriate instructional methods for each pupil, diagnosing learning needs, assessing pupil learning, prescribing intervention strategies, reporting outcomes, and evaluating the effects of instruction and support strategies” (Freidhoff et al., 2015, p. 112). More than half of the participants in this study (57.5%) used school as the primary learning location through attending lab sessions overseen by their mentors on school days to learn virtual-school course content and complete virtual-school assignments. It is possible that during the above-mentioned 82 lab sessions, mentors’ practice in keeping track of the progress, reminding students of the assignment due dates, keeping students stay focused on learning help to develop students self- regulated learning skills such as time-management, goal-setting, or self-monitoring and self- reflection. Connecting the CoI Models with Learning Outcomes In this study, the three learning-outcome variables SATIS, PROG, and final grade were included in separate hypothesized models, all of which also contained CoI, SRL, EFFI, and MP. Overall, the results from both learning-outcome models provided strong support for the hypothesized relationships between all these variables, as explained more fully below. Incorporating Learning Presence into the CoI Model. The models demonstrated positive relationships between CoI-framework and SRL-theory variables, on the one hand, and learning-outcome variables on the other. Social- and cognitive presences were both found to positively predict students’ perceptions of their learning progress and learning satisfaction, while only cognitive presence was found to predict their final grade. These findings are consistent with prior findings about the important roles played by social presence and cognitive presence in directly correlating with students’ satisfaction with learning and their perceptions that learning has occurred (Akyol & Garrison, 2008; Arbaugh, 2013; Kang et al., 2014; Rockinson-Szapkiw et al., 2016). Since most of the previous studies of online learning using the CoI framework focused on post-secondary environments, the present study has also extended the relationship between the CoI framework and learning outcomes to K-12 online learning contexts, and confirmed that it is useful in that context, following certain modifications. This dissertation’s models have also directly answered scholars’ calls for a natural consideration and integration of learning presence into the CoI framework (Barnard et al., 2008; 83 Cho et al., 2017; Lin et al., 2017; Shea & Bidjerano, 2010). None of the empirical evidence presented in Chapter 4, above, suggests that learning presence – including self-efficacy and metacognitive learning strategies – played any major role in explaining how learners transferred their perceptions of teaching-, mentor-, or social presence into learning outcomes. In general, the present study’s finding that students with high learning presence are likely to perceive high cognitive presence and in turn show high learning outcomes is consistent with previous studies’ findings about the intercorrelations between CoI and SRL predictors (Cho et al., 2017; Garrison, 2007; Lin et al., 2017; Lynch & Dembo, 2004; Pawan et al., 2003; Puzziferro, 2008; Roblyer & Marshall, 2002; Shea & Bidjerano, 2010, 2012; Weiner, 2003). This study’s identification of learning presence’s mediation effect on the relations between social presence and learning outcomes also adds empirical support to the mediating role of SRL in online learning, and answers calls for increased attention to learning presence within the CoI framework (Barnard et al., 2008; Cho et al., 2017; Shea & Bidjerano, 2010; Wang et al., 2008). For example, Wang et al. concluded that self-efficacy affected online-learning outcomes via learning strategies, which highlighted the importance of self-directed learning in online contexts. Barnard et al. found that approximately one-third of the relationship between students’ perceived social presence and their learning achievement was explained by their online SRL behaviors. Similarly, the current study found that about half of the relationship between students’ perceived social presence and their satisfaction was accounted for by learning presence (i.e., self- efficacy and SRL strategies). The mediations tested above have also shown in detail how social presence predicted learning outcomes under the mediation of learning presence, thereby revealing the operation of a metacognitive, motivational, and behavioral learning process in the transformation of perceptions of online learning into learning outcomes. 84 Distinctive Learning Outcomes. Although the proposed three learning outcome models are similar to each other in terms of their prediction paths, one path highlighted the difference between them and is therefore worthy of further discussion. The finding that social presence directly predicted satisfaction and perceived progress, but not final grade, lead us to reconsider two things: the first is the role of social presence in students’ self-perceived learning progress (both emotionally and cognitively), and the second is the need to differentiate perceived learning progress and actual final grade as two distinct variables in evaluating online learning. Both satisfaction and perceived progress were believed to be important components of successful online learning (Kuo et al., 2014; Lin et al., 2016), and were highly correlated with each other (Eom, Wen, & Ashill, 2006; Lin et al., 2016). The former refers to “the favorability of a student’s subjective evaluation of the various outcomes and experiences associated with education” (Elliott & Shin, 2002), and the latter focuses on measuring one’s subjective feeling that “that learning is taking place” (Rockinson-Szapkiw et al., 2016, p. 21). The fact that social presence directly predicted these two outcomes showed that when asking about students self- perceived outcomes in learning, whether about psychological emotion (Bolliger & Martindale, 2004), or about cognitive aspect of learning (Eom et al., 2006), students’ social experiences in the entire learning process really matter. That is to say, a strong social presence can greatly decide how students “perceive” their entire learning journey (Elliott & Shin, 2002). Second, the distinct findings between the perceived progress model and the final grade model confirmed the existence of the difference between students’ self-perceived learning outcomes and their actual learning outcomes although both measures reflect students’ cognitive learning progress outcomes (Barbour, 2010; 2013; Ferdig et al., 2009). Lin et al. (2016) used perceived progress in measuring students’ cognitive learning outcomes due to the lack of access 85 to grade-related data, and especially pointed out that “We could not verify the degree to which these responses were accurate reflections of students’ online learning, as opposed to optimistic or pessimistic projections. Thus, where possible, future research should incorporate additional data stored in online learning management systems” (p. 70). This slightly divergent findings from the two models supported this hypothesis and therefore concluded that we need to use multiple sources of data to measure students’ cognitive learning outcomes in a more accurate and comprehensive way. Learning At-home vs. Learning At-school This dissertation’s final research question examined whether students’ primary learning location (i.e., learning at-home or learning at-school) made a significant difference on students’ perceived social presence, mentor presence, self-efficacy, online SRL strategies, and learning outcomes. These two groups of students did not differed significantly in most of the measured items. However, they still differed in some items as described in chapter 4, and these significant differences actually suggested some interesting patterns. To sum, the significant differences concentrate on the following aspects: feeling of isolation in social presence, triggering event in cognitive presence, solving technical problem in mentor presence, and goal-setting and help- seeking in online SRL strategies. Albeit with these differences, students did not demonstrate significant differences in three measured learning outcomes across groups. Feeling of Isolation. Social interaction correlates with K-12 students’ academic performance (Hawkins et al., 2013). The item “I never feel isolated in this online course” is the only item in the latent factor of social presence that showed significant difference between learning-at-home students and learning-at-school students. Not surprisingly, those whose primary online learning location is school showed significant less sense of isolation than those 86 students whose primary online learning location is at home. This finding is consistent with previous findings about how the difference in physical space may affect students online learning (Borup & Drysdale, 2014; Drysdale, 2013; Freidhoff et al., 2015), and how the physical show-up of students at a same classroom help to eliminate the sense of isolation that generate from online learning (e.g., Borup & Drysdale, 2014; Charania, 2010; Pettyjohn, 2012). As Murphy and Rodriguez-Manzanares (2008) descbribed in their study, as an addition to the “organized yet formal” online learning communication between students and teachers, online students believed that the face-to-face interaction that take place at labs make students feel at ease and comfortable. The “non-course-specific or social interaction” (Drysdale et al., 2014, p. 18) gave students extra psychological support and reassurance in addition to interactions regarding academic contents. It should be noted that although one item listed above showed significant difference between two types of learners, in general, of the six measured social presence items, five out of six showed insignificant difference between at-home learners and at-school learners. On the one hand, it showed the diverse nature of the online learners: different learners perceive their social presence in different ways and in general, this were not affected by where the learner decided to complete the majority of their learning activities. On the other hand, this finding reflected that a high level of social presence is related with complex factors that do not only limit to learning location. The well-design of a supportive and rapport learning environment, a healthy and active relationship with peers and teachers in the virtual environment seem more important why determining one’s perception of social presence online (de la Varre et al., 2011), so that students know that “there is a person behind the computer and not a robot” (Murphy & Rodriguez- Manzanares, 2009, p.5). In specific, the role that online instructors take in building social rapport 87 (DiPietro et al., 2008; Murphy & Rodriguez-Manzanares, 2009) through their teaching practices may play a more important role in the current study. Triggering Event in Cognitive Presence. Cognitive presence measured in this study contained four subsets that show four different phases of the practical inquiry cycle: triggering event, exploration, integration, and resolution (Garrison et al., 2001). There are three items in each four subset. All three items in triggering event were found to show significant difference between at-home and at-school students. There were no significant differences detected in exploration, integration, and resolution. Triggering event measured whether students’ motivation and curiosity were raised by the problems posed in the course. Students who completed the majority of their online learning at school demonstrated significant higher scores on triggering event than those at-home students. Although no studies in K-12 online learning context has revealed such difference among learning locations, studies conducted in higher education settings showed consistent findings. For example, Akyol and Garrison (2011a) found that much of the triggering events occurred during the face-to-face sessions than the online sessions. In another study, Vaughan and Garrison (2005) found that there were more triggering events coded in face-to-face settings (13%) than in online settings (8%). Perhaps more importantly, some quotes from the participants in this study explained why curiosity is more likely to emerge in a physical setting, “(the face-to-face setting) develops a sense of community”, “face-to-face --- the discussions are in the moment and I often forget afterwards what was said” (p. 6). In this study, it is then possible that the actual presence of people in a same physical location helped to develop a strong sense of community – it even does not matter that whether the student showing up in a same room actually belongs to a same community under a same subject. The physical presence of the students, regardless of learning 88 contents, have built up a similar sense of community, which then helped to promote students’ cognitive presence in triggering event. The physical presence offered some additional cues for learners to initiate their learning curiosity and become motivated in learning (Vaughan & Garrison, 2005). Perceived Mentor’s Help. The mentor’s practice in this study fall largely into four categories: relationship builders, monitors, content-learning facilitators, and problem solvers. The results suggested that mentor showed a stronger presence in being a problem solver for those who chose to use school as their primary online learning location. In addition, mentor’s practice in fostering a sense of learning community was scored higher for those at-school students. Other than meeting students’ academic needs, online learning programs should also consider students’ technical needs during online learning (Watson, 2007; Pettyjohn, 2012). A considerate number of students did show relatively low levels of technical skills before starting online learning (Oliver et al., 2009), which may then limit their access and performance (Cavanaugh et al., 2009). In many K-12 online learning programs, mentors were assigned by schools to provide technical supports such as conducting orientations to familiarize students with the online learning environment, help to troubleshoot technical problems throughout the entire online learning journey (Aronson & Timms, 2003; Barbour & Mulcahy, 2004; de la Varre et al., 2011; Hannum et al., 2008; Keane et al., 2008; Pettyjohn, 2012; Borup & Drysdale, 2014). The findings of this study not only confirmed the existence of such role, but also extended our understanding of the mentor’s role as a problem-solver. Those who chose to complete the majority of their online learning at school labs perceived a better presence of their mentors when it comes to helping them solve technique problems. 89 Goal-setting & Help-seeking Learning Strategies. When asking about student’s online SRL strategies, study-at-school students showed significantly higher score on goal-setting and help-seeking strategies, with all the other strategies insignificant between groups. Goal-setting belongs to the forethought phase of self-regulated learning (Zimmerman, 1986). Learners use this strategy prior to or during their online learning. Help-seeking is in the second phase – performance control of learning (Zimmerman, 1986), and showed how learners use strategies to seek help to adjust their learning. Findings from this study provided empirical support to previous hypotheses about K-12 online learners’ low levels of self-regulated learning abilities (Borup & Drysdale, 2014; Cavanaugh et al., 2009; Rice, 2006), possible benefits for students to have sit-in time at school (Roblyer & Marshall, 2002), and the differences between learning-at-school and learning-at-home students, especially in terms of self-regulated learning ability differences (Borup & Drysdale, 2014; Rice, 2006). In particular, students who complete the majority of their online learning at school demonstrated significantly higher goal-setting strategy and help-seeking strategy can be thought as an addition to the already finding that students learning online contents in the same physical room spend less time on off-task behaviors (de la Varre et al., 2011; Harms et al., 2006). Such difference may be explained by the practice of mentors during students’ physical sit-in times at school, as mentors were found to act as a problem-solver and promote self-regulated learning strategies (Aronson & Timms, 2003; Barbour & Mulcahy, 2004; de la Varre et al., 2011; Hannum et al., 2008; Keane et al., 2008; Pettyjohn, 2012). Limitations and Recommendations for Future Research Several limitations of this study need to be noted. First, learning outcomes in this study comprised learner satisfaction, perceived learning, and final grade. Learning outcomes need not 90 be limited to these three aspects alone. For example, Beldarrain (2008) collected students’ grades attained in each exam, in addition to their overall final course grades; while Borup et al. (2013) included students’ changes in disposition as a dimension of course outcomes. It has also been recommended that standardized test results be considered when examining the effectiveness of learning (U.S. Department of Education, 2008). Future studies should consider a wider variety of learning-outcome metrics when examining CoI, SRL, and related factors in online learning. A second limitation concerns the sources of the data collected in this study. Due to limited access to the target school’s learning-management system and records, TP, MP, and SP could only be measured based on students’ self-reported perceptions. Future studies should include multiple sources of data (e.g., learning-management system, learning records, teacher perceptions, teacher and student interviews, open-ended questions, and/or mentor journals) to verify students’ self-reported data (Barbour, 2012; Borup et al., 2013; Cavanaugh et al., 2009). Such data will also provide in-depth insights into how perceived CoI and SRL elements may correlate with each other. A third limitation regards the random-sampling method used in this study. This approach may run the risk of not being inclusive or representative enough for a specific population of students (Creswell, 2005). The sample for this research consisted primarily of females (69.9%), high-school students (95%), and Caucasians (82.3%). Moreover, the participants cannot be considered “fully” online learners, as they still attended their brick-and-mortar schools on a daily basis. Researchers should therefore be cautious about trying to generalize from the findings of this study to the experience of K-12 students with other demographic, cultural, and learning backgrounds. More studies should be conducted to test whether the revised CoI models proposed in this study can be generalized. 91 Last but not least, SRL is a complex construct that entails students’ active, goal-oriented regulation of behavior, motivation, and cognitive and metacognitive learning tasks (Pintrich et al., 1993; Zimmerman, 2008); yet, learning presence as measured in the present study only included self-efficacy and SRL. Future studies of similar topics should consider also including intrinsic goal orientation and task value within learning presence. Implications for Practitioners Based on the findings of the present study, it is clear that online learning should emphasize students’ roles as strategic learners (see also McMahon & Oliver, 2001; Shea & Bidjerano, 2010; Wang et al., 2013). Online learning is a complicated process that involves self- efficacy and SRL beyond what is represented in the existing CoI framework. The present study’s more inclusive framework was effective in predicting students’ learning outcomes. This has several implications for online educators. Learning Presence. First, the learning presence issues identified in this dissertation’s revised CoI model could inform course design, online instruction, and off-line facilitation. Online instructors and mentors should structure an online-learning environment that activates students’ motivational and SRL processes in order to help them reach better learning outcomes (Cho et al., 2017; Kim et al., 2015; McMahon & Oliver, 2001; Wang et al., 2013). Although online learning encompasses a variety of learning resources and activities, it is often assumed that these resources and activities can be mastered without the guidance of teachers (McMahon & Oliver, 2001). Ley and Young (2001) proposed four principles to embed SRL into teaching: 1) structuring an effective learning environment, 2) integrating cognitive and metacognitive learning strategies into instructional activities, 3) modeling each student’s self-monitoring process through goal-setting and feedback-giving, and 4) allowing students opportunities for 92 self-monitoring. This dissertation’s findings tend to support those principles. Its other practical implications include the importance of teachers sharing their thinking load with students, to make the latter more aware of their own understandings. Teachers should also design activities such as writing reflective journals, using problem-based learning or inquiry-based learning, designing authentic learning activities or using concept mapping to promote SRL (Artino, 2008; McMahon & Oliver, 2001). Online educators also need to develop teaching strategies to enhance students’ self- efficacy. It is important to identify students with low self-efficacy and develop action plans to support them (Murphy & Rodriguez-Manzanares, 2009). Artino (2008) recommended that online teachers seek to improve students’ self-efficacy in two specific ways. The first is helping students identify and set goals that are challenging yet realistic, thus increasing their willingness to exert effort to achieve them, and the likelihood that they will commit to the goal and sustain their learning effort. The second is providing students with timely and honest feedback that focuses exclusively on the task itself. This will help learners understand their progress and where there is room for improvement, while developing their self-efficacy as they master the set goals. Therefore, teachers should explicitly specify the requirements for each assignment or learning module on the syllabus; set deadlines to facilitate goal completion; communicate regularly on goals and performance; and use online gradebooks to allow self-monitoring. Mentor Presence. Within the sphere of U.S. K-12 education, the facilitation of learning by mentors is unique to online learning. The physical separation between learners and online teachers makes it necessary for mentors to provide targeted, timely, and individualized support for K-12 online learners (Hawkins et al., 2011). The present study identified a range of mentors’ responsibilities, and found that mentor presence was closely related to students’ SRL strategies. 93 The findings also revealed a clear classification of mentor presence into just two main types, i.e., academic and non-academic. By the same token, this dissertation has confirmed that mentors play multiple roles in supporting students’ learning, and are not limited to the original purpose of “monitoring learning” (de la Varre et al., 2011). In the proposed revision to the CoI framework, MP was found to correlate significantly with SRL. The results call further attention to the responsibility of each mentor to provide aid in the development of self-regulation skills (Borup & Drysdale, 2014; Harms et al., 2006), especially as pre-college-aged students may lack sufficient SRL abilities to thrive in online environments, which are typically more autonomous and student-centered than face-to-face schooling (Weiner, 2003; Wicks, 2010). Previous sections have discussed how teachers and mentors can enhance learning presence through a range of different instructional approaches. Besides all those mentioned above, it is particularly important for a mentor to work as a “role model” and a “cheerleader” for students (Wortmann et al., 2008, p. 12). From a mentor-as-role model, students can learn how to develop necessary cognitive and metacognitive learning strategies that can be beneficial for online learning, through observing the mentor’s behavior, as an adjunct to receiving direct support for the development of SRL strategies. The mentor-as- cheerleader, meanwhile, should establish a relationship of trust with students and celebrate their success, with the aim of keeping them motivated to learn. Online Learning at School. Completing K-12 online learning does not necessarily mean that online learning only takes place in virtual environments. Many K-12 online learners either are required or have the option to complete their learning in a fixed location at school, in most cases, in school labs with computers and mentors (de la Varre et al., 2011; Watson et al., 2015). The findings in this study do not indicate that studying at-school is necessarily better than 94 studying at-home. In fact, the insignificance between two groups of students on three types of learning outcomes indicates that learning location does not matter that much to the sample in this study. However, the findings about some significant differences in terms of the measured items suggested that studying online learning at school labs may have some certain benefits, which further leads us to think about this question: how can we make sure the same quality of online learning can be delivered to those who choose to spend the majority of their online learning time at home? In particular, online learning programs need to make sure students possess a strong social presence, get guidance in developing self-regulated learning strategies, and receive help once needed regardless of their primary online learning locations. Another possible solution can be found from Borup and Drysdale’s (2014) study about the use of online-facilitator as a monitor and liaison between students and their online courses. Although these online facilitators do not share the same physical space with students, they can still function as on-site facilitators to make impact on student learning. 95 CHAPTER 6 CONCLUSION The number of K-12 online learners has been increasing dramatically over the past few years. It is therefore critically important to understand K-12 students’ online-learning processes and the factors that predict their online-learning outcomes. A theoretical model that captures the students situated in online learning, and reflects the interaction among each party, and examines the influence of those interconnections on students’ online learning experiences is thus highly expected (de la Varre et al., 2011). Based on the CoI framework and SRL theory, this study is among the first to shed light on the relationships among existing CoI variables, self-efficacy, SRL, and learning outcomes in the K-12 online-learning context. To help understand the influence of K-12 mentoring – which is unique to online learning in the U.S. – mentor presence was also included as a factor in this dissertation’s modeling. In addition to examining the relationships among this expanded range of factors, this study also investigated whether a student’s primary online learning location mattered in online learning. This study’s four main findings can be summed up as follows. First, in contrast to Shea and Bidjerano’s (2010) widely used model, it found a significant relationship between teaching presence and the use of self-regulated learning strategies, and a non-significant relationship between teaching presence and self-efficacy. Second, its models confirmed mentor’s practices in four domains and found that they significantly predicted student online SRL strategies. Third, it found that satisfaction, perceived progress and final grade provide strong support for the hypothesized relationship among these learning outcomes, CoI, and SRL, supporting previous calls for a natural consideration and integration of learning presence into the CoI framework – 96 though it should be noted that minor differences exist between final grade model and other two models. And fourth, this study’s comparison of learning-at-home and learning-at-school students suggested that in general, there were no significant differences on most of the items in TP, SP, MP, CP, online SRL strategies, and learning outcomes between these two groups of students. However, those whose primary online learning location is at school showed significantly less feelings of isolation, higher ability in generating curiosity once the online learning starts, higher perceptions on mentor’s practice as a problem solver, and higher goal-setting and help-seeking strategies. In short, this dissertation has established a theoretical framework for K-12 online learning that enables us to identify important components of such learning while striking a fine balance between extensiveness and parsimony. The findings indicate that SRL serves important functions in connecting CoI presences and learning outcomes. It is hoped that the processes identified in this study will be useful and relevant to K-12 online-learning institutions and educators seeking to improve online learning via a wide range of approaches. Given the important role of SRL in online learning, the results of the modeled relationships also warrant further academic investigation in other K-12 online-learning contexts. 97 APPENDICES 98 APPENDIX A: Parental Consent Form We are inviting your child to participate in a research study designed to help us understand how to support students in online courses. If you agree to allow your child to participate in this study we will ask your child to complete a questionnaire designed to better understand their online learning. There are no correct or incorrect answers, and your child's responses here will in no way impact their standing in their online course. The researchers are interested only in their online learning process. Their participation is voluntary and they may decline to answer the questionnaire or may skip any items that they feel uncomfortable answering. Your child can withdraw from the study any time without penalty. Your child's final course grade will also be collected. All data, including grades and responses are confidential and their privacy will be protected to the maximum extent allowable by law. They will be given a unique identifier and following the completion of the online course, all documents will contain only this unique identifier. There are no direct benefits to participating in this study, although we hope that they will gain more insight into their learning through participation. Upon completing the survey, your child will be given the opportunity to be entered in a drawing to receive a $20 Amazon gift card for your child’s time while he/she is in this study (there will be twenty gift cards in total). If you have any concerns of questions about this research study, such as scientific issues or how to do any part of it, or to report an injury, please contact the following investigators: • Dr. Chin-Hsi Lin, responsible project investigator, 513A Erickson Hall, Michigan State University, East Lansing, MI, 48824. chinhsi@msu.edu (517) 353-5400. 99 • Yining Zhang, secondary investigator, Erickson Hall, Michigan State University, East Lansing, MI, 48824. zhangy58@msu.edu (517) 580-9736. If you have any questions or concerns about your role and rights as a research participant, would like to obtain information or offer input, or would like to register a complaint about this research study, you may contact, anonymously if you wish, the Michigan State University’s Human Research Protection Program at 517-355-2180, FAX 517-432-4503, or e-mail irb@msu.edu, or regular mail at: 408 W. Circle Dr. Rm. 207 Olds, East Lansing, MI 48824. By entering your initials and clicking the "yes" button you mark your consent to have your child participate. 100 APPENDIX B: Student Assent Form We are inviting you to participate in a research study designed to help us understand how to support students in online courses. If you agree to participate in this study we will ask you to complete a questionnaire designed to better understand your online learning. There are no correct or incorrect answers, and your responses here will in no way impact your standing in your online course. Your participation is voluntary and you may decline to answer the questionnaire or may skip any items that you feel uncomfortable answering. You can withdraw from the study any time without penalty. Your final course grade will also be collected. All responses, including grades and responses are confidential and your privacy will be protected to the maximum extent allowable by law. You will be given a unique identifier and following the completion of your online course, all documents will contain only this unique identifier. There are no direct benefits to participating in this study, although we hope that you will gain more insight into your learning through participation. Upon completing the survey, you will be given the opportunity to be entered in a drawing to receive a $20 Amazon gift card for your time in this study (there will be twenty gift cards in total). If you have any concerns of questions about this research study, such as scientific issues or how to do any part of it, or to report an injury, please contact the following investigators: • Dr. Chin-Hsi Lin, responsible project investigator, 513A Erickson Hall, Michigan State University, East Lansing, MI, 48824. chinhsi@msu.edu (517) 353-5400. • Yining Zhang, secondary investigator, Erickson Hall, Michigan State University, East Lansing, MI, 48824. zhangy58@msu.edu (517) 580-9736. 101 If you have any questions or concerns about your role and rights as a research participant, would like to obtain information or offer input, or would like to register a complaint about this research study, you may contact, anonymously if you wish, the Michigan State University’s Human Research Protection Program at 517-355-2180, FAX 517-432-4503, or e-mail irb@msu.edu, or regular mail at: 408 W. Circle Dr. Rm. 207 Olds, East Lansing, MI 48824. By entering your initials you mark your willingness to participate. 102 APPENDIX C: Survey Demographic 1. What is your gender a. Female b. male 2. What is your grade level? a. 5th grade b. 6th grade c. 7th grade d. 8th grade e. 9th grade f. 10th grade g. 11th grade h. 12th grade 3. What is your ethnicity? a. White b. African American c. Hispanic d. Asian e. Other 4. What is the subject of the online course you are taking with MVU? (If you take multiple courses this semester, choose one that best represents your experience in this school) a. English b. Foreign languages c. Math d. Science e. Social science f. Other subjects 5. What is the name of this online course? 6. How would you rate your prior knowledge on this subject? a. Very poor b. Poor c. Fair d. Good e. Very good 7. Are you taking this course for credit recovery? 8. How many online courses have you taken before? a. 1 b. 2 c. 3 or more 9. Where did you complete most of your online coursework? a. School b. Home c. other Teaching presence 103 Design and organization 1. The instructor clearly communicated important course topics. 2. The instructor clearly communicated important course goals. 3. The instructor provided clear instructions on how to participate in course learning activities. 4. The instructor clearly communicated important due dates/time frames for learning activities. Facilitation 5. The instructor was helpful in guiding the class towards understanding course topics in a way that helped me clarify my thinking. 6. The instructor helped to keep course participants engaged and participating in productive dialogues. 7. The instructor helped keep the course participants on task. 8. The instructor encouraged course participants to explore new concepts in this course. 9. Instructor actions reinforced the development of a sense of community among course participants. Direct instruction 10. My instructor provided useful illustrations that helped make the course content more understandable to me. 11. My instructor presented helpful examples that allowed me to better understand the content of the course. 12. My instructor provided clarifying explanations or other feedback that allowed me to better understand the content of the course. Social presence Affective expression 1. I have a sense of belonging in the course. 2. I never feel isolated in this online course. 3. Online or web-based communication is an excellent medium for social interaction. Open communication 4. I felt comfortable conversing through the online medium. 5. I felt comfortable participating in the course activities. 6. I felt comfortable interacting with others in this course. Cognitive presence Triggering event 1. Problems posed increased my interest in course issues. 2. Course activities raised my curiosity. 3. I felt motivated to explore content related questions. Exploration 4. I utilized a variety of sources to explore problems. 5. Brainstorming and finding relevant information helped me resolve questions. 6. The course was valuable in helping me appreciate different perspectives. Integration 7. Learning activities helped me construct solutions. 8. Learning activities helped me construct explanations. 9. Reflection on course content helped me understand fundamental concepts in this class. Resolution 104 10. I can describe ways to apply the knowledge created in this course. 11. I have developed solutions to course problems that can be applied in practice. 12. I can apply the knowledge in this course to other non-class related activities. Self-efficacy 1. I believe I will receive an excellent grade in this class. 2. I'm confident I can do an excellent job in this course. 3. I expect to do well in this class. 4. I'm certain I can master the skills being taught in this class. Self-regulated learning strategies Goal setting 1. I set standards for my assignments in online courses. 2. I set short-term (daily or weekly) goals as well as long-term goals (monthly or for the semester). 3. I keep a high standard for my learning in my online courses. 4. I set goals to help me manage studying time for my online courses. Strategic learning 1. I take thorough notes for my online courses 2. I work extra problems or do additional readings in my online courses beyond the assigned ones to master the course content. 3. I try to schedule time every day or every week to study for my online courses. 4. I reflect on my learning in online courses to examine. Help seeking 5. I find someone who is knowledgeable in course content so that I can consult with him/her when I need help. 6. If needed, I try to ask my online teacher/mentors about the question that I don’t understand. 7. I interact with my online teacher/mentors to help me understand how I am doing in my online classes. Mentor presence 1. The mentor helped me get used to the online learning environment. 2. The mentor helped me become familiar with the course platform. 3. The mentor helped me when I met technical problems. 4. The mentor gave me suggestions on solving technical problems. 5. The mentor expressed appreciation for my contribution. 6. The mentor fostered a sense of learning community . 7. The mentor was friendly when interacting with me or other students. 8. The mentor encouraged students when needed. 9. The mentor was interested in students as individuals. 10. The mentor kept tracking of my progress. 11. The mentor kept constant check-in with me. 12. The mentor reminded me of the assignment due dates. 13. The mentor kept me stay focused on learning. 14. The mentor addressed my misunderstanding about content learning. 15. The mentor helped me with content learning. 16. The mentor answered my questions about content. Satisfaction 105 1. Overall I am satisfied with this class. 2. I would recommend this online course to other students. 3. I would take an online course like this again in the future. Perceived progress 1. I have met most of the requirements teachers made in the class. 2. I understand most of the learning content in my class. 3. I learned to identify the central issues of the course. 4. I developed the ability to communicate clearly about the subject. 106 REFERENCES 107 REFERENCES Akyol, Z., & Garrison, D. R. (2008). The development of a community of inquiry over time in an online course: Understanding the progression and integration of social, cognitive and teaching presence. Journal of Asynchronous Learning Networks, 12, 3-22. doi:10.24059/olj.v12i3.66 Akyol, Z., & Garrison, D. R. (2011a). Understanding cognitive presence in an online and blended community of inquiry: Assessing outcomes and processes for deep approaches to learning: Cognitive presence in an online and blended community of inquiry. British Journal of Educational Technology, 42, 233–250. doi:10.1111/j.1467-8535.2009.01029.x Akyol, Z., & Garrison, D. R. (2011b). Assessing metacognition in an online community of inquiry. The Internet and Higher Education, 14(3), 183–190. doi:10.1016/j.iheduc.2011.01.005 Akyol, Z., & Garrison, D. R. (2014). The development of a community of inquiry over time in an online course: Understanding the progression and integration of social, cognitive and teaching presence. Journal of Asynchronous Learning Networks, 12, 3-22. Anderson, T., Liam, R., Garrison, D. R., & Archer, W. (2001). Assessing teacher presence in a computer conferencing context. Journal of the Asynchronous Learning Network, 5(2). Retrieved from http://auspace.athabascau.ca/handle/2149/725 Arbaugh, J. B. (2013). Does academic discipline moderate CoI-course outcomes relationships in online MBA courses? The Internet and Higher Education, 17, 16-28. doi:10.1016/j.iheduc.2012.10.002 Arbaugh, J. B., Cleveland-Innes, M., Diaz, S. R., Garrison, D. R., Ice, P., Richardson, J. C., & Swan, K. P. (2008). Developing a community of inquiry instrument: Testing a measure of the Community of Inquiry framework using a multi-institutional sample. The Internet and Higher Education, 11(3–4), 133–136. doi:10.1016/j.iheduc.2008.06.003 Archambault, L., Diamond, D., Brown, R., Cavanaugh, C., Coffey, M., Foures-Aalbu, D., & Zygouris-Coe, V. (2010). Research committee issues brief: An exploration of at-risk learners and online education. International Association for K-12 Online Learning. Retrieved from http://eric.ed.gov/?id=ED509620 Aronson, J. Z., & Timms, M. J. (2003). Net choices, net gains: Supplementing high school curriculum with. San Francisco: WestEd. Retrieved from https://works.bepress.com/michael_timms/9/download/ Artino, A. (2008). Practical guidelines for online instructors. TechTrends, 52, 37-45. doi: 10.1007/s11528-008-0153-x Bandura, A. (1997). Self-efficacy: The exercise of control. New York: Greeman 108 Barbour, M. K. (2012). Models and resources for online teacher preparation and mentoring. In K. M. Kennedy & L. Archambault (Eds.), Lessons learned in teacher mentoring: Supporting educators in K-12 online learning environments (pp. 83–102). Vienna, VA: International Association for K-12 Online Learning. Barbour, M. K. (2013). The landscape of K-12 online learning: Examining what is known. Handbook of Distance Education, 3, 574–593. doi:10.4324/9780203803738.ch36 Barbour, M. K., & Mulcahy, D. (2004). The role of mediating teachers in Newfoundland’s new model of distance education. The Morning Watch, 32 (1-2), 1–14. Retrieved from http://www.mun.ca/educ/faculty/mwatch/fall4/barbourmulcahy.htm Barbour, M. K., & Mulcahy, D. (2009). Beyond volunteerism and good will: Examining the commitment of schoolbased teachers to distance education. Retrieved from http://digitalcommons.sacredheart.edu/ced_fac/177/ Barbour, M. K., & Reeves, T. C. (2009). The reality of virtual schools: A review of the literature. Computers & Education, 52, 402–416. doi:10.1016/j.compedu.2008.09.009 Bartlett, M. S. (1950). Tests of significance in factor analysis. British Journal of Mathematical and Statistical Psychology, 3(2), 77-85. doi: 10.1111/j.2044-8317.1950.tb00285.x Barnard, L., Lan, W. Y., To, Y. M., Paton, V. O., & Lai, S.-L. (2009). Measuring self-regulation in online and blended learning environments. The Internet and Higher Education, 12, 1– 6. doi:10.1016/j.iheduc.2008.10.005 Barnard, L., Paton, V., & Lan, W. (2008). Online self-regulatory learning behaviors as a mediator in the relationship between online course perceptions with achievement. The International Review of Research in Open and Distance Learning, 9. doi:10.19173/irrodl.v9i2.516 Barnard-Brak, L., Lan, W. Y., & Paton, V. O. (2010). Profiles in self-regulated learning in the online learning environment. The International Review of Research in Open and Distance Learning, 11, 61–78. doi:10.4018/978-1-61692-901-5.ch002 Beldarrain, Y. (2008). Engaging the 21st century learner: An exploratory study of the relationship between interaction and achievement in the virtual high school. Capella University, Minnesota. Bernard, R. M., Abrami, P. C., Borokhovski, E., Wade, C. A., Tamim, R. M., Surkes, M. A., & Bethel, E. C. (2009). A meta-analysis of three types of interaction treatments in distance education. Review of Educational Research, 79, 1243–1289. doi:10.3102/0034654309333844 Black, E. W., Ferdig, R. E., & DiPietro, M. (2008). An overview of evaluative instrumentation for virtual high schools. American Journal of Distance Education, 22, 24–45. doi:10.1080/08923640701713422 109 Bolliger, D. U., & Martindale, T. (2004). Key factors for determining student satisfaction in online courses. International Journal on E-learning, 3(1), 61-68. Retrieved from https://www.learntechlib.org/p/2226 Borup, J., & Drysdale, J. S. (2014). On-site and online facilitators: Current and future direction for research. In R. Ferdig & K. Kennedy (Eds.), Handbook of research on K-12 online and blended learning (pp. 325–346). ETC Press. Retrieved from http://press.etc.cmu.edu/files/Handbook-Blended-Learning_Ferdig-Kennedy- etal_web.pdf Borup, J., Graham, C. R., & Davies, R. S. (2013). The nature of adolescent learner interaction in a virtual high school setting. Journal of Computer Assisted Learning, 29, 153–167. doi:10.1111/j.1365-2729.2012.00479.x Campbell, D. T., & Fiske, D. W. (1959). Convergent and discriminant validation by the multitrait-multimethod matrix. Psychological Bulletin, 56, 81-105. doi:10.1037/h0046016 Cavanaugh, C., Gillan, K. J., Kromrey, J., Hess, M., & Blomeyer, R. (2004). The effects of distance education on K-12 student outcomes: A meta-analysis. Naperville, IL: Learning Point Associations. Retrieved from http://eric.ed.gov/?id=ED489533 Cavanaugh, C. S., Barbour, M. K., & Clark, T. (2009). Research and practice in K-12 online learning: A review of open access literature. The International Review of Research in Open and Distributed Learning, 10(1). doi: 10.19173/irrodl.v10i1.607 Celentin, P. (2007). Online training: Analysis of interaction and knowledge-building patterns among foreign language teachers. Journal of Distance Education, 21(3), 39–58. Retrieved from http://ijede.ca/index.php/jde/article/view/29/35 Charania, A. K. (2010). Preparing future teachers for virtual schooling: Assessing their preconceptions and competence. (Doctorial dissertation). Iowa State University. Retrieved from http://lib.dr.iastate.edu/etd/11447 Cho, M. H., & Shen, D. (2013). Self-regulation in online learning. Distance Education, 34, 290– 301. doi:10.1080/01587919.2013.835770 Cho, M. H., Kim, Y., & Choi, D. (2017). The effect of self-regulated learning on college students' perceptions of community of inquiry and affective outcomes in online learning. The Internet and Higher Education, 34, 10-17. doi:10.1016/j.iheduc.2017.04.001 Cobb, S. C. (2011). Social presence, satisfaction, and perceived learning of RN-to-BSN students in web-based nursing courses. Nursing Education Perspectives, 32, 115-119. doi:10.5480/1536-5026-32.2.115 Cohen, J. (1988). Statistical power analysis for the behavioral sciences (2nd ed.). Hillsdale, NJ: Lawrence Earlbaum Associates. 110 Corry, M., & Stella, J. (2012). Developing a framework for research in online K-12 distance education. The Quarterly Review of Distance Education, 13(3), 133–151. Retrieved from http://go.galegroup.com.proxy1.cl.msu.edu.proxy2.cl.msu.edu/ps/i.do?p=AONE&u=msu _main&id=GALE%7CA327816151&v=2.1&it=r&sid=summon&authCount=1 Costello, A. B., & Osborne, J. W. (2005). Exploratory factor analysis: Four recommendations for getting the most from your analysis. Practical Assessment, Research, and Evaluation, 10(7), 1-9. Creswell, J. W. (2005). Educational research: Planning, conducting, and evaluating quantitative and qualitative research. Upper Saddle River, N. J.: Merrill. Crippen K. J. & Earl, B. L., (2007). The impact of Web-based worked examples and self- explanation on performance, problem solving, and self-efficacy. Computers & Education, 49, 809-821. doi: 10.1016/j.compedu.2005.11.018 Cudek, R. & Browne, M. W. (1983). Cross-validation of covariance structures. Multivariate Behavioral Research, 18, 147-167. doi: 10.1207/s15327906mbr1802_2 de la Varre, C., Irvin, M., Jordan, A. Hannum W., & Farmer, T. (2014). Reasons for student dropout in an online course in a rural K-12 setting. Distance Education, 35, 324- 344.doi:10.1080/01587919.2015.955259 de la Varre, C., Keane, J., & Irvin, M. J. (2011). Dual perspectives on the contribution of on-site facilitators to teaching presence in a blended learning environment. Journal of Distance Education (Online), 25(3). Retrieved from http://www.ijede.ca/index.php/jde/article/view/751/1285 De Laat, M., Lally, V., Lipponen, L., & Simons, R.-J. (2007). Online teaching in networked learning communities: A multi-method approach to studying the role of the teacher. Instructional Science, 35(3), 257–286. doi:10.1007/s11251-006-9007-0 DiPietro, M., Ferdig, R. E., Black, E. W., & Preston, M. (2008). Best practices in teaching K-12 online: Lessons learned from Michigan Virtual School teachers. Journal of Interactive Online Learning, 7(1), 10–35. Retrieved from http://www.ncolr.org/jiol/issues/pdf/7.1.2.pdf Drysdale, J. S. (2013). Online facilitators and sense of community in K-12 online learning (Unpublished doctoral dissertation). Bringham Young University - Provo. Drysdale, J. S., Graham, C. R., & Borup, J. (2014). An online high school “shepherding” program: Teacher roles and experiences mentoring online students. Journal of Technology and Teacher Education, 22, 9-32. Ebner R. J. & Ehri, L. C. (2016). Teaching students how to self-regulate their online vocabulary learning by using a structured think-to-yourself procedure. Journal of College Reading and Learning, 46, 62-73. doi:10.1080/10790195.2015.1075448 111 Elliott, K. M., & Shin, D. (2002). Student satisfaction: An alternative approach to assessing this important concept. Journal of Higher Education Policy and Management, 24, 197-209. doi: 10.1080/1360080022000013518 Eom, S. B., Wen, H. J., & Ashill, N. (2006). The determinants of students’ perceived learning outcomes and satisfaction in university online education: an empirical investigation. Decision Sciences Journal of Innovative Education, 4, 215–235. doi:10.1111/j.1540- 4609.2006.00114.x Ferdig, R. E. (2010). Understanding the role and applicability of K-12 online learning to support student dropout recovery efforts. Lansing, MI: Michigan Virtual University. Retrieved from http://www.mivu.org/Portals/0/RPT_RetentionFinal.pdf Fornell, C., & Larcker, D.F. (1981). Evaluating structural equation models with unobservable variables and measurement error. Journal of Marketing Research, 18, 39-50. doi: doi.org/10.2307/3151312 Freidhoff, J., Borup, J., Stimson, R., & DeBruler, K. (2015). Documenting and sharing the work of successful on-site mentors. Journal of Online Learning Research, 1(1), 107–128. Retrieved from https://www.learntechlib.org/p/149918 Garrison, D. R. (2007). Online community of inquiry review: Social, cognitive, and teaching presence issues. Journal of Asynchronous Learning Networks, 11(1), 61–72. Retrieved from https://eric.ed.gov/?id=EJ842688 Garrison, D. R., Anderson, T., & Archer, W. (2001). Critical thinking, cognitive presence, and computer conferencing in distance education. American Journal of Distance Education, 15, 7–23. doi:10.1080/08923640109527071 Garrison, D. R., Anderson, T., & Archer, W. (2010). The first decade of the community of inquiry framework: A retrospective. The Internet and Higher Education, 13, 5–9. doi:10.1016/j.iheduc.2009.10.003 Garrison, D.R., Cleveland-Innes, M., & Fung, T. S. (2010). Exploring causal relationships among teaching, cognitive and social presence: Student perceptions of the community of inquiry framework. The Internet and Higher Education, 13, 31-36. doi: 10.1016/j.iheduc.2009.10.002 Gefen, D., Straub, D. W., & Boudreau, M. (2000). Structural equation modeling and regression: Guidelines for research practice. Communications of the Association for Information Systems, 4(7), 1–78.
Retrieved from http://cits.tamiu.edu/kock/NedWebArticles/Gefenetal2000.pdf Graham, S., & Harris, K. R. (2000). The role of self-regulation and transcription skills in writing and writing development. Educational Psychologist, 35, 3–12. doi:10.1207/s15326985ep3501_2 112 Gunawardena, C., Hermans, M. B., Sanchez, D., Richmond, C., Bohley, M., & Tuttle, R. (2009). A theoretical framework for building online communities of practice with social networking tools. Educational Media International, 46, 3–16. doi:10.1080/09523980802588626 Hair, Jr., J.F., Black, W.C., Babin, B.J., Anderson, R.E., Tatham, R.L., 2006. Multivariate data analysis (6th Ed.), Pearson-Prentice Hall, Upper Saddle River, NJ. Hannum, W. H., Irvin, M. J., Lei, P., & Farmer, T. W. (2008). Effectiveness of using learner
 centered principles on student retention in distance education courses in rural schools. Distance Education, 29, 211–229. doi:10.1080/01587910802395763 Harms, C. M., Niederhauser, D. S., Davis, N. E., Roblyer, M. D., & Gilbert, S. B. (2006). Educating educators for virtual schooling: Communicating roles and responsibilities. The Electronic Journal of Communication, 16(1 & 2). Retrieved from http://public.vrac.iastate.edu/~gilbert/papers/JP2007-harms-niederhauser.pdf Hawkins, A., Barbour, M. K., & Graham, C. R. (2011). Strictly business: Teacher perceptions of interaction in virtual schooling. International Journal of E-Learning & Distance Education, 25(2). Retrieved from http://www.ijede.ca/index.php/jde/article/view/726 Hawkins, A., Graham, C. Sudweeks R., & Barbour, M. (2013). Academic performance, course completion rates, and student perception of the quality and frequency of interaction in a virtual high school. Distance Education, 34, 64-83. doi: 10.1080/01587919.2013.770430 Hu, L., & Bentler, P. M. (1999). Cutoff criteria for fit indexes in covariance structure analysis: Conventional criteria versus new alternatives. Structural Equation Modeling, 6, 1-55. doi: 10.1080/10705519909540118 Kaiser, H. F. (1974). An index of factorial simplicity. Psychometrika, 39, 31–36. doi: 10.1007/bf02291575 Kaiser, H. F., & Rice, J. (1974). Little jiffy, mark IV. Educational and psychological measurement, 34, 111-117. doi: 10.1177/001316447403400115 Kang, M., Liw, B. T., Kim, J., & Park, Y. (2014). Learning presence as a predictor of achievement and satisfaction in online learning environments. International Journal on E-Learning, 13, 193–208. Retrieved from https://eric.ed.gov/?id=EJ1035981 Keane, J., de la Varre, C., Irvin, M. J., & Hannum, W. (2008). Learner-centered social support: Enhancing online distance education for underserved rural high school students in the United States. ALT-C 2008 Research Proceedings, 39-48. Retrieved from http://repository.alt.ac.uk/435/ Kennedy, K., & Cavanaugh, C. (2010). Development and support of online teachers: The roles of mentors in virtual schools. Journal of Technology Integration in the Classroom, 2(3), 37– 42. Retrieved from http://connection.ebscohost.com/c/articles/55301570/development- support-online-teachers-roles-mentors-virtual-schools 113 Kim, C., Park, S. W., Cozart, J., & Lee, H. (2015). From motivation to engagement: The role of effort regulation of virtual high school students in mathematics courses. Educational Technology & Society, 18(4), 261–272. Retrieved from http://www.ifets.info/journals/18_4/20.pdf Kline, R. B. (2011). Principles and practice of structural equation modeling. New York, NY: Guilford. Kuo, Y.-C., Walker, A. E., Schroder, K. E. E., & Belland, B. R. (2014). Interaction, Internet self- efficacy, and self-regulated learning as predictors of student satisfaction in online education courses. The Internet and Higher Education, 20, 35–50. doi:10.1016/j.iheduc.2013.10.001 Ley, K., and Young, D. B. (2001). Instructional principles for self-regulation. Educational Technology Research and Development 49, 93–103. doi: /10.1007/bf02504930 Lin, C.-H., Zhang, Y., & Zheng, B. (2017). The role of learning strategies in online language learning: A structural equation modeling analysis. Computers & Education, 113, 75-85. doi:10.1016/j.compedu.2017.05.014 Lin, C.-H., & Zheng, B. (2015). Teaching Practices and Teacher Perceptions in Online World Language Courses. Journal of Online Learning Research, 1(3), 275-303. Retrieved from https://www.learntechlib.org/p/171055 Lin, C.-H., Zheng, B., & Zhang, Y. (2016). Interactions and learning outcomes in online language courses. British Journal of Educational Technology, 48, 730-748. doi: 10.1111/bjet.12457 Linnenbrink, E. A., & Pintrich, P. R. (2003). The role of self-efficacy beliefs in student engagement and learning in the classroom. Reading & Writing Quarterly, 19(2), 119– 137. doi:10.1080/10573560308223 Liu, F., & Cavanaugh, C. (2012). Factors influencing student academic performance in online high school algebra. Open Learning: The Journal of Open, Distance and E-Learning, 27, 149–167. doi:10.1080/02680513.2012.678613 Lynch, R., & Dembo, M. (2004). The relationship between self-regulation and online learning in a blended learning context. The International Review of Research in Open and Distributed Learning, 5(2), 1–16. doi:10.19173/irrodl.v5i2.189 MacKinnon, D. P., Lockwood, C. M., & Williams, J. (2004). Confidence limits for the indirect effect: Distribution of the product and resampling methods. Multivariate Behavioral Research, 39, 99-128. doi: 10.1207/s15327906mbr3901_4 McMahon, M., & Oliver, R. (2001). Promoting self-regulated learning in an online environment. Retrieved from http://ro.ecu.edu.au/cgi/viewcontent.cgi?article=5815&context=ecuworks 114 Meyer, K. A. (2004). Evaluating online discussions: Four different frames of analysis. Journal of Asynchronous Learning Networks, 8, 101–114. Retrieved from http://onlinelearningconsortium.org/sites/default/files/v8n2_meyer_1.pdf Michigan Virtual Learning Research Institute. (2014). Mentoring fundamentals: A guide for mentoring online learners, Version 1. Lansing, MI: Michigan Virtual University. Retrieved from https://micourses.org/resources/pdf/toolkit/mentor_guide_14.pdf Murphy, E., & Rodriguez-Manzanares, M. (2009). Learner-centredness in high-school distance learning: Teachers’ perspectives and research-validated principles. Australasian Journal of Educational Technology, 25, 597–610. doi: 10.14742/ajet.1110 Nisbet, D., Wighting, M., & Rockinson-Szapkiw, A. (2013). Measuring sense of community and academic learning in graduate education. The International Journal of Interdisciplinary Educational Studies, 7(1), 1-8. Nunnally, J. C., & Bernstein, I. H. (1994) Psychometric theory (3rd ed.). New York, NY: McGraw-Hill, Inc. O’dwyer, L. M., Carey, R., & Kleiman, G. (2007). A study of the effectiveness of the Louisiana Algebra I online course. Journal of Research on Technology in Education, 39, 289–306. doi:10.1080/15391523.2007.10782484 Oliver, K., Osborne, J., & Brady, K. (2009). What are secondary students’ expectations for teachers in virtual school environments? Distance Education, 30, 23–45. doi: 10.1080/01587910902845923 Pajares, F. (2012). Motivational role of self-efficacy beliefs in self-regulated learning. In D. H. Schunk & B. J. Zimmerman (Eds.), Motivation and self-regulated learning: Theory, research, and applications (pp. 111–140). New York, NY: Routledge. Pawan, F., Paulus, T. M., Yalcin, S., & Chang, C.-F. (2003). Online learning: Patterns of engagement and interaction among in-service teachers. Language Learning & Technology, 7, 119–140. Pettyjohn, T. J. (2012). Stakeholder’s perceptions of supplemental online learning for credit recovery. Retrieved from http://digitalcommons.georgiasouthern.edu/etd/402/ Pintrich, P. R., & De Groot, E. V. (1990). Motivational and self-regulated learning components of classroom academic performance. Journal of Educational Psychology, 82, 33–40. doi:10.1037/0022-0663.82.1.33 Pintrich, P. R., Smith, D. A., Garcia, T., & McKeachie, W. J. (1993). Reliability and predictive validity of the Motivated Strategies for Learning Questionnaire (MSLQ). Educational and Psychological Measurement, 53, 801–813. doi:10.1177/0013164493053003024 115 Preacher, K. J., & Hayes, A. F. (2004). SPSS and SAS procedures for estimating indirect effects in simple mediation models. Behavior Research Methods, Instruments, & Computers, 36, 717-731. doi: 10.3758/bf03206553 Puzziferro, M. (2008). Online technologies self-efficacy and self-regulated learning as predictors of final grade and satisfaction in college-level online courses. American Journal of Distance Education, 22, 72–89. doi:10.1080/08923640802039024 Rice, K. L. (2006). A comprehensive look at distance education in the K–12 context. Journal of Research on Technology in Education, 38, 425–448. doi:10.1080/15391523.2006.10782468 Rice, K. (2014). Research and history of policies in K-12 online and blended learning. In R. Ferdig & K. Kennedy (eds.), Handbook of research on K-12 online and blended Learning (pp. 51–82). Pittsburgh, PA: ETC Press. Rice, K., Huerta, L., Shafer, S., R., Barbour, M. K., Miron, G., Gulosino, C., & Horvitz, B. (2014). Virtual Schools in the U.S. 2014: Politics, performance, policy, and research evidence. Retrieved from http://nepc.colorado.edu/publication/virtual-schools-annual- 2014 Richardson, J. C., Maeda, Y., Lv, J., & Caskurlu, S. (2017). Social presence in relation to students’ satisfaction and learning in the online environment: A meta-analysis. Computers in Human Behavior, 71, 402–417. doi:.10.1016/j.chb.2017.02.001 Roblyer, M. D., Davis, L., Mills, S. C., Marshall, J., & Pape, L. (2008). Toward practical procedures for predicting and promoting success in virtual school students. American Journal of Distance Education, 22, 90–109. doi:10.1080/08923640802039040 Roblyer, M. D., & Marshall, J. C. (2002). Predicting success of virtual high school students: Preliminary results from an educational success prediction instrument. Journal of Research on Technology in Education, 35, 241–255. doi:10.1080/15391523.2002.10782384 Rockinson-Szapkiw, A., Wendt, J., Whighting, M., & Nisbet, D. (2016). The predictive relationship among the community of inquiry framework, perceived learning and online, and graduate students’ course grades in online synchronous and asynchronous courses. The International Review of Research in Open and Distributed Learning, 17(3). doi: 10.19173/irrodl.v17i3.2203 Schumacker, E. S., & Lomax, R. G. (2010). Structural equation modeling. New York, NY: Palgrave Macmillan. Schunk, D. H. (2005). Self-regulated learning: The educational legacy of Paul R. Pintrich. Educational Psychologist, 40, 85–94. doi:10.1207/s15326985ep4002_3 116 Schunk, D. H., & Zimmerman, B. J. (2007). Influencing children’s self-efficacy and self- regulation of reading and writing through modeling. Reading & Writing Quarterly, 23, 7– 25. doi: 10.1080/10573560600837578 Schunk, D. H., & Zimmerman, B. J. (2012). Motivation and self-regulated learning: Theory, research, and applications. New York, NY: Routledge. Shea, P., & Bidjerano, T. (2008). Measures of quality in online education: An investigation of the community of inquiry model and the net generation. Journal of Educational Computing Research, 39, 339–361. doi:10.2190/EC.39.4.b Shea, P., & Bidjerano, T. (2010). Learning presence: Towards a theory of self-efficacy, self- regulation, and the development of a communities of inquiry in online and blended learning environments. Computers & Education, 55, 1721–1731. doi:10.1016/j.compedu.2010.07.017 Shea, P., & Bidjerano, T. (2012). Learning presence as a moderator in the community of inquiry model. Computers & Education, 59, 316–326. doi:10.1016/j.compedu.2012.01.011 Simpson, O. (2004). The impact on retention of interventions to support distance learning students. Open Learning, 19(1), 79–95. doi:10.1080/0268051042000177863 Smith, R., Clark, T., & Blomeyer, B. (2005). A synthesis of new research in K–12 online learning. Naperville, IL: Learning Point Associates. Staker, H. (2011). The rise of k-12 blended learning: Profiles of emerging models. Innosight Institute. Retrieved from http://eric.ed.gov/?id=ED535181 Swan, K. (2002). Building learning communities in online courses: The importance of interaction. Education, Communication & Information, 2(1), 23–49. doi:10.1080/1463631022000005016 Taylor, S., Clements, P., Heppen, J., Rickles, J., Sorensen, N., Walters, K., Michelman, V. (2016). Getting back on track the role of in-person instructional support for students taking online credit recovery. Washington, D.C: American Institutes for Research. Retrieved from http://www.air.org/system/files/downloads/report/In-Person-Support- Credit-Recovery.pdf. The U.S. Department of Education. (2007). Connecting students to advanced courses online. Washington, DC. Retrieved Sep 16, 2016, from http://www.ed.gov/admins/lead/academic/advanced/index.html. Tsai, C.-W., Shen, P.-D., & Fan, Y.-T. (2013). Research trends in self-regulated learning research in online learning environments: A review of studies published in selected journals from 2003 to 2012: Colloquium. British Journal of Educational Technology, 44, E107–E110. doi:10.1111/bjet.12017 117 Vaughan, N. & Garrison, D. (2005). Creating cognitive presence in a blended faculty development community. Internet and Higher Education, 8, 1-12. doi: 10.1016/j.iheduc.2004.11.001 Wang, Y., Peng, H., Huang, R., Hou, Y., & Wang, J. (2008). Characteristics of distance learners: Research on relationships of learning motivation, learning strategy, self-efficacy, attribution and learning results. Open Learning, 23(1), 17-28. doi: 10.1080/02680510701815277 Wang, C.-H., Shannon, D. M., & Ross, M. E. (2013). Students’ characteristics, self-regulated learning, technology self-efficacy, and course outcomes in online learning. Distance Education, 34, 302–323. doi:10.1080/01587919.2013.835779 Wang, J. & Wang, X. (2012). Structural equation modeling: Applications using Mplus: Methods and applications. West Sussex, UK: Higher Education Press. Watson, J., & Ryan, J. (2007). A national primer on K-12 online learning. Washington, DC: North American Council on Online Learning. Watson, J., Murin, A., Vashaw, L., Gemin, B., & Rapp, C. (2011). Keeping pace with K-12 online learning: An annual review of policy and practice, 2011. Evergreen Education Group. Retrieved from http://eric.ed.gov/?id=ED535912 Watson, J., Pape, L., Murin, A., Gemin, B., & Vashaw, L. (2014). Keeping pace with K-12 digital learning: An annual review of policy and practice. Retrieved from https://files.eric.ed.gov/fulltext/ED558147.pdf Watson, J., Pape, L., Murin, A., Gemin, B., & Vashaw, L. (2015). Keeping pace with K-12 digital learning: An annual review of policy and practice. Retrieved from http://www.kpk12.com/wp-content/uploads/Evergreen_KeepingPace_2015.pdf Weiner, C. (2003). Key ingredients to online learning: Adolescent students study in cyberspace - the nature of the study. International Journal on E-Learning, 2(3), 44–50. Retrieved from https://www.learntechlib.org/p/14497 Wicks, M. (2010). A National Primer on K-12 Online Learning. Version 2. International Association for K-12 Online Learning. Retrieved from http://eric.ed.gov/?id=ED514892 Wighting, M. J., Nisbet, D., & Rockinson-Szapkiw, A. J. (2013). Measuring sense of community and academic learning in graduate education. The International Journal of Interdisciplinary Educational Studies, 7(1), 1–8. doi:10.18848/2327- 011x/cgp/v07i01/53294 Wortmann, K., Cavanaugh, C., Kennedy, K., Beldarrain, Y., Letourneau, T., & Zygouris-Coe, V. (2008). Online teacher support programs: Mentoring and coaching models. North American Council for Online Learning. 118 Zhan, Z., & Mei, H. (2013). Academic self-concept and social presence in face-to-face and online learning: Perceptions and effects on students’ learning achievement and satisfaction across environments. Computers & Education, 69, 131–138. doi:10.1016/j.compedu.2013.07.002 Zimmerman, B. J. (1986). Becoming a self-regulated learner: Which are the key subprocesses? Contemporary Educational Psychology, 11, 307–313. doi:10.1016/0361-476X(86)90027- 5 Zimmerman, B. J. (2002). Become a self-regulated learner: An overview. Theory Into Practice, 41(2), 64–70. doi:10.1207/s15430421tip4102_2 Zimmerman, B. J. (2008). Investigating self-regulation and motivation: Historical background, methodological developments, and future prospects. American Educational Research Journal, 45, 166–183. doi:10.3102/0002831207312909 Zimmerman, B. J., & Bandura, A. (1994). Impact of self-regulatory influences on writing course attainment. American Educational Research Journal, 31, 845–862. doi:10.2307/1163397 Zimmerman, B. J., Bonner, S. & Kovach, R. (1996). Developing self-regulated learners: Beyond achievement to self-efficacy. Washington, DC: American Psychological Association. Zimmerman, B. J., & Schunk, D. H. (2012). Motivation: An essential dimension of self- regulated learning. In D. H. Schunk & B. J. Zimmerman (Eds.), Motivation and self- regulated learning: Theory, research, and applications (pp. 1–30). New York, NY: Routledge. 119