JUST TEXT ME: A SELF-REGULATED LEARNING INTERVENTION FOR COLLEGE STUDENTS By William John Imbriale A DISSERTATION Submitted to Michigan State University in partial fulfillment of the requirements for the degree of Educational Psychology and Educational Technology - Doctor of Philosophy 2020 ABSTRACT JUST TEXT ME: A SELF-REGULATED LEARNING INTERVENTION FOR COLLEGE STUDENTS By William John Imbriale This study looks at the impact of a self-regulated learning (SRL) intervention delivered via text messaging. The intervention was designed to foster SRL and improve academic performance for college students. Treatment participants received SRL resources, reminders, and reflective prompts via text message over the course of a semester in a large college freshman engineering course. Academic and SRL measures were compared between treatment and control groups. Outcome measures included test grades, final course grade, metacognitive self-regulation, study approach, behavioral disaffection, and organization. Treatment participants were expected to show higher levels of academic performance and SRL. Results suggested that the intervention had very limited effects on academic outcomes and no effect on SRL outcomes. The study has implications for technology interventions at institutions hoping to help students develop better SRL behaviors. Copyright by WILLIAM JOHN IMBRIALE 2020 For Siobhan, Will, Declan, & Baby iv ACKNOWLEDGEMENTS I can trace this journey back to my childhood when thinking about all the people who supported, encouraged, and loved me throughout. First and foremost my parents, Danielle DePonte and William A. Imbriale, who from an early age instilled the importance of education and always allowed me to pursue my interests, no matter how much they changed from year to year. Mom and Dad - your love, encouragement, and wisdom helped me persist through the peaks and valleys not just through this experience, but throughout my life. My brother Paul and sister Noel have also been incredibly instrumental and supportive at all stages of my life. Thank you to my grandparents, aunts, uncles, extended family and in-laws who took an interest in my work and provided me guidance and support. When I was in graduate school at Teachers College, Dr. Pamela Felder approached me in my first year and planted the seed that I should pursue a doctorate someday. I still remember this brief but important moment and hadn’t taken the idea seriously until that time. There have been many other faculty, colleagues, and mentors who have encouraged and supported me along the way. These include Dr. Bill Stanwood, Dr. Geoff Brackett, Dr. Christine Shakespeare, Dr. Tim Lynch, Dr. Joe Hoffman, Dr. Mike Alfultis, Professor Michael Rosenfeld, Professor Jere Greland, Captain Catie Hanft, and many others. Thank you for allowing me the time, space, financial support, and mentorship to pursue my academic endeavors while guiding me through my professional career. I would like to especially highlight and thank Professors Amie Carter and Joanne Rydzewski for working with me to make this project a reality. I am firmly convinced that I was part of one of the best doctoral programs in the country. There are so many people to thank who taught me so much over these past six years. First and v foremost is Dr. Chris Greenhow, without whom I would not have made it to the end. Chris – thank you for taking me on and supporting me at such a critical time. Your time, commitment, expertise, and enthusiasm throughout the years will not be forgotten. I also owe a debt of gratitude to Dr. Lisa Linnenbrink-Garcia, who struck that perfect balance of challenge and support throughout the dissertation process. Lisa – your expertise and talent are incredible. Thank you as well for guiding me through this project. Thank you to Dr. Aman Yadav and Dr. Matt Wawryznski for your willingness and insights through this process and for helping me take my work to the next level. There are many other people I’ve met throughout EPET who made this an incredible experience. Thank you Dr. Kelly Mix for mentoring me and getting me started. Thank you to the EPET faculty, especially Dr. Cary Roseth and Dr. Emily Bouck who supported and pushed me to new levels. Lastly, I would like to thank the entering EPET hybrid cohort of 2014. This group is comprised of incredibly kind and talented people whom I consider close friends. I will miss you all very much. In closing, all of this was ultimately for, and completely supported by, my family. Siobhan – you gave up so much for this from the late nights I was working to the lengthy library sessions, to helping me through stressful and uncertain times. I know you don’t believe this, but this is just as much yours as it is mine. I am glad we’ve reached this pinnacle together and that we’ll be able to take the time to enjoy the view. To Will and Declan – you boys gave up a lot too, even though you probably don’t realize it yet. Thank you for being delightful, caring, fiercely loving young boys. I can’t wait to watch you grow up and witness all that you will become. vi I know there are many others I have not mentioned here. My deepest thanks and appreciation to all who helped make this possible. vii TABLE OF CONTENTS LIST OF TABLES…………………………………………………………………………. xi LIST OF FIGURES…………………………………………………………………………xii INTRODUCTION…………………………………………………………………………. 1 Purpose………………………………………………………………………………. 3 Research Questions………………………………………………………………….. 4 Research Question 1……………………………………………………………4 Research Question 2…………………………………………………………… 5 THEORETICAL OVERVIEW……………………………………………………………. 6 Self-Regulated Learning……………………………………………………………... 6 SRL and Performance Feedback………………………………………………. 8 SRL Phases and Processes…………………………………………………….. 10 SRL Assumptions……………………………………………………………… 11 Teaching SRL………………………………………………………………….. 13 EMPIRICAL RESEARCH OVERVIEW………………………………………………….. 16 SRL Interventions……………………………………………………………………. 16 Self-Reflection in the College Classroom………………………………………17 The Self-Regulation Empowerment Program…………………………………..17 SRL in Engineering……………………………………………………………. 19 Text Message Interventions…………………………………………………….20 Prompting SRL with Technology………………………………………………23 Combining Training, Coaching, and Text Messaging………………………….27 Study Logs and Text Reminders………………………………………………. 28 Current Study…………………………………………………………………………29 METHODS………………………………………………………………………………… 31 Participants……………………………………………………………………………31 Design……………………………………………………………………………….. 32 Procedures……………….……………….……………….……………….………… 33 Intervention Components……………….……………….…………………….. 34 Teaching effective SRL……………….……………….……………….. 34 Test reminders……………….……………….……………….………… 35 Prompting self-reflection……………….……………….………………. 35 Study Steps……………….……………….……………….…………………... 36 Consent form and intake information……………….…………………... 37 Data collection……………….……………….……………….………… 37 Intervention Schedule……………….……………….…………………………38 Remind Technology……………….……………….………………………….. 38 Measures……………….……………….……………….……………….………….. 39 viii Self-Regulated Learning Measures ……………….……………….………….. 39 Self-Regulation Strategy Inventory – Self-Report (SRSI-SR)………….. 40 Motivated Strategies for Learning Questionnaire (MSLQ)…………….. 40 SRL and MSLQ limitations……………….……………….……………. 42 Academic Performance……………….……………….………………………. 42 Treatment Fidelity……………….……………….……………….…………… 43 Analysis Approach…………………………………………………………………... 43 Exploratory Factor Analysis…………………………………………………... 43 Primary Analyses……………………………………………………………….45 Additional Analyses………………………….……………………………….. 47 Power Analysis………………………………………………………………… 47 RESULTS……………….……………….……………….……………….………………. 49 SRL Factors………………………………………………………………………….. 49 Correlation Analysis……………….……………….……………….………………. 50 Multiple Regression Analysis……………….……………….……………………… 51 Assumptions……………….……………….……………….………………… 51 Effects on SRL……………….………………….……….…………………… 53 Effects on Academic Performance………...……………….………………… 55 Ancillary Analyses: Participant Engagement with SRL Material………………….. 56 DISCUSSION………………..………………..………………..………………..……….. 61 Research Questions………………..………………..………………..………………. 61 Research Question 1………………..………………..………………………… 61 Research Question 2………………..………………..………………..……….. 64 Comparing Results to Previous Studies………………..……………………………. 66 Text Messaging and Prompting SRL………..……..…………………………. 68 Teaching SRL………………..………………..………………..……………… 69 Self-Reflection………………..………………..……………..……………… 71 Intervention Effectiveness………………..………………..………………..………. 72 Impact on Students with Low GPA………………..………………..…………. 74 Considering Content and Timing of Messages……………..………………… 75 Implications………………..………………..………………..……………………… 75 Implications for SRL Theory………………..………………..……………….. 75 Implications for Empirical Research………………..………………..……….. 77 Implications for Practice………………..………………..…………………… 79 Limitations ………………..………………..………………..………………..…….. 80 Sample………………..………………..………………..……………………. 80 Statistical Assumptions and Missing Data………………..…………………… 81 Treatment Fidelity………………..………………..………………..………… 81 Measurement Limitations………………..………………..…………………… 82 APPENDICES…………………………………………………………………………….. 83 APPENDIX A Tables……………………………………………………………...... 84 APPENDIX B Figures………………………………………………………………. 114 APPENDIX C Message, Treatment, and Exam Schedule………………………….. 119 ix REFERENCES…………………………………………………………………………….. 126 APPENDIX D Self-Reflection Form……………………………………………..…. 123 APPENDIX E SRL Measures and Items…………………………………..……….. 125 x LIST OF TABLES Table 1 Sample Demographics…………..…………..…………..…………..……………. 84 Table 2 Background Information…………..…………..…………..…………..………….. 85 Table 3 Control Group Time-Series Design…………..…………..…………..…………… 86 Table 4 Factor Loadings of SRL Variables…………..…………..…………..……………. 87 Table 5 Factor Correlation Matrix…………..…………..…………..…………..…………. 89 Table 6 Correlation Matrix of Variables …………………………………………………...90 Table 7 Outcome Measure Summary …………..…………..…………..…………………..93 Table 8 Metacognitive Self-Regulation (MCSR) Models ..……………………………… 94 Table 9 Study Approach (SA) Models ..………………………………………………….. 96 Table 10 Behavioral Disaffection (BD) Models ..………………………………………… 98 Table 11 Organization (ORG) Models ..………………………………………….……… 100 Table 12 Treatment vs. Treatment Plus Summary…………..………….…………..…….. 102 Table 13 Multivariate and Univariate Analyses of Variance for Outcome Measures…….. 103 Table 14 Exam Models ..………………………………………………………………… 105 Table 15 Final Exam and Transcript Grade Models……………………………………… 107 Table 16 Treatment Fidelity…………..…………..…………..…………..………………. 108 Table 17 Reflection Entries…………..…………..…………..…………..………………. 109 Table 18 Reflection Data Summary…………..…………..…………..………………….. 112 xi LIST OF FIGURES Figure 1 A Model of Self-Regulated Learning (Butler & Winne, 1995) .……………….. 113 Figure 2 Phases and Sub-processes of Self-Regulation (Zimmerman, 2000) ……………. 114 Figure 3 Treatment and Control Overview…………..…………..…………..……………. 115 Figure 4 Scree Plot for Factor Analysis…………..…………..…………..……………… 116 Figure 5 Interaction Effect for Exam 1…………..…………..…………..…………..…… 117 xii INTRODUCTION College and university students are struggling to complete their degrees on time and many leave higher education without a degree (Hrabowski, 2014; Mabel, Castleman, & Bettinger, 2017). Today’s college students bring increasing needs and greater challenges to their institutions in the areas of student success, retention, and persistence (Hrabowski, 2014). Many higher education institutions have adapted to these challenges, leveraging new technologies to attract and support students in a variety of ways (Christensen, 2011). Colleges and universities are also investing in programs to increase degree earners in challenging fields such as science, technology, engineering, and mathematics (i.e. STEM) (Hrabowski, 2014). Colleges and universities are increasing efforts to reduce time to graduation by focusing on reducing student achievement gaps (Scott-Clayton, 2011). These efforts can help students stay enrolled and graduate while helping maintain enrollment and avoiding potential financial issues (Christensen, 2011). Competition for new students is fierce and demands for improving retention and graduation rates within institutions are high priorities (Hrabowski, 2014; Macfadyen & Dawson, 2010). Institutions therefore need to consider how they support students throughout their degree programs. Additional public pressures on institutions to improve enrollment, retention, and graduation rates also exist. For example, the U.S. Department of Education’s ‘College Scorecard’ site requires schools to publish their retention rates, graduation rates, costs, and student career earnings (U.S. Department of Education, 2018). This allows prospective and current students to compare schools by these important benchmarks. Rankings by publications such as U.S. News and World Report, Forbes, and others continue to push institutions to perform at high levels in order to appeal to and be held accountable to students and families (Meredith, 2004). 1 Institutions have tested many approaches to improving student success, often with mixed results (Lauricella & Kay, 2013; Mabel et al., 2017; Oreopolous, Patterson, Petronijevic & Pope, 2018). Two focus areas for these efforts include STEM majors and students of color (Harackiewicz & Priniski, 2018; Hrabowski, 2014). Colleges and universities have implemented robust support services for students who may be considered ‘at-risk’ of not completing a degree in a timely manner (Scott-Clayton, 2011). Support services have ranged from structured degree paths to custom-built online alert systems (Scott-Clayton, 2011; Tampke, 2012). Other institutions may emphasize advising and career services programs, despite limitations in serving the entire student population effectively (Scott-Clayton, 2011). Other institutions have implemented technological interventions that failed to produce significant results (Oreopoulos et al., 2018). These trends appear to be continuing as higher education endures ongoing pressure to improve in key outcomes while attracting and retaining students, especially those at-risk of dropping out. Higher education should consider implementing new research-based programs and initiatives to help shorten student time to graduation. Technology’s role in improving student learning outcomes, performance, and time to graduation is a growing area of research in higher education today. Nearly every college student owns a smartphone (Lauricella & Kay, 2013) and colleges, universities, and researchers have considered how students might use mobile technology for academic benefit. Using a mobile communication platform to communicate with students presents potential benefits. These include improved connection to faculty members, peers, and institutional resources (Georgina & Hosford, 2009; Jones, Edward, & Reid, 2009; Lauricella & Kay, 2013). Institutions have built scalable interventions through mobile engagement and technology (Lauricella & Kay, 2013; Tabuenca, Kalz, Drachsler, & Specht, 2015). Benefits of interacting with college students via 2 mobile engagement are still being determined. Some text message support solutions only improve GPA, but not necessarily retention (Castleman & Meyer, 2016). Other institutions are experimenting with mobile technology, but some faculty or staff may be reluctant to embrace it (Bull & McCormick, 2012). This study aims to fill in some of the knowledge gaps surrounding this approach to student success. Purpose Self-regulated learning (SRL) is one theoretical basis that can help inform interventions that improve college student academic performance. This paper adds to the literature on the best uses of mobile technology to improve college student SRL. It examines one specific approach that colleges and universities could take to help their students succeed by using mobile technology. The purpose of this study is to test the effects of a mobile communication strategy that combines the successes of previous SRL interventions. This study combines different components of previous interventions that led to improvements in SRL or academic performance. Elements of this intervention produced significant effects on SRL or academic performance in previous studies. The interventions were also adaptable to a mobile environment. Some previous SRL interventions were administered in hard copy (e.g. paper and pen) or in- person delivery. These studies used valid and reliable measures. These studies will be covered in greater detail as part of the literature review. Robust SRL training programs can help students improve study habits and academic performance (Cleary, Platten, & Nelson, 2008; Cleary & Zimmerman, 2004). Building on previous research, a similar training was delivered via a learning management system (LMS). Part of the intervention for this study used text message communication as a prompting 3 mechanism for this SRL training program. Prior research also suggests that reminders and other text-message prompts can help college students succeed in a variety of areas. In addition, previous studies have shown that structured self-reflection can help students improve academic performance (Zimmerman et al., 2011). The other two parts of the intervention included reminder and self-reflection prompts. Students were prompted via text message to use the SRL strategies they had learned from the training program. The study helps determine if a multi-faceted approach can influence SRL and academic performance for college students. The results of this study contribute to SRL theory, empirical research on mobile technology interventions, and practitioner approaches supporting college student SRL. The first section of this study reviews the theoretical basis of the intervention focusing on two theories of SRL (Butler & Winne, 1995; Zimmerman, 2002). Empirical research informing the intervention is also reviewed. A new intervention to improve student SRL and academic achievement is then described. Research questions, methods, procedures, results and implications for future research follow. Research Questions There are two central research questions for this study. These questions are outlined below with accompanying hypotheses. Research Question 1 Do students receiving the intervention show higher levels of SRL? Does the degree to which students participate in the intervention make any difference in its effect? Students participating in the intervention should utilize SRL at higher levels than those not participating in the intervention. SRL was measured using items from the self-regulation 4 strategy inventory self-report (SRSI-SR) (Cleary, 2006) and the motivated strategies for learning questionnaire (MSLQ) metacognitive self-regulation scale (Pintrich, Smith, Garcia, & McKeachie, 1991). The measures were administered three times over the course of a traditional college semester (15 weeks). This allowed group comparisons at key points in the semester (beginning (week #2), middle (week #7), and end of the term (week #15)). Research Question 2 Do students show improved academic performance if they participate in the intervention? How are these effects moderated by past academic performance? Students participating in the intervention were expected to show better performance on exams and final grade. Students with below average college GPA were expected to benefit more than those with higher college GPA. Underperforming students may have a greater need for an intervention that could impact academic performance (Frankfort, O’Hara, & Salim, 2015; Paunesku, Walton, Romero, Smith, Yeager, & Dweck, 2015). An interaction effect between treatment and past academic performance was expected. 5 THEORETICAL OVERVIEW Self-Regulated Learning Self-regulation is how individuals plan, manage, and reflect on their actions within an environment (Bandura, 1991). Individual experiences vary within these environments and they may have a significant influence on whether or not the student succeeds or struggles (Usher & Schunk, 2018). While environments are imposed, selected, or created, individuals can influence them through their actions (Bandura, 1986). Student actions are guided by both internal and external factors (Usher & Schunk, 2018). They can individually direct their own actions and intentions while their environments can be supportive and created in ways that foster SRL (Usher & Schunk, 2018). Technology could be considered both an internal means to achieving goals and an element of the learning environment that could be implemented to help students (Usher & Schunk, 2018). Students can create or adjust their learning environments through technological means such as social media or participating in student support groups in an electronic setting (Usher & Schunk, 2018). This self-control in an academic environment is often difficult to obtain. Many students leave too many elements of their learning experience to external controls and forces (Bandura, 1986). Self-regulation is comprised of a set of subfunctions that are utilized to determine self- directed actions (Bandura, 1991). In particular, forethought and planning provide individuals with the ability to plan and set goals to guide future behavior (Usher & Schunk, 2018). Social cognitive theory of self-regulation involves a self-system that includes ways in which individuals perceive and compare themselves with others (Bandura, 1978). Attention can be directed based on success or failure and personal determinations and conclusions on performance (Bandura, 1991). The combination of goals and feedback play an important role in self-regulated learning 6 and can increase student performance efforts (Bandura, 1991). Self-monitoring performance regularly is more effective than intermittent monitoring when executing self-regulation (Bandura, 1991). Self-regulation can be used effectively when learners have a clear understanding as to how they are performing (Bandura, 1991). More specifically, self-observation can help performance when evidence of progress is apparent, otherwise ambiguous actions and self- appraisal are possible (Bandura, 1991). Instructors can help direct students self-regulatory behaviors and help them learn new strategies that they can directly control (Usher & Schunk, 2018). Humans employ self-regulation in many ways such as how to approach athletic competition or taking medication in a timely manner (Clark & Patel, 2015; Kitsantas, Kavussanu, Corbatto, & Van De Pol, 2018). Self-regulation also occurs in the academic setting. Self-regulated learning (SRL) is the process by which students use self-directed actions to improve their academic skills (Zimmerman, 2002). Academic skills, including SRL, have been shown to have a significant relationship with academic outcomes (Robbins, Lauver, Le, Davis, Langley, & Carlstrom, 2004). SRL is a multi-faceted and complex contributor to college student academic success or failure. Some SRL strategies are more effective than others depending on the individual student (Butler & Winne, 1995). Certain strategies and actions are more effective in different learning contexts. For example, students may need to implement more intensive SRL strategies in challenging courses than in subject matter that comes more naturally (Ben-Eliyahu & Linnenbrink-Garcia, 2015). Students may adjust their SRL actions depending on what assistance they believe they need in relation to their learning environment. (Winne, 2018). Students may 7 determine whether or not the investment in certain SRL activities are is worth the perceived benefit (e.g. spending extra time to study a challenging topic) (Winne, 2018). Generally speaking, students need to have an understanding of how they are performing in their learning environment, information regarding effective SRL strategies they can use, and an opportunity to practice new SRL strategies (Winne, 2018). There are numerous theories of SRL that focus on different factors and variables but share some important assumptions. Two theories are discussed here in order to frame the intervention tested in this study. SRL and Performance Feedback Several SRL models focus on specific learning tasks and related feedback that help or hinder a student in the application of their SRL strategies. The Butler and Winne (1995) SRL model is built on performance feedback loops (see Figure 1). In this model, students use their past history, content knowledge and strategies to set goals for learning tasks. Students implement strategies and tactics that yield a product for the given task (Butler & Winne, 1995). Instructors then review the product and provide feedback (Butler & Winne, 1995). Peer reviewers or students themselves may also review these products. The student's SRL strategies for subsequent tasks are then guided by this feedback (Butler & Winne, 1995). For example, students may implement the same strategies that worked well for them on a task in a subsequent, similar task. Students may implement new strategies at their own discretion or based on the feedback they received for tasks on which they performed poorly on. Students may not implement effective SRL automatically, especially if a discrepancy between expected and necessary study approach exists (Butler & Winne, 1995). College students often lack a sense of how prepared they are for an exam or how they performed (Peverly et al., 2003). Internal monitoring is an important part of the SRL process and may include goal 8 reassessment, strategy adjustment, or reconsideration of past experiences (Butler & Winne, 1995). In other words, students must have an accurate perspective on their SRL actions and how they affect performance. The Butler & Winne (1995) model puts an emphasis on the role of external feedback in contrast to other models that highlight self-reflection. The use of SRL strategies is mostly determined by the student's cognitive knowledge and beliefs about how to handle specific tasks or challenges (Butler & Winne, 1995). Feedback cues and loops are critically important to determining whether or not the student can adapt learning behaviors to succeed in subsequent activities. For example, students may miss an opportunity to change their behavior in future tasks if feedback suggests that a student is weak in mathematics but they believe otherwise. Students should have adequate knowledge of SRL strategies, including when and how to apply them (Butler & Winne, 1995). There are several instances where feedback can prompt students to learn new SRL strategies. These include faculty teaching (e.g. what study strategies might be most effective for an upcoming exam), tutoring or academic coaching (Wolters & Hoops, 2015) or self-evaluation and reflection (Butler & Winne, 1995). Developing knowledge of SRL strategies is a recursive and lengthy process (Butler & Winne, 1995). It requires an active engagement whereby students take the time to review their strategies and utilize feedback cues to continue working effectively or change their actions. The Butler & Winne (1995) model is focused on task outcomes, student SRL use, and utilizing feedback to adjust their strategies. The instructor plays a very important role in the Butler and Winne (1995) model. Instructors who provide regular, ongoing, and valuable feedback will be more effective in assisting students to implement effective SRL than those who do not provide 9 such feedback. While students may be able to make assertions about their own performance, expert feedback makes this process easier for learners (Butler & Winne, 1995). SRL Phases and Processes Zimmerman's (2002) model is often cited in research on college student SRL (Roth, Ogrin, & Schmitz, 2016). This model takes a broader approach to SRL than does the Butler and Winne (1995) model, whereby student actions are the primary focus with more emphasis on personal reflection. This model makes distinctions between behavioral actions, control of the environment, and covert SRL strategies (Roth et al., 2016). Behaviors in Zimmerman’s (2002) model can be adapted to different situational and motivational settings (Roth et al., 2016). Zimmerman (2002) further extended Bandura’s (1991) theories on SRL into educational environments. The Zimmerman (2002) model provides a straightforward and overarching framework to analyze the effects of SRL interventions (see Figure 2). This model lends itself well to studying technology interventions because of its cyclical feedback structure (Kitsantas & Dabbagh, 2011). The Zimmerman (2002) model features a cyclical procedure with three phases: forethought, performance, and self-reflection (Schunk, 2012). Forethought includes goal orientation and strategic planning. Performance involves self-monitoring and self-control, where learners focus on the learning tasks at hand (e.g. taking an exam). Self-reflection involves the evaluation of the outcome achieved and whether or not the learner is satisfied with the result (Zimmerman, 2002). Multiple processes occur simultaneously within the forethought, performance, and self-reflection phases (Zimmerman, 2002). The forethought phase consists of goal setting and strategic planning, where students can set specific learning goals for themselves and outline how they plan to achieve the goal (Zimmerman, 2002). This may include very specific tasks such as studying a vocabulary list or 10 completing practice problems (Zimmerman, 2002). The forethought phase includes self- motivation, such as the student considering how confident they are and what they expect to achieve (Zimmerman, 2002). The performance phase of the Zimmerman (2002) model includes self-control and self- observation. Self-control is when the actions determined in the forethought phase are implemented in the learning task itself (Zimmerman, 2002). This may include focusing on a certain aspect of the learning task or spending more time on a more difficult topic (Zimmerman, 2002). Self-observation is the process by which students can attribute their actions to success or failure (Zimmerman, 2002). Students may find that one approach to a learning task or problem is more successful than another as they are performing the task. The self-reflection phase involves self-judgment or self-evaluation (Zimmerman, 2002). Students might compare their academic performance to classmates or to a standard they are hoping to meet (Zimmerman, 2002). Causal attribution is another component of self-reflection where students can determine (accurately or inaccurately) what caused their success or failure in the learning task (Zimmerman, 2002). This leads to either self-satisfaction or affect which then drives student motivation for the next learning task (i.e. leading back to the forethought phase) (Zimmerman, 2002). SRL Assumptions There are some basic assumptions and commonalities about SRL across various models beyond the ones highlighted here. SRL models generally share a commonality of multiple phases of self-regulation (Butler & Winne, 1995; Corno & Kanfer, 1993; Pintrich & De Groot, 1990; Zimmerman, 2002). These processes typically include a preemptive planning stage, a performance control stage, and a reaction or reflective phase (Butler & Winne, 1995; Pintrich, 11 2004; Zimmerman, 2002). Students take a myriad of cognitive, motivational, behavioral, or contextual actions within the phases of SRL in these models. One assumption is that students are active and constructive participants in their own learning (Azevedo, 2005). SRL is an active process where students must engage in their planning, performance, and reflection in order for effective learning to occur (Zimmerman, 2002). In other words, SRL does not occur automatically but rather students must consider their actions as they approach academic tasks and challenges. Another assumption is that students can monitor their SRL actions, cognition, motivations, and behavior (Azevedo, 2005). Students are able to consider the actions that they took on a given task and reflect on whether or not they were effective. This self-reflection component of SRL is very important both in Zimmerman's (2002) model and in the intervention tested for this study. Monitoring is particularly important as students adjust their approach to learning tasks (Butler & Winne, 1995). Feedback cues provided through an instructor or other source (e.g. classmates) are critical moments in the learning process where a student can consider alternative ways to use SRL as a means of reaching their goals (Butler & Winne, 1995). A third assumption is that elements of the student environment may encourage or impede successful use of SRL strategies (Azevedo, 2005). Teachers and faculty may choose to integrate teaching SRL into their courses, which has shown to be effective in previous studies (e.g. Zimmerman et al., 2011). In addition, instructors who provide valuable and regular feedback are more likely to help students foster their own SRL and performance than those who do not (Butler & Winne, 1995). Instructors who provide very little feedback or do not spend time working with students on learning strategies may be missing an opportunity to help them succeed. 12 SRL models assume that goals and standards for comparison exist to help students determine what their SRL actions should be (e.g. to continue or discontinue an action) (Azevedo, 2005). Students assess their own performance in relation to others in their learning environment, possibly leading to changes in behavior (Bandura, 1991). This is a social cognitive approach to SRL where students learn how effective their strategies are in relation to the performance of others in a similar situation (Bandura, 1991). For example, if a student sees themselves performing at a level lower or less satisfactorily to their peers, they may change their SRL strategies with the hopes of improving their outcomes. SRL activity is assumed to be a mediator between personal and contextual characteristics (Azevedo, 2005). Students regulate their motivations and actions to achieve specific goals. Student success is not dictated by student SRL actions entirely, but rather SRL is assumed to mediate how a student performs given their background, past academic history, and overcoming other challenges (Azevedo, 2005). Teaching SRL Students bring preexisting SRL skills and beliefs to the college setting (Bandura, 1991). Students also develop SRL during their college experiences (Winne, 1995). SRL skills can change over time with teaching and guidance (Sitzmann & Ely, 2010). Most learners need assistance with three main components of SRL. These include gathering valuable information about how they use SRL, access to effective tactics and strategies, and the chance to practice new tactics and SRL strategies (Winne, 2018). Over time, these new SRL strategies can become automatic and easy for the learner to execute (Winne, 2018). Student SRL skills may not be as effective in college as they were in high school (Peverly, Brobst, Graham, & Shaw, 2003). In particular, students may not be aware of how 13 prepared they are for exams (Peverly, et al., 2003). Students are seldom required to evaluate their own work, which may pose a missed opportunity for students to develop their SRL skills (Zimmerman, 2002). These limitations can have a wider impact on whether or not a college student is able to succeed in their his or her coursework. SRL strategies can be improved through committed and effective teaching by integrating SRL instruction into coursework (Schunk & Zimmerman, 1998; Zimmerman, 2002; Zusho & Edwards, 2011). For example, an instructor may implement a self-reflection exercise following an exam or show students how to revisit mistakes and errors to improve their learning (Zusho & Edwards, 2011). Many colleges and universities have instituted a first-year seminar or ‘learning to learn’ courses to help students orient their skills toward the college setting (Wolters & Hoops, 2015; Zusho & Edwards, 2011). Teaching and discussion of SRL strategies are often incorporated into these courses. Effective exercises and components of introductory self- regulatory ‘learning to learn’ courses may include note taking, test taking, and time management (Wolters & Hoops, 2015; Zusho & Edwards, 2011). The student is always at the center of SRL learning and activity. Changing and improving SRL practices requires a high level of intentionality on the part of the student (Bandura, 1991). Regardless of intervention or environmental influence, the student is the one that decides what their SRL actions will be (Winne, 2018). Learning and understanding effective SRL is challenging even for successful learners (Winne, 2018). Reviewing and analyzing feedback as it relates to SRL may require the undoing of strategies that had been effective previously (Winne, 2018). SRL improvement is dependent on intentional student actions combined with effective teaching and proper guidance (Butler & Winne, 1995). 14 Teaching SRL may influence positive student behaviors including improved study strategies, time management, goal setting and learning strategies (Pintrich, 2004; Schunk & Zimmerman, 1998). Teaching and fostering SRL are time consuming and intensive processes which many instructors may not be willing to sacrifice class time for (Butler & Winne, 1995). Some researchers have explored the affordances of technology-supported interventions for fostering SRL as an alternative (Azevedo, 2005). Technology interventions used to foster SRL are reviewed in the following section. The empirical work on text message interventions is also reviewed. 15 EMPIRICAL RESEARCH OVERVIEW Technology has provided opportunities to help students with SRL. Students use less SRL than in the past, often due to high school experiences and parental influence (Zusho & Edwards, 2011). Technological interventions may provide an avenue for institutions to bolster SRL skills. Technology also provides an opportunity for instructors and institutions to provide structure and scaffolding to SRL (Greene, Moos, & Azevedo, 2011). In other words, technology can be used as a mechanism for students to think about and utilize their SRL. Common SRL interventions such as learning to learn courses and other efforts typically occur periodically (e.g. once per week). Students may lack the understanding of what is required to succeed in difficult coursework and how to approach specific tasks (Zusho & Edwards, 2011). Students may thus need more regular and ongoing support for SRL. Some successful SRL interventions are reviewed below. SRL Interventions A myriad of SRL interventions have been implemented and researched for different student groups. Successful interventions have been studied for at-risk students, (Cleary, Platten, & Nelson, 2008) students with learning disabilities, (Butler & Schnellert, 2015) and children with depression (Ehrenreich-May, Kennedy, & Remmes, 2015). Most of the SRL interventions in higher education consist of tutoring, workshops, or coursework (Wolters & Hoops, 2015). These interventions vary in length and scope. Tutorial sessions may last as little as a half hour while other interventions last an entire semester (Wolters & Hoops, 2015). Many of these interventions and programs are optional for students or require extensive time and resources to implement (Wolters & Hoops, 2015). 16 Self-Reflection in the College Classroom SRL interventions are often embedded into specific courses. Instructors can teach effective SRL and content matter simultaneously (Wolters & Hoops, 2015). One successful example of an embedded self-reflection intervention in college mathematics can be found in Zimmerman et al. (2011). Almost 500 students in college developmental and introductory mathematics at a large technical institution participated in a study where the treatment group received reflective teaching practices from their instructor (Zimmerman et al., 2011). Participants were encouraged to use a math revision self-reflection form after each quiz and exam that they completed. The form required participants to review each problem they answered incorrectly and reflect on how they could have better prepared for the question and how they can prepare themselves for a similar question in the future (Zimmerman et al., 2011). Participants were offered the chance to earn points back on their grade if they returned the form in a timely manner. Results showed that intervention participants were more likely to receive better grades in subsequent exams (Zimmerman et al., 2011). Students were also more likely to complete important gateway courses and pass math placement exams (Zimmerman et al., 2011). The intervention in Zimmerman et al. (2011) required little alteration of the instructor’s teaching approach. It encouraged instructors to quiz students more often so that feedback received was more frequent (Zimmerman et al., 2011). This study did not include formal teaching on how to use SRL strategies. Rather, the instructors in the intervention group focused more on using mistakes to improve learning (Zimmerman et al., 2011). The Self-Regulation Empowerment Program The Self-Regulation Empowerment Program (SREP) is an SRL training and development program designed to assist school-aged children on improving their performance through 17 effective SRL practices (Cleary & Zimmerman, 2004). Several interventions were used in the program including cognitive modeling, cognitive coaching, practice sessions, and graphing (Cleary & Zimmerman, 2004). A key component to the SREP was student work with a learning coach. The learning coach was tasked with helping students implement changes to their learning strategies (Zimmerman, 2000). Through regular meetings and workshops, the learning coach helped students assess their strengths and weaknesses to determine which SRL strategies the student already possessed and which they needed to obtain (Cleary & Zimmerman, 2004). The program fostered SRL through a question protocol that helped students determine what was working well for them (Cleary & Zimmerman, 2004). The learning coach asked students how they decided on certain learning strategies for tasks and how they kept track of where they were studying in preparation for exams (Cleary & Zimmerman, 2004). One key intervention in the SREP was the self-regulation graph (Cleary & Zimmerman, 2004). This simple worksheet involved students plotting their academic goals and actual scores. For each grade received, the student outlined what strategies they used in preparation for the examination. The intention was for students to evaluate the effectiveness of their strategies based on the grade they received (Cleary & Zimmerman, 2004). If students did not meet their goal, they were encouraged by the learning coach to use other strategies or SRL processes. This exercise encouraged students to set specific performance outcome goals and plan ahead for how they would prepare (Cleary & Zimmerman, 2004). The SREP has been successful for improving SRL and academic performance with high school students (Cleary, et al., 2008). Eight ninth-graders with below average classroom test scores in biology showed improvement in class performance over the class average (Cleary et al., 18 2008). The intervention also significantly improved self-regulation and motivation (Cleary et al., 2008). The SREP program is designed to help students develop new SRL skills (Cleary & Zimmerman, 2004). College students, however, are expected to enter with these skills already in place. While many college students possess SRL skills, they are often used inaccurately or inadequately (Peverly et al., 2003). There is little research on the implementation of the SREP at the college level or via an electronic medium. SRL in Engineering Academic achievement in the first year of an engineering degree program often lays the groundwork for future success or dropout in the major (Nelson, Shell, Husman, Fishman, & Soh, 2015). First year ‘gateway’ courses are of particular importance. More than one third of engineering students will drop out of introductory courses in their engineering program (Gainen, 1995). Student SRL inside and outside the classroom are particularly important if students are going to persevere through the challenging nature of first year courses in engineering programs (Nelson et al., 2015). Retention in engineering and other STEM programs are particularly challenging. A number of interventions have been successful in slowing attrition rates in engineering programs in the first year. Successful programs include summer bridge programs, undergraduate research initiatives, cohort learning communities, and faculty mentoring (Ricks, Richardson, Stern, Taylor & Taylor, 2014). Less research has been conducted on ways to improve engineering student SRL in the first year. Engineering students may unintentionally adopt significant maladaptive SRL habits (e.g. surface learning), particularly in introductory courses in their program of study (Nelson et al, 19 2015). This suggests that students in their first year of an engineering major program may benefit from SRL interventions. Student GPA has shown to be a significant predictor of success in engineering coursework (Huang & Fang, 2013). The need to assist students with lower academic profiles early in engineering courses and programs provides a significant opportunity for effective interventions (Huang & Fang, 2013). Text Message Interventions Text messaging, also known as short message service (SMS) messaging refers to short typed messages sent between two or more mobile devices (Kasesniemi & Rautiainen, 2002). The use of text messaging technology on mobile devices is pervasive amongst college students with over 90 percent of students sending text messages on a regular basis (Bull & McCormick, 2012). Text messaging allows several affordances due to its ubiquity amongst college students, its instantaneous connectivity, and student preference over email (Lauricella & Kay, 2013). Challenges include the short amount of characters permitted in each text message (typically 140 characters), competition for attention with other mobile notifications, and the potential to infringe on student personal technological space (Lauricella & Kay, 2013). Recent studies have analyzed the possibilities for text message and mobile technology to improve student SRL. Purposes of these studies include reducing procrastination (Davis & Abbitt, 2013), improving self-awareness of time management (Tabuenca et al., 2015), and creating better interactions between students (Zamani-Miandashti & Ataei, 2015). Most college students have access to mobile phones and texting technology, which has led to increased research on their potential educational and supportive uses (Lauricella & Kay, 2013). Text messages sent to students may increase college student SRL and academic performance. 20 Text messages can provide a more effective means for colleges and universities to reach their students over email and other forms of communication (Griffiths & Hmer, 2004; Lauricella & Kay, 2013). Text message communication can improve faculty and administrative contact with students (Nix, Russell, and Keegan, 2006). Most college students are willing to use mobile devices to help achieving their educational goals (Lauricella & Kay, 2013). Some studies have suggested that text messaging is effective in helping underachieving or at-risk students persist in college (Frankfort et al., 2015). This approach has been helpful in assisting students to navigate complex administrative processes (Nix et al., 2006). Castleman and Page (2015) utilized a text message intervention to help low-income high school students and their families navigate the college enrollment process. They sampled more than 2400 graduating high school seniors from underrepresented minority and low-income groups in large school districts around the country (Castleman & Page, 2015). Almost 900 parents also participated in the study. The researchers used two interventions. The first was a set of text messages about the college entrance process. The second was a text message program where mentors were available for students via text message. Results found that the text message intervention increased college enrollment amongst students (Castleman & Page, 2015). The intervention was more successful in districts with less available student support (Castleman & Page, 2015). This study was focused on achieving administrative tasks rather than student SRL, but the results obtained are promising for related research. Castleman and Page (2015) were effective in part because of their timely reminders for students and parents to focus on specific parts of the enrollment process. Recent research has extended the Castleman and Page (2015) work to determine if text message prompting can help students navigate the financial aid process or complete their degree 21 (Castleman & Page, 2016; Mabel et al., 2017). Mabel et al. (2017) implemented a text message campaign aimed specifically at college students late in their careers who were struggling to complete their academic programs. Messages sent to students encouraged goal setting and making connections to helpful institutional resources. Results found that the campaign helped reduce dropout but did not have an impact on overall academic performance (Mabel et al., 2017). These results may have been due in part to the multifaceted nature of the intervention. The message content focused on important deadlines and administrative tasks (e.g. when to file for financial aid) while encouraging students to take general actions that would benefit their academics (e.g. to set goals and reduce procrastination). Messaging focused on specific class strategies may be better in helping students link their actions to specific desired course outcomes. The Mabel et al. (2017) intervention included a combination of SRL and administrative prompts. The SRL components of Mabel et al. (2017) focused on setting goals and encouraging help- seeking behaviors, but not on specific course approaches. While the intervention did not increase academic performance, the reduction in student dropout is promising (Mabel et al., 2017). The use of text message reminders may help students improve their SRL behavior and academic performance in several ways. Personalized messages may be effective in engaging students and promoting positive SRL behaviors (Goh, Seet, & Chen, 2012; Kim & Keller, 2007; Tabuenca et al., 2015). Personalized messages give the impression that information disseminated to the student is directed specifically to them, which is likely to garner more attention and lead to action (Goh et al., 2012; Tabuenca et al., 2015). Much of the past research in this area focused on generic messages (e.g. reminders regarding due dates, review sessions, etc.) (Goh et al., 2012; Kim & Keller, 2007; Nix et al., 2006). Goh et al. (2012) sent text messages to students multiple times per week throughout a semester. The message content focused on keeping students 22 interested and motivated in their studies rather than actual performance. Some self-reported measures suggest that the intervention impacted SRL, however the data presented did not show the full aggregate MSLQ measures (Goh et al., 2012). Instead, the researchers highlighted significant differences between treatment and control groups for specific MSLQ items. These items included "I expect to do well in this class," "I outline material to organize my thoughts" and "when I become confused… I go back and try to figure it out" (Goh et al., 2012). Presenting results by item in this manner raises some concern regarding validity of the results. Third-party programs and products have also assisted colleges and universities implement text message technology to improve student academic outcomes. Admithub (Admithub Inc., 2017) provides a two-way text messaging chat bot that gives automated advice and information to help students successfully transition to college (Peterson, 2016). Frankfort et al. (2015) analyzed the effectiveness of Persistence Plus (Persistence Plus LLC, 2016), a proprietary program that sends daily text messages to college students about how to assimilate socially and commit to a time and place to study. It also provides an outlet for student to express how they are feeling, their well-being, goals, and challenges (Frankfort et al., 2015). The program allows college personnel to intervene where necessary. Persistence Plus (Persistence Plus LLC, 2016) texted students six times per week on these topics. Results of the intervention showed a reduction in fail rates for remedial courses and an increase retention for at-risk groups (Frankfort et al., 2015). Prompting SRL with Technology Technology can be used to foster and scaffold SRL. Prompting can take place in a Learning Management System (LMS), by mobile device, or via some other medium. Several 23 studies have been successful in fostering performance outcomes using effective and well-timed SRL prompts. Sitzmann, Bell, Kraiger, and Kanar (2009) conducted two studies to assess the effectiveness of SRL prompts in two different conditions. The first study analyzed the impact of an SRL intervention on 93 working adults (Mage = 44) in an online training course on the Blackboard LMS. The course included ten modules with text and videos on the functions and uses of the Blackboard LMS. Students worked on the course at their own pace. Self-monitoring and self-evaluation questions were asked throughout the course (Sitzmann et al., 2009). For example, students were prompted to consider if they had thoughts that interfered with their focus or if they believed the tactics they were using in the training were effective (Sitzmann et al., 2009). Prompts were implemented in three different conditions including immediate (throughout the course), delayed (in the second half of the course), and control (no prompts) (Sitzmann et al., 2009). Hierarchical linear modeling results suggested that the SRL prompts had a positive effect on participants over time. Participants in both the immediate and delayed conditions showed an improvement in test scores (Sitzmann et al., 2009). The second study in Sitzmann et al. (2009) assessed the impact of SRL prompts on undergraduate students in a three-hour study. In contrast to the first study, this experiment was much more controlled and shorter in duration. Students were enrolled in a radar-tracking simulation course involving highly technical training. Results suggested that the SRL prompts had a positive effect on basic performance over the control group by .41 (immediate condition) and .53 SD (delayed condition) (Sitzmann et al., 2009). The two studies presented in Sitzmann et al. (2009) show the potential for SRL prompts to affect student performance, but only amongst two very specific, non-traditional learning 24 environments. Initial results are promising, but additional research should be performed to see if these results can be replicated in a more traditional undergraduate environment. Sitzmann & Ely (2010) showed similar results to those in Sitzmann et al. (2009) where continuous SRL prompting led to learning improvement amongst 479 adults (Mage = 42) in an online excel training course based in an LMS. Over the course of ten self-paced modules, researchers assessed changes in three SRL constructs including metacognition, motivation, and concentration (Sitzmann & Ely, 2010). This study also assessed student time on task, which was measured by how much time students spent in the LMS during training. These measures drew from three separate self-report measurement instruments but were not clearly linked to SRL theory. The authors did not detail the particular aspects of metacognition, motivation, and concentration they were seeking to understand. Results showed that prompting SRL had a significant effect on academic performance (Sitzmann & Ely, 2010). The intervention did not have a significant main effect on SRL activity as defined in the study. Rather, results showed that subjects in the continuous prompt condition spent more time in the LMS per module than the control condition, fully mediating SRL activity (Sitzmann & Ely, 2010). Results also showed that learning and time on task were significantly correlated (Sitzmann & Ely, 2010). Similar to Sitzmann et al., (2009) while the results are very promising, they were implemented with a very specific, non-traditional population. The study did not show that the intervention had a direct effect on SRL, though it did have an effect on academic performance. Sitzmann & Johnson (2012) found that a planning and self-regulation prompt intervention embedded in an LMS had a significant impact on student learning outcomes in an online course. Students were required to plan when, how long, and where they planned to study. 25 They were also asked prompting questions regarding their progress in the course, their perceived understanding of the material, and other reflective questions (Sitzmann & Johnson, 2012). Results found that students spent significantly less time on task than planned (Sitzmann & Johnson, 2012). Students receiving the planning and prompting intervention performed better than students in the control condition by six to eight percentage points (Sitzmann & Johnson, 2012). Performance was further increased when students followed through on their study plans, approximately 14 points higher than the control condition (Sitzmann & Johnson, 2012). Students in the treatment condition also saw significantly less attrition (Sitzmann & Johnson, 2012). Data suggested that planning too much time for studying and falling short had a potential negative effect on participants and the intervention might only be beneficial for those with high cognitive abilities (Sitzmann & Johnson, 2012). Kitsantas and Dabbagh (2011) suggested several possibilities for the role of web 2.0 technologies in college SRL. This research suggested starting points for instructors and administrators to use web 2.0 technology such as video conferencing, blogging, and wikis (Dabbagh & Kitsantas, 2012; Kitsantas & Dabbagh, 2011). For example, video conference interview assignments can be recorded and used to provide modeling, coaching and advice (Kitsantas & Dabbagh, 2011). Blogs could be used by students to demonstrate their learning processes with instructor feedback on how to improve their learning (Kitsantas & Dabbagh, 2011). There is significant research on course-level interventions to help students be more academically successful (e.g. Hodges & Kim, 2010; Kitsantas & Dabbagh, 2011). Much of this research focuses on how interventions can help students better self-regulate in both the online and in-person environments (Wandler & Imbriale, 2017; Wang, Shannon, & Ross, 2013). 26 Linking intervention prompts to a specific course may help students better connect their actions to learning outcomes. General prompting may not be as effective in improving student learning (Mabel et al., 2017). Structuring an intervention at the course level can help instructors and support personnel cater the messaging so that the timing is in line with the progression of the class. Combining Training, Coaching, and Text Messaging Recent research has combined teaching SRL online with regular text message reminding to help foster student academic performance. Oreopoulos et al. (2018) worked with over 9,000 students to create online study and time management plans reinforced by study tips, reminders, and coaching via text message throughout the year. The intervention was heavily focused on increasing student study time. While data suggested that this was achieved, the treatment did not improve student GPA, credit completion, or retention (Oreopoulos et al., 2018). The study sample included a range of participants from both highly selective institutions and open-access community colleges (Oreopoulos et al., 2018). Study time may not have been the best focus for the type of intervention presented. Survey data for this study suggested that an increase in study time was correlated with higher GPAs (Oreopoulos et al., 2018) however accumulation of study time is not necessarily linked to performance (Plant, Ericsson, Hill, & Asberg, 2005; Schuman, Walsh, Olson, & Etheridge, 1985). Further, quantity of study may not translate to effective study (Plant et al., 2005). Efficient study in even a brief period of time may be just as effective. Further, the Oreopoulos et al. (2018) study was quite complicated in its implementation, consisting of a high number of messages throughout the semester which students may have ultimately ignored. Prior research on 27 SRL interventions show that a more focused intervention on only one aspect of SRL is more effective (Zimmerman et al., 2011). Another recent study looked at the potential of a goal setting exercise with 1,400 Canadian college students (Dobronyi, Oreopoulos, & Petronijevic, 2017). Students were offered the option to receive text message reminders throughout the semester pertaining to the goals they set. Similar to Oreopoulos et al. (2018) the intervention had no significant effects on student GPA, credit completion or persistence (Dobronyi et al., 2017). While goal setting is a very important part of SRL (Zimmerman, 2002), the goal setting exercise in this study focused on matters that were too broad and not necessarily specific to their coursework (Dobronyi et al., 2017). The goals that students were asked to set included envisioning their future social life, family life, career, and how to maintain a balanced life (Dobronyi et al., 2017). These are important goals but are not directly linked to specific coursework and proximal goals, both of which have been helpful in improving academic performance outcomes (Sitzmann & Ely, 2011). Study Logs and Text Reminders Time management can be a challenge for many college students. Intervention research has been conducted to test the impact of text reminders to log and monitor study activity. With a small group of online graduate students (n = 36), Tabuenca et al. (2015) texted students on a regular basis reminding them to log their study time and activity using a mobile application. Participants preferred receiving regularly-scheduled notifications in the morning so that they could plan their day (Tabuenca et al., 2015). The authors hypothesized that attention-grabbing messages would have more impact on SRL than generic messages (Tabuenca et al., 2015). Notifications in this study included the student's name and useful content that was not repetitive. A link to the logging tool was provided in each message (Tabuenca et al., 2015). Results showed 28 a correlation between amount of time studied and grade performance, but there was no significant difference between treatment and control group on final grades (Tabuenca et al., 2015). There are several significant concerns with this particular study. First, the number of students ultimately completing the program (n = 13) was extremely limited. The student population (online graduate students) is not an ideal group to test this type of intervention as graduate likely have a history of effective SRL practices. As mentioned previously, interventions targeting study time have shown to be inconsistent in their impact on SRL and student learning (Plant et al., 2005). Current Study Few studies have tested SRL prompting via text messaging with a traditional college- aged population that also integrate a teaching or training component as was the goal in the current study. Some past studies that prompted or taught SRL lacked technology or text message elements (e.g. Cleary et al., 2008; Zimmerman et al., 2011). Text message intervention studies were mostly focused on assisting in administrative rather than educational tasks (e.g. Castleman & Page, 2015). Studies that integrated technology into SRL prompting were conducted with non- traditional college populations (e.g. Sitzmann et al., 2009; Sitzmann & Ely, 2010). Studies that combined text messaging and SRL prompting implemented approaches that are not well aligned with effective SRL practices in theory. These included increasing study time (Oreopoulos et al., 2018) and encouraging students to set lofty, non-proximal goals (Dobronyi et al., 2017). This study also sought to improve upon studies that were not as successful as perhaps could have been. For example, some studies were too invasive and texted students too often (e.g. 29 Oreopoulos et al., 2018). Others were not focused or nuanced enough to be effective (e.g. Dobronyi et al., 2017; Mabel et al., 2017). This study combined elements of three previously successful interventions into one new intervention program delivered via text message technology. Elements included an SRL teaching and training program (delivered electronically) (Cleary et al., 2008), text message SRL prompts regarding the course (Castleman & Page, 2015; Mabel et al., 2017), and prompting self- reflection following each exam (Zimmerman et al., 2011). Participants were able to work with the intervention on their own time. The timing of the messages was strategically scheduled in order to nudge students when they should take certain SRL actions regarding their class. There are two central research questions to guide this study. The first asks if participants receiving the intervention show higher levels of SRL and if level of participation makes any difference in its effect. Intervention participants should utilize SRL at higher levels than those in the control group. The second research question looks at whether or not the intervention has any effect on academic performance over the course of a full semester. In addition, it asks how these effects are moderated by past academic performance at the college level. In particular, can the intervention help raise the academic performance of students who have lower college GPAs? Students participating in the intervention were expected to show better performance on exams and final grade than their control group counterparts. 30 METHODS Participants Study participants were enrolled in a gateway class critical to degree progression and completion. Participants were enrolled in introduction to ship systems, a freshman-level engineering class at a small public institution in the Northeast. This particular course is an important prerequisite for students wishing to participate in one of three applied learning experiences required for an engineering degree. The withdrawal and fail rate for the course over the last four years is approximately 12%. The high enrollment in this course was desirable as similar courses can lack individualized attention and support. The larger class size also provided additional participants to help detect smaller effect sizes (Jones, et al., 2009). Participants were recruited both in class and via email. Participants were asked to provide access to their academic record, including overall GPA and demographic data. Gathering this information was necessary in order to determine how the intervention impacted students. Participation in the study was completely optional. Participants were provided incentives through a regimental program at the school. A total sample of 100 participants was obtained for this study. Students were provided the option to participate in the intervention but elect to not submit their data for purposes of the study. Eight students chose this option and their information was removed from the study. Two additional students withdrew from or dropped the course prior to completion and thus their information was removed. This resulted in 51 participants in the control group and the remaining 49 as part of the treatment group. Treatment and control were assigned based on the section of the course the participant was enrolled in. 31 Demographics of the sample were reflective of the institution's student population (see Table 1). The sample consisted of 85% male participants and 18% identifying as part of a racial minority. Gender proportions for the sample were similar to the rest of the student population (!2 institution sampled (!2 (1, n = 100) = 4.96, p < .05). The sample for this study consisted of 18% (1, n = 100) = .35, p = .55). Race frequencies were not reflective of the student population of the minority, significantly lower than the institution's overall rate of 28%. The majority of the sample (67%) had parents with at least a college degree. Approximately one quarter (23%) of the sample had parents with at least some college and 10% with a high school diploma or lower. Withdrawal and fail rates for the course were similar to past years with 12% of treatment participants and 10% of control participants either failing or withdrawing from the course. Descriptive statistics were generated to determine the overall structure of the data, looking specifically at the initial differences between the treatment and control groups. Both treatment and control showed very similar means for college GPA, high school GPA, and commitment to the engineering major. Students in both treatment and control groups had similar entering college and high school GPA (see Table 2). This study used a quasi-experimental design with college subjects enrolled in a freshman Design gateway engineering course. Two lecture hall sections of the same engineering course were used for study recruitment. Participants from one section of the course received the treatment with the other serving as the control. Both treatment and control sections were taught by the same faculty member. The measurement instrument for treatment and control groups was administered on the same dates for both sections. Separating treatment and control by section helped prevent 32 contamination between the two groups (Campbell & Stanley, 1963) and made the experiment easier to administer and organize. The study used a control group time-series design as outlined in Table 3 (Ary, Jacobs, & Sorensen, 2010). The between-subjects quasi-experimental design allowed for comparison of the treatment group against the control group. This helped determine the extent to which the SRL intervention had an impact on the outcome variables. The main benefits of this design are the ability to use intact groups to analyze treatment and control conditions (Ary et al., 2010). Threats to internal validity such as maturation and regression should be reduced as long as the two courses are similar (e.g. taught by the same faculty member with the same content and same exams) (Ary et al., 2010). There are some potential internal validity threats of this model, mainly the possibility of selection bias (e.g. one section is taught better than the other). Homogeneity of variance for all outcome measures were analyzed to determine if groups were similar in background (academic and demographic). The analysis approach section provides an overview of why specific statistical tests were selected for this study. Results were analyzed using exploratory factor analysis and multiple regression analysis. Multiple regression models for each outcome measure at each time point were analyzed using within group and between group variables. This helped determine whether or not significant differences existed between or within groups over the course of the semester as a result of the treatment condition (Ary et al., 2010). Main effects of the treatment and interactions with college GPA were analyzed. Procedures Procedures were based on previous studies that successfully affected SRL and academic performance (Castleman & Page, 2015; Cleary et al., 2008; Mabel et al., 2017; Sitzmann et al., 33 2009; Sitzmann & Ely, 2010). Components of these studies were combined to develop the current intervention and adjusted to fit the context of the class. Adjustments included changing the message content to fit the course and corresponding schedule (e.g. test reminders). Some messages were developed in collaboration with the faculty member (e.g. helpful resources to access). Intervention messages used in this intervention were different from the messages used in prior studies but utilized a similar approach to past studies (e.g. Sitzmann & Ely, 2010; Sitzmann & Johnson, 2012). SRL measures were gathered using in-person questionnaires three times during the semester at the beginning (week 2), middle (week 7), and end (week 15). Intervention Components Teaching effective SRL. Successful SRL interventions often included a teaching or instructional component (Cleary, 2006; Schunk & Zimmerman, 1998; Zusho & Edwards, 2011). This is especially true of interventions that attempted to improve student SRL strategies. As research has shown, teaching SRL has been executed effectively through ‘learning to learn’ courses and other programs (Pintrich, 2004; Schunk & Zimmerman, 1998). There are various open educational resources (OER) and online assistance available including online videos and tutorials to help students with SRL. These resources often lack the expense and coordination necessary for a full in-person course. The intervention in this study used OER that focused on college student learning and effective SRL practices (College Success, n.d.; Dennis Learning Center, 2018; Dillon, Lamoreaux, Nissila, & Priester, 2018). Students were prompted to review relevant chapters and resources via text message. These chapters and resources included instructional videos and short exercises accessible through the institution’s LMS. The content’s progression was based on the Self-Regulation Empowerment Program 34 (Cleary et al., 2008). These resources were provided via text which has been successful in previous interventions (Castleman & Page, 2015). Information on effective SRL was sent in the first several weeks of the semester so that students could implement the information in class. Participants were asked to read and engage with the content. Content included how to study effectively, how to prepare for class, time management, and other related topics (College Success, n.d.). Test reminders. Prior interventions that reminded students about important dates were successful in helping them remain enrolled in college (Castleman & Page, 2015, 2016). College students are often underprepared for examinations and may ineffectively manage their time (Peverly et al., 2003). Reminders implemented for this intervention included upcoming exam dates and how students should prepare. The intervention reminded students to begin exam preparation well in advance so that SRL could be activated in a timely manner. Other important reminders were developed in collaboration with the instructor (see Appendix C). Prompting self-reflection. Self-reflection is an important part of the SRL process (Zimmerman, 2002). Structured self-reflection has been successful in boosting college math student academic performance (Zimmerman et al., 2011). The intervention involved students completing a standard self-reflection form at the end of each quiz or examination (Zimmerman et al., 2011). The intervention for this study integrated a similar form in an online survey format that was delivered via text message shortly after each exam was graded and returned to participants. The form included very similar questions to those used in Zimmerman et al. (2011) (see Appendix D). The survey was intended to prompt student consideration about their study approaches to the exam. As part of the reflection, students were directed to look at the questions 35 they answered incorrectly and consider how they could better prepare for the next exam. Prompts referred participants to campus resources if needed. The Zimmerman et al. (2011) study required participants to complete one form for each question the student answered incorrectly on an exam. This form was modified so participants could consider the SRL strategies they used and how they impacted their performance. It also prompted participants to consider what changes they could make in preparation for their next examination. Study Steps Steps for the study were as follows: 1. Participants were recruited via in-class and email recruitment. They were asked to complete a release form and provide background demographic and personal information. Participants in the treatment section also received instructions on how to enroll in Remind.com (Remind Inc., 2015), the text messaging software used in the study. 2. Participants in the treatment condition were sent SRL prompts throughout the semester (see message schedule in Appendix C). Some messages included links for students to read about effective SRL practices while others included reminders about upcoming exams and prompts to complete a reflection exercise. 3. Both treatment and control groups completed in-person surveys regarding their SRL actions over the course of the semester. Surveys took approximately ten minutes to complete at each of the three time points. Participants requesting to be removed from the study stopped receiving messages. 36 4. Academic outcome data (e.g. exam and final grade scores) was gathered at the end of the semester for those participants who consented to release it. The main component of the intervention consisted of a series of text messages sent to students regarding the course and corresponding SRL tips and prompts. Messages were sent directly to participants 30 times over the course of a 15-week semester (see Appendix C for message, exam, and measurement schedule). Consent form and intake information. Participants completed the consent form in class with the researcher present to explain the study and its purpose. Demographic information was collected including student age, race, gender, major, and high school and college GPAs. Participants in the treatment section registered their cell phone number in order to participate in the study. Participants were also able to download the Remind app (Remind Inc., 2015). Participants received the same messages regardless of whether they signed up for text messages or the mobile app. Participants were sent an initial message to test the system and troubleshoot any issues prior to the start of the study. Data collection. Several data points were used to understand how much students accessed the SRL material provided to them. Participants were directed to SRL material in the LMS, where participants were encouraged to complete a short quiz for understanding the content. This was intended to reinforce the SRL material sent to participants and encourage them to review the material. Reminder messages were tracked through the Remind.com (Remind Inc., 2015) system. The system calculated how many participants opened messages sent to them to determine how many participants read them. Participation in the self-reflection component of the intervention was measured by the number of students that completed the reflection form in the 37 online survey system. Additional data from reflection surveys helped determine how participants utilized this portion of the intervention. Intervention Schedule Participants were sent messages on a pre-determined schedule in line with the progression of the course. Study participants were typically messaged twice per week on a Mondays and Wednesdays. Previous studies contacted participants on a more frequent basis, up to six times per week (Davis & Abbitt, 2013). The number of messages were limited for this study as many students in the United States see their mobile devices as more social than academic in purpose (Jones et al., 2009). Text messages included material on effective SRL, reminders for upcoming exams, and self-reflection prompts (see Figure 3 and Appendix C). Regular and ongoing SRL prompts have been more effective than prompts that only take place in the first or second half of the semester (Sitzmann & Ely, 2010). Sitzmann & Ely (2009) showed positive effects of three prompts per online module in a four-module course. Further, SRL practices may develop over a significant period of time (Kanfer & Ackerman, 1989). The intervention was therefore implemented throughout the duration of a traditional 15-week semester. Messaging students approximately twice per week throughout the semester was intended to reach students frequently without becoming too invasive or annoying. Messages were sent in the morning (appx. 11AM) so that students can plan their time and approach for the day (Tabuenca et al., 2015). Remind Technology Remind.com (Remind Inc., 2015) allows for easy communication between instructors and students. Primarily designed for K-12 teachers and administrators, Remind.com is a low-cost program offering individual memberships and allowing for wide access to faculty members who 38 may lack financial resources (Remind Inc., 2015). Teachers, faculty, administrators, students, and others can set up multiple ‘classes’ in the Remind.com (Remind Inc., 2015) web platform, which generates a unique code for individuals to sign up and receive text message communications. Instructors controlling the class are then able to send scheduled or instantaneous text messages to those enrolled. The software allows instructors and others to communicate individually with students via text message ‘office hours’ when they are available (Remind Inc., 2015). Faculty members can send and receive messages through the web interface, text messages on their mobile device, or through the Remind.com mobile app (Remind Inc., 2015). Measures This study used a variety of measures for SRL, academic performance, and treatment fidelity. These measures are described below. Self-Regulated Learning Measures This study used items from two separate instruments to measure SRL for both treatment and control groups. These included the Self-Regulation Strategy Inventory - Self Report (SRSI- SR) (Cleary, 2006) and the metacognitive self-regulation subscale of the Motivated Strategies for Learning Questionnaire (MLSQ) (Pintrich et al., 1994). The full SRL measurement instrument consisted of these two scales, resulting in 40 total items. Each item was measured on a self-reported seven-point Likert scale as described below. After all SRL data were collected, an exploratory factor analysis was conducted to combine and interpret the factors underlying these items. The exploratory factor analysis process is described in detail in the results. 39 Self-Regulation Strategy Inventory - Self-Report (SRSI-SR). The Self-Regulation Strategy Inventory - Self-Report (SRSI-SR) is a measurement instrument used to understand the frequency with which students use various SRL strategies in an academic environment (Cleary, 2006). The assessment includes 28 items on a scale of 1 (never) to 7 (always) (Cleary, 2006). The composite score of the SRSI-SR has high internal reliability (⍺ = .92) (Cleary, 2006). environment and behavior management (⍺ = .88, 12 items), seeking and learning information (⍺ = .84, 8 items) and maladaptive behaviors (⍺ = .72, 8 items) (Cleary et al., 2008). The The SRSI-SR items have previously been factored into three constructs including environment and behavior management factor included items such as "I think about how best to study before I begin studying" and "I study hard even when there are more fun things to do" (Cleary, 2006). Seeking and learning information sample items included "I think about the types of questions that might be on a test" (Cleary, 2006). The maladaptive regulatory behavior questions included items such as "I let my friends interrupt me when I am studying" (Cleary, 2006). The SRSI-SR was developed based on the Zimmerman & Martinez-Pons (1988) model of strategic learning. This model included ten SRL strategies including many that find themselves in more recent models of SRL (Zimmerman, 2002). The SRSI-SR is shorter in length and does not measure the same number of constructs as the larger MSLQ (Pintrich et al., 1991). High scores on the SRSI-SR suggest that students are engaged in their SRL strategies and intentions (Cleary, 2006). While not used as frequently as the MSLQ (Pintrich et al., 1991), the SRSI-SR has been used in a number of studies on college student SRL (Cleary et al., 2008). Motivated Strategies for Learning Questionnaire (MSLQ). SRL was also measured using a portion of the MSLQ (Pintrich, 2004). The full MSLQ instrument measures 15 constructs 40 using 81 seven-point Likert-scale questions (Pintrich et al., 1991). Constructs measured include motivation, cognition, meta-cognition, and resource management strategies (Pintrich, 2004). Learning strategies scales measure cognitive and metacognitive strategies including rehearsal, elaboration, organization, critical thinking, and metacognitive self-regulation. Resource management strategy scales are also included, comprising time and study environment, effort regulation, peer learning, and help seeking (Pintrich et al., 1991). The MSLQ is a widely accepted instrument to measure SRL (Roth, Ogrin, & Schmitz, 2016). The instrument utilizes consistently reliable (α > .70) and valid measures of SRL at the college level (Duncan & McKeachie, 2005; Pintrich, Smith, Garcia, & McKeachie, 1993; Zimmerman & Kitsantas, 2014). The MSLQ is based on a general cognitive model of learning and information processing (Pintrich et al., 1993) and is commonly used to measure critical SRL components found in models such as Zimmerman (2002), Pintrich & DeGroot (1990) and others. The metacognitive self-regulation (MCSR) subscale was the only scale of the MSLQ chosen for this study. Metacognitive self-regulation is a critical component of SRL and comprises planning, monitoring, and regulating (Pintrich et al., 1991). MCSR is the ability for a student to self-regulate the comprehension of their own learning (Richardson, Abraham, & Bond, 2012). This subscale was chosen because its description is broad as it pertains to SRL theory (Pintrich et al., 1991). The twelve items in the subscale are indicative of three common phases in SRL theory including planning, monitoring, and regulating learning (Pintrich et al., 1991). There are some overlapping items with the SRSI-SR (Cleary, 2006) in the areas of seeking and learning information. These duplications and overlap were sorted out in the exploratory factor analysis in the results section below. 41 The MCSR scale of the MSLQ is comprised of 12 items using a seven-point Likert scale of 1 (not at all true of me) to 7 (very true of me) (Pintrich et al., 1991). Composite scores on the metacognitive self-regulation scale have been correlated with academic performance (Pintrich et al., 1991; Richardson et al., 2012). These were measured over time to understand how SRL practices changed as a result of the intervention. All items used to develop the SRL measures can be found in Appendix E. SRL and MSLQ limitations. There are inherent measurement issues when self-reporting SRL. Researchers debate on the proper ways to measure SRL due to concerns over self-reporting and varying measurement instruments (Winne, 2010; Zimmerman, 2008). Many components of SRL are intertwined with one another and take place simultaneously. This makes it challenging to determine which element(s) of SRL are occurring at any given time. Studies that only analyze one component of self-regulated learning potentially overlook other processes that are taking place (Winters, Greene, & Costich, 2008). Academic Performance Academic performance was measured using three exams in weeks four, eight, and fourteen. A cumulative final exam was also administered at the very end of the semester. Each examination consisted of 50 multiple-choice questions worth two points per question for a maximum score of 100. Final grades were calculated using a composite of the three exam scores (20% each), the final examination (30%), a laboratory score (10%), and a maximum of three possible extra credit points for a maximum score of 100. Though these scores were converted to letter grades (A-F) for student transcripts, all academic performance was measured on a 100-point scale for the 42 purposes of this study. Exams 1 through 3, the final examination, and the transcript score (including extra credit and laboratory scores) were included in the final analysis. Treatment Fidelity A number of data points were collected to measure the extent to which participants accessed material and participated in the treatment. These data points included the percentage of each text message that was read by participants, number of times a participant accessed LMS materials, number of quizzes for understanding that were completed, and number of reflections completed. The reflection data included qualitative information consisting of open response entries from each participant. Analysis Approach The multifaceted nature of the intervention, the multiple measurement instruments used, the anticipated progression of SRL over time, and the data collected for this study presented a number of opportunities and challenges to analyzing and interpreting the data. This section provides a brief overview of the analysis approach that was used. Exploratory Factor Analysis An exploratory factor analysis (EFA) was conducted to better understand and consolidate SRL variables from the measurement instruments used. The survey instrument consisted of 40 items from established measures (Cleary, 2006; Pintrich et al., 1991). There was some similarity and overlap between the individual items. The SRSI-SR (Cleary, 2006) factor analysis has not been replicated widely, therefore a factor analysis would support or diverge from measures identified previously. The 40 items were analyzed using oblimin rotation as SRL measures are expected to be positively correlated with one another (Field et al., 2012). 43 There is a wide range of opinion on adequate sample size to run an appropriate factor analysis (Field et al., 2012). The total sample size for this study (n = 100) brings about the need for various considerations prior to running the factor analysis. To help determine how factors should be extracted from the sample, Kaiser-Meyer-Olkin (KMO) factor adequacy tests were run to determine if the sample size and data were adequate for factor analysis. Kaiser (1974) recommends that KMO values of more than .5 be achieved for all variables at a minimum. KMO values around .7 are considered ‘middling’ and values of at least .8 considered ‘meritorious’ (Kaiser, 1974). A KMO test was run on the first measure (40 items, n = 97, MSA = .7). Of the 40 total items, seven had individual KMOs at a ‘mediocre’ level (MSA < .6) (Kaiser, 1974). Conducting an exploratory factor analysis on data from the first measurement instance only would provide an acceptable but limited data set, increasing the sample size would provide stronger data for analysis. Rather than increasing sample size through simulated or imputated data, measurement results from all three instances were combined for analysis (n = 278). There are limitations to combining data across time points. These include the added weight of individual participants and combining data across time points when individual time points would be analyzed as outcome measures in the regression analysis. The increase in sample size is beneficial to the factor analysis given these limitations and will help better define the latent variables as a result (Field et al., 2012). The overall KMO value for this data set was significantly higher (KMO = .86), with the minimum KMO for any individual item at .73. With a larger sample to draw from, this data set is more appropriate for exploratory factor analysis. The determinant of the correlation matrix for this data set was < .000001, suggesting that some multicollinearity exists in the matrix. None of 44 the correlations in the matrix exceeded r = .65. Bartlett’s test of sphericity was conducted to determine if the data correlated between items sufficiently for analysis. The dataset was appropriate for exploratory factor analysis with !2(780) = 4,550, p < .0001. A number of EFAs were conducted to determine which model would best reflect SRL theory and the intervention's intended outcomes. The decision criteria used to select factors and contributing items were based on eigenvalues, scree plot analysis, and interpretability of the results. Items that loaded at least .40 or higher on only one factor were first considered. Stevens (2002) recommends .40 as a minimum to be included in a factor analysis. Cross-loading was avoided by removing items that loaded more than .30 on two or more factors. Items not meeting these criteria were removed from the analysis. Review of the remaining items and loadings yielded the four-factor model presented in the results section. Primary Analyses The intervention tested intended to improve college student SRL and academic performance throughout the semester. Measurements were taken at multiple time points to determine whether or not the treatment had any impact at the beginning, middle, end, or throughout the academic semester. It was therefore critical to consider the variable of time in the analysis. The main interest of this study was determining if significant differences between treatment and control groups existed at any time. The effects of the independent variables on the outcome measures were analyzed at each time point. Multiple regression was used in the analysis so that main and interaction effects could be examined for each outcome measure at each time point. Each time point and examination were treated as individual outcome measures of interest. 45 MANCOVA was initially used to analyze differences between groups on the various outcome measures. One key drawback of this approach, however, was the challenge integrating prior college GPA (a continuous variable) into the model in a way that was interpretable and meaningful. Another limitation of MANCOVA analysis is listwise deletion, which would require excluding any participant who did not complete all three measurement instances. With approximately 12% of data missing, MANCOVA would leave out a significant number of participants in a study with a relatively low sample size . Multiple regression allowed for models that could be built to determine the treatment’s impact at each time point while also including both categorical and continuous variables for consideration (Field et al., 2012). Two different models were run for each outcome variable at each time point. The first model simply looked at the main and interaction effects of the treatment condition and participant’s previous college GPA. The second model included all elements of the first model, but added demographic and background data to determine how much the treatment had an effect on the outcomes when controlling for these factors. To control for the possibility of Type I errors, a Bonferroni correction was applied to each of the four SRL outcome variables at each of three time points and each of the five academic performance outcomes (17 total tests). After the correction was applied, the new statistical significance threshold is p < .003. Numerous resources were consulted for the analysis of these results. Field, Miles, and Field (2012) was referred to for processes involving checking assumptions, structuring data, and executing analysis using R (R Core Team, 2019). Colleagues from Michigan State University's department of Counseling, Educational Psychology, and Special Education, members of the 46 dissertation committee and staff of the Center for Statistical Training and Consulting (CSTAT) were also consulted throughout this process. Additional Analyses The two research questions inquire about the degree to which participation in the treatment condition impacted the outcome measures. These questions ask about differences between groups overall, rather than at any specific point in time. Therefore MANOVA analyses were performed to determine if participants who completed the quizzes, material, or reflections differed from participants who had not on the outcome measures. Participants who completed any quiz, accessed LMS material, or finished at least one reflection were coded as 'treatment plus'. Therefore, three groups were used for this analysis including control, treatment, and treatment plus. MANOVA analysis was conducted to determine if there were any differences in the SRL and academic outcome measures between these three groups. The limited sample size for this analysis (n = 22) makes results challenging to interpret. Univariate ANOVAs were conducted for final exam grade and final course grade to avoid issues of overlap and intercorrelation with the three exam measures. Entries from reflection prompt exercises were coded using a basic schema to determine if any trends existed among participant reflections. Power Analysis A statistical power analysis was performed for the resulting sample using G*Power 3.1 (Faul, Erdfelder, Lang, & Buchner, 2007). Using linear multiple regression, fixed model, R2 deviation from zero, the study could detect a medium effect size (f2= .15) with α = .05, 1-β = .94, n = 100, predictors = 2). Prior power analyses in related research suggest a similar intervention could achieve as much as a small effect on SRL measures and medium effect on academic 47 performance outcomes (e.g. Zimmerman et al., 2011). While this analysis suggests that the sample provided adequate power to detect a medium effect, it was not sufficient to detect a small effect (f2 = .02). This limitation should be taken into consideration as the results are presented and discussed. A separate power analysis for MANOVA tests was conducted. Using G*Power 3.1 (Faul et al., 2007) global effects measures, this sample could detect a medium effect size (V = .25) with α = .05, 1-β = .96, n = 100, k = 3, (assuming r = .5 between measures and seven response variables). 48 RESULTS SRL Factors Exploratory factor analyses were conducted to extract SRL variables from the items used in the measurement instrument. The survey instrument consisted of 40 items including self- report questions from the SRSI-SR (Cleary, 2006) and the MSLQ metacognitive self-regulation (MCSR) scale (Pintrich et al., 1991). Multiple analyses were conducted before determining the structure for this study. The 40 items used were analyzed using oblimin rotation as SRL measures are expected to be positively correlated with one another (Field et al., 2012). Ten of the 40 factors had eigenvalues greater than one, suggesting that a large number of factors should be extracted. The scree plot, however, suggested that anywhere between three and six factors were needed (see Figure 4). Four SRL factors were extracted by using a combination of eigenvalues, scree plot analysis, and interpretability of the results. These factors made logical sense in describing common constructs to SRL theory and with the intended SRL outcomes of this particular study. Table 4 shows these loadings after rotation. The four factors extracted somewhat aligned with original measures presented in previous studies (Cleary, 2006; Pintrich et al., 1991). Anywhere between two and 11 items loaded on each of the four factors. The four factors extracted include metacognitive self-regulation (MCSR - factor I), behavioral disaffection (factor II), study approach (factor III), and organization (factor IV). Four items in the MCSR factor include items with relatively low loadings in comparison to other items in other factors. These items have loadings between .38 and .49, around the recommended .40 threshold (Stevens, 2002). While it is generally recommended that smaller loadings be excluded from smaller sample sizes (Field et 49 al., 2012), these items are included in the MCSR factor to increase reliability. The combined data set from three time points also allows for a larger sample to extract from (n = 278), therefore All factors showed good reliabilities with Cronbach's ⍺ > .70, and eigenvalues > 1. These lower factor loadings can be considered (Field et al., 2012). four factors correlated somewhat well with one another with the exception of behavioral disaffection, which only correlated with other factors at relatively low levels (see Table 5). As expected, factor correlations showed that MCSR, study approach, and organization were all positively correlated with one another while behavioral disaffection correlated negatively (see Table 5). Arabic numerals are used to indicate when the measurement was taken during the semester (e.g. MCSR1 was measured at the beginning of the semester, MCSR2 in the middle, and MCSR3 at the end). Correlation Analysis A correlation matrix was generated to determine which variables have significant linear relationships (see Table 6). Some data differed significantly from the normal distribution; therefore, Spearman's correlation coefficient was used to test the relationships between each variable (Field et al., 2012). Each SRL factor was somewhat positively correlated with the same SRL measures at different time points. For example, MCSR1 was positively correlated with MCSR2 (r = .71, p < .001) and MCSR3 (r = .54, p < .001). MCSR was also significantly correlated with the other SRL measures, especially study approach and organization. Behavioral disaffection was negatively correlated with academic performance throughout the semester. The negative correlation between behavioral disaffection at the end of the semester and the final grade was statistically significant (r = -.26, p < .05). 50 The treatment did not show a significant biserial relationship with any of the SRL measures. The outcome measures for SRL and academic performance mostly indicated a slight negative correlation with the treatment. The treatment condition had some negative correlations with SRL and academic outcome measures, but none at a statistically significant level. There were also some negative correlations between SRL and academic measures. These included negative significant correlations between study approach and test 3 (r = -.23, p < .05). Behavioral disaffection at the end of the semester showed significant negative correlations with test 2 (r = -.28, p < .05), final exam (r = -.23, p < .05), and final grade (r = -.26, p < .05). No other SRL measure had a significant correlation with academic outcome measures. Academic background indicators (e.g. college GPA) showed significant correlations with academic performance in the course, including all test grades and final grades. High school GPA was also highly and significantly correlated with all test grades and final grade. The treatment condition had a positive relationship with test results on the first exam only that approached statistical significance (r = .18, p = .08). Unexpectedly, the SRL outcome measures did not correlate positively with the academic outcome measures, and in some instances correlated negatively. Assumptions Multiple Regression Analysis Assumption tests were conducted for all 17 regression models to determine if the statistical tests were appropriate for analysis. Assumption tests performed included analyzing linear relationship between variables, multicollinearity between predictor variables, independence of the residuals, homoscedasticity, normal distribution of the residuals and outliers or influential cases. 51 Using scatterplots, all independent and dependent variables appeared to have a linear relationship using least squares estimates. Multicollinearity appeared in regression models that included an interaction term between previous college GPA and the treatment condition when the treatment condition was indicated by a dummy code (0, 1). Variance inflation factors (VIF) were consistently greater than 10 for these models with some standardized beta values > 1, also suggesting multicollinearity. To account for this, the treatment condition was recoded to be centered around 0 (-1 for control, 1 for treatment). The previous college GPA variable was also recoded around 0 by subtracting the mean from each observation. The new regression models and interaction terms did not show multicollinearity with all VIF < 10. All models were tested for the assumption of independent errors using the Durbin- Watson test. All models met this assumption. Scatter plots of fitted values and residuals were analyzed for any possible heteroscedasticity in the data that might indicate assumptions of linearity and normal distribution of the residuals were violated. Most regression models met this assumption with the exception of five SRL outcome measures (MCSR1, BD1, ORG3, SA2 and SA3). Cook’s Distance identified influential cases in the data for these models. As the treatment condition did not show any significant main or interaction effects with these models, all original data was retained for each model. The size and nature of the sample contributed to some issues with normally distributed variables and outcome measures (see Table 7). Though these variables displayed distributions that differ from the normal distribution, they can still be used in regression as long as the results are not extrapolated to the general population (Field et al., 2012). Three treatment participants in the grade distribution data had grades more than two standard deviations below the mean final grade (M = 80.45, SD = 9.01). These participants’ data were removed for grade outcome analysis 52 only. One additional participant in the treatment condition was removed due to having missed one of the three semester exams, resulting in a standardized final grade of Z = -1.93. Though these students were removed from the grades data analysis, they were retained as part of the SRL data analysis. Given these concerns and challenges with assumptions, results should be interpreted with caution, especially when considering their generalizability to other populations. For example, researchers and practioners should refrain from generalizing these results to more racially diverse, traditional-aged college populations (e.g. ages 18-20), or to courses delivered in different formats or subjects. The demographics of the sample for this study is extremely specific and relatively small. Therefore the results of this study may not necessarily translate to other populations. Effects on SRL The first research question for this study asks whether or not students receiving the intervention showed higher levels of SRL. The null hypothesis for this question is that there is no significant difference between the treatment and control groups. Regression analysis used the four SRL factors extracted through the EFA as outcome measures. Two separate regression models were run for each outcome measure at each of the three time points. The first model utilized treatment condition as the predictor variable with college GPA as a covariate. College GPA was included in order to account for college success coming into the semester and to control for student ability to navigate the academic environment. College GPA referred to the cumulative GPA the student had on record prior to the beginning of the semester. For most participants in the study, this GPA reflected their first semester’s overall performance as the course for this study typically took place in the students’ second semester of the first year. The interaction effect between college GPA and the treatment condition was also included in the 53 first model. Results suggest that the treatment condition did not have a significant effect on any of the SRL measures including MCSR, behavioral disaffection, study approach, or organization (see Tables 8-11). These results held across all time points. In addition, there was no significant interaction effect between treatment condition and college GPA. The second model added additional demographic and academic background information to the first model. Demographic variables including gender and race were added to determine if the treatment had any main or interaction effect when controlling for differences in gender and race. Due to the sample in this study, ‘minority’ was utilized to represent participants who self- identified as black, Hispanic, Asian, or multiracial. These ethnicities were included in minority designation at the institution sampled. Two additional covariates were added including student commitment to the engineering major (on a scale of 1 to 10) and parent’s highest level of education (high school graduate to advanced degree holder). This model helped determine if the treatment had any main or interaction effect when controlling for demographics as well as participant commitment to their academic program and parental academic background. Results suggest that the treatment condition did not have a significant main effect on any of the SRL measures using this model. This result held across all time points. In addition, there was no significant interaction effect between treatment condition and college GPA in these models (see Tables 8-11). The first research question also asked if level of participation in the intervention had any effects on SRL outcome measures. The SRL outcome measures (4 measures x 3 time points = 12 in total) were analyzed in a MANOVA to determine if participants in the treatment (n = 21) or treatment plus (n = 28) conditions showed any significant difference in SRL measures over the control group. Participants in the treatment plus condition completed at least one component of 54 the intervention during the semester (accessed LMS material, completed a review quiz, or completed a reflection exercise). Results showed that there were no significant differences between the three groups (control, treatment, treatment plus) on the SRL outcome measures F(24, 120) = 1.00, p = .47). These results suggest the degree to which participants completed the intervention components had no additional effect on SRL (see Tables 12 and 13). Effects on Academic Performance The second research question asks if treatment participants show improved academic performance. It also asks how the effects are moderated by past academic performance. Multiple regressions were performed to analyze the effects of the treatment condition on academic performance outcomes using college GPA as a covariate. The academic outcomes included scores on the three exams, the final exam, and overall final grade (five models in total). Interaction effects between the treatment condition and college GPA were included in all models. A Bonferroni correction was applied due to the number of regression models run. After the correction was applied, the new statistical significance threshold was p < .003. Results suggested that the treatment had a significant main and interaction effect on the first exam score only, however the level significance did not meet the Bonferroni correction threshold (see Table 14). There was also a significant positive main interaction effect between college GPA and the treatment condition by traditional standards (p < .05), but not when accounting for the Bonferroni correction (p < .003). This suggests that the treatment may have had a stronger effect on students with lower prior achievement (see Figure 5). The main effect of the treatment and interaction effect between college GPA and the treatment condition on the first test were not statistically significant when including demographic and academic background information in model 2. 55 Multiple regression analysis for the other academic outcome measures did not indicate any significant main or interaction effects of the treatment condition. College GPA, however, was a significant predictor for all academic outcome measures. The treatment also did not have any statistically significant interaction effect with college GPA on these other academic outcomes (see Tables 14 and 15). College GPA remained a significant predictor in model 2 for tests 2 and 3, final exam, and final grade. Commitment to the engineering major also had a significant positive main effect on exam 3, the final exam, and final grade (see Tables 14 and 15). Degree of participation in the intervention did not have a main effect on the academic outcome measures. The three semester exams were analyzed as outcome measures in a MANOVA to determine if differences between control, treatment, and treatment plus (e.g. accessed LMS material, completed a review quiz or reflection) conditions existed. Results suggested that the treatment conditions may have had an effect on the outcome measures at the multivariate level, F(6, 184) = 2.68, p < .05. However, there were no statistically significant differences detected for any of the outcomes with the follow up univariate analyses for the first (F(2, 93) = 1.92, p = .15) second (F(2, 93) = 1.66, p = .20), or third (F(2, 93) = .15, p = .86) exams (see Table 13). Univariate ANOVAs were conducted for final exam grade and final course grade to avoid issues of overlap and intercorrelation with the three exam measures. Results showed that the treatment conditions did not have a significant effect on final exam (F(2, 93) = .65, p = .52) or final grade for the course (F(2, 93) = .14, p = .87) (see Table 13). Ancillary Analyses: Participant Engagement with SRL Material Additional data were obtained from treatment elements that allowed for further analysis. The treatment for this study consisted of three components that allowed for analysis of participant activity. The SRL material in the LMS and the optional nature of the intervention led 56 to challenges with the intervention's overall fidelity. Participants were directed to have push notifications enabled, allowing the messages to appear on their mobile device without having to access the mobile application. Participants may not have received messages if they disabled the notification on their phone or for other reasons. According to Remind (Remind Inc., 2015) dashboard data obtained for this study, between 82 and 98 percent of participants read each message. Another component of the intervention involved students accessing SRL training material in the LMS. Of the 49 participants in the treatment condition, 22 students (45%) accessed the material at least once over the course of the semester with each participant accessing the material 3.77 times on average. Participants were prompted by text message to access the material eight separate times over the course of the semester as part of the 30 text messages sent. In addition, each SRL module in the LMS had a quiz to complete and check for understanding. Only eight of the 49 participants (16%) completed at least one of the quizzes for understanding. These participants completed an average of 3.25 of the eight quizzes (see Table 16). Just over half of the participants in the treatment condition completed any portion of the exercises or viewed the SRL materials in the LMS system. None of the participants in the treatment condition completed all the exercises or accessed all of the materials for the intervention. Less than a third of treatment participants completed one of the three reflection exercises (n = 16). These participants completed 1.19 of the three reflection exercises that were sent to them. The most reflections were completed after the first exam (n = 11) with fewer completing after the second exam (n = 2) and third exam (n = 8). The data obtained from the reflection prompts gives some indication as to how participants in the treatment condition approached exams and reflected on their performance (see 57 Table 17). Reflection prompts should be adjusted to better understand SRL strategies used by participants based on the research questions. For example, structured interview questions in Cleary & Zimmerman (2004) may garner better data about student SRL. Implementing a reflection such as the one used in Cleary & Zimmerman (2004) would be more extensive but would likely result in improved qualitative data that is more reflective of the phases of SRL (Zimmerman, 2002). Reflection prompt questions for this study closely followed those used in Zimmerman et al. (2011). The reflection exercises were intended to be brief and general so that participants would both complete them and answer openly (i.e. open-ended answers). Raw data were obtained from the online survey system used to administer the self- reflection exercises (see appendix D). Treatment participants were prompted to complete a short self-reflection exercise after they had received the results of each of their three exams (see Appendix D). Approximately one third of treatment participants completed at least one reflection during the semester (n = 16). Responses were coded using a basic schema (see Table 18) to determine if any trends existed among participant reflections (n = 21). The majority of participants who completed reflections (n = 12) reported engaging in study approaches that suggested some form of SRL. These included study approaches that suggested strategic planning, self-experimentation, or self- instruction, amongst others (Zimmerman, 2002). A smaller group of participants indicated that they had studied with other students (n = 6) with another group had indicated that they simply reread material (n = 6). One participant reported not doing any studying in preparation for their exam. Reflection prompt data suggest that students in the treatment condition engaged in a variety of exam preparation activities. Participants claimed they studied a little over 4.5 hours per 58 exam and practiced approximately 25 questions per test. Each of the three exams during the semester were 50 questions each. The amount of practice appeared to go down significantly by the end of the semester. Participants completing reflections claimed to practice only five questions each in preparation for the third exam of the term. It is unclear if this is an indication of increased confidence, less available time to study, or use of other approaches to study. Participants' level of preparation for exams varied significantly. Approximately half of the reflections (n = 12) indicated that they did something with the material suggesting effective SRL. The other half of reflections indicated either generic group work, just reading material, or not preparing at all (n = 13). The narrative responses seem to confirm that the treatment condition did little to change behavior. Participants were asked to reflect on general strategies utilized to prepare for exams and what plans they had to do things differently in preparation for future tests (Zimmerman et al., 2011). Responses mentioned various SRL approaches, ranging between reviewing class slides to creating their own quizzes and attending study groups. Participants made a number of observations when reflecting on what could have been done differently. Some participants said they did not study enough, while others claimed to have made 'silly mistakes.' Others stated they did not take the time to 'remember specific details of certain questions.' These results suggest that a slight majority of participants in the treatment condition who completed reflections engaged in at least basic SRL approaches to their studies. While studying in groups is not necessarily a specified component of the SRL models reviewed here, their use could indicate strategic planning, help seeking, and developing task strategies, as examples. Given the online format of the reflection form and the limited space to write, more expansive descriptions of SRL tactics used would be helpful for a more in-depth analysis. 59 In reflecting on what had gone wrong or was not effective in their preparation, participants reported a number of observations about their respective study approaches. Using a basic coding schema, the main challenges included not memorizing enough material (n = 6), not accurately anticipating exam questions (n = 5), and not studying enough or leaving adequate time for studying (n = 4). Participants indicated a range of reasons why their exam may not have gone as planned. A total of six different reasons were coded across all 21 responses. Responses mostly highlighted that not enough time or practice was spent in preparation for their exam or that they felt they did not memorize enough. Some responses (n = 3) went as far to suggest that the faculty member and course materials did not provide them adequate information to prepare for their exam. The last question in the reflection prompt asked participants to consider what strategies they would use to prepare for future exams. The top answers included starting to study sooner (n = 6), review more specific details (n = 6) and better understanding concepts (n = 4). 60 DISCUSSION SRL interventions at the college level typically consist of tutoring, targeted coursework, or brief workshops (Wolters & Hoops, 2015). This study looked at whether an abbreviated program could be administered via a text messaging application. It was expected that skills learned from online SRL modules (e.g. learning strategies, study strategies, time management) would be utilized effectively through regular prompting. Past studies have suggested that text message and similar interventions can help students and families with administrative tasks (e.g. financial aid and registration), but have yet to realize their potential in effecting change on college student SRL and academic success (Castleman & Page, 2016; Goh et al., 2012; Mabel et al, 2017; Oreopoulos et al., 2018). Of the 17 multiple regression models run, only one model yielded a significant main effect of the treatment or the treatment by college GPA interaction on the outcome measure (Test 1), however this result did not meet the more conservative Bonferroni correction threshold. Research Question 1 Research Questions The first research question asked whether or not participants in the intervention group showed higher levels of SRL than those in the control group. It was expected that participants in the intervention would display higher levels of SRL, especially if they completed more components of the intervention. The intervention did not significantly change behavior according to the SRL measures obtained for this study, even for participants who completed more components of the program (e.g. reflections, review quizzes). The lack of change in SRL over time was unexpected. Past studies suggested that changes to SRL behavior over the course of the semester were possible (Sitzmann & Ely, 2010). 61 In addition, the intervention was structured in a way that students were expected to learn effective SRL approaches at the beginning of the semester, proving beneficial at the end of the term. The fact that SRL did not change significantly over time may suggest one of several possible conclusions. Developing effective SRL may take more time than was allowed here. While college students are considered more adaptable to different learning environments, research suggests that this is not the case for all (Peverly et al., 2003). For example, some studies suggest that college students may not accurately predict performance on an upcoming exam, even if they spent time studying material (Peverly et al., 2003). College students may need to be highly adaptable with limited opportunities to practice new SRL approaches. This is especially relevant for classes with only a couple of exams, as was the case in this study (Peverly et al., 2003). Another possible explanation is that student SRL approaches are so engrained that more of an intervention is needed to change behavior. Some research has shown that SRL can be taught (e.g. Cleary & Zimmerman, 2004), but debate remains as to how much SRL approaches can actually be altered at the college level (Peverly et al., 2003). This may be especially true for students with a history of lower academic performance. If previous college GPA is the best predictor of academic performance and indicator of successful SRL approaches, a student with a lower college GPA may not be easily influenced by SRL interventions. The intervention did not have any significant impact on the SRL outcome measures and there was some indication that it had a slightly negative though non-significant effect. The correlation analysis showed the treatment condition had negative correlations with SRL outcome measures at numerous points during the semester (an increase in behavioral disaffection is a negative outcome). It is possible that the text message prompts, reminders, and SRL activities 62 caused students to access their mobile devices and creating a more distractive situation. Some research suggests that social media and mobile engagement can be positively correlated with engagement and communication, but also have negative impact on college performance (Junco, 2012). In studying time spent on Facebook in relation to college performance, Junco (2012) found that large increases in time on Facebook equates to lower college performance. The number of times a student checks Facebook, however, was not found to have any correlation with college GPA (Junco, 2012). There are some instances where checking social media such as Facebook may have its benefits for college students, including increasing interactions and collaborations with classmates (Junco, 2012). Amount of time on Facebook equated to lower college GPA, but does not necessarily link to less time studying (Junco, 2012). Amount of time on Facebook is still less of a predictor than prior academic history (e.g. high school and college GPA) (Junco, 2012). Students who received the prompts in this study may have ended up distracted by utilizing their phones for more distracting reasons such as social media. The deep thinking required for SRL may not be fostered by text prompts and may actually deter students from productive SRL behaviors. Butler & Winne’s (1995) SRL model emphasizes feedback either provided to or sought out by the learner. Participants that use effective SRL may seek feedback on their own (e.g. visiting faculty office hours, reviewing incorrect answers on an exam), but there was no student- specific feedback opportunity provided through this particular intervention. While the reflections prompted students to do this on their own, the intervention did not have the capacity to, nor had the intention of providing individual feedback. Future interventions may look for more ways to provide specific, recursive, and ongoing feedback in order to be more effective (Butler & Winne, 1995). 63 One last possible explanation is that the full intervention would have been more effective had it been required for students as part of this particular course. The optional nature of the intervention for participants may have lowered the actual perceived incentive to complete its components, especially accessing LMS material and completing reflection exercises. In other words, additional participants may have completed much more of the intervention had they been required by the faculty member and been counted as part of the course grade. This may have allowed for much better evaluation of the intervention against the control condition. Future research should consider ways in which such an intervention can be fully embedded and required as part of a college course. Programs that embedded SRL teaching have shown to be effective in increasing both SRL and academic outcomes (e.g. Cleary & Zimmerman, 2004; Zimmerman, 2011). Research Question 2 Participants in the treatment condition were expected to show better academic performance over the control condition. The second research question also asked if past academic performance moderated the main effects of the intervention on academic outcomes. Results suggested that the intervention had a significant positive effect on the first exam at the beginning of the semester, though not at a level to satisfy the Bonferroni correction that was applied. The same effect did not appear in subsequent exams, the cumulative final exam, or in participants’ final grade. In addition, the interaction effect was not significant when applying the Bonferroni correction. The main effect of the treatment condition and the interaction effect were found to be non-significant in model 2 which included demographic and other background variables. 64 These outcomes were somewhat expected given the lack of change in SRL behavior for the treatment condition. Similar to the SRL measures, the treatment condition had negative non- significant correlations for some academic outcome measures. Students in the treatment group performed lower than the control group on subsequent exams on average, but not at a statistically significant level. The lack of a significant finding for both SRL measures and academic performance runs somewhat contrary to previous studies. It does not support previous results suggesting that SRL approaches could improve over time with regular training and prompting (Sitzmann & Ely, 2010). Participants may have relied more heavily on practice exams or means other than the intervention to plan their SRL approaches. Changes to SRL approaches may have been based on prior college experiences and feedback provided by classmates or others. Treatment participants may have relied less less on the intervention and more on feedback from study groups or on their performance on exams throughout the semester. Butler and Winne (1995) emphasize the critical importance of feedback in SRL, which may have been supplied by the first test. Some interaction data suggest a possible initial novelty effect of the intervention (Davis & Abbitt, 2013). For example, the most reflections were completed after the first exam (n = 11), substantially more than after the second exam (n = 2) or third exam (n = 7). The number of completed LMS quizzes also decreased over the course of the semester from the highest participants on the first review quiz (n = 8) to the lowest on the last three quizzes (n = 2). These data suggest that student interaction with the intervention generally decreased after a short period of time. Accessing LMS materials and completing reflections started low and remained low throughout the semester. The newness of the intervention for treatment participants may have led to better SRL and test preparation behaviors early in the semester. Any effects of the intervention 65 may have faded as a result of participants getting used to the message prompts and not seeing them as useful to their success in the course. While the information provided may have been helpful, requiring participants to access it through the LMS may not have been most effective. Comparing Results to Previous Studies The intervention included components of previous studies that researched the impact of SRL teaching (Cleary & Zimmerman, 2004; Cleary et al., 2008), SRL prompting (Sitzmann et al., 2009; Sitzmann & Ely, 2010), and reflection activities (Zimmerman et al., 2011). The results from this study extend related research in a number of ways. This intervention added a technological component to successful interventions that were delivered without technology. The Self-Regulation Empowerment Program (Cleary et al., 2008) was delivered successfully via in-person intensive advisement and teaching. This intervention modified the program in a manner that could be delivered via LMS with text message reminders. In addition, this intervention modified the in-person reflection exercises used in Zimmerman et al., (2011) to be prompted by text message and administered through an electronic surveying system. Results of those studies suggested the interventions helped improve academic performance to some degree. Only the Cleary et al., (2008) study had results suggesting that the intervention impacted SRL learning. It should be noted that the Cleary et al. (2008) study sample consisted of urban high school students while Zimmerman et al. (2011) and this study included participants in college. This study added a text message component to previous interventions that used another form of technology to prompt SRL. Studies such as Sitzmann et al. (2009) and Sitzmann & Ely (2010) implemented SRL prompts by embedding them in a LMS or other software used in the learning process. Results suggested that SRL prompts could modestly improve academic 66 performance, though those same results were not found here (Sitzmann et al., 2009; Sitzmann & Ely, 2010). Participants in those studies were mostly adult learners taking online courses or training (Sitzmann et al., 2009; Sitzmann & Ely, 2010). This study included participants from a more traditional college-aged population. Recent research used a prompting and reminding approach to improve SRL and academic performance amongst college students (Dobronyi et al., 2017; Oreopolous et al., 2018). These studies intervened with college students on a more general level where interventions were focused on general college success rather than supplementing specific courses (Dobronyi et al., 2017; Oreopolous et al., 2018). Results of those studies also did not see an increase in the outcome measures of interest (overall GPA, credit completion, and retention) but did see an increase in study time amongst participants (Oreopolous et al., 2018). Results of the intervention tested here are very much aligned with these studies in that academic indicators were not significantly improved and that the effects on SRL are questionable (Dobronyi et al., 2017; Oreopolous et al., 2018). Reference studies for this research varied in sample size. Some used a one-subject case study approach (Cleary & Zimmerman, 2004) while others used sample sizes of 3000 (Oreopolous et al., 2018). In general, the sample size for this study was significantly smaller than most other similar studies referenced for this study. This lack of sample size needs to be considered in the interpretation of the results. For example, lack of sample size contributed to issues of non-normal data sets. Though outliers were removed for academic outcome measures, the small sample size in this study led to some influential cases that had more of an impact on results than they would have in larger samples. 67 Text Messaging and Prompting SRL As mentioned in the literature review, Mabel et al. (2017) implemented a text message campaign to help students complete their degrees. The campaign reduced college dropout but did not improve academic performance (Mabel et al., 2017). The intervention in this study had some initial impact on academic performance in the first exam that did not continue into the middle and end of the semester. The scope of this study did not include the impact on dropout from the course, major program, or institution. Future studies should consider dropout from course, major, or institution as outcome measures of interest. Past studies showed that SRL prompts could have promising results for academic performance on specific populations (Sitzman et al., 2009; Sitzmann & Ely, 2010). Results have shown that intervention groups improved academic performance but did not improve SRL measures (Sitzman et al., 2009; Sitzmann & Ely, 2010). SRL prompts have been shown to increase time on task, which can be correlated with academic performance (Sitzmann & Johnson, 2012). Academic performance improvements on multiple choice exams were shown in Sitzmann & Johnson (2012). Results of this study do not align with prior findings suggesting that academic performance could be modestly improved. Merely prompting SRL may not be enough to change behavior. Sitzmann & Johnson (2012) found that prompting SRL was effective in improving academic performance and increasing time on task when paired with a planning intervention. This study included a self- reflection component rather than a planning intervention. Zimmerman et al. (2011) showed that reflective exercises could significantly increase academic outcomes but not necessarily increases in self-efficacy or self-evaluation judgments. Zimmerman et al. (2011) also emphasized the need 68 for an instructional approach to SRL training in addition to reflection activities (Zimmerman et al., 2011). Teaching SRL The Self-Regulation Empowerment Program (SREP) successfully educated high school students on effective SRL, contributing to improvement in biology class performance (Cleary & Zimmerman, 2004). This study implemented components of the SREP, where students were asked to read online material about effective SRL. Cleary et al. (2008) showed increased academic performance, SRL, and motivation amongst a sample of eight high school participants. The study presented here included more participants while administering a similar program via mobile prompts. The lack of participation in the LMS training component of the intervention leaves many questions as to whether or not results shown in Cleary et al., (2008) could be replicated. SRL measures used here were different from Cleary et al., (2008), though many of the questions in the survey instrument were similar. Cleary et al. (2008) showed significant SRL improvement within the SREP treatment group but had full participation of the eight participants. The SREP consisted of regular meetings prior to the regular school day, where SRL lessons were delivered in 50-minute sessions over the course of 11 weeks. Each session included intensive instruction on the many components of SRL as they relate to the Zimmerman (2002) model. After an initial introductory period of the course, the program included training on effective learning approaches using instruction, modeling, and practice with feedback (Cleary et al., 2008). Results of this study did not show significantly higher levels of SRL for participants in the treatment group. The SREP had a more intensive, in-person approach to SRL education than this intervention. The electronic text message approach taken in this study did not appear to have 69 the same effect as in-person instruction on SRL. With the amount of intricacies involved in any SRL process and approach, generic text messaging may not provide adequate feedback to influence student behavior. Specific feedback is an important component of both SRL theory (Butler & Winne, 2015) and successful SRL interventions (Cleary et al., 2008). In other words, SRL feedback by an instructor may be more effective than SRL reminders or prompts via text message. Future research should consider ways in which a text message intervention can be changed to provide more student-specific feedback that can better replicate an in-person experience. Though this approach has worked in a number of text interventions that focused on student enrollment and financial aid processes (Castleman & Page, 2015, 2016), it has yet to show the same level of impact on SRL. Students may not have adequate SRL to effectively utilize less structured information in their coursework (Moos, 2018). In other words, some students may struggle to see ways to integrate SRL strategies without clear direction. Future studies may want to experiment with modifications to the intervention tested here. Improvements may include a two-way communication method that can better connecting participants to learning resources (e.g. tutoring or coaching). Other research might consider ways to embed SRL teaching and training materials directly into the intervention, rather than requiring students to access them through the LMS. Future studies may use multiple treatment groups to determine the optimal time of day or evening to message participants. A treatment group that receives messages in the evening may better prompt students when more time may be available for studying (Tabuenca et al., 2015). Additional studies should ensure that SRL learning modules are required as part of the study or as part of the course they are enrolled in. Requiring or embedding SRL activities has proven effective previously (e.g. Zimmerman et al., 2011). 70 Self-Reflection Reflection activities in Zimmerman et al. (2011) were successful in helping college students improve performance on several math exams. Results of this study did not show improvement in academic performance for participants in the treatment condition. Zimmerman et al. (2011) found that academic performance increased over time. Results of this study suggest that the treatment condition, including reflection activities, did not significantly improve academic performance at any time during the semester. The intervention in Zimmerman et al. (2011) was structured in a way that helped foster SRL over time. The participants in the treatment condition in the Zimmerman et al. (2011) study received additional SRL-based instruction within the remedial math class they were enrolled in. Instruction included modeling how to correct errors, guided self-reflection exercises, and an incentive system to promote additional attempts at exam questions that were answered incorrectly (Zimmerman et al., 2011). In addition, treatment participants took additional quizzes every three class sessions as a means of reviewing feedback more frequently (Zimmerman et al., 2011). As part of the reflection exercises, treatment participants completed a self-reflection form, which was replicated in this intervention (Zimmerman et al., 2011). The exercises served as a repetitive process by which students used self-reflection to prepare for subsequent exams. Similar to this study, the treatment in Zimmerman et al. (2011) did not yield any significant differences in self-efficacy and self-evaluation. Though the SRL measures and constructs were different, the intervention in Zimmerman et al. (2011) also did not show significant improvements in SRL measures. This study may have seen better SRL and academic performance results had more participants completed the reflections. This is another example of how the intervention tested 71 here lacked instructor feedback, perhaps relying too heavily on students to self-reflect on their own with little guidance. Students lacking effective SRL approaches may not have received enough SRL instruction through the LMS modules to make a difference. Future iterations of the intervention should look at ways that an instructor, tutor, or mentor can use the information from the self-reflections to better guide students as they prepare for subsequent exams. This intervention intended to improve student SRL strategies so they would be implemented effectively. Reflection responses should have indicated use of SRL strategies taught through the LMS modules. Participants would be expected to report using effective SRL strategies such as test preparation, self-evaluation of learning, or organizing materials. Exam preparation reflection data suggested that a portion of students did not utilize effective SRL strategies (e.g. participants just read material). When asked about future strategies for subsequent exams, all participants indicated that they planned to use some form of SRL. Despite the low number of reflections, there is some indication that the reflection process at least led students to consider their SRL approaches for future exams. While some of this data is encouraging, it does not show a strong enough pattern to explain the lack of difference in SRL and academic performance measures between treatment and control groups over the course of the semester. Intervention Effectiveness The intervention did not appear to assist students in their academic performance on in their SRL approaches. Given the sample in this study and the results presented, this type of intervention may not be effective in its impact on SRL or academic performance with the general college-aged population. The overall effect on SRL and academic performance generally falls short of the results of other past similar studies. The hypothesis that the intervention would have 72 a significant positive effect on all or most SRL and academic outcomes was not confirmed through this study. While the intervention may have had its challenges with participation, the rate at which students viewed the messages sent to them was encouraging. Anywhere between 82 and 98 percent of participants read each text message sent over the course of the term. There is no data in this study to compare text messaging and email, but it is safe to say that the text message technology was effective in reaching students. Participants seemed open to receiving and reading text messages about their academics, however results suggest that few students took action beyond reading these messages. The text components of the intervention had significantly more participation than other elements, perhaps due to the ease of receiving information via text versus looking up additional information or exercises in an LMS. Future studies looking at this type of intervention should ensure that completion of all elements is mandated as part of enrollment in the course. Past studies have shown that incorporating SRL teaching into coursework has been effective in improving student performance (Zimmerman et al., 2011). The optional nature of this intervention likely led to few students taking the time to complete them. Additional data should be gathered from participants regarding how they reacted to the messaging and other components of the intervention. For example, treatment participants might be surveyed about how they reacted to the messages, if they took any action when they received the messaging, and how likely they were to change their study behaviors. Additional data could be gathered on whether or not participants found the information helpful, and if they believed it might have influenced their SRL or performance. Supplemental information could also be collected through tutoring services, faculty office hours, 73 and elsewhere to determine if treatment participation led to any changes in help-seeking behaviors. Results showed some negative non-significant correlations between the treatment and SRL measures. Researchers may consider looking into whether or not text message interventions could in fact be detrimental to college students. Receiving regular and ongoing text message prompts may distract even the best students from proper studying. While a student may take the time to read an SRL prompt on their phone, they may subsequently spend time in a more distracting medium such as social media. This could pose unintended challenges to achieving student SRL. Text message prompts may be as much a distraction to some students as a benefit to others. Impact on Students with Low GPA The intervention was expected to show a significant interaction effect for students with low college GPAs, in line with previous studies (Mabel et al., 2017). The treatment condition did not show a significant interaction effect with college GPA on any of the exams or the final grade. A significant interaction effect would have suggested that the treatment had an impact on students with low GPAs. Future research should consider ways to improve supports for college students with low GPAs through similar interventions. A text message intervention like the one tested here may mainly be more beneficial to weaker college students than stronger ones, though results of this study do not confirm this benefit. Participants with better academic histories and stronger SRL skills did not seem to benefit from the intervention any more than students who were in the control condition. High-achieving students may not need to adjust their SRL approaches or look to additional resources to help improve their overall academic standing. 74 Considering Content and Timing of Messages Timing, frequency, and content approaches to SRL prompts varied in previous studies. Some studies sent text messages to students approximately once per week (e.g. Castleman & Page, 2015) while others were far more frequent at approximately six times per week (e.g. Frankfort et al., 2015). Other interventions messaged students at specific, strategic times (Mabel et al., 2017; Tabuenca et al., 2015). Research suggests that practitioners should carefully design interventions to meet intended outcomes, however the approach should be grounded in research (Yeager et al., 2016). The intervention in this study was intended to get students to consider their SRL on a more frequent basis than they might otherwise. It was designed to prompt students regularly but non intrusively and drive behavioral change. Messages were sent approximately twice per week in the late morning hours (appx 11:00 AM). This timing was expected to increase the probability that students would be awake to read the messages and use them to plan their SRL for the rest of the day. Finding the balance in communication with college students via cell phone, social networking, and text messaging is an ongoing challenge. These challenges are likely to continue as student use of mobile and text message technology will continue to evolve. Future studies may consider implementing multiple treatment conditions where messages are sent to students at different frequencies and different times of day to find the optimal approach. Implications for SRL Theory Implications Results of this study are worth considering in the context of SRL theory. The roles of feedback and continuous prompting may introduce ways to improve similar interventions and to conduct future research. 75 One central difference between the two models considered for this study (Butler & Winne, 1995; Zimmerman, 2002) is the role of external feedback. One model theorizes that significant feedback to the student will improve SRL (Butler & Winne, 1995). Specific feedback on student performance may foster more intentional SRL based on the learning context and better internal monitoring (Butler & Winne, 1995). Providing course-specific feedback or embedding SRL teaching specific to the learning context has been successful in previous studies (Zimmerman et al., 2011). Zimmerman (2002), while not explicitly including external feedback in the SRL model, implies that it plays a role in the SRL self-reflection phase. This phase includes self-evaluation and determining causal attribution, amongst other processes (Zimmerman, 2002). This intervention did not provide customized student-specific feedback, but rather generic SRL prompts and strategies that could be helpful to any student. For Butler & Winne (1995), feedback is an important part of SRL and plays a significant role in helping the learner determine if their strategies for learning tasks were appropriate or not. Feedback also helps the learner determine what their approach for subsequent tasks might be (Butler & Winne, 1995). Including specific external feedback as suggested in Butler & Winne (1995) may help improve SRL behavior, however more data would need to be gathered to determine if this change could potentially lead to improving SRL outcomes. Continuous prompting to utilize SRL may trigger students to use strategies at specific times than to actually change their overall approach. For example, the average college student may already use well-established goal setting approaches and won’t need prompting to use effective strategies. In considering Butler & Winne (1995), continuous prompting could be 76 viewed as a mechanism to foster actions within the cognitive system, providing internal and recursive feedback, setting goals, and planning strategies (Butler & Winne, 1995). In considering the three phases of Zimmerman's (2002) SRL model, continuous prompting via text message may be helpful to all three phases of SRL (forethought, performance, and self-reflection). Results of this study do not suggest that the intervention tested had any significant impact on SRL. Results did not show that SRL in the treatment condition was used any more than the control condition. Determining which phase of SRL might be impacted by this intervention is beyond the scope of this study, however future research should consider ways to identify which aspects of SRL are impacted by similar interventions. Future research could look at if and when the timing of prompts affect SRL results as found in studies such as Sitzmann et al. (2009) and Sitzmann and Ely (2010). Prompting may still trigger SRL more frequently or at different time periods than those not prompted. Additional research could analyze whether or not SRL prompting is helpful or necessary at all. Implications for Empirical Research The results of this study align with those found in Oreopoulos et al. (2018). A similar text message prompt intervention increased participant study time but did not lead to increases in GPA and other outcome measures (Oreopoulos et al., 2018). This study did not look at the impact on study time, but results suggest that the intervention did not improve SRL. This study showed similar academic results as Oreopoulos et al. (2018) but it did not measure impact on retention or credit completion in other courses. Research in SRL prompting with mobile technology has considered the place of technology in the academic lives of students, the timing of interventions, and the settings where students might receive the most benefit. This research should continue as colleges, universities, 77 and their students evolve in the mobile era. Further research should broadly consider how text message interventions can be implemented for students generally, not necessarily linked to a specific course. Institutions may consider using text messages to work with students in academic distress or falling behind. This intervention may be of particular help to institutions with large enrollments, retention challenges, or financial limitations. Regardless of reason, more research on its effectiveness needs to be considered. The framework for the intervention researched here may be adjusted for future programs and scaled to fit institutional needs. Longitudinal studies with larger samples would be beneficial to this niche field. This study shows that prompting college student SRL consistently over time in a way that fosters results is challenging (Sitzmann et al., 2009). Intervening to help students perform academically shows some promise, but the means by which students obtain high grades may or may not necessarily be linked to SRL. Text message prompts may simply serve as reminders rather than leading to a change in SRL behaviors. Given the results of this study, practitioners should give significant thought before investing the time and effort to plan and execute this type of intervention. Text messaging is still a strong and effective way to reach students directly, however its purpose needs to be strongly considered. College students may have well-established SRL approaches that are difficult to change in just one semester. Studies that were successful in doing so either worked with a younger student demographic (e.g. high school students), older demographic (online adult learners) or in a highly specialized and short-term course (Cleary et al., 2008; Sitzmann & Ely, 2010). Future research should consider requiring the intervention for a group of students. Including participation in the intervention as part of both supporting students and the grading expectations would most likely increase engagement in all aspects of the intervention. This 78 would allow future studies to fully assess its effects or lack thereof. Additional data points should be collected as they pertain to ways in which students react to text message prompts and how they can be improved from an end user perspective. Implications for Practice Practitioners should consider new ways to teach students effective SRL strategies. Results of this study and others (e.g. Oreopoulos et al., 2018) suggest that the time necessary to add this enhancement to a particular course may not yield any significant increase in overall SRL approaches or academic performance. The percentage of students who read the messages make text messaging a promising avenue for faculty, graduate assistants, and others to communicate with students, but it should not be expected to necessarily improve SRL or academic performance. The purpose and content of integrated text messages into courses should be strongly considered before implementation. This study looked to increase SRL and academic performance, but text messaging may be more effectively utilized for class reminders, communicating changes to the course or encouraging visits to office hours. In these ways, text messaging with programs like Remind (Remind Inc., 2015) could improve faculty communication with students about their classes. Setting up a text messaging program allows for a separate communication channel that may be better able to grab student attention better than other mediums including email. Practitioners need to be very intentional with how they build and deliver an intervention such as the one tested here. For example, practitioners should consider if the intervention will serve as just a way to enhance studying or if it will serve as a communication channel with students. They should consider ways in which the intervention can provide specific feedback and 79 how it might be customized to fit the needs of different learners. For example, customized feedback might be sent to students who struggled on a specific topic in a previous exam (as suggested in Butler & Winne, 1995). Other prompts might also direct students to specific resources such as supplemental material or on-campus tutoring. The content of the messages should also speak explicitly to effective SRL strategies rather than more generic overarching tips as were tested here. The delivery of an SRL training or education program may be better delivered over an alternative medium or in-person. The depth necessary for a successful SRL education program such as Cleary et al. (2008) may require more than text message prompts. They may need more in-person teaching and contact. Successful SRL education studies either had completely dedicated courses or were fully integrated into the instructor’s teaching approach (Zimmerman et al., 2011). Sample Limitations This study's sample lacked diversity in both race and gender to generalize to a larger, more traditional undergraduate population. The sample obtained for the study was smaller than anticipated (n = 100). The course used to administer this study is highly-specialized with very distinct learning outcomes and a unique student population. The sample included mostly freshman students, which prevents extrapolating effects of the intervention to a wider population. The sample obtained presented numerous challenges, including normality issues and the lack of statistical power to detect a small effect. Larger sample sizes in future studies can help determine if interventions such as the one tested here could have at least a minimal effect on SRL and academic outcomes. Results of this study should be interpreted very cautiously due to 80 the challenges with the sample, statistical assumptions, and insufficient statistical power to detect small effects. Statistical Assumptions and Missing Data There were several elements of the sample and the data obtained that did not meet statistical assumptions for multivariate normality, presenting challenges to the interpretation of results. The outcome measures for both treatment and control groups appeared to diverge significantly from the normal distribution. Shapiro-Wilk test statistics for each MANOVA model also presented some challenges, including non-normal distributions for the SRL outcome measures. Results from this study should be interpreted with caution due to these assumptions not being met. Missing data was also a challenge in interpreting the results and impact of the intervention over time. Despite surveying participants in-person, approximately 12% of SRL data was missing either due to student lack of attendance or inconsistent survey completion within the study. Treatment Fidelity A significant challenge to this study was the length and structure of the intervention. The optional nature of the treatment likely lowered the intentionality in which students participated. This was especially true of the online SRL training modules and the self-reflection exercises. None of the participants in the sample completed the entire intervention (i.e. completed all training modules, read all text messages, and completed all self-reflections). Future studies should find ways to ensure that participants fully participate in the intervention to determine if full completion of the intervention could be effective. 81 Measurement Limitations This study only measured some elements of SRL. Other factors may be of interest to researchers. There are multiple measures and many aspects of SRL that may be of more interest to researchers. This study did not measure other constructs that are typically studied in similar research including self-efficacy and motivation. Future research should consider how a similar intervention might impact students beyond the measures and constructs that have been considered here. This study was only able to measure the impact of the intervention within the course sampled. It was not able to look at how students utilized SRL in past semesters or subsequent semesters. Prior academic information (college GPA) was obtained, but the study was unable to determine if the intervention had any effect on SRL from a baseline before the course began. Future studies should consider looking at how an intervention changes SRL over time from one semester to the next. 82 APPENDICES 83 APPENDIX A Tables n 6 0 6 82 6 15 85 25 43 23 8 1 8 % 6 0 6 82 6 15 85 25 43 23 8 1 8 Asian Black Hispanic White Multiracial Gender Female Male Table 1 Sample Demographics (N = 100) Characteristic Race Parent level of Education Advanced degree College graduate Some college High school degree Some high school Unknown 84 Table 2 Background Information Measure 0.55 Engineering Major 1.33 High school GPA College GPA -0.70 Note. Student commitment to Engineering major, on scale of 1-10, with 10 being highest. High School GPA on a 0-100 scale. High School GPA on other scales were converted for the purposes of this study. Treatment SD M 2.38 4.56 0.75 SD 2.20 3.94 0.62 Control M 8.30 89.50 2.74 8.04 88.27 2.67 t p .59 .19 .48 Cohen's d -.11 -.29 -.08 85 Y1 Y1 X -- X -- Table 3 Control Group Time-Series Design Group T C Adapted from Ary et al., 2010 Note. 'T' represents treatment condition, 'C' represents control condition. X' indicates treatment, 'Y' indicates measurement timepoints. Y3 Y3 X -- Y2 Y2 86 Table 4 Factor Loadings of SRL Variables A. Metacognitive self-regulation (MCSR) (Factor 1) Q34 Q37 I ask myself questions to make sure I understand the material I have been studying in class I try to think through a topic and decide what I am supposed to learn from it rather than just reading it over when studying I think about the types of questions that might be on a test Before I study new course material thoroughly, I often skim it to see how it is organized I ask my instructor about the topics that will be on upcoming tests Q3 Q33 Q4 Q30 When reading for this course, I make up questions to help focus my reading Q27 Q7 Q15 Q35 I think about how best to study before I begin studying I quiz myself to see how much I am learning during studying I try to identify the format of upcoming tests I try to change the way I study in order to fit the course requirements and instructor's teaching style I try to forget about the topics that I have trouble learning I lose important materials pertaining to class I forget to bring my materials when I need to study I give up or quit when I do not understand something Q38 When studying for this course I try to determine which concepts I don't understand well B. Behavioral disaffection (Factor II) *Q13 *Q10 *Q20 *Q19 C. Study approach (Factor III) Q2 Q16 Q1 I try to study in a quiet place I try to study in a place that has no distractions (e.g. Noise, people talking) I make sure no one disturbs me when I study IV .26 .14 .11 -.18 .13 Factors III I II .58 .57 -.14 .13 .14 -.21 .69 .67 .63 .59 .56 .54 .53 .53 .53 .49 .47 .42 .38 -.16 .15 .11 .14 .12 .94 .68 .56 87 Table 4 (cont’d) D. Organization (Factor IV) Q25 Q9 Note. * - reverse coded I carefully organize my study materials so I don't lose them I use binders or folders to organize my study materials SS loadings Proportion Var Alpha 88 I 2.96 .15 .82 II -.18 .15 1.83 .09 .79 Factors III 1.75 .09 .74 IV .76 .73 1.30 .07 .73 Table 5 Factor Correlation Matrix 1 2 3 4 Note. Correlations represent average across all measurements Variable Metacognitive self-regulation (MCSR) Behavioral disaffection Study approach Organization 1 - -.11 .44 .43 2 - -.15 -.17 3 - .21 89 6 7 8 9 Measure Table 6 Correlation Matrix of Variables 1 - .41*** 1 College GPA 2 High School GPA 3 Engineering Major 2 - 3 4 Commitment 4 Parent Education 5 Treatment 6 Test1 7 Test2 8 Test3 9 Final Exam 10 Transcript Grade 11 MCSR1 12 MCSR2 13 MCSR3 14 BD1 15 BD2 16 BD3 17 SA1 18 SA2 19 SA3 20 ORG1 21 ORG2 22 ORG3 - .86*** .05 .06 -.11 -.11 .02 -.23* -.02 .15 -.20 -.05 -.17 .02 Note. 'MCSR' - Metacognitive self-regulation, 'BD' - behavioral disaffection, 'SA' - study approach, 'ORG' - organization; 1 - beginning of semester, 2 - middle of semester, 3 - end of semester * p < .05, ** p < .01, *** p < .001 - .63*** .79*** -.01 .01 -.02 -.01 .01 -.19 -.13 -.07 -.23* -.07 -.16 -.04 - .59*** .59*** .77*** -.02 .03 -.07 -.14 -.15 -.28 -.16 -.02 -.19 .02 -.08 -.04 - .53*** .46*** .57*** .74*** -.10 -.01 -.06 -.08 -.10 -.16 -.02 -.07 -.09 -.05 -.16 -.01 .13 .02 -.05 .44*** .63*** .41*** .48*** .59*** .06 .02 -.02 -.19 -.18 -.30** -.08 -.05 -.09 .01 -.02 .01 .23* .16 -.09 .30** .34*** .44*** .41*** .48*** .02 -.09* -.04 -.09 -.15 -.14 -.13 -.10 -.30** .02 -.08 .05 - -.10 -.09 .10 .16 .26* .34*** .32*** .14 .25* .17 -.18 -.13 -.07 .02 .10 -.02 .21* .07 .16 - .16 .06 .04 .06 -.01 .06 -.03 -.02 -.07 -.08 -.14 -.07 -.20* -.22* -.18 .03 .02 .02 5 - .18 -.15 -.03 -.06 .05 -.12 -.02 -.14 .15 .13 -.06 -.10 -.18 .02 -.06 .01 -.07 90 Table 6 (cont’d) Measure 10 Transcript Grade 11 MCSR1 12 MCSR2 13 MCSR3 14 BD1 15 BD2 16 BD3 17 SA1 18 SA2 19 SA3 20 ORG1 21 ORG2 22 ORG3 10 - .01 .09 -.08 -.13 -.04 -.26* -.08 .01 -.20 .00 -.15 .04 11 - .71*** .54*** -.17 -.17 -.13 .42*** .32** .32** .35*** .43*** .30** 12 - .53*** -.16 -.21 -.14 .35*** .45*** .24* .45*** .49*** .32*** 13 - -.21 -.37*** -.26* .40*** .35*** .50*** .40*** .44*** .52*** 14 - .59*** .48*** -.15 -.15 -.15 -.22* -.03 -.07 15 - .51*** -.06 -.25* -.26* -.19 -.30*** -.19 16 - -.13 -.01 -.20 -.13 -.20 -.10 17 - .61*** .52*** .17 .17 .12 91 Table 6 (cont’d) Measure 18 SA2 19 SA3 20 ORG1 21 ORG2 22 ORG3 18 - .47*** .33*** .35*** .19 19 - .13 .3*** .27*** 20 - .74*** .51*** 21 - .48*** 22 - 92 Table 7 Outcome Measure Summary Treatment Control Combined M Outcome SD M 1.04 2.41 1.09 2.68 1.00 2.69 0.87 4.69 0.66 4.79 0.76 4.85 1.41 4.98 1.33 4.99 1.36 4.76 1.12 4.97 0.89 4.87 5.11 1.24 76.89 10.48 74.93 13.02 77.56 11.50 79.98 10.75 81.22 9.11 BD1 BD2 BD3 MCSR1 MCSR2 MCSR3 ORG1 ORG2 ORG3 SA1 SA2 SA3 Test1 Test2 Test3 Final Exam Final Grade Note. 'MCSR' - Metacognitive self-regulation, 'BD' - behavioral disaffection, 'SA' - study approach, 'ORG' - organization 1 - beginning of semester, 2 - middle of semester, 3 - end of semester SD 0.98 2.17 1.26 2.47 1.22 2.89 0.97 4.82 1.02 4.82 0.81 5.11 1.63 5.07 1.62 4.86 1.37 4.95 1.28 5.12 1.35 5.09 5.05 1.29 72.47 12.08 78.98 11.45 9.58 78.71 8.40 81.75 80.27 8.33 Ske Kur 1.17 1.16 0.73 -0.81 -0.13 0.21 -0.62 -0.54 -0.24 -0.77 -1.12 -0.63 -0.33 -0.52 -0.33 -0.29 -0.23 n 0.93 47 2.12 46 0.33 49 0.31 47 -0.43 45 -0.44 49 -0.47 47 -0.31 45 -0.51 49 -0.19 47 0.82 45 -0.74 49 -0.43 51 -0.50 51 0.77 51 -0.29 51 0.53 51 Kur n 4.05 46 -0.55 41 -0.36 37 0.41 46 -0.44 41 -0.15 37 -0.03 46 -0.95 41 -0.15 37 -0.73 46 -0.61 41 -0.82 37 -0.53 45 -0.63 45 2.27 45 -0.80 45 -1.12 45 Ske 1.44 0.31 0.35 -0.33 0.30 -0.14 -0.59 -0.07 -0.05 0.25 0.42 -0.33 -0.38 -0.42 -1.04 -0.39 -0.16 M SD 1.01 2.29 1.18 2.57 1.13 2.81 0.92 4.75 0.86 4.81 0.80 5.00 1.52 5.03 1.48 4.92 1.36 4.87 1.20 5.05 1.15 4.98 5.08 1.26 74.54 11.51 77.08 12.31 78.17 10.49 9.56 80.92 80.71 8.67 Ske 1.29 0.79 0.66 -0.58 -0.03 0.10 -0.59 -0.41 -0.15 -0.33 -0.70 -0.50 -0.40 -0.50 -0.78 -0.42 -0.17 Kur n 2.50 93 0.97 87 0.34 86 0.20 93 -0.05 86 -0.24 86 -0.33 93 -0.38 86 -0.44 86 -0.52 93 0.64 86 -0.78 86 -0.38 96 -0.51 96 1.84 96 -0.46 96 -0.40 96 93 MCSR2 B SE β p R2 F .00 0.05 -0.01 0.10 -.01 0.05 0.14 .04 .00 -0.01 0.14 .13 1.07 .01 0.01 0.10 .01 0.01 0.14 -0.55 0.29 -.24 .12 0.31 0.33 .17 0.06 0.04 0.10 0.28 .05 -0.33 0.24 -.19 .04 0.06 0.19 0.07 0.14 .06 p .915 .722 .971 .950 .965 .059 .345 .175 .722 .166 .768 .607 Table 8 Metacognitive Self-Regulation (MCSR) Models Variables Model 1: Treatment & Past Performance Treatment Prior GPA Treatment x Prior GPA Model 2: Demographics & Education Background Treatment Prior GPA Gender (male) Minority Engineering major commitment Parent Education level (some college) Parent Education level (college graduate) Parent Education level (advanced degree) Treatment x Prior GPA F R2 .01 0.42 .11 0.94 MCSR1 B SE β -0.04 0.10 -.04 0.10 0.15 .08 .07 0.09 0.15 -0.06 0.11 -.07 0.07 0.16 .05 -0.38 0.31 -.15 .01 0.02 0.32 .06 0.03 0.05 0.37 0.31 .16 -0.42 0.26 -.21 .20 0.34 0.21 0.10 0.16 .07 .690 .480 .539 .559 .678 .225 .944 .587 .242 .109 .113 .560 94 Table 8 (cont’d) Variables Model 1: Treatment & Past Performance Treatment Prior GPA Treatment x Prior GPA Model 2: Demographics & Education Background Treatment Prior GPA Gender (male) Minority Engineering major commitment Parent Education level (some college) Parent Education level (college graduate) Parent Education level (advanced degree) Treatment x Prior GPA MCSR3 SE B β R2 F .03 0.82 .12 0.96 -0.13 0.09 -.16 -0.07 0.13 -.06 0.03 0.13 .03 -0.09 0.09 -.12 -0.07 0.13 -.06 -0.40 0.25 -.19 .02 0.04 0.28 0.05 0.04 .16 0.00 0.25 .00 -0.12 0.21 -.07 .02 0.03 0.18 0.14 0.13 .13 p .146 .579 .815 .329 .610 .119 .884 .211 .989 .579 .869 .291 95 Table 9 Study Approach (SA) Models Variables Model 1: Treatment & Past Performance Treatment Prior GPA Treatment x Prior GPA Model 2: Demographics & Education Background Treatment Prior GPA Gender (male) Minority Engineering major commitment Parent Education level (some college) Parent Education level (college graduate) Parent Education level (advanced degree) Treatment x Prior GPA SA1 B SE β p R2 SA2 B SE β F .02 0.49 .10 0.79 -0.12 0.13 -.10 -0.14 0.19 -.09 0.09 0.19 .06 -0.01 0.14 -.01 -0.19 0.20 -.12 -0.22 0.41 -.07 .16 0.60 0.46 0.07 0.06 .14 -0.24 0.40 -.08 -0.34 0.33 -.14 -0.10 0.27 -.05 0.15 0.20 .09 p .368 .447 .613 .920 .360 .582 .197 .269 .549 .301 .713 .454 R2 F .02 0.57 .08 0.69 -0.07 0.13 -.06 -0.12 0.19 -.07 -0.17 0.19 -.10 -0.04 0.14 -.04 -0.15 0.21 -.09 -0.58 0.41 -.17 0.47 0.42 .14 -0.02 0.07 -.03 -0.27 0.41 -.09 .03 0.09 0.34 0.10 0.28 .04 -0.24 0.21 -.14 .576 .537 .377 .761 .468 .158 .264 .801 .503 .796 .727 .256 96 Table 9 (cont’d) Variables Model 1: Treatment & Past Performance Treatment Prior GPA Treatment x Prior GPA Model 2: Demographics & Education Background Treatment Prior GPA Gender (male) Minority Engineering major commitment Parent Education level (some college) Parent Education level (college graduate) Parent Education level (advanced degree) Treatment x Prior GPA SA3 B SE β R2 F .02 0.42 .06 0.47 .03 0.04 0.14 -0.22 0.20 -.12 .02 0.04 0.20 0.06 0.15 .05 -0.28 0.22 -.16 -0.46 0.42 -.14 -0.04 0.47 -.01 -0.04 0.07 -.07 -0.27 0.42 -.09 -0.05 0.35 -.02 .00 0.00 0.29 0.04 0.22 .02 p .776 .287 .840 .697 .200 .270 .940 .558 .517 .886 .997 .848 97 Table 10 Behavioral Disaffection (BD) Models Variables Model 1: Treatment & Past Performance Treatment Prior GPA Treatment x Prior GPA Model 2: Demographics & Education Background Treatment Prior GPA Gender (male) Minority Engineering major commitment Parent Education level (some college) Parent Education level (college graduate) Parent Education level (advanced degree) Treatment x Prior GPA p R2 BD1 B SE β BD2 B SE β 0.11 0.13 .09 -0.22 0.19 -.13 -0.09 0.19 -.05 .11 0.13 0.14 -0.18 0.19 -.11 0.80 0.39 .25 -0.32 0.44 -.09 -0.02 0.06 -.05 -0.27 0.38 -.09 -0.12 0.32 -.05 0.41 0.26 .20 -0.04 0.19 -.02 p .393 .249 .647 .355 .370 .044 .474 .698 .487 .709 .123 .828 F .03 0.85 .14 1.19 F R2 .04 1.31 .13 1.09 0.11 0.11 .10 -0.26 0.16 -.18 .05 0.08 0.16 0.14 0.12 .14 -0.28 0.17 -.20 .01 0.04 0.33 0.02 0.33 .01 -0.05 0.05 -.10 -0.01 0.33 -.01 .07 0.15 0.27 0.27 0.22 .15 .08 0.11 0.17 .327 .106 .621 .238 .101 .908 .951 .388 .968 .596 .227 .516 98 Table 10 (cont’d) Variables Model 1: Treatment & Past Performance Treatment Prior GPA Treatment x Prior GPA Model 2: Demographics & Education Background Treatment Prior GPA Gender (male) Minority Engineering major commitment Parent Education level (some college) Parent Education level (college graduate) Parent Education level (advanced degree) Treatment x Prior GPA BD3 B SE β R2 F .10 3.11 .20 1.76 -0.13 0.12 -.11 -0.49 0.17 -.30 -0.03 0.17 -.02 -0.04 0.12 -.04 -0.47 0.18 -.31 0.62 0.33 .22 -0.21 0.37 -.07 .02 0.01 0.05 0.00 0.34 .00 -0.30 0.28 -.14 .20 0.38 0.23 0.05 0.18 .03 p .281 .005 .883 .764 .009 .066 .568 .834 .992 .285 .105 .789 99 F R2 .01 0.23 .07 0.61 Table 11 Organization (ORG) Models Variables Model 1: Treatment & Past Performance Treatment Prior GPA Treatment x Prior GPA Model 2: Demographics & Education Background Treatment Prior GPA Gender (male) Minority Engineering major commitment Parent Education level (some college) Parent Education level (college graduate) Parent Education level (advanced degree) Treatment x Prior GPA p R2 F .01 0.35 .16 1.32 .713 .986 .469 .464 .478 .219 .809 .425 .260 .837 .698 .432 ORG2 B SE β 0.06 0.16 .04 -0.15 0.23 -.07 .09 0.18 0.23 -0.02 0.18 -.02 -0.35 0.25 -.17 -1.03 0.50 -.25 -0.40 0.57 -.08 -2.00 0.07 .00 0.66 0.49 .18 -0.43 0.41 -.13 0.51 0.34 .19 .06 0.13 0.25 p .711 .531 .446 .898 .167 .044 .488 .984 .184 .305 .135 .594 ORG1 B SE β -0.06 0.16 -.04 0.00 0.24 .00 -0.17 0.24 -.08 -0.14 0.18 -.09 -0.19 0.27 -.09 -0.64 0.52 -.15 -0.13 0.53 -.03 0.07 0.08 .10 .15 0.59 0.52 0.09 0.43 .03 0.14 0.35 .05 -0.21 0.27 -.10 100 Table 11 (cont’d) Variables Model 1: Treatment & Past Performance Treatment Prior GPA Treatment x Prior GPA Model 2: Demographics & Education Background Treatment Prior GPA Gender (male) Minority Engineering major commitment Parent Education level (some college) Parent Education level (college graduate) Parent Education level (advanced degree) Treatment x Prior GPA ORG3 B SE β R2 F .01 0.15 .20 1.80 -0.10 0.15 -.07 .02 0.04 0.22 .01 0.02 0.22 -0.22 0.16 -.16 -0.14 0.23 -.07 -1.08 0.44 -.29 -0.39 0.49 -.09 .04 0.02 0.07 0.93 0.44 .27 -0.40 0.37 -.14 0.51 0.30 .21 -0.07 0.23 -.04 p .532 .852 .915 .167 .551 .015 .423 .748 .038 .286 .098 .767 101 Treatment (n = 21) SD 0.83 2.25 M 2.53 8.05 Table 12 Treatment vs. Treatment Plus Summary College GPA Commitment to Major 9.19 Test 1 10.65 Test 2 8.77 Test 3 10.55 Final Exam Final Grade 7.61 Note. * Any subject that accessed material, completed a quiz or reflection was coded as 'treatment plus' for analysis. Treatment Plus (n = 28) SD 0.68 2.52 77.89 73.15 77.26 81.16 81.21 76.15 76.23 77.77 79.11 81.23 12.21 15.85 14.72 11.18 11.06 M 2.77 8.04 Control (n = 51) M 2.74 8.30 72.47 78.98 78.71 81.75 80.27 SD 0.62 2.20 12.08 11.45 9.58 8.40 8.33 102 Table 13 Multivariate and Univariate Analyses of Variance for Outcome Measures Univariatec Multivariate (SRL)a F 1.00 p .47 ƞ2 .007 Multivariate (Exams 1-3)b F ƞ2 .040 2.68 p .016 Variable F 1.923 Exam 1 p .152 ƞ2 .040 F 1.66 Exam 2 p .197 ƞ2 .034 F 0.15 Exam 3 p .858 Treatment Note. Multivariate F ratios were generated from Pillai's statistic. Treatment condition consists of control, treatment, and treatment plus (participants accessed or completed at least one portion of the intervention. Multivariate (SRL) includes metacognitive self-regulation, behavioral dissafection, study approach, and organization measures. aMultivariate (SRL) df = 24, 120. bMultivariate (Exams 1-3) df = 6, 184. cUnivariate df = 2, 93. ƞ2 .003 103 Univariate F 0.14 Table 13 (cont'd) Variable Treatment condition Final Exam F 0.65 p .522 ƞ2 .014 Final Grade p .867 ƞ2 .003 104 4.62 R2 F .27 10.93 .37 Table 14 Exam Models Variables Model 1: Treatment & Past Performance Treatment Prior GPA Treatment x Prior GPA Model 2: Demographics & Education Background Treatment Prior GPA Gender (male) Minority Engineering major commitment Parent Education level (some college) Parent Education level (college graduate) Parent Education level (advanced degree) Treatment x Prior GPA p R2 F .40 0.38 .47 6.88 .017 .000 .014 .055 .000 .536 .005 .201 .858 .472 .629 .083 Test 2 B SE β -1.89 1.01 -.15 11.23 1.50 .62 -1.60 1.50 -.09 -2.18 1.12 -.17 11.13 1.64 .61 4.34 3.22 .12 -6.17 3.24 -.17 0.38 0.55 .06 -0.98 3.29 -.03 .08 2.27 2.27 -0.07 2.20 .00 -1.17 1.69 -.06 p .065 .000 .289 .056 .000 .182 .068 .489 .767 .407 .976 .490 Test 1 B SE β .22 2.51 1.04 7.54 1.54 .45 -3.84 1.54 -.23 .19 2.19 1.13 .41 6.88 1.65 2.01 3.24 .06 -9.58 3.34 -.28 0.71 0.55 .13 -0.59 3.30 -.12 1.98 2.73 -.09 -1.07 2.20 -.05 -2.99 1.70 -.18 105 Table 14 (cont’d) Variables Model 1: Treatment & Past Performance Treatment Prior GPA Treatment x Prior GPA Model 2: Demographics & Education Background Treatment Prior GPA Gender (male) Minority Engineering major commitment Parent Education level (some college) Parent Education level (college graduate) Parent Education level (advanced degree) Treatment x Prior GPA Test 3 B SE β R2 F .15 5.24 .33 3.91 -0.35 1.04 -.03 6.03 1.53 .39 -1.25 1.53 -.08 -0.68 1.10 -.06 .39 6.25 1.61 2.60 3.16 .08 -3.95 3.26 -.12 1.88 0.54 .36 -1.58 3.22 -.05 3.46 2.67 .14 -1.01 2.15 -.05 -0.28 1.66 -.02 p .735 .000 .418 .536 .000 .414 .229 .001 .626 .198 .640 .868 106 R2 F .20 7.31 .35 4.32 Table 15 Final Exam and Transcript Grade Models Variables Model 1: Treatment & Past Performance Treatment Prior GPA Treatment x Prior GPA Model 2: Demographics & Education Background Treatment Prior GPA Gender (male) Minority Engineering major commitment Parent Education level (some college) Parent Education level (college graduate) Parent Education level (advanced degree) Treatment x Prior GPA p .334 .000 .189 .675 .000 .226 .017 .000 .911 .214 .691 .508 p R2 B β SE 9.30 Transcript Grade F .37 17.15 .54 .08 0.71 0.73 7.65 1.08 .61 -1.42 1.08 -.11 .03 0.31 0.73 .57 7.27 1.06 2.55 2.09 .10 -5.27 2.15 -.20 1.50 0.36 .36 -0.24 2.13 -.01 2.21 1.76 .11 -0.57 1.42 -.04 -0.73 1.10 -.06 Final Exam SE B β -0.65 0.89 -.07 6.08 1.32 .44 -0.65 1.32 -.05 -0.82 0.93 -.09 5.37 1.36 .39 4.09 2.66 .15 -3.27 2.75 -.12 1.78 0.45 .39 .05 1.28 2.72 0.18 2.25 .01 -1.21 1.81 -.07 0.26 1.40 .02 .466 .000 .623 .378 .000 .129 .238 .000 .640 .937 .509 .854 107 n 49 22 8 28 16 M - 3.77 3.25 - 1.19 % - 45 16 57 33 Table 16 Treatment Fidelity Treatment subjects Subjects accessing SRL material Subjects completing review quizzes ** Subjects completing any portion of intervention Subjects completing reflections * Note. Data reflects only those participants who accessed material as part of the intervention * Maximum of three reflections could be completed, ** A maximum of eight review quizzes were available 108 Table 17 Reflection Entries Entry 1 Exam 1 Exam preparation activity Went to study groups What went wrong on the exam? I did not study alone enough What strategies will you use on the next exam? Start reviewing earlier 2 3 4 5 6 7 8 1 1 1 1 1 1 1 I reviewed class slides and notes for about 30 mins I made silly mistakes even though I knew the concept - Tried focusing on major components of each slide Study session I did not remember the significance of the minor parts or their definition Simple mistake or didn't study that exact thing Explain to myself how each part works with each other to make the system work Go over PowerPoints more precise Went over notes and made up some questions and quizzes with friends. There were a few things that I didn't expect on exam. Go over all notes. Went to review sessions, went on ship to look at the things we were learning about, studied power point slides Study the slides Quiz myself on the material over and over Some questions were not from power point slides and i did not know them Too little information about detailed operation on each pump, pipe, thread I did not memorize a graph that was present in the exam (found in the PowerPoint material) Start to study earlier Do personal research when applicable Continued quizzing and practice, perhaps work to relate the topics to the ship as well 109 Entry Exam Exam preparation activity What went wrong on the exam? 10 I didnt know all the material What strategies will you use on the next exam? Study earlier Table 17 (cont’d) 1 1 2 3 2 3 3 11 12 13 14 15 16 Power point slides, practice questions, review classes I read over my notes and the powerpoints Nothing I only had either stupid mistakes or got questions wrong that were not on the PowerPoints and I would have no way to study for. I didnt study Paying attention in class and listening to the teacher is the best way to prepare. Besides that I will make sure to know everything on the powerpoints. Study group Studied powerpoints and made flashcards Slides/flash cards The powerpoints given to us by the teacher were messy, hard to read and understand, and some of the slides were incorrect with the wrong information Small details that would have been remembered if studied once more Study harder Study the smaller details Review and rewrite PowerPoints Looked over power points and made up my own potential exam questions. Not enough memorization Study earlier None really. This was my best exam grade Review more of the notes taken in class. 110 Table 17 (cont’d) Entry Exam Exam preparation activity 17 Looked over PowerPoints 3 I reviewed all of the powerpoints, taking notes on them as I went along. Reviewed powerpoints and studied in with a group of 5 other students Read over notes and made flash cards 18 19 20 21 3 3 3 3 What went wrong on the exam? Not enough information expressed in words on the powerpoints made it hard to really know what to review. It also made it hard to take notes in class. I think I did everything right, since I got a 90 Trying to remember different systems What strategies will you use on the next exam? Same strategy Understanding the content works well Review the systems I do not fully understand I didn't completly understand the question Put in more time Read over the material. Did not know enough of the material. Flashcards. 111 Table 18 Reflection Data Summary Exam preparation activity Engaged with material Only read material Collaborated with others No action n What went wrong on the exam? 12 Did not memorize 6 Did not anticipate questions 6 Didn't study enough 1 Made mistakes Was not provided adequate information 3 Find answers on their own Nothing better n What strategies will you use on the next exam? 6 Start studying sooner 5 Review more specifics 4 Better understand concepts 3 More rote practice/self-quizing 2 Study group n 6 6 4 2 1 1 112 APPENDIX B Figure 1 A Model of Self-Regulated Learning (Butler & Winne, 1995). Figures 113 Figure 2 Phases and Sub-processes of Self-Regulation (Zimmerman, 2002). 114 Figure 3 Treatment and Control Overview Control Week 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 Intake form completed First Measure Test 1 Second measure Test 2 Test 3 Third measure Final Exam Treatment Intake form completed First Measure Reminder - Exam 1 Test 1 * Reflection - Exam 1 Second measure Reminder - Exam 2 Test 2 * Reflection - Exam 2 Reminder - Exam 3 Test 3 * Reflection - Exam 3 Third measure Final Exam * Text messages - SRL lessons and quizzes in LMS Text messages - class SRL prompts Note. * Any subject that accessed material, completed a quiz or reflection was coded as 'treatment plus' for analysis. Items highlighted in grey were common to both treatment and control groups. Text messages were sent to treatment participants approximately twice weekly. 115 Figure 4 Scree Plot for Factor Analysis e u l a v n e g i E Component Number 116 Figure 5 Interaction Effect for Test 1, Model 1 117 APPENDIX C Message, Treatment, and Exam Schedule Each message describes a text message prompt that was sent to students in the treatment group by week in the term. Week 1 1. This is a test of the Remind system for [your class]. There is no need to respond. Please participate by clicking on the messages you receive! (Testing system) 2. As a reminder, you will be receiving messages on your phone about [your class] as part of a research study. Please start by completing this survey! It will take you approximately 10 minutes to complete. (Research study reminder). 3. Check out the 'modules - for research study participants only’ section of Blackboard to see what's coming up. Week 2 4. Are you getting the grades you want? Visit Blackboard to read about strategies that can help you this semester! https://press.rebus.community/blueprint1/chapter/20-the-basics-of-study- skills/ (Teaching SRL - learning strategies) [First Measure] 5. There are many myths about learning and studying. Visit ‘approaches to learning’ in Blackboard to find out why you should reconsider how you learn! https://courses.lumenlearning.com/suny-collegesuccess-lumen1/chapter/deep-learning/ (Teaching SRL - study strategies). Week 3 6. Goal setting and time management are critical to success. Visit Blackboard to find out how to excel at both. https://courses.lumenlearning.com/suny-collegesuccess-lumen1/chapter/defining- goals/ (Teaching SRL - goal setting) 7. Your first test for [class] is coming up! This is a great time to start organizing and studying. (Reminder prompt) Week 4 8. We can all find ways to improve how we approach exams. Visit Blackboard to consider ways you can hone this important skill! 118 https://dennislearningcenter.osu.edu/strategic-test-taking/ (Teaching SRL - Test taking strategies) [First exam] 9. Procrastination affects us all, but it doesn’t have to! Visit Blackboard to learn how to combat procrastination and manage your time better. https://dennislearningcenter.osu.edu/procratination-ways-to-avoid-it-episode-1/ https://dennislearningcenter.osu.edu/procratination-ways-to-avoid-it-episode-3/ (Teaching SRL - strategic planning) Week 5 10. By now you should have received the results of your first exam. After reviewing, click here to complete an exercise. (Link to Qualtrics form - Self-reflection prompt) 11. Engaging in more active learning could be your key to success. Visit Blackboard to find out how to get started. https://press.rebus.community/blueprint1/chapter/27-taking-notes-in-class/ (Teaching SRL - active engagement) Week 6 12. It's important to learn from our mistakes. Visit Blackboard to find ways to fix common academic mishaps. https://courses.lumenlearning.com/suny-collegesuccess- lumen1/chapter/evaluating-results/ (Teaching SRL - reflection) 13. Remember to review the slides from class regularly to solidify the concepts you learn in class! (In consultation with faculty member). Week 7 14. Make sure you review your notes from class to help reinforce your learning and prepare for your upcoming exam! (In consultation with faculty member). 15. Your second test for [class] is coming up next week! Now is a great time to start organizing and studying. (Reminder prompt) [Second Measure] Week 8 [Second Exam] 16. Spend time focusing on terminology and definitions. They are a critical component of [your class]. (In consultation with faculty member). 119 17. Lab time on the ship is a great opportunity to reinforce the concepts you've learned in the classroom. Take advantage! (In consultation with faculty member). Week 9 18. By now you should have received the results of your second exam. After reviewing, click here to complete an exercise. (Links to Qualtrics form - Self-reflection prompt) Week 10 19. Key words written on the board often serve as important hints. Spend some extra time reviewing these definitions! (In consultation with faculty member). 20. Use your study time with peers effectively - plan ahead to make the most of your limited time! (In consultation with faculty member). Week 11 21. Evening hours with [tutors] can be particularly helpful if you need a helping hand! (In consultation with faculty member). 22. Review diagrams from class closely. Pay extra attention to detail. (In consultation with faculty member). Week 12 23. Everything you've learned will come together soon - keep at it! (In consultation with faculty member). 24. As you're studying, look ahead to how what you've learned will come together this summer on the ship! (In consultation with faculty member). Week 13 25. Your third exam for [class] is coming up next week! Now is a great time to start organizing and studying. (Reminder prompt) 26. Focus your understanding of how each lesson you've learned fit together. This will help you with challenging questions on upcoming exams. (In consultation with faculty member). Week 14 27. Make sure you are planning your time strategically these last few weeks. Focus on making your study sessions efficient and effective! [Third Exam] 120 28. Finals are right around the corner! Now is a great time to start organizing and studying. Use your time and energy wisely! (Reminder prompt). Week 15 29. By now you should have received the results of your third exam. After reviewing, click here to complete a short exercise. (Links to Qualtrics form - Self-reflection prompt). [Third Measure] 30. The final exam is cumulative - make sure you go all the way back to the start of the term to start preparing (In consultation with faculty member). 31. Best of luck with finals this week. You have prepared well. Thank you for participating in this study and enjoy your summer! (Closing message). 121 APPENDIX D Self-Reflection Form The self-reflection form below was administered electronically to students in the treatment condition following each exam. Q1. What are the last four digits of your student ID? ________________________________________________________________ Q2 Test I just received back with my grade • • • • • • Test 1 (1) Test 2 (2) Test 3 (3) Q3 How much time did you spend studying for this exam? ________________________________________________________________ Q4 How many practice questions did you do in preparation for this exam? ________________________________________________________________ Q5 What did you do to prepare for this exam? Be specific! ________________________________________________________________ Q6 Explain what strategies or processes went wrong on this exam ________________________________________________________________ Q7 We now recommend that you work through the questions you answered incorrectly. Other than memorization, what strategies will you use on the next exam? ________________________________________________________________ Q8 We now recommend you work on alternative, similar problems that you have reviewed. Write how many questions you reviewed below. 122 ________________________________________________________________ Q9 How confident are you now that you can answer similar questions you answered incorrectly on the next exam? 1. • 2. • 3. • 4. • 5. • Definitely not confident (1) Not confident (2) Undecided (3) Confident (4) Very confident (5) Q10 Would you like some additional assistance? If so speak with your instructor during office hours or visit the Learning Center for tutoring. You can also make an appointment with the Academic Coach by visiting the Luce Library. Form adapted from Zimmerman et al. (2011) 123 APPENDIX E SRL Measures and Items All questions and scales were administered as one collective instrument at each measurement point of the semester. SRSI - SR: Managing environment and behavior 1. I make sure no one disturbs me when I study. 2. I make a schedule to help me organize my study time. 3. I finish all of my studying before I hang out with my friends. 4. I try to study in a quiet place. 5. I think about how best to study before I begin studying. 6. I try to study in a place that has no distractions (e.g. Noise, people talking). 7. I quiz myself to see how much I am learning during studying. 8. I study hard even when there are more fun things to do. 9. I tell myself to keep trying when I can't learn a topic or idea. 10. I use binders or folders to organize my study materials. 11. I tell myself exactly what I want to accomplish during studying. 12. I carefully organize my study materials so I don't lose them. SRSI - SR: Seeking and learning information 13. I ask my instructor questions when I do not understand something. 14. I try to see how my notes from class relate to things I already know. 15. I make pictures or drawings to help me learn concepts. 16. I look over my assignments if I don't understand something. 17. I think about the types of questions that might be on a test. 18. I ask my instructor about the topics that will be on upcoming tests. 19. I rely on my notes to study. 20. I try to identify the format of upcoming tests. SRSI - SR: Maladaptive regulatory behavior 21. I forget to bring my materials when I need to study. 22. I avoid going to extra-help, office hours, or tutoring. 23. I lose important materials pertaining to class. 24. I give up or quit when I do not understand something. 25. I let my friends interrupt me when I am studying. 26. I avoid asking questions in class about things I don't understand. 27. I wait to the last minute to study for tests. 28. I try to forget about the topics that I have trouble learning. SRSI - SR questions adapted from Cleary, 2006. 124 MSLQ Questions - Metacognitive self-regulation 1. During class time I often miss important points because I'm thinking of other things. (reversed) 2. When reading for this course, I make up questions to help focus my reading. 3. When I become confused about something I'm reading for this class, I go back and try to figure it out. If course materials are difficult to understand, I change the way I read the material. 4. 5. Before I study new course material thoroughly, I often skim it to see how it is organized. 6. I ask myself questions to make sure I understand the material I have been studying in class. I try to change the way I study in order to fit the course requirements and instructor's teaching style. I often find that I have been reading for class but don't know what it was all about. (Reversed) I try to think through a topic and decide what I am supposed to learn from it rather than just reading it over when studying. 7. 8. 9. 10. When studying for this course I try to determine which concepts I don't understand well. 11. When I study for this class, I set goals for myself in order to direct my activities in each 12. If I get confused about taking notes in class, I make sure I sort it out afterwards. study period. From Pintrich et al., 1991 125 REFERENCES 126 REFERENCES CA: Cengage Learning. learning? The role of self-regulated learning. Educational Psychologist, 40(4), 199– 209. doi:10.1207/s15326985ep4004_2 Admithub [College student messaging software]. Boston, MA: AdmitHub Inc. Ary, D., Jacobs, L. C., & Sorensen, C. (2010). Introduction to research in education. Belmont, Azevedo, R. (2005). Using hypermedia as a metacognitive tool for enhancing student Bandura, A. (1978). The self-system in reciprocal determinism. American Psychologist, 33, 344- Bandura, A. (1986). Social foundations of thought and action: A social cognitive theory. Bandura, A. (1991). Social cognitive theory of self-regulation. Organizational Behavior and Ben-Eliyahu, A., & Linnenbrink-Garcia, L. (2015). Integrating the regulation of affect, behavior, and cognition into self-regulated learning paradigms among secondary and post-secondary students. Metacognition and Learning, 10(1), 15–42. doi:10.1007/s11409-014-9129-8 Bull, P., & McCormick, C. (2012). Mobile learning: Integrating text messaging into a Human Decision Processes, 50, 248–287. Englewood Cliffs, NJ: Prentice Hall. 358. community college pre-algebra course. International Journal on E-Learning, 11(3), 233– 245. Retrieved from https://www-learntechlib-org.proxy1.cl.msu.edu/primary/p/35397/. synthesis. Review of Educational Research, 65, 245–281. doi: 10.2307/1170684 self-regulation have to do with it? In Cleary, T.J. (Ed.), Self-regulated learning interventions with at-risk youth: Enhancing adaptability, performance and well-being. (pp. 89-112). Washington, DC: American Psychological Association. Butler, D. L., & Winne, P. H. (1995). Feedback and self-regulated learning: A theoretical Butler, D.L. & Schnellert, L. (2015). Success for students with learning disabilities: What does Campbell, D. T., Stanley, J. C. (1963). Experimental and quasi-experimental designs for Castleman, B.L. & Meyer, K. (2016) Castleman, B. L., & Meyer, K. (2016). Can text message nudges improve academic outcomes in college? Evidence from a West Virginia Initiative. Working paper, EdPolicyWorks (43), 31, University of Virginia. research. Boston: Houghton Mifflin. 127 empowerment program with urban high school students. Journal of Advanced Academics, 20(1), 70–107. doi:10.4219/jaa-2008-866 —self-report. Journal of School Psychology, 44(4), 307–322. doi:10.1016/j.jsp 2006.05.002 increase FAFSA renewal and college persistence. Journal of Human Resources, 51, 389–415. doi: 10.3368/jhr.51.2.0614-6458R adolescents with asthma. In Cleary, T.J. (Ed.), Self-regulated learning interventions with at-risk youth: Enhancing adaptability, performance and well-being. (pp. 181-202). Washington, DC: American Psychological Association. Castleman, B. L., & Page, L. C. (2015). Summer nudging: Can personalized text messages and peer mentor outreach increase college going among low-income high school graduates? Journal of Economic Behavior & Organization, 115, 144–160. doi:10.1016/j.jebo 2014.12.008 Castleman, B. L., & Page, L. C. (2016). Freshman year financial aid nudges: An experiment to Christensen, C.M. (2011) The innovative university. San Francisco, CA: Jossey-Bass. Clark, N.M. & Patel, M.R. (2015). Self-regulation-based interventions for children and Cleary, T. J. (2006). The development and validation of the self-regulation strategy inventory Cleary, T. J., Platten, P., & Nelson, A. (2008). Effectiveness of the self-regulation Cleary, T. J., & Zimmerman, B. J. (2004). Self-regulation empowerment program: A school- based program to enhance self-regulated and self-motivated cycles of student learning. Psychology in the Schools, 41(5), 537–550. doi:10.1002/pits.10177 College Success. (n.d.). In OER Services. Retrieved from https://courses.lumenlearning.com/ Corno, L., & Kanfer, R. (1993). The role of volition in learning and performance. Review of Dabbagh, N., & Kitsantas, A. (2012). Personal learning environments, social media, and self- regulated learning: A natural formula for connecting formal and informal learning. The Internet and Higher Education, 15, 3–8. doi: 10.1016/j.iheduc.2011.06.002 Davis, D. R., & Abbitt, J. T. (2013). An investigation of the impact of an intervention to reduce Dennis Learning Center. (2018). Resources. Retrieved from https://dennislearningcenter.osu.edu/student-resources/. academic procrastination using short message service (SMS) technology. Journal of Interactive Online Learning, 12, 78–102. Retrieved from https://www.ncolr.org/ suny-collegesuccess-lumen1/ Research in Education, 19, 301–341. doi: 10.2307/1167345 128 college: Indispensable study skills and time management strategies. v. 2.2. [Creative Commons]. Retrieved from https://press.rebus.community/blueprint1/ college success: A large-scale field experiment (NBER Working Paper No. 23738). Retrieved from National Bureau of Economic Research website: https://www.nber.org/papers/w23738.pdf and childhood depression. In Cleary, T.J. (Ed.), Self-regulated learning interventions with at-risk youth: Enhancing adaptability, performance and well-being. (pp. 157-180). Washington, DC: American Psychological Association. Dillon, D., Hill, L.B., Lamoreaux, A., Nissila, P., Priester, T. (2018). Blueprint for success in Dobronyi, C., Oreopoulos, P., & Petronijevic, U. (2017). Goal setting, academic reminders, and Duncan, T. G., & McKeachie, W. J. (2005). The making of the motivated strategies for learning questionnaire. Educational Psychologist, 40, 117–128. doi: 10.1207/s15326985ep4002_6 Ehrenreich-May, J., Kennedy, S.M. & Remmes, C.S. (2015) Emotion regulation interventions Faul, F., Erdfelder, E., Lang, A.G., & Buchner, A. (2007). G*Power 3: A flexible statistical power analysis program for the social, behavioral, and biomedical sciences. Behavior Research Methods, 39, 175-191. doi:10.3758/bf03193146 Field, A., Miles, J., & Field, Z. (2012). Discovering statistics using R. Thousand Oaks, CA: Frankfort, J., O’Hara, R. E., & Salim, K. (2015). Behavioral nudges for college success: Gainen, J. (1995). Barriers to success in quantitative gatekeeper courses. New Research, impact and possibilities. In Castleman, B.L., Schwartz, S., & Baum, S. (Eds.), Decision making for student success: Behavioral insights to improve college access and persistence (pp. 143-160). New York: Routledge. Directions for Teaching and Learning, (61), 5–14. doi:10.1002/tl.37219956104 Sage. Georgina, D. A., & Hosford, C. C. (2009). Higher education faculty perceptions on technology integration and training. Teaching and Teacher Education, 25, 690–696. doi: 10.1016/j.tate.2008.11.004 Goh, T.T., Seet, B.C., & Chen, N.S. (2012). The impact of persuasive SMS on students’ self- regulated learning. British Journal of Educational Technology, 43, 624–640. doi: 10.1111/j.1467-8535.2011.01236.x based learning environments. New Directions for Teaching and Learning, 126, 107– 115. doi:10.1002/tl.449 Greene, J. A., Moos, D. C., & Azevedo, R. (2011). Self-regulation of learning with computer- 129 collaboration. Peabody Journal of Education, 89, 291–304. doi:10.1080/0161956X. 2014.913440 timely information via SMS. Presented at the 11th International Conference of the Association for Learning Technology, Exeter, UK. Retrieved from http://www.alt.ac.uk/ altc2004/timetable/files/133/alt-c-2004-v1-LGAH%20.ppt. Griffiths, L., & Hmer, A. (2004). U R L8 4 ur exam:) - students’ opinions towards receiving Harackiewicz, J. M., & Priniski, S. J. (2018). Improving student outcomes in higher education: The science of targeted intervention. Annual Review of Psychology, 69(1), 409–435. doi: 10.1146/annurev-psych-122216-011725 Hodges, C. B., & Kim, C. (2010). Email, self-regulation, self-efficacy, and achievement in a college online mathematics course. Journal of Educational Computing Research, 43(2), 207–223. doi:10.2190/EC.43.2.d Hrabowski, F. A. (2014). Institutional change in higher education: Innovation and Huang, S., & Fang, N. (2013). Predicting student academic performance in an engineering Jones, G., Edwards, G., & Reid, A. (2009). How can mobile SMS communication support and Junco, R. (2012). Too much face and not enough books: The relationship between multiple Kaiser, H.F. (1974). An index of factorial simplicity. Psychometrika. 39, 31-36. doi:10.1007/ Kanfer, R., & Ackerman, P. L. (1989). Motivation and cognitive abilities: An integrative/ Kasesniemi, E.L. & Rautiainen, P. (2002). Mobile culture of children and teenagers in Kim, C., & Keller, J. M. (2007). Effects of motivational and volitional email messages (MVEM) with personal messages on undergraduate students' motivation, study habits and achievement. British Journal of Educational Technology, 39, 36–51. doi:10.1111/j 1467-8535.2007.00701.x indices of Facebook use and academic performance. Computers in Human Behavior, 28(1), 187–198. doi:10.1016/j.chb.2011.08.026 aptitude-treatment interaction approach to skill acquisition. Journal of Applied Psychology, 74, 657-690. doi: 10.1037/0021-9010.74.4.657 Finland. in J. Katz & M. Aakhus (Eds.), Perpetual contact: Mobile communication, private talk, public performance. Cambridge: Cambridge University Press, pp. 170-192. dynamics course: A comparison of four types of predictive mathematical models. Computers & Education, 61, 133–145. doi:10.1016/j.compedu.2012.08.015 enhance a first year undergraduate learning environment? Research in Learning Technology, 17, 201-218. doi:10.1080/09687760903247625 bf02291575 130 learning. New Directions for Teaching and Learning, 2011, 99–106. doi:10.1002/tl.448 education classrooms. Research in Learning Technology, 21. doi:10.3402/rlt.v21i0.19061 for educators: A proof of concept. Computers & Education, 54, 588–599. doi:10.1016/ j.compedu.2009.09.008 analysis of the effects of the U.S. News and World Report college rankings. Research in Higher Education, 45(5), 443–461. doi: 10.1023/B:RIHE. 0000032324.46716.f4 evidence on strategies to increase college completion for students at risk of late departure, US Department of Education, Institute of Education Science. Retrieved from http://j.mp/2yB8PXS athletes: A social cognitive perspective. In Schunk, D.H. & Greene, J.A. (Eds.), Handbook of self-regulation of learning performance: 2nd edition. pp. 194-207. New York: Routledge. Kitsantas, A., & Dabbagh, N. (2011). The role of Web 2.0 technologies in self-regulated Kitsantas, A., Kavussanu, M., Corbatto, D.B., & Van De Pol, P.K.C. (2018) Self-regulation in Lauricella, S., & Kay, R. (2013). Exploring the use of text and instant messaging in higher Mabel, Z., Castleman, B. L., & Bettinger, E. P. (2017). Finishing the last lap: Experimental Macfadyen, L. P., & Dawson, S. (2010). Mining LMS data to develop an “early warning system” Meredith, M. (2004). Why do universities compete in the ratings game? An empirical Moos, D.C. (2018). Emerging classroom technology: Using self-regulation principles as a guide Nelson, K. G., Shell, D. F., Husman, J., Fishman, E. J., & Soh, L.-K. (2015). Motivational and Nix, J., Russell, J., & Keegan, D. (2006). Mobile learning/SMS academic administration kit. Oreopoulos, P., Patterson, R. W., Petronijevic, U., & Pope, N. G. (2018). Lack of study time is Paunesku, D., Walton, G. M., Romero, C., Smith, E. N., Yeager, D. S., & Dweck, C. S. (2015). Persistence Plus (2016). [Text messaging software]. Boston, MA. Persistence Plus the problem, but what is the solution? Unsuccessful attempts to help traditional and online college students. (NBER Working Paper No. 25036). Retrieved from National Bureau of Economic Research website: https://www.nber.org/papers/w25036 for effective implementation. In Schunk, D.H. & Greene, J.A. (Eds.), Handbook of self- regulation of learning performance: 2nd edition. pp. 243-253. New York: Routledge. self-regulated learning profiles of students taking a foundational engineering course. Journal of Engineering Education, 104(1), 74–100. doi:10.1002/jee.20066 Presented at the EDEN Take Learning Mobile conference, Dublin, Ireland. Mind-set interventions are a scalable treatment for academic underachievement. Psychological Science, 26, 784-793. doi:10.1177/0956797615571017 LLC. Retrieved from 131 the Motivated Strategies for Learning Questionnaire (MSLQ). January 4, 2017, from http://blog.admithub.com/admithub-launches-first-college-chatbot- with-georgia-state self-regulation: A study on the relationship of self-regulation, note taking, and test taking. Journal of Educational Psychology, 95, 335–346. doi: 10.1037/0022-0663.95.2.335 learning in college students. Educational Psychology Review, 16, 385–407. doi:10.1007/ s10648-004-0006-x Peterson, D. (2016). AdmitHub launches first college chatbot with Georgia State. Retrieved Peverly, S. T., Brobst, K. E., Graham, M., & Shaw, R. (2003). College adults are not good at Pintrich, P. R. (2004). A conceptual framework for assessing motivation and self-regulated Pintrich, P. R., & De Groot, E. V. (1990). Motivational and self-regulated learning components of classroom academic performance. Journal of Educational Psychology, 82, 33-40. doi: 10.1037/0022-0663.82.1.33 Pintrich, P. R., Smith, D.A.F., Garcia, T., & McKeachie, W.J. (1991). A manual for the use of Pintrich, P. R., Smith, D. A. F., Garcia, T., & Mckeachie, W. J. (1993). Reliability and predictive Plant, E. A., Ericsson, K. A., Hill, L., & Asberg, K. (2005). Why study time does not predict R Core Team (2019). R: A language and environment for statistical computing. R Foundation for Remind (2015). [Student-teacher text messaging software]. San Francisco, CA: Remind, Inc. Richardson, M., Abraham, C., & Bond, R. (2012). Psychological correlates of university Ricks, K. G., Richardson, J. A., Stern, H. P., Taylor, R. P., & Taylor, R. A. (2014). An engineering learning community to promote retention and graduation of at-risk engineering students. American Journal of Engineering Education (AJEE), 5(2), 73–90. doi: 10.19030/ajee.v5i2.8953 grade point average across college students: Implications of deliberate practice for academic performance. Contemporary Educational Psychology, 30, 96–116. doi:10.1016/ j.cedpsych.2004.06.001 validity of the motivated strategies for learning questionnaire (MSLQ). Educational and Psychological Measurement, 53(3), 801–813. doi:10.1177/0013164493053003024 Statistical Computing, Vienna, Austria. Retrieved from https://www.R-project.org/ students’ academic performance: A systematic review and meta-analysis. Psychological Bulletin, 138, 353–387. doi:10.1037/a0026838 Robbins, S. B., Lauver, K., Le, H., Davis, D., Langley, R., & Carlstrom, A. (2004). Do psychosocial and study skill factors predict college outcomes? A meta-analysis. Psychological Bulletin, 130, 261–288. doi:10.1037/0033-2909.130.2.261 132 a systematic literature review of self-report instruments. Educational Assessment, Evaluation and Accountability, 28, 225–250. doi: 10.1007/s11092-015-9229-2 that college grades are affected by quantity of study. Social Forces, 63, 945–966. doi: 10.2307/2578600 Pearson. reflective practice. New York: Guilford Press. of prompting self-regulation in technology-delivered instruction. Personnel Psychology; Durham, 62, 697–734. doi:10.1111/j.1744-6570.2009.01155.x progress at community colleges? In Castleman, B.L., Schwartz, S., & Baum, S. (Eds.), Decision making for student success: Behavioral insights to improve college access and persistence (pp. 102-123). New York: Routledge. Roth, A., Ogrin, S., & Schmitz, B. (2016). Assessing self-regulated learning in higher education: Schuman, H., Walsh, E., Olson, C., & Etheridge, B. (1985). Effort and reward: The assumption Schunk, D. (2012). Learning theories: An educational perspective (Sixth edition). Boston, MA: Schunk, D. H., & Zimmerman, B. J. (1998). Self-regulated learning: From teaching to self- Scott-Clayton, J.E. (2011). The shapeless river: does a lack of structure inhibit students’ Sitzmann, T., Bell, B. S., Kraiger, K., & Kanar, A. M. (2009). A multilevel analysis of the effect Sitzmann, T., & Ely, K. (2010). Sometimes you need a reminder: The effects of prompting self- regulation on regulatory processes, learning, and attrition. Journal of Applied Psychology, 95, 132–144. doi:10.1037/a0018080 Sitzmann, T., & Ely, K. (2011). A meta-analysis of self-regulated learning in work-related Sitzmann, T., & Johnson, S. K. (2012). The best laid plans: Examining the conditions under Stevens, J.P. (2002). Applied multivariate statistics for the social sciences (4th ed.). Hillsdale, NJ: Tabuenca, B., Kalz, M., Drachsler, H., & Specht, M. (2015). Time will tell: The role of mobile Tampke, D. R. (2012). Developing, implementing, and assessing an early alert system. Journal which a planning intervention improves learning and reduces attrition. Journal of Applied Psychology, 97, 967–981. doi:10.1037/a0027977 learning analytics in self-regulated learning. Computers & Education, 89, 53–74. doi: 10.1016/j.compedu.2015.08.004 of College Student Retention: Research, Theory and Practice, 14(4), 523–532. doi: 10.2190/CS.14.4.e training and educational attainment: what we know and where we need to go. Psychological Bulletin, 137(3), 421. doi: doi:10.1037/a0022777 Erlbaum. 133 collegescorecard.ed.gov/ U.S. Department of Education (2018). College scorecard. Retreived from https:// Usher, E.L., & Schunk, D.H. (2018). Social cognitive theoretical perspective of self-regulation. In Schunk, D.H. & Greene, J.A. (Eds.), Handbook of self-regulation of learning performance: 2nd edition. pp. 19-35. New York: Routledge. 173-187. doi: 10.1207/s15326985ep3004_2 Psychologist, 45, 267–276. doi:10.1080/00461520.2010.517150 learning environments. Online Learning, 21. Doi: 10.24059/olj.v21i2.881 & Greene, J.A. (Eds.), Handbook of self-regulation of learning performance: 2nd edition. pp. 36-48. New York: Routledge. Wandler, J., & Imbriale, W. J. (2017). Promoting college student self-regulation in online Wang, C.H., Shannon, D.M., & Ross, M.E. (2013). Students’ characteristics, self-regulated learning, technology self-efficacy, and course outcomes in online learning. Distance Education, 34, 302–323. doi: 10.1080/01587919.2013.835779 Winne, P.H., (1995). Inherent details in self-regulated learning. Educational Psychologist, 30, Winne, P.H. (2010). Improving measurements of self-regulated learning. Educational Winne, P.H. (2018) Cognition and metacognition within self-regulated learning. In Schunk, D.H. Winters, F. I., Greene, J. A., & Costich, C. M. (2008). Self-regulation of learning within computer-based learning environments: A critical analysis. Educational Psychology Review, 20, 429–444. doi: doi:10.1007/s10648-008-9080-9 Wolters, C.A. & Hoops, L.D. (2015) Self-regulated learning interventions for motivationally disengaged college students. In Cleary, T.J. (Ed.), Self-regulated learning interventions with at-risk youth: Enhancing adaptability, performance and well-being. (pp. 67-88). Washington, DC: American Psychological Association. Yeager, D. S., Romero, C., Paunesku, D., Hulleman, C. S., Schneider, B., Hinojosa, C., Lee, Zamani-Miandashti, N., & Ataei, P. (2015). College students’ perceptions of short message Zimmerman, B.J. (2000). Attaining self-regulation: A social-cognitive perspective. In M. Boekaerts, P. Pintrich, & M. Seidner (Eds.), Self-regulation: Theory, research, and applications (pp. 13–39). Orlando, FL: Academic Press. H.Y., O'Brien, J., Flint, K., Roberts, A., Trott, J., Greene, D., Walton, G.M., & Dweck, C. S. (2016). Using design thinking to improve psychological interventions: The case of the growth mindset during the transition to high school. Journal of Educational Psychology, 108, 374–391. doi:10.1037/edu0000098 service-supported collaborative learning. Innovations in Education and Teaching International, 52, 426–436. doi:10.1080/14703297.2014.900453 134 Practice, 41, 64–70. doi:10.1207/s15430421tip4102_2 methodological developments, and future prospects. American Educational Research Journal, 45(1), 166–183. doi:10.3102/0002831207312909 Zimmerman, B. J. (2002). Becoming a self-regulated learner: An overview. Theory Into Zimmerman, B. J. (2008). Investigating self-regulation and motivation: Historical background, Zimmerman, B. J., & Kitsantas, A. (2014). Comparing students’ self-discipline and self- regulation measures and their prediction of academic achievement. Contemporary Educational Psychology, 39, 145–155. doi:10.1016/j.cedpsych.2014.03.004 Zimmerman, B.J. & Martinez-Pons, M. (1988). Construct validation of a strategy model of student self-regulated learning. Journal of Educational Psychology, 80, 284-290. doi: 10.1037/0022-0663.80.3.284 Zimmerman, B. J., Moylan, A., Hudesman, J., White, N., & Flugman, B. (2011). Enhancing self- Zusho, A., & Edwards, K. (2011). Self-regulation and achievement goals in the college classroom. New Directions for Teaching and Learning, 21–31. doi:10.1002/tl.441 reflection and mathematics achievement of at-risk urban technical college students. Psychological Test and Assessment Modeling, 53(1), 141–160. Retrieved from www.journaldatabase.info 135