IS IT WORTH IT? THREE PAPERS EXAMINING STUDENTS’ PERCEPTIONS OF COST IN MATHEMATICS By Patrick N. Beymer A DISSERTATION Submitted to Michigan State University in partial fulfillment of the requirements for the degree of Educational Psychology and Educational Technology—Doctor of Philosophy 2020 ABSTRACT IN MATHEMATICS By Patrick N. Beymer IS IT WORTH IT? THREE PAPERS EXAMINING STUDENTS’ PERCEPTIONS OF COST This dissertation comprises of three independent studies, all of which focus on providing a better understanding of students’ cost beliefs. Each paper in this dissertation had a unique aim. In paper one, validity evidence is gathered in order to develop a short cost scale for use with intensive longitudinal methodologies. Paper two addresses the potential jangle fallacy between emotional cost and negative emotions. In paper three, dynamic structural equation modeling is used to address the dynamic nature of cost in the classroom by examining the associations between anticipated costs, experienced costs, math achievement, and STEM career intentions. Each paper describes the emergent findings from each study and discusses implications for research and practice. for alecia, always iii ACKNOWLEDGEMENTS There are many who are embedded in the development of who I am and who I am becoming as a scholar. To make a comprehensive list of all of the people who have shaped my educational journey would be impossible. Nevertheless, I would like to thank some of those who have been instrumental to my growth. First, I am forever grateful to my wife, Alecia. Your relentless encouragement is truly the reason that I am where I am. I can never thank you enough for reading manuscripts, watching practice talks (alongside Ripley and Starbuck), and providing space for me to work through ideas. Watching you gracefully navigate academia has been a source of inspiration. You are my person. To my family. Dad, thank you for believing in me and always providing a source of optimism. Payton, thank you for taking the time to listen to me rant. Aaron, thank you for always picking up the phone to talk. Chris, thank you for reminding me to take time for family. I also owe many friends for keeping me grounded and providing unwavering support. Rob, my best friend, thank you for always being there and showing me how to be a little more spontaneous. To everyone I played intramural basketball with at Michigan State, thank you for taking me in and giving me an outlet outside of academic work. To my lab colleagues, thank you for engaging in scholarly conversations that contributed to my academic identity. To Kristy, thank you for laying the groundwork and showing me how to navigate the challenges that come with graduate school. Your friendship and mentorship have been invaluable. To Allison, thank you for being generous with your time and willing to talk about all things, academic or not. To iv Josh, thank you for making me laugh and pushing my analytic skills beyond where I ever thought they could be. Finally, I have been lucky enough to have received tremendous support from academic mentors along the way. DeLeon, thank you for allowing me to work alongside you and encouraging me to apply to graduate school. Your words of encouragement and belief in me has made a world of difference. Margareta, thank you for always supporting me. To Cary, Lisa, and Jessica: I could not have asked for a better dissertation committee. Each of you has shaped my academic development in unique ways. Thank you for instilling in me a commitment to high standards. Last, I am forever thankful to my advisor, Jen. Thank you for always allowing me to knock on your door to talk about academics or the latest concert, for playing catch outside of Erickson Hall during the summers, and for giving me the opportunity to work alongside you. You have provided me with so many academic skills: how to design a study, how to clearly articulate an argument (even though you were not always wild about it), how to analyze data. But more importantly, you have shown me how to be kind, even when the ever-growing to-do list feels like it is going to topple over. I can only hope that I will become the advisor to my students that you have been for me. v TABLE OF CONTENTS TABLE OF CONTENTS ............................................................................................................... vi LIST OF TABLES .......................................................................................................................... x LIST OF FIGURES ...................................................................................................................... xii INTRODUCTION. Dissertation Overview ..................................................................................... 1 Measurement That Reflects the Theorizing of Cost ........................................................................ 1 The Jingle-Jangle of Cost ................................................................................................................ 2 Understanding the Short-Term Dynamics of Cost .......................................................................... 3 The Present Dissertation .................................................................................................................. 4 Dissertation Structure .................................................................................................................. 4 Paper One ................................................................................................................................ 4 Paper Two ................................................................................................................................ 5 Paper Three .............................................................................................................................. 5 Contribution to the Field ................................................................................................................. 5 REFERENCES ................................................................................................................................ 7 PAPER ONE. Validity Evidence for a Short Cost Scale of College Students’ Perceptions of Cost ....................................................................................................................................................... 11 Abstract .......................................................................................................................................... 11 Introduction ................................................................................................................................... 11 What is Cost and How Do We Measure It? .................................................................................. 13 The Need for a Shortened Scale .................................................................................................... 18 The Present Validity Study ............................................................................................................ 19 Evidence of Content Validity .................................................................................................... 20 Evidence of Structural Validity ................................................................................................. 21 Evidence of Convergent Validity .............................................................................................. 21 Evidence of Predictive Validity ................................................................................................. 22 Study One ...................................................................................................................................... 22 Method ........................................................................................................................................... 22 Sample ....................................................................................................................................... 22 Procedure and Measures ............................................................................................................ 23 Cost ........................................................................................................................................ 23 Theoretically Relevant Variables .......................................................................................... 24 Data Analytic Strategy .............................................................................................................. 24 Results ........................................................................................................................................... 27 Preliminary Analysis ................................................................................................................. 27 Content Validity ........................................................................................................................ 29 Structural Validity ..................................................................................................................... 31 Convergent Validity .................................................................................................................. 33 Predictive Validity ..................................................................................................................... 35 vi Study One Discussion ................................................................................................................ 37 Study Two ..................................................................................................................................... 39 Method ........................................................................................................................................... 40 Sample ....................................................................................................................................... 40 Procedure and Measures ............................................................................................................ 40 Theoretically Relevant Variables .......................................................................................... 41 Data Analytic Strategy .............................................................................................................. 41 Results ........................................................................................................................................... 41 Preliminary Analysis ................................................................................................................. 41 Structural Validity ..................................................................................................................... 43 Convergent Validity .................................................................................................................. 45 Predictive Validity ..................................................................................................................... 45 Study Two Discussion ............................................................................................................... 45 Study Three ................................................................................................................................... 46 Method ........................................................................................................................................... 46 Sample ....................................................................................................................................... 46 Procedure and Measures ............................................................................................................ 46 Cost ........................................................................................................................................ 47 Data Analytic Strategy .............................................................................................................. 47 Results ........................................................................................................................................... 47 Preliminary Analysis ................................................................................................................. 47 Structural Validity ..................................................................................................................... 50 Between- and Within-Person Trajectories ................................................................................. 50 Study 3 Discussion .................................................................................................................... 56 Overall Discussion ......................................................................................................................... 56 Significance and Conclusion ..................................................................................................... 57 APPENDIX ................................................................................................................................... 59 REFERENCES .............................................................................................................................. 67 PAPER TWO. Can You Hear It? Toward Conceptual Clarity of Emotional Cost and Negative Emotions ........................................................................................................................................ 74 Abstract .......................................................................................................................................... 74 Introduction ................................................................................................................................... 74 Cost as a Potential Contributor to the Jingle-Jangle Problem ....................................................... 75 Jingle Fallacy of Emotional Cost .............................................................................................. 78 Jangle Fallacy of Emotional Cost .............................................................................................. 80 Conceptualization and Operationalization of Negative Emotions ................................................ 81 Anticipated and State Emotions .................................................................................................... 83 Importance of Emotional Cost ....................................................................................................... 85 Anticipated and Experienced Cost ................................................................................................ 86 Addressing the Jangle Fallacy of Emotional Cost and Negative Emotions .................................. 87 The Present Study .......................................................................................................................... 88 Method ........................................................................................................................................... 89 Course Description and Sample ................................................................................................ 89 Procedure ................................................................................................................................... 92 Measures .................................................................................................................................... 92 vii Pre-survey Measures ............................................................................................................. 92 Anticipated Emotional Cost .............................................................................................. 93 Anticipated Negative Emotions ......................................................................................... 93 Theoretically Relevant Variables ...................................................................................... 93 Daily Diary Measures ............................................................................................................ 93 Experienced Emotional Cost ............................................................................................. 94 Daily Negative Emotions .................................................................................................. 94 Theoretically Relevant Variables ...................................................................................... 94 Course Grades ....................................................................................................................... 94 Data Analytic Strategy .............................................................................................................. 95 Structural Validity ................................................................................................................. 95 Convergent Validity .............................................................................................................. 95 Predictive Validity ................................................................................................................. 96 Results ........................................................................................................................................... 97 Preliminary Analysis ................................................................................................................. 97 Primary Analysis ..................................................................................................................... 100 Structural Validity ............................................................................................................... 100 Convergent Validity ............................................................................................................ 104 Correlations Between Emotional Cost and Negative Emotions ...................................... 104 Associations with Theoretically Relevant Variables ....................................................... 104 Predictive Validity ............................................................................................................... 106 Engagement ..................................................................................................................... 107 Course grades .................................................................................................................. 111 Discussion .................................................................................................................................... 113 Is Anticipated Emotional Cost Different from Anticipated Negative Emotions? ................... 113 The Nuances of Experienced Emotional Cost ......................................................................... 115 Why Anticipated and Experienced Emotional Cost May Function Differently ...................... 117 Limitations ............................................................................................................................... 118 Future Directions ..................................................................................................................... 119 Conclusion ................................................................................................................................... 120 APPENDIX ................................................................................................................................. 121 REFERENCES ............................................................................................................................ 123 PAPER THREE. Students’ Perceptions of Cost as Antecedents of Their Mathematics Achievement and Intentions to Remain in STEM ....................................................................... 134 Abstract ........................................................................................................................................ 134 Introduction ................................................................................................................................. 134 Persistence and Achievement in Mathematics ............................................................................ 135 Gateway Courses ......................................................................................................................... 136 Importance of Cost ...................................................................................................................... 137 Dimensions and Consequences of Cost ................................................................................... 138 Cost Beliefs as Anticipated and Experienced .......................................................................... 140 Modeling the Dynamics of Cost Beliefs ..................................................................................... 143 The Present Study ........................................................................................................................ 144 Method ......................................................................................................................................... 146 Course Description and Sample .............................................................................................. 146 viii Procedure ................................................................................................................................. 149 Measures .................................................................................................................................. 149 Pre-survey Measures ........................................................................................................... 149 Anticipated Cost .............................................................................................................. 150 Expectancies .................................................................................................................... 150 Value ................................................................................................................................ 150 Daily Diary Measures .......................................................................................................... 150 Experienced Cost ............................................................................................................. 150 Post-survey and Achievement Measures ............................................................................. 151 Intentions to Pursue STEM and Achievement ................................................................ 151 Data Analytic Strategy ............................................................................................................ 151 Results ......................................................................................................................................... 156 Preliminary Analysis ............................................................................................................... 156 Main Analysis .......................................................................................................................... 162 Associations Between Anticipated Cost and Experienced Cost .......................................... 166 Associations Among Anticipated Cost, Course Grades, and STEM Career Intentions ...... 166 Associations Among Experienced cost, Course Grades, and STEM Career Intentions ...... 166 Experienced Cost as a Mediator (RQ2b) ............................................................................. 167 Ancillary Findings ............................................................................................................... 167 Discussion .................................................................................................................................... 168 The Nature of Anticipated Cost Beliefs .................................................................................. 169 Understanding Experienced Cost ............................................................................................ 171 How Anticipated and Experienced Cost Beliefs Inform Expectancy-Value Theory .............. 174 Future Directions and Limitations ........................................................................................... 175 Conclusion ................................................................................................................................... 177 APPENDIX ................................................................................................................................. 179 REFERENCES ............................................................................................................................ 184 CONCLUSION. Dissertation Takeaways ................................................................................... 194 Overview of Findings .................................................................................................................. 194 Paper One ................................................................................................................................ 194 Paper Two ................................................................................................................................ 195 Paper Three .............................................................................................................................. 195 Looking Across Three Papers ..................................................................................................... 196 Future Directions ......................................................................................................................... 196 Defining Cost More Clearly .................................................................................................... 196 Understanding Dimensions of Cost ......................................................................................... 197 A Larger Focus on Anticipated and Experienced Cost Beliefs ............................................... 198 Why Do Some Students Experience High Costs and Others Do Not? .................................... 199 Concluding Thoughts .................................................................................................................. 200 REFERENCES ............................................................................................................................ 201 ix LIST OF TABLES Table 1.1. Original cost items ....................................................................................................... 16 Table 1.2. Means and standard deviations of cost items, full scale administration ...................... 28 Table 1.3. Expert rankings ............................................................................................................ 30 Table 1.4. Confirmatory factor analysis at time 2 and time 4 ....................................................... 32 Table 1.5. Time 2 and 4 correlations ............................................................................................ 34 Table 1.6. Regressions on midterm exam, final grade, and continued stats interest .................... 36 Table 1.7. Final item selection ...................................................................................................... 38 Table 1.8. Means and standard deviations of original cost items ................................................. 42 Table 1.9. Study 2 results .............................................................................................................. 44 Table 1.10. Means and standard deviations of the full cost scale and short cost scale across studies ............................................................................................................................................ 49 Table A.1. Correlations of cost items from full Flake et al. (2015) scale at time 2 from study 1 . 60 Table A.2. Correlations of cost items from full Flake et al. (2015) scale at time 4 from study 1 . 62 Table A.3. Correlations of post-survey cost items from full Flake et al. (2015) scale in study 2 . 64 Table A.4. Within- and between-person correlations from shortened cost scale in study three .. 66 Table 2.1. Participant demographic characteristics ..................................................................... 91 Table 2.2. Within- and between-person correlations, means, and standard deviations of all variables ........................................................................................................................................ 98 Table 2.3. Confirmatory factor analysis of anticipated measures .............................................. 101 Table 2.4. Confirmatory factor analyses of experienced and daily measures ............................ 103 Table 2.5. Multilevel model results of anticipated variables predicting engagement ................. 108 x Table 2.6. Multilevel model results of daily variables predicting engagement ........................... 110 Table 2.7. Linear regression results of anticipated variables predicting course grades ............ 112 Table B.1. Survey items ............................................................................................................... 122 Table 3.1. Participant demographic characteristics ................................................................... 148 Table 3.2. Within- and between-person correlations, means, and standard deviations of all variables ...................................................................................................................................... 158 Table 3.3. Results from dynamic structural equation models ..................................................... 163 Table C.1. Study timeline ............................................................................................................. 180 Table C.2. Survey items ............................................................................................................... 181 Table C.3. Confirmatory factor analysis results ......................................................................... 183 xi LIST OF FIGURES Figure 1.1. Mean levels of between-person cost perceptions throughout 11 weeks of a calculus semester ......................................................................................................................................... 51 Figure 1.2. International female student’s cost perceptions throughout 11 weeks of a calculus semester ......................................................................................................................................... 53 Figure 1.3. White male student’s cost perceptions throughout 11 weeks of a calculus semester . 55 Figure 3.1. Hypothesized model .................................................................................................. 154 Figure 3.2. Trimmed model ......................................................................................................... 165 xii INTRODUCTION. Dissertation Overview According to expectancy-value theory, cost is a term that is broadly used to refer to the various things one must forgo in order to partake in a specific activity (Eccles et al., 1983). Briefly, expectancy-value theory posits that students’ motivation to pursue achievement-related tasks is driven by students’ expectancies for success on those tasks and the value that is placed on those tasks. Cost has been discussed as a type of task value (Eccles et al., 1983), but researchers have also suggested that cost deserves its own unique place in expectancy-value theory (i.e., expectancy-value-cost theory; Barron & Hulleman, 2015). Research surrounding cost has grown over the past several years bringing along a number of challenges (Wigfield & Eccles, 2020). This dissertation aims to address some of these challenges through three studies. Below, I discuss three issues surrounding cost and how this dissertation begins to address them. Measurement That Reflects the Theorizing of Cost Cost has been described as both what one must sacrifice to engage in a task (i.e., experiences) and the anticipated effort required to complete an activity (i.e., anticipations; Eccles, 2005); yet researchers have typically examined cost using only an anticipated perspective (Gaspard et al., 2015; Jiang et al., 2018; Perez et al., 2014). Because cost is defined as the anticipated effort as well as the actual experiences of giving something up to complete a task, research is needed to understand both aspects of cost. Further, researchers have called for an examination of complex processes in the classroom (Hilpert & Marchand, 2018), with a specific emphasis on cost (Feldon et al., 2019). Intensive longitudinal methodologies (Bolger & Laurenceau, 2013) are one way to assess dynamic constructs; however, concerns such as time constraints and participant fatigue are often attributed to these repeated measure designs with 1 long-scales (Goetz et al., 2016; Hektner et al., 2007; Zirkel et al., 2015). Thus, in order to explore the dynamic nature of motivation, and specifically cost, in the classroom a short cost scale is needed. Still, concerns around short-scales exist (Widaman et al., 2011). One way to address many of the concerns associated with short-scales is by gathering validity evidence (e.g., content, structural, convergent). Paper one of this dissertation addresses these concerns through multiple sources of validity evidence to develop a short-cost scale that captures the four dimensions of cost. The Jingle-Jangle of Cost With more research being conducted on cost (Wigfield & Eccles, 2020), multiple dimensions, beyond what was originally proposed, are being discussed. Because of this, it is easy for researchers to conflate dimensions of cost with one another, as well as with other already developed and highly researched constructs. Further, results from studies assessing cost become difficult to interpret when researchers use many different conceptualizations and operationalizations (Wigfield & Eccles, 2020). Emotional cost is one such dimension of cost that potentially suffers from this jingle-jangle fallacy (Kelley, 1927). Though some argue emotional cost is its own unique dimension of cost (Perez et al., 2019), others have used emotional cost interchangeably with other dimensions of cost, such as psychological cost (Eccles et al., 1983; Wigfield & Eccles, 2000). Because of this, a possible jingle fallacy (i.e., two things with similar names that are actually different constructs) may exist between emotional cost and psychological cost; however, perhaps a more pressing is the possible jangle fallacy (i.e., two things with dissimilar names that are actually the same construct) between emotional cost and negative emotions. Researchers have often used negative emotions as a way to define emotional cost (Bergey et al., 2019), whereas others have suggested differences between the two (Jiang et al., 2 2018). As a first step to bring conceptual clarity to theorizing about emotional cost, paper two of this dissertation examines the potential jangle fallacy between emotional cost and negative emotions by gathering validity evidence. Understanding the Short-Term Dynamics of Cost As mentioned above, cost has been conceptualized as both the anticipated effort of engaging in a task as well as the actual sacrifices one must make to complete an activity (Eccles, 2005). Research is needed to address both aspects of cost. As cost has been shown to be an important negative predictor of achievement (Conley, 2012; Jiang et al., 2018; Safavian et al., 2013; Trautwein et al., 2012), avoidance intentions, procrastination, negative affect (Jiang et al., 2018) and lower intentions to attend graduate school (Battle & Wigfield, 2003; Perez et al., 2014), continuing to understand cost, especially using an anticipated and experienced framework, is beneficial for future intervention work when deciding when is best to intervene (Barron & Hulleman, 2015; Rosenzweig et al., 2020). Further, as unique dimensions of cost have shown to be differently predictive of academic outcomes (Perez et al., 2014), continued research is needed to understand how these dimensions function. Again, this is beneficial for researchers looking to develop cost-reducing interventions as a student with high task effort cost beliefs may respond better to an intervention focused on time-management strategies, whereas a student with high emotional cost may respond better to an intervention focused on emotion regulation strategies. The final paper of this dissertation assesses how anticipated and experienced costs predict achievement and STEM-career intentions using a novel methodological approach, dynamic structural equation modeling (Asparouhov et al., 2018). 3 The Present Dissertation Through a series of three studies, using the same sample, this dissertation explores cost as a type of anticipated belief and as an immediate experience assessed through intensive data collection methods (Bolger & Laurenceau, 2013) in order to provide more conceptual clarity around the construct of cost. The three papers that comprise this dissertation contribute to the literature both methodologically (measuring two types of cost) and substantively (examining relations of cost types to each other and to academic outcomes). The present dissertation has three main objectives: 1) Establish and validate a short cost scale for practical assessment and use with intensive longitudinal methods. The short cost scale is then used to measure experienced cost in studies two and three. 2) Provide more conceptual clarity of cost by disentangling emotional cost and negative emotions. 3) Explore the associations among anticipated cost, experienced cost, math achievement, and STEM persistence intentions. Dissertation Structure This dissertation takes on the format of an independent three article dissertation. That is, this dissertation includes three independent empirical studies, each addressing a different issue related to cost. Paper One The focus of paper one is to gather validity evidence of a short measure of students’ perceptions of cost based on Flake et al.’s (2015) four-dimension cost measure. A multi-step validation process is used to determine items for a short cost scale. The value in validating a 4 short cost scale is that the measure can be used in future studies that employ intensive longitudinal designs. The short cost scale that is validated in study one is subsequently used in studies two and three of this dissertation to measure experienced cost through the use of a daily diary methodology. Paper Two In the second study, I seek to provide conceptual clarity between emotional cost and negative emotions using one-time and repeated measures data collected from a sample of undergraduate students from introductory calculus courses. Multiple forms of validity evidence are gathered (e.g., convergent, structural, predictive) to assess the possible jangle fallacy (Kelley, 1927) between emotional cost and negative emotions using an anticipated and experienced perspective. Paper Three The third study examines the dynamics of students’ cost beliefs. Using dynamic structural equation modeling (Asparouhov et al., 2018), I examine the associations among the costs that students anticipate in their calculus classes, the costs they experience while taking the class, their STEM career intentions at the end of the semester, and their course achievement. Contribution to the Field The present dissertation provides an exploration of students’ perceptions of cost in college mathematics courses. Study one provides researchers and educators with a short cost scale that will allow for quick assessments of students’ perceptions of cost. This type of short scale will be particularly beneficial for researchers examining experienced cost in the classroom or when using intensive longitudinal methods that often require short assessments (Hektner et al., 2007). Study two contributes to the cost literature by providing an initial attempt to disentangle 5 emotional cost and negative emotions. Study three provides additional evidence of the importance of considering both anticipated and experienced cost beliefs in understanding how perceptions of cost may shift over time, and how they influence important student outcomes. These studies have implications as to how cost should be theorized and operationalized in future studies. Second, by examining differences between anticipated and experienced cost, researchers can make more well-informed decisions regarding where and when to intervene when developing cost interventions. Last, to my knowledge, the outcomes being assessed have not been examined using the four-dimension cost scale. Therefore, insight regarding the types of cost that are associated with the outcomes will also provide researchers with more information for future research regarding interventions. 6 REFERENCES 7 REFERENCES Asparouhov, T., Hamaker, E. L., & Muthén, B. (2018). Dynamic structural equation models. Structural Equation Modeling: A Multidisciplinary Journal, 25(3), 359-388. doi:10.1080/10705511.2017.1406803 Barron, K. E., & Hulleman, C. S. (2015). Expectancy-value-cost model of motivation. In J. S. Eccles & K. Salmelo-Aro (Eds.), International encyclopedia of social and behavioral sciences: Motivational psychology (2nd ed.). New York, NY: Elsevier. Battle, A., & Wigfield, A. (2003). College women’s value orientations toward family, career, and graduate school. Journal of Vocational Behavior, 62(1), 56. doi:10.1016/S0001- 8791(02)00037-4 Bergey, B. W., Ranellucci, J., & Kaplan, A. (2019). The conceptualization of costs and barriers of a teaching career among latino preservice teachers. Contemporary Educational Psychology, 59, 101794. doi:10.1016/j.cedpsych.2019.101794 Bolger, N. & Laurenceau, J. (2013). Intensive longitudinal methods: An introduction to diary and experience sampling research. New York: Guilford Press. Conley, A. M. (2012). Patterns of motivation beliefs: Combining achievement goal and expectancy-value perspectives. Journal of Educational Psychology, 104, 32-47. doi: 10.1037/a0026042 Eccles, J. S. (2005). Subjective task values and the Eccles et al. model of achievement related choices. In A. J. Elliott & C. S. Dweck (Eds.), Handbook of competence and motivation and motivation (pp. 105-121). New York: Guilford. Eccles (Parsons), J. S., Adler, T. F., Futterman, R., Goff, S. B., Kaczala, C. M., Meece, J. L., et al. (1983). Expectancies, values, and academic behaviors. In J. T. Spence (Ed.), Achievement and achievement motives (pp. 75–146). San Francisco, CA: W. H. Freeman. Feldon, D. F., Callan, G., Juth, S., & Jeong, S. (2019). Cognitive load as motivational cost. Educational Psychology Review, 31(2), 319-337. doi:10.1007/s10648-019-09464-6 Flake, J. K., Barron, K. E., Hulleman, C., McCoach, B. D., & Welsh, M. E. (2015). Measuring cost: The forgotten component of expectancy-value theory. Contemporary Educational Psychology, 41, 232-244. doi:10.1016/j.cedpsych.2015.03.002. Gaspard, H., Dicke, A., Flunger, B., Schreier, B., Häfner, I., Trautwein, U., & Nagengast, B. (2015). More value through greater differentiation: Gender differences in value beliefs about math. The Journal of Educational Psychology, 107(3), 663-677. doi:10.1037/edu0000003 8 Goetz, T., Bieg, M., & Hall, N. C. (2016). Assessing academic emotions via the experience sampling method. In M. Zembylas & P. A. Schutz (Eds.) Methodological advances in research on emotion and education (pp. 245-258). Springer, Cham. doi:10/1007/978-3- 319-29049-2_19 Hektner, J. M., Schmidt, J. A., & Csikszentmihalyi, M. (2007). Experience sampling method: Measuring the quality of everyday life. Thousand Oaks, Calif: Sage Publications. Hilpert, J. C., & Marchand, G. C. (2018). Complex systems research in educational psychology: Aligning theory and method. Educational Psychologist, 53(3), 185-202. doi:10.1080/00461520.2018.1469411 Jiang, Y., Rosenzweig, E. Q., & Gaspard, H. (2018). An expectancy-value-cost approach in predicting adolescent students’ academic motivation and achievement. Contemporary Educational Psychology, 54, 139-152. doi:10.1016/j.cedpsych.2018.06.005 Kelley, T. L. (1927). Interpretation of educational measurements. Oxford, England: World Book Co. Perez, T., Cromley, J. G., & Kaplan, A. (2014). The role of identity development, values, and costs in college STEM retention. Journal of Educational Psychology, 106, 315-329. doi: 10.1037/a0034027 Perez, T., Wormington, S. V., Barger, M. M., Schwartz‐Bloom, R. D., Lee, Y., & Linnenbrink‐ Garcia, L. (2019). Science expectancy, value, and cost profiles and their proximal and distal relations to undergraduate science, technology, engineering, and math persistence. Science Education, 103(2), 264-286. doi:10.1002/sce.21490 Safavian, N., Conley, A., & Karabenick, S. (2013, April). Examining mathematics cost value among middle school youth. In E. M. Anderman (Chair), Is it worth my time and effort? Exploring students’ conceptions of the cost of learning. Symposium conducted at the annual meeting of the American Educational Research Association, San Francisco, CA. Trautwein, U., Marsh, H. W., Nagengast, B., Lüdtke, O., Nagy, G., & Jonkmann, K. (2012). Probing for the multiplicative term in modern expectancy–value theory: A latent interaction modeling study. Journal of Educational Psychology, 104(3), 763-777. doi:10.1037/a0027470 Widaman, K. F., Little, T. D., Preacher, K. J., & Sawalani, G. M. (2011). On creating and using short forms of scales in secondary research. In K. H. Trzesniewski, M. B. Donnellan, & R. E. Lucas (Eds.), Secondary data analysis: An introduction for psychologists (pp. 39- 61). Washington, DC, US: American Psychological Association. doi:10.1037/12350-003 Wigfield, A., & Eccles, J. S. (2000). Expectancy–Value theory of achievement motivation. Contemporary Educational Psychology, 25(1), 68-81. doi:10.1006/ceps.1999.1015 9 Wigfield, A. & Eccles, J. S. (2020). 35 years of research on students’ subjective task values and motivation: A look back and a look forward. In A. J. Elliot (Ed.), Advances in motivation science (Vol. 7, pp. 161-198). Elsevier Inc. Zirkel, S., Garcia, J. A., & Murphy, M. C. (2015). Experience-sampling research methods and their potential for education research. Educational Researcher, 44(1), 7-16. doi:10.3102/0013189X14566879 10 PAPER ONE. Validity Evidence for a Short Cost Scale of College Students’ Perceptions of Cost Abstract Students’ perceptions of cost and how much they have to sacrifice to engage in a class predicts their academic success. Although there is a growing body of literature on how cost influences academic outcomes, we have little understanding of how students’ costs fluctuate in their academic environment. Classroom constraints make collecting longitudinal data difficult and short measures are needed to facilitate classroom-based research. In this paper, the 19-item scale developed by Flake et al. (2015) to measure four dimensions of students’ perceptions of cost in college classrooms was shortened. A thorough validation study was conducted by considering multiple sources of validity evidence (content, structural, convergent, and predictive) from two sample classrooms (statistics and calculus) to select one item for each dimension. The resulting four item scale was then piloted in introductory college calculus classrooms to examine how the items performed. This validation study results in a short scale that researchers can use to assess students’ perceptions of cost in the classroom, paving the way for research into students’ perceptions of cost, how it changes over time, and how we can intervene to support success. Introduction Historically, motivation researchers have studied what motivates students towards a goal; however, researchers have recently started focusing on what might motivate students away from a goal. One process that has been shown to motivate students away from goals are their perceptions of cost. Expectancy-value theory (Eccles et al., 1983) posits that cost reflects what one must sacrifice to engage in a task (i.e., their experiences) as well as the anticipated effort that will be needed to complete the task. Despite this conceptualization of cost from the 80’s, only in 11 the past decade has the literature on students’ experienced cost beliefs grown. As cost beliefs may be malleable throughout a course, researchers have promoted the need to consider how these beliefs function in-situ (Feldon et al., 2019). Psychological research into motivation, engagement, and achievement has the potential to impact real-world learning environments if we can translate research into actionable practices for educators to consider in the classroom (Slavin, 2018). Effective translation requires that we conduct research in classrooms; however, this can be difficult because research takes time away from instruction. One way to minimize the research burden for students and teachers in real-world classrooms is to use short measures that are unobtrusive and quick to complete (Bryk et al., 2013). In the current study, I focus on measuring college students’ perceptions of cost in the classroom and how these costs influence academic and motivational outcomes. Over the past decade, burgeoning research has found that students with high perceptions of cost, compared to students with low perceptions of cost, report lower intentions to attend graduate school in STEM fields (Battle & Wigfield, 2003), higher intentions to leave their STEM major (Perez et al., 2014), and lower achievement (Conley, 2012; Trautwein et al., 2012; Safavian et al., 2013); however, there is a lack of research examining how cost develops in the classroom, over time. Further, researchers have found aspects of student motivation to be promising constructs to intervene on in the classroom for boosting students’ achievement (Hulleman & Harackiewicz, 2009; Yeager & Walton, 2011). Barron and Hulleman (2015) proposed that students’ perceptions of cost may be particularly promising to intervene on in the classroom. To advance and test motivation interventions and pedagogical tools, we need to examine how cost develops and influences behavior in real-world classrooms over time. Such studies are difficult to conduct because they require recruiting instructors who will allow time to be taken 12 from instruction for data collection, repeatedly throughout the semester. The purpose of the current study is to provide validity evidence for a short cost scale to meet this end. I focus specifically on developing a measure that researchers can use for intensive longitudinal methods, such as experience sampling techniques (Hektner et al., 2007). Intensive longitudinal designs allow researchers to examine “…life as it is lived…” (Allport, 1942, p. 56) and within-person differences across time (Bolger & Laurenceau, 2013), both of which are challenging with longer surveys. What is Cost and How Do We Measure It? Eccles et al. (1983) first introduced cost as a part of the expectancy-value theory of achievement motivation. Expectancy-value theory posits that students’ motivation to pursue achievement-related tasks are driven by the value that is placed on a task and the students’ expectancies for success on the task (Eccles et al., 1983). Task value was proposed to contain multiple dimensions: attainment value refers to one’s personal belief about the importance of a task; intrinsic value refers to whether the individual believes the task is enjoyable or interesting; and utility value refers to the usefulness of a task to one’s life. Cost was included as a dimension of task value, as well as hypothesized to be a potential mediator of the relationship between task value and academic outcomes (Eccles et al., 1983; Linnenbrink-Garcia & Patall, 2015; Wigfield & Cambria, 2010). Similar to value, cost is hypothesized to predict achievement-related outcomes and choices. Although students’ expectancies and values have been studied in-depth, perceptions of cost have largely been understudied until more recently (Flake et al., 2015; Johnson, & Safavian, 2016; Perez et al., 2014; Wigfield & Cambria, 2010). Recent work has found students’ perceptions of cost to be associated with achievement, avoidance goals, negative classroom affect, and STEM major intentions (Bergey et al., 2018; Jiang et al., 2018; Johnson et 13 al., 2016; Perez et al., 2014, 2019). Researchers have also found cost perceptions to increase throughout the course of college (Robinson et al., 2018), with females experiencing higher increases in perceived cost compared to males (Gaspard et al., 2017). As originally conceptualized by Eccles and colleagues (1983), cost was composed of three dimensions (task effort, loss of valued alternatives, and psychological cost of failure). Broadly, cost refers to an individuals’ subjective beliefs about negative consequences associated with engaging with a task (Eccles et al., 1983; Linnenbrink-Garcia & Patall, 2015; Wigfield & Cambria, 2010). Effort cost refers to the amount of effort one must put forth to engage in an activity. Loss of valued alternatives refers to the valued alternatives one must give up in order to complete an activity. Psychological cost, also referred to as emotional cost in some research, is the negative psychological experiences one might encounter when completing an activity (e.g., stress or anxiety; Eccles et al., 1983; Wigfield & Eccles, 2000). Researchers have mostly focused on measuring two or three dimensions of cost (Conley, 2012; Luttrell et al., 2010; Perez et al., 2014; Trautwein et al., 2012); however, other researchers have proposed additional dimensions of cost beyond the original three dimensions (Flake et al., 2015; Johnson & Safavian, 2016). From this recent research, multiple cost scales have emerged (Conley, 2012; Flake et al., 2015; Kosovich et al., 2015; Perez et al., 2014; Trautwein et al., 2012). Flake et al.’s (2015) measure of cost was selected for shortening due to the theoretical breadth of the dimensions and depth of validity evidence for this scale. Flake et al. (2015) proposed a model of cost with four distinct dimensions. This model extended previous research in two ways: they developed specific construct definitions to differentiate cost from the other components of expectancy-value theory and those definitions were not subject specific, such that costs could be measured in a variety of classes and/or with different populations of students. The four dimensions include task effort 14 cost, outside effort cost, loss of valued alternatives, and emotional cost (for full scale and construct definitions, see Table 1.1). Items to measure these dimensions were reviewed by experts and selected through multiple psychometric studies testing the factor structure of the scale. 15 Table 1.1. Original cost items Task effort cost - negative appraisals of time, effort, or amount of work put forth to engage in a task TE1 This class demands too much of my time. TE2 TE3 TE4 TE5 Outside effort cost - negative appraisals of time, effort, or amount of work put forth for task other than the task of interest OE1 I have to put too much energy into this class. This class takes up too much time. This class is too much work. This class requires too much effort. I have so many other commitments that I can’t put forth the effort needed for this class. OE2 Because of all of the other demands on my time, I don’t have enough time for this OE3 class. I have so many other responsibilities that I am unable to put in the effort that is necessary for this class. OE4 Because of other things that I do, I don’t have time to put into this class. Loss of valued alternatives – a negative appraisal of what is given up as a result of engaging in the task of interest L1 L2 L3 L4 I have to sacrifice too much to be in this class. This class requires me to give up too many other activities I value. Taking this class causes me to miss out on too many other things I care about. I can’t spend as much time doing the other things that I would like because I am taking this class. I worry too much about this class. Emotional cost – negative appraisals of a psychological state that results from exerting effort for the task EM1 EM2 This class is too exhausting. EM3 This class is emotionally draining. EM4 This class is too frustrating. EM5 This class is too stressful. EM6 This class makes me feel too anxious. Note: Operational definitions and items taken from Flake et al. (2015). 16 Task effort cost, loss of valued alternatives, and emotional cost were adapted from the original Eccles et al. (1983) conceptualization, but specifically focus on the negative appraisal of each type of cost. As discussed by Flake et al. (2015) in response to focus groups conducted with students, for something to be considered a cost, it must reflect a negative subjective appraisal that the work being done is too much. That is, an item that reads “This class requires a lot of effort” may not capture a negative appraisal, whereas an item that reads “This class requires too much effort” more clearly captures this. Outside effort cost was added to the model and was defined as the negative appraisals of the effort, time, or amount of work that is put forth on other tasks that take away focus from the focal activity. For example, a student may have calculus homework to complete, but they experience outside effort cost when completing the homework, because they also need to spend time that evening taking care of a sick family member. Another aspect of the Flake et al., (2015) operationalization and scale is that it is not domain or subject specific. It was developed to capture a general level of students’ cost in a variety of classes and in a way that could be applicable to a diverse range of students. The result is that this scale measures how much cost a student is experiencing to engage in a class and cannot capture why a student is experiencing that cost. If a student endorses the item “Taking this class forces me to give up too many other things I care about”, we do not know what those things that they care about are. This is a different approach than asking something like, “This class forces me to give up time with family”, which a student with high or low cost may endorse, given how much they value time with their family. Identifying why students experience costs requires qualitative modes of inquiry and is not captured by the quantitative ratings given by students on this cost scale (Flake et al., 2015 p. 242). 17 The Need for a Shortened Scale As longitudinal research in classrooms is necessary if we are to understand how cost develops and interacts with academic and motivational outcomes over time, a short scale measuring cost is needed. Feldon et al. (2019) echo this, suggesting a need to measure cost using event-sampling techniques, a type of intensive longitudinal methodology, in order to describe the dynamic nature of motivational change that students experience as they juggle many classes and outside activities from day-to-day and week-to-week. Research using intensive longitudinal data collection methods has significantly increased over the past decade (Hamaker & Wichers, 2017); however, using methods such as experience sampling and daily diaries in education research comes with its own set of challenges. For example, researchers using the experience sampling technique (i.e., students are assessed at random times) in a classroom may struggle to collect data multiple times during a 40-minute class period over ten consecutive days. Because surveys can cause disruptions in the classroom, it is important that the surveys are kept short when using intensive longitudinal methods (Goetz et al., 2016; Hektner et al., 2007; Zirkel et al., 2015). Though measuring one construct with 15- 20 items may not take long, researchers typically measure multiple constructs. It is recommended to keep survey completion time under two minutes (Csikszentmihalyi & Larson, 2014), so that students are less likely to become fatigued or experience negative emotions (Gogol et al., 2014). Researchers have also suggested the need to examine complex systems in education research (Hilpert & Marchand, 2018). There are many longitudinal studies examining change over longer time periods of educational constructs such as expectancies, values, and cost (Robinson et al., 2018); but short-term change has largely been ignored (for an exception see 18 Kosovich et al., 2017). By using intensive methods, researchers can focus on the complexities of within- and between-person differences among students during the short-term duration of a single class (Murayama et al., 2017). Finally, understanding students’ experiences during class is critical for making practical recommendations to educators and researchers (Zirkel et al., 2015). Collecting these data allow researchers to make important recommendations to educators about which activities are associated with the highest levels of student engagement in the classroom, for example. Moreover, researchers can then make intervention recommendations at the most opportune times. Researchers have used single-items or short-forms to assess many constructs in education research including emotions (Bieg et al., 2017; Goetz et al., 2010), perceived challenge (Strati et al., 2016), expectancies and perceived utility (Durik et al., 2018) and engagement (Shernoff & Vandell, 2007), among others when using intensive longitudinal methods. Having a short-form to assess cost would allow researchers to understand this construct in real-time and how it is related to other important constructs. Short scales come with concerns such as limited theoretical breadth (Smith et al., 2000, 2012; Widaman et al., 2011); however, through rigorous validation, they can ensure acceptable psychometric properties and serve an important purpose. To address the concerns regarding short scales (Smith et al., 2000, 2012; Widaman et al., 2011), I conducted three studies, gathering multiple sources of validity evidence for the shortened measure, which I outline below. The Present Validity Study In order to develop a scale that can be used to assess cost beliefs in real-time, I gathered multiple sources of validity evidence, including content, structural, convergent, and predictive that are consistent with a unified theory of construct validity (AERA et al., 2014; Bandalos, 19 2018). My goal was to develop a four-item cost scale (one item per dimension of cost) that would be able to assess each dimension individually, but also would be able to measure a higher- order cost factor including all four dimensions. First, in study one, items were ranked from the full scale based on multiple sources of validity evidence and one item was selected from each subscale to create a short, four-item form. Consistent with prior research validating short scales (Cheung & Lucas, 2014; Donnellan et al., 2006), those four items were then examined in studies two and three to ensure they maintained similar properties. In study two, a larger sample was used to evaluate those items in comparison to results from the full scale. Last, in study three, the short cost scale was piloted to examine how it performed and to show how college students’ perceptions of cost may vary over time. Evidence of Content Validity Evidence of content validity addresses how the construct is operationalized, defined, and measured. Longer measures are often considered to be superior when compared to shorter ones because a short-form may not capture the multidimensionality or needed content breadth of a construct (Jordan & Turner, 2008). One avenue for addressing this concern is to have expert reviewers report on whether the short measure sufficiently captures all aspects of the construct (Widaman et al., 2011). In the current study, I took this approach by asking experts to review the full scale and report on which item from each dimension of cost they believed to be the most relevant. One concern with this approach is that any one expert’s opinion will weigh too heavily in the item selection (Smith et al., 2000). To address this in the assessment of content validity, I aggregated responses from multiple experts. 20 Evidence of Structural Validity Assessing the psychometric properties of a short scale can be challenging and one concern is that the items selected do not strongly relate to the construct they are meant to measure (Widaman et al., 2011). I address this concern by comparing the pattern coefficients from all items in the long version and identifying the item with the strongest loading across two waves of data collection with a single sample. Then, to decrease the likelihood of a sample specific selection, I evaluate those items in a second, larger sample to ensure consistent desirable properties (Widaman et al., 2011). Another concern with short scales is that they will not retain the factorial integrity of the longer form (Smith et al., 2000; Widaman et al., 2011). As recommended by Smith et al. (2012), I address this concern by evaluating the factor structure of the shortened scale in study three. Although I was unable to test whether each single item retains the factor structure of each dimension, I was able to test whether the four items retain the higher order factor of general cost. Evidence of Convergent Validity If scores from the shortened cost scale are to be interpreted as measures of cost, they should relate to other theoretically relevant constructs in a similar manner to that of the longer version of the scale. Here, I considered evidence of convergent validity, which refers to the degree to which a construct exhibits expected intercorrelations with other theoretically relevant constructs (Widaman et al., 2011). Items in a short-scale should correlate similarly to theoretically-relevant constructs as the full-scale does. I examined convergent evidence of the short-form by comparing correlations between all items and single items to expectancies and values in study one. Through this process, I identified one item from each subscale that best recovers the relationship from the full scale. Then, in a second sample from another classroom, I 21 again examined how these items correlate to the same constructs, ensuring they correlate similarly to the other motivational constructs as the full scale does. Evidence of Predictive Validity A central goal of the shortened version of the scale is that it can be used in classrooms to predict student outcomes. Long scales are often thought to have stronger predictive validity than short scales (Lord & Novick, 1968) due to the capability of long scales to record greater discrimination by increasing the number of categories in the scale (Bergkvist & Rossiter, 2007). To evaluate that the scores from the short version will predict outcomes similarly to the longer version, I examined regressions of the full-scale as well as all single items on key outcomes of interest (i.e., achievement and continued interest) and identified which items adequately recover the predictive relationship when compared to the full scale. Study One The purpose of study one was to establish initial validity evidence for the use of a short- cost scale in college classrooms. My goal was to choose one item from each subscale for further evaluation. I assessed four different forms of validity evidence: content validity, structural validity, convergent validity, predictive validity with two waves of data from the same sample. Sample Method This study draws on previously collected data of students’ motivation in an introductory statistics course during 2015. The statistics course took place at a public Canadian university and consisted of 154 college students. Demographic information was not collected because the institutional review board deemed this information to be sensitive. During the year of data collection, 78% of undergraduates reported that they were Canadian citizens. The statistics 22 course was offered in the faculty of health which includes majors such as psychology, kinesiology, and nursing. Demographic information for the larger population of students in the faculty of health are: 65% female and 88% between the ages of 17 to 76 with a mean age of 22.5. Procedure and Measures Students responded to an online survey for participation credit across four time points during the semester assessing their perceptions of cost, expectancies, and values. During the final survey students were also asked to respond to items related to their continued interest in statistics. Midterm test grades and final course grades were also collected. In the present study, I used data from time points two and four (T2 and T4) separately across all tests of validity. I preregistered this study (link for preregistration: https://osf.io/qrjc7/) and selected T2 because it occurred after students had some experience in the course but before they had taken the midterm, so midterm grades could be used as a performance outcome. T4 was chosen because it occurred at the end of the course when students had formed an impression of their costs, but before final exams and final grade submission (so final grade could be used as an outcome). Cost Students’ perceptions of cost were assessed using a Likert scale that ranged from 1(completely disagree) to 6 (completely agree) at time points two and four (Flake et al., 2015; see Table 1.1 for full scale and definitions). The scale assesses four dimensions of cost: task effort cost (five items; “This class demands too much of my time.”; w = T2: 0.93 [0.91, 0.94]; T4: 0.96 [0.95, 0.97]1), outside effort cost (four items; “I have so many other responsibilities that I am unable to put in the effort that is necessary for this class.”; w = T2: 0.90 [0.87, 0.93]; T4: 1 The interval omega reliability is presented for all omegas due to the use of a factor model. 95% confidence intervals are presented. 23 0.93 [0.91, 0.95]), loss of valued alternatives (four items; “I can’t spend as much time doing the other things that I would like because I am taking this class.”; w = T2: 0.91 [0.88, 0.93]; T4: 0.93 [0.91, 0.95]), and emotional cost (six items; “This class is too stressful”; w = T2: 0.94 [0.93, 0.96]; T4: 0.96 [0.94, 0.97]). Theoretically Relevant Variables Separate scales were used to assess students’ expectancies and values at time points two and four (Kosovich et al., 2015) related to statistics. Three items were used to assess students’ expectancies (e.g., “I know I can learn the material in my statistics class.”; a = T2: .84 [0.79, 0.88]; T4: .94 [0.92, 0.96]2) and three items were used to assess students’ values (e.g., “I think my statistics class is useful.”; a = T2: .89 [0.86, 0.92]; T4: .95 [0.93, 0.63]). At the end of the semester (T4) continued interest (i.e., a desire to re-engage with an activity or task in the future) was also assessed (three items; “I am interested in taking more statistics classes.”; a = .88 [0.85, 0.92]). These items were adapted from Kosovich et al., (2015; e.g., I want to take more math/science classes in the future). All items were assessed on a 6-point Likert scale from 1 (completely disagree) to 6 (completely agree). Students’ grades, midterm exam grades and final grade in the course were also collected as measures of achievement. Grades were on a 0-100% scale. Data Analytic Strategy As described above, I examined four sources of validity evidence. Equal weight was given to each and items were selected after considering all evidence: expert rankings, confirmatory factor analysis, correlations, and linear regressions (described in more detail 2 The interval alpha reliability is presented for all alphas and are included for scales which did not use a factor model. 95% confidence intervals are presented. 24 below). For each, one item was ranked as the “best item”. For example, if one item performed the best in three out of four types of evidence, it would be selected for the shortened scale version. If a tie existed between items, I examined the next “best item”. Capturing the theoretical breadth of cost is central to our proposed use for understanding how cost changes over time. As such, I presented the full version of the scale to a panel of seven experts in the field of motivation who were not involved in the current study. All experts chosen held a Ph.D. in educational psychology or a similar field and were familiar with cost research. Each expert received an electronic form including each of the items for the four dimensions of cost along with the definitions for each dimension. Experts ranked the items of cost for each dimension based on how relevant they are to the given dimension (1 = most relevant; 2 = the second most relevant, etc.). I calculated the average ranking for each item across the seven experts to determine the item that was rated as the most relevant. The item with the highest mean score for each dimension was ranked highest. To evaluate the assumption that the shortened scale would have strong structural validity evidence, I utilized confirmatory factor analysis to examine the highest factor loadings at time points two and four. The items were ranked from highest to lowest factor loading at both time points (1 = highest factor loading). I then calculated means of the rankings of each item for both time points and recorded the item with the highest average ranking. Next, I evaluated if the shortened scale showed strong convergent validity by examining correlations of the cost scale and students’ expectancies and values as evidence of convergent validity. I correlated students’ perceptions of cost with expectancies and values for each sub- scale. Correlations of time two variables were only examined with other time two variables and correlations of time four variables were only examined with other time four variables. I 25 correlated both the full sub-scale for each dimension and each individual item of each sub-scale. It was expected that cost would be negatively correlated to expectancies and value. To rank the items, I used three criteria. First, the statistical significance and magnitude of the correlations were examined for similarities between single items and the full sub-scale with regards to expectancies and values. This provided evidence of whether the full sub-scales and single items yield the same conclusions regarding the hypothesized associations between perceived cost and expectancies and values. Second, to examine whether the single items and the full-scales produce similar correlations, the correlations from the single items were subtracted from the full-scales. The average difference (using expectancies and values) was then calculated. An average difference that is close to zero suggests that not much information is lost when going from the full sub-scale to a single item. Third, I calculated the absolute value of the difference in correlations to quantify the magnitude of the differences. An average (using expectancies and values) was then taken of the absolute values across the scores. Again, an average score close to zero suggested that the magnitudes do not differ much between the single-items and the full sub- scale. The item for each sub-scale that provided the strongest evidence across both time points (i.e., a difference that is closest to 0) was ranked the highest. Finally, I evaluated if a single item for each dimension of cost would predict outcomes similarly to the full sub-scale of cost (e.g., task effort cost). This is critical for its intended use to predict outcomes in classroom settings. I ran linear regression models to examine the predictive validity of each item. At time two, each sub-scale was used to predict students’ midterm grades and at time four, each sub-scale was used to predict students’ final grades and continued interest in statistics. Then, each single-item was included in a linear regression (one item for each regression) as a predictor of grades and continued interest (using the time points described 26 above). I hypothesized that cost would negatively predict both grades and continued interest in statistics. I considered two ways of examining evidence of predictive validity. The first suggests that the purpose of predictive validity should not be to choose the item that is the most predictive, but rather choose the item that is closest to the true correlation (i.e., the correlation that is closest to the correlation of the full-scale; Borsboom et al., 2004; Rossiter, 2002). The second suggests that if comparing between a full scale and a single item of the same construct, the true correlation is unknown and therefore the higher of the correlations should be chosen (Bergkvist & Rossiter, 2007). I chose to focus on the most predictive item. The R2 values were examined to ensure that researchers will not be sacrificing much predictive validity when using the single item sub-scales. I examined mean scores of R2 values across all three outcome variables (grades at time two and four and continued interest). The item with the highest average R2 was chosen as this suggests that item is explaining the most variance in the outcome. Preliminary Analysis Results I first examined means and standard deviations of all cost variables at both time points (see Table 1.2). Correlations of all cost items and subscales are included in the Appendix from the full Flake et al. (2015) scale. At time two, 154 students completed the survey and at time four, 137 students completed the survey. Thus, the attrition rate was 11% from T2 to T4. 27 Time 2 SD M Table 1.2. Means and standard deviations of cost items, full scale administration Time 4 SD M Task effort cost 3.20 1.08 3.42 1.28 3.16 1.24 3.33 1.37 TE1 3.30 1.32 3.53 1.31 TE2 3.05 1.18 3.30 1.40 TE3 3.18 1.18 3.54 1.39 TE4 3.31 1.22 3.42 1.44 TE5 Outside effort cost 2.91 1.06 3.03 1.15 2.92 1.25 2.96 1.27 OE1 2.83 1.18 3.08 1.26 OE2 3.02 1.22 3.12 1.36 OE3 2.87 1.17 2.99 1.18 OE4 Loss of valued alternatives 2.75 1.03 2.95 1.18 2.60 1.18 2.82 1.24 L1 2.73 1.09 2.94 1.31 L2 L3 2.62 1.14 2.93 1.39 3.03 1.25 3.09 1.29 L4 Emotional cost 3.09 1.15 3.32 1.29 3.56 1.36 3.66 1.44 EM1 EM2 2.97 1.17 3.24 1.38 2.83 1.32 3.17 1.52 EM3 2.87 1.21 3.20 1.36 EM4 EM5 3.08 1.28 3.31 1.39 EM6 3.25 1.48 3.34 1.43 Note: The range for all items was between 1 = Completely Disagree and 6 = Completely Agree. 28 Content Validity Table 1.3 includes the results from the expert rankings. On average, when examining task effort cost, item TE5 (See Table 1.1 for original cost items), was rated as most relevant. For outside effort cost, item OE1, had the highest average rating of relevance to the construct. The average ratings suggested that item L2 for loss of valued alternatives was the most relevant. Last, item EM3 of emotional cost, was rated as the most relevant across all experts. 29 Table 1.3. Expert rankings Item Task effort cost TE1 TE2 TE3 TE4 TE5 Outside effort cost OE1 OE2 OE3 OE4 Loss of valued alternatives L1 L2 L3 L4 Emotional cost EM1 EM2 EM3 EM4 EM5 EM6 Note: An average item ranking closer to 1 represents the item that was ranked as most relevant to the construct definition. Rankings of each subscale were as follows: task effort cost (1-5), outside effort cost (1-4), loss of valued alternatives (1-4), emotional cost (1-6). Average ranking 3.00 3.57 2.71 2.86 1.86 2.14 2.57 2.57 2.29 2.71 1.86 2.71 2.29 4.14 4.14 2.00 4.00 2.86 3.57 30 Structural Validity I examined a four-factor CFA at T2 and T4 using maximum likelihood (see Table 1.4) with Lavaan (v.0.6-5) in R (Rosseel, 2012). The CFA at time two fit the model well (CFI = .97, TLI = .96, RMSEA = .06). At T2, the highest factor loadings for each item were as follows: TE1, OE4, L2, EM5. The CFA at time four also fit the model well (CFI = .97, TLI = .96, RMSEA = .07). At T4, the highest factor loadings for each item were as follows: TE4, OE3, L2, EM5. I then examined the average ranking of each item across T2 and T4. That is, I assigned a value of 1 to the highest factor loading, a value of 2 to the second highest factor loading, and so on. Across both CFA’s the items with the highest factor loadings were TE1, OE3, L2, and EM5. I note that all items had strong factor loadings, most above .80 and all above .78. These factor loadings were comparable to the original developed scale (Flake et al., 2015). 31 Item Rank T2 1 2 5 4 2 4 2 3 1 3 1 2 4 6 4 5 3 1 2 T4 3 5 2 1 4 2 3 1 4 3 1 2 4 6 2 4 5 1 3 Table 1.4. Confirmatory factor analysis at time 2 and time 4 Item Task effort TE1 TE2 TE3 TE4 TE5 Outside effort OE1 OE2 OE3 OE4 LOVA L1 L2 L3 L4 Emotional EM1 EM2 EM3 EM4 EM5 EM6 Note: The factor loadings are standardized pattern coefficients. LOVA = Loss of Valued Alternatives Average Rank 2 3.5 3.5 2.5 3 3 2.5 2 2.5 3 1 2 4 6 3 4.5 4 1 2.5 Outside effort T4 T2 .888 .811 .825 .883 .890 .82 .842 .876 Task effort T4 T2 .887 .913 .878 .853 .921 .783 .939 .844 .853 .903 LOVA T2 .825 .921 .831 .808 T4 .862 .939 .872 .817 Emotional T2 .808 .854 .844 .857 .900 .864 T4 .841 .919 .876 .858 .939 .883 32 Convergent Validity Task effort cost, outside effort cost, loss of valued alternatives, and emotional cost were all negatively correlated with expectancies and value. Another consistent pattern was that expectancies had higher negative correlations with every dimension of cost compared to value. When examining each of the full subscales of cost every correlation with expectancies and value were significant. The correlations between expectancies and each individual item were also significant (p <.001); however, when examining correlations between value and each of the individual items, there were inconsistencies in the level of significance when comparing value to the full subscale and the individual items. For example, at time two, the correlation between of OE3 and value was -.143 (p = .08); however, the correlation between outside effort cost (full subscale) and value was -.235 (p = .003). The items with the smallest absolute value difference were TE5, OE2, L2, EM3. These four items were all negatively and significantly correlated with expectancies and values in a similar way when comparing them to the subscale correlations. See Table 1.5 for results. 33 Time 2 Time 4 Avg. difference Absolute value of difference Table 1.5. Time 2 and 4 correlations Task effort cost TE1 TE2 TE3 TE4 TE5 Outside effort cost OE1 OE2 OE3 OE4 LOVA L1 L2 L3 L4 Emotional cost EM1 EM2 EM3 EM4 EM5 EM6 Note: The average difference in correlations was calculated by subtracting the single item correlations from the full-subscale correlation across both time points and then taking the average (e.g., (Task Effort Expectancy Time 2 – TE1 Expectancy Time 2) + (Task Effort Expectancy Time 4 – TE1 Expectancy Time 4) + (Task Effort Value Time 2 – TE1 Value Time 2) + (Task Effort Value Time 2 – TE1 Value Time 2)/4). I then took the absolute value of the average difference. Items with the smallest absolute difference are bolded. LOVA = Loss of valued alternatives. Expectancies Value -.517*** -.448*** -.498*** -.451*** -.486*** -.517*** -.462*** -.414*** -.451*** -.407*** -.408*** -.500*** -.420*** -.478*** -.473*** -.437*** -.632*** -.552*** -.560*** -.579*** -.593*** -.577*** -.574*** -.329*** -.320*** -.311*** -.271*** -.296*** -.332*** -.308*** -.298*** -.300*** -.241** -.285*** -.392*** -.352*** -.407*** -.320*** -.340*** -.405*** -.287*** -.398*** -.415*** -.366*** -.356*** -.379*** Expectancies Value -.562*** -.530*** -.458*** -.460*** -.498*** -.521*** -.452*** -.374*** -.443*** -.361*** -.408*** -.517*** -.493*** -.463*** -.445*** -.428*** -.723*** -.627*** -.629*** -.687*** -.601*** -.623*** -.649*** -.274*** -.275*** -.172* -.254*** -.227** -.278*** -.235** -.205** -.294*** -.143 -.183* -.222** -.234** -.174* -.161* -.213** -.377*** -.238** -.387*** -.337*** -.373*** -.337*** -.331*** -.027 -.061 -.062 -.044 -.009 -.042 -.008 -.076 -.004 -.033 -.027 -.058 -.053 -.108 -.041 -.030 -.051 -.061 -.051 .027 .061 .062 .044 .009 .042 .008 .076 .040 .033 .027 .058 .053 .108 .041 .030 .051 .061 .051 34 Predictive Validity For all measures of cost, I found that students who reported higher levels of cost had lower midterm exams, final grades, and reported lower continued interest in statistics compared to students who reported lower levels of cost. When examining which items were most similar to the full subscales for each dimension of cost, I found that TE5, OE4, L2, and EM3 were the most predictive items (see Table 1.6). 35 Table 1.6. Regressions on midterm exam, final grade, and continued stats interest Task effort TE1 TE2 TE3 TE4 TE5 Outside effort OE1 OE2 OE3 OE4 LOVA L1 L2 L3 L4 Emotional EM1 EM2 EM3 EM4 EM5 EM6 Note: LOVA = Loss of Valued Alternatives. Midterm exams were predicted using time two variables. Final grade and continued stats interest were predicted using time four variables. Continued Stats Interest B (SE) -.37*** (.08) -.34*** (.08) -.32*** (.08) -.28*** (.08) -.34*** (.08) -.32*** (.07) -.38*** (.09) -.25** (.09) -.35*** (.08) -.25*** (.08) -.41*** (.09) -.42*** (.09) -.31*** (.09) -.41*** (.08) -.31*** (.08) -.32*** (09) -.45*** (.08) -.28*** (.08) -.39*** (.08) -.40*** (.07) -.39*** (.08) -.37*** (.08) -.39*** (.07) Final Grade B (SE) -3.15*** (.84) -2.41** (.80) -2.86*** (.83) -2.08** (.79) -3.02*** (.77) -3.19*** (.74) -3.21*** (.94) -2.46** (.86) -2.67** (.87) -2.68*** (.80) -2.79** (.93) -3.05*** (.92) -2.22** (.89) -2.74*** (.83) -2.60*** (.78) -2.36** (.86) -4.01*** (.81) -2.97*** (.74) -3.25*** (.77) -3.26*** (.69) -3.80*** (.77) -3.36*** (.76) -3.16*** (.74) Avg. R2 .121 .100 .100 .071 .100 .137 .074 .048 .073 .051 .078 .093 .069 .102 .065 .072 .171 .099 .150 .177 .142 .142 .128 Midterm Exam B (SE) -4.59*** (.96) -3.46*** (.83) -3.29*** (.80) -3.16*** (.90) -3.07*** (.92) -4.45*** (.82) -2.28* (1.04) -1.75* (.88) -2.17* (.94) -1.00 (.89) -2.07* (.93) -3.34*** (1.03) -.3.22*** (.89) -2.94** (.98) -1.43 (.95) -2.76*** (.84) -4.70*** (.89) -3.05*** (.76) -4.61*** (.86) -4.22*** (.77) -3.60*** (89) -3.90*** (.80) -2.65*** (.72) R2 .137 .108 .105 .078 .072 .169 .032 .027 .036 .009 .033 .068 .082 .059 .015 .070 .162 .099 .165 .173 .101 .143 .087 R2 .095 .063 .081 .049 .102 .122 .079 .057 .066 .076 .062 .075 .043 .075 .076 .053 .154 .106 .116 .142 .154 .127 .118 R2 .131 .128 .104 .087 .126 .121 .110 .060 .116 .068 .139 .137 .083 .171 .105 .094 .197 .093 .170 .216 .170 .156 .180 36 Study One Discussion The goal of study one was to establish initial validity evidence for a shortened scale assessing students’ perceptions of cost and identify one item from each subscale to further evaluate. I focused on four pieces of validity evidence to support the use of the scale in classrooms to measure cost over time and predict outcomes. The final items selected were as follows (see Table 1.7): TE5, OE4, L2, EM3. I focus on these items moving forward in studies two and three. 37 L2 EM3 CFA TE1 OE3 L2 EM5 Table 1.7. Final item selection Exp. Ranking Task effort TE5 Outside effort OE1 LOVA Emotional Task effort cost TE5 Outside effort cost OE4 Loss of valued alternatives L2 Emotional cost EM3 Note: LOVA = Loss of Valued Alternatives. Correlations TE5 OE2 L2 EM3 Regression TE5 OE4 L2 EM3 Final Selection TE5 OE4 L2 EM3 This class requires too much effort Because of other things that I do, I don’t have time to put into this class. This class requires me to give up too many other activities I value. This class is emotionally draining. 38 Though I developed this ranking system as a reasoned, a-priori way of choosing items, I note that most of the items performed similarly and differences that determined rankings were often differences in a loading or correlation out to the hundredths place, putting items in the “best” category by a slim margin. Despite this, some items were ranked consistently as the best item across all four sources of validity evidence. For example, when examining task effort cost, three out of four tests suggested that TE5 was the best item for selection. The CFA was the only test that did not find TE5 as the best item for selection; however, at both time points TE5 still had a high factor loading (all loadings were above .80). EM3 also was the best item in three out of four tests. Similar to task effort cost, the CFA showed that EM5 had the highest factor loading across two time points. Again, EM3 still had a high factor loading (above .84) for both time points. Last, in all four tests, L2 was found to be the best item. For outside effort cost, there was no item consistency across pieces of validity evidence. However, differences in the items were small, so I examined the “second-best” item across each of the tests. In doing so, I found that OE4 was the next best item in the expert rankings, CFA, and correlations. Therefore, I choose OE4 as the final item for outside effort cost. I next, examined how these items performed in a larger sample in study two. Study Two The purpose of study two was to evaluate the use of the items chosen for the short-cost scale in another sample. In study two, I considered structural, convergent, and predictive validity using the same methods that were used in study one. The purpose of study two was to evaluate if the items chosen in study one demonstrated replicable and acceptable psychometric properties in study two. 39 Sample Method The sample for this study was taken from a larger study assessing students’ cost beliefs throughout a semester. Data were collected during the Fall of 2018 from large introductory calculus courses at a midwestern university in the United States (N = 563 students who completed the post-survey). Demographic information was collected from institutional records from the university. Of this sample, 37% identified as female. Race/ethnicity was reported as follows: 66% White, 15% International, 9% Asian, 4% Black, 4% Hispanic, 2% two or more races; less than 1% not reported. This sample was fairly representative of the university: 67% White, 13% International, 5% Asian, 7% Black, 4% Hispanic, 3% two or more races. Students’ reported their class level as follows: 82% first-year; 14% second-year; 3% third-year; 1% fourth- year; less than 1% other. Procedure and Measures Students completed a pre-survey during the first two weeks of the semester and a post- survey during the final two weeks of the semester. These surveys were distributed through an online link as a departmental wide survey. Students who participated in the pre- and post-survey received course credit added onto their final course grade. For the purposes of this study, I use only those who completed the post-survey. Cost Students’ perceptions of cost were assessed during the post-survey using the same full- scale as in study 1 (Flake et al., 2015), but using a Likert scale that ranged from 1(strongly disagree) to 7(strongly agree). Reliabilities were as follows: task effort cost: w = 0.93 [0.92, 40 0.94]; outside effort cost: w = 0.91 [0.90, 0.92]; loss of valued alternatives: w = 0.89 [0.88, 0.91]; emotional cost: w = 0.93 [0.92, 0.94]. Theoretically Relevant Variables The expectancies and values scales that were used in study one, were also used in study two (Kosovich et al., 2015); however, these items focused on calculus rather than statistics. These items were on a 7-point Likert scale from 1(strongly disagree) to 7(strongly agree). Reliabilities were as follows: expectancies: a = .80 [0.77, 0.83]; values: a = .83 [0.81, 0.86]. Students’ final calculus grades were collected through institutional records. Grades were reported on a 4.0 scale. Data Analytic Strategy Using the full cost scale from the post-survey, I report on three of the same pieces of validity evidence as in study one: structural, convergent, and predictive. In the tables presented below, I bold the items that were selected in study one to evaluate their performance in this second sample. Preliminary Analysis Results I examined means and standard deviations of all cost variables taken during the post- survey measure (see Table 1.8). Correlations of all cost items and subscales are included in the Appendix from the full Flake et al. (2015) scale. 41 Table 1.8. Means and standard deviations of original cost items Task effort cost TE1 TE2 TE3 TE4 TE5 Outside effort cost OE1 OE2 OE3 OE4 Loss of valued alternatives L1 L2 L3 L4 Emotional cost EM1 EM2 EM3 EM4 EM5 EM6 Note: The range for all items was between 1 = Strongly Disagree and 7 = Strongly Agree. SD 1.40 1.58 1.62 1.61 1.57 1.53 1.31 1.43 1.53 1.49 1.45 1.33 1.46 1.47 1.53 1.62 3.83 1.82 1.68 1.81 1.70 1.73 1.82 M 3.40 3.40 3.47 3.38 3.38 3.39 3.13 3.05 3.20 3.18 3.11 3.21 3.10 3.13 3.18 3.41 3.76 4.14 3.48 3.76 3.59 3.70 3.89 42 Structural Validity I used a CFA (N = 531) to examine the psychometric properties of the full-cost scale (see Table 1.9). The CFA fit the model well (CFI = .98, TLI = .98, RMSEA = .05). Similar factor loadings for all items were found across the dimensions of cost and all selected items from study one had strong psychometric properties. 43 Table 1.9. Study 2 results R2 Task effort .128 .149 TE1 .084 TE2 .095 TE3 .081 TE4 TE5 .094 Outside effort .073 .057 OE1 .060 OE2 .047 OE3 OE4 .064 LOVA .084 .067 L1 L2 .054 .093 L3 .049 L4 Emotional .216 .139 EM1 .121 EM2 EM3 .172 .202 EM4 .179 EM5 EM6 .164 Note: Factor loadings are standardized values. LOVA = Loss of Valued Alternatives. CFA Loadings Expectancies Value .878 .788 .867 .869 .851 .847 .872 .807 .855 .832 .824 .846 .793 .769 .863 .842 .860 .877 .812 Regression Final Grade -0.28*** (.03) -0.27*** (.03) -0.20*** (.03) -0.21*** (.03) -0.20*** (.03) -0.22*** (.03) -0.23*** (.03) -0.18*** (.03) -0.18*** (.03) -0.16*** (.03) -0.19*** (.03) -0.24*** (.03) -0.20*** (.03) -0.17*** (.03) -0.22*** (.03) -0.15*** (.03) -0.34*** (.03) -0.23*** (.02) -0.23*** (.03) -0.25*** (.02) -0.29*** (.02) -0.27*** (.02) -0.25*** (.02) -.305*** -.321*** -.171*** -.298*** -.262*** -.283*** -.291*** -.257*** -.220*** -.258*** -.296*** -.300*** -.260*** -.251*** -.326*** -.206*** -.317*** -.185*** -.310*** -.294*** -.334*** -.279*** -.259*** Correlations -.446*** -.448*** -.328*** -.410*** -.367*** -.407*** -.418*** -.385*** -.362*** -.329*** -.403*** -.428*** -.407*** -.369*** -.398*** -.323*** -.480*** -.326*** -.421*** -.431*** -.482*** -.421*** -.423*** 44 Convergent Validity I next examined correlations between each dimension of cost and students’ expectancies and values (see Table 1.9). Again, all correlations between cost and expectancies and values were negative. Expectancies again had higher negative correlations with cost compared to value. Last, all correlations were significant (p <.001). The absolute value of the average difference between the items selected in study one and the full subscale were as follows: TE5: .031; OE4: .005; L2: .054; EM3: .036. The differences found between the items chosen in study one and the full subscale were all less than or equal to .054, demonstrating replicable properties. Predictive Validity As in study one, I first used the composite subscale mean to predict final grade and then used each individual item of the cost scale as a predictor. I found that for all cost dimensions, students who reported high levels of cost, had lower final grades compared to students who reported lower levels of cost. I then examined R2 to find the most predictive item for each subdimension of cost (see Table 1.9). When considering TE5, R2 = .094 (full TE scale R2 = .128). For OE4, R2 = .047 (full OE scale R2 = .073). For L2, R2 = .054 (full L scale R2 = .084). Last, for EM3, R2 = .172 (full EM scale R2 = .216). Similar to study one, the differences found in R2 between the chosen items and the full-scale were small (less than or equal to .044). Study Two Discussion I examined how the items chosen in study one performed in a second sample. When looking across the three pieces of evidence examined in study two (i.e., structural, convergent, and predictive), I found that the item properties were consistent with what I observed in study one and the full subscale. Although, the items chosen in study one were not always the “best” item in study two, the difference in results that determined the ranking were small. That is, the 45 estimates differed at the hundredth, or in some cases, the thousandth decimal place. Therefore, the items chosen in study one demonstrated consistent validity evidence in study two. Study Three The purpose of study three was to pilot the short cost scale with the items chosen in study one. Here, I tested the intended use of this scale. That is, I used the scale to collect daily diary reports, an intensive longitudinal methodology, to examine college students’ experiences of cost over time. I further examined the short scale’s structural validity of the higher-order cost factor from the original validation study. I hypothesized that the model fit of the higher-order cost factor, using the four chosen items, would be acceptable. I also report scale score means and variance as well as the between- and within-person trajectories of cost throughout the semester as evidence that the short version of the scale captures meaningful differences across students, over time. Sample Method A subsample of the students who participated in study two also participated in the piloting of the short cost scale (N = 273). This subsample was taken from three of the calculus courses, all taught by separate instructors. Procedure and Measures Throughout the semester, a subsample of students enrolled in the calculus courses also completed a daily diary measure for 11 consecutive weeks. Surveys were distributed through an online platform, Remind. Through Remind, students were emailed a link to the daily diary survey during the last 10 minutes of their class. Students had the remainder of the day to respond to the survey before it closed at midnight. Daily diary surveys rotated between Monday and 46 Wednesday to avoid day-of-the-week effects. As an incentive for participation, students who completed 80% of the daily diary surveys were entered into one of two drawings for a $75 Amazon gift card in their course section (six gift cards in total). In total, 1,424 daily diary responses were collected. The response rate was 47.42% and students responded to an average of 5.22 daily surveys (SD = 3.6). Cost The four items that were chosen in study one, were piloted in study three using a daily diary approach. The stem of each item read as: “After today’s class I feel like:”. Data Analytic Strategy I piloted the shortened cost measure and report on its structural validity using a CFA, reliability, and descriptive statistics. Further, I examined between- and within-person changes over the course of the semester to illustrate that the scale can capture differences that exist when examining inter- and intra-individual change. Results Preliminary Analysis I examined means and standard deviations of all of the single cost items taken during the daily diary measure (four items were used to assess cost). Means and standard deviations were as follows: task effort cost: M = 3.01, SD = 1.65; outside effort cost: M = 2.66, SD = 1.50; loss of valued alternatives: M = 2.76, SD = 1.59; emotional cost: M = 3.08, SD = 1.75. I also examined how the short scale compared to the long version in each sample by taking the means and standard deviations of the four items in all of the other data collection waves (see Table 1.10). Across studies, the short scale recovers a comparable mean and variance compared to the long 47 scale. Within- and between-person correlations of the cost items form the short-scale are included in the Appendix. 48 Table 1.10. Means and standard deviations of the full cost scale and short cost scale across studies M Full Cost Scale 3.01 Short Cost Scale 2.94 Note: The short cost scale was comprised of items TE5, OE4, L2, EM3. The scale for study 1 was a 6-point Likert scale. The scale for studies 2 and 3 was a 7-point Likert scale. Study 3 M 2.88 SD M 0.96 0.98 Study 2 SD M 1.15 1.19 3.41 3.34 3.21 3.13 SD 1.43 SD 1.27 1.30 Study 1 T2 T4 49 Structural Validity To assess the assumption that short-cost scale would have strong psychometric properties for the higher-order cost factor, I used a CFA to examine a one factor model of cost across all item responses. Due to the nested nature of the data (up to 11 measurements within person), I adjusted the standard errors so that they were robust to the dependency of repeated measurements (N = 1301). I found the fit to be acceptable: CFI = .99, TLI = .98, RMSEA = .05. The standardized loadings were as follows: task effort cost = .874; outside effort cost = .832; loss of valued alternatives = .876; emotional cost = .767. The full four item scale produced a reliability of w = 0.90 [0.89, 0.91]. Between- and Within-Person Trajectories I next examined plots of mean-levels of cost perceptions over time. I first examined between-person cost perceptions over time (see Figure 1.1). From this graph, we can see that cost does not appear to vary much between person, though there are slight differences between the dimensions of cost throughout the semester. 50 7 6 5 4 3 2 1 Between-Person Cost Perceptions Over Time 1 2 3 4 5 6 7 8 9 10 11 Task Effort Outside Effort LOVA Emotional Figure 1.1. Mean levels of between-person cost perceptions throughout 11 weeks of a calculus semester. LOVA = Loss of valued alternatives. 51 However, I then examined individual cases and found that from week-to-week, the scale can detect within-person changes to cost perceptions that coincide to course events. To demonstrate this variability, I present two individual students as case studies. I first present an account of an International Female student’s cost perceptions throughout the course of the semester (see Figure 1.2). From examining this student’s reports, we can see how variable her perceptions of cost were throughout the semester. For example, there is a large dip in task effort and emotional cost around week six. This week in the semester coincides with the week before a midterm exam where no homework was assigned, which suggests that the scale is sensitive to environment induced fluctuations. Further, we can see how the different dimensions of cost are perceived to be quite different. That is, this student reports generally high levels of task effort cost and relatively low levels of loss of valued alternatives. 52 7 6 5 4 3 2 1 International Female Student Cost Perceptions Over Time 1 2 3 4 5 6 7 8 9 10 11 Task Effort Outside Effort LOVA Emotional Figure 1.2. International female student’s cost perceptions throughout 11 weeks of a calculus semester. LOVA = Loss of valued alternatives. 53 Second, I present an account of a White Male student (see Figure 1.3). In this account, we see that this student reports high levels of emotional cost and fairly low levels of other cost dimensions throughout the semester. Unlike, the student I present on above, this students’ cost perceptions do not appear to have high peaks or dips before exam weeks, but his emotional cost increases as the semester goes on. We can also see variability in this student’s responses over time and the importance of having item content across all four sub-dimensions of cost. 54 7 6 5 4 3 2 1 White Male Student Cost Perceptions Over Time 1 2 3 4 5 6 7 8 9 10 11 Task Effort Outside Effort LOVA Emotional Figure 1.3. White male student’s cost perceptions throughout 11 weeks of a calculus semester. LOVA = Loss of valued alternatives. 55 Study 3 Discussion Using a CFA, I found evidence that the four items chosen recovered the general cost factor from the original validation study. I also found that the short scale adequately recovers the means and standard deviations of the full cost scale across studies. Finally, by plotting students’ cost perceptions throughout the calculus course, I was able to see that the scale captures important within- and between-person differences. Overall Discussion I sought to provide validity evidence for the use of a short cost scale, adapted from the original Flake et al. (2015) scale, in college classrooms. It is often a misunderstanding that short scales are intended to substitute long scales (Ziegler et al., 2014); rather, I aimed to provide a practical measure that can be used by researchers and educators in the context of classrooms when short measures are necessary. Across three studies, I provide content, structural, convergent, and predictive validity evidence for the short cost scale. This scale offers researchers and educators a way to quickly assess students’ perceptions of cost in the classroom. Here, I discuss limitations of this study and potential future research. Research focusing on students’ perceptions of cost has overwhelmingly focused on the college student population; therefore, I found it best to focus on this same population. However, a better understanding of how cost perceptions differ among different populations of students is needed. As research on students’ perceptions of cost begins to focus more on students in K-12 settings, we may need to revisit the use of this short cost scale. I found validity evidence for the use of this short-cost scale in college classrooms, but validation is an ongoing process of accumulating evidence about interpreting scores in different settings (AERA et al., 2014). Thus, researchers should continue to assess the validity of this scale before use in other settings, such 56 as high school classroom. Given how similar the items performed, other items from the full version of the scale could be considered. A longitudinal investigation should also be considered in the future (i.e., longitudinal measurement invariance, short-term trajectories) to address whether the factor-structure of cost holds over time. This would aid in the interpretation of whether this scale can be used in longitudinal studies. As evidence from study three suggests that cost perceptions may vary within-person more so than between-person, this scale will provide an option for researchers to begin assessing within-person differences and changes more systematically. Future researchers may wish to focus more on the reasons there may be peaks or dips in perceptions of cost throughout the semester by conducting interviews along with the collection of intensive data. Examining individual cases of students may aid researchers in understanding individual student experiences, something that has been absent from educational psychology research, specifically with regard to race and ethnicity and is an exciting area of future study (Zusho & Kumar, 2018). Significance and Conclusion Short scales are often used with little to no validity evidence (Flake et al., 2017; Barry et al., 2014). In the current study, I take a comprehensive approach to validation of a short scale to measure students’ perceptions of cost in the classroom. Given the evidence I present here, this scale can be used to assess cost beliefs over time in college classrooms, where short measures are needed. Each item represents a subdimension of cost from the full-length scale and the four- items adequately form a general cost factor, consistent with the higher-order cost factor from the full scale. This scale also aids researchers hoping to understand the antecedents and consequences of these cost beliefs. Despite the limitations above, the present study provides numerous sources of strong validity evidence across multiple samples for a short-cost scale. 57 Given the importance in education research of understanding how phenomena fluctuate in-class over time, as well as understanding the antecedents and consequences of cost over time, this short-cost scale will give researchers an instrument to understand these associations and ultimately promote effective pedagogy in the classroom. 58 APPENDIX 59 .69 .71 .74 .52 .43 .52 .43 .46 .67 .64 .65 .53 .54 .70 .63 .66 .60 .57 .65 .60 .67 .62 .54 .44 .54 .47 .45 .63 .55 .60 .51 .56 .61 .54 .56 .51 .54 .56 .54 .70 .65 .58 .64 .50 .56 .68 .64 .64 .56 .57 .73 .59 .69 .60 .66 .68 .66 Table A.1. Correlations of cost items from full Flake et al. (2015) scale at time 2 from study 1 TE TE1 TE2 TE3 TE4 TE5 OE OE1 OE2 OE3 OE4 LV LV1 LV2 LV3 LV4 EM EM1 EM2 EM3 EM4 EM5 EM6 TE1 TE2 TE3 TE4 TE5 OE OE1 OE2 OE3 OE4 .75 .72 .73 .76 .60 .55 .55 .51 .51 .74 .69 .68 .62 .64 .70 .59 .66 .60 .58 .70 .58 TE .91 .89 .84 .87 .88 .65 .57 .63 .54 .56 .77 .72 .73 .63 .64 .80 .68 .75 .66 .68 .76 .69 .55 .51 .50 .46 .46 .65 .62 .62 .54 .52 .75 .64 .72 .58 .61 .74 .67 .64 .74 .64 .58 .61 .58 .52 .59 .52 .51 .52 .50 .53 .53 .73 .62 .50 .56 .63 .50 .50 .41 .39 .47 .47 .46 .41 .87 .86 .87 .90 .70 .61 .64 .68 .55 .61 .52 .51 .55 .55 .54 .55 .65 .69 .71 .59 .53 .53 .56 .47 .51 .42 .42 .42 .44 .45 .52 .61 .55 .55 .61 .46 .55 .45 .46 .50 .53 .47 .49 60 LV .87 .93 .88 .87 .71 .57 .68 .63 .61 .68 .60 LV1 LV2 LV3 LV4 EM EM1 EM2 EM3 EM4 EM5 EM6 .74 .69 .63 .67 .54 .65 .60 .53 .66 .58 Table A.1. (cont’d) TE TE1 TE2 TE3 TE4 TE5 OE OE1 OE2 OE3 OE4 LV LV1 LV2 LV3 LV4 EM EM1 EM2 EM3 EM4 EM5 EM6 Note. TE = Task effort cost composite; OE = Outside effort cost composite; LV = Loss of valued alternatives composite; EM = Emotional cost composite. All correlations were significant at the p < .001 level. .65 .59 .48 .54 .52 .51 .56 .52 .76 .75 .71 .80 .74 .58 .46 .58 .53 .52 .54 .45 .85 .87 .88 .88 .90 .90 .77 .77 .78 .68 .56 .63 .59 .59 .64 .58 .67 .67 .64 .71 .79 .75 .72 .75 .73 61 .81 .78 .75 .63 .75 .71 .64 .69 .59 .58 .63 .64 .65 .63 .77 .65 .78 .69 .67 .62 .59 .52 .59 .54 .59 .56 Correlations of cost items from full Flake et al. (2015) scale at time 4 from study 1 TE TE1 TE2 TE3 TE4 TE5 OE OE1 OE2 OE3 OE4 LV LV1 LV2 LV3 LV4 EM EM1 EM2 EM3 EM4 EM5 EM6 TE1 TE2 TE3 TE4 TE5 OE OE1 OE2 OE3 OE4 .81 .87 .86 .80 .75 .70 .65 .71 .66 .82 .77 .78 .72 .70 .77 .68 .71 .69 .66 .74 .68 TE .93 .91 .93 .95 .93 .81 .76 .72 .76 .68 .86 .81 .81 .76 .74 .85 .75 .79 .76 .76 .83 .76 .72 .75 .70 .74 .66 .61 .72 .64 .63 .63 .63 .68 .70 .90 .92 .92 .88 .85 .75 .84 .78 .72 .76 .67 .65 .70 .67 .72 .70 .75 .80 .72 .83 .75 .77 .78 .69 .72 .61 .63 .69 .62 .69 .65 .85 .77 .71 .69 .73 .68 .82 .77 .79 .71 .71 .83 .74 .75 .76 .74 .79 .75 .72 .68 .66 .70 .58 .78 .71 .74 .70 .68 .81 .70 .75 .71 .73 .79 .71 .78 .83 .81 .69 .68 .59 .65 .57 .77 .71 .71 .70 .65 .78 .69 .75 .67 .69 .76 .69 .86 .83 .81 .77 .74 .75 .68 .82 .79 .76 .71 .71 .78 .69 .71 .71 .70 .77 .69 62 LV .89 .94 .91 .88 .78 .67 .74 .74 .68 .74 .68 LV1 LV2 LV3 LV4 EM EM1 EM2 EM3 EM4 EM5 EM6 .82 .70 .71 .72 .61 .70 .70 .58 .70 .62 Table A.2. (cont’d) TE TE1 TE2 TE3 TE4 TE5 OE OE1 OE2 OE3 OE4 LV LV1 LV2 LV3 LV4 EM EM1 EM2 EM3 EM4 EM5 EM6 Note: TE = Task effort cost composite; OE = Outside effort cost composite; LV = Loss of valued alternatives composite; EM = Emotional cost composite. All correlations were significant at the p < .001 level. .72 .72 .61 .68 .66 .68 .67 .64 .80 .82 .77 .78 .75 .65 .57 .60 .62 .55 .61 .56 .88 .92 .90 .89 .94 .91 .82 .84 .76 .74 .62 .68 .69 .64 .71 .65 .75 .71 .71 .79 .80 .80 .79 .89 .80 63 .75 .74 .66 .57 .64 .55 .57 .79 .69 .68 .71 .69 .71 .54 .67 .60 .66 .63 .57 .78 .67 .62 .62 .55 .59 .76 .70 .67 .67 .63 .74 .57 .72 .63 .66 .69 .59 .63 .58 .58 .52 .56 .75 .68 .63 .66 .64 .72 .53 .67 .64 .68 .65 .58 Table A.3. Correlations of post-survey cost items from full Flake et al. (2015) scale in study 2 TE TE1 TE2 TE3 TE4 TE5 OE OE1 OE2 OE3 OE4 LV LV1 LV2 LV3 LV4 EM EM1 EM2 EM3 EM4 EM5 EM6 TE1 TE2 TE3 TE4 TE5 OE OE1 OE2 OE3 OE4 .68 .77 .75 .72 .67 .59 .64 .55 .59 .80 .74 .70 .72 .65 .74 .59 .70 .64 .68 .66 .59 TE .89 .85 .90 .89 .89 .73 .65 .69 .62 .64 .86 .78 .74 .76 .72 .83 .64 .78 .71 .75 .75 .67 .70 .75 .70 .63 .60 .66 .59 .57 .44 .53 .49 .50 .52 .48 .88 .90 .87 .89 .76 .67 .67 .70 .64 .62 .49 .59 .53 .55 .57 .51 .72 .69 .73 .68 .60 .62 .60 .57 .56 .42 .54 .47 .51 .51 .45 .69 .65 .56 .56 .61 .54 .55 .45 .54 .44 .47 .49 .45 .68 .59 .61 .62 .57 .53 .40 .50 .47 .47 .48 .43 .69 .67 .69 .60 .50 .56 .55 .50 .67 .61 .56 .60 .57 .73 .57 .67 .62 .64 .68 .62 64 LV .87 .88 .89 .87 .74 .58 .70 .66 .66 .67 .60 LV1 LV2 LV3 LV4 EM EM1 EM2 EM3 EM4 EM5 EM6 .68 .71 .65 .65 .51 .59 .58 .57 .60 .51 Table A.3. (cont’d) TE TE1 TE2 TE3 TE4 TE5 OE OE1 OE2 OE3 OE4 LV LV1 LV2 LV3 LV4 EM EM1 EM2 EM3 EM4 EM5 EM6 Note: TE = Task effort cost composite; OE = Outside effort cost composite; LV = Loss of valued alternatives composite; EM = Emotional cost composite. All correlations were significant at the p < .001 level. .73 .71 .71 .76 .70 .65 .51 .59 .57 .56 .57 .56 .83 .87 .87 .87 .89 .86 .70 .65 .65 .63 .70 .67 .73 .72 .75 .68 .71 .68 .64 .50 .63 .57 .56 .57 .49 .67 .66 .48 .63 .60 .60 .60 .53 65 Table A.4. Within- and between-person correlations from shortened cost scale in study three Within-person TE OE LOVA EM Between-person TE OE LOVA EM Note: All correlations were significant at the p < .001 level. LOVA EM .38 .68 TE .38 .43 .40 .77 .84 .77 OE .44 .36 .82 .67 66 REFERENCES 67 REFERENCES Allport, G. W. (1942). The use of personal documents in psychological science (Bulletin 49). New York: Social Science Research Council. American Educational Research Association (AERA), American Psychological Association (APA), & National Council on Measurement in Education (NCME). (2014). Standards for educational and psychological testing. Washington, DC: American Educational Research Association. Bandalos, D. L. (2018). Measurement theory and applications for the social sciences. New York: Guilford Press. Barron, K. E., & Hulleman, C. S. (2015). Expectancy-value-cost model of motivation. In J. S. Eccles & K. Salmelo-Aro (Eds.), International encyclopedia of social and behavioral sciences: Motivational psychology (2nd ed.). New York, NY: Elsevier. Barry, A. E., Chaney, B., Piazza-Gardner, A. K., & Chavarria, E. A. (2014). Validity and reliability reporting practices in the field of health education and behavior: A review of seven journals. Los Angeles, CA: SAGE Publications. doi:10.1177/1090198113483139 Battle, A., & Wigfield, A. (2003). College women’s value orientations toward family, career, and graduate school. Journal of Vocational Behavior, 62(1), 56. doi:10.1016/S0001- 8791(02)00037-4 Bergey, B. W., Parrila, R. K., & Deacon, S. H. (2018). Understanding the academic motivations of students with a history of reading difficulty: An expectancy-value-cost approach. Learning and Individual Differences, 67, 41-52. doi:10.1016/j.lindif.2018.06.008 Bergkvist, L., & Rossiter, J. R. (2007). The predictive validity of multiple-item versus single- item measures of the same constructs. Journal of Marketing Research, 44(2), 175-184. doi:10.1509/jmkr.44.2.175 Bieg, M., Goetz, T., Sticca, F., Brunner, E., Becker, E., Morger, V., & Hubbard, K. (2017). Teaching methods and their impact on students' emotions in mathematics: An experience- sampling approach. ZDM: The International Journal on Mathematics Education, 49(3), 411-422. doi:10.1007/s11858-017-0840-1 Bolger, N. & Laurenceau, J. (2013). Intensive longitudinal methods: An introduction to diary and experience sampling research. New York: Guilford Press. Borsboom, D., Mellenbergh, G. J., & van Heerden, J. (2004). The concept of validity. Psychological Review, 111(4), 1061-1071. doi:10.1037/0033-295X.111.4.1061 68 Bryk, A. S., Yeager, D. S., Hausman, H., Muhich, J., Dolle, J. R., Grunow, A., … & Gomez, L. (2013, June). Improvement research carried out through net-worked communities: Accelerating learning about practices that support more productive student mindsets. In A White Paper prepared for the White House meeting on “Excellence in Education: The Importance of Academic Mindsets. Retrieved from http://www.carnegiefoundation.org/sites/default/files/improve- ment_research_NICs_bryk-yeager.pdf Cheung, F., & Lucas, R. E. (2014). Assessing the validity of single-item life satisfaction measures: Results from three large samples. Quality of Life Research, 23(10), 2809-2818. doi:10.1007/s11136-014-0726-4 Conley, A. M. (2012). Patterns of motivation beliefs: Combining achievement goal and expectancy-value perspectives. Journal of Educational Psychology, 104, 32-47. doi: 10.1037/a0026042 Csikszentmihalyi, M. & Larson, R. (2014). Validity and reliability of the experience-sampling method. In Flow and the Foundations of Positive Psychology (pp. 35-54). Dordrecht: Springer Netherlands. http://doi.org/10.1007/978-94-017-9088-8_3 Donnellan, M. B., Oswald, F. L., Baird, B. M., & Lucas, R. E. (2006). The mini-IPIP scales: Tiny-yet-effective measures of the big five factors of personality. Psychological Assessment, 18(2), 192-203. doi:10.1037/1040-3590.18.2.192 Durik, A. M., Schwartz, J., Schmidt, J. A., & Shumow, L. (2018). Age differences in effects of self-generated utility among black and hispanic adolescents. Journal of Applied Developmental Psychology, 54, 60-68. doi:10.1016/j.appdev.2017.11.004 Eccles (Parsons), J. S., Adler, T. F., Futterman, R., Goff, S. B., Kaczala, C. M., Meece, J. L., et al. (1983). Expectancies, values, and academic behaviors. In J. T. Spence (Ed.), Achievement and achievement motives (pp. 75–146). San Francisco, CA: W. H. Freeman. Feldon, D. F., Callan, G., Juth, S., & Jeong, S. (2019). Cognitive load as motivational cost. Educational Psychology Review, 31(2), 319-337. doi:10.1007/s10648-019-09464-6 Flake, J. K., Barron, K. E., Hulleman, C., McCoach, B. D., & Welsh, M. E. (2015). Measuring cost: The forgotten component of expectancy-value theory. Contemporary Educational Psychology, 41, 232-244. doi:10.1016/j.cedpsych.2015.03.002 Flake, J. K., Pek, J., & Hehman, E. (2017). Construct validation in social and personality research: Current practice and recommendations. Social Psychological and Personality Science, 8(4), 370-378. doi:10.1177/1948550617693063 69 Gaspard, H., Häfner, I., Parrisius, C., Trautwein, U., & Nagengast, B. (2017). Assessing task values in five subjects during secondary school: Measurement structure and mean level differences across grade level, gender, and academic subject. Contemporary Educational Psychology, 48, 67-84. doi:10.1016/j.cedpsych.2016.09.003 Goetz, T., Bieg, M., & Hall, N. C. (2016). Assessing academic emotions via the experience sampling method. In M. Zembylas & P. A. Schutz (Eds.) Methodological advances in research on emotion and education (pp. 245-258). Springer, Cham. doi:10/1007/978-3- 319-29049-2_19 Goetz, T., Frenzel, A. C., Stoeger, H., & Hall, N. C. (2010). Antecedents of everyday positive emotions: An experience sampling analysis. Motivation and Emotion, 34(1), 49-62. doi:10.1007/s11031-009-9152-2 Gogol, K., Brunner, M., Goetz, T., Martin, R., Ugen, S., Keller, U., . . . Preckel, F. (2014). "my questionnaire is too long!" the assessments of motivational-affective constructs with three-item and single-item measures. Contemporary Educational Psychology, 39(3), 188- 205. doi:10.1016/j.cedpsych.2014.04.002 Hamaker, E. L., & Wichers, M. (2017). No time like the present: Discovering the hidden dynamics in intensive longitudinal data. Current Directions in Psychological Science, 26(1), 10-15. doi:10.1177/0963721416666518 Hektner, J. M., Schmidt, J. A., & Csikszentmihalyi, M. (2007). Experience sampling method: Measuring the quality of everyday life. Thousand Oaks, Calif: Sage Publications. Hilpert, J. C., & Marchand, G. C. (2018). Complex systems research in educational psychology: Aligning theory and method. Educational Psychologist, 53(3), 185-202. doi:10.1080/00461520.2018.1469411 Hulleman, C. S., & Harackiewicz, J. M. (2009). Promoting interest and performance in high school science classes. Science, 326(5958), 1410-1412. doi:10.1126/science.1177067 Jiang, Y., Rosenzweig, E. Q., & Gaspard, H. (2018). An expectancy-value-cost approach in predicting adolescent students’ academic motivation and achievement. Contemporary Educational Psychology, 54, 139-152. doi:10.1016/j.cedpsych.2018.06.005 Johnson, M. L., & Safavian, N. (2016). What is cost and is it always a bad thing? furthering the discussion concerning college-aged students' perceived costs for their academic studies. Journal of Cognitive Education and Psychology, 15(3), 368-390. doi:10.1891/1945-8959.15.3.368 Johnson, M. L., Taasoobshirazi, G., Clark, L., Howell, L., & Breen, M. (2016). Motivations of traditional and nontraditional college students: From self-determination and attributions, to expectancy and values. The Journal of Continuing Higher Education, 64(1), 3-15. doi:10.1080/07377363.2016.1132880 70 Wesley. Jordan, J. S., & Turner, B. A. (2008). The feasibility of single-item measures for organizational justice. Measurement in Physical Education and Exercise Science, 12(4), 237-257. doi:10.1080/10913670802349790 Kosovich, J. J., Flake, J. K., & Hulleman, C. S. (2017). Short-term motivation trajectories: A parallel process model of expectancy-value. Contemporary Educational Psychology, 49, 130-139. doi:10.1016/j.cedpsych.2017.01.004 Kosovich, J. J., Hulleman, C. S., Barron, K. E., & Getty, S. (2015). A practical measure of student motivation: Establishing validity evidence for the expectancy-value-cost scale in middle school. The Journal of Early Adolescence, 35(5-6), 790-816. doi:10.1177/0272431614556890 Linnenbrink-Garcia, L. & Patall, E. A. (2015). Motivation. In L. Corno, & E. M. Anderman (Eds.) Handbook of educational psychology (Vol. 3, pp. 91-103). New York, NY: Routledge. Lord, F. I., & Novick, M. R. (1968). Statistical theories of mental test scores. MA: Addison- Luttrell, V. R., Callen, B. W., Allen, C. S., Wood, M. D., Deeds, D. G., & Richard, D. C. S. (2010). The mathematics value inventory for general education students: Development and initial validation. Educational and Psychological Measurement, 70(1), 142-160. doi:10.1177/0013164409344526 Murayama, K., Goetz, T., Malmberg, L.-E., Pekrun, R., Tanaka, A., & Martin, A. J. (2017). Within-person analysis in educational psychology: Importance and illustrations. British Journal of Educational Psychology Monograph Series II, 12, 71– 87. Perez, T., Cromley, J. G., & Kaplan, A. (2014). The role of identity development, values, and costs in college STEM retention. Journal of Educational Psychology, 106, 315-329. doi: 10.1037/a0034027 Perez, T., Dai, T., Kaplan, A., Cromley, J. G., Brooks, W. D., White, A. C., . . . Balsai, M. J. (2019). Interrelations among expectancies, task values, and perceived costs in undergraduate biology achievement. Learning and Individual Differences, 72, 26-38. doi:10.1016/j.lindif.2019.04.001 Robinson, K. A., Lee, Y., Bovee, E. A., Perez, T., Walton, S. P., Briedis, D., & Linnenbrink- Garcia, L. (2018). Motivation in transition: Development and roles of expectancy, task values, and costs in early college engineering. Journal of Educational Psychology, doi:10.1037/edu0000331 Rosseel, Y. (2012). Lavaan: An r package for structural equation modeling. Journal of Statistical Software, 48(2), 1-36. http://www.jstatsoft.org/v48/i02/ 71 Rossiter, J. R. (2002). The C-OAR-SE procedure for scale development in marketing. International Journal of Research in Marketing, 19(4), 305-335. doi:10.1016/S0167- 8116(02)00097-6 Safavian, N., Conley, A., & Karabenick, S. (2013, April). Examining mathematics cost value among middle school youth. In E. M. Anderman (Chair), Is it worth my time and effort? Exploring students’ conceptions of the cost of learning. Symposium conducted at the annual meeting of the American Educational Research Association, San Francisco, CA. Shernoff, D. J., & Vandell, D. L. (2007). Engagement in after-school program activities: Quality of experience from the perspective of participants. Journal of Youth and Adolescence, 36(7), 891-903. doi:10.1007/s10964-007-9183-5 Slavin, R. E. (2018). Educational psychology: Theory and practice (Twelfth ed.). NY: Pearson. Smith, G. T., Combs, J. L., & Pearson, C. M. (2012). Brief instruments and short forms. In H. Cooper, P. M. Camic, D. L. Long, A. T. Panter, D. Rindskopf, & K. J. Sher (Eds.), APA handbook of research methods in psychology, Vol. 1. Foundations, planning, measures, and psychometrics (pp. 395-409). Washington, DC, US: American Psychological Association. doi:10.1037/13619-021 Smith, G. T., McCarthy, D. M., & Anderson, K. G. (2000). On the sins of short-form development. Psychological Assessment, 12(1), 102-111. doi:10.1037/1040- 3590.12.1.102 Strati, A., Schmidt, J., & Maier, K. (2017). Perceived challenge, teacher support, and teacher obstruction as predictors of student engagement. Journal of Educational Psychology, 109(1), 131-U152. doi:10.1037/edu0000108 Trautwein, U., Marsh, H. W., Nagengast, B., Lüdtke, O., Nagy, G., & Jonkmann, K. (2012). Probing for the multiplicative term in modern expectancy–value theory: A latent interaction modeling study. Journal of Educational Psychology, 104(3), 763-777. doi:10.1037/a0027470 Widaman, K. F., Little, T. D., Preacher, K. J., & Sawalani, G. M. (2011). On creating and using short forms of scales in secondary research. In K. H. Trzesniewski, M. B. Donnellan, & R. E. Lucas (Eds.), Secondary data analysis: An introduction for psychologists (pp. 39- 61). Washington, DC, US: American Psychological Association. doi:10.1037/12350-003 Wigfield, A., & Cambria, J. (2010). Expectancy-value theory: Retrospective and prospective. In S. Karabenick & T. Urdan (Eds.), Advances in motivation and achievement. The next decade of research on motivation (Vol 16, pp. 35-70) New York: Taylor Francis Group. doi:10.1108/S0749-7423(2010)000016A005 Wigfield, A., & Eccles, J. S. (2000). Expectancy–Value theory of achievement motivation. Contemporary Educational Psychology, 25(1), 68-81. doi:10.1006/ceps.1999.1015 72 Yeager, D. S., & Walton, G. M. (2011). Social-psychological interventions in education: They're not magic. Review of Educational Research, 81(2), 267-301. doi:10.3102/0034654311405999 Ziegler, M., Kemper, C. J., & Kruyen, P. M. (2014). Short scales – five misunderstandings and ways to overcome them. Journal of Individual Differences, 35(4), 185-189. doi:10.1027/1614-0001/a000148 Zirkel, S., Garcia, J. A., & Murphy, M. C. (2015). Experience-sampling research methods and their potential for education research. Educational Researcher, 44(1), 7-16. doi:10.3102/0013189X14566879 Zusho, A., & Kumar, R. (2018). Introduction to the special issue: Critical reflections and future directions in the study of race, ethnicity, and motivation. Educational Psychologist, 53(2), 61-63. doi:10.1080/00461520.2018.1432362 73 PAPER TWO. Can You Hear It? Toward Conceptual Clarity of Emotional Cost and Negative Emotions Abstract Research on cost beliefs has surged over the past several years. Though many dimensions of cost have been found to exist, researchers have often conflated particular dimensions with one another. Moreover, some dimensions of cost may actually just refer to already established constructs. This paper explores the potential jangle fallacy between emotional cost and negative emotions with particular attention to the costs and emotions that students anticipated to be associated with a course, as well as the costs and emotions that students actually experience during the course. Results of this study suggest that anticipated emotional cost is similar to some anticipated negative emotions and different from others. The same was true of experienced emotional cost and daily negative emotions. Future directions are discussed for providing more conceptual clarity around this construct. Introduction Expectancy-value theory (Eccles et al., 1983) posits that students’ expectancies and values are critical antecedents of their motivation and academic achievement. Though values and expectancies have garnered a considerable amount of attention, students’ perceptions of cost have received less attention until recently and have been described as the “forgotten component of expectancy-value theory” (Flake et al., 2015). Cost beliefs broadly refer to what a student must sacrifice to do well on a given task (Eccles et al., 1983) and have been shown to negatively predict course taking intentions (Battle & Wigfield, 2003), intentions to pursue a STEM major (Perez et al., 2014), and achievement (Conley, 2012; Safavian et al., 2013; Trautwein et al., 2012). Because of the links between cost beliefs and important academic outcomes, researchers 74 have suggested that cost may be a promising construct for intervention (Barron & Hulleman, 2015); however, research on cost is still in its infancy and more work is needed to clarify both conceptual and operational definitions of cost (Wigfield & Eccles, 2020). For example, researchers agree that cost has multiple dimensions, but have debated whether three (Eccles et al., 1983; Perez et al., 2014) or four (Flake et al., 2015) dimensions of cost exist. It is also unclear whether these particular dimensions fully differ from other constructs in motivation and emotion research. Of particular interest to this study is the conceptualization of emotional cost, or a student’s negative appraisal of a psychological state from putting forth effort on a particular task (Flake et al., 2015). As jingle-jangle fallacies have been prominent in psychological research, it is possible that emotional cost actually refers to negative emotions, but has simply been given a new title. Given that emotion assessment largely suffers from a jingle-jangle fallacy (Weidman et al., 2017), understanding whether emotional cost is exacerbating this issue is imperative. Thus, the purpose of this study is to examine the potential conflation of emotional cost and negative emotions. Cost as a Potential Contributor to the Jingle-Jangle Problem Kelley (1927) first coined the concept of jingle-jangle fallacies. The jingle fallacy refers to the idea that two things with similar names actually refer to different constructs, whereas the jangle fallacy refers to the idea that two things with dissimilar names are actually the same construct (Marsh et al., 2018). Academics have noted that researchers need to take caution and make sure they are not measuring the same construct under the guise of a different name (Heyman & Dweck, 1992). Pajares (2009) also noted that problems arise from conceptually similar constructs that are operationalized differently, in order to fulfill a particular research agenda. Accordingly, researchers have suggested that one way to establish conceptual 75 distinctiveness between two constructs in question is to provide validity evidence (e.g., structural, predictive, convergent, discriminant) using analytic methods such as confirmatory factory analyses (CFA) and regression (Bong, 1996; Bong & Skaalvik, 2003; Marsh et al., 2018). Nearly a century ago, Kelley (1927) considered the jingle-jangle fallacy with respect to achievement and intelligence. More recently, researchers have examined a number of possible jingle-jangle fallacies including self-efficacy and self-concept (Marsh et al., 2018), perceived relevance and cognitive engagement (Reschly & Christenson, 2012), autonomy as independence vs. volition (Van Petegem et al., 2013), and personality and values (Higgs & Lichtenstein, 2013). Perhaps most notably, Marsh has examined jingle-jangle fallacies in many papers considering self-beliefs (Marsh et al., 2018), sport motivation orientations (Marsh, 1994), and academic motivation orientations (Marsh et al., 2003). Throughout these papers, different sources of validity (e.g., structural, convergent, discriminant) are considered to assess whether a jingle- jangle problem exists. For example, Marsh and colleagues (2018) examined the potential jingle- jangle fallacy regarding self-concept, self-efficacy, and outcome expectancies using well- established measures of each construct. Findings suggested that among the scales that were examined, math self-concept, math self-efficacy, and math outcome expectancies were largely indistinguishable, given that correlation coefficients between constructs were mostly larger than .90. Importantly, despite the high intercorrelations, findings also suggested some distinctiveness. For example, self-concept measures were more strongly correlated with math posttest outcomes than self-efficacy and the same outcomes. The authors speculated that such distinct relations with academic outcomes may reflect the different time-reference points for the two distinct self- measures. The two constructs may have been differently related to achievement outcomes 76 because self-efficacy is generally future-oriented, whereas self-concept focuses on past accomplishments. Recently, research surrounding students’ perceptions of cost has gained momentum. Expectancy-value theory (Eccles et al., 1983) posits that students’ achievement related choices, engagement, and persistence are driven by two fundamental questions: Can I do this task, and do I want to do this task? The former refers to a students’ expectancies or a students’ belief about their competence. The latter focuses on the extent that a student values the activity or task (i.e., the subjective importance that a student places on the task). Within this theory, Eccles et al. (1983) discuss the role of students’ perceptions of cost as a type of value that refers to what a student must give up or the effort required in order to complete a task; however, researchers have focused much less on cost until recently. Eccles et al., (1983) suggested three dimensions of cost (though others have suggested four; see Flake et al., 2015). The three most commonly discussed dimensions include: task effort cost: how much effort is required to engage in a task; loss of valued alternatives cost: what must be given up in order to engage in a task; emotional/psychological cost: psychological consequences associated with the failure of a particular task. Whereas, effort cost and loss of valued alternatives appear to be fairly unique constructs that researchers tend to agree on, emotional and psychological cost have been more difficult to discern. With this recent attention and theoretical development of the construct of cost, there is also a potential risk of a jingle-jangle fallacy. Considering that researchers are beginning to provide different conceptualizations and empirical evidence of multiple dimensions of cost, some of which may be overlapping (Perez et al., 2019; Wigfield & Eccles, 2020), early examination of the potential jingle-jangle of cost may provide researchers with more clarity surrounding this construct. 77 Jingle Fallacy of Emotional Cost Emotional and psychological cost have been used somewhat interchangeably (Eccles et al., 1983; Wigfield & Eccles, 2000) and researchers have taken up different conceptualizations of these two terms (Flake et al., 2015; Perez et al., 2014). For example, Perez et al. (2014) refer to psychological cost as, “the risk for, or actual, psychological or emotional cost associated with engaging in the task” (p. 316). Psychological cost has also been described as the psychological threat and fear of failure that is associated with the task-at-hand (Eccles et al, 1983; Perez et al., 2019). On the other hand, Flake et al. (2015), refer to emotional cost as the, “negative appraisals of a psychological state that results from exerting effort for the task” (p. 237). Other researchers have referred to emotional cost as the negative emotional or psychological consequences that result from engaging in an activity, or the threats to one’s self-worth that may be attached to failure (Rosenzweig et al., 2020). Further, emotional cost has been explicitly conceptualized as the negative emotions experienced when engaging in a task (Gaspard et al., 2015; Jiang et al., 2018). Eccles and Wigfield (2020) recently defined cost as both the emotional and psychological costs incurred from pursuing a task, with a particular focus on anticipated anxiety and the social and emotional costs tied to failure. As researchers continue to use different terms to define measures of cost, it may be that psychological cost and emotional cost represent two qualitatively distinct constructs. Accordingly, Perez et al. (2019) suggests that emotional costs differ from psychological costs, though some similarities might exist across these dimensions. For example, psychological cost and emotional cost have both been found to be negatively correlated with expectancies/competence beliefs, value, and course grades (Flake et al., 2015; Gaspard et al., 2015, 2017; Perez et al., 2014). Still, researchers do not have a clear understanding of the different dimensions of cost just yet, making it difficult to compare findings 78 from studies (Wigfield & Eccles, 2020). Because of this, studies are needed to compare different measures of cost (Wigfield & Eccles, 2020). Although expectancies and values have been shown to be negatively correlated to all dimensions of cost (Flake et al., 2015; Gaspard et al., 2017; Perez et al., 2014), dimensions do appear to differ in their predictive ability. For example, Perez et al., (2014) found that psychological cost was not significantly related to intentions to leave a STEM major, whereas other forms of cost were. In another study examining the costs of teaching careers among Latino preservice teachers, Bergey and colleagues (2019) found emotional costs to be highest when compared to other forms of cost. These conceptual distinctions have led to how emotional and psychological cost have been operationalized in the development of scales to measure these dimensions. Researchers have often described costs in similar ways, but used different items to measure these constructs (Wigfield & Eccles, 2020). Perez et al. (2014) focus on the reasons why a student may experience psychological costs (e.g., I would be embarrassed if I found out that my work in my science major was inferior to that of my peers; It frightens me that the courses required for my science major are harder than courses required for other majors.), whereas, Flake et al., (2015) focus on how much cost a student experiences in their measure (e.g., I worry too much about this class; This class is too exhausting). This itself creates friction in how researchers may understand or conceptualize cost. The above scales are two of many that show how researchers have interpreted psychological and emotional cost, which suggests that they may be measuring different constructs altogether (see Gaspard et al., 2017; Jiang et al., 2018; Johnson & Safavian, 2016). As can be seen from these scales, emotional cost is being operationalized in a way that focuses more on specific negative emotions and some threshold that is being crossed to insinuate that a cost is experienced. Though there may be a potential jingle fallacy between psychological 79 cost and emotional cost, understanding the differences between the two is beyond the scope of this paper. This jingle fallacy is deserving of attention; however, before addressing it, research is needed to disentangle emotional cost from negative emotions, a possible jangle fallacy. Jangle Fallacy of Emotional Cost Whereas there are different ways academics make meaning of emotional costs, a pressing concern regards the conceptualization and operationalization of emotional cost that scholars may have failed to differentiate from negative emotions. Indeed, academics have conflated these constructs by stating: “The emotional costs scale assessed negative emotions associated with teaching” (Bergey et al., 2019, p. 5). This is in conflict with how Flake et al. (2015) conceptualized emotional cost as stated above.3 Other scholars have attempted to differentiate emotional cost and negative emotions/affect by claiming that the two overlap, but suggesting that emotional cost focuses on anticipated negative affect of a specific task and negative emotions/affect focus on experiencing emotions in a specific situation; therefore, emotional cost and negative emotions are not the same (Jiang et al., 2018). This is, however, in conflict with how Eccles (2005) described cost as being what an individual sacrifices to engage in an activity as well as the anticipated effort that is needed to finish the activity. As such, cost may involve two separate components: anticipated and experienced. Both of which may be related to or distinct from anticipated and daily negative emotions. Whereas cost research is more limited, negative emotions have been studied extensively to date. In the sections that follow, I briefly review the literature on the conceptualization, operationalization, and correlates of negative emotions and then provide a similar review for emotional cost. Following this I discuss the 3 Bergey et al. (2019) used the Flake et al. (2015) scale in their study of the costs associated with teaching among Latino preservice teachers. 80 importance of considering anticipated and experienced costs and how to best address this potential jangle fallacy. Conceptualization and Operationalization of Negative Emotions Emotions have been described as containing multiple physiological, cognitive, affective, motivational, and expressive components (Kleinginna & Kleinginna, 1981; Pekrun & Linnenbrink-Garcia, 2012; Scherer, 2000). Pekrun’s (2006) control-value theory of achievement emotions provides an integrative framework for understanding the antecedents and consequences of emotions in educational and achievement situations. Broadly, this theory suggests that the learning environment informs students’ control- and value-appraisals, which predict their emotions. In turn, these emotions predict student learning behaviors and achievement. Control- value theory particularly focuses on discrete achievement emotions that are separated according to object focus (i.e., activity or outcome), valence (i.e., positive or negative), and activation (i.e., activated or deactivated). Thus, emotions can be categorized into one of four possible combinations of valence and activation level (e.g., negative-activated, negative-deactivated, positive-activated, positive-deactivated). The emotions that students experience span a wide spectrum including excitement, happiness, frustration, and boredom (Pekrun et al., 2002; Pekrun & Linnenbrink-Garcia, 2012). Emotion research has gained attention due to the key role that emotions play in shaping students’ pursuit of STEM fields. In particular, students tend to experience more negative emotions in STEM areas than in other domains (Dettmers et al., 2011; Dowker et al., 2016). Additionally, negative emotional experiences in STEM have been shown to be associated with a reduced likelihood of enrolling in future STEM courses and pursuing a STEM career (Eccles, 2012; Wigfield et al., 2002). Negative emotions have also been shown to negatively predict outcomes 81 such as self-regulation, learning behaviors, psychological health, and academic achievement (Diener, 2000; Goetz et al., 2013b; Pekrun et al., 2002). Research has also shown that the outcomes associated with emotions may differ by the emotion’s valence and the level of activation (Feldman Barrett & Russell, 1998). For example, negative-deactivated emotions, such as boredom, tend to undermine motivation and engagement (Pekrun et al., 2010); whereas negative-activated emotions, such as frustration, are assumed to have more complex associations with achievement-outcomes (Pekrun & Linnenbrink-Garcia, 2012). However, in general, negative emotions tend to be negatively correlated with positive emotions (Goetz et al., 2016b; Pekrun, 2006). In this study, three negative emotions are considered that capture different aspects of valence and activation. The first two emotions are generally considered to be negative achievement emotions: anger and boredom. Anger is considered to be a negative-activated emotion and is generally shown to be negatively related to academic outcomes (Linnenbrink- Garcia et al., 2016). Boredom is typically considered to be a negative-deactivated emotion; however, researchers have proposed that multiple types of boredom exist that may cross different dimensions (Goetz et al., 2014; Linnenbrink-Garcia et al., 2016). Boredom has been found to be associated with negative academic outcomes (Goetz et al., 2006, 2007; Pekrun et al., 2009, 2010) and school dropout (Bearden et al., 1989), but under specific circumstances has been shown to be related to self-reflection and initiating creativity (Seib & Vodanovich, 1998; Vodanovich, 2003). The last negative emotion considered, confusion, has been described using many terms such as an epistemic emotion (Brun et al., 2008; Pekrun et al., 2017), a knowledge emotion (Silvia, 2010), an affective state (Hess, 2003; Keltner & Shiota, 2003), a cognitive feeling state (Clore, 1992), and a discrete emotion (Rozin & Cohen, 2003). It is often considered to be a negative- 82 activating emotion (i.e., moderate arousal; Sazzad et al., 2011). Confusion has been shown to be associated with negative learning outcomes and disengagement (D’Mello & Graesser, 2014; Dweck, 1986; Kennedy & Lodge, 2016), but has also been shown to be positively related to engagement, knowledge exploration, and other learning outcomes (D’Mello & Graesser, 2014; Vogl et al., 2019). Anticipated and State Emotions In the conceptualization of emotions, scholars have made distinctions between anticipated emotions and state or momentary emotions4. Anticipated emotions are conceptualized as expected affective experiences associated with future situations or states (Perugini & Bagozzi, 2001; Wilson & Gilbert, 2003; van der Pligt & de Vries, 1998). They therefore do not constitute the experience of an emotion but rather are a cognitive appraisal of an expected future affective state. For instance, a student may expect to feel anxious during an upcoming exam, while presently not experiencing any anxiety at all. Anticipated emotions are closely related to affective forecasting, which refers to an individual’s prediction about how they will feel in the future (Wilson & Gilbert, 2003). Empirical research generally suggests that individuals tend to poorly estimate their anticipated emotions (i.e., future emotions) and that their retrospective emotions (i.e., trait emotions) differ from their actual experienced emotions (i.e., state emotions; Dunn et al., 2003; Goetz et al., 2013a). These poor estimations have been linked to a number of mechanisms including perceived competence (Goetz et al., 2013a), emotional exhaustion (Goetz et al., 2015), and gender stereotypes (Schuster & Martiny, 2016). In spite of these inaccurate predictions, anticipated emotions have been shown to be predictors of actual 4 Anticipated and state or momentary emotions are two types of distinctions with regards to emotions that scholars have made. Other distinctions that exist include trait emotions, prospective, retrospective emotions, and anticipatory emotions (see Ben-Eliyahu, 2019; Pekrun, 2006; Pekrun & Linnenbrink-Garcia, 2012). 83 emotions and behaviors, such as STEM career aspirations (Richard et al., 1996; Schuster & Martiny, 2016; Wilson & Gilbert, 2005). State or momentary emotions have been described as short-lived emotional experiences that tend to be strongly associated with situational cues (Eid et al., 1999). These emotional states are less stable over time and are in response to the changing environment (Linnenbrink, 2006). Within a given school subject, a student might experience high boredom during some classroom activities, and low boredom in others based on characteristics of the content and the particular activity. As discussed above, anticipated emotions may reflect more of a cognitive appraisal of an expected future state, whereas momentary emotions reflect that actual experienced emotions. Momentary or state emotions also differ from trait emotions, which refer to stable individual characteristics that are domain-specific (Bieg et al., 2014; Robinson & Clore, 2002). Much in the same way that scholars have conceptualized anticipated and experienced emotions as distinct, conceptualizations of cost suggest similar distinctions, though there is no evidence as to whether anticipated and experienced cost function differently as is the case with emotions. Because of this, an anticipated and experienced framework is necessary when attempting to disentangle emotional cost and negative emotions in order to more rigorously examine whether they are distinct from one another. Given that empirical research suggests that anticipated and experienced emotions differ in their predictive ability of academic outcomes (Schuster & Martiny, 2016; Wilson & Gilbert, 2005), it may be that anticipated emotional cost differs in its predictive ability when compared to experienced emotional cost, both in magnitude and direction. Like anticipated emotions, anticipated costs may prepare students for future challenges or potentially constrain progress, leading to distraction and withdrawal. Examining anticipated constructs can provide early warning signs of future difficulties such as low achievement, STEM 84 attrition, and disengagement. Students who anticipate high costs or high negative emotions may create a self-fulfilling prophecy by poorly estimating how they think they will feel in a particular course or major, which in turn could increase their negative experiences in class (e.g., experienced costs, negative emotions), potentially leading to attrition from their program of study. By understanding these anticipated beliefs, researchers will be better prepared to provide opportunities for early intervention. Because of this, it is necessary to understand both anticipated and experienced beliefs if researchers plan to provide more conceptual clarity around cost. Importance of Emotional Cost Emotional cost was first introduced as another term to describe psychological cost twenty years ago (Wigfield & Eccles, 2000). Even today, these terms are used somewhat interchangeably. Wigfield and Eccles (2020) describe this aspect of cost as “…the emotional or psychological costs of pursuing the task, particularly the cost of failure (e.g., Will taking this advanced course make me feel emotionally drained?)” (p. 169). Arguably, researchers have taken up the term emotional cost, rather than psychological cost, to describe the negative psychological states a student experiences from engaging in task (Dietrich et al., 2019; Flake et al., 2015; Gaspard et al., 2015, 2017; Jiang et al., 2018; Rosenzweig et al., 2020). Though many researchers still tend to measure multiple dimensions as a higher-order cost factor (Kosovich et al., 2015; Trautwein et al., 2012), when emotional cost is examined separately, it has been shown to be negatively correlated with expectancies, value, grades, long-term interest, and overall motivation in different ways than effort costs and loss of valued alternative costs (Flake et al., 2015; Gaspard et al., 2015, 2017). Thus, emotional cost is important in its own right; however, 85 research is still needed to understand it in more detail, especially from an anticipated and experienced perspective. Anticipated and Experienced Cost Just as emotions have been described using an anticipated and experienced framework, cost may benefit from this distinction as well. Anticipated costs refer to one’s anticipated or expected negative appraisals of what must be given up or what is required to complete a given task, whereas experienced costs refer to one’s experienced or immediate negative appraisals of what is currently being given up or what is required to complete a given task. When considering emotional cost, anticipated emotional cost refers to the anticipated negative emotional experiences or psychological states that students associate with a specific task and experienced emotional cost refers to the actual negative emotional experiences or psychological states associated with that task. Because anticipated beliefs function differently than experiences due to situational factors (Eid et al., 1999), as well as the cognitive appraisal aspect of anticipated beliefs (Perugini & Bagozzi, 2001; Wilson & Gilbert, 2003; van der Pligt & de Vries, 1998), it is important to explore both aspects when considering the possible jangle fallacy between emotional cost and negative emotions. For example, it is possible that anticipated emotional cost and anticipated negative emotions function similarly, but experienced emotional cost and momentary emotions do not. This may be due to the fact that students appraise emotional costs and negative emotions similarly, but differentiate the experiences of emotional cost and negative emotions differently. Further, it may be that emotional cost captures a specific type of negative emotion, but not all negative emotions. Emotional cost could represent only negative-activated emotions (e.g. anger), but not negative-deactivated emotions (e.g., boredom). This may also differ depending on 86 whether it is anticipated or experienced. Given Eccles (2005) definition of cost as an anticipated and experienced belief, as well as researchers calling for the examination of the dynamic nature of cost using intensive longitudinal methodologies, such as event-based and diary approaches (Feldon et al., 2019), researchers need to consider both experienced and anticipated cost. Addressing the Jangle Fallacy of Emotional Cost and Negative Emotions Researchers addressing jingle-jangle fallacies typically focus on gathering validity evidence in order to provide clarity around constructs (Marsh, 1994; Marsh et al., 2003, 2018). Structural, convergent, and predictive validity are common forms of validity assessed when examining these fallacies. Structural validity refers to the psychometric properties of a construct, convergent validity refers to the extent to which a construct is related to other variables, and predictive validity refers to how predictive a construct is of a particular outcome. Because convergent and predictive validity rely on other theoretically relevant variables to test for validity evidence, it is important to understand the expected relations between those variables. Expectancies refer to how well a student believes they will do on a particular activity, whereas value broadly refers to how much a student wants to complete a particular activity (Eccles et al., 1983; Wigfield & Eccles, 2000). Expectancy-value theory and empirical evidence suggests that cost is negatively related to both expectancies and value (Eccles et al., 1983; Flake et al., 2015). That is, the higher a student’s expectancies and value, the lower their perceived costs are. Control-value theory posits similar relations between negative emotions and control and value, where control is considered to be a part of a student’s expectancies (Pekrun, 2006). Control-value theory, as well as empirical evidence, also suggests that positive emotions are generally inversely related to negative emotions (Goetz et al., 2016b; Pekrun, 2006). In this study, I examine how happiness and excitement, positive-activated emotions (Linnenbrink- 87 Garcia et al., 2016), are related to negative emotions and emotional cost. If emotional cost is not distinct from negative emotions, similar correlations should be observed. Thus, I focus on the relations between expectancies, value, positive emotions, emotional cost, and negative emotions. Expectancy-value theory (Eccles et al., 1983) and control-value theory (Pekrun, 2006) suggest that cost and emotions, respectively, are predictors of achievement-related outcomes and behaviors, such as course grades and engagement. Therefore, when considering predictive validity, these two outcomes are considered. The Present Study The purpose of this study was to examine the potential jangle fallacy between emotional cost and negative emotions. Specifically, I examined the Flake et al. (2015) emotional cost scale and how it may be conflated with discrete negative emotions. As is often the case with studies examining jingle-jangle fallacies, this study was exploratory in nature and thus, no specific hypotheses were formed (Marsh, 1994; Marsh et al., 2003, 2018). The broad research aim of this study was to contribute to the ongoing efforts of providing more conceptual clarity of cost, with a specific emphasis on emotional cost. The three emotions chosen -- anger, boredom, and confusion – are all negative, but differ in terms of activation. It is possible that emotional cost may be capturing a specific type of negative emotion rather than all negative emotions. As research suggests that anticipated emotions may function differently than experienced emotions (Schuster & Martiny, 2016; Wilson & Gilbert, 2005), it may also be that cost functions differently depending on whether it is anticipated or experienced. Moreover, it may be that emotional cost and negative emotions function similarly when they are measured daily, but not when they are anticipated. As discussed above, anticipated beliefs function more as a cognitive appraisal of the emotions or costs that 88 students expect to experience in the future (Perugini & Bagozzi, 2001; Wilson & Gilbert, 2003; van der Pligt & de Vries, 1998), whereas momentary emotions or experienced costs reflect what is actually felt in-the-moment due to situational factors (Eid et al., 1999). Emotional cost is conceptualized to reflect an appraisal of a negative psychological state. That is, if a student is experiencing high emotional cost, some threshold must have been crossed. A student should be experiencing too much stress or too much frustration. Because of this, emotional cost should be negatively related to motivational beliefs and academic outcomes. Negative emotions, on the other hand, do not necessarily suggest that a threshold has been crossed. A student could experience some amount of stress that leads to adaptive outcomes (i.e., eustress; Crum et al., 2013; O’Sullivan, 2011). Further, some emotions that have been categorized as negative, such as boredom, may not reflect the emotions considered in emotional cost. Thus, this study seeks to contribute to the expanding literature on cost by more thoroughly understanding how emotional cost may be similar or different from particular negative emotions. Course Description and Sample Method Five sections of two large introductory calculus courses at a midwestern university in the United States participated in the present study. Two sections represented a calculus courses that is not required for STEM majors (i.e., basic-STEM), whereas the other three sections represented a course that is generally required for students who plan to complete a STEM majors (i.e., applied-STEM). Both courses are designed with the intention to aid students in building foundational calculus knowledge (e.g., limits, derivatives, integrals), so they are able to apply these skills in the future. Each section met for 50 minutes, three times per week during the fall 2018 semester (Monday, Wednesday, Friday). Mondays and Wednesdays were reserved for large 89 lectures that were taught by the faculty of record, while Fridays were reserved for smaller sections taught by a teaching assistant. Three separate instructors taught each of the three sections of the applied-STEM calculus course, whereas the same instructor taught both sections of the basic-STEM course. The total enrollment across the two basic-STEM calculus courses was 514, with a 30% participation rate (N = 156). The total enrollment across the three applied- STEM courses was 584, with a 47% participation rate (N = 273). Of the students who participated in the study, 64% were enrolled in the applied-STEM course and 36% were enrolled in the basic-STEM course. Across all five sections, 62% of the students who participated in the study identified as White, 63% identified as male, and 78% were first-year students (see Table 2.1 for full demographic characteristics). 90 Table 2.1. Participant demographic characteristics Sex Male Female Race/Ethnicity White International Asian Black Hispanic Two or more races Not reported Class level First-year Second-year Third-year Fourth-year Fifth-year Not reported % Students (N = 429) 63% 37% 62% 20% 6% 5% 5% 2% < 1% 78% 15% 5% < 1% 1% 1% 91 Procedure A pre-survey was completed during the first two weeks of the semester that was distributed through an online link. If students completed the pre-survey (along with a post-survey that was not used in this study), they received course credit, that was attached to their final grade. Students enrolled in the course also completed a daily diary measure, a type of intensive longitudinal measure, for 11 consecutive weeks throughout the semester. Students were emailed a link to the survey during the last ten minutes of their class using an online platform, Remind. Students then had the remainder of the day to respond to the survey, which closed at midnight. The daily diary surveys were rotated between Mondays and Wednesdays to avoid day-of-the- week effects. Students enrolled in the basic-STEM calculus course who completed 80% of the daily diary surveys received course credit, whereas students enrolled in the applied-STEM course were entered into one of two drawings for a $75 Amazon gift card in their course section (six gift cards in total). Overall, 2,435 daily diary responses were collected. Students responded to an average of 5.68 surveys (SD = 3.56) with a range of responses between 1 and 11. A response rate of 52% was observed. A response rate between 50 and 75 percent is generally expected of intensive longitudinal studies with college students (Feldman Barrett, 2004; Hektner et al., 2007); thus, 52% is satisfactory. Demographic information and course grades were collected through the university’s institutional office. Measures Pre-survey Measures A 7-point Likert scale (1 = Strongly agree, 7 = Strongly disagree) was used to assess all pre-survey measures. 92 Anticipated Emotional Cost. Anticipated emotional cost was assessed using an adapted scale from the original Flake et al. (2015) scale. Six items were used to assess anticipated emotional cost (e.g., “I’ll worry too much about this class.” ! = 91). The full-scale is included in the Appendix. Anticipated Negative Emotions. Anticipated negative emotions were assessed using three separate single items as is often the case when assessing discrete emotions (Bieg et al., 2013, 2014; Goetz et al., 2016b). Students were asked, “In my calculus course this fall, I expect to be angry/bored/confused.” Theoretically Relevant Variables. In order to assess convergent validity, a number of theoretically relevant variables were assessed during the pre-survey. Three items were used to assess values (e.g., I think my calculus class is useful.”; a = .85) and three items were used to assess expectancies (e.g., “I know I can learn the material in my calculus class.”; a = .84; Kosovich et al., 2015). Single items were used to assess anticipated positive emotions (e.g., “In my calculus course this fall, I expect to be happy/excited.”). Daily Diary Measures A daily diary approach is one type of intensive longitudinal methodology (Bolger & Laurenceau, 2013) that can be used to assess individuals’ subjective experiences in relation to a particular course. This is also often referred to as an end-of-class report (Durik et al., 2018; Schmidt et al., 2017). Students were asked to complete a daily diary survey once-per-week for 11 consecutive weeks after one of the large lectures. All items were assessed on a 7-point Likert scale (1 = Strongly agree, 7 = Strongly disagree). Single items were used to assess experienced emotional cost as well as discrete emotions. This is common considering intensive longitudinal 93 methodologies are designed to collect repeated measures (Csikszentmihalyi & Larson, 2014; Goetz et al., 2016a; Zirkel et al., 2015). Experienced Emotional Cost. Students reported on their perceptions of experienced emotional cost in relation to the specific day’s class. The item was selected from the original Flake et al. (2015) scale after being validated for use in the daily diary survey (Beymer et al., under review). The item was as follows: “After today’s class I feel like, this class is emotionally draining.”. Daily Negative Emotions. Consistent with prior research (Bieg et al., 2013, 2014; Goetz et al., 2013b, 2016b), students were asked to rate their negative emotions after class using three separate items for the three distinct emotions. The items read: “Thinking about today’s class, please indicate the amount to which you felt angry/bored/confused.”. Theoretically Relevant Variables. Theoretically relevant variables were also assessed using the diary assessment in order to explore validity evidence. The following constructs were assessed using single items. Daily value was assessed using the item: “Thinking about the work you did in class today, was it important to you?” (Schmidt et al., 2019). Happiness and excitement were assessed with single items asking (e.g., “Thinking about today’s class, please indicate the amount to which you felt happy/excited.”). Daily engagement was also assessed and was used in the examination of predictive validity. A composite measure of engagement was formed from two items (i.e., “Thinking about the work you did in class today, how well were you concentrating?”; “Thinking about the work you did in class today, how hard were you working?”; a = .79; Beymer et al., 2020). Course Grades 94 Final course grades were collected as a measure of achievement through the university’s institutional records office. Grades were reported on a 4.0 scale. Data Analytic Strategy As is often the case when examining jingle-jangle fallacies, the main form of analysis focuses on gathering validity evidence (Marsh, 1994; Marsh et al., 2003, 2018). As such, this study assessed structural, convergent, and predictive validity using the statistical software R (R Core, Team, 2018). Structural Validity Structural validity represents a way to assess the psychometric properties of a construct. One common way to assess structural validity is through the use of a confirmatory factor analysis (CFA). Here I examined, the fit and loadings of a one- and two-factor model for the variables representing anticipated negative emotions and anticipated emotional cost. A CFA was also used to assess experienced emotional cost and daily negative emotions, but in a slightly different fashion because a single item was used to assess emotional cost. Therefore, two separate two-factor models were examined. The first model examined emotional cost as part of a higher-order cost factor, whereas the second model examined emotional cost as part of the factor with negative emotions. Convergent Validity Convergent validity refers to the extent to which a construct is related to other theoretically relevant constructs (Widaman et al., 2011). Most commonly, convergent validity is assessed by examining correlations between constructs. First, I examined the correlations between emotional cost and the three negative emotions. If the correlations are positive and large, this would suggest that emotional cost and the negative emotions may be similar 95 constructs. Next, I examined the correlations between emotional cost, negative emotions, and other theoretically relevant constructs. Theory and empirical data would suggest that cost should be negatively related to expectancies and values (Eccles et al., 1983; Flake et al., 2015; Gaspard et al., 2017; Perez et al., 2014) and that negative emotions should be negatively related to positive emotions (Goetz et al., 2016b; Pekrun, 2006). If emotional cost and negative emotions are similar, I expect to see similar correlations, in both magnitude and direction, between emotional cost and negative emotions with expectancies and values. The same is true for when examining how negative emotions and emotional cost relate to positive emotions. Predictive Validity Finally, predictive validity was examined to assesses how emotional cost and negative emotions predicted outcomes of interest. In order to examine predictive validity, regressions were used and the coefficients were examined, both in direction and magnitude (Borsboom et al., 2004; Rossiter, 2002). Linear regressions were used to examine the associations between anticipated emotional cost, anticipated negative emotions and course grades. Multilevel models were used to examine how anticipated emotional cost and anticipated negative emotions predicted daily engagement. Multilevel models were also used to examine how experienced cost and daily emotions predicted daily engagement. In total, twelve models were examined (one for each independent variable predicting each of the outcomes). In order to compare the magnitude of the effects of the multilevel models, Cohen’s d was calculated using the EMAtools package (v.0.1-3; Kleiman, 2017) in R. As suggested by Cohen (1988), effect sizes around 0.2 were considered small, effect sizes around 0.5 were considered medium, and effect sizes around 0.8 were considered large. 96 Preliminary Analysis Results Correlations, means, and standard deviations of all variables are presented in Table 2.2 and are discussed more below when considering convergent validity. Anticipated anger, M = 3.77, and emotional cost, M = 3.71, had the highest mean values compared to anticipated boredom and confusion, and experienced emotional cost had the highest mean value, M = 3.19, compared to the negative daily emotions measured. Intraclass correlations (ICCs) of daily variables were examined to understand how much variation occurred between-person. More variation occurred between-persons for emotional cost than any of the negative emotions examined, with all negative emotions showing similar between-person variation (ICCs for Emotional cost: 0.66; Anger: 0.42; Boredom: 0.47; Confusion: 0.43). ICCs were also examined between-week, but little variance could be attributed (i.e., ICCs were less than or equal to 0.01 for all daily variables). 97 5. .20*** .18*** .41*** .52*** .45*** .58*** -.14* -.18** -.32*** -.16** .36*** .21*** .31*** .36*** .07 4.60 1.66 6. .49*** .19*** .81*** .29*** -.33*** -.38*** -.31*** -.37*** .31*** .22*** .49*** .52*** .23*** 3.26 1.73 7. .16*** .26*** -.22*** -.34*** -.30*** -.34*** .26*** .14* .38*** .45*** .17** 2.79 1.76 Table 2.2. Within- and between-person correlations, means, and standard deviations of all variables 8. Within-person Daily 1. Emo cost 2. Angry 3. Bored 4. Confused 5. Value 6. Happy 7. Excited 8. Engagement Between-person Daily 1. Emo cost 2. Angry 3. Bored 4. Confused 5. Value 6. Happy 7. Excited 8. Engagement Pre-survey -.02 9. Emo cost 10. Angry -.08 -.29*** 11. Bored .03 12. Confused .15* 13. Value 14. Expectancies .11 .12* 15. Happy .18** 16. Excited 17. Grade -.04 4.72 Mean 1.34 SD 2. 1. .19*** .27*** .09*** .40*** .19*** -.09*** -.01 -.11*** -.11*** .01 -.07** -.06* .04 .45*** .37*** .23*** .61*** .58*** -.20*** -.13* -.36*** -.13* -.22*** < .001 .01 .54*** .25*** .03 .28*** -.19*** -.32*** -.30*** -.26*** -.31*** 3.19 1.74 3. .23*** -.17*** -.07** -.07** -.18*** .33*** -.39*** -.31*** -.29*** -.42*** .16** .17** .40*** .11 -.23*** -.16** -.21*** -.25*** -.11 2.98 1.67 4. -.04 -.12*** -.07** -.003 -.07 -.31*** -.17** .08 .34*** .23*** .03 .28*** -.18** -.26*** -.29*** -.30*** -.39*** 2.76 1.69 -.09 .21*** .21*** .04 .11 -.12* -.18** -.11* -.13* -.16** 1.81 1.38 98 Table 2.2. (cont.d) Within-person Daily 1. Emo cost 2. Angry 3. Bored 4. Confused 5. Value 6. Happy 7. Excited 8. Engagement Between-person Daily 1. Emo cost 2. Angry 3. Bored 4. Confused 5. Value 6. Happy 7. Excited 8. Engagement Pre-survey 9. Emo cost 10. Angry 11. Bored 12. Confused 13. Value 14. Expectancies 15. Happy 16. Excited 17. Grade Mean SD Note: The range for all items was between 1 = Strongly Disagree and 7 = Strongly Agree. Course grades were on a 4.0 scale. 9. .42*** .13** .48*** -.21*** -.37*** -.35*** -.34*** -.19*** 3.71 1.30 12. -.21*** -.24*** -.35*** -.33*** -.21*** 2.76 1.69 10. .31*** .63*** -.22*** -.23*** -.29*** -.28*** -.15** 3.77 1.73 11. .27 -.36*** -.18*** -.27*** -.37*** .01 2.98 1.67 13. .59*** .49*** .50*** .14** 5.59 1.07 14. .42*** .40*** .15** 5.60 1.01 15. .75*** .22*** 4.62 1.48 16. .19*** 4.43 1.56 17. 2.79 1.19 99 Primary Analysis Structural Validity To examine the psychometric properties of emotional cost and negative emotions, CFA’s were conducted using the lavaan package (v.0.6-5; Rosseel, 2012) in R. First, I examined the fit of a one- and two-factor model using the anticipated measures collected during the pre-survey (See Table 2.3). Using Hu and Bentler’s (1999) guidelines for fit indices, the one-factor model did not fit the data well (CFI = .91, TLI = .88, RMSEA = .12) compared to the two-factor model (CFI = .98, TLI = .97, RMSEA = .06). Because boredom had a low factor loading in both models, I removed boredom to examine fit with only the negative-activated emotions; however, the fit remained the same. The fit for the one-factor model, with boredom removed, fit poorly (CFI = .93, TLI = .90, RMSEA = .13), whereas the fit for the two-factor model, with boredom removed, fit well (CFI = .99, TLI = .98, RMSEA = .05). This provides some evidence that anticipated negative emotions and anticipated emotional cost may represent two unique constructs. 100 Table 2.3. Confirmatory factor analysis of anticipated measures Emotional Cost EM1 EM2 EM3 EM4 EM5 EM6 Negative emotions Angry Bored Confused Note: Factor loadings are standardized values. One-factor model fit: CFI = .912, TLI = .882, RMSEA = .124. Two-factor model fit: CFI = .980, TLI = .973, RMSEA = .060. EM = Emotional Cost. One-factor 0.761 0.789 0.779 0.815 0.836 0.801 0.538 0.174 0.561 0.761 0.793 0.780 0.818 0.841 0.800 Two-factor 0.793 0.305 0.805 101 Using a CFA, I then examined two separate two-factor models to assess which model fit better when experienced emotional cost was part of a higher-order cost factor with other dimensions included (i.e., task effort, outside effort, loss of valued alternatives; Beymer et al., under review) or when experienced emotional cost was part of a factor with negative emotions (see Table 2.4). Because of the nested nature of the data (up to 11 measurements per person), standard errors were adjusted so that they were robust. The fit of Model 1 (i.e., experienced emotional cost as part of a higher-order cost factor) was good: CFI = .98, TLI = .96, RMSEA = .06, whereas Model 2 (e.g., experienced emotional cost as part of a negative emotions factor) did not fit well: CFI = .91, TLI = .86, RMSEA = .12. This suggests that experienced emotional cost may be better suited as part of a higher-order cost factor than a negative emotions factor. 102 Model 1 Model 2 Table 2.4. Confirmatory factor analyses of experienced and daily measures Cost TE OE LV Emotional Cost EM Negative emotions Angry Confused Bored Note: Factor loadings are standardized values. Model 1 fit: CFI = .975, TLI = .960, RMSEA = .062. Model 2 fit: CFI = .914, TLI = .861, RMSEA = .115. TE = Task Effort Cost; OE = Outside Effort Cost; LV = Loss of Valued Alternatives Cost; EM = Emotional Cost. 0.850 0.507 0.595 0.284 0.694 0.785 0.373 0.858 0.822 0.884 0.861 0.819 0.878 0.792 103 Convergent Validity Correlations were examined to explore evidence of convergent validity with respect to emotional cost and negative emotions (see Table 2.2). Correlations Between Emotional Cost and Negative Emotions. The correlations between emotional cost and negative emotions were relatively low, with the largest correlation being between experienced emotional cost and daily confusion, r = .58 (between-person). All correlations were significant and ranged from small to moderate, with the exception of experienced emotional cost and anticipated boredom, r = .03. This suggests that, in general, there is a small to moderate negative association between emotional cost and negative emotions. The small to moderate correlations between negative emotions and emotional cost may suggest the presence of two distinct constructs. Associations with Theoretically Relevant Variables. Next, I examined how emotional cost and negative emotions were related to other theoretically relevant variables. Value. Anticipated emotional cost and all of the anticipated negative emotions were negatively related to value. These correlations were small ranging from r = -.21 to -.36. Value was also negatively related to experienced emotional cost and all daily negative emotions, with small correlations ranging from r = -.12 to -.23. Anticipated boredom was most highly correlated to value, r = -0.36, whereas similar correlations were found between anticipated emotional cost, anticipated anger, anticipated confusion, and value. The lowest correlation was between value and daily anger, r = -0.12. Daily value, measured using the daily diary approach, was also negatively correlated with anticipated emotional cost and anticipated negative emotions, with small correlations between r = -.14 to -.32. Correlations between daily value and experienced emotional cost and experienced 104 negative emotions were also negative and small, both between- and within-person, ranging from rbetween = -.07 to -.39; rwithin = -.01 to -.17. Interestingly, experienced emotional cost had the smallest within-person correlation coefficient with daily value, rwithin = -0.01, compared to the negative daily emotions, but daily confusion had the smallest between-person correlation coefficient to value, rbetween = -0.07, compared to other daily measures. Taken together, anticipated emotional cost and the anticipated negative emotions were similarly related to value. The same was true when considering experienced emotional cost and daily negative emotions. Expectancies. Expectancies showed negative correlations with anticipated emotional cost and anticipated negative emotions. The correlations were small and ranged from r = -.18 to -.37. The same was true when considering the correlations between experienced emotional cost, daily negative emotions and expectancies. The correlations ranged from r = -.16 to -.32. Anticipated emotional cost had the highest correlation coefficient with expectancies, r = -0.37, and the second highest correlation was between daily emotional cost and expectancies, r = -0.32, suggesting that emotional cost was more strongly negatively related to expectancies than negative emotions. Still, all correlations were relatively small, suggesting some similarities between anticipated emotional cost and negative emotions. Positive Emotions. Finally, I considered the correlations between the positive emotions, happiness and excitement, with emotional cost and the negative emotions. I first considered anticipated happiness. The correlations of anticipated happiness with anticipated emotional cost and anticipated negative emotions were all small and negative, ranging from r = -.27 to -.35. The same was true when considering the relations of anticipated happiness with experienced emotional cost and daily negative emotions. Correlations ranged from r = -.11 to -.30. I next examined the correlations of daily happiness with anticipated emotional cost and anticipated 105 negative emotions. These correlations were similar, ranging from r = -.31 to -.38. Correlations of daily happiness with daily emotional cost and daily negative emotions were also small and negative ranging from rbetween = -.13 to -.36; rwithin = -.07 to -.12. Daily happiness did have a small correlation with daily anger, rbetween = -.13, compared to the other daily variables. Anticipated excitement was considered next. Again, correlations of anticipated excitement with anticipated emotional cost and anticipated negative emotions were small and negative, ranging from r = -.28 to -.37. Anticipated excitement was also negatively correlated with experienced emotional cost and experienced negative emotions. The correlations ranged from r = -.13 to -.30, with the smallest correlation between anticipated excitement and daily anger, r = -.13. Daily excitement had similar correlations with anticipated emotional cost and anticipated negative emotions, ranging from r = -.22 to -.34. Finally, correlations between daily excitement, daily emotional cost and daily negative emotions were slightly different. All correlations were small and negative, except for the correlations between daily excitement and daily boredom, rbetween = < .001, rwithin = .01, which were positive and close to 0; however, in general, the correlations among positive emotions, negative emotions and emotional cost were similar. Predictive Validity Finally, predictive validity was examined. The lme4 (v.1.1-19; Bates et al., 2015) and lmerTest (v.3.1-0; Kuznetsova et al., 2017) packages were used to carry out multilevel modeling in R, due to the nested nature of the data (i.e., daily variables nested within students and week). The daily responses at level one (within-student) consisted of experienced emotional cost, the three daily negative emotions, and engagement. The student-level variables included anticipated emotional cost and the three anticipated negative emotions. In these models, level one 106 represented the daily diary variables. Level two was cross-classified, representing the student and the week. Although no week-specific variables were included in the analysis, cross-classified models were used in order to account for the fact that multiple students were responding to diary variables on the same day of the week. The built-in lm function was used to carry out linear regressions in R (R Core Team, 2018). Engagement. First, I examined the multilevel model results of anticipated emotional cost and anticipated negative emotions predicting daily engagement (see Table 2.5). The only significant predictor of daily engagement was anticipated boredom, B = -0.20, SE = 0.03, d = 0.60. This suggests that students who reported higher levels of anticipated boredom experienced lower levels of daily engagement throughout the course. 107 Emo cost 4.63*** (0.16) 0.01 (0.04) 0.02 Table 2.5. Multilevel model results of anticipated variables predicting engagement Fixed effects Intercept, B00 B01 (SE) d Random effects Student, r0 Week, r1 Level-1 error, ε Note. ***p < .001 for an a (Type 1 error rate) of .05. Boredom 5.36*** (0.14) -0.20*** (0.03) 0.60 Anger 4.76*** (0.13) -0.03 (0.03) 0.10 "2 0.75*** 0.02*** 0.94 0.85*** 0.02*** 0.94 "2 0.84*** 0.02*** 0.94 "2 Confusion 4.56*** (0.17) 0.02 (0.03) 0.05 "2 0.84*** 0.02*** 0.94 108 Next, I examined the daily variables as predictors of daily engagement (see Table 2.6). The only two significant predictors were experienced emotional cost and daily boredom; however, the coefficients were in opposite directions. Those who experienced higher emotional cost had higher levels of engagement, B = 0.04, SE = 0.02, d = 0.09, whereas those who experienced higher levels of daily boredom had lower levels of engagement, B = -0.18, SE = 0.02, d = 0.46; however, the effect size of experienced emotional cost on engagement is extremely small. The effect size of daily boredom on engagement is small but was close to medium. 109 Emo cost 4.53*** (0.09) 0.04* (0.02) 0.09 Table 2.6. Multilevel model results of daily variables predicting engagement Fixed effects Intercept, B00 B10 (SE) d Random effects Student, r0 Week, r1 Level-1 error, ε Note. *p < .05, ***p < .001 for an a (Type 1 error rate) of .05. 0.82*** 0.02*** 0.95 Anger 4.72*** (0.08) -0.03 (0.02) 0.07 "2 0.82*** 0.02*** 0.95 "2 Boredom 5.18*** (0.08) -0.18*** (0.02) 0.46 "2 0.67*** 0.02*** 0.92 Confusion 4.61*** (0.08) 0.01 (0.02) 0.04 "2 0.82*** 0.02*** 0.94 110 Course grades. When examining how anticipated emotional cost and anticipated negative emotions predicted course grade, using linear regression, anticipated boredom was the only nonsignificant predictor. Students who reported high levels of anticipated emotional cost, B = -0.17, SE = 0.03, anticipated anger, B = -0.11, SE = 0.02, and anticipated confusion, B = -0.15, SE = 0.02, had lower course grades compared to those with lower levels of these predictors (see Table 2.7). Thus, with the exception of boredom, anticipated negative emotions and anticipated emotional cost were similarly predictive of course grade; however, the proportion of variance that was explained in each model was extremely small, ranging from R2 = .03 to .04. 111 Table 2.7. Linear regression results of anticipated variables predicting course grades Fixed effects Intercept B (SE) R2 Note. ***p < .001 for an a (Type 1 error rate) of .05. Anger 3.24*** (0.09) -0.11*** (0.02) 0.03 Boredom 2.94*** (0.10) -0.04 (0.03) 0.002 Emo cost 3.41*** (0.12) -0.17*** (0.03) 0.03 Confusion 3.51*** (0.12) -0.15*** (0.02) 0.04 112 Research on cost beliefs has grown in the past several years with researchers proposing Discussion new dimensions of cost and developing new measures to assess it (Wigfield & Eccles, 2020). Though cost is a promising area of research and has great potential for intervention work (Barron & Hulleman, 2015; Rosenzweig et al., 2020), there is still uncertainty surrounding what the different dimensions of cost are and there is concern that researchers may be conflating dimensions of cost with other well researched constructs (Wigfield & Eccles, 2020). The purpose of this study was to examine the potential jangle fallacy (Kelly, 1992) of emotional cost and negative emotions in hopes of providing more conceptual clarity around emotional cost. This study has added to the literature by considering this potential jangle fallacy as well as considering cost from an anticipated and experienced framework. This is critical given that cost has been described as comprising both the anticipated effort one must give up as well as the actual sacrifices that take place in order to complete a task (Eccles, 2005; Eccles & Wigfield, in press). Is Anticipated Emotional Cost Different from Anticipated Negative Emotions? Yes and No The results surrounding anticipated emotional cost and anticipated negative emotions were somewhat conflicting. First, when considering evidence that supports these constructs as distinct, it is important to note that the correlations between anticipated emotional cost and the anticipated negative emotions were relatively low, not surpassing r = .48. Marsh et al. (2018) provide correlational evidence of self-beliefs with high correlations to suggest self-efficacy, self- concept, and expectancies were similar. Thus, the correlational evidence here suggests that anticipated emotional cost may differ from the examined anticipated negative emotions. This is 113 further confirmed when examining the two-factor CFA model that fit the data better than the one-factor model. When considering how anticipated emotional cost and the anticipated negative emotions were correlated to theoretically relevant variables (e.g., value, expectancies, happiness), the correlations were relatively similar, suggesting there may be some evidence of these anticipated constructs as similar. More specifically, the anticipated beliefs were all negatively correlated with value, expectancies, anticipated happiness, and anticipated excitement. Further, they were similar in their magnitude. The same was generally true when considering how anticipated emotional cost and anticipated negative emotions were correlated with daily reports of value, happiness, and excitement; however, anticipated value was more strongly correlated to value than anticipated anger, anticipated confusion, and anticipated emotional cost. These correlations were generally consistent with theory and empirical evidence suggesting that cost is negatively correlated to expectancies and value (Flake et al., 2015; Gaspard et al., 2015, 2017). The negative emotions examined had similar correlations to these variables as well. Again, this suggests that there are possible similarities between emotional cost and negative emotions. There was also other evidence that anticipated boredom may function differently, not only from anticipated emotional cost, but also from anticipated confusion and anger. This can be seen by the low factor loading in the CFAs and also in the multilevel model results examining the associations between the anticipated beliefs and daily engagement. Boredom was the only significant predictor of daily engagement. Moreover, when examining the linear regressions of anticipated beliefs predicting course grades, boredom was the only nonsignificant predictor. Thus, it may be that anticipated emotional cost is more similar to anticipated anger and anticipated confusion, than anticipated boredom. This is not entirely surprising given that 114 boredom has been considered harder to place on the circumplex (Goetz et al., 2014; Linnenbrink- Garcia et al., 2016). When examining the emotional cost scale, emotions such as exhaustion, frustration, stress, and anxiety are included (Flake et al., 2015). These generally represent negative-activated emotions, with the exception of exhaustion (Linnenbrink-Garcia et al., 2016). This may be why anticipated emotional cost and anticipated anger seem to function similarly in some instances. This may hold true for why emotional cost functions similarly to anticipated confusion in some instances as well. That is, confusion has also been considered to be a negative-activating emotion (Sazzad et al., 2011). Conversely, boredom generally represents a negative-deactivated emotion (Goetz et al., 2014; Linnenbrink-Garcia et al., 2016) that anticipated emotional cost may not tap into. Evidence from the CFAs also suggest that the items referencing frustration, anxiety, and stress in the emotional cost measure have more similar loadings than the other three items referencing worrying too much, being emotionally drained, and exhaustion (Flake et al., 2015). As discussed below, being emotionally drained may be more similar to a particular type of boredom (Goetz et al., 2014), but because a composite measure is used for anticipated emotional cost, those aspects of boredom are not represented as well as they are in the experienced emotional cost item. The Nuances of Experienced Emotional Cost Though similarities existed among the anticipated beliefs, more differences were found among the experienced constructs. First, results from the CFAs suggested that emotional cost loads better onto a factor of a higher-order cost than with a negative emotions factor, suggesting that experienced emotional cost is more related to other dimensions of experienced cost than daily negative emotions. Similar to the anticipated constructs, the correlations between 115 experienced emotional cost and daily emotions ranged from small to moderate, providing some evidence that experienced emotional cost and daily emotions are distinct constructs. Correlations between the daily variables and other theoretically relevant variables were small. Interestingly, experienced emotional cost was more similar to daily boredom and daily confusion in some instances. For example, the correlations between experienced emotional cost, daily boredom, daily confusion, and daily happiness and daily excitement were more similar than with daily anger; however, this was not the case when examining how those variables related to daily value. Correlations between the daily variables, value, expectancies, anticipated happiness, and anticipated excitement were all similar. Thus, it is interesting that daily boredom may be more similar to experienced emotional cost than anticipated boredom is with anticipated emotional cost. This may be due to how students take up the items when responding during the surveys. Perhaps the item used for experienced emotional cost, (“This class is emotionally draining.”; Beymer et al., under review) more closely resembles a form of boredom. Goetz et al. (2014) discuss multiple types of boredom, measured using the experience sampling method, that represent different levels of arousal and valence. Experiencing emotional cost may be more similar to a type of indifferent boredom that is characterized by a general indifference to, or withdrawal from tasks. It may also be more similar to other negative-deactivated emotions that were not considered in this study such as exhaustion or tired (Linnenbrink-Garcia et al., 2016; Yik et al., 2011). However, conflicting evidence was found when examining the multilevel results of the daily variables predicting engagement. Whereas experienced emotional cost was found to be a positive predictor of engagement, boredom was found to be a negative predictor. Though they differed in their predictive ability of engagement, the significance levels are worth noting. Experienced emotional cost was significant at the p < .05 level, whereas boredom was 116 significant at the p < .001 level. Thus, more work is needed to truly understand the similarities and differences between these constructs. Why Anticipated and Experienced Emotional Cost May Function Differently One possibility as to why anticipated and experienced cost may relate differently to negative emotions regards how anticipated beliefs function differently than experiences, in general. That is, anticipated beliefs refer to cognitive appraisals (Perugini & Bagozzi, 2001; Wilson & Gilbert, 2003; van der Pligt & de Vries, 1998), whereas experiences reflect what is felt (Eid et al., 1999). Perhaps anticipating emotional costs and negative emotions feel more similar to students because they draw on the same past experiences to make their appraisals. For example, a student who is asked to anticipate the emotional cost or negative emotions they will experience in a calculus course likely reflects back on past math classes to make the future- oriented appraisal. Moreover, they could anticipate emotional cost and negative emotions as being similar given their future predictions about the course. Conversely, this is not how emotional cost and daily emotions are assessed. Students are asked to specifically reflect on a particular day’s class and think about their experiences. These more proximal experiences may differ given that a student is only reflecting on that day; therefore, the nuances between emotional cost and negative emotions may be more easily seen. Similarly, another possibility lies in the fact that when considering emotional cost, it is important to remember that some threshold must be crossed, so that some loss or cost is incurred (Flake et al., 2015). Perhaps, this is more difficult to ascertain when anticipating emotional cost. As discussed, negative emotions, such as boredom (Seib & Vodanovich, 1998; Vodanovich, 2003), confusion (D’Mello & Graesser, 2014; Vogl et al., 2019), and stress (Crum et al., 2013; O’Sullivan, 2011) do not always lead to maladaptive behaviors. For something to be considered 117 a cost, it should be negatively related to behaviors. Perhaps that threshold of loss is more difficult to predict when anticipated, but when experiencing emotional cost, it is more proximal and thus felt more. Because of this, experienced emotional cost may provide a different experience that is unique from daily negative emotions. Finally, another possible explanation why anticipated and experienced cost relate differently to anticipated and experienced negative emotions, respectively, is measurement. Flake et al’s. (2015) full emotional cost scale may capture more negative-activated emotions such as frustration and anxiety, whereas the short emotional cost scale (Beymer et al., under review) does not. Perhaps the short-scale misses some of the complexity of cost as a construct. Still, this is a difficult issue to address when attempting to measure the dynamic nature of cost using intensive longitudinal methods (Bolger & Laurenceau, 2013) where single items are often necessary (Csikszentmihalyi & Larson, 2014; Goetz et al., 2016a; Zirkel et al., 2015). Limitations Studies examining jingle-jangle fallacies often consider convergent and discriminant validity (Marsh, 1994; Marsh et al., 2003, 2018). Though convergent validity was considered, I was unable to gather evidence of discriminant validity due to survey constraints. This piece of validity evidence, however, is important moving forward. Showing how emotional cost and negative emotions are not related to variables that they should not be related to would provide more evidence for or against the jangle fallacy of emotional cost and negative emotions. Second, I chose to focus primarily on negative-activated emotions; however, deactivated- negative emotions, such as sadness and exhaustion, may show similarities to emotional cost. Further, other activated-negative emotions such as frustration and annoyance may show 118 similarities with emotional cost. More research is needed in this area as researchers work to more clearly define cost dimensions. Future Directions This study was a first step in examining the potential jangle fallacy between emotional cost and negative emotion; however, it was beyond the scope of this paper to examine the possible jingle fallacy between psychological cost and emotional cost. Some argue they may be different (Perez et al., 2019), while others have used these terms interchangeably (Eccles et al., 1983; Wigfield & Eccles, 2000). Understanding the possible jingle fallacy between these two types of cost is an important next step in order to understand cost more clearly. Other issues surrounding dimensions of cost exist. For example, Perez et al. (2019) discuss how ego cost has been used in a similar fashion to psychological cost (Jiang et al., 2018). In order for results to be understood across studies (Wigfield & Eccles, 2020), it is important that researchers first agree on what cost is and how dimensions differ. Though research examining the possible jingle fallacy between emotional cost and psychological cost is needed, work is still necessary to understand the jangle fallacy of emotional cost and negative emotions in more detail. Many of the results of this study were conflicting. Thus, researchers need to further examine how emotional cost may be distinct or different from negative emotions. Future work should consider more theorizing of distinct dimensions of cost as well as empirical evidence. Qualitative work in the form of focus groups has been used to interview students about the types of costs they face (Flake et al., 2015; Johnson & Safavian, 2016). This is a promising avenue for understanding the types of costs that students actually incur. Measures can then be developed based off of what is learned from these focus groups 119 (Flake et al., 2015). Ideally, this will aid researchers in developing appropriate measures that capture unique aspects of cost. Conclusion With the recent surge of research on cost, researchers are describing new dimensions of cost often (Wigfield & Eccles, 2020). However, it is crucial to make sure that researchers are not conflating dimensions of cost with already defined cost dimensions as well as other well- established constructs. Though more work is needed to thoroughly understand dimensions of cost beliefs, this study provides an empirical look at how emotional cost functions in comparison to three negative emotions. Importantly, researchers need to take caution when developing new measures of cost and doing their due diligence in understanding what they are truly measuring. As Wigfield and Eccles (2020) state, more research is needed comparing different measures of cost dimensions. Further, I would add that more research is needed not only comparing different measures of cost dimensions, but also other constructs that may be similar to cost dimensions. 120 APPENDIX 121 Measure Anticipated perceptions of cost in calculus Table B.1. Survey items Details Average score for all 19 items and average score for each sub- scale (4-6 items each) from Flake et al. (2015): Response scale: 7 item scale (1 = Strongly disagree, 7 = Strongly agree) Emotional Cost: 1. I’ll worry too much about this class. 2. This class will be too exhausting. 3. This class will be emotionally draining. 4. This class will be too frustrating. 5. This class will be too stressful. 6. This class will make me feel too anxious. 122 REFERENCES 123 REFERENCES Barron, K. E., & Hulleman, C. S. (2015). Expectancy-value-cost model of motivation. In J. S. Eccles & K. Salmelo-Aro (Eds.), International encyclopedia of social and behavioral sciences: Motivational psychology (2nd ed.). New York, NY: Elsevier. Bates, D., Maechler, M., Bolker, B., & Walker, S. (2015). Fitting linear mixed-effects models using lme4. Journal of Statistical Software, 67(1), 1-48. doi:10.18637/jss.v067.io1 Battle, A., & Wigfield, A. (2003). College women’s value orientations toward family, career, and graduate school. Journal of Vocational Behavior, 62(1), 56. doi:10.1016/S0001- 8791(02)00037-4 Bearden, L. J., Spencer, W. A., & Moracco, J. C. (1989). A study of high school dropouts. School Counselor, 37(2), 113–120. Ben-Eliyahu, A. (2019). Academic emotional learning: A critical component of self-regulated learning in the emotional learning cycle. Educational Psychologist, 54(2), 84-105. doi:10.1080/00461520.2019.1582345 Bergey, B. W., Ranellucci, J., & Kaplan, A. (2019). The conceptualization of costs and barriers of a teaching career among latino preservice teachers. Contemporary Educational Psychology, 59, 101794. doi:10.1016/j.cedpsych.2019.101794 Beymer, P. N., Ferland, M., & Flake, J. K. (under review). Validity evidence for a short scale of college students' perceptions of cost. Manuscript submitted for publication. Beymer, P. N., Rosenberg, J. M., & Schmidt, J. A. (2020). Does choice matter or is it all about interest? an investigation using an experience sampling approach in high school science classrooms. Learning and Individual Differences, 78. doi:10.1016/j.lindif.2019.101812 Bieg, M., Goetz, T., & Hubbard, K. (2013). Can I master it and does it matter? An intraindividual analysis on control–value antecedents of trait and state academic emotions. Learning and Individual Differences, 28, 102-108. doi:10.1016/j.lindif.2013.09.006 Bieg, M., Goetz, T., & Lipnevich, A. (2014). What students think they feel differs from what they really feel - academic self-concept moderates the discrepancy between students' trait and state emotional self-reports. Plos One, 9(3), e92563. doi:10.1371/journal.pone.0092563 Bolger, N., & Laurenceau, J. (2013). Intensive longitudinal methods: An introduction to diary and experience sampling research. New York, NY: Guilford Publications. 124 Bong, M. (1996). Problems in academic motivation research and advantages and disadvantages of their solutions. Contemporary Educational Psychology, 21, 149–165. http://dx.doi.org/10.1006/ceps.1996.0013 Bong, M., & Skaalvik, E. M. (2003). Academic self-concept and self-efficacy: How different are they really? Educational Psychology Review, 15, 1–40. http://dx.doi.org/10.1023/A:1021302408382 Borsboom, D., Mellenbergh, G. J., & van Heerden, J. (2004). The concept of validity. Psychological Review, 111(4), 1061-1071. doi:10.1037/0033-295X.111.4.1061 Brun, G., Doguoglu, U., & Kuenzle, D. (Eds.). (2008). Epistemology and emotions. Aldershot, UK: Ashgate. Clore, G. L. (1992). Cognitive phenomenology: Feelings and the construction of judgment. In L. L. Martin & A. Tesser (Eds.), The construction of social judgments (pp. 133–163). Hillsdale, NJ: Erlbaum. Cohen, J. (1988). Statistical power analysis for the behavioral sciences (2nd ed.). Hillsdale, NJ: L. Erlbaum Associations. Conley, A. M. (2012). Patterns of motivation beliefs: Combining achievement goal and expectancy-value perspectives. Journal of Educational Psychology, 104, 32-47. doi: 10.1037/a0026042 Crum, A. J., Salovey, P., & Achor, S. (2013). Rethinking stress: The role of mindsets in determining the stress response. Journal of Personality and Social Psychology, 104(4), 716-733. doi:10.1037/a0031201 Csikszentmihalyi, M. & Larson, R. (2014). Validity and reliability of the experience-sampling method. In Flow and the Foundations of Positive Psychology (pp. 35-54). Dordrecht: Springer Netherlands. http://doi.org/10.1007/978-94-017-9088-8_3 D’Mello, S. K., & Graesser, A. C. (2014). Confusion. In R. Pekrun & L. Linnenbrink-Garcia (Eds.), International handbook of emotions in education (pp. 289–310). New York: Routledge. Dettmers, S., Trautwein, U., Lüdtke, O., Goetz, T., Frenzel, A. C., & Pekrun, R. (2011). Students’ emotions during homework in mathematics: Testing a theoretical model of antecedents and achievement outcomes. Contemporary Educational Psychology, 36(1), 25-35. doi:10.1016/j.cedpsych.2010.10.001 Diener, E. (2000). Subjective well-being: The science of happiness and a proposal for a national index. American Psychologist, 55(1), 34-43. doi:10.1037/0003-066X.55.1.34 125 Dietrich, J., Moeller, J., Guo, J., Viljaranta, J., & Kracke, B. (2019). In-the-moment profiles of expectancies, task values, and costs. Frontiers in Psychology, 10, 1662. doi:10.3389/fpsyg.2019.01662 Dowker, A., Sarkar, A., & Looi, C. (2016). Mathematics anxiety: What have we learned in 60 years? Frontiers in Psychology, 7, 508. doi:10.3389/fpsyg.2016.00508 Dunn, E.W., Wilson, T.D., & Gilbert, D.T. (2003). Location, location, location: The misprediction of satisfaction in housing lotteries. Personality and Social Psychology Bulletin, 29, 1421-1432. Durik, A. M., Schwartz, J., Schmidt, J. A., & Shumow, L. (2018). Age differences in effects of self-generated utility among black and hispanic adolescents. Journal of Applied Developmental Psychology, 54, 60-68. doi:10.1016/j.appdev.2017.11.004 Dweck, C. S. (1986). Motivational processes affecting learning. American Psychologist, 41(10), 1040–1048. doi:10.1037/0003-066XX.41.10.1040 Eccles, J. (2012, April). Expectancy-value theory and gendered academic and occupational choices. Paper presented at the 2012 annual meeting of the American Educational Research Association, Vancouver, British Columbia, Canada. Eccles, J. S. (2005). Subjective task values and the Eccles et al. model of achievement related choices. In A. J. Elliott & C. S. Dweck (Eds.), Handbook of competence and motivation (pp. 105-121). New York: Guilford. Eccles (Parsons), J. S., Adler, T. F., Futterman, R., Goff, S. B., Kaczala, C. M., Meece, J. L., et al. (1983). Expectancies, values, and academic behaviors. In J. T. Spence (Ed.), Achievement and achievement motives (pp. 75–146). San Francisco, CA: W. H. Freeman. Eccles, J. S., & Wigfield, A. (in press). From expectancy value theory to situated expectancy value theory: A development, social cognitive, and sociocultural perspective on motivation. Contemporary Educational Psychology. Eid, M., Schneider, C., & Schwenkmezger, P. (1999). Do you feel better or worse? the validity of perceived deviations of mood states from mood traits. European Journal of Personality, 13(4), 283-306. Feldman Barrett, L. (2004). Feelings or words? understanding the content in self-report ratings of experienced emotion. Journal of Personality and Social Psychology, 87(2), 266-281. doi:10.1037/0022-3514.87.2.266 Feldon, D. F., Callan, G., Juth, S., & Jeong, S. (2019). Cognitive load as motivational cost. Educational Psychology Review, 31(2), 319-337. doi:10.1007/s10648-019-09464-6 126 Flake, J. K., Barron, K. E., Hulleman, C., McCoach, B. D., & Welsh, M. E. (2015). Measuring cost: The forgotten component of expectancy-value theory. Contemporary Educational Psychology, 41, 232-244. doi:10.1016/j.cedpsych.2015.03.002. Gaspard, H., Dicke, A., Flunger, B., Schreier, B., Häfner, I., Trautwein, U., & Nagengast, B. (2015). More value through greater differentiation: Gender differences in value beliefs about math. The Journal of Educational Psychology, 107(3), 663-677. doi:10.1037/edu0000003 Gaspard, H., Häfner, I., Parrisius, C., Trautwein, U., & Nagengast, B. (2017). Assessing task values in five subjects during secondary school: Measurement structure and mean level differences across grade level, gender, and academic subject. Contemporary Educational Psychology, 48, 67-84. doi:10.1016/j.cedpsych.2016.09.003 Goetz, T., Becker, E. S., Bieg, M., Keller, M. M., Frenzel, A. C., & Hall, N. C. (2015). The class half empty: How emotional exhaustion affects the state-trait discrepancy in self-reports of teaching emotions. PLoS ONE, 10(9), e0137441. Goetz, T., Bieg, M., & Hall, N. C. (2016a). Assessing academic emotions via the experience sampling method. In M. Zembylas & P. A. Schutz (Eds.) Methodological advances in research on emotion and education (pp. 245-258). Springer, Cham. doi:10/1007/978-3- 319-29049-2_19 Goetz, T., Bieg, M., Lüdtke, O., Pekrun, R., & Hall, N. C. (2013a). Do girls really experience more anxiety in mathematics? Psychological Science, 24, 2079-2087. Goetz, T., Frenzel, A. C., Hall, N. C., Nett, U. E., Pekrun, R., & Lipnevich, A. A. (2014). Types of boredom: An experience sampling approach. Motivation and Emotion, 38, 401–419. doi:10.1007/s11031-013-9385-y Goetz, T., Frenzel, A. C., Pekrun, R., & Hall, N. C. (2006). The domain specificity of academic emotional experiences. Journal of Experimental Education, 75(1), 5–29. doi:10.3200/JEXE.75.1.5-29. Goetz, T., Frenzel, A. C., Pekrun, R., Hall, N. C., & Ludtke, O. (2007). Between- and within- domain relations of students’ academic emotions. Journal of Educational Psychology, 99(4), 715–733. doi:10.1037/0022-0663.99.4.715. Goetz, T., Lüdtke, O., Nett, U. E., Keller, M. M., & Lipnevich, A. A. (2013b). Characteristics of teaching and students’ emotions in the classroom: Investigating differences across domains. Contemporary Educational Psychology, 38(4), 383-394. doi:10.1016/j.cedpsych.2013.08.001 Goetz, T., Sticca, F., Pekrun, R., Murayama, K., & Elliot, A. J. (2016b). Intraindividual relations between achievement goals and discrete achievement emotions: An experience sampling approach. Learning and Instruction, 41, 115-125. doi:10.1016/j.learninstruc.2015.10.007 127 Hektner, J. M., Schmidt, J. A., & Csikszentmihalyi, M. (2007). Experience sampling method: Measuring the quality of everyday life. Thousand Oaks, Calif: Sage Publications. Hess, U. (2003). Now you see it, now you don't--the confusing case of confusion as an emotion: Commentary on Rozin and Cohen (2003). Emotion, 3(1), 76-80. doi:10.1037/1528- 3542.3.1.76 Heyman, G. D., & Dweck, C. S. (1992). Achievement goals and intrinsic motivation: Their relation and their role in adaptive motivation. Motivation and Emotion, 16, 231–247. http://dx.doi.org/10.1007/BF0099 1653 Higgs, M., & Lichtenstein, S. (2010). Exploring the ‘Jingle fallacy’: A study of personality and values. Journal of General Management, 36(1), 43-61. doi:10.1177/030630701003600103 Hu, L., & Bentler, P. M. (1999). Cutoff criteria for fit indexes in covariance structure analysis: Conventional criteria versus new alternatives. Structural Equation Modeling: A Multidisciplinary Journal, 6(1), 1-55. 10.1080/10705519909540118 Jiang, Y., Rosenzweig, E. Q., & Gaspard, H. (2018). An expectancy-value-cost approach in predicting adolescent students’ academic motivation and achievement. Contemporary Educational Psychology, 54, 139-152. doi:10.1016/j.cedpsych.2018.06.005 Johnson, M. L., & Safavian, N. (2016). What is cost and is it always a bad thing? furthering the discussion concerning college-aged students' perceived costs for their academic studies. Journal of Cognitive Education and Psychology, 15(3), 368-390. doi:10.1891/1945-8959.15.3.368 Kelley, T. L. (1927). Interpretation of educational measurements. Oxford, England: World Book Co. Keltner, D., & Shiota, M. N. (2003). New displays and new emotions: A commentary on Rozin and Cohen (2003). Emotion, 3(1), 86-91. doi:10.1037/1528-3542.3.1.86 Kennedy, G. & Lodge, J. M. (2016). All roads lead to Rome: Tracking students’ affect as they overcome misconceptions. In S. Barker, S. Dawson, A. Pardo, & C. Colvin (Eds.) Show Me the Learning. Proceedings ASCILITE 2016 (pp. 318-328). Kleiman, E. (2017). EMAtools. Data management tools for real-time monitoring/ecological momentary assessment data. R package version 0.1.3. https://cran.r- project.org/web/packages/EMAtools/index.html. Kleinginna, P. R., & Kleinginna, A. M. (1981). A categorized list of emotion definitions, with suggestions for a consensual definition. Motivation and Emotion, 5(4), 345-379. doi:10.1007/BF00992553 128 Marsh, H., Pekrun, R., Parker, P., Murayama, K., Guo, J., Dicke, T., & Arens, A. (2018). The murky distinction between self-concept and self-efficacy: Beware of lurking jingle-jangle fallacies. Journal of Educational Psychology, 111(2), 331-353. doi:10.1037/edu0000281 O'Sullivan, G. (2011). The relationship between hope, eustress, self-efficacy, and life satisfaction among undergraduates. Social Indicators Research, 101(1), 155-172. doi:10.1007/s11205-010-9662-z Pajares, F. (2009). Toward a positive psychology of academic motivation: The role of self- efficacy beliefs. In R. Gilman, E. S. Huebner, & M. J. Furlong (Eds.), Handbook of positive psychology in schools (pp. 149– 160). New York, NY: Routledge/Taylor & Francis Group. Pekrun, R. (2006). The control-value theory of achievement emotions: Assumptions, corollaries, and implications for educational research and practice. Educational Psychology Review, 18(4), 315-341. doi:10.1007/s10648-006-9029-9 Pekrun, R., Elliot, A. J., & Maier, M. A. (2009). Achievement goals and achievement emotions: Testing a model of their joint relations with academic performance. Journal of Educational Psychology, 101(1), 115–135. doi:10.1037/a0013383. Kosovich, J. J., Hulleman, C. S., Barron, K. E., & Getty, S. (2015). A practical measure of student motivation: Establishing validity evidence for the expectancy-value-cost scale in middle school. The Journal of Early Adolescence, 35(5-6), 790-816. doi:10.1177/0272431614556890 Kuznetsova, A., Brockhoff, P. B., & Christensen, R. H. B. (2017). lmerTest package: Tests in linear mixed effects models. Journal of Statistical Software, 82(13), 1–26. https://doi.org/10.18637/jss. v082.i13. Linnenbrink, E. A. (2006). Emotion research in education: Theoretical and methodological perspectives on the integration of affect, motivation, and cognition. Educational Psychology Review, 18(4), 307-314. doi:10.1007/s10648-006-9028-x Linnenbrink-Garcia L., Wormington S.V., Ranellucci J. (2016) Measuring Affect in Educational Contexts: A Circumplex Approach. In M. Zembylas & P. Schutz (Eds.) Methodological Advances in Research on Emotion and Education. Springer. Marsh, H. W. (1994). Sport motivation orientations: Beware of jingle-jangle fallacies. Journal of Sport and Exercise Psychology, 16(4), 365-380. doi:10.1123/jsep.16.4.365 Marsh, H. W., Craven, R. G., Hinkley, J. W., & Debus, R. L. (2003). Evaluation of the big-two- factor theory of academic motivation orientations: An evaluation of jingle-jangle fallacies. Multivariate Behavioral Research, 38(2), 189-224. doi:10.1207/S15327906MBR3802_3 129 Applied Social Psychology, 18, 111-129. Robinson, M. D., & Clore, G. L. (2002). Belief and feeling: Evidence for an accessibility model of emotional self-report. Psychological Bulletin, 128(6), 934-960. doi:10.1037/0033- 2909.128.6.934 Pekrun, R., Goetz, T., Daniels, L. M., Stupnisky, R. H., & Perry, R. P. (2010). Boredom in achievement settings: Exploring control-value antecedents and performance outcomes of a neglected emotion. Journal of Educational Psychology, 102(3), 531-549. doi:10.1037/a0019243 Pekrun, R., Goetz, T., Titz, W., & Perry, R. P. (2002). Academic emotions in students' self- regulated learning and achievement: A program of qualitative and quantitative research. Educational Psychologist, 37(2), 91-105. doi:10.1207/S15326985EP3702_4 Pekrun, R. & Linnenbrink-Garcia, L. (2012). Academic emotions and student engagement. In S. L. Christensen, A. L. Reschly, & C. Wylie (eds.), Handbook of Research on Student Engagement, pp. 259-282. New York, NY: Springer US. Pekrun, R., Vogl, E., Muis, K. R., & Sinatra, G. M. (2017). Measuring emotions during epistemic activities: The Epistemically-Related Emotion Scales. Cognition and Emotion, 31, 1268–1276. http://dx.doi.org/10.1080/02699931.2016.1204989 Perez, T., Cromley, J. G., & Kaplan, A. (2014). The role of identity development, values, and costs in college STEM retention. Journal of Educational Psychology, 106, 315-329. doi: 10.1037/a0034027 Perez, T., Wormington, S. V., Barger, M. M., Schwartz‐Bloom, R. D., Lee, Y., & Linnenbrink‐ Garcia, L. (2019). Science expectancy, value, and cost profiles and their proximal and distal relations to undergraduate science, technology, engineering, and math persistence. Science Education, 103(2), 264-286. doi:10.1002/sce.21490 Perugini, M., & Bagozzi, R. P. (2001). The role of desires and anticipated emotions in goal- directed behaviours: Broadening and deepening the theory of planned behaviour. British Journal of Social Psychology, 40, 79-98. R Core Team (2018). R: A language and environment for statistical computing. Vienna, Austria: R Foundation for Statistical Computing. http://www.R-project.org Reschly, A. L., & Christenson, S. L. (2012). Jingle, jangle, and conceptual haziness: Evolution and future directions of the engagement construct. In S. L. Christenson, A. L. Reschly, & C. Wylie (Eds.), Handbook of research on student engagement (p. 3–19). Springer Science + Business Media. https://doi.org/10.1007/978-1-4614-2018-7_1 Richard, R., van der Pligt, J., de Vries, N. (1996). Anticipated affect and behavioral choice basic. 130 Rosenzweig, E. Q., Wigfield, A., & Hulleman, C. S. (2020). More useful or not so bad? examining the effects of utility value and cost reduction interventions in college physics. Journal of Educational Psychology, 112(1), 166-182. doi:10.1037/edu0000370 Rosseel, Y. (2012). Lavaan: An r package for structural equation modeling. Journal of Statistical Software, 48(2), 1-36. http://www.jstatsoft.org/v48/i02/ Rossiter, J. R. (2002). The C-OAR-SE procedure for scale development in marketing. International Journal of Research in Marketing, 19(4), 305-335. doi:10.1016/S0167- 8116(02)00097-6 Rozin, P., & Cohen, A. B. (2003). Reply to commentaries: Confusion infusions, suggestives, correctives, and other medicines. Emotion, 3(1), 92-96. doi:10.1037/1528-3542.3.1.92 Safavian, N., Conley, A., & Karabenick, S. (2013, April). Examining mathematics cost value among middle school youth. In E. M. Anderman (Chair), Is it worth my time and effort? Exploring students’ conceptions of the cost of learning. Symposium conducted at the annual meeting of the American Educational Research Association, San Francisco, CA. Sazzad, M. S., AlZoubi, O., Calvo, R. A., & D’Mello, S. K. (2011). Affect detection from multichannel physiology during learning. In S. Bull & G. Biswas (Eds.), Proceedings of the 15th International Conference on Artificial Intelligence in Education (pp. 131–138). New York, NY: Springer. Scherer, K. R. (2000). Emotions as episodes of subsystems synchronization driven by nonlinear appraisal processes. In I. Granic & M. D. Lewis (Eds.), Emotion, development, and self- organization: Dynamic systems approaches to emotional development (pp. 70–99). New York: Cambridge University Press. Schmidt, J. A., Kafkas, S. S., Maier, K. S., Shumow, L., & Kackar-Cam, H. Z. (2019). Why are we learning this? using mixed methods to understand teachers’ relevance statements and how they shape middle school students’ perceptions of science utility. Contemporary Educational Psychology, 57, 9-31. doi:10.1016/j.cedpsych.2018.08.005 Schmidt, J. A., Shumow, L., & Kackar-Cam, H. Z. (2017). Does mindset intervention predict students’ daily experience in classrooms? A comparison of seventh and ninth graders’ trajectories. Journal of Youth and Adolescence, 46(3), 582-602. doi:10.1007/s10964-016- 0489-z Schuster C, & Martiny, S. E. (2016). Not feeling good in stem: Effects of stereotype activation and anticipated affect on women’s career aspirations. Sex Roles, 76, 40-55. Seib, H. M., & Vodanovich, S. J. (1998). Boredom proneness and psychosocial development. The Journal of Psychology, 132, 642–652. 131 Silvia, P. J. (2010). Confusion and interest: The role of knowledge emotions in aesthetic experience. Psychology of Aesthetics, Creativity, and the Arts, 4(2), 75-80. doi:10.1037/a0017081 Trautwein, U., Marsh, H. W., Nagengast, B., Lüdtke, O., Nagy, G., & Jonkmann, K. (2012). Probing for the multiplicative term in modern expectancy–value theory: A latent interaction modeling study. Journal of Educational Psychology, 104(3), 763-777. doi:10.1037/a0027470 van der Pligt, J. & Nanne K., De Vries (1998). Expectancy-Value models of health behaviour: The role of salience and anticipated affect. Psychology & Health, 13, 289-305. Van Petegem, S., Van Petegem, S., Vansteenkiste, M., Vansteenkiste, M., Beyers, W., & Beyers, W. (2013). The Jingle–Jangle fallacy in adolescent autonomy in the family: In search of an underlying structure. Journal of Youth and Adolescence, 42(7), 994-1014. doi:10.1007/s10964-012-9847-7 Vodanovich, S. J. (2003a). On the possible benefits of boredom: A neglected area in personality research. Psychology and Education: An Interdisciplinary Journal, 40(3–4), 28–33. Vogl, E., Pekrun, R., Murayama, K., & Loderer, K. (2019). Surprised-curious-confused: Epistemic emotions and knowledge exploration. Emotion. doi:10.1037/emo0000578 Weidman, A. C., Steckler, C. M., & Tracy, J. L. (2017). The jingle and jangle of emotion assessment: Imprecise measurement, casual scale usage, and conceptual fuzziness in emotion research. Emotion, 17(2), 267-295. doi:10.1037/emo0000226 Widaman, K. F., Little, T. D., Preacher, K. J., & Sawalani, G. M. (2011). On creating and using short forms of scales in secondary research. In K. H. Trzesniewski, M. B. Donnellan, & R. E. Lucas (Eds.), Secondary data analysis: An introduction for psychologists (pp. 39- 61). Washington, DC, US: American Psychological Association. doi:10.1037/12350-003 Wigfield, A., Battle, A., Keller, L. B., & Eccles, J. S. (2002). Sex differences in motivation, self- concept, career aspiration, and career choice: Implications for cognitive development. In A. McGillicuddy-De Lisi & R. De Lisi (Eds.), Biology, society, and behavior: The development of sex differences in cognition (pp. 93–124). Westport, CT: Ablex. Wigfield, A., & Eccles, J. S. (2000). Expectancy–Value theory of achievement motivation. Contemporary Educational Psychology, 25(1), 68-81. doi:10.1006/ceps.1999.1015 Wigfield, A. & Eccles, J. S. (2020). 35 years of research on students’ subjective task values and motivation: A look back and a look forward. In A. J. Elliot (Ed.), Advances in motivation science (Vol. 7, pp. 161-198). Elsevier Inc. Wilson, T. D., & Gilbert, D. T. (2003). Affective forecasting. In M. P. Zanna (Ed.), Advances in experimental social psychology (Vol. 35, pp. 345–411). San Diego, CA: Academic Press. 132 Wilson, T. D., & Gilbert, D. T. (2005). Affective forecasting: Knowing what to want. Current Directions in Psychological Science, 14, 131-134. Yik, M., Russell, J. A., & Steiger, J. H. (2011). A 12-point circumplex structure of core affect. Emotion, 11(4), 705-731. doi:10.1037/a0023980 Zirkel, S., Garcia, J. A., & Murphy, M. C. (2015). Experience-sampling research methods and their potential for education research. Educational Researcher, 44(1), 7-16. doi:10.3102/0013189X14566879 133 PAPER THREE. Students’ Perceptions of Cost as Antecedents of Their Mathematics Achievement and Intentions to Remain in STEM Abstract Students’ perceptions of cost are important predictors of academic and motivational outcomes. Though cost has been described as the anticipated effort one must put forth on an activity and what an individual sacrifices to complete a task, no known work has examined the extent to which anticipated cost beliefs predict experienced cost or whether anticipated and experienced cost are differentially predictive of academic and motivational outcomes. The present study examined four dimensions of cost (task effort, outside effort, loss of valued alternatives, emotional cost) as anticipated and experienced beliefs and how they predict mathematics achievement and STEM-career intentions in introductory college calculus courses. Overall, results suggested that students with high anticipated cost beliefs also experienced higher cost beliefs and had higher course grades. Unlike anticipated cost, higher experienced cost was associated with lower grades. whereas students with high experiences of cost had lower course grades. Results are discussed in terms of implications for theory and practice. Introduction The United States (U.S.) has emphasized the importance of science, technology, engineering, and mathematics (STEM) occupations as these jobs have been expected to grow at a faster rate than non-STEM occupations (Fayer et al., 2017); however, about half of the undergraduate students who declare STEM majors end up changing their major before completion of the degree (Chen, 2013; Daempfle, 2003). As STEM interest tends to decline as students age (Brophy, 2008; Harackiewicz et al., 2016), gateway courses can act as one of the final areas where STEM attrition or persistence takes form (Chang et al., 2008; Stout et al., 134 2011). Researchers are still trying to understand students’ experiences that contribute to their decision to leave STEM pathways, but one reason may be that students simply perceive the costs of remaining in a STEM field to be too high. For example, STEM majors are often thought to be rather competitive and have high workloads compared to other majors (Seymour & Hewitt, 1997). Students also perceive STEM majors as requiring them to sacrifice other activities they value, which has been shown to predict STEM attrition (Perez et al., 2014). Expectancy-value theory suggests that students’ achievement motivation and academic choices are driven by two factors: their beliefs about their ability to do a given task and their beliefs about the value of doing that task (Eccles et al., 1983; Wigfield & Eccles, 2000). In this model, cost has been conceptualized as a negative value, and refers to the sacrifices one must make in order to partake in an activity or complete a task (Eccles et al., 1983). Despite fairly consistent evidence that students view STEM pathways as costly, little is known about how cost beliefs function in the short-term, throughout the duration of a single course, for example. The purpose of this study was to examine students’ experiences of cost during a gateway college calculus course and to explore how these experienced costs mediate the associations between anticipated cost and academic outcomes (i.e, course grades and STEM career intentions), while controlling for expectancies and values. Persistence and Achievement in Mathematics In order for students to realize a STEM career, they have to be successful in STEM courses and make the choice to be persistent in pursuit of this career path. Current shortages in STEM careers are one result of high attrition among STEM majors, particularly among women and minority students (DePass & Chubin, 2009; Hall et al., 2011; Ingersoll & May, 2012; Ingersoll & Perda, 2010; U.S. Department of Education, National Center for Education Statistics, 135 2005). Achievement in introductory gateway STEM courses has been found to be a critical predictor of STEM persistence (Gasiewski et al., 2012; Jones et al., 2010; Lawson et al., 2007; Shaw & Barbuti, 2010); however, achievement and persistence outcomes are also influenced by a number of cognitive and motivational beliefs including self-efficacy, interest, perceived autonomy support, and relevance of STEM material (Crisp et al., 2009; Cromley et al., 2016; Hall & Webb, 2014; Hurtado et al., 2010; Ironsmith et al., 2003; Jones et al., 2010; Lent et al., 1994; Zusho et al., 2003). Given that researchers have found these student beliefs to be critical predictors of achievement and persistence in STEM courses, it may be particularly important to consider these outcomes in gateway STEM courses where STEM attrition tends to be high. Gateway Courses Gateway courses refer to high-risk, high enrollment, often introductory courses that must be passed for a student to continue with their selected major (Koch & Rodier, 2014). For example, calculus is generally the first mathematics course that is required for STEM majors and must be passed in order to continue with the intended STEM major (Mattern et al., 2015; Radunzel et al., 2015). Academic performance in college gateway STEM courses is particularly important for student persistence in STEM majors (Gasiewski et al., 2012; Jones et al., 2010; Lawson et al., 2007; Shaw & Barbuti, 2010) as STEM attrition tends to be high for students who do not perform well in these courses (Chang et al., 2008; Stout et al., 2011). Students who perform poorly in gateway courses may self-select out of STEM majors and/or may be prevented from continuing due to program or university guidelines. Further, STEM attrition is more prevalent for women and students of color in gateway courses (Koch, 2017). Women and students of color are also more likely to switch their major to a non-STEM major and are less 136 likely to complete a STEM degree (National Science Board, 2018; National Science Foundation, 2017). Research has shown that motivation in gateway STEM courses also impacts achievement and STEM persistence (Ackerman et al., 2013; Dai & Cromley, 2014; Gore, 2006; Hernandez et al., 2013; Perez et al., 2014; Zare, 2009). Generally, motivation researchers have focused on the absence or presence of beliefs that appear to be motivationally adaptive, but less research has focused on beliefs that, when present can be seen as barriers to one’s motivation. Cost beliefs (i.e., what one must sacrifice to do well on a specific task) are one type of motivational detractor. Students’ perceptions of cost have been found to be predictive of reduced achievement and STEM persistence (Perez et al., 2014); however, cost research is still in its infancy and more work is needed to understand how cost predicts academic outcomes and choices. Importance of Cost According to expectancy-value theory, students’ achievement motivation and academic choices are driven by two factors: success expectancies and task value (Eccles et al., 1983). Expectancies refer to how well one believes they will do on a particular task or activity (Eccles et al., 1983; Wigfield & Eccles, 2000), whereas task value is broadly conceptualized as the extent to which a student wants to complete a task. Task value was originally conceptualized as having three dimensions. Attainment value refers to the importance that a student places on a given task. Intrinsic value refers to the enjoyment that a student gains by completing a task. Utility value refers to the usefulness of a particular task for a student’s future plans or goals (Eccles et al., 1983). Eccles et al. (1983) originally discussed students’ perceptions of cost (i.e., what a student must give up to complete a task or the effort required to complete the task) as a possible mediator of task value, but in later work described it as a fourth dimension of task value 137 (Eccles, 2005; Wigfield & Eccles, 1992, 2000; Wigfield et al., 2016). Other researchers have recently provided empirical evidence that cost is an independent construct from task value and students’ expectancies (Barron & Hulleman, 2015; Chiang et al., 2011; Conley, 2012; Luttrell et al., 2010; Trautwein et al., 2012), though there is still considerable debate about where cost “fits” in the model (Eccles & Wigfield, in press; Wigfield & Eccles, 2020). In this study, I consider cost separately from value and expectancies. Dimensions and Consequences of Cost Eccles et al. (1983) originally posited three dimensions of cost that students may experience: the effort required to engage in a task (i.e., task effort cost), valued alternatives that must be given up in order to engage in a task (i.e., loss of valued alternatives), and the psychological consequences that come with failure on the particular task (i.e., psychological cost). Wigfield and Eccles (2000) later expanded psychological cost to incorporate experiences of anxiety and negative emotional experiences that students associate with a specific task, which they referred to as emotional cost. More recently, Flake et al. (2015) suggested dividing effort cost into two unique components after conducting focus groups with students and reviewing literature from fields outside of education. Task effort (TE) cost refers to a student’s perception of how much effort is needed to engage in a task (e.g., a student believes that completing calculus homework will take too many hours to finish). Outside effort (OE) cost refers to a student’s perception of how much effort is needed for other activities, which will inhibit the student from completing the main task (e.g., a student works a part-time job, which takes away from putting time and effort into her calculus homework). Loss of valued alternatives (LV) and emotional (EM) cost were retained as meaningful cost dimensions. Of note here is that Flake et al. (2015) reiterated that students’ 138 subjective cost beliefs are what is important. That is, two students may have the same calculus homework and it may take each of them three hours to complete, but only one student may believe this is too much time to put into homework. Researchers have noted that there may be other dimensions of cost, such as financial cost and ego cost (Eccles, 2005; Johnson & Safavian, 2016); however, in the framing of the current study I focus on the four dimensions specified by Flake et al. (2015). Students’ cost perceptions have been shown to be negatively correlated with their expectancies and task values. (Flake et al., 2015; Gaspard et al., 2017; Perez et al., 2014; Safavian & Conley, 2016). For example, if a student perceives a task to require too much effort, she may end up valuing the task less after deciding that the task is not worth the effort. The same student may also feel less competent if she perceives the task to require too much effort. Conversely, a student who perceives a task to require little effort may experience high expectancies and may value the task more. However, other research has shown that a student can perceive high cost, while still experiencing high expectancies and high value (Conley, 2012; Perez et al., 2014). For example, Johnson and Safavian (2016) conducted focus groups with college students who reported that having high frustration or stress pushed them to work harder in their course. Other researchers have found that perceptions of high cost are negatively correlated with math achievement (Conley, 2012; Jiang et al., 2018; Safavian et al., 2013; Trautwein et al., 2012) and inversely related to interest and utility value (Luttrell et al., 2010). In other studies, high cost perceptions have been linked to high levels of procrastination, avoidance intentions, negative affect (Jiang et al., 2018) as well as lower intentions to attend graduate school (Battle & Wigfield, 2003; Perez et al., 2014). Researchers have also found that students’ perceptions of cost in STEM subjects tends to increase throughout their educational careers 139 (grades 5-12), with females experiencing higher increases in perceived cost compared to male students (Gaspard et al., 2017). This suggests that students’ perceptions of cost related to STEM subjects may be highest when they enter college. Considering that high levels of cost tend to be negatively associated with motivational outcomes, achievement, and course-taking intentions, cost may be a promising construct to intervene on (Barron & Hulleman, 2015). In thinking about designing interventions focused on students’ perceptions of cost, two questions arise. The first question is, What is the best point of intervention? Is it more effective to address students’ beliefs about the anticipated or expected costs of a course they have yet to take, or to intervene at the point where students are actually engaged in the course itself and experiencing the daily costs for themselves? The second question is, on what dimension of cost should we intervene? In order to answer these questions, researchers must examine how different dimensions of anticipated and experienced costs function. Cost Beliefs as Anticipated and Experienced As described above, Eccles (2005) defined cost as what an individual must sacrifice to engage in a task and the anticipated effort that will be needed to finish that task; however, empirical work to date has primarily operationalized cost in terms of one’s anticipated beliefs about a task, rather than one’s experienced sacrifices (Gaspard et al., 2015; Jiang et al., 2018; Perez et al, 2014). In other words, cost is measured in terms of how costly one expects a future activity to be. Given that Eccles (2005) considered cost to refer to both anticipated and experienced sacrifices, researchers should consider both of these perspectives on cost. It is reasonable to assume that one’s beliefs about the costs of a particular task may shift once they have more concrete experiences with the task itself. That is, a person might perceive a task as more or less costly when they are in the middle of it, relative to before they started it. Anticipated 140 costs refer to the expected or anticipated negative appraisals of what will be given up, invested, or required to partake in particular activity or task. For example, a student may have formed preconceived beliefs that a math class will take up too much time or that it will be extremely frustrating. In essence then, anticipated cost beliefs are predictions. These predictions are, of course, likely shaped by students’ past experiences (Eccles et al., 1983). Experienced costs refer to one’s immediate or experienced negative appraisals of what is currently being given up, invested, or required in order to partake in the particular activity or task at hand. For example, during week five of the semester, a student may feel that their math class does not take too much effort (low experienced cost); however, this student’s experiences of cost may change during an exam week where they have to invest and sacrifice much more. Experienced cost may fluctuate and change from week-to-week depending on other factors both inside and outside of the class. Researchers have explored other constructs, such as emotions, using an anticipated and experienced framework when examining decision-making (Keltner & Lerner, 2010; Loewenstein & Lerner, 2003) and have found that experienced emotions often drive behavior in different ways than when anticipating future consequences. Experienced emotional responses are sensitive to the vividness and timing of outcomes, whereas anticipated emotional responses are not; thus, they have different potential to shape behavior (Loewenstein & Lerner, 2003). Experienced emotions can drive individuals to behave in a way that is contradictory to their long-term goals. Anticipated emotions are not actual emotions being experienced; rather, they are cognitive appraisals of an expected future state (Perugini & Bagozzi, 2001; Wilson & Gilbert, 2003; van der Pligt & de Vries, 1998). For example, a student may expect to be anxious during a future exam but does not currently feel anxious. These anticipated emotions are similar to the concept 141 of affective forecasting, in which individual’s predict how they will feel in the future (Wilson & Gilbert, 2003). Research has generally shown that individuals tend to poorly estimate their future emotions (Dunn et al., 2003; Goetz et al., 2013), which have been linked to perceived competence (Goetz et al., 2013), emotional exhaustion (Goetz et al., 2015), and gender stereotypes (Schuster & Martiny, 2016); however, anticipated emotions have still been shown to be significant predictors of outcomes such as STEM career aspirations (Richard et al., 1996; Schuster & Martiny, 2016; Wilson & Gilbert, 2005). Unlike anticipated emotions, momentary or state emotions are short-lived emotional experienced, are strongly related to particular situations (Eid et al., 1999), and are less stable over time (Linnenbrink, 2006). Thus, students’ anticipated beliefs of costs may act as a cognitive appraisal of the costs they expect to experience in the future. These anticipated beliefs may be formed from previous experiences as well as future expectations. Perhaps a student who anticipates what he or she has to give up, is actually considering the consequence, rather than actually giving something up, whereas what a student must actually give up may shift from week-to-week and is more dependent on context. For example, students may form preconceived notions and overestimate the costs associated with taking a course; however, they may find that they actually do not incur high levels of cost when taking the course. Taken together, a student may anticipate that a calculus course will take up too much time, but find that during the course of the semester, some weeks required more effort than others. Thus, it is important to understand if anticipated and experienced costs impact outcomes uniquely so that researchers can better understand when and where to intervene. 142 Modeling the Dynamics of Cost Beliefs As mentioned above, cost has been referred to as what one sacrifices as well as the anticipated effort that are required to complete a task (Eccles, 2005), yet little work has been conducted to understand how cost beliefs function in the classroom. Researchers have called for the need to examine complex systems in the classroom (Hilpert & Marchand, 2018) with others specifically suggesting a need to understand the dynamic nature of cost using intensive longitudinal methodologies, such as event-based techniques (Feldon et al., 2019). Through the use of intensive longitudinal methods, researchers can begin to understand the complexities of motivational processes, such as cost, including within- and between-person effects among students (Murayama et al., 2017). Additionally, in order to make sound recommendations to educators and researchers, understanding students’ classroom experiences is critical (Zirkel et al., 2015). Considering that cost beliefs may vary significantly throughout the semester, researchers must consider them as dynamic. Not only is it likely that one’s anticipated cost beliefs predict one’s experiences of cost throughout the semester, it is also possible that one’s anticipated cost beliefs predict how one’s experienced cost one week may carry over to the next week. Anticipated cost may also predict how much one’s experiences of cost fluctuate throughout the semester; however, researchers can only answer these questions through the use of intensive longitudinal methods that offer affordances to understanding the complexities of motivational processes. Dynamic structural equation modeling (DSEM; Asparouhov et al., 2018) is one approach that can aid in understanding these complexities. DSEM is a novel and relatively new technique used for modeling intensive longitudinal data over time (see McNeish & Hamaker, 2019 for an overview). Whereas growth models are interested in developmental processes and change over time, time-series models (i.e., DSEM) 143 focus on understanding the variability of processes and when they will deviate from the mean (McNeish & Hamaker, 2019). DSEM is a type of multilevel time-series modeling approach that allows for the examinations of variation across individuals and includes predictors at the within- and between-person levels. Further, whereas in traditional time-series models, missing data are often removed (Newman, 2014), DSEM uses Bayesian estimation via Markov Chain Monte Carlo (MCMC) algorithms. Because of this, a missing value at a particular measurement is sampled from a conditional posterior distribution that is dependent on other data and parameters in the model (Hamaker et al., 2018). If necessary, DSEM can also account for unequal measurement time intervals (Zhou et al., 2019). Though DSEM can provide new insights into dynamic processes, it is still relatively new; thus, there are limitations with this approach such as how to address model fit (Asparouhov et al., 2018) and sample size concerns (McNeish, 2019) that are considered in more detail in the discussion. The Present Study The present study was designed to examine how anticipated and experienced cost beliefs are associated with one another, and with student achievement and intentions to pursue a STEM career, while controlling for students’ expectancies and values at the beginning of the semester. Because students’ cost beliefs throughout the semester may be dependent on one another, I also control for the week-to-week carryover effects of experienced cost as well as the variability in these beliefs. Four dimensions of cost were examined as suggested by prior research (Flake et al., 2015). The research questions framing this study are: RQ1a) What are the dynamic associations between anticipated and experienced cost? To answer this question, I examined how anticipated cost is related to experienced cost, the variability in 144 experienced cost, and the week-to-week carryover effects of experienced cost (i.e., the within person effect). Empirical evidence of anticipated and experienced emotions has suggested differences in their predictive ability (Schuster & Martiny, 2016; Wilson & Gilbert, 2005). Because of this, it may be that anticipated costs predict course grades and STEM-career intentions differently than experienced costs. Because anticipated costs may be more of a cognitive appraisal than experienced costs, students may be able to prepare better for those costs and have better long-term outcomes, such as higher grades. Whereas, because experienced costs are more proximal and more difficult to adapt to, students with high experiences of cost may have maladaptive long-term outcomes, such as lower grades. Anticipated cost is also expected to predict experienced costs, in that students with high anticipated costs will experience high costs as well. RQ1b) What are the associations between anticipated cost, course grades and STEM career intentions? As theory and empirical data would suggest that cost is a negative predictor of academic outcomes (Eccles et al., 1983; Perez et al., 2014; Jiang et al., 2018), I hypothesize that anticipated cost will negatively predict grades and STEM career intentions. RQ2a) What are the dynamic associations between experienced cost, course grades, and STEM career intentions? To answer this question, I examined how experienced cost, the variability in experienced cost, and the week-to-week carryover effects of experienced cost are related to course grades and STEM career intentions. Similar to research question 1a, I expect that experienced cost will negatively predict grades and STEM career intentions. No hypotheses were made regarding how the week-to-week carryover or the variability in experienced cost would 145 predict grades and STEM career intentions. This was due to these variables being exploratory in nature. RQ2b) To what extent does experienced cost act as a mediator of the effects of anticipated cost on course grades and STEM career intentions? This research question was exploratory, given that no known research has tested cost, let alone experienced cost, as a mediator; though theory does suggest that these variables may mediate the relation between affective memories and academic outcomes (Eccles et al., 1983). For a student to consider the cost he or she might experience during a course (i.e., anticipated cost), he likely draws upon memories of past experiences. Therefore, it is likely that experienced cost will mediate the relation between anticipated cost and the academic outcomes of interest in this study. I also examined whether the week-to-week carryover effects of experienced cost and the variance of experienced cost act as mediators. No hypotheses were made here, as these tests were completely exploratory. Course Description and Sample Method Data for this study were collected from five sections of two large introductory calculus course at a midwestern university in the United States. Three of the sections represented a calculus course that is generally required for students who plan to complete a STEM major (i.e., applied-STEM), while the other two sections represented a calculus course that is not required for STEM majors (i.e., basic-STEM) Students in the basic-STEM course typically pursue business majors or are pre-med. Both courses are designed to aid students in building the foundational calculus knowledge (e.g., limits, derivatives, integrals) so that they can apply their skills in the future. Each section met three times per week for 50 minutes during the fall 2018 146 semester (Monday, Wednesday, Friday). On Mondays and Wednesdays, class time consisted of large lectures taught by the faculty of record, whereas on Fridays students met in smaller sections taught by teaching assistants. There was a separate instructor for each of the three sections of the applied-STEM calculus course. The same instructor taught both sections of the basic-STEM calculus course. The total enrollment across the three sections of the applied-STEM calculus course was 584, with 47% agreeing to participate in the study (N = 273). The total enrollment across the two sections of the basic-STEM calculus course was 514, with 30% agreeing to participate in the study (N = 156). From the students that participated in the study, 36% were in the basic-STEM calculus course and 64% were in the applied-STEM calculus course. Of the students who participated in the survey, 62% identified as White, 63% identified as male, and 78% were first-year students (see Table 3.1 for full demographic characteristics). 147 Table 3.1. Participant demographic characteristics Sex Male Female Race/Ethnicity White International Asian Black Hispanic Two or more races Not reported Class level First-year Second-year Third-year Fourth-year Fifth-year Not reported % Students (N = 429) 63% 37% 62% 20% 6% 5% 5% 2% < 1% 78% 15% 5% < 1% 1% 1% 148 Procedure Students completed a pre-survey during the first two weeks of the semester and a post- survey during the final two weeks of the semester. Surveys were distributed through an online link. Students who participated in the pre- and post-survey received course credit added onto their final course grade. Throughout the semester, students enrolled in the course also completed a daily diary measure, a type of intensive longitudinal data collection, for 11 consecutive weeks. Surveys were distributed through an online platform, Remind. Through Remind, students were emailed a link to the daily diary survey during the last ten minutes of their class. Students had the remainder of the day to respond to the survey before it closed at midnight. Daily diary surveys rotated between Monday and Wednesday to avoid day-of-the-week effects. Students in the applied-STEM calculus course who completed 80% of the daily diary surveys were entered into one of two drawings for a $75 Amazon gift card in their course section (six gift cards in total). Students in the basic-STEM calculus course who completed 80% of the daily diary surveys received course credit. In total, 2,435 daily diary responses were collected. On average, students responded to an average of 5.68 surveys (SD = 3.56) with a range of responses between 1 and 11. Generally, a response rate between 50 and 75 percent is expected of intensive longitudinal studies with college students (Feldman Barrett, 2004; Hektner et al., 2007). A response rate of 52% was observed. Demographic information and course grades were obtained through the university’s institutional office. A timeline of the study is presented in the Appendix. Measures Pre-survey Measures All measures were assessed on a 7-point Likert scale (1 = Strongly agree, 7 = Strongly disagree). Example items are included below, while full-scales are provided in the Appendix. 149 Anticipated Cost. Four dimensions of anticipated cost were assessed using an adapted scale from the Flake et al. (2015) scale. Anticipated task effort cost was assessed using five items was assessed using four items (e.g., “I have so many other commitments that I won’t be able to (e.g., “This class will demand too much of my time.” ! = .92). Anticipated outside effort cost put forth the effort needed for this class.” ! = 89.). Anticipated loss of valued alternatives was assessed using four items (e.g., “I’ll have to sacrifice too much to be in this class.” ! = 85.). class.” ! = 91). the material in my math class.” ! = .84; Kosovich et al., 2015). ! = .87; Kosovich et al., 2015). Anticipated emotional cost was assessed using six items (e.g., “I’ll worry too much about this Expectancies. Expectancies were assessed using three-items (e.g., “I know I can learn Value. Value was assessed using three-items (e.g., “I think my math class is important.” Daily Diary Measures Daily diary data collection is a type of intensive longitudinal methodology (Bolger & Laurenceau, 2013) used to collect individuals’ subjective experiences in relation to a specific course. This has also been referred to as an end-of-class report (Durik et al., 2018; Schmidt et al., 2017). Similar to other studies that have used this methodology (Durik et al., 2018; Schmidt et al., 2017; Schweinle et al., 2016), students were asked to complete a daily diary report once a week for 11 consecutive weeks after one of the large lectures. All items were assessed using a 7- point Likert scale (1 = Strongly agree, 7 = Strongly disagree) and included the stem: “After today’s class I feel like:”. Experienced Cost. Students reported on their perceptions of experienced cost in relation to the specific day’s class. In total, four items were used to assess experienced cost (one per 150 dimension). These items were selected from the original Flake et al. (2015) scale after a multimethod validation study (Beymer et al., under review). The items used were as follows: experienced task effort cost: “This class requires too much effort”; experienced outside effort cost: “Because of other things that I do, I don’t have time to put into this class”; experienced loss of valued alternatives: “This class requires me to give up too many other activities that I value”; emotional cost: “This class is emotionally draining”. Given that intensive longitudinal methodologies are designed to collect repeated measures, short scales are generally preferable to longer, multi-item scales (Csikszentmihalyi & Larson, 2014; Goetz et al., 2016; Zirkel et al., 2015). Post-survey and Achievement Measures Post-survey measures were completed during the final two weeks of the semester. Intentions to pursue STEM was assessed using a 7-point Likert scale (1 = Definitely will not, 7 = Definitely will). Course grades were obtained through university institutional records and were on a 4.0 scale. Intentions to Pursue STEM and Achievement. Students’ intentions to pursue a STEM career was assessed using a single item in the post-survey: “To what extent do you intend to pursue a STEM-related career?” (adopted from Estrada et al., 2011). Estrada-Hollenbeck et al. (2009) have found this measure to be significantly correlated with a number of behaviors related to intentions of pursuing a career in the given field (i.e., STEM, science). Final course grades were collected as a measure of achievement. Data Analytic Strategy Dynamic structural equation modeling (DSEM; Asparouhov et al., 2018) was used to model within- and between-person effects. DSEM allows for the modeling of multilevel 151 intensive longitudinal data using Bayesian estimation. Markov Chain Monte Carlo (MCMC) algorithms were used to obtain posterior distributions of each parameter to describe a 95% credibility interval (Asparouhov et al., 2018). If 0 is not contained in the interval, this suggests that the estimate is significant. I used the default, noninformative priors for all parameters, 20,000 iterations, and thinning of every 10th iteration for more stable results. The default number of chains (i.e., two) and number of discarded iterations (i.e., first half of each chain) were used in Mplus (Muthén & Muthén, 1998-2017). I used the default for point estimate of central tendency of posterior distributions in Mplus, which is the median. DSEM uses Full Information Maximum Likelihood (FIML). Therefore, listwise deletion does not occur for missing cases, rather all available data is used to estimate parameters. Within-person, I modeled the autoregressive relation of experienced cost. That is, experienced cost at time t was regressed on experienced cost at time t-1 to examine whether experienced cost has week-to-week carryover effects. Further, I modeled the within-person variance of experienced cost to examine whether the week-to-week carryover effects vary from student to student. The within-person variability takes into account the extent to which unexplained variances in a participant’s measurements oscillate around his or her trajectory or in other words, how reactive students’ cost beliefs are to outside forces that are not accounted for in the model. The log function was used here to ensure that the estimate of variance is positive (Hamaker et al., 2018). Because the dispersion of these residuals can vary between subjects, I model the within-person variability as a random effect (Zhou et al., 2019). Alongside modeling the residual variance at the between-person, I also estimated the random effects of experienced cost and the autoregressive relation of experienced cost between-person (i.e., these parameters are allowed to vary by individual). Last, I modeled experienced cost, the autoregressive 152 parameter, and the residual variance as mediators of anticipated cost, expectancies, value and course grade and STEM career intentions. Due to the complexity of modeling in DSEM, all constructs are modeled as observed measures; however, experienced cost, the autoregressive parameter, and the residual variance are modeled as latent factors between-person (i.e., these paths are random). Anticipated cost, expectancies, and value were grand-mean centered (See Figure 3.1 for hypothesized model). 153 Within-person Between-person Anticipated Cost Expectancies Value Experienced(w) Costt-1, i Experienced(w) Costt, i Experienced Cost et log(!2) Grade Experienced Cost Auto. Slope log residual variance STEM intentions Figure 3.1. Hypothesized model. Four separate models were examined, one for each dimension of cost: task effort cost, outside effort cost, loss of valued alternatives, and emotional cost. The black circles represent random effects across individuals. 154 Equations are presented below: Level 1 Model: Level 2 Model: ECti = !0i + !ECt-1,i + eti "EC,i = !00 + !01ACi + !02Ei + !03Vi + u0i "EE,i = !10 + !11ACi + !12Ei + !13Vi + u1i log(#ECi2) = exp(!EC20 + !21ACi + !22Ei + !23Vi + u2i) Gradei = !30 + !31ACi + !32Ei + !33Vi !34"EC,i + !35"EE,i + SIi = !40 + !41ACi + !42Ei + !43Vi !44"EC,i + !45"EE,i + !46log(#i2) !36log(#i2) + u3i (1) (2) (3) (4) (5) (6) + u4i In Equation 1, ECti represents experienced cost measured at a given time t for a given time t. This captures the difference between the predicted and observed value. In Equation effect in experienced cost from week to week. Last, eti represents the student-level residual at student i, !ECt-1,I represents the lagged experienced cost variable, which allows for a carryover 2, "EC,i represents the student-specific intercept for experienced cost, which is equal to the average intercept across all students, !00, anticipated cost, !01ACi, expectancies, !02Ei, value, !03Vi, as predictors of the student-specific intercept and a student-specific random intercept, u0i. In Equation 3, "EE,i represents the student-specific slope, which is equal to the average autoregressive relationship between experienced cost at time t-1 and time t (!10), anticipated cost, !11ACi, expectancies, !12Ei, value, !13Vi, as predictors of the student-specific slope and a student-specific random slope, u1i. In Equation 4, #ECi2, the log of the residual variance of experienced cost is equal to the expected residual variance of experienced cost, !EC20, the multiplicative change in the residual variance for a one-unit change in anticipated cost, !21ACi, 155 must be exponentiated to arrive at the overall residual variance across all students. In Equation 5, residual variance to vary randomly. Because Equation 4 is the log of the residual variance, it expectancies, !22Ei, value, !23Vi, and the random effect, u2i, which allows the within-person achievement of student i, Gradei, is equal to the average intercept across students, !30, anticipated cost, !31ACi, expectancies, !32Ei, value, !33Vi, experienced cost, !32"EC,i, the autoregressive slope of experienced cost, !33"EE,i, the log of the residual variance of experienced cost, !34log(#i2), as predictors of achievement, along with a student-specific random specific average intercept across students, !40, anticipated cost, !41ACi, expectancies, !42Ei, value, !43Vi, experienced cost, !42"EC,i, the autoregressive slope of experienced cost, !43"EE,i, the log of the residual variance of experienced cost, !44log(#i2), as predictors of achievement, along with a random intercept, u3i. Last, in Equation 6, STEM career intentions of student i, SIi, is equal to the student-specific random specific random intercept, u4i. Results Preliminary Analysis Correlations, means, and standard deviations of all variables were examined (see Table 3.2). As expected, dimensions of cost were generally highly and positively correlated with each other. Measures of anticipated cost were more strongly correlated with other measures of anticipated cost and measures of experienced cost were more strongly correlated with other measures of experienced cost. Finally, expectancies and value were positively correlated with each other and negatively correlated with cost. A confirmatory factor analysis (CFA) was also examined to confirm that dimensions of anticipated cost separated from one another and from expectancies and values. A six-factor CFA was considered including anticipated task effort, outside effort, emotional, and loss of valued alternatives cost along with expectancies and value. 156 This six-factor solution fit the model well, CFI = 0.98, TLI = 0.97, RMSEA = 0.04 according to Hu and Bentler’s (1999) guidelines. Standardized factor loadings are presented in the Appendix. A CFA was not conducted for the experienced measures of cost, given that a single item was used for each dimension. 157 Table 3.2. Within- and between-person correlations, means, and standard deviations of all variables Within-person 1. Exp. TE 2. Exp. OE 3. Exp. LV 4. Exp. EM Between-person 1. Exp. TE 2. Exp. OE 3. Exp. LV 4. Exp. EM 5. Ant. TE 6. Ant. OE 7. Ant. LV 8. Ant. EM 9. Expectancies 10. Value 11. Grade 12. STEM int. Mean SD Note. TE = Task Effort Cost; OE = Outside Effort Cost; LV = Loss of Valued Alternatives; EM = Emotional Cost. All items were measured on a 7-point Likert scale except for grade which was on a 4.0 scale. Correlations between experienced cost and all other variables were correlated at the daily level. *p < 0.05; **p < 0.01; ***p < 0.001. 8 -0.37*** -0.21*** 0.59*** -0.19*** 0.15** 0.14* -0.07 5.67 3.72 1.30 0.99 1 0.38*** 0.46*** 0.44*** 0.73*** 0.84*** 0.78*** 0.57*** 0.44*** 0.48*** 0.54*** -0.33*** -0.22*** -0.32*** -0.10 3.12 1.62 2 0.47*** 0.35*** 0.80*** 0.67*** 0.44*** 0.50*** 0.44*** 0.46*** -0.30*** -0.24*** -0.34*** -0.09 2.82 1.55 3 0.41*** 0.72*** 0.50*** 0.44*** 0.48*** 0.47*** -0.34*** -0.28*** -0.28*** -0.04 2.86 1.57 6 0.67*** 0.55*** -0.30*** -0.22*** -0.02 -0.05 2.75 1.14 7 0.77*** -0.31*** -0.20*** -0.07 -0.04 3.21 1.21 4 0.42*** 0.33*** 0.41*** 0.54*** -0.32*** -0.19*** -0.31*** -0.13 3.19 1.74 5 0.63*** 0.79*** 0.78*** -0.35*** -0.22*** -0.08 -0.05 3.40 1.26 9 10 0.14** 0.21*** 5.69 1.07 11 0.14* 2.82 1.22 12 5.77 1.45 158 Because there were differences in how each calculus course was incentivized, I examined mean differences among response rates. Results from a T-test suggest that there were significant differences in response rate by course [t (340) = 3.65, p < .001]. The average response rate for students enrolled in the applied-STEM calculus course was 47% and 59% for students enrolled in the basic-STEM calculus course. Because of this, I also examined mean differences by course type (i.e., applied-STEM vs. basic-STEM) for each variable. I further conducted missing data analysis to examine whether daily diary response rates (proportion of surveys completed out of a possible total of 11) were significantly different across groups defined by gender, race/ethnicity, and class level (e.g., sophomore). A T-test was used to examine differences in response rate by gender and a one-way ANOVA was used to examine response rate differences by race/ethnicity and class level. A significant difference was found in response rate by gender [t (340) = 3.07, p < .01]. The average response rate for males was 48% and 58% for females. An overall significant difference was found in response rate by race/ethnicity [F (6, 422) = 2.23, p < .05]; however, when examining Tukey’s post-hoc analysis, no significant differences were found between groups. Response rates by racial/ethnic group were as follows: White = 55%; International = 45%; Asian = 49%; Black = 39%; Hispanic = 48%; Two or more races = 63%; Not reported = 9%. Finally, no significant difference in response rate by year in school was found [F (5, 423) = 1.58, p = .17]. A MANOVA was run to test for differences by course in anticipated cost, expectancies, value, grade, and STEM intentions. There was a significant overall effect across variables, F(8, 305) = 17.33, p < 0.001; Wilks’ Lambda = 0.68. There were significant differences between courses on expectancies, F(1, 305) = 8.15, p < 0.01 (MbasicSTEM = 5.45; MappliedSTEM = 5.79); value, F(1, 305) = 9.41, p < 0.01 (MbasicSTEM = 5.45; MappliedSTEM = 5.81); course grade, F(1, 305) = 4.69, 159 p < 0.05 (MbasicSTEM = 2.60; MappliedSTEM = 2.92); and STEM intentions, F(1, 305) = 134.51, p < 0.01 (MbasicSTEM = 3.55; MappliedSTEM = 5.77). I also examined whether there were mean differences by course type in experienced cost using a MANOVA. An overall significant effect across experienced cost variables was found, F(4, 2213) = 9.06, p < 0.001; Wilks’ Lambda = 0.98. Across all cost dimensions, students in the basic-STEM calculus course reported higher mean levels of experienced cost relative to students in the applied-STEM calculus courses: Experienced task effort cost F(1, 2216) = 15.89, p < 0.001 (MbasicSTEM= 3.28; MappliedSTEM = 3.01); experienced outside effort cost: F(1, 2216) = 32.33, p < 0.001 (MbasicSTEM= 3.04; MappliedSTEM = 2.66); experienced loss of valued alternatives: F(1, 2216) = 9.88, p = 0.002 (MbasicSTEM= 2.98; MappliedSTEM = 2.76); experienced emotional cost: F(1, 2216) = 12.17, p < 0.001 (MbasicSTEM = 3.34; MappliedSTEM= 3.08). I also used a chi-square test to examine whether there were systematic differences in students who had missing or complete data on outcome variables (STEM career intentions and course grade) by gender, race/ethnicity, class level, and by calculus course. For grades, there were no significant differences by gender [χ2 (1) = 1.78, p = .18] or calculus course [χ2 (1) = .03, p = .86]. The same pattern was found when examining STEM career intentions: gender [χ2 (1) = 2.47, p = .12]; calculus course [χ2 (1) = 1.98, p = .16]. Significant differences were found when examining grades and STEM career intentions by race/ethnicity and course type. First, when examining grades by race/ethnicity [χ2 (6) = 35.90, p < .001], 4% of grades were missing (N = 16 out of 429 students). White students were missing 4% of grades (N = 10 out of 267 students), Black students were missing 10% (N = 2 out of 20 students), Hispanic students were missing 10% (N = 2 out of 20 students), students reporting two or more races were missing 11% (N = 1 out of 9 students), and the one student who did not report his/her race was missing grade data. 160 When examining grades by class level [χ2 (5) = 91.44, p < .001], freshman were missing 3% (N = 9 out of 334 students), sophomores were missing 2% (N = 1 out of 63 students), juniors were missing 5% (N = 1 out of 20 students), and 5 out of 7 (71%) of students who did not report their class level were missing grades. When examining missing STEM career intentions by race/ethnicity [χ2 (6) = 12.68, p < .05] and class level [χ2 (5) = 16.93, p < .01], a total of 26% was missing (N = 111 out of 429 students). White students were missing 22% of STEM career intentions (N = 60 out of 267 students), International students were missing 32% (N = 27 out of 85 students), Asian students were missing 15% (N = 4 out of 27 students), Black students were missing 45% (N = 9 out of 20 students), Hispanic students were missing 35% (N = 7 out of 20 students), students reporting two or more races were missing 33% (N = 3 out of 9 students), and the one student who did not report his/her race was missing grade data. Freshman were missing 23% (N = 78 out of 334 students), sophomores were missing 30% (N = 19 out of 63 students), juniors were missing 35% (N = 7 out of 20 students), 2 out of 2 seniors (100%), and 5 out of 7 (71%) students who did not report their class level were missing grades. To assess convergence of estimation for each model, I examined the posterior scale reduction (PSR) factor (i.e., the ratio of the total variation over the within-chain variation), autocorrelation plots, and trace plots. Convergence criteria were all acceptable upon examination. Across all models, the PSR factor reached a satisfactory level of close to one. For autocorrelation plots, a small autocorrelation around 0.1 is desired (Muthén, 2010). This was true across all models. Last trace plots depict the estimate of a model parameter sampled in each MCMC chain across the number of iterations (Jebb & Woo, 2015). Assuming there is no 161 abnormality in model convergence, the trace plot should show no trend or large fluctuations. Again, this was true across all models. Main Analysis Four models (one for each dimension of cost) were examined. All estimates are included in Table 3.3 and a trimmed model showing significant results is presented in Figure 3.2. Across all four models, all fixed effects (within- and between-person) and variance components were significant. For example, on average, experienced task effort across students was 3.12 (95% CI = [3.00, 3.25]) and had a variance component of 0.91 (95% CI = [0.73, 1.12]). Further, within- person, the previous week’s reported TE cost was positively related to the current week’s reported TE cost, 0.29 (95% CI = [0.22, 0.36]). The same held true when examining between- person results for TE cost, 0.30 (95% CI = [0.22, 0.38]) with a variance of 0.11 (95% CI = [0.07, 0.16]). This suggests that experienced TE cost has week-to-week carryover effects within- and between-person and varies significantly from student to student. Further, the residual within- person variability in experienced cost, across all models, was significant implying that the consistency of experienced cost varies from week-to-week within- and between-person. Below, I focus on the examined paths in the hypothesized models and report the significant findings. 162 Model 1 TE Model 2 OE Model 3 LV Model 4 EM Est. (SD) 95% CI Est. (SD) 95% CI Est. (SD) 95% CI Est. (SD) 95% CI 0.29* (0.04) 0.81* (0.02) [0.22, 0.36] [0.76, 0.86] 0.28* (0.03) 0.79* (0.02) [0.22, 0.35] [0.75, 0.83] 0.30* (0.04) 0.79* (0.02) [0.23, 0.37] [0.74, 0.83] 0.27* (0.04) 0.85* (0.03) [0.20, 0.34] [0.79, 0.92] -0.25* (0.06) 0.002 (0.03) -0.02 (0.02) [-0.37, -0.14] [-0.05, 0.07] [-0.06, 0.001] -0.27* (0.08) -0.01 (0.03) -0.02 (0.02) [-0.43, -0.14] [-0.08, 0.02] [-0.07, 0.01] [-0.39, -0.12] [-0.09, 0.04] [-0.07, 0.01] -0.22* (0.05) -0.01 (0.05) -0.02 (0.02) [-0.33, -0.12] [-0.16, 0.07] [-0.06, 0.004] Table 3.3. Results from dynamic structural equation models Within-person (Level 1) Fixed effect Exp. costt-1 à Exp. costt log(Residual variance of exp. cost) Between-person (Level 2) Fixed effect Experienced cost Exp. costt-1 à Exp. costt log(Residual variance of exp. cost) Grade STEM intentions Effect of anticipated cost on … Experienced cost Exp. costt-1 à Exp. costt log(Residual variance of exp. cost) Effect of expectancies on … Experienced cost Exp. costt-1 à Exp. costt log(Residual variance of exp. cost) Effect of value on … Experienced cost Exp. costt-1 à Exp. costt log(Residual variance of exp. cost) Effect of … on Grade Anticipated cost Expectancies Value Experienced cost Exp. costt-1 à Exp. costt log(Residual variance of exp. cost) Effect of … on STEM intentions Anticipated cost Expectancies Value Experienced cost Exp. costt-1 à Exp. costt log(Residual variance of exp. cost) Ind. eff. of AC on Grade through … Experienced cost Exp. costt-1 à Exp. costt log(Residual variance of exp. cost) 3.12* (0.06) 0.30* (0.04) -0.71* (0.10) 4.14* (0.31) 5.38* (0.52) 0.62* (0.06) -0.01 (0.04) 0.28* (0.08) [3.00, 3.25] [0.22, 0.38] [-0.91, -0.52] [3.53, 4.74] [4.37, 6.39] [0.50, 0.74] [-0.09, 0.07] [0.12, 0.45] -0.23* (0.09) 0.01 (0.05) 0.02 (0.13) [-0.39, -0.06] [-0.10, 0.11] [-0.24, 0.27] -0.02 (0.07) -0.02 (0.05) 0.04 (0.11) 0.24* (0.08) 0.02 (0.08) 0.08 (0.07) -0.40* (0.09) -0.49 (0.45) -0.09 (0.05) 0.13 (0.14) 0.02 (0.15) 0.37* (0.13) -0.18 (0.15) 0.42 (0.74) -0.02 (0.08) [-0.16, 0.13] [-0.11, 0.06] [-0.18, 0.27] [0.09, 0.39] [-0.14, 0.19] [-0.06, 0.22] [-0.57, -0.23] [-1.37, 0.38] [-0.18, 0.003] [-0.13, 0.40] [-0.28, 0.33] [0.11, 0.63] [-0.47, 0.12] [-1.05, 1.87] [-0.18, 0.14] 2.74* (0.06) 0.30* (0.04) -0.91* (0.11) 4.15* (0.35) 5.46* (0.54) 0.58* (0.07) 0.05 (0.04) 0.42* (0.10) -0.27* (0.08) 0.08 (0.05) -0.25 (0.14) -0.07 (0.07) -0.06 (0.05) 0.001 (0.12) 0.34* (0.09) 0.02 (0.09) 0.04 (0.07) -0.47* (0.12) -0.36 (0.35) -0.06 (0.04) 0.15 (0.15) -0.05 (0.16) 0.37* (0.14) -0.25 (0.19) 0.50 (0.61) -0.02 (0.07) 163 [2.62, 2.86] [0.22, 0.38] [-1.22, -0.69] [3.48, 4.84] [4.39, 6.51] [0.45, 0.70] [-0.03, 0.12] [0.21, 0.61] 2.83* (0.06) 0.31* (0.04) -0.90* (0.11) 4.00* (0.33) 4.84* (0.53) 0.58* (0.06) 0.06 (0.04) 0.45* (0.10) [2.70, 2.96] [0.23, 0.39] [-1.11, -0.68] [3.37, 4.65] [3.78, 5.87] [0.45, 0.70] [-0.01, 0.13] [0.26, 0.64] 3.20* (0.07) 0.27* (0.04) -0.70* (0.11) 3.96* (0.31) 5.40* (0.51) 0.65* (0.06) 0.05 (0.03) 0.34* (0.09) [3.07, 3.34] [0.20, 0.36] [-0.91, -0.50] [3.36, 4.59] [4.36, 6.38] [0.53, 0.76] [-0.02, 0.12] [0.17, 0.52] [-0.42, -0.12] [-0.02, 0.19] [-0.53, 0.03] -0.34* (0.08) 0.01 (0.05) -0.30* (0.14) [-0.50, -0.18] [-0.09, 0.10] [-0.57, -0.02] -0.21* (0.09) -0.02 (0.05) -0.07 (0.14) [-0.40, -0.03] [-0.12, 0.08] [-0.36, 0.21] [-0.21, 0.08] [-0.15, 0.03] [-0.24, 0.24] [0.17, 0.52] [-0.16, 0.19] [-0.11, 0.19] [-0.70, -0.24] [-1.03, 0.34] [-0.14, 0.02] [-0.14, 0.44] [-0.37, 0.26] [0.09, 0.63] [-0.60, 0.12] [-0.66, 1.77] [-0.17, 0.12] -0.01 (0.07) 0.02 (0.04) 0.08 (0.12) 0.25* (0.08) -0.05 (0.09) 0.10 (0.07) -0.42* (0.11) -0.20 (0.40) -0.06 (0.04) 0.11 (0.14) -0.001 (0.16) 0.35* (0.13) -0.07 (0.18) 0.60 (0.63) -0.14 (0.07) -0.24* (0.07) -0.01 (0.03) -0.03 (0.02) [-0.15, 0.13] [-0.07, 0.10] [-0.16, 0.32] [0.09, 0.42] [-0.22, 0.12] [-0.04, 0.24] [-0.63, -0.22] [-0.99, 0.58] [-0.15, 0.02] [-0.17, 0.38] [-0.32, 0.31] [0.09, 0.61] [-0.41, 0.30] [-0.55, 1.98] [-0.28, 0.01] -0.05 (0.08) 0.04 (0.05) -0.09 (0.12) 0.12 (0.08) -0.04 (0.09) 0.09 (0.08) -0.34* (0.07) -0.41 (0.78) -0.07 (0.04) 0.10 (0.15) 0.02 (0.17) 0.32* (0.15) -0.22 (0.13) 0.90 (1.18) -0.03 (0.07) [-0.20, 0.11] [-0.05, 0.13] [-0.32, 0.15] [-0.03, 0.29] [-0.22, 0.13] [-0.07, 0.25] [-0.49, -0.20] [-2.11, 0.96] [-0.15, 0.01] [-0.21, 0.37] [-0.30, 0.35] [0.01, 0.59] [-0.47, 0.03] [-1.29, 3.40] [-0.16, 0.11] 0.12* (0.05) -0.02 (0.04) 0.01 (0.02) -0.14 (0.11) 0.02 (0.04) -0.01 (0.03) 0.02 (0.03) -0.01 (0.05) 0.004 (0.11) [0.05, 0.23] [-0.12, 0.03] [-0.01, 0.05] [-0.04, 0.11] [-0.02, 0.10] [-0.02, 0.02] [0.01, 0.16] [-0.07, 0.13] [-0.02, 0.03] [0.02, 0.18] [-0.08, 0.07] [-0.03, 0.03] [0.06, 0.26] [-0.06, 0.04] [-0.01, 0.06] [-0.30, 0.07] [-0.09, 0.06] [-0.06, 0.04] [-0.06, 0.07] [-0.04, 0.09] [-0.03, 0.02] [-0.06, 0.07] [-0.05, 0.04] [-0.03, 0.01] 0.14* (0.05) -0.001 (0.02) 0.02 (0.02) 0.004 (0.03) -0.001 (0.02) -0.003 (0.01) 0.09* (0.04) -0.001 (0.04) -0.001 (0.01) 0.01 (0.03) 0.01 (0.03) -0.002 (0.01) -0.11 (0.10) -0.001 (0.04) -0.01 (0.02) 0.03 (0.04) 0.02 (0.03) < 0.001 (0.01) Table 3.3. (cont’d) Ind. eff. of Expec. on Grade through … Experienced cost Exp. costt-1 à Exp. costt log(Residual variance of exp. cost) Ind. eff. of Val. on Grade through … Experienced cost Exp. costt-1 à Exp. costt log(Residual variance of exp. cost) Ind. eff. of AC on STEM through … Experienced cost Exp. costt-1 à Exp. costt log(Residual variance of exp. cost) Ind. eff. of Expec. on STEM through … Experienced cost Exp. costt-1 à Exp. costt log(Residual variance of exp. cost) Ind. eff. of Value on STEM through … Experienced cost Exp. costt-1 à Exp. costt log(Residual variance of exp. cost) Variance Experienced cost Exp. costt-1 à Exp. costt log(Residual variance of exp. cost) Grade STEM intentions Note. Regression coefficients at the within-person level were standardized estimates averaged over clusters (i.e., students); those at the between-person level were unstandardized. The default for point estimate of central tendency of posterior distributions in Mplus is median. The table shows the posterior standard deviation (SD) and 95% credibility interval (CI) for each parameter estimate. * denotes that interval excludes zero, suggesting that the parameter differs from zero. TE = Task Effort Cost; OE = Outside Effort Cost; LV = Loss of Valued Alternatives; EM = Emotional Cost; Exp. = Experienced; Expec. = Expectancies; ind. eff. = indirect effect. 0.68* (0.09) 0.14* (0.03) 3.24* (0.32) 1.22* (0.11) 3.33* (0.29) 1.14* (0.12) 0.07* (0.03) 3.00* (0.29) 1.20* (0.11) 3.27* (0.31) 0.91* (0.10) 0.11* (0.02) 2.55* (0.26) 1.21* (0.11) 3.35* (0.29) 0.74* (0.10) 0.13* (0.03) 3.15* (0.31) 1.25* (0.10) 3.31* (0.29) [0.73, 1.12] [0.07, 0.16] [2.10, 3.11] [1.01, 1.43] [2.84, 3.97] [0.92, 1.40] [0.03, 0.13] [2.47, 3.63] [0.98, 1.41] [2.68, 3.91] [0.56, 0.95] [0.08, 0.18] [2.60, 3.79] [1.06, 1.47] [2.78, 3.92] [0.51, 0.88] [0.10, 0.19] [2.68, 3.91] [1.02, 1.43] [2.81, 3.96] 0.04 (0.04) 0.001 (0.05) < 0.001 (0.01) 0.01 (0.03) -0.02 (0.05) < 0.001 (0.01) < 0.001 (0.01) 0.004 (0.04) -0.01 (0.02) [-0.24, 0.17] [-0.04, 0.18] [-0.14, 0.004] 0.001 (0.02) -0.003 (0.04) < 0.001(0.01) 0.04 (0.04) -0.004 (0.08) 0.001 (0.01) [-0.11, 0.15] [-0.08, 0.10] [-0.01, 0.12] [-0.03, 0.03] [-0.06, 0.11] [-0.06, 0.02] [-0.04, 0.07] [-0.12, 0.07] [-0.01, 0.03] [-0.31, 0.02] [-0.08, 0.28] [-0.06, 0.04] [-0.04, 0.04] [-0.11, 0.06] [-0.03, 0.02] [-0.03, 0.13] [-0.09, 0.11] [-0.02, 0.02] [-0.01, 0.13] [-0.19, 0.14] [-0.02, 0.03] [-0.03, 0.06] [-0.10, 0.22] [-0.02, 0.03] [-0.36, 0.07] [-0.04, 0.13] [-0.07, 0.05] [-0.03, 0.19] [-0.06, 0.21] [-0.04, 0.06] [-0.02, 0.08] [-0.16, 0.05] [-0.02, 0.02] 0.06 (0.06) 0.03 (0.07) 0.004 (0.02) 0.01 (0.02) 0.02 (0.08) 0.001 (0.01) -0.04 (0.11) 0.03 (0.05) -0.06 (0.04) 0.02 (0.06) 0.002 (0.04) 0.04 (0.03) 0.07* (0.04) 0.002 (0.05) 0.003 (0.01) -0.14 (0.08) 0.03 (0.09) -0.01 (0.03) 164 Experienced(w) Costt-1, i .29*/.28*/.30*/.27 Experienced(w) Costt, i Within-person Between-person Anticipated Cost .24*/.34*/.25*/.12 Grade .62*/.58*/.58*/.65* Experienced Cost -.40*/-.47*/-.42*/-.34* -.23*/-.27*/-.34*/-.21* Expectancies Value Experienced Cost Auto. Slope .02/-.25/-.30*/-.07 .28*/.42*/.45*/.34* log residual variance .37*/.37*/.35*/.32* STEM intentions Figure 3.2. Trimmed model. Results in the figure are presented in the same order as in Table 3.3: task effort cost, outside effort cost, loss of valued alternatives, emotional cost. * denotes that interval excludes zero, suggesting that the parameter differs from zero. Note: Indirect effects are not shown. Regression coefficients at the within-person level were standardized estimates averaged over clusters (i.e., students); those at the between-person level were unstandardized. 165 Associations Between Anticipated Cost and Experienced Cost (RQ1a) Across all four models, anticipated cost positively predicted experienced cost: TE cost: 0.62 (95% CI = [0.50, 0.74]); OE cost: 0.58 (95% CI = [0.45, 0.70]); LV cost: 0.58 (95% CI = [0.45, 0.70]); EM cost: 0.65 (95% CI = [0.53, 0.76]). This suggests that students who anticipated that there would be high cost involved in the course tended to experience those higher costs. Similarly, across all four models, anticipated cost positively predicted the magnitude of the within-person variability in experienced cost: TE cost: 1.32 (i.e., exp(0.28)); OE cost: 1.52 (i.e., exp(0.42)); LV cost: 1.57 (i.e., exp(0.45)); EM cost: 1.40 (i.e., exp(0.34)). This would suggest that students who reported high levels of anticipated cost beliefs had higher variability in their experiences of cost than those who experienced lower cost, on average. Associations Among Anticipated Cost, Course Grades, and STEM Career Intentions (RQ1b) Controlling on the level of cost experienced during the course, students who reported higher levels of anticipated TE cost, 0.24 (95% CI = [0.09, 0.39]), OE cost, 0.34 (95% CI = [0.17, 0.52]), and LV cost, 0.25 (95% CI = [0.09, 0.42]), had higher grades compared to students with low anticipated TE, OE, and LV cost. Anticipated EM cost was not a significant predictor of course grades. None of the anticipated cost measures were significantly related to STEM career intentions. Given the inconsistent findings of cost as a positive predictor of grades with past research (Perez et al., 2015), a path model was examined where experienced predictors were dropped. With the exception of anticipated emotional cost, the positive relation between all anticipated costs and grades remained. Associations Among Experienced cost, Course Grades, and STEM Career Intentions (RQ2a) Across all four models of cost, experienced cost negatively predicted course grade, controlling on anticipated cost: TE cost: -0.40 (95% CI = [-0.57, -0.23]); OE cost: -0.47 (95% CI 166 = [-0.70, -0.24]; LV cost: -0.42 (95% CI = [-0.63, -0.22]); EM cost: -0.34 (95% CI = [-0.49, - 0.20), suggesting that students who experienced high levels of cost had a lower final grade in their calculus course compared to students who experienced low levels of cost. There were no significant findings when examining experienced cost as a predictor of STEM career intentions. Further, no other significant effects were found related to the week-to-week carryover effects of cost or the variability of experienced cost. Experienced Cost as a Mediator (RQ2b) Indirect effects were found for all four models of cost. Experienced cost mediated the relation between anticipated cost and course grade as follows: TE cost: -0.25 (95% CI = [-0.37, - 0.14]; OE cost: -0.27 (95% CI = [-0.43, -0.14]); LV cost: -0.24 (95% CI = [-0.39, -0.12]); EM cost: -0.22 (95% CI = [-0.33, -0.12). This implies that the association between anticipated cost and grade appears to be partially explained by experienced cost. It is possible that this mediation is an example of an inconsistent mediation model, or a model where a mediated effect has a different sign than other direct or mediated effects (Blalock, 1969; Davis, 1985; MacKinnon et al., 2000, 2007). Because anticipated cost was a positive predictor of experienced cost and experienced cost was a negative predictor of grades, a suppressor effect is observed when examining mediation because of the negative indirect effects (Kline, 2015; Shieh, 2006). Ancillary Findings For all four models of cost, students who reported high expectancies, experienced lower levels of cost compared to those who reported low expectancies: TE cost: -0.23 (95% CI = [- 0.39, -0.06]); OE cost: -0.27 (95% CI = [-0.42, -0.12]); LV cost: -0.34 (95% CI = [-0.50, -0.18]); EM cost: -0.21 (95% CI = [-0.40, -0.03]). That is, students who reported high expectancies, experienced lower levels of cost compared to those with lower expectancies. For LV cost, 167 expectancies were also found to negatively predict the variability in experienced cost, -.30 (95% CI = -0.57, -0.02), suggesting that those with high expectancies had lower variability in their experiences of LV cost. Indirect effects were found for all four models of cost when examining experienced cost as a mediator between expectancies and course grade: TE cost: 0.09 (95% CI = [0.02, 0.18]); OE cost: 0.12 (95% CI = [0.05, 0.23]); LV cost: 0.14 (95% CI = [0.06, 026]); EM cost: 0.07 (95% CI = [0.01, 0.16). That is, experienced cost partially explains the association between expectancies and course grades. Similarly, this may be a case of inconsistent mediation and a suppression effect where both direct effects from expectancies to experienced cost and experienced cost to course grades were negative, but the indirect effect was positive. No significant effects were found when examining value as a predictor of experienced TE cost, the week-to-week carryover, or the within-person variability; however, when examining STEM career intentions, in all models of cost, students who reported higher value for their course, reported higher intentions to pursue a STEM career than students who reported low value: TE cost: 0.37 (95% CI = [0.11, 0.63]); OE cost: 0.37 (95% CI = [0.09, 0.63]); LV cost: 0.35 (95% CI = [0.09, 0.61]); EM cost: 0.32 (95% CI = [0.01, 0.59). Discussion The purpose of this study was to examine how students’ cost beliefs as both anticipated and experienced predict course grade in a calculus course and STEM career intentions. Further, this study examined the extent to which experienced costs mediate the effects of one’s anticipated cost beliefs and course grade and STEM career intentions. Analyses controlled for expectancies and values, as these factors have been shown to be related to both cost beliefs and the academic outcomes of interest. Eccles (2005) proposed that cost comprises the anticipated effort (i.e., expected costs) and the things one must sacrifice (i.e., actual experienced costs) that 168 are needed to complete a task; however, research has yet to explore how these different time frames of cost relate to one another and whether they have differential effects on student outcomes. The findings of this study suggest that both anticipated and experienced cost beliefs independently impact achievement; however, these constructs were found to be predictors of achievement in different ways. That is, anticipated cost positively predicted achievement, whereas experienced cost negatively predicted achievement. Though results were similar across dimensions of cost, some dimensions of cost were more predictive of course grades than others. The Nature of Anticipated Cost Beliefs As discussed, Eccles (2005) considers cost as the anticipated effort and the actual sacrifices one must make. Results from this study suggest that anticipated cost may act as a “double-edged sword,” in that students who anticipated that their calculus course would be more costly actually experienced higher costs throughout the course, but also had higher grades, after controlling for expectancies, values, and experienced cost. Perhaps those who anticipate higher costs experience a “self-fulfilling prophecy” in that they are setting themselves up for experiencing higher costs during the course. Conversely, this relationship could also be interpreted as students accurately forecasting how they will experience costs during the course, which may better prepare students for the course, as indicated by higher grades. On the other hand, if a student does not anticipate costs and then experiences high levels of these costs, their grades may suffer. Students who reported high anticipated cost beliefs also experienced larger variability in their experiences of cost. Taken together, anticipating high cost beliefs may provide both positive and negative consequences. It is possible that anticipating high costs allow students to prepare for those future sacrifices by spending more time studying for an exam for example. In turn, this may positively impact their course grade. 169 It could also be that students who anticipate high costs appear to experience higher costs throughout the course and those with higher experienced costs tend to have lower grades. Though it is unclear from this study what students are drawing on to make assumptions about the costs they will experience, it may be that they are drawing on past experiences from other math courses. Eccles (1983) posited that a students’ affective memories predict components of task value. Perhaps students draw on past affective memories when they anticipate the costs they will experience. Interestingly, anticipated task effort cost and emotional cost were most predictive of experienced cost, but outside effort cost and loss of valued alternatives cost were most predictive of larger variability in experienced costs. Perhaps students struggled to anticipate outside effort and loss of valued alternatives cost and therefore experienced larger swings in the variability for these types of costs. For example, it may be hard to anticipate what other demands will arise that take away from the calculus course (i.e., outside effort) or whether new valued alternatives come about that must be given up in order to be successful (i.e., loss of valued alternatives). On the other hand, students may be more familiar with the effort needed to be successful in a math course (i.e., task effort cost) and the emotional toll it will take (i.e., emotional cost), due to taking previous math courses throughout high school. Thus, it is possible that students struggle to anticipate things that are out of their control in the future. That is, it is difficult to predict what other demands will become salient, especially as a first-year college student who does not have much experience in this new environment. When considering the effect of anticipated cost on course grades, as mentioned above, students who anticipated higher costs also had higher course grades for TE, OE, and LV cost; however, the cost literature suggests that cost negatively predicts achievement (Conley, 2012; Jiang et al., 2018; Safavian et al., 2013; Trautwein et al., 2012). It is possible that regarding these 170 dimensions, students anticipated high costs, and thus were able to plan to counterbalance these costs by putting in the effort and time needed to perform well in their calculus course. It may be that these students are rigid students who achieve highly and thus, overprepare. Anticipated emotional cost, on the other hand, did not significantly predict course grades. Perhaps even if high emotional cost is anticipated, it is more difficult to counterbalance these costs by just putting in more effort, for example. No significant effects were found regarding anticipated cost as a predictor of STEM intentions, which is also conflicting with past research showing that negative cost beliefs are associated with lower intentions to pursue a STEM career or take STEM courses (Battle & Wigfield, 2003; Perez et al., 2014). Mean levels of STEM career intentions were relatively high in this sample (M = 5.77) and thus may be contributing to possible ceiling effects. Further, there were no significant correlations between any cost variables and STEM career intentions. This non-significant finding may be an artifact of the sample as well, given that it was made up mostly of students who intend to pursue STEM careers. As evidenced by the MANOVA in the preliminary analysis, students from the applied-STEM course did have significantly higher STEM intentions than students in the basic-STEM course (MbasicSTEM = 3.55; MappliedSTEM = 5.77); however, the sample was still largely made up of students in the applied-STEM course. Understanding Experienced Cost Once controlling for anticipated cost, the experience of cost was negatively related to grades and was a stronger predictor than anticipated cost. This was true for all dimensions of experienced cost; however anticipated emotional cost was not a predictor of grades. This is contradictory to the results of anticipated cost on grades, but more in line with other research on cost (Conley, 2012; Jiang et al., 2018; Safavian et al., 2013; Trautwein et al., 2012). It may be 171 that it is more difficult for students to respond to proximal events that impact their experiences of cost. For example, a student may receive more homework in his other classes in a particular week, making it difficult to put the time into his calculus class. Given the findings of this study, it may be that anticipating costs is more adaptive than experiencing costs, if students are able to plan how to respond to those costs in advance; however, experienced cost was a mediator of anticipated cost and grades. Experienced cost partially explains the relation between anticipated cost and grades (i.e., anticipated cost predicts grades through experienced cost). Thus, experienced costs may be best to intervene on. Still, it is important to consider the effects of experienced cost in conjunction with those of anticipated cost (i.e., results must be interpreted with respect to the models in this study). Students with high anticipated costs and low experienced costs may have higher grades, whereas students who anticipate high costs and experience high costs may have lower grades. As researchers have noted the potential benefits of cost interventions (Barron & Hulleman, 2015; Rosenzweig et al., 2020), it may be worth considering intervening throughout the course, where students experience costs, rather than when they anticipate them. This could prove to be more adaptive than intervening at one particular time point. It may be advantageous to help students work through their experiences of high costs without suffering adverse academic consequences; however, more research is needed to truly understand the effects of experienced costs and how they vary from week-to-week in order to help students respond to these costs throughout the semester. Researchers are beginning to develop cost reduction interventions by targeting only certain dimensions (Rosenzweig et al., 2020); however, different strategies may be necessary for different dimensions of cost. Considering that task effort and outside effort costs have to do with students responding to challenges related to effort in the course of interest as 172 well as other courses they may be taking, time management strategies may be useful in helping to combat some of the negative associations found in these dimensions. Regarding emotional cost, strategies for emotional regulation may be helpful throughout the semester, so that students know how to regulate feelings of too much anxiety and stress that appear to be contributing to lower achievement. Outside effort and loss of valued alternatives cost were the strongest predictors of grade, thus it may be best to focus on how to combat the negative effects of these two types of cost first. These two dimensions of cost may also be the toughest to counter, given that it is difficult to remove outside sources that contribute to outside effort cost and loss of valued alternatives. It is possible that qualitative work may shed some light on why students experience costs. Much of the research on cost has explored how much cost students are experiencing, rather than why. This insight could also help to frame future interventions. It is also important to note that there was a positive association between experienced cost from one week to the next (i.e., students who experience high costs one week, experience higher costs the next week) and that the residual variances of cost were significant. This suggests that a more in-depth look is needed to understand the fluctuations of cost from week-to-week both between- and within-person. Why is it that cost beliefs are generally higher during some weeks of the semester, compared to others? Further, how and why do experiences of cost differ between-person during the same week? More research is needed to understand what causes this variability. One possible cause may be the feedback that students receive after exams. For example, a student may experience low emotional cost the weeks leading up to exam, but upon learning that he received a poor grade, may begin to experience higher emotional cost. Alternatively, other valued alternatives and demands may arise throughout the semester. A student may join a new club during the semester that takes away time from putting effort into her 173 calculus course (i.e., outside effort cost). More work is needed to understand these shifts in order to make more sound intervention recommendations. How Anticipated and Experienced Cost Beliefs Inform Expectancy-Value Theory Eccles (2005) has discussed cost as what an individual sacrifices to engage in a task and the anticipated effort needed to complete the task. This study used a novel approach to examine both anticipated and experienced cost beliefs. The results of this study suggest that both anticipated and experienced cost beliefs are significant predictors of course grades. Because of this, it is important that theorizing of cost consider distinguishing between anticipated and experienced dimensions. Experienced cost was found to be a negative predictor of grades, whereas anticipated cost was a positive predictor; however, experienced cost was found to be a stronger predictor than anticipated cost. As empirical evidence suggests that anticipated and experienced emotions have differential effects (Schuster & Martiny, 2016; Wilson & Gilbert, 2005), perhaps it is not surprising that anticipated cost and experienced cost were found to predict grades in different ways. Perhaps experienced emotional costs are similar to experienced emotional responses in that they are sensitive to the timing of outcomes, whereas anticipated emotional responses are not (Loewenstein & Lerner, 2013). Similarly, it is possible that just as experienced emotions can drive students to act in a way that is contradictory to their future goals, experienced cost may also drive choices that push them away from their long-term goals, which may be why their grades suffer. This may also be related to the temporal issues associated with how students respond to the self-reported items. Anticipated beliefs may draw on past experiences or future anticipations whereas experiences are felt in-the-moment. Further, these temporal associations likely impact the way they are associated to distal outcomes such as achievement and STEM career intentions. Asking students to anticipate their future costs may act 174 as an intervention already, which could partially explain why positive relations were found between anticipated cost and grades. This distinction between anticipated and experienced beliefs is critical for researchers examining cost as implications may differ when designing interventions for example. This study examined four dimensions of cost beliefs and found them all to be significant predictors of grade. Though one may argue that this may be a reason to consider cost as a single dimension, the effects of certain dimensions were larger than others and it is important to consider the magnitude of effects in this study. For example, experienced outside effort cost was the strongest predictor of grade, whereas emotional cost was the weakest. Further, past work has found dimensions of cost to predict academic outcomes differently as well (Perez et al., 2014; Flake et al., 2015). One finding of this study was that anticipated emotional cost was not a predictor of grades, but experienced emotional cost was. Thus, by combining cost into a single dimension, critical information could be lost when designing future interventions, as discussed above. For example, the intervention strategies to combat outside effort cost will be different compared to emotional cost. Future Directions and Limitations DSEM is a relatively new analytic technique and as such does not have some of the capabilities of other modeling techniques. For example, DSEM is currently limited to two-level models in Mplus (Muthén & Muthén, 2018); thus, I was unable to control for the third-level of nesting: course. Still, DSEM provides a novel approach to understanding between- and within- person effects of cost beliefs throughout a semester and how experiences of cost impact academic outcomes. Further, because DSEM is still in development, there are not robust means to compare model fit. Though, one can examine the deviance information criterion (DIC), it is 175 not always the most effective way to compare model fit (Asparouhov et al., 2018). In place, I examined PSR factors, autocorrelation plots, and trace plots (Zhou et al., 2019), which were satisfactory. Still, there is uncertainty surrounding model fit, given the current state of DSEM. Finally, given concerns of sample size and the number of included parameters in models, I was unable to control for individual characteristics such as gender, race, and ethnicity. Though more research is needed to understand how cost beliefs function for students from different backgrounds, using DSEM still allowed for novel insights into how cost beliefs function in the classroom throughout the semester. Second, there may be other variables that shift throughout the semester that contribute to how a student experiences cost. For example, a student may value the calculus course less during the beginning of the semester, but as it progresses, they value it more. In turn, experiences of cost may shift from high to low. Further, students face competing demands that may vary throughout a semester. For example, one week a student may have to study for an exam in an English literature course that takes away from the time she could be spending on calculus; however, other weeks the student may not face these demands. Thus, experiences of cost likely depend on these competing demands and shifting values throughout a course. Future work is needed to understand how these other variables impact cost over time. Next, the role of anticipated cost beliefs calls for a more in-depth examination in the future. First, anticipated cost beliefs positively predicted experienced cost beliefs and grades. Although cost has generally been studied among college students (Flake et al., 2015; Perez et al., 2014), it is likely that students form cost beliefs based on their previous experiences related to math (e.g., high school courses, social perceptions of math). Because of this, more research is needed to understand cost beliefs in K-12 settings as it is unclear how students form these 176 beliefs. Further, it is possible that students’ self-reports of anticipated cost could be acting as a proxy for their current or immediate past mathematics performance. It is unclear how students form anticipated cost beliefs and whether they rely on past experiences, future assumptions, or both when anticipating costs. Thus, more work is needed to tease this apart to understand how students make these predictions. It is also important to note that this study is limited in its generalizability. Response rates were larger for women than men; thus, more data was provided by women. The same can be said for students in the basic-STEM class compared to the applied-STEM class, so results may represent these groups disproportionately; however, as mentioned above, DSEM is in relatively early stages of development making it difficult to control for many of these factors. Therefore, results should be interpreted with caution when attempting to generalize beyond the sample of this study. Conclusion The present study aimed to examine how anticipated cost beliefs and experienced costs predict achievement and STEM career intentions using a novel approach (i.e., DSEM). Evidence suggests that anticipated and experienced costs should be considered in future work, given that they both are predictors of course grades. As experienced cost was a strong negative predictor of grades, more work is needed in order to understand what impacts fluctuations in these experiences of cost, so that future interventions can be designed to combat these negative effects. Although, anticipated and experienced costs did not predict STEM-career intentions here, they did predict grades, which have been shown to be a precursor for STEM persistence (Gasiewski et al., 2012; Jones et al., 2010; Lawson et al., 2007; Shaw & Barbuti, 2010); thus, the results of this study add to the literature suggesting that cost may be a detractor from STEM persistence. Future 177 work must consider the effects of high costs in STEM classrooms, in order to combat STEM attrition rates. 178 APPENDIX 179 Table C.1. Study timeline Week 1 Week 2 Week 3 Week 4 Week 5 Week 6 Week 7 Week 8 Week 9 Week 10 Week 11 Week 12 Week 13 Week 14 Week 15 Monday Pre-Survey End of Class Survey End of Class Survey End of Class Survey End of Class Survey End of Class Survey Post-Survey Post-Survey Wednesday Pre-Survey Pre-Survey End of Class Survey End of Class Survey End of Class Survey End of Class Survey End of Class Survey End of Class Survey Post-Survey Post-Survey 180 Table C.2. Survey items Measure Anticipated perceptions of cost in calculus Expectancies Details Average score for all 19 items and average score for each sub- scale (4-6 items each) from Flake et al. (2015): Response scale: 7 item scale (1 = Strongly disagree, 7 = Strongly agree) Task Effort Cost: 1. This class will demand too much of my time. 2. I will have to put too much energy into this class. 3. This class will take up too much of my time. 4. This class will be too much work. 5. This class will require too much effort. Outside Effort Cost: 1. I have so many other commitments that I won’t be able to put forth the effort needed for this class. 2. Because of all of the other demands on my time, I won’t have enough time for this class. 3. I have so many other responsibilities that I will be unable to put in the effort that is necessary for this class. 4. Because of other things that I do, I won’t have time to put into this class. Loss of Valued Alternatives: 1. I’ll have to sacrifice too much to be in this class. 2. This class will require me to give up too many other activities I value. 3. Taking this class will cause me to miss out on too many other things I care about. 4. I won’t spend as much time doing the other things that I would like because I am taking this class. Emotional Cost: 7. I’ll worry too much about this class. 8. This class will be too exhausting. 9. This class will be emotionally draining. 10. This class will be too frustrating. 11. This class will be too stressful. 12. This class will make me feel too anxious. Average score for all 3 items from Kosovich et al. (2015): 181 Table C.2. (cont’d) Value Intentions to pursue a STEM career Response scale: 7 item scale (1 = Strongly disagree, 7 = Strongly agree) 1. I know I can learn the material in my math class. 2. I believe that I can be successful in my math class. 3. I am confident that I can understand the material in my math class. Average score for all 3 items from Kosovich et al. (2015): Response scale: 7 item scale (1 = Strongly disagree, 7 = Strongly agree) 1. I think my math class is important. 2. I value my math class. 3. I think my math class is useful. Single item from Estrada et al. (2011): Response scale: 7 item scale (1 = Definitely will not, 7 = Definitely will) 1. To what extent do you intend to pursue a STEM-related career? 182 Table C.3. Confirmatory factor analysis results Item Task effort Ant. TE1 Ant. TE2 Ant. TE3 Ant. TE4 Ant. TE5 Outside effort Ant. OE1 Ant. OE2 Ant. OE3 Ant. OE4 LOVA Ant. LV1 Ant. LV 2 Ant. LV 3 Ant. LV 4 Emotional Ant. EM1 Ant. EM2 Ant. EM3 Ant. EM4 Ant. EM5 Ant. EM6 Expectancies Exp1 Exp2 Exp3 Value Value1 Value2 Value3 Note: Factor loadings are standardized values. Item numbers correspond to the survey items in Table C.2. Ant. = Anticipated; TE = Task Effort Cost; OE = Outside Effort Cost; LV/LOVA = Loss of Valued Alternatives; EM = Emotional Cost; Exp = Expectancies. Ant. EM 0.77 0.84 0.74 0.81 0.84 0.78 Ant. TE 0.86 0.77 0.85 0.84 0.83 Ant. OE 0.82 0.86 0.82 0.81 Ant. LV 0.80 0.84 0.84 0.64 Exp. 0.78 0.83 0.83 Value 0.87 0.83 0.84 183 REFERENCES 184 REFERENCES Ackerman, P., Kanfer, R., & Beier, M. (2013). Trait complex, cognitive ability, and domain knowledge predictors of baccalaureate success, STEM persistence, and gender differences. Journal of Educational Psychology, 105(3), 911-927. doi:10.1037/a0032338 Asparouhov, T., Hamaker, E. L., & Muthén, B. (2018). Dynamic structural equation models. Structural Equation Modeling: A Multidisciplinary Journal, 25(3), 359-388. doi:10.1080/10705511.2017.1406803 Barron, K. E., & Hulleman, C. S. (2015). Expectancy-value-cost model of motivation. In J. S. Eccles & K. Salmelo-Aro (Eds.), International encyclopedia of social and behavioral sciences: Motivational psychology (2nd ed.). New York, NY: Elsevier. Battle, A., & Wigfield, A. (2003). College women’s value orientations toward family, career, and graduate school. Journal of Vocational Behavior, 62(1), 56. doi:10.1016/S0001- 8791(02)00037-4 Beymer, P. N., Ferland, M., & Flake, J. K. (under review). Validity evidence for a short scale of college students' perceptions of cost. Manuscript submitted for publication. Blalock, H. M. (1969). Theory construction: From verbal to mathematical formulations. Englewood Cliffs, N.J: Prentice-Hall. Bolger, N., & Laurenceau, J. (2013). Intensive longitudinal methods: An introduction to diary and experience sampling research. New York, NY: Guilford Publications. Brophy, J. (2008). Developing students’ appreciation for what is taught in school. Educational Psychologist, 43(3), 132–141. https://doi. org/10.1080/00461520701756511. Chang, M. J., Cerna, O., Han, J., & Saenz, V. B. (2008). The contradictory roles of institutional status in retaining underrepresented minorities in biomedical and behavioral science majors. The Review of Higher Education, 31(4), 433-464. doi:10.1353/rhe.0.0011 Chen, X. (2013). STEM attrition: College students’ paths into and out of STEM fields. National Center for Education Statistics, Institute of Education Science (NCES 2014-001). Washington, DC: U.S. Department of Education. Chiang, E. S., Byrd, S. P., & Molin, A. J. (2011). Children’s perceived cost for exercise: Application of an expectancy-value paradigm. Health Education & Behavior, 38(2), 143- 149. doi:10.1177/1090198110376350 185 Conley, A. M. (2012). Patterns of motivation beliefs: Combining achievement goal and expectancy-value perspectives. Journal of Educational Psychology, 104, 32-47. doi: 10.1037/a0026042 Crisp, G., Nora, A., & Taggart, A. (2009). Student characteristics, pre-college, college, and environmental factors as predictors of majoring in and earning a STEM degree: An analysis of students attending a hispanic serving institution. American Educational Research Journal, 46(4), 924-942. doi:10.3102/0002831209349460 Cromley, J. G., Perez, T., & Kaplan, A. (2016). Undergraduate STEM achievement and retention: Cognitive, motivational, and institutional factors and solutions. Policy Insights from the Behavioral and Brain Sciences, 3(1), 4-11. doi:10.1177/2372732215622648 Csikszentmihalyi, M. & Larson, R. (2014). Validity and reliability of the experience-sampling method. In Flow and the Foundations of Positive Psychology (pp. 35-54). Dordrecht: Springer Netherlands. http://doi.org/10.1007/978-94-017-9088-8_3 Daempfle, P. A. (2003). An analysis of the high attrition rates among first year college science, math, and engineering majors. Journal of College Student Retention, 5(1), 37-52. doi:10.2190/DWQT-TYA4-T20W-RCWH Dai, T., & Cromley, J. G. (2014). Changes in implicit theories of ability in biology and dropout from STEM majors: A latent growth curve approach. Contemporary Educational Psychology, 39(3), 233-247. doi:10.1016/j.cedpsych.2014.06.003 Davis, J. A. (1985). Logic of causal order. Los Angeles: SAGE Publications Inc. DePass, A. L., & Chubin, D. E. (Eds.). (2009). Understanding interventions that encourage minorities to pursue research careers: Building a community of research and practice. Bethesda, MD: American Society of Cell Biology. Dunn, E.W., Wilson, T.D., & Gilbert, D.T. (2003). Location, location, location: The misprediction of satisfaction in housing lotteries. Personality and Social Psychology Bulletin, 29, 1421-1432. Durik, A. M., Schwartz, J., Schmidt, J. A., & Shumow, L. (2018). Age differences in effects of self-generated utility among black and hispanic adolescents. Journal of Applied Developmental Psychology, 54, 60-68. doi:10.1016/j.appdev.2017.11.004 Eccles, J. S. (2005). Subjective task values and the Eccles et al. model of achievement related choices. In A. J. Elliott & C. S. Dweck (Eds.), Handbook of competence and motivation (pp. 105-121). New York: Guilford. Eccles (Parsons), J. S., Adler, T. F., Futterman, R., Goff, S. B., Kaczala, C. M., Meece, J. L., et al. (1983). Expectancies, values, and academic behaviors. In J. T. Spence (Ed.), Achievement and achievement motives (pp. 75–146). San Francisco, CA: W. H. Freeman. 186 Eccles, J. S., & Wigfield, A. (in press). From expectancy value theory to situated expectancy value theory: A development, social cognitive, and sociocultural perspective on motivation. Contemporary Educational Psychology. Eid, M., Schneider, C., & Schwenkmezger, P. (1999). Do you feel better or worse? the validity of perceived deviations of mood states from mood traits. European Journal of Personality, 13(4), 283-306. Estrada, M., Woodcock, A., Hernandez, P. R., & Schultz, P. W. (2011). Toward a model of social influence that explains minority student integration into the scientific community. Journal of Educational Psychology, 103(1), 206-222. doi:10.1037/a0022809 Estrada-Hollenbeck, M., Aguilar, M., Woodcock, A., Hernandez, P., & Schultz, P. W. (2009, May). The power of doing: How research experience contributes towards minority student integration into the scientific community. Poster session presented at the Third Understanding Interventions That Encourage Minorities to Pursue Research Careers Conference, Washington, D.C. Fayer, S., Lacey, A., & Watson, A. (2017). STEM occupations: past, present, and future. Retrieved from Bureau of Labor Statistics Website: https://www.bls.gov/spotlight/2017/science-technology- engineering-and-mathematics- stem-occupations-past-present-a nd-future/pdf/science-technology-engineering-and- mathematics- stem-occupations-past-present-and-future.pdf. Feldman Barrett, L. (2004). Feelings or words? understanding the content in self-report ratings of experienced emotion. Journal of Personality and Social Psychology, 87(2), 266-281. doi:10.1037/0022-3514.87.2.266 Feldon, D. F., Callan, G., Juth, S., & Jeong, S. (2019). Cognitive load as motivational cost. Educational Psychology Review, 31(2), 319-337. doi:10.1007/s10648-019-09464-6 Flake, J. K., Barron, K. E., Hulleman, C., McCoach, B. D., & Welsh, M. E. (2015). Measuring cost: The forgotten component of expectancy-value theory. Contemporary Educational Psychology, 41, 232-244. doi:10.1016/j.cedpsych.2015.03.002. Gasiewski, J. A., Eagan, M. K., Garcia, G. A., Hurtado, S., & Chang, M. J. (2012). From gatekeeping to engagement: A multicontextual, mixed method study of student academic engagement in introductory STEM courses. Research in Higher Education, 53(2), 229- 261. doi:10.1007/s11162-011-9247-y Gaspard, H., Dicke, A., Flunger, B., Schreier, B., Häfner, I., Trautwein, U., & Nagengast, B. (2015). More value through greater differentiation: Gender differences in value beliefs about math. The Journal of Educational Psychology, 107(3), 663-677. doi:10.1037/edu0000003 187 Gaspard, H., Häfner, I., Parrisius, C., Trautwein, U., & Nagengast, B. (2017). Assessing task values in five subjects during secondary school: Measurement structure and mean level differences across grade level, gender, and academic subject. Contemporary Educational Psychology, 48, 67-84. doi:10.1016/j.cedpsych.2016.09.003 Goetz, T., Becker, E. S., Bieg, M., Keller, M. M., Frenzel, A. C., & Hall, N. C. (2015). The class half empty: How emotional exhaustion affects the state-trait discrepancy in self-reports of teaching emotions. PLoS ONE, 10(9), e0137441. Goetz, T., Bieg, M., & Hall, N. C. (2016). Assessing academic emotions via the experience sampling method. In M. Zembylas & P. A. Schutz (Eds.) Methodological advances in research on emotion and education (pp. 245-258). Springer, Cham. doi:10/1007/978-3- 319-29049-2_19 Goetz, T., Bieg, M., Lüdtke, O., Pekrun, R., & Hall, N. C. (2013a). Do girls really experience more anxiety in mathematics? Psychological Science, 24, 2079-2087. Gore, P. A. (2006). Academic self-efficacy as a predictor of college outcomes: Two incremental validity studies. Journal of Career Assessment, 14(1), 92-115. doi:10.1177/1069072705281367 Hall, C., Dickerson, J., Batts, D., Kauffmann, P., & Bosse, M. (2011). Are we missing opportunities to encourage interest in STEM fields? Journal of Technology Education, 23(1), 32-46. doi:10.21061/jte.v23i1.a.4 Hall, N., & Webb, D. (2014). Instructors’ support of student autonomy in an introductory physics course. Physical Review Special Topics - Physics Education Research, 10(2), 020116. doi:10.1103/PhysRevSTPER.10.020116 Hamaker, E. L., Asparouhov, T., Brose, A., Schmiedek, F., & Muthén, B. (2018). At the frontiers of modeling intensive longitudinal data: Dynamic structural equation models for the affective measurements from the COGITO study. Multivariate Behavioral Research, 53(6), 820-841. doi:10.1080/00273171.2018.1446819 Harackiewicz, J. M., Smith, J. L., & Priniski, S. J. (2016). Interest matters: The importance of promoting interest in education. Policy Insights from the Behavioral and Brain Sciences, 3(2), 220–227. https://doi.org/10.1177/2372732216655542. Hektner, J. M., Schmidt, J. A., & Csikszentmihalyi, M. (2007). Experience sampling method: Measuring the quality of everyday life. Thousand Oaks, Calif: Sage Publications. Hernandez, P., Schultz, P., Estrada, M., Woodcock, A., & Chance, R. (2013). Sustaining optimal motivation: A longitudinal analysis of interventions to broaden participation of underrepresented students in STEM. Journal of Educational Psychology, 105(1), 89-107. doi:10.1037/a0029691 188 Hilpert, J. C., & Marchand, G. C. (2018). Complex systems research in educational psychology: Aligning theory and method. Educational Psychologist, 53(3), 185-202. doi:10.1080/00461520.2018.1469411 Hu, L., & Bentler, P. M. (1999). Cutoff criteria for fit indexes in covariance structure analysis: Conventional criteria versus new alternatives. Structural Equation Modeling: A Multidisciplinary Journal, 6(1), 1-55. 10.1080/10705519909540118 Hurtado, S., Newman, C. B., Tran, M. C., & Chang, M. J. (2010). Improving the rate of success for underrepresented racial minorities in STEM fields: Insights from a national project. New Directions for Institutional Research, 2010(148), 5-15. doi:10.1002/ir.357 Ingersoll, R. M., & May, H. (2012). The magnitude, destinations, and determinants of mathematics and science teacher turnover. Educational Evaluation and Policy Analysis, 34(4), 435-464. doi:10.3102/0162373712454326 Ingersoll, R. M., & Perda, D. (2010). Is the supply of mathematics and science teachers sufficient? American Educational Research Journal, 47(3), 563-594. doi:10.3102/0002831210370711 Ironsmith, M., Marva, J., Harju, B., & Eppler, M. (2003). Motivation and performance in college students enrolled in self-paced versus lecture-format remedial mathematics courses. Journal of Instructional Psychology, 30(4), 276. Jebb, A. T., & Woo, S. E. (2015). A bayesian primer for the organizational sciences: The “Two sources” and an introduction to BugsXLA. Organizational Research Methods, 18(1), 92- 132. doi:10.1177/1094428114553060 Jiang, Y., Rosenzweig, E. Q., & Gaspard, H. (2018). An expectancy-value-cost approach in predicting adolescent students’ academic motivation and achievement. Contemporary Educational Psychology, 54, 139-152. doi:10.1016/j.cedpsych.2018.06.005 Johnson, M. L., & Safavian, N. (2016). What is cost and is it always a bad thing? furthering the discussion concerning college-aged students' perceived costs for their academic studies. Journal of Cognitive Education and Psychology, 15(3), 368-390. doi:10.1891/1945-8959.15.3.368 Jones, M. T., Barlow, A. E. L., & Villarejo, M. (2010). Importance of undergraduate research for minority persistence and achievement in biology. The Journal of Higher Education, 81(1), 82-115. doi:10.1353/jhe.0.0082 Jones, B. D., Paretti, M. C., Hein, S. F., & Knott, T. W. (2010). An analysis of motivation constructs with first-year engineering students: Relationships among expectancies, values, achievement, and career plans. Journal of Engineering Education, 99(4), 319- 336. doi:10.1002/j.2168-9830.2010.tb01066.x 189 Keltner, D., & Lerner, J. S. (2010). Emotion. In D. Gilbert, S. Fiske, & G. Lindsey (Eds.), The handbook of social psychology (5th ed., pp. 312-347). New York, NY: McGraw Hill. Kline, R. B. (2015). Principles and practice of structural equation modeling (Fourth ed.). New York: Guilford Publications Inc. M.U.A. Koch, A. K. (2017, May). Many thousands failed: A wakeup call to history educators. Perspectives on History, 55, 18-19. Retrieved from https://www.historians.org/publications-and-directories/perspectives-on-history/may- 2017/many-thousands-failed-a-wakeup-call-to-history-educators. Koch, A. K., & Rodier, R. (2014). Gateways to Completion guidebook. Brevard, NC: John N. Gardner Institute for Excellence in Undergraduate Education. Kosovich, J. J., Hulleman, C. S., Barron, K. E., & Getty, S. (2015). A practical measure of student motivation: Establishing validity evidence for the expectancy-value-cost scale in middle school. The Journal of Early Adolescence, 35(5-6), 790-816. doi:10.1177/0272431614556890 Lawson, A. E., Banks, D. L., & Logvin, M. (2007). Self-efficacy, reasoning ability, and achievement in college biology. Journal of Research in Science Teaching, 44(5), 706- 724. doi:10.1002/tea.20172 Lent, R. W., Brown, S. D., & Hackett, G. (1994). Toward a unifying social cognitive theory of career and academic interest, choice, and performance. Journal of Vocational Behavior, 45(1), 79-122. doi:10.1006/jvbe.1994.1027 Linnenbrink, E. A. (2006). Emotion research in education: Theoretical and methodological perspectives on the integration of affect, motivation, and cognition. Educational Psychology Review, 18(4), 307-314. doi:10.1007/s10648-006-9028-x Loewenstein, G., & Lerner, J. S. (2003). The role of affect in decision making. In R. J. Davidson, K. R., Scherer, & H. H., Goldsmith (Eds.), Handbook of affective science (pp. 619-642). New York, NY: Oxford University Press. Luttrell, V. R., Callen, B. W., Allen, C. S., Wood, M. D., Deeds, D. G., & Richard, D. C. S. (2010). The mathematics value inventory for general education students: Development and initial validation. Educational and Psychological Measurement, 70(1), 142-160. doi:10.1177/0013164409344526 MacKinnon, D. P., Fairchild, A. J., & Fritz, M. S. (2007). Mediation analysis. Annual Review of Psychology, 58(1), 593-614. doi:10.1146/annurev.psych.58.110405.085542 MacKinnon, D. P., Krull, J. L., & Lockwood, C. M. (2000). Equivalence of the mediation, confounding and suppression effect. Prevention Science, 1(4), 173-181. doi:10.1023/A:1026595011371 190 Mattern, K., Radunzel, J., & Westrick, P. (2015). Development of STEM readiness benchmarks to assist educational and career decision making (ACT Research Report 2015-3). Iowa City, IA: ACT, Inc. McNeish, D. (2019). Two-level dynamic structural equation models with small samples. Structural Equation Modeling: A Multidisciplinary Journal. doi:10.1080/10705511.2019.1578657 McNeish, D., & Hamaker, E. L. (2019). A primer on two-level dynamic structural equation models for intensive longitudinal data in mplus. Psychological Methods. doi:10.1037/met0000250 Murayama, K., Goetz, T., Malmberg, L.-E., Pekrun, R., Tanaka, A., & Martin, A. J. (2017). Within-person analysis in educational psychology: Importance and illustrations. British Journal of Educational Psychology Monograph Series II, 12, 71– 87. Muthén, B. (2010). Bayesian analysis in Mplus: A brief introduction. Retrieved from https://www.statmodel. com/download/IntroBayesVersion%203.pdf Muthén, L. K., & Muthén, B.O. (1998-2017). Mplus User’s Guide, Eighth Edition. Los Angeles, CA: Muthén & Muthén. National Science Board. (2018). Science and engineering indicators 2018. NSB-2018-1. Alexandria, VA: National Science Foundation. Retrieved from https://www.nsf.gov/statistics/indicators/. National Science Foundation, Women, Minorities, and Persons with Disabilities in Science and Engineering (Arlington, VA: National Center for Science and Engineering Statistics, 2017). Retrieved from: https://www.nsf.gov/statistics/2017/nsf17310/digest/about-this- report/. Newman, D. A. (2014). Missing data: Five practical guidelines. Organizational Research Methods, 17(4), 372-411. doi:10.1177/1094428114548590 Perez, T., Cromley, J. G., & Kaplan, A. (2014). The role of identity development, values, and costs in college STEM retention. Journal of Educational Psychology, 106, 315-329. doi: 10.1037/a0034027 Perugini, M., & Bagozzi, R. P. (2001). The role of desires and anticipated emotions in goal- directed behaviours: Broadening and deepening the theory of planned behaviour. British Journal of Social Psychology, 40, 79-98. Radunzel, J., Mattern, K., & Crouse, J., & Westrick, P. (2015). Development and validation of a STEM benchmark based on the ACT STEM score. (ACT Technical Brief). Iowa City, IA: ACT. 191 Richard R, van der Pligt, J., de Vries, N. (1996). Anticipated affect and behavioral choice basic. Applied Social Psychology, 18, 111-129. Rosenzweig, E. Q., Wigfield, A., & Hulleman, C. S. (2020). More useful or not so bad? examining the effects of utility value and cost reduction interventions in college physics. Journal of Educational Psychology, 112(1), 166-182. doi:10.1037/edu0000370 Safavian, N., & Conley, A. (2016). Expectancy-value beliefs of early-adolescent hispanic and non-hispanic youth. AERA Open, 2. doi:10.1177/2332858416673357 Safavian, N., Conley, A., & Karabenick, S. (2013, April). Examining mathematics cost value among middle school youth. In E. M. Anderman (Chair), Is it worth my time and effort? Exploring students’ conceptions of the cost of learning. Symposium conducted at the annual meeting of the American Educational Research Association, San Francisco, CA. Schmidt, J. A., Shumow, L., & Kackar-Cam, H. Z. (2017). Does mindset intervention predict students’ daily experience in classrooms? A comparison of seventh and ninth graders’ trajectories. Journal of Youth and Adolescence, 46(3), 582-602. doi:10.1007/s10964-016- 0489-z Schuster C, & Martiny, S. E. (2016). Not feeling good in stem: Effects of stereotype activation and anticipated affect on women’s career aspirations. Sex Roles, 76, 40-55. Schweinle, A., Meyer, D. K., & Turner, J. C. (2006). Striking the right balance: Students' motivation and affect in elementary mathematics. The Journal of Educational Research, 99(5), 271-293. doi:10.3200/JOER.99.5.271-294 Seymour, E., & Hewitt, N. M. (1997). Talking about leaving: Why undergraduates leave the sciences. Boulder, CO: Westview Press. Shaw, E. J., & Barbuti, S. (2010). Patterns of persistence in intended college major with a focus on STEM majors. NACADA Journal, 30(2), 19-34. Shieh, G. (2006). Suppression situations in multiple linear regression. Educational and Psychological Measurement, 66(3), 435-447. doi:10.1177/0013164405278584 Stout, J., Dasgupta, N., Hunsinger, M., & McManus, M. (2011). STEMing the tide: Using ingroup experts to inoculate women's self-concept in science, technology, engineering, and mathematics (STEM). Journal of Personality and Social Psychology, 100(2), 255- 270. doi:10.1037/a0021385 Trautwein, U., Marsh, H. W., Nagengast, B., Lüdtke, O., Nagy, G., & Jonkmann, K. (2012). Probing for the multiplicative term in modern expectancy–value theory: A latent interaction modeling study. Journal of Educational Psychology, 104(3), 763-777. doi:10.1037/a0027470 192 van der Pligt, J. & Nanne K., De Vries (1998). Expectancy-Value models of health behaviour: The role of salience and anticipated affect. Psychology & Health, 13, 289-305. U.S. Department of Education, National Center for Education Statistics. (2005). Digest of education statistics. Retrieved from the National Center for Education Statistics web site at http://nces.ed.gov/programs/digest/d05/tables/dt05_009.asp Wigfield, A., & Eccles, J. S. (1992). The development of achievement task values: A theoretical analysis. Developmental Review, 12(3), 265-310. doi:10.1016/0273-2297(92)90011-P Wigfield, A., & Eccles, J. S. (2000). Expectancy–Value theory of achievement motivation. Contemporary Educational Psychology, 25(1), 68-81. doi:10.1006/ceps.1999.1015 Wigfield, A. & Eccles, J. S. (2020). 35 years of research on students’ subjective task values and motivation: A look back and a look forward. In A. J. Elliot (Ed.), Advances in motivation science (Vol. 7, pp. 161-198). Elsevier Inc. Wigfield, A., Tonks, S. M., & Klauda, S. L. (2016). Expectancy-value theory. In K. R. Wentzel & D. B. Miele (Eds.), Handbook of motivation of school (2nd ed., pp. 55-74). New York: Routledge. Wilson, T. D., & Gilbert, D. T. (2003). Affective forecasting. In M. P. Zanna (Ed.), Advances in experimental social psychology (Vol. 35, pp. 345–411). San Diego, CA: Academic Press. Zare, R. N. (2009). No more pencils, no more books. Journal of Chemical Education, 86(2), 142-144. Zhou, L., Wang, M., & Zhang, Z. (2019). Intensive longitudinal data analyses with dynamic structural equation modeling. Organizational Research Methods. doi:10.1177/1094428119833164 Zirkel, S., Garcia, J. A., & Murphy, M. C. (2015). Experience-sampling research methods and their potential for education research. Educational Researcher, 44(1), 7-16. doi:10.3102/0013189X14566879 Zusho, A., Pintrich, P., & Coppola, B. (2003). Skill and will: The role of motivation and cognition in the learning of college chemistry. International Journal of Science Education, 25(9), 1094;1081;-1094. doi:10.1080/0950069032000052207 193 CONCLUSION. Dissertation Takeaways Once a largely ignored construct, research surrounding cost has risen over the past several years (Flake et al., 2015; Perez et al., 2014; Jiang et al., 2018; Wigfield & Eccles, 2020), due to its promise for intervention (Barron & Hulleman, 2015; Rosenzweig et al., 2020). Because of this, researchers are still trying to understand how to conceptualize and operationalize cost, as well as how it functions in the classroom (Eccles & Wigfield, in press; Wigfield & Eccles, 2020). Broadly, the purpose of this dissertation was to contribute to both theory and practice by providing empirical evidence of how cost operates in the classroom and to contribute to the theoretical discussion of cost. Expectancy-value theory (Eccles et al., 1983) positions cost as a type of task-value that has been described as both the anticipated effort and the sacrifices one makes in order to complete a task (Eccles, 2005); however, researchers have generally focused on examining the anticipated nature of cost (Gaspard et al., 2015; Jiang et al., 2018; Perez et al, 2014), despite calls for examinations of the dynamic nature of cost in the classroom using intensive longitudinal methodologies (Feldon et al., 2019). In this dissertation, I examine cost from both an anticipated and experienced perspective. An overview of findings is presented below. Paper One Overview of Findings Paper one of this dissertation was a study that focused on gathering validity evidence for a short cost scale to be used in paper’s two and three. Beginning with the Flake et al. (2015) cost scale, four items were chosen (one item representing each dimension of cost: task effort, outside effort, loss of valued alternatives, and emotional) after compiling multiple sources of validity 194 evidence. This short scale provides an instrument for researchers to use to assess the dynamic nature of cost. Paper Two Study two sought to provide conceptual clarity of cost by examining the potential jangle fallacy (Kelley, 1927) between emotional cost and negative emotions. Both anticipated and experienced emotional cost and negative emotions were examined. Results suggested that the similarities and differences between anticipated emotional cost and anticipated negative emotions were nuanced. Anticipated emotional cost showed more similarities with anticipated anger and anticipated confusion than with anticipated boredom. Experienced emotional cost also showed contradictory results with daily emotions, but was more similar to daily boredom than daily anger or daily confusion. Though more work is needed to understand emotional cost, initial results suggest that there is not enough evidence to conclude the emotional cost dimension is nothing more than negative emotions with a new label. Paper Three The final study of this dissertation examined the relations among anticipated cost, experienced cost, mathematics achievement, and STEM career intentions using a novel analytic approach, dynamic structural equation modeling (Asparouhov et al., 2018). Results of this study suggest that those who anticipate high levels of cost tend to have higher experienced cost, more variability in those cost experiences, and higher grades; however, the experience of cost has the opposite effect on grades. Of the four dimensions of cost, outside effort cost (anticipated and experienced) was found to be the strongest predictor of grades, whereas experienced emotional cost (anticipated emotional cost was not significant) was the weakest. As cost intervention work has taken a front seat recently (Barron & Hulleman, 2015; Rosenzweig et al., 2020), this study 195 provides important theoretical and practical findings of where and on what dimensions of cost it may be best to intervene. Looking Across Three Papers Taken together, the three papers in this dissertation contribute methodologically and substantively to the growing understanding of cost. Paper one provided a short-cost scale that was necessary for examining experienced cost as a dynamic process in paper’s two and three. The second two papers contribute to the conceptualization of cost by examining two separate issues. Across those two papers, results confirmed the importance of considering both anticipated and experienced cost. Anticipated and experienced costs appeared to function differently in both papers, as mentioned above. Importantly, in paper two, anticipated emotional cost was found to be a negative predictor of course grades, whereas in paper three, anticipated emotional cost was not a significant predictor after controlling for other variables, such as experienced emotional cost, expectancies, and values. By omitting relevant predictors, estimates can be biased and will lose precision (Maxwell & Delaney, 2004), thus the inclusion of important theoretical variables in understanding effects of cost is critical. Future Directions Though this dissertation contributes to understanding cost more clearly, there are questions that still remain that were unable to be addressed in these three studies. Here, I highlight four future directions for cost research. Defining Cost More Clearly As discussed throughout this dissertation, more work is needed to conceptualize and operationalize cost so that researchers are better able to compare results across studies (Wigfield & Eccles, 2020). The surge of research on cost has been welcomed (Wigfield & Eccles, 2020); 196 however, because of the growing number of researchers conducting studies on cost, many definitions and measures of costs exist (Flake et al., 2015; Perez et al., 2014; Jiang et al., 2018). Research is needed to more clearly define what cost is and what cost is not. Using focus groups and interviews is one way to better understand the costs that students experience (Flake et al., 2015; Johnson & Safavian, 2016). Then, researchers can begin validating measures to support unique dimensions of cost for empirical use. Interviews and focus groups can also aid in clarifying dimensions of cost that draw attention. Because many dimensions of cost have been proposed, understanding those dimensions is an important step in understanding cost more clearly. Understanding Dimensions of Cost Eccles et al., (1983) originally proposed three cost dimensions: task effort cost, loss of valued alternatives, and psychological cost. However, other dimensions have been proposed. Wigfield and Eccles (2000) later referred to psychological cost as emotional cost; however, other researchers have suggested psychological and emotional cost are different dimensions (Perez et al., 2019). Researchers have also discussed dimensions such as opportunity cost (Perez et al., 2019), ego cost (Jiang et al., 2018), outside effort cost (Flake et al., 2015), social cost, and financial cost (Johnson & Safavian, 2016). This is not to say that these costs do not exist, only that research is needed to conceptualize and operationalize these dimensions and understand how they are related to academic outcomes. With so many dimensions, it is possible that researchers will conflate one dimension of cost with another dimension of cost, or one dimension of cost with a construct that already exists (i.e., the jingle-jangle fallacy; Kelley, 1927). Still, research has shown that dimensions of cost predict academic outcomes differently (Perez et al., 2014). Cost has been discussed as a promising construct to intervene on (Barron & 197 Hulleman, 2015); however, in order for researchers to make sound recommendations to classroom educators, more research is needed to understand how unique dimensions are related to academic outcomes. An intervention for a student with high task effort cost may look much different from a student with emotional cost. For example, a student with high task effort cost may fare better with an intervention focused on time management strategies, whereas a student with high emotional cost may need guidance with learning emotional regulation strategies. Before intervention work takes place in the classroom, researchers need to carefully consider how each dimension functions and what strategies should be used to reduce high cost beliefs. Throughout this dissertation, I chose to use variable-centered approaches for analysis; however, person-oriented approaches, such as latent profile analysis, also provide opportunities to better understand how dimensions of cost interact with each other, as well as with other theoretically relevant constructs. These person-oriented approaches can help to identify naturally occurring constellations of variables and how they can be used to predict academic outcomes (Bergman & Trost, 2006; Laursen & Hoff, 2006; Magnusson, 2003). Researchers have already begun to examine cost using person-oriented approaches and have found different profiles of cost, expectancies, and values (Dietrich et al., 2019; Perez et al., 2019). Whereas these papers have focused on profiles of cost, expectancies, and value, future work should begin to examine profiles of different cost dimensions as this may shed light on how these dimensions interact. A Larger Focus on Anticipated and Experienced Cost Beliefs Future research is also needed to understand more of the nuances of anticipated and experienced cost. This is necessary for a number of reasons. First, if cost interventions are to be most effective, researchers need to understand when to best intervene. Though this paper showed some positive outcomes associated with anticipated costs, this is the first study examining cost 198 using an anticipated and experienced framework. Thus, more work is needed to understand how these processes function differently. Second, as researchers have called for an examination of complex systems (Hilpert & Marchand, 2018), and cost specifically (Feldon et al., 2019), research is necessary to understand the short-term changes associated with experienced cost. Within- and between-person effects of cost should be examined using intensive longitudinal methodologies (Murayama et al., 2017); however, more work is needed to understand how other outside courses and activities impact students’ cost beliefs. Students do not take just one course throughout the semester; they take multiple courses. They also have other activities outside of school that take time such as jobs, family obligations, and other relationships. Thus, students’ experiences of cost throughout a course are likely dependent on these shifting values and competing demands that may change throughout the semester. Researchers should focus on these outside factors that may impact cost beliefs. Why Do Some Students Experience High Costs and Others Do Not? Finally, as results from paper one showed, students from different backgrounds may experience costs differently. Though researchers have focused on how much cost a student experiences (Flake et al., 2015), they have yet to focus on why they experience costs to a certain extent. In order to answer this question, qualitative methodologies, such as interviews, are needed to compliment quantitative methodologies. By interviewing students, researchers can begin to understand the fluctuations in students’ experiences of cost over time. This could further aid researchers in understanding why women and students of color in STEM fields tend to experience higher costs than White males (Gaspard et al., 2017). Studying individual cases of 199 students can provide a unique understanding of how cost beliefs function for students from different backgrounds (Zusho & Kamar, 2018). Concluding Thoughts As research on cost continues to grow, it is important that researchers ask how they can best advance theory and practice. Education researchers often seek to build and refine theory; however, it is important to keep an eye towards helping students and teachers through making sound recommendations for classroom practice. This dissertation is a first step in understanding cost more thoroughly in hopes of doing just that. 200 REFERENCES 201 REFERENCES Asparouhov, T., Hamaker, E. L., & Muthén, B. (2018). Dynamic structural equation models. Structural Equation Modeling: A Multidisciplinary Journal, 25(3), 359-388. doi:10.1080/10705511.2017.1406803 Barron, K. E., & Hulleman, C. S. (2015). Expectancy-value-cost model of motivation. In J. S. Eccles & K. Salmelo-Aro (Eds.), International encyclopedia of social and behavioral sciences: Motivational psychology (2nd ed.). New York, NY: Elsevier. Bergman, L. R., & Trost, K. (2006). The person-oriented versus the variable-oriented approach: Are they complementary, opposites, or exploring different worlds. Merrill-Palmer Quarterly, 52, 601–632. https://doi.org/10.1353/mpq.2006.0023 Dietrich, J., Moeller, J., Guo, J., Viljaranta, J., & Kracke, B. (2019). In-the-moment profiles of expectancies, task values, and costs. Frontiers in Psychology, 10, 1662. doi:10.3389/fpsyg.2019.01662 Eccles, J. S. (2005). Subjective task values and the Eccles et al. model of achievement related choices. In A. J. Elliott & C. S. Dweck (Eds.), Handbook of competence and motivation (pp. 105-121). New York: Guilford. Eccles (Parsons), J. S., Adler, T. F., Futterman, R., Goff, S. B., Kaczala, C. M., Meece, J. L., et al. (1983). Expectancies, values, and academic behaviors. In J. T. Spence (Ed.), Achievement and achievement motives (pp. 75–146). San Francisco, CA: W. H. Freeman. Eccles, J. S., & Wigfield, A. (in press). From expectancy value theory to situated expectancy value theory: A development, social cognitive, and sociocultural perspective on motivation. Contemporary Educational Psychology. Feldon, D. F., Callan, G., Juth, S., & Jeong, S. (2019). Cognitive load as motivational cost. Educational Psychology Review, 31(2), 319-337. doi:10.1007/s10648-019-09464-6 Flake, J. K., Barron, K. E., Hulleman, C., McCoach, B. D., & Welsh, M. E. (2015). Measuring cost: The forgotten component of expectancy-value theory. Contemporary Educational Psychology, 41, 232-244. doi:10.1016/j.cedpsych.2015.03.002. Gaspard, H., Dicke, A., Flunger, B., Schreier, B., Häfner, I., Trautwein, U., & Nagengast, B. (2015). More value through greater differentiation: Gender differences in value beliefs about math. The Journal of Educational Psychology, 107(3), 663-677. doi:10.1037/edu0000003 202 Gaspard, H., Häfner, I., Parrisius, C., Trautwein, U., & Nagengast, B. (2017). Assessing task values in five subjects during secondary school: Measurement structure and mean level differences across grade level, gender, and academic subject. Contemporary Educational Psychology, 48, 67-84. doi:10.1016/j.cedpsych.2016.09.003 Hilpert, J. C., & Marchand, G. C. (2018). Complex systems research in educational psychology: Aligning theory and method. Educational Psychologist, 53(3), 185-202. doi:10.1080/00461520.2018.1469411 Jiang, Y., Rosenzweig, E. Q., & Gaspard, H. (2018). An expectancy-value-cost approach in predicting adolescent students’ academic motivation and achievement. Contemporary Educational Psychology, 54, 139-152. doi:10.1016/j.cedpsych.2018.06.005 Johnson, M. L., & Safavian, N. (2016). What is cost and is it always a bad thing? furthering the discussion concerning college-aged students' perceived costs for their academic studies. Journal of Cognitive Education and Psychology, 15(3), 368-390. doi:10.1891/1945-8959.15.3.368 Kelley, T. L. (1927). Interpretation of educational measurements. Oxford, England: World Book Co. Laursen, B., & Hoff, E. (2006). Person-centered and variable-centered approaches to longitudinal data. Merrill-Palmer Quarterly, 32, 377–389. https://doi.org/10.1353/mpq.2006.0029 Magnusson, D. (2003). The person approach: Concepts, measurement models, and research strategy. New Directions for Child and Adolescent Development, 101, 3–23. https://doi.org/10.1002/cd.79 Maxwell, S. E., & Delaney, H. D. (2004). Designing experiments and analyzing data: A model comparison perspective (2nd ed.). New York, NY: Taylor and Francis Group. Murayama, K., Goetz, T., Malmberg, L. E., Pekrun, R., Tanaka, A., & Martin, A. J. (2017). Within-person analysis in educational psychology: Importance and illustrations. British Journal of Educational Psychology Monograph Series II, 12, 71– 87. Perez, T., Cromley, J. G., & Kaplan, A. (2014). The role of identity development, values, and costs in college STEM retention. Journal of Educational Psychology, 106, 315-329. doi: 10.1037/a0034027 Perez, T., Wormington, S. V., Barger, M. M., Schwartz‐Bloom, R. D., Lee, Y., & Linnenbrink‐ Garcia, L. (2019). Science expectancy, value, and cost profiles and their proximal and distal relations to undergraduate science, technology, engineering, and math persistence. Science Education, 103(2), 264-286. doi:10.1002/sce.21490 203 Rosenzweig, E. Q., Wigfield, A., & Hulleman, C. S. (2020). More useful or not so bad? examining the effects of utility value and cost reduction interventions in college physics. Journal of Educational Psychology, 112(1), 166-182. doi:10.1037/edu0000370 Wigfield, A., & Eccles, J. S. (2000). Expectancy–Value theory of achievement motivation. Contemporary Educational Psychology, 25(1), 68-81. doi:10.1006/ceps.1999.1015 Wigfield, A. & Eccles, J. S. (2020). 35 years of research on students’ subjective task values and motivation: A look back and a look forward. In A. J. Elliot (Ed.), Advances in motivation science (Vol. 7, pp. 161-198). Elsevier Inc. Zusho, A., & Kumar, R. (2018). Introduction to the special issue: Critical reflections and future directions in the study of race, ethnicity, and motivation. Educational Psychologist, 53(2), 61-63. doi:10.1080/00461520.2018.1432362 204