3 ESSAYS ON THE ECONOMICS OF HIGHER EDUCATION By Joshua Brownstein A DISSERTATION Submitted to Michigan State University in partial fulfillment of the requirements for the degree of Economics – Doctor of Philosophy 2023 ABSTRACT Chapter 1 is The Effect of Honors College Participation on Student Outcomes. Honors education refers to programs for high achieving students at U.S. post-secondary institutions. These programs provide high achieving students benefits such as the ability to enroll in exclusive courses with small class sizes, to live in special dorms, and to enroll in classes earlier than non- honors students. These changes to a student’s college experience may change their academic outcomes in ways that concern students and policymakers. Results in most prior research on the effect of honors program participation on academic outcomes may be biased by unobserved differences between students in and not in an honors program. This paper addresses these unobserved differences by studying an honors college that uses GPA admissions cutoffs. The Michigan State University Honors College admits all students in the top 10% of the freshmen fall semester GPA distribution of each non-honors college. I use a regression discontinuity research design to compare outcomes of students above and below the cutoffs, and attribute differences in outcomes to differences in honors college participation. I find that participation in the honors college reduces the time for students to get their first degree and increases the probability that first-generation college students will graduate from MSU. However, the honors college has an insignificant effect on most outcomes for most groups I check, so the few significant findings may be due to random chance and doing many significance tests. Chapter 2 is How Low-Income Expectations Affect Student Loan Repayment Plan Choice: Survey Evidence from College Seniors. Income-driven repayment plans lower required payments for student loan borrowers when their income decreases. This helps to reduce student loan defaults. Despite universal availability, only a minority of student loan borrowers in the U.S. are in an income-driven repayment plan. In this study, I test whether a student’s choice of repayment plan is related to their expectations of earning a low income. Using an information experiment in a web survey, I create two groups of college seniors with an exogenous difference in low-income expectations. I find that respondents who see the major specific income information believe they, on average, have a higher probability of earning a low income. However, those respondents are not any more likely to choose the income-driven repayment plan. I conclude that students’ repayment plan preferences are not strongly related to their expectations of earning a low income. This may be due to students caring about things other than minimizing monthly payments when choosing a repayment plan. Chapter 3 is The Effect of Test Score Performance Labels on Postsecondary Educational Outcomes: Evidence from Michigan. Standardized test scores and the labels associated with those scores provide students and their parents with highly credible information about a student’s academic achievement. This information could cause students and their parents to change their beliefs regarding a student’s academic ability. This may then consequently change the student’s future educational choices and thus their future educational outcomes. In this chapter I use administrative data on Michigan students to look at the impact of receiving different labels summarizing a student’s performance on standardized tests on a student’s post-secondary educational outcomes. I use a regression discontinuity research design to compare students who have similar test scores but who receive different summary labels. While some of my estimates are significant, almost all lack robustness to using another bandwidth and I am likely to find some spurious effects given the large number of estimates in this chapter. I conclude that I do not find evidence of a large effect of performance labels on postsecondary outcomes. ACKNOWLEDGEMENTS Thank you to Scott Imberman, Ajin Lee, Leslie Papke, and Kris Renn for providing me feedback on these projects as members of my dissertation committee. Thank you to my parents Janet Brownstein and Stephen Brownstein for their love and support while I was working on this dissertation. Thank you to the economics doctoral students at Michigan State University for feedback on these projects at graduate student seminars. Thank you to Justin Micomonaco for helping me with data requests for Chapter 1 and for helping me to learn about the MSU Honors College. Thank you to the MSU Office of the Registrar for providing me with the data for Chapter 1 and for sending out the emails with links to my survey that I discuss in Chapter 2. Thank you to Cody Orr for providing me code in Qualtrics for creating the income expectations questions in Chapter 2. Thank you to the National Science Foundation and the Michigan State University College of Social Science for funding the incentives for students to take the survey discussed in Chapter 2. The material in Chapter 2 is based upon work supported by the National Science Foundation under Grant No. 2049358. Any opinions, findings, and conclusions or recommendations expressed in that chapter are those of the author and do not necessarily reflect the views of the National Science Foundation. Thank you to Patria Wilson for helping me to fill out the National Science Foundation grant related to Chapter 2. Thank you to Belen Freight for helping me use the National Science Foundation grant to incentivize students to take the survey discussed in Chapter 2. Thank you to the Michigan Education Data Center for providing me with the data for Chapter 3. Thank you to Michigan Education Data Center employees Jasmina Camo- Biogradlija and Kyle Kwaiser for their repeated communications with me related to applying for and getting access to data for Chapter 3. Chapter 3 uses data structured and maintained by the MERI-Michigan Education Data Center (MEDC). MEDC data is modified for analysis purposes using rules governed by MEDC and are not identical to those data collected and maintained by the Michigan Department of Education (MDE) and/or Michigan’s Center for Educational Performance and Information (CEPI). Results, information and opinions solely represent the analysis, information, and opinions of the author and are not endorsed by, or reflect the views or positions of, grantors, MDE and CEPI or any employee thereof iv TABLE OF CONTENTS CHAPTER 1: THE EFFECT OF HONORS COLLEGE PARTICIPATION ON STUDENT OUTCOMES....................................................................................................................................1 CHAPTER 2: HOW LOW-INCOME EXPECTATIONS AFFECT STUDENT LOAN REPAYMENT PLAN CHOICE: SURVEY EVIDENCE FROM COLLEGE SENIORS ............34 CHAPTER 3: THE EFFECT OF TEST SCORE PREFORMANCE LABELS ON POSTSECONDARY OUTCOMES: EVIDENCE FROM MICHIGAN .......................................61 BIBLIOGRAPHY ..........................................................................................................................89 APPENDIX A: CHAPTER 1 APPENDIX ....................................................................................97 APPENDIX B: CHAPTER 2 APPENDIX ..................................................................................121 APPENDIX C: CHAPTER 3 APPENDIX ..................................................................................141 v CHAPTER 1: THE EFFECT OF HONORS COLLEGE PARTICIPATION ON STUDENT OUCOMES 1.1 Introduction and Motivation Honors education refers to special programs that colleges and universities in the United States (U.S.) provide to high-achieving students. Colleges have these programs to improve the educational experience of high-achieving students and to incentivize high-achieving students to attend their college1. In 2016 there were at least 1,035 honors colleges and honors programs in the U.S.2 (Scott, Smith, and Congnard-Black 2017). While the specifics of the programs vary widely, common program elements include having honors courses3, having honors housing, and requiring students to complete a thesis (Scott, Smith, and Congnard-Black 2017). These patterns are like patterns I found when looking at honors programs in national universities with a similar ranking to the University whose program I study4. In this paper I study how a student’s participation in an honors program changes their academic outcomes. While honors programs have aspects which have been shown to improve student outcomes, research on K-12 programs for high achieving students have shown mixed results. One reason an honors student might do better academically than a non-honors student is that they are in classes with fewer students. A key feature of honors programs is to allow students access to exclusive classes with small class sizes. Quasi-experimental research in higher education settings has found smaller class sizes to improve students rating of courses (Monks and Schmidt 2011; Sapelli and Illanes 2016). Another reason honors students might do better academically than 1 Large universities often advertise their honors programs as making a student’s experience more like that of a small liberal arts college. This seems to be done to incentivize academically gifted students who want to attend a small liberal arts college to attend a large university instead. To the extent that students going to a small liberal arts college causes students to have different academic outcomes, replicating those features in an honors program may cause the program to impact academic outcomes in a similar way. For an example of an honors college that advertises itself as having a “small-college atmosphere” see https://honorscollege.msu.edu/about/index.html. 2 In 2016 honors education was offered at an estimated 59% of U.S. public and non-profit undergraduate post- secondary institutions, 42% of two-year public and non-profit U.S. post-secondary institutions, and 68% of 4-year post-secondary institutions. 59% of both public and private non-profit post-secondary institutions offered honors education in 2016 (Scott and Smith 2016). 3 At MSU, compared to non-honors courses, honors courses are limited to honors students, have smaller class sizes, cover more material, cover material at a faster pace, and have more classroom interaction. See https://honorscollege.msu.edu/admissions/honors-experiences.html. Honors courses at other universities likely have similar features such as having small class sizes. 4 See Appendix A for a summary of these findings. In this paper I study the Michigan State University (MSU) Honors College. One of the findings is that, similar to the MSU Honors College, 20 of 50 honors programs that I looked at offered priority registration for honors students. This means that honors students can register for classes earlier then non-honors students. 1 non-honors students is that they have higher ability peers. Prior research has found that in some cases being in post-secondary settings with higher ability peers improves a student’s GPA (Carrrell, Fullerton, and West 2009; Brady, Isnler, and Rahman 2017)5. Peers also impact a variety of other outcomes for college students such as if they smoke, how much they binge drink, and if they support affirmative action (Sacerdote 2011). Like post-secondary honors education, gifted and talented programs in primary and secondary schools allow high-achieving students to take classes that go through advanced material with other high-achieving students. Studies have found positive effects on grades (Booij, Haan, and Plug 2017), reading and math achievement (Card and Giuliano 2014), high school graduation and college enrollment (Cohodes 2020) for students in gifted and talented education at the K - 12 level. However, other research finds no effect (Bui, Craig, and Imberman 2014; Abadulkadiroğlu, Angrist, and Pathak 2014) or a mix of positive, negative, and insignificant effects (Barrow, Sartain, and De La Torre 2020) 6. This discrepancy between positive outcomes for smaller classes and better peers and the mixed outcomes of K-12 programs makes it unclear what the effect of honors programs will be. This motivates me to study the effect of honors programs on student outcomes. Another motivation for this study is most other research on this topic is not able to credibly control for unobservable differences between honors and non-honors students. Most other studies compare honors and non-honors students based on the assumption that students select into honors programs based on observable characteristics like grades7. This assumption is likely wrong and leads to biased results because students who select into joining honors programs are probably different on unobservable characteristics such as organizational skills and motivation. These differences would lead honors students to have better outcomes even if honors programs did not change their college experience. In this paper I study the effect of honors college participation on academic outcomes while controlling for selection on unobservable factors. I do this by studying the effect of participating 5 Other studies have peer effect findings consistent with little or no effect of peer ability on high ability students (Carrell, Sacerdote, and West 2013; Booij, Leuven, and Oosterbeek 2017). 6 Barrow, Sartain, and De La Torre (2020) study the effect of being above cutoffs to get into selective high schools in Chicago. Their findings include no effect on ACT scores, negative effect on GPA especially for students from low-SES neighborhoods, and positive effects on student perceptions of personal safety and peer relationships. 7 Cosgrove (2004), Hartleroad (2005), Rinn (2007), Slavin, Coladarci, and Pratt (2008), Patton, Coleman, and Kay (2019), and Smeaton and Walsh (2019) estimate the effect of honors college participation on student outcomes by comparing honors students to non-honors students with high GPAs. This assumes that, for students with similar GPAs, aside from differences in a student’s college experience caused by the honors program, there are no other differences between honors and non-honors students that cause their outcomes to be different. 2 in the MSU Honors College. The MSU Honors College considers for admission freshmen whose GPA is high relative to other freshmen students with similar majors. They do this by admitting first-year students whose cumulative GPA at the end of their first fall semester is above the cumulative GPAs of at least 90% of other freshmen in their non-honors college. This policy allows me to use a fuzzy regression discontinuity research design to compare individuals above and below the GPA cutoffs and to attribute discontinuities in outcomes at the cutoffs to a discontinuous increase in the proportion of honors students at the cutoffs. Because students can not precisely control their GPA, being just above or just below a cutoff is as good as random. This allows me to address omitted variable bias by comparing honors students to non-honors students who are similar on unobservable characteristics like organization skills and motivation. Looking at all students in my sample who are close to the cutoffs, I do not find evidence of large effects on student outcomes from honors college participation. In some specifications I find that honors college participation reduces time to degree. While the effect is especially large for male students, I am likely to find a significant effect because I check 9 outcomes and that finding for all students near the cutoff is not statistically significant without covariates in the regression or using a doughnut sample. In heterogeneity analysis I find that honors college participation increases the probability that first-generation college students graduate from MSU. This finding is consistent with marginally significant effects on total number of credits completed for first- generation college students. However, the coefficients have large standard errors because of the low number of high GPA first generation students in my sample and the results are not statistically significant when I use a bandwidth of 0.10 grade points. To better understand the MSU honors college I interviewed 10 honors students and 3 honors college advisors. These interviews help me better understand what it is like to be an honors student at MSU and how being an honors student might change student outcomes. Some things I learned from the interviews are: that rather than take honors classes honors students mostly take regular classes and do additional projects, that the honors general education requirements are fulfilled by completing courses in specific disciplines, that honors students value being able to register for classes first and that, unlike some of their non-honors peers, honors students did not have problems enrolling in the classes they wanted. 1.2 Literature Review Many studies attempt to measure the causal effect of honors college participation on a 3 student’s academic outcomes by comparing honors students to observably similar non-honors students8. Most papers study programs at large 4-year public colleges (Cosgrove 2004; Hartleroad 2005, Rinn 2007; Slavin, Coladarci and Pratt 2008; Keller and Lacy 2013; Furtwengler 2015; Brown, Winburn, and Sullivan-Gonzalez 2019; Diaz, Farruggia, Wellman, and Bottoms 2019; Lishinski and Micomonaco 2020). Other papers study smaller 4-year public colleges (Patton, Coleman, and Kay 2019; Smeaton and Walsh 2019) and community colleges (Honeycutt 2019). These studies look at differences in average outcomes between honors students and high ability non-honors students (Cosgrove 2004; Hartleroad 2005; Rinn 2007; Slavin, Coladarci and Pratt 2008; Patton, Coleman, and Kay 2019; Smeaton and Walsh 2019), use matching methods (Shushok 2006; Keller and Lacy 2013; Futwengler 2015; Brown, Winburn, and Sullivan-Gonzalez 2019; Honeycutt 2019; Lishinski and Micomonaco 2020), and use hierarchical models (Diaz, Farruggia, Wellman, and Bottoms 2019). They find that honors college participation is associated a student having: a higher GPA (Cosgrove 2004; Hartleroad 2004; Shushok 20069; Rinn 2007; Furtwengler 2015; Brown, Winburn, and Sullivan-Gonzalez 2019; Diaz, Farruggia, Wellman, and Bottoms 2019; Honeycutt 2019; Lishinski and Micomonaco 2020), a higher retention rate (Shushok 200610; Slavin, Coladarci, and Pratt 2008; Keller and Lacy 2013; Brown, Winburn, and Sullivan-Gonzalez 2019; Diaz, Farruggia, Wellman, and Bottoms 2019; Patton, Coleman, and Kay 2019; Smeaton and Walsh 2019) a higher graduation rate (Cosgrove 2004; Slavin, Coladarci, and Pratt 2008; Keller and Lacy 2013; Diaz, Farruggia, Wellman, and Bottoms 2019; Honeycutt 2019; Patton, Coleman, and Kay 2019; Lishinski and Micomonaco 2020), longer time to graduate (Cosgrove 2004), more credits earned (Diaz, Farruggia,, Wellman, and Bottoms 2019), and more credits for upper level courses (Lishinski and Micomonaco 2020)11. 8 See Rinn and Plucker (2017) for a literature review of papers published from 2002 to 2017 on the effects of honors programs on student outcomes. Some papers in the review are referenced later in the paragraph. 9 Shushok (2006) found that honors students GPAs are statistically significantly higher than the GPAs of matched non-honors students at the end of freshmen year. The difference in GPAs was not statistically significant for the GPAs Shushok collected 3 years later. 10 Shushok (2006) finds that first year retention rates for honors students are statistically significantly higher than 1 st year retention rates for matched non-honors students at the end of freshmen year. The difference in retention rates is not statistically significant for students 3 years later. This may simply be due to the study’s small sample size as only 9 honors students and 15 non-honors students left the college during the period being analyzed. 11 There are also papers which associate honors college participation with variables I do not study such as higher academic self-concept (Rinn 2007), increased interaction with faculty members (Shushok 2006), students taking classes with better teaching practices (Seifert, Pascarella, Colangelo, and Assouline 2007; Miller and Dumford 2018) and getting a higher standardized exam score (Seifert, Pascarella, Colangelo, and Assouline 2007). 4 There is one recent study on the effect of honors college participation on academic outcomes that uses a methodology that can credibly control for selection on both observable and unobservable characteristics. Pugatch and Thompson (2022) study the Oregon State University honors college. They use a regression kink research design based on the change in slope of the probability of honors college admission as a function of a student’s honors college application score. Using student-course level data they find that looking at all students near the kink scores honors college participation increases course GPA. However, they also find that honors college participation decreases course GPA for first generation college students. Like this study, the researchers also use student level data to look at other academic outcomes. They look at the effect of honors college participation on overall grades, non-honors grades, overall number of credit hours, non-honors credit hours, ever graduating, graduating in less than 4, 5, and 6 years, and graduating in science or engineering. They do not find a significant impact on student’s overall GPA. However, their point estimate is positive and of a similar magnitude to their course level data estimate. They find significant negative effects on the number of non-honors credits and graduating in less than 6 years. The authors dismiss the later finding partially because 99% of students in their data graduate within 6 years. Their point estimate on the probability of ever graduating is large and negative at 7.7 percentage points but is not statistically significant. This study compliments Pugatch’s and Thompson’s study in several ways. One is by producing a credible causal estimate of honors college participation at a different university. Another is that Pugatch and Thompson study students who were admitted to an honors program while they were in high school while I study students who were admitted when they were already in college. Further, I study a variety of outcomes that Pugatch and Thompson do not. These outcomes include number of minors, time to degree, and credits in upper-level courses. Finally, due to a larger sample size, I can provide more precise estimates for the student level outcomes both studies look at. The admissions policy of the MSU Honors College allows me to study the effect of an honors program on academic outcomes with a fuzzy regression discontinuity research design (RDD). This research design is considered to have high internal validity because, absent manipulation of the running variable, being on either side of the cutoff is as good as random (Lee and Lemieux 2011). In other words, the RDD is less subject to potential omitted variable bias 5 then other studies that rely on a selection on observables assumption. Studies which compare differences in outcomes between honors students and high ability non-honors students may not be able to control for differences in other observable factors between these students. Studies that use matching techniques can account for observable factors that affect student outcomes but may not completely control for unmeasured factors such as a student’s level of ambition or how much a student cares about their college education. One downside of an RDD is that estimates only apply to units near the cutoff who are treated because they are above the cutoff. In this study I estimate the effect of participating in the MSU Honors College for students: who do not join the honors college when they are in high school, whose freshmen GPA is near a GPA cutoff, and who would join the honors college if their GPA was above a GPA cutoff. The effect of honors college participation for students admitted into the Honors College when they are in high school or for students with average GPAs may be significantly different from my estimates. This methodology allows me to provide information about what might happen to student outcomes if the GPA cutoffs were lowered, and more students were invited to join the MSU Honors College. 1.3 Institutional Background: MSU and The MSU Honors College MSU is a large 4-year public university located in East Lansing, Michigan. 83% of students who applied to the university in Fall 2021 were admitted. In Fall 2020 38,491 undergraduate students were enrolled in the university. These students were 90% full time, 68% white, and 80% of them were from the state of Michigan12. The MSU Honors College invites first-year students with high GPAs13 to join the college. MSU is organized into 17 different non-honors colleges. These colleges represent specific categories of study such as business, communication arts and sciences, and education. Freshmen students are assigned to colleges based on their expected majors. The MSU Honors College invites to join the college all freshmen who are in the top 10% of each non-honors college’s 12 https://nces.ed.gov/collegenavigator/?q=Michigan+State+University&s=all&id=171100 The years were chosen based on the data available on the above website. 13 GPA stands for grade point average. Each course grade at MSU is assigned one of the following scores: 0, 1, 1.5, 2, 2.5, 3, 3.5, or 4. The better a student does in a class, the higher their course grade. Each class is a certain number of credits depending on how many hours the class meets each week. To calculate GPA, you first multiply a student’s course grades by the number of credits in their classes to get the number of grade points they earned in each class. You then sum the grade points the student earned and divide by the number of credits the student took at MSU. While GPAs are generally determined using grades on assignments and exams, some students may be able to change their GPA by requesting a professor raise their grade. See https://natsci.msu.edu/students/current-students/student- success-resources/academic-success/habits-to-develop-outside-of-class/calculating-your-gpa/. 6 freshmen GPA distribution at the end of their first fall semester14. Transfer students can also be invited into the honors college this way if they transfer to MSU as first year students 15. There are no additional fees for being in the college and there are no punishments if a student starts out in the college and leaves it later. A large minority of students invited into the college this way do not accept their invitation16. The benefits of being in the MSU Honors College include more flexible general education requirements, the ability to enroll in classes on the first day of each enrollment period, the ability to enroll in graduate courses, honors courses, and honors sections of regular courses, the ability to live on honors-only floors of residence halls, the ability to meet with honors college advisors and the ability to apply for special scholarships. See Appendix A.2 for more details about the benefits of being enrolled in the MSU Honors College. Students must fulfill certain requirements to stay in the college. These requirements include completing at least 3 honors experiences (explained below) by the end of their second spring semester, maintaining a GPA of at least 3.2, and completing an Honors College Academic Progress Plan once a year. The Honors College Academic Progress Plan is used to approve courses for the college’s general education requirements and to have students reflect on their accomplishments and professional goals. Students in the college who engage in enough honors activities are recognized as having graduated from the college. To graduate a student must complete at least 8 honors experiences17. Honors experiences include participation in honors courses, participating in honors sections, taking the honors option in a non-honors course, and taking a graduate course. During an honors option students do a project related to course material not required by other students such as writing a business plan in an accounting course or writing a report on an additional experiment in a chemistry course18. If a student graduates from the MSU Honors College, that fact is recorded on the student’s diploma and on their official MSU transcript. They are also recognized during 14 Students who participate in specific enrichment programs and are in the top 15% of their college’s GPA distribution are also invited to join the MSU Honors College. Only a small percent of students who are invited into the MSU Honors College are between the 85th and 90th percentile of their GPA distribution. 15 Students who transfer as something other than first year students can also petition to join the honors college. 16 From academic years 2017 – 2018 to 2021 – 2022 54% of freshmen admitted into the college accepted their offer. 17 Students must complete 10 honors experiences if they have 2 degrees and want both degrees to be labeled as honors degrees. 18 See https://honorscollege.msu.edu/academics/honors-option-examples.html for other examples of honors option projects. 7 graduation ceremonies with an Honors College stole and their affiliation with the MSU Honors College being noted in the graduation program. 1.4 Data and Sample This chapter uses student level administrative data from MSU’s Office of the Registrar. I restrict the sample to students who were freshmen and whose first semester at MSU as an undergraduate was fall semester 2009, 2010, 2011, 2012, or 2013. Students who were in a college whose 90th percentile GPA I was unable to identify19 are removed because I do not know how close those students’ GPAs are to a cutoff to be considered for admission to the MSU Honors College. Students in colleges and cohorts where the GPA cutoff is 4.0 are removed. Because 4.0 is the maximum GPA a student can receive, when the cutoff is 4.0, I am unable to model the relationship between outcome variables and a student’s GPA above the cutoff. Students whose GPA at the end of their first semester is 4.0 are removed because 4.0 students may be systematically different from students with a lower GPA20. After removing those students, the analysis sample, I also refer to this sample as the All GPAs Sample, has 35,800 observations21. 19 These include students whose first college was recorded as being in: the Honors College, the College of Human Medicine, the Associate Provost for Undergraduate Education or the Associate Provost for Undergraduate Services. Students do not have to declare a major until they have 56 credits. If students do not declare a major, their major is recorded as exploratory preference. Over 99% of Associate Provost for Undergraduate Education students have exploratory preference as their freshmen major. The most common majors for Associate Provost for Undergraduate Services are Study Abroad Course Access Track (33%) and Class Connection Tracking (24%). All College of Human Medicine students have a major of Bioethics, Humanities and Society. 20 Because 4.0 is the maximum GPA a student can have, students who have a 4.0 GPA may have a wide range of underlying abilities. This may make the average outcome of 4.0 students different from students with a GPA just below a 4.0. If there was no upper limit to a student’s GPA this would not be an issue. 21 I start with a sample of 43,267 students whose first undergraduate term is Fall 2009, Fall 2010, Fall 2011, Fall 2012, or Fall 2013. 3,594 of those students are in a first college whose 90 th percentile GPA I am unable to identify, 1,968 are in a starting year and first college whose 90 th percentile GPA was 4.0, and 2,334 have a first semester GPA of 4.0. 8 Table 1.1 - Summary Statistics Honors and Non-Honors Students Variable Honors Students Non-Honors Students Female Indicator 0.59 0.50 (0.49) (0.50) White Indicator 0.78 0.61 (0.42) (0.49) Black Indicator 0.05 0.09 (0.21) (0.29) First Gen Indicator 0.20 0.28 (0.40) (0.45) Age First Term 17.9 18.1 (0.52) (0.75) ACT Score 28.6 24.4 (3.6) (3.4) First Semester GPA 3.6 2.6 (0.47) (1.1) N 2,320 33,480 Notes: Honors students are students who are in the MSU Honors College for at least 1 semester. All other students are non-honors students. The table shows the mean value for each variable for honors and non-honors students. The standard deviation is below each mean in parentheses. 8.3% of honors students and 22% of non-honors students have missing ACT scores. N = 2,128 for ACT statistics for honors students. N = 26,186 for ACT statistics for non-honors students. Table 1.1 shows summary statistics for honors and non-honors students. 6% of students in the sample are honors students. Compared to non-honors students, honors students are more likely to be female, more likely to be white, less likely to be black, less likely to be a first- generation college student, and have higher ACT scores and first semester GPAs. Honors and non-honors students on average start college when they are the same age, but the variability of ages is greater for non-honors students22. To the extent honors college participation causes students to substitute non-honors peers for honors peers, participation will likely increase the ACT scores and grades of the students’ peers. This is because honors students have higher ACT scores and first semester GPAs then non- honors students. Honors students are encouraged to have other honors students as peers through access to things like honors classes, honors-only floors of resident’s halls, and by the existence of honor student organizations. Prior research has found that peers significantly impact a variety of outcomes in higher education settings such as GPA and level of binge drinking (Carrrell, Fullerton, and West 2009; Sacerdote 2011). Therefore, I expect honors students to have 22 In results available upon request, I get summary statistics for students admitted into the MSU Honors College when they are in high school and for students admitted into the MSU Honors College when they are already at MSU. Compared to students admitted when they were in high school, students admitted when they were in college are more likely to be female, less likely to be white, have lower ACT scores, and have higher first semester GPAs. The All GPAs Sample contains 1,124 high school admits and 1,196 college admits. 9 improved academic outcomes because they have higher ability peers. Table 1.2 - Summary Statistics Close to Cutoffs Sample and All GPAs Sample Variable Close to Cutoffs All GPAs Female Indicator 0.57 0.51 (0.49) (0.50) White Indicator 0.76 0.62 (0.42) (0.49) Black Indicator 0.03 0.09 (0.18) (0.29) First Gen Indicator 0.20 0.28 (0.40) (0.45) Age First Term 18.0 18.1 (0.69) (0.74) ACT Score 26.4 24.7 (3.2) (3.6) First Semester GPA 3.8 2.8 (0.09) (1.1) N 4,829 35,800 Notes: The table shows the mean value for each variable either for all students in the analysis sample (All GPAs) or for students in my sample whose 1st semester GPA is close to one of the GPA cutoffs to be admitted into the honors college (Close to Cutoffs). The standard deviation is below each mean in parentheses. Students in the Close to Cutoffs Sample have a first semester GPA minus the 90th percentile GPA for their year and college (running variable) of between - 0.15 and 0.15. 13% of students in the Close to Cutoffs Sample and 21% of students in the All GPAs Sample have missing ACT scores. N = 4,223 for ACT statistics for the Close to Cutoffs Sample. N = 28,314 for ACT statistics for the All GPAs Sample. Table 1.2 shows summary statistics for all students in the analysis sample (All GPAs Sample) and for a sample of students who are close to the cutoffs. Compared to the students in the All GPAs Sample, the students close to the cutoffs are more likely to be female, and white, less likely to be black or first gen students and have higher ACT scores and first semester GPAs. The two groups are similar in age during their first term. 1.5 Empirical Methodology My equation of interest is: (1.1) 𝑂𝑢𝑡𝑐𝑜𝑚𝑒𝑖𝑐𝑡 = 𝛽0 + 𝛽1 𝐻𝑜𝑛𝑜𝑟𝑠𝐶𝑜𝑙𝑙𝑒𝑔𝑒𝑖𝑐𝑡 + 𝜷𝑿𝑖 + 𝜃𝑐𝑡 + 𝜖𝑖𝑐𝑡 Outcomeict represents an outcome for student i who started in non-honors college c and in year t. The main outcomes I study include: the student’s cumulative GPA at the end of their 4th and 8th semesters at MSU23, if the student graduated from MSU, the number of semesters it took 23 When counting semesters for cumulative GPA as an outcome, I do not count summers. For example, if a student started in Fall 2009 then their 3rd semester cumulative GPA would be their cumulative GPA at the end of Fall 2010 even if they took classes at MSU during Summer 2010. I also do not account for students who leave MSU for a semester and return later. For example, if a student started in Fall 2009, took no class in Spring 2010 or Fall 2010 and returned in Spring 2011, then their 3 rd semester cumulative GPA (Fall 2010) would be missing. 10 the student to get their first BA or BS degree, the number of majors the student completed, the number of minors the student completed, the total number of credits the student earned at MSU, the number of credits the student earned for classes at the 300 level, and the number of credits the student earned for classes at the 400 level. Xi is a vector of covariates for student i. This vector contains indicator variables for the student’s race24, gender and if the student is a first-generation college student25. It also contains the student’s age when they entered MSU as a continuous variable. θct is a fixed effect for the combination of the first non-honors college a student enrolled in at MSU and what year, 2009 – 2013, the student was a freshman. Cutoffs depend on a student’s first college-year combination. This fixed effect allows me to compare students who face the same GPA cutoff. HonorsCollegeict is an indicator variable for the student being in the MSU Honors College for at least 1 semester. Because students are chosen to be in the honors college based on their academic achievement, an OLS regression would be inconsistent with 𝛽̂1 likely being too large. 𝛽̂1 would include not only the causal effect of being in the Honors College, but also the difference in unobserved factors that affect academic outcomes between honors and non-honors students. These factors might include how much a student studies and how much a student enjoys attending lectures26. To address this issue, I use a fuzzy27 regression discontinuity research design where having a high enough 1st semester GPA to be considered for admission to the Honors College is an instrument for being in the college for at least 1 semester. The empirical methodology for this project relies on the fact that the MSU Honors College uses GPA cutoffs when determining which freshmen get invited to join the college. The MSU Honors College invites all freshmen into the college whose GPA at the end of their first fall 24 Some students in my data have a race that is either not reported or not requested. I leave these students in the sample and consider not reported as a race and not requested as a race. 25 Being a first-generation college student means that none of the student’s ancestors such as parents, grandparents, or great grandparents attended college or university. 26 Other examples of possible unobserved differences that OLS regressions might not account for include differences in innate intelligence or differences in the quality of schools students attend before they start attending MSU. 27 This is a fuzzy regression discontinuity research design because the probability of being in the MSU Honors College does not go from 0 to 1 at the GPA cutoffs. The main reason some students below the cutoff are in the MSU Honors College is because they were invited into the college when they were in high school. While all students above the cutoffs are invited to join the MSU Honors College, many above cutoff students decline their invitation to join the college. 11 semester is in the top 10% of GPAs of freshmen in each non-honors college. For example, assume that there were 100 freshmen in the College of Music in Fall of 2009, that each student had a different GPA, and that the 10th highest GPA among those students was a 3.75. In that case, the MSU Honors College would invite the 10 freshmen in The College of Music who had a GPA of greater than or equal to 3.75 to join the college28. Because students do not know what the cutoffs will be, and because students cannot precisely control their GPA, those just above and just below the cutoffs should be similar in both observable and unobservable characteristics unrelated to honors college participation. This allows me to attribute differences in academic outcomes between students with similar GPAs on different sides of the GPA cutoffs to the difference in participation in the honors college at the cutoffs. The first stage estimating equation is (1.2) 𝐻𝑜𝑛𝑜𝑟𝑠𝐶𝑜𝑙𝑙𝑒𝑔𝑒𝑖𝑐𝑡 = 𝛽0 + 𝛽1 𝐴𝑏𝑜𝑣𝑒𝐶𝑢𝑡𝑜𝑓𝑓𝑖𝑐𝑡 + 𝛽2 (𝐹𝑟𝑒𝑠ℎ𝑚𝑒𝑛𝐺𝑃𝐴𝑖𝑐𝑡 − 𝐺𝑃𝐴𝐶𝑢𝑡𝑜𝑓𝑓𝑐𝑡 ) + 𝛽3 𝐴𝑏𝑜𝑣𝑒𝐶𝑢𝑡𝑜𝑓𝑓𝑖𝑐𝑡 (𝐹𝑟𝑒𝑠ℎ𝑚𝑒𝑛𝐺𝑃𝐴𝑖𝑐𝑡 − 𝐺𝑃𝐴𝐶𝑢𝑡𝑜𝑓𝑓𝑐𝑡 ) + 𝜷𝑿𝑖 + 𝜃𝑐𝑡 + 𝜖𝑖𝑐𝑡 The second stage estimating equation is (1.3) 𝑂𝑢𝑡𝑐𝑜𝑚𝑒𝑖𝑐𝑡 = 𝛽0 + 𝛽1 𝐻𝑜𝑛𝑜𝑟𝑠𝐶𝑜𝑙𝑙𝑒𝑔𝑒𝑖𝑐𝑡 + 𝛽2 (𝐹𝑟𝑒𝑠ℎ𝑚𝑒𝑛𝐺𝑃𝐴𝑖𝑐𝑡 − 𝐺𝑃𝐴𝐶𝑢𝑡𝑜𝑓𝑓𝑐𝑡 ) + 𝛽3 𝐴𝑏𝑜𝑣𝑒𝐶𝑢𝑡𝑜𝑓𝑓𝑖𝑐𝑡 (𝐹𝑟𝑒𝑠ℎ𝑚𝑒𝑛𝐺𝑃𝐴𝑖𝑐𝑡 − 𝐺𝑃𝐴𝐶𝑢𝑡𝑜𝑓𝑓𝑐𝑡 ) + 𝜷𝑿𝑖 + 𝜃𝑐𝑡 + 𝜖𝑖𝑐𝑡 AboveCutoffict is an indicator variable for if the student is above a GPA cutoff. GPACutoffct is the minimum GPA the student needs to earn for them to be considered for admission into the MSU Honors College. It is specific both to the non-honors college the student was in when they were freshmen and the year the student was a freshman. The distribution of GPA cutoffs used in my analysis is shown below in Figure 1.1. In both equations the coefficient of interest is β1. In Equation 1.3 β1 is the causal effect of ever being a part of the MSU Honors College on an outcome for students whose GPA is both close to one of the cutoffs and who would join the MSU Honors College if their GPA were above a cutoff. 28 The cutoffs are calculated rounding to 2 decimal places. It might be the case that more than 10% of freshmen in a college are at or above a cutoff because many students have the same 1 st semester GPA. In that case all students at or above the cutoff are invited to join the college. 12 Figure 1.1 – Distribution of GPA Cutoffs Notes: N = 63. Cutoffs range from 3.6 to 3.93. For the years of my sample some colleges had cutoffs of 4.0. The 4.0 cutoffs are not included in the graph because students whose cutoff was 4.0 were not included in the analysis sample. I also use Equation 1.2 and Equation 1.3 to measure how much students close to the cutoff participate in the MSU Honors College. I do this by looking at the following outcomes: the number of semesters a student is in the college, if the student graduated from the college, and the number of honors experiences the student completed. The more students do things that they can only do as honors students, the more intense the treatment of being admitted to the honors program is, and the more likely the program will change academic outcomes. The longer a student is in the MSU Honors College the more time they can engage in honor student only activities. Most of the things that count as honors experiences including enrolling in honors courses, honors sections, and graduate courses, are things only honors students can do 29. The more honors experiences students have, the more being admitted into the MSU Honors College changes their college experience. Graduating from the MSU Honors College means a student has completed at least 8 honors experiences and completed yearly academic progress plans. Those students have engaged a lot with honors activities, much more so than students who were admitted into the college but who did not have any honors experiences. 29 Honors options also count as honors experiences but both non-honors and honors students can do honors options. Honors students have a much stronger incentive to do them because only for honors students do they count towards getting a degree from the MSU Honors College. 13 If there are discontinuities in observable characteristics at the GPA cutoffs, this may be evidence that students on either side of the cutoffs are different in ways other than their participation in the MSU Honors College. I test for this using the following equation (1.4) 𝐶𝑜𝑣𝑎𝑟𝑖𝑎𝑡𝑒𝑖𝑐𝑡 = 𝛽0 + 𝛽1 𝐴𝑏𝑜𝑣𝑒𝐶𝑢𝑡𝑜𝑓𝑓𝑖𝑐𝑡 + 𝛽2 (𝐹𝑟𝑒𝑠ℎ𝑚𝑒𝑛𝐺𝑃𝐴𝑖𝑐𝑡 − 𝐺𝑃𝐴𝐶𝑢𝑡𝑜𝑓𝑓𝑐𝑡 ) + 𝛽3 𝐴𝑏𝑜𝑣𝑒𝐶𝑢𝑡𝑜𝑓𝑓𝑖𝑐𝑡 (𝐹𝑟𝑒𝑠ℎ𝑚𝑒𝑛𝐺𝑃𝐴𝑖𝑐𝑡 − 𝐺𝑃𝐴𝐶𝑢𝑡𝑜𝑓𝑓𝑐𝑡 ) + 𝜃𝑐𝑡 + 𝜖𝑖𝑐𝑡 The models as specified above assume a linear relationship between a student’s freshmen fall semester GPA and the outcome variables, allowing for different slopes on each side of the GPA cutoffs. I use a bandwidth of 0.15 for all regressions in the main body of the paper. I include alternative specifications in Appendix A. These other specifications include using a bandwidth of 0.10, using a bandwidth of 0.20, removing students with GPAs within 0.01 grade points of the cutoffs (doughnut sample), and choosing a bandwidth using an algorithm and calculating confidence intervals using the method described in Calonico, Cattaneo, and Titiunik (2014). To test for differences in the effect of Honors College participation for different subgroups, I use the following equation (1.5) 𝑂𝑢𝑡𝑐𝑜𝑚𝑒𝑖𝑐𝑡𝑠 = 𝛽0 + 𝛽1 𝐻𝑜𝑛𝑜𝑟𝑠𝐶𝑜𝑙𝑙𝑒𝑔𝑒𝑖𝑐𝑡𝑠 + 𝛽2 (𝐹𝑟𝑒𝑠ℎ𝑚𝑒𝑛𝐺𝑃𝐴𝑖𝑐𝑡𝑠 − 𝐺𝑃𝐴𝐶𝑢𝑡𝑜𝑓𝑓𝑐𝑡 ) + 𝛽3 𝐴𝑏𝑜𝑣𝑒𝐶𝑢𝑡𝑜𝑓𝑓𝑖𝑐𝑡𝑠 (𝐹𝑟𝑒𝑠ℎ𝑚𝑒𝑛𝐺𝑃𝐴𝑖𝑐𝑡𝑠 − 𝐺𝑃𝐴𝐶𝑢𝑡𝑜𝑓𝑓𝑐𝑡 ) + 𝛽4 𝑆𝑢𝑏𝑔𝑟𝑜𝑢𝑝𝑠 + 𝛽5 𝑆𝑢𝑏𝑔𝑟𝑜𝑢𝑝𝑠 𝐻𝑜𝑛𝑜𝑟𝑠𝐶𝑜𝑙𝑙𝑒𝑔𝑒𝑖𝑐𝑡𝑠 + 𝛽6 𝑆𝑢𝑏𝑔𝑟𝑜𝑢𝑝𝑠 (𝐹𝑟𝑒𝑠ℎ𝑚𝑒𝑛𝐺𝑃𝐴𝑖𝑐𝑡𝑠 − 𝐺𝑃𝐴𝐶𝑢𝑡𝑜𝑓𝑓𝑐𝑡 ) + 𝛽7 𝑆𝑢𝑏𝑔𝑟𝑜𝑢𝑝𝑠 𝐴𝑏𝑜𝑣𝑒𝐶𝑢𝑡𝑜𝑓𝑓𝑖𝑐𝑡 (𝐹𝑟𝑒𝑠ℎ𝑚𝑒𝑛𝐺𝑃𝐴𝑖𝑐𝑡𝑠 − 𝐺𝑃𝐴𝐶𝑢𝑡𝑜𝑓𝑓𝑐𝑡 ) + 𝜃𝑐𝑡 + 𝜖𝑖𝑐𝑡𝑠 Subscript s denotes if individual i is a member of subgroup s. Subgroups is a subgroup indicator variable. This equation models the relationship between the running variable and the dependent variable differently for students who are and are not subgroup members. I estimate Equation 1.5 by instrumenting HonorsCollegeicts and SubgroupsHonorsCollegeicts with AboveCutofficts and SubgroupsAboveCutofficts. The coefficients of interest are β1 and β5. β1 is the treatment effect of honors college participation for students who are not members of the subgroup. β1 + β5 is the treatment effect for students who are members of the subgroup. The statistical test on β5 tests whether the treatment is different for subgroup members and non- subgroup members. 14 1.6 Results 1.6.1 Identification Test: Discontinuity in Density Figure 1.2 – Histogram Students Close to Cutoffs Notes: N = 4,829. Each bar in this histogram has a width of 0.01. The histogram starts at GPA Minus Cutoff = -0.15. A sudden change in the density of observations at the cutoffs may be evidence that individuals on different sides of the cutoffs are different in ways that are not related to participation in the MSU Honors College. Figure 1.2 shows the density of observations for students in my sample who have a GPA within 0.15 grade points of the cutoffs. For this study the running variable (GPA Minus Cutoff) is a student’s GPA at the end of their freshmen fall semester minus the 90th percentile of GPA for the student’s cohort and first college30. The graph shows a small decrease in the number of observations where the running variable equals 0. I test for the significance of the change in the density of observations at the cutoffs using the test described in Cattaneo, Jansson and Ma (2018) which builds on foundational work for this type of test in McCrary (2008). I find that this decrease is statistically significant with a test statistic of 2.2409 and a p-value of 0.0331. I do not think the significant test result means that students are precisely manipulating their GPA to be above the cutoffs. If they were, the density of observations would be much higher just above the cutoffs than just below the cutoffs. However, based on Figure 2, the density of observations declines slightly at the cutoffs. No student has an incentive to have a GPA just 30 90th percentile GPAs by year and college were obtained from the MSU Enrollment and Term End Reports Ranking of Cumulative GPAs by Class and Level of Primary Major. See https://reg.msu.edu/roinfo/ReportView.aspx?Report=CTE-RankCumGPAs 31 This is for an algorithmically chosen bandwidth of 0.137. Specifying a bandwidth of 0.15 the test statistic is 1.9961 and the p-value is 0.0459. 15 below a GPA cutoff. It is also the case that students can not precisely control their GPA. GPA is generally determined by grades on tests, homework assignments, and projects. Students generally do not know precisely what grade they will earn on a project for different levels of work. Students do not know what questions will be on a test and therefore cannot study specific topics to get the exact score they want. Finally, the cutoffs change from year to year. Cutoffs are calculated after the fall semester based on the distribution of grades of freshmen in each college. Even if a student knew what the previous year’s cutoff was and could precisely target their GPA to last year’s cutoff, the cutoff may be higher when it is applied to the student. In that case the student’s GPA would be below the cutoff and they would not be invited to join the MSU Honors College. 16 1.6.2 Identification Test: Discontinuities in Covariates Table 1.3 – Discontinuity in Covariates Female First Gen Age First ACT White Black Semester Score32 Above Cutoff -0.0090 0.0059 -0.0120 -0.1358 -0.0148 0.0235** (0.0277) (0.0178) (0.0371) (0.1959) (0.0229) (0.0108) College- Y Y Y Y Y Y Cohort Fixed Effects Mean 0.51 0.28 18 25 0.62 0.09 Outcome American Asian Pacific Hawaiian Hispanic Two or Native Islander More Races Above Cutoff 0.0069** -0.0086 0.0003 -0.0003 -0.0052 0.0000 (0.0029) (0.0155) (0.0003) (0.0003) (0.0077) (0.0068) College- Y Y Y Y Y Y Cohort Fixed Effects Mean 0.00 0.05 0.00 0.00 0.04 0.02 Outcomes Race Not Race Not Reported Requested Above Cutoff 0.0018 -0.0037 (0.0048) (0.0151) College- Y Y Cohort Fixed Effects Mean 0.01 0.16 Outcome Notes: * p < 0.1, ** p < 0.05, *** p < 0.01. Bandwidth = 0.15. Standard errors are clustered at the first college – cohort level. The regressions above use estimating Equation 1.4 from Section 1.5 of this chapter and include first college – cohort fixed effects. N = 4,829 except for ACT Score where N = 4,223. The outcomes are indicator variables for being female, being a specific race, being a first-generation college student, the student’s age during their first semester at MSU and the student’s ACT score. Mean outcomes for the All GPAs Sample are shown. In Table 1.3 I test if there is a statistically significant discontinuity at the cutoffs for variables that should not be affected by a student enrolling in the MSU Honors College. Most of the coefficients are small and statistically insignificant. There is a statistically significant discontinuity in the proportion of black students and American Native students at the cutoff. I do not think this is much of an issue given the small number of black and American Native students near the cutoff. To the extent it is an issue, I address this by doing a robustness check using a doughnut sample. In that sample observations within 0.01 grade points of the cutoffs are removed. In Appendix A.4 I show that for the doughnut sample, no covariate that I check has a 32 In results not shown, I test for a discontinuity at the cutoffs in the proportion of students whose ACT score is missing in my data. The discontinuity, at a decline of 0.0%, is small and insignificant. 17 statistically significant change at the cutoff at the 5% level. 1.6.3 Discontinuities in Honors College Participation at the Cutoffs Figure 1.3 – Discontinuity in Proportion of Honors Students Notes: N = 4,829. To create the graph, I regressed being an honors student on indicator variables for a student being in a particular first college and cohort. The graph above plots the residuals from that regression. This was done because all my regressions include first college-cohort fixed effects. Only students in the analysis sample who have a running variable between -0.15 and 0.15 are included in the graph. I define an honors student as a student who was in the MSU Honors College for at least 1 semester. Each dot is the residual proportion of honors students whose running variable is an element of [x, x + 0.01). For the left most dot x = -0.15. Figure 1.3 shows a binned scatter plot of the residual proportion of honors students for different values of the running variable around the cutoffs. Residuals are from a regression of an indicator for a student being an honors student on indicator variables for students being in a particular first college and cohort. This was done because all my regressions include first college-cohort fixed effects. I did this so I am only comparing students who faced the same GPA cutoff. All binned scatter plots in this chapter will plot residuals of the variable of interest on first-college cohort indicator variables for the same reason. Binned scatter plots using the raw data are available upon request. In the figure the proportion of students who are honors students discontinuously increases from -0.05 to 0.2 at the cutoffs. 18 Table 1.4 – Discontinuity in Ever Being in the Honors College Ever in Ever in Honors Honors College College Above Cutoff 0.2871*** 0.2859*** (0.0269) (0.0259) First College- Y Y Cohort Fixed Effects Covariates N Y Mean Outcome 0.06 0.06 Notes: * p < 0.1, ** p < 0.05, *** p < 0.01. N = 4,829. Bandwidth = 0.15. Standard errors are clustered at the first college – cohort level. All regressions include first college-cohort fixed effects. Covariates include the student’s age when the entered MSU and indicators for being female, being a specific race, and being a first-generation college student. Mean outcomes for the All GPAs sample are shown. Table 1.4 shows that the increase in the proportion of honors students at the GPA cutoffs is statistically significant at the 1% level for a bandwidth of 0.15. This means that there are many students below the cutoffs who would have joined the MSU Honors College if their GPA was a bit higher, and they were invited to join the college. Many students who are invited to join the MSU Honors College because they are at or above a GPA cutoff do not join the college. Of 1,759 students in my sample at or above the cutoffs who are not in the MSU Honors College during their first semester, only 828 of them (47%) ever become honors students. To learn about what kinds of invited students are more likely to accept their invitation, using the sample of 1,759 students, I regress an indicator for being in the honors college on indicators for being female, being a first-generation student, being a specific race, starting in a specific year, and having a specific first college. Clustering standard errors at the first-college cohort level, several of the coefficients are statistically significant. Women are 7 percentage points more likely to accept their invitation then men. Students who are a year older when they start attending MSU are 4 percentage points less likely to accept their invitation. Students who start in 2013 are 10 percentage points more likely to accept their invitation than students who started in 2009. Students whose first college is James Madison 33, Music, Natural Science, or Veterinary Medicine are more likely to accept their invitation then students whose first college is Agriculture and Natural Resources. The coefficients for black (8 33 Students in James Madison College have at least one of the following majors: International Relations, Comparative Cultures and Politics, Social Relations and Policy, or Political Theory and Constitutional Democracy. James Madison is a living-learning community where students in the college can live in a special dorm (Case Hall) connected to classrooms, a dining hall, and faculty advising offices. See https://jmc.msu.edu/ for more information. 19 percentage points) and being a first-generation student (-5 percentage points) are both statistically insignificant. Table 1.5 – Intensity of Honors College Participation for Marginal Students Number of Number of Number of Number of Graduating Graduating Semesters Semesters Honors Honors from from in the in the Experiences Experiences Honors Honors Honors Honors College College College College Treatment Effect 7.8047*** 7.8054*** 5.2522*** 5.2714*** 0.5165*** 0.5182*** (0.4398) (0.4422) (0.3779) (0.3789) (0.0576) (0.0578) First College- Y Y Y Y Y Y Cohort Fixed Effects Covariates N Y N Y N Y Mean Outcome 7.8 7.8 5.3 5.3 0.52 0.52 College Admits Notes: * p < 0.1, ** p < 0.05, *** p < 0.01. N = 4,829. Bandwidth = 0.15. Standard errors are clustered at the first college – cohort level. All regressions include first college-cohort fixed effects. The coefficients are 2SLS estimates for the treatment effect of ever participating in the MSU Honors College. Covariates include the student’s age when the entered MSU and indicator variables for being female, being a specific race, and being a first-generation college student. Number of Semesters in Honors College is calculated using the first and last semester the student is in the honors college and counts summers as 1 semester. Mean outcomes for honors students in the All GPAs Sample whose first semester in the honors college is not their first semester at MSU are shown. Table 1.5 shows treatment effects for how accepting an invitation to join the honors college changes honors college related outcomes. The table shows that, at least based on the outcomes in the table, honors students just above the cutoffs participate in the honors college as much as other students invited into the college as a freshman. Marginal honors students stayed in the honors college for an average of 7.8 semesters and completed an average of 5.3 honors experiences. About 52% of them ended up graduating from the honors college meaning they completed at least 8 honors experiences. These results show that honors college participation significantly changed the college experience of students near the cutoffs. 20 1.6.4 Results: Discontinuities in Academic Outcomes Figure 1.4 – Discontinuities in Selected Outcomes Notes: N = 4,829 for the top left graph. N = 4,403 for the top right graph. N = 4,561 for bottom left graph. N = 4,006 for bottom right graph. The top left graph has the most observations because some students left MSU before they earned a degree or before their 4th or 8th semesters. To create each graph, I regressed the outcome variable on indicator variables for a student being in a particular first college and cohort. In the graphs above I plot the residuals from those regressions. Graphs created using the raw data are available upon request. For the top right graph time to degree counts summers as 1 semester even if the student did not take any summer classes. For the bottom two graphs the variable is cumulative GPA at the end of the term. Each dot is the average residual for students whose running variable is an element of [x, x + 0.01). For the left most dot x = -0.15. Figure 1.4 contains binned scatter plots showing the discontinuity in: the proportion of students who graduated from MSU (top left), the number of semesters to get first degree (top right), 4th semester GPA (bottom left), and 8th semester GPA (bottom right). The only outcome that has a visually large discontinuity at the cutoffs is time to degree. Time to degree decreases by about 0.15 semesters at the cutoffs. 21 Table 1.6 – Effect of Honors College Participation on Student Outcomes Graduate Graduate Time to Time to 4th 4th MSU MSU Degree Degree Semester Semester GPA GPA Treatment Effect 0.0133 0.0178 -0.5883* -0.7789** -0.0173 0.0068 (0.0537) (0.0536) (0.3556) (0.3269) (0.0676) (0.0652) First College- Y Y Y Y Y Y Cohort Fixed Effects Covariates N Y N Y N Y Number of 4,829 4,829 4,403 4,403 4,561 4,561 Observations Mean Outcome 0.79 0.79 13 13 3.0 3.0 8th 8th Total Total Credit Credit Semester Semester Credit Credit Hours Hours GPA GPA Hours Hours 300 300 Level Level Treatment Effect 0.0138 0.0503 -3.0377 -3.3693 -1.9085 -1.9186 (0.0685) (0.0633) (4.7788) (4.8348) (2.0016) (2.0133) First College- Y Y Y Y Y Y Cohort Fixed Effects Covariates N Y N Y N Y Number of 4,006 4,006 4,829 4,829 4,829 4,829 Observations Mean Outcome 3.1 3.1 106 106 25 25 Credit Credit More More Number Number Hours 400 Hours 400 than One than One Minors Minors Level Level Degree Degree Treatment Effect 0.6898 0.8118 -0.0478 -0.0494 -0.0970 -0.0987 (2.1644) (2.1546) (0.0412) (0.0414) (0.0754) (0.0766) First College- Y Y Y Y Y Y Cohort Fixed Effects Covariates N Y N Y N Y Number of 4,829 4,829 4,403 4,403 4,829 4,829 Observations Mean Outcome 17 17 0.03 0.03 0.15 0.15 Notes: * p < 0.1, ** p < 0.05, *** p < 0.01. All regressions include first college-cohort fixed effects. Standard errors are clustered at the first college – cohort level. Coefficients are 2SLS estimates for the treatment effect of ever participating in the MSU Honors College. Covariates include the student’s age when they enter MSU and indicators for being female, being a specific race, and being a first-generation college student. For all regressions the bandwidth is 0.15. Mean outcomes for students in the All GPAs Sample are shown. Time to degree counts summers as 1 semester. For the GPA regressions, 4th semester and 8th semester are calculated ignoring summers. GPA is cumulative GPA at the end of the semester. For more than one degree only students who have at least 1 degree are included in the regression. Table 1.6 shows treatment effect estimates for ever being in the MSU Honors College on 22 academic outcomes. Almost all outcomes have insignificant coefficients with or without covariates. The only exception to this is the negative coefficient on time to degree with covariates. According to that estimate, being in the MSU Honors College causes students near the cutoff to graduate 0.78 semester sooner. This is a reduction in the number of semesters to graduate of about 6%34. The magnitude for time to degree is 24% smaller without covariates and is only statistically significant at the 10% level. The statistically significant coefficient might be a spurious result given that the 8 other outcomes I check are statistically insignificant and the more outcomes I check the more likely 1 is significant even if all true effects are 0. Based on these results, it seems like honors college participation does not affect student outcomes with the possible exception of reducing the time it takes students to get their first degree. In Appendix A.7 I estimate the effect of being in the MSU Honors College on outcomes not in Table 5. Outcomes in Tables A.22 and A.23 include cumulative GPA 2nd to 8th semesters, retention for 2nd to 8th semester, and time to first degree ignoring summers. In Table A.24 I look at the effect of honors college participation on the major of a student bachelor’s degree35. No coefficient in Tables A.22, A.23, or A.24 is statistically significant at the 5% level. This includes the coefficients on time to degree when calculated ignoring summer semesters. I conclude that I do not have evidence that honors college participation changes any of the outcomes in Appendix A.7. 1.6.5 Alternative Specifications: Full Sample In Appendix A.3.1 I re-create Tables 1.3 to 1.6 using algorithmically chosen bandwidths and bias corrected confidence intervals from Calonico, Catteno, and Titiunik (2014). The results are presented in Tables A.1 to A.4 and are qualitatively similar to those above. In Appendix A.3.2 I re-do the analysis from Tables 1.3 to 1.6 for a bandwidth of 0.10 and a bandwidth of 0.20. The results are presented in Tables A.5 to A.10. In most cases changing the bandwidth does not change the significance of the results. The coefficient on the proportion of black students is significant at a 10% level for a bandwidth of 0.20 but not significant for a bandwidth of 0.10. With a bandwidth of 0.15 the coefficient is significant at the 5% level. The 34 The denominator for this calculation is the average of 13 semesters it took students in the All GPAs Sample to get their first degree. 35 My data contains the name of each degree or certificate a student earned at MSU. I take the name of the first bachelor’s degree in each student’s list of awards and classify the degree into 1 of 11 groups of degrees based on the degree groups in Andrews, Imberman, Lovenheim, and Strange (2022). If a student did not earn a bachelor’s degree at MSU they are classified as being in a No Degree group of degrees. A list of which majors are classified as being part of each major group is available upon request. 23 negative treatment effect of honors college participation on time to degree is only significant at the 10% level for a bandwidth of 0.20. The treatment effect is significant at the 5% level for bandwidths of 0.10 and 0.15. Finally, I re-create Tables 1.3 to 1.6 using a doughnut sample. This sample removes students in the analysis sample whose GPA at the end of their first fall semester is within 0.01 grade points of their cutoff. One reason for creating this sample was to address the significant discontinuity in the proportion of black students at the cutoffs in the analysis sample. Another is to address identification issues arising from the jump in the proportion of honors students of about 10 percentage points from between 0.02 and 0.01 grade points below the cutoffs to between 0.01 grade points below the cutoffs and the cutoffs. The results are presented in Tables A.11 to A.14. With the doughnut sample no covariates have a statistically significant discontinuity at the cutoff at the 5% level. The proportion of honors students still increases significantly at the cutoff and the treatment effect on honors college related outcomes is about the same as it is in Table 1.5. However, unlike in Table 1.6, no outcome has a significant coefficient at the 5% level when covariates are included. In particular, the estimated treatment effect for time to degree is about 35% of the magnitude it is in Table 1.6 and is not significant even at the 10% level. This result is consistent with the significant time to degree in Table 1.6 being due to random variation rather than due to a real causal effect. Another possible concern with my main specification is that I may not have a large enough range of observations above the cutoff to properly estimate the regression. To address this, in results available upon request, I get the results in Tables 1.3 to 1.6 dropping all students whose cutoff is 3.9 or greater. Results are similar to the main specification with a first stage of 27 percentage points and a significant effect on time to degree with covariates of -0.84 semesters. To see if my results are robust to including students whose GPA at the end of their first term is 4.0, in results available upon request I get the information in Tables 1.3 to 1.6 including those 4.0 students. In all regressions I include an indicator variable for a student having a 4.0 GPA at the end of their first term in case those students are different from other students. Results are similar to the main specification with a first stage of 30 percentage points and a significant effect on time to degree with covariates of -0.77 semesters. The only major difference is that I estimate compliers stayed in the honors college on average 4.5 semesters. This is much less than the estimate of 7.8 semesters in Table 1.5. 24 1.6.6 Discontinuity in High School Admits In Appendix A.5 I look for a discontinuity in the proportion of students who were admitted into the MSU Honors College when they were in high school. I identify a student as a high school admit based on the student being in the MSU Honors College during their first term at MSU. Because those students being in the MSU Honors College is unrelated to the cutoffs, there should be no discontinuity in high school admits at the cutoffs. This is what I find in Figure A.1 and Table A.15. The discontinuity for high school admits is close to 0 and statistically insignificant. 25 1.6.7 Heterogeneity: Female vs Male Table 1.7 – Male and Female Treatment Effect of Honors College Participation Graduate Time to 4th 8th Total Credit MSU Degree Semester Semester Credit Hours GPA GPA Hours 300 In Honors College 0.0351 -1.5449** 0.0586 0.0855 -4.6264 -1.8286 (0.1040) (0.6713) (0.0954) (0.1097) (8.1971) (3.8378) In Honors College * -0.0356 1.6551** -0.1323 -0.1202 3.0443 -0.0316 Female (0.1320) (0.7845) (0.1159) (0.1095) (9.8622) (6.1465) P(In Honors College + 0.99 0.80 0.38 0.61 0.78 0.59 Interaction) First College-Cohort Y Y Y Y Y Y Fixed Effects Number of Observations 4,829 4,403 4,561 4,006 4,829 4,829 Mean Outcome Males 0.77 13 3.0 3.1 104 25 Mean Outcome Females 0.81 12 3.1 3.2 107 25 Credit More Number Hours Than One Minors 400 Degree In Honors College 1.2094 0.0125 0.0301 (3.6923) (0.0519) (0.1315) In Honors College * -0.9019 -0.1062 -0.2248 Female (4.1635) (0.0806) (0.2011) P(In Honors College + 0.90 0.14 0.10 Interaction) First College-Cohort Y Y Y Fixed Effects Number of Observations 4,829 4,403 4,829 Mean Outcome Males 16 0.02 0.12 Mean Outcome Females 18 0.03 0.17 Notes: * p < 0.1, ** p < 0.05, *** p < 0.01. All regressions include first college-cohort fixed effects. Standard errors are clustered at the first college – cohort level. Regressions are 2SLS regressions where Above Cutoff and Above Cutoff * Female are instruments for In Honors College and In Honors College * Female. All regressions have a bandwidth of 0.15. Time to degree results only use students who graduated and counts summers as 1 semester. For the GPA regressions, 4th semester and 8th semester are calculated ignoring summers. GPA is cumulative GPA at the end of the semester. Mean outcomes are for all male or all female students in the All GPAs Sample. For more than one degree only students who have at least 1 degree are included in the regression. Table 1.7 shows results of regressions that explore differences in the effect of honors college participation for female and male students. For most outcomes, neither the treatment effect for female students, the treatment effect for male students, nor the difference between the 26 two treatment effects is statistically significant. The one exception to this is for time to degree. I estimate that male students graduate a statistically significant 1.5 semesters faster because they join the MSU Honors College. This is statistically significantly different than my estimated treatment effect for female students of an insignificant increase in time to degree of 0.1 semesters. As robustness checks, I re-run the regressions used to create Table 1.7 with a doughnut sample and with bandwidths of 0.10 and 0.20. The results are presented in Appendix Table A.16, A.17, and A.18. Results are similar to those in Table 1.7. In all alternative specifications I find that being in the MSU Honors College reduces time to degree for male students and has a near 0 effect for female students. The coefficient on male time to degree is significant for bandwidths of 0.1 and 0.2 but not when using the doughnut sample. I conclude that my time to degree results looking at all students near the cutoff are entirely driven by the effect of honors college participation on male students. 27 1.6.8 Heterogeneity: First Generation College Students vs Second and Above Generation Students Table 1.8 –First Gen and Second and Above Gen Treatment Effect of Honors College Participation Graduate Time to 4th 8th Total Credit MSU Degree Semester Semester Hours GPA GPA In Honors College -0.0536 -0.6459 -0.0384 -0.0237 -9.2399* (0.0548) (0.4130) (0.0715) (0.0647) (5.2039) In Honors College * First 0.3257** 0.2345 0.1031 0.1739 30.3831** Gen (0.1367) (1.1164) (0.1379) (0.1746) (13.0604) P(In Honors College + 0.04 0.67 0.62 0.40 0.09 Interaction) First College-Cohort Fixed Y Y Y Y Y Effects Number of Observations 4,829 4,403 4,561 4,006 4,829 Mean Outcome 2nd and 0.82 12 3.1 3.2 107 Above Gen Mean Outcome First Gen 0.73 13 2.9 3.0 101 Credit Hours Credit Hours More Than Number 300 Level 400 Level One Degree Minors In Honors College -3.7226 0.4058 -0.0522 -0.1444 (2.2992) (2.4136) (0.0436) (0.1003) In Honors College * First 8.8233* 1.5211 0.0240 0.2255 Gen (4.9352) (5.2188) (0.0769) (0.2483) P(In Honors College + 0.25 0.68 0.71 0.69 Interaction) First College-Cohort Fixed Y Y Y Y Effects Number of Observations 4,829 4,829 4,403 4,829 Mean Outcome 2nd and 25 17 0.03 0.15 Above Gen Mean Outcome First Gen 22 16 0.03 0.14 Notes: * p < 0.1, ** p < 0.05, *** p < 0.01. All regressions include first college-cohort fixed effects. Standard errors are clustered at the first college – cohort level. Regressions are 2SLS regressions where Above Cutoff and Above Cutoff * First Gen are instruments for In Honors College and In Honors College * First Gen. All regressions have a bandwidth of 0.15. The regression for time to degree only includes students who graduated and counts summers as 1 semester. For the GPA regressions, 4th semester and 8th semester are calculated ignoring summers. GPA is cumulative GPA at the end of the semester. Mean outcomes are for 2nd and above generation or first-generation students in the All GPAs Sample. Table 1.8 shows differences in the effect of honors college participation for students who 28 are and are not first-generation college students. The only treatment effect significant at the 5% level is for graduation. I estimate that joining the MSU Honors College causes first generation college students to be 27 percentage points more likely to graduate from MSU. The other significant result in the table is the difference in the treatment effect on total number of credits completed at MSU. I estimate that being in the MSU Honors College causes first generation college students to complete 21 more credits at MSU and second and above generation college students to complete 9 credits less at MSU. Both treatment effects have p-values between 0.1 and 0.05. As robustness checks, I re-run the regressions used to create Table 1.8 with a doughnut sample and with bandwidths of 0.10 and 0.20. The results are presented in Appendix Tables A.19, A.20, and A.21. Results are qualitatively similar to those in Table 1.8, but p-values are larger for a bandwidth of 0.1 and the doughnut sample because of the smaller number of observations. The treatment effects for first generation college students for graduation and number of credits earned are significant at the 5% level for a bandwidth of 0.20 but not for a bandwidth of 0.10 or when using the doughnut sample. I conclude that participating in the MSU Honors College likely causes first generation college students to be more likely to graduate and to earn more credits at MSU. Because the effects are so large and because first generation college students are a population of interest for higher education policymakers, it is possible that this is the most important finding in this paper. 1.7 Discussion and Conclusion In this paper I study how a student’s participation in the MSU Honors College changes a variety of academic outcomes. The MSU Honors College invites all students whose GPA is in the top 10% of the GPA distribution in their non-honors college during their freshmen fall semester to join the MSU Honors College. This creates a large discontinuity in the probability of ever being in the college at these 90th percentile GPA cutoffs. This discontinuity allows me to use a fuzzy regression discontinuity research design to study the effect of participation in the MSU Honors College on student outcomes by looking for discontinuities in student outcomes at those GPA cutoffs. Looking at all students in my analysis sample near the cutoffs, I do not find that honors college participation has a large effect on student outcomes. For 21 of 22 outcomes I look at, my estimated effects are statistically insignificant. I do find a significant effect for time to degree, but this effect is not significant when I exclude covariates or when I use a doughnut sample. 29 Because I am checking 22 outcomes there is a good chance that I randomly find a significant effect even if all true treatment effects are 0. The time to degree effect I find in some specifications may just be a result of random variation. In heterogeneity analysis, I show that honors college participation may cause large changes in a small number of academic outcomes for particular groups of students. I find that honors college participation causes male students to get their first degree significantly faster and that this effect is robust to all bandwidths I check. I also find for at least one bandwidth I check that honors college participation makes first generation college students significantly more likely to graduate and to earn significantly more credits at MSU. Because the effect is large, for an important outcome, and for a population of interest to higher education policymakers, it is possible that the effect on graduation for first generation students to be the main finding of the paper. To understand how being in the honors college changes the college experience of honors students, in 2022 I conducted interviews with 10 current honors students and 3 honors college advisors. One thing I learned from this is that most honors experiences are honors options. When I asked the students what honors experiences they had or planned to have, they generally listed at most one honors course or section with the rest of their honors experiences being honors options. An honors advisor estimated that 80-90% of honors experiences are honors options and that one reason for this was the lack of honors courses and sections that were available for students to take. Another thing I learned is how significant the change in the general education requirements for honors students is. Non-honors students must take courses that fulfill general education requirements but do not fulfill any requirements to complete particular majors36. Honors students fulfill their requirements by taking courses in specific majors such as Philosophy 101. Courses taken to fulfill requirements for a minor or second major can also count to fulfilling general education requirements for honors students. A third thing I learned is that being an honors student may have little impact on who a student’s peers are. Honors students do not take many classes with only honors students. Many of the students I talked to never lived in the honors only floors of residence halls. There are a variety of student organizations that are affiliated with the honors college but the students I talked to were not very involved with them. The impression I 36 Non-Honors students must complete ISS and IAH courses. See https://reg.msu.edu/academicprograms/Print.aspx?Section=215 30 got is that the main ways being an honors student changed a student’s college experience was by letting them enroll in classes early, by having alternative general education requirements, and by doing honors options. Honors students may get their degree faster because honors students can enroll in classes before non-honors students and because their general education requirements are easier to fulfill with coursework they would do even if they were not an honors student. Being able to enroll in classes earlier than most other students may prevent honors students from having to stick around for an additional semester because there was no more room to enroll in a class they needed to get their degree. In my interviews with them, honors students were always able to enroll in the classes they wanted at the times that they wanted. They discussed that some classes were small and filled up fast. They never had a problem getting into those classes, but some of their non- honors friends had trouble enrolling in those classes. General education requirements for honors students could be fulfilled with courses students were already taking to complete a second major or a minor37. This allows some honors students to finish their degree(s) taking fewer courses. If a student needs to take fewer courses, then they can graduate in less time38. The main economic effect of a student finishing their degree sooner is that they can enter the workforce sooner. Each semester in college is about 4 months long. Assume joining the Honors College causes a student to graduate a semester earlier. Also assume the student earns the median earnings for MSU graduates of $61,101. In that case, joining the Honors College would increase the student’s earnings by $61,101 * 4/12 = $20,36739. The additional time in the labor force might also increase future earnings if earnings increase with years in the labor force. Joining the MSU Honors College may increase the graduation rate of first-generation students by giving them access to honors advisors and by getting them involved in the First- 37 To see if students who had more than one major or who had at least one minor were driving the time to degree results, I estimated the treatment effect of being in the honors college using only students who graduated with a single major an no minor (single major students). Including all covariates in the regression, I estimate that singe major students who join the honors college get their first degree 0.43 semesters sooner p-value 0.261. Because this is much smaller than my main estimate of getting the first degree 0.78 sooner, it provides some evidence that non- single major students are driving the time to degree effect. 38 If this was the main reason honors students got their first degree sooner, then I would expect to see large negative treatment effects on total number of credits earned at MSU. However, the estimated effects are credit hours for all students in the sample near the cutoff and for males, while negative, are statistically insignificant. 39 Earnings of MSU graduates are from U.S. Department of Education’s College Scorecard at the following URL. https://collegescorecard.ed.gov/school/?171100-. The statistics was taken from the website on 7/26/2022. 31 Generation Honors Association40. Being in the honors college allows students to meet with special advisors. According to my interviews, these advisors are easier to meet with than other advisors students have access to. While most students meet with honors advisors to discuss issues related to being an honors student, the advisors can discuss a variety of topics related to college such as how many credits a student should take each semester. Being able to easily meet with advisors might be especially important for first generation students because their parents cannot advise them about college based on their experience of being a college student. The First- Generation Honors Association is a student organization affiliated with the MSU Honors College. The organization’s goal is to benefit first generation students by creating a community of high achieving first generation students and providing first generation students with advice and information to help them while in college. I attended one of the organization’s events where they invited 4 college graduates who themselves were first generation students to discuss their experience in college and answer questions from event attendees. It is possible that joining the honors college may make first generation students aware of this organization and that participating in its activities may make students more likely to graduate. If other researchers could figure out what about the honors college is so beneficial for these students, then MSU or other universities might be able to improve the outcomes of first gen students by providing those benefits to first-generations students even if they are not in an honors program. If joining the MSU Honors College increases the graduation rates of first-generation students, then it likely increases the future incomes of those students. College graduates make significantly higher incomes than those without a college degree (Abel and Deitz 2014). Graduating from college also opens the opportunity to get advanced degrees such as master’s degrees and medical degrees which also are associated with higher incomes (Altonji and Zhong 2021). One of my most surprising findings compared to prior literature is the lack of a significant effect of honors college participation on a student’s GPA. Several previous studies have found honors college participation to be associated with earning a higher GPA (Cosgrove 2004; Hartleroad 2004; Shushok 2006; Rinn 2007; Furtwengler 2015; Brown, Winburn, and Sullivan- Gonzalez 2019; Diaz, Farruggia, Wellman and Bottoms 2019; Honeycutt 2019; Lishinski and Micomonaco 2020; Pugatch and Thompson 2022). One possibility is that the GPA effect is small 40 https://honorscollege.msu.edu/admissions/first-generation-honors-association.html 32 and positive but that I do not have enough observations to detect the effect. This would be consistent with my positive estimate of the effect of honors college participation on 8th semester GPAs for all students in my sample near the cutoffs. Another possibility is that the effect of honors college participation on GPA is positive for honors students on average, but that the effect is 0 for students who are on the margin of being admitted into the MSU Honors College. A third possibility is that the real effect of honors college participation is 0 and other studies are unable to control for unobserved variables that explain the GPA difference between honors and non-honors students. This would not explain the results from Pugatch and Thompson (2022) who find that on average honors college participation increases course GPA but that it decreases course GPA for first-generation students. There are many additional questions related to this research that future projects could explore. One set of questions relates to which aspects of the MSU Honors College cause the effects found in this paper assuming the significant results are causal rather than due to random variation in the data. Is the faster time to degree due to being able to enroll in classes first or due to something else? What is the effect of being able to take graduate classes, being in a dorm with other honors students, or having access to an honors advisor separate from all the other benefits of being in the college? Another set of questions relates to what the causal effect of honors college participation is on student outcomes for types of students not studied in this paper. How would participation in an honors college affect students in other parts of the GPA distribution? Do higher GPA students or lower GPA students benefit more from honors college participation? If the structure of the MSU Honors College was recreated at another university, would students at that college experience the same effects as students at MSU? Are the effects limited to large 4- year public universities or would students at other types of institutions, like community colleges, benefit from participating in an honors program? 33 CHAPTER 2: HOW LOW-INCOME EXPECTATIONS AFFECT STUDENT LOAN REPAYMENT PLAN CHOICE: SURVEY EVIDENCE FROM COLLEGE SENIORS 2.1 Introduction and Motivation Most college students in the United States (U.S.) get loans from the federal government to fund their college education (Woo, Bentz, Lew, Velez, and Smith 2017). The U.S. federal government offers student loan borrowers a choice between two kinds of repayment plans. One type of repayment plan sets payments so that the loan is paid off within a certain period41. The other type of repayment plan sets payments as a function of a borrower’s income42. The latter kind of plan is referred to as an income-driven repayment plan or IDR plan. IDR plans are preferred over time-based plans by scholars of student loans for their ability to reduce the loan payments of student loan borrowers when their incomes are low (Chapman and Dearden 2017). Borrowers on IDR plans are more likely to make required on time payments (Herbst 2023) and less likely to default on their student loans43 (U.S. Government Accountability Office 2015; Muller and Yannelis 2019). Preventing student loan default is important because defaulting on U.S. government student loans can lead to a variety of negative consequences for the borrower. These consequences include: a reporting of the default to credit bureaus leading to reduced access to private sources of credit, collection fees, wage garnishment, the garnishment of the borrower’s tax refund, and the inability to get more U.S. government student loans until the default is resolved. As of Q2 2021, 17 percent of student loan borrowers were in default (Ma and Pender 2021). Scott-Clayton (2019) finds that the proportion of students who graduated in 1996 who had ever defaulted on their student loans continued to increase over the 20 years they had data for. Using that data to forecast defaults in the future, Scott-Clayton projects that 40% of borrowers who graduated college in 2004 would default on their student loans at some point by 2023. Despite these facts, only 32% of borrowers in FY 2021 were in IDR plans (Ma and Pender 2021). Given the high default rate on student loans, and the fact that IDR plans likely reduce student loan default, it seems as if borrowers’ lives could be significantly improved if more of 41 The default plan choice for students who get loans from the U.S. federal government sets payments so that the loans are paid off in full if the minimum payment is made every month for 10 years. 42 After a certain number of years of making payments on one of these plans, all remaining loan balances are forgiven and required payments decrease to $0. The income-driven plan that is available to all new borrowers of U.S. government student loans, the Revised Pay as You Earn Plan, offers loan forgiveness after 20 years for undergraduate borrowers and after 25 years for graduate borrowers. See https://studentaid.gov/manage- loans/repayment/plans/income-driven for more information. 43 The U.S. Department of Education defines student loan default as not making required payments for at least 270 days. 34 them were on IDR plans. One reason why there may be both relatively low enrollment in IDR plans, and a high default rate on student loans, is that students have overly optimistic expectations about their future earnings44. In terms of reducing required payments, the biggest benefit of being on an IDR plan occurs when a borrower’s income is low. If borrowers believe they have an unreasonably low probability of earning a low income, then they may also believe that it is unlikely they will experience reduced payments should they choose an IDR plan instead of a time-based repayment plan. Should a student loan borrower earn a low income after they graduate while being on a time-based repayment plan, their required payments may be such a large proportion of their income that they are unable or unwilling to make them. If this is the case, then presenting students with relevant information about post-college incomes should cause them to: increase the probability that they believe they will earn a low income, be more likely to choose an IDR plan over a time-based repayment plan and reduce the probability that they default on their loans. The purpose of this research is to learn about the effect that a student’s expectations of earning a low income have on their choice of student loan repayment plan45. To study this, I field an online survey to college seniors at Michigan State University (MSU). Survey respondents are asked about the probability they expect to earn an income in different income ranges. They are also asked if they would prefer an IDR or non-IDR (time-based) plan if they had $30,000 in student loan debt. The survey includes an information experiment where respondents were randomly shown either information about the average income of U.S. college graduates (All Graduates Income Treatment), or information about the median earnings of MSU graduates with majors like their own major (Major Specific Income Treatment). The goal of providing this information is to create an exogenous difference in low-income expectations between 44 Colon (2021) finds that, on average, a sample of undergraduates at The Ohio State University underestimate the mean salary for employed workers in Ohio age 30 to 50 with specific groups of majors. In Cox, Kreisman, and Dynarski (2020) college students who participated in a laboratory experiment expect the typical earnings of the typical graduate to be $34,500 while the average earnings of 24-year-old graduates in 2015 was about $22,000. In a survey of NYU students Wiswall and Zafar (2015a) find that when they asked what NYU undergraduates thought 30-year-old college graduates with broad categories of majors in the US earned, the average response is statistically significantly above the authors calculations of the actual population earnings. The authors also find substantial heterogeneity in errors, with many students underestimating population earnings. Betts (1996) finds that in a sample of undergraduates at UC San Diego, the mean beliefs about the average salary of BA holders in 1990 is close to correct although the mean salary of BAs with psychology degrees is statistically significantly below mean beliefs about the salary of psychology graduates. 45 Abraham, Filiz-Ozbay, Ozbay, and Turner (2020) and Brownstein (2020) find that the probability a student believes they will earn a low income 6 months after leaving school is statistically significantly correlated with student loan repayment plan choice. 35 respondents who see the two types of income information. Questions about income expectations and repayment plan choice are asked both before and after the income information is shown. Controlling for pre-treatment differences in low-income expectations, I find that survey respondents who see the Major Specific Income Treatment have a subjective probability of earning a low income that is a statistically significant 7 percentage points higher than the survey respondents who see the All-Graduates Income Treatment. However, controlling for pre- treatment differences in repayment plan choice, I find that survey respondents who see the Major Specific Income Treatment are an insignificant 2 percentage points less likely to choose the IDR plan. Based on this, and similar results for various subsamples, I conclude that repayment plan choice is not very responsive to changes in low-income expectations. This may be because students care about things other than minimizing required monthly payments when picking a repayment plan. 2.2 Background on Student Loans and Income-Driven Repayment in the United States About 92% of all student loan debt in the U.S. is owed to the U.S. Federal Government (Peter G. Peterson Foundation 2021)46. Students who attend college apply for federal loans by filling out the Free Application for Federal Student Aid. Loans are offered to students as part of their overall financial aid package for a university. Students can borrow up to the lesser of either the cost of attendance, or a limit that is based on year in school and dependency status (Kirkham 2020). For federal student loans, there are limits both on the amount of borrowing per year and the lifetime amount of borrowing47. In the academic year 2020-2021, 25 percent of undergraduate students borrowed loans directly from the federal government (Ma and Pender 2021). Ma and Pender also found that 55 percent of students who graduated from public and non-profit 4-year universities in the 2019 – 2020 academic year had student loan debt. They calculate that the average amount of debt among people who graduated with debt that year was $28,400. One of the major benefits of IDR plans is that they reduce the probability that borrowers will 46 Every year most new student loan debt is also owed to the U.S. Federal Government. For example, in the 2020 – 2021 academic year, 87% of new student loan debt was owed to the Federal Government (Ma and Pender 2021). The other 13% was owed to private companies. 47 If students would like to borrow more than the limits for those loans, their parents may borrow Parent’s PLUS loans from the Federal Government up to the cost of attendance. 36 default on their student loans. Borrowers48 will be current on their loans if they make at least the minimum monthly loan payment. The minimum monthly loan payment is generally 49 determined by the repayment plan the borrower is on. Once a borrower misses a payment, they are considered delinquent on that loan. Borrowers who are delinquent on their loans for a period of 90 days have their delinquency reported to the 3 major Credit Reporting Agencies (CRA’s) 50. If a borrower pays less than the minimum payment for 270 days, then their loan is in default. Default has several negative consequences for the borrower including: the entire amount of the loan is due immediately, the default is reported to the 3 major CRA’s, being charged for collection costs, being prohibited from receiving additional federal student aid until the default is resolved, and sometimes having their wages, tax refunds and federal benefits garnished. To prevent these harms to borrowers, it is a worthwhile goal to reduce student loan defaults. In the survey, respondents are given the choice between an IDR plan and a non-IDR plan. These plans are based on two51 of the repayment plans borrowers can choose from when they enter repayment. The non-IDR plan is based on the Standard Repayment Plan. The Standard Repayment Plan sets minimum monthly payments so that the loan would be paid off if the minimum payment is made every month for 10 years. If a borrower does not select a repayment plan before they begin paying back their loans, they are automatically put on the Standard Repayment Plan. The IDR plan is based on the Revised Pay as You Earn Plan (REPAY). Unless a borrower has an FFEL loan, they can get on REPAY52. REPAY sets minimum payments equal to 10% of discretionary income with loan forgiveness53 after 20 years of payments for an 48 In this paper borrowers is used as a shorthand for U.S. citizens who have gotten student loans from the U.S. federal government. 49 Borrowers can temporarily lower their minimum monthly payment to $0 using deferment or forbearance. Deferment and forbearance can be given for a variety of approved circumstances such as getting treated for cancer or serving in the Peace Corps. 50 Credit Reporting Agencies (CRA’s) are businesses that collect information about people’s use of credit and sell that information to third parties (Irby 2020). 51 There are currently seven different repayment plans for student loans. Four of those repayment plans set minimum payments as a function of the borrower’s income. 52 https://fcaa.org/student-loan-repayment-plans/revised-pay-as-you-earn-repaye/. FFEL stands for Federal Family Education Loan program. These loans, which were available until 2010, were made by private institutions and guaranteed by the federal government. 53 According to Student Borrower Protection Center (2021), despite the first IDR plan becoming available in the U.S. in 1995, only 32 U.S. student loan borrowers have ever received loan forgiveness because they had been in an IDR plan for a long period of time. Despite this, it is probably the case that loan forgiveness is a salient feature of IDR plans. In Brownstein (2020), I find that decreasing the number of the years until loan forgives for an IDR plan from 20 years to 15 increases the probability surveyed MSU students prefer an IDR plan to a non-IDR plan by about 20 percentage points. 37 undergraduate borrower or 25 years of payments for individuals who borrowed for graduate or professional school. Discretionary income is defined as income above 150% of the federal poverty line54. IDR plans lower the required payment of student loan borrowers when their income is low. This is the feature of IDR plans that probably lower a borrower’s probability of default. Even if this feature does not prevent defaults, it prevents students from losing a high proportion of their income on student loan payments when they most need the money. These benefits of IDR plans should make IDR plans more attractive to borrowers who believe they are more likely to earn a low-income. However, IDR plans are not always better than non-IDR plans. If an IDR plan successfully lower a borrower’s payments, they cause the borrower to accrue more interest on their loan55 and take longer to pay off their loan. If borrowers care more about that than the benefits of lower payments, then they may continue to prefer a non-IDR plan even if they believe they are more likely to earn a low-income. Borrowers can learn about student loan repayment, including payment amounts and what repayment plans are available, by doing student loan exit counseling. Most exit counseling is done through a website created by the U.S. Department of Education (DoE)56. The information I provide students in the survey is like the information borrowers get on the exit counseling website. DoE requires colleges to have borrowers complete student loan exit counseling when they leave school57. If colleges do not offer or refer their borrowers to exit counseling, they may lose access to federal financial aid (Klepfer, Ferandez, Fletcher, and Webster 2015). Exit counseling provides information on loan balances, repayment obligations, and which repayment plans are available to the borrower. During exit counseling borrowers can enter their estimated future income, future expenses, and how much in student loans they borrowed from the federal government. The website then provides students with an estimated initial monthly payment, an 54 For all states except Hawaii and Alaska, 150% of the federal poverty line for a household with a single individual in 2020 was $19,140 and for a household with 2 individuals was $25,860. See Office of the Assistant Secretary for Planning and Evaluation (2019). 55 A borrower may make lower total interest payments when making lower monthly payments if the borrows makes small payments long enough for a significant proportion of their loan balance to be forgiven. 56 https://studentaid.gov/app/counselingInstructions.action?counselingType=exit. Schools can do other things to fulfill the requirement to provide exit counseling. However, anyone can use the U.S. Department of Education’s website and most schools (including MSU) refer their students to the website for exit counseling. 57 It may be the case that a large proportion of borrowers do not complete exit counseling. In a survey of 13,000 high debt borrowers, 40% of respondents reported they did not receive any form of student loan counseling (Whitsett and O’Sullivan 2012). 38 estimated total amount paid, and a repayment period of either the number of years in repayment or the number of years until loan forgiveness. As part of this process, borrowers are asked to select a repayment plan from a menu of available repayment plans. The selected plan is sent to the borrower’s loan servicer to determine if they are eligible for the plan. If borrowers do not go through exit counseling, or they do not choose a specific repayment plan at the end of exit counseling, they are put on the Standard Repayment Plan. Students can change their repayment plan at any time by contacting their student loan servicer (Lane, 2020)58. 2.3 Literature Review There are many studies which look at the effect of providing students with information about what they can expect to earn after college on decisions related to college. Wiswall and Zafar (2015a) look at how U.S. students change their income expectations after being informed about the earnings of different groups of individuals. Treatments include being shown information about the average income of all college graduates and the average income of college graduates conditional on gender and major. In a companion paper, Wiswall and Zafar (2015b) use the same data to study how changes to major-specific earnings expectations caused by seeing major specific earnings information changed students’ expectations of what they would major in. Baker, Bettinger, Jacob, and Marinescu (2018) study the impact of income information on major choice for community college students. Hastings, Neilson, and Zimmerman (2018) find that Chilean student loan applicants who receive information about college- and major- specific incomes of past Chilean college graduates are less likely to attend and believe they would earn less if they enrolled in programs whose graduates earned low incomes. Bleemer and Zafar (2018) find that providing information to U.S. household heads about the expected returns to college increase the probability that respondents said they wanted to attend college. Hurwitz and Smith (2018) look at the effect of the release of a large amount of information about the income of college graduates in the College Scorecard. They find that after the information was released colleges with higher reported median incomes had more students send their SAT scores to them. Conlon (2021) finds that students are more likely to choose a major which they received income information about in an online survey. The above research shows that college students change their expectations and behaviors in response to seeing information on post-college incomes. 58 A loan servicer is a private company that the U.S. Federal Government contracts with to collect federal student loan payments. 39 Another group of studies uses experiments to study what affects student loan repayment plan choice. Abraham, Filiz-Ozbay, Ozbay, and Turner (2020) study how the description of IDR plans affects repayment plan choice. They find that students are statistically significantly more likely to choose the IDR plan when the description of the plan emphasizes its benefits. Cox, Kreisman, and Dynarski (2020) have college students participate in an incentivized laboratory experiment which involve students choosing between time-based and IDR repayment plans. They find that: being shown information about the incomes of recent college graduates causes students to decrease what they expect their income to be, that being shown that information did not change a student’s choice of repayment plans, and that students are statistically significantly more likely to select the repayment plan framed as the default plan. In Brownstein (2020) I field a small online survey to students at MSU where they choose either an IDR or non-IDR student loan repayment plan. Although many of my results are not statistically significant, I find that students are more likely to choose the IDR plan when: the amount of income not considered when calculating payments is lower59, the percent of non-exempt income determining payment is lower, and the number of years until loan forgiveness is lower. Muller and Yannelis (2019b) study a field experiment where borrowers are randomly sent or not sent pre-populated applications to enroll in an IDR plan. They find that individuals who receive the applications have much higher enrollment in IDR plans, lower loan payments, and a lower probability of failing to make a required loan payment. The method for eliciting distributional income expectations used in this study comes from Delavande and Rohwedder (2008). They find that, compared to eliciting expectations by asking for points on the cumulative distribution function, eliciting expectations by asking respondents to place balls in bins representing ranges of the probability distribution leads to a statistically significantly higher percentage of respondents with valid probability distributions. Delavande, Giné, and McDenzie (2011) find that using this method to elicit income expectations in developing countries provides reasonable responses that are predictive of future economic behavior. Orr (2020) uses this method to elicit the subjective expectations of college students, including questions about expected GPA conditional on a certain amount of studying, and 59 Income driven repayment plans calculate payments as a function of income above a certain amount such as 10% of income above 150% of the federal poverty line in the case of the Revised Pay as You Earn Plan. The results in Brownstein (2020) suggest that if the amount of exempt income was decreased, such as to 125% of the federal poverty line, that more borrowers would choose to be on the Revised Pay as You Earn Plan. 40 questions about income conditional on graduating with a certain GPA. 2.4 Description of Survey This paper analyzes data from a web survey of Michigan State University (MSU) college seniors60. MSU’s Office of the Registrar sent out emails that I wrote on October 19th, October 22nd, and October 25th, 2021. The emails described the survey and had a URL which could be used to take the survey. The emails also informed students that if they completed the survey, they could be sent $10 using either Venmo or Paypal. The 3 emails were sent to the same 7,000 students. The survey was closed on October 27th, 2021. Screenshots of the emails are available upon request. Before any data was analyzed, incomplete survey responses and any response after the first response by the same person were removed61. After that 1,581 responses were left. The survey has a response rate of 22.6%. The median time it took students in the sample to complete the survey is 9 minutes and 56 seconds. Survey respondents are asked about their income expectations in the form of a statistically valid probability distribution. The method of eliciting this distribution comes from Delavande and Rohwedder (2008). Survey respondents allocate 10 balls to the following income ranges: $0 - $30,000, $30,000 - 60,000, $60,000 - $90,000, $90,000 - $120,000, and greater than $120,000. Each ball they allocate to an income range represents a 10-percentage point probability that they expect to earn an annual income in that range. Survey respondents are asked about what income they expect to receive 5 years after graduating with an undergraduate degree from MSU62. Survey respondents are asked not to count any time in graduate or professional school as part of those 5 years63. Each time after they are asked for their income expectations, survey respondents are asked to choose between two different repayment plans. They are asked to assume they have 60 MSU’s Office of Financial Aid defines a senior as an undergraduate student who has completed at least 88 credits. 61 In cases where 2 or more responses had the same Venmo account name or the same email for Paypal, all responses except for the response with the earliest recorded date were deleted. 4 completed responses had neither a Venmo account nor an email for Paypal and therefore could not be checked against other responses. 62 Arcidiacono, Hotz, Maurel, and Monamo (2020) survey students at Duke University about their major and occupation specific earnings expectations in 2009. In 2015 they collect data on survey respondents’ actual earnings. They find that a student’s earnings expectations are informative about future earnings and that students sorted into occupations based on expected earnings. Wiswall and Zafar (2021) find that college students’ beliefs about future income are significant related to realized income 6 years later and that mean expected income is almost identical to mean realized income. 63 This was for two reasons. First, individuals in graduate or professional school have an unusually low income given their level of education. Second, borrowers who are in graduate or professional school can get a deferment and temporarily lower their required loan payment to $0. 41 graduated from MSU with $30,000 in student loan debt, and the debt has an interest rate of 5%. Repayment Plan 1 is an IDR plan like the widely available Revised Pay as You Earn Plan. Repayment Plan 2 is a time-based repayment plan like the Standard Repayment Plan. Information about the repayment plans is shown in three tables. The first table describes the two repayment plans. The other tables have estimates of minimum monthly payments, estimated length of time making payments, and total amount paid over the course of the loan. These estimates are given for the two repayment plans for starting post-college incomes of between $10,000 and $90,000 in $10,000 increments64. After being asked about their income expectations and choice of repayment plan for the first time, survey respondents are randomly shown one of the two information treatments described below. One information treatment contains information on the median yearly incomes of individuals in the U.S. with a college degree65. I refer to this treatment as the All-Graduates Income Treatment. This statistic is calculated using the American Community Survey 2015 – 2019 IPUMS file (Ruggles, Flood, Goeken, Grover, Meyer, Pacas, and Sobek 2020). This information is intended to be a placebo treatment in that it would not change a survey respondent’s income expectations. I expected that students would think that information about the incomes of college graduates of all ages, majors, and universities is too general to affect their earnings expectations66. The purpose of including a treatment like this is to deal with issues related to the Hawthorne effect and to have a control group without letting survey respondents know that they are in the control group. The other treatment shows survey respondents the median yearly earnings of MSU graduates with majors similar to the respondent’s primary major. I call this treatment the Major Specific Income Treatment. The median earnings data is from the U.S. Department of 64 See Appendix B for screenshots from survey. Total amount paid and length of time making payments are calculated assuming simple daily interest and income increasing at 5% on January 1 st of each year. Additional details about those calculations are available upon request. 65 I calculated the average income to be $53,268. College graduates are identified in the American Community Survey by having a degree field that is not N/A. 66 The income of college graduates varies depending on a student’s major. Using the data from the College Scorecard I describe in the next paragraph, median first year incomes for MSU graduates vary from $18,200 to $74,700 depending on the graduate’s major. Income also varies by age. In Chart 2 Abel and Deitz (2014) estimate that, controlling for worker characteristics, the incomes of college graduates increase from about $40,000 when they are in their 20’s to about $80,000 when they are in their 50’s. 42 Education’s College Scorecard67. The College Scorecard has data on median earnings for students of either a single major or a group of related majors. The median earnings statistic that a student who received the Major Specific Income Treatment sees is based on the survey respondent’s self-reported primary major. The statistics shown are for median earnings during the first year after students have graduated from MSU. Only students who got federal financial aid are included in the sample to calculate the medians. My hypothesis is that the major specific earnings data would increase the probability students expected to earn a low income, and that this would cause them to be more likely to choose the IDR plan. Cox, Kreisman, and Dynarski (2020) study student loan repayment plan choice by randomly providing or not providing students with information related to their future income. In that study, about half of college students who participated in a laboratory experiment are provided information on the distribution of earnings of 24-year-old bachelor’s degree holders. Those who see the information expect themselves and their peers to earn statistically significantly less than experiment participants who are not provided with that information. Based on this, I expect that providing students with information about the earnings of recent BA holders would shift their expected income distribution to center around lower incomes. This in turn would increase students’ subjective probability that they would earn a low income. In my survey, given the hypothetical borrowing amount, available plans, and interest rate, borrowers whose annual income is less than $58,184 would have lower required monthly payments on the IDR plan described in the survey than if they were on the non-IDR plan described in the survey. After being shown one of the treatments, respondents are then again asked the same questions related to income expectations and repayment plan choice. Then survey respondents are asked four questions to test their understanding of the two repayment plans. See Appendix B.1 for screenshots of these questions. The survey ends with a series of questions related to the survey respondent’s demographics and their college financial aid. This section includes questions about the survey respondent’s gender, race, and age. The survey respondents are also asked how much student loan debt they have68. 67 The data was taken from the following URL in October of 2020: https://collegescorecard.ed.gov/school/fields/?171100-Michigan-State-University. The data matched the median salary 1 year after graduation (EARN_MDN_HI_1YR) for 2015-2016 and 2016-2017 classes that can be downloaded from the College Scorecard’s data website (https://collegescorecard.ed.gov/data/). 68 Past research that compares how much student loan debt students say they have in surveys to university administrative records of student loan debt has found that many students do not correctly report how much student 43 2.5 Empirical Framework The goal of this research project is to use a randomized information treatment to create exogenous variation in low-income expectations between two groups of students. I then want to see if the group that believes they have a higher probability of earning a low income is more likely to choose the IDR plan. To study how low-income expectations and repayment plan choice are affected by the treatments, I use the following estimating equations: (2.1) 𝑂𝑢𝑡𝑐𝑜𝑚𝑒𝑖 = 𝛽0 + 𝛽1 𝑠𝑎𝑤𝑀𝑎𝑗𝑜𝑟𝑆𝑝𝑒𝑐𝑖𝑓𝑖𝑐𝐼𝑛𝑐𝑜𝑚𝑒𝑖 + 𝜷𝑿𝒊 + 𝜖𝑖 (2.2) 𝑂𝑢𝑡𝑐𝑜𝑚𝑒𝑖𝑡 = 𝛽0 + 𝛽1 𝑠𝑎𝑤𝑀𝑎𝑗𝑜𝑟𝑆𝑝𝑒𝑐𝑖𝑓𝑖𝑐𝐼𝑛𝑐𝑜𝑚𝑒𝑖 + 𝛽2 𝐴𝑓𝑡𝑒𝑟𝑇𝑟𝑒𝑎𝑡𝑚𝑒𝑛𝑡𝑡 + 𝛽3 𝑠𝑎𝑤𝑀𝑎𝑗𝑜𝑟𝑆𝑝𝑒𝑐𝑖𝑓𝑖𝑐𝐼𝑛𝑐𝑜𝑚𝑒𝑖 ∗ 𝐴𝑓𝑡𝑒𝑟𝑇𝑟𝑒𝑎𝑡𝑚𝑒𝑛𝑡𝑡 + 𝜖𝑖𝑡 i indexes the survey respondent. Outcomei is two different variables. One variable is the subjective probability a student believes they would earn a low-income. For my main analysis, I define earning a low income as earning $0 to $30,000 a year. A second main variable is an indicator variable that equals 1 if the respondent chooses the IDR plan and equals 0 if the respondent chooses the non-IDR plan. Xi is a vector of covariates. The covariates in the analysis are the same covariates I use in the balance tests. Xi includes indicator variables for having a single major, being female, being white, having a Pell Grant, being a first-generation college student, and having student loans. Xi also includes continuous variables for a student’s age and the student’s subjective probability that they will attend graduate or professional school in the next 20 years. Equation 2.1 uses data from the income expectations and repayment plan choice questions that are asked after the survey respondents see one of the treatments. The coefficient of interest in Equation 2.1 is β1. β1 is the average expected outcome for survey respondents in the case they saw the Major Specific Income Treatment minus the average expected outcome for survey respondents in the case they saw the All-Graduates Income Treatment (treatment effect of the Major Specific Income Treatment). Equation 2.2 uses data from the income expectations and repayment plan choice questions that are asked both before and after the survey respondent has seen information about loan debt they have (Akers and Chingos, 2014; Andruska, Hogarth, Fletcher, Robes, and Wohlgemuth 2014). I use data on student loan debt only to categorize survey respondents who do and do not have student loans. Unfortunately, even this categorization likely has measurement error. Andruska, Hogarth, Fletcher, Robes, and Wohlgemuth (2014) find that 62 of 165 students in their study who reported in a survey they had no student loan debt had student loan debt in administrative records. 44 the income of college graduates. t indexes when the outcome is measured in the survey. Either t = 0 when the outcome is measured before the income information is shown or t = 1 after the income information is shown. AfterTreatmentt is an indicator variable for the outcome being recorded after the survey respondent has seen the income information. The coefficient of interest in Equation 2.2 is β3 which also is the treatment effect of the Major Specific Income Treatment. Equation 2.2 improves on Equation 2.1 by controlling for pre-treatment differences in the outcome variable. The following equation is used to see if the difference in the effect of the treatments on outcomes are statistically significantly different for different subgroups. (2.3) 𝑂𝑢𝑡𝑐𝑜𝑚𝑒𝑖𝑠𝑡 = 𝛽0 + 𝛽1 𝑠𝑎𝑤𝑀𝑎𝑗𝑜𝑟𝑆𝑝𝑒𝑐𝑖𝑓𝑖𝑐𝐼𝑛𝑐𝑜𝑚𝑒𝑖 + 𝛽2 𝑆𝑢𝑏𝑔𝑟𝑜𝑢𝑝𝑀𝑒𝑚𝑏𝑒𝑟𝑠 + 𝛽3 𝐴𝑓𝑡𝑒𝑟𝑇𝑟𝑒𝑎𝑡𝑚𝑒𝑛𝑡𝑡 + 𝛽4 𝑠𝑎𝑤𝑀𝑎𝑗𝑜𝑟𝑆𝑝𝑒𝑐𝑖𝑓𝑖𝑐𝐼𝑛𝑐𝑜𝑚𝑒𝑖 ∗ 𝑆𝑢𝑏𝑔𝑟𝑜𝑢𝑝𝑀𝑒𝑚𝑏𝑒𝑟𝑠 + 𝛽5 𝑠𝑎𝑤𝑀𝑎𝑗𝑜𝑟𝑆𝑝𝑒𝑐𝑖𝑓𝑖𝑐𝐼𝑛𝑐𝑜𝑚𝑒𝑖 ∗ 𝐴𝑓𝑡𝑒𝑟𝑇𝑟𝑒𝑎𝑡𝑚𝑒𝑛𝑡𝑡 + 𝛽6 𝑆𝑢𝑏𝑔𝑟𝑜𝑢𝑝𝑀𝑒𝑚𝑏𝑒𝑟𝑠 ∗ 𝐴𝑓𝑡𝑒𝑟𝑇𝑟𝑒𝑎𝑡𝑚𝑒𝑛𝑡𝑡 + 𝛽7 𝑠𝑎𝑤𝑀𝑎𝑗𝑜𝑟𝑆𝑝𝑒𝑐𝑖𝑓𝑖𝑐𝐼𝑛𝑐𝑜𝑚𝑒𝑖 ∗ 𝑆𝑢𝑏𝑔𝑟𝑜𝑢𝑝𝑀𝑒𝑚𝑏𝑒𝑟𝑠 ∗ 𝐴𝑓𝑡𝑒𝑟𝑇𝑟𝑒𝑎𝑡𝑚𝑒𝑛𝑡𝑡 + 𝜖𝑖𝑠𝑡 Equation 2.3 regresses outcomes for respondent i in subgroup s measured at time t. SubgroupMembers is an indicator variable for being a member of a subgroup such as having student loans or having a low-income major. Equation 2.3 has three sources of variation: the variation in outcome by treatment, the variation in outcome by subgroup, and the variation in the outcome before the treatment and after the treatment. The coefficient of interest in Equation 2.3 is β7. β7 can be thought of as how much the effect of the Major Specific Income Treatment is different for survey respondents who are and are not members of the subgroup, controlling for pre-treatment differences in the outcome by subgroup and treatment. 2.6 Results 2.6.1 Analysis Sample Before any data is analyzed, incomplete survey responses and any response after the first response by the same person are removed69. After that, 1,581 responses are left. The survey has a response rate of 22.6%. 38 international students are removed from the sample because only U.S. citizens are eligible for student loans from the U.S. Federal Government. 147 additional students 69 In cases where 2 or more responses had the same Venmo account name or email for Paypal, all responses except for the response with the earliest recorded date were deleted. 4 completed responses had neither a Venmo account nor an email for Paypal and therefore could not be checked against other responses. 45 are dropped because of missing income information70. This leaves an analysis sample of 1,396 completed responses. 2.6.2 Summary Statistics Appendix B.2 contains summary statistics for the analysis sample. The sample contains individuals with 95 different primary majors. The 5 majors with the highest number of individuals in the sample are: Human Biology (107 respondents), Psychology (68 respondents), Finance (66 respondents), Neuroscience (62 respondents), and Kinesiology (59 respondents). 87% of respondents reported having only 1 major when they took the survey. Each MSU major is matched to a description of a major or group of majors in the College Scorecard to determine what income would be shown if the respondent received the Major Specific Income Treatment. Survey respondents were matched to 61 College Scorecard major descriptions, with some College Scorecard major descriptions being matched to more than one MSU major. The top 5 College Scorecard major descriptions in the data are: Physiology, Pathology, and Related Science (137 respondents), Psychology (68 respondents), Finance and Financial Services Management (66 respondents), Public Relations, Advertising, and Applied Communications (65 respondents), and Business Administration, Management, and Operations (63 respondents). 59% of the sample is female and 81% is white. The average age of respondents is 21. 59% of respondents have student loans and 31% have ever had a Pell Grant. 19% of respondents are first-generation college students. Individuals in the sample believed they had an average subjective probability of 65% of attending graduate or professional school in the next 20 years71. 2.6.3 Balance Tests In Appendix B.3, I test for balance in covariates between survey respondents who saw the All-Graduates Income Treatment and survey respondents who saw the Major Specific Income Treatment. I regress a binary variable for seeing the Major Specific Income Treatment on 8 70 These individuals are removed from the analysis because, based on their primary major, if they are selected to receive the Major Specific Income Treatment, they would see a median income of Data Not Available. For many of these majors, there would be a major or group of majors in the College Scorecard dataset that was like a particular MSU major. However, in the College Scorecard dataset the median earnings for MSU graduates for the major or group of majors was listed as unavailable. A list of which majors were or were not in the analysis sample is available upon request. 71 27% of individuals were either in or seeking continuing education 6 months after graduating from MSU. See https://careernetwork.msu.edu/outcomes/ Accessed November 11 th, 2021. Using data from the 2007 – 2008 Baccalaureate and Beyond Longitudinal Study, Baum and Steele (2017) estimate that 39% of individuals who graduated with a bachelor’s degree in 2007 – 2008 enrolled a graduate degree program within 4 years of graduating from college. 46 binary variables related to a survey respondent’s demographics and college financial aid. The only coefficient for a covariate that is statistically significant at the 5% level is the coefficient for having a single major. An F-test of joint significance for that regression has a p-value of 0.2376. Therefore, I believe the covariates are balanced across the treatments. Because of that, I interpret coefficient estimates on the coefficients of interest as causal effects of seeing the Major Specific Income Treatment on the outcome compared to seeing the All-Graduates Income Treatment. 2.6.4 Distribution of Income Expectations by Treatment Before and After Treatment Figure 2.1 - Income Expectations by Treatment Before and After Treatment Notes: N = 2,792. Because each respondent gave their income expectations twice, each respondent has 2 observations. Figure 2.1 shows the average distribution of income expectations for each treatment before and after seeing the income information. Firstly, this figure shows that both treatments have similar average distributions of income expectations before the income information is shown. For the two treatments, the middle three income ranges have the exact same average subjective probability, and the other two income ranges are different by no more than 2 percentage points. Second, this figure shows that both treatments cause income expectations to change. Survey respondents who see the All-Graduates Income Treatment believe they have a higher average probability of receiving an income between $30,000 and $60,000, and a lower or no different probability of earning an income in other income ranges after they see the income information. It is possible that income information causes survey respondents to believe they have an increased probability of receiving an income close to the typical income number they see. In the case of the All-Graduates Income Treatment this income is $53,268. This is different than my expectation that the information in the All-Graduates Income Treatment would be too general for it to affect a survey respondent’s income expectations. Survey respondents who see the Major Specific 47 Income Treatment increase the probability they believe they would earn between $0 and $30,000 and between $30,000 and $60,000. Those survey respondents also have a decreased average probability they believe they would earn an income in the other three income ranges. Given that 83% of survey respondents who see the Major Specific Income Treatment see a typical income less than $60,000, this is consistent with survey respondents responding to income information by increasing the probability they believe they will earn an income close to the income that they see. Having the All-Graduates Income Treatment change survey respondents’ income expectations does not invalidate my research design. So long as the two treatments create exogenous variation in low-income expectations, I can relate differences in low-income expectations, uncorrelated with anything else, to differences in repayment plan choice. However, having the All-Graduates Income Treatment change income expectations means that I do not have evidence for how students would change their income expectations and repayment plan choice if they were simply asked questions about income expectations and repayment plan choice twice. 48 2.6.5 Effect of Treatment on Low Income Expectations Table 2.1 – Effect of Treatment on Low Income Expectations Percent Chance of Percent Chance of Earning a Low Earning a Low Income Income Major Specific 7.9997*** 8.0647*** Treatment (1.2381) (1.2152) Covariates N Y N 1,396 1,396 Percent Chance of Earning a Low Income Major Specific 7.0947*** Treatment * After (1.6578) Treatment Covariates N N 2,792 Notes: * p < 0.1, ** p < 0.05, *** p < 0.01. For the regression with Major Specific Treatment * After Treatment each survey respondent has two observations: one observation before the treatment and one after the treatment. This table shows the results of the subjective probability (scaled to be between 0 and 100) a survey respondent believed they would earn between $0 and $30,000 5 years after graduating from MSU regressed on binary variables for the survey respondent seeing the major specific income treatment (Major Specific Treatment) or a binary variable for seeing the Major Specific Income Treatment, a binary variable for the question about income expectations coming after the treatment (After Treatment) and an interaction between those variables. Standard errors are robust to heteroskedasticity. Covariates included in the regression are binary variables for the survey respondent: being female, being white, having only 1 major, having a Pell Grant, being a first-generation college student and having student loans and discrete variables for the survey respondent’s age and the probability the survey respondent believed they would attend graduate or professional school within 20 years of answering the survey. Table 2.1 shows estimates of how seeing the Major Specific Income Treatment changes low- income expectations relative to seeing the All-Graduates Income Treatment. Seeing the Major Specific Income Treatment causes students to believe they had, on average, an 8-percentage point higher probability of earning a low income compared to if they saw the All-Graduates Income Treatment. Controlling for covariates changes the estimate very little consistent with covariates being balanced across treatments. Controlling for pre-treatment differences in low- income expectations reduces the treatment effect to 7-percentage points. In all cases the effect is statistically significant. 49 2.6.6 Effect of Treatment on Repayment Plan Choice Figure 2.2 - Plan Choice by Treatment Before and After Treatment Notes: N = 2,792. Because each respondent chose a repayment plan twice, each respondent has 2 observations. Table 2.2 – Effect of Treatment on Low Income Expectations Choose IDR Plan Choose IDR Plan Major Specific -0.0432* -0.0455* Treatment (0.0261) (0.0262) Covariates N Y N 1,396 1,396 Choose IDR Plans Major Specific -0.0187 Treatment * After (0.0367) Treatment Covariates N N 2,792 Notes: * p < 0.1, ** p < 0.05, *** p < 0.01. For the regression with Major Specific Treatment * After Treatment each survey respondent has two observations: one observation before the treatment and one after the treatment. This table shows the results of an indicator variable for the survey respondent choosing the IDR plan regressed on binary variables for the survey respondent seeing the major specific income treatment (Major Specific Treatment) or a binary variable for seeing the Major Specific Income Treatment, a binary variable for the question about repayment plan choice coming after the treatment (After Treatment) and an interaction between those variables. Standard errors are robust to heteroskedasticity. Covariates included in the regression are binary variables for the survey respondent: being female, being white, having only 1 major, having a Pell Grant, being a first-generation college student and having student loans. It also includes as covariates discrete variables for the survey respondent’s age and the probability the survey respondent believed they would attend graduate or professional school within 20 years of answering the survey. Figure 2.2 shows the percent of the analysis sample who saw each treatment who chose the IDR plan before and after seeing the treatment. Table 2.2 contains estimates of the treatment effect of seeing the Major Specific Income Treatment on the percentage chance students choose 50 the IDR plan. I hypothesized that because the Major Specific Income Treatment would increase the subjective probability that survey respondents believed they would earn a low-income, the Major Specific Income Treatment would cause survey respondents to be more likely to choose the IDR plan. Contrary to my hypothesis, all 3 regressions in Table 2.2 estimate that the treatment effect of the Major Specific Income Treatment is to decrease the percent of survey respondents who choose the IDR plan. The effect is about -4 percentage points without covariates, -5 percentage points with covariates, and -2 percentage points controlling for pre- treatment differences in plan choice. No coefficient is statistically significant. When controlling for pre-treatment differences in plan choice the 95% confidence interval of the treatment effect is -9 percentage points to 5 percentage points. This is despite that, consistent with my hypothesis, seeing the Major Specific Income Treatment causes students to have a higher average subjective probability of earning a low income compared to seeing the All-Graduates Income Treatment. 2.6.7 Robustness Checks In Appendix B.3 I find a statistically significant difference between the treatments in the proportion of survey respondents who have only one major. This difference may help explain differences in outcome by treatment. In results available upon request, I recreate Figures 2.1 and 2.2 and Tables 2.1 and 2.2 dropping all students with more than one major. The results are similar with a statistically significant difference in low-income expectations by treatment, but not a statistically significant difference in plan choice by treatment. The magnitude of all coefficients of interest are within 1 percentage point in these tables compared to coefficients of interest for the full sample. To see if the low-income expectations results are robust to a change in the highest income that is considered low, I replicate the analysis on low-income expectations changing the definition of low-income from earning between $0 and $30,000 to earning between $0 and $60,000. The results of this analysis are in Appendix B.4. Survey respondents who received the Major Specific Income Treatment have a subjective probability of earning between $0 and $60,000 after the survey respondents see the income information that is 5 to 6 percentage points higher than survey respondents who saw the All-Graduates Income Treatment. This after treatment difference is statistically significant at the 5% level with or without covariates. Considering pre-treatment differences in income expectations reduces this difference to about 4 percentage points. In the regression using both before and after treatment income expectations, 51 this lower coefficient value and a much larger standard error makes this difference not statistically significant at the 10% level. 2.6.8 Heterogeneity by Having Student Loans Only 41% of survey respondents in the analysis sample report they have student loans72. Because only students with student loans must choose a student loan repayment plan, I want to see if the results are similar for respondents with and without student loans. To see how survey respondents with student loans are different from survey respondents without student loans, I test for statistically significant differences in other covariates for survey respondents with and without student loans using a multivariate regression. The results of this analysis are in Table B.10. Survey respondents with student loans are statistically significantly more likely to have a Pell Grant (23-percentage points) and to be a first-generation college student (11-percentage points). Overall, these results indicate that the main difference for students with and without student loans in my sample is that survey respondents with student loans come from families with a lower socio-economic status than those without student loans. Tables B11 to B13 show the treatment effect for seeing the Major Specific Income Treatment for students who do and do not have student loan debt. The treatment effect for seeing the Major Specific Income Treatment for survey respondents with student loans on low-income expectations is a not statistically significant 4.72 percentage points higher than it is for survey respondents without student loans. For survey respondents with student loans, the treatment effect of seeing the Major Specific Income Treatment on earning a low income is a significant 9.09 percentage points. For survey respondents without student loans this effect is a not statistically significant 4.37 percentage points. The treatment effect for seeing the Major Specific Income Treatment for survey respondents with student loans on the probability of choosing an IDR plan is a not statistically significant - 6.06 percentage points different than it is for survey respondents without student loans. For survey respondents with student loans, the treatment effect of seeing the Major Specific Income Treatment is a not statistically significant decrease in the probability of choosing an IDR plan by 72 Using information in the Common Data Set voluntarily reported by Michigan State University, The Institute for College Access & Success concluded that 25% of student debt for college graduates at Michigan State University was non-federal. See https://ticas.org/wp-content/uploads/2020/10/Michigan.pdf. The major source of non-federal student loan debt is student loans given by private sector financial companies. These companies, as far as I know, do not offer IDR plans. Because I did not ask if a survey respondent’s loans were federal or private, I am unable to know which survey respondents with student loan debt had private student loans. 52 4.24 percentage points. For survey respondents without student loans this effect is a not statistically significant increase in the probability of choosing an IDR plan of 1.81 percentage points. 2.6.9 Heterogeneity by Income of Major Appendix B.6 contains tables that look at the heterogeneity of results by the income of the survey respondent’s major. For this analysis, the survey respondent’s major income is equal to the income the respondent would see if they were chosen to receive the Major Specific Income Treatment. This means the major income is the income of MSU graduates with federal financial aid one year after they graduated with a major similar to the survey respondent’s primary major. The median major income for the sample is $37,400. Survey respondents are classified as having a low-income major if their major income is below the sample median major income. Survey respondents are classified as having a high-income major if their major income is equal to or above the sample median major income. The treatment effect for seeing the Major Specific Income Treatment for survey respondents with a low-income major on low-income expectations is a statistically significant 13.88 percentage points higher than it is for survey respondents with high-income majors. For survey respondents with a low-income major, the treatment effect of seeing the Major Specific Income Treatment on a survey respondent’s subjective probability of earning a low income is a statistically significant 13.91 percentage points. For survey respondents with a high-income major this effect is a not statistically significant 0.04 percentage points. The treatment effect for seeing the Major Specific Income Treatment for survey respondents with a low-income major on the probability of choosing an IDR plan is a not statistically significant 8.05 percentage points different than it is for survey respondents with a high-income major. For survey respondents with a low-income major, the treatment effect of seeing the Major Specific Income Treatment is a not statistically significant increase in the probability of choosing an IDR plan by 2.06 percentage points. For survey respondents with a high income major this effect is a not statistically significant decrease in the probability of choosing an IDR plan of 5.99 percentage points. 2.6.10 Change in Income Expectations, Change in Repayment Plan Choice One reason for the small effect of the Major Specific Income Treatment on repayment plan choice may be that few survey respondents changed their low-income expectations when they 53 see the income information. Overall, 38% of survey respondents changed their low-income expectations after they see the income information. 33% of survey respondents who see the All- Graduates Income Treatment change their low-income expectations. 43% of respondents who see the Major Specific Income Treatment change their low-income expectations. Figure 2.3 - Increase in Low-Income Expectations After Treatment by Treatment Notes: N = 531. 865 respondents who did not change their low-income expectations after receiving the income information are not shown in the figure to make it easier to read. Figure 2.3 shows the distribution in the increase of low-income expectations after the income information separately for the All-Graduates Income Treatment and the Major Specific Income Treatment. Survey respondents who did not change their low-income expectations are removed to make the figure easier to see. Survey respondents who see the Major Specific Income Treatment are more likely to increase their subjective probability of earning a low income while survey respondents who see the All-Graduates Income Treatment are more likely to decrease their subjective probability of earning a low-income. 54 Figure 2.4 - Plan Choice After Treatment by Change in Low-Income Expectations Notes: N = 1,337 Figure 2.4 shows the relationship between the change in survey respondents’ low-income expectations and the probability respondents choose the IDR plan after the treatment 73. If low- income expectations were strongly related to repayment plan choice, I would expect the probability of choosing the IDR plan after the treatment to be higher for survey respondents who had a higher increase in their probability of earning a low income after the treatment. In that case, the bars would get higher as you moved to the right on the graph. Visually there is no large consistent increase or decrease in the height of the bars as you move to the right along the graph. This is consistent with low-income expectations having little effect on repayment plan choice. 73 If the number of survey respondents who changed their low-income expectations by a certain number of percentage points conditional on plan choice, like survey respondents who initially chose the IDR plan and whose low-income expectations decreased by 40 percentage points, is less than 10, then those changes in the probability of earning a low income are not shown. 55 Table 2.3 - Statistical Test of Relationship Between Change in Low Income Expectations and Change in Plan Choice IDR to IDR IDR to non- Non-IDR to Non-IDR to Choose IDR Choose IDR IDR IDR non-IDR After After Treatment Treatment Change in Low -4.587*10-4 -6.609*10-4 1.215*10-3** -9.55*10-5 8.833*10-4 1.881*10-3* Income (7.712*10-4) (5.517*10-4) (5.529*10-4) (7.419*10-4) (1.019*10-3) (1.025*10-3) Expectations Sample None None None None Only Choose Choose non- Restrictions IDR Plan IDR Plan before Before Treatment Treatment N 1,396 1,396 1,396 1,396 884 512 Notes: * p < 0.1, ** p < 0.05, *** p < 0.01. IDR to IDR is an indicator variable for the survey respondent choosing the IDR plan both before and after the treatment. IDR to non-IDR is an indicator variable for the survey respondent choosing the IDR plan before the treatment and the non-IDR plan after the treatment. Non-IDR to IDR is an indicator variable for the survey respondent choosing the non-IDR plan before the treatment and the IDR plan after the treatment. Non-IDR to non-IDR is an indicator variable for the survey respondent choosing the non-IDR plan both before and after the treatment. Change in Low Income Expectations is the subjective probability (scaled to be between 0 and 100) that the survey respondent believed they would earn between $0 and $30,000 5 years after the graduated from MSU after they saw the treatment minus what they believed that subjective probability was before they saw the treatment. Standard errors are robust to heteroskedasticity. Table 2.3 shows the results of regressions of different variables related to plan choice on a survey respondent’s change in the probability they believe they would earn a low income. The only regression that has a coefficient that is statistically significant at the 5% level is the regression on a survey respondent switching from preferring the non-IDR plan before seeing a treatment to the IDR plan after seeing a treatment. Based on that regression, a 10-percentage point increase in the probability the survey respondent believed they would earn a low income is associated with an increased probability of switching from the non-IDR plan to the IDR plan of 1.2 percentage points. The results in Figure 2.4 and Table 2.3 show that, for the most part, how a survey respondent changed their expectations of earning a low-income after seeing the treatment is not related to if they changed the repayment plan they preferred after seeing the treatment. 2.7 Discussion Contrary to my hypothesis, I find that the Major Specific Income Treatment did not cause survey respondents to be statistically significantly more likely to choose the IDR plan. This is despite the Major Specific Income Treatment causing survey respondents to have a statistically 56 significantly higher subjective probability of earning a low-income. When controlling for pre- treatment differences in covariates, the 95% confidence interval of the treatment effect of the Major Specific Income Treatment on choosing an IDR plan is -9 percentage points to 5 percentage points. This means it is highly unlikely the treatment effect of the Major Specific Income Treatment is above 5 percentage points. I conclude the survey respondents in my sample are not choosing a student loan repayment plan based on minimizing their required payments in the event they have a low-income. One possible explanation for the results is that survey respondents are worried about the costs of making low payments on their student loans. In general, if a borrower makes a smaller monthly payment on their student loans, they will have to pay more interest over the life of the loan. A lower monthly loan payment will also cause a respondent to take longer to pay off the loan. Survey respondents are shown tables with information about starting monthly payments, estimated total amount of money paid on the loan, and estimated total time making payments for different starting levels of income for both the non-IDR and IDR plan. This information is on the page where students are asked to choose either an IDR or non-IDR plan. These estimates assume a survey respondent’s income increases by 5% at the start of every year74. Whether or not a survey respondent would pay more on their student loans if they made lower payments depends on how much of their loans are forgiven. I estimate that if the survey respondent made either $30,000, $40,000, or $50,000 when they began making payments on their loans, then their total payments would be higher if they were on the IDR plan 75. However, because the IDR plan forgives any remaining loan balance after 20 years of payments, a student who is on the IDR plan described in the survey and had $30,000 of student loan debt when they graduated MSU, might have lower total loan payments on the IDR plan, even if their required monthly payments are generally less than they would be if they were on the non-IDR plan. I estimate this happens if a survey respondent’s income is either $10,000 or $20,000 when they begin making payments. Even if survey respondents understood the benefits of being on an IDR plan in terms of total payments when their starting income is $10,000 or $20,000, the cost of increased total payments when their starting income is $30,000, $40,000, or $50,000 may have 74 The U.S. Department of Education assumes borrower’s incomes increase by 5% per year when they estimate future student loan payments on their exit counseling website. 75 This assumes the survey respondent starts out with $30,000 in student loan debt and makes the minimum required payment every month after they started paying back their loans while they have a positive loan balance. 57 made them less likely to change from the non-IDR plan to the IDR plan when they believed they had a higher probability of earning a low-income. To test if concerns about having higher total payments might explain the lack of an effect of the Major Specific Income Treatment on repayment plan choice, in Appendix B.7 I analyze the effect of the Major Specific Income Treatment on a survey respondent’s subjective probability of earning between $30,000 and $60,000 a year. It is in that income range that I estimate a survey respondent would have higher total payments on the IDR plan compared to their total payments on the non-IDR plan. If the Major Specific Income Treatment causes survey respondents to believe they have a higher probability of earning $30,000 to $60,000 a year, the possible additional costs to them in terms of total payments would help explain why the Major Specific Income Treatment did not increase the probability respondents chose the IDR plan. Using data from before and after the treatments, the treatment effect of the Major Specific Income Treatment on that probability is a not statistically significant -2.98 percentage points. For all specifications in Appendix B.7 the treatment effect is negative. This result is not consistent with higher total payments explaining the lack of an effect of the treatment on plan choice. Another thing survey respondents may have been concerned about is how long they would have student loans. Even if they earned less than $20,000, and therefore would not be required to make payments if they were on the IDR plan, they would still have student loan debt. If having student loan debt imposes a mental cost on individuals no matter the level of payments, then even respondents who earned a low-income would have a reason to make higher payments so they could be debt free sooner. This may have discouraged survey respondents who had a higher probability of earning a low-income from choosing the IDR plan. If a borrower’s starting salary is $10,000, $20,000, $30,000, or $40,000, the table in the survey has an estimated time making payments for the IDR plan of longer than the 10 years of payments on the non-IDR plan. There are both costs and benefits to having a low-income while being on the IDR plan. This may help explain why causing survey respondents to have a higher subjective probability of earning a low- income did not coincide with being more likely to choose an IDR plan. Finally, it may be the case that survey respondents did not respond to having a greater subjective probability of earning a low-income by being more likely to choose the IDR plan because they did not understand the differences between the IDR plan and the non-IDR plan. Survey respondents only correctly answer on average 1.8 of the 4 questions testing their 58 understanding of the two repayment plans. This is not statistically significantly different by treatment. Survey respondents who see the Major Specific Income Treatment correctly answer 0.11 fewer questions then survey respondents who see the All-Graduates Income Treatment76. Neither is there a statistically significant difference in the number of correct responses between survey respondents with and without student loans77. It may be the case that if survey respondents better understood the plans, they would have responded to having a greater subjective probability of earning a low income in the way I predicted. I test if the level of repayment plan understanding is related to the treatment effect of the Major Specific Income Treatment in Appendix B.8. To do this, I compare the treatment effect for survey respondents who answer 0 or 1 of the 4 questions correct, I label those survey respondents as having low plan understanding, to the treatment effect for students who answer 2 or more questions correct, I label those survey respondents as having high plan understanding. For both survey respondents with high and low plan understanding, the Major Specific Income Treatment causes survey respondents to believe that they have a statistically significantly higher probability of earning a low income. However, for neither group did the Major Specific Income Treatment statistically significantly increase the probability of choosing the IDR plan. The treatment effects for the two groups were not statistically significantly different for either low- income expectations or repayment plan choice. Based on this, I do not think low understanding of the repayment plans explains why the Major Specific Income Treatment did not statistically significantly increase the probability a survey respondent chose the IDR plan. 2.8 Conclusion In this paper I test the hypothesis that a student’s subjective probability of earning a low income is a causal factor in if they prefer an IDR or non-IDR student loan repayment plan. I predict that students who had an exogenously higher subjective probability of earning a low income would be more likely to choose an IDR plan. I test that using data from a web survey emailed to undergraduate seniors at MSU. The survey randomizes the type of information about post-college incomes survey respondents are shown to create two groups of survey respondents with exogenously different probabilities of earning a low income. I find seeing the Major Specific Income Treatment causes survey respondents to believe they have a statistically 76 P-Value 0.155 77 Survey respondents without student loans answer on average 1.86 questions correct while survey respondents with student loans answer on average 1.82 questions correct. The P-value for the difference is 0.569. 59 significantly higher probability of earning a low-income compared to survey respondents who see the All-Graduates Income Treatment. Despite this, survey respondents seeing Major Specific Income Treatment do not cause respondents to have a statistically significantly different probability of choosing the IDR plan than survey respondents who see the All-Graduates Income Treatment. This pattern is similar looking at a variety of sub samples such as survey respondents with and without student loans, survey respondents with lower and higher earning majors, and survey respondents with lower and higher understanding of the repayment plans. I conclude that changing a student loan borrower’s expectation of earning a low income will not statistically significantly change their repayment plan choice. Attempts to increase take- up of IDR plans may have more success focusing on other changes to student loan repayment plan choice such as emphasizing the benefits of IDR plans (Abraham, Filiz-Ozbay, Ozbay, and Turner, 2020) or making an IDR plan the default repayment plan choice (Cox, Kreisman, and Dynarski, 2020). Future research could explore what borrowers in general, and students in particular, most care about when paying back their loans. Do they care about minimizing required payments, the total amount their loans cost, how long they have any debt, or some combination of the above? How do borrowers in general, and students in particular, balance the tradeoffs between lower monthly payments and increased amount of time having debt? If students had more choices related to how they repaid their student loans, such as having more control over the length of time they had to pay back their student loans on the non-IDR plan, or how payments were calculated as a function of their annual income on an IDR plan, how would they design their repayment plan? A second line of future research to expand on this research might be digging deeper into students’ expectations of their futures. How do students expect their income to change over time? How do students expect their incomes to change if they attend graduate or professional school? How accurate are students’ beliefs related to how much they will earn and how likely they are to attend graduate or professional school? What do students think they will be doing if they earn different ranges of income? 60 CHAPTER 3: THE EFFECT OF TEST SCORE PERFORMANCE LABELS ON POSTSECONDARY OUTCOMES: EVIDENCE FROM MICHIGAN 3.1 Introduction Governments throughout the world administer standardized exams to their students to learn about their academic achievement (Schleicher 2015). Much of the literature on standardized testing has focused on how schools (Figlio and Loeb 2011; Figlio and Ladd 2015) and teachers (Donaldson and Papay 2015) react to the test scores especially when performance on these exams leads to rewards or sanctions. A less studied aspect of standardized testing is how testing provides knowledge about a student’s academic achievement to the student and their parents. This information may be a significantly more credible signal of a student’s academic achievement than grades given the wide variation in grading practices among schools and teachers78. If parents and students make decisions about post-secondary education based on beliefs about the student’s academic ability, and if how standardized test scores are described changes their beliefs, then which label a student receives will change which post-secondary education choices students make. By changing a student’s education choices these labels may then change a student’s postsecondary outcomes. This chapter looks at the causal effect of getting different labels on standardized tests on post-secondary outcomes using administrative data on students in Michigan. I use a regression discontinuity research design to look at students who receive similar exam scores but different labels summarizing those scores. I look at students who are close to the cutoffs of receiving either the label associated with the highest or lowest scores for an 11th grade math and reading exam. While some of my estimates are statistically significant, almost all lack robustness to using another bandwidth. Also, if no labels had any effect on postsecondary outcomes, I would be likely to find some statistically significant effects anyway given the large number of estimates in this chapter. I conclude that I do not find evidence of a large effect of performance labels on postsecondary outcomes. 3.2 Literature Review The study most similar to my study is Papay, Murnane, and Willett (2016). They use Massachusetts administrative data to study the effect of test score labels in grades 8 and 10 on 78 See Gershenson (2018) for evidence of differential grade inflation by the affluence of students in North Carolina. See Pattison, Grodsky, and Muller (2013) for evidence that while grades have risen over time, the signaling power of grades as measured by the variance of grades and predictive power of grades has not decreased over time. 61 post-secondary enrollment. They focus their analysis on students who get free and reduced-price lunches and who live in urban school districts. Among those students near the cutoffs, being labeled Advanced rather than Proficient on the 10th grade math exam causes a 5-percentage point increase in post-secondary attendance within a year after intended high school graduation. This effect is greatest among students who when surveyed before taking the 10th grade exam reported they did not plan on attending a 4-year college. Two other papers study the effect of performance labels using regression discontinuity research designs on K – 12 outcomes. Avery and Goodman (2021) study the causal effect of receiving an Advanced label on a 10th grade math test on the probability of taking an Advanced Placement Calculus course for Massachusetts students. They find that for Black and Hispanic students, getting the Advanced label increases the probability a student will take an Advanced Placement Calculus course by 2.5 percentage points. Beuchert, Eriksen, and Krægpøth (2020) study the effect of 3rd grade test score labels79 for children in Denmark. Pooling the results of different labeling cutoffs, they find that getting a label associated with a lower score on the 3 rd grade math exam causes a 6% of a standard deviation increase in scores on the 6th grade math exam. There is one study that looks at the effect of students being informed about their academic ability on a mock standardized exam on high school outcomes. It provides evidence that information about academic ability can change academic outcomes by changing a student’s choice about where to go to school. Bobba and Frisancho (2019) study the effect of providing information about academic ability on the secondary schooling choices of students from high poverty neighborhoods in Mexico City. Schools are randomly assigned one of three treatments: no intervention, a mock secondary school admissions exam without informing students of their scores, or a mock secondary school admissions exam with informing students of their scores. They find the combination of the mock exam and the information about the exam score made high scoring students more likely to go to academic (college prep) schools and low scoring students more likely to go to non-academic (vocational/technical) schools. This new sorting of students to schools led to an increase in the on time high school graduation rate for students in 79 In Denmark scores receive one of the following 5 labels ordered from the lowest scoring exams to the highest scoring exams: Considerably Below Average, Below Average, Average, Above Average, and Considerably Above Average. The effect size around the cutoff between Considerably Below Average and Below Average is greater than the effect size at any of the other cutoffs. 62 the exam and information group of 8 percentage points compared to the no intervention group. Papay, Murnane, and Willett (2016), Beuchert, Eriksen, and Krægpøth (2020), and Avery and Goodman (2021) use regression discontinuity research designs to look at the effect of test score labels on the future educational outcomes of tested students. In all those papers, the labels did not carry any consequences in terms of things like the ability to graduate or the ability to take certain classes. Also, in those papers parents are sent reports about their child’s test performance that include the label that corresponds to their child’s score. This is like the institutional setting for this paper. In the case of the Danish score report, parents are not given information about the underlying scale score that determines the label. This is different from the reports in Massachusetts and Michigan that show parents the underlying scale score in the report. My study builds on those studies in several ways. It is the first study to look at the effect of test score labels in Michigan. Because of Michigan’s large population80, this study can detect smaller effects than the prior literature. Compared to Papay, Murnane, and Willett (2016) and Avery and Goodman (2021) this paper studies an exam taken in 11th grade rather than an exam taken in 10th grade. The closer the information is received relative to high school graduation, the more impact it might have under the assumption that events closer in time to the measured outcome have a greater effect on that outcome than events further away in time from it. 3.3 Institutional Setting In 2002 the No Child Left Behind Act was passed81 . The act required all U.S. states to administer standardized exams to students in math and reading in grades 3 through 8 and once in high school82. From the 2007 - 2008 school year to the 2013 – 2014 school year 11th grade students in Michigan are required to take standardized exams in Math, Reading, Science, Social Studies, and Writing as part of the Michigan Merit Exam. For each exam each student is assigned a scale score to indicate how well they did on the exam. Students who have higher scale scores did 80 The population of Denmark in Q1 2020 was 5,822,763. The estimated population of Massachusetts in 2019 was 6,892,503. The estimated population of Michigan in 2019 was 9,986,857. See https://www.dst.dk/en/Statistik/emner/befolkning-og-valg/befolkning-og-befolkningsfremskrivning/folketal for Denmark population and https://www.census.gov/data/tables/time-series/demo/popest/2010s-state-total.html for the population of Massachusetts and Michigan. 81 https://en.wikipedia.org/wiki/No_Child_Left_Behind_Act 82 In 2015 President Obama signed the Every Student Succeeds Act. While this law officially repealed the No Child Left Behind Act, it has its own set of requirements to test students in grades 3 to 8 and once in high school. See https://www.edweek.org/policy-politics/the-every-student-succeeds-act-an-essa-overview/2016/03 for more information. 63 better on the exam generally by answering a higher proportion of the exam’s multiple-choice questions correctly. Students’ performance on each exam is summarized by a performance label. Which label a student receives is based on their scale score. The performance labels from lowest scores to highest scores are: Not Proficient, Partially Proficient, Proficient, and Advanced. Scale scores are mapped to performance labels based on scores being in non-overlapping intervals. This means that for any given year all students who receive a lower performance label, such as Not Proficient, have lower scale scores than all students who receive a higher performance label, such as Partially Proficient. 3.4 Data and Sample The data for this project comes from the Michigan Education Data Center (MEDC). MEDC houses student level data for all K - 12 students who attend public schools in Michigan including data on test scores and demographic information such as a student’s race and gender. It also has data on post-secondary enrollment and degree completion from the National Student Clearinghouse. This chapter uses data on all students in Michigan who have 11th grade test scores from the 2007 – 2008 school year to the 2013 – 2014 school year83. Students whose data on their race, their gender, or if they are economically disadvantaged are missing are not included in my sample84. I construct 4 samples to look at students near the cutoffs between the following pairs of performance labels: Proficient and Advanced on the math exam, Proficient and Advanced on the reading exam, Not Proficient and Partially Proficient on the math exam, and Not Proficient and Partially Proficient on the reading exam. For each sample I only include students who receive one of the performance labels in the sample’s name. For example, the Math Proficient/Advanced sample only includes students who receive a Proficient or Advanced label on their 11th grade math exam. 83 The 2007 – 2008 school year is the earliest year that the Michigan Education Data Center has test score data for. I choose my last year to be 2013 – 2014 so I could analyze similar exam data across years. Starting in Spring 2015 Michigan made large changes to its 11th grade standardized exams changing from the Michigan Merit Exam to the Michigan Student Test of Education Progress. See https://medc.miedresearch.org/dataset/k-12-student-assessments. 84 I start with a sample of 803,798 students who were in 11 th grade from school year 2007 -2008 to school year 2013 – 2014. Of those students 87,609 are missing data on their race and gender, 28,253 are missing data on if they are economically disadvantaged, and 1,054 are missing data on their reading and math scores. Some of those groups of students overlap. Once all students with missing data are removed, I have a sample of 716,694 students. 64 Table 3.1 - Summary Statistics Variable Math Reading Math Not Reading Not Proficient/Advanced Proficient/Advanced Proficient/Partially Proficient/Partially Proficient Proficient Female Indicator 0.46 0.53 0.52 0.46 (0.50) (0.50) (0.50) (0.50) White Indicator 0.88 0.85 0.72 0.67 (0.32) (0.36) (0.45) (0.47) Black Indicator 0.03 0.07 0.19 0.24 (0.16) (0.26) (0.39) (0.43) Hispanic Indicator 0.02 0.03 0.04 0.05 (0.13) (0.16) (0.20) (0.21) Asian Indicator 0.05 0.03 0.02 0.02 (0.22) (0.17) (0.13) (0.14) Two or More 0.01 0.02 0.02 0.02 Races Indicator (0.12) (0.13) (0.14) (0.14) Native American 0.00 0.01 0.01 0.01 Indicator (0.07) (0.08) (0.09) (0.09) Hawaiian Indicator 0.00 0.00 0.00 0.00 (0.03) (0.03) (0.03) (0.03) Economically 0.15 0.24 0.42 0.47 Disadvantaged (0.36) (0.43) (0.49) (0.50) Indicator N 198,116 393,103 508,594 317,140 Notes: The table shows the mean outcome for each sample above the standard deviation for that outcome in parentheses. Table 3.1 shows summary statistics for the 4 samples. The main differences between the samples are between the Advanced/Proficient samples and between the Not Proficient/Partially Proficient samples. A higher proportion of the Not Proficient/Partially Proficient samples are black and economically disadvantaged. A lower proportion of the Not Proficient/Partially Proficient samples are white. The differences are smaller for proportion female and proportion of the other races in the data. 3.5 Empirical Framework My goal in this paper is to look at how receiving different performance labels changes a student’s post-secondary outcomes. I do this by using a sharp regression discontinuity research design to compare the outcomes of students near cutoffs to receive different performance labels. By doing this, for students close to a cutoff, I can estimate the average treatment effect of a student receiving a label associated with higher scale scores compared to receiving the label associated with lower scale scores on the other side of the cutoff. 65 For my main results I use the following estimating equation. (3.1) 𝑂𝑢𝑡𝑐𝑜𝑚𝑒𝑖𝑠𝑦 = 𝛽0 + 𝛽1 𝐻𝑖𝑔ℎ𝑒𝑟𝐿𝑎𝑏𝑒𝑙𝑖𝑠𝑦 + 𝛽2 (𝑆𝑐𝑎𝑙𝑒𝑆𝑐𝑜𝑟𝑒𝑖𝑠𝑦 − 𝐶𝑢𝑡𝑜𝑓𝑓𝑠 ) + 𝛽3 𝐻𝑖𝑔ℎ𝑒𝑟𝐿𝑎𝑏𝑒𝑙𝑖𝑠𝑦 (𝑆𝑐𝑎𝑙𝑒𝑆𝑐𝑜𝑟𝑒𝑖𝑠𝑦 − 𝐶𝑢𝑡𝑜𝑓𝑓𝑠 ) + 𝜃𝑦 + 𝜷𝑿𝑖 + 𝜖𝑖𝑠𝑦 In the equation individual i takes exam subject s in school year y. HigherLabel isy is an indicator variable for the student receiving the performance label in the sample associated with higher scales scores (either Advanced or Partially Proficient). Cutoffs is the lowest scale score a student needs to receive the higher label. In my sample, cutoffs vary based on the subject (math or reading) of the exam and which labels are on either side of the cutoff. However, the cutoffs do not vary depending on the year of the exam. θy is a fixed effect for the year the exam was taken. Xi are covariates. Covariates are indicator variables for a student’s race, gender, and if they are economically disadvantaged. I cluster standard errors at the year of exam level. The equation assumes a linear relationship between the outcome variable and the scale score of the exam allowing for the slope of the line to vary on either side of the cutoffs. The coefficient of interest is β1 which is the average outcome for students near the cutoffs if they get the label associated with higher scale scores minus the counterfactual average outcome of those students if they got the label on the other side of the cutoff associated with receiving lower scale scores. I refer to this as the treatment effect of receiving the higher label. For each regression I limit my sample to students whose scale scores are within a certain number of points of the cutoff. This value is called bandwidth. I choose a bandwidth for each sample based on the following procedure. First, for a given sample, I calculate the mean squared optimal bandwidth for each of my 6 outcome variables (ever enrolling in any post-secondary institution, ever enrolling in a 2-year institution, ever enrolling in a 4-year institution, having any post-secondary degree, having an associate degree, having a bachelor’s degree) using the method in Calonico, Cattaneo, and Titiunik (2014). Bandwidths are chosen using a uniform kernel accounting for indicators related to a student’s race, a student’s gender, and if the student is economically disadvantaged being in the regression. The bandwidth I use for each sample is the average of the 6 calculated bandwidths rounded to the nearest whole number. As a robustness check, I redo my analysis using bandwidths that are 0.5 times and 1.5 times the value of the chosen bandwidths rounded to the nearest whole number. The results using these other bandwidths are presented in Appendix C and discussed in Section 3.10. For β1 to be the treatment effect of receiving the higher label, it must be the case that a 66 student having a scale score be close to and above a cutoff or close to and below a cutoff is as good as random. In that case students near the cutoff will have, on average, the same observable and unobservable characteristics. This may not be the case if, for example, students or the individuals who assign students scale scores precisely manipulate the scores so students receive a specific performance label. In this case I would not only be measuring the treatment effect of receiving a higher label, but the willingness or ability to manipulate scores to be above or below a cutoff. To check for this, I use a modified version of Equation 3.1 where the outcome is an indicator variable for a student being female, being a certain race, or being economically disadvantaged and other covariates are excluded from the regression. In those regressions β1 is the discontinuity in the proportion of students with that characteristic at the cutoff. A significant coefficient would be evidence that the traits of students change suddenly at the cutoff and would be consistent with scores being manipulated to get a specific performance label. To look at heterogeneity, I use the following estimating equation. (3.2) 𝑂𝑢𝑡𝑐𝑜𝑚𝑒𝑖𝑔𝑠𝑦 = 𝛽0 + 𝛽1 𝐻𝑖𝑔ℎ𝑒𝑟𝐿𝑎𝑏𝑒𝑙𝑖𝑔𝑠𝑦 + 𝛽2 𝐻𝑖𝑔ℎ𝑒𝑟𝐿𝑎𝑏𝑒𝑙𝑖𝑔𝑠𝑦 𝑆𝑢𝑏𝑔𝑟𝑜𝑢𝑝𝑔 + 𝛽3 𝑆𝑢𝑏𝑔𝑟𝑜𝑢𝑝𝑔 + 𝛽4 (𝑆𝑐𝑎𝑙𝑒𝑆𝑐𝑜𝑟𝑒𝑖𝑔𝑠𝑦 − 𝐶𝑢𝑡𝑜𝑓𝑓𝑠 ) + 𝛽5 𝐻𝑖𝑔ℎ𝑒𝑟𝐿𝑎𝑏𝑒𝑙𝑖𝑔𝑠𝑦 (𝑆𝑐𝑎𝑙𝑒𝑆𝑐𝑜𝑟𝑒𝑖𝑔𝑠𝑦 − 𝐶𝑢𝑡𝑜𝑓𝑓𝑠 ) + 𝛽6 𝑆𝑢𝑏𝑔𝑟𝑜𝑢𝑝𝑔 (𝑆𝑐𝑎𝑙𝑒𝑆𝑐𝑜𝑟𝑒𝑖𝑔𝑠𝑦 − 𝐶𝑢𝑡𝑜𝑓𝑓𝑠 ) + 𝛽7 𝐻𝑖𝑔ℎ𝑒𝑟𝐿𝑎𝑏𝑒𝑙𝑖𝑔𝑠𝑦 𝑆𝑢𝑏𝑔𝑟𝑜𝑢𝑝𝑔 (𝑆𝑐𝑎𝑙𝑒𝑆𝑐𝑜𝑟𝑒𝑖𝑔𝑠𝑦 − 𝐶𝑢𝑡𝑜𝑓𝑓𝑠 ) + 𝜃𝑦 + 𝜖𝑖𝑔𝑠𝑦 In Equation 3.2 I look at student i in subgroup g whose exam score is in subject s for exam taken in school year y. Subgroupg equals 1 if a student is a member of the subgroup and 0 otherwise. I look at 3 different subgroups: female students, black students, and economically disadvantaged students. In the regressions where the subgroup is black students only students who are either white or black are included in the regressions. Equation 3.2 assumes a linear relationship between the outcome variable and the scale score whose slope can be different both above and below the cutoff and for students who are and are not members of the subgroup. The estimates of interest are β1, which is the higher label treatment effect for students who are not members of the subgroup, and β1 + β2, which is the higher label treatment effect for students who are members of the subgroup. β2 is the difference between the two treatment effects. 67 3.6 Math Proficient/Advanced Cutoff Results 3.6.1 Identification Test: Discontinuity in Density Figure 3.1 – Histogram Students Close to the Cutoff Math Proficient/Advanced Sample Notes: N = 40,349. Each bar in this histogram shows the number of students in the sample who received a different scale score. Figure 3.1 shows the number of students in the sample who receive different scale scores for values of the scale score close to the cutoff. A sudden change in the number of students at 0 would be consistent with scores being manipulated so the student receives a different performance label. Based on Figure 3.1, I do not find evidence of this as the change in the number of observations is smooth at the cutoff. 68 3.6.2 Identification Test: Discontinuity in Covariates Table 3.2 – Discontinuity in Covariates Math Proficient/Advanced Sample Female White Black Hispanic Advanced -0.0016 -0.0042 0.0016 0.0026 Label (0.0138) (0.0048) (0.0022) (0.0036) Year Fixed Y Y Y Y Effects Mean 0.46 0.88 0.03 0.02 Outcome Asian Two or Native Hawaiian More American Races Advanced 0.0002 0.0004 -0.0004 -0.0003 Label (0.0023) (0.0019) (0.0013) (0.0005) Year Fixed Y Y Y Effects Mean 0.05 0.01 0.00 0.00 Outcomes Economically Disadvantaged Advanced -0.0013 Label (0.0085) Year Fixed Y Effects Mean 0.15 Outcomes Notes: * p < 0.1, ** p < 0.05, *** p < 0.01. N = 40,349. Bandwidth = 5 scale score points. Standard errors are clustered at the year level. The outcomes are indicator variables for being female, being a specific race or being economically disadvantaged. Mean outcomes for the Math Proficient/Advanced Sample are shown. In Table 3.2 I estimate discontinuities in the proportion of students who have different observable characteristics at the Proficient/Advanced cutoffs. I find that all discontinuities are small and statistically insignificant. Based on this and on Figure 3.1 I conclude there is no manipulation of scale scores and my estimates are treatment effects of receiving an Advanced label on the 11th grade math exam. 69 3.6.3 Higher Label Treatment Effect Math Proficient/Advanced Sample Table 3.3 – Effect of Receiving a Higher Label Postsecondary Outcomes Math Proficient/Advanced Sample Any Any 2-Year 2-Year 4-Year 4-Year Postsecondary Postsecondary Enrollment Enrollment Enrollment Enrollment Enrollment Enrollment Advanced Label -0.0020 -0.0020 -0.0003 -0.0002 -0.0018 -0.0018 (0.0036) (0.0038) (0.0107) (0.0107) (0.0042) (0.0048) Year Fixed Y Y Y Y Y Y Effects Covariates N Y N Y N Y Mean Outcome 0.95 0.95 0.52 0.52 0.86 0.86 Any Any Associate Associate Bachelor’s Bachelor’s Postsecondary Postsecondary Degree Degree Degree Degree Degree Degree Advanced Label -0.0099 -0.0097* -0.0030 -0.0029 -0.0096 -0.0094 (0.0054) (0.0043) (0.0020) (0.0022) (0.0051) (0.0051) Year Fixed Y Y Y Y Y Y Effects Covariates N Y N Y N Y Mean Outcome 0.70 0.70 0.10 0.10 0.62 0.62 Notes: * p < 0.1, ** p < 0.05, *** p < 0.01. N = 40,349. Bandwidth = 5 scale score points. Standard errors are clustered at the year level. Mean outcomes for the Math Proficient/Advanced Sample are shown. In Table 3.3 I estimate the treatment effect of receiving an Advanced label using the Math Proficient/Advanced Sample. For all the outcomes I check I estimate that the treatment effect is small and statistically insignificant both with and without covariates. I conclude that there is no effect on average of receiving an Advanced label on the 11th grade math exam on postsecondary outcomes. Table 3.4 – Male and Female Treatment Effect Math Advanced Label Any Two-Year Four-Year Any Associate Bachelor’s Postsecondary Enrollment Enrollment Postsecondary Degree Degree Enrollment Degree Advanced -0.0007 0.0112 0.0042 0.0009 -0.0010 -0.0000 (0.0054) (0.0082) (0.0086) (0.0082) (0.0053) (0.0095) Advanced * Female -0.0032 -0.0271 -0.0137 -0.0246 -0.0049 -0.0218 (0.0052) (0.0197) (0.0108) (0.0150) (0.0107) (0.0136) P(Advanced + 0.26 0.46 0.08 0.05 0.38 0.03 Interaction) Year Fixed Effects Y Y Y Y Y Y Mean Outcome 0.94 0.52 0.83 0.63 0.09 0.55 Males Mean Outcome 0.97 0.53 0.90 0.78 0.10 0.70 Females Notes: * p < 0.1, ** p < 0.05, *** p < 0.01. N = 40,349. Bandwidth = 5 scale score points. Mean outcomes are for students in the Math Proficient/Advanced Sample. All regressions include year fixed effects. Standard errors are clustered at the year level. 70 In Table 3.4 I estimate the treatment effect of receiving an Advanced label using the Math Proficient/Advanced Sample for male and female students. For most of the outcomes I check I estimate that the treatment effect for both groups of students and the difference between the two treatment effects is small and statistically insignificant. However, I estimate that receiving an Advanced label causes female students to be significantly less likely to complete any postsecondary degree (2.4 percentage points) and less likely to complete a bachelor’s degree (2.2 percentage points). While the treatment effect for males is very close to 0, for neither outcome are the effects for male and female students significantly different. Table 3.5 – White and Black Treatment Effect Math Advanced Label Any Two-Year Four-Year Any Associate Bachelor’s Postsecondary Enrollment Enrollment Postsecondary Degree Degree Enrollment Degree Advanced -0.0022 0.0034 -0.0027 -0.0143 -0.0042* -0.0147 (0.0036) (0.0121) (0.0046) (0.0081) (0.0019) (0.0089) Advanced * Black 0.0363* -0.0027 0.0363* 0.1348* 0.0240 0.1412* (0.0176) (0.0570) (0.0185) (0.0657) (0.0331) (0.0637) P(Advanced + 0.10 0.99 0.13 0.09 0.57 0.07 Interaction) Year Fixed Effects Y Y Y Y Y Y Mean Outcome 0.95 0.53 0.86 0.70 0.10 0.62 White Mean Outcome 0.96 0.52 0.88 0.58 0.06 0.52 Black Notes: * p < 0.1, ** p < 0.05, *** p < 0.01. N = 36,634. Only white and black students are included in the regressions. Bandwidth = 5 scale score points. Mean outcomes are for students in the Math Proficient/Advanced Sample. All regressions include year fixed effects. Standard errors are clustered at the year level. In Table 3.5 I estimate the treatment effect of receiving an Advanced label using the Math Proficient/Advanced Sample for white and black students. No treatment effect nor any difference in treatment effect for white and black students is significant at the 5% level. However, point estimates for the treatment effect for black students are large and significant at the 10% level for both any postsecondary degree (12 percentage points) and bachelor’s degree (13 percentage points). These degree treatment effect estimates are much larger than the effect estimates for ever enrolling in a postsecondary institution (3 percentage points) or enrolling in a 4-year institution (3 percentage points) respectively. Assuming these are real effects rather than estimates being due to random variation, then it would mean getting a higher label would increase the probability of a student getting a bachelor’s degree for some black students who were already planning on enrolling in a postsecondary institution. I conclude that getting an Advanced label has little effect on postsecondary outcomes for white students but that it may 71 make black students more likely to get a bachelor’s degree. Table 3.6 – Difference by Economically Disadvantage Treatment Effect Math Advanced Label Any Two-Year Four-Year Any Associate Bachelor’s Postsecondary Enrollment Enrollment Postsecondary Degree Degree Enrollment Degree Advanced -0.0002 0.0014 0.0027 -0.0059 -0.0002 -0.0069 (0.0035) (0.0114) (0.0049) (0.0035) (0.0028) (0.0049) Advanced * -0.0173 -0.0168 -0.0402* -0.0404 -0.0245 -0.0296 Economically (0.0148) (0.0430) (0.0185) (0.0238) (0.0186) (0.0255) Disadvantaged P(Advanced + 0.29 0.72 0.07 0.11 0.19 0.17 Interaction) Year Fixed Effects Y Y Y Y Y Y Mean Outcome Not 0.96 0.52 0.88 0.73 0.09 0.66 Economically Disadvantaged Mean Outcome 0.91 0.57 0.74 0.51 0.12 0.41 Economically Disadvantaged Notes: * p < 0.1, ** p < 0.05, *** p < 0.01. N = 40,349. Bandwidth = 5 scale score points. Mean outcomes are for students in the Math Proficient/Advanced Sample. All regressions include year fixed effects. Standard errors are clustered at the year level. In Table 3.6 I estimate the treatment effect of receiving an Advanced label using the Math Proficient/Advanced Sample for students who are and are not economically disadvantaged. None of the treatment effects for students who are or are not economically disadvantaged nor the difference between the treatment effects are statistically significant. I conclude that getting an Advanced label has little average effect on postsecondary outcomes for either group of students. 72 3.7 Reading Proficient/Advanced Cutoff Results 3.7.1 Identification Test: Discontinuity in Density Figure 3.2 – Histogram Students Close to the Cutoff Reading Proficient/Advanced Sample Notes: N = 122,628. Each bar in this histogram shows the number of students in the sample who received a different scale score. Figure 3.2 shows the number of students in the sample who receive different scale scores for values of the scale score close to the Proficient/Advanced cutoffs for the reading exam. Like for the Math Proficient/Advanced sample, I find no visual evidence of a discontinuity in the density of observations at the cutoffs. 73 3.7.2 Identification Test: Discontinuities in Covariates Table 3.7 – Discontinuity in Covariates Reading Proficient/Advanced Sample Female White Black Hispanic Advanced 0.0010 -0.0001 0.0004 -0.0001 Label (0.0068) (0.0028) (0.0020) (0.0018) Year Fixed Y Y Y Y Effects Mean 0.53 0.85 0.07 0.03 Outcome Asian Two or Native Hawaiian More American Races Advanced -0.0003 0.0003 -0.0001 -0.0000 Label (0.0016) (0.0007) (0.0010) (0.0004) Year Fixed Y Y Y Y Effects Mean 0.03 0.02 0.01 0.00 Outcomes Economically Disadvantaged Advanced 0.0008 Label (0.0057) Year Fixed Y Effects Mean 0.24 Outcomes Notes: * p < 0.1, ** p < 0.05, *** p < 0.01. N = 122,628. Bandwidth = 9 scale score points. Standard errors are clustered at the year level. The outcomes are indicator variables for being female, being a specific race or being economically disadvantaged. Mean outcomes for the Reading Proficient/Advanced Sample are shown. In Table 3.7 I estimate discontinuities in the proportion of students who have different characteristics at the Proficient/Advanced cutoffs for the reading exam. I find that all discontinuities are small and statistically insignificant. Again, I conclude that discontinuities in outcomes at the cutoffs are due to the higher performance label rather than manipulation of students’ scale scores. 74 3.7.3 Higher Label Treatment Effect Reading Proficient/Advanced Sample Table 3.8 – Effect of Receiving a Higher Label Postsecondary Outcomes Reading Proficient/Advanced Sample Any Any 2-Year 2-Year 4-Year 4-Year Postsecondary Postsecondary Enrollment Enrollment Enrollment Enrollment Enrollment Enrollment Advanced Label 0.0005 0.0005 -0.0027 -0.0028 0.0013 0.0014 (0.0028) (0.0027) (0.0077) (0.0077) (0.0047) (0.0049) Year Fixed Y Y Y Y Y Y Effects Covariates N Y N Y N Y Mean Outcome 0.90 0.90 0.58 0.58 0.73 0.73 Any Any Associate Associate Bachelor’s Bachelor’s Postsecondary Postsecondary Degree Degree Degree Degree Degree Degree Advanced Label -0.0013 -0.0011 0.0104** 0.0104** -0.0058 -0.0056 (0.0062) (0.0058) (0.0032) (0.0033) (0.0078) (0.0072) Year Fixed Y Y Y Y Y Y Effects Covariates N Y N Y N Y Mean Outcome 0.56 0.56 0.11 0.11 0.46 0.46 Notes: * p < 0.1, ** p < 0.05, *** p < 0.01. N = 122,628. Bandwidth = 9 scale score points. Standard errors are clustered at the year level. Mean outcomes for the Reading Proficient/Advanced Sample are shown. In Table 3.8 I estimate the higher label treatment effect using the Reading Proficient/Advanced Sample. For 5 of 6 outcomes, I estimate that the treatment effect is small and statistically insignificant. However, I estimate that getting an Advanced label causes students to be a significant 1 percentage point more likely to earn an associate degree. Table 3.9 – Male and Female Treatment Effect Reading Advanced Label Any Two-Year Four-Year Any Associate Bachelor’s Postsecondary Enrollment Enrollment Postsecondary Degree Degree Enrollment Degree Advanced 0.0032 -0.0041 0.0060* 0.0060* 0.0142** -0.0022 (0.0026) (0.0084) (0.0030) (0.0025) (0.0051) (0.0070) Advanced * Female -0.0050 0.0025 -0.0088 -0.0138 -0.0069 -0.0069 (0.0044) (0.0053) (0.0077) (0.0097) (0.0067) (0.0083) P(Advanced + 0.67 0.85 0.74 0.46 0.14 0.38 Interaction) Year Fixed Effects Y Y Y Y Y Y Mean Outcome 0.88 0.55 0.69 0.50 0.10 0.41 Males Mean Outcome 0.92 0.60 0.77 0.61 0.13 0.50 Females Notes: * p < 0.1, ** p < 0.05, *** p < 0.01. N = 122,628. Bandwidth = 9 scale score points. Mean outcomes are for students in the Reading Proficient/Advanced Sample. All regressions include year fixed effects. Standard errors are clustered at the year level. In Table 3.9 I estimate the higher label treatment effect using the Reading 75 Proficient/Advanced Sample for male and female students. Consistent with my results for all students I find that getting an Advanced label causes male students to be 1.4 percentage points more likely to earn an associate degree. The effect for female students, while not significantly different than the male student effect, is about half the magnitude and not statistically significant at the 10% level. I do not find evidence of a significant effect for either male or female students for the other outcomes I look at. Table 3.10 – White and Black Treatment Effect Reading Advanced Label Any Two-Year Four-Year Any Associate Bachelor’s Postsecondary Enrollment Enrollment Postsecondary Degree Degree Enrollment Degree Advanced 0.0016 -0.0001 0.0015 -0.0015 0.0127*** -0.0065 (0.0023) (0.0083) (0.0044) (0.0054) (0.0033) (0.0074) Advanced * Black -0.0328 -0.0304 -0.0431* -0.0214 -0.0373** -0.0082 (0.0237) (0.0226) (0.0211) (0.0193) (0.0138) (0.0235) P(Advanced + 0.25 0.17 0.12 0.33 0.14 0.60 Interaction) Year Fixed Effects Y Y Y Y Y Y Mean Outcome 0.90 0.58 0.73 0.57 0.12 0.47 White Mean Outcome 0.91 0.61 0.72 0.39 0.07 0.31 Black Notes: * p < 0.1, ** p < 0.05, *** p < 0.01. N = 113,366. Only white and black students are included in the regressions. Bandwidth = 9 scale score points. Mean outcomes are for students in the Reading Proficient/Advanced Sample. All regressions include year fixed effects. Standard errors are clustered at the year level. In Table 3.10 I look at the Advanced label treatment effect for white and black students. Again, the only significant coefficients are for associate degree completion. I find getting an Advanced label causes white students to be 1.2 percentage points more likely and black students 2.5 percentage points less likely to earn an associate degree. While the black student effect is not significant, the difference between the two effects is. 76 Table 3.11 – Difference by Economically Disadvantage Treatment Effect Reading Advanced Label Any Two-Year Four-Year Any Associate Bachelor’s Postsecondary Enrollment Enrollment Postsecondary Degree Degree Enrollment Degree Advanced -0.0012 -0.0023 0.0019 0.0009 0.0093** -0.0020 (0.0022) (0.0064) (0.0035) (0.0060) (0.0038) (0.0079) Advanced * 0.0101 -0.0007 -0.0032 -0.0118 0.0069 -0.0206** Economically (0.0066) (0.0153) (0.0126) (0.0127) (0.0106) (0.0073) Disadvantaged P(Advanced + 0.29 0.87 0.94 0.46 0.12 0.04 Interaction) Year Fixed Effects Y Y Y Y Y Y Mean Outcome Not 0.93 0.58 0.78 0.62 0.11 0.53 Economically Disadvantaged Mean Outcome 0.82 0.58 0.58 0.35 0.11 0.25 Economically Disadvantaged Notes: * p < 0.1, ** p < 0.05, *** p < 0.01. N = 122,628. Bandwidth = 9 scale score points. Mean outcomes are for students in the Reading Proficient/Advanced Sample. All regressions include year fixed effects. Standard errors are clustered at the year level. In Table 3.11 I look at the Advanced label treatment effect for students who are and are not economically disadvantaged. I find two significant treatment effects. Getting an Advanced label makes students who are not economically disadvantaged 0.93 percentage points more likely to earn an associate degree. I also find that getting an Advanced label makes economically disadvantaged students 2.3 percentage points less likely to earn a bachelor’s degree. The only outcome where the effects for the two groups are significantly different is for earning a bachelor’s degree. 77 3.8 Math Not Proficient/Partially Proficient Cutoff Results 3.8.1 Identification Test: Discontinuity in Density Figure 3.3 – Histogram Students Close to the Cutoff Math Not Proficient/Partially Proficient Sample Notes: N = 146,961. Each bar in this histogram shows the number of students in the sample who received a different scale score. Figure 3.3 shows the number of students who receive different scale scores for scores close to the cutoffs. I find no visual evidence of a discontinuous change in the number of students at the cutoffs. 78 3.8.2 Identification Test: Discontinuities in Covariates Table 3.12 – Discontinuity in Covariates Math Not Proficient/Partially Proficient Sample Female White Black Hispanic Partially -0.0068 0.0007 -0.0055 0.0015 Proficient (0.0052) (0.0068) (0.0054) (0.0022) Label Year Fixed Y Y Y Y Effects Mean 0.52 0.72 0.19 0.04 Outcome Asian Two or Native Hawaiian More American Races Partially -0.0004 0.0024** 0.0007 0.0005 Proficient (0.0009) (0.0008) (0.0011) (0.0003) Label Year Fixed Y Y Y Y Effects Mean 0.02 0.02 0.01 0.00 Outcomes Economically Disadvantaged Partially -0.0043 Proficient (0.0067) Label Year Fixed Y Effects Mean 0.42 Outcomes Notes: * p < 0.1, ** p < 0.05, *** p < 0.01. N = 146,961. Bandwidth = 7 scale score points. Standard errors are clustered at the year level. The outcomes are indicator variables for being female, being a specific race, or being economically disadvantaged. Mean outcomes for the Math Not Proficient/Partially Proficient Sample are shown. In Table 3.12 I estimate discontinuities in the proportion of students who have different traits at the Not Proficient/Partially Proficient cutoffs for the math exam. For most of the characteristics I check the discontinuity is small and statistically insignificant. The exception to this is that I estimate the proportion of students who are two or more races increases by a significant 0.24 percentage points at the cutoff. Even if this is due to manipulation of the scale score, the manipulation seems to be only for a small percentage of students. This difference also probably will not have a big effect on my estimates because students who are two or more races are only 2% of the Math Not Proficient/Partially Proficient Sample. 79 3.8.3 Higher Label Treatment Effect Math Not Proficient/Partially Proficient Sample Table 3.13 – Effect of Receiving a Higher Label Postsecondary Outcomes Math Not Proficient/Partially Proficient Sample Any Any 2-Year 2-Year 4-Year 4-Year Postsecondary Postsecondary Enrollment Enrollment Enrollment Enrollment Enrollment Enrollment Partially -0.0033 -0.0021 -0.0059 -0.0053 0.0011 0.0028 Proficient Label (0.0040) (0.0048) (0.0048) (0.0052) (0.0034) (0.0043) Year Fixed Y Y Y Y Y Y Effects Covariates N Y N Y N Y Mean Outcome 0.75 0.75 0.59 0.59 0.45 0.45 Any Any Associate Associate Bachelor’s Bachelor’s Postsecondary Postsecondary Degree Degree Degree Degree Degree Degree Partially 0.0027 0.0033 -0.0024 -0.0024 -0.0016 -0.0010 Proficient Label (0.0039) (0.0047) (0.0026) (0.0028) (0.0029) (0.0027) Year Fixed Y Y Y Y Y Y Effects Covariates N Y N Y N Y Mean Outcome 0.30 0.30 0.11 0.11 0.19 0.19 Notes: * p < 0.1, ** p < 0.05, *** p < 0.01. N = 146,961. Bandwidth = 7 scale score points. Standard errors are clustered at the year level. Mean outcomes for the Math Not Proficient/Partially Proficient Sample are shown. In Table 3.13 I estimate the effect of receiving a Partially Proficient label on postsecondary outcomes. All the estimated treatment effects are small and statistically insignificant. Table 3.14 – Male and Female Treatment Effect Math Partially Proficient Label Any Two-Year Four-Year Any Associate Bachelor’s Postsecondary Enrollment Enrollment Postsecondary Degree Degree Enrollment Degree Partially Proficient -0.0029 -0.0012 0.0006 0.0154** 0.0091** 0.0007 (0.0078) (0.0079) (0.0046) (0.0053) (0.0036) (0.0026) Partially Proficient 0.0009 -0.0077 0.0028 -0.0222* -0.0209*** -0.0033 * Female (0.0086) (0.0079) (0.0091) (0.0096) (0.0038) (0.0070) P(Partially 0.62 0.09 0.65 0.39 0.01 0.68 Proficient + Interaction) Year Fixed Effects Y Y Y Y Y Y Mean Outcome 0.68 0.54 0.38 0.23 0.09 0.14 Males Mean Outcome 0.81 0.63 0.52 0.37 0.13 0.24 Females Notes: * p < 0.1, ** p < 0.05, *** p < 0.01. N = 146,691. Bandwidth = 7 scale score points. Mean outcomes are for students in the Math Not Proficient/Partially Proficient Sample. All regressions include year fixed effects. Standard errors are clustered at the year level. 80 In Table 3.14 I estimate the effect of receiving a Partially Proficient label for male and female students. For most outcomes I find that the male treatment effect, the female treatment effect, and the difference in the treatment effects is not statistically significant. I find that receiving a Partially Proficient label makes male students significantly more likely to earn a postsecondary degree (1.5 percentage points) and to earn an associate degree (0.91 percentage points). The effects for females are an insignificant 0.68 decrease in receiving any postsecondary degree and a significant 1.2 percentage point decrease in completing an associate degree. The difference in the associate degree treatment effects is statistically significant. Table 3.15 – White and Black Treatment Effect Math Partially Proficient Label Any Two-Year Four-Year Any Associate Bachelor’s Postsecondary Enrollment Enrollment Postsecondary Degree Degree Enrollment Degree Partially Proficient -0.0028 -0.0080 -0.0016 0.0022 -0.0043* -0.0009 (0.0036) (0.0055) (0.0038) (0.0029) (0.0018) (0.0044) Partially Proficient * 0.0054 0.0071 0.0298 0.0077 0.0037 0.0061 Black (0.0091) (0.0095) (0.0146) (0.0064) (0.0086) (0.0036) P(Partially Proficient 0.79 0.94 0.13 0.28 0.95 0.22 + Interaction) Year Fixed Effects Y Y Y Y Y Y Mean Outcome 0.74 0.58 0.46 0.33 0.12 0.22 White Mean Outcome 0.78 0.62 0.45 0.20 0.06 0.13 Black Notes: * p < 0.1, ** p < 0.05, *** p < 0.01. N = 133,804. Only white and black students are included in the regressions. Bandwidth = 7 scale score points. Mean outcomes are for students in the Math Not/Proficient Partially Proficient Sample. All regressions include year fixed effects. Standard errors are clustered at the year level. In Table 3.15 I estimate the effect of receiving a Partially Proficient label for white and black students. None of the treatment effects are statistically significant. 81 Table 3.16 – Difference by Economically Disadvantage Treatment Effect Math Partially Proficient Label Any Two-Year Four-Year Any Associate Bachelor’s Postsecondary Enrollment Enrollment Postsecondary Degree Degree Enrollment Degree Partially Proficient 0.0017 -0.0017 0.0065 0.0071 -0.0031 0.0007 (0.0039) (0.0058) (0.0040) (0.0047) (0.0038) (0.0039) Partially Proficient * -0.0131* -0.0109* -0.0143* -0.0120** 0.0012 -0.0067 Economically (0.0056) (0.0056) (0.0064) (0.0047) (0.0061) (0.0053) Disadvantaged P(Advanced + 0.12 0.06 0.22 0.28 0.67 0.11 Interaction) Year Fixed Effects Y Y Y Y Y Y Mean Outcome Not 0.80 0.62 0.52 0.38 0.13 0.26 Economically Disadvantaged Mean Outcome 0.67 0.54 0.35 0.19 0.08 0.10 Economically Disadvantaged Notes: * p < 0.1, ** p < 0.05, *** p < 0.01. N = 146,961. Bandwidth = 7 scale score points. Mean outcomes are for students in the Math Not Proficient/Partially Proficient Sample. All regressions include year fixed effects. Standard errors are clustered at the year level. In Table 3.16 I estimate the math Partially Proficient treatment effect for students who are and are not economically disadvantaged. None of the treatment effects are statistically significant. 82 3.9 Reading Not Proficient/Partially Proficient Cutoff 3.9.1 Identification Test: Discontinuity in Density Figure 3.4 – Histogram Students Close to the Cutoff Reading Not Proficient/Partially Proficient Sample Notes: N = 96,171. Each bar in this histogram shows the number of students in the sample who received a different scale score. Figure 3.4 shows the number of students in the sample who receive different scale scores for values close to the cutoff. Visually there is no discontinuous change in the number of students at the cutoff. 83 3.9.2 Identification Check: Discontinuities in Covariates Table 3.17 – Discontinuity in Covariates Reading Not Proficient/Partially Proficient Sample Female White Black Hispanic Partially 0.0046 0.0101 -0.0065 -0.0027 Proficient (0.0033) (0.0022) (0.0080) (0.0033) Label Year Fixed Y Y Y Y Effects Mean 0.46 0.67 0.24 0.05 Outcome Asian Two or Native Hawaiian More American Races Partially -0.0013 0.0010 -0.0009 0.0004 Proficient (0.0016) (0.0021) (0.0014) (0.0006) Label Year Fixed Y Y Y Y Effects Mean 0.02 0.02 0.01 0.00 Outcomes Economically Disadvantaged Partially 0.0053 Proficient (0.0057) Label Year Fixed Y Effects Mean 0.47 Outcomes Notes: * p < 0.1, ** p < 0.05, *** p < 0.01. N = 96,169. Bandwidth = 9 scale score points. Standard errors are clustered at the year level. The outcomes are indicator variables for being female, being a specific race, or being economically disadvantaged. Mean outcomes for the Reading Not Proficient/Partially Proficient Sample are shown. In Table 3.17 I estimate discontinuities in the proportion of students who are female, who are a specific race, or who are economically disadvantaged at the Reading Not Proficient/Partially Proficient cutoff. All the discontinuities are small and statistically insignificant. 84 3.9.3 Higher Label Treatment Effect Reading Not Proficient/Partially Proficient Sample Table 3.18 – Effect of Receiving a Higher Label Postsecondary Outcomes Reading Not Proficient/Partially Proficient Sample Any Any 2-Year 2-Year 4-Year 4-Year Postsecondary Postsecondary Enrollment Enrollment Enrollment Enrollment Enrollment Enrollment Partially -0.0023 -0.0011 -0.0047 -0.0036 -0.0029 -0.0021 Proficient Label (0.0100) (0.0096) (0.0100) (0.0096) (0.0059) (0.0060) Year Fixed Y Y Y Y Y Y Effects Race and Gender N Y N Y N Y Controls Mean Outcome 0.68 0.68 0.56 0.56 0.36 0.36 Any Any Associate Associate Bachelor’s Bachelor’s Postsecondary Postsecondary Degree Degree Degree Degree Degree Degree Partially -0.0033 -0.0033 -0.0002 -0.0005 -0.0012 -0.0010 Proficient Label (0.0049) (0.0048) (0.0043) (0.0042) (0.0034) (0.0035) Year Fixed Y Y Y Y Y Y Effects Race and Gender N Y N Y N Y Controls Mean Outcome 0.23 0.23 0.09 0.09 0.13 0.13 Notes: * p < 0.1, ** p < 0.05, *** p < 0.01. N = 96,169. Bandwidth = 9 scale score points. Standard errors are clustered at the year level. Mean outcomes for the Reading Not Proficient/Partially Proficient sample are shown. In Table 3.18 I estimate treatment effects of receiving a Partially Proficient label on the reading exam. All the effects are small and statistically insignificant. Table 3.19 – Male and Female Treatment Effect Reading Partially Proficient Label Any Two-Year Four-Year Any Associate Bachelor’s Postsecondary Enrollment Enrollment Postsecondary Degree Degree Enrollment Degree Partially -0.0094 -0.0089 -0.0091 -0.0086 0.0026 -0.0030 Proficient (0.0107) (0.0139) (0.0059) (0.0064) (0.0050) (0.0044) Partially 0.0144 0.0086 0.0127 0.0108 -0.0067 0.0035 Proficient * (0.0089) (0.0113) (0.0075) (0.0104) (0.0037) (0.0085) Female P(Partially 0.67 0.97 0.68 0.79 0.37 0.94 Proficient + Interaction) Year Fixed Y Y Y Y Y Y Effects Mean Outcome 0.63 0.51 0.32 0.19 0.08 0.11 Males Mean Outcome 0.74 0.61 0.41 0.27 0.11 0.15 Females Notes: * p < 0.1, ** p < 0.05, *** p < 0.01. N = 96,169. Bandwidth = 9 scale score points. Mean outcomes are for students in the Reading Not Proficient/Partially Proficient Sample. All regressions include year fixed effects. Standard errors are clustered at the year level. 85 Table 3.20 – White and Black Treatment Effect Reading Partially Proficient Label Any Two-Year Four-Year Any Associate Bachelor’s Postsecondary Enrollment Enrollment Postsecondary Degree Degree Enrollment Degree Partially Proficient -0.0033 -0.0028 -0.0085 -0.0009 -0.0004 -0.0001 (0.0088) (0.0097) (0.0055) (0.0059) (0.0054) (0.0020) Partially Proficient * 0.0066 -0.0057 0.0207* -0.0099 -0.0107 -0.0003 Black (0.0077) (0.0132) (0.0102) (0.0090) (0.0077) (0.0100) P(Partially Proficient 0.82 0.58 0.33 0.16 0.10 0.96 + Interaction) Year Fixed Effects Y Y Y Y Y Y Mean Outcome 0.67 0.54 0.36 0.26 0.11 0.15 White Mean Outcome 0.74 0.62 0.37 0.16 0.05 0.08 Black Notes: * p < 0.1, ** p < 0.05, *** p < 0.01. N = 86,906. Only white and black students are included in the regressions. Bandwidth = 9 scale score points. Mean outcomes are for students in the Reading Not Proficient/Partially Proficient sample. All regressions include year fixed effects. Standard errors are clustered at the year level. Table 3.21 – Difference by Economically Disadvantage Treatment Effect Reading Partially Proficient Label Any Two-Year Four-Year Any Associate Bachelor’s Postsecondary Enrollment Enrollment Postsecondary Degree Degree Enrollment Degree Partially Proficient 0.0037 -0.0067 0.0052 0.0015 -0.0020 0.0029 (0.0093) (0.0115) (0.0055) (0.0074) (0.0063) (0.0036) Partially Proficient * -0.0111 0.0050 -0.0150 -0.0082 0.0039 -0.0069 Economically (0.0131) (0.0140) (0.0082) (0.0100) (0.0081) (0.0061) Disadvantaged P(Advanced + 0.62 0.90 0.31 0.35 0.74 0.48 Interaction) Year Fixed Effects Y Y Y Y Y Y Mean Outcome Not 0.74 0.60 0.42 0.30 0.12 0.19 Economically Disadvantaged Mean Outcome 0.62 0.51 0.29 0.15 0.07 0.07 Economically Disadvantaged Notes: * p < 0.1, ** p < 0.05, *** p < 0.01. N = 96,169. Bandwidth = 9 scale score points. Mean outcomes are for students in the Reading Not Proficient/Partially Proficient sample. All regressions include year fixed effects. Standard errors are clustered at the year level. In Table 3.19, Table 3.20, and Table 3.21 I estimate the effect of receiving a Partially Proficient label on the reading exam for male and female students, white and black students, and students who are and are not economically disadvantaged. The estimated effects for all groups of students are small and statistically insignificant. 3.10 Alternative Specifications In Appendix C, I estimate higher label treatment effects including year fixed effects and 86 covariates in the regressions for all samples using bandwidths that are 0.5 times and 1.5 times the bandwidths, rounded to the nearest whole number, used in the main body of the paper. I do this to see how sensitive my results are to my choice of bandwidth. Like the results in the main body of the paper, most estimated treatment effects are small and statistically insignificant. Sometimes I estimate large treatment effects for the smaller bandwidth but this same treatment effect for a larger bandwidth is a much smaller magnitude. For example, using a bandwidth of 3 scale score points, I estimate that getting an Advanced label on their math exam causes black students to be 24 percentage points more likely to complete a bachelor’s degree. Using a bandwidth of 8 scale score points, I estimate that effect to only be 6 percentage points. The significance level of a treatment effect often changes when I use a different bandwidth. Table C.17 and Table C.18 list all the treatment effects that are significant at the 5% level. I estimate the treatment effect for 4 samples, 6 outcomes, and 7 groups of students giving me estimates for 168 treatment effects85. Out of those treatment effects, 34 of them are significant with at least one bandwidth, 8 are significant with at least 2 bandwidths and only one is significant with all 3 bandwidths. Assuming all real treatment effects are 0, I would expect about 8 treatment effects to be significant due to random chance for any given choice of bandwidth. Given the limited number of significant effects I find relative to what I would expect due to random chance and given how the significance of an effect often changes when I use a different bandwidth, it is possible that the best way to interpret my results is that I do not find strong evidence of a large effect of which performance label a student gets on their post- secondary outcomes. 3.11 Discussion and Conclusion In this paper I study how which label a student receives summarizing their performance on standardized exams affects their postsecondary outcomes. I do this by using a regression discontinuity research design to compare the outcomes of students close to cutoffs to receive different performance labels. I use data on students in Michigan public schools who took 11th grade math and reading exams from the 2007 – 2008 school year to the 2013 – 2014 school year. 85 The 4 samples are Math Proficient/Advanced Sample, Reading Proficient/Advanced Sample, Math Not Proficient/Partially Proficient Sample, and Reading Not Proficient/Partially Proficient Sample. The 6 outcomes are any postsecondary enrollment, 2-year enrollment, 4-year enrollment, any postsecondary degree, associate degree, and bachelor’s degree. The 7 groups of students are all, male, female, white, black, not economically disadvantaged, and economically disadvantaged. 87 I look at the effect both of receiving the label associated with the highest range of exam scores (Advanced) compared to the label for the second highest range of scores (Proficient) and the effect of receiving the label for the second lowest range of scores (Partially Proficient) compared to the label for receiving the lowest range of scores (Not Proficient). I look for effects for all students, for male students, for female students, for white students, for black students, for students who are not economically disadvantaged, and for students who are economically disadvantaged. While I find some statistically significant treatment effects, it is possible that they are almost all due to random chance rather than the treatment effect being different from 0. Many of these effects are not significant when using another bandwidth, and I would expect to find some significant effects given the large number of estimates I get. Out of a total of 168 estimated treatment effects, 34 are significant using at least one bandwidth and only one is significant using all 3 bandwidths. Because of this, it is possible that the best interpretation of my results is that I do not find strong evidence that which performance label a student receives affects their postsecondary outcomes. There are many different directions that future researchers could go in when studying test score labels. They could see if the labels change the quality of an institution a student goes to. They could see if the labels change the type of major a student chooses to study. They can look at the effect of labels in different states and for different grades. They could look at the effect of labels on K-12 academic outcomes such as scores on future exams, grades, high school graduation, and characteristics of the K-12 schools students attend. 88 BIBLIOGRAPHY Abadulkadiroğlu, Atila, Joshua Angrist, and Parag Pathak. 2014. “The Elite Illusion: Achievement Effects at Boston and New York Exam Schools.” Econometria. Vol. 82(1): 137 – 196 Abel, Jaison R., and Richard Deitz. 2014. “Do the Benefits of College Still Outweigh the Costs?” Current Issues in Economics and Finance 20(3) Abraham, Katharine G., Emel Filiz-Ozbay, Erkut Y. Ozbay, and Lesley J. Turner. 2020. “Framing Effects, Earnings Expectations, and the Design of Student Loan Repayment Schemes.” Journal of Public Economics 183 Akers, Elizabeth J., and Matthew M. Chingos. 2014. “Are College Students Borrowing Blindly?” Brown Center on Education Policy, Brookings Institution. https://www.brookings.edu/wp-content/uploads/2016/06/are-college-students-borrowing- blindly_dec-2014.pdf Allcott, Hunt, and Dmitry Taubinsky. 2015. “Evaluating Behaviorally Motivated Policy: Experimental Evidence from the Lightbulb Market.” American Economic Review 105(8): 2501 – 2538 Altonji, Joseph G., and Ling Zhong. 2021. “The Labor Market Returns to Advanced Degrees.” Journal of Labor Economics 39(2): 303 – 360 Andrew, Rodney J., Imberman, Scott A., Lovenheim, Michael F., and Kevin M. Strange. 2022. “The Returns to College Major Choice: Average and Distributional Effects, Career Trajectories, and Earnings Variability.” NBER Working Paper 30331 https://www.nber.org/papers/w30331 Andruska, Emily A., Jeanne M. Hogarth, Cynthia N. Fletcher, Gregory R. Forbes, and Darin R. Wohlgemuth. 2014. “Do You Know What You Owe? Student’ Understanding of Their Student Loans.” Journal of Student Financial Aid 44(2): 125 – 148 Arcidiancono, Peter V., Joseph Hotz, Arnaud Maurel, and Teresa Romano. 2020. “Ex Ante Returns and Occupational Choice.” Journal of Political Economy 128(12): 4475 – 4522 Avery, Christopher, and Joshua Goodman. 2021. “Ability Signals and Rigorous Coursework: Evidence from AP Calculus Participation.” Working Paper. Baker, Rachel, Eric Bettinger, Brian Jacob, and Ioana Marinescu. 2018. “The Effect of Labor Market Information on Community College Students’ Major Choice.” Economics of Education Review 65: 18 – 30 Barrow, Lisa, Lauren Sartain, and Marisa De La Torre. 2020. “Increasing Access to Selective High Schools through Place-Based Affirmative Action: Unintended Consequences.” American Economic Journal: Applied Economics 12(4): 135 – 163 89 Baum, Sandy, and Patricia Steele. 2017. “Who Goes to Graduate School and Who Succeeds?” Urban Institute Report. https://www.urban.org/sites/default/files/publication/86981/who_goes_to_graduate_scho ol_and_who_succeeds_1.pdf Betts, Julian R. 1996. “What Do Students Know about Wages? Evidence from a Survey of Undergraduates.” Journal of Human Resources 31(1): 27 – 56 Beuchert, Louise, Tine L. M. Eriksen, Morten V. Krægpøth. 2020. “The impact of standardized test feedback in math: Exploiting a natural experiment in 3rd grade.” Economics of Education Review 77 Bleemer, Zachary, and Basit Zafar. 2018. “Intended College Attendance: Evidence from an Experiment on College Returns and Costs.” Journal of Public Economics 157: 184 – 211 Bobba, Matteo, and Veronica Frisancho. 2019. “Perceived Ability and School Choice.” TSE Working Paper 16-660 Booij, Adam, Edwin Leuven, and Hessel Oosterbeek. 2017. “Ability Peer Effects in University: Evidence from a Randomized Experiment.” Review of Economic Studies 84: 547 – 578 Booij, Adam, Ferry Haan, and Erik Plug. 2017. “Can Gifted and Talented Education Raise the Academic Achievement of All High-Achieving Students?” IZA Instiutute of Labor Economics Discussion Paper 10836 http://ftp.iza.org/dp10836.pdf Brady, Ryan R., Michael A. Insler, Ahmed S. Rahman. 2017. “Bad Company: Understanding Negative Peer Effects in College Achievement.” European Economic Review 98: 144 – 168 Brown, Robert D., Jonathan Winburn, and Douglass Sullivan-Gonzalez. 2019. “The Value Added of Honors Programs in Recruitment, Retention, and Student Success: Impacts of the Honors College at the University of Mississippi.” pp. 179–201 in The Demonstrable Value of Honors Education: New Research Evidence, edited by A. J. Cognard-Black, J. Herron, and P. J. Smith. National Collegiate Honors Council Monograph Series, Lincoln, NE: National Collegiate Honors Council. Brownstein, Joshua. 2020. “A Pilot Survey on Student Loan Repayment Plan Choice.” Unpublished. https://tinyurl.com/Brownstein2020 Bui, Sa A., Steven G. Craig, and Scott A. Imberman. 2014. “Is Gifted Education a Bright Idea? Assessing the Impact of Gifted and Talented Programs on Students?” American Economic Journal: Economic Policy 6(3) pp. 30 – 62 Calonico, Sebastion, Matias D. Cattaneo, Max H. Farrell, and Rocío Titiunik. 2017. “rdrobust: Software for regression-discontinuity designs.” The Stata Journal 17(2): 372 – 90 404 Calonico, Sebastian, Matias D. Cattaneo, and Rocio Titiunik. 2014. “Robust Nonparamteric Confidence Intervals for Regression-Discontinuity Designs.” Econometrica 82(6): 2295 – 2326 Card, David, and Laura Giuliano. 2014. “Does Gifted Education Work? For Which Students?” NBER Working Paper 20453 https://www.nber.org/papers/w20453 Carrell, Scott E., Richard L. Fullerton, James E. West. 2009. “Does Your Cohort Matter? Measuring Peer Effects in College Achievement.” Journal of Labor Economics 23(3): 439 – 464 Carrell, Scott E., Bruce I. Sacerdote, and James E. West. 2013. “From Natural Variation to Optimal Policy? The Importance of Endogenous Peer Group Formation.” Econometrica 81(3): 855 – 882 Cattaneo, Matias D., Michael Jansson, Xinwei Ma. 2018. “Manipulation Testing Based on Density Discontinuity.” The Stata Journal 18(1): 234 – 261. Chapman, Bruce, and Lorraine Dearden. 2017. “Conceptual and Empirical Issues for Alternative Student Loan Designs: The Significance of Laon Repayment Burdens for the United States”, Annals of the American Academy of Political and Social Science 671(1): 249 - 268 Cohodes, Sarah R. 2020. “The Long-Run Impacts of Specialized Programming for High- Achieving Students.” American Economic Journal: Economic Policy 12(1): 127 – 166 Conlon, John J. 2021 “Major Malfunction: A Field Experiment Correcting Undergraduates’ Beliefs about Salaries.” Journal of Human Resources 56(3): 922 – 939 Cosgrove, John R. 2004. “The Impact of Honors Programs on Undergraduate Academic Performance, Retention, and Graduation.” Journal of the National Collegiate Honors Council 5(2): 45 – 53 Cox, James C., Daniel Kreisman and Susan Dynarski. 2020. “Designed to Fail: Effects of the Default Option and Information Complexity on Student Loan Repayment”. Journal of Public Economics 192 Darolia, Rajeev, and Casandra Harper. 2018. “Information Use and Attention Deferment in College Student Loan Decisions: Evidence From a Debt Letter Experiment.” Educational Evaluation and Policy Analysis 40(1): 129 – 150 Diaz, Dulce, Susan P. Farruggia, Meredith E. Wellman, and Bette L. Bottoms. 2019. “Honors Education Has a Positive Effect on College Student Success.” pp. 59–91 in The Demonstrable Value of Honors Education: New Research Evidence, edited by A. J. 91 Cognard-Black, J. Herron, and P. J. Smith. National Collegiate Honors Council Monograph Series, Lincoln, NE: National Collegiate Honors Council. Delavande, Adeline, Xavier Giné, and David McDenzie. 2011. “Measuring Subjective Expectations in Developing Countries: A Critical Review and New Evidence.” Journal of Development Economics 94(2): 151 – 163 Donaldson, Morgaen L., and John P. Papay. 2015. “Teacher Evaluation for Accountability and Development.” pp. 174 – 193. In Handbook of Research in Education Finance and Policy. Edited by Helen F. Ladd and Margaret E. Goertz. Published by Taylor & Francis Group. Figlio, David N., and Helen F. Ladd. 2015. “School Accountability and Student Achievement” pp. 194 – 210. In Handbook of Research in Education Finance and Policy. Edited by Helen F. Ladd and Margaret E. Goertz. Published by Taylor & Francis Group. Figlio, David, and Susanna Loeb. 2011. “School Accountability.” Pp. 383 – 421 In Handbook of the Economics of Education Volume 3. Edited by Eric A. Hanushek, Stephen J. Machin, and Ludger Woessmann. Foote, Andrew, Lisa Schulkind, and Teny M. Shapiro. 2015. “Missed Signals: The Effect of ACT College-Readiness Measures on Post-Secondary Decisions.” Economics of Education Review 46: 39 – 51 Furtwengler, Scott R. 2015. “Effects of Participation in a Post-Secondary Honors Program with Covariate Adjustment Using Propensity Score.” Journal of Advanced Academics 26(4): 274 – 293 Gershenson, Seth. 2018. “Grade Inflation in High Schools (2005 – 2016).” Thomas B. Fordham Institute Report. https://fordhaminstitute.org/national/research/grade-inflation-high- schools-2005-2016 Hartleroad, Gayle. 2005. “Comparison of the Academic Achievement of First-Year Female Honors Program and Non-Honors Program Engineering Students.” Journal of the National Collegiate Honors Council Fall 2005: 109 – 120 Hastings, Justine, Christopher A. Neilson, and Seth D. Zimmerman. 2018. “The Effects of Earnings Disclosure on College Enrollment Decisions.” NBER Working Paper 21300. https://www.nber.org/papers/w21300 Herbst, Daniel. 2023. “The Impact of Income-Driven Repayment on Student Borrower Outcomes.” American Economic Journal: Applied Economics 15(1): 1 - 25 Honeycutt, Jane B. 2019. “Community College Honors Benefits: A Propensity Score Analysis.” pp. 203–27 in The Demonstrable Value of Honors Education: New Research Evidence, edited by A. J. Cognard-Black, J. Herron, and P. J. Smith. National Collegiate Honors 92 Council Monograph Series, Lincoln, NE: National Collegiate Honors Council. Hurwitz, Michael, and Jonathan Smith. 2018. “Student Responsiveness to Earnings Data in the College Scorecard.” Economic Inquiry 56(2): 1220 – 1243 Irby, Latoya. 2019. “What Are the 3 Major Credit Reporting Agencies?” https://www.thebalance.com/who-are-the-three-major-credit-bureaus-960416 (Accessed February 5, 2020). Irby, Latoya. 2020. “Understanding Your Credit Report and the FCRA” https://www.thebalance.com/what-you-should-know-about-the-fcra-960639 (Accessed February 5, 2020). James Monks and Robert M. Schmidt. 2011. “The Impact of Class Size on Outcomes in Higher Education.” The B.E. Journal of Economic Analysis & Policy 11(1): Article 62 Keller, Robert. R., and Michael G. Lacy. 2013. “Propensity Score Analysis of an Honors Program’s Contribution to Students’ Retention and Graduation Outcomes.” Journal of the National Collegiate Honors Council 14(2): 73 – 84 Kirkham, Elyssa. 2020. “How Much Money Can I Take Out In Student Loans?”. Student Loan Hero. January 14. https://studentloanhero.com/featured/how-much-student-loans-can-i- get/ Klepfer, Kasey, Chris Ferandez, Carla Fletcher, and Jeff Webster. 2015. “Informed or Overwhelmed? A Legislative History of Student Loan Counseling with a Literature Review on the Efficacy of Loan Counseling”. TG Research and Analytical Services. https://files.eric.ed.gov/fulltext/ED579985.pdf. (Accessed February 10, 2020) Lacy, T. Austin, Johnathan G. Conzelmann, and Nichole D. Smith. 2018. “Federal Income- Driven Repayment Plans and Short-Term Student Loan Outcomes.” Educational Researcher 47(4): 255 – 258 Lane, Ryan. 2020. “Can You Change Your Student Loan Repayment Plan?”. Nerdwallet. January 3. https://www.nerdwallet.com/blog/loans/student-loans/how-much-student- loans-cost/ Lee, David S., Thomas Lemieux. 2010 “Regression Discontinuity Designs in Economics.” Journal of Economic Literature 48: 281 – 355 Lishinski, Alex and Justin Micomonaco. 2020. “A Propensity Score Analysis of the Impact of Honors Program Participation on Student Success Outcomes.” Unpublished. Ma, Jennifer and Matea Pender. 2021. “Trends in College Pricing and Student Aid 2021.” College Board. https://research.collegeboard.org/pdf/trends-college-pricing-student-aid- 2021.pdf (Accessed February 2nd, 2022) 93 McCrary, Justin. 2008. “Manipulation of the running variable in regression discontinuity design: A density test.” Journal of Econometrics 142(2): 698 – 714 Miller, Angie L., and Amber D. Dumford. 2018. “Do High-Achieving Students Benefit From Honors College Participation? A Look at Student Engagement for First-Year Students and Seniors.” Journal for the Education of the Gifted 41(3): 217 – 241 Mueller, M. Holger, and Constantine Yannelis. 2019a. “Students in Distress: Home Prices and Student Loan Default in the Great Recession.” Journal of Financial Economics 131(1): 1 – 19 Muller, M. Holger, and Constantine Yannelis. 2019b. “Reducing Barriers to Enrollment in Federal Student Loan Repayment Plans: Evidence from the Navient Field Experiment.” Unpublished. https://cmepr.gmu.edu/wp-content/uploads/2019/08/MuellerYannelis.pdf Office of the Assistant Secretary for Planning and Evaluation. 2020. “Poverty Guidelines”. U.S. Department of Health and Human Services. January 8. https://aspe.hhs.gov/poverty- guidelines Orr, Cody. 2020. “Clocking Into Work and Out of Class: How College Students Make Their Credit Hour Enrollment and Financing Decisions.” Unpublished. https://orrcody.github.io/research/drafts/into-work-out-of-class.pdf Papay, John P., Richard J. Murnane and John B. Willett. 2016. “The Impact of Test Score Labels on Human-Capital Investment Decisions.” The Journal of Human Resources 52(2): 357 – 388 Pattison, Evangeleen, Eric Grodsky, and Chandra Muller. 2013. “Is the Sky Falling? Grade Inflation and the Signaling Power of Grades.” Educational Researcher 42(5): 259 – 265 Patton, Katie, David Coleman, and Lisa W. Kay. 2019. “High-Impact Honors Practices: Success Outcomes among Honors and Comparable High-Achieving Non-Honors Students at Eastern Kentucky University.” Pp. 93–114 in The Demonstrable Value of Honors Education: New Research Evidence, edited by A. J. Cognard-Black, J. Herron, and P. J. Smith. National Collegiate Honors Council Monograph Series, Lincoln, NE: National Collegiate Honors Council. Peter G. Peterson Foundation. 2021. “10 Key Facts About Student Debt in the United States.” https://www.pgpf.org/blog/2021/05/10-key-facts-about-student-debt-in-the-united-states (Accessed February 2nd, 2022) Pugatch, Todd, and Paul Thompson. 2022. “Excellence for All? University Honors Programs and Human Capital Formation.” IZA Institute of Labor Economics. IZA DP No. 15354. Rinn, Anne N. 2007. “Effects of Programmatic Selectivity on the Academic Achievement, Academic Self-Concepts, and Aspirations of Gifted College Students.” Gifted Child 94 Quarterly 51(3): 232 – 245 Rinn, Anne N., and Jonathan A. Plucker. 2019. “High-Ability College Students and Undergraduate Honors Programs: A Systematic Review.” Journal for the Education of the Gifted 42(3): 187 – 215 Rotherham, Andrew J. 2006. “Making the Cut: How States Set Passing Scores on Standardized Tests.” Education Sector. Published July 2006 Ruggles, Steven, Sarah Flood, Ronald Goeken, Josiah Grover, Erin Meyer, Jose Pacas and Matthew Sobek. 2020. “IPUMS USA: Version 10.0.” Minneapolis, MN. https://doi.org/10.18128/D010.V10.0 Sacerdote, Bruce. 2011. “Peer Effects in Education: How Might They Work, How Big Are They and How Much Do We Know Thus Far?” Handbook of Economics of Education 3: 249 – 277 Sapelli, Claudio, and Gastón Illanes. 2016. “Class Size and Teacher Effects in Higher Education.” Economics of Education Review 52: 19 – 28 Schleicher, Andreas. 2015. “Are American students overtested? Listen to what students themselves say.” OECD Education and Sills Today. Published November 18, 2015. Scott, Richard I., and Patricia J. Smith. 2016. “Demography of Honors: The National Landscape of Honors Education.” Journal of the National Collegiate Honors Council 17(1) Scott, Richard I., Patricia J. Smith and Andrew J. Cognard-Black. 2017. "Demography of Honors: The Census of U.S. Honors Programs and Colleges." Journal of the National Collegiate Honors Council 18(1): 189 – 224 Scott-Clayton, Judith. 2019. “The Looming Student Loan Default Crisis is Worse than we Thought.” Brookings Institution. Evidence Speaks Reports, Vol 2, #34. https://www.brookings.edu/wp-content/uploads/2018/01/scott-clayton-report.pdf, (Accessed December 10, 2019) Seifert, Tricia A., Ernest T. Pascarella, Nicholas Colangelo, and Susan G. Assouline. 2007. “The Effect of Honors Program Participation on Experiences of Good practices and Learning Outcomes.” Journal of College Student Development 48(1): 57 – 74 Shushock, Frank Jr. 2006. “Student Outcomes and honors Programs: A Longitudinal Study of 172 Honors Students 2000 – 2004.” Journal of the National Collegiate Honors Council Fall/Winter 2006: 85 – 96 Slavin, Charlie, Theodore Coladarci, and Phillip A. Pratt. 2008. “Is Student Participation in an Honors Program Related to Retention and Graduation Rates?” Journal of the National 95 Collegiate Honors Council Fall/Winter 2008: 59 – 69 Smeaton, George, and Margaret Walsh. 2019, “Contributions of Small Honors Programs: The Case of a Public Liberal Arts College.” Pp. 229–52 in The Demonstrable Value of Honors Education: New Research Evidence, edited by A. J. Cognard-Black, J. Herron, and P. J. Smith. National Collegiate Honors Council Monograph Series, Lincoln, NE: National Collegiate Honors Council. Spisak, Art L., and Suzanne Carter Squires. 2016. “The Effect of Honors Courses on Grade Point Averages.” Journal of the National Collegiate Honors Council 17(2): 103 – 114 Student Borrower Protection Center. 2021. “Education Department’s Decades-Old Debt Trap: How the Mismanagement of Income-Driven Repayment Locked Millions in Debt.” National Consumer Law Center. https://www.nclc.org/uncategorized/issue-brief- education-departments-decades-old-debt-trap-how-the-mismanagement-of-income- driven-repayment-locked-millions-in-debt.html (Accessed February 17th, 2022) U.S. Government Accountability Office. 2015. “Federal student loans: education could do more to help ensure borrowers are aware of repayment and forgiveness options”. GAO- 15-663. Whitsett, Healey, and Rory O’Sullivan. 2012. “Lost Without a Map: A Survey about Students’ Experiences Navigating the Financial Aid Process.” NERA Economic Consulting. https://www.nera.com/content/dam/nera/publications/archive2/PUB_Student_Loan_Borr owers_1012.pdf. (Accessed February 11, 2020) Wiswall, Matthew J., and Basit Zafar. 2015a. “How Do College Students Respond to Public Information about Earnings?” Journal of Human Capital 9(2): 117 – 169 Wiswall, Matthew J., and Basit Zafar. 2015b. “Determinants of College Major Choices: Identification from an Information Experiment.” Review of Economic Studies 82(2): 791 – 824 Wiswall, Matthew J., and Basit Zafar. 2021. “Human Capital Investment and Expectations about Career and Family.” Journal of Political Economy 129(5): 1361 – 1424 Woo, Jennie, H., Alexander H. Bentz, Stephen Lew, Erin Dunlop Velez, and Nichole Smith. 2017. “Repayment of Student Loan as of 2015 Among 1995-96 and 2003-04 First-Time Beginning Students” National Center for Education Statistics. https://nces.ed.gov/pubs2018/2018410.pdf 96 APPENDIX A: CHAPTER 1 APPENDIX A.1 Common Features of Similarly Ranked Honors Programs To learn about honors programs outside of Michigan State University (MSU), I looked online for information about honors programs at similarly ranked U.S. universities. I limited my search to national universities whose U.S. News and World Report 2022 ranking was within 20 spots of MSU’s ranking. In the process I checked the websites of 53 universities for information about the university’s honors program. 50 of those universities had honors programs. 48 of the programs had courses for honors students as a key feature of the program. 35 programs had honors housing, 29 programs had honors advising, 20 programs had priority registration allowing honors students to register for classes before non-honors students, and 20 programs required honors students to complete a thesis or capstone project to finish the program. A.2 Benefits of Being Enrolled in the MSU Honors College Students get a variety of benefits when they join the MSU Honors College. They get a different, more flexible set of general education requirements86. They can enroll in classes on the first day of each enrollment period. This is before most other students at MSU can enroll in courses. They can enroll in courses without being in the course’s required major or having completed the required prerequisites. This may require approval from the department that teaches the course. They can enroll in graduate-level courses as an undergraduate student87. They can enroll in honors courses. These courses are only available to honors students. On its website the MSU Honors College describes the benefits of honors courses over regular courses as 88: having smaller class sizes, covering the material in greater depth, covering the material at a faster pace, and having more classroom interaction. They can enroll in honors sections of courses. Courses with large numbers of students are often divided into multiple sections. Generally, all sections of a course are taught by the same professor, take the same exams, and have the same 86 The general education requirements for students enrolled in the MSU Honors College are: one course in introductory writing, two courses in arts and humanities, two lecture classes in natural sciences and two social science courses. Each course must be 3 or 4 credits. By contrast the university wide requirements are: 8 credits in Arts and Humanities, 8 credits in Social, Behavioral, and Economic Sciences, 3 credits in Biological Sciences, 3 credits in Physical Sciences and 2 credits of lab in either biological or physical sciences. Both the honors and non- honors general education requirements can be at least partially completed using AP, IB, or Dual Enrollment credits. See https://honorscollege.msu.edu/admissions/general-education-requirements.html for honors college general education requirements and https://reg.msu.edu/Forms/ESAF/IS_DN_FAQ.aspx#IS1 for non-honors general education requirements. 87 Students pay the same tuition for graduate classes as they do for undergraduate classes. I learned this in an email from an associate dean of the MSU Honors College. 88 https://honorscollege.msu.edu/academics/honors-experiences.html 97 homework assignments. The main difference is that each section is assigned to attend in-person meetings, such as lectures, at different times. Honors sections cover the same material and fulfill the same major and prerequisite requirements as non-honors sections. However, honors sections compared to non-honors sections have many of the benefits of honors courses such as smaller section sizes and covering the material in greater depth. They can meet with honors college advisors. Honors college advisors can help students with a variety of topics including making plans to fulfill requirements to graduate from the MSU Honors College, enroll in courses outside their major and make course plans consistent with their post-college graduation goals. They can apply to have an honors college peer mentor. Mentors are expected to share their experiences of being in the MSU Honors College and respond to communications from their mentee. Mentors are available to first- and second-year students. They can live on honors-only floors of residence halls. Students on honors only floors sometimes organize floor-specific events89. Finally, there are some merit scholarships available only to students enrolled in the MSU Honors College. Some of these scholarships are only available to students accepted into the college from high school90. Other scholarships are available to all students who are currently members of the college91. Because only a minority of students in the MSU Honors College receive these scholarships, and because these scholarships are merit based, I do not think they would have much effect on the students near the GPA cutoffs. Therefore, I do not expect them to influence my results. 89 This may not have much effect on students who were admitted to the MSU Honors College when they are freshmen. While students at MSU are required to live on campus their first year, many students move off campus after their first year. 90 https://honorscollege.msu.edu/admissions/freshman-scholarships.html 91 https://honorscollege.msu.edu/programs/scholarships-for-current-students.html 98 A.3 Alternative Specifications All Students Near Cutoffs A.3.1 Bias-Corrected Results Table A.1 – Discontinuity in Covariates Female First Gen Age First ACT White Black Semester Score92 Above Cutoff -0.0061 0.0185 -0.0032 -0.0416 -0.0154 0.0248** [-0.07,0.05] [-0.02,0.09] [-0.07,0.10] [-0.44,0.42] [-0.06,0.05] [0.01,0.05] Bandwidth 0.186 0.157 0.180 0.144 0.188 0.166 First College-Cohort Y Y Y Y Y Y Fixed Effects Number of 5,613 4,990 5,479 4,113 5,639 5,178 Observations Mean Outcome 0.51 0.28 18 25 0.62 0.09 American Asian Pacific Hawaiian Hispanic Two or More Native Islander Races Above Cutoff 0.0067* -0.0131 -0.0002 -0.0002 -0.0104 -0.0005 [-0.00,0.01] [-0.05,0.01] [-0.00,0.00] [-0.00,0.00] [-0.03,0.01] [-0.02,0.02] Bandwidth 0.212 0.146 0.515 0.169 0.179 0.191 First College-Cohort Y Y Y Y Y Y Fixed Effects Number of 6,124 4,768 12,439 5,238 5,455 5,667 Observations Mean Outcome 0.00 0.05 0.00 0.00 0.04 0.02 Race Not Race Not Reported Requested Above Cutoff 0.0011 0.0059 [-0.01,0.01] [-0.03,0.03] Bandwidth 0.199 0.206 First College-Cohort Y Y Fixed Effects Number of 5,857 6,012 Observations Mean Outcome 0.01 0.16 Notes: * p < 0.1, ** p < 0.05, *** p < 0.01. All regressions include first college-cohort fixed effects. The method for selecting bandwidths and calculating confidence intervals is from Calonico, Cattaneo, and Titiunik (2014). Robust 95% confidence intervals are below the coefficients in brackets. Coefficients are calculated using a triangular kernel. The outcomes are indicator variables for being female, being white, being black, and being a first-generation college student, the student’s age during their first semester at MSU and the student’s ACT score. Mean outcomes for the All GPAs Sample are shown. 92 In results not shown, I test for a discontinuity in the probability a student’s ACT score is missing at the cutoffs. The discontinuity, at a decline of 0.1%, is small and statistically insignificant. 99 Table A.2 – Discontinuity in Ever Being in the Honors College Ever in Ever in Honors Honors College College Above Cutoff 0.2880*** 0.2902*** [0.22,0.34] [0.23,0.34] First College- Y Y Cohort Fixed Effects Covariates N Y Bandwidth 0.148 0.150 Number of 4,798 4,810 Observations Mean Outcome 0.06 0.06 Notes: * p < 0.1, ** p < 0.05, *** p < 0.01. All regressions include first college-cohort fixed effects. The method for selecting bandwidths and calculating confidence intervals is from Calonico, Cattaneo, and Titiunik (2014). Robust 95% confidence intervals are below the coefficients in brackets. Coefficients are calculated using a triangular kernel. Covariates include the student’s age when the entered MSU and indicators for being female, being a specific race, and being a first-generation college student. Robust 95% confidence intervals are below the coefficients in brackets. Mean outcomes for the All GPAs sample are shown. Table A.3 – Intensity of Honors College Participation for Marginal Students Number of Number of Number of Number of Graduating Graduating Semesters Semesters Honors Honors from from in the in the Experiences Experiences Honors Honors Honors Honors College College College College Treatment Effect 7.6377*** 7.5778*** 5.2753*** 5.3360*** 0.5073*** 0.5055*** [6.5,8.4] [6.5,8.4] [4.4,6.2] [4.5,6.2] [0.34,0.63] [0.34,0.63] First College- Y Y Y Y Y Y Cohort Fixed Effects Covariates N Y N Y N Y Bandwidth 0.151 0.136 0.142 0.117 0.157 0.144 Number of 4,870 4,534 4,662 4,004 4,972 4,691 Observations Mean Outcome 7.8 7.8 5.3 5.3 0.52 0.52 Honors Students Notes: * p < 0.1, ** p < 0.05, *** p < 0.01. All regressions include first college-cohort fixed effects. The method for selecting bandwidths and calculating confidence intervals is from Calonico, Cattaneo, and Titiunik (2014). Robust 95% confidence intervals are below the coefficients in brackets. Coefficients are calculated using a triangular kernel. The coefficients are 2SLS estimates for the treatment effect of ever participating in the MSU Honors College. Covariates include the student’s age when the entered MSU and indicators for being female, being a specific race, and being a first-generation college student. Robust 95% confidence intervals are below the coefficients in brackets. Mean outcomes for honors students in the All GPAs Sample who were not in the honors college during their first semester are shown. 100 Table A.4 – Effect of Honors College Participation on Student Outcomes Graduate Graduate Time to Time to 4th Semester 4th Semester MSU MSU Degree Degree GPA GPA Treatment Effect 0.0431 0.0484 -1.0562*** -1.3930*** 0.0323 0.0697 [-0.08,0.18] [-0.7,0.19] [-2.0, -0.4] [-2.3, -0.7] [-0.07, 0.17] [-0.03, 0.20] First College-Cohort Y Y Y Y Y Y Fixed Effects Covariates N Y N Y N Y Bandwidth 0.108 0.103 0.099 0.090 0.109 0.108 Number of 3,760 3,674 3,126 2,906 3,550 3,545 Observations Mean Outcome 0.79 0.79 13 13 3.0 3.0 8th Semester 8th Semester Total Credit Total Credit Credit Hours Credit Hours GPA GPA Hours Hours 300 Level 300 Level Treatment Effect 0.0612 0.1182* -0.6987 -0.3924 -2.0418 -1.8525 [-0.07, 0.21] [-0.02, 0.28] [-12, 11] [-12, 12] [-7.9, 2.7] [-7.6, 2.9] First College-Cohort Y Y Y Y Y Y Fixed Effects Covariates N Y N Y N Y Bandwidth 0.097 0.087 0.102 0.095 0.110 0.104 Number of 2,827 2,592 3,606 3,367 3,767 3,674 Observations Mean Outcome 3.1 3.1 106 106 25 25 Credit Hours Credit Hours More than More than Number Number 400 Level 400 Level One Degree One Degree Minors Minors Treatment Effect 0.9603 1.5477 -0.0443 -0.0463 -0.1302 -0.1348 [-4.0, 6.0] [-3.4, 6.7] [-0.15, 0.02] [-0.15, 0.02] [-0.39, 0.07] [-0.39, 0.08] First College-Cohort Y Y Y Y Y Y Fixed Effects Covariates N Y N Y N Y Bandwidth 0.099 0.092 0.170 0.150 0.158 0.146 Number of 3,435 3,251 4,769 4,439 5,038 4,768 Observations Mean Outcome 17 17 0.03 0.03 0.15 0.15 Notes: * p < 0.1, ** p < 0.05, *** p < 0.01. All regressions include first college-cohort fixed effects. The method for selecting bandwidths and calculating confidence intervals is from Calonico, Cattaneo, and Titiunik (2014). Robust 95% confidence intervals are below the coefficients in brackets. Coefficients are calculated using a triangular kernel. Coefficients are 2SLS estimates for the treatment effect of ever participating in the MSU Honors College. Covariates include the student’s age when they enter MSU and indicators for being female, being a specific race, and being a first-generation college student. Mean outcomes for students in the All GPAs Sample are shown. Time to degree counts summers as 1 semester. For the GPA regressions, 4th semester and 8th semester are calculated ignoring summers. GPA is cumulative GPA at the end of the semester. For more than one degree only students who have at least 1 degree are included in the regression. 101 A.3.2 Additional Bandwidths All Students Near the Cutoffs Table A.5 – Discontinuity in Covariates 1 Female Female First Gen First Gen Age First Age First Semester Semester Above Cutoff -0.0100 0.0004 0.0221 0.0040 -0.0010 -0.0102 (0.0301) (0.0257) (0.0204) (0.0177) (0.0364) (0.0379) First College- Y Y Y Y Y Y Cohort Fixed Effects Bandwidth 0.10 0.20 0.10 0.20 0.10 0.20 Number of 3,472 5,866 3,472 5,866 3,472 5,866 Observations Mean Outcome 0.51 0.51 0.28 0.28 18 18 ACT ACT White White Black Black Score Score Above Cutoff -0.0244 -0.0518 -0.0181 -0.0275 0.0259* 0.0210** (0.2280) (0.1873) (0.0266) (0.0203) (0.0143) (0.0096) First College- Y Y Y Y Y Y Cohort Fixed Effects Bandwidth 0.10 0.20 0.10 0.20 0.10 0.20 Number of 3,028 5,113 3,472 5,866 3,472 5,866 Observations Mean Outcome 25 25 0.62 0.62 0.09 0.09 Notes: * p < 0.1, ** p < 0.05, *** p < 0.01. All regressions include first college-cohort fixed effects. Standard errors are clustered at the first college – cohort level. The regressions above are estimated using Equation 1.4 from Section 1.5 of this dissertation. The outcomes are indicator variables for being female, being a first-generation college student, the student’s age during their first semester at MSU, the student’s ACT score and indicator for being white and an indicator for being black. Mean outcomes for the All GPAs Sample are shown. 102 Table A.6 – Discontinuity in Covariates 2 American American Asian Asian Pacific Pacific Native Native Islander Islander Above Cutoff 0.0069** 0.0070** -0.0154 -0.0018 N/A -0.0001 (0.0031) (0.0027) (0.0188) (0.0135) (0.0001) First College- Y Y Y Y Y Y Cohort Fixed Effects Bandwidth 0.10 0.20 0.10 0.20 0.10 0.20 Number of 3,472 5,866 3,472 5,866 3,472 5,866 Observations Mean 0.00 0.00 0.05 0.05 0.00 0.00 Outcome Hawaiian Hawaiian Hispanic Hispanic Two or More Two or More Races Races Above Cutoff -0.0004 -0.0003 -0.0183* -0.0135* 0.0006 -0.0008 (0.0005) (0.0005) (0.0106) (0.0074) (0.0102) (0.0055) First College- Y Y Y Y Y Y Cohort Fixed Effects Bandwidth 0.10 0.20 0.10 0.20 0.10 0.20 Number of 3,472 5,866 3,472 5,866 3,472 5,866 Observations Mean 0.00 0.00 0.04 0.04 0.02 0.02 Outcome Race Not Race Not Race Not Race Not Reported Reported Requested Requested Above Cutoff 0.0001 0.0015 0.0187 0.0145 (0.0062) (0.0046) (0.0171) (0.0162) First College- Y Y Y Y Cohort Fixed Effects Bandwidth 0.10 0.20 0.10 0.20 Number of 3,472 5,866 3,472 5,866 Observations Mean 0.01 0.01 0.16 0.16 Outcome Notes: * p < 0.1, ** p < 0.05, *** p < 0.01. All regressions include first college-cohort fixed effects. Standard errors are clustered at the first college – cohort level. The regressions above are estimated using Equation 1.4 from Section 1.5 of this dissertation. The outcomes are indicator variables for being the race described. Mean outcomes for the All GPAs Sample are shown. 103 Table A.7 – Discontinuity in Ever Being in the Honors College Ever in Ever in Ever in Honors Honors Honors College College College Above Cutoff 0.3068*** 0.2859*** 0.3155*** (0.0311) (0.0259) (0.0241) First College- Y Y Y Cohort Fixed Effects Covariates Y Y Y Bandwidth 0.10 0.15 0.20 Number of 3,472 4,829 5,866 Observations Mean Outcome 0.06 0.06 0.06 Notes: * p < 0.1, ** p < 0.05, *** p < 0.01. All regressions include first college-cohort fixed effects. Standard errors are clustered at the first college – cohort level. All regressions include the following covariates: the student’s age when the entered MSU and indicators for being female, being a specific race, and being a first-generation college student. Mean outcomes for the All GPAs sample are shown. 104 Table A.8 – Intensity of Honors College Participation for Marginal Students Number of Number of Number of Number of Number of Number of Semesters Semesters Semesters Honors Honors Honors in the in the in the Experiences Experiences Experiences Honors Honors Honors College College College Above Cutoff 7.6817*** 7.8054*** 7.8924*** 5.3137*** 5.2714*** 5.3531*** (0.4869) (0.4422) (0.3833) (0.3878) (0.3789) (0.3243) First College- Y Y Y Y Y Y Cohort Fixed Effects Covariates Y Y Y Y Y Y Bandwidth 0.10 0.15 0.20 0.10 0.15 0.20 Number of 3,472 4,829 5,866 3,472 4,829 5,866 Observations Mean Outcome 7.8 7.8 7.8 5.3 5.3 5.3 College Admits Graduating Graduating Graduating from from from Honors Honors Honors College College College Above Cutoff 0.5126*** 0.5182*** 0.5531*** (0.0599) (0.0578) (0.0489) First College- Y Y Y Cohort Fixed Effects Covariates Y Y Y Bandwidth 0.10 0.15 0.20 Number of 3,472 4,829 5,866 Observations Mean Outcome 0.52 0.52 0.52 College Admits Notes: * p < 0.1, ** p < 0.05, *** p < 0.01. All regressions include first college-cohort fixed effects. Standard errors are clustered at the first college – cohort level. The coefficients are 2SLS estimates for the treatment effect of ever participating in the MSU Honors College. All regressions include the following covariates: the student’s age when the entered MSU and indicators for being female, being a specific race, and being a first-generation college student. Mean outcomes for honors students in the All GPAs Sample whose first semester in the honors college is not their first semester at MSU are shown. 105 Table A.9 – Effect of Honors College Participation on Student Outcomes 1 Graduate Graduate Graduate Time to Time to Time to MSU MSU MSU Degree Degree Degree Above Cutoff 0.0311 0.0178 0.0328 -0.8544** -0.7789** -0.5831** (0.0689) (0.0536) (0.0375) (0.3867) (0.3269) (0.2930) First College- Y Y Y Y Y Y Cohort Fixed Effects Covariates Y Y Y Y Y Y Bandwidth 0.10 0.15 0.20 0.10 0.15 0.20 Number of 3,472 4,829 5,866 3,168 4,403 5,338 Observations Mean Outcome 0.79 0.79 0.79 13 13 13 4th 4th 4th 8th 8th 8th Semester Semester Semester Semester Semester Semester GPA GPA GPA GPA GPA GPA Above Cutoff 0.0518 0.0068 0.0108 0.0851 0.0503 0.0564 (0.0762) (0.0652) (0.0505) (0.0714) (0.0633) (0.0547) First College- Y Y Y Y Y Y Cohort Fixed Effects Covariates Y Y Y Y Y Y Bandwidth 0.10 0.15 0.20 0.10 0.15 0.20 Number of 3,272 4,561 5,542 2,880 4,006 4,854 Observations Mean Outcome 3.0 3.0 3.0 3.1 3.1 3.1 Notes: * p < 0.1, ** p < 0.05, *** p < 0.01. All regressions include first college-cohort fixed effects. Standard errors are clustered at the first college – cohort level. Coefficients are 2SLS estimates for the treatment effect of ever participating in the MSU Honors College. All regressions include the following covariates: the student’s age when they enter MSU and indicators for being female, being a specific race, and being a first-generation college student. Mean outcomes for students in the All GPAs Sample are shown. Time to degree counts summers as 1 semester. For the GPA regressions, 4th semester and 8th semester are calculated ignoring summers. GPA is cumulative GPA at the end of the semester. For more than one degree only students who have at least 1 degree are included in the regression. 106 Table A.10 – Effect of Honors College Participation on Student Outcomes 2 Total Total Total Credit Credit Credit Credit Credit Credit Hours 300 Hours 300 Hours 300 Hours Hours Hours Level Level Level Above Cutoff -2.4737 -3.3693 -1.5321 -2.0451 -1.9186 -0.2072 (5.4790) (4.8348) (3.6429) (2.1083) (2.0133) (1.5904) First College- Y Y Y Y Y Y Cohort Fixed Effects Covariates Y Y Y Y Y Y Bandwidth 0.10 0.15 0.20 0.10 0.15 0.20 Number of 3,472 4,829 5,866 3,472 4,829 5,866 Observations Mean Outcome 106 106 106 25 25 25 Credit Credit Credit More than More than More than Hours 400 Hours 400 Hours 400 One One One Level Level Level Degree Degree Degree Above Cutoff 0.5808 0.8118 1.4166 -0.0218 -0.0494 -0.0229 (2.4737) (2.1546) (1.7758) (0.0463) (0.0414) (0.0331) First College- Y Y Y Y Y Y Cohort Fixed Effects Covariates Y Y Y Y Y Y Bandwidth 0.10 0.15 0.20 0.10 0.15 0.20 Number of 3,472 4,829 5,866 3,168 4,403 5,338 Observations Mean Outcome 17 17 17 0.03 0.03 0.03 Number Number Number Minors Minors Minors Above Cutoff -0.1538* -0.0987 -0.0229 (0.0841) (0.0766) (0.0331) First College- Y Y Y Cohort Fixed Effects Covariates Y Y Y Bandwidth 0.10 0.15 0.20 Number of 3,472 4,829 5,338 Observations Mean Outcome 0.15 0.15 0.15 Notes: * p < 0.1, ** p < 0.05, *** p < 0.01. All regressions include first college-cohort fixed effects. Standard errors are clustered at the first college – cohort level. Coefficients are 2SLS estimates for the treatment effect of ever participating in the MSU Honors College. All regressions include the following covariates: the student’s age when they enter MSU and indicators for being female, being a specific race, and being a first-generation college student. Mean outcomes for students in the All GPAs Sample are shown. Time to degree counts summers as 1 semester. For the GPA regressions, 4th semester and 8th semester are calculated ignoring summers. GPA is cumulative GPA at the end of the semester. For more than one degree only students who have at least 1 degree are included in the regression. 107 A.4 Results Using Doughnut Sample The doughnut sample is the All GPAs Sample without students whose GPA minus the GPA cutoff they face to be invited into the MSU Honors College is between -0.01 and 0.01. Table A.11 – Discontinuity in Covariates Female First Gen Age First ACT White Black Semester Score Above Cutoff -0.0024 -0.0309 -0.0121 -0.1005 0.0165 0.0051 (0.0343) (0.0256) (0.0535) (0.2425) (0.0288) (0.0114) First College- Y Y Y Y Y Y Cohort Fixed Effects Mean Outcome 0.51 0.28 18 25 0.62 0.09 American Asian Pacific Hawaiian Hispanic Two or Native Islander More Races Above Cutoff 0.0058* -0.0074 0.0004 -0.0003 -0.0070 -0.0006 (0.0033) (0.0146) (0.0004) (0.0003) (0.0098) (0.0096) First College- Y Y Y Y Y Y Cohort Fixed Effects Mean Outcome 0.00 0.05 0.00 0.00 0.04 0.02 Race Not Race Not Reported Requested Above Cutoff 0.0021 -0.0146 (0.0064) (0.0226) First College- Y Y Cohort Fixed Effects Mean Outcome 0.01 0.16 Notes: * p < 0.1, ** p < 0.05, *** p < 0.01. All regressions include first college-cohort fixed effects. Standard errors are clustered at the first college – cohort level. Bandwidth = 0.15. The regressions above are estimated using Equation 1.4 from Section 1.5 of this dissertation. N = 4,317 except for ACT Score where N = 3,763. The outcomes are indicator variables for being female, being a specific race, being a first-generation college student, the student’s age during their first semester at MSU and the student’s ACT score. Mean outcomes for the All GPAs Sample are shown. 108 Table A.12 – Discontinuity in Ever Being in the Honors College Ever in Ever in Honors Honors College College Above Cutoff 0.3020*** 0.3001*** (0.0345) (0.0342) First College- Y Y Cohort Fixed Effects Covariates N Y Mean Outcome 0.06 0.06 Notes: * p < 0.1, ** p < 0.05, *** p < 0.01. All regressions include first college-cohort fixed effects. N = 4,317. Bandwidth = 0.15. Standard errors are clustered at the first college – cohort level. Covariates include the student’s age when they entered MSU and indicators for being female, being a specific race, and being a first-generation college student. Mean outcomes for the All GPAs sample are shown. Table A.13 – Intensity of Honors College Participation for Marginal Students Number of Number of Number of Number of Graduating Graduating Semesters Semesters Honors Honors from from in the in the Experiences Experiences Honors Honors Honors Honors College College College College Treatment Effect 7.6684*** 7.6440*** 4.9822*** 4.9769*** 0.4849*** 0.4831*** (0.5172) (0.5213) (0.4540) (0.4546) (0.0745) (0.0749) First College- Y Y Y Y Y Y Cohort Fixed Effects Covariates N Y N Y N Y Mean Outcome 7.8 7.8 5.3 5.3 0.52 0.52 College Admits Notes: * p < 0.1, ** p < 0.05, *** p < 0.01. All regressions include first college-cohort fixed effects. N = 4,317. Bandwidth = 0.15. Standard errors are clustered at the first college – cohort level. The coefficients are 2SLS estimates for the treatment effect of ever participating in the MSU Honors College. Covariates include the student’s age when they entered MSU and indicators for being female, being a specific race, and being a first-generation college student. Mean outcomes for honors students in the All GPAs Sample whose first semester in the honors college is not their first semester at MSU are shown (College Admits). 109 Table A.14 – Effect of Honors College Participation on Student Outcomes Graduate Graduate Time to Time to 4th 4th MSU MSU Degree Degree Semester Semester GPA GPA Treatment Effect 0.0117 0.0104 -0.2730 -0.2717 -0.0892 -0.0962 (0.0552) (0.0545) (0.4686) (0.4482) (0.0737) (0.0726) First College- Y Y Y Y Y Y Cohort Fixed Effects Covariates N Y N Y N Y Number of 4,317 4,317 3,934 3,934 4,079 4,079 Observations Mean Outcome 0.79 0.79 13 13 3.0 3.0 8th 8th Total Total Credit Credit Semester Semester Credit Credit Hours Hours GPA GPA Hours Hours 300 300 Level Level Treatment Effect -0.0387 -0.0412 -2.5772 -2.8455 -2.6998 -2.9276 (0.0744) (0.0695) (5.3452) (5.4208) (2.2418) (2.2677) First College- Y Y Y Y Y Y Cohort Fixed Effects Covariates N Y N Y N Y Number of 3,575 3,575 4,317 4,317 4,317 4,317 Observations Mean Outcome 3.1 3.1 106 106 25 25 Credit Credit More More Number Number Hours Hours than than One Minors Minors 400 400 One Degree Level Level Degree Treatment Effect 1.9152 1.7142 -0.0387 -0.0413 -0.1356 -0.1437 (2.3074) (2.3072) (0.0441) (0.0440) (0.0998) (0.1031) First College- Y Y Y Y Y Y Cohort Fixed Effects Covariates N Y N Y N Y Number of 4,317 4,317 3,934 3,934 4,317 4,317 Observations Mean Outcome 17 17 0.03 0.03 0.15 0.15 Notes: * p < 0.1, ** p < 0.05, *** p < 0.01. All regressions include first college-cohort fixed effects. Standard errors are clustered at the first college – cohort level. Coefficients are 2SLS estimates for the treatment effect of ever participating in the MSU Honors College. Covariates include the student’s age when they enter MSU and indicators for being female, being a specific race, and being a first-generation college student. For all regressions the bandwidth is 0.15. Mean outcomes for students in the All GPAs Sample are shown. Time to degree counts summers as 1 semester. For the GPA regressions, 4th semester and 8th semester are calculated ignoring summers. GPA is cumulative GPA at the end of the semester. For more than one degree only students who have at least 1 degree are included in the regression. 110 A.5 Discontinuity in Honors Students Admitted in High School A student is identified as being admitted into the MSU Honors College when they are in high school if the student is enrolled in the MSU Honors College during their first term at MSU. Figure A.1 – Discontinuity in the Proportion of Honors Students Admitted in High School Notes: N = 4,829. Only students who have a running variable between -0.15 and 0.15 are included in the graph. I define an honors student admitted in high school as a student who was in the MSU Honors College during their first semester at MSU. Each dot is the proportion of honors students admitted in high school whose running variable is an element of [x, x + 0.01). For the left most dot x = -0.15. Table A.15 – Discontinuity in Honors Students Admitted in High School High High School School Honors Honors College College Admit Admit Above Cutoff -0.0082 -0.0066 (0.0160) (0.0160) First College- Y Y Cohort Fixed Effects Covariates N Y Mean Outcome 0.03 0.03 Notes: * p < 0.1, ** p < 0.05, *** p < 0.01. All regressions include first college-cohort fixed effects. N = 4,829. Bandwidth = 0.15. Standard errors are clustered at the first college – cohort level. Covariates include the student’s age when the entered MSU and indicators for being female, being a specific race, and being a first-generation college student. Mean outcomes for the All GPAs sample are shown. 111 A.6 Alternative Specifications Heterogeneity Analysis Table A.16 – Male and Female Treatment Effect of Honors College Participation Additional Bandwidths 1 Graduate Graduate Time to Time to 4th 4th MSU MSU Degree Degree Semester Semester GPA GPA In Honors 0.1354 0.0291 -1.7950** -1.2989** 0.1188 0.0390 College (0.1269) (0.0767) (0.7479) (0.5248) (0.1106) (0.0890) In Honors -0.1822 -0.0129 1.8640** 1.4339** -0.1649 -0.0772 College * Female (0.1550) (0.1020) (0.9112) (0.6578) (0.1279) (0.1100) P(In Honors 0.57 0.75 0.89 0.73 0.64 0.55 College + Interaction) First College- Y Y Y Y Y Y Cohort Fixed Effects Bandwidth 0.10 0.20 0.10 0.20 0.10 0.20 Number of 3,472 5,866 3,168 5,338 3,272 5,542 Observations 8th 8th Total Total Credit Credit Semester Semester Credit Credit Hours 300 Hours 300 GPA GPA Hours Hours Level Level In Honors 0.1443 0.0955 0.6590 -3.9978 -1.2186 -1.4840 College (0.1383) (0.0937) (9.9098) (6.3313) (3.9977) (2.7672) In Honors -0.1686 -0.1129 -4.3632 4.0566 -1.3919 2.0709 College * Female (0.1313) (0.1042) (12.6867) (7.7112) (7.0457) (4.5856) P(In Honors 0.71 0.78 0.60 0.99 0.53 0.83 College + Interaction) First College- Y Y Y Y Y Y Cohort Fixed Effects Bandwidth 0.10 0.20 0.10 0.20 0.10 0.20 Number of 2,880 4,854 3,472 5,866 3,472 5,866 Observations Notes: * p < 0.1, ** p < 0.05, *** p < 0.01. All regressions include first college-cohort fixed effects. Standard errors are clustered at the first college – cohort level. Regressions are 2SLS regressions where Above Cutoff and Above Cutoff * Female are instruments for In Honors College and In Honors College * Female. Time to degree only uses students who graduated and counts summers as 1 semester. For the GPA regressions, 4th semester and 8th semester are calculated ignoring summers. GPA is cumulative GPA at the end of the semester. 112 Table A.17 – Male and Female Treatment Effect of Honors College Participation Additional Bandwidths 2 Credit Credit More than More than Number Number Hours 400 Hours 400 One One Minors Minors Level Level Degree Degree In Honors 1.8041 1.8209 0.0770 0.0336 -0.0801 0.0814 College (4.8643) (3.0617) (0.0636) (0.0405) (0.1373) (0.1096) In Honors -2.4450 -1.3925 -0.1725* -0.0946 -0.1296 -0.2337 College * Female (5.6407) (3.5017) (0.0923) (0.0699) (0.2147) (0.1634) P(In Honors 0.81 0.83 0.15 0.25 0.12 0.12 College + Interaction) First College- Y Y Y Y Y Y Cohort Fixed Effects Bandwidth 0.10 0.20 0.10 0.20 0.10 0.20 Number of 3,472 5,866 3,168 5,338 3,472 5,866 Observations Notes: * p < 0.1, ** p < 0.05, *** p < 0.01. All regressions include first college-cohort fixed effects. Standard errors are clustered at the first college – cohort level. Regressions are 2SLS regressions where Above Cutoff and Above Cutoff * Female are instruments for In Honors College and In Honors College * Female. For more than one degree only students who have at least 1 degree are included in the regression. 113 Table A.18 – Male and Female Treatment Effect of Honors College Participation Doughnut Sample Graduate Time to 4th 8th Total Credit MSU Degree Semester Semester Credit Hours GPA GPA Hours 300 In Honors College -0.0413 -0.9605 -0.0416 0.0002 -9.1723 -0.8313 (0.1085) (0.8465) (0.1108) (0.1032) (9.6000) (4.1556) In Honors College * 0.0916 1.1793 -0.0824 -0.0630 11.3384 -3.0823 Female (0.1346) (0.9157) (0.1347) (0.1060) (11.6801) (5.7875) P(In Honors College + 0.45 0.65 0.17 0.45 0.74 0.23 Interaction) First College-Cohort Y Y Y Y Y Y Fixed Effects Number of 4,317 3,934 4,079 3,575 4,317 4,317 Observations Mean Outcome Males 0.77 13 3.0 3.1 104 25 Mean Outcome 0.81 12 3.1 3.2 107 25 Females Credit More Number Hours Than Minors 400 One Degree In Honors College -2.7446 -0.0033 -0.0955 (3.9353) (0.0610) (0.1828) In Honors College * 7.8089* -0.0633 -0.0736 Female (4.5810) (0.0929) (0.2479) P(In Honors College + 0.07 0.32 0.22 Interaction) First College-Cohort Y Y Y Fixed Effects Number of 4,317 3,934 4,317 Observations Mean Outcome Males 16 0.02 0.12 Mean Outcome 18 0.03 0.17 Females Notes: * p < 0.1, ** p < 0.05, *** p < 0.01. Students with a GPA within 0.01 grade points of the cutoff have been removed from the sample. All regressions include first college-cohort fixed effects. Standard errors are clustered at the first college – cohort level. Regressions are 2SLS regressions where Above Cutoff and Above Cutoff * Female are instruments for In Honors College and In Honors College * Female. All regressions have a bandwidth of 0.15. Time to degree only uses students who graduated and counts summers as 1 semester. For the GPA regressions, 4th semester and 8th semester were calculated ignoring summers. GPA is cumulative GPA at the end of the semester. Mean outcomes are for all male or all female students in the All GPAs Sample. For more than one degree only students who have at least 1 degree are included in the regression. 114 Table A.19 – First Gen and Second and Above Gen Treatment Effect of Honors College Participation Additional Bandwidths 1 Graduate Graduate Time to Time to 4th 4th MSU MSU Degree Degree Semester Semester GPA GPA In Honors College -0.0254 -0.0339 -0.6884 -0.4959 0.0258 -0.0241 (0.0656) (0.0411) (0.4616) (0.3430) (0.0863) (0.0563) In Honors College 0.2797* 0.2810** -0.3037 -0.0162 0.0176 0.1078 * First Gen (0.1555) (0.1291) (1.4093) (0.8408) (0.1843) (0.1291) P(In Honors 0.12 0.04 0.42 0.50 0.80 0.48 College + Interaction) First College- Y Y Y Y Y Y Cohort Fixed Effects Bandwidth 0.10 0.20 0.10 0.20 0.10 0.20 Number of 3,472 5,866 3,168 5,338 3,272 5,542 Observations 8th 8th Total Total Credit Credit Semester Semester Credit Credit Hours 300 Hours 300 GPA GPA Hours Hours Level Level In Honors College 0.0187 0.0075 -7.7021 -7.1938* -3.1778 -1.8035 (0.0772) (0.0530) (5.2575) (4.2709) (2.2914) (1.8219) In Honors College 0.1397 0.1239 29.1312* 27.7376** 5.7985 7.4295 * First Gen (0.2217) (0.1700) (15.0205) (11.4473) (6.2156) (4.6394) P(In Honors 0.47 0.44 0.17 0.05 0.65 0.18 College + Interaction) First College- Y Y Y Y Y Y Cohort Fixed Effects Bandwidth 0.10 0.20 0.10 0.20 0.10 0.20 Number of 2,880 4,854 3,472 5,866 3,472 5,866 Observations Notes: * p < 0.1, ** p < 0.05, *** p < 0.01. All regressions include first college-cohort fixed effects. Standard errors are clustered at the first college – cohort level. Regressions are 2SLS regressions where Above Cutoff and Above Cutoff * First Gen are instruments for In Honors College and In Honors College * First Gen. Time to degree only uses students who graduated and counts summers as 1 semester. For the GPA regressions, 4th semester and 8th semester are calculated ignoring summers. GPA is cumulative GPA at the end of the semester. 115 Table A.20 – First Gen and Second and Above Gen Treatment Effect of Honors College Participation Additional Bandwidths 2 Credit Credit More than More than Number Number Hours 400 Hours 400 One One Minors Minors Level Level Degree Degree In Honors College -0.1565 0.8016 -0.0255 -0.0069 -0.2177* -0.0523 (2.5708) (2.0809) (0.0489) (0.0318) (0.1203) (0.0775) In Honors College 2.7874 1.3236 0.0230 -0.0673 0.3569 0.0059 * First Gen (6.6946) (4.8042) (0.0956) (0.0669) (0.2534) (0.2303) P(In Honors 0.68 0.61 0.98 0.29 0.45 0.81 College + Interaction) First College- Y Y Y Y Y Y Cohort Fixed Effects Bandwidth 0.10 0.20 0.10 0.20 0.10 0.20 Number of 3,472 5,866 3,168 5,338 3,472 5,866 Observations Notes: * p < 0.1, ** p < 0.05, *** p < 0.01. All regressions include first college-cohort fixed effects. Standard errors are clustered at the first college – cohort level. Regressions are 2SLS regressions where Above Cutoff and Above Cutoff * First Gen are instruments for In Honors College and In Honors College * First Gen. Time to degree only uses students who graduated and counts summers as 1 semester. For more than one degree only students who have at least 1 degree are included in the regression. 116 Table A.21 –First Gen and Second and Above Gen Treatment Effect of Honors College Participation Doughnut Sample Graduate Time to 4th 8th Total Credit MSU Degree Semester Semester Credit Hours GPA GPA Hours 300 Level In Honors College -0.0544 -0.3213 -0.1408* -0.0939 -7.7197 -4.0809* (0.0558) (0.4796) (0.0730) (0.0627) (5.5735) (2.3869) In Honors College * 0.2992** 0.2762 0.1962 0.2078 23.7289* 6.0180 First Gen (0.1442) (1.2199) (0.1454) (0.1799) (13.7509) (5.5272) P(In Honors 0.08 0.97 0.72 0.57 0.22 0.71 College + Interaction) First College- Y Y Y Y Y Y Cohort Fixed Effects Number of 4,317 3,934 4,079 3,575 4,317 4,317 Observations Mean Outcome 2nd 0.82 12 3.1 3.2 107 25 and Above Gen Mean Outcome 0.73 13 2.9 3.0 101 22 First Gen Credit More Number Hours Than Minors 400 One Level Degree In Honors College 0.6059 -0.0522 -0.1550 (2.6169) (0.0499) (0.1152) In Honors College * 5.8473 0.0595 0.0824 First Gen (6.2468) (0.0759) (0.2929) P(In Honors 0.25 0.91 0.78 College + Interaction) First College- Y Y Y Cohort Fixed Effects Number of 4,317 3,934 4,317 Observations Mean Outcome 2nd 17 0.03 0.15 and Above Gen Mean Outcome 16 0.03 0.14 First Gen Notes: * p < 0.1, ** p < 0.05, *** p < 0.01. Students with a GPA within 0.01 grade points of the cutoff have been removed from the sample. All regressions include first college-cohort fixed effects. Standard errors are clustered at the first college – cohort level. Regressions are 2SLS regressions where Above Cutoff and Above Cutoff * First Gen are instruments for In Honors College and In Honors College * First Gen. All have a bandwidth of 0.15. The regression for time to degree only includes students who graduated and counts summers as 1 semester. For the GPA regressions, 4th semester and 8th semester are calculated ignoring summers. GPA is cumulative GPA at the end of the semester. Mean outcomes are for 2nd and above generation or first-generation students in the All GPAs Sample. 117 A.7 Additional Outcomes Analysis Sample Table A.22 – Effect of Honors College Participation on Student Outcomes Time to Time to Retention Retention Retention Retention Degree Degree to 4th to 4th to 8th to 8th Ignoring Ignoring Semester Semester Semester Semester Summers Summers Treatment Effect -0.2400 -0.3294 0.0442 0.0441 0.0230 0.0189 (0.2440) (0.2395) (0.0404) (0.0409) (0.0670) (0.0682) First College-Cohort Y Y Y Y Y Y Fixed Effects Covariates N Y N Y N Y Number of 3,812 3,812 4,829 4,829 4,829 4,829 Observations 2nd 2nd 3rd 3rd 5th 5th Semester Semester Semester Semester Semester Semester GPA GPA GPA GPA GPA GPA Treatment Effect -0.0279 -0.0158 -0.0229 -0.0018 -0.0001 0.0316 (0.0357) (0.0358) (0.0541) (0.0539) (0.0693) (0.0668) First College-Cohort Y Y Y Y Y Y Fixed Effects Covariates N Y N Y N Y Number of 4,750 4,750 4,608 4,608 4,457 4,457 Observations Notes: * p < 0.1, ** p < 0.05, *** p < 0.01. Standard errors are clustered at the first college – cohort level. All regressions include first college-cohort fixed effects. Coefficients are 2SLS estimates for the treatment effect of ever participating in the MSU Honors College. Covariates include the student’s age when the entered MSU and indicators for being female, being a specific race, and being a first-generation college student. For all regressions the bandwidth is 0.15. For time to degree ignoring summers all students who got their first degree during a summer semester are dropped and summer semesters count as 0 semesters. For the GPA regressions, the semester numbers are calculated ignoring summers. GPA is cumulative GPA at the end of the semester. Retention to semester X is measured as having a cumulative GPA at the end of semester X with semester number calculated ignoring summers. 118 Table A.23 – Effect of Honors College Participation on Student Outcomes 6th 6th 7th 7th Retention Retention Semester Semester Semester Semester to 2nd to 2nd GPA GPA GPA GPA Semester Semester Treatment Effect 0.0204 0.0517 0.0317 0.0660 0.0217 0.0216 (0.0717) (0.0677) (0.0722) (0.0679) (0.0245) (0.0250) First College-Cohort Y Y Y Y Y Y Fixed Effects Covariates N Y N Y N Y Number of 4,387 4,387 4,264 4,264 4,829 4,829 Observations Retention Retention Retention Retention Retention Retention to 3rd to 3rd to 5th to 5th to 6th to 6th Semester Semester Semester Semester Semester Semester Treatment Effect 0.0370 0.0380 -0.0275 -0.0258 -0.0319 -0.0299 (0.0377) (0.0379) (0.0500) (0.0485) (0.0510) (0.0502) First College-Cohort Y Y Y Y Y Y Fixed Effects Covariates N Y N Y N Y Number of 4,829 4,829 4,829 4,829 4,829 4,829 Observations Retention Retention to 7th to 7th Semester Semester Treatment Effect -0.0578 -0.0589 (0.0508) (0.0505) First College-Cohort Y Y Fixed Effects Covariates N Y Number of 4,829 4,829 Observations Notes: * p < 0.1, ** p < 0.05, *** p < 0.01. Standard errors are clustered at the first college – cohort level. All regressions include first college-cohort fixed effects. Coefficients are 2SLS estimates for the treatment effect of ever participating in the MSU Honors College. Covariates include the student’s age when the entered MSU and indicators for being female, being a specific race, and being a first-generation college student. For all regressions the bandwidth is 0.15. For the GPA regressions, the semester numbers are calculated ignoring summers. GPA is cumulative GPA at the end of the semester. Retention to semester X is measured as having a cumulative GPA at the end of semester X with semester number calculated ignoring summers. 119 Table A.24 – Effect of Honors College Participation on Major of Bachelor’s Degree Agriculture and Agriculture and Biology Biology Business Business Natural Natural and Health and Health and and Resources Resources Degree Degree Economics Economics Degree Degree Degree Degree Treatment Effect 0.0236 0.0246 -0.0365 -0.0279 0.0571 0.0573 (0.0249) (0.0249) (0.0601) (0.0597) (0.0526) (0.0536) First College- Y Y Y Y Y Y Cohort Fixed Effects Covariates N Y N Y N Y Mean Outcome 0.03 0.03 0.13 0.13 0.24 0.24 Communications Communications Education Education Engineering Engineering Degree Degree Degree Degree and and Architecture Architecture Degree Degree Treatment Effect -0.0123 -0.0125 -0.0113 -0.0076 -0.0374 -0.0365 (0.0247) (0.0253) (0.0331) (0.0323) (0.0580) (0.0572) First College- Y Y Y Y Y Y Cohort Fixed Effects Covariates N Y N Y N Y Mean Outcome 0.04 0.04 0.03 0.03 0.07 0.07 Information Information Liberal Liberal Physical Physical Technology Technology Arts Arts Sciences Sciences Degree Degree Degree Degree and Math and Math Degree Degree Treatment Effect 0.0021 0.0012 0.0153 0.0117 -0.0344 -0.0340 (0.0535) (0.0546) (0.0384) (0.0388) (0.0348) (0.0352) First College- Y Y Y Y Y Y Cohort Fixed Effects Covariates N Y N Y N Y Mean Outcome 0.04 0.04 0.04 0.04 0.02 0.02 Social Sciences Social Sciences Vocational Vocational Degree Degree Degree Degree Treatment Effect 0.0571 0.0551 -0.0060 -0.0089 (0.0470) (0.0467) (0.0175) (0.0177) First College- Y Y Y Y Cohort Fixed Effects Covariates N Y N Y Mean Outcome 0.14 0.14 0.02 0.02 Notes: * p < 0.1, ** p < 0.05, *** p < 0.01. N = 4,829. All regressions include first college- cohort fixed effects. Standard errors are clustered at the first college – cohort level. Coefficients are 2SLS estimates for the treatment effect of ever participating in the MSU Honors College. Covariates include the student’s age when they enter MSU and indicators for being female, and being a specific race, and being a first-generation college student. For all regressions the bandwidth is 0.15. Mean outcomes for students in the All GPAs Sample are shown. Only students who earn a bachelor’s degree at MSU are included in the regressions. 120 APPENDIX B: CHAPTER 2 APPENDIX B.1 Survey Screenshots93 Figure B.1 - All Graduates Income Treatment If the survey respondent sees the All-Graduates Income Treatment, they are not allowed to continue to the next page of the survey until they typed 53268 into the textbox. This is done to ensure that survey respondents process the information they see. There is a typo in the above figure. The calculation for the typical income of college graduates is made using the 2019 5-year ACS data. This data covers the year 2015 – 2019, not 2014 – 2019 which is written in the survey. 93 Screenshots for the rest of survey available upon request. 121 Figure B.2 - Major Specific Income Treatment Primary Major Agribusiness Management The major or group of majors and the income that is shown is based on what the survey respondent indicated their primary major is. Individuals who see the above page are not allowed to continue with the survey until they typed 44300 into the textbox. Survey respondents with other majors who see the Major Specific Income Treatment also have to type in the typical income shown before they can continue to the next page of the survey. 122 Figure B.3 - Question to Elicit Own Income Expectations This question is asked twice: once before the survey respondent sees the treatment and once after the survey respondent sees the treatment. The survey respondent is not allowed to continue with the survey until the number of balls they place in the various income ranges equaled 10. A valid response with 10 balls in the 5 income ranges is shown in the above picture. However, survey respondents first see this question with no balls in any of the income ranges. 123 Figure B.4 - Intro to Question Eliciting Repayment Plan Preferences 124 Figure B.5 - Repayment Plan Choice Question Table Describing the Two Repayment Plans In the survey this table appears directly below the information in Figure B.4. 125 Figure B.6 - Repayment Plan Choice Question Payment Information Tables In the survey these tables appear directly below the table in Figure B.5. The $318.20 monthly payment is calculated using the bankrate.com loan calculator (URL: https://www.bankrate.com/calculators/mortgages/loan-calculator.aspx). The monthly payment is for a $30,000 loan with a loan term of 10 years at a 5% interest rate. The estimate of Total Amount Paid = $318.20 payment * 120 monthly payments over 10 years. The 5% increase in income comes from the U.S. Department of Education’s Loan Simulator. On the page where individuals provide their yearly salary and how much their incomes grow each year the default income growth is 5%. On that page is written “*According to a U.S. Department of Education and U.S. Department of Treasury analysis of a representative sample of actual student loan borrower incomes, the borrower incomes increase, on average, at a rate of 5% per year.” See https://studentaid.gov/loan-simulator/repayment/wizard/personal- 126 info/income-info accessed January 20th, 2022. Figure B.7 - Repayment Plan Choice Question In the survey Figure B.7 appears directly below Figure B.6. 127 Figure B.8 - Test of Understanding of Repayment Plans Introduction 128 Figure B.9 - Test of Understanding of Repayment Plans Questions 1 and 2 In the survey Figure B.9 appears directly below Figure B.8. The answer to Question 1 is $0. This is the formula to calculate a monthly payment on Repayment Plan 1 given an annual income at or above $20,000: 0.1 * (Annual Income – $20,000) / 12. Plugging $20,000 for Annual Income into the formula the expression equals $0. According to 41% of respondents answer this question correctly. The answer to Question 2 is $200. This is the formula to calculate a monthly payment on Repayment Plan 1 given an annual income at or above $20,000: 0.1 * (Annual Income – $20,000) / 12. Plugging $44,000 for Annual Income into the formula the expression equals $200. 38% of respondents answer this question correctly. 129 Figure B.10 - Test of Understanding of Repayment Plans Questions 3 and 4 In the survey Figure B.10 appears directly below Figure B.9. The answer to Question 3 is $400. The description of the plans in Figure B.8 says if you are on Repayment Plan 2 your payments do not change when your income changes. Therefore, the payment stays at $400. 66% of respondents answered this question correctly. The answer to Question 4 is $0. The description of the plans in Figure B.8 says “Any remaining loan balance after 20 years of payments is forgiven.” Therefore, at the end of the borrower’s 20th year their $10,000 was forgiven. Therefore, during the borrower’s 21st year they are not required to make any payments. 39% of respondents answered this question correctly. Survey respondents see Figures B.8, B.9, and B.10 on the same page. That page comes after survey respondents are asked for their repayment plan choice a second time and before questions about covariates like age and gender. 130 B.2 Summary Statistics Table B.1 - Number of Majors Number Frequency Percent of Analysis Sample 1 1208 86.5% 2 180 12.9% 3 5 0.4% 4 or More 3 0.2% Table B.2 - Gender Gender Frequency Percent of Analysis Sample Female 826 59.2% Male 548 39.3% Other 22 1.6% Table B.3 - Race Race Frequency Percent of Analysis Sample American Indian or 2 0.1% Alaska Native Asian 118 8.5% Black or African 74 5.3% American Native Hawaiian or 6 0.4% Pacific Islander White 1135 81.3% Other 61 4.4% Table B.4 - Student Loans How much student loan Frequency Percent of Analysis debt the respondent had Sample when they answered the survey $0 567 40.6% $1 - $10,000 217 15.5% $10,001 - $20,000 226 16.2% $20,001 - $30,000 174 12.5% $30,001 - $40,000 78 5.6% $40,001 - $50,000 48 3.4% Greater than $50,000 86 6.2% 131 Table B.5 - Pell Grant Has the respondent Frequency Percent of every had a Pell Analysis Grant? Sample Yes 428 30.7% No 968 69.3% Table B.6 - First Generation College Student Is the respondent a Frequency Percent of first-generation Analysis Sample college student? Yes 267 19.1% No 1129 80.9% Table B.7 - Summary Statistics Continuous Variables Variable Mean Standard Min Max Deviation Age 21.35 1.76 12 46 Probability 65.41 31.93 0 100 will Attend Graduate or Professional School in the Next 20 Years 132 B.3 Balance Test Table B.8 - Multivariate Regression Test Major Specific Income Treatment Single Major 0.0940** (0.0391) Is Female 0.0196 (0.0280) Is White 0.0144 (0.0354) Has Pell Grant 0.0371 (0.0325) Is First Generation -0.0055 College Student (0.0372) Has Student Loans -0.0553* (0.0284) Age 0.0001 (0.0082) Probability Attend -0.0000 Graduate or (0.0004) Professional School N 1,396 Notes: * p < 0.1, ** p < 0.05, *** p < 0.01. Each coefficient in the above table comes from a single regression using the following estimating equation. 𝑀𝑎𝑗𝑜𝑟_𝑆𝑝𝑒𝑐𝑖𝑓𝑖𝑐_𝑇𝑟𝑒𝑎𝑡𝑚𝑒𝑛𝑡𝑖 = 𝛽0 + 𝛽1 𝑆𝑖𝑛𝑔𝑙𝑒_𝑀𝑎𝑗𝑜𝑟𝑖 + 𝛽2 𝐼𝑠_𝐹𝑒𝑚𝑎𝑙𝑒𝑖 + 𝛽3 𝐼𝑠_𝑊ℎ𝑖𝑡𝑒𝑖 + 𝛽4 𝐻𝑎𝑠_𝑃𝑒𝑙𝑙_𝐺𝑟𝑎𝑛𝑡𝑖 + 𝛽5 𝐼𝑠_𝐹𝑖𝑟𝑠𝑡_𝐺𝑒𝑛_𝐶𝑜𝑙𝑖 + 𝛽6 𝐻𝑎𝑠_𝑆𝑡𝑢𝑑𝑒𝑛𝑡_𝐿𝑜𝑎𝑛𝑠𝑖 + 𝛽7 𝐴𝑔𝑒𝑖 + 𝛽8 𝑃𝑟𝑜𝑏_𝐺𝑟𝑎𝑑_𝑆𝑐ℎ𝑜𝑜𝑙𝑖 + 𝜖𝑖 Major_Specific_Treatment is a binary variable which equals 1 if the respondent sees the Major Specific Income Treatment. Single Major, Is Female, Is White, Has Pell Grant, Is First Generation College Student, and Has Student Loans are binary variables which equal 1 if the respondent has the attribute in the variable name. Standard errors are robust to heteroskedasticity. 133 B.4 Robustness Check Table B.9 - Effect of Treatment on Probability of Earning $0 to $60,000 Percent Chance of Percent Chance of Earning $0 to Earning $0 to $60,000 $60,000 Major Specific 5.2699*** 5.4729*** Treatment (1.8612) (1.8042) Covariates N Y N 1,396 1,396 Percent Chance of Earning $0 to $60,000 Major Specific 4.1118 Treatment * After (2.5362) Treatment Covariates N N 2,792 Notes: * p < 0.1, ** p < 0.05, *** p < 0.01. This table shows the results of the subjective probability (scaled to be between 0 and 100) a survey respondent believes they would earn $0 to $60,000 5 years after graduating from MSU regressed on binary variables for the survey respondent seeing the major specific income treatment (Major Specific Treatment) or a binary variable for seeing the Major Specific Income Treatment, a binary variable for the question about income expectations coming after the treatment (After Treatment) and an interaction between those variables. Standard errors are robust to heteroskedasticity. The covariates are binary variables for the survey respondent: being female, being white, having only 1 major, having a Pell Grant, being a first-generation college student and having student loans. There are also 2 other covariates: the survey respondent’s age and the probability the survey respondent believes they would attend graduate or professional school within 20 years of answering the survey. 134 B.5 Heterogeneity by Having Student Loans Table B.10 - Differences Between Survey Respondents with and without Student Loans Has Student Loans Single Major 0.0440 (0.0371) Is Female 0.0314 (0.0265) Is White 0.0002 (0.0324) Has Pell Grant 0.2340*** (0.0287) Is First Generation 0.1146*** College Student (0.0325) Age 0.0130 (0.0088) Probability Attend -0.0004 Graduate or (0.0004) Professional School N 1,396 Notes: * p < 0.1, ** p < 0.05, *** p < 0.01. The above table shows the results of a regression of an indicator variable for the survey respondent having student loans on indicator variables for the student having a single major, being female, being white, having a Pell Grant, and being a first- generation college student and continuous variables for the survey respondent’s age and their subjective probability of attending graduate or professional school. Standard errors are robust to heteroskedasticity. B.5.1 Has Student Loans Only Sample (829 Survey Respondents) Table B.11 - Effect of Treatment for Students with Student Loan Debt Percent Chance of Choose IDR Plan Earning a Low Income Major Specific 9.0887*** -0.0424 Treatment * After (2.2186) (0.0479) Treatment N 1,658 1,658 Notes: * p < 0.1, ** p < 0.05, *** p < 0.01. For this table, each survey respondent has two observations: one observation before the treatment and one after the treatment. Therefore, N = 829 * 2 = 1,658. This table shows the results of the subjective probability (scaled to be between 0 and 100) a survey respondent believes they will earn $0 to $30,000 5 years after graduating from MSU or a binary variable for the survey respondent choosing the IDR plan regressed on a binary variable for seeing the Major Specific Income Treatment, a binary variable for the question about income expectations coming after the treatment (After Treatment) and an interaction between those variables. Standard errors are robust to heteroskedasticity. 135 B.5.2 No Student Loans Sample (568 Survey Respondents) Table B.12 - Effect of Treatment for Students without Student Loan Debt Percent Chance of Choose IDR Plan Earning a Low Income Major Specific 4.3709* 0.0181 Treatment * After (2.4929) (0.0574) Treatment N 1,136 1,136 Notes: * p < 0.1, ** p < 0.05, *** p < 0.01. For this table, each survey respondent has two observations: one observation before the treatment and one after the treatment. Therefore, N = 568 * 2 = 1,136. This table shows the results of the subjective probability (scaled to be between 0 and 100) a survey respondent believed they would earn $0 to $30,000 5 years after graduating from MSU or a binary variable for the survey respondent choosing the IDR plan regressed on a binary variable for seeing the Major Specific Income Treatment, a binary variable for the question about income expectations coming after the treatment (After Treatment) and an interaction between those variables. Standard errors are robust to heteroskedasticity. B.5.3 Models with Interaction Terms Table B.13 - Heterogeneity in Effect of Treatment by Having Student Loans Percent Chance of Choose IDR Plan Earning a Low Income Major Specific 4.7178 -0.0606 Treatment * Have (3.3369) (0.0747) Student Loans *After Treatment N 2,792 2,792 Notes: * p < 0.1, ** p < 0.05, *** p < 0.01. For this table each survey respondent has two observations: one observation before the treatment and one after the treatment. Therefore, N = 1,396 * 2 = 2,792. This table shows the results of the subjective probability (scaled to be between 0 and 100) a survey respondent believes they will earn $0 to $30,000 5 years after graduating from MSU or a binary variable for the survey respondent choosing the IDR plan regressed on a binary variable for seeing the Major Specific Income Treatment, a binary variable for the question about income expectations coming after the treatment (After Treatment), a binary variable for the respondent having student loans (Have Student Loans), and all possible interaction terms using those 3 binary variables. Standard errors are robust to heteroskedasticity. 136 B.6 Heterogeneity by Income of Major Survey respondents are categorized as having a low-income major if the income they would have seen if they had seen the Major Specific Income Treatment was below $37,400. All other survey respondents are classified as having a high income major. This cutoff is chosen because it is the median income survey respondents would have seen if they had been shown the Major Specific Income Treatment. B.6.1 Low Income Majors (694 Survey Respondents) Table B.14 - Effect of Treatment on Students with Low Income Majors Percent Chance of Choose IDR Plan Earning a Low Income Major Specific 13.9117*** 0.0206 Treatment * After (2.6683) (0.0524) Treatment N 1,388 1,388 Notes: * p < 0.1, ** p < 0.05, *** p < 0.01. For this table, each survey respondent has two observations: one observation before the treatment and one after the treatment. Therefore, N = 694 * 2 = 1,388. This table shows the results of the subjective probability (scaled to be between 0 and 100) a survey respondent believes they will earn $0 to $30,000 5 years after graduating from MSU or a binary variable for the survey respondent choosing the IDR plan regressed on a binary variable for seeing the Major Specific Income Treatment, a binary variable for the question about income expectations coming after the treatment (After Treatment) and an interaction between those variables. Standard errors are robust to heteroskedasticity. B.6.2 High Income Majors (702 Survey Respondents) Table B.15 - Effect of Treatment on Students with High Income Majors Percent Chance of Choose IDR Plan Earning a Low Income Major Specific 0.0362 -0.0599 Treatment * After (1.6340) (0.0516) Treatment N 1,404 1,404 Notes: * p < 0.1, ** p < 0.05, *** p < 0.01. For this table each survey respondent has two observations: one observation before the treatment and one after the treatment. Therefore, N = 702 * 2 = 1,404. This table shows the results of the subjective probability (scaled to be between 0 and 100) a survey respondent believes they will earn $0 to $30,000 5 years after graduating from MSU or a binary variable for the survey respondent choosing the IDR plan regressed on a binary variable for seeing the Major Specific Income Treatment, a binary variable for the question about income expectations coming after the treatment (After Treatment) and an interaction between those variables. Standard errors are robust to heteroskedasticity. 137 B.6.3 Models with Interaction Terms Table B.16 - Heterogeneity in Effect of Treatment by Income of Major Percent Chance of Choose IDR Plan Earning a Low Income Major Specific 13.8754*** 0.0805 Treatment * Have (3.1288) (0.0735) Student Loans *After Treatment N 2,792 2,792 Notes: * p < 0.1, ** p < 0.05, *** p < 0.01. For this table each survey respondent has two observations: one observation before the treatment and one after the treatment. Therefore, N = 1,396 * 2 = 2,792. This table shows the results of the subjective probability (scaled to be between 0 and 100) a survey respondent believes they will earn $0 to $30,000 5 years after graduating from MSU or a binary variable for the survey respondent choosing the IDR plan regressed: on a binary variable for seeing the Major Specific Income Treatment, a binary variable for the question about income expectations coming after the treatment (After Treatment), a binary variable for the respondent having a low income major (Low Income Major), and all possible interaction terms using those 3 binary variables. Standard errors are robust to heteroskedasticity. B.7 Treatment Effect on Subjective Probability of Earning $30,000 to $60,000 Table B.17 - Treatment Effect on Subjective Probability of Earning $30,000 to $60,000 Percent Chance of Percent Chance of Earning $30,000 Earning $30,000 to to $60,000 $60,000 Major Specific -2.7190* -2.5806* Treatment (1.3939) (1.3828) Covariates N Y N 1,396 1,396 Percent Chance of Earning $30,000 to $60,000 Major Specific -2.9829 Treatment * After (1.8282) Treatment Covariates N N 2,792 Notes: * p < 0.1, ** p < 0.05, *** p < 0.01. This table shows the results of the subjective probability (scaled to be between 0 and 100) a survey respondent believes they will earn $30,000 to $60,000 5 years after graduating from MSU regressed on binary variables for the survey respondent seeing the major specific income treatment (Major Specific Treatment) or a binary variable for seeing the Major Specific Income Treatment, a binary variable for the question about income expectations coming after the treatment (After Treatment) and an interaction between those variables. Standard errors are robust to heteroskedasticity. The covariates are binary variables for the survey respondent: being female, being white, having only 1 major, having a Pell Grant, being a first-generation college student and having student loans. There are also 2 other covariates: the survey respondent’s age and the probability the survey respondent believes they would attend graduate or professional school within 20 years of answering the survey. 138 B.8 Heterogeneity by Repayment Plan Understanding Survey respondents are classified as having low plan understanding if they get 0 or 1 questions right on the 4-question test of repayment plan understanding in the survey. Survey respondents are classified as having high plan understanding if they get 2, 3, or 4 questions right on the 4-question test of repayment plan understanding in the survey. This way of categorizing survey respondents is chosen to split the sample as evenly as possible. B.8.1 Low Plan Understanding (672 Survey Respondents) Table B.18 - Effect of Treatment on Students with Low Plan Understanding Percent Chance of Choose IDR Plan Earning a Low Income Major Specific 7.7501*** 0.0117 Treatment * After (2.5706) (0.0543) Treatment N 1,344 1,344 Notes: * p < 0.1, ** p < 0.05, *** p < 0.01. For this table each survey respondent has two observations: one observation before the treatment and one after the treatment. Therefore, N = 672 * 2 = 1,344. This table shows the results of the subjective probability (scaled to be between 0 and 100) a survey respondent believes they will earn $0 to $30,000 5 years after graduating from MSU or a binary variable for the survey respondent choosing the IDR plan regressed on a binary variable for seeing the Major Specific Income Treatment, a binary variable for the question about income expectations coming after the treatment (After Treatment) and an interaction between those variables. Standard errors are robust to heteroskedasticity. B.8.2 High Plan Understanding (724 Survey Respondents) Table B.19 - Effect of Treatment on Students with High Plan Understanding Percent Chance of Choose IDR Plan Earning a Low Income Major Specific 6.6260*** -0.0462 Treatment * After (2.1355) (0.0489) Treatment N 1,448 1,448 Notes: * p < 0.1, ** p < 0.05, *** p < 0.01. For this table each survey respondent has two observations: one observation before the treatment and one after the treatment. Therefore, N = 724 * 2 = 1,448. This table shows the results of the subjective probability (scaled to be between 0 and 100) a survey respondent believes they would earn $0 to $30,000 5 years after graduating from MSU or a binary variable for the survey respondent choosing the IDR plan regressed on a binary variable for seeing the Major Specific Income Treatment, a binary variable for the question about income expectations coming after the treatment (After Treatment) and an interaction between those variables. Standard errors are robust to heteroskedasticity. B.8.3 Models with Interaction Terms 139 Table B.20 - Heterogeneity in Effect of Treatment by Plan Understanding Percent Chance of Choose IDR Plan Earning a Low Income Major Specific 1.1241 0.0580 Treatment * Have (3.3419) (0.0731) Student Loans *After Treatment N 2,792 2,792 Notes: * p < 0.1, ** p < 0.05, *** p < 0.01. For this table each survey respondent has two observations: one observation before the treatment and one after the treatment. Therefore, N = 1,396 * 2 = 2,792. This table shows the results of the subjective probability (scaled to be between 0 and 100) a survey respondent believes they will earn $0 to $30,000 5 years after graduating from MSU or a binary variable for the survey respondent choosing the IDR plan regressed: on a binary variable for seeing the Major Specific Income Treatment, a binary variable for the question about income expectations coming after the treatment (After Treatment), a binary variable for the respondent having a low amount of understand of the repayment plans (Low Plan Understanding), and all possible interaction terms using those 3 binary variables. Standard errors are robust to heteroskedasticity. 140 APPENDIX C: CHAPTER 3 APPENDIX C.1 Robustness Check Different Bandwidths C.1.1 Math Proficient/Advanced Sample Table C.1 – Effect of Receiving a Higher Label Postsecondary Outcomes Math Proficient/Advanced Sample Any Any 2-Year 2-Year 4-Year 4-Year Postsecondary Postsecondary Enrollment Enrollment Enrollment Enrollment Enrollment Enrollment Advanced Label -0.0018 -0.0003 0.0088 0.0017 -0.0004 0.0027 (0.0050) (0.0029) (0.0109) (0.0075) (0.0083) (0.0054) Year Fixed Y Y Y Y Y Y Effects Race and Gender Y Y Y Y Y Y Controls Bandwidth 3 8 3 8 3 8 Number of 25,418 63,456 25,418 63,456 25,418 63,456 Observations Mean Outcome 0.75 0.75 0.59 0.59 0.45 0.45 Any Any Associate Associate Bachelor’s Bachelor’s Postsecondary Postsecondary Degree Degree Degree Degree Degree Degree Advanced Label -0.0107* -0.0075 -0.0049 -0.0061** -0.0101 0.0004 (0.0049) (0.0047) (0.0048) (0.0018) (0.0066) (0.0049) Year Fixed Y Y Y Y Y Y Effects Race and Gender Y Y Y Y Y Y Controls Bandwidth 3 8 3 8 3 8 Number of 25,418 63,456 25,418 63,456 25,418 63,456 Observations Mean Outcome 0.30 0.30 0.11 0.11 0.19 0.19 Notes: * p < 0.1, ** p < 0.05, *** p < 0.01. Bandwidth is measured in scale score points. Standard errors are clustered at the year level. Mean outcomes for the Math Proficient/Advanced Sample are shown. 141 Table C.2 – Effect of Receiving a Higher Label Postsecondary Outcomes Math Proficient/Advanced Sample Male and Female Students Any Any 2-Year 2-Year 4-Year 4-Year Postsecondary Postsecondary Enrollment Enrollment Enrollment Enrollment Enrollment Enrollment Advanced Label 0.0005 -0.0017 0.0235* 0.0096 0.0062 0.0045 (0.0075) (0.0043) (0.0112) (0.0050) (0.0153) (0.0086) Advanced * -0.0040 0.0036 -0.0352 -0.0182* -0.0130 -0.0038 Female (0.0086) (0.0048) (0.0338) (0.0077) (0.0174) (0.0101) P(Advanced + 0.50 0.57 0.69 0.47 0.24 0.92 Interaction) Year Fixed Y Y Y Y Y Y Effects Bandwidth 3 8 3 8 3 8 Number of 25,418 63,456 25,418 63,456 25,418 63,456 Observations Mean Outcome 0.94 0.94 0.52 0.52 0.83 0.83 Males Mean Outcome 0.97 0.97 0.53 0.53 0.90 0.90 Females Any Any Associate Associate Bachelor’s Bachelor’s Postsecondary Postsecondary Degree Degree Degree Degree Degree Degree Advanced Label 0.0011 -0.0017 0.0010 -0.0018 -0.0002 0.0064 (0.0138) (0.0075) (0.0075) (0.0047) (0.0141) (0.0054) Advanced * -0.0239 -0.0127 -0.0146 -0.0102 -0.0192 -0.0134 Female (0.0268) (0.0110) (0.0099) (0.0092) (0.0191) (0.0087) P(Advanced + 0.19 0.12 0.05 0.05 0.04 0.47 Interaction) Year Fixed Y Y Y Y Y Y Effects Bandwidth 3 8 3 8 3 8 Number of 25,418 63,456 25,418 63,456 25,418 63,456 Observations Mean Outcome 0.63 0.63 0.09 0.09 0.55 0.55 Males Mean Outcome 0.78 0.78 0.10 0.10 0.70 0.70 Females Notes: * p < 0.1, ** p < 0.05, *** p < 0.01. Bandwidth is measured in scale score points. Mean outcomes are for students in the Math Proficient/Advanced Sample. All regressions include year fixed effects. Standard errors are clustered at the year level. 142 Table C.3 – Effect of Receiving a Higher Label Postsecondary Outcomes Math Proficient/Advanced Sample White and Black Students Any Any 2-Year 2-Year 4-Year 4-Year Postsecondary Postsecondary Enrollment Enrollment Enrollment Enrollment Enrollment Enrollment Advanced Label -0.0015 -0.0010 0.0083 0.0001 0.0005 0.0016 (0.0046) (0.0025) (0.0120) (0.0091) (0.0074) (0.0056) Advanced * 0.0648 0.0294* 0.1848* -0.0256 0.1035** 0.0248 Black (0.0401) (0.0137) (0.0934) (0.0544) (0.0398) (0.0204) P(Advanced + 0.16 0.07 0.07 0.62 0.05 0.19 Interaction) Year Fixed Y Y Y Y Y Y Effects Bandwidth 3 8 3 8 3 8 Number of 23,093 57,599 23,093 57,599 23,093 57,599 Observations Mean Outcome 0.95 0.95 0.53 0.53 0.86 0.86 White Students Mean Outcome 0.96 0.96 0.52 0.52 0.88 0.88 Black Students Any Any Associate Associate Bachelor’s Bachelor’s Postsecondary Postsecondary Degree Degree Degree Degree Degree Degree Advanced Label -0.0154* -0.0080 -0.0091* - -0.0135 0.0010 (0.0074) (0.0072) (0.0047) 0.0086*** (0.0080) (0.0083) (0.0021) Advanced * 0.2608** 0.0927* 0.0328 0.0438* 0.2579*** 0.0596 Black (0.0924) (0.0461) (0.0402) (0.0207) (0.0637) (0.0546) P(Advanced + 0.03 0.09 0.60 0.15 0.01 0.30 Interaction) Year Fixed Y Y Y Y Y Y Effects Bandwidth 3 8 3 8 3 8 Number of 23,093 57,599 23,093 57,599 23,093 57,599 Observations Mean Outcome 0.70 0.70 0.10 0.10 0.62 0.62 White Students Mean Outcome 0.58 0.58 0.06 0.06 0.52 0.52 Black Students Notes: * p < 0.1, ** p < 0.05, *** p < 0.01. Bandwidth is measured in scale score points. Mean outcomes are for students in the Math Proficient/Advanced Sample. All regressions include year fixed effects. Standard errors are clustered at the year level. 143 Table C.4 – Effect of Receiving a Higher Label Postsecondary Outcomes Math Proficient/Advanced Sample Not Economically Disadvantaged and Economically Disadvantaged Students Any Any 2-Year 2-Year 4-Year 4-Year Postsecondary Postsecondary Enrollment Enrollment Enrollment Enrollment Enrollment Enrollment Advanced Label 0.0004 0.0017 0.0084 0.0042 0.0038 0.0063 (0.0050) (0.0026) (0.0114) (0.0097) (0.0089) (0.0053) Advanced * -0.0189 -0.0157 0.0074 -0.0218 -0.0445 -0.0240 Economically (0.0180) (0.0122) (0.0548) (0.0412) (0.0344) (0.0149) Disadvantaged P(Advanced + 0.34 0.32 0.77 0.64 0.25 0.30 Interaction) Year Fixed Y Y Y Y Y Y Effects Bandwidth 3 8 3 8 3 8 Number of 25,418 63,456 25,418 63,456 25,418 63,456 Observations Mean Outcome 0.96 0.96 0.52 0.52 0.88 0.88 Not Economically Disadvantaged Mean Outcome 0.91 0.91 0.57 0.57 0.74 0.74 Economically Disadvantaged Any Any Associate Associate Bachelor’s Bachelor’s Postsecondary Postsecondary Degree Degree Degree Degree Degree Degree Advanced Label -0.0014 -0.0022 0.0008 -0.0047 -0.0046 0.0063 (0.0042) (0.0040) (0.0061) (0.0030) (0.0063) (0.0054) Advanced * -0.1093*** -0.0360 -0.0502* -0.0113 -0.0769** -0.0408 Economically (0.0256) (0.0252) (0.0243) (0.0174) (0.0244) (0.0269) Disadvantaged P(Advanced + 0.01 0.18 0.05 0.32 0.01 0.22 Interaction) Year Fixed Y Y Y Y Y Y Effects Bandwidth 3 8 3 8 3 8 Number of 25,418 63,456 25,418 63,456 25,418 63,456 Observations Mean Outcome 0.73 0.73 0.09 0.09 0.66 0.66 Not Economically Disadvantaged Mean Outcome 0.51 0.51 0.12 0.12 0.41 0.41 Economically Disadvantaged Notes: * p < 0.1, ** p < 0.05, *** p < 0.01. Bandwidth is measured in scale score points. Mean outcomes are for students in the Math Proficient/Advanced Sample. All regressions include year fixed effects. Standard errors are clustered at the year level. 144 C.1.2 Reading Proficient/Advanced Sample Table C.5 – Effect of Receiving a Higher Label Postsecondary Outcomes Reading Proficient/Advanced Sample Any Any 2-Year 2-Year 4-Year 4-Year Postsecondary Postsecondary Enrollment Enrollment Enrollment Enrollment Enrollment Enrollment Advanced Label -0.0022 -0.0027** -0.0049 -0.0041 0.0006 -0.0021 (0.0032) (0.0009) (0.0070) (0.0049) (0.0063) (0.0034) Year Fixed Y Y Y Y Y Y Effects Covariates Y Y Y Y Y Y Bandwidth 5 14 5 14 5 14 Number of 72,167 182,594 72,167 182,594 72,167 182,594 Observations Mean Outcome 0.90 0.90 0.58 0.58 0.73 0.73 Any Any Associate Associate Bachelor’s Bachelor’s Postsecondary Postsecondary Degree Degree Degree Degree Degree Degree Advanced Label -0.0088 -0.0032 0.0084* 0.0051 -0.0094 -0.0065 (0.0058) (0.0037) (0.0042) (0.0039) (0.0076) (0.0037) Year Fixed Y Y Y Y Y Y Effects Covariates Y Y Y Y Y Y Bandwidth 5 14 5 14 5 14 Number of 72,167 182,594 72,167 182,594 72,167 182,594 Observations Mean Outcome 0.56 0.56 0.11 0.11 0.46 0.46 Notes: * p < 0.1, ** p < 0.05, *** p < 0.01. Bandwidth is measured in scale score points. Standard errors are clustered at the year level. Mean outcomes for the Reading Proficient/Advanced Sample are shown. 145 Table C.6 – Effect of Receiving a Higher Label Postsecondary Outcomes Reading Proficient/Advanced Sample Male and Female Students Any Any 2-Year 2-Year 4-Year 4-Year Postsecondary Postsecondary Enrollment Enrollment Enrollment Enrollment Enrollment Enrollment Advanced Label 0.0017 -0.0013 -0.0086 -0.0026 0.0105 0.0025 (0.0047) (0.0027) (0.0085) (0.0056) (0.0061) (0.0029) Advanced * -0.0066 -0.0025 0.0062 -0.0026 -0.0163 -0.0082 Female (0.0069) (0.0054) (0.0098) (0.0061) (0.0100) (0.0071) P(Advanced + 0.36 0.24 0.79 0.40 0.56 0.37 Interaction) Year Fixed Y Y Y Y Y Y Effects Bandwidth 5 14 5 14 5 14 Number of 72,167 182,594 72,167 182,594 72,167 182,594 Observations Mean Outcome 0.88 0.88 0.55 0.55 0.69 0.69 Males Mean Outcome 0.92 0.92 0.60 0.60 0.77 0.77 Females Any Any Associate Associate Bachelor’s Bachelor’s Postsecondary Postsecondary Degree Degree Degree Degree Degree Degree Advanced Label 0.0061 0.0014 0.0145** 0.0050 -0.0012 -0.0032 (0.0033) (0.0042) (0.0055) (0.0052) (0.0074) (0.0059) Advanced * -0.0248** -0.0089 -0.0116 0.0002 -0.0120 -0.0065 Female (0.0088) (0.0086) (0.0086) (0.0053) (0.0073) (0.0094) P(Advanced + 0.11 0.26 0.67 0.27 0.25 0.14 Interaction) Year Fixed Y Y Y Y Y Y Effects Bandwidth 5 14 5 14 5 14 Number of 72,167 182,594 72,167 182,594 72,167 182,594 Observations Mean Outcome 0.50 0.50 0.10 0.10 0.41 0.41 Males Mean Outcome 0.61 0.61 0.13 0.13 0.50 0.50 Females Notes: * p < 0.1, ** p < 0.05, *** p < 0.01. Bandwidth is measured in scale score points. Mean outcomes are for students in the Reading Proficient/Advanced Sample. All regressions include year fixed effects. Standard errors are clustered at the year level. 146 Table C.7 – Effect of Receiving a Higher Label Postsecondary Outcomes Reading Proficient/Advanced Sample White and Black Students Any Any 2-Year 2-Year 4-Year 4-Year Postsecondary Postsecondary Enrollment Enrollment Enrollment Enrollment Enrollment Enrollment Advanced Label -0.0002 -0.0025* -0.0014 -0.0037 0.0011 -0.0024 (0.0032) (0.0011) (0.0065) (0.0050) (0.0055) (0.0027) Advanced * -0.0185 -0.0250 0.0195 -0.0143 -0.0240 -0.0374** Black (0.0247) (0.0154) (0.0337) (0.0164) (0.0233) (0.0112) P(Advanced + 0.50 0.12 0.61 0.27 0.40 0.02 Interaction) Year Fixed Y Y Y Y Y Y Effects Bandwidth 5 14 5 14 5 14 Number of 66,775 168,735 66,775 168,735 66,775 168,735 Observations Mean Outcome 0.90 0.90 0.58 0.58 0.73 0.73 White Students Mean Outcome 0.91 0.91 0.61 0.61 0.72 0.72 Black Students Any Any Associate Associate Bachelor’s Bachelor’s Postsecondary Postsecondary Degree Degree Degree Degree Degree Degree Advanced Label -0.0087 -0.0037 0.0098* 0.0069 -0.0086 -0.0074 (0.0064) (0.0033) (0.0048) (0.0037) (0.0086) (0.0040) Advanced * -0.0171 -0.0348** -0.0203 -0.0293** -0.0175 -0.0221 Black (0.0280) (0.0136) (0.0139) (0.0106) (0.0327) (0.0175) P(Advanced + 0.42 0.03 0.48 0.10 0.50 0.14 Interaction) Year Fixed Y Y Y Y Y Y Effects Bandwidth 5 14 5 14 5 14 Number of 66,775 168,735 66,775 168,735 66,775 168,735 Observations Mean Outcome 0.57 0.57 0.12 0.12 0.47 0.47 White Students Mean Outcome 0.39 0.39 0.07 0.07 0.31 0.31 Black Students Notes: * p < 0.1, ** p < 0.05, *** p < 0.01. Bandwidth is measured in scale score points. Mean outcomes are for students in the Reading Proficient/Advanced Sample. All regressions include year fixed effects. Standard errors are clustered at the year level. 147 Table C.8 – Effect of Receiving a Higher Label Postsecondary Outcomes Reading Proficient/Advanced Sample Not Economically Disadvantaged and Economically Disadvantaged Students Any Any 2-Year 2-Year 4-Year 4-Year Postsecondary Postsecondary Enrollment Enrollment Enrollment Enrollment Enrollment Enrollment Advanced Label -0.0010 -0.0032* -0.0023 -0.0037 0.0033 -0.0004 (0.0036) (0.0013) (0.0040) (0.0046) (0.0064) (0.0038) Advanced * -0.0066 0.0046 -0.0144 0.0008 -0.0152 -0.0091 Economically (0.0133) (0.0067) (0.0232) (0.0098) (0.0169) (0.0106) Disadvantaged P(Advanced + 0.55 0.81 0.54 0.80 0.49 0.36 Interaction) Year Fixed Y Y Y Y Y Y Effects Bandwidth 5 14 5 14 5 14 Number of 72,167 182,594 72,167 182,594 72,167 182,594 Observations Mean Outcome 0.93 0.93 0.58 0.58 0.78 0.78 Not Economically Disadvantaged Mean Outcome 0.82 0.82 0.58 0.58 0.58 0.58 Economically Disadvantaged Any Any Associate Associate Bachelor’s Bachelor’s Postsecondary Postsecondary Degree Degree Degree Degree Degree Degree Advanced Label -0.0034 -0.0022 0.0089 0.0038 -0.0035 -0.0038 (0.0075) (0.0026) (0.0053) (0.0040) (0.0096) (0.0040) Advanced * -0.0289 -0.0082 -0.0020 0.0083 -0.0318* -0.0181 Economically (0.0159) (0.0143) (0.0120) (0.0095) (0.0154) (0.0109) Disadvantaged P(Advanced + 0.04 0.50 0.47 0.21 0.02 0.08 Interaction) Year Fixed Y Y Y Y Y Y Effects Bandwidth 5 14 5 14 5 14 Number of 72,167 182,594 72,167 182,594 72,167 182,594 Observations Mean Outcome 0.62 0.62 0.11 0.11 0.53 0.53 Not Economically Disadvantaged Mean Outcome 0.35 0.35 0.11 0.11 0.25 0.25 Economically Disadvantaged Notes: * p < 0.1, ** p < 0.05, *** p < 0.01. Bandwidth is measured in scale score points. Mean outcomes are for students in the Reading Proficient/Advanced Sample. All regressions include year fixed effects. Standard errors are clustered at the year level. 148 C.1.3 Math Not Proficient/Partially Proficient Sample Table C.9 – Effect of Receiving a Higher Label Postsecondary Outcomes Math Not Proficient/Partially Proficient Sample Any Any 2-Year 2-Year 4-Year 4-Year Postsecondary Postsecondary Enrollment Enrollment Enrollment Enrollment Enrollment Enrollment Partially -0.0113* -0.0010 -0.0140* -0.0001 -0.0057 -0.0028 Proficient Label (0.0057) (0.0028) (0.0062) (0.0035) (0.0065) (0.0024) Year Fixed Y Y Y Y Y Y Effects Covariates Y Y Y Y Y Y Bandwidth 4 11 4 11 4 11 Number of 88,885 222,002 88,885 222,002 88,885 222,002 Observations Mean Outcome 0.75 0.75 0.59 0.59 0.45 0.45 Any Any Associate Associate Bachelor’s Bachelor’s Postsecondary Postsecondary Degree Degree Degree Degree Degree Degree Partially -0.0011 0.0045 -0.0001 0.0018 -0.0074* -0.0004 Proficient Label (0.0063) (0.0034) (0.0042) (0.0029) (0.0032) (0.0030) Year Fixed Y Y Y Y Y Y Effects Covariates Y Y Y Y Y Y Bandwidth 4 11 4 11 4 11 Number of 88,885 222,002 88,885 222,002 88,885 222,002 Observations Mean Outcome 0.30 0.30 0.11 0.11 0.19 0.19 Notes: * p < 0.1, ** p < 0.05, *** p < 0.01. Bandwidth is measured in scale score points. Standard errors are clustered at the year level. Mean outcomes for the Math Not Proficient/Partially Proficient Sample are shown. 149 Table C.10 – Effect of Receiving a Higher Label Postsecondary Outcomes Math Not Proficient/Partially Proficient Sample Male and Female Students Any Any 2-Year 2-Year 4-Year 4-Year Postsecondary Postsecondary Enrollment Enrollment Enrollment Enrollment Enrollment Enrollment Partially -0.0143 -0.0022 -0.0072 -0.0013 -0.0125* 0.0008 Proficient Label (0.0078) (0.0073) (0.0106) (0.0069) (0.0062) (0.0051) Partially 0.0035 0.0031 -0.0140 0.0031 0.0097 -0.0065 Proficient * (0.0086) (0.0097) (0.0104) (0.0084) (0.0105) (0.0111) Female P(Advanced + 0.15 0.80 0.01 0.64 0.77 0.43 Interaction) Year Fixed Y Y Y Y Y Y Effects Bandwidth 4 11 4 11 4 11 Number of 88,885 222,002 88,885 222,002 88,885 222,002 Observations Mean Outcome 0.68 0.68 0.54 0.54 0.38 0.38 Males Mean Outcome 0.81 0.81 0.63 0.63 0.52 0.52 Females Any Any Associate Associate Bachelor’s Bachelor’s Postsecondary Postsecondary Degree Degree Degree Degree Degree Degree Partially 0.0078 0.0156** 0.0084** 0.0098** -0.0063 0.0022 Proficient Label (0.0101) (0.0052) (0.0026) (0.0031) (0.0043) (0.0030) Partially -0.0184 -0.0201* -0.0162** - -0.0036 -0.0045 Proficient * (0.0139) (0.0093) (0.0051) 0.0144*** (0.0084) (0.0078) Female (0.0026) P(Advanced + 0.26 0.46 0.28 0.15 0.14 0.72 Interaction) Year Fixed Y Y Y Y Y Y Effects Bandwidth 4 11 4 11 4 11 Number of 88,885 222,002 88,885 222,002 88,885 222,002 Observations Mean Outcome 0.23 0.23 0.09 0.09 0.14 0.14 Males Mean Outcome 0.37 0.37 0.13 0.13 0.24 0.24 Females Notes: * p < 0.1, ** p < 0.05, *** p < 0.01. Bandwidth is measured in scale score points. Mean outcomes are for students in the Math Not Proficient/Partially Proficient Sample. All regressions include year fixed effects. Standard errors are clustered at the year level. 150 Table C.11 – Effect of Receiving a Higher Label Postsecondary Outcomes Math Not Proficient/Partially Proficient Sample White and Black Students Any Any 2-Year 2-Year 4-Year 4-Year Postsecondary Postsecondary Enrollment Enrollment Enrollment Enrollment Enrollment Enrollment Partially Proficient -0.0162** -0.0022 -0.0204* -0.0022 -0.0137*** -0.0060** Label (0.0064) (0.0040) (0.0088) (0.0044) (0.0034) (0.0023) Partially Proficient * 0.0245* 0.0050 0.0218 0.0060 0.0508** 0.0146* Black (0.0120) (0.0073) (0.0156) (0.0105) (0.0140) (0.0068) P(Advanced + 0.43 0.62 0.93 0.73 0.05 0.35 Interaction) Year Fixed Effects Y Y Y Y Y Y Bandwidth 4 11 4 11 4 11 Number of 80,870 202,414 80,0.58870 202,414 80,870 202,414 Observations Mean Outcome White 0.74 0.74 0.58 0.58 0.46 0.46 Students Mean Outcome Black 0.78 0.78 0.62 0.62 0.45 0.45 Students Any Any Associate Associate Bachelor’s Bachelor’s Postsecondary Postsecondary Degree Degree Degree Degree Degree Degree Partially Proficient -0.0059 0.0031 -0.0042 0.0003 -0.0098 -0.0006 Label (0.0051) (0.0025) (0.0032) (0.0022) (0.0053) (0.0039) Partially Proficient * 0.0299** 0.0031 0.0155 0.0024 0.0216 0.0047 Black (0.0088) (0.0107) (0.0084) (0.0083) (0.0116) (0.0049) P(Advanced + 0.05 0.55 0.28 0.79 0.18 0.46 Interaction) Year Fixed Effects Y Y Y Y Y Y Bandwidth 4 11 4 11 4 11 Number of 80,870 202,414 80,870 202,414 80,870 202,414 Observations Mean Outcome White 0.33 0.33 0.12 0.12 0.22 0.22 Students Mean Outcome Black 0.20 0.20 0.06 0.06 0.13 0.13 Students Notes: * p < 0.1, ** p < 0.05, *** p < 0.01. Bandwidth is measured in scale score points. Mean outcomes are for students in the Math Not Proficient/Partially Proficient Sample. All regressions include year fixed effects. Standard errors are clustered at the year level. 151 Table C.12 – Effect of Receiving a Higher Label Postsecondary Outcomes Math Not Proficient/Partially Proficient Sample Not Economically Disadvantaged and Economically Disadvantaged Students Any Any 2-Year 2-Year 4-Year 4-Year Postsecondary Postsecondary Enrollment Enrollment Enrollment Enrollment Enrollment Enrollment Partially -0.0085 -0.0010 -0.0157** -0.0034 0.0003 0.0051 Proficient Label (0.0069) (0.0043) (0.0061) (0.0047) (0.0071) (0.0036) Partially -0.0092 -0.0019 0.0031 0.0065 -0.0187 - Proficient * (0.0092) (0.0046) (0.0087) (0.0056) (0.0113) 0.0215*** Economically (0.0057) Disadvantaged P(Advanced + 0.04 0.29 0.22 0.50 0.05 0.00 Interaction) Year Fixed Y Y Y Y Y Y Effects Bandwidth 4 11 4 11 4 11 Number of 88,885 222,002 88,885 222,002 88,885 222,002 Observations Mean Outcome 0.80 0.80 0.62 0.62 0.52 0.52 Not Economically Disadvantaged Mean Outcome 0.67 0.67 0.54 0.54 0.35 0.35 Economically Disadvantaged Any Any Associate Associate Bachelor’s Bachelor’s Postsecondary Postsecondary Degree Degree Degree Degree Degree Degree Partially 0.0013 0.0067 -0.0013 0.0019 -0.0069 0.0007 Proficient Label (0.0080) (0.0048) (0.0072) (0.0033) (0.0053) (0.0042) Partially -0.0077 -0.0072 0.0029 -0.0015 -0.0032 -0.0034 Proficient * (0.0092) (0.0076) (0.0089) (0.0049) (0.0096) (0.0049) Economically Disadvantaged P(Advanced + 0.29 0.92 0.68 0.93 0.14 0.44 Interaction) Year Fixed Y Y Y Y Y Y Effects Bandwidth 4 11 4 11 4 11 Number of 88,885 222,002 88,885 222,002 88,885 222,002 Observations Mean Outcome 0.38 0.38 0.13 0.13 0.26 0.26 Not Economically Disadvantaged Mean Outcome 0.19 0.19 0.08 0.08 0.10 0.10 Economically Disadvantaged Notes: * p < 0.1, ** p < 0.05, *** p < 0.01. Bandwidth is measured in scale score points. Mean outcomes are for students in the Math Not Proficient/Partially Proficient Sample. All regressions include year fixed effects. Standard errors are clustered at the year level. 152 C.1.4 Reading Not Proficient/Partially Proficient Sample Table C.13 – Effect of Receiving a Higher Label Postsecondary Outcomes Reading Not Proficient/Partially Proficient Sample Any Any 2-Year 2-Year 4-Year 4-Year Postsecondary Postsecondary Enrollment Enrollment Enrollment Enrollment Enrollment Enrollment Partially -0.0046 -0.0004 -0.0044 0.0012 -0.0039 -0.0049 Proficient Label (0.0112) (0.0077) (0.0121) (0.0083) (0.0100) (0.0041) Year Fixed Y Y Y Y Y Y Effects Covariates Y Y Y Y Y Y Bandwidth 5 14 5 14 5 14 Number of 55,190 148,864 55,190 148,864 55,190 148,864 Observations Mean Outcome 0.68 0.68 0.56 0.56 0.36 0.36 Any Any Associate Associate Bachelor’s Bachelor’s Postsecondary Postsecondary Degree Degree Degree Degree Degree Degree Partially -0.0000 -0.0055 -0.0053 -0.0009 0.0082 -0.0016 Proficient Label (0.0068) (0.0036) (0.0047) (0.0035) (0.0047) (0.0026) Year Fixed Y Y Y Y Y Y Effects Covariates Y Y Y Y Y Y Bandwidth 5 14 5 14 5 14 Number of 55,190 148,864 55,190 148,864 55,190 148,864 Observations Mean Outcome 0.23 0.23 0.09 0.09 0.13 0.13 Notes: * p < 0.1, ** p < 0.05, *** p < 0.01. Bandwidth is measured in scale score points. Standard errors are clustered at the year level. Mean outcomes for the Reading Not Proficient/Partially Proficient Sample are shown. 153 Table C.14 – Effect of Receiving a Higher Label Postsecondary Outcomes Reading Not Proficient/Partially Proficient Sample Male and Female Students Any Any 2-Year 2-Year 4-Year 4-Year Postsecondary Postsecondary Enrollment Enrollment Enrollment Enrollment Enrollment Enrollment Partially -0.0071 -0.0065 -0.0022 -0.0016 -0.0101 -0.0143* Proficient Label (0.0125) (0.0100) (0.0151) (0.0111) (0.0109) (0.0071) Partially 0.0015 0.0087 -0.0080 0.0020 0.0101 0.0173* Proficient * (0.0080) (0.0072) (0.0096) (0.0093) (0.0143) (0.0073) Female P(Advanced + 0.68 0.78 0.42 0.96 1.00 0.44 Interaction) Year Fixed Y Y Y Y Y Y Effects Bandwidth 5 14 5 14 5 14 Number of 55,190 148,864 55,190 148,864 55,190 148,864 Observations Mean Outcome 0.63 0.63 0.51 0.51 0.32 0.32 Males Mean Outcome 0.74 0.74 0.61 0.61 0.41 0.41 Females Any Any Associate Associate Bachelor’s Bachelor’s Postsecondary Postsecondary Degree Degree Degree Degree Degree Degree Partially -0.0061 -0.0066 -0.0008 0.0015 -0.0000 -0.0034 Proficient Label (0.0140) (0.0046) (0.0087) (0.0033) (0.0043) (0.0042) Partially 0.0097 0.0019 -0.0108 -0.0048 0.0157 0.0030 Proficient * (0.0209) (0.0079) (0.0099) (0.0049) (0.0145) (0.0060) Female P(Advanced + 0.72 0.49 0.02 0.54 0.23 0.93 Interaction) Year Fixed Y Y Y Y Y Y Effects Bandwidth 5 14 5 14 5 14 Number of 55,190 148,864 55,190 148,864 55,190 148,864 Observations Mean Outcome 0.19 0.19 0.08 0.08 0.11 0.11 Males Mean Outcome 0.27 0.27 0.11 0.11 0.15 0.15 Females Notes: * p < 0.1, ** p < 0.05, *** p < 0.01. Bandwidth is measured in scale score points. Mean outcomes are for students in the Reading Not Proficient/Partially Proficient Sample. All regressions include year fixed effects. Standard errors are clustered at the year level. 154 Table C.15 – Effect of Receiving a Higher Label Postsecondary Outcomes Reading Not Proficient/Partially Proficient Sample White and Black Students Any Any 2-Year 2-Year 4-Year 4-Year Postsecondary Postsecondary Enrollment Enrollment Enrollment Enrollment Enrollment Enrollment Partially -0.0029 -0.0009 0.0001 0.0035 -0.0102 -0.0103** Proficient Label (0.0124) (0.0076) (0.0157) (0.0081) (0.0095) (0.0041) Partially -0.0120 0.0030 -0.0171 -0.0078 0.0167 0.0156 Proficient * Black (0.0173) (0.0056) (0.0230) (0.0095) (0.0149) (0.0098) P(Advanced + 0.44 0.84 0.44 0.73 0.64 0.64 Interaction) Year Fixed Y Y Y Y Y Y Effects Bandwidth 5 14 5 14 5 14 Number of 49,821 134,602 49,821 134,602 49,821 134,602 Observations Mean Outcome 0.67 0.67 0.54 0.54 0.36 0.36 White Students Mean Outcome 0.74 0.74 0.62 0.62 0.37 0.37 Black Students Any Any Associate Associate Bachelor’s Bachelor’s Postsecondary Postsecondary Degree Degree Degree Degree Degree Degree Partially 0.0048 -0.0049 -0.0070 -0.0018 0.0092* -0.0013 Proficient Label (0.0087) (0.0046) (0.0075) (0.0037) (0.0047) (0.0014) Partially -0.0166 -0.0054 -0.0064 -0.0041 -0.0014 -0.0015 Proficient * Black (0.0095) (0.0081) (0.0095) (0.0056) (0.0107) (0.0059) P(Advanced + 0.06 0.26 0.05 0.44 0.41 0.67 Interaction) Year Fixed Y Y Y Y Y Y Effects Bandwidth 5 14 5 14 5 14 Number of 49,821 134,602 49,821 134,602 49,821 134,602 Observations Mean Outcome 0.26 0.26 0.11 0.11 0.15 0.15 White Students Mean Outcome 0.16 0.16 0.05 0.05 0.09 0.09 Black Students Notes: * p < 0.1, ** p < 0.05, *** p < 0.01. Bandwidth is measured in scale score points. Mean outcomes are for students in the Reading Not Proficient/Partially Proficient Sample. All regressions include year fixed effects. Standard errors are clustered at the year level. 155 Table C.16 – Effect of Receiving a Higher Label Postsecondary Outcomes Reading Not Proficient/Partially Proficient Sample Not Economically Disadvantaged and Economically Disadvantaged Students Any Any 2-Year 2-Year 4-Year 4-Year Postsecondary Postsecondary Enrollment Enrollment Enrollment Enrollment Enrollment Enrollment Partially -0.0041 0.0106 -0.0150 0.0077 0.0058 0.0001 Proficient Label (0.0119) (0.0072) (0.0136) (0.0082) (0.0102) (0.0068) Partially -0.0015 -0.0238* 0.0204* -0.0148 -0.0192 -0.0102 Proficient * (0.0115) (0.0101) (0.0103) (0.0110) (0.0148) (0.0123) Economically Disadvantaged P(Advanced + 0.73 0.31 0.72 0.57 0.41 0.29 Interaction) Year Fixed Y Y Y Y Y Y Effects Bandwidth 5 14 5 14 5 14 Number of 55,190 148,864 55,190 148,864 55,190 148,864 Observations Mean Outcome 0.74 0.74 0.60 0.60 0.42 0.42 Not Economically Disadvantaged Mean Outcome 0.62 0.62 0.51 0.51 0.29 0.29 Economically Disadvantaged Any Any Associate Associate Bachelor’s Bachelor’s Postsecondary Postsecondary Degree Degree Degree Degree Degree Degree Partially 0.0013 0.0015 -0.0113 -0.0021 0.0087 0.0049 Proficient Label (0.0085) (0.0046) (0.0064) (0.0049) (0.0063) (0.0030) Partially -0.0023 -0.0122 0.0123 0.0037 -0.0011 -0.0117* Proficient * (0.0092) (0.0091) (0.0089) (0.0066) (0.0063) (0.0053) Economically Disadvantaged P(Advanced + 0.91 0.19 0.89 0.75 0.16 0.16 Interaction) Year Fixed Y Y Y Y Y Y Effects Bandwidth 5 14 5 14 5 14 Number of 55,190 148,864 55,190 148,864 55,190 148,864 Observations Mean Outcome 0.30 0.30 0.12 0.12 0.19 0.19 Not Economically Disadvantaged Mean Outcome 0.15 0.15 0.07 0.07 0.07 0.07 Economically Disadvantaged Notes: * p < 0.1, ** p < 0.05, *** p < 0.01. Bandwidth is measured in scale score points. Mean outcomes are for students in the Reading Not Proficient/Partially Proficient Sample. All regressions include year fixed effects. Standard errors are clustered at the year level. 156 C.2 Summary of Significant Treatment Effects Table C.17 – List of Higher Label Treatment Effects Significant at the 5% Level Proficient/Advanced Sample Group of Students Outcome Bandwidth Math All Associate Degree 8 Proficient/Advanced Math Female Any Postsecondary 5 Proficient/Advanced Degree Math Female Associate Degree 3, 8 Proficient/Advanced Math Female Bachelor’s Degree 3, 5 Proficient/Advanced Math White Associate Degree 8 Proficient/Advanced Math Black 4-Year Enrollment 3 Proficient/Advanced Math Black Any Postsecondary 3 Proficient/Advanced Degree Math Black Bachelor’s Degree 3 Proficient/Advanced Math Economically Any Postsecondary 3 Proficient/Advanced Disadvantaged Degree Math Economically Associate Degree 3 Proficient/Advanced Disadvantaged Math Economically Bachelor’s Degree 3 Proficient/Advanced Disadvantaged Reading All Any Postsecondary 14 Proficient/Advanced Enrollment Reading All Associate Degree 9 Proficient/Advanced Reading Male Associate Degree 5, 9 Proficient/Advanced Reading White Associate Degree 9 Proficient/Advanced Reading Black 4-Year Enrollment 14 Proficient/Advanced Reading Black Any Postsecondary 14 Proficient/Advanced Degree Reading Not Economically Associate Degree 9 Proficient/Advanced Disadvantaged Reading Economically Any Postsecondary 5 Proficient/Advanced Disadvantaged Degree Reading Economically Bachelor’s Degree 5, 9 Proficient/Advanced Disadvantaged 157 Table C.18 – List of Higher Label Treatment Effects Significant at the 5% Level Not Proficient/Partially Proficient Sample Group of Students Outcome Bandwidth Math Not Male Any Postsecondary 7, 11 Proficient/Partially Degree Proficient Math Not Male Associate Degree 4, 7, 11 Proficient/Partially Proficient Math Not Female 2-Year Enrollment 4 Proficient/Partially Proficient Math Not Female Associate Degree 7 Proficient/Partially Proficient Math Not White Any Postsecondary 4 Proficient/Partially Enrollment Proficient Math Not White 4-Year Enrollment 4, 11 Proficient/Partially Proficient Math Not Black 4-Year Enrollment 4 Proficient/Partially Proficient Math Not Black Any Postsecondary 4 Proficient/Partially Degree Proficient Math Not Not Economically 2-Year Enrollment 4 Proficient/Partially Disadvantaged Proficient Math Not Economically Any Postsecondary 4 Proficient/Partially Disadvantaged Enrollment Proficient Math Not Economically 4-Year Enrollment 4, 11 Proficient/Partially Disadvantaged Proficient Reading Not Female Associate Degree 5 Proficient/Partially Proficient Reading Not White 4-Year Enrollment 14 Proficient/Partially Proficient Reading Not Black Associate Degree 5 Proficient/Partially Proficient 158