THREE ESSAYS ON THE ECONOMICS OF EDUCATION By Riley Acton A DISSERTATION Submitted to Michigan State University in partial fulfillment of the requirements for the degree of Economics – Doctor of Philosophy 2020 ABSTRACT THREE ESSAYS ON THE ECONOMICS OF EDUCATION By Riley Acton Chapter 1: Effects of Reduced Community College Tuition on College Choices and Degree Completion Recent efforts to increase college access concentrate on reducing tuition rates at community colleges, but researchers and policymakers alike have expressed concern that such reductions may not lead to long-run college completion gains. In this chapter, I use detailed data on students’ college enrollment and completion outcomes to study how community college tuition rates affect students’ outcomes across both public and private colleges. By exploiting spatial variation in tu- ition rates, I find that reducing tuition at a student’s local community college by $1,000 increases enrollment at the college by 3.5 percentage points (18%) and reduces enrollment at non-local community colleges, for-profit institutions, and other private, vocationally-focused colleges, by 1.9 percentage points (15%). This shift in enrollment choices increases students’ persistence in college, the number of credits they complete, and the probability that they transfer to and earn bachelor’s degrees from four-year colleges. Chapter 2: Community College Program Choices in the Wake of Local Job Losses Deciding which field to study is one of the most consequential decisions college students make, but most research on the topic focuses on students attending four-year colleges. In this chapter, I study the extent to which community college students’ program choices respond to changes in local labor market conditions in related occupations. To do so, I exploit the prevalence of mass layoffs and plant closings across counties, industries, and time, and create occupation-specific lay- off measures that align closely with community college programs. I find that declines in local employment deter students from entering closely related community college programs and instead induce them to enroll in other vocationally-oriented programs. Using data on occupational skill composition, I document that students predominantly shift enrollment between programs that re- quire similar skills. These effects are strongest when layoffs occur in business, health, and law enforcement occupations, as well as when they take place in rural counties. Chapter 3: Do Health Insurance Mandates Spillover to Education? Evidence from Michi- gan’s Autism Insurance Mandate (with Scott Imberman and Michael Lovenheim) Social programs and mandates are usually studied in isolation, but interaction effects could create spillovers to other public goods. In this paper, we examine how health insurance cover- age affects the education of students with Autism Spectrum Disorder (ASD) in the context of state-mandated private therapy coverage. Since Medicaid benefits under the mandate were far weaker than under private insurance, we proxy for Medicaid ineligibility and estimate effects via triple-differences. We find little evidence of an overall shift in ASD identification, but we do find substantial crowd-out of special education services for students with ASD from the mandate. The mandate led to increased mainstreaming of students in general education classrooms and a re- duction in special education support services like teacher consultants. There is little evidence of changes in achievement, which supports our interpretation of the service reductions as crowd-out. To Lily and Grace Acton. You give me hope for the future. iv ACKNOWLEDGMENTS Many people deserve acknowledgment for the role they have played in helping me complete this dissertation. First, I would like to express my sincere gratitude to my advisor, Scott Imberman, for his guidance throughout graduate school. I know that I would not be the economist I am today without his feedback, advice, and never-ending support. The other members of my committee played equally integral roles in helping me to develop my research and navigate this Ph.D. Steven Haider continuously pushed me to think bigger and go further than I previously thought possible, Stacy Dickert-Conlin painstakingly read my drafts and provided sound advice through every stage of the process, and Amanda Chuan stepped in to provide a fresh perspective on labor economics, the job market, and the transition from graduate student to faculty member. I also thank Mike Conlin for many valuable conversations, Mike Lovenheim for his coauthorship and mentorship from afar, and the economics department staff (Lori Jean Nichols, Margaret Lynch, Belen Feight, Jay Feight) for keeping everything running smoothly behind the scenes. While research can often be a lonely pursuit, I am very grateful that I did not have to travel this path alone. Both my work and well-being have benefited greatly from conversations with the brilliant, funny economists in the basement of Berkey Hall. I am especially thankful to Cody Orr, Hannah Gabriel, Luke Watson, Gabrielle Pepin, Nick Rowe, and Chris Fowler for their friendship and thoughtful insights, as well as our coffee breaks and their well-timed “TFP shocks.” Thanks also to the MSU students who came before me and passed along institutional knowledge and ad- vice. In particular, I thank Dylan Brewer, Alyssa Carlson, Katie Harris-Lagoudakis, and Sarajane Parsons, all of whom have shared invaluable wisdom with me over the past five years. The broader economics and education policy fields have also provided me with much support on this journey. Thank you to the University of Michigan’s Education Policy Institute for sev- eral opportunities to share my work with leaders in the field and forge friendships with the next generation of education economists. Thanks also to the Association for Public Policy Analysis & Management (APPAM) and Association for Education Finance & Policy (AEFP) for providing v venues to present my research and opportunities to become part of a supportive community. I am also grateful to the many seminar participants, at MSU and elsewhere, who have provided feedback on these dissertation chapters. Finally, I thank those in my “inner circle” who have seen me through the completion of this Ph.D. I would not be here without the support of my mentors at Ursinus College. Thank you to Jennifer VanGilder for enthusiastically steering me towards graduate school in economics and continuing to check-in on me throughout the process of becoming an economist. Equal thanks go to Jeff Schepers for always believing in me, calling me when I most needed it, and reminding me that the only way to eat an elephant is “one bite at a time.” My largest source of encouragement and inspiration has always been, and continues to be, my family. Mom and Dad: thank you for everything —from the frequent texts and phone calls, to the wine deliveries, and the plane tickets home. For the past 27 years, you have given me all of the tools and resources I needed to be successful and I could not be more grateful for your love and support. Lily and Grace: I am very proud to be your big sister. Thank you for keeping me grounded, making me laugh, and giving me so much to be hopeful about. Opa: I am honored to follow in your footsteps as I enter the next phrase of my academic career, and am so happy that Oma has been able to watch it unfold. I hope to have a fraction of an impact on this world that you did. Last, but certainly not least, I express my utmost appreciation for my partner, Samuel Christensen. Your love, humor, and wisdom have gotten me through the many trials and tribulations of these past few years. I can’t wait to see what our next chapter together has in store. vi PREFACE This dissertation used data collected and maintained by the Michigan Department of Education (MDE) and Michigan’s Center for Educational Performance and Information (CEPI). The results, information and opinions presented here solely represent the analysis, information and opinions of the author and are not endorsed by, or reflect the views or positions of, grantors, MDE and CEPI or any employee thereof. vii TABLE OF CONTENTS LIST OF TABLES . LIST OF FIGURES . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . x . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xiii CHAPTER 1 EFFECTS OF REDUCED COMMUNITY COLLEGE TUITION . . . . . . . Introduction . 1.3 Data and Sample . 1.3.1 Data Sources 1.3.2 ON COLLEGE CHOICES AND DEGREE COMPLETION . . . . . . . . 1.1 1.2 Michigan’s Postsecondary Education Market . . . . . . . . . . . . . . . . . . . . . 1 1 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6 . 6 1.2.1 Michigan’s Community Colleges . . . . . . . . . . . . . . . . . . . . . . . 1.2.2 9 Private Competitors to Community Colleges . . . . . . . . . . . . . . . . . 1.2.3 Other Postsecondary Options . . . . . . . . . . . . . . . . . . . . . . . . . 11 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12 Sample Construction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21 1.5.1 College Enrollment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21 1.5.2 College Completion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22 1.5.3 Heterogeneity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25 1.5.4 Robustness Checks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29 1.4 Empirical Strategy . . . 1.5 Results . 1.6 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . CHAPTER 2 COMMUNITY COLLEGE PROGRAM CHOICES . . . . . Introduction . 2.4 Measuring Local Job Losses . 2.1 2.2 Conceptual Framework . 2.3 Institutional Setting & Enrollment Data 2.3.1 2.3.2 IN THE WAKE OF LOCAL JOB LOSSES . . . . . . . . . . . . . . . . . . 32 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36 . . . . . . . . . . . . . . . . . . . . . . . 37 Programs Offered by Michigan’s Community Colleges . . . . . . . . . . . 38 Students Enrolled in Michigan’s Vocational Programs . . . . . . . . . . . . 40 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42 2.4.1 Using WARN Data to Generate Occupation-Specific Layoff Exposure . . . 43 2.4.2 Distribution of Layoffs Across Occupations . . . . . . . . . . . . . . . . . 46 Potential Measurement Error . . . . . . . . . . . . . . . . . . . . . . . . . 47 2.4.3 2.5 Effect of Job Losses on Enrollment in Related Programs . . . . . . . . . . . . . . . 49 2.5.1 Empirical Approach . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49 2.5.2 Main Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 50 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52 2.5.3 Robustness . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 54 Substitution out of Vocational Programs . . . . . . . . . . . . . . . . . . . 54 . . . . . . . . . . . . . . . . . 56 Substitution Between Vocational Programs . . . . . . . . . . 58 2.6.1 2.6.2 2.6.3 Explaining Substitution with Occupation Characteristics . 2.6 Substitution Effects . . . . . . . viii 2.6.4 Heterogeneity & Robustness . . . . . . . . . . . . . . . . . . . . . . . . . 60 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 62 2.7 Conclusion . . . . . . . CHAPTER 3 DO HEALTH INSURANCE MANDATES SPILLOVER TO EDUCA- 3.1 3.2 Background . . . . . . . . . . . . . . . . . . . TION? EVIDENCE FROM MICHIGAN’S AUTISM INSURANCE MAN- DATE . Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 65 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 65 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 72 3.2.1 Austism Spectrum Disorder and Therapy Options . . . . . . . . . . . . . . 72 3.2.2 The Michigan Autism Insurance Mandate . . . . . . . . . . . . . . . . . . 74 3.2.3 Special Education . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 76 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 78 . 3.3.1 Michigan Administrative K-12 Schooling Data . . . . . . . . . . . . . . . 78 3.3.2 Measuring Insurance Status . . . . . . . . . . . . . . . . . . . . . . . . . . 81 . . . . . . . . . 3.3 Data . 3.4 The Effect of the Autism Insurance Mandate on ASD and Special Education . . . . Incidence . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 83 3.4.1 Empirical Approach . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 83 3.4.2 Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 85 . . . . . . . . . 3.5 The Effect of the Autism Insurance Mandate on Educational Services and Test . . . . . . . . . . Scores . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 87 3.5.1 Empirical Approach . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 87 3.5.2 Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 89 3.5.3 Heterogeneous Treatment Effects and Robustness Checks . . . . . . . . . . 95 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 99 . . . . . . . . . . 3.6 Conclusion . . . . . . . . APPENDICES . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 102 APPENDIX A CHAPTER 1 APPENDIX . . . . . . . . . . . . . . . . . . . . . . . . 103 APPENDIX B CHAPTER 2 APPENDIX . . . . . . . . . . . . . . . . . . . . . . . . 131 APPENDIX C CHAPTER 3 APPENDIX . . . . . . . . . . . . . . . . . . . . . . . . 169 . . . . BIBLIOGRAPHY . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 193 ix LIST OF TABLES Table A.1.1 Mean Tuition Rates at Michigan Community Colleges, 2008-2016 . . . . 109 Table A.1.2 Associate Degree Programs Offered by Community & Vocational Colleges 110 Table A.1.3 Baker College vs. Private Two-Year Colleges . . . . . . . . . . . . . . . 111 Table A.1.4 Michigan’s Traditional Four-Year Colleges . . . . . . . . . . . . . . . . . 112 Table A.1.5 Michigan’s Community College Districts . . . . . . . . . . . . . . . . . 113 Table A.1.6 Descriptive Statistics, 2009-2016 High School Graduates . . . . . . . . . 114 Table A.1.7 First Stage Estimate of In-District Status on Tuition . . . . . . . . . . . . 115 Table A.1.8 Balance Tests of Student Characteristics . . . . . . . . . . . . . . . . . . 116 Table A.1.9 Balance Tests of Census Tract Characteristics . . . . . . . . . . . . . . . 117 Table A.1.10 Balance Tests of Distance to Postsecondary Institutions . . . . . . . . . . 118 Table A.1.11 Effect of In-District Status and Reduced Tuition on College Enrollment . 119 Table A.1.12 Heterogeneous Effects by Graduation Year . . . . . . . . . . . . . . . . . 120 Table A.1.13 Characteristics of Community and Vocational Colleges . . . . . . . . . . 121 Table A.1.14 Effect of In-District Status and Reduced Tuition on College Completion . 122 Table A.1.15 Distribution of Degree Completion Increases Across Majors . . . . . . . 123 Table A.1.16 Academic Program Categories . . . . . . . . . . . . . . . . . . . . . . . 124 Table A.1.17 Distribution of Bachelor’s Degree Increases Across Professional Majors . 125 Table A.1.18 Heterogeneity by Student Characteristics . . . . . . . . . . . . . . . . . . 126 Table A.1.19 Balance Tests of Student Characteristics, Varying Bandwidths . . . . . . 127 Table A.1.20 Local Community College Enrollment Results, Within Same School District . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 128 x Table A.1.21 Full Enrollment Results for Within Same School District Sample . . . . . 129 Table A.1.22 Placebo Tests . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 130 Table B.1.1 Programs Offered by Michigan’s Community Colleges . . . . . . . . . . 142 Table B.1.2 Program Groups and Associated Occupation Codes . . . . . . . . . . . . 143 Table B.1.3 Summary Statistics of Michigan’s High School Graduates . . . . . . . . . 144 Table B.1.4 Summary Statistics of Vocational Students by Program . . . . . . . . . . 145 Table B.1.5 Industries with Highest Concentration of Occupation Groups . . . . . . . 146 Table B.1.6 Correlation Between Occupation Composition Across Industries . . . . . 147 Table B.1.7 Summary Statistics of Layoffs in Michigan, 2001-2017 . . . . . . . . . . 148 Table B.1.8 Largest Layoffs by Occupation Group, 2001-2017 . . . . . . . . . . . . . 149 Table B.1.9 Effect of Job Losses on Enrollment in Related Community College Programs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 150 . . . Table B.1.10 Effect of Job Losses in Alternative Geographic Areas . . . . . . . . . . . 151 Table B.1.11 Effect of Community College Layoffs on Overall Vocational Program Enrollment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 152 Table B.1.12 Effect of Layoffs on College Enrollment Outcomes . . . . . . . . . . . . 153 Table B.1.13 Effect of Layoffs on Composition of Vocational Students . . . . . . . . . 154 Table B.1.14 Effect of Layoffs on First-Year Course-Taking . . . . . . . . . . . . . . . 155 Table B.1.15 Substitution Between Community College Program Groups . . . . . . . . 156 Table B.1.16 Substitution Between Narrower Community College Programs . . . . . . 157 Table B.2.1 Relationship Between Estimated Layoffs & Employment Change . . . . . 160 Table B.3.1 Effect of Layoffs on Retention in Related Programs . . . . . . . . . . . . 165 Table B.3.2 Own-Layoff Effects on Program Retention Rates . . . . . . . . . . . . . 165 Table C.1.1 Descriptive Tabulations of Analysis Variables . . . . . . . . . . . . . . . 177 xi Table C.1.2 Overlap Between Free/Reduced Price Lunch and Medicaid in Michi- gan, by Family Income . . . . . . . . . . . . . . . . . . . . . . . . . . . 178 Table C.1.3 The Effect of the ASD Insurance Mandate on Disability Incidence . . . . 179 Table C.1.4 The Effect of the ASD Insurance Mandate on Special Education Services 180 Table C.1.5 The Effect of the ASD Insurance Mandate on Other Special Education Services . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 181 . . . Table C.1.6 The Effect of the ASD Insurance Mandate on Test Scores . . . . . . . . . 182 Table C.1.7 The Effect of the ASD Insurance Mandate on Taking Regular Exams . . . 183 Table C.1.8 Heterogeneous Effects of the ASD Insurance Mandate on ASD Incidence 184 Table C.1.9 The Effect of the ASD Insurance Mandate, by Gender and Race . . . . . 185 Table C.1.10 The Effect of the ASD Insurance Mandate, by Grade . . . . . . . . . . . 186 Table C.1.11 Heterogeneous Effects of the ASD Insurance Mandate on Test Scores, using Non-Special Education Control Group . . . . . . . . . . . . . . . . 187 Table C.1.12 The Effect of the ASD Insurance Mandate – Robustness Checks . . . . . 188 Table C.1.13 The Effect of the ASD Insurance Mandate on Test Scores - Robustness Checks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 189 Table C.1.14 The Effect of the ASD Insurance Mandate – No Sample Exclusion Based on Disadvantaged Status . . . . . . . . . . . . . . . . . . . . . . . 190 Table C.1.15 The Effect of the ASD Insurance Mandate on Disability Incidence - No Sample Exclusion Based on Disadvantaged Status . . . . . . . . . . . . . 191 Table C.1.16 The Effect of the ASD Insurance Mandate on Test Scores - No Sample Exclusion Based on Disadvantaged Status . . . . . . . . . . . . . . . . . 192 xii LIST OF FIGURES Figure A.1.1 Identified Community College District Boundaries . . . . . . . . . . . . 103 Figure A.1.2 Washtenaw Community College District Analysis Sample . . . . . . . . . 104 Figure A.1.3 Distribution of Border Pair Tuition Differentials . . . . . . . . . . . . . . 105 Figure A.1.4 Correlation Between Tuition Differentials and Area Characteristics . . . . 106 Figure A.1.5 Reduced Form Estimates with Alternative Bandwidths . . . . . . . . . . 107 Figure A.1.6 School Districts Overlapping Community College Districts . . . . . . . . 108 Figure B.1.1 Differences in Course-Taking and Credit Completion by CC Program Group . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 131 Figure B.1.2 Labor Market Shocks in Michigan, 2001-2017 . . . . . . . . . . . . . . . 132 Figure B.1.3 Average Layoffs in Michigan Counties, 2001-2017 . . . . . . . . . . . . 133 Figure B.1.4 Correlation Between National and State-Specific Industry Employment Shares, 2016 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 134 Figure B.1.5 Distribution of Layoffs by County, 2001-2017 . . . . . . . . . . . . . . . 135 Figure B.1.6 Robustness Checks for Pooled Specification . . . . . . . . . . . . . . . . 136 Figure B.1.7 Substitution into Program Groups Requiring Similar Skills . . . . . . . . 137 Figure B.1.8 Relationship Between Substitution Effects and Skill Distance . . . . . . . 138 Figure B.1.9 Alternative Measures of Skill Distance . . . . . . . . . . . . . . . . . . . 139 Figure B.1.10 Heterogeneous Own-Layoff Effects . . . . . . . . . . . . . . . . . . . . 140 Figure B.1.11 Robustness Checks for Own-Layoff Effects . . . . . . . . . . . . . . . . 141 Figure B.2.1 Comparison of Employment Counts in QCEW & CBP . . . . . . . . . . 159 Figure B.2.2 Relationship between Layoffs and Employment Changes, by Sector . . . 161 Figure B.3.1 Effect of Layoffs on Program Choice for Later Enrollees . . . . . . . . . 162 xiii Figure B.4.1 Effect of Layoffs on Enrollment in Narrower Program Groups . . . . . . 166 Figure B.4.2 Substitution into Narrower Program Groups Requiring Similar Skills . . . 167 Figure B.4.3 Relationship Between Substitution Effects & Skill Distance, Narrower Programs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 168 Figure C.1.1 ASD and Non-ASD Special Education Incidence Event Studies . . . . . . 169 Figure C.1.2 Test Scores Event Studies, using Non-Disabled Control Group . . . . . . 170 Figure C.1.3 Event Study Estimates of Main Outcomes . . . . . . . . . . . . . . . . . 171 Figure C.1.4 Test Scores Event Studies, using All Non-ASD Control Group . . . . . . 175 Figure C.1.5 Test Score Event Studies, using Non-ASD Special Ed Control Group . . . 176 xiv CHAPTER 1 EFFECTS OF REDUCED COMMUNITY COLLEGE TUITION ON COLLEGE CHOICES AND DEGREE COMPLETION 1.1 Introduction Community colleges enroll nearly 40% of U.S. undergraduate students and are increasingly the focus of college access initiatives (National Center for Education Statistics, 2018).1 These insti- tutions offer a variety of educational programs, including vocationally focused certificates, two- year associate degrees, and pathways to transfer to four-year colleges and universities. Moreover, community colleges offer these opportunities at a lower price than nearly all other postsecondary options, making them accessible to a large and diverse group of students, many of whom come from low-income backgrounds or are the first in their families to attend college (Ma and Buam, 2016). In recent years, policymakers have capitalized on community colleges’ commitment to ac- cess in their local communities by implementing programs that make community college low-cost or completely tuition-free (Smith, 2017). As these types of programs grow in popularity, so too do questions about their potential con- sequences for students’ educational attainment and labor market outcomes. Policymakers and researchers alike have expressed concern that reducing the price of community college may de- ter students from enrolling in four-year colleges, potentially decreasing the probability that they earn bachelor’s degrees and receive wage premiums in the labor market. Notably absent from this discussion, however, is the possibility that reducing the price of community college could deter students from enrolling in private colleges that offer certificates and associate degrees —here- after referred to as vocational colleges. These colleges primarily operate as for-profit entities, which have grown rapidly in the past two decades and now produce over 40% of less-than-two- 1In this paper, I use the term “community college” to refer to any publicly funded college that primarily offers sub-baccalaureate credentials. These institutions are also sometimes referred to as junior colleges, technical colleges, or city colleges. 1 year certificates and nearly 20% of associate degrees in the U.S., despite having higher average tuition rates, and lower average completion rates and wage premiums than their public, not-for- profit counterparts (Deming et al., 2012; Cellini and Turner, 2018; Armona et al., 2018). While there is some evidence that community colleges and for-profit colleges compete for students in the two-year college market, particularly in the presence of declines in state funding for public higher education (Cellini, 2009; Goodman and Volz, 2019) or local labor demand shocks (Armona et al., 2018), there is currently no direct evidence on how tuition rates at public institutions alter students’ enrollment decisions in private institutions in the two-year sector, or how such a substitution effect may impact students’ longer-run educational outcomes. In this paper, I empirically estimate the effects of community college tuition on students’ col- lege enrollment decisions and outcomes across different sectors of the postsecondary education market. To isolate exogenous variation in community college tuition rates, I exploit an institu- tional feature of Michigan’s community college system in which students residing on either side of a “community college district” boundary face substantially different tuition rates at their lo- cal community college due to a locally provided tuition subsidy. This feature allows me to use a boundary fixed effects strategy that compares the college choices and outcomes of students who live just inside of a community college district and face an average community college tuition rate of $2,300 per year to their peers who live just outside of a community college district and face an average tuition rate of $4,100. While this approach is similar to that used by Denning (2017) and McFarlin et al. (2018) to study community college taxing districts in Texas, I am able to build upon both studies through the use of detailed, student-level administrative data from the Michigan Department of Education that contains students’ precise census blocks of residence, as well as comprehensive college enrollment and completion records across public and private colleges. Obtaining students’ census blocks of residence enables me to very accurately determine whether students reside within community college districts and to avoid the potential measurement error in- duced by inferring in-district status from the schools they attend. McFarlin et al. (2018) show that precisely measuring community college tuition is important in determining its effects on college 2 enrollment, but are unable to observe in which colleges students enroll due to their use of restricted- access Census data. Meanwhile, Denning (2017) observes detailed college enrollment and com- pletion records but must proxy for in-district status with the location of a student’s high school. By combining data on students’ precise residences with specific college enrollment records, I am bet- ter able to identify the direct effect of a community college’s tuition rate on a student’s decision to enroll in the college. In addition, the detailed college records in my dataset come from the National Student Clearinghouse (NSC), which covers 97% of all postsecondary institutions in the U.S., and now covers several of the largest national for-profit colleges (National Student Clearinghouse Re- search Center, 2017). This coverage allows me to determine the underlying substitution effects that drive an increase in community college attendance, including whether reduced community college tuition crowds out enrollment in similar private colleges, which previous work has not been able to consider. Among students graduating from Michigan public high schools between 2009 and 2016, I find that reducing the tuition rate that a student faces at her local community college by $1,000 increases the probability of enrollment at the college within a year of high school graduation by 3.5 percentage points, about 18% of the mean enrollment rate. A portion of this increase can be attributed to students enrolling in their local community college who would not have initially enrolled in any postsecondary education program in the absence of the tuition reduction, as a $1,000 decrease in local community college tuition increases overall college enrollment by 0.7 percentage points (1% of the mean). At the same time, this tuition decrease causes students to reduce enrollment in non-local community colleges by 1.6 percentage points (8% of the mean) and in for-profit and other private, vocationally-focused colleges that offer two-year degrees by 0.4 percentage points (11% of the mean). The remainder of the increase in local community college attendance can be attributed to a 1.0pp decline in four-year college attendance; however, this estimate is statistically insignificant and is quite small compared to its mean. Using longer-run data from cohorts who graduated high school between 2009 and 2011, I find further evidence that reduced community college tuition increases persistence in college and degree 3 completion. A $1,000 decrease in local community college tuition induces students to complete 2.5% more semesters of college, 2.7% more college credits, and to transfer to four-year colleges at a rate 6.5% higher than their peers who do not receive discounted tuition. This $1,000 tuition decrease also increases bachelor’s degree completion by 1.1pp (3.5%), particularly in business and professional fields such as teacher education and exercise science. These improved outcomes are driven in part by students switching from higher-cost and lower-resourced vocational colleges that focus on labor market preparation to higher-resourced community colleges that promote transfer to four-year colleges. Consistent with this mechanism, I also find that reduced community college tuition induces students to earn general liberal arts associate degrees, which are designed to prepare students to transfer, rather than associate degrees in vocational subjects. These results contribute to several strands of literature on college choice and the consequences of public subsidization of postsecondary education. First, the results add to a large body of empir- ical work on the effect of college costs on students’ college enrollment decisions. Most previous analyses find approximately a 3-5 percentage point increase in the probability of enrollment for each $1,000 decrease in the cost of a college option (Deming and Dynarski, 2010; Page and Scott- Clayton, 2016), with potentially even larger effects at the community college level. However, recent estimates of students’ sensitivity to community college costs come from large-scale policy changes, such as the introduction of free tuition policies (Carruthers and Fox, 2016) or the ex- pansion of community college districts (Denning, 2017), which may affect students’ choices and outcomes through multiple channels, such as informational campaigns, mentoring programs, or the construction of new college campuses. The results presented here isolate tuition variation by comparing observationally similar students who likely have similar exposure to college informa- tion, marketing, and campuses, and are very much in line with that of the broader literature. This finding suggests that, despite the already low cost of most community colleges in the U.S., stu- dents are responsive to the sticker prices advertised by community colleges and that policies that reduce advertised tuition rates by even small amounts may have meaningful impacts on students’ educational and labor market outcomes. 4 Second, this research provides the first direct evidence that students substitute towards commu- nity colleges and away from similar private colleges, including those in the for-profit sector, when community college tuition is low. Cellini (2009) and Goodman and Volz (2019) document a simi- lar phenomenon in the context of changes in state funding for higher education, whereby increases in funding for public colleges deter students from attending for-profit institutions. In this paper, I find that this private-to-public enrollment shift also occurs as a direct result of a reduction in community college tuition and that the shift improves students’ educational attainment. However, as in Denning (2017), I do not find that, on average, students forgo initially attending four-year colleges when they have access to a low cost community college or that students forgo opportuni- ties to earn bachelor’s degrees by attending community colleges. This finding comes in contrast to Carruthers and Fox (2016) who find that a broad, tuition-free community college program in Ten- nessee reduces four-year college enrollment, suggesting that the structure of community college tuition policies may play an important role in determining their effects on students’ college choices and outcomes. Finally, this work contributes to an expanding literature on the effects of community college attendance on educational and labor market outcomes. Because community colleges are uniquely situated between the labor market and four-year colleges, their impact on students’ longer-term out- comes is often ambiguous and depends on students’ counterfactual enrollment decisions (Rouse, 1995). Some students who attend community college may be made better off because in the ab- sence of community colleges they would not have attended any college, while others may be made worse off because they are diverted from attending four-year colleges. Empirically, students who are deterred from attending four-year colleges tend to experience an educational attainment and labor market penalty (Reynolds, 2012; Goodman et al., 2017), while students who are induced to attend their local community college rather than not attending any college experience positive educational and labor market gains (Mountjoy, 2019). I find that students who are induced to at- tend their local community college rather than attending other predominantly two-year colleges are more likely to transfer to four-year colleges and earn bachelor’s degrees. This result implies that 5 gains from community college attendance can extend to a broader group of students than identified in prior work and suggests that policies that increase community college access without deter- ring students from attending four-year colleges could increase educational attainment and improve labor market outcomes. 1.2 Michigan’s Postsecondary Education Market The institutional setting for this analysis is the postsecondary education market in the state of Michigan. There are over 90 accredited colleges and universities in Michigan offering a wide range of academic programs, and over 90% of the state’s high school graduates who enroll in college choose to attend one of them. Michigan has a largely decentralized community college system in which tuition rates are determined independently by each college and are based on a student’s place of residence relative to specific geographic boundaries. This creates large differences in the tuition rates faced by otherwise observationally similar students who reside on either side of a given boundary. In addition, Michigan is home to a large private vocational college, Baker College, which has multiple locations throughout the state and enrolls over 25,000 students annually. Baker offers sub-baccalaureate academic programs similar to Michigan’s community colleges but spends less per student on instruction and has much lower transfer rates than its public counterparts. The presence of this potential competitor in the two-year college market allows me to examine whether subsidizing community college tuition crowds out enrollment in similar private colleges. 1.2.1 Michigan’s Community Colleges Michigan is home to 28 public community colleges which together enroll over 300,000 stu- dents annually (Michigan Community College Association, 2019). Each college is designed to serve a distinct geographic area, known as a community college district, and is given substantial autonomy over its administration. There is no overarching state law nor agency governing the specific operations of community colleges and state intervention in their practices is rare (Hilliard, 2016). The state government does, however, provide annual appropriations funding to community 6 colleges, which accounts for approximately 20% of the community colleges’ operating revenues.2 To supplement this funding, the colleges rely heavily on both tuition and fees (43% of operating revenues) and local property taxes (35% of operating revenues). For each college, local property taxes may only be assessed on properties within its community college district (Michigan House Fiscal Agency, 2017).3 Community college district boundaries are governed by the trustees of each college under state guidelines and may be primarily comprised of counties, public school districts, or public interme- diate school districts (ISDs), which are administrative organizations that support multiple school districts.4 Community college districts may also include or exclude specific cities, townships, or other geographic features, although any changes to boundaries must be voted on by residents of the district. Currently, 15 of the state’s 28 community college districts are comprised primarily of counties and 13 are comprised primarily of school districts or ISDs.5 Based on conversations with state employees and community college staff members, it is my understanding that no com- munity college boundaries changed during the time frame of the data, and that most have remained unchanged for several decades. Community colleges offer tuition rates based on a student’s place of residence relative to their 2This funding is allocated based on a weighted performance funding model that takes into account prior-year funding, enrollment, and performance indicators, and rewards colleges for low administrative costs and adherence to best practices for community engagement (Michigan House Fiscal Agency, 2017). 3In 2015-2016, the average millage rate for community colleges was 2.51, i.e. $2.51 per $1,000 of taxable property value (Michigan Center for Educational Performance & Information, 2017). This millage rate is assessed on all properties in a community college’s district, in addition to any other local property taxes (e.g., county, school district, township, or municipality taxes). Using data on aggregate real estate taxes and home values at the census tract level from the American Community Survey, I estimate that in-district areas in Michigan have an average total millage rate of 17.4, while out-of-district areas have an average total millage rate of 12.3. 4More information about Michigan’s ISDs is available here: https://www.gomaisa.org/ value-of-isds/. 5Table A.1.5 lists the geographic areas that comprise each community college district. I gather this information from individual community college websites, course catalogs, and conversations with colleges’ institutional research staff. 7 community college district boundaries.6 In exchange for property tax funding, students residing within the boundaries of a district are offered the lowest tuition rate at their district’s community college, averaging approximately $90 per credit. Students residing within Michigan, but outside of the district, are offered the next lowest rate,7 and students residing outside of the state are offered the highest rate.8 Critically for the analysis at hand, a sizable portion of Michigan high school students reside outside of any community college district and, therefore, face the higher, out-of- district tuition rate at any community college they wish to attend. Using data on students’ census blocks of residence, I estimate that approximately 23% of Michigan’s high school graduates reside in an area that is not part of any community college district. On average, these students face tuition rates at their local community college —the college whose district area they reside nearest —that are 65% higher than those faced by their peers who live within the community college’s district boundaries.9 This equates to an average annual cost difference of nearly $1,500 for a student taking a course load of 12 credits per semester. Given that the annual median family income of Michigan’s community college students is approximately $60,000 (Chetty et al., 2017), this represents a difference of approximately 2.5% of annual median family income. Table A.1.1 provides summary statistics on the average in-district and out-of-district tuition rates between 2008 6Tuition rates are set based on students’ residences regardless of whether students enroll in courses in-person or online. However, students who reside within a community college district are also able to enroll in online courses offered by other community colleges at a discounted rate (https: //www.micollegesonline.org/courses.html). If anything, this feature should attenuate the estimates that follow as it reduces the incentive for in-district students to enroll in their local community college. 7Macomb Community College also offers an “affiliate” tuition rate to students who reside out- side of their district but in areas near their boundaries, which I incorporate in the empirical analysis. Results are also robust to treating this area as out-of-district. 8Michigan’s community colleges differ in how long a student must be a resident of the district to qualify for in-district tuition. However, most require several months of residency, which makes it unlikely that students who do not reside in a district while attending high school would be able to claim in-district residency upon initial enrollment. 9The tuition prices used in this paper are the colleges’ advertised tuition prices, also known as sticker prices. Both in-district and out-of-district students may qualify for federal, state, local, or institutional financial aid that will reduce their net price of attendance. Across Michigan’s community colleges, data from IPEDS indicates that the average net price for in-district students is approximately 80% lower than the average net price for out-of-district students. 8 and 2016, measured in 2016 dollars. Following Denning (2017), I calculate semester tuition as the tuition rate for 12 credits and annual tuition as the tuition rate for 24 credits. In addition to the tuition variation induced by community college district boundaries, students residing in different areas of the state and graduating in different years may also face substantially different local community college tuition rates. Without government oversight of tuition-setting policies, individual community colleges are free to differ in their relative in-district and out-of- district rates and may update these rates annually. Over the time frame of the data, real mean in- district tuition (measured in 2016 dollars) ranged from $76.90 per credit at Oakland Community College to $114.89 per credit at Mott Community College. Real mean out-of-district tuition ranged from $114.05 per credit at Wayne Community College to $221.22 per credit at Grand Rapids Com- munity College. This range means that, on average between 2008 and 2016, it was less costly to be an out-of-district student at Wayne Community College than to be an in-district student at Mott Community College. Community college tuition rates, particularly for out-of-district students, have also steadily increased over the past decade. For the graduating high school class of 2008, the real average in-district tuition rate per credit was $82.47 and the average out-of-district rate was $134.46. By 2016, these average rates had increased to $106.10 and $176.58, respectively. 1.2.2 Private Competitors to Community Colleges Michigan’s other postsecondary institutions may be grouped into two mutually exclusive cat- egories: vocational colleges, which predominantly offer sub-baccalaureate degree programs, and traditional four-year colleges, which predominantly offer bachelor’s and graduate degrees. I define a vocational college as a private institution that is either (1) a for-profit institution or (2) a not-for- profit institution that offers more than 25% of its degrees at the associate degree level and accepts 90% or more of applicants. These colleges are similar to the state’s community colleges in that they provide access to a vast majority of interested students and offer academic programs that can be completed in two years or less: namely, associate degrees and short-term certificates. Community and vocational colleges also tend to offer degrees in similar fields and both have an emphasis on 9 health and business subjects. Table A.1.2 highlights this point by comparing the types of associate degrees offered by the community and vocational colleges attended by Michigan’s high school graduates. Given the overlap in program offerings, it is reasonable to believe that these vocational institutions compete with community colleges in the market for sub-baccalaureate education. In Michigan, the colleges identified under this vocational college criteria and available in the NSC data are: Baker College (not-for-profit), Davenport University (not-for-profit), Everest Insti- tute (for-profit), ITT Technical Institute (for-profit), and The International Academy of Design & Technology (for-profit).10 I also observe enrollment in other large national for-profit chains, such as the University of Phoenix, DeVry University, and Kaplan University, although these institutions do not report in which campus a student is enrolled so I am unable to observe whether students enroll in Michigan, online, or elsewhere in the country.11 However, I do not observe enrollment in any smaller for-profit institutions located within Michigan, such as cosmetology schools.12 This lack of coverage includes institutions that do not participate in federal financial aid programs, which Cellini and Goldin (2014) show account for over half of for-profit enrollment in Michigan. It is not obvious that these types of non-degree granting institutions would be popular among recent high school graduates, but to the extent that they are, I will overestimate the share of students not enrolling in college and will underestimate the share enrolling in vocational colleges. As such, my results should be interpreted as an upper bound of the effect of reduced community college tuition on overall college enrollment and a lower bound of the effect of reduced tuition on substitution away from vocational colleges. The most popular private vocational institution among Michigan’s high school graduates is Baker College, which has thirteen campuses throughout the state and enrolls over 70% of Michi- 10The three for-profit colleges in this list (Everest, ITT, and The International Academy of De- sign & Technology) shut down operations within Michigan during the time frame of the data. To my knowledge, no new colleges opened. 11Students who enroll in exclusively online programs are included in the NSC data, but I am unable to distinguish between on-campus and online enrollment within an institution. 12In 2017, the NSC reported coverage of 78% of multi-state for-profit institutions but 0% cov- erage of for-profits operating only in Michigan (National Student Clearinghouse Research Center, 2017). 10 gan’s vocational students.13 Baker is a private, not-for-profit institution, that primarily offers de- gree programs designed to take two years or less. Such institutions are not common in the U.S. For example, according to the 2016 College Scorecard, there are 369 private, predominantly associate- or certificate-degree granting institutions in the U.S. but 2,587 for-profit private institutions offer- ing the same types of degrees. However, in many ways, Baker College operates similarly to the more popular model of a private, for-profit two-year college. Table A.1.3 compares Baker to the universe of private colleges that predominantly grant associate degrees and certificates. Across several measures of institutional quality and outcomes, Baker appears more similar to its for-profit counterparts rather than its not-for-profit peers. Given these similarities, the results from this pa- per should provide suggestive evidence on how reductions in local community college tuition may affect enrollment at for-profit vocational colleges. 1.2.3 Other Postsecondary Options The remainder of undergraduate, degree-granting postsecondary institutions in Michigan are either public or private traditional four-year colleges. In recent years, public universities have pri- marily relied on students’ tuition payments for operating expenses as state appropriations have declined and now account for only 21% of the universities’ operating budgets (Michigan House Fiscal Agency, 2017). Similar to the state’s community colleges, there is little government over- sight of the universities’ practices and, as a result, there is a substantial amount of heterogeneity in tuition rates, expenditures, and program offerings among them. However, in contrast to commu- nity colleges, all public universities offer the same tuition rate to all in-state students regardless of their location of residence. Michigan also has several private four-year institutions, which finance their operating expenditures with students’ tuition payments, private donations, and endowments as they receive minimal support from the state. They tend to be much smaller and somewhat more expensive than the state’s public universities and, overall, make up a small share of the postsec- 13Because of this large market share, my results are robust to any definition of vocational col- leges that includes Baker College. 11 ondary education market. Table A.1.4 provides summary statistics on these institutional attributes across the public and private sectors. Students who choose not to enroll in community, vocational, or traditional four-year colleges generally enter the state’s low-skill labor market. In the years following the Great Recession, young adults who have chosen this option in Michigan have faced high rates of unemployment and underemployment. Those who are employed are most likely to work in service and retail occupations, which have low median wages and minimal opportunities for advancement (Bureau of Labor Market Information and Strategic Initiatives, 2014). 1.3 Data and Sample 1.3.1 Data Sources The data used in this paper primarily come from a student-level, administrative dataset pro- vided by the Michigan Department of Education (MDE) and the state’s Center for Education Per- formance and Information (CEPI). This dataset contains academic records for all students enrolled in grades 9-12 in Michigan’s public schools between 2007 and 2017 and further links these stu- dents to college enrollment and completion records from the NSC and a state-run data repository (STARR). The high school academic records provide rich information on students’ demographic characteristics, including race/ethnicity, gender, free and/or reduced price lunch (FRPL) eligibility, English language learner (ELL) status, and special education enrollment; academic performance, including math and reading tests scores on a state standardized test administered in eleventh grade; and place of residence measured at the census block level. The final component is a key advantage of the MDE/CEPI dataset as it allows me to very accurately determine whether a student resides within a community college district.14 The college link provided through the NSC and STARR 14This feature of the data is a particular advantage in Michigan because the state has generous school choice policies and nearly 6% of K-12 students to attend a school other than that to which they are assigned (either within or outside their school district of residence). An additional 7% of students attend a charter school (Cowen et al., 2015). Thus, using the location of a student’s high school to proxy for her place of residence —as is common in other settings with spatial variation (e.g., Denning, 2017) —would likely introduce measurement error to the estimation procedure. 12 contains all dates and records of students’ enrollments in colleges covered by either database. The data also include information on the academic programs in which they enroll, using six-digit Clas- sification of Instructional Program (CIP) codes, the credits they complete, and the awards they receive. I match these data to postsecondary institutional information, including campus latitudes and longitudes, from the NCES’ Integrated Postsecondary Education Data System (IPEDS). I also gather annual in-district and out-of-district tuition rates at each of Michigan’s community colleges from Michigan’s Workforce Development Agency. 1.3.2 Sample Construction The goal of this paper is to estimate the causal effect of the tuition rate a student faces at her local community college on her postsecondary enrollment decisions and outcomes. To do so, I exploit the fact that students who live inside one of Michigan’s community college districts face a substantially discounted tuition rate at their local community college. The challenge of this approach is that community college district areas may be spatially correlated with unobservable determinants of college choice. For example, community colleges may form their districts in geo- graphic areas that have strong preferences for community college education, which would then bias any estimates of the effect of in-district status on college enrollment or outcomes. To mitigate this type of bias, I limit the sample to students who reside near a community college district boundary and use fixed effects to compare the outcomes of students who reside in close geographic prox- imity to one another and graduate from high school in the same year but differ in their in-district status at the local community college.15 15The two mile bandwidth is chosen to maximize sample size and minimize observed differ- ences between adjacent in-district and out-of-district students. Results using alternative band- widths are included in the appendix and discussed in Section 1.5.5. Note that this approach is similar in spirit to regression discontinuity (RD) designs that exploit geographically-discontinuous treatments. However, because I do not observe students’ exact addresses and must aggregate to the census block level, there is a mass point in the running variable at the geographic discontinuity and I cannot use standard RD inference techniques that rely on a smooth distribution of individuals at the discontinuity (Keele et al., 2017). 13 To implement this empirical strategy, I first identify the census blocks that are located within each community college district. For community college districts consisting solely of counties, this is straightforward: I assign a census block to the community college district if the census block is contained within the county of interest. For community college districts that include public K-12 school districts, I first calculate the amount of geographic overlap between each census block and all overlapping school districts. I then match a census block to the school district with which it shares the most overlap and assign it to the community college district of that school district. Once I have mapped all census blocks to their corresponding community college districts, I identify community college district boundaries that divide a collection of census blocks that are contained within a given community college district from a collection of census blocks that are not contained within any community college districts. Figure A.1.1 displays all 28 community college districts and bolds the district boundaries used in the analysis.16 To limit the analysis to students who differ in their in-district status, but reside within a small distance of one another, I divide each identified boundary into equal segments, each of which is no more than 5 miles long. Throughout the remainder of the text, I refer to these segments as “boundary segments.” I next calculate the distance from the centroid of each student’s census block to the nearest boundary segment and, in my main empirical specification, restrict the sample to students residing within two miles of their nearest boundary segment.17 An example of this sample restriction for the Washtenaw Community College district area is provided in Figure A.1.2. Each dot represents a single census block centroid that is no more than two miles from the nearest boundary segment, and dots displayed in the same shade are located closest to the same boundary 16Both Bay de Noc Community College and Glen Oaks Community College have “service dis- tricts” in which students face tuition rates that are greater than the in-district but lower than the out-of-district rate. I do not include boundaries that divide these areas from areas not in any com- munity college district as they are less salient than the true community college district boundaries. 17In order to only include students who are likely to be affected by the local community college’s listed tuition rate, I further exclude 6,687 students who are eligible for place-based promise scholarships upon high school graduation, or whose area of residence becomes eligi- ble for a promise scholarship during the time frame of the data. I identify areas that are eligible for promise scholarships from the Upjohn Institute’s Promise Database: https://www.upjohn.org/ promise/promiseSearch.html. 14 segment. Intuitively, the empirical strategy compares the outcomes of students who live in census blocks shown in the same shade, but reside inside or outside of the community college district.18 Figure A.1.3 presents visual evidence on the differences in local community college tuition rates among students residing on either side of the identified boundary segments by plotting the distribution of in-district vs. out-of-district tuition differentials across all border-year pairs. The average difference in tuition between in-district and out-of-district students is $1,617, which is only slightly higher than the average college-level difference of $1,463 (see Table A.1.1). However, there is some variation in this differential, with the interquartile range stretching from $1,315 to $2,036. To further explore this variation, Figure A.1.4 plots the tuition differentials against various demographic characteristics. There is no identifiable relationship between a border-year pair’s tuition differential and the share of economically disadvantaged students, the median household income of the area, or students’ average test scores. This finding suggests that the variation in tuition differentials likely come from different tuition-setting policies and practices at colleges throughout the state rather than differences in local economies or preferences for education. Table A.1.6 then provides descriptive statistics on the entire sample of students who graduate from Michigan public high schools between 2009 and 2016, and on the analysis sample who live within two miles of their nearest boundary segment.19 I also present separate means for the in- district and out-of-district students in each sample. All variables are measured when a student takes the Michigan Merit Exam (MME), a required standardized test that is typically administered during a student’s junior year of high school. Panel A shows that there are some differences in demographic characteristics between in-district and out-of-district students. For example, in- district students are less likely to be white and are more likely to be English language learners 18I do not consider boundaries that divide two distinct community college districts, so students residing outside of a community college district of interest do not reside within any community college district. 19Students who graduate before 2009 or after 2016 are dropped from the sample due to incom- plete college enrollment and completion data collection. Students enrolled in juvenile detention centers, adult education, or alternative education programs, as well as those missing academic or demographic variables, are also dropped from the sample. 15 (ELL). This is not surprising since community college districts tend to be located in more urban and diverse areas of the state. Panel B then shows that in-district students score slightly lower on their state standardized tests than their out-of-district peers. Panel C reports college enrollment outcomes for the first year following a student’s graduation from high school. I maintain all college enrollment spells that occur within this time frame, which may include enrollment at multiple institutions. As a result, the sum of enrollment in different college types is slightly larger than the total number of students who enroll in some form of post- secondary education. In both samples, about 30% of high school graduates enroll in a community college, with more in-district students doing so than out-of-district students, especially in the all students sample. About 3% enroll in vocational colleges, with less in-district students doing so than out-of-district students. In the sample of all students, about 41% of graduates enroll in a four- year college, while about 38% do so in the analysis sample. There are little differences in this rate between in-district and out-of-district students. In total, about 70% of all Michigan public high school graduates enroll in college within one year, while about 67% of the analysis sample does. 1.4 Empirical Strategy The boundary fixed effects approach, as outlined in Figure A.1.2, naturally lends itself to the following reduced form estimating equation: Yibt = γ + δ Districti + XiΨ + µbt + νibt (1.1) where Yibt is an outcome of interest for student i who resides along boundary segment b and grad- uates from high school in year t. Districti is a dummy variable equal to 1 if student i resides in a community college district and equal to 0 otherwise. Xi is a vector of individual control vari- ables that may affect college enrollment choices, such as a student’s socioeconomic background and academic aptitude. µbt is a full set of boundary segment by year fixed effects, which will hold constant any factors affecting graduates who live in the same area along a community college boundary segment, such as local economic conditions or changing preferences for higher educa- tion. νibt is an idiosyncratic error term. The coefficient of interest is δ , which represents the effect 16 of residing in a community college district on Yibt. To estimate how community college tuition itself affects students’ choices and outcomes, I also use a two-stage least squares approach similar to Denning (2017). I choose to use this approach because it is a straightforward way to scale the results by the mean difference between in-district and out-of-district tuition rates. The first stage equation is: Tuitionibt = ζ + λ Districtib + XiΦ + µbt + υibt and the second stage equation is: Yibt = α + β (cid:92)Tuitionibt + XiΓ + µbt + εibt (1.2) (1.3) where (cid:92)Tuitionibt is predicted from the first stage, and the remainder of the variables are defined as in previous equations.20 In order for β to represent the causal effect of local community college tuition in the 2SLS ap- proach outlined above, it must be the case that (1) Cov(Districtib,Tuitionibt|Xi, µbt) (cid:54)= 0 and that (2) Cov(Districtib,εibt|Xi, µbt) = 0. The first assumption states that, within a narrowly defined ge- ographic area and graduation year, and after controlling for observable characteristics, a student’s in-district status is related to the tuition rate he or she faces at the local community college. Given that all community colleges in Michigan set different tuition rates for in-district and out-of-district students, this assumption should hold. However, it is also directly testable in the data. Table A.1.7 presents the estimated first stage value of λ in three specifications of equation (1.2): including no control variables, including only distance-related control variables, and including a full set of distance and student control variables.21 The estimated values are quite stable across the different 20One could also estimate this relationship via ordinary least squares, but this would impose that the relationship between the tuition differential (in dollars) and the effect of residing in-district is linear, i.e. that the largest in-district effects occur when there are the largest raw tuition dif- ferentials. This may be reasonable, considering I do not see much correlation between tuition differentials and observable characteristics, but it is also possible that boundaries with high tuition differentials are unobservably different than boundaries with low differentials. Thus, to remain agnostic about this relationship, I prefer the 2SLS approach. 21The distance-related variables are the distance between a student’s census block of residence 17 specifications and indicate that in-district students face a local community college tuition rate that is approximately $1,800 lower than that of their out-of-district peers. All three estimates also have partial F-statistics greater than 40, limiting the probability that the 2SLS estimates suffer from weak instrument bias. The second assumption states that, within a narrowly defined geographic area and graduation year, and after controlling for observable characteristics, a student’s in-district status is uncorre- lated with unobservable determinants of college choices or outcomes. This is also the assumption needed for the identification of δ in the reduced form equation. This assumption rules out the pos- sibility that, for example, families choose to live in community college districts due to unobserved preferences for community college attendance. This is inherently untestable. However, there are several reasons to believe that this assumption is likely to hold. First, community college district boundaries are not well-publicized by the state of Michigan. The state does not maintain any publicly-available record of community college district boundaries and each community college has discretion over whether and how they make this information available to potential students. Thus, it is possible that a family could select a place of residence without knowing whether or not it is contained within a community college district.22 Second, very few students move into community college districts between 9th and 12th grade. This suggests that families do not anticipate community college attendance and move to take ad- vantage of the subsidized tuition rates offered to in-district students. While nearly 14% of all students move census blocks during high school, less than 1% move from an out-of-district cen- sus block to an in-district census block.23 Moreover, conditional on beginning high school in a and the nearest campus of the local community college, the nearest vocational college, the nearest public university, and the nearest private four-year college. The student control variables are: a student’s race (white, black, or Hispanic), gender, FRPL status, special education participation, ELL status, math and reading test scores, school of choice participation, on-time graduation status, and dual enrollment experience. 22Property taxes for the local community college are displayed on the tax bills of property own- ers who reside within community college districts, but there is no indication of in-district status, nor tuition rates, on these bills. 23Author’s own calculation based on a sample of students who have records for all grades 9-12 and non-missing census block information in at least two of those grades. 18 community college district, a student has a probability of finishing high school in a community college district of 99%. In contrast, conditional on beginning high school outside of a community college district, a student has a probability of finishing high school in a community college district of 4%. While I do not observe students’ residences after they graduate from high school, I restrict outcomes to students’ enrollment choices within one year of high school graduation to avoid the possibility that students move into community college districts as adults. Third, students residing on either side of a community college district boundary appear quite similar across observable characteristics. Table A.1.8 reports balance tests of observable student characteristics and predicted community college enrollment along the boundary segments.24 The results indicate that students residing near one another, but on opposite sides of a community college district boundary, are quite similar. These students are similarly likely to be white, to be eligible for free or reduced price lunch, and to be English language learners. They also score similarly on standardized tests, graduate on-time from high school at similar rates, and have equal predicted community college attendance rates. The only attributes across which the two groups differ are special education status and dual enrollment participation: in-district students are both less likely to be classified as special education students and slightly less likely to dual enroll in a college course while in high school, although the latter result is only marginally statistically significant.25 Tables A.1.9 and A.1.10 provide additional evidence of balance across neighborhood characteristics and distance to local colleges. Despite these mitigating factors, the largest threat to identification is the fact that community college district boundaries are often congruent with either county or school district boundaries, 24I predict enrollment on the full sample of high school graduates using a probit equation that includes the observable characteristics of the other balance tests. Specifically, I estimate enroll- ment as a function of a student’s race, gender, FRPL eligibility, special education status, ELL status, math test score, reading test score, on-time graduation status, and dual enrollment status. This approach explicitly tests for differences in observable characteristics that are correlated with community college attendance. 25Additional analyses suggest that in-district students are also less likely to be school of choice students, but this is unsurprising given that in-district school districts tend to be larger and more suburban and students residing in rural areas are more likely to choice in to suburban school dis- tricts than suburban students are to choice in to rural districts. 19 inducing compound treatments at the cutoff points (Keele et al., 2017).26 To my knowledge, there are no other specific community college policies that are discontinuously applied along com- munity college district boundaries. Nevertheless, school districts may provide different college information and guidance to students and families often select where to live based on school dis- trict attributes (Caetano and Macartney, 2014) including the quality of the school district’s college advising. A related concern is that families choose where to live based on preferences for other types of taxes or public goods, which may be correlated with their preferences for education more generally. However, I find that, along the boundaries, in-district residents face an average millage rate of 15.4, while out-of-district residents face an average rate of 12.3. Given that the average community college millage rate is about 2.5, this suggests that there is only about a 0.6 millage difference (i.e., 0.60per1,000 of taxable value) attributable to other types of taxes, which is rather small and unlikely to explain residential choices. To address potential sorting into school districts, in Section 1.5.5, I repeat the analysis using a subset of students who live in school districts which are bisected by a community college district. Students in this sample come from families who chose to live within the school district’s bound- aries, and therefore likely have similar preferences for education, and overwhelmingly attend the same high school, and therefore likely receive similar college counseling. However, only a frac- tion of the students live within the local community college’s district. I find very similar effects of in-district status using this subsample of students, suggesting that neither residential sorting nor school-level policies are likely driving my main results. 26The overlap of counties and community college districts is less concerning as the vast major- ity of college advising and implementation of college access policies occurs at the school, school district, or intermediate school district level, rather than the county level. Moreover, specifications that include county fixed-effects produce qualitatively similar results, indicating that, among stu- dents residing along community college district boundaries, there are not unobserved differences in preference for higher education institutions along county lines. These results are available from the author upon request. 20 1.5 Results 1.5.1 College Enrollment Table A.1.11 presents the reduced form and 2SLS estimates for student’s college enrollment choices within one year of high school graduation. The first four columns present estimates for four mutually exclusive college categories —the local community college (that at which in-district students receive reduced tuition), non-local community colleges (both in Michigan and in other states), private vocational colleges, and four-year colleges. However, students may enroll in more than one type of college within their first year following high school graduation, such that the sum of these estimates need not exactly equal the overall college enrollment effect presented in column (5). Panel A presents estimates for all cohorts of students, while Panel B presents estimates only using the 2009-2011 cohorts who will be used for analyses of college completion. The first row of each panel presents the reduced form effects of residing in a community college district. For the “all cohorts” sample, residing in a community college district increases enrollment in the local community college within one year of high school graduation by 6.4pp (31%), while decreasing enrollment in non-local community colleges by 2.8pp (31%) and in private vocational colleges by 0.7pp (20%). All three of these estimates are statistically significant at the 99% confi- dence level and imply that students shift enrollment away from other two-year colleges and towards their local community college when they reside in a community college district. In contrast, there is no statistically significant effect of in-district status on enrollment in four-year colleges, and the point estimate is quite small: -1pp, or 2.7% of the mean enrollment rate. On net, these enrollment effects increase overall college enrollment within one year of high school graduation by 1.3pp, or approximately 1.9% of the mean enrollment rate of 67.3%. The community college and voca- tional college enrollment effects are qualitatively similar for the 2009-2011 cohorts, but the overall college enrollment effect for this subsample is much smaller (0.6pp) and not statistically different from zero.27 All changes in enrollment behavior for these cohorts come from switching out of 27In Table A.1.12 I estimate the main specification including an interaction term between the 21 non-local community colleges and vocational colleges. The second row of each panel presents the 2SLS estimates of the effect of reducing the tuition rate at a student’s local community college by $1,000. Across all students, this reduction in tuition increases enrollment at the local community college by 3.5pp (18%) and is primarily driven by a 1.5pp (17%) decrease in enrollment in non-local community colleges and a 0.4pp decrease in enrollment in private vocational colleges (11%). Taken together, these enrollment effects increase overall college enrollment in the year following high school graduation by a statistically significant amount of 0.7pp, or approximately 1% of the mean enrollment rate of 67.3%. Again, the commu- nity college and vocational college enrollment effects are qualitatively similar using the 2009-2011 subsample, but the overall enrollment effect is smaller (0.4pp) and statistically insignificant. 1.5.2 College Completion Table A.1.14 estimates how residing in a community college district affects longer-run educa- tional outcomes for the 2009-2011 cohorts. The first row of the table presents reduced form effects. In-district status significantly increases the total number of college semesters students complete by 0.344 (4.2%) and the total number of credits students complete by 3.46 (4.5%), indicating that students increase their educational attainment when they have access to low-cost local community college. Residing in a community college district also increases the probability that a student will transfer to a four-year college by 1.1pp (9.6%), where transfer is defined as a student beginning college at a community or vocational college but later enrolling in a four-year college. The 2SLS results in the second row indicate that reducing a student’s local community college tuition rate by $1,000 increases the number of semesters of college she completes by 0.206 (2.5%), the number of credits she completes by 2.07 (2.7%), and her probability of transferring from to a four-year college by 0.7pp (6.5%). in-district dummy variable and a dummy variable for being in the 2009-2011 cohorts. I find that the effects for local community college, vocational college, and four-year college are statistically no different for the 2009-2011 cohorts, compared to the 2012-2016 cohorts. However, the effects for non-local community college and overall college enrollment are statistically different between the two groups. 22 Columns (4) and (5) show that residing in a community college district does not significantly affect students’ completion of certificates nor associate degrees, although the coefficient for asso- ciate degree completion is positive. This lack of a degree completion effect could be driven by the fact that community colleges have lower completion rates than their vocational counterparts. On average, only 13.5% of students at Michigan’s community colleges complete programs within 150% of their intended length, whereas 19.6% of students at vocational colleges do so. However, column (6) indicates that in-district status increases bachelor’s degree completion by a statistically significant amount of 1.8pp (5.7%). The 2SLS estimate shows that reducing a student’s local com- munity college tuition rate by $1,000 increases her probability of completing a bachelor’s degree by 1.1pp (3.5%). To better understand these degree completion outcomes, Table A.1.15 reports the distribution of associate and bachelor’s degree increases across seven categories of majors: (1) general studies, which primarily consists of pre-transfer programs at community colleges; (2) liberal arts and sci- ences; (3) health; (4) business; (5) technical fields, such as engineering and technology programs; (6) professional fields, such as education, criminal justice, and journalism; and (7) other or unspec- ified fields, which primarily consists of degrees awarded without a major recorded in the data.28 For each estimate, the outcome of interest is whether a student completes a given degree in a given field. These outcomes are mutually exclusive such that the sum of their coefficients must equal the overall degree completion increases presented in Table A.1.14. Panel A reports the reduced form and 2SLS results for associate degree completion by field, indicating that a $1,000 decrease in a student’s local community college tuition rate increases her probability of earning a general studies associate degree by 0.6pp (17.1%) and an associate degree 28For all students who enroll in a postsecondary institution covered by the NSC, the MDE/CEPI dataset records the six-digit federal Classification of Instructional Program (CIP) code of the pro- grams in which students enroll. I define a student as earning a degree in a given field of study if the student is enrolled in the field of study when she earns her degree. Table A.1.16 lists the set of two-digit CIP codes included in each category. If a student earns more than one degree of the same type (e.g., multiple associate degrees), only the field of study for her first degree is considered in this analysis. 23 in other or unspecified fields by 0.2pp (12.5%). These estimates indicate that, while reduced lo- cal community college tuition does not statistically significantly increase overall associate degree completion, it shifts the fields in which students earn associate degrees. Specifically, students are more likely to earn degrees that enable transfer to four-year colleges than degrees which lead to labor market entry. Panel B reports the effects of in-district status and reduced tuition on bache- lor’s degree completion by field, and shows that the increase in bachelor’s degree completion is primarily driven by increases in bachelor’s degree completion in business and professional fields of study. Given that business majors experience substantial earnings gains in the labor market (Andrews et al., 2017), this increase is likely to have longer-term payoffs for students.29 Taken together, these completion results indicate that having access to in-district tuition induces students to complete associate degrees that enable transfer to four-year colleges, and to ultimately complete bachelor’s degrees. These improved outcomes are likely the result of differences in institutional resources and objectives between Michigan’s community and vocational colleges. For example, community colleges spend about $1,166 more per student on instruction than vocational colleges and also award a large share of their degrees in general liberal arts fields (two-digit CIP code 24). In contrast, vocational colleges rarely award degrees in this area. Given that these degrees are generally intended for students transferring to four-year colleges, it is not surprising that community colleges also have substantially higher rates of transfer than vocational colleges: 36% compared to 11%. Table A.1.13 provides additional summary statistics on the differences between these institutions that further explain why attending a community college, rather than a vocational college, could improve students’ educational attainment. 29To further explore the increase in professional fields, Table A.1.17 presents separate estimates for disaggregated majors contained within this category. The results indicate that the increase is driven by more students completing degrees in education majors and parks, recreation, leisure, and fitness studies majors. The largest majors in the latter category are exercise science (CIP 31.0505) and sports administration (CIP 31.0504). It is not obvious why the degree increases are largest in these fields as community colleges in Michigan have transfer programs for a wide variety of majors; future work could explore further reasons why students primarily choose these pathways. 24 1.5.3 Heterogeneity Table A.1.18 reports heterogeneous treatment effects by a student’s FRPL eligibility, gender, and academic achievement for select college enrollment and completion outcomes.30 Panel A shows that FRPL eligible and ineligible students respond similarly to residing in a community college district with regards to local community college enrollment, but their substitution patterns are different. FRPL ineligible students, who come from higher-income families, respond to living in a community college district by changing which community college they attend: they are 3.3pp less likely to enroll in a non-local community college and 6.7pp more likely to enroll in their local community college. In contrast, FRPL eligible students respond to in-district status by reducing non-local community college enrollment by only 1.5pp. These students also decrease enrollment in vocational colleges by 0.8pp and increase overall college enrollment by 1.8pp, indicating that having access to a low-tuition local community college option is particularly important for overall college enrollment for lower income students. However, FRPL eligible and ineligible students earn associate and bachelor’s degrees at comparable rates. Panel B shows that male students are more responsive to in-district status than female students: they are 7.2pp more likely to attend the local community college than their out-of-district peers, whereas female students are 5.6pp more likely to do so. The underlying substitution effects are also different by gender. Female students respond to in-district status by significantly reducing enroll- ment in non-local community colleges and vocational colleges, while male students only somewhat reduce enrollment in non-local community colleges and also reduce enrollment in four-year col- leges. This difference in substitution patterns may stem from the fact that vocational colleges tend to offer degrees in female-dominated fields, such as healthcare. Nevertheless, as in the case of FRPL eligible and ineligible students, these differences do not persist when looking at completion 30For the binary FRPL status and gender variables, I extend equation (1.1) to include an inter- action term between Districti and the demographic variable of interest. For the test score variable, students are assigned to score quartiles among all students who took the MME exam in the same year based on their combined scores on the math and reading exams. I then modify equation (1.1) to include a dummy variable for the middle two quartiles, a dummy variable for the top quartile, and interaction terms with these dummy variables and Districti. 25 outcomes. That is, even male students who forgo initially attending four-year colleges to attend their local community college do not forgo ultimately earning bachelor’s degrees. Lastly, Panel C reports the estimated effects by students’ test scores. Students from the bot- tom three test score quartiles are very responsive to residing in a community college district: it increases their probability of enrolling in the local community college by 7.4-7.5pp. In contrast, students from the top quartile respond to in-district status by increasing their enrollment in the local community college by only 2.9pp. There are also differences among these groups when consider- ing substitution effects. Students from bottom quartile forgo enrollment in non-local community colleges, vocational colleges, and four-year colleges, whereas students from the middle quartiles primarily forgo enrollment in other community colleges. However, there are no decreases in bach- elor’s degree attainment among any group of students, which again suggests that the students who are deterred from attending four-year colleges do not forgo opportunities to earn bachelor’s de- grees. 1.5.4 Robustness Checks The reduced form and 2SLS results both rely on the assumption that there are no unobservable differences between students residing on either side of a community college district boundary that affect their college choices and outcomes. One threat to this assumption is that the two mile band- width does not create appropriate treatment and control groups because individual students may live several miles from one another, and therefore, may have different preferences over postsec- ondary education options or may be exposed to different social networks and information about college.31 To test whether the results hold across comparisons of students who reside farther from or closer to one another, I re-run the reduced form analysis for local community college enrollment using varying bandwidths from 0.1 miles to 4 miles. Figure A.1.5 presents the estimates from these specifications, which range from 2.5pp to 8.0pp and are all statistically significant at the 90% level 31Observed differences in student characteristics do not necessarily decrease as the bandwidth is narrowed, and in some cases, actually increase. Table A.1.19 documents this fact by providing the balance tests from Table A.1.8 for varying bandwidths. 26 or greater. Moreover, the 90% confidence intervals of all of the point estimates contain the 6.4pp estimate from the main specification, indicating that the two mile bandwidth selection is not the main driver of the results. A greater threat to the identifying assumption is the fact that community college district bound- aries are often congruent with school district boundaries, and a non-trivial share of families choose where to live based on school district characteristics. To test whether differences in school dis- tricts drive the college enrollment and completion results, I provide an alternative specification that compares the college choices and outcomes of students who reside in the same school district but live on opposite sides of a community college district boundary. This situation occurs when a community college district is congruent with a county (or multiple counties), but school districts in the area span more than one county. Figure A.1.6 identifies the 25 school districts in the state in which at least 10% of the high school residents reside within the community college district and at least 10% reside outside. Using these school districts as the analysis sample eliminates the concern that families sort into more desirable school districts that are located in community college districts. In addition, this approach holds constant college counseling information provided by the school district as the majority of students residing within one of these school districts attend the same high school: twenty-four of these twenty-five school districts contain only one high school, and 92% of students attend a high school that is located within their district of residence. I repeat the reduced form and 2SLS analyses on this selected sample, but replace the boundary segment by year fixed effect with a school district of residence by year fixed effect. Table A.1.20 presents results from this analysis for enrollment in the local community college for the 2009- 2016 cohorts and bachelor’s degree completion for the 2009-2011 cohorts. The first column of the table presents the local community college enrollment results from the main specification. The second column presents results from the within school district specification. Using this sample and specification, residing in-district increases enrollment at the local community college by 5.0pp, and reducing the tuition rate by $1,000 increases enrollment by 3.2pp. Neither of these estimates 27 is statistically different from the analogous estimates produced by the main specification.32 The third and fourth columns show that the estimated degree completion effect is also similar when using the within school district specification, indicating that the results are unlikely to be driven by selection into particular school districts. Another way to check the robustness of the main results is to examine whether college en- rollment choices and completion outcomes discontinuously change along geographic boundaries other than community college districts. If the differences in college outcomes between in-district and out-of-district students residing along a community college district border are truly driven by differences in tuition rates, then there should be no differences in college choices nor outcomes along borders where tuition rates do not differ and there are not other related policies in place. To test whether this is true, I conduct two different placebo tests. First, I contract all community col- lege district perimeters by two miles and compare the college choices of students residing within two miles of the new placebo boundary. This approach compares the choices and outcomes of students who all live within the same community college district, and face the same low tuition rate, but differ in how close they live to the center of the community college district. Second, I expand all community college district perimeters by two miles and compare the college choices of students residing within two miles of the new placebo boundary. In this approach, I compare students who live outside of a community college district but differ in how close they live to the nearest community college district boundary. Table A.1.22 presents the results from these approaches. The first column indicates that stu- dents residing within a community college district, but on either side of the contracted placebo boundary, do not differ in their likelihood of attending the local community college. The second column shows that students residing outside of a community college district, but on the side of the expanded placebo boundary that is closer to the true community college district, are slightly more likely to attend the local community college. However, this estimate (0.7pp) is quite small 32Table A.1.21 contains estimates for all one-year enrollment outcomes using this alternative specification. Given the reduced sample size, these estimates lack precision but are qualitatively similar to those produced by the main specification. 28 compared to the estimate of 6.4pp along the true community college district boundaries and is only marginally significant. The third column indicates that students who reside within a community college district, but inside the contracted placebo boundary, are slightly less likely to obtain bach- elor’s degrees, while the fourth column indicates that out-of-district students living on either side of the expanded placebo boundary are equally likely to obtain a bachelor’s degree. Both sets of results indicate that enrollment and completion outcomes do not change in meaningful ways along non-community college district boundaries, providing additional validation that the main results capture the true effect of reduced community college tuition. 1.6 Conclusion Community colleges serve millions of undergraduate students each year and are increasingly the focus of college access policies, making it critical to understand how students respond to their costs. In this paper, I provide new evidence on the effect of community college tuition rates on students’ college enrollment decisions, persistence in college, and degree completion. To do so, I exploit the fact that Michigan’s community colleges offer students different tuition rates depend- ing on whether they live within our outside a college’s district boundaries, as well as the fact that nearly one-quarter of Michigan’s high school graduates do not live within the boundaries of any community college district. This geographic variation allows me to use a boundary fixed effects design that compares the outcomes of students who reside on either side of a community college district but who are otherwise observationally similar. I combine this approach with detailed ad- ministrative records from the Michigan Department of Education to track students’ residences, college enrollment choices, and college completion outcomes over time. Among students graduating from Michigan public high schools between 2009 and 2016, I find that a $1,000 decrease in the advertised tuition rate at a student’s local community college upon graduating high school increases the probability of enrollment at the college by 3.5pp, or about 18%. This increase in local community college enrollment can be partially attributed to an increase in overall college enrollment but is also due to a decrease in enrollment at non-local community 29 colleges and at private vocationally-focused colleges who offer similar degree programs to com- munity colleges. However, I find little evidence that students forgo attending four-year colleges or decrease their overall educational attainment in response to a low community college tuition rate. Instead, for students who graduate from high school between 2009 and 2011, I find an increase in persistence in college, credit completion, transfer to four-year colleges, and bachelor’s degree completion. These improved outcomes may be attributed to the substitution towards local com- munity colleges and away from non-local community colleges and vocational colleges, as overall college enrollment is not affected by reduced community college tuition for this subset of students. This finding suggests that gains from community college attendance can extend to more students than identified in prior work (Rouse, 1995; Reynolds, 2012; Mountjoy, 2019): namely, students who would have attended a private vocational college in the absence of a community college. These results have meaningful policy implications, both for Michigan and for community col- lege policies throughout the country. Approximately 100,000 students graduate from Michigan public high schools in a given year; of these, about 23,000 do not live within a community col- lege district. Based on this paper’s estimates, reducing local community college tuition by $1,000 for these students would induce 253 more students to earn bachelor’s degrees.33 Given that the average discounted lifetime premium to earning a bachelor’s degree is about $300,000-$600,000 (?), the total discounted earnings benefits to students under such a policy would be between $76 million and $152 million. These figures far exceed the $5-$6 million cost of reducing tuition by $1,000 for all out-of-district students who attend community colleges.34 In fact, the income tax gains alone (assuming students continue to reside in Michigan) would total $3-6 million under Michigan’s current state income tax rate of 4.25%. Other policies that induce students to attend 33Currently, about 6,828 (29.7%) out-of-district students in each cohort earn a bachelor’s degree. Increasing this percentage by 1.1pp (estimated increase in overall community college enrollment) to 30.8% would mean 7,081 students would complete a bachelor’s degree —a difference of 253 students. 34About 5,267 (22.9%) out-of-district students attend community colleges each year. Increasing this percent to 24.9% would bring the total to about 5,727. At $1,000 per student, the cost of implementing the proposed policy would be $5,727,000 plus administrative costs. 30 community colleges rather than not pursuing postsecondary education or attending lower quality private colleges, including the regulation of the for-profit industry and funding for new community college campuses, are likely to be similarly cost-effective and should continue to be a focus of education policy research. However, the findings of this paper are not without limitations. One limitation of this study is that the results are estimated from an empirical design that compares students living very near one another, and thus, does not address the role of distance in college choices. Given the documented relationship between college proximity and college attendance (Card, 1995; Currie and Moretti, 2003; Lapid, 2017), it is likely that rural students who live far from colleges face additional chal- lenges in accessing higher education and may not respond to reduced tuition as strongly as their non-rural peers. Future work should seek to identify how reduced tuition policies differentially af- fect rural students and should investigate alternative policy interventions to increase college-going behavior among this population. Second, the tuition reduction studied in this paper does not in- clude changes in marketing, mentoring, or college campuses. Policies that include such factors, such as broad free-tuition programs or the expansion of community college districts, may influence students in different ways and should continue to be rigorously evaluated as they are implemented. 31 CHAPTER 2 COMMUNITY COLLEGE PROGRAM CHOICES IN THE WAKE OF LOCAL JOB LOSSES 2.1 Introduction The educational decisions that young people make can substantially affect their long-run labor market outcomes and overall economic well-being. The typical college graduate will earn more than double the typical high school graduate over her lifetime (Hershbein and Kearney, 2014), while also experiencing improved health, less reliance on social safety net programs, and fewer interactions with the criminal justice system (Oreopoulos and Salvanes, 2011). Equally large earn- ings gaps exist among students with the same level of education who pursue different fields of study (Altonji et al., 2012), and a growing body of literature shows that students take these earnings gaps into account when selecting college majors (Montmarquette et al., 2002; Beffy et al., 2012; Long et al., 2015), particularly when provided with reliable information about the labor market (Wiswall and Zafar, 2015; Hastings et al., 2015; Baker et al., 2018). However, the vast majority of college major choice research focuses on the four-year college sector. The nearly ten million students who attend two-year community colleges (National Center for Education Statistics, 2018) also must decide which fields to study, and their decisions have similarly large implications for their labor market outcomes. For example, students who enroll in healthcare programs can expect to experience large earnings gains in the labor market, while stu- dents who select other programs may not earn more than their peers who do not enroll in postsec- ondary education (Bahr et al., 2015; Belfield and Bailey, 2017; Stevens et al., 2018; Grosz, 2018). In response to these earnings differences, policymakers have begun to introduce programs that aim to steer students into programs that align with local economies. Several states tie community colleges’ appropriation funding to their ability to produce degrees in high-demand areas (Snyder and Boelscher, 2018), and some recent financial aid programs incentivize students to choose in- 32 demand fields of study (Allen, 2019; Natanson, 2019). Yet, there is little evidence on the extent to which labor market opportunities affect community college students’ program choices. In this paper, I use administrative data on the education decisions of recent high school gradu- ates in Michigan to analyze how labor market conditions influence students’ choices of community college programs. Specifically, I consider how students’ choices respond to local, occupation- specific job losses that alter the relative benefit of pursuing different programs. These types of job losses are likely to be particularly influential to community college students for several rea- sons. First, community college students tend to remain close to home when attending college and after graduating, making it likely that local labor demand shapes students’ expected labor market prospects more than state or national demand.1 Second, community college programs are generally designed to take two years or less to complete. Thus, while four-year college students may con- sider longer-run labor market trends when choosing college majors, community college students may be more likely to consider short-term fluctuations in labor demand. Finally, many programs at the community college level are closely tied to specific occupations, such as nursing or weld- ing, rather than the broad subjects that typically define majors at four-year colleges. As a result, the expected labor market opportunities associated with programs align closely with labor market opportunities in specific occupations. My empirical approach exploits plausibly exogenous variation in students’ exposure to local job losses resulting from mass layoffs and plant closings. I further rely on the distribution of occupations across industries to create estimated measures of occupation-specific labor demand shocks that align closely with six broad groups of community college programs. Intuitively, these measures isolate job losses that affect the types of occupations community college graduates would expect to enter after completing their educational programs. For example, hospitals employ a large number of healthcare workers with community college credentials, such as nurses and health 1The median distance a community college student travels to campus is only eight miles (Hill- man and Weichman, 2016), and over 60% of community college graduates live within 50 miles of the college they attended (Sentz et al., 2018). In Michigan, I estimate that 66% of students who attend community colleges within six months of high school graduation attend one located in their county. This number is 86% for students who live in a county with a community college. 33 assistants. Therefore, hospital closures should change the benefit to local students of enrolling in community college healthcare programs. In contrast, mass layoffs at prisons will mostly affect law enforcement professionals and, in turn, should alter the benefits of entering community college law enforcement programs. By comparing cohorts in the same county that were exposed to different local job losses as they exited high school, I show that students’ program choices are sensitive to occupations’ local labor market conditions. On average, an additional layoff per 10,000 working-age residents in a county reduces the share of the county’s high school graduates enrolling in related community college programs by 0.8%. Correspondingly, a one standard deviation increase in layoff exposure reduces enrollment by 3.8%. This effect is most pronounced when layoffs occur in a student’s county during her senior year of high school, and is driven by students substituting enrollment between community college programs, rather than forgoing higher education opportunities. To explain these substitution patterns, I leverage data on the skills required in different occupa- tions from the U.S. Department of Labor’s Occupational Information Network (O*NET) to create measures of skill similarity between community college programs. I then document that students primarily shift their enrollment into programs that require similar skills to the field affected by layoffs. Moreover, when occupations that do not have close substitutes experience negative em- ployment shocks, students exhibit a lower degree of responsiveness. This finding suggests that students’ abilities to adapt to labor market changes depends on the set of available educational choices and further indicates that supply-side responses by colleges could alter the effects of local labor market downturns. These results contribute to two related lines of literature on how individuals make human capital investment decisions. First, the results add to a large body of empirical work on factors affecting what students study in college, particularly how expected wages affect students’ college majors. Most prior work at the four-year college level finds that, to some extent, expected wages influence students’ choices (Altonji et al., 2016). Consistent with this finding, a recent line of work shows that the composition of college majors changed following the Great Recession, with more students 34 pursuing “recession-proof” majors (Shu, 2016; Liu et al., 2018; Ersoy, 2019). Choi et al. (2018) also show that the occurrence of “superstar” firms with abnormally high stock returns increases the number of four-year college students majoring in related fields. Related research at the community college level is limited, but two recent studies indicate that students attending these institutions are sensitive to expected labor market prospects. Baker et al. (2018) perform an information experiment and find that students’ program choices respond to new information about labor market outcomes, particularly the salaries earned by previous graduates. Meanwhile, Grosz (2018) uses a shift-share approach to show that, in California, the distribution of community college program completions has kept pace with statewide employment composition changes. He further shows that these trends are primarily due to changes in student demand rather than supply-side responses by colleges. I build on these findings by showing that exposure to job losses also affect students’ choices across community college programs. In line with prior work, these effects are rather small in magnitude, suggesting that factors outside of the labor market play a substantial role in determining students’ choices. Second, this research provides new evidence that local labor market shocks can affect education choices across a variety of margins. Several recent papers exploit mass layoffs and similar events to study how labor market conditions affect college enrollment (Charles et al., 2018; Hubbard, 2018; Foote and Grosz, 2019). They generally find that poor labor market conditions lead to an increase in college enrollment, and conversely, that economic booms decrease postsecondary enrollment and completion. A line of literature on the sensitivity of community college enrollment to the business cycle confirms this finding (Betts and McFarland, 1995; Hillman and Orians, 2013). However, few papers consider the occupation- or industry-specific nature of local labor market shocks. Two recent exceptions are Weinstein (2019), who finds that various industry-level shocks affect the composition of college majors at nearby four-year universities, and Huttunen and Riukula (2019), who find that Finnish children are less likely to enter the same field of study as their parent when their parent has been laid off. I find similar responses to local shocks among a previously unstudied population of students and also show that students shift enrollment towards programs 35 that require similar skills, which has not been documented in prior work. 2.2 Conceptual Framework This paper estimates how community-level job losses affect students’ postsecondary choices, particularly at the community college level. The basic economic intuition of this analysis is that job losses occurring through labor market shocks (e.g., mass layoffs and plant closings) represent changes in local labor demand, which in turn can affect students’ expected benefits of pursuing different postsecondary education programs. To see the potential changes in students’ decisions arising from a change in expected benefits, consider a simplified setting where student i decides between four different postsecondary options: (1) a community college vocational program that leads to a career in occupation group A (e.g., health), (2) a community college vocational program that leads to a career in occupation group B (e.g., business), (3) a four-year college program (lead- ing to a bachelor’s degree), or (4) directly entering the labor market.2 Each alternative is associated with an expected lifetime benefit, Bi j, where j denotes one of the choices. This expected benefit term is a function of student i’s expected earnings in related occupations and the student’s taste for the occupations and/or coursework. That is, Bi j = Yi j + µi j, where Yi j is an expected earnings term and µi j is a taste parameter. For example, the expected benefit to student i of pursuing a com- munity college health program is a combination of the expected earnings in community college health occupations and how much a student expects to enjoy the nature of healthcare work and coursework. Each alternative is also associated with an expected cost, Ci j. Students choose the alternative that maximizes Ui j = Ui(Bi j −Ci j), where Ui is some increas- ing, concave function. That is, a student will choose alternative j if Ui j > Uik for all j (cid:54)= k and the probability that student i chooses alternative j can be expressed as Pi j = P(Ui j > Uik). Suppose that student i observes a plant closing or mass layoff while she is deciding which postsecondary option to pursue. Her response to the shock will depend on how it affects the occupations associ- 2Students may also choose to enroll in a non-vocational program at a community college. Be- cause these programs are typically designed to assist students in transferring to four-year colleges, I implicitly consider them as part of option (3), a four-year college program. 36 ated with each alternative. Consider two extreme examples. In one, the labor market shock only affects community college health occupations and reduces the expected earnings of pursuing health programs by ε1, while holding all other components of the model constant. In another, the labor market shock affects all occupations in the economy and reduces Yi j by ε2 for all alternatives. In the first example, the utility student i receives from entering a community college health program will decrease and, if the decline is large enough, she will choose a different postsecondary option. If the student has a strong taste for vocational education —that is, a high µi j term for the vocational program options —she will likely shift her enrollment into the other vocational program. If not, may no longer enroll in college or may enroll in a four-year college program instead. In contrast, in the second example, the utility student i receives from each alternative will decrease and the student’s choice should be less affected. These examples highlight that the anticipated effects of layoffs depend on the distribution of layoffs across different segments of the economy. Moreover, they show that labor market shocks can have large effects without inducing students to change whether or where they enroll in college. Namely, students can choose to enter other programs within the vocational community college sector. Previous studies that only consider the effects of layoffs on college entry do not capture this response and potentially miss important labor market implications since the returns to a community college education vary significantly across programs. 2.3 Institutional Setting & Enrollment Data The institutional setting for this analysis is the community college market in the state of Michi- gan. Michigan is home to 28 public community colleges, which together enroll more than 300,000 students annually (Michigan Community College Association, 2019). Local boards of trustees control and govern the colleges, but all institutions share two key features. First, all colleges are open enrollment institutions, meaning students can enroll regardless of academic preparation.3 3Colleges may set admissions standards for select programs, but most programs do not have such requirements. For example, at Lansing Community College, one of the largest in the state, only 7 out of 209 programs use selective admissions (https://www.lcc.edu/academics/documents/ 37 Second, the colleges primarily confer certificates and associate degrees, which may either be vo- cational or non-vocational in nature.4 Vocational programs are designed to prepare students for immediate entry into the labor market and have direct links to specific occupations, whereas non- vocational programs typically consist of general education courses and prepare students to transfer to four-year colleges and universities. 2.3.1 Programs Offered by Michigan’s Community Colleges Due to the deregulated nature of Michigan’s community college system, the state does not systematically track the programs offered by each college over time. However, in 2011 and 2013, the Department of Treasury published the “Michigan Postsecondary Handbook,” which provides a listing of all programs offered by each of Michigan’s community colleges and includes their degree level, number of credits, and six-digit Classification of Instructional Program (CIP) codes. The Workforce Development Agency also maintains an online database of all current programs offered by the state’s community colleges. I use data from the handbooks and online database to classify programs into vocational and non-vocational categories, as well as to create the program groups that I use to analyze students’ responses to job losses in related occupations. To begin, I match each CIP code listed in one of the program listings to its associated occu- pation code in the Standard Occupation Classification System (SOC) using a crosswalk developed by the Bureau of Labor Statistics (BLS) and National Center for Education Statistics (NCES).5 In the crosswalk, a CIP code is only matched to an occupation if “programs in the CIP category are preparation directly for entry into and performance in jobs in the SOC category” (National Center for Education Statistics, 2011). For example, physical therapy assistant programs (CIP 51.0806) are matched to physical therapy assistants (SOC 31-2021) and welding technology programs (CIP 48.0508) are matched to welders (SOC 51-4121). One limitation of the crosswalk is that CIP codes pdf-policies/selective-admission-programs-criteria.pdf). 4Since 2012, Michigan’s community colleges have been able to confer bachelor’s degrees in a small number of fields. However, as of 2016, community colleges had only awarded 116 bachelor’s (House Fiscal Agency, 2017). 5The crosswalk can be accessed at: https://nces.ed.gov/ipeds/cipcode/resources.aspx?y=55. 38 are constant across levels of education. As a result, some programs may be matched to occupa- tions that are unlikely to be obtained by recent community college graduates. For example, the CIP code for registered nursing (51.3801) is matched to the SOC codes for both registered nurses (29-1141), which is a career attainable by graduates of community college nursing programs, and postsecondary nursing instructors (25-1072), which requires an advanced degree. To ensure all programs are only mapped to attainable occupations, I further match the SOC occupation codes to data on job preparation requirements from O*NET and limit the occupation matches to those that require at least a high school diploma but not necessarily a bachelor’s degree. I then define a program as a vocational program if it is matched to an occupation within this subset of attainable occupations. All other programs are considered non-vocational. These programs include general studies programs in which students take core classes that transfer to four-year colleges, pre-transfer programs in specific areas (such as pre-engineering), or academic programs that do not have close occupation links (such as foreign languages).6 Table B.1.1 provides summary statistics on the programs offered by Michigan’s community colleges in 2011. On average, a college offers 117 unique academic programs, with 81% being vocational. The five most commonly offered vocational programs, according to broader four-digit CIP codes, are those in vehicle maintenance and repair technologies (CIP 47.06), industrial pro- duction technologies (CIP 15.06), allied health (CIP 51.09), criminal justice and corrections (CIP 43.01), and business administration (CIP 52.02). To analyze students’ choices across this large set of programs, I create six broad groups of programs based on programs’ matched occupations: business, health, skilled trades, STEM, law enforcement, and other. I create these groupings by combining programs that are matched to similar two digit SOC occupation codes and, throughout the remainder of the text, refer to the occupations they contain as community college occupations.7 6Any programs that explicitly state in their name that they are “pre-transfer” programs are considered non-vocational, regardless of whether an occupational match exists. 7Programs can be matched to more than one detailed SOC occupation code, but 95% of pro- grams are matched to only one two-digit SOC occupation code. For the 5% (22 programs) that are matched to more than one two-digit SOC code, I merge in data on occupational employment from the BLS Occupational Employment Series and assign programs to the occupation group of 39 Table B.1.2 provides a list of the two-digit SOC codes contained within each group. 2.3.2 Students Enrolled in Michigan’s Vocational Programs To analyze how enrollment in community college programs responds to job losses in related occupations, I rely on a student-level administrative dataset provided by the Michigan Department of Education (MDE) and Center for Educational Performance and Information (CEPI). The dataset contains high school academic records for all students who attended public high schools from 2009 to 2016 and further links students to college enrollment and completion records from the National Student Clearinghouse (NSC) and a state-run data repository (STARR). The high school academic records provide rich information on students’ demographic characteristics and academic perfor- mance, including race/ethnicity, gender, standardized test scores, and census block of residence. All variables are measured during students’ eleventh grade year, when they complete state stan- dardized tests. The college link provided through the NSC and STARR contains all records of students’ enrollments in colleges covered by either database, as well as information on the aca- demic programs in which they enroll, the credits they complete, and the awards they receive. Like the information on colleges’ program offerings, program enrollment is recorded using six-digit CIP codes each semester a student is enrolled in a postsecondary institution. I focus my analysis on high school graduates’ first college enrollment and program choices within six months (180 days) of graduating from high school.8 This restriction ensures that the county in which a student resides during high school is a valid measure of her local labor market when she is deciding her postsecondary choice. Once students graduate from high school, I no longer observe where they reside, and therefore, cannot reasonably assume that the labor market shocks occurring in their high school county are the labor market shocks they actually observe. Moreover, by limiting enrollment choices to those occurring soon after high school graduation, the matched occupation that had higher statewide employment in 2009. 8In order to focus on students who are likely to consider postsecondary education, I drop stu- dents enrolled in juvenile detention centers, adult education, or alternative education programs from the analysis. Results are robust to including these students. 40 I limit the possibility that supply-side responses by colleges drive my results. For example, it is unlikely that colleges can respond to labor market shocks by altering the programs or courses they offer, as these decisions are typically made months or years in advance.9 Table B.1.3 provides summary statistics on Michigan’s high school graduates disaggregated by their first postsecondary education choices. A non-trivial share of students enroll in vocational and non-vocational community college programs each year: 9% and 14% of graduates, respectively.10 Students who enroll in vocational programs are more likely to be economically disadvantaged than students in non-vocational programs and also score lower on state standardized tests.11 They are also more likely to be male and a racial minority. Compared to their peers who do not enroll in college, they are less disadvantaged and more academically prepared. Table B.1.4 disaggregates the summary statistics by students’ vocational program choices.12 Across the eight cohorts in the sample, about 24% of vocational students enroll in business pro- grams, while 23% enroll in health programs, 8% enroll in the skilled trades, 13% enroll in STEM, 13% enroll in law enforcement, and 20% enroll in other programs, such as culinary arts or graphic design. There are some demographic differences across the program groups. For example, stu- 9Because Michigan does not provide annual information on the programs offered by each com- munity college, I am unable to directly analyze whether colleges alter course or program offerings in response to local job losses. However, Grosz (2018) provides evidence that student demand is much more responsive to labor market trends than college supply. 107.9% of community college students simultaneously enroll in a vocational and non-vocational program. I classify these students as enrolling in vocational programs. 6.3% of vocational students enroll in more than one six-digit CIP code. If a student enrolls in two programs and one of the programs is in the “other” category, I assign the student to the alternative program. Otherwise, I randomly assign the student to enroll in one of the programs they have selected. In Section 2.6, I show that the results are robust to dropping students who enroll in multiple program groups. 11Students are classified as economically disadvantaged if they qualify for free or reduced-price meals under the National School Lunch Program, are in a household that receives food (SNAP) or cash (TANF) assistance, are homeless, are a migrant, or are in foster care. 12 To verify that program choices accurately capture students’ educational experiences, I cat- egorize community college courses into the same six occupation groups and tabulate the share of courses taken in different subject areas among students enrolled in different programs. Figure B.1.1 presents these results. The figures show that students who indicate enrollment in a given program group take disproportionately more courses, and earn disproportionately more credits, in the subject area of their program than students in other program groups. 41 dents who enroll in skilled trades programs are overwhelmingly white (84%) and male (94%). In contrast, students who enroll in health programs tend to be non-white (29%) and female (78%). There is less sorting across academic abilities: average math and reading test scores are similarly low across the programs, but nearly all students in each group graduate from high school on time. 2.4 Measuring Local Job Losses In my empirical approach, I build on work by Hubbard (2018) and Foote and Grosz (2019) that uses the prevalence of mass layoffs and plant closings to proxy for changes in local labor demand. A key advantage of this type of data is that events are reported at the establishment level. Therefore, I can generate counts of reported job losses in small industries and small counties that are typically suppressed or imputed in county-level databases. For example, of 8,217 possible county-industry pairs in Michigan (83 counties, 99 NAICS 3-digit subsectors), only 2,633 (32%) have a complete panel of employment data available in the BLS’ Quarterly Census of Employment and Wages (QCEW) series. Other data series, such as County Business Patterns, have similar limitations, which I detail in Appendix B. Layoff data are also advantageous because they represent sharp declines in local employment that are plausibly exogenous to students’ educational choices, and are likely representative of the employment changes students observe through newspapers and other media outlets. My primary source of layoff data is a listing of all mass layoffs and plant closings reported to the Michigan Workforce Development Agency (WDA) under the federal Worker Adjustment and Retraining Notification (WARN) Act of 1989. The WARN Act requires employers with 100 or more employees to provide at least 60 days notice to employees ahead of large, permanent reductions in employment. Two types of events may trigger a WARN notice: (1) a plant closing affecting 50 or more employees at a single employment site, or (2) a mass layoff affecting either 500 or more employees or between 50 and 499 employees that account for at least one-third of the employer’s workforce (U.S. Department of Labor, 2019). Employers must give written notice of the anticipated layoff to the employees’ representative (e.g., a labor union), the chief local elected 42 official (e.g., the mayor), and the state dislocated worker unit. If employers do not provide such notice, they are liable to provide each aggrieved employee with back pay and benefits for up to 60 days. Krolikowski and Lunsford (2020) offer additional information on the WARN act and document its value as a labor market indicator. All WARN notices filed in Michigan are publicly available on the WDA’s website. However, the WARN Act does not apply to government entities, which limits my ability to observe layoffs in law enforcement professions —one of Michigan’s most popular community college program groups. To overcome this limitation, I supplement the WARN data with a listing of correctional facility closures and corresponding staff reductions from Michigan’s Senate Fiscal Agency (SFA). These events are analogous to plant closures in the private sector but particularly affect public law enforcement occupations, such as corrections officers. 2.4.1 Using WARN Data to Generate Occupation-Specific Layoff Exposure The layoff data available from the WDA include a record of the date that each mass layoff or plant closing event was reported to the state, along with the name of the company, the city where the affected operation is located, and the number of affected workers.13 The correctional facility closure data available from the SFA include a record of the name of the correctional facility that closed, along with the year and number of affected full-time equivalent (FTE) workers. For each correctional facility closure, I find related local news articles to approximate the date the closure was announced and the county in which the correctional facility was located. Panel A of Figure B.1.2 plots the number of mass layoffs, plant closures, and correctional facility closings reported during each academic year from 2001 to 2017, where I define academic years as July 1 of year t to June 30 of year t + 1. For example, the 2005 academic year runs from July 1, 2005 to June 30, 2006. On average, there are about 75 layoff events each year, with 24 being mass layoffs, 50 being plant closures, and 1.4 being correctional facility closures. The total 13I drop 19 layoff events (1.35% of the sample) that do not provide sufficient geographic infor- mation to assign to a county. 43 number of layoff events spiked to 193 during the 2008 academic year when the Great Recession and automotive industry collapse hit Michigan especially hard. Panel B shows that the total number of job losses also spiked during 2008. These layoffs occur throughout the state, in both rural and urban areas, which I highlight in Figure B.1.3 by plotting the average amount of per capita layoffs that occur in each county from 2001 to 2017. The layoff data do not contain information on the occupations of laid-off workers. Therefore, I estimate students’ exposure to job losses in each community college occupation group by exploit- ing the fact that different occupations are concentrated in different industries. I first match all 1,024 entities that experience a layoff to their respective three-digit NAICS industry code using informa- tion from company websites and online business databases. There are 99 unique three-digit codes in the NAICS system, each of which represents a subsector of economic activity. I observe 72 of the 99 subsectors in the layoff data, with the three most common subsectors being transportation equipment manufacturers (21% of observations); general merchandise stores (6% of observations); and professional, scientific, and technical services (5% of observations). I then calculate the distribution of community college occupations across industries. Explic- itly, let g denote one of the six program/occupation groups outlined in Appendix Table B.1.2 (for example, health or business) and k denote a three-digit NAICS industry (for example, hospitals or general merchandise stores). The share of industry k’s employment that belongs to occupations in group g in year t can be calculated as: αgkt = Employmentgkt Employmentkt (2.1) where Employmentgkt is the total employment in occupations in group g in industry k in year t and Employmentkt is total employment in industry k in year t. For example, if g is the health occupa- tion group and k is the hospital subsector, then α will capture the share of employment in hospitals that belongs to health-related occupations that community college graduates can reasonably en- ter. I calculate αgkt for each year, occupation group, and industry using nationally-representative data from the BLS’ Occupational Employment Series (OES) for non-government sectors and the 44 American Community Survey (ACS) for government sectors.14 Continuing with the example from above, I find that, on average, community college health occupations account for 54.4% of employ- ment in the hospital subsector. In contrast, community college health occupations only account for only 1% of employment at general merchandise stores.15 As a result, layoffs that occur at hospitals should affect these occupations, and therefore the benefit of enrolling in community college health programs, much more than layoffs that occur at general merchandise stores.16 I operationalize this intuition by using the occupation-by-industry employment shares to esti- mate layoff exposure within a given occupation group, county, and academic year. Specifically, I estimate the number of layoffs in occupation group g in county c in academic year t as: Layoffsgct = ∑ k αgktLayoffskct (2.2) where Layoffskct is the number of layoffs in industry k in county c in academic year t, which is identified in the mass layoff data. These measures take into account both the occupations which likely experience layoffs and the size of the layoff events occurring in a given county and year. For example, consider Kalamazoo County during the 2012 academic year. During this year, three firms reported mass layoffs: Hostess Brands, a food manufacturer (15 layoffs); International Paper, a paper manufacturer (77 layoffs); and OneWest Bank, a credit intermediary (168 layoffs).17 In 14The BLS only began publishing state-specific estimates in 2012 and cautions that they are subject to more error than the national-level estimates. Nevertheless, I also construct the α values using Michigan-specific data and find a strong correlation with my preferred nationally- representative estimates. Appendix Figure B.1.4 plots the α values for each community college occupation group using each 2016 national and Michigan data. The figure shows a strong correla- tion between the two measures, with a Pearson coefficient of 0.95. 15Appendix Table B.1.5 presents the three largest average values of α for each occupation group. 16In Appendix Table B.1.6 I compute the correlation between the α values across the six com- munity college occupation groups. Most correlations are negative, indicating that different com- munity college occupations are concentrated in different industries and, therefore, will be affected by different layoff events. Only two correlations are positive: business and STEM occupations, and health and other occupations. 17Note that the Hostess Brands layoff is below the 50 job loss threshold for required WARN reporting. Firms sometimes voluntarily report smaller layoffs, particularly when they are reporting simultaneous layoffs at facilities across the state. In Section 2.6, I repeat the empirical specifica- tions only using layoffs that meet the 50 job loss threshold and obtain very similar results to the main specification. 45 this same year, community college business occupations, i.e., business occupations which com- munity college graduates can enter, accounted for 6.7% of employment in food manufacturing, 10.9% of employment in paper manufacturing, and 44.5% of employment in credit intermedi- aries nationally. As such, a reasonable estimate of the number of business occupation layoffs reported under the WARN system in Kalamazoo County during the 2012-2013 academic year is 0.067(15) + 0.109(77) + 0.445(168) ≈ 84.18 2.4.2 Distribution of Layoffs Across Occupations Table B.1.7 provides summary statistics on the layoffs occurring in Michigan counties between the 2001 and 2017 academic years. In addition to estimating the number of layoffs occurring in community college occupations, I use equations (1) and (2) to generate the number of layoffs occurring in low-skilled occupations that require less than an associate’s degree and the number of layoffs occurring in high-skilled occupations that require more than an associate’s degree. These layoff measures correspond to the types of occupations students would expect to enter if they did not pursue any postsecondary education or if they obtained four-year college degrees. Panel A presents summary statistics on the number of layoffs occurring per 10,000 working-age residents in a given county, year, and occupation group.19 On average, a county-year observation with 10,000 working-age residents experiences 5.3 layoffs in low-skilled occupations, 4.1 lay- offs in middle-skill community college occupations, and 1.3 layoffs in high-skilled occupations. Among the community college occupations, 2.1 layoffs occur in the skilled trades, 1.0 occurs in business, 0.5 occur in law enforcement, 0.3 occur in STEM, 0.2 occur in health, and 0.1 occur in other community college occupations. There is substantial variation in the number of layoffs 18To illustrate more examples of county layoffs, Appendix Table B.1.8 provides information on the three county-year pairs with the largest amount of per capita layoffs in each occupation group from 2001 to 2017. 19I define working-age residents as those aged 20 to 64 and obtain annual county-level estimates of this population from the Census Bureau’s Population Estimates Program (https://www.census. gov/programs-surveys/popest.html). The average county-year observation in the data has 71,131 working-age residents. 46 occurring in different occupations, with the standard deviations for each category far exceeding the means. For example, the number of skilled trade layoffs occurring in a county ranges from 0 to nearly 96 per 10,000 working-age residents. Panel B then calculates the share of layoffs occurring in each category for county-year observations that experience non-zero layoffs. Across the time frame, 369 county-year observations (26%) experience layoffs. On average, 51% layoffs are in low-skilled occupations, while about 37% occur in middle-skill occupations, and 11% occur in high-skilled occupations. Figure B.1.5 further highlights the variation in layoffs across counties by plotting the layoffs that occur in each occupation group in each county between 2001 and 2017. I do not include counties that do not experience layoffs over this time frame and order all other counties by their average working-age population over this time frame. The left-hand panel plots the total number of layoffs per 10,000 working-age residents in each occupation group while the right-hand panel shows the share of layoffs occurring in each occupation group. The total number of layoffs varies substantially across counties, with both small and large counties experiencing a high number of local labor market shocks over the time frame. For example, the two counties that experience the most per capita layoffs are Ingham County, which is home to the state capitol of Lansing and has about 200,000 residents, and Ontonagon County, which only has 4,000 residents. The share of layoffs occurring in each occupation group also varies considerably across counties, further emphasizing the importance of separating layoffs by the types of jobs they affect. 2.4.3 Potential Measurement Error Because the layoff data does not contain information on the occupations of laid off workers, the layoff measures I construct rely on the distribution of occupations across industries. Implicitly, these measures assume that layoffs in an occupation are proportional to its national employment shares in industries that experience layoffs. Any deviation of layoffs from these proportions could lead to measurement error in the layoff terms, whereby I inaccurately classify layoffs as affecting one occupation group when, in reality, they affect another. For example, suppose that a hospital 47 reports a mass layoff of 100 workers. Based on industry-by-occupation shares, I estimate that about 55 layoffs should affect community college health occupations, while only about 8 should affect community college business occupations. However, suppose that a hospital was to layoff only their billing or financial services department. This type of layoff would disproportionately affect business occupations rather than health occupations, causing me to overstate the effect of the event on health occupations and understate the effect on business occupations. More formally, suppose that a single layoff in occupation X occurs. Further, suppose that with probability ε, I will incorrectly classify this layoff as affecting occupation Y . Then, the estimated effect of the layoff on the probability that a student chooses program X will be: (cid:98)δXX = (1− ε)δXX + εδXY where δXX is the true effect of layoffs in occupation group X on enrollment in group X programs and δY X is the true effect of layoffs in occupation group Y on enrollment in group X programs. Because δXX ≤ 0 (layoffs deter students from entering related programs) and δXY ≥ 0 (students substitute into other programs), the estimated response will be of a smaller magnitude than the true response and could even be positive if either ε or δXY is sufficiently large. Correspondingly, the estimated effect of the layoff on the probability that a student chooses program Y will then be: (cid:98)δY X = (1− ε)δY X + εδYY where δYY is the true effect of layoffs in occupation group Y on enrollment in group Y programs and δY X is the true effect of layoffs in occupation group Y on enrollment in group X programs. Because δY X ≥ 0 and δYY ≤ 0, the estimated term will be biased downward toward zero and could be negative if either ε or δYY are sufficiently large. Given the non-classical nature of this measurement error and the fact that ε is unknown, there is no straightforward way to empirically correct for it. However, there are circumstances where measurement error is less likely to occur. Specifically, plant and prison closures are likely to affect all jobs contained within a given facility and, therefore, should align more closely with the industry-by-occupation employment shares than layoffs that only affect a subset of jobs. In Section 48 2.5.3, I conduct the empirical analysis using only layoffs that are a result of facility closures and find quite similar results to my main specification. 2.5 Effect of Job Losses on Enrollment in Related Programs The conceptual framework in Section 2.2 outlines two key outcomes of interest for the empir- ical analysis: (1) the effect of local job losses on enrollment in related community college pro- grams, and (2) the corresponding substitution into other postsecondary options (including direct labor market entry) if students are indeed deterred from entering related programs.20 I begin by estimating the average effect of job losses on enrollment in related community college programs. Then, in Section 2.6, I consider heterogeneous effects across occupation groups and document how students substitute between postsecondary programs in response to job losses. 2.5.1 Empirical Approach I create measures of program enrollment at the county-year-program level and estimate speci- fications of the following form: Enrollgct = α + Layoffsgctβ + XctΓ + θgc + δgt + εgct (2.3) where Enrollgct is the number of students from county c and cohort t who enroll in community college programs in group g, per 100 high school graduates, and Layoffsgct is a vector of the number of layoffs per 10,000 working-age residents in occupation group g that may affect cohort t in county c. I consider two sources of variation in layoffs: timing and location. That is, I consider how students respond to layoffs that occur in different points during their pre-college years and that occur in different geographic areas. The vector Xct contains time-varying county control variables that may affect students’ enrollment choices, such as the average test scores of the cohort or the unemployment rate. θgc is a program-by-county fixed effect that accounts for unobserved differences in program preferences across counties. δgt is a program-by-cohort fixed effect that 20In Appendix B.3, I further consider how related educational outcomes, such as delayed enroll- ment or program retention, respond to layoffs. 49 captures unobserved changes in program preferences over time. Finally, εgct is an idiosyncratic error term. Throughout the analysis, I cluster all standard errors at the county level. The fixed effects capture two important sources of unobserved heterogeneity: differences in preferences for community college programs across counties and across time. The vector of con- trols further accounts for changes in economic conditions across counties and time. As such, the identifying assumption for β to represent the causal effect of job losses on students’ choices is that there are no unobserved changes in preferences at the county-program level that are correlated with job losses. This assumption rules out the possibility that, for example, firms lay off workers because they know the next cohort of high school graduates has different preferences for college education than the last cohort. While such a phenomenon seems unlikely, the assumption could be threatened if there are county-specific trends in occupation-specific job prospects and program preferences. Thus, I also estimate specifications that include county-by-program linear time trends. Similarly, layoffs may not represent true changes in occupation-specific employment conditions if job losses are absorbed by increased employment in nearby counties. For this reason, I estimate specifications that interact the cohort-by-program fixed effects with commuting zone (CZ) fixed effects to account for any unobservable changes in an occupation group’s employment in a broader geographic region.21 2.5.2 Main Results Table B.1.9 presents estimates of equation (2.3), measuring layoffs at different times during a cohort’s academic career. Column (1) includes only layoffs occurring during a cohort’s senior year of high school: the time period during which students must decide what educational program, if any, they will enter following graduation. The point estimate is negative and statistically sig- nificant, indicating that an additional layoff per 10,000 county residents during this year reduces enrollment in related programs by 0.012 students per 100 graduates, or about 0.012pp. There are 21Commuting zones are groups of counties that reflect a local labor market (see: https://www. ers.usda.gov/data-products/commuting-zones-and-labor-market-areas/). Throughout the analysis, I use the 1990 CZ delineations. 50 several ways to interpret this estimate. At the mean enrollment rate of 1.5%, this estimate repre- sents a 0.8% decrease in enrollment in related programs. Correspondingly, a one standard deviation increase in layoff exposure reduces enrollment in related programs by 3.83% of the mean. Alterna- tive, doubling the amount of per capita layoff exposure the average county-cohort pair experiences reduces enrollment by about 0.6%. These estimates imply that, for the average county, 52 workers being laid off in a given occupation induces one less student to enroll in a related program.22 Columns (2) and (3) then add measures of layoffs occurring in earlier years. The estimate on layoffs occurring in a cohort’s senior year of high school remains negative and statistically significant, but there are little effects of layoffs occurring prior to this year. The largest point estimate comes from layoffs occurring in students’ sophomore year of high school, but this effect is about half the size of the effect of layoffs occurring in the senior year of high school and is not consistently statistically significant. These results indicate that students primarily respond to layoffs occurring in the year leading up to their postsecondary decision point. Such evidence is consistent with a growing literature highlighting the importance of salience in decision-making (Mullainathan, 2002; Genniaoli and Shleifer, 2010), and particularly, the sensitivity of college major choice to recent events (Xia, 2016; Patterson et al., 2019) Finally, Column (4) adds a measure of layoffs occurring in the year following a cohort’s high school graduation. Because I restrict the analysis to students’ first program choices within six months of high school graduation, including this measure serves as a natural placebo test: these layoffs have not occurred when students make their postsecondary choices, and thus, should not affect enrollment in related vocational programs. Indeed, I find that they do not. The point estimate on this variable is positive, but close to zero statistically insignificant. Meanwhile, the estimate on layoffs occurring during a cohort’s senior year of high school remains negative, statistically significant, and close to the -0.012. 22I obtain this estimate by re-estimating equation (2.3) with the dependent variable scaled by the average number of graduates in a county and the independent variable scaled by the average number of working-age residents in a county. The β parameter then represents the effect of an additional layoff on enrollment in the average county. Thus, 1/β provides the number of layoffs needed to induce one student not to enroll in the average county. 51 Next, I consider how layoffs in other areas of the state affect students’ program enrollment decisions. To do so, I estimate equation (2.3) without including the occupation group by cohort fixed effects (δgt), as this term absorbs any statewide changes in student preferences for a program, including the effects of statewide layoffs. Table B.1.10 presents these results. Column (1) includes only layoffs occurring during a cohort’s senior year of high school within their own county. This specification produces a very similar estimate to the main specification in Table B.1.9, despite the lack of a program-by-year fixed effect. Column (2) then adds a measure of layoffs occurring in the rest of the state. The coefficient on this measure is close to zero and statistically insignifi- cant, indicating that, on average, layoffs occurring elsewhere in the state do not affect students’ program choices. Column (3) then separates this measure into layoffs occurring elsewhere in the county’s commuting zone and layoffs occurring outside of the commuting zone. The coefficient on layoffs occurring elsewhere in the commuting zone is negative, indicating that students also respond to layoffs occurring outside of their county but in their general area of the state. However, the coefficient is smaller than the coefficient on county layoffs and is not statistically significant, again indicating that saliency plays a role in students’ decision-making process and that students primarily respond to layoffs that occur in their immediate local area. 2.5.3 Robustness Figure B.1.6 presents several robustness checks of the main specification from column (1) in Table B.1.9: the effect of layoffs in a student’s county during her senior year of high school on enrollment in related programs. First, Panel A shows how the results change when including different control variables in the Xct vector. Including the number of layoffs occurring in low- skill and high-skill occupations, either together or separately, does not meaningfully change the estimated coefficient. Similarly, replacing the vector of covariates with a county-by-cohort fixed effect produces a nearly identical estimate. Next, Panel B estimates specifications with county-by- 52 program linear time trends and program-by-year-by-CZ fixed effects.23 These specifications also similar estimates to the main specification, indicating that unobserved changes in local economic conditions are not driving the results. Panel C then shows how the estimates change when dropping events that are the result of mass layoffs rather than plant closings, or events that report less than the required 50 job losses. The estimates are similar when using all layoffs and when using only layoffs that are a result of closings. Moreover, the point estimate using only closings is slightly larger in magnitude, which is consistent with the expected effects of measurement error outlined in Section 2.4.4. I also find quite similar estimates when only including layoffs that reach the 50 job loss threshold, indicating that the voluntary reporting of smaller layoff events does not contaminate the main results. Finally, Panel D estimates non-linear specifications that can better handle fractional dependent variables. First, I estimate equation (2.3) using the inverse hyperbolic sine of a county’s program enrollment as the dependent variable.24 I then estimate Poisson and fractional logit (Papke and Wooldridge, 1996) specifications.25,26 All specifications produce similar semi-elasticities to the main linear specification, providing evidence that functional form selection is not driving the results. 23In all specifications that include year-by-CZ fixed effects, Monroe County is dropped from the analysis because all other counties in its commuting zone are in Ohio. 24The inverse hyperbolic sine (IHS) function approximates the log function but allows values of zero (Burbidge et al., 1988). I use the transformations proposed by Bellemare and Wichman (2019) to estimate elasticities at the mean values of the dependent and independent variables. 25In the Poisson specification, the dependent variable remains the share of students from a given county and cohort who enroll in a given program (rather than a raw count variable). This speci- fication may be interpreted the same as estimating a linear model with the dependent variable as log program enrollment and controlling for log total vocational enrollment and restricting the co- efficient to be equal to 1. However, like the IHS specification, the Poisson approach allows for the inclusion of dependent variables equal to zero. See Lindo et al. (2018) for more details. 26The fractional logit specification is analogous to estimating a standard logit demand specifica- tion where the dependent variable is the log of the enrollment share, but allows for the inclusion of county-program-years where no students enroll in a given program. 53 2.6 Substitution Effects The results in Section 2.5 indicate that fewer students enroll in community college programs when exposed to related job losses. This response primarily occurs when the job losses take place in a student’s own county during her senior year of high school. In order to better understand how this response may affect students’ longer-run outcomes, I now estimate how these job losses affect students’ decisions to enroll in other postsecondary options. 2.6.1 Substitution out of Vocational Programs I begin by estimating how layoffs in community college occupations affect students’ decisions to enroll in vocational community college programs overall. To do so, I estimate the following equation: VocationalEnrollct = α + 6 ∑ g=1 βgLayoffsgc,t−1 + XctΓ + θc + δt + εct (2.4) where VocatonalEnrollct is the number of students from county c and cohort t, per 100 graduates, who enroll in vocational community college programs at community colleges. The vector of layoff variables, Layoffsgc,t−1, captures the number of layoffs, per 10,000 working-age residents, that occur in different community college occupation group g in county c during cohort t’s senior year of high school. As in equation (2.3), the vector Xct contains time-varying county control variables that may affect students’ choices, including the number of layoffs that occur in non community college occupations. θc is a county fixed effect that absorbs county-specific preferences for different types of postsecondary education (as θgc does in the previous estimating equation) and δt is a cohort fixed effect that accounts for changing preferences over time (as δgt does in the previous estimating equation). εct is the error term. I continue to cluster all standard errors at the county level. The β vector identifies how layoffs in different types of occupations affect students’ decisions to enroll in related types of college programs. The identifying assumption is that, after controlling for secular trends through the cohort fixed effects, any within-county variation in layoffs is uncor- 54 related with within-county variation in unobserved college preferences. As in Section 2.5, this as- sumption seems reasonable, but could be threatened if there are unobserved changes in preferences or economic opportunities over time. Therefore, I also estimate specifications with county-specific linear time trends or cohort dummies interacted with commuting zone fixed effects. Table B.1.11 presents the estimates of equation (2.4). Column (1) is the baseline specification, column (2) includes county-specific linear time trends, and column (3) includes cohort-by-CZ fixed effects. Across the three columns, the effects of layoffs are small and none are statistically significant at the 5% level.27 Moreover, in all specifications, I fail to reject the joint hypothesis that all six coefficients are equal to zero, indicating that layoffs in community college occupations do not affect enrollment in vocational programs. In Table B.1.13, I further consider whether layoffs in community college occupations affect the composition of students enrolling in vocational programs by regressing mean demographic values of vocational students against the vector of layoff measures. I find little evidence that layoffs affect who enrolls in vocational programs, and, in all specifications, I fail to reject the joint hypothesis that the coefficients on all community college layoff terms are equal to zero. Similarly, in Table B.1.14, I estimate how layoffs in community college occupations affect credit completion within vocational students’ first year of community college enrollment. I find no evidence that layoffs affect total credit completion, nor completion of vocational vs. non-vocational courses.28 Taken together, these findings show that layoffs in community college occupations do not dissuade 27In Table B.1.12, I show that, overall, layoffs increase college enrollment. This finding is con- sistent with prior work that shows college enrollment increases when local economic conditions worsen. I further show that this increase in college enrollment is concentrated in programs that should lead to four-year college degrees, including non-vocational programs at community col- leges, while layoffs slightly decrease enrollment in community college vocational programs. This finding is slightly different from Hubbard (2018), who also uses Michigan data and finds that layoffs predominantly increase enrollment in community colleges. However, he uses an earlier sample (2002-2011 academic years) and measures layoffs within a 30-mile radius of a student’s high school rather than at the county level, which could explain the differences in our results. 28I use course codes and information from community college catalogs to divide all courses into vocational and non-vocational groups. I define vocational courses as those in the same fields that are included in the six vocational program groups of interest, while all other courses are considered non-vocational. 55 students from enrolling in community colleges and pursuing vocational education programs, nor do they change students’ intensity of enrollment. Thus, the response documented in Section 2.5 must come from students changing which types of vocational programs they pursue. 2.6.2 Substitution Between Vocational Programs Because job losses do not deter students from entering vocational community college programs overall, I now consider how students substitute between vocational programs in response to layoffs. I restrict the sample to students who enroll in vocational programs and estimate the following system of six equations: ProgramEnroll jct = α + 6 ∑ g=1 βgLayoffsgc,t−1 + XctΓ + θc + δt + εct (2.5) where ProgramEnroll jct is enrollment in occupation group j among students from county c and cohort t, per 100 students enrolling in vocational programs, and Layoffsgc,t−1 is the number of layoffs in occupation group j in county c occuring in school year t − 1, per 10,000 working-age residents in the county.29 The vector Xct contains the same variables as in equation (2.4), θc is a county fixed effect, δt is a cohort fixed effect, and εct is the error term. I again cluster all standard errors at the county level. The coefficient βg will represent the “own-layoff” effect when j = g and will represent a “cross- layoff” effect when j (cid:54)= g. As predicted in Section 2.2, the own-layoff terms should be negative because layoffs should deter students from enrolling in related programs. The cross-layoff terms should be positive since students would then substitute between programs, but could be negative if there is some measurement error. Moreover, because the dependent variable shares must sum to 100, the sum of a βg term across the six enrollment outcomes must equal 0. This restriction implies that any decrease in enrollment in a given program group due to related layoffs must be offset by students enrolling in other vocational community college programs. 29Because the same regressors appear in every equation and there are no cross-equation re- strictions, estimating each equation separately is algebraically equivalent to jointly estimating the system using feasible generalized least squares (Wooldridge, 2010). 56 The identifying assumption for the β j terms to represent causal effects of layoffs on students’ choices is that, conditional on all other variables, layoffs in occupation group j must be uncor- related with unobservable determinants of enrollment in program group g. When j = g, this as- sumption imposes that occupation-specific layoffs are not correlated with changing preferences for corresponding programs within a county. When j (cid:54)= g, the assumption is that occupation-specific layoffs are not correlated with changing preferences for other programs within a county. As in the previous sections, unobserved changes in preferences or economic opportunities could violate this assumption, so I again estimate specifications with county-specific linear time trends or cohort dummies interacted with commuting zone fixed effects. Table B.1.15 presents the substitution matrix created from estimating equation (2.5) for each of the six occupation groups.30 The bold diagonal terms represent the effect of an additional layoff per 10,000 county residents in occupation group g on enrollment in related programs. For example, an additional layoff per 10,000 county residents in business programs reduces enrollment in business programs by 1.02 students per 100 enrollees, or by 1.02pp. An analogous increase in layoffs reduce enrollment in health programs by 0.61pp and in law enforcement programs by 0.15pp, in other programs by 0.81pp, and by smaller but negative amounts in the skilled trades and STEM. In the bottom panel of the table, I present the own-layoff elasticities at the mean values of both the dependent and independent variables. An additional layoff per 10,000 working-age county residents reduces enrollment in related programs by between 0.6% and 4.7%, with the largest statistically significant effects coming from the business and health fields. Moving horizontally across the columns shows how layoffs induce students to substitute into other types of vocational programs. For example, an additional business layoff per 10,000 county residents increases enrollment in law enforcement programs by about 1.7pp. This coefficient shows that business layoffs induce students to substitute away from business programs and towards law enforcement programs. Similarly, students primarily substitute from health programs into other 30The sample consists of 657 (98.9%) county-cohort pairs where at least one student enrolls in vocational programs. Restricting the sample to counties that have non-zero vocational enrollment in every year of the data produces nearly identical results. 57 programs when there are health layoffs. In Table B.1.16, I further disaggregate the “other” category and find that most of the substitution occurs in social service programs, such as childcare, although there is also statistically significant substitution into arts and media programs and personal care and culinary programs. Although not statistically significant, the estimates further suggest that students substitute from law enforcement programs towards business, STEM, and health programs when there are law enforcement layoffs. 2.6.3 Explaining Substitution with Occupation Characteristics While it is interesting to document that health layoffs induce students to substitute towards pro- grams in the “other” category, this finding raises yet another question: why do students substitute towards these fields? Based on the conceptual framework presented in Section 2.2, students should substitute into their “next best” alternative program. Given that programs are closely tied to oc- cupations, the next best programs are likely to share similar occupation characteristics. For exam- ple, health programs and several programs in the other category —such as childcare professionals —focus on serving one’s community and require a high level of person-to-person interaction, so it seems reasonable that students would substitute between these programs. To empirically assess the extent to which students substitute into similar programs, I use data on occupation characteristics from the U.S. Department of Labor’s Occupational Information Net- work (O*NET), which contains a wealth of information on worker and job characteristics, includ- ing the skills required in different occupations. I characterize community college program groups using measures of three dimensions of skill requirements for related occupations: cognitive skills, social skills, and technical skills. The cognitive skill category contains ten measures of skills “that facilitate learning or the more rapid acquisition of knowledge,” such as mathematics, reading com- prehension, and writing. The social skills category contains six measures of skills that are “used to work with people to achieve goals,” such as negotiation and service orientation. The techni- cal skills category contains eleven measures of skills “used to design, set-up, operate, and correct malfunctions involving application of machines or technological systems,” such as repairing and 58 programming. For each occupation and skill measure, O*NET reports a standardized importance score and standardized level score. Both measures range from 0 to 100, but each provides different information. The importance score describes how important a particular skill is to an occupation, with higher values indicating more importance. The level score characterizes the degree to which the skill is required to perform the occupation, with higher values indicating a higher requirement. I use these data elements to create a Euclidean distance measure that identifies program groups that require similar skills. The measure is similar to that used by O*NET to identify similar careers but, to my knowledge, has not previously been used to identify similar college programs. I define the distance between program group p and program group s, which experiences the labor market shock, as: Distanceps = (cid:114) ∑27 j=1 Importance js(Level jp − Level js)2 (2.6) where Importance js is the importance of skill j for program group s, Level jp is the required level of skill j for program p, and Level js is the required level of skill j for program group s. As a result, the programs that are most similar to program group s in terms of the skills that are most important for careers in group s will have the lowest distance measures.31 I standardize the measures such that the least similar pair of program groups has a distance measure of 1. Figure B.1.7 plots the coefficients in Table B.1.15 against this skill distance measure. Each panel shows the effect of a different type of layoff on enrollment in each program group. For example, the upper left panel shows that business layoffs decrease enrollment in business programs but increase enrollment in law enforcement programs, which is the most similar program group to business. A similar pattern emerges in the second panel, where health layoffs decrease enrollment in health programs but increase enrollment in law enforcement and other programs, both of which are fairly similar to health. Layoffs in law enforcement and other community college occupations also induce students to enroll in similar programs. However, when there are layoffs in STEM 31To create level and importance measures for program groups, I create a weighted average of all occupations that belong to the group where weights are proportional to the total enrollment of Michigan students over the time frame of the data. For example, nursing receives a high weight in the health program group because it is one of the most popular programs. 59 and skilled trades, students are not substantially deterred from enrolling in related programs. This lack of a response may be due to the lack of nearby substitutes in which students could enroll. For example, all of the non-STEM program groups have a distance measure of 0.5 or greater, indicating that they require quite different skills than STEM occupations do. This difference is not surprising as STEM occupations tend to require much more mathematical skills than non-STEM occupations. Figure B.1.8 provides further evidence that students substitute into similar programs by pooling all of the substitution effects and plotting them against their respective skill distance measures. The largest substitution effects appear at the left end of the x-axis, indicating that students mostly substitute into programs that are similar to those affected by layoffs. Moving across the x-axis, there is a downward slope showing that students are less likely to enroll in programs that require substantially different skills. A simple linear fit of the data indicates that moving from the most similar to the most different program group reduces the substitution effect by 0.55, where I measure effect sizes as the impact of an additional layoff per 10,000 county residents on enrollment per 100 vocational students.32 In Appendix B.4, I consider substitution patterns between more narrowly- defined program groups and find that the general pattern of students substituting towards similar programs still holds. 2.6.4 Heterogeneity & Robustness Figure B.1.10 considers heterogeneous responses to layoffs by re-estimating the system of equations in equation (2.5) using different subgroups of students. First, in Panel A, I consider how the effects vary across genders. Because there is substantial sorting across genders in community college programs, it is reasonable to think that male and female students may respond differently to layoffs in various fields. Indeed, I find that the responses to health layoffs are predominantly driven by female students, who account for nearly 80% of enrollment in health programs. The 32In Figure B.1.9, I re-create the figure using alternate measures of skill distance. The results are quite similar, with an additional layoff per 10,000 county residents reducing the effect size by 0.73 when using only differences in skill levels and by 0.62 when using only differences in skill importance. 60 responses to business, skilled trades, STEM, and law enforcement layoffs tend to come from male students, who make up the majority of enrollment in these programs. However, the estimates for these fields are noisier and are not significantly different between male and female students. In Panel B, I show how the effects vary across urban and rural counties.33 This type of het- erogeneity is particularly relevant in Michigan because a majority of the state’s residents reside in urban areas, but those areas comprise little of the state’s land area. Moreover, there are documented differences in racial composition, political leanings, and educational attainment across rural and urban areas in the state (Citizens Research Council of Michigan, 2018). I find that the responses to layoffs are predominantly driven by rural counties, except for law enforcement layoffs, which mostly affect urban counties. This strong response could be the result of geographic preferences of students’ in rural areas to remain in their local communities or differences in information networks in these areas. For example, rural news outlets may have fewer events to cover and, therefore, may devote more attention to a local layoff or business closure. Layoffs in rural areas may also be better indicators of future labor market prospects than layoffs in urban areas, particularly if an occupation’s employment is heavily concentrated in one firm that then closes or downsizes. I next perform a series of robustness checks that test the sensitivity of the results to alternative specifications. First, because scaling the dependent variable by the number of vocational students in a given county and cohort may introduce heteroskedasticity, I estimate the substitution matrix using the refined weighting schemed proposed by Solon et al. (2015). Panel A of Figure B.1.11 presents the own-layoff effects using this approach. The point estimates and corresponding stan- dard errors are quite similar with or without weights. Second, because layoffs may be more likely to occur when a county is on a downward economic trajectory, Panel B of Figure B.1.11 shows how the estimates change when including county-specific linear time trends. The results are also quite similar with and without trends. I also estimate specifications that include cohort-by-commuting zone fixed effects to capture changing economic conditions or program preferences that are unique 33I define urban counties as those that the U.S. Census Bureau classifies as “mostly urban” and define all other counties as rural. A list of Michigan’s urban and rural counties is available here: https://www.mlive.com/news/2016/12/michigans urban rural divide o.html. 61 to geographic regions within the state. Panel C shows how the results change when including this additional set of fixed effects. Again, the estimates are quite similar to the main specification. Panel D then shows how the results change when dropping the 2009 cohort, who graduated during the height of the Great Recession in Michigan and may have faced additional challenges in both accessing higher education and entering the labor market. The estimates are somewhat noisier when I do not include this cohort, but the effect sizes remain similar. Panel E further shows how the estimates change when I drop any student who enrolls in more than one program group from the analysis. The results are nearly identical when restricting the sample in this way. Finally, because the dependent variable represents county-level enrollment shares, I estimate several alternative specifications that are designed to handle fractional data. As in Section 2.5.3, I first estimate inverse hyperbolic sine, Poisson, and fractional logit specifications. I then implement a fractional multinomial logit specification that jointly estimates all coefficients in Table B.1.15, while imposing that each enrollment outcome must fall between 0 and 100, and the shares must sum to 100 (Buis, 2017). In Panel F of Figure B.1.11, I compare the results from these three specifications to the estimated elasticities obtained from the main linear specification. The semi- elasticities are quite similar across the specifications, with an additional layoff per 10,000 working- age residents reducing enrollment in related programs by up to 5% and effects varying across fields of study. 2.7 Conclusion More than 8 million students enroll in public community colleges in the United States each year, with many entering vocational programs that prepare them for a continually evolving labor market. The returns to these programs vary substantially by field of study, but there is little ev- idence on how students choose which programs to pursue. In this paper, I study the extent to which students’ program choices respond to changes in local labor market conditions in related occupations. To do so, I match detailed administrative data on students’ educational decisions with establishment-level data on plant closings and mass layoffs in the state of Michigan. While 62 previous researchers have used similar data to study how local economic conditions affect col- lege enrollment, I provide the first analysis in the literature that matches layoffs to corresponding academic programs and considers how they affect what students study once they enroll in college. I find that local labor market shocks deter students from entering related programs at commu- nity colleges. Instead, students shift their enrollment into other types of vocationally-oriented com- munity college programs. Using rich data on occupation characteristics, I document that students primarily substitute into programs that lead to occupations that require similar skills. However, when layoffs occur in fields that do not have clear substitutes, such as STEM occupations and the skilled trades, students are less likely to shift their enrollment to alternative programs. These results have several policy implications for Michigan’s community colleges and national education policy efforts. For example, colleges should prepare for students to enter different pro- grams when there are local labor market shocks. Providing community colleges with the resources to expand the supply of alternative programs, particularly those with high labor market returns, could be beneficial to students. High schools and colleges should also carefully consider the type of labor market information they provide students. I find that students are particularly sensitive to local labor market conditions. However, it is not clear whether this responsiveness is a result of the salience of local events or geographic preferences. Ideally, educators would urge students to consider both local and non-local labor market opportunities to make informed choices that best align with their geographic preferences and constraints. Nevertheless, these results also have limitations. First, the majority of local labor market shocks I observe come during the aftermath of the Great Recession in a state that was particularly affected by the collapse of the automotive industry. While this setting produces substantial variation in local labor market conditions, the results may not generalize to future cohorts or other areas of the country. Additional work analyzing how students respond to local labor market shocks in other contexts would be a valuable contribution to the literature. Second, my results are limited in that they apply only to the decisions of recent high school graduates. Adults enrolling in commu- nity college programs, especially those who lose their jobs during local labor market downturns, 63 may have different preferences for program characteristics and may respond quite differently to local labor market shocks than younger students who are enrolling in college for the first time. Understanding the choices of this population and evaluating interventions meant to promote their employment and earnings are important areas of both future research and public policy. 64 CHAPTER 3 DO HEALTH INSURANCE MANDATES SPILLOVER TO EDUCATION? EVIDENCE FROM MICHIGAN’S AUTISM INSURANCE MANDATE 3.1 Introduction How policy decisions in one area spill over to other areas in which there are no direct policy connections is a core question in economics. These spillovers often are unintended by policy- makers, but they can have large impacts on how individuals respond to policy changes and the resulting social welfare effects of those policies. The opportunity for these unintended spillovers is particularly large in the United States, where a large array of different government organizations at the federal, state and local levels enact separate policies that interact with one another in com- plex ways. These interactions mostly have been studied with respect to the social safety net in the US.1 Little research has addressed spillovers into education, particularly with respect to the health care system.2 This lack of research is surprising, since education and health together accounted for 25.2% of GDP in 2017. Health and education are strongly linked through their central role in the development of human capital, and there also are direct policy linkages through the special education system that services students with disabilities. In this paper, we provide one of the first analyses of how health care policies spill over to the education sector by examining the effect of Michigan’s autism insurance mandate on the ed- ucational services received by, and achievement outcomes of, students diagnosed with Autism 1See for example Elwell (2018), Ham and Shore-Sheppard (2005), Yelowitz (1995), Moffitt and Wolfe (1992), and Blank (1989). These studies all find evidence that changing one program affects participation in other programs. 2Recent work by Benson (2018) estimates the effect of special education participation on Sup- plemental Security Income receipt and shows strong evidence of interactions among these pro- grams. There also is some research that shows how direct health interventions in public schools affects student health and educational achievement, but these studies do not identify spillover ef- fects across programs or policy areas (e.g., Lovenheim et al., 2016; Reback and Cox, 2018; Buckles and Hungerman, 2018). 65 Spectrum Disorder (ASD). While our analysis contains broad lessons for how health care policies affect educational services and outcomes, the specific focus on students with ASD also is of high importance. ASD is one of the fastest-growing developmental disabilities in the United States. The ASD diagnosis rate among eight-year-olds increased from 6.7 per 1,000 students in 2000 to 16.8 per 1,000 in 2014.3 Among students 3-21 years old, the rate of special education primary disability identifications with ASD rose from 0.2% in 2000 to 1.2% in 2015 (a 500% increase). The overall student disability rate declined slightly over this time period, from 13.3% to 13.2%.4 A recent study using self-reports from 2016 found that among children aged 3 to 17, 2.8% were diagnosed with ASD at some point (Xu et al., 2019). Students with an ASD diagnosis are growing in absolute terms (617,000 children in 2015) and are an increasing proportion of all students with disabilities (9.2% in 2015, up from 1.2% in 2000). Further, students with ASD are some of the most ex- pensive students to teach. Children with ASD typically have substantial learning disabilities that require intensive therapy services throughout childhood as well as coordination between the health care system and the education system. These students cost schools $8,610 more than the average non-ASD student (Lavelle et al., 2014), while the cost to families varies dramatically by health insurance coverage but has been estimated to be as high as $47,000 per year in the US.5 Applied Behavioral Analysis (ABA) is the main therapy used to treat students with ASD. It is not a “cure” but has been shown to substantially improve symptoms through behavioral modifica- tion therapy (Peters-Scheffer et al., 2011; Dawson et al., 2010; Viru´es-Ortega, 2010; Howlin et al., 2009; Eldevik et al., 2009; Foxx, 2008). It is most effective when implemented early in life and when done intensively, often for at least 20 hours per week. Because of the high cost of these ther- apies, schools and families often lack the resources to provide sufficient services to ASD students. Until recently ABA and other autism therapies were often excluded from health insurance because 3Source: https://www.cdc.gov/ncbddd/autism/data.html. 4These tabulations are taken from the 2017 Digest of Education Statistics, Table 204.30. https: 5Source: https://www.special-learning.com/article/funding$\ $overview. //nces.ed.gov/programs/digest/ 66 they were considered “experimental” and/or “educational.” Even when a therapy is not excluded, the coverage is uneven across health insurance plans both in terms of what is covered and the ages for which therapies are covered. The lack of coverage for autism services in many private health insurance plans highlights that what treatments students receive relies on the interaction between school-based services and the health insurance plan to which a family has access. The interaction of health insurance and school special education services is not unique to ASD, as these issues are present for all student disabili- ties.6 Currently, there is very little understanding of how the health insurance and special education systems interact in the production of education services for students with disabilities. The need for expensive extra-curricular treatments that are unevenly covered by health insurance plans makes students diagnosed with ASD an informative group through which to study this interaction. We provide the first analysis in the literature of how mandating private insurance coverage of ASD treatments such as ABA affects special education diagnoses (including ASD), the educational supports students with ASD receive, and the educational outcomes of students diagnosed with ASD. Beginning in October 2012, Michigan required that all private state-regulated insurance plans cover ASD treatment services through age 18. Self-insured plans, while not mandated to cover ASD therapy, were provided generous financial incentives to do so. Medicaid also provided coverage but only for children under age 6 due to a lack of funding.7 The difference between the Medicaid and private insurance coverage forms the basis of our empirical strategy. We use administrative data on all public K-12 students in the State of Michigan from the 2009- 2010 to the 2014-2015 school years. The data are extremely rich and include not only traditional test scores, demographics, and schools attended but also specific disability diagnoses and the ser- vices students receive in school through their Individual Education Plans (IEPs). The data do not 6Given the strong positive correlation between health and education and the central role both play in the development of human capital, health policy is likely to have an impact on education for non-disabled students as well. For example, Cohodes et al. (2016) show that Medicaid expansions led to higher educational attainment among affected cohorts. 7As of January, 2016, Michigan began covering all youth in Medicaid up to age 18. Our analysis thus ends prior to the Medicaid expansion to focus on the private insurance mandate. 67 include information on the health insurance plan under which each student is covered, so we rely on the close overlap of Medicaid and free/reduced price lunch (FRPL) eligibility. Between 2008 and 2016, tabulations from the American Community Survey show that only 31% of FRPL-eligible students had private insurance coverage while 89% of those not eligible for FRPL had private in- surance.8 We use this overlap to estimate intent-to-treat models that examine how outcomes among non-economically disadvantaged students9 (who are mostly covered by private insurance) change when the mandate is enacted in 2012 relative to economically disadvantaged students (who are less likely to be covered by private insurance). To further increase the accuracy of our proxies for insur- ance coverage, we restrict our sample to students who are economically disadvantaged in all years that we observe them in grades 2 through 8 and those never observed during those grades as eco- nomically disadvantaged, conditional on being observed for at least two years. We use this sample because students who are persistently eligible for FRPL are the most disadvantaged (Michelmore and Dynarski 2017). In a difference-in-differences setting, we first show that the mandate has little effect on the likelihood of receiving an ASD special education identification in grades 2 through 8.10 This is interesting in its own right, as private insurance can cover diagnostic services. However, most medical diagnoses for ASD occur before the age of six (Fountain et al., 2011), which is likely why we find no effect on ASD identification.11 The lack of any meaningful effect on overall ASD inci- dence supports a triple difference strategy when we examine education services and achievement. We estimate how outcomes among ASD students who are not economically disadvantaged change relative to ASD students who are and how this change relates to the change in outcomes among 8Note that insurance counts may exceed 100% as some people remain eligible for Medicaid while enrolled in private plans. 9“Economic disadvantage” status refers to students in poverty and the vast majority of students qualify based off eligibility for free or reduced-price lunch. 10Generally, free/reduced price lunch eligibility is more accurately measured in primary grades than secondary grades, which is one benefit of focusing on younger students. 11While medical diagnosis and identification of ASD for education purposes are similar, they are not the same. Some students may be diagnosed but not have an IEP or have a different primary identification. Alternatively, while it is extremely likely a child with an ASD identification also has a medical diagnosis, the latter is not a necessary condition for the former. 68 non-ASD students who differ in disadvantage status. We find that the insurance mandate reduces the set of special education resources ASD students receive and induces students to be placed in less restrictive environments. Among ASD students who are never disadvantaged, the mandate causes a 6.4 percentage point (9.5%) reduction in place- ment in resource room (pull-out) or cognitive impairment programs (self-contained classes for stu- dents with cognitive impairments) and a 3.4 percentage point (26.4%) increase in placement in no special education program. Furthermore, these ASD students are 2.3 percentage points (17.7%) less likely to be assigned an ASD-specialized teacher consultant (who provides oversight and sup- port to the students’ teachers and help develops instructional plans), though they are slightly more likely to receive any special education support services. Taken together, these measures indicate that the private insurance mandate led to lower special education resource provision for affected students in schools. Our data do not allow us to observe the use of ASD therapy services outside of school. Such data would be useful in assessing the costs and benefits of this policy and whether our results indeed reflect crowd out of special education services rather than students not requiring as many services in school. To provide some evidence on whether the mandate generates crowd out versus reducing the need for in-school services, we examine student test scores that yield insight into the extent to which the mandate supports or detracts from student learning. There could be a negative effect if service crowd-out is more than 100% or if the privately provided services are of lower quality. Conversely, student learning may increase if overall services increase and/or if the quality of services provided increases. Additionally, providing ABA outside of school may facilitate more inclusion of ASD students in general education classrooms, which some research suggests is productive for learning among students with disabilities (Ruijs and Peetsma, 2009) and more time to focus on direct instruction. The inability to observe privately provided services precludes a direct analysis of these mechanisms, but we are able to identify the net policy effect that shows how the policy impacts achievement. The achievement results thus provide suggestive evidence of the mechanisms at work. 69 We find little evidence of a net change in reading or math test scores. In our preferred model that uses non-disabled students as a comparison group, the 95% confidence interval for math in standard deviation units is [-0.046,0.056] and in reading it is [-0.039,0.075]. We thus can rule out anything but modest-to-small changes in math and reading scores due to the mandate. This finding suggests either that crowd-out was complete (and thus total services did not change) or that any reduction in services is balanced by the effects of being in a more inclusive general education environment. Nonetheless, these results suggest there was no adverse effect on students’ academic performance from the crowd-out. We also examine heterogeneous effects by gender, race, and grade level. Our results do not vary much across groups, but we do find that girls are more likely to be removed from resource/cognitive programs and more likely to be placed in no special education programs than are boys. Effects are similar for White and Asian versus Black and Hispanic students, but we lack power to estimate precise effects for the latter group. We also find that the effects only begin to appear in grade 2. This grade heterogeneity is sensible, as nearly all students in Kindergarten and many in grade 1 are 6 years old or younger and therefore receive increased private ASD services under the man- date regardless of their health insurance status. In terms of test scores, we find little evidence of heterogeneous treatment effects. This paper contributes to several different strands of research. The first is the small literature that examines the effects of health care policies on student achievement. Most of the prior liter- ature focuses on Medicaid (Cohodes et al., 2016; Levine and Schanzenbach, 2009) or examines direct health interventions in schools (Lovenheim et al., 2016; Reback and Cox, 2018; Buckles and Hungerman, 2018). To our knowledge this is the first paper to examine the interaction between the health insurance and special education systems. More specifically, our study is the first to identify causal effects of Autism insurance mandates on the educational services disabled students receive and their subsequent educational achievement. The second literature to which we contribute is the crowd-out of public goods by private pro- vision (Bergstrom et al., 1986). Crowd-out of public services from private provision has been doc- 70 umented in several contexts like Medicaid (Cutler and Gruber, 1996; Gruber and Simon, 2008), charitable donations (Payne, 1998; Gruber and Hungerman, 2007; Anderoni and Payne, 2011), re- ligion (Hungerman, 2005), and school funding (Gordon, 2004). Ours is the first analysis to show that private health insurance mandates crowd out special education services in public schools. This is an important contribution because special education is by design at the intersection of publicly provided education and often privately provided health care. That changes to private insurance can affect the services that disabled students receive in public schools is a novel finding that has impli- cations for health insurance policies and the funding and provision of special education services. We also contribute to a growing body of work on policies surrounding ASD students. ASD is a very expensive disability to treat, with current estimates in the US indicating that it costs about $17,000 per year to treat a student with ASD through health care and special education services (Lavelle et al. 2014). There also is suggestive evidence that ASD leads to lower labor force attachment and earnings among parents (Cidav et al., 2012), although identifying causal estimates is difficult in this context. The lifetime cost of supporting a child with ASD, including potential labor force effects among parents, is between $1.4 and $2.4 million in the US (Buescher et al., 2014). A large literature has arisen that examines the causes of the rise in ASD (see e.g., Hansen et al., 2015; Matson and Kozlowski, 2011), but to date very little work has been done on what school or health policies can support the academic development of ASD students and how best to deliver services to them in a cost-effective manner. Finally, we present direct estimates of the effect of the Michigan insurance mandate on special education services and academic outcomes. These mandates are growing in prevalence: 46 states (plus D.C.) currently have some form of regulation that requires ASD services to be covered by health insurance plans. However, the scope of what is covered and the ages of children included in the regulations vary considerably across states.12 As small literature has arisen that examines these mandates. Mandell et al. (2016), Barry et al. (2017), and Candon et al. (2019) find that these mandates increase treatment prevalence and spending for those with private insurance: ASD 12Regulations for each state can be found at http://www.ncsl.org/research/health/ autism-and-insurance-coverage-state-laws.aspx. 71 mandates lead to a $77 increase in monthly spending on ASD services. Chatterji et al. (2015) use a similar triple difference design as in this study and find that Autism Insurance Mandates have no effect on financial burden, access to care, and unmet need for services. These results are suggestive of full crowd-out of services due to the mandate. The Michigan mandate is among the most expansive in terms of what must be covered and in terms of the ages of children included. Thus, our analysis is informative with respect to the potential for these types of policies to impact educational services and outcomes among students with ASD. 3.2 Background 3.2.1 Austism Spectrum Disorder and Therapy Options Autism spectrum disorder (ASD) is a developmental disability that generates problems with social, emotional, and communication skills (Centers for Disease Control, 2018). The categoriza- tion combines disorders that were previously viewed as distinct - Autism, Asperger’s Syndrome, Pervasive Developmental Disorder (Not Otherwise Specified) - and are now considered by psychi- atrists to be variations of the same spectrum of conditions. Children with ASD show many different symptoms and often exhibit some but not others. Common symptoms include difficulty with social interactions, delayed speech and inability to communicate verbally, repetitive behaviors, and stim- ming. These start to appear as early as 18 months of age, and diagnoses can be obtained as early as 24 months (Centers for Disease Control, 2018). Even so, diagnosis this early is uncommon. The median age of first diagnosis in the US as of 2012 was 4.2 years, and only 46% of children with ASD had a full evaluation prior to 3 years of age (Baio et al., 2018). ASD is considered a lifelong disorder. While there is no cure, there are treatments that can help alleviate symptoms and improve the ability of individuals with ASD to perform well behav- iorally, both in school and in society more broadly. Children with ASD usually receive a variety of therapeutic interventions. These often include occupational therapy, speech therapy, sensory integration therapy, and Applied Behavior Analysis (ABA) therapy.13 Students may receive these 13While there is some research on how nutritional changes can help, these studies are largely 72 services from private practitioners, through the special education system, which we discuss further in Section 3.2.3, or through a combination of private and public providers. Applied Behavior Analysis (or Early Intensive Behavioral Intervention), which involves using positive reinforcement and repetitive application of behavioral situations where cause and effect are outlined, has become one of the most widely used strategies for addressing autism.14 As noted above, there is substantial experimental and observational research that shows ABA to be effective at improving educational and behavioral outcomes for children. Further, providing intervention early on when the child is very young has been shown to be more effective than starting later (Zwaigenbaum et al. 2015; Dawson et al. 2010; Granpeesheh et al. 2009; Corsello (2005)). ABA therapy is typically provided by licensed Board Certified Behavior Analysts (BCBAs), many of whom work outside of the public school system.15 However, many students also receive some form of ABA therapy in school. For example, in 2011 - prior to the insurance reform - 59% of public school educators in Michigan reported using ABA therapies with students with ASD (Ferreri and Bolt, 2011). Data on the costs of these interventions are sparse, but the therapies are generally considered to be quite expensive. Total costs of treatment combined with opportunity costs (e.g. for lost work by a parent or caretaker) have ranged from $17,000 per year in the US to $44,000 in the UK and $68,000 in Sweden (J¨abrink (2007); Knapp et al. (2009); Lavelle et al., 2014). Additionally, estimates of medical expenditures for individuals with ASD indicate they exceed those without ASD by $4,110 to $6,200 per year, 4 to 6 times larger than average (Shimabukuro, Grosse, and Rice 2008). Given these large costs, insurance coverage is potentially a very important factor in whether children receive treatment. Cost-benefit analyses have shown ABA interventions to be highly cost effective over the long run. Jacobson et al. (1998) find lifetime benefits for the observational or small sample experiments and show limited evidence of impacts on symptoms (Mari-Bauset et al., 2014) 14https://www.autismspeaks.org/applied-behavior-analysis-aba-0. 15Less than 30% of job postings for BCBAs come from the education sector, while nearly 60% come from the health care and social assistance sectors. See: https://www.bacb.com/wp-content/ uploads/2017/09/151009-burning-glass-report.pdf 73 individual of up to $1 million. Ganz (2007) estimates the lifetime social costs of untreated autism at $3.2 million as of 2003, though it is unclear how much this can be mitigated by therapeutic treatment. 3.2.2 The Michigan Autism Insurance Mandate Until recently, treatments for Autism Spectrum Disorder beyond therapies for co-morbidities like speech and occupational therapy were not commonly covered by health insurance. As a result, states started mandating coverage for ABA and related therapies. Today, 46 states and the District of Columbia have some coverage requirements for autism services. Even in these states, however, coverage can be limited. Affected children often have to go through a time consuming evaluation process where access may only be available in a few locations with long wait lists. Furthermore, coverage mandates do not extend to all health insurance plans - since self-insured firms are covered under Federal law, states have little ability to mandate coverage in these cases. In 2012, the state of Michigan passed a law that expanded access to insurance coverage for children with ASD, with implementation starting in October, 2012. The law had three main pillars. First, for people covered under state regulated insurance plans, mainly employer sponsored plans for small or medium sized employers and individually purchased plans, the law mandated coverage for “evidence-based behavioral health treatment” - typically ABA, pharmaceuticals, psychiatric care, psychological care, and other therapies - from birth through 18 years of age for children with diagnosed ASD.16 Coverage requirements are generous: the maximum annual benefit starts at $50,000 for children under six and decreases with age to a floor of $30,000 at age 18. Co-pays, deductibles, and co-insurance rates cannot exceed those required by the individual’s insurance plan for physical illness.17 A difficulty often faced by states in ensuring widespread coverage of Autism insurance man- dates is that only a subset of insurance plans are subject to state regulation. Self-insured plans, 16Diagnosis by a physician is required and insurance companies are permitted to require the evaluations be done through designated evaluation centers (Peters et al., 2014). 17Michigan Public Acts 99 and 100 of 2012. 74 mostly used by large employers, are covered under Federal law and so are not typically subject to state mandates. As of 2011, 61% of Michiganders with employer-provided coverage were in self- insured plans (Fronstin, 2012). Michigan addressed this gap via Public Act 101 of 2012, which set up a reimbursement fund for self-insured plans that provided benefits in line with those required for regulated insurance plans. Plans were permitted to request up to 100% of the claims from their beneficiaries for reimbursement.18 While there are no data on how many self-insured firms provided coverage under this law, the very generous reimbursement likely led to high take-up. It is worth noting that the Autism Alliance of Michigan maintains a list of self-funded firms in Michigan that offer the insurance benefit, including many of largest employers in the state such as General Motors, Ford, Meijer (a supermarket chain), and Beaumont Health System, along with the state government and most major universities.19 Children not covered under employer-provided or individually-purchased insurance plans in Michigan are almost all covered under Medicaid, including those covered through the Children’s Health Insurance Program (CHIP). In 2014, 58% of children aged 0 to 18 in Michigan were covered by private insurance, while 39% were covered under Medicaid and only 3% were uninsured.20 The Michigan reform provided insurance coverage for ASD to Medicaid beneficiaries as of April 2013, but the benefits were considerably less generous than the private insurance mandate, a key aspect of the reform for our identification strategy. In particular, while pharmacy, psychiatric, psychological, and co-morbid therapies like speech therapy were already covered prior to the reform, the only ASD-specific therapy added to coverage from the law is ABA. Other evidence based therapies are not covered. This itself is only a minor difference as most therapy for ASD is based on ABA. 18In FY2016, the fund ran out of money and hence claim processing has been suspended since then. While it is possible some firms have since removed their benefits due to the lack of reim- bursement, since our data cover only through the 2015-16 school year and firms typically make insurance coverage decisions towards the end of the calendar year, this is unlikely to affect our results. 19A full list of self-insured employers with ASD benefit can be found at https:// autismallianceofmichigan.org/insurance-facts/. Data on the largest employers in Michigan are from https://www.zippia.com/advice/largest-companies-in-michigan/. 20Kaiser Family Foundation estimates based on the Census Bureau’s March Current Population Survey, 2014-2017. 75 More importantly, underfunding of the Medicaid benefit led to coverage expiring once the child reaches an age of six.21 Generally, ABA therapy continues beyond this age and many years of therapy are needed for benefits to emerge and be maintained. Further reducing the value of this benefit is that often children are diagnosed relatively close to the age cutoff. According to the most recent report available, the average age at first ASD diagnosis for Medicaid recipients across the US was 5.4 years in 2002 - 2004 (Mandell et al., 2010). This leaves virtually no time for therapy to have an effect before access is cut off. Even if age of diagnosis has improved, nationwide data regardless of insurance coverage showed that the median age of diagnosis in 2012 was 4.2 years, again leaving little time to garner substantial benefits from therapy prior to reaching six years of age (Christensen et al., 2016). 3.2.3 Special Education Students deemed eligible for special education services are covered under the Individuals with Disabilities Education Act (IDEA). To receive such services, students first must be evaluated. An evaluation can be initiated by parents or the school and involves a review of the child’s educational progress as well as factors potentially related to the suspected disability like health, vision and hear- ing, social and emotional development, academic performance, communication, and motor skills. Hence, eligibility is not simply based on standardized test scores or pure academic performance. Autism is classified under IDEA as a specific disability category. The evaluation process for students who may be on the Autism spectrum involves examining the students existing academic record, behavioral outcomes, interviews with teachers and parents, and an assessment by special- ists trained in ASD diagnosis and treatment. These specialists can be provided by the school or outside of the school. Critically, under IDEA these assessments are to be provided at no cost to families, though parents can and often do use external assessments to inform the process. Once a child has been evaluated, parents meet with some combination of teachers, school administrators, 21As of January 2016, due to requirements of the Affordable Care Act, the age limit was in- creased to 21. For this reason we focus our analysis on school years prior to 2015-2016. 76 school counselors, and specialists to determine whether the child qualifies for special education services. If so, they develop an individualized education plan (IEP) that specifies what educational environment and educational services the student will receive and the benchmarks that will be used to determine whether the student needs to continue receiving these services in the future. Typically, IEPs are updated every year with a full reassessment every three years. The special education services agreed to in the IEP need to be provided by the school at no cost to the family. However, the services being received outside of school can influence the in- school services on the IEP because the IEP is developed with direct input from the parents. It is reasonable to assume that parents and schools consider the sum total of therapies and services available to students when crafting an IEP, and parents of students with ASD often report needing to ask schools to provide more services than initially offered (Ferreri and Bolt, 2011). However, we are aware of no research that examines the extent to which outside therapies influence IEP services. IDEA also includes a Least Restrictive Environment (LRE) provision that requires students be placed in the most general education setting possible. This provision is designed to avoid special education students being segregated from the rest of the school population, which could have negative consequences for educational and social/emotional development. As this discussion highlights, the special education process is complex and involves many participants and constituencies. Together with the LRE provision of IDEA there is significant scope for non-school resources and factors to play a role in the specific education services student receive and the educational environment to which they are exposed. Research on factors that influence how IEP plans are developed is thin; our paper is the first to empirically examine how external factors such as health insurance affect special education services, which is an important advancement in our understanding of how the special education system operates. 77 3.3 Data 3.3.1 Michigan Administrative K-12 Schooling Data Our analysis relies on a student-level dataset provided by the Michigan Department of Educa- tion (MDE) and Center for Education Performance and Information (CEPI). The dataset contains administrative educational records on all students enrolled in pre-K through grade 8 in Michigan’s public schools from the 2009-10 to 2014-15 school years. These records provide rich informa- tion on students’ demographic characteristics, disabilities, educational settings, special education programs and services, and achievement levels. Student demographic characteristics are reported by schools to MDE and include a student’s race, gender, and eligibility for Limited English Pro- ficiency (LEP) services. Our key demographic variable of interest is a student’s “economically disadvantaged” status, which we use as a proxy for measuring a student’s insurance status. A student is defined as economically disadvantaged if she qualifies for free or reduced-price meals under the National School Lunch Program, is in a household that receives food (SNAP) or cash (TANF) assistance, is homeless, is a migrant, or is in foster care. Students’ economic disadvantage statuses are updated annually to reflect changes in families’ economic circumstances.22 Typically a student who qualifies for any of the latter also qualifies for free/reduced price lunch, and so we interchangeably refer to this status as “economically disadvantaged” or FRPL. For students with IEPs, which is synonymous with qualifying for special education, we obtain additional information on a student’s primary disability, as defined on her IEP, and special educa- tion resources provided to the student.23 The special education resources variables are classified 22One concern with this measure is that some schools qualify for the Community Eligibility Provision for free-lunch that allows all students in a school to qualify regardless of individual circumstances. However, Michigan still requires schools to collect family income information to determine individual FRPL eligibility for record keeping purposes. Only 0.3% of observations in our data are in school-years with 100% FRPL eligibility, indicating that CEP does not affect our classifications. 23Students may also receive services through the use of 504 plans which typically provide access for students who are not classified under conditions recognized via the Individuals with Disabilities Education Act. Since autism is a category in IDEA and IEPs provide more legal protections and guarantees of educational (as opposed to simply disability related) services, most children with 78 into three distinct categories: (1) a student’s special education program, (2) a student’s educational setting, and (3) the special education support services received by a student. The program category contains the IEP-designated programs in which a student is enrolled. Programs are state-defined special education settings that must adhere to specific regulations. To be considered an ASD program, a classroom must not have more than 5 students and must be served by a state-endorsed teacher of students with ASD who has completed ASD-specific education and training. Therefore, not all schools or school districts offer all special education programs, and a student’s program need not exactly correspond with her disability. For example, students with ASD are commonly enrolled in a “cognitive impairment” program, which has classrooms with up to 10 students and is designed to provide instruction to students with an array of learning disabilities. There are 14 specific types of special education programs in Michigan, and students can be enrolled in up to 3 of them. We focus on four categories that are the most relevant for ASD students: ASD- specific special education programs, cognitive impairment programs, resource programs - which usually involve pull-out time in a “resource room” with a special education specialist, and no program (e.g. in a general education classroom 100% of the time). It is also important to note that even if a student is attached to a program, he/she could spend any percentage of time below 100% in a general education classroom. So in some cases students are technically placed in a program but spend very little time in the actual program classroom. The educational setting category contains information on the primary educational setting where a student receives his education. Our data include eight different measures of the education setting: enrollment in a special education school, whether students are in a general education classroom more than 80% of the time, 40-79% of the time, or less than 40% of the time, whether students spend any time in a general education classroom, and the proportion of the student’s full-time equivalent (FTE) enrollment that is in a special education setting. Since the special education FTE rate is measured in the fall and spring, we use the average rate across the two semesters. For ASD are covered under IEPs. While we do not observe 504 plans directly, these students would have an ASD identification but no data on services, which accounts for only 0.5% of students with ASD in our analysis sample. 79 brevity, we focus on whether a student is enrolled in a separate special education school, whether she is enrolled in a general education classroom more than 80% of the time, and the percentage of FTE enrollment that is in a special education setting averaged across the year. We hypothesize ex- ante that these education settings are the most likely to be associated with changes in ASD-related school services, however we also show estimates for our other measures. The services category records any special education support services a student receives within an academic year. These services include therapies, such as speech, occupational, and physical therapy; work with school social workers and/or psychologists; special transportation to and from school; and assignment to teacher consultants who provide assistance to general education teach- ers. Teacher consultants in Michigan must have a master’s degree in education or in a field related to special education as well as teaching experience in a special education classroom. Moreover, teacher consultants may be approved to work with special education students generally or may be approved to work with students with particular disabilities by meeting additional education and experience standards. There are 30 specific special education support service categories listed in the data, and up to 10 are recorded for a student. Our primary estimates examine ASD teaching consultants, language support, occupational therapy, social workers, and whether a student receives any support services. We focus on these support services because they are the most relevant for ASD student. We also examine non-ASD teaching consultants, physical therapy, and transporta- tion services to demonstrate that services that are less important for ASD students are unaffected by the insurance mandate. Finally, we obtain information on students’ test-taking behavior and achievement on standard- ized math and reading exams. MDE reports the type of exams (standard or special education) students took in a given year, as well as any special education accommodations used. Test scores are recorded for students who take standard or particular types of special education exams. How- ever, the special education exams assess different material and are scored on a different scale than the standard exams. Thus, we only consider test scores for students who take standard exams. As we discuss below, we do not find that the mandate led to changes in regular exam taking by stu- 80 dents with ASD relative to non-disabled students, allowing us to analyze the regular exams while avoiding sample selection concerns. We further standardized the scaled scores for these exams across all students in the state within a grade level and school year. Our main sample consists of students in grades 2-8 in school years 2009-2010 through 2014- 2015. While as noted above we have data for more grades and years, students in kindergarten and grade 1 are excluded because it is likely that publicly insured students in these grades also received increased access to private services through the Medicaid benefit, though we also look at grade specific effects for the excluded grades. Further, the 2015-2016 school year is dropped due to the age six cutoff of Medicaid ASD benefits being removed in 2016. Table C.1.1 presents means of all analysis variables for several subsamples of these students: all students, ASD students, non-ASD special education students, and non-special education students.24 About 1% of the sample has an ASD identification; these students are more likely to be male and white than the sample overall and are less likely to be disadvantaged. ASD students also are less likely to be disadvantaged and more likely to be white and male than non-ASD special education students. Further, there are substantive differences in the programs, educational environments and support services received by ASD and non-ASD disabled students. 3.3.2 Measuring Insurance Status One of the central data challenges in this analysis is the inability to measure student health insurance status. The insurance mandate was only binding for students covered under a private health insurance plan. In some cases children may be uninsured but this is very rare. According to the American Community Survey, 96% of individuals in Michigan under age 18 were covered by insurance in 2012, the year Michigan passed the Autism mandate.25 Thus, those students who do 24The sample sizes of ASD students in the top and bottom panel do not match because a small number of students receive an ASD identification through a 504 plan but receive no special educa- tion services. 25This is likely a lower bound of the child health insurance coverage rate because those who are eligible for Medicaid but who are not signed up would be signed up and receive treatment upon arrival at a hospital. They therefore are functionally covered by Medicaid even if they are not 81 not have private insurance are almost certainly covered by Medicaid/CHIP. To proxy for private insurance coverage, we use a student’s status as being economically disad- vantaged, which is primarily based on free/reduced-price lunch (FRPL) eligibilty. The motivation for using this proxy is that the eligibility criteria for FRPL status overlaps closely with eligibility for public health insurance. In order to qualify for public health insurance, children must be in families that earn less than 200% of the Federal Poverty Line.26 Eligibility for free or reduced price lunches in schools is set at 185% of the poverty line. Table C.1.2 shows health insurance status by free/reduced price lunch eligibility (top panel) and by family income as a percent of the federal poverty line (bottom panel) from the 2008-2016 American Community Survey among K-8th grade students in Michigan. The top panel shows health insurance coverage is near-universal and varies little by whether students are eligible for free/reduced price lunch. What does vary across these groups is what type of insurance students have. Almost 89% of ineligible students have private insurance, while 73% of eligible students receive Medicaid. While there is some overlap, FRPL eligibility is strongly correlated with whether students receive Medicaid. FRPL status is a somewhat noisy proxy for family income. Research using data from education records linked to tax data indicates that there is a wide range of family incomes among students in the same free/reduced price lunch category (Domina et al., 2010). The bottom panel of Table C.1.2 shows that the poorest students, those whose families earn under 135% of the poverty line, are the most likely to be on Medicaid (81%). The percent on Medicaid declines with income, even in the eligible range. In order to strengthen the proxy we use for Medicaid eligibility, we use the fact that those who persistently receive free/reduced price lunch are the most disadvantaged students (Michelmore and Dynarski, 2017). These students are likely to come from the bottom of the income distribution, and Table C.1.2 indicates that they are unlikely to have private insurance. formally enrolled in the program. 26Federal Medicaid eligibility is stricter, at 133% of the Federal Poverty Line. The Michigan Child Health Insurance Program (MCHIP) extends public insurance eligibility up to 200% of the poverty line for children of Michigan residents. 82 Conversely, higher income students who are above 250% of the poverty line are very likely to have private insurance. Hence, our main analysis sample is comprised of those who receive free/reduced price lunch in every observed year of school enrollment from grade 2 to 8 and those who do not receive free/reduced price lunch in any observed year of school enrollment between grades 2 and 8. Students who receive free/reduced price lunch in some years are excluded, as are students who are only observed for one year. For completeness, we show robustness checks that include the “sometimes free/reduced price lunch” students. The estimates that include these students are qualitatively similar but attenuated as expected because of the use of a weaker treatment proxy. 3.4 The Effect of the Autism Insurance Mandate on ASD and Special Edu- cation Incidence 3.4.1 Empirical Approach Using data on students in grades 2-8 in Michigan from school years 2009-2014 as described in Section 3.3, we estimate difference-in-differences models that identify how the insurance mandate affects the likelihood that students are diagnosed with ASD or have any special education diagno- sis. Our measure of special education is whether students have an individual education plan, and we designate a student as being diagnosed with ASD if the IEP lists ASD as the primary disability. Note that we only observe specific services, programs, and educational settings for students with IEPs. Further, in our main model we exclude any student who is identified as economically dis- advantaged in some but not all years and conditions on students being observed in grades 2-8 for at least two years. Our difference-in-differences model for ASD identification is of the following form: ASDig jt = β0 + β1NonDisadvi + β2PostMandatet × NonDisadvi (3.1) + ΩXit + γgt + δ j + εig jt, where ASDig jt is an indicator equal to 1 if student i in grade g and school j is identified as hav- ing ASD (or an alternative disability that generates an IEP in companion estimates) in year t, 83 NonDisadvi is an indicator that equals 1 if we never observe the student as economically disad- vantaged in grades 2 through 8 (i.e., treated students), and PostMandatet is an indicator that is equal to 1 in the 2012-2013 school year and beyond. The model includes school fixed effects (δ j) as well as grade-by-year fixed effects (γgt). The inclusion of school fixed-effects ensures that we are only comparing students in the same school, since special education implementation often varies considerably across schools. Furthermore, special education policies are usually defined at the district level, and the school fixed effects subsume school district fixed effects. We show below that our estimates are similar when we only use school district fixed effects. Finally, we include controls for whether a student is white, male, or limited English proficient (Xit). Standard errors are clustered at the school district level since students in the same district experience sim- ilar education environments and face the same special education evaluation and service provision practices. The coefficient of interest in equation (3.1) is β2, which is the difference-in-differences es- timate of how the ASD (or other disability) rate of non-disadvantaged children changes in 2012 relative to disadvantaged students. The core identifying assumption is that trends in special edu- cation diagnoses among disadvantaged students are a valid counterfactual for trends among non- disadvantaged children, conditional on the controls. It is important to stress that for identification it need not be the case that disadvantaged and non-disadvantaged students with a disability are treated similarly, only that their treatment would have changed in similar ways in the absence of the insurance mandates. This assumption can functionally be broken down into two pieces: 1) outcome trends between disadvantaged and non-disadvantaged children must be similar prior to 2012, and 2) there must be no shocks that occur in 2012 that disproportionately affect students by disadvantage status. Using data prior to 2012, we generate direct evidence on whether there are pre-treatment rela- tive trends. These figures are presented below in Section 3.4.2 and provide strong support for the assumption that ASD rates are trending similarly across disadvantaged and non-disadvantaged stu- dents prior to 2012. The second assumption of no correlated shocks is more difficult to examine in 84 the data, as such shocks are by definition unobserved. Nonetheless, we are aware of no other state policy that was enacted during 2012 that would have disproportionately affected students across the SES distribution. The economy was recovering from the Great Recession, but this should be reflected in trends rather than a 2012-specific shock. The Affordable Care Act individual mandate came into effect in 2012. However, it was effective January 1 2012, while the ASD mandate went into effect in October 2012. Thus, any effects of the ACA mandate should be evident in the prior school year. We also do not believe it is plausible that the ACA affected these students: the ACA was focused on uninsured adults rather than children. Health insurance coverage among children was nearly universal prior to 2012 due to Medicaid and CHIP, and Medicaid rules did not change in Michigan during this period.27 To the extent the mandate caused some parents of children who are disadvantaged to switch from Medicaid to private insurance to use the ASD service benefits, our estimates will understate the effect of the mandate. 3.4.2 Results Table C.1.3 presents estimates of β2 from equation (3.1). Panel A shows the baseline estimates. In column (1), we show estimates using ASD diagnosis as the dependent variable; while the point estimate is statistically different from zero at the 10% level, it is small in magnitude. The point estimate indicates that non-disadvantaged students experience a relative decline in ASD diagnoses of 0.045 percentage points after 2012. This is 4.5% relative to the mean, and the 95% confidence interval rules out an ASD increase among this group of more than 0.03 percentage points. Thus, our estimates indicate that the mandate led at most to a very small change in ASD special education identification. As discussed above, a core assumption underlying our approach is that there are no trends in disability diagnosis that differ across students who do and do not receive free/reduced price lunch. The top panel of Figure C.1.1 presents event study estimates of equation (3.1), where 27Michigan expanded Medicaid for adults under the ACA in April 2014, but this expansion did not affect eligibility among children. 85 PostMandatet × NonDisadvi is replaced with a set of interactions between NonDisadvi and year indicators. The figure demonstrates that there is no systematic change in the likelihood of being diagnosed with ASD across the two groups prior to 2012. Furthermore, the year-to-year changes are extremely small, even relative to the low baseline mean of 1%. The figure also shows that there is little post-2012 change in the ASD diagnosis likelihood across the groups, which is consistent with estimates in Table C.1.3. Column (2) of Table C.1.3 presents estimates for the incidence of all other non-ASD disabili- ties. The estimate is positive, not statistically significant, and is small in magnitude. Taken at face value, it suggests that non-ASD diagnoses decreased by 0.04 percentage points (0.03% relative to the mean) among economically disadvantaged students relative to non-disadvantaged students post-2012. The bottom panel of Figure C.1.1 presents event study estimates for non-ASD disability incidence. The estimates in this specification are noisy, but similar to the ASD event study, there is little evidence of pre-2012 or post-2012 relative changes. Taken together, the panels of Figure C.1.1 support the use of the free/reduced price lunch students as a control group in this analysis.28 Although there is no aggregate change in non-ASD disability incidence, the remaining columns of Table C.1.3, Panel A present evidence of a shift in the composition of disabilities in this broad group. The mandate is associated with an increase in the prevalence of emotional and other health disabilities and a decline in the prevalence of speech disability. These relative changes in the composition of the special education groups complicates our preferred triple difference analysis in which we compare changes in outcomes by poverty status among ASD versus non-ASD special education students when the mandate comes into effect. It is possible that outcomes in the non- ASD population are affected by this change in the composition of disabilities. In Panel B of Table C.1.3, we show estimates that include a linear time trend interacted with non-poverty status. Including this control, the ASD effect shrinks substantially and is no longer statistically significantly different from zero at even the 10% level. Furthermore, the non-ASD disability effects are attenuated such that only emotional disability is significantly different from 28Note that we are unable to use non-disabled students as a control group, since they by defini- tion are not diagnosed with ASD. 86 zero (at the 10% level). A comparison of Panel A and Panel B of Table C.1.3 shows that the relative shifts in diagnoses are due to linear secular trends by poverty status. Event studies that exclude this control also demonstrate this point: the changes in disability incidence are mostly driven by secular trends that appear prior to 2012.29 The linear non-disadvantaged trend accounts for these secular trends. Critically, we show below that the rest of our results and conclusions are robust to including linear non-poverty time trend controls. Hence, our results are not being driven by the small compositional changes that are evident in Panel A of Table C.1.3. Panel C of Table C.1.3 demonstrates the robustness of our estimates to school district fixed effects. Since special education policies often are district-specific, it is not clear that school fixed effects are necessary or desirable. The results in Panel C are almost identical to those in Panel A, which further supports the validity of our empirical approach. 3.5 The Effect of the Autism Insurance Mandate on Educational Services and Test Scores 3.5.1 Empirical Approach Motivated by the finding that the insurance mandate has a negligible effect on ASD inci- dence, we employ triple difference models that compare changes in outcomes among students with ASD by free/reduced price lunch status to changes in outcomes among non-ASD students by free/reduced price lunch status. Specifically, we estimate models of the following form: Yig jt = β0 + β1NonDisadvi + β2ASDit + β3PostMandatet × NonDisadvi +β4PostMandatet × ASDit + β5NonDisadvi × ASDit +β6PostMandatet × NonDisadvi × ASDit + ΩXit + γgt + δ j + εig jt, (3.2) where Yig jt is an outcome for student i in grade g, school j, and year t. All other variables are as previously defined. In all the models, the vector (Xit) includes controls for whether a student is white, male, or limited English proficient. In models with test scores as the dependent variable, 29These event study estimates are available from the authors upon request. 87 the vector also includes the lagged test score for the same subject. As with equation (3.1), standard errors are clustered at the school district level throughout. For special education outcomes, we are restricted to using other special education students as one of the comparison groups in the triple-difference analysis. This is because each of the out- comes we analyze are only available for students with an IEP; students without an IEP do not receive special education services. Hence, all outcomes for students who are not special education will have values of zero. One potential limitation is that since special education funding is largely fixed, districts may respond to higher (lower) needs in one group by reducing (increasing) services in other groups. There are several reasons to believe that this is not a first-order issue in our anal- ysis. First, districts are not restricted to using only special education funding for special education services. In fact it is very common for special education funding to be insufficient and for districts to use general funds to supplement the costs (Degrow, 2017). Since our results show a decline in ASD service provision due to the mandate, it is likely that most, if not all, of the money saved will go towards general education. Second, the fixed-effects in our model ensure that we are comparing students in the same school. This ensures that differences in how schools (or districts) respond to the funding changes do not affect the estimates. The main variable of interest in equation (3.2) is β6, which yields the triple difference esti- mate of the effect of the Michigan insurance mandate on student outcomes. The identification assumptions underlying this model are similar to those discussed above for equation (3.1). How- ever, this model relaxes the common trends assumption somewhat: any differences in trends in outcomes across students who do and do not receive free/reduced price lunch must be similar for ASD and non-ASD students. Put differently, the non-ASD relative trends by free/reduced price lunch status need to be an accurate counterfactual for these trends among ASD students. Similar to the difference-in-differences model, there are two main sources of bias. The first is differential relative trends across treatment and control groups. In this case, ASD students who are/are not disadvantaged would have to exhibit different relative trends to non-ASD students prior to 2012. We present graphical evidence that such relative trends are not present. The second is contem- 88 poraneous and persistent shocks that differentially impact ASD students who are not eligible for free/reduced price lunch. We know of no reason to suspect that these shocks exist, especially since the ASD incidence rate does not change substantially when the mandate comes into effect. The plausibility of the identification assumptions rests heavily on the composition of the control group. When estimating impacts on programs, educational settings, and support services, the sample consists of all students with a disability (i.e., with an individual education plan). The control group thus is students with a non-ASD disability. Since only students with an IEP receive special education services, non-ASD disabled students are a natural control group. When we estimate effects on test scores, we are able to consider both non-ASD disabled and non-disabled students as potential comparison groups. One important issue is that Michigan changed the format and structure of achievement exams, particularly those taken by students with disabilities, in 2014- 15. Given this change, we restrict our achievement analysis to 2013-14 and earlier. 3.5.2 Results One of the strengths of our data that is unique to our administrative education data and is not commonly available in other states is the detailed information on special education services. As discussed in Section 3.3.1, we consider three types of services: education programs, the educa- tional setting, and special education support services. These categories are correlated with one another but imperfectly so. For example, the program in which one is enrolled can affect spe- cial education support services and the extent to which students are in a general education setting. However, students can receive support services even if they are not enrolled in special education programs and if they are in a general education setting. Examining these three categories of ed- ucational inputs thus paints a rich picture of how the ASD insurance mandate affects the type of education students with ASD receive. Table C.1.4 presents our baseline estimates for the main set of special education service out- comes that are most associated with ASD. The special education program outcomes we examine are whether the student is in an ASD program (column 1), whether the student is in a resource or 89 cognitive impairment program (column 2) and whether the student is in no special program (col- umn 3).30 The program does not have to match the disability listed on the IEP, so students with ASD diagnoses can be in non-ASD focused programs, and attachment to cognitive impairment programs in particular is common for students with ASD. Each column of the table shows results from a separate regression, and the first row presents the triple difference coefficient of interest. Column (1) shows that the ASD mandate increases the likelihood that students are placed in an ASD program by 3.1 percentage points, which is 15.7% of the ASD-specific mean (shown at the bottom of the table). However, the estimate is not statistically significantly different from zero. En- rollment in resource and cognitive programs decline substantially, by 6.4 percentage points (9.6%), and this estimate is statistically significant at the 5% level. The likelihood a student is enrolled in no special education program increases by a statistically significant 3.4 percentage points, which is a 4% drop in enrollment in any program - 87% of students with ASD are enrolled in at least one special education program. That the percentage of ASD students not in any special education program rises substantially suggests that the insurance mandate leads to lower intensity of special education interventions and less placement in self-contained special education classrooms. These results thus reflect crowding out of special education services offered by public schools from the private insurance mandate. As discussed in Section 3.3.1, one of the main assumptions under which our estimates are identified is that relative trends among free/reduced price lunch and non-free-reduced price lunch students are similar for ASD and non-ASD disabled students prior to 2012. Figure C.1.3 presents evidence on the plausibility of this assumption by showing event study estimates of the mandate on program placement.31 For no outcome do we see any evidence of differential pre-2012 trends, which supports our identification strategy. Furthermore, there is a clear decline in the likelihood of being in a cognitive or resource program after 2012 and an increase in the likelihood of being 30The estimates for other programs are provided in Table C.1.5 and show no effect. 31Specifically, we replace PostMandatet × NonDisadvi × ASDit with NonDisadvi × ASDit in- teracted with a set of year dummies in equation (3.2). Year 2011 is excluded, so all estimates are relative to that year. 90 enrolled in no program. These figures match the results in Table C.1.4 closely. Columns (4)-(6) of Table C.1.4 present estimates of the effect of the mandate on students’ ed- ucational settings. We focus on three outcomes: whether a student is placed in a special education school (column 4), whether the student is in a general education classroom more than 80% of the time (column 5), and the percentage of time students are in a special education classroom (column 6). The point estimates are small in magnitude and are not statistically significant at conventional levels. Thus it appears that while the special education environment students are attached to during their time outside the general education classroom is changing to less intensive environments, they are not spending substantially more or less time in a general education classroom. While our data do not allow us to dig deeper there are some explanations that are consistent with this pattern. First, it may be the case that students who were attached to a cognitive impairment classroom were already spending more than 80% of their day in the general education classroom in the absence of the mandate, but with insurance coverage they leave the cognitive impairment program but still spend some time outside the general education classroom for therapeutic services. Second, we are limited to analyzing rather broad categories of general education classroom time, so it possible these are masking some changes. For example, a student could have been attached to a resource room and spent 80% to 90% of time in the general education classroom but with insurance the student now no longer uses the resource room and is in general education 100% of the day.32 Figure C.1.3 presents event study estimates for these outcomes. The event study models are less informative for these measures because education setting variables are only available begin- ning in 2010 and the FTE measures are only available beginning in 2011. Hence, we have fewer pre-treatment years for these outcomes with which to diagnose any selection on relative trends. Nonetheless, given the data available, the event study estimates support the validity of our empiri- cal approach and match the findings in Table 4 closely. Our final set of educational input measures – special education support services – are shown in columns (7)-(11) of Table C.1.4. We focus on ASD teaching consultants (column 7), language 32In Table C.1.5 we also do not see shifts in likelihood of being in a general education classroom < 40% or between 40% - 79% of the time. 91 support (column 8), occupational therapy (column 9), access to social workers (column 10), and whether students receive any support services (column 11). Column (7) shows that the insurance mandate reduced the likelihood that a student received an ASD teacher consultant by 2.3 per- centage points (17.7% relative to the mean), and this estimate is statistically significant at the 5% level. Column (10) further shows that students are 2.1 percentage points (3% relative to the mean) less likely to be assigned a social worker. Columns (8) and (9) indicate that there is little change in students’ access to language and occupational therapy services following the mandate. While these results are consistent with a crowd-out of special education services, we also find that the likelihood of receiving any special education support services increases by 2.0 percentage points, or about 2% of the mean. One possibility is that very marginal students are moved to having no program but given access to services they did not have before to compensate. Nonetheless, given the consistent patterns for the other measures, this also may simply be a spurious result that is a function of having multiple outcomes. Overall, the pattern is again consistent with crowd-out. Figure C.1.3 shows event studies for these outcomes and supports the identification assumption of common relative trends in special education support services prior to 2012 between ASD and non-ASD disabled children. The results presented thus far suggest that ASD students who are likely to have private insur- ance receive fewer special education supports in more inclusive environments after the insurance mandate is passed. The effect of this change on student achievement is unclear. If the crowd-out we find is incomplete, overall support levels increase and student achievement is likely to increase. This is particularly the case if the quality of services provided by the private market are higher than those provided through the schools. However, if crowd-out is full (or more than full) or if support quality declines, student achievement should decline. Alternatively, if the services are focused on one aspect of instruction for a student (e.g. math instead of ELA), that subject could worsen while the other subject, which may have been crowded out by extra time spent on the focus subject, could improve. Unfortunately, since we do not observe the instructional foci of the special education in- terventions, it is likely that in aggregate we would see those effects offset. Hence, a finding of no 92 impact on achievement (which we show below to be the case) could be indicative of these shifts in achievement across subjects. In Table C.1.6, we focus on two achievement measures: standardized math test scores and standardized reading test scores for students in grades 4-8.33 Given Michigan changed their exam structures in 2014-2015, we exclude that year from the estimates, leaving us with two pre-mandate and two post-mandate years. An added complication when we analyze these outcomes is that it is unclear which control group is most appropriate. This closely relates to the issue that we can only measure test scores for students who take Michigan’s traditional standardized exams. Special education specific exams do not assess the same material, are scored on a different scale, and experienced substantial changes over the time period. In Table C.1.7, we show that, when comparing ASD students to non-disabled students, there is no evidence that test-taking behavior was altered by the insurance mandate. However, when comparing ASD students to other disabled students, there is some evidence that the mandate induced fewer students to take traditional exams. Given these two complications, we show triple difference estimates using all three comparison groups - all non-ASD, non-Sped, and non-ASD sped - but note that we prefer the estimates that compare ASD students to their non-disabled peers. Even so, the results are relatively similar across comparison groups. The first three columns of Table C.1.6 show results for math scores and the second three show results for reading test scores, both of which are in standardized units. Across all columns, we find little evidence that academic achievement is affected by the private insurance mandate. For math all of the estimates are very close to zero. Our preferred estimate using all non-Sped students as the comparison group is in column 2 and has a coefficient of 0.005 with a 95% confidence interval of [-0.046,0.056]. Thus, we can rule out at the 95% level that the mandate changes math scores by more than 5-6% of a standard deviation. 33Students begin standardized testing in grade 3 but cannot be analyzed until grade 4 given our inclusion of lagged achievement. In these specifications, our sample consists of all students who we observe taking a regular exam for at least two years and are either never or always disadvantaged in the years in which we observe them. 93 The point estimates for reading (columns 4-6) are somewhat larger than for math but are also not statistically different from zero. In this case, the largest estimate is in column (4) using all non- ASD as a comparison group. The 95% confidence interval for reading with our preferred control group (column 5) is [-0.039,0.075]. We thus can rule out anything larger than modest effects on test scores at the 95% level. Figure C.1.2 shows event study estimates for math and reading using non-ASD special edu- cation students as the comparison group.34 The figures do not show any evidence of differential pre-2012 trends that would bias our triple difference estimates, though we acknowledge that we are limited in this assessment by the need to control for prior achievement, which restricts us to only having two years of pre-mandate testing data. Taken together, the results from Table C.1.6 and Figure C.1.2 indicate that academic achieve- ment is likely unaffected and certainly does not increase substantially due to the private insurance mandate. While, as noted above, improvements in one subject may be offset by lower scores in the other, it remains the case that in total we see little evidence of achievement impacts. Fur- thermore, it is important to note that this is an intention-to-treat; for those students who do have service changes (the treatment-on-the-treated effect) we cannot rule out sizable achievement im- pacts. Nonetheless, the analysis indicates that the mandate policy itself has at most very small impacts on achievement. This null result is an important finding given the changes in education services we document. Because we cannot observe services provided outside of school, we can- not determine whether our results indeed reflect crowd-out or just a reduced demand for services among students because their conditions improve due to increased access to private services. That student achievement does not substantially increase suggests our findings are most consistent with a crowd-out story, which is our preferred interpretation of the results. This interpretation also is consistent with the findings in Chatterji et al. (2015), who find no change in unmet need for services due to ASD mandates. That achievement does not substantially decrease in our setting further suggests that the shifting of responsibility to the private sector does not academically harm 34Event studies for the other control groups are shown in Figures C.1.4 and C.1.5 and are very similar. 94 students with ASD. We caution, however, that these achievement analyses are short term, and so it is possible that cognitive improvements do not show up until more exposure time has elapsed. 3.5.3 Heterogeneous Treatment Effects and Robustness Checks We examine several sources of heterogeneity: gender, race, and grade.35 Table C.1.8 presents estimates of the effect of the mandate on ASD incidence for each of these different groups. We see no evidence of a change in diagnoses for girls or for Whites & Asians vs. Black & Hispanic students. While there is evidence of reduced ASD incidence of 0.07 percentage points among boys due to the mandate, as with the overall estimates, this is very small relative to the mean rate of 1.7%. When we examine heterogeneity by grade in Panel B, there is a decline in ASD diagnosis in second grade that is significant at the 5% level. However, there is little evidence of any effects in higher grades, which suggests the mandate may shift the timing of diagnosis slightly to these later grades. It is worth noting that testing begins in grade 3, which may provide an impetus to identify students who would have been identified earlier in the absence of the mandate. Overall, there is little evidence that different groups experience a meaningful increase in ASD diagnosis, and there is no evidence of a positive shift along any dimension we examine. We now turn to examining outcomes for these different groups. As shown in Table C.1.1, 85% of students with an ASD diagnosis are boys and in general, the condition is far more common among males: according to the Centers for Disease Control, boys are 3 times as likely to be diagnosed as girls nationwide.36 It thus is informative to examine effects separately by gender. Table C.1.9 shows triple difference estimates of educational service outcomes for boys (Panel A) and girls (Panel B). The direction of the estimates are similar for boys and girls, but the effects tend to be larger both in absolute value and relative to the ASD identification rate for girls. Effects on resource/cognitive programs, no special education programs, and ASD teacher consultant are 35When assessing heterogeneity by grade, our sample consists of all students in grades K-8 who we observe for at least two years and are either disadvantaged in all years in which we observe them or are never disadvantaged in the years in which we observe them. 36https://www.cdc.gov/nchs/products/databriefs/db291.htm 95 all larger for girls than for boys, but they are qualitatively similar across genders. Combined, the results show substantial crowd out effects on both genders that are somewhat larger for girls. We also note, however, that among girls there is a statistically significant increase in the likelihood of receiving any special education services. This appears to operate through language services. Given that this is the only outcome across both genders consistent with an increase in school- based provision, we are unable to say whether this is a real effect or simply an artifact of multiple hypothesis testing. Panels C and D show effects of the mandate for Black and Hispanic as well as White and Asian students, respectively. The crowd-out effects are most evident for White and Asian students, who make up nearly 3/4 of the ASD population in Michigan. The effects on Black and Hispanic students are quite noisy as a result. The results for Whites and Asians mirror the overall estimates quite closely while the findings for Black and Hispanic students are qualitatively similar but imprecise, which limits our ability to draw strong conclusions for this group. Table C.1.10 shows effects by grade, including kindergarten and 1st grade. Kindergarten can be interpreted as a specification check, as ASD services for the vast majority of these students are covered by both private and Medicaid insurance plans after 2012. Furthermore, many first grade students are under age 6 and would also be able to receive services through Medicaid. Hence, if our identification strategy is valid we would not expect to see significant impacts in these grades and indeed that is what the table shows. The estimates in both of these grades are small, and none are statistically significantly different from zero at even the 10% level. These estimates suggest that we are not picking up unobserved shocks or trends that differentially influence outcomes among non-disadvantaged ASD students. The remaining estimates in the table test for heterogeneous treatment effects by grade for higher grades. Autism therapies like applied behavioral analysis can have differential effects by age, and the ability of schools to alter special education services also can differ for older versus younger students. Resource and cognitive program reductions are largest for students in grades 2 through 6 and start to fade after 4th grade, though estimates never turn positive. Consistent with 96 resource/cognitive program effects, the effects on non-participation in special education programs are positive in grades 2 through 6. Further, the reduction in the use of ASD teacher consultants is concentrated in early elementary grades. For the other outcomes, the estimates are generally small and are not statistically significant regardless of grade level. The crowd-out effects are mostly con- centrated in elementary rather than middle school, which one would expect given ASD therapy is more effective, and hence more commonly used, when the child is younger. We also examine test score effects along these dimensions of heterogeneity. Table C.1.11 shows reading and math score estimates by gender, race, and grade using non-disabled students as the comparison group. There are no strong statistically significant patterns across groups, though for girls and Black/Hispanic students the estimates are very imprecise due to there being relatively few ASD students in each of these groups. When we look by grade level, the estimates are again very noisy, but only one estimate is significant at the 10% level and they generally have a mix of positive an negative coefficients. We next estimate a series of robustness checks that assess the validity of several data limitations and identifying assumptions. First, we examine the importance of including the linear time trend interacted with non-disadvantaged status. As discussed in Section 3.4, this control accounts for secular linear trends in special education incidence, particularly in the comparison group of non- ASD disabled, and removes the composition changes we see within the comparison group in Table C.1.3. In Panel A of Table C.1.12, we present triple difference results that include this control. The estimates are virtually identical to baseline, which suggests that the composition changes have little effect on our program, educational setting, and special education service provision results. Panel A of Table C.1.13 shows a similar robustness check for math and reading test scores; the test score results are also robust to including this control. In Panel B of Table C.1.12, we present estimates that include school district (rather than school) fixed effects. This robustness test assesses the stability of the estimates to not accounting for unobserved heterogeneity across schools within each district. The estimates are very similar to baseline, suggesting that this heterogeneity is not correlated with the treatment. 97 Panel C shows results that address the concern that ASD designation is endogenous to the mandate. While we find little evidence to suggest this is the case, the point estimate for ASD in Table C.1.3 is significant at the 10% level. To check if this is a concern, we use the pre-2012 ASD assignment of students in place of their contemporaneous identification for post-treatment years. Hence, we assign everyone their pre-treatment ASD identification status.37 The estimates again are very similar to those in Table C.1.4 except the reduction in use of social workers and the increase in enrollment in no special education program become larger. Hence if anything we may be slightly underestimating the crowd-out effects. Nonetheless, since students for this analysis have to be observed before 2012, the age profile in the post-treatment period skews older and so we believe the baseline estimates are more accurate. Throughout this analysis we have compared students who are always disadvantaged to students who never are to increase the strength of the proxy for Medicaid eligibility. In Table C.1.14, we show results using the full sample of students.38 Relative to our baseline sample, this sample adds those who sometimes receive free/reduced price lunch or are observed only once in the data and identifies treatment as being non-disadvantaged in a given year. The sample sizes increase, and the point estimates are attenuated relative to the main results as expected, since free/reduced price lunch receipt is a worse proxy for Medicaid eligibility among the sometimes-eligible students (Michelmore and Dynarski, 2017; Domina et al., 2010). Nonetheless, the qualitative patterns do not change; the conclusion that the Autism insurance mandate led to crowd-out of education ser- vices is robust to including these students. Table C.1.16 shows that we obtain similar results for math and reading test scores when we include these students as well. In Panel B of these tables, we show that the estimates using the full sample of students are robust to including linear year trend interacted with non-poverty status. Together, the results in Table C.1.12 and Tables C.1.6 through C.1.16 demonstrate that our results and conclusions are robust to the way in which we construct 37Data restricted to students who are observed at least once before 2012. 38Table C.1.15 shows disability incidence estimates using the full sample of students both with and without time trends by non-poverty status. The estimates are very similar to those in Table C.1.3. 98 our analysis sample and to the use of linear time trends by non-poverty status. The results thus far have focused on a set of special education services and outcomes that are most closely associated with ASD. As noted previously, in Table C.1.5, we present estimates for other services in our data that are less likely to be affected by the Autism insurance mandate. If we find effects on many of these outcomes, it is suggestive of bias in our main results. Specifically, we examine enrollment in another special education program, two categories of general education participation, any general education participation, non-ASD teacher consultant use, physical ther- apy services, and transportation services. None of the point estimates in Table C.1.7 is statistically significant at even the 10% level, and each estimate is close to zero. There is no evidence that these other service measures are affected by the insurance mandate, which supports the validity of our main findings; the services that change are those that are most closely aligned with the needs of ASD students. Finally, in Panel D of Table C.1.11, we show that our test score estimates are robust to in- cluding the 2014-15 testing year when Michigan changed to a new exam format that could have affected ASD versus non-ASD students differently. In general the estimates are similar with the exception of math relative to non-ASD sped students which becomes negative and marginally sig- nificant. More importantly, however, these results are consistent with our overall conclusion that any achievement gains were at most modest. 3.6 Conclusion We present the first estimates in the literature of how a mandate that requires private insurance to cover therapeutic services for children diagnosed with Autism Spectrum Disorder affects the special education services students receive in public schools, as well as their educational achieve- ment as measured by test scores. While we study Michigan’s mandate, passed in 2012, 46 states and D.C. currently have some form of coverage mandate for ASD students. The prevalence of these mandates makes them important to study, but our results also provide more general insight into how the effect of health policy spills over to education services and outcomes. The close con- 99 nection between health and education in the production of human capital underscores the relevance of studying such policy spillovers more broadly. Using administrative K-12 data on all 2nd through 8th grade students in the state of Michigan from 2009-2010 to 2014-2015 school year, we estimate how the insurance mandate affected a wide range of special education services as well as student test scores. The data do not contain informa- tion on private insurance coverage, so we use the strong overlap between economic disadvantage and Medicaid (the near-universal alternative to private insurance among children in Michigan) to proxy for exposure to this mandate. Because we find little evidence that the incidence of ASD di- agnoses is altered by the mandate, we estimate triple difference models that compare how services and outcomes change in 2012 among non-disadvantaged ASD students relative to disadvantaged ASD student and disadvantaged vs. non-disadvantaged differences among non-ASD students. Our main findings indicate that the ASD coverage mandate led to sizable declines in the special education services students receive. ASD students who are not economically disadvantaged expe- rienced declines in the likelihood of being placed in a resource or cognitive impairment special education program, the likelihood of being placed in any special education program, the likelihood of being given an ASD teaching consultant, and the likelihood of being provided in-school access to a social worker. However, test scores did not change on average. Taken together, we argue the evidence is most consistent with a crowd-out story, where the private provision of ASD thera- pies reduces special education services in schools. This would generate the service reductions we document and would lead to no change in academic achievement, as we find. Our results are important in showing that supply-side health policies focused on health insur- ance have spillover effects to the education system that likely were unintended by policymakers. The findings from this paper suggest that the crowd-out of special education services largely un- does the intent of policymakers to help provide more therapy services to autistic children. Nonethe- less, we see little evidence that policy change was harmful to students and there is a potential for welfare enhancement if provision through the health insurance system is more efficient than through the education system or if it frees up instruction time for students. Nonetheless, that these 100 spillovers occur in this setting is suggestive that other health care policies, such as recommenda- tions against teens taking anti-depressants or medical practices surrounding ADHD disabilities, also may have effects on the services students receive in schools and their academic achievement. Further understanding these interactions between health policies and schools and how they affect students is a ripe area for future research. 101 APPENDICES 102 APPENDIX A CHAPTER 1 APPENDIX A.1 Tables & Figures Figure A.1.1: Identified Community College District Boundaries 103 Figure A.1.2: Washtenaw Community College District Analysis Sample 104 Figure A.1.3: Distribution of Border Pair Tuition Differentials 105 051015Percent0100020003000Tuition Differential Figure A.1.4: Correlation Between Tuition Differentials and Area Characteristics (a) % Economically Disadvantaged (b) Math Test Scores (c) Local CC Size (d) Local CC Quality 106 Correlation: -0.0620100020003000Tuition Differential0.2.4.6.81% Economically DisadvantagedCorrelation: 0.0650100020003000Tuition Differential-2-1012Average 11th Grade Math ScoreCorrelation: -0.0940100020003000Tuition Differential05000100001500020000Local CC Undergraduate EnrollmentCorrelation: 0.1390100020003000Tuition Differential20004000600080001000012000Local CC Instructional Spending per FTE Figure A.1.5: Reduced Form Estimates with Alternative Bandwidths 107 0.05.1.15Effect of In-District Status on Local CC Enrollment0.1 0.5 1.0 1.5 2.0 2.5 3.0 3.5 4.0Miles to Border Figure A.1.6: School Districts Overlapping Community College Districts 108 Table A.1.1: Mean Tuition Rates at Michigan Community Colleges, 2008-2016 Per Credit Per Semester Per Year Annual/Income $2266.56 $3729.36 $1462.80 3.78% 6.22% 2.44% $94.44 $155.39 $60.95 $1133.28 $1864.68 $731.40 In-District Out-of-District Difference Notes: Tuition rates are provided by Michigan’s Workforce Development Agency and con- verted into real 2016 dollars. All amounts are averaged across academic years 2008-2009 to 2015-2016. “Per semester” rates are calculated as the cost of 12 credits and “per year” rates are calculated as the cost per 24 credits. The final column “Annual/Income” presents the “per year” estimates divided by 60,000, the approximate median household income of students attending Michigan’s community colleges (Chetty et al., 2017). 109 Table A.1.2: Associate Degree Programs Offered by Community & Vocational Colleges CIP Code CIP Title 1 3 4 5 9 10 11 12 13 14 15 16 19 22 23 24 25 26 27 29 30 31 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 54 Agriculture, Agriculture Operations, and Related Natural Resources and Conservation Architecture and Related Services Area, Ethnic, Cultural, and Gender Studies Communication, Journalism, and Related Communications Technologies/Technicians and Support Computer and Information Sciences and Support Personal and Culinary Services Education Engineering Engineering Technologies/Technicians Foreign Languages, Literatures, and Linguistics Family and Consumer Sciences/Human Sciences Legal Professions and Studies English Language and Literature/Letters Liberal Arts and Sciences, General Studies, Humanities Library Science Biological and Biomedical Sciences Mathematics and Statistics Military Technologies Multi/Interdisciplinary Studies Parks, Recreation, Leisure, and Fitness Studies Philosophy and Religious Studies Theology and Religious Vocations Physical Sciences Science Technologies/Technicians Psychology Security and Protective Services Public Administration and Social Service Professions Social Sciences Construction Trades Mechanic and Repair Technologies/Technicians Precision Production Transportation and Materials Moving Visual and Performing Arts Health Professions and Related Clinical Sciences Business, Management, Marketing, and Related History MI Comm. Colleges Vocational Colleges 0.342 0.258 0.249 0.097 0.536 0.615 0.996 0.723 0.864 0.750 0.999 0.371 0.838 0.782 0.346 1.000 0.191 0.585 0.377 0.000 0.429 0.381 0.185 0.018 0.416 0.439 0.397 0.996 0.483 0.284 0.567 0.931 0.842 0.264 0.930 1.000 1.000 0.231 0.198 0.006 0.001 0.000 0.726 0.724 0.957 0.130 0.735 0.427 0.778 0.302 0.725 0.343 0.006 0.084 0.000 0.056 0.006 0.001 0.007 0.115 0.000 0.000 0.006 0.000 0.007 0.939 0.010 0.008 0.016 0.802 0.421 0.745 0.765 0.945 0.962 0.011 Diff. 0.144 0.252 0.248 0.097 -0.191 -0.109 0.039 0.593 0.129 0.323 0.221 0.069 0.112 0.440 0.339 0.916 0.190 0.529 0.372 -0.001 0.422 0.266 0.185 0.017 0.411 0.438 0.389 0.056 0.473 0.276 0.551 0.129 0.421 -0.482 0.165 0.055 0.038 0.220 Notes: All data comes from the U.S. Department of Education’s College Scorecard. All variables are averaged across all 2009-2016 high school graduates who enroll in college within one year of high school graduation to reflect the characteristics of the colleges that students attend. 110 Table A.1.3: Baker College vs. Private Two-Year Colleges Variable: Avg. Net Price Instruction $ per FTE % Full-Time Faculty 200% Graduation Rate % Liberal Arts Degrees Median Earnings Median Debt Baker College (1) $12,333 $4,010 0.103 0.168 0.003 $26,880 $8,447 All Private Two-Years For-Profit Not-For- Profit (4) (2) (3) $16,320 $4,361 0.481 0.682 0.007 $26,129 $8,927 $16,200 $3,908 0.457 0.688 0.000 $24,512 $8,673 $17,172 $7,525 0.580 0.635 0.058 $35,725 $10,637 Institutions Notes: Data comes from the College Scorecard, averaged over all years 2009-2016. 1,838 2,092 1 336 111 Table A.1.4: Michigan’s Traditional Four-Year Colleges Variable: Undergraduates Avg. Net Price Instruction $ per FTE % Full-Time Faculty 200% Graduation Rate Median Earnings Median Debt Flagships Other Public (2) (1) 32,475 $15,477 $17,943 0.840 0.850 $55,220 $18,771 12,522 $13,245 $8,448 0.640 0.512 $42,229 $14,170 Private (3) 1,669 $18,995 $8,430 0.685 0.521 $41,938 $16,106 Institutions Notes: Data comes from the College Scorecard, averaged over all years 2009-2016. 26 2 13 112 Table A.1.5: Michigan’s Community College Districts Community College Alpena Bay de Noc Delta Gogebic Counties Delta Dickinson* Bay Midland Saginaw Gogebic Grand Rapids Henry Ford Jackson Kalamazoo Valley Kellogg Kirtland Lake Michigan Berrien Lansing Macomb Macomb Mid Michigan Monroe County Monroe Montcalm Mott School Districts Alpena Cities/Townships Kent ISD: Byron Center, Caledonia, Cedar Springs, Comstock Park, East Grand Rapids, Forest Hills, Godfrey Lee, Godwin Heights, Grand Rapids, Grandville, Kelloggsville, Kenowa Hills, Kent City, Kentwood, Lowell, Northview, Rockford, Sparta, Thornapple Kellogg, Wyoming Dearborn Jackson Climax-Scotts, Comstock, Galesburg-Augusta, Gull Lake, Kalamazoo, Mattawan, Parchment, Portage, Schoolcraft, Vicksburg Albion, Athens, Battle Creek, Harper Creek, Homer, Lakeview, Mar-Lee, Marshall, Pennfield, Tekonsha, Union City C.O.O.R. ISD: Crawford-AuSable, Fairview Area, Houghton Lake, Mio-AuSable, Roscommon Area, West Branch-Rose City South Haven Bath, Dansville, Dewitt, East Lansing, Grand Ledge, Haslett, Holt/Diamondale, Lansing, Leslie, Mason, Okemos, Stockbridge, Waverly, Webberville, Williamston Clare-Gladwin RESA: Beaverton, Clare, Farwell, Gladwin, Harrison Montcalm Area ISD: Carson City-Crystal, Central Montcalm, Greenville, Lakeview, Montabella, Tri County, Vestaburg Genesee ISD: Atherton, Beecher, Bendle, Bentley, Carman-Ainsworth, Clio, Davison, Fenton, Flint, Flushing, Genesee, Goodrich, Grand Blanc, Kearsley, Lake Fenton, Lakeville, Linden, Montrose, Mt. Morris, Swartz Creek, Westwood Heights Covert Muskegon Muskegon North Central Michigan Emmet Northwestern Michigan Grand Traverse Oakland Oakland Schoolcraft Southwestern Michigan Cass St. Clair County Washtenaw Washtenaw Wayne County Wayne West Shore Notes: * denotes service district areas. Clarenceville, Garden City, Livonia, Northville, Novi (part), Plymouth-Canton Algonac, Capac, East China, Marysville, Memphis, Port Huron, Yale Keeler, Hamilton NOT INCLUDED: Dearborn, Garden City, Highland Park, Livonia, Northville, Plymouth, Canton (part) Bear Lake, Free Soil, Kaleva-Norman-Dickson, Ludington, Manistee, Mason County Central, Mason County Eastern, Onekama, Walkerville Crystal, Elbridge, Weare 113 Table A.1.6: Descriptive Statistics, 2009-2016 High School Graduates All Students In All Variable Panel A. Demographics White Black Hispanic Male FRPL eligible Special education English language learner Resides in CC district 0.760 0.150 0.041 0.490 0.333 0.082 0.025 0.779 Panel B. High School Academics Math standardized score Reading standardized score School of choice On-time graduation Dual enrollment in HS 0.095 0.087 0.096 0.966 0.095 Panel C. One-Year College Enrollment Community college Vocational college Four-year college Any college 0.294 0.031 0.407 0.697 Out 0.906 0.015 0.043 0.498 0.320 0.085 0.010 0.000 0.169 0.141 0.104 0.972 0.121 0.226 0.046 0.393 0.642 Analysis Sample All In Out 0.851 0.081 0.029 0.499 0.300 0.081 0.021 0.616 0.120 0.104 0.124 0.970 0.108 0.295 0.035 0.375 0.674 0.814 0.110 0.029 0.497 0.315 0.078 0.029 1.000 0.090 0.078 0.120 0.968 0.102 0.314 0.031 0.373 0.684 0.911 0.034 0.030 0.503 0.278 0.086 0.007 0.000 0.168 0.144 0.130 0.974 0.117 0.265 0.043 0.378 0.658 0.719 0.189 0.041 0.488 0.337 0.082 0.030 1.000 0.075 0.071 0.094 0.965 0.088 0.314 0.027 0.411 0.712 734,928 24,853 Observations Notes: The “All Students” sample include all students who graduate from a traditional public high school in Michigan between 2009 and 2016, take the Michigan Merit Exam (MME), and have non-missing geographic and test score information. The “Analysis Sample” further restricts the sample to students who reside within two miles of a community college district boundary. Students who attend alternative education high schools or juvenile detention centers are not included in either sample. 572,581 162,347 64,667 39,814 114 Table A.1.7: First Stage Estimate of In-District Status on Tuition Variable In-District Status No Controls Distance Controls All Controls -1,797*** -1,814*** (269.0) (240.0) -1,795*** (240.6) 64,667 45.46 0.901 Observations Partial F-Statistic Adjusted R2 Notes: Each coefficient is estimated from a single regression and corresponds to λ in equation (1.2), representing the difference in local community college tuition faced by students residing inside of a community college district, as compared to students residing outside of a community college district. All standard errors are clustered at the boundary segment level. ∗ p < 0.10, ∗∗ p < 0.05, ∗∗∗ p < 0.01. 64,667 55.67 0.905 64,667 56.05 0.905 115 Table A.1.8: Balance Tests of Student Characteristics Outcome White Male (2) (1) FRPL (3) SPED (4) In-District Effect 0.001 (0.010) -0.004 (0.005) -0.015 (0.012) -0.009*** (0.003) Observations Mean 64,667 0.851 64,667 0.499 64,667 0.300 Outcome Math Score (6) Reading Score (7) On-Time Grad (8) 64,667 0.081 Dual Enroll (9) ELL (5) 0.006 (0.006) 64,667 0.021 Pred. CC Enrollment (10) In-District Effect 0.012 (0.013) 0.015 (0.012) -0.001 (0.003) -0.008* (0.004) 0.002 (0.001) 64,667 0.120 64,667 0.104 64,667 0.970 64,667 0.108 Observations Mean Notes: The sample consists of all students who reside within two miles of the nearest community college district boundary segment and graduated from high school between 2009 and 2016. Each coefficient is estimated from a single regression that regresses the student characteristic of interest on a dummy variable for in-district status and the full set of boundary segment by year fixed effects. The coefficients represent the average difference in characteristics among students who reside within two miles of the same community college district boundary and graduate from high school in the same year. All standard errors are clustered at the boundary segment level. ∗ p < 0.10, ∗∗ p < 0.05, ∗∗∗ p < 0.01. 64,667 0.295 116 Table A.1.9: Balance Tests of Census Tract Characteristics Outcome In-District Effect Observations Mean Outcome Median HH Income (1) 1,427 (1,065) 64,645 59,505 Poverty Share (2) -0.003 (0.005) 64,667 0.118 Mean 3rd Grade Math Score (3) -0.077 (0.116) 64,653 2.976 2 Bedroom Rental Price (4) 13.08 (12.69) 46,927 748.58 Single Parent Non-White Share (5) Share (6) High-Paying Job Share (7) Job Growth 2004-2013 (8) In-District Effect 0.001 (0.011) 0.001 (0.001) 0.008 (0.006) 0.004 (0.009) 64,667 0.142 64,667 0.245 64,667 0.387 Observations Mean Notes: The sample consists of all students who reside within two miles of the nearest community college district boundary segment and graduated from high school between 2009 and 2016. Each coefficient is estimated from a single regression that regresses the student characteristic of interest on a dummy variable for in-district status and the full set of boundary segment by year fixed effects. The coefficients represent the average difference in characteristics among students who reside within two miles of the same community college district boundary and graduate from high school in the same year. All data comes from the Equality of Opportunity Project and is publicly available at: https://opportunityinsights.org/data/. All standard errors are clustered at the boundary segment level. ∗ p < 0.10, ∗∗ p < 0.05, ∗∗∗ p < 0.01. 64,667 0.006 117 Table A.1.10: Balance Tests of Distance to Postsecondary Institutions Outcome Local CC (1) Public Four-Year (2) Private Four-Year (3) Private Vocational (4) In-District Effect -1.462*** (0.178) 0.023 (0.432) -0.929** (0.442) -0.717 (0.468) 64,667 10.30 64,667 19.93 64,667 23.73 64,667 19.50 Observations Mean Notes: The sample consists of all students who reside within two miles of the nearest community college district boundary segment and graduated from high school between 2009 and 2016. Each coefficient is estimated from a single regression that regresses the student characteristic of interest on a dummy variable for in-district status and the full set of boundary segment by year fixed effects. The coefficients represent the average difference in characteristics among students who reside within two miles of the same community college district boundary and graduate from high school in the same year. All standard errors are clustered at the boundary segment level. ∗ p < 0.10, ∗∗ p < 0.05, ∗∗∗ p < 0.01. 118 Table A.1.11: Effect of In-District Status and Reduced Tuition on College Enrollment Outcome Local CC (1) Non-Local CC (2) Vocational College (3) Four-Year College (4) Any College (5) Panel A. All Cohorts In-District Effect 0.064*** (0.007) -0.028*** (0.006) -0.007*** (0.002) Tuition Effect 0.035*** (0.004) -0.015*** (0.004) -0.004*** (0.001) Observations Mean 64,667 0.209 64,667 0.089 64,667 0.035 Panel B. 2009-2011 Cohorts 0.060*** In-District Effect (0.010) -0.035*** (0.007) -0.007** (0.003) Tuition Effect 0.036*** (0.006) -0.021*** (0.006) -0.004*** (0.001) -0.010 (0.007) -0.005 (0.003) 64,667 0.375 -0.005 (0.008) -0.003 (0.005) 0.013** (0.005) 0.007** (0.003) 64,667 0.674 0.006 (0.008) 0.004 (0.004) 23,734 0.096 23,734 0.368 23,734 0.691 23,734 0.040 23,734 0.225 Observations Mean Notes: The sample in Panel A consists of all students who reside within two miles of the nearest community college district boundary segment and graduated from high school between 2009 and 2016. Panel B further restricts the sample to students who graduated from high school between 2009 and 2011. In both panels, each coefficient is estimated from a single regression. The coefficients in the “in-district effect” rows correspond to δ in equation (1.1), representing the estimated change in the probability of an outcome due to a student residing in a community college district. The coefficients in the “tuition effect” rows correspond to −β ∗ 1000, where β is defined as in equation (1.3). These coefficients represent the estimate change in the probability of an outcome due to a $1,000 decrease in the annual tuition rate at a student’s local community college. All regressions include controls for a student’s race/ethnicity, gender, FRPL status, special education participation, ELL status, math and reading test scores, school of choice participation, on-time graduation, and dual enrollment experience, as well as the distance between the centroid of a student’s census block of residence and the nearest campus of the local community college, the nearest vocational college, the nearest in-state public university, and the nearest in-state private four-year college. All standard errors are clustered at the boundary segment level. ∗ p < 0.10, ∗∗ p < 0.05, ∗∗∗ p < 0.01. 119 Table A.1.12: Heterogeneous Effects by Graduation Year Local CC (1) Non-Local CC (2) Vocational College (3) Four-Year College (4) Any College (5) 0.066*** (0.008) -0.024*** (0.006) -0.007*** (0.002) Variable: In-District Effect -0.009 (0.007) -0.001 (0.007) 0.019*** (0.006) -0.017** (0.007) In-District x 2009-2011 Grad -0.005 (0.008) -0.010** (0.004) -0.001 (0.003) 64,667 0.209 64,667 0.089 64,667 0.035 64,667 0.375 Observations Mean Notes: The sample consists of all students who reside within two miles of the nearest community college district boundary segment and graduated from high school between 2009 and 2016. The coefficients in the “in-district effect” rows correspond to δ in equation (1.1) for the 2012-2016 cohorts, representing the estimated change in the probability of an outcome due to a student residing in a community college district. The coefficients in the second row represent the difference in the in-district effect between the 2009-2011 cohorts and the 2012-2016 cohorts. All regressions include controls for a student’s race/ethnicity, gender, FRPL status, special education participation, ELL status, math and reading test scores, school of choice participation, on-time graduation, and dual enrollment experience, as well as the distance between the centroid of a student’s census block of residence and the nearest campus of the local community college, the nearest vocational college, the nearest in-state public university, and the nearest in-state private four-year college. All standard errors are clustered at the boundary segment level. ∗ p < 0.10, ∗∗ p < 0.05, ∗∗∗ p < 0.01. 64,667 0.674 120 Table A.1.13: Characteristics of Community and Vocational Colleges Variable Avg. Net Price Instruction $ per FTE % Full-Time Faculty Transfer Rate 150% Graduation Rate % Liberal Arts Degrees Median Earnings Median Debt MI Community Colleges (1) $5,325.38 $4,993.05 0.400 0.360 0.135 0.349 Vocational Colleges (2) $14,004.62 $3,897.80 0.213 0.111 0.196 0.006 $29,326.50 $4,211.58 $29,018.95 $8,867.49 Difference (3) -$8,679.24 $1,095.25 0.188 0.249 -0.061 0.343 $307.55 -$4,655.91 21,720 144 28 191,394 Students Institutions Notes: All data comes from the U.S. Department of Education’s College Scorecard, except for the transfer rate variable which is calculated on the full sample of Michigan’s 2009-2011 high school graduates who enroll in community or vocational colleges within one year of high school graduation. All variables are averaged across all 2009-2016 high school graduates who enroll in college within one year of high school graduation to reflect the characteristics of the colleges that students attend. - - 121 Table A.1.14: Effect of In-District Status and Reduced Tuition on College Completion Semesters Bachelor’s of College Completed Four-Year Completion Completion Completion Transfer to Certificate Associate Credits (5) 0.005 (0.005) 0.003 (0.002) (6) 0.018** (0.008) 0.011** (0.005) Outcome In-District Effect Tuition Effect (1) (2) (3) (4) 0.344*** (0.097) 3.463*** (1.302) 0.206*** (0.062) 2.069*** (0.656) 0.011** (0.005) 0.007** (0.003) -0.003 (0.004) -0.002 (0.003) 23,734 8.133 23,734 76.46 23,734 0.055 23,734 0.115 23,734 0.126 Observations Mean Notes: The sample consists of all students who reside within two miles of the nearest community college district boundary segment and graduated from high school between 2009 and 2011. Each coefficient is estimated from a single regression. The coefficients in the “in-district effect” rows correspond to δ in equation (1.1), representing the estimated change in the probability of an outcome due to a student residing in a community college district. The coefficients in the “tuition effect” rows correspond to −β ∗ 1000, where β is defined as in equation (1.3). These coefficients represent the estimate change in the probability of an outcome due to a $1,000 decrease in the annual tuition rate at a student’s local community college. All regressions include controls for a student’s race/ethnicity, gender, FRPL status, special education participation, ELL status, math and reading test scores, school of choice participation, on-time graduation, and dual enrollment experience, as well as the distance between the centroid of a student’s census block of residence and the nearest campus of the local community college, the nearest vocational college, the nearest in-state public university, and the nearest in-state private four-year college. All standard errors are clustered at the boundary segment level. ∗ p < 0.10, ∗∗ p < 0.05, ∗∗∗ p < 0.01. 23,734 0.316 122 Table A.1.15: Distribution of Degree Completion Increases Across Majors General Studies Liberal Arts (2) Health Business Technical (3) (4) (5) Prof. (6) Other (7) Outcome (1) Panel A. Associate Degree In-District Effect 0.010*** (0.002) -0.000 (0.002) -0.002 (0.002) -0.001 (0.002) Tuition Effect 0.006*** (0.002) -0.000 (0.001) -0.001 (0.002) -0.001 (0.001) Observations Mean 23,734 0.033 23,734 0.010 23,734 0.023 23,734 0.013 Panel B. Bachelor’s Degree 0.002** In-District Effect (0.001) 0.001 (0.005) 0.001 (0.003) 0.007* (0.004) Tuition Effect 0.001** (0.001) 0.000 (0.003) 0.001 (0.002) 0.004* (0.002) -0.002 (0.002) -0.001 (0.001) 23,734 0.016 -0.002 (0.004) -0.001 (0.002) -0.002 (0.002) 0.003* (0.001) -0.001 (0.001) 0.002* (0.001) 23,734 0.016 23,734 0.016 0.009** (0.004) 0.001 (0.003) 0.005*** (0.002) 0.001 (0.001) 23,734 0.003 23,734 0.100 23,734 0.034 23,734 0.067 23,734 0.051 23,734 0.038 23,734 0.022 Observations Mean Notes: In both panels, the sample consists of all students who reside within two miles of the nearest community col- lege district boundary segment and graduated from high school between 2009 and 2011. Each coefficient is estimated from a single regression. The coefficients in the “in-district effect” rows correspond to δ in equation (1.1), represent- ing the estimated change in the probability of an outcome due to a student residing in a community college district. The coefficients in the “tuition effect” rows correspond to −β ∗ 1000, where β is defined as in equation (1.3). These coefficients represent the estimate change in the probability of an outcome due to a $1,000 decrease in the annual tu- ition rate at a student’s local community college. All regressions include controls for a student’s race/ethnicity, gender, FRPL status, special education participation, ELL status, math and reading test scores, school of choice participation, on-time graduation, and dual enrollment experience, as well as the distance between the centroid of a student’s census block of residence and the nearest campus of the local community college, the nearest vocational college, the nearest in-state public university, and the nearest in-state private four-year college. All standard errors are clustered at the boundary segment level. ∗ p < 0.10, ∗∗ p < 0.05, ∗∗∗ p < 0.01. 123 Major Group General Studies Liberal Arts Health Business Technical Professional Table A.1.16: Academic Program Categories Two-Digit CIP Code CIP Title 24 1 3 5 16 23 26 27 30 38 40 42 45 50 54 51 52 4 10 11 14 15 41 46 47 48 49 9 12 13 19 22 25 31 43 44 Liberal Arts and Sciences, General Studies and Humanities Agriculture, Agriculture Operations, and Related Sciences Natural Resources and Conservation Area, Ethnic, Cultural, and Gender Studies Foreign Languages, Literatures, and Linguistics English Language and Literatures Biological and Biomedical Sciences Mathematics and Statistics Multi/Interdisciplinary Studies Philosophy and Religious Studies Physical Sciences Psychology Social Sciences Visual and Performing Arts History Health Professions and Related Clinical Sciences Business, Management, Marketing, and Related Support Services Architecture and Related Services Communications Technologies/Technicians and Support Services Computer and Information Sciences and Support Services Engineering Engineering Technologies/Technicians Science Technologies/Technicians Construction Trades Mechanic and Repair Technologies/Technicians Precision Production Transportation and Materials Moving Communication, Journalism, and Related Programs Personal and Culinary Services Education Family and Consumer Sciences/Human Sciences Legal Professions and Studies Library Science Parks, Recreation, Leisue, and Fitness Studies Security and Protective Services Public Administration and Social Service Professions 124 Table A.1.17: Distribution of Bachelor’s Degree Increases Across Professional Majors Outcome In-District Effect Tuition Effect Protective Service (1) 0.001 (0.002) 0.001 (0.001) Family & Consumer Sciences (2) -0.001 (0.001) -0.001 (0.001) Personal Care & Culinary (3) 0.000 (0.000) Legal Studies (4) 0.000 (0.001) 0.000 (0.000) 0.000 (0.000) Education & Library Science (5) 0.004** (0.002) 0.002** (0.001) Comm. & Journalism Public Admin. (6) -0.000 (0.002) -0.000 (0.001) (7) 0.002* (0.001) 0.001 (0.001) Parks, Rec., Leisure, & Fitness (8) 0.003** (0.001) 0.002* (0.001) 23,734 0.008 23,734 0.0002 23,734 0.018 23,734 0.001 23,734 0.004 Observations Mean Notes: The sample consists of all students who reside within two miles of the nearest community college district boundary segment and graduated from high school between 2009 and 2011. Each coefficient is estimated from a single regression. The coefficients in the “in-district effect” rows correspond to δ in equation (1.1), representing the estimated change in the probability of an outcome due to a student residing in a community college district. The coefficients in the “tuition effect” rows correspond to −β ∗ 1000, where β is defined as in equation (1.3). These coefficients represent the estimate change in the probability of an outcome due to a $1,000 decrease in the annual tuition rate at a student’s local community college. All regressions include controls for a student’s race/ethnicity, gender, FRPL status, special education participation, ELL status, math and reading test scores, school of choice participation, on-time graduation, and dual enrollment experience, as well as the distance between the centroid of a student’s census block of residence and the nearest campus of the local community college, the nearest vocational college, the nearest in-state public university, and the nearest in-state private four-year college. All standard errors are clustered at the boundary segment level. ∗ p < 0.10, ∗∗ p < 0.05, ∗∗∗ p < 0.01. 23,734 0.011 23,734 0.017 23,734 0.008 125 Table A.1.18: Heterogeneity by Student Characteristics Outcome Overall effect One year enrollment: Local (1) Non- Local (2) Voc. (3) Four- Year (4) Any (5) 0.064*** (0.007) -0.028*** (0.006) -0.007*** (0.002) -0.010 (0.007) 0.013*** (0.005) Completion: Assoc. (6) 0.005 (0.005) Bach. (7) 0.018** (0.008) Panel A. FRPL Eligibility Ineligible Eligible 0.067*** (0.008) 0.056*** (0.009) -0.033*** (0.007) -0.015** (0.006) -0.007*** (0.002) -0.009** (0.003) -0.010 (0.008) -0.008 (0.008) 0.010 (0.006) 0.019*** (0.007) 0.006 (0.005) 0.002 (0.009) 0.015 (0.009) 0.028** (0.013) Ineligible = eligible? 0.244 0.026 0.647 0.859 0.312 0.686 0.436 Panel B. Gender Female Male 0.056*** (0.008) 0.072*** (0.008) -0.034*** (0.007) -0.022*** (0.008) -0.011*** (0.003) -0.004* (0.002) -0.005 (0.007) -0.014* (0.008) -0.001 (0.009) 0.026*** (0.007) 0.004 (0.008) 0.006 (0.006) 0.019* (0.011) 0.017* (0.010) Female = male? 0.008 0.138 0.033 0.206 0.017 0.832 0.913 Panel C. Test Score Bottom quartile Middle two quartiles Top quartile 0.074*** (0.011) 0.075*** (0.008) 0.029*** (0.010) -0.018** (0.009) -0.036*** (0.007) -0.19** (0.008) -0.011*** (0.004) -0.009*** (0.003) 0.001 (0.003) -0.023** (0.009) -0.003 (0.007) -0.008 (0.015) 0.021** (0.011) 0.017*** (0.007) -0.004 (0.008) 0.012 (0.010) -0.001 (0.007) 0.011 (0.0010) 0.008 (0.011) 0.026** (0.011) 0.014 (0.014) Bottom = middle? Top = middle? 0.934 0.000 0.031 0.017 0.677 0.022 0.061 0.702 0.730 0.029 0.363 0.288 0.150 0.488 64,667 64,667 64,667 23,734 64,667 64,667 N Notes: For outcomes (1)-(5), the sample consists of all students who reside within two miles of the nearest community college district boundary segment, graduated from high school between 2009 and 2016. For outcomes (6) and (7), the sample is further restricted to students who graduated from high school between 2009 and 2011, and students who earn postsecondary degrees in high school are dropped from the sample. Coefficients are estimated from regressions with interaction terms, as described in section 1.5.3. All regressions include controls for a student’s race/ethnicity, gender, FRPL status, special education participation, ELL status, math and reading test scores, school of choice participation, on-time graduation, and dual enrollment experience, as well as the distance between the centroid of a student’s census block of residence and the nearest campus of the local community college, the nearest vocational college, the nearest in-state public university, and the nearest in-state private four-year college. All standard errors are clustered at the boundary segment level. ∗ p < 0.10, ∗∗ p < 0.05, ∗∗∗ p < 0.01. 23,734 126 Table A.1.19: Balance Tests of Student Characteristics, Varying Bandwidths Bandwidth: White Male (2) 0.001 (0.003) 4 Miles (N=145,775) -0.015 (0.013) (1) FRPL (3) -0.009 (0.016) SPED (4) -0.008* (0.004) ELL Math Reading On-Time (5) 0.004 (0.003) (6) 0.018 (0.016) 0.026** (0.012) -0.001 (0.003) (8) (7) 3 Miles (N=102,791) -0.008 (0.009) -0.000 (0.004) -0.012 (0.012) -0.009** (0.004) 0.004 (0.004) 0.022* (0.013) 0.023** (0.011) 2 Miles (N=64,667) 0.001 (0.010) -0.004 (0.005) -0.015 (0.012) -0.009*** (0.003) 0.006 (0.006) 0.012 (0.013) 0.015 (0.012) 1 Mile (N=31,541) 0.016 (0.013) 0.004 (0.007) -0.023 (0.019) -0.010** (0.005) 0.010 (0.010) 0.020 (0.016) 0.008 (0.015) 0.5 Miles (N=15,185) 0.020 (0.014) 0.005 (0.009) -0.032 (0.025) -0.009 (0.007) 0.008 (0.010) 0.017 (0.020) 0.026 (0.022) -0.001 (0.002) -0.001 (0.003) -0.002 (0.003) -0.003 (0.006) Dual (9) -0.007 (0.006) -0.007* (0.004) -0.008* (0.004) -0.011* (0.006) -0.014* (0.008) -0.011 (0.018) 0.001 (0.003) -0.015 (0.052) -0.014 (0.022) 0.051** (0.023) 0.054** (0.025) -0.023 0.1 Miles (0.028) (N=1,136) Notes: The sample consists of all students who reside within the specified distance of the nearest community college district boundary segment and graduated from high school between 2009 and 2016. Each coefficient is estimated from a single regression that regresses the student characteristic of interest on a dummy variable for in-district status and the full set of boundary segment by year fixed effects. The coefficients represent the average difference in characteristics among students who reside within the specified distance of the same community college district boundary and graduate from high school in the same year. All standard errors are clustered at the boundary segment level. ∗ p < 0.10, ∗∗ p < 0.05, ∗∗∗ p < 0.01. -0.015 (0.024) 0.020 (0.070) 127 Table A.1.20: Local Community College Enrollment Results, Within Same School District Local CC Enrollment Bachelor’s Degree Main Strategy School District Main Strategy School District (1) (2) (3) In-District Effect Tuition Effect 0.064*** (0.007) 0.035*** (0.004) 0.050*** (0.014) 0.032*** (0.011) 0.018** (0.008) 0.011** (0.005) (4) 0.015 (0.022) 0.011 (0.015) 6,946 0.292 64,667 0.209 23,734 0.316 17,783 0.233 Observations Mean Notes: Columns (1) and (3) repeat the estimates for local community college enrollment and bachelor’s degree completion presented in Tables 5 and 7, respectively. Here, the sample consists of all students who reside within two miles of the nearest community college district boundary segment and graduated from high school between 2009 and 2016. Standard errors are cluster at the boundary segment level. Columns (2) and (4) present reduced form and 2SLS estimates on the sample of school districts that overlap community college districts (see Section 1.5.4). The sample consists of all students who reside in one of the overlapping school districts and graduated from high school between 2009 and 2016. In these columns, standard errors are clustered at the school district level. All regressions include controls for a student’s race/ethnicity, gender, FRPL status, special education participation, ELL status, math and reading test scores, school of choice participation, on-time graduation, and dual enrollment experience, as well as the distance between the centroid of a student’s census block of residence and the nearest campus of the local community college, the nearest vocational college, the nearest in-state public university, and the nearest in-state private four-year college. ∗ p < 0.10, ∗∗ p < 0.05, ∗∗∗ p < 0.01. 128 Table A.1.21: Full Enrollment Results for Within Same School District Sample Outcome In-District Effect Local CC (1) Non-Local CC (2) 0.050*** (0.014) -0.020* (0.012) Tuition Effect 0.032*** (0.011) -0.013 (0.008) Vocational College (3) -0.001 (0.005) -0.001 (0.003) Four-Year College (4) -0.016 (0.018) -0.010 (0.012) Any College (5) 0.014 (0.016) 0.009 (0.010) 17,783 0.067 17,783 0.035 17,783 0.233 17,783 0.643 17,783 0.336 Observations Mean Notes: The sample consists of all students who reside within a school district that intersects a community college district and graduated from high school between 2009 and 2016. Each coefficient is estimated from a single regression that includes a full set of school district by graduation year fixed effects. The coefficients in the “policy effect” represent the estimated change in the probability of an outcome due to a student residing in a community college district. The coefficients in the “tuition effect” row represent the estimate change in the probability of an outcome due to a $1,000 decrease in the annual tuition rate at a student’s local community college. All regressions include controls for a student’s race/ethnicity, gender, FRPL status, special education participation, ELL status, math and reading test scores, school of choice participation, on-time graduation, and dual enrollment experience, as well as the distance between the centroid of a student’s census block of residence and the nearest campus of the local community college, the nearest vocational college, the nearest in-state public university, and the nearest in-state private four-year college. All standard errors are clustered at the school district level. ∗ p < 0.10, ∗∗ p < 0.05, ∗∗∗ p < 0.01. 129 Table A.1.22: Placebo Tests Local CC Enrollment Bachelor’s Degree Further In Further Out Further In Further Out (3) (4) (1) 0.005 (0.007) 94,582 0.242 Estimate (2) 0.007* (0.004) -0.012* (0.007) -0.004 (0.008) 50,527 0.159 33,676 0.318 19,390 0.314 Observations Mean Notes: Each column reports the estimates of a placebo test that alters the boundaries of the community college districts. Columns (1) and (3) contract all community college districts by 2 miles; columns (2) and (4) expand all community college districts by 2 miles. Each sample consists of all students who reside within two miles of the nearest placebo commu- nity college district boundary segment and graduated from high school between 2009 and 2016. Each column then estimates δ from equation (1.1) using the constructed placebo community college district boundaries. All regressions include controls for a student’s race/ethnicity, gender, FRPL status, special education participation, ELL status, math and reading test scores, school of choice participation, on-time graduation, and dual enroll- ment experience, as well as the distance between the centroid of a student’s census block of residence and the nearest campus of the local community college, the nearest vocational college, the nearest in-state public university, and the nearest in-state private four-year col- lege. All standard errors are clustered at the placebo boundary segment level. ∗ p < 0.10, ∗∗ p < 0.05, ∗∗∗ p < 0.01. 130 APPENDIX B CHAPTER 2 APPENDIX B.1 Figures & Tables Figure B.1.1: Differences in Course-Taking and Credit Completion by CC Program Group (a) Share of Courses Taken (b) Share of Credits Completed Notes: Each bar represents the share of courses taken or credits completed in different areas of study among students pursuing a program in the designated program group (e.g., business, health, etc.). The sample consists of all students who enroll in Michigan community colleges within six months of high school graduation. Only courses taken and credits completed within the first academic year following high school graduation are included. 131 020406080100OtherLaw Enf.STEMTradesHealthBusinessNon-Voc.Non-Voc.BusinessHealthTradesSTEMLaw Enf.Other020406080100OtherLaw Enf.STEMTradesHealthBusinessNon-Voc.Non-Voc.BusinessHealthTradesSTEMLaw Enf.Other Figure B.1.2: Labor Market Shocks in Michigan, 2001-2017 (a) Layoff Events (b) Total Job Losses 132 050100150200Number of Layoff Events200120032005200720092011201320152017Academic YearTotal Layoff EventsWARN Mass LayoffsWARN Plant ClosuresPrison Closures010,00020,00030,00040,000Number of Jobs Losses200120032005200720092011201320152017Academic YearTotal Layoff EventsWARN Mass LayoffsWARN Plant ClosuresPrison Closures Figure B.1.3: Average Layoffs in Michigan Counties, 2001-2017 133 IronDeltaLuceMarquetteChippewaKentAlgerGogebicHuronBaragaSanilacOntonagonLakeIoniaBaySchoolcraftHoughtonMackinacAlleganClareIoscoOaklandBarryTuscolaAlconaEatonCassSaginawLapeerMenomineeNewaygoWayneSt. ClairAlpenaJacksonGratiotCalhounDickinsonAntrimClintonLenaweeOttawaBerrienMasonOscodaIsabellaOtsegoMontcalmInghamCheboyganMonroeBranchGeneseeEmmetWexfordOsceolaHillsdaleOceanaMecostaMidlandOgemawGladwinKalkaskaCrawfordManisteeWashtenawVan BurenLivingstonPresque IsleMacombMissaukeeArenacBenzieMuskegonSt. JosephKalamazooRoscommonShiawasseeMontmorencyLeelanauCharlevoixKeweenawGrand TraverseKeweenawCharlevoixLeelanauAvg. Layoffs per 10,000 Working-Age Residents0.000 - 2.1042.105 - 5.9885.989 - 9.1329.133 - 11.21311.214 - 13.48813.489 - 18.95618.957 - 29.33329.334 - 47.984 Figure B.1.4: Correlation Between National and State-Specific Industry Employment Shares, 2016 134 0.2.4.6National Employment Share0.2.4.6Michigan Employment ShareBusinessHealthTradesSTEMLaw Enf.Other Figure B.1.5: Distribution of Layoffs by County, 2001-2017 Notes: The sample consists of the 66 (79.5%) Michigan counties that experience layoffs between 2001 and 2017. The left-hand panel shows the total number of layoffs in each type of occupation per 10,000 working-age residents (averaged over the time frame). The right-hand panel shows the share of total layoffs occurring in each type of occupation. 135 0200400600800Layoffs per 10,000 ResidentsWayneOaklandMacombKentGeneseeWashtenawInghamKalamazooOttawaSaginawLivingstonMuskegonSt. ClairJacksonBerrienMonroeCalhounAlleganEatonBayLenaweeLapeerGrand TraverseMidlandVan BurenClintonIsabellaShiawasseeMarquetteIoniaMontcalmSt. JosephBarryTuscolaCassNewaygoHillsdaleBranchGratiotChippewaSanilacDeltaHoughtonEmmetWexfordHuronClareMasonDickinsonOceanaGladwinManisteeIoscoOtsegoMenomineeOsceolaAntrimBenzieArenacCrawfordLakeIronAlgerBaragaSchoolcraftOntonagon0.2.4.6.81Share of Total LayoffsWayneOaklandMacombKentGeneseeWashtenawInghamKalamazooOttawaSaginawLivingstonMuskegonSt. ClairJacksonBerrienMonroeCalhounAlleganEatonBayLenaweeLapeerGrand TraverseMidlandVan BurenClintonIsabellaShiawasseeMarquetteIoniaMontcalmSt. JosephBarryTuscolaCassNewaygoHillsdaleBranchGratiotChippewaSanilacDeltaHoughtonEmmetWexfordHuronClareMasonDickinsonOceanaGladwinManisteeIoscoOtsegoMenomineeOsceolaAntrimBenzieArenacCrawfordLakeIronAlgerBaragaSchoolcraftOntonagonLow SkillCC BusinessCC HealthCC TradesCC STEMCC Law Enf.CC OtherHigh Skill Figure B.1.6: Robustness Checks for Pooled Specification (a) Different Control Variables (b) Trends and CZ Fixed Effects (c) Exclusion of Layoff Events (d) Non-Linear Specifications (Semi-Elasticities) 136 Main SpecificationControlling for Non-CCLayoffsControlling Separatelyfor High- andLow-Skill LayoffsIncludingCounty-by-Year FE-.03-.02-.010.01Main SpecificationCounty-by-ProgramLinear Time TrendsYear-by-Program-by-CZFixed Effects-.03-.02-.010.01Main SpecificationOnly Plant ClosingsOnly Events with 50 orMore Job Losses-.03-.02-.010.01Main SpecificationInverse HyperbolicSine (IHS)Fractional Logit-.03-.02-.010.01 Figure B.1.7: Substitution into Program Groups Requiring Similar Skills 137 -2-10123Effect Size0.2.4.6.81Skill DistanceBusiness Layoff-1-.50.51Effect Size0.2.4.6.81Skill DistanceHealth Layoff-.4-.20.2.4Effect Size0.2.4.6.81Skill DistanceSkilled Trades Layoff-2-1012Effect Size0.2.4.6.81Skill DistanceSTEM Layoff-.4-.20.2.4Effect Size0.2.4.6.81Skill DistanceLaw Enforcement Layoff-2-1012Effect Size0.2.4.6.81Skill DistanceOther LayoffBusinessHealthTradesSTEMLaw Enf.Other Figure B.1.8: Relationship Between Substitution Effects and Skill Distance 138 -1012Effect Size.2.4.6.81Skill Distancen = 30 RMSE = 0.456Effect = 0.384 - 0.548 Similarity R2 = 7.3% Figure B.1.9: Alternative Measures of Skill Distance (a) Differences in Skill Levels Only (b) Differences in Skill Importance Only 139 -1012Effect Size.2.4.6.81Skill Distancen = 30 RMSE = 0.432Effect = 0.558 - 0.728 SkillOnly R2 = 16.6%-1012Effect Size.2.4.6.81Skill Distancen = 30 RMSE = 0.441Effect = 0.489 - 0.652 ImpOnly R2 = 13.2% Figure B.1.10: Heterogeneous Own-Layoff Effects (a) Heterogeneity by Gender (b) Heterogeneity by County Urbanicity 140 -4-202Own-Layoff Effect SizeBusinessHealthTradesSTEMLaw Enf.OtherMaleFemale-3-2-1012Own-Layoff Effect SizeBusinessHealthTradesSTEMLaw Enf.OtherUrbanRural Figure B.1.11: Robustness Checks for Own-Layoff Effects (a) Weighting for Heteroskedasticity (b) County-Specific Time Trends (c) Cohort-by-Commuting Zone Fixed Effects (d) Dropping 2009 Cohort (e) Dropping Students in Multiple Programs (f) Non-Linear Specifications 141 -2-1.5-1-.50.5Own-Layoff Effect SizeBusinessHealthTradesSTEMLaw Enf.OtherNo WeightsSHW Weights-3-2-101Own-Layoff Effect SizeBusinessHealthTradesSTEMLaw Enf.OtherNo TrendsCounty Trends-3-2-101Own-Layoff Effect SizeBusinessHealthTradesSTEMLaw Enf.OtherMain Spec.CZ-by-Year FEs-2-1.5-1-.50.5Own-Layoff Effect SizeBusinessHealthTradesSTEMLaw Enf.OtherAll CohortsDropping 2009-2-1.5-1-.50.5Own-Layoff Effect SizeBusinessHealthTradesSTEMLaw Enf.OtherAll StudentsSingle Majors-.15-.1-.050.05.1Own-Layoff Semi-Elasticity (at mean)BusinessHealthTradesSTEMLaw Enf.OtherLinearIHSPoissonFractional LogitMultinomial Logit Table B.1.1: Programs Offered by Michigan’s Community Colleges Variable: Panel A. All Programs Total Programs Vocational Programs Non-Vocational Programs Share Vocational Mean (1) S.D. Min. Max. (2) (4) (3) 116.54 95.29 21.25 0.81 67.18 41.00 59.00 33.00 13.03 5.00 0.56 0.10 319.00 280.00 51.00 0.94 Panel B. Associate Programs Total Programs Vocational Programs Non-Vocational Programs Share Vocational 59.75 45.07 14.68 0.75 30.11 24.42 9.94 0.12 10.00 5.00 2.00 0.49 142.00 124.00 37.00 0.92 56.79 50.21 6.57 0.88 177.00 158.00 21.00 1.00 Panel C. Certificate Programs Total Programs Vocational Programs Non-Vocational Programs Share Vocational Notes: The sample consists of Michigan’s 28 community colleges during the academic year 2011-2012. Vocational programs are defined as those which can be matched to an occupation that is attainable by community col- lege graduates. Non-vocational programs are all other programs offered by Michigan’s community colleges. 40.52 17.00 36.47 13.00 0.00 5.45 0.08 0.67 142 Table B.1.2: Program Groups and Associated Occupation Codes Program Group SOC SOC Title Business 11 13 23 41 43 Management Business and Financial Legal Sales and Related Office and Administrative Support Health Trades STEM Law Enf. Other 29 31 37 45 47 49 51 53 15 17 19 33 21 25 27 35 39 Healthcare Practitioners and Technical Healthcare Support Building and Grounds Cleaning and Maintenance Farming, Fishing, and Forestry Construction and Extraction Installation, Maintenance, and Repair Production* Transportation and Material Moving** Computer and Mathematical Architecture and Engineering Life, Physical, and Social Science Protective Service Community and Social Service Education, Training, and Library Arts, Design, Entertainment, Sports, and Media Food Preparation and Serving Related Personal Care and Service * Programs matched to the 3-digit code 51-3 (Food Processing Workers) are included in the “Other” group because they are generally part of Culinary Arts programs that are mostly matched to the 2-digit code 35 (Food Preparation and Serving Related). Results are robust to including these programs in either group. ** Programs matched to the 6-digit code 53-3011 (Ambulance Drivers and Attendants) are in- cluded in the “Health” group because they are generally part of Emergency Medical Services programs that are mostly matched to the 2-digit code 29 (Healthcare Practitioners and Techni- cal). Results are robust to including these programs in either group. 143 Table B.1.3: Summary Statistics of Michigan’s High School Graduates Variable: White Black Hispanic Male Economically Disadvantaged English Language Learner Standardized Math Score Standardized Reading Score On-Time Graduation All Grads (1) 0.760 0.150 0.041 0.490 0.333 0.025 0.095 0.087 0.971 CC Voc. (2) 0.738 0.176 0.046 0.537 0.366 0.039 -0.165 -0.205 0.984 CC Non-Voc. (3) 0.789 0.128 0.040 0.465 0.324 0.036 -0.028 -0.048 0.986 Other College (4) 0.785 0.128 0.027 0.443 0.222 0.010 0.532 0.524 0.997 No College (5) 0.723 0.178 0.057 0.543 0.461 0.035 -0.305 -0.303 0.931 Students Share of Graduates Notes: The sample consists of all graduates of Michigan public high schools from 2009 to 2016 who have non-missing demographic and geographic information. College and program choices are defined as a student’s first enrollment choice within 6 months of graduating high school. For example, the sample in column (2) consists of all students who first enroll in vocational programs in Michigan’s community colleges within 6 months of high school graduation. 259,072 0.353 103,032 0.140 306,532 0.417 734,928 1.000 66,292 0.090 144 Table B.1.4: Summary Statistics of Vocational Students by Program Business Health Trades STEM Law Enf. Other Students Share of Vocational Students Notes: The sample consists of all graduates of Michigan public high schools from 2009 to 2016 who have non-missing demographic and geographic information and enroll in a vocational program at one of the state’s community colleges within 6 months of high school graduation. 12,979 0.196 16,082 0.243 15,080 0.227 5,387 0.081 Variable: White Black Hispanic Male Economically Disadvantaged English Language Learner Standardized Math Score Standardized Reading Score On-Time Graduation (1) 0.747 0.169 0.041 0.588 0.329 0.044 -0.056 -0.162 0.987 (2) 0.705 0.203 0.051 0.216 0.415 0.053 -0.260 -0.231 0.984 (3) 0.837 0.088 0.045 0.943 0.348 0.034 -0.193 -0.398 0.978 (6) 0.704 0.213 0.046 0.396 0.366 0.019 -0.242 -0.162 0.984 (4) 0.759 0.146 0.042 0.855 0.338 0.048 0.069 -0.072 0.984 8,476 0.128 (5) 0.750 0.171 0.049 0.653 0.389 0.031 -0.306 -0.316 0.984 8,288 0.125 145 Table B.1.5: Industries with Highest Concentration of Occupation Groups NAICS Industry Title Business 524 522 425 Insurance Carriers and Related Activities Credit Intermediation and Related Activities Wholesale Electronic Markets and Agents and Brokers Health 621 623 622 Trades 212 811 484 STEM 511 516 518 Ambulatory Health Care Services Nursing and Residential Care Facilities Hospitals Mining (except Oil and Gas) Repair and Maintenance Truck Transportation Publishing Industries (except Internet) Internet Publishing and Broadcasting Data Processing, Hosting, and Related Services Law Enforcement 482 921 922 Rail Transportation Executive, Legislative, and Other General Government Support Justice, Public Order, and Safety Activities Other 515 812 624 Broadcasting (except Internet) Personal and Laundry Services Social Assistance α 0.429 0.443 0.470 0.414 0.508 0.544 0.386 0.449 0.623 0.187 0.216 0.300 0.005 0.010 0.411 0.228 0.313 0.369 146 Table B.1.6: Correlation Between Occupation Composition Across Industries Business Health Trades STEM Law Enf. Other Business 1.000 Health Trades STEM -0.133 1.000 -0.258 -0.212 1.000 0.328 -0.106 -0.190 1.000 Law Enf. -0.106 -0.002 -0.098 -0.051 1.000 -0.138 Other Notes: Each cell displays a pairwise correlation between the industry employment shares for the occupation groups of interest. See Section 2.4.1 for more information. -0.026 1.000 0.071 -0.360 -0.011 147 Table B.1.7: Summary Statistics of Layoffs in Michigan, 2001-2017 (3) Mean (1) S.D. Min. Max. Layoff category: (2) (4) Panel A. Layoffs per 10,000 Working-Age Residents Non-CC Low Skill CC Business CC Health CC Trades CC STEM CC Law Enf. CC Other Non-CC High Skill 16.395 0.000 0.000 2.991 2.647 0.000 0.000 7.134 0.000 0.991 6.302 0.000 0.000 0.596 4.483 0.000 5.250 1.024 0.210 2.080 0.307 0.518 0.106 1.263 290.3 45.75 88.23 95.56 14.98 138.9 14.10 69.81 County-Year Obs. 1,411 1,411 1,411 1,411 Panel B. Share of Total Layoffs (County-Year Pairs with Non-Zero Total Layoffs) Non-CC Low Skill CC Business CC Health CC Skilled Trades CC STEM CC Law Enf. CC Other Non-CC High Skill 0.142 0.155 0.028 0.066 0.000 0.070 0.000 0.120 0.037 0.000 0.0844 0.000 0.029 0.000 0.002 0.075 0.512 0.118 0.019 0.173 0.033 0.020 0.015 0.114 0.909 0.451 0.552 0.648 0.234 0.432 0.219 0.510 369 369 County-Year Obs. Notes: The sample consists of all county-year observations from 2001 to 2017. Layoffs in each category are estimated using local industry layoffs and national occupation-by-industry shares. See Section 2.4.1 for more details. 369 369 148 Table B.1.8: Largest Layoffs by Occupation Group, 2001-2017 County Business Lake Iosco Ontonagon Health Midland Gladwin Ontonagon Trades Antrim Ontonagon Wexford STEM Antrim Ingham Midland Year Size Largest Related Layoff (Jobs Lost) 2005 2008 2009 27.88 Michigan Youth Correctional Facility (204) 29.02 Kalitta Air (219) 45.75 SmurfitStone Container Corp. (150) 2015 2015 2009 13.95 MidMichigan Health - Stratford Village (143) 29.72 MidMichigan Health - Gladwin Pines (85) 88.23 Maple Manor Nursing Home (62) 2007 2009 2010 61.18 Dura Automotive Systems (300) 69.30 SmurfitStone Container Corp. (150) 95.56 AAR Mobility Systems (282) 2007 2004 2015 61.18 Dura Automotive Systems (300) 9.987 General Motors (3,975) 14.98 Dow Chemical Company (700) Law Enforcement 2011 Lake 2009 Arenac Lake 2005 87.01 Northlake Correctional Facility (146) 131.2 Standish Maximum Facility (281) 138.9 Michigan Youth Correctional Facility (204) Other Oceana Hillsdale Ontonagon Notes: Size is measured as the estimated number of layoffs per 10,000 working-age residents in the county. 6.03 Double JJ Resort (150) 7.45 14.10 SmurfitStone Container Corp. (150) 2008 2012 2009 The Manor Residential Treatment Facility (140) 149 Table B.1.9: Effect of Job Losses on Enrollment in Related Community College Programs Layoffs per 10,000 in: Year following graduation Senior year of H.S. Junior year of H.S. Sophomore year of H.S. Freshman year of H.S. 8th grade 7th grade 6th grade 5th grade Enrollment in Occupation Group Programs per 100 H.S. Graduates (4) (1) 0.007 (0.005) (3) (2) -0.012** (0.006) -0.014** (0.007) -0.014** (0.007) -0.011* (0.006) -0.002 (0.004) -0.003 (0.005) -0.001 (0.005) -0.008** (0.004) -0.008* (0.004) -0.006 (0.004) -0.004 (0.004) -0.005 (0.004) -0.002 (0.004) -0.007 (0.005) -0.004 (0.004) 0.005 (0.005) 0.007 (0.006) -0.002 (0.004) -0.000 (0.004) 0.002 (0.005) 0.004 (0.005) 1.57 3,984 0.488 1.57 3,984 0.489 1.57 3,984 0.490 1.57 3,984 0.490 Outcome Mean County-Program-Year Obs. R-squared Notes: The unit of observation is a county-cohort-program triad. Outcomes are measured as the number students who initially enroll in a given vocational program within 6 months of high school graduation per 100 graduates in the county. The coefficients in each column are estimated from a separate regression and represent variants of β in equation (2.3), the effect of an additional layoff per 10,000 working age residents in a given occupation group on enrollment in corresponding programs. All regressions include controls for the share of graduates that are white, male, and categorized as economically disadvantaged; average 11th grade math and reading test scores; and the county unemployment rate and logged size of the labor force during a cohort’s senior year of high school. All standard errors are clustered at the county level. ∗ p < 0.10, ∗∗ p < 0.05, ∗∗∗ p < 0.01. 150 Table B.1.10: Effect of Job Losses in Alternative Geographic Areas Layoffs per 10,000 in: Own county, t-1 Rest of state, t-1 Rest of commuting zone, t-1 State less commuting zone, t-1 Enrollment in Occupation Group Programs per 100 Vocational Students (1) (2) (3) -0.012** (0.006) -0.012** (0.006) -0.012** (0.006) 0.003 (0.012) -0.008 (0.009) 0.007 (0.013) 1.57 3,984 0.476 1.57 3,936 0.479 1.57 3,984 0.476 Outcome Mean County-Program-Year Obs. R-squared Notes: The unit of observation is a county-cohort-program triad. Outcomes are measured as the number students who initially enroll in a given vocational program within 6 months of high school graduation per 100 graduates in the county. The coefficients in each column are estimated from a separate regression and represent variants of β in equation (2.3), the effect of an additional layoff per 10,000 working age residents in a given occupation group on enrollment in corresponding programs. All regressions include controls for the share of graduates that are white, male, and categorized as economically disadvantaged; average 11th grade math and reading test scores; and the county unemployment rate and logged size of the labor force during a cohort’s senior year of high school. All standard errors are clustered at the county level. ∗ p < 0.10, ∗∗ p < 0.05, ∗∗∗ p < 0.01. 151 Table B.1.11: Effect of Community College Layoffs on Overall Vocational Program Enrollment Layoffs per 10,000 in: Business, t-1 Health, t-1 Skilled Trades, t-1 STEM, t-1 Law Enforcement, t-1 Other, t-1 P-Value for Joint Test County-Specific Trends Year-by-CZ Fixed Effects Vocational Enrollment per 100 Graduates (1) 0.009 (0.013) (2) 0.016 (0.017) (3) 0.003 (0.012) 0.002 (0.005) 0.002 (0.002) 0.018 (0.015) -0.000 (0.002) 0.012 (0.027) 0.351 -0.006 (0.005) 0.001 (0.004) 0.001 (0.018) -0.001 (0.002) 0.021 (0.024) 0.607 X 0.011* (0.006) 0.003 (0.003) 0.002 (0.014) -0.000 (0.002) 0.015 (0.023) 0.314 X 9.40 664 0.761 9.40 664 0.671 9.40 656 0.809 Outcome Mean County-Year Obs. R-squared Notes: The unit of observation is a county-cohort pair. Outcomes are measured as the number of students who enroll in vocational community college programs within 6 months of high school graduation, per 100 high school graduates in the county and cohort. The coefficients in each column are estimated from a separate regression and represent the β parameters in equation (2.4), the effect of an additional layoff per 10,000 working age residents in a given occupation group on the outcome of interest. The numbers in brackets below the estimates are the estimated elasticities at the mean dependent and independent variable values. All regressions include controls for the share of graduates that are white, male, and categorized as economically disadvantaged; average 11th grade math and reading test scores; and the county unemployment rate, logged size of the labor force, and the number of layoffs per 10,000 working-age residents in non community college occupations during a cohort’s senior year of high school. All standard errors are clustered at the county level. ∗ p < 0.10, ∗∗ p < 0.05, ∗∗∗ p < 0.01. 152 Table B.1.12: Effect of Layoffs on College Enrollment Outcomes Enrollment per 100 Graduates in: No Formal CC Vocational CC Non-Voc. Programs Programs Layoffs per 10,000 in: Panel A. Total layoffs All occupations, t-1 Outcome Mean County-Year Obs. R-Squared College (1) -0.013** (0.006) 39.60 664 0.787 Panel B. Layoffs by skill group Low-skill occupations, t-1 -0.004 (0.020) Community college occupations, t-1 High-skill occupations, t-1 -0.041 (0.035) 0.058 (0.077) (2) (3) -0.004* (0.002) 9.40 664 0.670 -0.012 (0.013) 0.004 (0.017) -0.002 (0.037) 0.005 (0.005) 12.56 664 0.731 0.019 (0.016) 0.011 (0.021) -0.069 (0.052) Four-Year Colleges (4) 0.012** (0.005) 38.44 664 0.865 -0.002 (0.022) 0.026 (0.027) 0.012 (0.053) 9.40 664 0.670 38.44 664 0.865 12.56 664 0.732 39.60 664 0.788 Outcome Mean County-Year Obs. R-Squared Notes: The unit of observation is a county-cohort pair. Outcomes are measured as the number of students who enroll in vocational community college programs within 6 months of high school graduation, per 100 high school graduates in the county and cohort. The coefficients in each column are estimated from a separate regression and represent the β parameters in equation (2.4), the effect of an additional layoff per 10,000 working age residents in a given occupation group on the outcome of interest. The numbers in brackets below the estimates are the estimated elasticities at the mean dependent and independent variable values. All regressions include controls for the share of graduates that are white, male, and categorized as economically disadvantaged; average 11th grade math and reading test scores; and the county unem- ployment rate and logged size of the labor force during a cohort’s senior year of high school. All standard errors are clustered at the county level. ∗ p < 0.10, ∗∗ p < 0.05, ∗∗∗ p < 0.01. 153 Table B.1.13: Effect of Layoffs on Composition of Vocational Students % White % Male Layoffs per 10,000 in: Business, t-1 Health, t-1 Skilled Trades, t-1 STEM, t-1 Law Enforcement, t-1 Other, t-1 (1) 0.007 (0.004) 0.004 (0.003) -0.000 (0.001) 0.008 (0.006) 0.000 (0.001) -0.016 (0.012) P-Value for Joint Test 0.456 (2) -0.005 (0.009) 0.005 (0.003) 0.001 (0.002) -0.003 (0.009) 0.002 (0.002) -0.001 (0.011) 0.638 % Econ. Dis. (3) -0.005 (0.008) -0.002 (0.002) -0.000 (0.001) -0.007 (0.008) 0.001 (0.001) -0.009 (0.006) 0.217 Avg. Math Score (4) 0.011 (0.008) 0.002 (0.002) -0.002 (0.002) -0.009 (0.008) -0.001 (0.001) -0.010 (0.014) 0.217 Avg. Read Score (5) -0.003 (0.008) -0.001 (0.002) -0.000 (0.002) -0.005 (0.010) -0.003 (0.002) 0.010 (0.011) 0.827 0.870 657 0.728 0.531 657 0.220 0.393 657 0.528 Outcome Mean County-Year Obs. R-Squared Notes: The unit of observation is a county-cohort pair. Outcomes are measured as the mean characteristic across all students who enroll in vocational programs. The coefficients in each column are estimated from a separate regression and represent the β parameters in equation (2.4), the effect of an additional layoff per 10,000 working age residents in a given occupation group on the outcome of interest. All regressions include controls for the share of graduates that are white, male, and categorized as economically disadvantaged; average 11th grade math and reading test scores; and the county unemployment rate, logged size of the labor force, and the number of layoffs per 10,000 working-age residents in non community college occupations during a cohort’s senior year of high school. All standard errors are clustered at the county level. ∗ p < 0.10, ∗∗ p < 0.05, ∗∗∗ p < 0.01. -0.067 657 0.474 -0.144 657 0.389 154 Table B.1.14: Effect of Layoffs on First-Year Course-Taking Layoffs per 10,000 in: Business, t-1 Health, t-1 Skilled Trades, t-1 STEM, t-1 Law Enforcement, t-1 Other, t-1 Total Credits (1) 0.007 (0.216) 0.019 (0.086) 0.019 (0.036) 0.044 (0.346) 0.034 (0.034) 0.140 (0.705) Vocational Credits (2) -0.082 (0.108) 0.029 (0.050) 0.000 (0.018) 0.006 (0.143) 0.009 (0.018) -0.150 (0.329) Non-Voc. Credits (3) 0.089 (0.152) -0.010 (0.049) 0.019 (0.025) 0.039 (0.233) 0.025 (0.021) 0.290 (0.397) P-Value for Joint Test 0.952 0.920 0.669 6.46 657 0.482 10.88 657 0.505 17.34 657 0.471 Outcome Mean County-Year Obs. R-Squared Notes: The unit of observation is a county-cohort pair. Outcomes are mea- sured as the mean number of credits completed in the first year of com- munity college enrollment across all students who enroll in vocational pro- grams. The coefficients in each column are estimated from a separate re- gression and represent the β parameters in equation (2.4), the effect of an additional layoff per 10,000 working age residents in a given occupation group on the outcome of interest. All regressions include controls for the share of graduates that are white, male, and categorized as economically disadvantaged; average 11th grade math and reading test scores; and the county unemployment rate, logged size of the labor force, and the number of layoffs per 10,000 working-age residents in non community college oc- cupations during a cohort’s senior year of high school. All standard errors are clustered at the county level. ∗ p < 0.10, ∗∗ p < 0.05, ∗∗∗ p < 0.01. 155 Table B.1.15: Substitution Between Community College Program Groups Layoffs per 10,000 in: Business, t-1 Health, t-1 Skilled Trades, t-1 STEM, t-1 Law Enf., t-1 Other, t-1 Business (1) -1.025** (0.456) -0.120 (0.138) 0.067 (0.078) 0.212 (0.676) 0.076 (0.075) 0.753 (0.617) 0.164 (0.109) 0.206 (0.626) 0.078 (0.082) 0.072 (0.945) Enrollment per 100 Vocational Students in: STEM Law Enf. Trades Health (2) (3) (4) (5) -0.702 (0.682) -0.056 (0.449) -0.093 (0.280) 1.736*** (0.592) -0.610** (0.232) -0.281** (0.122) 0.164 (0.123) 0.250 (0.222) 0.030 (0.123) -0.086 (0.839) -0.088 (0.097) -0.253 (0.674) -0.014 (0.066) -0.124 (0.347) -0.048 (0.061) 0.143 (0.094) -0.153** (0.075) -0.344 (0.518) -0.688 (0.522) 1.014 (0.678) Other (6) 0.141 (0.347) 0.597*** (0.132) -0.159** (0.063) 0.044 (0.405) -0.097 (0.061) -0.807 (0.511) Own-layoff semi-elasticities (at mean): -0.047** (0.021) -0.029*** (0.011) -0.006 (0.007) -0.010 (0.029) -0.011** (0.005) -0.046 (0.029) 20.67 657 0.506 21.66 657 0.190 14.33 657 0.344 11.84 657 0.266 Outcome Mean County-Year Obs. R-squared Notes: The unit of observation is a county-cohort pair. Outcomes are measured as the number of students who enroll in a given program within 6 months of high school graduation per 100 students who in the county and cohort enroll in vocational programs. The coefficients in each column are estimated from a separate regression and represent the β j terms in equation (2.5), the effect of an additional layoff per 10,000 working age residents in a given occupation group on the outcome of interest. All regressions include controls for the share of graduates that are white, male, and categorized as economically disadvantaged; average 11th grade math and reading test scores; and the county unemployment rate, logged size of the labor force, and the number of layoffs per 10,000 working-age residents in non community college occupations during a cohort’s senior year of high school. All standard errors are clustered at the county level. ∗ p < 0.10, ∗∗ p < 0.05, ∗∗∗ p < 0.01. 13.74 657 0.258 17.75 657 0.353 156 Table B.1.16: Substitution Between Narrower Community College Programs Enrollment per 100 Vocational Students in: Trades STEM Law Enf. Arts Personal & Culinary Business Health (1) -1.025** (0.456) (2) -0.702 (0.682) (3) (4) (5) -0.056 (0.449) -0.093 (0.280) 1.736*** (0.592) & Media (6) -0.303 (0.227) 0.107 (0.084) -0.124*** (0.039) 0.383 (0.316) Social Services (8) 0.440** (0.184) 0.346*** (0.073) -0.008 (0.031) 0.196 (0.195) 0.068 (0.053) -0.031 (0.371) (7) 0.004 (0.201) 0.144* (0.083) -0.027 (0.057) -0.535** (0.268) -0.088** (0.043) -0.123 (0.302) -0.610** (0.232) -0.281** (0.122) 0.164 (0.123) 0.250 (0.222) 0.030 (0.123) -0.086 (0.839) -0.088 (0.097) -0.253 (0.674) -0.014 (0.066) -0.124 (0.347) -0.048 (0.061) 0.143 (0.094) -0.153** (0.075) -0.077*** (0.027) -0.344 (0.518) -0.688 (0.522) 1.014 (0.678) -0.652 (0.404) Layoffs per 10,000 in: Business, t-1 Health, t-1 Skilled Trades, t-1 STEM, t-1 Law Enforcement, t-1 Other, t-1 -0.120 (0.138) 0.067 (0.078) 0.212 (0.676) 0.076 (0.075) 0.753 (0.617) 21.66 657 0.190 0.164 (0.109) 0.206 (0.626) 0.078 (0.082) 0.072 (0.945) 20.67 657 0.506 9.11 657 0.542 13.74 657 0.258 11.84 657 0.266 14.33 657 0.344 Outcome Mean Observations R-squared Notes: The unit of observation is a county-cohort pair. Outcomes are measured as the number of students who enroll in a given program within 6 months of high school graduation per 100 students who in the county and cohort enroll in vocational programs. I define social service programs as those with 2-digit occupation codes of 21 (Community and Social Service) and 25 (Education, Training, and Library), plus childcare programs (SOC 39-9011); arts and media programs as those with the 2-digit occupation code 27 (Arts, Design, Entertainment, Sports, and Media); and personal care and culinary programs as those with the 2-digit codes 35 (Food Preparation and Serving) and 39 (Personal Care and Service), other than childcare, plus baking programs (SOC 51-3011). The coefficients in each column are estimated from a separate regression and represent the β j terms in equation (2.5), the effect of an additional layoff per 10,000 working age residents in a given occupation group on the outcome of interest. All regressions include controls for the share of graduates that are white, male, and categorized as economically disadvantaged; average 11th grade math and reading test scores; and the county unemployment rate, logged size of the labor force, and the number of layoffs per 10,000 working-age residents in non community college occupations during a cohort’s senior year of high school. All standard errors are clustered at the county level. ∗ p < 0.10, ∗∗ p < 0.05, ∗∗∗ p < 0.01. 5.26 657 0.322 3.39 657 0.313 157 B.2 Comparing Layoffs to Other Employment Data Sources The estimated layoff measures used throughout the analysis are designed to capture changes in local labor demand in a given occupation group and county. They should not, however, be treated as the exact number of job losses in an occupation group and county because not all layoff events are required to be reported under the WARN Act and, among events that are required to be reported, there is non-compliance in reporting. For example, in 2001, the federal government estimated that only about one quarter of events were required to be reported under the WARN Act and that, of those that were required to be reported, only one-third of were reported to the correct government agencies (United States General Accounting Office, 2003). Nevertheless, to verify that these proxy measurements capture true changes in employment over time and across counties, I compare county-by-industry layoffs to analogous employment data from two commonly used employment datasets: the Quarterly Census of Employment and Wages (QCEW) and the County Business Patterns (CBP). The QCEW is published quarterly by the Bureau of Labor Statistics and captures employment in more than 95% of U.S. jobs. However, a large share of its data at the county-by-industry level is suppressed due to privacy concerns. The CBP is released annually by the U.S. Census Bureau and captures the number of establishments and total employment during the week of March 12. Like the QCEW, many county-by-industry cells in the CBP are suppressed to prevent users from inferring information about individual firms. But in contrast to the QCEW, employment counts for some cells in the CBP can be imputed from establishment counts and higher-level geographic and industrial classifications. In the analyses that follow, I use the imputed data provided by Eckert et al. (2020) to maximize the coverage of Michigan’s counties. I begin by comparing the county-by-industry employment counts provided by both the QCEW and CBP. Because the CBP data does not contain information on government employment, I re- strict the sample to all non-government NAICS 3-digit sectors. I further restrict the sample to county-by-industry pairs that have non-zero employment counts in all years 2001-2016 in at least one of the datasets. Figure B.2.1, below, provides a simple scatterplot of employment counts in 158 the two datasets for the 73% of observations (3,630 county-industry pairs) that contain employ- ment information in both datasets. The two measures of employment are highly correlated, with a Pearson’s coefficient of 0.95. Figure B.2.1: Comparison of Employment Counts in QCEW & CBP Then, with each dataset, I estimate regressions of the following form: ∆Employmentkct = α + β Layoffskc,t−1 + εkct (B.1) where ∆Employmentkct is the change in employment in industry k in county c between March of year t − 1 and March of year t, and Layoffskc,t−1 is the number of layoffs in industry k in county c between March of year t − 1 and March of year t.1 The parameter of interest, β , captures the relationship between layoffs and year-over-year employment change in a given county and indus- try. If β is equal to -1, then, on average, an additional layoff is associated with an employment reduction of exactly one worker. If |β| is less than 1, then an additional layoff reduces employment 1The CBP provides employment counts as of March 12. To track corresponding employment changes in the QCEW, I use the first quarter, third month employment counts. 159 025,00050,00075,000100,000125,000QCEW Employment025,00050,00075,000100,000125,000CBP Employment by less than one worker on average, presumably because some laid-off workers find work at other firms in the same county and industry or other firms are increasing employment at the same time as the layoff. Alternatively, if |β| is greater than 1, then an additional layoff reduces employment by more than one worker on average, indicating that there are additional employment reductions, including changes in labor supply, that are not captured in the WARN data. Table B.2.1 presents the results of this specification using each dataset. Table B.2.1: Relationship Between Estimated Layoffs & Employment Change Layoff measure: Panel A. Quarterly Census of Employment & Wages (QCEW) -1.139*** Layoffs in county and industry, t-1 (0.312) -1.236*** (0.322) (1) (2) (3) -0.749*** (0.266) County, industry, and year FEs Interacted FEs X X X County-Year-Industry Obs. 47,399 47,398 47,254 Panel B. County Business Patterns (CBP) Layoffs in county and industry, t-1 -0.942*** -0.914*** (0.196) (0.196) -0.803*** (0.202) County, industry, and year FEs Interacted FEs X X X County-Year-Industry Obs. 58,202 58,202 58,186 Notes: The sample consists of all county-by-industry pairs that have non-zero em- ployment between 2001 and 2016 in either the QCEW or CBP dataset. All standard errors are clustered at the county level. ∗ p < 0.10, ∗∗ p < 0.05, ∗∗∗ p < 0.01. Column (1) shows that an additional layoff is associated with an employment reduction of 1.2 workers in the QCEW and of 0.94 workers in the CBP data. Column (2) then adds county, industry, and year fixed effects to assess whether the negative relationship continues to hold after controlling for factors that may induce layoffs (e.g., overall economic downturns or industry-specific turnover patterns). When using either dataset, the estimated change in employment due to an additional 160 layoff remains negative, statistically significant and close to -1 when including these fixed effects. Finally, column (3) interacts these fixed effects to mimick the interacted fixed effects in the most saturated version of equation (2.3) in the main text. When controlling for county-by-year, county- by-sector, and sector-by-year effects, an additional layoff reduces employment by 0.75 workers (QCEW) to 0.8 workers (CBP). The estimates remain statistically significant, indicating that the layoff measures are indeed capturing changes in local employment counts. Finally, to ensure that the relationship between is not driven by select industries, I estimate equation (B.1) separately for the ten NAICS 3-digit subsectors with the most layoffs in the WARN data. Figure B.2.2 presents these results. The estimated coefficients are overwhelmingly negative and do not vary substantially by dataset, again indicating that the layoff measures used throughout the paper capture true changes in local employment conditions. Figure B.2.2: Relationship between Layoffs and Employment Changes, by Sector 161 336: Transportation Equipment Manufacturing326: Plastics and Rubber Products Manufacturing452: General Merchandise Stores332: Fabricated Metal Product Manufacturing522: Credit Intermediation & Related Activities541: Professional, Scientific, and TechnicalServices481: Air Transportation333: Machinery Manufacturing331: Primary Metal Manufacturing561: Administrative and Support Services-10-50510QCEWCBP B.3 Other Responses to Layoffs To supplement the main analysis, I also analyze how layoffs affect two other educational out- comes of interest: the enrollment choices of students who delay community college entrance be- yond the first six months of high school graduation and the retention rates of students once enrolled. For the first outcome, I restrict the sample to students who graduate from high school between 2009 and 2013 and enroll in vocational community college programs within at some point before 2017 and re-estimate equation 2.3 in the main text for different enrollment timeframes.1 Figure B.3.1 shows the estimated elasticity of program choice with respect to prior-year layoffs in related occu- pations. For enrollment within either six or twelve months of high school graduation, an additional layoff per 10,000 county working-age residents during a cohort’s senior year of high school re- duces enrollment in related programs by about 1%. This effect continues to hold when I control for layoffs occurring during students freshman, sophomore, and junior years of high school. Figure B.3.1: Effect of Layoffs on Program Choice for Later Enrollees 1To control for time-varying county characteristics that I may not observe in my data, I include county-by-cohort fixed effects in these specifications. 162 -.03-.02-.010Semi-Elasticity w.r.t. Prior-Year Layoffs0-6 Months0-12 Months1-2 YearsNo ControlsPrior Layoff Controls When analyzing longer-run enrollment choices, I cannot observe where students live in the years following high school graduation and, therefore, implicitly assume that students’ remain living in the same county that lived in during high school. Nevertheless, for students enrolling in vocational community college programs in the 1-2 years following graduation, I find similar effects of layoffs on program choices. Figure B.3.1 shows that an additional layoff per 10,000 students reduces enrollment in the following year by about 2%. The magnitude of this estimate suggests that older students may be even more responsive to local labor market shocks, which is an important topic for future work. I also consider how layoffs affect program retention rates by including all cohorts and estimat- ing equations of the following form: Retentiongct = α + Layoffsgctβ + XctΓ + θgc + δgt + εgct (B.1) where Retentiongct is a measure of the year-over-year retention of students from county c enrolled in program group g in year t, Layoffsgct is a measure of analogous layoffs, and all other terms are defined as in previous equations in the main text. My main measure of retention is the number of students from county c who were enrolled in program group g in year t − 1 and remain enrolled in the same program and community college in year t, per 100 students initially enrolled.2 This measure is equal to the share of students who remain enrolled in the same college and program in the following year and multiply the share by 100. I also calculate measures of students switching between programs and between colleges, graduating from programs, and not being observed in the data the following year. I measure layoffs as those that occur between July 1st of year t − 1 and June 30th of year t to capture layoffs that students observe throughout the year in which they are enrolled in a program. Table B.3.1 presents these results. Column (1) indicates that an additional layoff per 10,000 working-age residents reduces program retention by 0.26pp, or about 0.6%. This estimate is 2In these calculations, I only consider enrollment in the college at which students earn the most credits during a given year. That is, if a student enrolls in two colleges within one year, she is assigned to enrollment only at the college in which she earns more credits. 163 smaller than the decrease in initial program enrollment documented in my earlier results, which is consistent with the fact that students already enrolled in a program likely face a lower marginal cost to finishing. For example, they have likely already completed some of the coursework needed to earn a degree in the subject. I also estimate the effects of layoffs on retention separately for each program group using a modified version of the systems of equations setup.3 Table B.3.2 presents these results, which indicate that the largest elasticities come from students’ responses to layoffs in STEM and other programs. Columns (2) through (5) of Table B.3.1 document what choices students make when layoffs deter them from continuing in vocational programs. While the estimates are imprecise, the largest coefficient appears in Column (5), which measures the share of students who were enrolled in a program in the prior year but are no longer formally enrolled in postsecondary education. In most cases, this means that a student has dropped out of her community college program without earning a degree.4 Given the large labor market returns to degree completion, this type of substitution effect may negatively impact students’ longer-run outcomes and suggests that policies that assist students in switching between programs after local labor market shocks could improve student outcomes. 3Specifically, I regress a program’s retention rate on the vector of layoffs occurring in each occupation group, county control variables, county fixed effects, and cohort fixed effects. 4Students could also be enrolled in colleges not covered by the NSC data. However, these types of colleges make up less than 1% of U.S. postsecondary institutions overall (National Student Clearinghouse Research Center, 2017). 164 Table B.3.1: Effect of Layoffs on Retention in Related Programs Number per 100 Prior-Year Vocational Students: Same Not Different Program Different College Program (1) (2) (3) Layoff measure: Layoffs per 10,000 in occupation group -0.264** (0.128) -0.034 (0.027) -0.008 (0.043) Earned Degree (4) 0.027 (0.052) Observed (5) 0.279** (0.129) 25.44 3,364 0.276 Outcome Mean County-Program-Year Obs. R-Squared 43.48 3,364 0.246 11.92 3,364 0.300 10.62 3,364 0.270 8.54 3,364 0.374 Notes: The unit of observation is a county-year-program triad. Each coefficient is estimated from a separate regression and represents β in equation B.1, the effect of an additional layoff per 10,000 working age residents in a given occupation group on retention in related programs. All regressions include controls for the share of graduates that are white, male, and categorized as economically disadvantaged; average 11th grade math and reading test scores; and the county unemployment rate, logged size of the labor force, and the number of layoffs per 10,000 working-age residents in non community college occupations during a cohort’s first year of college. All standard errors are clustered at the county level. ∗ p < 0.10, ∗∗ p < 0.05, ∗∗∗ p < 0.01. Table B.3.2: Own-Layoff Effects on Program Retention Rates Layoff measure: Layoffs per 10,000 in own occupation group Retention per 100 Students in: Business Health Trades STEM Law Enf. (1) (2) (3) (4) (5) Other (6) -0.250 (0.546) -0.082 (0.275) -0.364 (0.246) -1.307 (0.951) -0.226 (0.204) -3.600*** (1.358) Outcome Mean County-Year Obs. R-Squared 41.41 566 0.353 43.93 566 0.291 43.98 560 0.253 45.25 554 0.245 41.97 560 0.285 44.37 558 0.233 Notes: The unit of observation is a county-cohort pair. Each coefficient is estimated from a separate regression and represents the effect of an additional layoff per 10,000 working age residents in a given occupation group on retention in related programs. All regressions include controls for the share of graduates that are white, male, and categorized as economically disadvantaged; average 11th grade math and reading test scores; and the county unemployment rate, logged size of the labor force, and the number of layoffs per 10,000 working-age residents in non community college occupations during a cohort’s first year of college. All standard errors are clustered at the county level. ∗ p < 0.10, ∗∗ p < 0.05, ∗∗∗ p < 0.01. 165 B.4 Substitution Between Narrower Program Groups One limitation of the main analysis is that it combines multiple, potentially distinct programs into a single program group. To investigate substitution patterns between narrower program groups, I re-estimate the system of equations presented in equation (2.5) of the main text using enrollment in the two-digit occupation codes that comprise each program group as the dependent variables. For example, rather than estimating how business layoffs affect enrollment in business programs overall, I separately estimate how business layoffs affect enrollment in management, business and financial operations, legal, sales, and administrative support programs. I present these own-layoff effects in Figure B.4.1. Figure B.4.1: Effect of Layoffs on Enrollment in Narrower Program Groups The results indicate that the reduction in business program enrollment is driven by students forgoing enrollment in management-related programs, such as business administration, and the re- 166 11 Management13 Business &Financial Operations23 Legal41 Sales & Related43 Office &AdministrativeSupport-2-1.5-1-.50.5Business29 HealthcarePracticioners &Technical31 HealthcareSupport-1.5-1-.50.5Health37 Building, GroundsCleaning &Maintenance45 Farming, Fishing,& Forestry47 Construction &Extraction49 Installation,Maintenance, &Repair51 Production53 Transportation &Material Moving-.2-.10.1Skilled Trades15 Computer &Mathematical17 Architecture &Engineering19 Life, Physical, &Social Science-1-.50.51STEM33 ProtectiveService-.3-.2-.10Law Enforcement21 Community &Social Service25 Education,Training, & Library27 Arts, Design,Entertainment,Sports, & Media35 Food Preparation& Serving Related39 Personal Care &Service-1.5-1-.50.5Other duction in healthcare programs is driven by students forgoing enrollment in healthcare practitioner programs, such as nursing. The reductions in enrollment in skilled trades programs are driven by programs in the installation, maintenance, and repair and production categories, which includes auto mechanic and welding degrees. The responses to STEM and other layoffs are not substantially different across each program group’s occupational categories. Next, I analyze substitution patterns relative to the two-digit occupation code that experiences the largest own-layoff effect in each program group. For example, because the largest decrease in business program enrollment comes from the management group, I compare the skills of all other two-digit occupation codes to the skills needed for management occupations to see if students are substituting into similar programs. Figure B.4.2 shows how the substitution patterns for each pro- gram group relate to the skill distance measures. Figure B.4.2: Substitution into Narrower Program Groups Requiring Similar Skills 167 -2-10123Effect Size0.2.4.6.81Skill DistanceBusiness Layoff-1-.50.51Effect Size0.2.4.6.81Skill DistanceHealth Layoff-.2-.10.1.2.3Effect Size0.2.4.6.81Skill DistanceSkilled Trades Layoff-2-1012Effect Size0.2.4.6.81Skill DistanceSTEM Layoff-.4-.20.2Effect Size0.2.4.6.81Skill DistanceLaw Enforcement Layoff-2-1012Effect Size0.2.4.6.81Skill DistanceOther LayoffBusinessHealthTradesSTEMLaw Enf.Other Figure B.4.3 then plots the pooled substitution effects against the skill distance measures for all six program groups. As in Figure B.1.8 in the main text, the largest substitution effects occur at the start of the x-axis, and there is a downward slope, indicating that substitution effects are largest in the most similar programs and diminish as skill distance increases. However, the results are less precise when considering enrollment in smaller program categories. Figure B.4.3: Relationship Between Substitution Effects & Skill Distance, Narrower Programs 168 -1-.50.511.5Effect Size.2.4.6.81Skill Distancen = 110 RMSE = .26380208Effect = 0.099 - 0.157 Similarity R2 = 1.7% APPENDIX C CHAPTER 3 APPENDIX C.1 Figures & Tables Figure C.1.1: ASD and Non-ASD Special Education Incidence Event Studies Notes: This figure plots event study estimates in which we replace PostMandatet × NonDisadvi with NonDisadvi interacted with a set of year dummies in equation (3.1). The NonDisadv×t control is excluded because of collinearity with the event study variables. Year 2011 is excluded, so all estimates are relative to that year. Each point represents the point estimate and the bars extending from each point show the 95% confidence interval that is calculated from standard errors that are clustered at the school district level. 169 -.0015-.001-.00050.0005Estimate200920102011201220132014YearAny ASD Incidence0.002.004.006.008Estimate200920102011201220132014YearAny Non-ASD Incidence Figure C.1.2: Test Scores Event Studies, using Non-Disabled Control Group Notes: This figure plots event study estimates in which we replace PostMandatet × NonDisadvi × ASDit with NonDisadvi × ASDit interacted with a set of year dummies in equation (3.2). Year 2011 is excluded, so all estimates are relative to that year. Each point represents the point estimate and the bars extending from each point show the 95% confidence interval that is calculated from standard errors that are clustered at the school district level. 170 -.1-.050.05Estimate2010201120122013YearMath Test Score (vs. Non-Disabled Students)-.1-.050.05.1Estimate2010201120122013YearReading Test Score (vs. Non-Disabled Students) Figure C.1.3: Event Study Estimates of Main Outcomes 171 -.050.05Estimate200920102011201220132014YearASD Program-.1-.050.05Estimate200920102011201220132014YearResource/Cog. Program-.020.02.04.06Estimate200920102011201220132014YearNo Program Figure C.1.3 (cont’d) 172 -.01-.0050.005.01.015Estimate20102011201220132014YearSpecial Ed School-.020.02.04.06.08Estimate20102011201220132014YearGen Ed 80% or more-.03-.02-.010.01.02Estimate2011201220132014YearAverage Special Ed FTE Figure C.1.3 (cont’d) 173 -.04-.020.02.04Estimate200920102011201220132014YearASD Teacher Consultant-.04-.020.02.04Estimate200920102011201220132014YearSpeech Therapy-.04-.020.02.04.06Estimate200920102011201220132014YearOccupational Therapy Figure C.1.3 (cont’d) Notes: This figure plots event study estimates in which we replace PostMandatet × NonDisadvi × ASDit with NonDisadvi × ASDit interacted with a set of year dummies in equation (3.2). Year 2011 is excluded, so all estimates are relative to that year. Each point represents the point estimate and the bars extending from each point show the 95% confidence interval that is calculated from standard errors that are clustered at the school district level. 174 -.06-.04-.020.02Estimate200920102011201220132014YearSocial Worker0.01.02.03.04.05Estimate200920102011201220132014YearAny Figure C.1.4: Test Scores Event Studies, using All Non-ASD Control Group Notes: This figure plots event study estimates in which we replace PostMandatet × NonDisadvi × ASDit with NonDisadvi × ASDit interacted with a set of year dummies in equation (3.2). Year 2011 is excluded, so all estimates are relative to that year. Each point represents the point estimate and the bars extending from each point show the 95% confidence interval that is calculated from standard errors that are clustered at the school district level. 175 -.1-.050.05Estimate2010201120122013YearMath Test Score (vs. All Non-ASD Students)-.1-.050.05.1Estimate2010201120122013YearReading Test Score (vs. All Non-ASD Students) Figure C.1.5: Test Score Event Studies, using Non-ASD Special Ed Control Group Notes: This figure plots event study estimates in which we replace PostMandatet × NonDisadvi × ASDit with NonDisadvi × ASDit interacted with a set of year dummies in equation (3.2). Year 2011 is excluded, so all estimates are relative to that year. Each point represents the point estimate and the bars extending from each point show the 95% confidence interval that is calculated from standard errors that are clustered at the school district level. 176 -.1-.050.05Estimate2010201120122013YearMath Test Score (vs. Non-ASD Disabled Students)-.1-.050.05.1Estimate2010201120122013YearReading Test Score (vs. Non-ASD Disabled Students) Table C.1.1: Descriptive Tabulations of Analysis Variables Variable Demographics White Male LEP Poverty Disability ASD Any Non-ASD Cognitive Emotional Speech Learning Disability Other Health All ASD Non-ASD Special Ed. Special Ed. Non- 0.754 0.858 0.030 0.434 0.651 0.644 0.053 0.691 0.687 0.489 0.055 0.482 0.683 0.513 0.055 0.509 0.010 0.133 0.011 0.009 0.036 0.055 0.016 Observations 3,854,234 38,803 506,432 3,308,999 Special Education Program ASD Resource Cognitive Other None Education Setting Special Ed. School Gen. Ed. > 80% Gen. Ed. 40-79% Gen Ed. < 40% Average FTE Special Education Support Services ASD Teaching Consultant Non-ASD Teaching Consultant Language Social Worker Occupational Therapy Physical Therapy Transportation Other Service Any Service 0.197 0.551 0.119 0.031 0.129 0.056 0.455 0.149 0.207 0.353 0.130 0.091 0.790 0.691 0.401 0.031 0.041 0.042 0.943 0.001 0.611 0.090 0.046 0.268 0.018 0.594 0.143 0.074 0.194 0.003 0.074 0.478 0.220 0.085 0.027 0.009 0.023 0.674 Observations Notes: Authors’ tabulations from data on students in grades 2-8 from the 2009-2010 to the 2014-2015 school years. The sample sizes for the ASD groups in the top and bottom panels differ slightly because a small number of students with an ASD diagnosis do not receive any special education services. 506,432 38,621 177 Table C.1.2: Overlap Between Free/Reduced Price Lunch and Medicaid in Michigan, by Family Income By Free/Reduced Price Lunch Status: FRPL Status Eligible Not Eligible Percent Medicaid 72.93% 12.86% Percent Private Insurance 31.03% 88.53% Percent Insured 95.40% 97.54% Percent Insured By Family Income as Percent of Poverty Line: Percent Medicaid Percent Private Insurance Family Income ≤ 135% FPL 135-185% FPL 185-250% FPL 250-350% FPL ≥ 350% FPL Notes: Authors’ tabulations from the 2008-2016 American Community Survey among children who were in grades K-8 at Michigan public schools (N = 84,477). “FPL” stands for Federal Poverty Line. Note that insurance counts may exceed 100% as some people remain eligible for Medicaid while enrolled in private plans. 22.50% 54.19% 74.53% 86.82% 95.29% 81.05% 50.87% 29.13% 14.60% 5.16% 95.63% 94.75% 95.95% 96.84% 98.57% 178 Table C.1.3: The Effect of the ASD Insurance Mandate on Disability Incidence Panel A: Main Estimates Dependent Variable: ASD (1) Non-ASD Disability (2) Cognitive Emotional Disability Disability (3) (4) Speech Disability (5) Learning Disability (6) Other Health Disability (7) -0.00045* (0.00024) -0.00004 (0.00105) 0.00001 (0.00032) 0.00126*** (0.00028) -0.00283*** (0.00065) 0.00399*** (0.00088) -0.00266*** (0.00043) Panel B: Including Linear Time Trend Interacted with Non-Disadvantaged Status ASD (1) Non-ASD Disability (2) Dependent Variable: Cognitive Emotional Disability Disability (3) (4) Speech Disability (5) -0.00004 (0.00022) -0.00004 (0.00097) -0.00028 (0.00031) 0.00051* (0.00029) -0.00013 (0.00064) Panel C: School District Fixed Effects Dependent Variable: Non-ASD Disability (2) Cognitive Emotional Disability Disability (3) (4) Speech Disability (5) ASD (1) Learning Disability (6) -0.00033 (0.00063) Learning Disability (6) Other Health Disability (7) 0.00034 (0.00039) Other Health Disability (7) Independent Variable Non-disadv* Post-2012 Independent Variable Non-disadv* Post-2012 Independent Variable Non-disadv* Post-2012 -0.00044* (0.00023) -0.00013 (0.00108) -0.00005 (0.00031) 0.00125*** (0.00031) -0.00303*** 0.00418*** (0.00066) (0.00098) -0.00254*** (0.00041) 0.009 0.011 0.133 0.010 Incidence Rate Notes: Authors’ estimates of equation (3.1) as described in the text using data on students in grades 2-8 from the 2009-2010 to the 2014-2015 school years. The sample includes only students who are always or never eligible for free/reduced price lunch. Each column is a separate regression; N=3,854,234. “Non-disadv” is an indicator for whether the student is eligible for free/reduced price lunch in all observed years of schooling. Students who are eligible for free/reduced price lunch in only some years of schooling are excluded from the regression. All regressions include controls for whether a student is white, male, and limited English proficient as well as grade-by-year fixed effects. Estimates in Panels A and B include school fixed effects, while those in Panel C include school district fixed effects. Estimates in Panel B also control for a linear time trend interacted with non-disadvantaged status. Standard errors are clustered at the school district level: *,**,*** indicate significance at the 10, 5, and 1 percent level, respectively. 0.016 0.036 0.055 179 Table C.1.4: The Effect of the ASD Insurance Mandate on Special Education Services Resource or Cognitive Program Program ASD Program School Independent Variable Non-disadv*Post- 2012*ASD Non-disadv ASD Non-disadv*Post- 2012 Non-disadv*ASD Post-2012*ASD (1) 0.031 (0.035) 0.002 (0.002) 0.187*** (0.0449) 0.000 (0.002) -0.062 (0.040) -0.022 (0.038) 545,053 0.197 (2) -0.064** (0.030) -0.133*** (0.004) -0.048 (0.0367) 0.007 (0.005) 0.152*** (0.034) 0.032 (0.032) 545,053 0.670 No Sped (3) 0.034*** (0.011) 0.143*** (0.004) -0.104*** (0.0104) -0.012*** (0.005) -0.109*** (0.012) -0.008 (0.009) 545,053 0.129 Special General Ed (4) 0.004 (0.004) -0.001** (0.001) 0.009* (0.005) 0.000 (0.001) -0.006 (0.004) -0.007 (0.006) 455,751 0.056 Ed >80% (5) 0.000 (0.015) 0.122*** (0.005) -0.145*** (0.019) -0.031*** (0.005) -0.018 (0.019) -0.017 (0.014) 455,751 0.455 Dependent Variable: Sped FTE Rate (6) ASD Teacher Consultant (7) -0.008 (0.009) -0.056*** (0.003) 0.115*** (0.024) 0.012*** (0.002) -0.001 (0.023) 0.005 (0.009) 332,372 0.353 -0.023** (0.011) 0.000 (0.001) 0.125*** (0.016) -0.001 (0.001) 0.039** (0.018) -0.015 (0.011) 545,053 0.130 Language Services (8) 0.005 (0.013) 0.101*** (0.004) 0.300*** (0.015) -0.020*** (0.004) -0.068*** (0.015) 0.009 (0.013) 545,053 0.790 Occupational Therapy Services (9) Social Worker (10) Any Sped Services (11) -0.004 (0.012) 0.023*** (0.003) 0.257*** (0.014) 0.001 (0.003) 0.014 (0.015) 0.003 (0.012) 545,053 0.401 -0.021* (0.013) -0.089*** (0.005) 0.406*** (0.019) 0.004 (0.006) 0.105*** (0.018) 0.028** (0.013) 545,053 0.691 0.020** (0.010) 0.055*** (0.005) 0.255*** (0.011) -0.015*** (0.005) -0.053*** (0.011) -0.021** (0.010) 545,053 0.943 Observations ASD Mean Notes: Authors’ estimates of equation (3.2) as described in the text using data on students in grades 2-8 from the 2010-2011 to the 2014-2015 school years. The sample includes only students who are always or never eligible for free/reduced price lunch. Each column is a separate regression. “Non-disadv” is an indicator for whether the student is eligible for free/reduced price lunch in all observed years of schooling and “ASD” is an indicator for whether the student has an ASD diagnosis in that year. All regressions include controls for whether a student is white, male, and limited English proficient as well as school and grade-by-year fixed effects. The final row of the table provides dependent variable means for the ASD sample. Standard errors are clustered at the school district level: *,**,*** indicate significance at the 10, 5, and 1 percent level, respectively. 180 Table C.1.5: The Effect of the ASD Insurance Mandate on Other Special Education Services Gen Ed Gen Ed General Program 40-79% <40% Consultant Therapy Independent Variable Non-disadv*Post- 2012*ASD (2) (3) -0.001 (0.005) -0.005 (0.010) -0.001 (0.014) -0.006 (0.004) Other Sped (1) Dependent Variable: Any Ed (4) Non-ASD Teacher (5) 0.003 (0.007) Trans- Physical portation Services (6) (7) -0.002 (0.005) -0.001 (0.006) 545,053 0.031 455,751 0.149 455,751 0.207 455,571 0.811 Observations ASD Mean Notes: Authors’ estimates of equation (3.2) as described in the text using data on students in grades 2-8 from the 2010-2011 to the 2014-2015 school years. The sample includes only students who are always or never eligible for free/reduced price lunch. Each column is a separate regression. “Non-disadv” is an indicator for whether the student is eligible for free/reduced price lunch and “ASD” is an indicator for whether the student has an ASD diagnosis in that year. All regressions include the full set of controls listed in equation (3.2), including controls for whether a student is white, male, and limited English proficient as well as school and grade-by-year fixed effects. The final row of the table provides dependent variable means for the ASD sample. Standard errors are clustered at the school district level: *,**,*** indicate significance at the 10, 5, and 1 percent level, respectively. 545,053 0.041 545,053 0.031 545,053 0.091 181 Table C.1.6: The Effect of the ASD Insurance Mandate on Test Scores Non-ASD Non-Sped Non-ASD Non-Sped Control Group: Non-disadv*Post- 2012*ASD Non-disadv ASD Non-disadv*Post Non-disadv*ASD Post*ASD Lagged Achievement All (1) 0.001 (0.026) 0.156*** (0.005) -0.091*** (0.016) 0.014*** (0.005) -0.079*** (0.018) 0.023 (0.021) 0.738*** (0.006) Math (2) 0.005 (0.026) 0.144*** (0.005) -0.129*** (0.016) 0.011** (0.005) -0.070*** (0.018) 0.022 (0.021) 0.731*** (0.006) Non-ASD Sped (3) -0.010 (0.027) 0.139*** (0.006) 0.112*** (0.018) 0.033*** (0.007) -0.016 (0.020) 0.064*** (0.021) 0.598*** (0.014) All (4) 0.021 (0.029) 0.197*** (0.004) -0.097*** (0.017) -0.002 (0.003) -0.186*** (0.022) 0.014 (0.023) 0.644*** (0.003) Reading (5) 0.018 (0.029) 0.181*** (0.004) -0.166*** (0.018) 0.001 (0.003) -0.168*** (0.022) 0.025 (0.023) 0.618*** (0.003) Non-ASD Sped (6) 0.004 (0.030) 0.157*** (0.007) 0.195*** (0.019) 0.014* (0.007) -0.143*** (0.024) 0.029 (0.024) 0.564*** (0.010) 1,754,971 1,579,046 185,814 1,749,290 Observations Notes: Authors’ estimates of equation (3.2) as described in the text using data on students in grades 2-8 from the 2009-2010 to the 2013-2014 school years. 2014-2015 is excluded as Michigan changed from the Michigan Assessment of Educational Progress to the M-Step exam and restructured alternative examination options. The sample includes only students who are always or never eligible for free/reduced price lunch. Each column is a separate regression. “Non-disadv” is an indicator for whether the student is eligible for free/reduced price lunch in all observed years of schooling and “ASD” is an indicator for whether the student has an ASD diagnosis in that year. All regressions include controls for whether a student is white, male, and limited English proficient as well as school and grade-by-year fixed effects. Standard errors are clustered at the school district level: *,**,*** indicate significance at the 10, 5, and 1 percent level, respectively. 1,578,937 180,172 182 Table C.1.7: The Effect of the ASD Insurance Mandate on Taking Regular Exams Non-ASD Non-Sped Non-ASD Non-Sped Control Group: Non-disadv*Post- 2012*ASD Non-disadv ASD Non-disadv*Post Non-disadv*ASD Post*ASD All (1) -0.025 (0.024) 0.028*** (0.001) -0.351*** (0.026) 0.017*** (0.001) 0.057** (0.025) -0.060** (0.025) Math (2) 0.001 (0.024) 0.001* (0.000) -0.406*** (0.028) -0.000 (0.000) 0.089*** (0.026) -0.098*** (0.024) Non-ASD Sped (3) -0.047* (0.025) 0.058*** (0.006) -0.203*** (0.028) 0.046*** (0.007) 0.017 (0.027) 0.039 (0.025) All (4) Reading (5) -0.030 (0.024) 0.031*** (0.001) -0.351*** (0.026) 0.018*** (0.001) 0.048* (0.025) -0.049** (0.025) -0.003 (0.024) 0.000 (0.000) -0.413*** (0.028) -0.001*** (0.000) 0.083*** (0.025) -0.089*** (0.025) Non-ASD Sped (6) -0.050* (0.026) 0.065*** (0.006) -0.183*** (0.029) 0.045*** (0.007) 0.005 (0.027) 0.057** (0.026) 385,960 2,712,322 2,352,871 2,712,322 Observations Notes: Authors’ estimates of equation (3.2) as described in the text using data on students in grades 2-8 from the 2009- 2010 to the 2013-2014 school years. 2014-2015 is excluded as Michigan changed from the Michigan Assessment of Educational Progress to the M-Step exam and restructured alternative examination options. The sample includes only students who are always or never eligible for free/reduced price lunch. Each column is a separate regression. “Non- disadv” is an indicator for whether the student is eligible for free/reduced price lunch in all observed years of schooling and “ASD” is an indicator for whether the student has an ASD diagnosis in that year. All regressions include controls for whether a student is white, male, and limited English proficient as well as school and grade-by-year fixed effects. Standard errors are clustered at the school district level: *,**,*** indicate significance at the 10, 5, and 1 percent level, respectively. 2,352,871 385,960 183 Table C.1.8: Heterogeneous Effects of the ASD Insurance Mandate on ASD Incidence Independent Variable Non-disadv* Post-2012 Girls (1) Boys (2) -0.00020 (0.00016) -0.00073* (0.00044) Observations Incidence Rate 1,878,120 1,976,114 0.003 0.017 Panel A: By Gender and Race White & Asian Black & Hispanic (3) (4) -0.00027 (0.00024) 3,272,013 0.011 0.00034 (0.00082) 582,221 0.007 Independent Variable KG (1) Grade 1 (2) Grade 2 (3) Panel B: By Grade Grade 3 (4) Grade 4 (5) Grade 5 (6) Grade 6 Grade 7 Grade 8 (7) (8) (9) Non-disadv*Post- 2012 -0.00036 (0.00051) -0.00082 (0.00052) -0.00204*** (0.00055) -0.00068 (0.00057) Observations Incidence Rate 594,630 0.007 545,212 0.008 526,440 0.009 520,105 0.010 -0.00024 (0.00058) 524,539 0.010 0.00121** (0.00060) 0.00037 (0.00063) -0.00048 (0.00059) -0.00065 (0.00054) 538,197 0.011 559,531 0.010 588,427 0.010 503,930 0.010 Notes: Authors’ estimates of equation (3.1) as described in the text using data on students in grades 2-8 from the 2009-2010 to the 2014-2015 school years. The sample includes only students who are always or never eligible for free/reduced price lunch. Each column is a separate regression. “Non-disadv” is an indicator for whether the student is eligible for free/reduced price lunch. All regressions include the full set of controls listed in equation (3.2), including controls for whether a student is white, male, and limited English proficient as well as school and grade-by-year fixed effects. Standard errors are clustered at the school district level: *,**,*** indicate significance at the 10, 5, and 1 percent level, respectively. 184 Table C.1.9: The Effect of the ASD Insurance Mandate, by Gender and Race Panel A: Boys Dependent Variable: ASD ASD Resource or Cognitive Sped FTE Program Program Program School >80% Rate (6) Special General No Sped Ed Ed (2) (3) Teacher Consultant (7) (4) 0.006 (0.005) (5) 0.000 (0.016) 0.029** (0.012) -0.008 (0.008) -0.021* (0.011) 359,165 0.130 300,256 0.062 300,256 0.529 218,809 0.349 359,165 0.130 Independent Variable Non-disadv*Post- 2012*ASD (1) 0.032 (0.034) Observations ASD Mean 359,165 0.193 -0.057* (0.029) 359,165 0.662 Independent Variable Non-disadv*Post- 2012*ASD (1) 0.012 (0.047) ASD Resource or Cognitive Sped FTE Program Program Program School >80% Rate (6) Special General No Sped Ed Ed (2) (5) (3) (4) Teacher Consultant (7) -0.092** (0.046) 0.066*** (0.023) -0.005 (0.011) -0.002 (0.031) -0.011 (0.029) -0.031 (0.021) Panel B: Girls Dependent Variable: ASD Observations ASD Mean 185,888 0.222 185,888 0.644 185,888 0.123 155,495 0.076 155,495 0.489 113,563 0.380 185,888 0.131 Panel C: Black & Hispanic Dependent Variable: ASD Independent Variable Non-disadv*Post- 2012*ASD ASD No Sped Special General Resource or Cognitive Sped FTE Program Program Program School >80% Rate (6) 0.019 (0.028) (5) 0.005 (0.049) (1) 0.083 (0.091) (4) 0.012 (0.012) -0.016 (0.034) -0.058 (0.082) Ed Ed (2) (3) Observations ASD Mean 82,393 0.320 82,393 0.579 82,393 0.093 58,337 0.073 58,337 0.428 29,426 0.396 Teacher Consultant (7) -0.012 (0.025) 82,393 0.088 Panel D: White & Asian Dependent Variable: ASD ASD Resource or Cognitive Sped FTE Program Program Program School >80% Rate (6) Special General No Sped Ed Ed (1) (2) (3) Teacher Consultant (7) (4) 0.001 (0.003) (5) 0.009 (0.013) -0.010 (0.009) -0.016* (0.009) -0.008 (0.009) -0.033** (0.014) 0.039*** (0.010) Independent Variable Non-disadv*Post- 2012*ASD Language Services (8) -0.003 (0.014) 359,165 0.788 Language Services (8) 0.050* (0.026) 185,888 0.805 Language Services (8) 0.012 (0.035) 82,393 0.849 Language Services (8) -0.010 (0.012) Occupational Therapy Services (9) -0.006 (0.013) 359,165 0.400 Occupational Therapy Services (9) 0.009 (0.030) Social Worker (10) -0.024** (0.012) Any Sped Services (11) 0.012 (0.010) 359,165 0.694 359,165 0.943 Social Worker (10) -0.022 (0.030) Any Sped Services (11) 0.050** (0.020) 185,888 0.410 185,888 0.676 185,888 0.944 Occupational Therapy Services (9) -0.001 (0.043) 82,393 0.387 Occupational Therapy Services (9) -0.005 (0.012) Social Worker (10) -0.007 (0.040) Any Sped Services (11) -0.011 (0.032) 82,393 0.662 82,393 0.946 Social Worker (10) -0.020 (0.013) Any Sped Services (11) 0.014* (0.008) 462,660 0.133 462,660 0.183 462,660 0.668 Observations ASD Mean Notes: Authors’ estimates of equation (3.2) as described in the text using data on students in grades 2-8 from the 2009-2010 to the 2014-2015 school years. The sample includes only students who are always or never eligible for free/reduced price lunch. Each column is a separate regression; N=707,376. “Non-disadv” is an indicator for whether the student is eligible for free/reduced price lunch and “ASD” is an indicator for whether the student has an ASD diagnosis in that year. All regressions include the full set of controls listed in equation (3.2), including controls for whether a student is white, male, and limited English proficient as well as school and grade-by-year fixed effects. The final row of each panel provides dependent variable means for the ASD sample. Standard errors are clustered at the school district level: *,**,*** indicate significance at the 10, 5, and 1 percent level, respectively. 462,660 0.942 462,660 0.784 462,660 0.403 462,660 0.695 397,414 0.063 397,414 0.533 302,946 0.350 462,660 0.135 185 Table C.1.10: The Effect of the ASD Insurance Mandate, by Grade ASD Resource or Cognitive Sped FTE Independent Program Program Program School >80% Rate Variable (6) KG Special General No Sped Ed (5) Ed (3) (1) -0.006 (0.038) 0.045 (0.041) 0.022 (0.046) 0.053 (0.047) 0.055 (0.044) 0.011 (0.037) 0.004 (0.030) 0.020 (0.040) 0.022 (0.025) (2) 0.027 (0.038) -0.037 (0.041) -0.072* (0.042) -0.10** (0.047) -0.080* (0.042) -0.040 (0.039) -0.033 (0.031) -0.020 (0.039) -0.010 (0.032) 1 2 3 4 5 6 7 8 (4) 0.006 (0.013) -0.007 (0.012) -0.011 (0.010) 0.013 (0.012) -0.010* (0.006) -0.004 (0.008) 0.002 (0.008) 0.004 (0.011) 0.010 (0.009) -0.005 (0.028) -0.017 (0.028) 0.049** (0.025) 0.041* (0.024) 0.029 (0.021) 0.018 (0.020) 0.037** (0.017) 0.000 (0.019) -0.004 (0.021) -0.001 (0.033) -0.032 (0.032) -0.021 (0.031) -0.014 (0.031) -0.031 (0.033) 0.007 (0.030) 0.013 (0.030) -0.015 (0.031) 0.020 (0.032) -0.002 (0.033) 0.004 (0.025) 0.012 (0.031) 0.006 (0.025) -0.001 (0.021) 0.031 (0.025) -0.022 (0.023) -0.022 (0.024) -0.039* (0.022) ASD Teacher Consultant (7) -0.006 (0.023) -0.035 (0.025) -0.047** (0.022) -0.040* (0.021) -0.003 (0.018) -0.008 (0.018) -0.011 (0.018) -0.015 (0.020) -0.021 (0.022) Language Services (8) -0.014 (0.018) 0.022 (0.021) 0.025 (0.024) 0.008 (0.022) 0.007 (0.026) 0.002 (0.026) -0.018 (0.025) -0.035 (0.026) -0.041 (0.028) Occupational Therapy Services (9) -0.012 (0.042) -0.018 (0.032) -0.037 (0.030) -0.058** (0.027) 0.066** (0.029) 0.040 (0.026) 0.013 (0.024) -0.029 (0.023) 0.015 (0.023) Any Sped Social Worker Services (10) -0.049 (0.033) -0.002 (0.033) -0.022 (0.029) -0.010 (0.027) -0.015 (0.026) -0.041* (0.024) -0.020 (0.023) 0.001 (0.026) 0.002 (0.025) (11) -0.017 (0.013) -0.001 (0.015) 0.025 (0.017) 0.019 (0.016) 0.029 (0.021) -0.002 (0.018) 0.007 (0.017) 0.013 (0.018) -0.013 (0.021) Notes: Authors’ estimates of equation (3.2) as described in the text using data on students in grades Kindergarten-8 from the 2009-2010 to the 2014-2015 school years. The sample includes only students who are always or never eligible for free/reduced price lunch. Each cell is a separate regression and shows the estimate of the coefficient on the triple interaction term of Non-disadv*Post-2012*ASD. All regressions include the full set of controls listed in equation (3.2), including controls for whether a student is white, male, and limited English proficient as well as school and grade-by-year fixed effects. Standard errors are clustered at the school district level: *,**,*** indicate significance at the 10, 5, and 1 percent level, respectively. 186 Table C.1.11: Heterogeneous Effects of the ASD Insurance Mandate on Test Scores, using Non- Special Education Control Group Independent Variable Non-disadv* Post-2012 Girls (1) 0.079 (0.070) Panel A: Math, by Gender and Race Boys (2) White/Asian Black/Hisp. (3) (4) -0.005 (0.028) 0.006 (0.028) -0.008 (0.091) Observations 803,320 775,726 1,365,614 213,432 Independent Variable Non-disadv* Post-2012 Panel B: Reading, by Gender and Race Girls (1) Boys (2) White/Asian Black/Hisp. (3) (4) 0.061 (0.087) 0.008 (0.030) 0.035 (0.031) -0.115 (0.108) Observations 803,315 775,622 1,365,950 212,987 Independent Variable Non-disadv*Post- 2012 Grade 4 (1) 0.136* (0.073) Panel C: Math, by Grade Grade 5 (2) Grade 6 (3) -0.079 (0.066) -0.013 (0.058) Grade 7 (4) 0.006 (0.059) Grade 8 (5) 0.019 (0.061) Observations 298,060 300,174 309,489 326,170 343,020 Independent Variable Non-disadv*Post- 2012 Grade 4 (1) 0.024 (0.067) Panel D: Reading, by Grade Grade 6 Grade 5 (2) 0.072 (0.063) (3) -0.069 (0.066) Grade 7 (4) 0.039 (0.064) Grade 8 (5) 0.069 (0.068) Observations 298,012 299,973 309,512 326,192 343,089 Notes: Authors’ estimates of equation (3.2) as described in the text using data on students in grades 2-8 from the 2009-2010 to the 2013-2014 school years. 2014-2015 is excluded as Michigan changed from the Michigan Assessment of Educational Progress to the M-Step exam and restructured alternative examination options. The sample includes only students who are always or never eligible for free/reduced price lunch. Each column is a separate regression. “Non-disadv” is an indicator for whether the student is eligible for free/reduced price lunch and “ASD” is an indicator for whether the student has an ASD diagnosis in that year. All regressions include the full set of controls listed in equation (3.2), including controls for whether a student is white, male, limited English proficient, as well as school and grade-by-year fixed effects. Estimates also include controls for lagged test score. Standard errors are clustered at the school district level: *,**,*** indicate significance at the 10, 5, and 1 percent level, respectively. 187 Table C.1.12: The Effect of the ASD Insurance Mandate – Robustness Checks Panel A: Including Linear Time Interacted with Non-Disadvantaged Status ASD Resource or Cognitive Sped FTE Program Program Program School >80% Rate (6) Special General No Sped Ed Ed (2) (3) Teacher Consultant (7) Dependent Variable: ASD -0.064** (0.030) 0.034*** (0.011) (4) 0.004 (0.004) (5) 0.000 (0.015) -0.009 (0.009) -0.023** (0.011) Independent Variable Non-disadv*Post- 2012*ASD (1) 0.031 (0.035) Observations ASD Mean 545,053 0.197 545,053 0.670 545,053 0.129 455,751 0.056 455,751 0.455 332,372 0.353 545,053 0.130 545,053 0.790 Language Services (8) 0.005 (0.013) Occupational Therapy Services (9) -0.004 (0.012) 545,053 0.401 Panel B: School District FEs Dependent Variable: ASD ASD Resource or Cognitive Sped FTE Program Program Program School >80% Rate (6) Special General No Sped Ed Ed (2) (3) (4) Teacher Consultant (7) -0.068** (0.030) 0.041*** (0.011) -0.006 (0.007) -0.013 (0.011) -0.023** (0.011) (5) 0.014 (0.016) Independent Variable Non-disadv*Post- 2012*ASD (1) 0.028 (0.036) Observations ASD Mean 545,053 0.197 545,053 0.670 545,053 0.129 455,751 0.056 455,751 0.455 332,372 0.353 545,053 0.130 Language Services (8) -0.000 (0.014) 545,053 0.790 Occupational Therapy Services (9) -0.017 (0.013) 545,053 0.401 Panel C: Pre-Treatment Assignment of ASD Variable ASD Resource or Cognitive Sped FTE Program Program Program School >80% Rate (6) Special General No Sped Ed Ed (2) (3) (5) Teacher Consultant (7) Dependent Variable: ASD -0.072*** (0.028) 0.049*** (0.011) -0.011 (0.016) -0.002 (0.008) -0.011 (0.011) (4) 0.004 (0.004) Independent Variable Non-disadv*Post- 2012*ASD (1) 0.020 (0.032) Language Services (8) 0.001 (0.016) Occupational Therapy Services (9) -0.012 (0.014) Social Worker (10) -0.021* (0.012) Any Sped Services (11) 0.019* (0.010) 545,053 0.691 545,053 0.943 Social Worker (10) -0.015 (0.014) Any Sped Services (11) 0.016 (0.010) 545,053 0.691 545,053 0.943 Social Worker (10) -0.041*** (0.013) Any Sped Services (11) 0.019* (0.011) 352,054 0.534 352,054 0.049 422,974 0.133 422,974 0.661 422,974 0.190 Observations ASD Mean Notes: Authors’ estimates of equation (3.2) as described in the text using data on students in grades 2-8 from the 2010-2011 to the 2014-2015 school years. The sample in Panel A includes only students who are always or never eligible for free/reduced price lunch, while the sample in Panels B and C include all students. Each column is a separate regression. “Non-disadv” is an indicator for whether the student is eligible for free/reduced price lunch and “ASD” is an indicator for whether the student has an ASD diagnosis in that year. All regressions include the full set of controls listed in equation (3.2), including whether a student is white, male, and limited English proficient as well as grade-by-year fixed effects. Estimates in Panels A and C include school fixed effects, while those in Panel B include school district fixed effects. Estimates in Panel A also control for a linear time trend interacted with non-disadv status. The final row of each panel provides dependent variable means for the ASD sample. Standard errors are clustered at the school district level: *,**,*** indicate significance at the 10, 5, and 1 percent level, respectively. 422,974 0.779 422,974 0.373 422,974 0.700 422,974 0.939 332,338 0.338 252,537 0.132 188 Table C.1.13: The Effect of the ASD Insurance Mandate on Test Scores - Robustness Checks Math (2) 0.005 (0.026) Math (2) 0.014 (0.026) Math (2) 0.002 (0.027) Math (2) All (1) 0.010 (0.026) All (1) 0.000 (0.027) All (1) Panel A: Including Linear Time Trend Interacted with Non-Disadvantaged Status Control Group: Non-disadv*Post- 2012*ASD All (1) 0.001 (0.026) Non-ASD Non-Sped Non-ASD Non-Sped Non-ASD Sped (3) -0.010 (0.027) All (4) 0.021 (0.029) Reading (5) 0.018 (0.029) Non-ASD Sped (6) 0.004 (0.030) Observations 1,754,971 1,579,046 185,814 1,749,290 1,578,937 180,172 Panel B: School District FEs Non-ASD Non-Sped Non-ASD Non-Sped Non-ASD Sped (3) -0.001 (0.027) All (4) 0.029 (0.029) Reading (5) 0.027 (0.029) Non-ASD Sped (6) 0.014 (0.029) Control Group: Non-disadv*Post- 2012*ASD Observations 1,754,971 1,579,046 185,814 1,749,290 1,578,937 180,172 Panel C: Pre-Treatment Assignment of ASD Variable Non-ASD Non-Sped Non-ASD Non-Sped Non-ASD Sped (3) -0.017 (0.028) All (4) 0.031 (0.029) Reading (5) 0.028 (0.029) Non-ASD Sped (6) 0.007 (0.032) Control Group: Non-disadv*Post- 2012*ASD Observations 1,542,200 1,387,847 162,767 1,537,785 1,388,220 157,909 Control Group: Non-ASD Non-Sped Non-ASD Non-Sped Panel D: Including 2014-15 Non-ASD Sped (3) -0.045* (0.025) All (4) 0.028 (0.026) Reading (5) 0.034 (0.026) Non-ASD Sped (6) 0.008 (0.028) Non-disadv*Post- 2012*ASD -0.019 (0.024) -0.005 (0.024) Observations 2,184,874 1,961,605 236,273 2,179,192 1,961,404 230,726 Notes: Authors’ estimates of equation (3.2) as described in the text using data on students in grades 2-8 from the 2009-2010 to the 2013-2014 school years, except where specified. The sample includes only students who are always or never eligible for free/reduced price lunch. Each column is a separate regression. “Non- disadv” is an indicator for whether the student is eligible for free/reduced price lunch and “ASD” is an indicator for whether the student has an ASD diagnosis in that year. All regressions include the full set of controls listed in equation (3.2), including whether a student is white, male, and limited English proficient as well as grade-by-year fixed effects. Estimates in Panels A, C, and D include school fixed effects, while those in Panel B include school district fixed effects. The estimates in Panel A also include a linear time trend interacted with non-disadv status. Estimates also include controls for lagged test score. Standard errors are clustered at the school district level: *,**,*** indicate significance at the 10, 5, and 1 percent level, respectively. 189 Table C.1.14: The Effect of the ASD Insurance Mandate – No Sample Exclusion Based on Disadvantaged Status Panel A: Including All Students Who are Ever Observed Disadvantaged ASD Resource or Cognitive Sped FTE Program Program Program School >80% Rate (6) Special General No Sped Ed Ed (2) (5) (3) Teacher Consultant (7) Dependent Variable: ASD (4) 0.002 (0.003) 0.020** (0.010) -0.006 (0.014) -0.010 (0.007) -0.017* (0.009) Language Services (8) 0.005 (0.011) Occupational Therapy Services (9) 0.003 (0.012) Any Sped Social Worker Services (10) -0.011 (0.011) (11) 0.012 (0.008) 705,616 0.125 577,221 0.056 577,221 0.444 421,418 0.359 705,616 0.129 705,616 0.784 705,616 0.395 705,616 0.688 705,616 0.940 Independent Variable Non-disadv*Post- 2012*ASD (1) 0.021 (0.030) Observations ASD Mean 705,616 0.200 Independent Variable Non-disadv*Post- 2012*ASD (1) 0.021 (0.030) Observations ASD Mean 705,616 0.200 -0.041* (0.025) 705,616 0.672 -0.041* (0.025) 705,616 0.672 Panel B: Including All Students Who are Ever Observed Disadvantaged and Linear Time Trend Interacted with Non-Disadvantaged Status ASD Resource or Cognitive Sped FTE Program Program Program School >80% Rate (6) Special General No Sped Ed Ed (2) (5) (3) Teacher Consultant (7) Dependent Variable: ASD (4) 0.002 (0.003) 0.020** (0.010) -0.006 (0.014) -0.010 (0.007) -0.017* (0.009) Language Services (8) 0.005 (0.011) Occupational Therapy Services (9) 0.003 (0.012) Any Sped Social Worker Services (10) -0.011 (0.011) (11) 0.012 (0.008) 705,616 0.125 577,221 0.056 577,221 0.444 421,418 0.359 705,616 0.129 705,616 0.784 705,616 0.395 705,616 0.688 705,616 0.940 Notes: Authors’ estimates of equation (3.2) as described in the text using data on students in grades 2-8 from the 2010-2011 to the 2014-2015 school years. The sample includes all students. Each column is a separate regression. “Non-disadv” is an indicator for whether the student is eligible for free/reduced price lunch and “ASD” is an indicator for whether the student has an ASD diagnosis in that year. All regressions include the full set of controls listed in equation (3.2), including whether a student is white, male, and limited English proficient as well as school and grade-by-year fixed effects. The estimates in Panel B also include a linear time trend interacted with non-disadv status. The final row in each panel provides dependent variable means for the ASD sample. Standard errors are clustered at the school district level: *,**,*** indicate significance at the 10, 5, and 1 percent level, respectively. 190 Table C.1.15: The Effect of the ASD Insurance Mandate on Disability Incidence - No Sample Exclusion Based on Disadvantaged Status Panel A: Excluding Linear Time Trend Interacted with Non-Disadvantaged Status Dependent Variable: ASD (1) Non-ASD Disability (2) Cognitive Emotional Disability Disability (3) (4) Speech Disability (5) Learning Disability (6) Other Health Disability (7) -0.00032 (0.00024) 0.00123 (0.00136) 0.00004 (0.00028) 0.00113*** (0.00027) -0.00215*** 0.00420*** (0.00053) (0.00093) -0.00214*** (0.00040) Panel B: Including Linear Time Trend Interacted with Non-Disadvantaged Status ASD (1) Non-ASD Disability (2) Dependent Variable: Cognitive Emotional Disability Disability (3) (4) Speech Disability (5) 0.00009 (0.00026) -0.00050 (0.00110) -0.00038 (0.00030) 0.00040 (0.00031) 0.00012 (0.00061) Learning Disability (6) -0.00113* (0.00069) Other Health Disability (7) 0.00056 (0.00039) Independent Variable Non-disadv* Post-2012 Independent Variable Non-disadv* Post-2012 0.010 0.133 0.011 Incidence Rate Notes: Authors’ estimates of equation (3.1) as described in the text using data on students in grades 2-8 from the 2009-2010 to the 2014- 2015 school years. Each column is a separate regression; N=4,970,113. “Non-disadv” is an indicator for whether the student is eligible for free/reduced price lunch. All regressions include the full set of controls listed in equation (3.1), including whether a student is white, male, and limited English proficient as well as school and grade-by-year fixed effects. Estimates in Panel B include the control for a linear time trend interacted with non-disadvantage status. Standard errors are clustered at the school district level: *,**,*** indicate significance at the 10, 5, and 1 percent level, respectively. 0.016 0.056 0.009 0.035 191 Table C.1.16: The Effect of the ASD Insurance Mandate on Test Scores - No Sample Exclusion Based on Disadvantaged Status Panel A: Including All Students Who are Ever Observed Disadvantaged Control Group: Non-ASD Non-Sped Non-ASD Non-Sped Math (2) Non-ASD Sped (3) -0.023 (0.024) All (4) 0.014 (0.026) Reading (5) 0.010 (0.026) Non-ASD Sped (6) -0.001 (0.026) Non-disadv*Post- 2012*ASD -0.011 (0.023) -0.009 (0.023) All (1) All (1) Observations 2,158,249 1,940,574 230,304 2,151,497 1,940,733 223,320 Panel B: Including All Students Who are Ever Observed Disadvantaged and Linear Time Trend Interacted with Non-Disadvantage Status Reading Math Control Group: Non-ASD Non-Sped Non-ASD Non-Sped Non-ASD All (2) Sped (3) (4) 0.014 (0.026) (5) 0.010 (0.026) Non-ASD Sped (6) -0.001 (0.026) Non-disadv*Post- 2012*ASD -0.012 (0.023) -0.009 (0.023) -0.023 (0.024) 230,304 2,158,249 1,940,574 2,151,497 Observations Notes: Authors’ estimates of equation (3.2) as described in the text using data on students in grades 2-8 from the 2009-2010 to the 2013-2014 school years, except where specified. The sample includes all students. Each column is a separate regression. “Non-disadv” is an indicator for whether the student is eligible for free/reduced price lunch and “ASD” is an indicator for whether the student has an ASD diagnosis in that year. All regressions include the full set of controls listed in equation (3.2), including whether a student is white, male, and limited English proficient as well as school and grade-by-year fixed effects. The estimates in Panel B include the linear time trend interacted with non-disadvantaged status. Estimates also include controls for lagged test score. Standard errors are clustered at the school district level: *,**,*** indicate significance at the 10, 5, and 1 percent level, respectively. 1,940,733 223,320 192 BIBLIOGRAPHY 193 BIBLIOGRAPHY Allen, J. (2019). Connecticut Community Colleges Offer SNAP Scholarship. https://www.wnpr. org/post/connecticut-community-colleges-offer-snap-scholarship. Altonji, J. G., P. Arcidiacono, and A. Maurel (2016). The Analysis of Field Choice in College and Graduate School: Determinants and Wage Effects. Handbook of the Economics of Education 5, 305–396. Altonji, J. G., E. Blom, and C. Meghir (2012). Heterogeneity in Human Capital Investments: High School Curriculum, College Major, and Careers. Annual Review of Economics 4, 185–223. Anderoni, J. and A. A. Payne (2011). Is Crowding Out Due Entirely to Fundraising? Evidence from a Panel of Charities. Journal of Public Economics 95(5-6), 334–343. Andrews, R. J., S. A. Imberman, and M. F. Lovenheim (2017). Risky Business? The Effect of Majoring in Business on Earnings and Educational Attainment. NBER Working Paper No. 23575. Armona, L., R. Chakrabarti, and M. F. Lovenheim (2018). How Does For-Profit College Atten- dance Affect Student Loans, Defaults and Labor Market Outcomes? NBER Working Paper No. 25042. Bahr, P. R., S. Dynarski, B. Jacob, D. Kreisman, A. Sosa, and M. Wiederspan (2015). Labor Market Returns to Community College Awards: Evidence from Michigan. EPI Working Paper 01-2015. Baio, Jon, et al. (2018). Prevalence of Autism Spectrum Disorder Among Children Aged 8 Years Autism and Developmental Disabilities Monitoring Network. Morbitity and Mortality Weekly Report 67(6), 1–23. Baker, R., E. Bettinger, B. Jacob, and I. Marinescu (2018). The Effect of Labor Market Information on Community College Students’ Major Choice. Economics of Education Review 65, 18–30. Barry, C. L., A. J. Epstein, S. C. Marcus, A. Kennedy-Hendricks, M. K. Candon, M. Xie, and D. S. Mandell (2017). Effects of State Insurance Mandates on Health Care Use and Spending for Autism Spectrum Disorder. Health Affairs 36(10), 1754–1761. Beffy, M., D. Fougere, and A. Maurel (2012). Choosing the Field of Study in Postsecondary The Review of Economics and Statistics 94(1), Education: Do Expected Earnings Matter? 334–347. Belfield, C. and T. Bailey (2017). The Labor Market Returns to Sub-Baccalaureate College: A Review. CAPSEE Working Paper. Bellemare, M. F. and C. J. Wichman (2019). Elasticities and the Inverse Hyperbolic Sine Trans- formation. Oxford Bulletin of Economics and Statistics. 194 Benson, C. (2018). Is Special Education a Pathway to Supplemental Security Income for Children? Working Paper. Bergstrom, T., L. Blume, and H. Varian (1986). On the Private Provision of Public Goods. Journal of Public Economics 29(1), 25–49. Betts, J. R. and L. L. McFarland (1995). Safe Port in a Storm: The Impact of Labor Market Conditions on Community College Enrollments. The Journal of Human Resources 30, 741– 765. Blank, R. (1989). The Effect of Medical Need and Medicaid on AFDC Participation. Journal of Human Resources 24(1), 54–87. Buckles, K. S. and D. M. Hungerman (2018). The Incidental Fertility Effects of School Condom Distribution Programs. Journal of Policy Analysis and Management 37(3), 464–492. Buescher, A. V., Z. Cidav, M. Knapp, and D. S. Mandell (2014). Costs of Autism Spectrum Disorders in the United Kingdom and the United States. JAMA Pediatrics 168(8), 721–728. Buis, M. (2017). Fmlogit: Stata module fitting a fractional multinomial logit model by quasi maximum likelihood. Burbidge, J. B., L. Magee, and A. L. Robb (1988). Alternative Transformations to Handle Extreme Values of the Dependent Variable. Journal of the American Statistical Association 83(401), 123– 127. Bureau of Labor Market Information and Strategic Initiatives (2014). Youth and Young Adults and the Michigan Labor Market. http://milmi.org/Portals/198/publications/Youth and Youth Adults and the Michigan Labor Market.pdf. Caetano, G. and H. Macartney (2014). Quasi-Experimental Evidence of School Choice through Residential Sorting. Working Paper. Candon, M. K., C. L. Barry, S. C. Marcus, A. J. Epstein, A. Kennedy-Hendricks, M. Xie, and D. S. Mandell (2019). Insurance Mandates and Out-of-Pocket Spending for Children With Autism Spectrum Disorder. Pediatrics 143(1). Card, D. (1995). Using Geographic Variation in College Proximity to Estimate the Return to Schooling. In L. Christofides, E. K. Grant, and R. Swindinsky (Eds.), Aspects of Labour Eco- nomics: Essays in Honour of John Vanderkamp. University of Toronto Press. Carruthers, C. K. and W. F. Fox (2016). Aid for All: College Coaching, Financial Aid, and Post- secondary Persistence in Tennessee. Economics of Education Review 51, 97–112. Cellini, S. R. (2009). Crowded Colleges and College Crowd-Out: The Impact of Public Subsidies on the Two-Year College Market. American Economic Journal: Economic Policy 1, 1–30. Cellini, S. R. and C. Goldin (2014). Does Federal Student Aid Raise Tuition? New Evidence on For-Profit Colleges. American Economic Journal: Economic Policy 6, 174–206. 195 Cellini, S. R. and N. Turner (2018). Gainfully Employed? Assessing the Employment and Earn- ings of For-Profit College Students Using Administrative Data. Journal of Human Resources, forthcoming. Centers for Disease Control (2018). Autism Spectrum Disorder (ASD). https://www.cdc.gov/ ncbddd/autism/facts.html. Charles, K. K., E. Hurst, and M. J. Notowidigdo (2018). Housing Booms and Busts, Labor Market Opportunities, and College Attendance. American Economic Review 108, 2947–2994. Chatterji, P., S. L. Decker, and S. Markowitz (2015). The Effects of Mandated Health Insurance Benefits for Autism on Out-of-Pocket Costs and Access to Treatment. Journal of Policy Analysis and Management 34(2), 328–353. Chetty, R., J. N. Friedman, E. Saez, N. Turner, and D. Yagan (2017). Mobility Report Cards: The Role of Colleges in Intergenerational Mobility. NBER Working Paper, No. 23618. Choi, D., D. Lou, and A. Mukherjee (2018). The Effect of Superstar Firms on College Major Choice. Working Paper. Christensen, Deborah L., et al. (2016). Prevalence and Characteristics of Autism Spectrum Dis- order Among 4-year-old Children in the Autism and Developmental Disabilities Monitoring Network. Journal of Developmental & Behavioral Pediatrics 37(1), 1–8. Cidav, Z., S. C. Marcus, and D. S. Mandell (2012). Implications of Childhood Autism for Parental Employment and Earnings. Pediatrics 129(4), 617–623. Citizens Research Council of Michigan (2018). Exploring Michigan’s Urban/Rural https://www.michiganfoundations.org/sites/default/files/resources/rpt400 Exploring Divide. Michigans Urban-Rural Divide.pdf. Cohodes, S. R., D. S. Grossman, S. A. Kleiner, and M. F. Lovenheim (2016). The Effect of Child Health Insurance Access on Schooling: Evidence from Public Insurance Expansions. Journal of Human Resources 51(3), 727–759. Corsello, C. M. (2005). Early Intervention in Autism. Infants & Young Children 18(2), 74–85. Cowen, J. M., B. Creed, and V. A. Keesler (2015). Dynamic participation in inter-district open enrollment: Evidence from Michigan 2006-2013. Michigan State University Education Policy Center Working Paper No. 49. Currie, J. and E. Moretti (2003). Mother’s Education and the Intergenerational Transmission of Human Capital: Evidence from College Openings. The Quarterly Journal of Economics 118, 1495–11532. Cutler, D. M. and J. Gruber (1996). Does Public Insurance Crowd out Private Insurance? The Quarterly Journal of Economics 111(2), 391–430. Dawson, Geraldine, et al. (2010). Randomized, Controlled Trial of an Intervention for Toddlers with Autism: The Early Start Denver Model. Pediatrics 125(1). 196 Degrow, B. (2017). How School Funding Works in Michigan Mackinac Center for Public Policy. https://www.mackinac.org/school-funding. Deming, D. and S. Dynarski (2010). College Aid. In P. B. Levine and D. J. Zimmerman (Eds.), Targeting Investments in Children: Fighting Poverty When Resources are Limited. University of Chicago Press. Deming, D. J., C. Goldin, and L. F. Katz (2012). The For-Profit Postsecondary School Sector: Nimble Critters or Agile Predators? Journal of Economic Perspectives 26(1), 139–164. Denning, J. T. (2017). College on the Cheap: Consequences of Community College Tuition Re- ductions. American Economic Journal: Economic Policy 9(2), 155–188. Domina, T., N. Pharris-Ciurej, A. M. Penner, E. K. Penner, Q. Brummet, S. R. Porter, and Is Free and Reduced-Price Lunch a Valid Measure of Educational Dis- T. Sanabria (2010). advantage? Educational Researcher 47(9), 539–555. Eckert, F., T. C. Fort, P. K. Schott, and N. J. Yang (2020). Imputing Missing Values in the Census Bureau’s County Business Patterns. NBER Working Paper, No. 26632. Eldevik, Sigmund, et al. (2009). Meta-analysis of Early Intensive Behavioral Intervention for Children with Autism. Journal of Clinical Child & Adolescent Psychology 38(3), 439–450. Elwell, J. (2018). The Effects of Expansions of Children’s Medicaid Eligibility on Program Par- ticipation and Labor Supply. Working Paper. Ersoy, F. Y. (2019). Reshaping Aspirations: The Effects of the Great Recession on College Major Choice. Working Paper. Ferreri, S. and S. Bolt (2011). Educating Michigan’s Students with Autism Spectrum Disorder (ASD): An Initial Exploration of Programming ‘The ASD Michigan Project’. Education Policy Center Special Report. Foote, A. and M. Grosz (2019). The Effect of Local Labor Market Downturns on Postsecondary Enrollment and Program Choice. Education Finance and Policy, forthcoming. Fountain, C., M. D. King, and P. S. Bearman (2011). Age of Diagnosis for Autism: Individual and Community Factors across 10 Birth Cohorts. Journal of Epidemiology and Community Health 65, 503–510. Foxx, R. M. (2008). Applied Behavioral Analysis Treatment of Autism: The State of the Art. Child and Adolescent Psychiatric Clinics of North America 17(4), 821–834. Fronstin, P. (2012). Self-Insured Health Plans: State Variation and Recent Trends by Firm Size. Employee Benefit Research Institute Notes 33(11), 2–11. Ganz, M. L. (2007). The Lifetime Distribution of the Incremental Societal Costs of Autism. Archives of Pediatrics & Adolescent Medicine 161(4), 343–349. Genniaoli, N. and A. Shleifer (2010). What Comes to Mind. Quarterly Journal of Economics. 197 Goodman, J., M. Hurwitz, and J. Smith (2017). Access to 4-Year Public Colleges and Degree Completion. Journal of Labor Economics 35(3), 829–867. Goodman, S. F. and A. H. Volz (2019). Attendance Spillovers between Public and For-Profit Col- leges: Evidence from Statewide Variation in Appropriations for Higher Education. Education Finance and Policy, forthcoming. Gordon, N. (2004). Do Federal Grants Boost School Spending? Evidence from Title I. Journal of Public Economics 88(9-10), 1771–1792. Granpeesheh, Doreen, et al. (2009). The Effects of Age and Treatment Intensity on Behavioral Intervention Outcomes for Children with Autism Spectrum Disorders. Research in Autism Spec- trum Disorders 3(4), 1014–1022. Grosz, M. (2018). Do Postsecondary Training Programs Respond to Changes in the Labor Market? Working Paper. Gruber, J. and D. M. Hungerman (2007). Faith-based Charity and Crowd-out During the Great Depression. Journal of Public Economics 91(5-6), 1043–1069. Gruber, J. and K. Simon (2008). Crowd-out 10 Years Later: Have Recent Public Insurance Expan- sions Crowded Out Private Health Insurance. Journal of Health Economics 27(2), 201–217. Ham, J. C. and L. D. Shore-Sheppard (2005). Did Expanding Medicaid Affect Welfare Participa- tion? Industrial and Labor Relations Review 58(3), 452–470. Hansen, S. N., D. E. Schendel, and E. T. Parner (2015). Explaining the increase in the prevalence of autism spectrum disorders: the proportion attributable to changes in reporting practices. JAMA Pediatrics 169(1), 56–62. Hastings, J., C. A. Neilson, and S. D. Zimmerman (2015). The Effects of Earnings Disclosure on College Enrollment Decisions. NBER Working Paper, No. 21300. Hershbein, B. and M. Kearney (2014). Major Decisions: What College Graduates Earn Over Their Lifetimes. https://www.hamiltonproject.org/papers/major decisions what graduates earn over their lifetimes/. Hilliard, T. (2016). Autonomy and Innovation: Systemic Change in a Decentralized State. Tech- nical report, Jobs for the Future. Hillman, N. and T. Weichman (2016). Education Deserts: The Continued Significance of “Place” in the Twenty-First Century. Technical report, American Council on Education, Washington, DC. Hillman, N. W. and E. L. Orians (2013). Community Colleges and Labor Market Conditions: How Does Enrollment Demand Change Relative to Local Unemployment Rates? Research in Higher Education 54(7), 765–780. 198 House Fiscal Agency (2017). Four-Year Degree Offerings at Michigan’s Community Col- https://www.house.mi.gov/hfa/PDF/CommunityColleges/CC FourYearDegrees memo leges. Oct17.pdf. Howlin, P., I. Magiati, and T. Charman (2009). Systematic Review of Early Intensive Behavioral Interventions for Children with Autism. American Journal of Intellectual and Developmental Disabilities 1141, 23–41. Hubbard, D. (2018). The Impact of Local Labor Market Shocks on College Choice: Evidence from Plant Closings in Michigan. Working Paper. Hungerman, D. M. (2005). Are Church and State Substitutes? Evidence from the 1996 Welfare Reform. Journal of Public Economics 89(11-12), 2245–2267. Huttunen, K. and K. Riukula (2019). Parental Job Loss and Children’s Careers. IZA Discussion Paper No. 12788. J¨abrink, K. (2007). The Economic Consequences of Autistic Spectrum Disorder Among Children in a Swedish Municipality. Autism 11(5), 453–463. Jacobson, J. W., J. A. Mulick, and G. Green (1998). Cost-benefit Estimates for Early Intensive Be- havioral Intervention for Young Children with Autism - General Model and Single State Case. Behavioral Interventions: Theory & Practice in Residential & Community-Based Clinical Pro- grams 13(4), 201–226. Keele, L., S. Lorch, M. Passarella, D. Small, and R. Titiunik (2017). An Overview of Geographi- cally Discontinuous Treatment Assignments with an Application to Children’s Health Insurance. In M. D. Cattaneo and J. C. Escanciano (Eds.), Regression Discontinuity Designs: Theory and Applications, Volume 38 of Advances in Econometrics. Emerald Group Publishing. Knapp, M., R. Romeo, and J. Beecham (2009). Economic Cost of Autism in the UK. Autism 13(3), 317–336. Krolikowski, P. M. and K. G. Lunsford (2020). Advance Layoff Notices and Labor Market Fore- casting. Federal Reserve Bank of Cleveland Working Paper No. 20-03. Lapid, P. A. (2017). Expanding College Access: The Impact of New Universities on Local Enroll- ment. Job Market Paper. Lavelle, Tara A., et al. (2014). Economic Burden of Childhood Autism Spectrum Disorders. Pedi- atrics 133(3), e520–e529. Levine, P. B. and D. Schanzenbach (2009). The Impact of Children’s Public Health Insurance Expansions on Educational Outcomes. Forum for Health Economics & Policy 12(1). Lindo, J. M., J. Schaller, and B. Hansen (2018). Caution! Men not at work: Gender-specific labor market conditions and child maltreatment. Journal of Public Economics 163, 77–98. Liu, S., W. Sun, and J. V. Winters (2018). Up in STEM, Down in Business: Changing College Major Decisions with the Great Recession. Contemporary Economic Policy. 199 Long, M. C., D. Goldhaber, and N. Huntington-Klein (2015). Do completed college majors re- spond to changes in wages? Economics of Education Review 49, 1–14. Lovenheim, M. F., R. Reback, and L. Wedenoja (2016). How Does Access to Health Care Affect Teen Fertility and High School Dropout Rates? Evidence from School-based Health Centers. NBER Working Paper No. 22030. Ma, J. and S. Buam (2016). Trends in Community Colleges: Enrollment, Prices, Student Debt, and Completion. College Board Research Brief . Mandell, D. S., C. L. Barry, M. Xie, K. Shea, K. Mullan, and A. J. Epstein (2016). Effects of Autism Spectrum Disorder Insurance Mandates on the Treated Prevalence of Autism Spectrum Disorder. JAMA Pediatrics 170(9), 887–893. Mandell, D. S., K. H. Morales, M. Xie, L. J. Lawer, A. C. Stahmer, and S. C. Marcus (2010). Age of Diagnosis Among Medicaid-Enrolled Children with Autism, 2001-2004. Psychiatric Services 61(8), 822–829. Mari-Bauset, Salvador et al. (2014). Evidence of the Gluten-Free and Casein-Free Diet in Autism Spectrum Disorders: A Systematic Review. Journal of Child Neurology 29(12), 1718–1727. Matson, J. L. and A. M. Kozlowski (2011). The increasing prevalence of autism spectrum disor- ders. Research in Autism Spectrum Disorders 5(1), 418–425. McFarlin, I., B. McCall, and P. Martorell (2018). How Much Do Tuition Subsidies Promote Col- lege Access? Evidence from Community College Taxing Districts. Working Paper. Michelmore, K. and S. Dynarski (2017). The Gap Within the Gap: Using Longitudinal Data to Understand Income Differences in Educational Outcomes. AERA Open 3(1), 1–18. Michigan Center for Educational Performance & Information (2017). Michigan Community Col- leges Activities Classification Structure (ACS) 2016-2017 Data Book & Companion. http: //www.michigancc.net/acs/ACS%202016-17.pdf. Michigan Community College Association (2019). Fast Facts. https://www.mcca.org/fast-facts. Michigan House Fiscal Agency (2017). Budget Briefing: Community Colleges. https://www. house.mi.gov/hfa/PDF/Briefings/CC BudgetBriefing fy17-18.pdf. Moffitt, R. and B. Wolfe (1992). The Effect of the Medicaid Program on Welfare Participation and Labor Supply. The Review of Economics and Statistics 74(4), 615–626. Montmarquette, C., K. Cannings, and S. Mahseredjian (2002). How Do Young People Choose College Majors? Economics of Education Review 21, 543–556. Mountjoy, J. (2019). Community Colleges and Upward Mobility. Working Paper. Mullainathan, S. (2002). A Memory-Based Model of Bounded Rationality. Quarterly Journal of Economics. 200 Natanson, H. (2019). Gov. Northam proposes making community college free for https://www.washingtonpost.com/local/education/ some gov-northam-proposes-making-community-college-free-for-some-job-seekers-in-virginia/ 2019/12/12/8f2a25fa-1cdc-11ea-8d58-5ac3600967a1 story.html. job-seekers in Virginia. National Center for Education Statistics (2011). Guidelines for Using the CIP to SOC Crosswalk. https://nces.ed.gov/ipeds/cipcode/resources.aspx?y=55. National Center for Education Statistics (2018). Digest of Education Statistics, 2016, Table 308.10. https://nces.ed.gov/programs/digest/d17/tables/dt17 308.10.asp?current=yes. National Student Clearinghouse Research Center (2017). Snapshot Report – Contribution of https: Two-Year Public Institutions to Bachelor’s Completions at Four-Year Institutions. //nscresearchcenter.org/snapshotreport-twoyearcontributionfouryearcompletions26/. Oreopoulos, P. and K. G. Salvanes (2011). Priceless: The Non-Pecuniary Benefits of Schooling. Journal of Economic Perspectives 25(1), 159–184. Page, L. C. and J. Scott-Clayton (2016). Improving College Access in the United States: Barriers and Policy Responses. Economics of Education Review 51, 4–22. Papke, L. E. and J. M. Wooldridge (1996). Econometric Methods for Fractional Response Variables with an Application to 401(K) Plan Participation Rates. Journal of Applied Econometrics 11, 619–632. Patterson, R. W., N. G. Pope, and A. Feudo (2019). Timing Is Everything: Evidence from College Major Decisions. IZA Discussion Paper, No. 12069. Payne, A. A. (1998). Does the Government Crowd-out Private Donations? New Evidence from a Sample of Non-profit Firms. Journal of Public Economics 69(3), 323–345. Peters, C., K. Lausch, and M. Udow-Phillips (2014). Autism Spectrum Disorder in Michigan. Center for Healthcare Research and Transformation Issue Brief . Peters-Scheffer, N., R. Didden, H. Korzilius, and P. Sturmey (2011). A Meta-Analytic Study on the Effectiveness of Comprehensive ABA-based Early Intervention Programs for Children with Autism Spectrum Disorders. Research in Autism Spectrum Disorders 5(1), 60–69. Reback, R. and T. L. Cox (2018). Where Health Policy Meets Education Policy: School-based Health Centers in New York. Working Paper. Reynolds, C. L. (2012). Where to attend? Estimating the effects of beginning college at a two-year institution. Economics of Education Review 31, 345–362. Rouse, C. E. (1995). Democratization or Diversion? The Effect of Community Colleges on Edu- cational Attainment. Journal of Business & Economic Statistics 13(2), 217–224. Ruijs, N. M. and T. T. Peetsma (2009). Effects of inclusion on students with and without special educational needs reviewed. Educational Research Review 4(2), 67–79. 201 Sentz, R., M. Metsker, P. Linares, and J. Clemans (2018). How Your School Affects Where You Live. https://www.economicmodeling.com/how-your-school-affects-where-you-live/. Shu, P. (2016). Innovating in Science and Engineering or ”Cashing In” on Wall Street? Evidence on Elite STEM Talent. Harvard Business School Working Paper 16-067. Smith, side replicating-tennessees-approach-free-community-college-takes-money-and-more. In- https://www.insidehighered.com/news/2017/03/02/ A. A. Higher (2017). Ed. Replicating the Tennessee Promise. Snyder, M. and S. Boelscher (2018). Driving Better Outcomes: Fiscal Year 2018 State Status & Ty- pology Update. http://hcmstrategists.com/wp-content/uploads/2018/03/HCM DBO Document v3.pdf. Solon, G., S. J. Haider, and J. M. Wooldridge (2015). What Are We Weighting For? Journal of Human Resources 50, 301–316. Stevens, A. H., M. Kurlaender, and M. Grosz (2018). Career Technical Education and Labor Mar- ket Outcomes: Evidence from California Community Colleges. Journal of Human Resources, forthcoming. United States General Accounting Office (2003). The Worker Adjustment and Retraining Noti- fication Act: Revising the Act Could Clarify Employer Responsibilities and Employee Rights. https://www.gao.gov/new.items/d031003.pdf. U.S. Department of Labor (2019). Plant Closings & Layoffs. https://www.dol.gov/general/topic/ termination/plantclosings. Viru´es-Ortega, J. (2010). Applied Behavior Analytic Intervention for Autism in Early Childhood: Meta-analysis, Meta-regression and Dose-response Meta-analysis of Multiple Outcomes. Clin- ical Psychology Review 304, 387–399. Weinstein, R. (2019). Local Labor Markets and Human Capital Investments. Working Paper. Wiswall, M. and B. Zafar (2015). Determinants of College Major Choice: Identification using an Information Experiment. Review of Economic Studies 82(2), 791–824. Wooldridge, J. M. (2010). Econometric Analysis of Cross Section and Panel Data (2nd ed.). Cambridge, MA: The MIT Press. Xia, X. (2016). Forming wage expectations through learning: Evidence from college major choices. Journal of Economic Behavior & Organization 132. Xu, Guifeng, et al. (2019). Prevalence and Treatment Patterns of Autism Spectrum Disorder in the United States, 2016. JAMA Pediatrics 173(2), 153–159. Yelowitz, A. (1995). The Medicaid Notch, Labor Supply, and Welfare Participation: Evidence from Eligibility Expansions. The Quarterly Journal of Economics 110(4), 909–939. 202 Zwaigenbaum, Lonnie, et al. (2015). Early Intervention for Children with Autism Spectrum Dis- order Under 3 Years of Age: Recommendations for Practice and Research. Pediatrics 136(S1), s60–s81. 203