WHAT FACTORS HELP OR HINDER THE ACHIEVEMENT OF LOW SES STUDENTS? AN INTERNATIONAL COMPARISON USING TIMSS 2011 8TH GRADE SCIENCE DATA By Justin L. Bruner A DISSERTATION Submitted to Michigan State University in partial fulfillment of the requirements for the degree of Curriculum, Instruction, and Teacher Education – Doctor of Philosophy 2014 ABSTRACT WHAT FACTORS HELP OR HINDER THE ACHIEVEMENT OF LOW SES STUDENTS? AN INTERNATIONAL COMPARISON USING TIMSS 2011 8TH GRADE SCIENCE DATA By Justin L. Bruner Focusing on science from a cross-country perspective, this study explores the relationship between 8th grade science achievement and student, teacher, and school characteristics. More specifically, this study will pay special attention to low socio-economic status (SES) students and seek to understand why some disadvantaged students are able to have higher than expected achievement in science given their SES while other disadvantaged students are not able to achieve beyond what would be expected given their background. This study will explore the multi-level relationship between the characteristics of students, their teachers, their schools, and student achievement in science. While looking at students in classrooms and in schools, this work will create as precise as possible a measure of student SES by drawing on recommendations of an expert panel commissioned by the National Association of Educational Progress (NAEP) study. The study uses the most recent cycle (2011) of the Trends in International Math and Science Study (TIMSS), to strategically select a six-country sample from the 45 participating countries. This six-country sample was selected by using the country level achievement and the standard deviation of that achievement. This will create a sample that has a range of equality in achievement and strength in achievement. This allows for making comparisons both across and within countries to better understand variations in the factors of student performance, especially for disadvantaged students. This paper builds on the existing research around socio-economic status (SES) and achievement by exploring in more detail the conditions in schools and classrooms around the world that might magnify or reduce the effect of SES on student achievement. The analysis looks at these questions: “What conditions help low SES students achieve higher than what would be expected given their SES?” and “What conditions hinder low SES students to achieve at or below what would be expected given their SES?” Investigating these questions will help to understand, in a global context, where disadvantaged students are being successful in their science classes, under what conditions, and as a result help to inform educational policy. The results suggest that there are clearly inequities in achievement and that these inequities may be further increased by other factors. These factors are present at all levels of analysis: the student level, the teacher/classroom level, and the school level. There are also variables that consistently had no impact at all levels with respect to student science achievement and there are also variables that were impactful but only within specific countries. Overall, there are no silver bullets present in these data that can do much on their own to help low SES students overcome their predicted achievement disadvantage. However there does appear to be the potential for a combination of factors being able to do more. Copyright by JUSTIN L. BRUNER 2014 I dedicate this work to my wife Lori for her unwavering love and support both of which I could not have completed this work without. I also want to dedicate this in memory of my best friend Miikka who was along for this journey but did not make it to the end. v ACKNOWLEGEMENTS One of my core beliefs is that an individual is only as good as the people around them. While this work will only have my name on it, there were numerous other people involved who deserve just as much credit as I get. Michigan State University is such a special place with so many special people that I could write another dissertation about it. First off, thank you to my committee members: Dr. Jack Schwille, Dr. Amita Chudgar, Dr. John Metzler, Dr. Kristin Phillips, and Dr. Peter Youngs. All of you provided me with such a generous amount of your time, support, and wonderful feedback. I am so appreciative of everything you have done for me. I want to be sure to acknowledge all my graduate instructors who provided me with so many wonderful course experiences and more knowledge than I will ever be able to use in my career. I want to thank all the hard-working support staff at MSU that makes my experience and those of so many others possible by doing so much great work in the background. All the support staff at MSU have been nothing short of exceptional. I want to thank all of my graduate and undergraduate students that I had the privilege of teaching. I learned so much from all of you and I am so proud of all the great young teachers you were. Finally, I want to be sure to thank all of my fellow graduate students I was fortunate enough to work with during the years. I received everything from life counseling, to moral support, to feedback on ideas. I enjoyed having such great colleagues in the trenches with me. I want to end with a very sincere and humble thank you to everyone who has been any part of my life during the past five years. None of this would be possible with you. vi TABLE OF CONTENTS LIST OF TABLES ix LIST OF FIGURES x KEY TO SYMBOLS AND ABBREVIATIONS xi Chapter 1 - Introduction, Purpose, and Significance Introduction Research Questions Background for Why This Study Matters Theoretical, TIMSS, and Analytical Frameworks of Study Dataset Limitations of the Study Organization of the Study 1 1 1 2 4 6 8 10 Chapter 2 – Literature Review International Reports World Bank and United Nations OECD and IEA Studies Socio-Economic Status Peer-Reviewed Journal Literature Studies with TIMSS and Other International Datasets Students Teachers School 12 12 12 14 17 19 20 20 23 24 Chapter 3 – Country Selection and Background Country Level Analysis and Selection Country Background Information Low Variation Countries Finland Republic of Korea Chile High Variation Countries Singapore Ghana United States 28 28 31 31 31 34 35 38 38 40 42 Chapter 4 - Methods Creating the Dataset Factor Analysis and Variable Selection Cleaning and Descriptive Analysis Setting up TIMSS for Proper Analysis and Reporting Missing Data Baseline Regressions Hierarchical Linear Modeling Binary Logistic Regressions 45 45 45 52 54 55 57 59 61 Chapter 5 - Results Correlations by Country Chile Finland 65 65 65 66 vii Ghana Korea Singapore United States Correlations Explained by Variable Baseline Regressions Student Regressions Teacher/Classroom Variables School Variables Hierarchical Linear Models Unconditional Model SES Model Full Model Chile Finland Ghana Korea Singapore United States Variance Structure Logistic Regressions Odds of Low Achievement in Science Odds of Low SES Odds of Low Achievement Interacting Low SES with Model Variables 67 68 69 69 70 76 76 80 83 86 86 88 90 90 90 91 92 92 93 98 100 100 103 106 Chpater 6 – Discussion and Conclusion Inequalities SES Gender Factors that Enhance or Reduce this Effect Student Factors Non-Student Factors Positive Effects on Student Achievement Negative Effects on Student Achievement Country Specific Factors Factors with No Effect Recommendations and Conclusions Concluding Remarks 109 110 110 111 113 119 121 121 122 122 124 125 129 APPENDICES Appendix 1: Stata Code Appendix 2: Factor Analysis Appendix 3: Raw Data Outputs 130 131 278 287 BIBLIOGRAPHY 596 viii LIST OF TABLES Table 4.1: Descriptive Statistics by Country 53 Table 4.2: Percentage of Missing Data by Country for Model Variables 56 Table 4.3: SES and Achievement Categories 62 Table 5.1: Correlations with Science Achievement 72 Table 5.2: Correlations with SES 74 Table 5.3: Student Regressions on Science Achievement 79 Table 5.4: Teacher and Classroom Variables on Student Achievement 82 Table 5.5: School Variables on Student Achievement 85 Table 5.6: Unconditional HLM 87 Table 5.7: SES only HLM 89 Table 5.8: Full Fixed Effects HLM 95 Table 5.9: Variance Structure 99 Table 5.10: Odds of Low Achievement in Science 102 Table 5.11: Odds of Low SES 105 Table 5.12: Odds of Low Achievement Given Interaction of Low SES with Variable 107 Table 6.1: Gender Comparison by Country 112 Table 6.2: Summary of Variable Significance by Country 114 Table 6.3: Country Representation of Achievement, SD, Gender, and Variance Location 129 Table A2.1: Student Factor Analysis Raw Outputs 278 Table A2.2: Student Item Correlations 281 Table A2.3: Teacher Factor Analysis Raw Outputs 282 Table A2.4: Teacher Item Correlations 284 Table A2.5: School Factor Analysis Raw Outputs 285 Table A2.6: School Item Correlations 287 ix LIST OF FIGURES Figure 1.1- Analytic Framework Using a Production Function 6 Figure 3.1- Science Achievement vs. SD of Science Achievement by Country 29 Figure 3.2 - 25th Percentile Science Achievement vs. SD by Country 30 Figure 3.3 - 75th Percentile Science Achievement vs. SD by Country 30 x KEY TO SYMBOLS AND ABBREVIATIONS EFA: Education for All HLM: Hierarchical Linear Modeling IEA: International Association for the Evaluation of Educational Achievement MDG: Millennium Development Goals OECD: Organisation for Economic Co-operation and Development PIRLS: Progress in International Reading Literacy Study PISA: Programme for International Student Assessment SES: Socio-Economic Status TIMSS: Trends in International Mathematics and Science Study UNDP: United Nations Development Program UNESCO: United Nations Educational, Scientific, and Cultural Organization xi Chapter 1 - Introduction, Purpose, and Significance Introduction Focusing on science from a cross-country perspective, this study explores the relationship between eighth grade science achievement and student, teacher, and school characteristics. More specifically, this study will pay special attention to low socio-economic status (SES) students and seek to understand why some disadvantaged students obtain higher than expected achievement in science given their SES while other disadvantaged students are not able to achieve beyond what would be expected given their background. This study will explore the multi-level relationship among the characteristics of students, their teachers, their schools, and student achievement in science. While looking at students in classrooms and in schools, this work will create a precise-as-possible measure of student SES by drawing on recommendations of an expert panel commissioned by the National Association of Educational Progress (NAEP) study. The study uses the most recent cycle (2011) of the Trends in International Math and Science Study (TIMSS) to strategically select a six-country sample from the 45 participating countries. This six-country sample will be selected by using measures of country-level achievement and the standard deviation of that achievement. This will create a sample that has a range of equality in achievement and strength in achievement. This allows for making comparisons both across and within countries to better understand variations in factors that affect student performance, especially for disadvantaged students. More detailed explanations of the country selection and analysis procedures can be found in chapter 4. Research Questions This paper will build on the existing research on socio-economic status (SES) and achievement by exploring in more detail the conditions in schools and classrooms around the world that might magnify or reduce the effect of SES on student achievement. The analysis will look at these questions: “What conditions help low SES students achieve higher than what would be expected given their SES?” and “What conditions hinder low SES students from achieving at or below what would be expected given their SES?” Investigating these questions will help to explain, in a global context, where disadvantaged students are being successful in their science classes and under what conditions, and as a result help to inform educational policy. More specifically, this study will explore a series of questions in detail as it pertains to students, teachers, and schools both within and across countries. These questions will take on both the broader sample of all students as well as look exclusively at low SES students. At the student-level these specific questions are: 1 • What are the differences between countries in the strength of the effect of SES? • How does the gender of a student impact student achievement differences between countries? • To what extent do the TIMSS data support an argument that a student’s achievement in science is helped or hindered by other individual characteristics given their SES? • At the teacher level these specific questions are: • How much does “teacher quality” (level of teacher education, years of teacher experience, and being a science major) enhance student science achievement? • What pedagogical choices in the science classroom made by teachers are positive predictors of student science achievement, when other classroom factors are taken into account? • Do the actions of teachers towards students’ social and emotional well-being have a positive or negative impact on science achievement in a multivariate analysis? • Finally, at the school-level the questions are: • How does being in a rural or urban school matter for the science achievement of students in each country? • What are the effects on science achievement of schools that have differing proportions of economically disadvantaged students? The main hypothesis to be tested is that increased levels of the magnitude of inequality lead to decreased academic performance. On a country level, it is expected that countries with low variation in their achievement will also show the lower impact of student disadvantage. At a student level this could be as simple as girls having significantly lower achievement compared to boys. At a teacher level it could be that some groups of students are assigned to different types of teachers compared to other groups of students. At a school level it could be that some students attend more affluent schools than others based off of their communities. The expectation is that a combination of all these factors matter to varying degrees. Furthermore, this will likely vary across countries but what this variation looks like within each country is of interest. Background for Why This Study Matters Why does this study matter and why should it be conducted? The answer is two-pronged and lies in addressing issues of disadvantage in education and addressing the global shortage of science professionals. The Programme for International Student Assessment (PISA) 2009 (Organization for Economic Cooperation and Development (OECD), 2009) report finds that the highest performing nations educate all their students to a high 2 level, not just some of them. These high performing nations have the lowest impact of disadvantage on educational performance and the lowest amount of students in the bottom levels of proficiency. The most famous example is Finland where the school system is a part of a much larger social support system. The report also notes that poor performance in school does not automatically mean a student is disadvantaged but that being a disadvantaged student or attending a disadvantaged school does increase the likelihood of poor performance. By explaining the impact of disadvantage on test scores for students at the bottom of the performance distribution, researchers and policymakers can then begin to address ways to help raise performance at the bottom of the distribution which will have the effect of also evening out the distribution of achievement which will help to raise overall national performance. There has been a large body of research that highlights the fact that disadvantaged students are disproportionally impacted by elements of inequality both in and out of school (UNDP, 2014). Addressing this inequality of achievement is critical in raising the overall performance of a nation and of our increasingly global society. Another important element of this study has to do with the selection of science as a content domain in place of other content areas. It has been well documented that a declining number of students globally are deciding to study science beyond the compulsory requirements (Osborne et al. 2003). What makes this puzzling are the premiums enjoyed by a career in science or a related STEM (Science Technology Engineering and Math) profession. The United States Department of Commerce estimates that STEM as a field will grow 18% until 2018 compared to 9.8% for non-STEM fields (Langdon et al, 2011). STEM workers will also draw about 25% higher wages and have significantly lower unemployment rates (Langdon et al, 2011). Finally, STEM workers enjoy wage premiums compared to non-STEM workers at all degree levels (Langdon et al, 2011). So it would stand to reason that these jobs would be in high demand and highly competitive, right? Unfortunately that is not the case. In the United States for example, 90% of the coveted H-1B visas that help fill labor shortages are for STEM occupations (Rothwell and Ruiz, 2013). This is also a problem in other countries around the world: in Finland for example, the 2006 PISA report showed the Finnish students were significantly below average in their affect towards science despite having the top achievement scores (OECD, 2006). So why aren’t students interested in entering these highpaying, secure, and growing jobs especially given the fact the premiums enjoyed could help provide upward mobility to disadvantaged students while also filling a much-needed professional shortage? 3 The literature seems to indicate that there is not a single reason for this but rather a combination of factors such as gender, teachers, the quality of teaching, curriculum, and cultural differences. One pattern that appears to emerge is that students do appear to value science as a profession and see the benefits to society but there appears to be a disconnect when it comes to school science that may discourage further participation or exploration in science (Osborne et al. 2003). In other words, what students perceive as possible or interesting with respect to science is often disconnected from the school science that takes place in the classroom as a result of the previously mentioned factors. Based on this perceived pattern, it becomes critical to observe the aspects of school science where students learn best. The review of the literature in chapter 2 will identify relevant literature and explore the interaction of these factors in more detail. Theoretical, TIMSS, and Analytical Frameworks of Study It is important to situate this study by highlighting the different frameworks that underpin the perspectives and analysis being undertaken. From a theoretical perspective, this study is grounded in Human Capital Theory that was pioneered in the mid-20th century by scholars including Becker (1962) and Schultz (1961). It is also important to note that TIMSS has its own frameworks that it uses to guide the sampling and types of data collected within each cycle of the study, dedicating an entire volume to this (Mullis et al., 2009). Finally, the analysis will be guided by an analytical framework of an education production function where inputs are used to produce an output of some type (Bowles, 1970). The core idea behind human capital theory is that people have value to a society just the same way a machine might have value (capital) towards a business. The skills, abilities, and talents of a collection of people provide a return and value to a society (Schultz, 1961). The more a society invests in these skills, abilities, and talents through education then people will accumulate a greater amount of human capital which in turn allows for increased contributions back to society (Becker, 1962). Simply put, Human Capital Theory posits a positive linear relationship between the amount of education (human capital) and the skills, abilities, and talents acquired. However, the problem within human capital theory is that not every person has the same opportunities or accumulates skills at the same rate due to factors such as attending schools with differing amounts of resources or having different qualities of teachers to name some common examples. This imperfection of human capital theory being an unequal process where individuals have differing outcomes is at the heart of the motivation for this study. 4 The goal of this study is to uncover factors that magnify or reduce the ability of a person to accumulate skills that come from personal, classroom, or school characteristic differences. TIMSS also creates their own frameworks that motivate their sampling and data collection (Mullis et al., 2009). There are three main layers that drive the framework for TIMMS: the intended curriculum, the implemented curriculum, and the attained curriculum (p. 75). The intended curriculum refers to the “national, social, and education context” where the policy decisions are made about education within a given structure. The implemented curriculum looks at the “school, teacher, and classroom context” where the curriculum is actually carried out. The last part looks at the attained curriculum which is a measure of “student outcomes and characteristics.” The subsets of these three main layers of the TIMSS frameworks will be explained in more detail in the chapter 3 which covers this TIMSS design more extensively. Lastly is the analytical framework from which the results will be analyzed using a production function. The idea of a production function is not new but its application to education has taken root in the past half-century starting most commonly with Bowles (1970). The idea behind an education production function is that there are some inputs related to education that are combined through a process that in turn produces an output. Some common educational inputs are school resources, teacher characteristics, and family background (Hanushek, 2007). The process part of the education production function could simply be thought of as going to school and is very commonly referred to as a “black box” because of the difficulty in measuring all the interactions that take place for a student on a given school day. The output part of the function is whatever outcome comes out of the process (Bowles, 1970). Very commonly this is student achievement of some type. In the case of this study the “inputs” will be student, teacher, and school characteristics as measured by the TIMSS study and the outputs will be student science achievement. Figure 1.1 provides a diagram of the analytical framework. 5 Figure 1.1 – Analytic Framework Using a Production Function Inputs Process Output Student Factors - Attitude SES Gender Parents Teacher/Classroom Factors - Teacher Quality Teacher Pedagogy Peer Effects Support Expectations Attending School Student Science Achievement (“Black Box”) School Factors - Climate Urban/Rural Resources Proportion of Disadvantaged Students Dataset The Trends in Mathematics and Science Studies (TIMSS) is conducted every four years by the International Association for the Evaluation of Educational Achievement (IEA) as a study of fourth and eighth grade students, their teachers, and their schools. TIMSS utilizes a two-stage sampling design that takes a sample of schools within a country and then a sample of students within those schools. Mathematics and science teachers of the students within the sampled schools are also included but they are not considered a representative sample on their own but rather representative of the students they teach (Foy et al., 2013). Achievement data in mathematics and 6 science as well as background data is collected from the students. Teachers also complete a questionnaire that asks about their background, their working situation, and their teaching practices. Administrators or headmasters of participating schools also complete a questionnaire similar to the teachers but with the questions geared towards the administrative and school level. All questionnaires are available for download from the TIMSS or IEA (International Association for the Evaluation of Educational Achievement) websites. TIMSS had been conducted on four-year cycles starting in 1995 and measures the science and mathematics achievement of fourth and eighth grade students. TIMSS allows previously participating countries to monitor their progress over time with a new scaling metric implemented starting with the 2003 cycle. TIMSS was selected because it is the largest international dataset that contains data about science achievement and is focused exclusively on mathematics and science as subjects. This focus on mathematics and science allows for future research to take a more in-depth study of specific subjects within mathematics and science as well as their curricula. The other advantage to TIMSS is that the achievement data is standardized and IRT-scaled in each study for an overall mean of 500 and a standard deviation of 100. This has been done consistently during each study cycle so it allows for the tracking of movement over time. While TIMSS studies go back more than two cycles, the TIMSS fixed scaling started with the 2003 cycle so TIMSS can only be compared using the three most recent cycles (Olson et al., 2008). This study uses the 2011 eighth grade TIMSS science dataset for the analysis. 45 countries participated in the eighth grade 2011 science study during this year. The whole sample consists of over 300,000 students in total and about 150-200 schools within each country (Martin et al., 2012). Eighth grade students had 217 response items to assess their achievement divided equally between multiple-choice and open-response questions (Martin et al., 2012). However, students only complete a small portion of the entire assessment and IRT scaling is used to estimate achievement for the entire assessment (Martin et al., 2012). TIMSS uses frameworks for assessment, teaching, and collecting background information that guide the design of the instruments in the study. With respect to the eighth grade science content domains, 35% of the assessment is related to topics in biology, 20% in chemistry, 25% in physics, and 20% in Earth Science (Mullis et al., 2009). With respect to the eighth grade cognitive domains, 35% of the eighth grade science assessment is dedicated to knowing content, 35% is dedicated to applying content, and 30% is dedicated to reasoning content 7 (Mullis et al., 2009). These content and cognitive domains also breakdown further within each given domain however given that this study does not operative within any specific domain those details have been omitted. 1 For collecting background data, TIMSS uses a contextual framework to guide instrument design. These elements will be explained in more detail because these help guide the variable selection for the models to be used for analysis of student achievement. This contextual framework consists of four broad areas, each with sub-domains just like the assessment frameworks: 1. National and community contexts, 2. School contexts, 3. Classroom contexts, and 4. Student Characteristics and Attitudes (Mullis et al., 2009). National and community contexts consist of demographics and resources, the organization and structure of the education system, and the mathematics and science curriculum. The schools consist of the school characteristics, school organization for instruction, school climate for learning, the teaching staff, school resources, and parent involvement. The classroom contexts consist of teacher education and development, teacher characteristics, classroom characteristics, instructional materials and technology, topics taught, instructional activities, and assessment. Finally, the student characteristics and attitudes consist of student demographics, their home background, and student attitudes toward learning mathematics and science. Limitations of the Study As with any research, it is important to identify and highlight the limitations of the study. Some limitations are the result of the TIMSS study design itself, there are also limitations of the analysis, and there are also limitations of the findings as well. One concern with the TIMSS study itself has to do with the representativeness of the sample in terms of how the sampling is done. Prais (2007) presents a very strongly worded concern of the effect of this in the case of England for sampling problems with both TIMSS and PISA which lead to the results of England not being internationally comparable due to low participation rates. In Prais’ opinion, the main problem came from low school participation rates which caused the need for replacement schools to be used and in turn, took away from the validity of the sampling as more schools needed to be replaced. The TIMSS organizers do put out a technical report with each cycle that details the sampling procedures and rates. However, in the main report sampling concerns are reduced to a footnote directing the reader to an appendix that charts the sampling and contains more footnotes. This is all done while still reporting the results from the country despite sampling representativeness concerns. A reader 1 See TIMSS 2011 Assessment Frameworks for these further sub domains. 8 or researcher must therefore exercise great caution when selecting specific countries. Of the countries in this study, Singapore and the United States were flagged for only covering 90 to 95 percent of the population. The remaining countries did not have any flags. Another important consideration and limitation of this study as it relates to the TIMSS design is that it is a snapshot of the educational situation within a country for a given point in time. While it is certainly possible to uncover patterns related to student achievement, this study will not have a cross-sectional element where these patterns can be explored over time to measure changes as a result of different factors. However, due to the implementation of a standardized scoring scheme by IEA starting with TIMSS 2003, achievement scores can be compared over time as well as background items that are used on multiple cycles but this will not be done here. TIMSS operates on four-year cycles and conducts the assessments with a four-year gap in student grades so that the fourth grade students on a given cycle will be the grade students assessed on the next cycle. This offers exciting and potentially more powerful research findings, however, this study will be limited to a single cycle and will not exploit any of the abilities to compare over time. The fact that this is just a single point in time does limit the precision of the results in that they do not have any repeated measures. Another limitation of this study is related to the size of the data and the fact that there are many variables that serve as both background variables as well as achievement variables that are not being used. This study will utilize only science achievement overall as the outcome measure: however, TIMSS offers many more achievement variables that could be useful for comparison. These variables include international benchmarking scores, specific science subject scores, and specific science skill scores. All of these variables could potentially provide greater insights beyond the general science achievement variable. With respect to the background variables, great care was taken to select variables that were deemed important based on the literature as well as important based on a factor analysis which will be explained in great detail in the later chapters. However, this still leaves out a majority of teacher, student, and school variables that may be attributed to the unexplained variance. This unexplained variance could be the result of variables excluded from the model but it could also be the result of factors for which there is not a variable available to measure it. There are also always purely random components of any analysis that will not be measured. The final limitation of the study, and in some aspects the most important, has to do with what policy recommendations are possible given the findings. That is to say that just because a particular finding is true in one 9 country does not mean it can or should be automatically applied to another country. Steiner-Khamsi (2004) calls this “educational policy borrowing” in that one-country implements a successful educational policy and therefore another country wants to copy this idea in their own country. However, the frequent problem with policy borrowing is that it does not often account for the unique contexts within each country that may determine a policy’s success or failure. Finland is a frequently-targeted country for policy borrowing due to the combination of high equality and achievement in the Finnish system. Pasi Sahlberg (2014) is quick to point out that there are elements of the Finnish system that could be utilized in other countries and there are elements of the system that are more difficult to replicate because they are embedded within the larger Finnish society. Keeping the unique contextual features in mind when suggesting the recommendations from this study will be very important but it will also be impossible to say with certainty to what degree a particular recommendation is applicable to another country in which relevant contextual data has not been sufficiently studied. Organization of the Study Chapter 2 provides an in-depth review of the very rich field of research on international achievement, student background characteristics, teacher effectiveness, and school effectiveness. Research will be reviewed from a range of sources including international research organizations such as IEA and OECD, peer reviewed literature, and Non-Governmental Organization’s (NGO) such as the UN and World Bank. It would be impossible to cover everything from such a large body of research, therefore, the literature review will uncover patterns of findings as well as differences across these different research groups. Chapter 3 will highlight and validate the selection process of the six countries as well as provide a brief background of the educational contexts of each country. Understanding how the countries are selected is critical because these countries have unique differences in their educational systems and these unique differences of each country will impact the results within each country. Therefore, providing an overview of each country in the sample will aid in interpreting the findings within a given country. For example, the question of why gender matters for some countries but not others can be explained by understanding the gender disparities in each country. Once the selection of the countries has been explained and the background of each country profiled, chapter 4 will then explain the methodology of the study. This will be a detailed explanation of the multiple analyses done to situate the deliberate process that was used. This chapter also highlights how the complex sampling of TIMSS is 10 treated, provides supporting tables, and the syntax used so the study could be easily replicated within other countries or verified by other researchers. The results of those methods are shown and explained in chapter 5 starting with the simplest and building towards the most complex. The results are reported by analysis type as well as by country. The reporting by analysis type allows for comparison between countries to uncover patterns or distinct differences across borders. The withincountry reporting allows for in-depth exploration within a given country to explain findings that might be unique or important to a given country. Finally, to conclude, chapter 6 will discuss the results with the implications for policy as well as future research. The implications for policy will apply the findings to make recommendations that policy makers or stakeholders would want to consider when attempting to improve student achievement outcomes. This will include suggestions for changes at the student, teacher, and school levels. Some changes will be specific to a given country while others will be cross-national and specific to a given situation. The future research section will use both the findings of the study as well as questions unanswered from the study to suggest new avenues for research that build on the work of this study. This could include complementary work with the same sample of countries or broadening this study to include a different or more specific sample of countries. 11 Chapter 2 - Literature Review Since this study involves a range of research topics, it is important to review several different types of literature from multiple publication backgrounds. Literature related to SES and student achievement will be a key component of this review to see what the magnitude of the relationship between SES and achievement has been from previous work. Given the importance of SES to this study, a small section will be devoted to reviewing this relevant SES literature. Additionally, literature related to student factors (besides SES), teacher and classroom factors, as well as school factors in terms of how they relate to achievement will also be reviewed to examine what other factors from previous research are associated with student achievement. With respect to publication backgrounds, reports from international agencies like the United Nations and World Bank will be reviewed to draw on their cross-country comparisons. Also in this grouping will be relevant results from international achievement studies like TIMSS, PIRLS (Progress in International Reading Literacy Study), and PISA, specifically those that highlight relationships between outcomes and SES across countries. Finally, the peer-reviewed literature that generally comes from universities will be reviewed, using primarily economic literature to look for any additional confirmation or different findings from international reports as it relates to achievement, SES, teacher, and school characteristics. International Reports Before exploring the academic literature, it is important to explore the international literature to see what is being emphasized and discussed at a cross-country level. There are a handful of annual reports that are important to review due to their focus on education and inequality: • Education for All (EFA) Global Monitoring Report (GMR) – published annually by UNESCO • Human Development Report (HDR) – published annually by UNDP • World Development Report (WDR) – published annually by the World Bank It is also important to review the reports put out for the most recent international testing studies for TIMSS (2011) and PISA (2009). The PISA report will be useful in that it focuses on the impact of SES on achievement. World Bank and United Nations The most recent EFA report is focused on the role of teaching and learning as it pertains to improving the quality of education, an aim of this study as well. The report highlights the fact that globally millions of children are not meeting basic learning goals despite attending school and that the vast majority of these children are 12 disadvantaged in some way (UNESCO, 2014). The report points out that teachers are one key component to this problem, although not the only problem because often teachers lack the complementary support that allows them to complete their job properly. The main problems the report identifies related to teachers are: being properly trained, motivated to teach, enjoy teaching, and can support weak learners (UNESCO, 2014). More specifically, the report does an excellent job highlighting some of the aspects of working with disadvantaged students as it pertains to teachers as well as why this is an important aspect to study. According to UNESCO, of the 650 million primary school students globally, 250 million of these students do not meet basic reading and math requirements and 120 million have not made it past grade 4 (UNESCO, 2014). This disparity is also concentrated with the majority of these students living in Africa and South Asia, according to the report. What is also important to note as it relates to this study is that there are disparities with respect to learning not just between countries but also within countries most commonly along the lines of gender, urban/rural location, ethnicity, and family wealth (UNESCO, 2014). One way to combat this, the report argues, is to include elements of teacher quality as part of an education strategy that specifically focuses on disadvantaged students. The report notes that in the majority of countries, there is a lack of planning to improve the training of teachers or enhance the training of current teachers (UNESCO, 2014). In 33 countries, less than 75% of teachers were properly certified to national standards and teachers were also found to not have proper content training (UNESCO, 2012). Furthermore, there is a lack of recruiting and incentivizing teachers to work with disadvantaged populations where there are shortages present. Finally, the last major aspect of the report notes that it is not uncommon for teachers to leave the profession, especially for those working in difficult circumstances, so it is important to provide the right incentives to retain teachers in the long term (UNESCO, 2014). Globally, the teacher student ratio is 24 to 1, down from 26 to 1 previously, so class sizes are declining but still large (UNESCO, 2012). However, it is very important to note that children who are disadvantaged benefit the most from reduced class size (UNESCO, 2005). While a little older report, the 2005 EFA report focuses on education quality and the implications of a quality education so it is relevant to this study. The report aggregates some of the research around earnings and estimates that a one standard deviation increase in mathematics achievement leads to about a 12% increase in lifetime earnings (UNESCO, 2005). This reinforces the importance of improving disadvantaged students’ performance since they are more likely to perform at a lower level. The report also highlights spillover and social 13 effects to increasing education, especially in developing countries, noting that with each increased year of education, the likelihood of HIV/AIDS infection decreased while condom use increased (UNESCO, 2005). Inequality and disadvantage can also take on a gender theme, as there are 68 countries globally where girls are disadvantaged when it comes to primary education enrollment (UNESCO, 2012). However, when girls are able to attend school they can be just as, if not more, successful than boys as evidenced by the fact that girls attain higher achievement in reading compared to boys but boys attain higher achievement in mathematics and science (UNESCO, 2012). This is consistent with research in other reports when exploring gender differences by subject. The report is also careful to point out that just attending school is not a guarantee of success regardless of gender and estimates, as the previous report does, that 130 million fourth grade students are not meeting basic learning requirements across all subjects (UNESCO, 2012). The HDR and WDR are more development-based than education-based but do a very good job of highlighting the spillover effects of education and other disciplines. The 2006 HDR notes that of 73 countries with available data, 53, comprising 80% of the world’s population, have seen some type of inequality (economic, political, or socio-cultural) rise in the past decade. This manifests itself across different fields as infant mortality is three times higher for less-educated mothers but also that simply reducing the gender parity in education will also help in reducing undernourished children as mothers’ have more education (UNDP, 2005). Where you live also matters as it pertains to disadvantage as rural households are more likely to be less educated, have lower incomes, and have lower-quality teachers (World Bank, 2006). These international reports clearly show some trends globally with respect to education but also some unique country differences as well. The most common trends highlight that inequality is clearly impacting educational metrics including enrollment and performance. There are also gender disparities present where girls are not given the same opportunities as boys despite the evidence of numerous benefits of girls getting educated. There is also great variation in the quality of teachers that impacts the performance of their students. Finally, the reports show that these trends are not present everywhere as shown by a 2005 EFA report highlighting four countries (Finland, Canada, Cuba and Korea) that go against international trends so it is possible to identify and rectify the sources of inequity. Since two of these four countries are included in the sample, they will be profiled in greater detail in the next chapter. OECD and IEA Studies 14 One of the highlights of the 2009 PISA study is that low socio-economic status is not an automatic indication of low academic performance. In the report, students who come from the bottom quarter of SES but score in the top quarter of the exam are labeled as “resilient” students. These are the students who will be of most interest in this study by trying to explain what exactly makes them “resilient.” This will most likely be a combination of individual, classroom, school, and country effects although to what extent is unclear. Consistent with other work, PISA 2009 finds effects for schools, communities, and family background on student performance but these do appear to vary by country highlighting the importance of cross-national comparison. Furthermore, the best-performing countries overall also have the smallest gaps in performance because a greater proportion of their population is absent from having very low performance (OECD, 2009). In other words, by eliminating extreme lows in student achievement, the overall average achievement of the country is raised because of the absence of these extreme low achievement scores. In cases where disadvantage is present, the report has findings that are in line with previous research. It finds that disadvantaged students get access to lower-quality teachers and students who attend schools with more affluent peers are likely to have better educational performance (OECD, 2009). Additionally, single-parent, rural, and immigrant students are more likely to have lower performance (OECD, 2009). The news is not all bad, both positive attitudes by students and teachers with respect to each other and the subject content has a positive relationship to score outcomes (OECD, 2009). This finding is also present in the peer-reviewed journal literature in the next section. The 2011 TIMSS science report also shows SES effects on science achievement with schools that have more affluent students showing higher scores than schools with more disadvantaged students. The eighth grade difference is almost 50 points which equates to half a standard deviation. There is an almost equal distribution of these types of schools across countries in the 2011 TIMSS study with about 1/3 of schools listed as “affluent”, about 1/3 listed as neither, and about 1/3 listed as “disadvantaged” for eighth grade students (Martin et al., 2012). One unique aspect is that the TIMSS and PIRLS 2011 studies overlap and therefore share some additional home and background information that would not otherwise be collected. This data is only for the fourth grade students in the sample so it cannot be used for this study but the findings are important to note. Not surprisingly, a major factor that matters is parent involvement and the home situation. Children whose parents read to them or work with them on their mathematics/science perform better on all subjects (Mullis et al., 2012; Martin et al. 2012). The other common finding is the importance of the skills a child possesses before they enter schooling and if they 15 attended a pre-primary institution. Children who had basic literacy skills or could do simple numeracy before entering school had significantly higher scores (Mullis et al., 2012; Mullis et al., 2012). Finally, having resources at home to support the subject area learning, such as books at home or science materials, has a positive impact on scores (Mullis et al., 2012; Martin et al., 2012). There are also some important findings about the schools and the teachers that impact test performance. Interestingly, school resources seem to matter in that a better-resourced school will most likely have higher test scores across all subjects (Mullis et al., 2012; Mullis et al., 2012; Martin et al., 2012). The culture of the school and how it is organized also seems to impact performance across all subjects. Schools that emphasize academic success and are more clearly organized are also more likely to have higher performance (Mullis et al., 2012; Mullis et al., 2012; Martin et al., 2012). From a teacher standpoint, the biggest impacts on test scores are teacher background characteristics and teacher attitude toward the subject and their students, all three of which have a positive association (Mullis et al., 2012; Mullis et al., 2012; Martin et al., 2012). The reports also explore student-level characteristics and again, elements of disadvantage begin to take shape. In connecting back to some of the literature about human development, students who have better nutrition are more likely to have a positive attitude, be more engaged, and in turn, have better test performance (Mullis et al., 2012; Mullis et al., 2012; Martin et al., 2012). This is more likely a problem for students from a less-resourced family as opposed to a more well-off family. Another potential source of disadvantage is the language of the test and how that connects to the native language of the student. Not surprisingly, students who do not speak the language of the test as their native language are more likely to perform poorly on the PIRLS/TIMSS assessment (Mullis et al., 2012; Mullis et al., 2012; Martin et al., 2012). Finally, while not necessarily a source of disadvantage, student attitude, engagement, and confidence all impact test performance (Mullis et al., 2012; Mullis et al., 2012; Martin et al., 2012). It is unclear from the reports if this is more likely for disadvantaged students or if this holds across all student groups. In addition to looking at the reports put out by the IEA and OECD it is also important to review some of the research coming from the most recent IEA meetings to help position this study in the current work being conducted. First, Hedges (2013) looks at the relationship between the GDP per capita of a country (which he admits is an imperfect measure) and their achievement by comparing the achievement of those above the GDP per capita and those below the GDP per capita. Hedges does find relationships between achievement and the amount of inequality 16 between countries but this varies between rich and poor countries. He finds that in rich countries, inequality is weakly related to achievement, but in poorer countries it is negatively related to achievement. Hedges’ work helps support the selecting and grouping of countries using their achievement and the variation within that achievement. Poon, Ng, and Lim (2013) presented on the differences in performance by SES in Singapore using TIMSS and PIRLS 2011 data. They explore the relationship between SES and performance in Singapore against other countries using home resources as their SES proxy. Their analysis focused on the strength (R2) and slope (B1) of the relationship between SES and achievement, similar to what the TIMSS, PIRLS, and PISA international reports do. In Singapore they find that an equal number of students from each SES quartile are represented in the topachievement quartile and that the slope of this line is similar to the international line. Their work helps highlight the different ways of exploring SES and achievement quartiles. Finally, Klieme (2013) presents on the challenges of large-scale data sets being able to capture the quality of schools and teaching to inform policies beyond achievement. He argues for more context-specific additions to the general IEA assessments that are built on existing theoretical frameworks for school and teacher quality. Klieme also argues for more adaptive testing which can help to reduce the amount of error in measuring achievement results. While the TIMSS dataset does have limitations, the concerns raised by Klieme cannot be presently addressed as the IEA does not offer adaptive testing but they do highlight the importance of considering context with respect to any findings. The TIMSS Encyclopedia (Mullis et al., 2012) will be an essential support in helping to contextualize the findings. So what should be taken away from this large body of international research of which this review has only scratched the surface? First, there are clearly several elements of inequality that impact test performance. Students, parents, the home, the school, and teachers all impact this as has been hypothesized and proven from previous research. Gender continues to be a critical component of predicting achievement. It also appears that student attitude, motivation, and confidence can help. A parent’s level of education and their level of involvement at home both impact their child’s test score. There are also clearly peer effects from attending school with higher-achieving peers and attending a school that emphasizes learning. Finally, what makes a good teacher is unclear but job satisfaction and attitude towards teaching can matter as well. All of these things are important variables to explore in the analysis. Socio-Economic Status 17 Given the importance of SES to this study, it is important to review the literature around SES and achievement. The 2012 EFA Report confirms that higher SES leads to higher educational outcomes (UNESCO, 2012). In education, Coleman’s research in 1966 and the numerous other follow-up studies show that socio-economic status is one of the biggest sources of disadvantage related to achievement outcomes. It also carries over to careers after school as background, race, and gender are major barriers to students entering STEM fields (Griffith, 2010). The international literature was very clear as to the impact of SES on student achievement so it is of great importance to understand what goes into the definition of SES. The review of the literature below will show that SES is a multi-dimensional measure that comprises many elements of an individual’s background. Both the PISA and TIMSS studies have shown that SES impacts achievement at multiple levels of the educational process but SES can be measured in multiple ways. A single variable such as free or reduced lunch or books in home is often used as an SES measure but can be unreliable, especially across cultures where the selected measure might have different levels of relation to SES. What exactly is the best way measure SES? That is the exact question that the panel commissioned by the IES for the NAEP study set out to answer (Cowan et al., 2012). The panel made a series of recommendations, some of which will be applied to this study, with the hope of providing a better analysis of the impact of SES on achievement. The panel suggests using a “big three” measure that includes family income, parental educational attainment, and parental occupational status combined into a composite variable. Additionally, the panel suggests an expanded measure of SES could include neighborhood or school SES. With respect to using a single or composite measure of SES, the panel suggests that the advantages of using a composite measure outweigh the disadvantages. One main issue the panel highlights is correctly weighting the SES composite to create a proper measure based on the data being used. This study, due to limitations in the data, will apply some, but not all of these recommendations regarding measuring SES to the dataset. Family income will be approximated by the variable “books in home” as reported in the student questionnaire. Parental educational attainment will also be used for both parents. Unfortunately parental occupational status is not available in the dataset so this cannot be included in the measure of SES. TIMSS also has a variable off the school questionnaire that asks about the percentage of economically disadvantaged students in the school so this can be used to estimate school-level SES. Since not all of the factors of SES are available for use, it is possible that the effect of SES will be understated. 18 Utilizing a different “big three” SES measure has also been explored in the academic literature as well. Marks, Cresswell, and Ainsley (2006) use PISA 2000 to look at the impact of material, social, and cultural resources on achievement. The fact that all three of these resources are considered part of SES and contribute differently to achievement reinforces the importance of using a comprehensive SES measure. As hypothesized previously, they find that the magnitude of the effect varies by country but on average 60% of the variation in achievement is due to the combination of these three factors. Another important finding of their work was that the effect held regardless of the subject for which achievement was measured (reading vs. mathematics vs. science). Since the effect of SES appears relatively consistent by subject, this paper will use just science as an achievement outcome, however, comparing for math or reading achievement would be a useful avenue of future research to check for subject effects. The academic literature also reinforces this impact but it is also important to understand the generational factors that affect students as a result of where they live or who their parents are. Oreopoulos, Page, and Stevens (2006) attempt to understand some of the intergenerational effects of schooling and find that a 1-year increase in the education of either parent reduces the probability that a child repeats a grade by 2 to 4 percentage points. This highlights the importance of the need to include a parental education variable within an SES measure. Archer et al. (2012) also find that SES factors impact how students engage with science as a subject as well as how they see themselves working in a science related career later in life using data from the United Kingdom. The results from their mixed methods study shows that middle class families were able to better leverage their student’s interest in science by providing them with additional opportunities while low class families were not able to provide these additional opportunities. Archer et al. argue that this is because lower class families do not have the financial means or science specific knowledge required to foster these additional opportunities in science for their children. This work highlights a class divide along SES lines within science to where lower SES students are at a disadvantage undertaking additional opportunities within science as a discipline. So how might achievement and SES gaps be changing? Reardon (2011) undertakes an analysis of income gaps over time using United States datasets to look at how achievement gaps have changed. Using percentiles, Reardon does find in fact that achievement gaps between the 10th and 90th percentiles have increased in the United States. If in fact these gaps are widening, then this study becomes critical for understanding how to lessen the gap by detecting where disadvantaged students are being successful or what might be contributing to the increasing gaps. Peer-Reviewed Journal Literature 19 The peer-reviewed journal literature confirms much of what was found in the international reports in that science achievement is a combination of student, teacher, and school characteristics. The peer-reviewed journal literature review will take a more science-specific focus to try to highlight some of the unique aspects of science achievement with respect to students, teachers, and schools. Studies with TIMSS and Other International Datasets To start, it is important to highlight some of the major studies with respect to TIMSS and other international datasets to see what other researchers have been able to understand from their analyses. One important finding in the Finnish case is that high exam scores do not automatically mean high engagement or interest. Finland, for example, has very high scores on international exams, however, this does not mean that students are necessarily interested or engaged in science. Lavonen and Laaksonen (2009) explore this idea using PISA 2006. They find that demonstrations, practical work, and the ability to make conclusions are the strongest predictors of Finnish student performance as it pertains to science. Despite the high marks on international test scores, Finland suffers from a lower entry into science careers just like many other countries in the world. In PISA 2006, Finnish students’ interest in science careers was lower than OECD average (Lavonen & Laaksonen, 2009). This highlights the divide between student achievement on tests and students interested in science to the point of taking up a career in science. Finnish students are generally not interested in ways in which scientists design experiments or what is required for scientific explanations nor are they interested in inquiry or debate, but students are interested in how science can improve people’s conditions or work for the protection of the great good (Lavonen & Laaksonen, 2009). This indicates that teachers may need to better connect science to the outside world and place less of an emphasis on specific details of scientific processes. At the teacher level, what makes a good teacher is difficult to pin down especially when it comes to working with disadvantaged students. Akiba et al. (2007) use TIMMS 2003 data to find that having a high-quality teacher (certification, math major, education major and more than 3 years’ experience) leads to higher student achievement in mathematics but that there are significant gaps along SES lines in access to higher-quality teachers. They argue for teacher policies that recruit the top candidates into the teaching profession and develop them so that they will stay and become successful. This is similar to the model used in Finland where entry into teaching is very competitive. Students 20 Gender has been highlighted as a major source of disadvantage despite overwhelming evidence that investing in girls has numerous benefits. Schultz (2002) argues for more investment in girls’ education because girls can show the greatest marginal gains for the same level of investment so it makes financial sense. He also points out that child health and schooling outcomes are more closely related to those of their mother than their father so it is important from a generational perspective as well. From an international perspective, Beaman et al. (2012) look at a female leadership policy in India where certain village council spots are reserved for women every few years. They found a positive influence on girls’ educational achievement when councils were headed by women. Both parents’ and girls’ aspirations also increased with a female village leader present and girls spent less time on chores and more time on school during these times. Lloyd et al. (2006) examine dropout rates in Pakistan during two different years. They find that both household and school factors matter. Girls show high dropout rates especially depending on what school they attend, the affluence of their household, and having a mother that did not attend school. A loss in household income significantly increases the risk of dropping out, most likely because the parent needs the child to work or can no longer afford the fees to send the child to school. These are some examples of the important role parent occupation can play in measuring SES, however, this measure is not available for this study. Focusing in on science and gender, one consistent finding across science subjects is the enormous gender differences between girls and boys when it comes to a whole range of science measures so it is important to highlight girls’ experiences with science both in and out of school. The gender ratio for participation and achievement is biased for boys in physics and chemistry although biology is more even, sometimes even biased for girls (Osborne et al. 2003). There are many reasons attributed to this such as differences in their class experiences (Griffith, 2010), lack of opportunities to actively participate in science (Osborne et al., 2003; Uitto et al., 2006), or communities that reinforce traditional roles of women and science (Reigle-Crumb & Moore, 2013). The community and culture also impact girls and their science participation. Riegle-Crumb and Moore (2013) try to uncover why less women are taking physics courses in the United States. They find that in communities where women are more present in STEM fields, the physics course-taking gender gap is reduced. They also use international gender parity indices and show that reduced gender parity within a nation also reduces female disadvantage in sciences. Griffith (2010) also finds these community and role model effects in her work in the 21 United States. Having more female and minority graduate and undergraduate students helps improve the persistence rates for females and minorities at the university level. Griffith (2010) takes a look at students in college as to why women and minorities have much lower persistence rates than non-minority, male students. She argues that much of the difference is explained by differences in their preparation and educational experiences. Taking more AP classes in STEM, according to Griffith has a positive impact on the persistence rates of women and minorities. Receiving low grades early on in the program, not surprisingly, has a negative impact on persistence (Griffith, 2010). This finding reinforces the importance of teaching and success in science classes to persistence within science as a field. Griffith also finds that how much a university devotes towards research also matters with universities that devote more resources towards research showing a lower persistence rate implying that less attention is given towards teaching. Her finding that having more undergraduates compared to graduate students will help persistence rates as will increased spending on teaching over research. While this study will not be able to measure persistence rates, these findings highlight the long-term importance of any potential gender findings from this work to feed into longer-term questions about girls and science careers. Further highlighting the importance of the student-teacher relationship, Askell-Williams and Lawson (2001) very simply but effectively asked students in Australia what the components of interesting class lessons were across all subjects. Their argument was that student input was needed in shaping teaching policy and to aid in teacher development. They highlighted three main clusters that were positively associated with interest and learning according to the students: 1. Teachers, specifically their personal qualities in how they relate to students, and their professional qualities as to what they know about their subject; 2. Individual learning, specifically, student’s situational interest, their individual achievement, and their self-efficacy; and 3. Social learning, specifically the chance to interact with other students, work together with them, and having a positive class environment. AskellWilliams and Lawson show that students are able to differentiate between student and teacher behaviors and they find that there is a gap between teachers’ actions towards student learning and actual student learning. The data available in TIMSS allows for the study of a student’s situational interest in science as a subject as well as their self-efficacy in science. This appears to be an important area to explore based on the literature, specifically as it pertains to low-SES students because it is something that could be changed more easily than other factors. If in fact, low-SES students are disproportionately more likely to have a negative affect toward science and 22 having a negative affect in science is related to achievement in science, then this opens an avenue to explore what might be causing this relationship between low SES and a negative affect in science. Teachers Building off the international findings of Akiba et al., Langford et al. (2002) highlight how low-income, low-achieving, urban, non-white students are more likely to have the least skilled teachers in the United States. Boyd et al. (2009) find a wide variety in the range of teachers in New York City schools depending on where they were prepared and that this has a significant impact on student achievement. While it may be difficult to define the exact characteristics, the literature is very clear that disadvantaged students are more likely to have lower-quality teachers. The existing literature has focused mainly on individual student characteristics but less on how what the teacher does impacts the students. Science teachers and teachers more generally, are an important component of the student experiences that have been highlighted so far in that the choices teachers make in the classroom will impact students. The quality of teaching is a significant determinant of attitude and achievement towards school science. Positive attitudes in science are associated with higher levels of involvement, rapport with the teacher, relationships with classmates, using a range of teaching strategies, and new learning activities (Osborne et al., 2003). Osborne et al. (2003) also find that engagement was raised by pupils having autonomy and more control over their learning within their science classes. The clearest example of this importance comes from Cooper’s (2013) work which looks across all subjects in a single school but highlighted some science specific findings. In her sample, 71% of the variance in engagement was at the class level or student-by-class level while only the remaining 29% was at the student level alone. Cooper finds that teaching practices explained a large proportion of engagement across classes reducing student residuals by 44%, class residuals by 77%, and student-by-class residuals by 41%. What is most astonishing is that race, parent education, class level, and class period were not significant when accounting for teaching practices. While this study will not measure engagement but rather achievement, this finding possibly highlights some ways in which the impact of disadvantage can be lessened through teaching. The qualitative portion of Cooper’s work helps to highlight exactly what this looks like, especially in science classes. A physics class was used as the exemplary model where the teacher utilized all three types of teaching being tested (connected instruction, academic rigor, and lively teaching). In this class, students saw physics 23 as being relevant to their lives, the teacher made content easy to understand for students, and related to students by using humor. The teacher was a veteran of 39 years who was a physics major that the students also saw as knowledgeable about the subject. To contrast this, Cooper profiled another physics class where only lively teaching was used. Students reported liking the teacher but felt disconnected from the content and did not show a high level of rigor or personal investment in what they were doing. Finally, Cooper profiles a biology class where the teacher, a trained biologist, uses a high amount of rigor and lively teaching but not connected instruction. These students reported liking the teacher and understanding the content, however, the students did not see how it was relevant to their lives. Cooper’s work shows that exploring the pedagogy that teachers use and accounting for their credentials has an impact on student experiences in the classroom. Drawing on the Finnish context, Lavonen and Laaksonen (2009) report that for Finnish teachers, the most useful science practices were: frequent use of teacher demonstrations, assigning practical work in the classroom, and the opportunity for students to draw their own conclusions. They also report that teachers do a good job of supporting students as evidenced by high student efficacy and student concept in science. This again supports the importance of teacher pedagogy and promoting student self-confidence in science. Based on this review of the literature teacher background characteristics as well as the choices the teachers make in the classroom both impact outcomes in the science classroom. Teachers that major in science as well as their years of experience appear to be important background characteristics. In the classroom, teachers that provide open-ended experiences, relate content to their students, and give students examples all seem to be important practices. These findings indicate that teacher actions can positively impact students whether they are pedagogical or attitudinal. It is important for this study to explore these characteristics as they relate to low-SES students. If in fact low-SES students are more likely to have teachers that make disadvantageous pedagogical choices or foster poor attitudes then understanding the impact these teacher factors might have on student achievement becomes another important aspect of this study. School Schools as an institution are frequently studied in a variety of ways to see how their structure impacts individual student performance. The international literature showed that well-resourced schools do matter and that peer effects can take place within schools that lower or raise student performance depending on the peers. The 24 literature is less in agreement with these findings. Greenwalt, Hedges, and Lane (1996) analyze 60 different studies around school resources and find that overall increased resources, small class sizes and teacher characteristics improve student achievement in most cases but not all. Glewwe et al. (2009) conducted a randomized trial in rural Kenya and found that textbooks did not raise test scores overall in English, math, and science but did for the best students who knew how to use them. This suggests that resources can be beneficial if schools and students know how to use them to improve student performance. It would not be unreasonable to assume that more affluent schools are better at leveraging resources than less affluent schools. The class size within a school is also frequently researched as a way to boost student achievement by reducing the size of classes. Dee and West (2011) find that small class sizes greatly impact the non-cogitative development of eighth graders in the United States using all subjects, especially in urban settings where more disadvantaged children are likely to live. Angrist and Lavy (1999) study class size in Israeli schools that have a cap of 40 students (Maimonides rule) and find positive effects for fourth and fifth grade students but no effect for third grade students with respect to mathematics and reading. Krueger (2003) studies teachers and students from kindergarten to third grade in the United States and finds that small class size effects in mathematics and reading are the largest for minority students and students on free and reduced lunch. Michaelowa (2001) looks at five Francophone African countries using PASEC data and finds a variation of factors on achievement in mathematics and reading. Hoxby (2000) finds no effect of class size on achievement in mathematics and reading when looking at 649 elementary schools. Class size appears to be highly locally dependent on the needs of the country, hence the wide variation observed. The literature seems to conclude that class size generally seems to matter for the most disadvantaged students who need the extra attention but is inconclusive for other students. Class size is not included in this model due to the inconsistency and highly-contextualized nature of the effect, however, it is a variable that may be worth exploring at a later date. Another body of school literature has to do with peer effects in that does attending a school with high achieving (or the opposite) peers raise (or lower) your own performance? In Kenya, peer effects are found in both lower- and higher-achievement tracks for mathematics and reading (Duflo, Dupas, & Kremer 2012). This may be due to both working with like-ability peers and teachers teaching toward the middle of the distribution which helps all the students in the classroom rather than just some of them (Duflo, Dupas, & Kremer 2012). Angrist and Lang (2004) look at the METCO desegregation program in Boston where students are sent from Boston into the suburban 25 areas. They find little to no effect on either the current or incoming students as a result of the program meaning that adding lower achieving students from Boston did not lower the performance of the students in the receiving schools in mathematics and language arts. Imberman, Kugler, and Sacerdote (2012) look at the impact of Katrina evacuees being moved into surrounding school districts in the aftermath of the natural disaster. They find little to no impact on incumbent students’ achievement overall in mathematics and language arts. In cases where there was an impact, they find that higher students put with higher-achieving peers improve while lower-achieving students put with lower achieving peers worsen. The results of peer effects are mixed but it appears that for students in disadvantaged situations, the peer effects do not change the achievement of advantaged peers but it can magnify the lack of achievement of disadvantaged students when a student is with disadvantaged peers. The school literature seems to highlight some important characteristics that will be important for this study to consider. The international report literature highlighted the importance of the school climate and organization on achievement. It also highlighted the urban and rural school differences that are present in some countries. The peerreviewed journals also highlighted class size but with inconsistent findings so that is not something that will be considered for this study but could be important for a future study. Both sets of literature highlighted the importance of peer effects and how the composition of the school influences individual student performance. These factors will be considered as it relates to low-SES students with respect to how their individual performance is impacted by the climate of the school, where the school is located, and the types of peers in the school. Overall, the literature provides a good basis of variables to explore for the study. At the student level, gender and SES will be critical aspects to study. Additionally, it will be important to look at student attitudes and affects as they relate to science achievement. At the teacher level, teacher background, attitudes, and pedagogy are important characteristics based on the literature. At the school level, the composition of the student body, the climate of the school, and the location of the school all impact achievement according to the literature. What appears to be largely missing from the literature is where this study will pick up and that is the interaction of these factors with SES and achievement. The literature consistently finds, for example, that student or teacher attitudes impact student achievement. What the literature does not say, is if low-SES students are more likely to have a poorer attitude and how that might, if at all, impact their achievement. This will help to highlight not only the effect of these factors on low achievement but also how SES intersects with each factor and achievement. The 26 next section will give some background to the country selection, a rationale for the selection, and profile the countries selected. 27 Chapter 3 - Country Selection and Background This chapter will cover the selection of the countries to be used in the analysis and then provide some detailed descriptions for each country. The selection of the countries was done in a very purposeful manner so that the sample contained some very stark differences between achievement and the standard deviation of that achievement, thus, it is important to highlight that process and the rationale that went into the country selection. The background portion of this chapter will focus in on the political and structural educational systems in each country including governance, teacher certification, and school structure. This portion is intended to be used to help provide some background for the results that will come from the analyses so the findings have some context. Country Level Analysis and Selection The first step for the study was to systematically draw a sample of countries that was aligned with the idea of exploring differences in achievement of disadvantaged students relative to the population. This required first looking at broad differences between countries with respect to their achievement and the level of achievement inequality. To do this a simple scatter plot was made that plotted the science achievement score for each country against the standard deviation (achievement inequality) of that country. Figure 3.1 shows this simple graph and a clear negative relationship emerges between the amount of achievement inequality (measured by SD) and the level of achievement in that as a general trend, as the size of the science achievement variation increases, the science achievement decreases. To test to see if this trend continues for achievement groups the same graph was replicated but for the achievement of the 75th percentile for each country against the SD and also for the 25th percentile of science achievement for each country against the SD. These can be seen by Figures 3.2 and 3.3. The same negative pattern continues in both groups but what is very striking is the magnitude of this relationship – the slope for the 25th-percentile students’ achievement is almost 4 times greater than that of the 75th-percentile students. This suggests that the lower achieving students have their achievement depressed by other factors in higher inequality countries, a very important first finding that helps to justify the structure of this analysis. 28 300 400 500 600 Figure 3.1 – Science Achievement vs. SD of Science Achievement by Country 120 100 80 60 SD Fitted values 600 95% CI Average Singapore Average 500 Finland Chinese Taipei Korea Japan Slovenia Russia Hong Kong England USA Hungary Australia Lithuania New Zealand Israel Sweden Italy Ukraine Norway Kazakhstan Iran Romania Chile 400 Tunisia Thailand Armenia Saudi Arabia Syria Georgia Indonesia Turkey UAE Bahrain Jordan Malaysia PalestineOman Qatar Macedonia Lebanon 300 Morocco Ghana 60 80 100 SD Source: TIMSS 2011 8th grade data 29 120 Figure 3.2 - 25th Percentile Science Achievement vs. SD by Country Source – TIMSS 2011 8th grade data Figure 3.3 - 75th Percentile Science Achievement vs. SD by Country Source – TIMSS 2011 8th grade data 30 Based on these graphs a sample of countries needs to be selected. It was decided to put these graphs into quadrants and select one country from each quadrant as well as two in the middle for a total of six countries in the sample. Representing the high achievement and low variation country was Finland with Chile representing the low achievement, low disparities. Singapore represents the high achievement, high disparity country while Ghana represents the low achievement, high disparity country. Nested in a group of countries towards the middle point of the four quadrants are the United States and Republic of Korea which will also be included. This sample provides very stark contrasts in differences between achievement and disparities in that achievement. Finland represents the ideal combination by having higher achievement but also low variation within that achievement meaning many students are brought to high levels of achievement through equitable outcomes. Chile represents that low variation does not guarantee high achievement. Country Background Information It is critical to situate each country within their given context so as to provide some background for interpreting the results that are found. To aid in doing this TIMSS provides an encyclopedia where a background is provided about the education system in each country by a specialist or official in that country. Other outside encyclopedias and sources will also be used, where available, to help complement the TIMSS encyclopedia including the TEDS-M encyclopedia which although it is related to the preparation of mathematics teachers, has numerous applications to teachers in this sample independent of subject. The country background section highlights how the educational structures might be contributing to any findings. These structures include the teachers, students, families, and governance within each country. What helps bind these structures together is the idea of equity in opportunities and outcomes as first put forth in the 2009 PISA report on overcoming social background (OECD, 2010). The report argues that the top performing countries educate all their students to a high level of performance by providing equal opportunities and having a low impact of external factors like SES with respect to achievement. This high performance is a function of the interaction of the various educational structures working to equalize outcomes. Low Variation Countries Finland Finland represents high achievement but low variation in that achievement for the sample in this study. That is to say that the unique contribution of the Finnish system to this study is that it provides high-quality 31 education with comparatively small differences between students. This is a process that has not happened overnight but has actually taken many decades to come to fruition. Starting in the 1970s, Finland moved from a tracked system to a comprehensive system where basic education is available for all students (Hautamaki et al., 2008). In 1985, the educational system was reformed to make the education system more decentralized so that more of the authority was delegated to the municipalities (Hautamaki et al., 2008). During the 1990s, Finland undertook a major curricular reform (LUMA program) focused on developing a more comprehensive mathematics and science curriculum (Kupari & Vettenranta, 2012). LUMA was a large-scale investment by the Finnish government in education and Finnish teachers were able to take thousands of credit hours of additional training (Hautamaki et al., 2008). One outcome of these series of reforms is that the Finnish Government has taken extreme care to ensure that everyone has an equal opportunity to receive a compulsory education (Kupari & Vettenranta, 2012). The system in Finland is now structured so that the national government broadly sets the curriculum through the National Core Curriculum for Basic Education but the local government has autonomy to implement the national curriculum and content (Hautamaki et al., 2008). The goal of the National Core Curriculum is to establish common guidelines for student experiences (Hautamaki et al., 2008). The school systems are now organized by municipalities which are responsible for making these local adjustments of the federal rules (Kupari & Vettenranta, 2012). One example of local autonomy in Finland is that schools and teachers have the authority to choose the textbooks and supporting lessons that they feel work best for their students (Kupari & Vettenranta, 2012). More broadly, Finland is a small but wealthy country where the majority of the population of 5.2 million people are of Finnish descent and speak Finnish as a language although Swedish is an officially recognized language and minority (CIA, 2013). Finland distributes its’ $35,900 GDP per capita very equally as evidenced by a Gini index that ranks 131st in the world (CIA, 2014). Finland is an older country with respect to age as only 15% of the population is age 14 or younger leading to a youth dependency ratio of 26% and a median age of 43 years (CIA, 2013). As a result, Finland spends 6.8% of their GDP on education which is one of the highest rates in the world so it is no surprise that 100% of their population is literate and school life expectancy is until 17 years old (CIA, 2014). One of the strongest assets of the Finnish education system is the strength of their teachers. In 1975, Finland undertook major teacher education reforms by moving teacher training from teacher colleges to universities which has created a culture of research-based teacher training that is still prevalent today (Hautamaki et al., 2008). In Finland, a master’s degree is required for teacher certification where classroom teachers earn their master’s in 32 education while specialist teachers earn their master’s in their subject area (Hautamaki et al., 2008). Classroom teachers in Finland teach all subjects from grades one to six in Finland while specialist teachers teach only in their science major subject for which they are certified from grades seven and up (Kupari & Vettenranta, 2012). In addition to amassing 300 credit hours for their master’s program, teachers must participate in in-service training which is facilitated at the local level (Hautamaki et al., 2008). The requirement of teachers needing master’s degrees also helps facilitate teachers continuing their education further into an additional master’s degree or doctorate degree (Hautamaki et al., 2008). As a result, teaching as a profession in Finland is extremely competitive with acceptance rates into teaching programs averaging 10 to 15 percent each year (Hautamaki et al., 2008). Basic education in Finland starts at about seven years old, however, children do have the option of attending a free year of pre-primary schooling (Kupari & Vettenranta, 2012). Enrollment in this is very low compared to other countries and only the kindergarten year before grade one starts is highly attended (Hautamaki et al., 2008). The basic or compulsory education lasts nine years up until grade nine. While enrolled in school, students receive all their complementary social supports such as free meals in school as well as medical and dental care (Hautamaki et al., 2008). In Finland, science is taught as an integrated subject in grades one to four called “Environmental and Natural Studies” (Kupari & Vettenranta, 2012). In grades five to nine it is taught as separate subjects including: biology, geography (earth science in United States), physics, chemistry, and health (Kupari & Vettenranta, 2012). In biology students focus on “life and natural phenomena,” in geography students learn about the “world and regional phenomena,” in physics students learn about the “nature of physics,” in chemistry students learn about the “nature of chemical information and scientific thinking,” and in health students learn “well-being and safety” (Kupari & Vettenranta, 2012). Students in Finland are usually 15 years old in 8th grade but it is difficult to put an exact number on how many hours of science per week a Finnish student receives because Finnish students have a degree of autonomy in designing their schedule but science lessons in Finland are typically 75 minutes. 2 This combination of reforms, investments in education, and a professionalized teacher workforce has Finland consistently ranking near the top of the international achievement exams in the recent years. This has drawn attention to understanding the Finnish educational system in greater detail and what specifically drives their success. 2 This information comes from my own experiences in Finland. 33 The Finnish educational system is unique in that it takes great care to ensure that every student has equality of opportunity supported by a generous social structure. Republic of Korea In the Republic of Korea, the population of almost 50 million is universally of Korean descent and Korean is the main spoken language (CIA, 2013). Korea is also an older country with only 14% of the population aged 14 years or younger, a median age of 40 years, and a youth dependency ratio of 20% (CIA, 2013). Korea is also a wealthy country with a GDP per capita of $33,200 per person and this income is generally equally distributed as Korea’s Gini index ranks 112th in the world (CIA, 2013). Korea spends 5% of its GDP annually on education, has almost universal literacy at 98%, and has a school life expectancy of 17 years of age (CIA, 2013). Korea’s inclusion in this study represents average to above average achievement with average variation in that achievement but of all the countries in the sample it has shown the most consistent increases in achievement over time. Similar to Finland, Korea has undertaken a series of rapid reforms and made a significant investment in their education system over the past half century (UNESCO, 2005). Unlike Finland, a large reason for Korea’s success is also due to a constant cycle of evaluation. This constant evaluation and assessment of the educational systems drives many of the reforms. Starting in 1945, Korea moved toward a compulsory basic education system by expanding the educational system and creating teacher education colleges (OECD, 2014). Then 30 years later the Government implemented 10 reforms so Koreans could “lead the 21st Century” (OECD, 2014). This included updating almost all aspects of the educational system but also putting an emphasis on science education (OECD, 2014). Starting in 1991, the Korean Government delegated many of the budget and administrative decisions to the local level and while the curriculum is still written at the national level, it is designed to allow for local flexibility (Cho et al., 2012). One interesting aspect of the Korean education system is that at a national level, education is housed in the Ministry of Education, Science, and Technology (MEST) (Cho et al., 2012). This reinforces the importance the Government places on the relationship between education, science, and technology. The Korean national curriculum is frequently revised with revisions taking place in 2007 and 2011 most recently (Cho et al., 2012). The MEST also began work to introduce a STEAM (science, technology, engineering, arts, and math) curriculum at the elementary and middle school levels (Cho et al., 2012). The science curriculum for grades three to ten comes from the national common basic curriculum and is centered mainly on inquiry-based learning (Cho et al., 2012). For eighth grade students, science concepts center on energy, material, life, and the earth 34 (Cho et al., 2012). Presently, student progress is assessed annually through the National Assessment of Education Achievement (NAEA) which started in 2008 (OECD, 2014). The results of the examination are made public and are used to compare schools and regional governments again each other as well as over time (OECD, 2014). Overall, Korea has been lauded in PISA for the below-average impact of SES on student achievement and for all students to have equal access to learning opportunities (OECD, 2014). In Korea, elementary school consists of grades one to six, middle school is grades seven to nine, and high school is grades ten to twelve (Cho et al., 2012). There is also preschool education available for children who are aged three to five. Students in Korea are generally 14 years old during 8th grade (Cho et al., 2012). For students who are gifted in math and/or science, the Korean Government provides 23 special high schools nationwide for these students to study advanced topics in math or science (Cho et al., 2012). In Korea, science is taught in grades three to ten and students have the opportunity to pursue gifted classes as they progress or supplementary classes if they are struggling (Cho et al., 2012). Students receive 102 instructional hours in science annually in Korea in grades three to seven and 136 instructional hours in grades eight to ten (Cho et al., 2012). Teacher training and certification is very centralized at the national level in Korea. Teachers in Korea are subject specialists usually starting in grade seven and are licensed at the national level (Cho et al., 2012). Teachers are most often trained at four-year universities in Korea where they study their subject area and take teacher education courses (Cho et al., 2012). In order to receive their certification, teachers must also complete a teaching practicum, pass a national examination, and pass an interview with officials (Cho et al., 2012). Once a teacher has reached three years of experience, they are required to complete 180 hours of professional development either at a university or through a distance learning program (Cho et al., 2012). Starting in 2000, teachers also now must be part of a new appraisal system that promotes and pays based on abilities using nationally created criteria (Cho et al., 2012). In summary, Korea is the product of many years of reforms and a constant cycle of evaluation with an eye towards improvement in all facets of the educational process. At a national level this involves constantly updating the curriculum, at a teacher level this includes constantly appraising teachers to make promotion and pay decisions, and at a student level this involves assessing students on a national examination each year. This continuous and holistic process is what drives Korea’s success. Chile 35 Chile is the only South American country in the sample for this study and represents a country that has low achievement but also low disparities in that achievement so it is a country of interest from the standpoint that the low achievement on the surface appears evenly distributed. This is in contrast with income in Chile where GDP-per capita is $19,100 but this is also very unequally distributed as Chile’s Gini index is the 14th highest in the world (CIA, 2013). There are just over 17 million people within Chile and almost all these people speak Spanish and are considered white and non-indigenous (CIA, 2013). Chile is a middle aged country where 21% of the population in Chile is age 14 or younger with a youth dependency ratio of 30.2% and a median age of 33.3 years (CIA, 2013). With respect to schooling, Chile spends about 4.5% of its GDP on education (90th highest globally), has a literacy rate of 98.6%, and has a school life expectancy until age 15 (CIA, 2013). Chile is of interest to researchers because it has a long history of reforms within education. Chile claims the oldest teacher training school in Latin America starting in 1840 but the recent years of military dictatorship has damaged the educational system in Chile (Davidson, 2013). In Chile, there are three different types of schools: municipal, subsidized private schools, and fully private schools, of which about 47% are municipal, 47% are subsidized private, and 5% are full private (Gubler, 2012). During the recent history an increasing number of students are moving from the municipal system to the subsidized private system because private schools in Chile are generally considered to have the better teachers and better-performing students (Davidson, 2013). The schools in Chile are funded nationally but administered locally and the funding formula is generally a fixed amount of money per each student at the school however, this funding amount can vary based on the SES of the school with higher amounts going towards low-SES schools in an attempt to equalize outcomes (Gubler, 2012). This indicates that Chile is aware of inequalities that are present in the country and is attempting to compensate for them in some way. Chile does use a national assessment system in grades four, eight, and ten to monitor progress and while the results of these assessments are reported publically at the school, regional, and national levels they do not have consequences for student progression or teacher administrative decisions (Gubler, 2012). Recently in Chile, concerns about low student achievement have been tied to a lack of teacher knowledge among other factors (Davidson, 2013). Teachers in Chile are somewhat unique in that students are usually taught by generalist teachers up to grade eight, however teachers for the later grades (five to eight) are prepared to be more specialized compared to teachers of the lower grades (Davidson, 2013). There are also now some schools where full specialist teachers are teaching grade seven or eight students (Gubler, 2012). Specialist teachers in Chile will only 36 teach the subjects which they are certified in such as science (Gubler, 2012). In order to become certified, there are a growing range of options available for teachers in Chile. In 2006, there were 16 public universities that certified teachers, 22 private ones, and 5 professional institutions (Davidson, 2013). In recent years in Chile, due to the low costs of starting a teacher preparation unit within a university, private universities have taken over preparing the majority of teacher preparation candidates instead of the public universities (Davidson, 2013). There are no national requirements for entry into a teacher education program or teacher certification, these are largely left up to the universities and the traditional certification route to teaching in Chile takes about five years (Gubler, 2012; Davidson, 2013). Teacher preparation programs in Chile consist of a combination of subject knowledge, pedagogy training, and field experiences (Davidson, 2013). For ongoing teacher monitoring, teachers must complete a national-designed assessment which is not public but can allow the teacher to earn more money if they score high enough on the evaluation (Gubler, 2012). Eighth-grade students in Chile are generally age 13 and can attend one year of preschool and one year of Kindergarten prior to starting grade one (Gubler, 2012). In 2010, it is estimated that 76% of students attended preschool and 90% attended Kindergarten (Gubler, 2012). The science curriculum in Chile is organized by grades one to four, five to eight, and nine to twelve. The Ministry of Education is responsible for writing the curriculum and providing the support materials such as textbooks but it is up to local officials to oversee the implementation of the resources (Gubler, 2012). In grades one to four, students take science called “Natural, Social, and Cultural Comprehension of the Environment” which includes social science as well as natural science (Gubler, 2012). In grades five to eight, science is called “Comprehension of the Natural World” and is taught as an integrated natural science (Gubler, 2012). In grades one and two students in Chile receive five hours of science instruction per week. In grades three and four it is six hours per week and in grades five to eight it is four hours per week. Chile represents a unique country case due to the combination of low achievement and low variation in that achievement. Despite having a long history of teacher training, overall student achievement has suffered due to inconsistent changes within teacher development including an increasing shift of teacher training towards more private and newer programs. There has also been a shift in students from the municipal sector to the private sector where the perception is students will attend better schools with better prepared teachers. While Chile does have large social inequalities, it appears that with respect to education that these inequalities are not as pronounced despite variations in schooling and teachers. 37 High Variation Countries Singapore Singapore is the highest-achieving country for eighth grade science in the 2011 TIMSS study and in this study it was selected on this basis of high achievement but also a high degree of variation in that achievement. Singapore will be of interest to study because it defies the 2009 PISA idea of high achievement going with equality in that achievement. Singapore is also unique in that it is by far the smallest country in the sample which could be a reason for the larger variation in achievement. Singapore places a high value on education as a result of having very limited amounts of land and natural resources (Chin et al., 2012). Singapore is a city-state of about 5.5 million people located in southern Asia and the country is mostly Chinese in background but also has Malay and Indian minorities so the official languages in Singapore are Mandarin, English, and Malay (CIA, 2013). Singapore, like Korea and Finland, also has an older population with a median age of 34 years, a youth dependency ratio of 21%, and 13% of the population is less than 15 years old (CIA, 2013). Singapore is one of the wealthiest places in the world with a GDP per capita of $62,400 but it is also very unequal as the Gini index is 32nd in the world (CIA, 2013). Singapore spends only 3% of its GDP on education but still has a 96% literacy rate (CIA, 2013). Similar to Korea, Singapore has also undertaken a series of educational reforms in recent years to improve their achievement. In 1997, Singapore started the “Thinking Schools, Learning Nation” (TSLN) program which aimed to move from a more central system to one that is more flexible and local (Chin et al., 2012). This move towards a more flexible system created more choices for students in terms of their curriculum, their schooling, and their interests and abilities (Chin et al., 2012). The result is an education structure in Singapore that is very complex with students having multiple pathways during the education process. For lower secondary school (grades seven to eight), students have the option of attending a mainstream government school or specialized school if their examination scores merit consideration (Chin et al., 2012). Within their lower and upper secondary education students can attend express, academic, technical, or prevocational schools (Chin et al., 2012). There are also special school options for students with special abilities in science that offer a more intensive and customized science curriculum (Chin et al., 2012). The aim behind all these choices is to help match student abilities and interests to their educational experiences (Chin et al., 2012). Furthermore, the TSLN also devolved more control toward the teachers and school administrators to make decisions (Chin et al., 2012). This reform, while creating more choices, 38 could also have introduced a range of variation into the educational system with respect to achievement which could be why Singapore has such high variation in their scores. Primary education in Singapore starts with grade one and ends with grade six after which students will complete an exit examination. Then students will attend a range of different lower secondary schools depending on their examination scores which last from grades seven to eight generally (Chin et al., 2012). Students are generally 14 years old in Singapore for eighth grade (Mullis et al., 2012). Science is required for all students from grades three to eight in Singapore (Chin et al., 2012). Science in Singapore is organized heavily by cross-cutting ideas and themes rather than specific scientific disciplines. Primary science is focused on themes of diversity, cycles, energy, interactions, and systems. Lower secondary science continues with these themes but also adds in the themes of models and systems, measurement, and science and technology (Chin et al., 2012). Continuing with this idea, Singapore has a national science framework that is centered around science as inquiry (Chin et al., 2012). This is supported by teaching scientific skills and processes, scientific ethics and attitudes, and scientific knowledge, understanding, and education. Students in Singapore learn about science in daily life, science in society, and science and the environment. In this framework, the student’s role is to be the inquirer and the teacher the leader of the inquiry (Chin et al., 2012). Singapore is unique in that there is just a single institution responsible for the preparation of all the teachers in the country, the National Institute of Education (NIE) (Wong, et al, 2013). There are many different pathways into teaching but the majority of teachers in Singapore have a university degree in their discipline and then attend an additional one-year teacher training program at the National University (Chin et al., 2012). This teacher training program is aligned with the national curriculum and is also designed to ease teachers into the classroom by providing them mentoring support as well as a reduced workload (Chin et al., 2012). Teacher education students in Singapore take classes in academic content, curriculum, complete a teaching practicum, and must also complete a language enhancement course (Wong et al., 2013). The language portion is something unique to Singapore and helps teachers develop their English skills (official language of instruction) by emphasizing how to use English for academic purposes (Wong et al., 2013). Once fully certified, teachers in Singapore are entitled to 100 hours of professional development each year which can include specialized training or working towards an advanced degree (Chin et al., 2012). 39 At the primary level in Singapore, the teachers are generalists and then in grade seven the teachers are usually specialists that focus on only two within-discipline subjects which coincides with the students starting their lower secondary education (Chin et al., 2012). Secondary teachers in Singapore must hold a major in their subject discipline in order to teach (Chin et al., 2012). Teaching in Singapore is a very highly competitive process as teachers are recruited from the top third of each graduating cohort, similar to Finland (Chin et al., 2012). Singapore, similar to Finland and Korea, follows a similar pattern in what drives their success. Singapore has undertaken major reforms within their system that devolved some central control to a more local system. The teacher workforce in Singapore is highly competitive and professionalized leading to a strong teaching force. Finally, the system has focused heavily on providing students choices in their educational aspirations with the aim of providing a more tailored educational experience for the students. Ghana Ghana is a country that bears many of the hallmarks and challenges faced by a developing country in that there is a great demand for resources within Ghana but challenges effectively distributing and finding enough resources for everyone. Ghana represents the low achievement and high variation in that achievement within this work. This is an interesting combination to study because the averages indicate that there is overall low achievement but that there is also inequality within that low achievement. Ghana is the youngest and most diverse country in the sample as there are approximately 26 million people in Ghana that come from several different ethnic backgrounds and as a result speak 80 different languages (CIA, 2013; Anumel, 2012). This diversity highlights one of the challenges Ghana faces in providing education. Almost 39% of the population in Ghana is age 14 years or younger and since the median age in Ghana is only 20 years, the youth dependency ration is a staggering 65.6% (CIA, 2013). GDP per capita in Ghana is very low at around $3,500 per person and that small sum is also very unequally distributed as Ghana is the 62nd most unequal country in the world as measured by their Gini index (CIA, 2013). Ghana spends one of the highest percentages of GDP on education in the world at 8.1% but literacy remains low at 71.5% and school life expectancy is only until 12 years (CIA, 2013). Despite the low achievement overall, Ghana has been improving country level achievement since the 2003 cycle of TIMSS. Since 2004, Ghana has seen a surge in educational enrollments which has stressed the infrastructure as there is a need for more resources such as textbooks, qualified teachers, and updates to facilities (Anumel, 2012). Basic education in Ghana is now compulsory and free of tuition fees (although there are still other fees involved) as 40 a result of the new law. Level one basic education in Ghana includes two years of preschool, two years of kindergarten, six years of primary school (grades one to six), and three years of junior high school (grades seven to nine) (Anumel, 2012). Primary school in Ghana begins at age six but eighth grade students in Ghana are the oldest in the sample, averaging 16 years old which is likely due to late enrollment, grade repetition, and spending time away from school to work at home (Mullis et al., 2012). At the conclusion of basic education, students will take a national examination which will allow them to enter a secondary high school or a technical/vocational program depending on their scores (Anumel, 2012). In Ghana, science is taught as an integrated subject during basic education although the content and skill emphasis shifts between lower primary, upper primary, and junior high grades (Anumel, 2012). In primary school grades one to three students learn natural science that is mostly biologyand geology-based, then, in upper primary students learn more basic concepts of chemistry and physics (Anumel, 2012). These concepts are then carried forward into junior high school (Anumel, 2012). Despite having a shortage of resources, educational administration in Ghana is by far the most layered as it has six different levels (listed from most central to most local): ministry, national, regional, district, circuit, and school (Anumel, 2012). The Ministry of Education in Ghana sets the educational policies to be carried out under the supervision of the national Ghana Education Service (GES), which is charged with implementing the policies and providing the guidelines for inspection and compliance at the lower levels (Anumel, 2012). These inspections are primarily the responsibility of the regional and district levels (Anumel, 2012). Ghana’s most recent educational strategy plan from 2010 emphasizes the importance of mathematics, science, and technology within education in Ghana (Anumel, 2012). The goal of this policy is to help accelerate the development of Ghana through a strong emphasis on these subjects when children attend school (Anumel, 2012). This policy has led to a number of changes within Ghana including the appointment of a STEM coordinator in each of the 170 districts within Ghana, the creation of a National Science Education Unit, and the implementation of Science Resource Centers within each region to support teachers and give students greater access to science resources (Anumel, 2012). It is not clear what the impact of these policy changes has been to date. Teachers are trained in Ghana through one of the 38 officially recognized Colleges of Education within Ghana and teaching requires a degree from a recognized two- or four-year program (Anumel, 2012). Unfortunately Ghana suffers from a shortage of mathematics and science teachers, especially in the rural areas although the government is trying to offer incentives for teachers to work in rural areas (Anumel, 2012). Teachers in Ghana are 41 usually generalist teachers from grades one to six and therefore do not require any specialized mathematics or science training (Anumel, 2012). Teachers from grades seven to nine are expected to be subject specialists but this is not always the case especially because of shortages of science teachers overall, especially in rural areas (Anumel, 2012). It is very clear from the looking at the context within Ghana that there are many challenges facing the educational system that are driving a large amount of the low student achievement. There has been a huge influx of students in recent years as a result of abolishing school fees, there is a shortage of qualified math and science teachers, and there are not enough resources to support such a large percentage of the population in the school system. United States The United States represents one of the most decentralized educational systems in the sample. This decentralization leads to a great deal of variation within the educational systems by state and to a lesser degree, by locality. Unlike some of the other countries in the sample, the United States is just now beginning to undertake major educational reforms at a federal level which aimed to have some aspects of the educational process more centralized and standardized, specifically teacher qualifications and student achievement. The United States also represents one of the two “middle” countries selected from the sample in that as a country the United States is close to average in achievement and the variation of that achievement for the 2011 TIMSS eighth grade science results. The United States is by far the largest country in the sample with 318 million people of a range of different backgrounds. It is these differences in backgrounds that drive a majority of differences in student achievement in the United States. Of these people, a small majority are considered White with Hispanic and Black as large minority groups (CIA, 2013). English is the majority language but about 11% of the population speaks Spanish as their primary language (CIA, 2013). The United States is also a very wealthy country with a GDP per capita of $53,000 but it is also a very unequal country with a Gini index that ranks 41st in the world (CIA, 2013). The population of the United States is also older with a median age of 38 years, a youth dependency ratio of 30%, and 19% of the population is 14 years old or younger (CIA, 2013). The United States spends 5.4% of their GDP on education, has almost universal literacy at 99%, and has a school life expectancy of 17 years (CIA, 2013). It is estimated that only about 10 percent of educational funding each year comes from the federal government (Sen et al., 2012). Within each state, schools are generally administered and operated at a local level by 42 some elected or appointed school board or official but that does not mean the federal government does not have a role. One main role of the federal government in education in the United States is related to providing supplemental funding to help poor families/communities or students with disabilities which mainly comes from the Elementary and Secondary Education Act (ESEA) which was authorized in the 1960s (Sen et al., 2012). In 2002 under President George W. Bush, ESEA was changed to “No Child Left Behind” (NCLB) and made the federal funding conditional on educational performance and teacher qualifications, one of which was that all teachers grade six and up needed to be specialists in the subjects they teach (Sen et al., 2012). NCLB also required states to regularly test students for the purposes of measuring school, district, and state progress in learning. States not meeting these requirements were subject to penalties including loss of federal money and additional oversight. In 2009 under President Obama, NCLB was modified by providing additional money in a “Race to the Top” fund. This fund provides additional money to states that meet a range of criteria including using more student data to improve performance, focusing on STEM fields, and assistance to help low performing schools (Sen et al., 2012). There is not a national curriculum in the United States around science. However, recently a group of stakeholders has created the Next Generation Science Standards (NGSS), which are being offered to each state for adoption with the goal of each state adopting them and in turn, creating a national science framework. The NGSS structures science content around “Disciplinary Core Ideas” in science that carries over across grade levels and “Cross-Cutting Concepts” as ideas that are important to all science subjects (NGSS, 2014) 3. There are seven crosscutting concepts: patterns, cause and effect, scale proportion and quantity, systems and system models, energy and matter, structure and function, and stability and change (NGSS, 2014). As previously noted, the school structure varies by state but generally students attend elementary school from grades one to five, middle or junior high school from grades six to eight, and high school from grades nine to twelve. Eighth grade students in the United States are usually 14 years of age and have had about 8-10 years of schooling depending on their participation in preschool and kindergarten. Since states are responsible for their own curriculums it is difficult to give an exact explanation of which topics are covered in science by eighth grade but students will generally have an integrated-science course which covers four main areas of earth science, biology, chemistry, and physics (Sen et al., 2012). Students in the United States generally start learning science as soon as they enter primary school, however, the hours dedicated to science will largely vary by state. 3 For more detailed information on NGSS, visit the website at http://nextgenscience.org 43 Teacher certification is also largely a state responsibility but this varies a great deal between states (Youngs & Grogan, 2013). This is also true for the way teachers are trained in that there are many differences between the states which generally partner with universities in their jurisdiction for teacher training (Youngs & Grogan, 2013). The “No Child Left Behind” act in 2002 required all teachers to be “highly qualified” within four years of the passage of the law which means that an individual was able to pass an examination and also holds a background in the subject they are teaching (Youngs & Grogan, 2013). It is also possible for teachers to obtain certification through a “non-traditional” path which usually means that the individual possesses a four-year university degree and receives their teacher training from somewhere outside a traditional university setting. Youngs and Grogan (2013) calculate that 77 percent of districts require full state licensure to teach and that 66 percent require graduation from a state approved teacher preparation program. The United States has no national requirements around teacher practicum or field experiences (Youngs & Grogan, 2013). Within the United States educational system there are two main patterns that emerge: diversity and decentralization. The United States has a very diverse population with respect to different aspects of individual backgrounds which drives some of the inequality in achievement in the United States system. The standards, teacher training, teacher certification, and hours of instruction vary largely by state. This creates several parallel systems across the country as each state generally operates their own educational system. Recent reforms have attempted to centralize some aspects of education, specifically teacher certification and baseline student knowledge with mixed results. The next chapter will detail the methods used to analyze the countries just reviewed. 44 Chapter 4 - Methods This chapter describes the process used to create the data, select the variables, and conduct the analysis with the intention of having enough specificity that the reader is able to replicate the process. The appendix will have the syntax and raw output that is mentioned throughout this chapter and the next chapter for supporting reference. Any publicly available documentation such as guides, reports, or background information is not included in the appendix but can be downloaded from the TIMSS website. Creating the Dataset The TIMSS data is publically available for download in raw form from the IEA website as are the supporting documentation including user’s guides, a country encyclopedia, questionnaires, frameworks, and technical reports. All supporting documentation has been utilized in some capacity for this analysis. The raw data can be merged into a usable dataset manually with any statistics program or by using the free IDB data analyzer provided by the IEA which also comes with a user manual to ensure proper use 4. The IDB analyzer allows a user to merge or do basic descriptive analysis using any IEA dataset. This study used the IDB tool to merge the data to avoid any errors that might come from attempting a manual merge. Using the tool ensures that the data is merged with all the correct variables, weights, and plausible values as designed by the IEA. The datasets were then outputted as SPSS files and converted into Stata files using the StatTransfer tool to again ensure no errors or loss of data during the conversion. A full eighth grade science dataset was made for students, teachers, and schools separately as well as a large dataset that included all three groups in a single dataset. Initially all 45 available countries were included, however, this was narrowed down to six countries based on the analysis detailed in the previous sections. Factor Analysis and Variable Selection Variables had to be selected at the student, teacher, and school levels to help attempt to explain achievement differences. To do this, variables were selected based on previous research of the literature to include “essential” or variables commonly understood to impact student achievement and the results of a factor analysis of the variables at each of the following levels from the dataset: student, teacher/classroom, and school. At a student level, the main variable that underpins this study is Socio-Economic Status (SES) so it is important to create as strong of a measure as possible to represent this for each student. Previous studies frequently use a single proxy variable such as number of books in the child’s home, to attempt to capture SES, however, this has become 4 TIMSS 2011 User Guide for the International Database (2013) Foy et al. 45 contested in recent years to the extent that the National Center for Education Statistics (NCES) in the United States commissioned a panel of experts to recommend the best way to capture SES (Cowan et al., 2012). The panel recommended creating a measure that is as comprehensive as possible in it would use as much available information as possible in place of a single proxy measure and also take into account the population being studied (Cowan et al., 2012). The SES measure for this study was created in this vein, however, it should be noted that one of the weaknesses of TIMSS compared to other international education studies like PISA is that the SES data are not as comprehensive. PISA provides a SES composite variable in their dataset called “index of economic, social, and cultural status” (ESCS) (OECD, 2009). In order to replicate this type of measure used in PISA and utilize the recommendations by the NCES panel, four variables from TIMSS were standardized and combined into a single SES measure: number of books in home, highest level of father’s education, highest level of mother’s education, and a composite variable provided in the dataset called “Home Study Supports.” This variable takes into account all the available information within the TIMSS dataset to make as comprehensive a SES measure as possible. The Stata syntax from this process is available in the appendix in the “student cleaning” section. Another student-level variable considered essential is gender given that gender disparities are well-noted worldwide and especially prevalent within some countries with respect to access and opportunities within education (UNESCO, 2003). Furthermore, gender disparities are particularly exaggerated within the sciences in favor of boys with respect to achievement and affect (Lavonen & Laaksonen, 2009). Beyond these two essentials at a student level, the rest of the variables were left up to a factor analysis supported by the literature which was conducted and then the results fit into the contextual frameworks used by TIMSS. First, it is clear from the literature that student attitudinal and confidence measures are very important so these made up a large part of the factor analysis as four of the six factors account for this. These factors (new variables) are: interest and enjoyment of learning science, value of science, positive science affect, and negative science affect. The variables that loaded with each of these factors are listed below as well as the corresponding alpha values. Another factor that is included is called “bullying” which has received a great deal of attention in the United States in recent years. This variable mainly captures negative student experiences with their peers. The final variable created from this analysis is called “parent involvement” which looks at the frequencies at which the students and parents interact regarding the student’s schooling. The results of the full factor analysis can be seen in appendix 2 in more detail. 46 Factor 1 – Interest and Enjoyment of Science • 17A – I enjoy learning science • 17C – I read about science in my spare time • 17E – I learn many interesting things in science • 17F – I like science • 18C – My teacher is easy to understand • 18D – I am interested in what my teacher says • 18E – My teacher gives me interesting things to do o Alpha - .8967 Factor 2 – Value of science • 17G – It is important to do well in science • 19J – I think learning science will help me in my daily life • 19K – I need science to learn other school subjects • 19L – I need to do well in science to get into the university of my choice • 19M – I need to do well in science to get the job I want • 19N – I would like a job that involves using science o Alpha - .8835 Factor 3 – Positive Science Affect • 18A – I know what my teacher expects me to do • 19A – I usually do well in science • 19D – I learn things quickly in science • 19F – I am good at working out different science problems • 19G – My teacher thinks I can do well in science with difficult materials • 19H – My teacher tells me I am good at science o Alpha - .885 Factor 4 – Negative Science Affect • 17D – Science is boring • 19B – Science is more difficult for me than some of my classmates 47 • 19C – Science is not one of my strengths • 19E – Science makes me confused and nervous • 19I – Science is harder for me than any other subject o Alpha - .8292 Factor 5 – Bullied • 13A – I was made fun of or called names • 13B - I was left out of games or activities by other students • 13C – Someone spread lies about me • 13D – Something was stolen from me • 13E – I was hit or hurt by other students • 13F – I was made to do things I didn’t want to by other students o Alpha - .7674 Factor 6 – Parent Involvement • 11A – My parents ask me what I am learning in school • 11B – I talk about my schoolwork with my parents • 11C – My parents make sure that I set aside time for my homework • 11D – My parents check if I do my homework o Alpha - .7860 Other variables of interest no groups • SES (derived) • Gender (girl=1) However, not surprisingly “positive science affect,” “interest and enjoyment of science,” and “value of science” showed high levels of inter-correlation (>.5) in the bivariate analysis of the factors. This could impact the stability of the regression models in that a slight change in one factor could lead to changes in the other factors as well which might influence the whole model. Given that they had the smallest factor loadings and that they are also measuring very similar concepts as interest and enjoyment of science, positive science affect and value of science are excluded from the final model as variables. Appendix 2 shows the results of the correlations. 48 Overall, this leaves six final variables at the student level: gender, SES, interest and enjoyment of science, negative science affect, whether a student was bullied, and the level of parent involvement. The three derived variables: interest and enjoyment of science, bullying, and parent involvement are also in alignment with the contextual frameworks of the TIMSS study in that they are measures of parent involvement, student attitudes toward science, and school climate. This alignment is important so that this analysis takes into account the framework TIMSS uses to help create their variables. Next, the same process was repeated for teachers by selecting essential variables based on the literature and then creating composite variables that match the TIMSS frameworks and the results of the factor analysis. Frequently cited variables along the lines of “teacher quality” consist of the education level of the teacher and how many years they have been teaching with the assumption that as these increase so does the ability of the teacher. The number of hours of instruction is also another often-used variable to help try to explain some classroom differences. Finally, a simple dummy variable indicating whether a teacher is a science major or not will be included to see if this influences student outcomes to any degree. These four variables will be included as the “essential” variables to explain classroom and teacher differences. The factor analysis results for teacher variables are seen in appendix 2 where six main factors were identified, however, the explained variance of these six added up to more than one so the model will need to be reduced by a single factor. The identified composite variables, sub variables, and alpha scores are as follows: Factor 1 – Teacher Cooperation (How often do you have the following interactions with other teachers?) • 10A – Discuss how to teach a particular topic • 10B - Collaborate in planning and preparing instructional materials • 10C – Share what I have learned about my teaching experiences • 10E – Work together to try new ideas o Alpha-.81 Factor 2 – Classroom learning limitations (In your view, to what extent do the following limit how you teach this class?) • 15A – Students lack prerequisite knowledge • 15B - Students suffering from lack of basic nutrition • 15C – Students suffering from not enough sleep 49 • 15D – Students with special needs • 15E – Disruptive Students • 15F – Uninterested Students o Alpha - .7545 Factor 3 – Inquiry Teaching (In teaching science to the students in this class, how often do you usually ask them to do the following) • 19A – Observe natural phenomena and describe what they see • 19B – Watch me demonstrate an experiment or investigation • 19C – Design or plan experiments or investigations • 19D – Conduct experiments or investigations o Alpha - .7669 Factor 4 – Teacher Quality • 04 – Level of teacher education • 12 – Class size o Alpha - .6606 Factor 5 – Expectations (how would you characterize the following within your school) • 6D – Teachers expectations for student achievement • 6E – Parental support for students achievement • 6H – Students desire to do well in school o Alpha - .6878 Factor 6 – Teachers supporting students (How often do you do the following in this class) • 14D – Encourage all students to improve their performance • 14E – Praise students for good effort o Alpha - .7406 Not grouped variables to consider • 17 – Hours of science instruction • 01 – Teacher years of teaching • Science Major 50 Of these six, the “teacher quality” variable will be removed because this has the lowest reliability and also because the level of teacher education has already been identified as something to be included in the model. This leaves us with five factor variables to be combined with the four already identified variables for nine different teacher variables to be studied. These nine factors also align with the contextual frameworks used within TIMSS by capturing the organization of the education system, the school climate for learning, the teaching staff, the school resources, teacher education and development, teacher characteristics, classroom characteristics, instructional activities, and assessment. Again, it is important to keep the variables of the analysis in alignment with the TIMSS frameworks to help enhance validity. The correlations shown in appendix 2 also show that no variables will need to be omitted due to having too high of a correlation with other independent variables unlike the previous student variable case. Finally, the school variable analysis was conducted to identify any relevant school-level predictors that would be of importance to study. There are two main variables to include based on the literature: the level of urbanization and the affluence of the school. Urban/rural differences are important to consider in that the quality and amount of resources available at each depend on the location of the school in many countries. In the United States, urban schools are frequently the lowest-performing schools compared to the suburban schools but in many developing countries urban schools are the best schools. The affluence of the school is important from the standpoint of accounting for the SES of the school or community in that poorer students are generally more likely to go to school with other poor children due to the affluence of the surrounding community. This variable will also help to highlight any peer affects that might be present at a school level, whether positive or negative. The factor analysis of the school variables identified two main factors to be included as follows with the full analysis available in appendix 2: Factor 1 – Negative attitude about schooling (To what degree is each of the following a problem among students at your school) • 12A – Arriving late at school • 12B – Absenteeism o Alpha - .7893 Factor 2 – School/Parent Contact (How often does your school do the following for parents concerning individual students) 51 • 10AA – Inform parents about their child’s progress • 10CB – Inform parents about school accomplishments • 10CG – Organize workshops or seminars for parents on learning or pedagogical issues o Alpha .6095 Additional variables to consider that are not loaded • 3A – Approximately what percentage of students come from economically disadvantaged backgrounds • 05B – Urban/Rural This gives the analysis a total of four school variables to be included in the models. The negative attitude about schooling helps to capture framework components of student attitudes and the school climate for learning while the school/parent contact variable helps to identify parent involvement with the school. The SES and school location variables help identify the framework variables about school characteristics and school resources. The correlations (appendix 2) show no concern with respect to being too highly related so all the variables will be kept. Next, factor scores were created based on the weighting of each initial variable that comprised it. Using the factor weights, a percentage was calculated for how much each variable contributes to the factor variable. This was done to preserve the one to four scaling that comes from the Likert scales used in the TIMSS questionnaires. This way the composite variables’ raw values can be interpreted as they relate to the scales used in the questionnaires. Appendix 2 shows the results of this process for each of the composite variables created by the factor analysis to account for how much each factor contributes to the larger composite variable. Cleaning and Descriptive Analysis After the creation of the composite variables, the dataset needed to be cleaned and rescaled so that the outputs could be interpreted in a logical way and to prevent any errors from incorrect codes. This included things like making gender a 0, 1 variable instead of the 1, 2 variable as it was in the original dataset or rescaling questions so “never” is coded as a 1 instead of a 4 to move in ascending order of frequency. The full Stata syntax for this process is available in appendix 1 for verification as three separate files, one for each group. The descriptive analysis was conducted using Stata 13.1 and the IEA IDB analyzer. Table 4.1 shows the initial descriptive statistics. 52 Table 4.1 –Descriptive Statistics by Country Chile Sci Achievement SES Girl Interest and Enjoyment Negative Science Affect Bullied Parent Involvement Finland Sci Achievement SES Girl Interest and Enjoyment Negative Science Affect Bullied Parent Involvement Ghana Sci Achievement SES Girl Interest and Enjoyment Negative Science Affect Bullied Parent Involvement Korea Sci Achievement SES Girl Interest and Enjoyment Negative Science Affect Bullied Parent Involvement Singapore Sci Achievement SES Girl Interest and Enjoyment Negative Science Affect Bullied Parent Involvement USA Sci Achievement SES Girl Interest and Enjoyment Negative Science Affect Bullied Parent Involvement N 5835 5835 4968 5819 5660 5670 5673 5714 4266 4266 3020 4227 4214 4212 4166 4205 7323 7323 6059 7294 6501 6464 6408 6684 5166 5166 4353 5165 5146 5127 5136 5157 5927 5927 4594 5924 5888 5871 5865 5906 10477 10477 7881 10439 10108 10075 10178 10314 Source: TIMSS 2011 8th Grade Science Data 53 Mean SD Min Max 461.47 0.01 0.54 3.2 2.09 1.56 3.14 73.38 1.84 0.5 0.67 0.73 0.52 0.73 211.27 -4.92 0 1 1 1 1 727.95 3.75 1 4 4 4 4 552.35 0.04 0.49 2.5 2.13 1.44 2.47 65.17 1.73 0.5 0.57 0.54 0.49 0.77 251.56 -6.46 0 1 1 4 1 799.23 2.68 1 4 4 4 4 305.99 0.08 0.48 3.62 2.23 2.2 3.22 112.32 1.85 0.5 0.46 0.87 0.72 0.82 5 -3.05 0 1 1 1 1 697.31 5.12 1 4 4 4 4 560.16 0.05 0.52 2.44 2.37 1.51 2.28 77.3 1.78 0.5 0.67 0.67 0.53 0.79 249.02 -6.7 0 1 1 1 1 826.13 2.21 1 4 4 4 4 589.99 0.01 0.49 3.09 2.19 1.7 2.59 96.73 1.83 0.5 0.65 0.75 0.61 0.84 233.81 -5.04 0 1 1 1 1 876.37 3.23 1 4 4 4 4 499.1 0.06 0.51 2.93 2.06 1.57 0.06 84.38 1.83 0.5 0.77 0.8 0.59 1.83 219.22 -5.61 0 1 1 1 -5.61 801.51 2.66 1 4 4 4 2.66 Setting up TIMSS for Proper Analysis and Reporting Due to the sampling methods used in TIMSS and the use of plausible values as achievement scores, a conventional analysis cannot be run without accounting for these aspects of the data when achievement data is used. TIMSS uses Item Response Theory (IRT) to impute a full test score even though the student only completes a subset of the whole exam. As a result the TIMSS dataset provides five plausible values as achievement outcomes for science as a whole, specific science subjects, and specific science skills. In order to obtain the most accurate achievement estimates, separate regressions need to be run for each of the five plausible values and then the results averaged together for a final output (Martin and Mullis, 2012). Each regression output was put into an Excel spreadsheet and then the “average” command was used to compute the average of each of the five regressions for the tables. Additionally, the proper weighting must be used to ensure that the representativeness of the sampling is preserved and standard errors are accurately calculated. At the student level, TIMSS provides three different student weights – a total weight which sums to the national population of a country, a house weight which sums to the national student sample size, and a senate weight which gives equal emphasis regardless of size (Martin and Mullis, 2012). In this analysis, the total weight was used in all cases because the countries are analyzed on an individual level, never within the same regression. Teacher and school weights are provided for use when just the teacher or school data is used. Finally, TIMSS uses Jackknife Repeated Replication Techniques to provide unbiased estimates of the sampling errors of the means, totals, and percentages (Martin and Mullis, 2012). This method uses schools as the primary sampling unit (PSU) and replicates the selection of schools based on the sampling selection of the study where two schools are assigned to each of the 75 zones within a country for a total sample of 150 schools in each country. Then the teachers and students are embedded within this sample of schools. Schools that are selected but do not participate are replaced with backup schools and the degree to which this takes place is reported by TIMSS in the report appendix. In order to account for all these various factors which are critical for proper analysis and reporting, Stata offers a survey mode to analyze data structured such as TIMSS. Guides provided by StataCorp (2013) and UCLA statistical consulting (2013) were used to ensure proper specification of each of the models. Prior to conducting any analysis, the user must tell Stata the parameters of the data using the “svy set” command. Here Stata was told to use 54 jackknife sampling by indicating the variables that specify this in the TIMSS dataset and also which weight to use, in this case the total weight variable for students. Full syntax is available in appendix 1. Now that this has been done, Stata can be used as would be normally by just putting the survey command before any command, for example: “svy jackknife: reg DV IV” This survey setup process was conducted three different times, once for the student data, once for the teacher data, and once for the school data each time after conducting the cleaning. Missing Data Given the size and scope of the TIMSS dataset, there are missing data that must be accounted for in the dataset. There is no ideal method for handling missing data and the methods that should be used are largely situational depending on the data. Allison (2001) provides an excellent discussion of choices researchers have when dealing with missing data. One choice that needs to be made is whether to ignore or delete missing cases or whether to attempt to impute cases by using various methods. Both choices involve making assumptions about the data that must be supported in some fashion. This work assumes that a variable missing more than ten percent of the data should be noted when making interpretations. Table 4.2 below shows the amount of missing data for each variable by country for the overall six country sample. From this table, four different blocks with large amounts of missing data emerge: students in Ghana, teachers in the United States, schools in the United States, and schools in Chile. In Ghana, four student variables have missing response rates above ten percent: SES, interest and enjoyment of science, a negative science affect, and bullying. In the United States, the missing percentage from teachers ranges from thirty to forty-five percent so it clear that a large amount of teachers in the United States were non-compliant in completing their questionnaire. The schools in the United States also have missing percentages at or above ten percent. Also at a school level, Chile had missing responses at or just above ten percent so these estimates will need to be carefully interpreted as well. Since the analysis is done at a country level, these four instances of high amounts of missing data will be noted when discussing the results later on. 55 Table 4.2 – Percentage of Missing Data by Country for Model Variables Variable Chile Finland Ghana Korea Singapore United States Gender 0.27% 0.91% 0.40% 0.02% 0.05% 0.36% SES 9.12% 3.38% 14.04% 0.66% 0.47% 2.08% Interest/Enjoyment 3.00% 1.22% 11.22% 0.39% 0.66% 3.52% Neg. Sci Affect 2.83% 1.27% 11.73% 0.75% 0.94% 3.84% Bully 2.78% 2.34% 12.49% 0.58% 1.05% 2.85% Parent Inv. 2.07% 1.43% 8.73% 0.17% 0.35% 1.56% Teacher Experience 6.41% 5.45% 7.70% 4.48% 1.20% 30.74% Teacher Education 5.35% 5.00% 3.95% 5.05% 1.20% 30.43% Hours Science 7.56% 9.72% 7.48% 5.95% 2.13% 45.69% Science Major 7.97% 5.90% 7.20% 4.73% 2.73% 32.45% Teacher Cooperation 6.34% 6.02% 2.65% 4.48% 3.04% 31.33% Limitations 7.63% 1.03% 3.56% 5.76% 2.99% 43.44% Inquiry Teaching 6.53% 9.82% 2.02% 6.13% 3.04% 42.95% Expect for Achievement 5.54% 5.54% 2.83% 5.04% 2.31% 30.78% Teacher Support 6.02% 9.29% 2.95% 4.48% 42.59% Per. Econ. Disadv. Students 14.93% 15.78% 0.76% 2.71% 1.48% 2.95% Urban/Rural School 10.52% 7.92% 0.37% 2.09% 0.00% 10.50% Neg. School Climate 10.75% 8.65% 0.42% 4.14% 4.20% 11.59% School Parent Involvement 12.00% 8.23% 2.58% 2.09% 4.79% 12.35% 10.71% Notes – Source TIMSS, 2011 8th Grade Science Data Given that the missing data has now been described in detail some decisions were made about how to handle it. First, Allison (2001) says a researcher should make assumptions about whether the data are missing completely at random (MCAR) or missing at random (MAR). The MCAR assumption that there is no pattern between the missing data is very strong (Allison, 2001) and cannot be satisfied in the four concerned cases that were identified since the data was missing in blocks (students, teachers, schools). However, the assumption for MAR is much weaker and in this case the researcher must assume that the missing data can be explained using the other information in the data (Allison, 2001). In this instance, the data is missing in related blocks (students, teachers, schools) within these countries. To test whether the missing data might be related to science achievement, a correlation was run between the achievement in each country and the missing data for each variable. In all of the concerned cases, there were no significant and large correlations (>.10) between the achievement score and whether a variable was missing. In all cases it appears to be respondents not taking care to completely answer the surveys 56 within each country rather than a certain achievement group of people being significantly more likely not to respond. Given that the data are being assumed to be MAR for the large cases now the choice needs to be made whether to delete or impute any missing data. Given the complex sampling that goes into creating the TIMSS data, attempting to impute data properly is very risky and may make the problem worse so it was decided not to attempt to impute data. This leaves the option of deleting cases in which there are two options, listwise deletion or pairwise deletion (Allison, 2001). Listwise deletion would delete any cases that have missing data and while this would give the most complete dataset, it would also reduce the sample size the most and give the largest standard errors. For this study, pairwise deletion was used which only eliminates a case when data is missing that is needed for a specific analysis. Allison (2001) states that this method best lends itself to regression and pairwise deletion also seems to make sense in that it uses as much data as possible. Baseline Regressions Now that the dataset has been cleaned and has been specified correctly, the analysis can start by first running some multivariate regressions using the retained variables to get a baseline indication of the importance of each variable at each level and within each country, controlling for the other retained independent variables. It is important to note that the regressions were run separately for each of the six countries and within each country three separate regressions were run for each of the three groups (students, teachers, schools) for a total of eighteen separate regressions. Also, because science achievement is the outcome, recall that the five plausible values have to be used and then averaged together for a final reporting outcome. Therefore, within each of the eighteen regressions, five “sub-regressions” were run (one for each plausible value) and then averaged together. The averaged regression outputs will be reported in the next chapter and full Stata syntax for the regressions as well as the sub-regressions are available in the appendix. First, are the student regressions that contained the six variables of gender, SES, interest and enjoyment of science, negative science affect, whether a student was bullied, and the level of parent involvement. The full regression equation for students in each country and plausible value is as follows: SCIENCEACHIEVEMENT(PV1-5)i = αi + β1SESi + β2FEMALEi + β3INTERESTENJOYi + β4NEGSCIAFFECTi + β5BULLIEDi + β6PARENTi +ε Where science achievement is the science achievement of student i. There is no notation for country because a separate regression will be run for each country. Alpha is the random intercept term and epsilon represents the 57 random error term of student achievement. Beta 1 represents the predicted change in achievement based on a one unit change in SES. Beta 2 represents the predicted change in achievement if the student is a female (1=female, 0=male). Beta 3 represents the predicted change in achievement based on a one unit change in the level of interest and enjoyment a student has towards science. Beta 4 represents the predicted change in achievement based on a one unit change in the degree to which a student has a negative affect about science. Beta 5 represents the predicted change in achievement based on a one unit change in the extent to which a student reports being bullied at school. Beta 6 represents the predicted change in achievement based on a one unit change in the degree to which a parent is involved in their child’s studies. Next the same process was repeated using the teacher variables with student achievement in science as the outcome and teacher characteristics predicting the achievement. The equations were again separated by country and utilized the plausible values for achievement. The teacher variables were: years of teaching, amount of teacher education, whether the teacher was a science major, hours of science instruction, how much teachers cooperate, limitations of the classroom, how often the teacher uses inquiry-based teaching, expectations for students, and supporting students. The full teacher regression for each country is as follows: SCIENCEACHIEVEMENT(PV1-5)ij = αij + β1EXPERIENCEij + β2EDUCATIONij + β3SCIENCEMAJORij + β4HOURSSCIENCEij + β5COOPERATIONij + β6LIMITATIONSij + β7INQUIRYij + β8STUEXPECTATIONSij + β9STUSUPPORTij + ε Where science achievement is again the science achievement of student i in classroom j, alpha is the random intercept term, and epsilon is the random error term of student science achievement. Beta 1 represents the predicted change in achievement based on a one unit change in the number of years a teacher has been teaching. Beta 2 represents the predicted change in achievement based on a one unit change in the number of years of education a teacher has. Beta 3 represents the predicted change in achievement based on a one unit change in the whether a teacher was a science major (1=if a teacher was a science major) Beta 4 represents the predicted change in achievement based on a one unit change in the number of hours of science taught per week in the classroom. Beta 5 represents the predicted change in achievement based on a one unit change in the degree to which a teacher reports cooperating with other teachers. Beta 6 represents the predicted change in achievement based on a one unit change in the limitations to learning present within a classroom. Beta 7 represents the predicted change in achievement based on a one unit change in the degree to which a teacher uses inquiry-based teaching. Beta 8 represents the predicted change in achievement based on a one unit change in the degree to which there are high student 58 expectations. Beta 9 represents the predicted change in achievement based on a one unit change in the degree to which teachers report supporting students. Finally, this process was repeated for the school-level analysis following the same outline as the student and teacher analysis. Student science achievement continues to be the outcome and there are four school level variables – percent of economically disadvantaged students, how urban or rural the school is, the presence of a negative school climate, and the amount of parent/school contact. The full regression equation appears as follows: SCIENCEACHIEVEMENT(PV1-5)ij = αij + β1PERCENTDISADVANTAGEDij + β2URBANij + β3NEGATIVECLIMATEij + β4PARENTSCHOOLij +ε Where student science achievement is the outcome for student i in school j, alpha is the random intercept term, and epsilon is the random error term. Beta 1 represents the predicted change in achievement based on a one unit change in the percentage of disadvantaged students (represented categorically) within a school where a higher value indicates a higher percentage of disadvantaged students. Beta 2 represents the predicted change in achievement based on a one unit change in how urban a school is with a higher value indicating a more urban school and a lower value indicating a more rural school. Beta 3 represents the predicted change in achievement based on a one unit change in the degree to which a negative school climate is present. Beta 4 represents the predicted change in achievement based on a one unit change in the degree to which parents and the school are in contact. The next chapter reports the findings of these regressions. Hierarchical Linear Modeling The next stage of the analysis was done using Hierarchical Linear Modeling (HLM) to test for, and account for, the nested structure of the data which would be students nested within classrooms and schools. This analysis drew on the work of Raudenbush and Bryk (2002) for creating and analyzing HLM as well as a summer 2013 workshop by Raudenbush and Bryk that the author attended. As with the previous analyses, for each country the same hierarchical model was run to allow for comparisons across countries. The outcome variable will be science achievement, the students will be considered at level one of the model and teachers and schools will be considered at level two of the model. First, an unconditional model will be run to determine if there is nested data present within TIMSS that suggests that HLM is appropriate for the analysis. Then a full model will be run that treats all the level two predictors as fixed and includes all the student, teacher, and school variables. 59 Stata 13 has some severe limitations with respect to the ability to run HLM and these limitations are severely magnified with survey data such as TIMSS. Therefore, the HLM 7.1 software by Raudenbush, Bryk, and Congdon (2013) was used to conduct this part of the analysis. HLM allows for much greater customization of the models and also can simultaneously account for the use of plausible values as an outcome like TIMSS has in addition to the sampling weights without having to run multiple models like in Stata. The drawback is that the HLM 7.1 software is very sensitive to the data inputted for analysis so some additional steps need to be taken to ensure the analysis is properly setup. All the data must be cleaned and organized in another statistics program prior to importing it into HLM 7.1. HLM 7.1 does have the ability to read Stata files, however, it does frequently seem to have unexplained errors when reading Stata files so this study instead used the Stat Transfer program (2013) to move the Stata files into an SPSS format which is read more cleanly by the HLM 7.1 program. The first step was to create a database which has three levels of analysis into a single dataset. This was done using the IDB analyzer to again ensure the precision of the proper weights, plausible values, and sampling variables when creating the dataset. Next, the same setup files used for the initial cleaning and regressions were used in Stata to ensure the single dataset with all three levels is consistent with the separate level datasets used previously. Then, the dataset must be sorted by the ID variables at each level; if not, the HLM 7.1 program does not read the data correctly (Raudenbush and Bryk, 2002). This is done using a simple sort command in Stata. Next, the data was moved over from Stata to SPSS using the Stat Transfer tool. Finally, the last cleaning step involved separating out the dataset by country which was done using the sort cases and save commands in SPSS. The result of this process is six datasets, one for each country that contains the student, teacher, and school variables in a single dataset. Now the data can be loaded into HLM 7.1 for analysis. First for each country, a simple unconditional model was run just to verify that the data in each country did have a nested structure as evidenced by the chi-squared test of the variance structure. The unconditional model is as follows: Level 1 - Yij = αij +εj Level 2 - αij= γ00j +μ Combined model - Yij = γ00j ε0j+ μij Here Y represents the science achievement of student i, in school/teacher j. Just as in the standard regressions, there is no notation for the country since a separate model will be run for each country. Alpha is the level 1 intercept of student achievement, epsilon is the student-level error term, mu is the teacher/school level error term, and gamma is 60 the level 2 intercept. This model will also allow for the explanation of how much variance is at the student level in each country and how much is at the teacher/school level without any predictors present. Once the unconditional model was run for each country, an SES model was run that contained an SES predictor at both a student and school level. Student SES was used as the predicting variable at level 1 and the percentage of disadvantaged students in a school was used as the predicting variable in level 2. This was done to see what the reduction in variance was at each level by adding SES into the equation. The SES model is as follows: Level-1 Model: Yij = β0 + β1(SES) + ρ Level-2 Model: β0 = γ00 + γ01(ECDISA) + u0 β1 = G10 Full Model: Yijk = γ00 + γ01ECDISA + γ10SES + μ0+ ρ Finally, a full model with fixed level 1 predictors will be run for each country using the same variables that were used in the baseline regressions. This will help measure how much additional variance is explained beyond SES by adding these predictors. The difference here is that all three groups will be combined into a single model which means that achievement can be more accurately predicted because all the groups are accounted for at the same time. This fixed model will look as follows: Level-1 Model: Yij = β0 + β1(GIRL) + β2(SES) + β3(INTENJ) + β4(NEGSCI) + β5(BULLY) + β6(PARENT) + ρ Level-2 Model: β0 = γ00 + γ01(TEAEXP) + γ02(TEAEDU) + γ03(SCIHRS) + γ04(ECDISA) + γ05(URBAN) + γ06(SCIMAJ) + γ07(TECOOP) + γ08(TELIMI) + γ09(INQUIRY) + γ010(EXPECT) + γ011(TESUPP) + γ012(TEEVAL)+ γ013(SCHCLI) + γ014(SCHPAR) + u0 β1 = G10 β2 = G20 β3 = G30 β4 = G40 β5 = G50 β6 = G60 Full Model: Yijk = γ00 + γ01TEAEXP + γ02TEAEDU + γ03SCIHRS + γ04ECDISA + γ05URBAN + γ06SCIMAJ + γ07TECOOP + γ08TELIMI + γ09INQUIRY + γ010EXPECT + γ011TESUPP+ γ012TEEVAL + γ013SCHCLI + γ014SCHPAR+ γ10GIRL+ γ20SES+ γ30INTENJ+ γ40NEGSCI+ γ50BULLY+ γ60PARENT+ μ0+ ρ Binary Logistic Regressions Given that the OLS regressions have provided some baseline understanding to the variables and the HLM has accounted for all the variables in a single leveled equation, the final step was to focus in on the population of interest which is the low SES students and the variation in their achievement. To do this, variables were needed to label students based on their level of SES. Furthermore, because of the interest in how this achievement differs, 61 students were also grouped by their differing levels of achievement. This created categorical variables for the level of student SES as well as the level of student science achievement. These variables can then be used in an ordered logistic regression, which is supported by Stata 13 in survey mode, by using the same setup commands as with the baseline regressions. To create the categorical indicators for SES and achievement, each country was given its own student dataset from the larger dataset. This was to ensure that the categories were based on the within-country data rather than the between-country data. Using the between-country data would bias the categories towards the countries with the larger sample sizes making it impossible to account for the effects of these variables within each country. Both variables were created by dividing the students into categories of three: low, medium, and high with respect to SES and science achievement. This puts each student in each country into one category of a 3 x 3 matrix as shown in table 4.3 below: Table 4.3 – SES and Achievement Categories Bottom Third of Science Achievement Middle Third of Science Achievement Top Third of Science Achievement Bottom third of SES Low Achievement, Low SES High Achievement, Low SES Middle third of SES Low Achievement, Middle SES Top third of SES Low Achievement, High SES Middle Achievement, Low SES Middle Achievement, Middle SES Middle Achievement, High SES High Achievement, Middle SES High Achievement, High SES Note: This was done within each of the six counties as opposed to within each country Once the students have been grouped, then a flag variable was be used to indicate low achievement (0 or 1) and low SES (0 or 1). These flags will be used as the outcome measures in the analysis to help explain the odds of low achievement and low SES given the other variables. The output will be in odds ratios to allow for comparing the odds of being in the low achievement group compared to not being in the low achievement group controlling for other variables. The same can be done for SES to look at the odds of being low SES versus not given the other variables of interest. This will help to highlight whether low SES students are more likely to have certain characteristics. The goal was to find mitigating or compensating variables that help explain achievement differences among similar SES students or factors that might make achievement for low SES students worse. 62 Without foreshadowing too much into the results, this part of the analysis was restricted to include variables that showed a consistent effect on student achievement based on the previous analyses. The idea for this analysis is to uncover factors that are related to helping or hurting low-SES students so variables that did not show a consistent effect as a result of the previous analyses were omitted. Additionally, the variables were transformed to include a positive, compensating effect. For example, negative science affect was transformed and scaled in a positive way so that having a smaller negative affect now takes on a higher value. This allows for a more logical interpretation of the results when trying to identify compensatory factors. The variables included with their transformation are: • High interest and enjoyment • Positive science affect • Percentage of Economically Advantaged Peers • Positive School Climate • Experienced Teacher • Teacher with Science Major • Low Classroom Limitations to Learning • Inquiry Based Teaching • High Expectations for Student Achievement The logistic equation looks very similar to the regular regression equations with the main differences being the outcome variable is now the odds of having low science achievement compared to not having low science achievement and the variables are now more restricted. When computing the odds of low SES, the same equation as below is used with low SES and low achievement just switching places in the equation. ln ((Prob. low science achievement group) / (1- Prob. of low science achievement group))ij = αij + β1SESij + β2GIRLij +β3INTERESTENJOYij + β4POSSCIAFFECTij + β5ECONADVPEERSijk + β6POSSCHCLIMATEij +β7EXPERTEACHij + β8SCIMAJij + β9LOWLIMITATIONSij + β10INQUIRYij + β11HIGHEXPECTij ε Here i still represents the person and j the country. The interpretations are a bit complex and will be explained as they relate to the variables in the next chapter but the idea is to measure the odds of a student belonging to a particular achievement group as changes in a specific predicting variable take place while holding all the other variables constant. So for example, how does a student’s odds of being in the low achievement group compared to 63 the higher achievement group change if they are a girl versus a boy, holding all other variables constant is one type of question that can be explored in this type of analysis. This chapter has summarized the methods used to create the data and also to conduct the different analyses. To start with, the data was created by using the IEA IDB analyzer to create a properly structured dataset. Then descriptive statistics and bivariate correlations were run for each of the variables. The next step was to run a multivariate regression for each level of the analysis to account for the relationship between variables at each level. After the regression an HLM was run to allow for all variables to be in a single model and also to account for any variance differences between the different groups. Finally, a logistic regression was run for the variables that showed a strong effect in the previous analyses to detect any compensatory or penalizing effects for low SES students and their achievement. The next chapter will reveal and discuss the results of the various analyses. 64 Chapter 5 - Results This chapter will present the results of the correlation, OLS regression, logistic regression and HLM. The results will be presented in four different sections with each section corresponding to the analysis type. Within each analysis section, the results will be discussed by a within country context and also in an international context by variable. The implications for policy and interpretations will be reserved for the next and final chapter that will contain the discussion of all of the findings. Correlations by Country Table 5.1 below shows the results of the simple bivariate correlations of each variable with achievement within each country. Table 5.2 shows the results of the bivariate correlations with SES within each country. The correlation is listed in the first cell and the significance of the correlation in parentheses underneath. Chile At a student level, the variables in Chile are all significantly correlated with science achievement with the exception of interest and enjoyment of science. It is worth noting that Chile is the only country in which interest and enjoyment of science is not significantly correlated with achievement. Being a girl in Chile has a negative .09 association with science achievement while SES shows the strongest positive correlation with science achievement of any country at .51. This suggests that students in Chile may have a strong predetermination of their achievement before even stepping foot in the classroom. The strongest individual correlation at a person level for a student in Chile that is within their control is having a negative science affect, which shows a negative correlation at .28 but this is the weakest correlation of all the countries. When exploring the correlations with SES in Chile, having a negative science affect shows the strongest correlation of .12 meaning that as the SES of a student increases, their negative science affect will decrease. This pattern is also true for bullying where Chile shows the strongest correlation of bullying with SES at a negative .11. At a teacher and classroom level, being in a classroom where there are high expectations for student achievement and having a teacher that is a science major are both positively associated with science achievement at .38 and .25 respectively. These are the strongest correlations of these variables of any country in the sample and also are positively correlated with SES suggesting these factors are related to SES. The strongest negative associations at a teacher-classroom level in Chile are being in a classroom where there are limitations reported by the teacher and the hours spent teaching science each week with correlations of -.37 and -.19. These are also the strongest of each 65 variable for any country. SES is also negatively associated with hours of science in Chile suggesting that there may be a remedial aspect to science teaching in Chile where lower achieving students are receiving additional science instruction. Classroom limitations to learning also has the strongest negative correlation with SES in Chile compared to any country at -.31 meaning that lower SES is correlated with being in a classroom with increased limitations. At a school level, all four predictors are significantly correlated with achievement and except for being in an urban school, the school predictors have the strongest correlation with achievement of all the countries in the sample. The SES correlations help to shed some additional light on these correlations. The percentage of economically disadvantaged students is also negatively correlated with SES which would be expected and this is also the strongest correlation of all the countries. Being in a negative school climate also shows a negative association with SES, however, being in an urban school is positively associated with SES. This means that in Chile that lower-SES students appear more likely to be in schools with other low-SES students in a rural area and in a school with a negative climate. Finland In Finland, there are some very interesting correlations at all levels. With the exception of Ghana (which is an outlier to be discussed in the next section) SES in Finland shows the weakest correlation with science achievement of all the countries although it is still high at .38. Finland is also the only country where being a girl is has a positive association with science achievement although it is a very weak association. These correlations seem consistent with the egalitarian structure that the Finnish system is known for. Finnish students also show a high negative correlation between negative science affect and achievement as well as positive correlation of interest and enjoyment of science and affect which is in line with the previous literature. The correlation of interest and enjoyment of .32 is the second strongest of all the countries and the correlation of negative science affect is the strongest at -.43. Both these variables show the strongest correlation with SES of any country at .23 and -.22 suggesting that the effects of these variables in Finland take on a strong SES component. The teacher/classroom variables in Finland do show significant but not necessarily strong correlations with achievement as all variables are lower than .1, suggesting a weak or no relationship. Finnish teachers show the strongest correlation of the frequency of using inquiry-based teaching with achievement at .09 and this also takes on an SES component with a correlation of .05. This could mean that there are some differences in how teachers in Finland are teaching science and it may have a weaker SES component. Finland also has the second highest 66 correlation of a teacher being a science major with student achievement at .09 but the correlation of this with SES is only .04 indicating that these teachers are likely distributed evenly across students of different SES. Supporting the relatively egalitarian nature of Finnish education, Finland does not show the strongest correlations of SES with any of the teacher or classroom variables. This equality continues at the school level where, in comparison with other countries, Finland has the weakest correlation with the percentage of economically disadvantaged students in a school at a -.08 and the second weakest with a negative school climate at a -.11. The same is true for the correlation of these variables with SES in Finland. In short, there is essentially no relationship. This suggests that overall, the correlations suggest that Finland has the weakest association of SES with achievement and of SES with other variables in the study on their own. Ghana Ghana shows very significant relationships of student achievement with all the student-level variables. Ghana shows the strongest correlation of being a girl with achievement of all of the countries in the sample and it is also a negative association suggesting that girls have a much lower science achievement compared to boys. Negative peer treatment of students also appears to be a concern in Ghana as bullying shows the strongest negative correlation with achievement in Ghana. However, the correlation is positive with SES meaning that higher-SES students in Ghana show more instances of bullying so the overall correlation of bullying with achievement and SES is somewhat inconsistent. Ghana also has the weakest association of SES with achievement but this is more likely due to poor measurement of the variable within Ghana as opposed to the actual reality. SES in Ghana appears to have a skewed distribution, some extreme cases (high SES and achievement of 5), and some missing cases. Further illustrating this point is the correlations of interest and enjoyment of science and a negative science affect. When correlating these variables with achievement, Ghana shows a pattern similar to other countries. However when correlating these with SES, interest and enjoyment is not significantly correlated and negative science affect has the weakest correlation of SES with achievement. At the teacher/classroom level Ghana shows the second strongest correlations of teacher experience and level of teacher education with achievement at .08. These are also positively correlated with SES indicating that in Ghana, higher-SES students generally have more experienced and educated teachers. Being in a classroom where a teacher reports more limitations to learning is negatively associated with achievement in Ghana at a -.19 which is in the middle of the sample of countries and the association with SES is -.08 which is the weakest of the sample. This 67 indicates that lower-SES students in Ghana are slightly more likely to be in a classroom with limitations to learning although the degree of the association appears to be weaker compared to other countries. Ghana also shows a slightly moderate correlation of .14 between expectations for student achievement and achievement. This is correlated at a .11 with SES which is also in the middle of the sample. At the school level, all variables are significantly correlated with achievement with three of the four variables having a stronger than .2 correlation with school parent contact being the exception. Ghana shows the strongest correlation of being in an urban school with achievement at .23 and this is also the strongest correlation with SES in the sample at .25. This indicates that the urban schools in Ghana are more likely to be higher achieving and contain higher SES students. Ghana also shows the second highest correlation of achievement with a negative school climate but this is not as strongly correlated with SES. The correlations between the percentage of economically disadvantaged students in a school and achievement are identical to the correlation with SES. Korea Korea is unique in that three of their six student variables are correlated stronger than .4: SES, interest and enjoyment of science, and a negative science affect. Interest and enjoyment of science and negative science affect also have significant correlations with SES. Korea has the second strongest correlation of any of the countries studied for each variable with SES. Being a girl in Korea is has a small negative correlation with achievement at .04. Parent involvement also has the strongest association with science achievement in Korea of all the countries at a .19 and a correlation with SES at a .32 which is also the highest of all countries. This is a positive relationship indicating that parent involvement is important in Korea as it relates to achievement and that there is an SES component to it as well. Teacher-classroom variables in Korea are very weakly correlated with achievement and often not significant. The strongest correlation with achievement is expectations for student achievement at a .08 but this is the weakest correlation of all the countries for this variable. The correlation with SES and student expectations is also a .08 and also the strongest correlation of the teacher-classroom variables in Korea with SES. This suggests that there are not many variables on a teacher-classroom level in Korea that are associated with achievement. At a school level, the correlations between achievement and the school-level variables in Korea are all significant but comparatively weaker than other countries. The strongest correlation with achievement at a school level in Korea was the percentage of economically disadvantaged students which was a -.16 and also significantly correlated 68 with SES which would be expected. Korea also shows the second strongest correlation between parent-school contact and achievement at a .06 which also aligns with the strong correlation of parent involvement at the student level. Together, this suggests a large role for parents in Korean student achievement. Singapore Singapore is the only country that does not show a significant correlation between gender and science achievement which is especially important, given that Singapore is the highest achieving country of the sample. The fact that gender is not significantly associated with achievement may contribute to Singapore’s high level of achievement. Singapore’s correlation of SES with achievement is .42 which is the third strongest of the countries in the sample. The correlations of interest and enjoyment of science and a negative science affect in Singapore are the second weakest of the sample. These variables are also only moderately correlated with SES at a .12 and -.15 respectively. Parent involvement in Singapore is not strongly correlated with achievement at a .08 but, it is more strongly correlated with SES at a .22. This suggests that Singapore may have a similar, although weaker, pattern between parent involvement, achievement, and SES compared to Korea. In Singapore, all of the teacher/classroom variables are significantly correlated with achievement except teacher cooperation indicating that there is a strong relationship between teacher/classroom variables and student achievement. Level of teacher education (.19) has the strongest correlation with achievement in Singapore which also takes on the strongest correlation with SES of all the countries at .11, this suggests that Singapore may not deploy their more-educated teachers very equally along SES lines. Classroom limitations (-.35) and expectations for student achievement (.36) have the second highest correlations with student achievement in Singapore. Classroom limitations and expectations for student achievement have the second highest correlations with SES in Singapore at a -.16 and .24 respectively, indicating that there is also some presence of SES differences at the teacher/classroom level. Since Singapore is a city-state, the urban variable is not applicable but the remaining three school-level variables have statistically significant relationships. The strongest association with science achievement in Singapore at a school level is the proportion of economically disadvantaged students at a school which is correlated with achievement at a -.33, this is the second strongest correlation for that variable. United States 69 In the United States, SES has the strongest association with science achievement at a .43 and it is the second strongest association of SES in the country sample. Interest and enjoyment of science and having a negative science affect are both moderately correlated with science achievement in the United States with respect to other countries at a .22 and -.34 respectively. These variables are also both slightly correlated with SES at .14 and -.19. The correlation between negative science affect and SES is the second strongest in the sample. Parent involvement and bullying do not have a significant relationship with science achievement in the United States but they are significantly correlated with SES. At the teacher-classroom level, the United States shows the strongest correlation between achievement and teacher experience at .13 and this is also the strongest correlation with SES at .15. This suggests that higher-SES kids are slightly more likely to get more experienced teachers in the United States. Compared to other countries, the United States shows moderate correlations of classroom limitations and expectations for student achievement at -.19 and .27, which also have correlations with SES at -.15 and .28. This indicates that there are classroom differences by SES and achievement in the United States as it pertains to these variables. As with the teacher-classroom variables, the school variables in the United States show relatively moderate correlations compared to other countries. All school variables are significantly correlated with achievement, the strongest being the percentage of economically disadvantaged students at -.32. From an SES standpoint, a negative school climate is correlated with SES at a -.14, which is the second strongest correlation from that standpoint. As a whole, this indicates that school variables in the United States, just like teacher/classroom variables, have some statistically significant effects on achievement but are less so in size compared to the other countries in the sample. Correlations Explained by Variable When looking at these results variable by variable instead of country by country, we see that among the student-level variables SES shows by far the strongest overall correlation with science achievement ranging from .18 to .51 but having a negative science affect is also strongly correlated which ranges from .28 to .43. Interest and enjoyment of science also shows a consistently strong relationship to science achievement with Korea topping out at a .44 correlation. Gender, bullying, and parent involvement all show weaker relationships to achievement and varying levels of significance depending on the country. Gender and bullying are strongest in Ghana while parent involvement is strongest in Korea. When comparing the correlations with SES, interest and enjoyment of science and negative science affect appear to have a generally strong overall correlation depending on the country. This 70 indicates that as a whole these variables may vary to a large degree along SES lines. Bullying and parent involvement with respect to SES are largely country-specific but there are cases where there are associations with SES such as Ghana with bullying and Korea with parent involvement. As a block, the teacher/classroom variables show the weakest correlation with science achievement. A teacher reporting limitations to the classroom and reporting an environment where there are strong expectations for student achievement show the strongest correlations with achievement. Student expectations are significantly correlated in all the countries while classroom limitations are significantly correlated in all countries except Korea. These variables also show the strongest correlations with SES as well. This may indicate that the limitations and expectations in a classroom can vary to a large degree depending on SES which also impacts achievement. Teachers who support their students and report frequently using inquiry-based teaching practices in their classrooms surprisingly show the weakest correlations with student achievement. Two of the traditional “teacher quality” variables: experience and level of education, generally show weak correlations with science achievement. Teacher experience is moderately correlated with achievement in the United States and teacher education is moderately correlated in Ghana and Singapore, which are interestingly the two representations of inequality in the study. Those variables are also significantly correlated with SES in Ghana and Singapore as well. In Chile, the number of hours of science instruction shows a moderate negative correlation with achievement that is counterintuitive and having a teacher who has a science major and also reports cooperating with other teachers also show a moderate correlation with achievement in Chile. The school-level variables are in many ways a reflection of the student level variables in that SES and negative attitudes again have the strongest correlations with science achievement. The proportion of disadvantaged students in a school shows the strongest correlation with achievement as a block and is significant in every country. Being in a negative school climate is also strongly and negatively associated with achievement and is significant in every country as is the association between negative school climate and SES. Not surprisingly, being in an urban school is most strongly correlated with science achievement in Ghana and Chile where delivery of educational services to rural areas can be difficult. SES is also the most strongly correlated with urban schools in these countries indicating that wealthier areas are more urban in these countries and that is reflected in the schools. The amount of school and parent contact has the weakest association with science achievement at the school level and is also not strongly associated with SES except in Korea and Singapore. 71 Table 5.1 – Correlations with Science Achievement Chile Finland Ghana Korea Singapore United States Girl SES Interest/ Enjoyment Negative Science Bullied Parent Involvement Teacher Experience Teacher Education Hrs. Science/wk -0.0869 0.5049 0.0177 -0.2793 -0.1111 -0.0586 -0.0381 0.0276 -0.1933 (0) (0) (.1824) (0) (0) (0) (.0049) (.0405) (0) 0.0381 0.3851 0.3178 -0.4284 -0.043 -0.0357 0.0419 0.0138 0.0109 (.0133) (0) (0) (0) (.0055) (.0205) (.001) (.1877) (.3094) -0.1465 0.1752 0.232 -0.4195 -0.1424 0.0816 0.0755 0.1296 -0.0349 (0) (0) (0) (0) (0) (0) (0) (0) (.0028) -0.0376 0.4103 0.4385 -0.4263 0.0155 0.1887 0.0369 0.0151 -0.0128 (.007) (0) (0) (0) (.2679) (0) (.0048) (.251) (.3322) -0.0021 0.418 0.2162 -0.312 -0.0951 0.0798 -0.0769 0.1893 0.0504 (0.8727) (0) (0) (0) (0) (0) (0) (0) (.001) -0.0638 0.4309 0.2203 -0.3401 -0.0056 0.0105 0.1276 0.0251 -0.0229 (0) (0) (0) (0) (.5722) (.287) (0) (.0319) (.0838) Source: TIMSS 2011 8th Grade 72 Table 5.1 - (cont’d) Chile Finland Ghana Korea Singapore United States Science Major Teacher Cooperation Class Limitations Inquiry Teaching Student Expectations Supporting Students Econ Disadv. Students Urban School Neg. School Climate School Parent Contact 0.2512 0.1208 -0.37 0.0014 0.383 -0.0628 -0.4179 0.1681 -0.3463 0.0653 (0) (0) (0) (.9186) (0) (0) (0) (0) (0) (0) 0.0926 0.0088 -0.1956 0.0904 0.1493 0.012 -0.0788 -0.0045 -0.1097 0.0152 (0) (.4033) (0) (0) (0) (.2637) (0) (.7763) (0) (.343) -0.008 0.0487 -0.19 0.0387 0.1359 0.0069 -0.2031 0.232 -0.2163 0.0572 (.4927) (0) (0) (.0007) (0) (.5424) (0) (0) (0) (0) 0.0041 0.0264 -0.0105 0.0298 0.0819 -0.0108 -0.1565 0.0937 -0.0768 0.0566 (.7572) (.0438) (.4269) (.0242) (0) (.4099) (0) (0) (0) (.0001) -0.0253 0.0069 -0.3479 0.0836 0.3599 -0.0459 -0.3256 1 -0.1567 0.0491 (.0543) (.5992) (0) (0) (0) (.0004) (0) (0) (0) (.0002) -0.0227 0.0521 -0.1871 0.0646 0.2719 0.032 -0.3171 -0.0983 -0.1514 -0.0216 (.0562) (0) (0) (0) (0) (.0128) (0) (0) (0) (.0384) 73 Table 5.2 Correlations with SES Chile Finland Ghana Korea Singapore United States Interest/ Enjoyment Negative Science Bullied Parent Involvement Teacher Experience Teacher Education Hrs. Science/wk Science Major -0.0508 -0.1183 -0.1103 0.0094 -0.0235 0.0305 -0.1356 0.2374 (.003) (0) (0) (.4983) (.0975) (.0307) (0) (0) 0.2308 -0.2236 -0.0101 0.1104 0.0062 -0.0106 -0.0092 0.0392 (0) (0) (.3335) (0) (.5582) (.3194) -0.3194 (.0002) 0.0137 -0.097 0.0791 0.2513 0.143 0.0905 0.0453 0.067 (.2817) (0) (0) (0) (0) (0) (0) (0) 0.1693 -0.1491 -0.0132 0.3251 0.0052 0.0418 -0.0482 -0.0114 (0) (0) (.3065) (0) (.6905) (.0015) (.0003) (.3869) 0.12 -0.1452 -0.0471 0.217 -0.0272 0.1118 0.0073 0.0061 (0) (0) (.0003) (0) (.0376) (0) (.5802) (.6453) 0.1443 -0.1872 0.025 0.1791 0.1446 0.0258 -0.0298 0.0031 (0) (0) (0) (0) (0) (.0287) (.0253) (.7956) 74 Table 5.2 - (cont’d) Chile Finland Ghana Korea Singapore United States Teacher Cooperation Class Limitations Inquiry Teaching Student Expectations Supporting Students Econ Disadv. Students Urban School Neg. School Climate School Parent Contact 0.0969 -0.3109 0.015 0.3382 -0.0782 -0.4535 0.1857 -0.2846 0 (0) (0) (.2916) (0) (0) (0) (0) (0) -0.9998 0.0283 -0.1077 0.0542 0.109 -0.0003 -0.0521 0.0852 -0.0507 0.0244 (.0080) (0) (0) (0) (.9792) (0) (0) (0) (.0245) 0.0455 -0.0831 -0.0125 0.1104 0.0102 -0.204 0.2482 -0.113 0.018 (.002) (0) (.3066) (0) (.4081) (0) (0) (0) (.1433) 0.0455 0.0119 0.0268 0.0829 -0.0144 -0.2104 0.125 -0.0296 0.0855 (.0005) .(.3674) (.0433) (.0) (.2740) (0) (0) (.0238) (0) 0.0457 -0.164 0.075 0.2352 -0.0094 -0.219 1 -0.1155 0.0671 (.0005) (0) (0) (0) (.4719) (0) (0) (0) (0) -0.0086 -0.1476 0.0646 0.281 0.0184 -0.3413 -0.0218 -0.1391 -0.0061 (.4689) (0) (0) (0) (.1554) (0) (.0371) (0) (.5627) Source: TIMSS 2011 8th Grade Data 75 Baseline Regressions Table 5.3 shows the results of the student regressions on achievement based on the averages of the five plausible values. The student variables are: SES, gender, interest and enjoyment of science, negative science affect, bullying, and parent involvement. A more detailed table that shows the results of each plausible value individually can be found in appendix 3. This section will discuss each level separately, since the regressions were run this way including reporting at both the country-level and the variable-level. It is important to note that these regressions are meant to be baseline indicators by building on the results of the correlation analysis and setting the stage for the HLM analysis which will account for the nested structure of the data. As a reminder in the student data, girl is a dummy variable, SES is a standardized composite variable, and the remainder of the variables are composite variables taking values of 1 to 4 just like the Likert questionnaire given where 1 is the most negative response such as “strongly disagree” and 4 is the most positive such as “strongly agree.” In the teacher data, science major is a dummy variable, hours of science instruction is a continuous variable, and the remaining variables are ordinal based off the questionnaires. In the school questionnaire, all four variables are ordinal based off the questionnaires. Student Regressions As a country, Korea has the highest r-squared value, therefore explaining the most variance at the student level as a country at .33 while Ghana explains the least at .23. In Chile, girls pay about a fifth of a standard deviation penalty on average in their science achievement due to their gender, however, the strongest penalty in Chile is paid by students that have a negative science affect which costs the students about a quarter of a standard deviation on average for a one point change in negative science affect. In Chile SES has the strongest correlation with achievement compared to the other countries. However, when controlling for other variables Singapore now has the strongest impact of SES at about a fifth of a standard deviation increase in achievement for each standard deviation increase in SES. In Finland, SES is not the strongest contributor towards increasing science achievement but instead an interest and enjoyment of science is the strongest one. Conversely, having a negative science affect in Finland on average takes away the most from achievement. In Finland, a one-point change in interest and enjoyment of science and negative science affect could possibly change average student achievement by about half a standard deviation. Finland now shows the strongest impact of parent involvement on achievement where a one-point increase in the 76 level of parent involvement results, when making strong assumptions about the adequacy of this model, in about an 11 point decrease in achievement on average. This unexpected finding could mean that parents in Finland are only becoming more involved when their child is struggling academically. As mentioned while Ghana shows the smallest r-squared, it also has three of the most extreme student predictors of science achievement of the sample. Being a girl in Ghana means that a student’s average science achievement will be about 22 points lower on average compared to boys. Also, students in Ghana that have a negative science affect will score about 45 points lower on average, almost half a standard deviation. On the opposite side, students in Ghana show the strongest relationship of interest and enjoyment on science achievement which adds about 30 points on average to their achievement. All together, wiping out gender disparities and then increasing the two student attitudinal variables in Ghana by one point each could conceivably swing student achievement by almost an entire standard deviation or 100 points. In Korea, interest and enjoyment of science adds about 25 points to a student’s science achievement while having a negative science affect wipes out 25 points on average, resulting in about a half a standard deviation swing on average for a one point change in each variable. Parent involvement, which has the strongest bivariate correlation with achievement in Korea, is now no longer significant, illustrating the fact that bivariate correlations are no more than a starting point in the analysis since subsequent analyses can show that in controlling for possible confounding variables, variables which are strong in a bivariate relationship can be shown to have no statistical effects. Singapore, the high-achieving but unequal country in the sample, has the strongest impact of SES on science achievement at about 21 points on average controlling for the other variables. A one-point change in negative science affect will decrease achievement by 29 points on average in Singapore or just over a quarter of a standard deviation. Bullying and parent involvement also appear to take away about a tenth of a standard deviation from achievement in Singapore on average controlling for the other variables. Lastly, the United States shows the second strongest impact of SES on science achievement at 18 points on average. It has the weakest positive impact of interest and enjoyment of only a five-point increase in achievement for each increase in interest and enjoyment. The United States also shows the smallest impact of negative science affect on student achievement at a 24-point decrease in achievement for each increase in negative affect. As a variable, SES and interest and enjoyment of science equally show the largest positive statistical effect on science achievement when controlling for other variables. In Chile, Singapore, and the United States SES has the 77 stronger positive impact. In Finland, Ghana, and Korea, interest and enjoyment shows the higher impact. This may indicate some unique contextual differences at the student level. On the negative side of the achievement equation, having a negative science affect was the strongest variable in reducing student achievement in all six countries. With the control variables in these regressions, bullying and parent involvement had little to no impact on achievement with a few exceptions. Gender was very country-specific with girls being penalized in Chile, Ghana, and United States, but having no relationship in the highest performing countries: Finland, Korea, and Singapore. 78 Table 5.3 – Student Regressions on Science Achievement Chile Avg Sig Finland Avg Sig Ghana Avg Sig Korea Avg Sig Singapore Avg Negative Science Affect Bullied Parent Involvement P-value R2 0 0.2746 0 0.3006 5.61 0 0.2293 Intercept Girl SES Interest and Enjoyment 581.09 -20.69 16.53 -5.73 -25.48 -6.28 -4.48 *** *** *** * *** *** ** 629.19 1.81 11.20 12.36 -36.75 1.64 -10.94 *** *** *** *** *** 324.80 -22.32 7.13 30.20 -45.69 -9.70 *** *** *** *** *** ** 559.26 3.54 13.69 25.92 -25.87 2.59 -0.56 0 0.3286 *** *** *** 21.18 6.86 -28.93 -8.94 -4.26 0 0.275 *** * *** ** ** -2.12 -9.42 0 0.2811 *** 666.91 -3.9 Sig *** Avg 604.47 -9.23 17.63 4.83 -24.17 Sig *** *** *** ** *** USA Notes: Source TIMSS 2011 8th Grade, SCIPV1-5, TOTWGT, JRR, *=.05, **=.01, ***<.001 79 *** Teacher/Classroom Variables As evidenced by the weak correlations, the teacher/classroom variables as a group provide the poorest predictions of student achievement in the regressions as shown in table 5.4. Singapore showed the largest impact of teacher/classroom variables with an r-squared of .23 while Korea showed almost no impact of teacher/classroom variables with an r-squared of .01. Chile is unique in that it is the only country where hours of science instruction significantly matter but increasing the number of hours of instruction actually decreases science achievement. This could be the result of a remedial effect where students who are struggling are given additional instructional time but that is not clear. Chile also shows the strongest relationship with having a teacher that has a science major on student achievement. Students who have teachers who report classroom limitations and high expectations for achievement are also significantly predictive of student science achievement in Chile. By changing the amount of classroom limitations and student expectations by one point each, student achievement in Chile can change by about half a standard deviation. Finland is known for their strong teaching force so the results of this regression are somewhat surprising. Overall, teachers/classrooms in Finland have the second lowest r-squared value of the sample which may be a result of the restricted variance of the teacher variables. What is interesting to note is that Finland is the only country where students of teachers that report using inquiry based teaching have a significantly positive relationship with their achievement although it is only about 8 points on average. While this variable only asks how often teachers do this irrespective of quality, there appear to be a group of teachers in Finland who are able to leverage higher achievement from their students through their instructional practices. Changing classroom limitations, the use of inquiry-based teaching, and student expectations for achievement by one point each might result in about a half a standard deviation change in achievement in Finland. Ghana is the only country where students who have a teacher that has a higher level of education is a significant predictor of their achievement, adding about 13 points on average for each increase. Students of teachers that report having higher classroom limitations are likely to have achievement that is just less than a half a standard deviation on average. In Ghana, students of teachers who report that there are higher expectations for student achievement are likely to have achievement that is a quarter of a standard deviation higher on average. In Ghana, a one-point change in each of these variables results in almost a whole standard deviation change in achievement. 80 As noted previously, Korea has the lowest relationship of teacher/classroom variables to student achievement. What is interesting about Korea is that it has no variables in the model that significantly deduct from student achievement. Of the three variables that are significant, all increase student achievement but even when combined, a one-point change in each can only increase student achievement by a quarter of a standard deviation. Students of teachers that have more experience are likely to have marginally higher achievement. Students of teachers that have a science major and whose teachers report higher expectations for student achievement are likely to have achievement that is about 12 points higher each on average. Singapore is the strongest country of teacher/classroom variables predicting achievement but this comes from only two very strong variables. Students of teachers who report having classroom limitations have science achievement that is a half of a standard deviation lower compared to those that do not. Conversely, students who have teachers who report higher expectations for student achievement are likely to have science achievement just under a half a standard deviation higher as a result. Both of these results are by far the strongest of any country for these two variables and when combined a one-point change in each variable could change student achievement by over an entire standard deviation. The United States also only shows two significant teacher/classroom variables. Students of teachers who have more experience on average have very marginal higher science achievement. Students of teachers who report high expectations for student achievement have about a quarter of a standard deviation higher science achievement on average for each increase, this is second only to Singapore. Similar to Korea, a one-point change in these variables results in only a quarter of standard deviation change in student achievement. In summary, the two most unequal countries in the sample, Ghana and Singapore, show the greatest impact of teacher variables with changes of up to an entire standard deviation. Conversely, the United States and Korea, the two middle inequality countries, show the lowest impact at only about a quarter a standard deviation change from their teacher variables. Chile and Finland, the two more equal countries, show about a half a standard deviation change from their teacher variables. 81 Table 5.4 – Teacher and Classroom Variables on Student Achievement Chile Avg Sig Finland Avg Sig Ghana Avg Sig Korea Avg Sig Singapore Avg Cons. Teacher Yrs of Teaching Teacher Level of Ed. Sci Major Hours of Science Instruct. Teacher Coop. Classroom Limits Using Inquiry Student Expect. Teacher Support Students p-value R2 558.03 -0.13 -4.13 13.29 -8.56 -3.5 -20.08 -9.66 22.58 -8.12 0 0.1692 ** ** 8.69 0.32 -0.43 0 0.0461 12.27 0.0043 0.0893 -2.81 0.0073 0.0136 -13.12 0 0.2304 -0.82 0 0.1006 *** 548.79 0.1 2.67 *** *** -2.17 * 256.88 1.73 13.8 *** * * 496.17 0.36 -0.51 *** * 558.53 0.3 Sig *** Avg 458.46 0.61 Sig *** * -26.18 -5.94 -5.06 *** -23.96 8.27 8.36 *** ** * -42.49 5.47 24.9 ** 12.75 -1.14 0.24 1.92 * 5.84 * 15.51 -15.3 12.30 *** 4.03 -10.34 -58.93 6.41 *** 45.77 *** USA 2.64 1.93 -1.41 -101 -9.04 1.76 28.69 *** Notes: Source TIMSS 2011 8th Grade, SCIPV1-5, SCIWGT, JRR, *=.05, **=.01, ***<.001 82 School Variables Just as in the case of the student vs. school correlations, the student vs. school regressions expectedly follow the same pattern as shown in table 5.5. These regressions account for very little of the variance. As a country, Chile has the strongest r-squared value at .16 while Finland has the lowest at .01. In Chile, students in schools where the percentage of disadvantaged students is high and being in a negative school climate are significantly likely to have lower science achievement of 17 and 13 points respectively on average to where a one-point change in each would change student achievement by about a third of a standard deviation. Finnish students who attend schools with a negative school climate are significantly likely to score 11 points lower on average for each change in the school climate but in line with Finnish culture based on the previous correlations, the percentage of disadvantaged students has no significant impact on student science achievement. Students in Ghana are the most impacted by attending a school with a negative school climate which might decrease their achievement by a quarter of a standard deviation on average for each increase in the negative climate. Students in Ghana also have the strongest impact of attending an urban school which is predictive of an increase in their achievement by about 18 points on average the more urban the school. For each increase in the proportion of disadvantaged students in a school in Ghana, student achievement could possibly decrease by 13 points on average. When combined, a one-point change in each of these three variables in Ghana would lead us to tentatively predicting just over a half a standard deviation change in achievement. Korea follows a similar pattern to Ghana in that students who attend an urban school are likely to have higher achievement but to a lesser extent of 6 points on average. Korea also shows the smallest statistically significant impact of a negative school climate on student achievement at a reduction of 10 points on average for each increase. The proportion of economically disadvantaged students predicts achievement to change by about 11 points on average for each change in the proportion. All together a one-point change in each of these variables in Korea gives us a quarter of a standard deviation change in student achievement. In line with the correlations analysis, Singapore shows a strong impact of school variables on student science achievement. Students in Singapore are the most impacted by attending a school with a high proportion of disadvantaged students which in terms of the regression costs them about 41 points on average for each change in the proportion. These students are also impacted by a negative school climate which could decrease student 83 achievement by about a quarter of a standard deviation for each change in the school climate. When combined, a one-point change in each of these variables shows up as about a two-thirds change in student achievement. Finally, the United States is unique in that it is the only country where students who attend an urban school have lower achievement which is a unique contextual feature of the United States because of the lower quality of urban schools relative to suburban ones. The United States is also the only country where a negative school climate is not a significant predictor of student achievement. Why this is remains unclear but it could be a result of the variance in school climate being absorbed into other variables like urban and proportion of disadvantaged students. The United States also shows the second highest impact of the proportion of disadvantaged students on achievement after Singapore where a one-point change results in a 22-point change in student achievement. Overall, student achievement in the United States at the school level could change by about a quarter of a standard deviation as a result of the significant school variables. As a whole, the percentage of disadvantaged students in a school was the strongest predictor of student achievement with a negative school climate the second strongest. Attending an urban school was significant in all countries but Finland, but the effect was country specific where Ghana had a strong positive effect and the United States a negative effect. Once other variables were controlled, school-parent contact as a variable did not significantly matter in any country. At a country level, Ghana and Singapore again showed the most variation in school-level predictors while Finland showed the least variation as a result of school-level predictors. 84 Table 5.5 – School Variables on Student Achievement Chile Avg Sig Finland Avg Sig Ghana Avg Constant Percent Disadvantaged Students Urban School Negative school climate School Parent Contact p-value R2 502.52 -17.49 5.22 -13.39 6.28 0 0.1608 *** *** * *** 575.12 -6.09 1.71 -11.3 3.26 0.0092 0.013 -7.42 0 0.1141 2.61 0 0.0376 2.47 0 0.1243 0.49 0 0.1112 *** ** 388.05 -12.71 17.86 -27.83 *** ** *** *** 568.22 -11.08 5.99 -9.55 *** *** *** *** 699.38 -41.34 -25.68 Sig *** *** ** Avg 617.6 -21.64 -6.04 Sig *** *** ** Sig Korea Avg Sig Singapore Avg USA Notes: Source TIMSS 2011 8th Grade, SCIPV1-5, TOTWGT, JRR, *=.05, **=.01, ***<.001 85 -5.66 Hierarchical Linear Models This section will cover the results of the Hierarchical Linear Models that were run. There were three models that were run: an unconditional model with no predictors, an SES model with only an SES predictor at each level, and a full fixed-effects model that contained all the variables in a single model. The unconditional and SES models will be covered without subheadings due to their size but the full model will be covered both at a country level and at a variable level to allow for reporting both within and between countries. A final section will discuss the variance structure between the different models. It should be noted that at the first level are the student predictors and at the second level are the teacher/classroom/school predictors. The full outputs can be seen in Appendix 3. Unconditional Model The purpose of the unconditional model is to test to verify that the data structure is nested and to get a sense of how the variance is divided between the levels. A separate model was run for each country just as with each other analysis and the results are reported in Table 5.6. It shows that for each country HLM is appropriate as the p-value of the chi-squared test is zero for each country. Korea shows the highest percentage of the variance in science achievement at the student level with over 90 percent which may explain some of the high correlations seen at the student level and why Korea had the highest r-squared at the student-level regression. On the other end, Singapore shows the highest percentage of the variance in science achievement at the teacher/classroom/school level with about 75% of the variance at this level. This aligns with Singapore showing some of the strongest school and teacher/classroom correlations and with Singapore having the highest r-squared at both the teacher/classroom and school levels. Singapore is the only country of the group where more of the variation is at the second level. Given the range of school choices available within the Singapore system, this is not surprising. The United States is the most balanced country of the group with about 55 percent of the variance in student achievement at the student level and about 45 percent at the teacher/classroom/school level. Finland, Chile, and Ghana show the largest source of the variance in science achievement at the student level as well with approximately 80 percent, 65 percent, and 60 percent respectively. 86 Table 5.6– Unconditional HLM Chile Mean Achievement Model pvalue Student Variance Teacher/School Variance 459.0995 0 3331.0200 1802.3591 64.89% 35.11% 3258.7257 818.3886 79.93% 20.07% 7425.3885 4895.0752 60.27% 39.73% 5431.9957 480.4833 91.87% 8.13% 2228.6163 6871.3503 24.49% 75.51% 3273.6286 2763.5914 54.22% 45.78% (4.0775) Finland 551.7991 0 (1.7191) Ghana 311.4986 0 (7.4403) Korea 558.9538 0 (2.3417) Singapore 589.6599 0 (5.2019) United States 530.6035 0 (3.0308) Notes: Source TIMSS 2011 8th Grade, SCIPV1-5, SCIWGT, FML, Robust SE, *=.05, **=.01, ***<.001 87 SES Model Table 5.7 shows the results from the SES model with the standard errors in parentheses below the coefficients. At the student level, student SES was used as the only predictor and at the teacher/classroom/school level the proportion of disadvantaged students in a school was used as the only predictor of student achievement. All models continue to be significant with chi-squared tests at zero for each country. The variance percentages continue to remain the same as with the unconditional model when SES is added as a predictor. Korea shows the strongest relationship of student SES to achievement of about 12 points which is only a slight decrease from the 13 points from the regression model. Finland, which had the second strongest representation of variance in science achievement at the student level, has the second strongest relationship with 9 points. One interesting finding is that Ghana shows no relationship between student SES and science achievement in this model. As previously noted, this is most likely due to measurement error rather than the actual situation. Chile has the third strongest impact of SES at the student level with 7 points, the United States next with 6.5 points, and Singapore has the smallest impact of student SES on science achievement at about 3 points. It is worth pointing out that the three countries with the strongest relationship with student SES are also the three countries with the lowest standard deviations in their country achievement scores as selected for the sample. The level-two coefficients are not surprisingly opposites of the level-one coefficients. Just as in the unconditional model, Singapore shows the strongest relationship of science achievement at the second level of the model by having the percentage of disadvantaged students at a school related to science achievement decrease by just under a half a standard deviation on average as the percentage of disadvantaged student’s increases. This is more than double the next strongest country which is Ghana at about 19 points. This is followed by the United States and Chile at about 15 points each with Finland and Korea showing the lowest impact of SES at the second level. At this level, the countries with the higher standard deviations in the sample are: Singapore, Ghana, and the United States, which are represented as showing the strongest relationship with SES at the second level. This pattern of the relationship between the impact of SES at each level and the level of inequality is something that will be important to monitor moving forward. 88 Table 5.7 – SES only HLM Chile Finland Ghana Korea Singapore United States Mean Achievement Student SES Percent Disadvantaged Model pvalue 508.9109*** (8.8659) 6.9365*** -14.9693*** 0 (.8012) (2.6899) 561.7981*** 9.3808*** -5.4012*** (3.5287) (.4187) (1.7847) 378.9247*** 0.2769 -18.9977* (34.5256) (1.529) (9.2607) 570.3911*** 11.9066*** -4.6277*** (3.5423) (.5662) (1.6335) 661.6084*** 2.6286*** -41.9171*** (13.0315) (.3954) (6.9441) 576.3826*** 6.4939*** -15.4053*** (7.7461) (.5681) (2.4351) Notes: Source TIMSS 2011 8th Grade, SCIPV1-5, SCIWGT, FML, Robust SE, *=.05, **=.01, ***<.001 89 0 0 0 0 0 Full Model Chile In Chile, the student-level predictors continue to be the strongest for determining science achievement as a group. Being a girl in Chile and having a negative science affect reduces average achievement by about one quarter of a standard deviation for each change. Chile shows the second strongest impact of being a girl of the country sample after Ghana. Student SES also predicts about a six-point increase in achievement for each standard deviation increase in SES which represents the third highest impact of the country sample. One interesting anomaly about Chile is that it is the only country where interest and enjoyment of science is not a significant predictor of achievement. This finding will need to be explored further. For teacher/classroom variables at the second level, students in Chile who have teachers who report that there are classroom limitations will on average have achievement that is ten points lower for each level of increase in the limitations which is the third strongest impact of the sample. In a similar fashion, students who have teachers who report higher expectations for student achievement will on average have ten-point higher achievement for each increase of expectations. Hours of science instruction also continues to be negatively associated with student achievement in Chile. Overall, the significant teacher/classroom variables in Chile add up to about a quarter of a standard deviation in achievement for a single point change in each variable. For the school-level variables at the second level, students who attend schools with higher percentages of disadvantaged students and schools with a negative school climate have on average 9 and 7 points lower achievement for each increase. These are the only variables that are significant in Chile for this group. Finland Students in Finland continue to have their achievement strongly linked with attitudinal and affective measures. For each increase in the level of interest and enjoyment, Finnish students increase their achievement by about 12 points on average. Conversely, for each increase in a negative science affect, Finnish students’ achievement in likely to decrease by about 34 points which is the strongest of the entire country sample. These two variables combine to create about a half a standard deviation difference in student achievement for a single change in each. For each standard deviation increase in SES, a Finnish student will increase their achievement by about 7 points on average which is the second strongest after Korea. Finland is one of only two countries where gender does not matter. 90 At a teacher/classroom level, classroom limitations, students’ expectations, and inquiry-based teaching are the strongest predictors of achievement in Finland. For students whose teachers report an increase in classroom limitations, their achievement decreases by about 11 points on average for each increase in limitations. For each increase in teachers reporting higher student expectations for achievement, a Finnish student’s achievement increases by about 8 points. Finally, students who have teachers who report using inquiry-based teaching will increase their achievement by about 6 points on average for each increase. These three factors combine together to create about a quarter of a standard deviation difference in student achievement on average with a single point change in each. Being in an urban school is the only significant school predictor at the second level in Finland which results in a decrease of about 3 points for each increase in how urban a school is. This is the first time that being an urban school has been a significant predictor of science achievement in Finland. It is interesting to note that the percentage of disadvantaged students and being in a negative school climate are not significant in Finland which reaffirms the idea that Finland has a relatively more egalitarian system. Ghana Students in Ghana continue not to be impacted by SES. According to this model, however, Ghana shows the largest effects of gender, interest and enjoyment of science, and a negative science affect. Being a girl in Ghana means that your science achievement will be about a quarter of a standard deviation lower. For each increase in the level of interest and enjoyment of science in Ghana, students will increase their achievement by about 37 points on average. In the opposite case, for each increase in a student having a negative science affect that student’s achievement will decrease by about 35 points. These three variables combine to have over a standard deviation difference in achievement. Being bullied in Ghana also significantly decreases student achievement by about 9 points on average. At the second level, Ghana does not have any teacher/classroom variables that are significantly related to achievement, which is very surprising. This could be a problem with the model or a problem with not enough variation to pick up any effects. At the school level, going to an urban school in Ghana is a significant predictor of achievement and the strongest relationship of any country. For each increase in how urban a school is, students in Ghana will increase their achievement by about 14 points on average. This is consistent with the literature where developing countries have better services in urban areas compared to rural ones. Students who attend a school with a 91 negative school climate will see their achievement decrease by about a quarter of a standard deviation on average for each increase in the negativity of the school climate. This may be an indication of safety concerns for students. Travelling to school is known to be an issue in some developing countries as many children have long walks to and from school each day, however, it is not clear that this is the reason from the data. Korea When controlling for all the other factors in the model, Korea shows the largest impact of student SES on achievement with a one standard deviation increase in SES leading to about a 10-point increase in student achievement on average. For each increase in student interest and enjoyment of science in Korea, student achievement will increase by about 37 points on average which is the second highest impact after Ghana. Korean students who have a negative science affect will see about a 27-point decrease in their science achievement on average for each increase in their negative affect. These three factors account for about three quarters of a standard deviation difference in student achievement. Gender does not significantly impact achievement in Korea just as in Finland. At the second level of the model, students who have a teacher who has a science major will have about 12 points higher achievement on average. Korean students who have teachers who report a climate that has higher expectations for student achievement also could score about a five-point increase in achievement for each increase in expectations. Korea is also the only country where teacher experience is a significant predictor of student achievement but this has a very small effect on achievement. The only school predictor that is significant in Korea is the percentage of disadvantaged students at a school but Korea has the smallest effect of this of the countries where it is significant with student achievement decreasing by about 5 points on average for each increase in the proportion of disadvantaged students. These small impacts on student achievement continue to support Korea’s position as having a larger amount of the variance in student achievement at the student level. Singapore At the student level, Singapore shows the weakest effect of any country of student SES on achievement with a one standard deviation increase in SES leading to only a 2-point increase in student achievement. This may be due in large part to the fact that a majority of the variance in science achievement in Singapore is at the second level. Girls in Singapore have achievement that is about 14 points lower on average while students who show a negative science affect decrease their achievement by about 23 points on average for each increase in negative 92 affect. Students in Singapore also show the smallest significant effect of interest and enjoyment of science by only increasing achievement by 6 points on average for each increase of interest and enjoyment. All together, these effects account for about a half a standard deviation of student achievement. At the second level, Singapore shows an effect of classroom limitations that is literally six times stronger than the next country. Students in Singapore who are in classrooms where their teachers report an increase in limitations will have their achievement decrease by 61 points on average for each increase in the limitations. Students who have teachers who report a climate that has high expectations for student achievement will have about 28 points higher achievement for each increase, almost double the next highest country. At the school level, students who attend schools with more disadvantaged students will see their achievement decrease by about 30 points on average for each increase in the amount of disadvantaged students. Attending a school with a negative school climate also costs students about 17 points on average for each increase in the negative climate. The significant teacher variables account for about a full standard deviation of student achievement and the significant school variables account for another half a standard deviation. These results clearly support the high proportion in variation of student achievement being at the school level in Singapore. United States At the student level, the United States shows relatively moderate effects relative to other countries in the sample. Student achievement increases by about 6 points on average for each standard deviation increase in SES. Being a girl in the United States decreases average achievement by about 13 points while students who have a negative science affect could see a decrease by about 18 points on average for each increase in negative affect. Students in the United States who report higher interest and enjoyment of science will see their achievement increase by about 8 points on average for each increase in the level of interest and enjoyment. These variables as a group account for a little under a half a standard deviation in achievement in the United States. Some unexpected results are that parent involvement is negatively associated with achievement in the United States while students who report being bullied more have higher achievement. The parent involvement finding could suggest that parents are only involved when their child struggles. The bullying could follow the stereotype of smarter kids being picked on in the United States. At the second level, the United States only has one variable that significantly predicts student achievement: the percentage of disadvantaged students. For each increase in the proportion of disadvantaged students at the 93 school, students in the United States will likely on average have their achievement drop by about 11 points for each increase in the proportion of disadvantaged students. There is not a single teacher variable in the United States that significantly predicts student achievement which is similar to Ghana and again could be due to the model not picking up important teacher characteristics. 94 Table 5.8 – Full Fixed Effects HLM Chile Finland Ghana Korea Singapore United States Bullied Parent Involvement Percent Disadvantaged Students -24.3595*** -1.3884 -2.0563 -8.9878*** (2.1132) (3.412) (1.806) (2.4705) 11.5638*** -33.5528*** 2.8246 -10.2549*** -2.9579 (1.9297) (1.9743) (2.5821) (2.0218) (1.2305) (1.6978) -0.6331 -24.4696*** 36.6856*** -34.1748*** -9.1949*** 1.1018 -8.8228 (89.0171) (.9715) (3.4875) (5.6989) (2.4141) (2.5537) (3.2191) (6.4332) 570.354*** 9.6991*** 1.1199 26.2152*** -27.1057*** 5.7886* -1.5307 -4.7259*** (29.4229) (.5424) (2.8046) (2.4825) (2.5458) (2.4434) (1.5119) (1.3521) 766.7918*** 1.8767*** -14.0443*** 5.8308* -22.8803*** 4.1253* -1.6155 -29.4777*** (75.2226) (.3567) (2.3996) (2.606) (1.4395) (1.7461) (1.0059) (5.9998) 613.7744*** 5.5289*** -13.1672*** 7.5471*** -18.1501*** 3.9972* -6.6945*** -10.5519*** (51.2574) (.5662) (2.1651) (2.0222) (1.8759) (1.7574) (1.2807) (2.9429) Mean Achievement Interest Enjoyment Neg. Sci Affect Student SES Girl 576.4519*** 6.1112*** (52.3099) (.748) -23.5513*** -2.6882 (2.5637) (2.8067) 578.5832*** 7.3506*** 3.0453 (35.6129) (.3814) 315.3677*** Notes: Source TIMSS 2011 8th Grade, SCIPV1-5, SCIWGT, FML, Robust SE, *=.05, **=.01, ***<.001 95 Table 5.8 (cont’d) Chile Finland Ghana Korea Singapore United States Urban School Neg. School Climate School Parent Contact Teacher Experience Teacher Education Hours Science Week Science Major Teacher Cooperation 3.412 -6.7876* 6.4242 0.0392 -0.1232 -5.0513* 8.5333 -0.2466 (2.3292) (3.4031) (-5.4442) (.2016) (6.7358) (2.8527) (5.1519) (3.7345) -3.2734** -3.3095 3.5667 0.0501 5.5724 1.3851 5.1847 -1.8305 (1.1625) (2.7699) (2.4263) (.1285) (4.3621) (1.1858) (3.3274) (1.9590) 14.2426** -24.167** -9.7565 0.4234 0.6297 -1.5634 -21.4241 -7.3811 (4.9245) (7.9008) (9.8175) (.7567) (5.9829) (3.9941) (12.2386) (8.3973) 2.4424 -3.528 -0.4955 .5774*** -4.9145 -0.4889 12.1249* -0.3466 (1.4686) (1.9096) (3.1789) (.1165) (3.0086) (1.2665) (5.3125) (3.3498) 0 -17.4367* 1.9719 0.4412 11.2468 3.4796 -16.3788 -7.6528 0 (7.7748) (8.1236) (.4031) (9.0694) (3.2156) (17.1782) (7.6839) -1.3736 -3.0423 -5.1492 0.4872 0.2468 -1.2824 5.2577 1.2092 (-2.2782) (4.1574) (4.3237) (.2725) (4.9949) (1.3511) (5.8978) (3.1618) 96 Table 5.8 (cont’d) Chile Finland Ghana Korea Singapore United States Classroom Limitations Inquiry Teaching Student Expectations Teacher Support Students -10.5844* -3.0278 10.1944* -0.1143 (4.993) (4.952) (4.7715) (4.3817) -11.1439** 5.7286* 8.1642** 0.0355 (3.6561) (2.5685) (2.8081) (2.3322) -2.9046 7.4159 14.2222 3.0472 (13.8922) (7.7125) (8.3973) (11.3495) -2.4179 0.9539 5.0856* -3.5528 (2.0911) (3.5299) (2.1715) (1.8752) -60.59*** 4.276 27.9078*** -15.7007* (9.7054) (9.7147) (7.2217) (6.5751) -9.5925 0.2345 7.6744 0.3285 (7.4364) (4.8826) (4.7501) (6.4761) 97 Variance Structure Table 5.9 shows the variance structure of the models with respect to how the variance is divided between the two levels and how much the variance decreased as a result of each new model. The percentages are the decreases in the variance of that model for a given level over the previous model. The ICC gives the proportion of the variance that is at the second level compared to the first level. In a sign of the goodness of fit of the models, in all cases for all countries the variance decreases or remains the same as each model becomes more complicated. The ICC also decreases in each case as well within each country and across the models. As with the unconditional models, Singapore shows the highest percentage of the variance at the second level and Korea the highest percentage at the first level for both the SES and full models. When moving from the unconditional model to the SES model, Korea shows the biggest decrease in the variance at both the first and second levels. When moving from the SES to the full models, Korea again shows the biggest decrease in the variances at both levels although at the second level the decrease is equal with Ghana. Ghana shows the smallest change (none) at the first level and at the second level moving from the unconditional to the SES model which is consistent with the absence of an SES effect in Ghana. The United States shows the smallest reduction in variance at both levels moving from the SES to the full model which is also consistent with the lack of significant findings in the United States. Overall, there does appear to be a pattern between where the higher proportion of the variance in achievement is located and the category from which the country was selected for inclusion, specifically the size of the SD of achievement in the country. Singapore, Ghana, and the United States are the three most unequal countries in the sample based on this and these are the three countries with the highest proportion of the variance at the second level. The other three countries: Korea, Finland, and Chile, all have the highest amount of their variation in science achievement at the student level and these are the countries with the lowest SD in their student achievement. 98 Table 5.9 – Variance Structure Chile Student Variance Teacher/ School Variance Student Variance Teacher/ School Variance Student Variance Teacher/ School Variance 3331.0200 1802.3591 3202.2778 877.6688 2811.7621 512.6146 3.86% 51.30% 12.19% 41.59% Variance Reduction ICC Finland 0.35 3258.7257 818.3886 Variance Reduction ICC Ghana 7425.3885 4895.0752 ICC 480.4833 Variance Reduction ICC Singapore 2228.6163 6871.3503 ICC 400.8746 10.99% 35.16% 16.46% 24.45% 0.15 3273.6286 2763.5914 0.14 7425.7708 4562.4511 5952.5242 2334.979 0.00% 6.80% 19.84% 48.82% 0.38 0.28 4738.18 164.0572 3766.3122 84.3395 12.77% 65.86% 20.51% 48.59% 0.03 0.02 2204.9029 5512.6612 1778.8315 3931.53 1.06% 19.8% 19.32% 28.68% 0.76 Variance Reduction ICC 2423.1212 0.08 Variance Reduction United States 530.6341 0.40 5431.9957 0.15 2900.5822 0.20 Variance Reduction Korea 0.22 0.71 0.69 3142.0739 1819.8157 2794.7455 1558.8103 4.02% 34.2% 11.05% 14.34% 0.46 0.37 99 0.36 Logistic Regressions The purpose of the logistic analysis was to attempt to key in on low-SES students and the odds of their differences in achievement as it relates to other variables. The goal was to uncover some potential compensatory effects that could help explain how low-SES students could achieve higher than their predicted achievement taking what was learned from the previous analyses and some hypotheses about what might matter. Nine variables were identified that consistently produced a strong effect on science achievement across the countries or were of interest to study: • Gender • Interest and enjoyment of science • Science affect • Percentage of disadvantaged peers • School climate • Teacher experience • Teacher with a science major • Classroom limitations to learning • Expectations for student achievement The tables for each variable can be seen in Appendix 4 which indicate that of the 54 individual regressions run, only six showed cases where low-SES students were not significantly more likely to have lower achievement controlling for the variable of interest and five of these six cases were in Ghana where it has been seen that the SES variable is questionable as constructed. This reinforces the impact of SES on achievement. Odds of Low Achievement in Science Table 5.10 shows the results of the logistic regression where the outcome variable was being in the low achieving group (0 or 1). In every country except Ghana, low-SES students show significantly higher odds of being in the low achieving group compared to higher-SES students when controlling for the other variables in the regression. This reinforces the relationship between SES and achievement. In Chile and Ghana, girls have greater odds of having low achievement in science compared to boys holding the other variables constant which has been consistent across models. These odds are 1.6 and 1.74 greater respectively. The strongest variable in this model for predicting whether a student has higher odds of being low achieving is science affect. In every country, students 100 who have a positive science affect have significantly lower odds of being low achieving compared to students with a more negative affect. The strongest odds of this are in Ghana at 1.77 greater odds and the weakest odds are in Singapore at 1.5 greater odds. The second strongest variable for predicting the odds of low achievement in this model is attending schools with a lower proportion of disadvantaged peers which is significant in every country except Finland. This variable captures the SES effect in the schools and the fact that this is not significant in Finland continues the pattern with Finland and equality. The strongest effect of this is in Singapore where students who attend schools with lower proportions of disadvantaged students have 1.62 lower odds of having low achievement compared to students who attend schools with greater proportions of disadvantaged students. Interest and enjoyment is also significantly related to low achievement in every country except Chile, which has been consistent, and Singapore. Korea shows the strongest effect with students who have high interest and enjoyment having 1.5 lower odds of low achievement compared to students who have lower levels of interest and enjoyment. There are also some country-specific results from this model. Students who attend a school with a positive school climate have significantly lower odds of low achievement compared to those students who attend a school with a negative school climate in Ghana and Korea. These odds are 1.5 and 1.17 respectively. In Finland and Singapore, students who have teachers who report lower amounts of limitations have significantly lower odds of low achievement compared to students who have teachers who report higher limitations. These odds are extremely strong in Singapore at 1.7 times greater odds compared to 1.22 greater odds in Finland. Conversely, students who have teachers who report high expectations for student achievement are significantly less likely to have low student achievement in Chile, Finland, and Korea compared to students who have teachers that report low expectations for student achievement. Finally, the three teacher specific variables: teacher experience, having a teacher with a science major, and having a teacher that uses inquiry based practices, had no effect on student achievement given these logistic regressions and controlling for the other independent variables. 101 Table 5.10 – Odds of Low Achievement in Science Chile Finland Ghana Korea Singapore USA Low SES Girl 2.6038*** (.2816) 2.72*** (.2516) 1.0865 (.1343) 2.876*** (.2166) 2.4014*** (.2109) 2.6207*** (.2153) Experienced Teacher 1.598*** (.1419) 0.9134 (.0742) 1.7381*** (.1397) 1.1401 (.0836) 1.1272 (.0947) 1.1702* (.0743) Science Major 0.9899 0.7762 (.1515) (.1213) 1.0316 0.7792 Finland (.099) (.1011) 0.7057 1.0344 Ghana (.144) (.1956) 0.8776 0.7282 Korea (.1113) (.275) 0.9219 1.2771 Singapore (.2165) (.7237) 0.9068 0.8548 USA (.1335) (.1308) Source: TIMSS 2011 8th grade science data Chile Interest Enjoyment Positive Science Affect Less Disadvantaged Peers 0.9979 (.1089) .7277** (.0732) .6171*** (.0731) 0.4498*** (.0487) 1.0029 (.1124) .8076* (.0795) Low Limitations 0.4259*** (.0429) .3804*** (.0471) .2276*** (.0271) .3059*** (.0272) 0.4948*** (.0536) .4404*** (.0395) Inquiry Teaching 0.4712*** (.0859) 0.9074 (.1184) .5273* (.1382) .7081*** (.0747) 0.3804*** (.0913) .3986*** (.0838) High Student Expectations 0.8337 (.1434) .7771** (.0631) 0.7264 (.1792) 1.0757 (.1093) 0.2949*** (.0715) 0.9782 (.1292) 1.0235 (.1585) 0.8634 (.0845) 0.8946 (.1985) 0.981 (.103) 0.7821 (.1883) 0.8787 (.1377) .5675** (.1155) .8157* (.0802) (.7867) (.1985) .7587** (.0659) 0.6283 (.1536) 0.8986 (.1617) 102 Positive School Climate 0.7302 (.1254) 0.9762 (.1384) .4959*** (.0943) .8256* (.0774) 0.8159 (.3657) 0.8527 (.1762) Odds of Low SES Table 5.11 shows the odds of low SES given the same variables as in the achievement analysis. Here the analysis is a little finer grained as there are not as many significant relationships to sort through. This indicates that the relationship of these variables and achievement may be stronger than it is between these variables and SES. The only variable that shows a consistent relationship across countries with low SES is low achievement. Not surprisingly, low achieving students have greater odds of being low SES than students with high achievement. Also not surprisingly, students who attend schools with more economically advantaged students are less likely to be low SES compared to those students who attend schools with more economically disadvantaged students. This is significant in every country except Ghana. Interest and enjoyment is a significant predictor of low SES in four of the six countries with Chile and Ghana being the exceptions. The analysis shows that students with high levels of interest and enjoyment have lower odds of being low SES compared to students who have lower levels of interest and enjoyment. This is strongest in Finland where students with high interest and enjoyment have 1.32 lower odds of being low SES compared to students with lower levels of interest and enjoyment. Furthermore, students that have a positive science affect have significantly lower odds of being low SES compared to students that have a more negative science affect in four of the six countries with Ghana and Korea being the exceptions in this case. This effect is again strongest in Finland where students with a positive science affect have 1.29 lower odds of being low SES compared to students with a negative science affect. These are very interesting findings in that there appear to be some relationships between a student’s attitude in science and the odds of them being low SES. Having an experienced teacher is significant in half the countries but the direction is inconsistent. In Ghana and the United States, students who have an experienced teacher are less likely to be low SES compared to those students who have a less experienced teacher. However, in Chile the opposite is true. This could mean that experienced teachers in Chile are purposely sorted toward low SES students or that they are given some type of incentive to work with them. The exact reason is not clear from the data but this finding will need further investigation to determine if there is a reason or if this is just a coincidence. Three variables had no relationship in predicting low SES: a positive school climate, a teacher that uses inquiry based teaching, and being in a classroom with low amounts of limitations. These three variables appear to operate mostly independently of SES. In Chile, as continues to be a country-specific pattern, students who have a 103 teacher with a science major have 1.34 lower odds of being low SES compared to students who have a teacher that does not have a science major. Lastly, in Singapore students who have a teacher that reports high expectations for student achievement have 1.28 lower odds of being low SES compared to students that have a teacher that does not report high expectations for student achievement. 104 Table 5.11 Odds of Low SES Low Achievement Chile Finland Ghana Korea Singapore USA 1.2517*** (.2719) 2.7496*** (.2481) 1.0625 (.1297) 2.8433*** (.2157) 2.3539*** (.2069) 2.5945*** (.2134) Science Major 0.6637* (.1089) 0.9547 Finland (.1018) 0.8445 Ghana (.1511) 1.2685 Korea (.2905) 0.9733 Singapore (.1964) 0.8858 USA (.1091) Source: TIMSS 2011 8th grade data Chile Interest Enjoyment Positive Science Affect Less Disadvantaged Peers 0.9853 (.0958) .6842*** (.0693) 0.9864 (.0868) .8112* (.0666) .8247** (.0576) .8269* (.0625) Low Limitations 0.7548** (.0716) .7135** (.0747) 0.8271 (.0847) 0.9345 (.0774) .8456* (0702) .7697*** (.0533) Inquiry Teaching 0.3852*** (.0594) .7759* (.0808) 0.7349 (.1839) .5173*** (.0515) .5886*** (.0426) .5651*** (.0671) High Student Expectations 0.9019 (.1737) 0.8921 (.0888) 0.8827 (.1982) 1.0756 (.1048) 0.8517 (.0708) 0.8703 (.0819) 1.0301 (.1683) 0.8992 (.0873) 0.9131 (.1757) 0.9926 (.0952) 1.0254 (.0855) 0.9086 (.0838) 1.0111 (.2138) 0.9645 (.1233) 0.8648 (.2013) 0.9112 (.1108) 0.7214*** (.0609) 0.8835 (.1009) 105 Positive School Climate Experienced Teacher 0.9841 (.1438) 0.8787 (.1018) 0.6816 (.1332) 1.1034 (.1047) 1.0122 (.1341) 1.0153 (.1237) 1.3596* (.1984) 0.9533 (.0697) 0.5484** (.0928) 0.9624 (.0794) 0.9762 (.0876) .7701** (.0707) Odds of Low Achievement Interacting Low SES with Model Variables Table 5.12 shows the results of the interaction terms of being a low SES student with a potential compensatory variable from each of the regressions with the odds ratios reported and the standard errors below. In each cell, a regression was run that contained odds of low achievement as the outcome. The predictors were being a low SES student, the variable listed, and the interaction of the variable and low SES. The interaction term is what is reported in the table from each of these regressions. The goal of this analysis was to find significant interaction terms with odds lower than one. This would indicate that given low SES, the variable effect makes the odds of low achievement less than one compared with high SES. Of the cases where the interaction is significant, every single case still shows low SES students more likely to have lower achievement than high SES students even in the presence of a potential compensatory effect. The implications of this finding will be discussed in the next section but the results indicate that there is not a single factor strong enough to help SES students achieve at the same odds as their high SES peers. This means that there is not a variable that on its own can act as a silver bullet to compensate for the effects of low SES on achievement. 106 Table 5.12 – Odds of Low Achievement Given Interaction of Low SES with Variable Chile Finland Ghana Korea Singapore USA Girl Interest Enjoyment Positive Science Affect Less Disadvantaged Peers Positive School Climate Experienced Teacher Science Major Low Limitations High Student Expectations 1.3893 1.3472 1.3356 1.5701 1.5884 0.7303 1.371 1.2783 1.4664 (.042) (.103) (.082) (0) (.019) (.131) (.081) (.286) (.132) 1.2542 1.771 1.0885 0.9339 0.9905 1.2036 1.2539 1.2102 1.0252 (.143) (.002) (.658) (.702) (.952) (.079) (.086) (.130) (.870) 1.0172 0.7723 2.14 1.7412 1.6266 1.1507 0.8263 1.2532 1.2665 (.911) (.088) (0) (.033) (.01) (.526) (.329) (.342) (.265) 1.3271 1.6653 2.3119 1.178 0.9343 0.9364 1.5681 0.9065 1.291 (.03) (.011) (0) (0) (.664) (.635) (.379) (.476) (.146) 1.2896 1.2681 1.196 1.2754 1.4719 1.3032 0.9858 0.8265 1.1821 (.02) (.107) (.209) (.116) (.273) (.191) (.971) (.402) (.358) 1.5596 1.0457 1.3417 1.1382 0.9069 1.4 0.9248 1.1811 1.391 (0) (.653) (.033) (.330) (.517) (.014) (.653) (.457) (0) Source: TIMSS 2011 8th grade data 107 In summary, many of the findings in the literature are affirmed. These include the large effects of gender and SES on student achievement. Also at the student level are effects of student attitudes toward science on science achievement. At the teacher level, the effect of teacher attitudes and limitations of the classroom are also found to have relationships with science achievement. At the school level, the proportion of economically disadvantaged students in a school shows relationships with science achievement in almost every instance. Whether a school is urban or rural appears to be a contextual effect. Some findings from this analysis are different from the literature. These models show little to no effect of parent involvement on student achievement. This could be a function of other variables taking away any effect that may not have been accounted for in the analyses done in the literature. The main candidate for which variable would take away the parent effect would be SES but this is just speculation. Another major difference from the literature is that the teacher quality variables also showed little to no effect on student achievement. This could again be due to the controls in the model or it could be that there was not enough variation in these factors as recorded in the dataset. Of interest in this analysis is the effect of SES as it relates to these factors. There appears to be a strong attitudinal component as it relates to SES with students with less favorable attitudes towards science showing greater odds of being low SES. Given the magnitude of the relationship between science achievement and attitude towards science this could be a big concern. The implications of this result and the others will be discussed in the next chapter. 108 Chapter 6 – Discussion and Conclusion This study started with the full sample of the 2011 TIMSS eight grade science dataset. After plotting the countries on the basis of their overall achievement score in science and the standard deviation of that score, a clear negative relationship was observed. Furthermore, when making the same plot for the 25th and 75th percentiles it was noted that the slope for the 25th percentile was significantly steeper indicating that the lower scoring students were more impacted by changes in the SD (inequality) of the test scores. As a result, a six-country sample was drawn with four countries representing each high/low combination of achievement and SD and two additional countries that were in the middle of the combinations. The motivation behind this selection was to explore any potential relationships between achievement and the standard deviation of that achievement, specifically as it relates to the SES of students. Starting with this sample, the aim of the study was to build on the existing research around socio-economic status (SES) and achievement by exploring in more detail the conditions in schools and classrooms around the world that might magnify or reduce the effect of SES on student achievement. More specifically, the analysis looked at the questions of: “What conditions help low SES students achieve higher than what would be expected given their SES?” and “What conditions hinder low SES students in the sense that they achieve at or below what would be expected given their SES?” For this study, SES is important not only for the financial, human, cultural, and social capital that are to the advantage or disadvantage of young people, but also because it acts, in part, as a proxy for the other prior effect of inequalities students have experienced. With TIMSS, the study is only a single snapshot in time but it would be reasonable to conclude that assuming a student’s SES remained fairly static, that it embodies correlated effects of other variables from previous years. Variables were selected that had a basis in the existing literature as well as an empirical basis from preliminary analysis of the TIMSS 2011 dataset that utilized a range of factor analyses. The results from the previous chapter suggest that there are clearly inequities in achievement and that these inequities may be further increased by other factors. These factors are present at all levels of analysis: the student level, the teacher/classroom level, and the school level. There are also variables that consistently had no impact at all levels with respect to student science achievement and there are also variables that were impactful but only within specific countries. Overall, as noted in the logistic regressions, there are no silver bullets present in these data that can do much on their 109 own to help low-SES students overcome their predicted achievement disadvantage. However, there does appear to be the potential for a combination of factors being able to do more. Inequalities SES SES is at the heart of this analysis and this study has demonstrated the strength of this relationship. Using an HLM analysis, this study looked at the impact of individual SES as well as school SES on science achievement. With the exception of Ghana, every country has a significant relationship with SES at the student level in the HLM analysis. Ghana does not show a relationship at either level in the HLM analysis due to the SES data not being normally distributed and some large inconsistencies in the relationship between achievement and SES that would seem to defy reason. At the school level, four countries showed an impact of SES on student achievement with Finland and Ghana being the exceptions. The Finnish exception is believable, given what we know about the relatively egalitarian structure of their school and social system. There appears to be a reciprocal relationship between a lack of SES impact at one level showing an impact at the other level instead. Korea and Finland show the strongest impact of SES on achievement at the student level but they show little or no impact of SES at the school level. This would suggest that the majority of differences in achievement in these countries are due to differences between students rather than schools and the HLM variance analysis supports this conclusion. Finland and Korea have the least amount of variance in student achievement at the school level. This would suggest that Finland and Korea have schools that are very similar to each other. On the other extreme is Singapore which shows the smallest impact of student SES on achievement but the largest at the school level at three times the impact in SES in the country where the relationship is the next strongest. The HLM variance analysis supports this as Singapore has the highest percentage of variance in student achievement at the school level, twice that of the next highest country the United States. This is not surprising given the numerous schooling options in Singapore which are very different from one another and produce all this differentiation. Chile and the United States show moderate but significant effects of SES on achievement at both levels that are more balanced. However, the United States shows over twice the percentage of variance in science achievement at the school level compared to Chile where more of the variance is at the student level. In the United States, this is most likely due to the differences between the generally poorer and lower-performing schools against the wealthier and better-performing suburban schools. 110 Gender In addition to SES, another very large source of inequality is related to gender, especially in science achievement. Therefore, when exploring issues of equity, gender is another very helpful variable to help highlight the magnitude of inequity within a country. In countries where gender has a strong relationship with achievement, it would be misleading not to include gender in the models. The literature reviewed was very clear about the disadvantage girls face worldwide in science and this study also reinforces that finding. Table 6.1 shows the mean achievement scores for each country by gender and a t-test of their differences. Based on the t-tests both Chile and Ghana are above the international average for the difference with Singapore showing no difference and Finland showing a difference in favor of girls. In the main analysis, following a similar pattern to student SES, girls face a significant penalty in their achievement for their gender in all countries except Korea and Finland. This finding is important in that Korea and Finland are two of the higher achieving and more equal (in achievement SD terms) countries in this sample. It is interesting to note that the two countries with the highest impact of gender on achievement, Ghana and Chile at about a quarter of a standard deviation in each country, are the lowest achieving of the sample. When essentially half of the student population is facing a deficit it would seem to make sense that the overall scores of a country would suffer. The United States and Singapore show a moderate penalty of gender on science achievement but the difference is that Singapore has such high mean achievement compared to the other countries that the impact is not as apparent in the country scores. Given these interesting relationships in the findings, a future study should look at the impact of gender and other factors on student achievement, similar to this with SES. 111 Table 6.1 – Gender Comparison by Country Mean SD N Chile Boy 480.79 78.05 2697 Girl 467.27 76.74 3122 Finland Boy 550.11 68.87 2166 Girl 555.16 63.33 2061 Ghana Boy 324.92 112.67 3802 Girl 291.87 110.15 3492 Korea Boy 564.76 80.17 2502 Girl 558.97 73.69 2663 Singapore Boy 585.67 103.47 2992 Girl 585.26 90.86 2932 USA Boy 529.87 83.97 5164 Girl 519.51 78.16 5275 Int. Avg. Boy 498.12 128.57 19323 Girl 489.49 130.17 19545 Source: TIMSS 2011 8th Grade 112 t-value p-value 6.65 0 -2.48 0.007 12.67 0 2.7 0.003 0.16 0.5643 6.53 0 6.57 0 Factors that Enhance or Reduce this Effect Table 6.2 summarizes the effects of each variable by country within each analysis. Knowing now that SES has largely impacted science achievement on its own, the next question becomes what factors consistently seem to enhance or compensate for the effect of SES on achievement. With the introduction of other variables the SES effect will be reduced or possibly enhanced which is what the study is trying to capture. To answer these questions, the HLM analysis needs to be taken in conjunction with the logistic regression analysis. For example, looking at the cells under the negative science affect variable shows that every single analysis in every country shows a significant result with achievement. One thing that is clear is that there are factors at the student, teacher, classroom, and school levels that matter with respect to the effect of SES on achievement. 113 Table 6.2 – Summary of Variable Significance by Country Correlations OLS Regression Logit HLM Chile Sig. Sig. Sig. Sig. Finland Sig. Sig. Sig. Sig. Ghana Sig. Sig. Sig. NS Korea Sig. Sig. Sig. NS Singapore Sig. Sig. Sig. Sig. United States Sig. Sig. Sig. Sig. Chile Sig. Sig. Sig. Sig. Finland Sig. NS NS NS Ghana Sig. Sig. Sig. Sig. Korea Sig. NS NS NS Singapore NS NS NS Sig. United States Sig. Sig. Sig. Sig. Chile NS Sig. NS NS Finland Sig. Sig. Sig. Sig. Ghana Sig. Sig. Sig. Sig. Korea Sig. Sig. Sig. Sig. Singapore Sig. Sig. NS Sig. United States Sig. Sig. Sig. Sig. Chile Sig. Sig. Sig. Sig. Finland Sig. Sig. Sig. Sig. Ghana Sig. Sig. Sig. Sig. Korea Sig. Sig. Sig. Sig. Singapore Sig. Sig. Sig. Sig. SES Girl Interest/Enjoyment Neg. Sci. Affect 114 Table 6.2 (cont’d) United States Sig. Sig. Sig. Sig. Chile Sig. Sig. N/A NS Finland Sig. NS N/A NS Ghana Sig. Sig. N/A Sig. Korea NS NS N/A Sig. Singapore Sig. Sig. N/A Sig. United States NS NS N/A Sig. Chile Sig. Sig. N/A NS Finland Sig. Sig. N/A Sig. Ghana Sig. NS N/A NS Korea Sig. NS N/A NS Singapore Sig. Sig. N/A NS United States NS Sig. N/A Sig. Bullied Parent Involvement Source: TIMSS 2011 8th Grade 115 Table 6.2 (cont’d) Teacher Experience Chile Sig. NS NS NS Finland Sig. NS NS NS Ghana Sig. Sig. NS NS Korea Sig. Sig. NS Sig. Singapore Sig. NS NS NS United States Sig. NS NS NS Chile Sig. NS N/A NS Finland NS NS N/A NS Ghana Sig. Sig. N/A NS Korea NS NS N/A NS Singapore Sig. NS N/A NS United States Sig. NS N/A NS Chile Sig. Sig. N/A Sig. Finland NS NS N/A NS Ghana Sig. NS N/A NS Korea NS NS N/A NS Singapore Sig. NS N/A NS United States NS NS N/A NS Chile Sig. Sig. NS NS Finland Sig. Sig. NS NS Ghana NS NS NS NS Korea NS Sig. NS NS Singapore NS NS NS NS United States NS NS NS NS Teacher Education Hrs. Sci./week Science Major Teacher Cooperation 116 Table 6.2 (cont’d) Chile Sig. NS N/A NS Finland NS NS N/A NS Ghana Sig. NS N/A NS Korea Sig. NS N/A NS Singapore NS NS N/A NS United States Sig. NS N/A NS Chile Sig. Sig. NS Sig. Finland Sig. Sig. Sig. Sig. Ghana Sig. Sig. NS NS Korea NS NS NS NS Singapore Sig. Sig. Sig. Sig. United States Sig. NS NS NS Chile NS NS NS NS Finland Sig. Sig. NS Sig. Ghana Sig. NS NS NS Korea Sig. NS NS NS Singapore Sig. NS NS NS United States Sig. NS NS NS Chile Sig. Sig. Sig. Sig. Finland Sig. Sig. Sig. Sig. Ghana Sig. Sig. NS NS Korea Sig. Sig. Sig. Sig. Singapore Sig. Sig. NS Sig. United States Sig. Sig. NS NS Sig. NS N/A NS Classroom Limitations Inquiry Teaching Student Expectations Supporting Students Chile 117 Table 6.2 (cont’d) Finland NS NS N/A NS Ghana NS NS N/A NS Korea NS NS N/A NS Singapore Sig. NS N/A Sig. United States Sig. NS N/A NS Chile Sig. Sig. Sig. Sig. Finland Sig. NS NS NS Ghana Sig. Sig. Sig. NS Korea Sig. Sig. Sig. Sig. Singapore Sig. Sig. Sig. Sig. United States Sig. Sig. Sig. Sig. Chile Sig. NS N/A NS Finland NS NS N/A Sig. Ghana Sig. Sig. N/A Sig. Korea Sig. Sig. N/A NS Singapore NA NA N/A NA United States Sig. Sig. N/A NS Chile Sig. Sig. NS Sig. Finland Sig. Sig. NS NS Ghana Sig. Sig. Sig. Sig. Korea Sig. Sig. Sig. NS Singapore Sig. Sig. NS Sig. United States Sig. Sig. NS NS Chile Sig. NS N/A NS Finland NS NS N/A NS Percent Econ. Disadvantaged Urban School Neg. Sch. Climate School Parent Contact 118 Table 6.2 (cont’d) Ghana Sig. NS N/A NS Korea Sig. NS N/A NS Singapore Sig. NS N/A NS United States Sig. NS N/A NS 119 Student Factors The literature indicates that students globally are less interested in science than other subjects for a variety of reasons. This study has reinforced that finding since interest and enjoyment of science is the strongest variable associated with increases in student achievement in science. The size of the effect varies between the different countries. This variable is a composite of questions about students enjoying experiences in science, showing an interest in science content, and how they experience science through their teacher. Interestingly, this effect is the strongest of the student variables compared to other countries in the 2 of the 3 countries where most of the variance is at the student level: Korea and Finland (with Ghana being the 3rd). When looking at the logistic regressions, in all three of these countries having an interest and enjoyment in science means a student is significantly less likely to have low achievement in the full logistic regression compared to students with a lower interest and enjoyment of science. This model also included controls for gender and SES. Students with a high level of interest and enjoyment in science have 1.55 lower odds of having low achievement compared to students with low levels of interest and enjoyment in Korea. These odds are 1.38 in Ghana and 1.27 in Finland. Furthermore in Finland, Korea, Singapore, and the United States, students with high levels of interest and enjoyment are significantly less likely to be low SES compared to students with lower levels of interest and enjoyment. This highlights the importance of finding ways to promote student interest in science as a subject in so far as it appears to have a substantial impact on student achievement and is often biased along SES lines. Finding a way to raise interest could also impact students taking up careers in science where there are many labor shortages. An even stronger consistent student predictor is whether a student has a negative affect towards science as a subject. This variable is a composite of student responses to negative experiences in science including difficultly, negative feelings, and lack of confidence and was correlated between a .2 and .45 with science achievement. In the full HLM model this is significant in every single country with the strongest effects again in Finland, Korea, and Ghana where a single point change in affect corresponded with a 34-, 27-, and 34-point change in science achievement respectively. When combined with interest and enjoyment in science, a one unit change in each negative science affect and interest and enjoyment of science accounts for about a 2/3 of a standard deviation difference in science achievement in Ghana and about half a standard deviation in Korea and Finland when including controls for gender and SES. A one-unit increase in negative science affect is predictive of lower achievement by just under a quarter a standard deviation in Chile, United States, and Singapore. When looking at the 120 logistic regressions, students with a positive science affect have between 1.5 and 1.77 lower odds of having low achievement compared to students with a more negative science affect. These odds are strongest in Ghana and weakest in Singapore. Furthermore, students in Chile, Finland, Singapore, and the United States who have a positive science affect have significantly lower odds of being low SES compared to students who have a more negative science affect. The results indicate that better student relationships with science as a subject and improving student experiences in science classrooms has good potential for improving achievement, especially for low-SES students who appear more dispositional to negative feelings about science. Non-Student Factors A second important finding is the weakness of school, teacher, or classroom variables that were consistently associated with student achievement to the degree that some of the student variables. However, there were some factors that showed a moderate effect in most countries, but not all. The only consistent variable that showed this effect was at the school level, students who attend a school with a negative school climate are significantly more likely to have lower achievement in Chile, Ghana, and Singapore. However, when looking at the full logistic regression, this effect only holds for Ghana which has the strongest impact and where students who are in a positive school climate have greater odds of having higher achievement compared to students who were in a more negative school climate. It is important to note that in no country were low-SES students more likely to attend a school with a negative climate. At the teacher and classroom level, as discussed in the following sections, there were two main factors that appeared to impact science achievement, one that helped and one that hindered. Positive Effects on Student Achievement One factor that appears to help improve student achievement at the classroom level is when teachers report that there are high expectations for student achievement. This was significant in four of the six countries in the HLM model with Ghana and Korea being the exceptions. This is based on teacher perceptions concerning whether the teachers, parents, and students have expectations for high student achievement. In a continued pattern at the second level, this effect is strongest in Singapore where second-level variables explain most of the variance in achievement and a one-unit change in expectations accounts for just over a quarter of a standard deviation change in achievement on average. In the individual logistic regressions, students whose teachers report higher expectations have lower odds of having lower achievement compared with students whose teachers report lower expectations but in the full logistic regression the effect only holds in Chile and Finland. This suggests that in the other countries expectations 121 for student achievement are confounded with other factors rather than having a strong independent effect on their own. Negative Effects on Student Achievement The main factor that hindered student achievement was teachers who reported a high degree of classroom learning obstacles which is based on such variables as classroom behavior, lack of student nutrition, and students lacking prerequisite knowledge. This was a factor in half of the countries using the HLM model: Singapore, Finland, and Chile. In Singapore, this was an enormous factor with an average single unit change on the Likert scale for classroom obstacles reducing achievement by over half a standard deviation. In Finland and Chile, this effect was lower at about 11 points each but still significant. When looking at the individual logistic regressions this effect still holds but in the full logistic regressions the effect disappears in Chile with respect to achievement. These limitations are all things that are more likely to affect low-SES children such as not having enough to eat, lacking sleep, or lacking the proper prerequisite knowledge that could spill over into the classroom, however, the logistic regression indicates that low SES kids are not significantly more likely to be in these types of classrooms in any country. In general, when summarized across countries, the data confirms the strong impact of SES on achievement at both the student and school level. Gender is also a factor in determining student achievement especially in the lower-achieving countries like Chile and Ghana. The strongest block of variables that predict student achievement are the student-level variables. Specifically, the level of interest and enjoyment a student experiences in science class as well as the degree of negativity of a student’s affect about science show the strongest relationships, which in many countries are affected by SES as well. At the teacher/classroom level, having high expectations for students is a strong predictor of achievement while being in a classroom where the teacher encounters many obstacles to learning has a negative relationship with achievement, even when controlling for other important factors. Expectations for student achievement takes on an SES component, but classroom obstacles do not appear to have an SES component to them. Finally, at the school-level, attending a school that has a negative school climate is a significant predictor of student achievement, but does not appear to have an SES component to it. Country Specific Factors While there were some variables which tended to be generally related to student achievement in the countries studied, there were also some specific variables that uniquely mattered in a specific country or two. They are important to highlight because they illustrate the fact that unique contextual factors can also turn out to be 122 statistically significant in these analyses. One of the most salient examples of this is the impact of being a student in an urban school in Ghana. Many developing countries have stark urban/rural differences in the quality of education and Ghana is no exception to this. According to the full HLM model, a student in Ghana is on average likely to have about 14 points higher achievement with each increase in how urban their school is. This relates back to SES in that the rural areas of Ghana and many other developing countries tend to be the poorest areas of the country. Again, the absence of an SES effect in Ghana may be the result of measurement problems. Another country-specific effect, although small, is related to Finland and their teachers. Finland has become famous for the high quality of their teachers in terms of their preparation and the results of their students 5. Finland is the only country where teachers who report using science pedagogy (inquiry-based teaching methods) with their students increases student achievement. This indicates that some teachers in Finland are able to leverage these practices on a level that can generate positive gains in student achievement while some others cannot. The variable asks how often teachers use these practices with their students but makes no allowance as to the quality of these practices so the fact that this is significant in Finland suggests that there are probably differences among teachers in how well they implement these practices. In Korea, two unique teacher features were the level of teacher experience and having a teacher with a science major, both have a positive impact on science achievement. The teacher experience factor was very small but significant whereas having a teacher with a science major had a much stronger effect. These findings highlight a unique aspect in how teachers are certified and continually learn within Korea. It appears that the professional development and ongoing work that Korean teachers undertake provide some unique benefit to their ability to improve student achievement. It also appears that teachers who have a science major are able to transform their subject expertise into better achievement for their students so it may be that Korea would want to study this effect in more detail. In other countries, there does not seem to be enough quality differences in how science teachers are prepared to make a difference in student achievement. Chile is notable for one significant finding and one lack of a significant finding. Chile is the only country where the interest and enjoyment of science is not a significant predictor of student science achievement. The natural question becomes why is this case? One hypothesis has to do with the fact that according to the TIMSS Encyclopedia Chile has a high-stakes examination in eighth grade. This might simply be a grade effect in that 5 This can be noted in any of the PISA reports as well as some of the IEA reports. 123 teachers are simply just focusing on teaching to the test and putting less emphasis on making science instruction interesting. The fact that Chile does have a significant effect for negative science affect means this hypothesis could stand. The other unique finding in Chile is that it is the only country where the hours of science instruction significantly predict achievement and it has a negative relationship. On the surface, it would seem to make sense that the more time spent on teaching science that achievement would increase. However, given the lower achievement of Chile as a country this may be because of remedial teaching or the fact that it takes longer for students to grasp concepts. Recall that Singapore is the only country where the majority of the variance in student achievement is at the school level. This is very clearly a function of the range of schooling options in the country. It also has implications for the future in that if Singapore wants to reduce the variance in achievement, it would be wiser to target the school rather than the student level. According to the TIMSS Encyclopedia, students in Singapore have seven different school options for grade eight so we would expect to see this much variance. After grade six, which is the completion of the primary schooling cycle, students in Singapore take an examination to help determine the next type of schooling they will attend. The sample of students in Singapore used in this data could be attending any of the seven different types of schools. The United States is notable for the fact that it does not have any teacher/classroom variables that are significantly related to achievement and only a single (SES) school level variable is related to science achievement. This means that what drives the variation in student achievement at the school, teacher, and classroom levels in the United States is unclear. Therefore, the only recommendations for student achievement that can be used as a result of the study have to do with the student level. Factors with No Effect At the student level it was important to note that all variables had at least some impact on science achievement. However, at the school and teacher/classroom level, this is not the case. At the school level, in this data, the degree of school and parent contact appears to have no relationship with science achievement. This could be due to the fact that the school can have both negative and positive contact with a parent regarding their child. Sometimes contact is to discuss a child’s poor achievement whereas parents can also contact teachers to increase the achievement of students who are already doing well. It may be worth investigating in more detail this difference to see if in fact relationships do emerge when the purpose of the contact is taken into account. This variable showed a 124 great level of significance in the correlation analysis, but it was not significant in any case in the regression or HLM analysis. This indicates that the significance in the bivariate correlations was most likely due to confounding factors which were accounted for in the other analyses. At the teacher/classroom level, there were also a few variables that showed no impact on science achievement. The first was the level of teacher education. This is most likely due to the fact that this has received a great deal of attention over the years and the majority of countries have implemented some type of teacher reforms in recent times. It is an indication that the variation in the amount of education teachers are receiving is decreasing. However, it could still be the case that disadvantaged students are more likely to be assigned to less well-educated teachers so this might need to be explored further. Initially, teacher level of education significantly correlated with achievement in four of the six countries. However, when controls were put in place in the regressions and HLM, the significance disappeared indicating that unaccounted for factors may have been contributing to the significance of the bivariate correlation. Another teacher variable that showed no impact was the degree to which teachers cooperated with one another. This could be related to how teachers used their planning and free time with the result that they just do not work together enough. To understand this in greater detail would require a type of control/treatment randomized trial with collaboration or a study of what happens when teachers do collaborate. It would seem reasonable to expect that teachers working together would be a positive impact but how much is unclear. Similar to the other variables that did not show an effect, teacher cooperation was significant in the correlations but was no longer so when other controls were put into a regression. Recommendations and Conclusions 1. Address deficits of SES and disadvantage found more universally across different contexts. Regardless of the country in the sample, it was clear that SES and disadvantage made a big impact on student achievement as they were the two most strongly linked variables in the model. The hope of this study was to identify factors within the students, in the classroom, or in the school that could help directly eliminate these effects. While it was not possible to identify specific factors that can directly compensate for the effects of SES, it was clear that there are promising relationships between SES, student achievement, and other factors that might be helpful in enhancing or reducing the effects. 125 At the student level, SES is negatively correlated with having a negative affect of science from .1 to .22, and in the logistic regressions, low-SES kids are significantly more likely to have a negative science affect compared to having a positive affect. The HLM analysis shows the impact of this on science achievement is between 18 and 34 points for each unit change in affect depending on the country, in other words, about a quarter of a standard deviation in achievement average. Interest and enjoyment of science produced a similar pattern in that lowSES kids were less likely to have a high interest and enjoyment of science according to the logistic regressions. The correlations range from a .01 to a .23 meaning that increases in SES are associated with increases in levels of interest and enjoyment. The HLM analysis reports that a single change in interest and enjoyment corresponds with a 6 to 37 point change in student achievement depending on the country. The implications for policy are clear within an educational realm, that while it is important to get all kids interested in science as a subject and ensure that they have positive experiences, low SES students should be targeted as they appear to be more strongly linked to negative experiences and it is reasonable to conclude their achievement suffers as a result. At a classroom and teacher level there are similar patterns and opportunities that will not only benefit all students, but low-SES students in particular. The correlation between students of teachers who report high expectations for student achievement and SES are positive and significant in every country ranging from .08 to .33 meaning that as the SES of a student increases, so does the expectations for achievement of that student. The logistic regressions show that low-SES students have significantly greater odds of having a teacher report low expectations in half of the countries in the sample. In the HLM analysis, a one-point change in expectations might result in an 8 to 28 point change in student achievement. Since this variable includes student, parent, and teacher expectations as components, it highlights the importance of partnerships between them with respect to student achievement. The obstacles to learning variable shows another trend in that SES is negatively associated with it from a .08 to a .31 meaning that as the SES of a student increases, the teacher will generally report fewer obstacles in their classroom. In the countries where this is a significant predictor of achievement the effect is about 10 points. However, in Singapore the effect is 60 points. Generally speaking, teachers are reporting that there are things that are taking away from their ability to teach effectively because they need to spend their time working with things that are not related to learning and the analysis shows that this does impact achievement. Finally, at the school level, low-SES students overwhelmingly have greater odds of attending a school with other disadvantaged students. Not surprisingly, this impacts achievement most likely due to a combination of 126 resources, economic factors of the community, and a lack of desirability to work in these schools. There are a range of policy options to try to combat this such as investing in the community, providing incentives to work in these schools, or possibly resorting these students to different schools. Targeting these schools would appear to be desirable from the standpoint of being able to reach a great number of disadvantaged students at once. 2. Explore where the variation in achievement is located and the magnitude of the disadvantage for determining where to take action that will be most impactful The biggest take away from the country-level analysis is that while there are certainly some problems that cross borders, there are also some country-specific factors that impact achievement in addition to variation in the magnitude of the impact of certain factors on SES. When making any policy decision, it is important to consider these factors as opposed to copying from another country. For example, the source of most of the variation in science achievement in Singapore is at the school and teacher level, a product of the structure of the system there. Therefore, in undertaking any policies aimed at reducing the variation in science achievement, it would make sense for Singapore to target schools and classrooms where they will have the most impact in reducing variation. To give an idea as to the magnitude of the impact this could have, a one-point change in the five significant variables in Singapore would yield a about a standard deviation and a half change in achievement. Conversely, in Korea most of the variation is at the student level so policies here could be more effective when they target individual student characteristics. However Korea has a much lower standard deviation in achievement so a change in the four significant student variables of one point each would yield a much more modest effect of about three-quarters of a standard deviation change in achievement. Contrast this with Ghana which has a much more unequal distribution of achievement. A one-point change in each of their significant student variables would yield over a standard deviation change in their student achievement. This highlights the importance for policy makers of understanding not only where the variation is located but the degree of the inequality as well. There is one final point on the relationship between the overall achievement of a country and the standard deviation of that achievement since that was a motivating factor behind the country selection represented in Table 6.3. Some clear patterns have emerged from this sample and this selection between whether the variation in achievement is located at the student or school level and whether girls face a deficit in achievement. Recall that the two high achieving countries were Finland and Singapore with Chile and Ghana as the low achieving. The equal 127 distribution in achievement was represented by Finland and Chile while the unequal distribution was represented by Ghana and Singapore. The low SD (more equal) countries in the sample selection show more of their variance at the student level while the higher SD countries (more unequal) show a higher amount of their variance at the school level relative to the other countries in the sample. This pattern indicates that if the variance in achievement is more located at the school level then the overall achievement will tend to show more variation (be more unequal). This makes sense given that inequality present between schools will impact a greater number of students then inequality between individual students. The other part of this pattern is that the higher-achieving countries do not show an impact of girls having lower achievement in science. However, the two lower-achieving countries do show this impact. The implication here is that if half of the student population is disadvantaged on the basis of their gender it would seem to reason that overall country achievement would suffer. These findings are not meant to be causal claims but rather patterns that should be investigated further with more countries. On the surface, there does appear to be some relationship between the level of achievement in a country, the scale of the inequality, and the magnitude of the variables that contribute to that inequality. 128 Table 6.3 – Country Representation of Achievement, SD, Gender, and Variance Location Low SD & Variance at Student Level High SD & Variance at School Level High Achievement & Girls Not Disadvantaged Finland Singapore Low Achievement & Girls Disadvantaged Chile Ghana Concluding Remarks Quite simply, a sum is only a strong at the parts that make it up. When students within a country are being penalized because of their lack of wealth or their gender it makes the country as a whole weaker because it is not drawing skilled workers from the greatest proportion of the labor force. This is the fundamental human capital argument of this study that large proportions of a nation are not being served to the degree that others are with respect to their education because of factors beyond their control and as a result it could have long-term implications for the quality of the labor force. Science is an especially powerful example of this, as noted previously in the literature review, the United States imports the majority of its workers into science fields. Imagine what might happen if schools and communities were able to level out some of the disadvantages faced so that the whole human capital pool is utilized by all students getting equal experiences in science. Generally speaking, females make up about half of the population in a school, but their science achievement is significantly lower in many cases. Furthermore, it is generally the case that wealth is distributed disproportionately among the whole population so that a greater proportion of the population is less wealthy. To this end, it would be reasonable to assume that many of the shortages in science and other fields would be eliminated if it were possible to eliminate these factors. This study has attempted, in a global context, to highlight not only the magnitude of the problem, but also some possible ways forward that would benefit not only students but the larger society as well. 129 APPENDICES 130 APPENDIX Appendix 1 – Stata Code Stata code for student variable cleaning ***Cleaning and recoding Student Data Files TIMSS 2011*** *Data has already been merged using IEA IDB analyzer* clear set more off use "C:\Users\educ.brunerju\Google Drive\Dissertation\TIMSS\Datasets\2011\TIMSS2011StuBackCountryFilter.dta" *reorder the dataset* sort idcntry order id* it* BS* bs* WGT* totwgt houwgt senwgt jkzone jkrep *recode gender for boy=0* replace itsex=0 if itsex==2 label define itsex 1 "girl" 0 "boy" 9 "OMITTED OR INVALID", replace tab itsex replace BSBG01=0 if BSBG01==2 label define BSBG01 1 "girl" 0 "boy" 9 "OMITTED OR INVALID", replace tab BSBG01 *recode language of test so it is reverse coded, never=0, always=4* replace BSBG03=0 if BSBG03==4 replace BSBG03=5 if BSBG03==3 replace BSBG03=3 if BSBG03==1 replace BSBG03=1 if BSBG03==5 label define BSBG03 0 "never" 1 "sometimes" 2 "almost always" 3 "always", replace tab BSBG03 *recode possessions to yes 1 0 no* replace BSBG05A=0 if BSBG05A==2 131 label define BSBG05A 0 "no" 1 "yes", replace tab BSBG05A replace BSBG05B=0 if BSBG05B==2 label define BSBG05B 0 "no" 1 "yes", replace tab BSBG05B replace BSBG05C=0 if BSBG05C==2 label define BSBG05C 0 "no" 1 "yes", replace tab BSBG05C replace BSBG05D=0 if BSBG05D==2 label define BSBG05D 0 "no" 1 "yes", replace tab BSBG05D replace BSBG05E=0 if BSBG05E==2 label define BSBG05E 0 "no" 1 "yes", replace tab BSBG05E replace BSBG05F=0 if BSBG05F==2 label define BSBG05F 0 "no" 1 "yes", replace tab BSBG05F replace BSBG05G=0 if BSBG05G==2 label define BSBG05G 0 "no" 1 "yes", replace tab BSBG05G replace BSBG05H=0 if BSBG05H==2 label define BSBG05H 0 "no" 1 "yes", replace tab BSBG05H replace BSBG05I=0 if BSBG05I==2 label define BSBG05I 0 "no" 1 "yes", replace tab BSBG05I replace BSBG05J=0 if BSBG05J==2 label define BSBG05J 0 "no" 1 "yes", replace 132 tab BSBG05J replace BSBG05K=0 if BSBG05K==2 label define BSBG05K 0 "no" 1 "yes", replace tab BSBG05K *recode parent education so IDK is 0 instead of 8 and relabel* replace BSBG06A=0 if BSBG06A==8 label define BSBG06A 0 "I don't know" 1 "some or did not go to school lv1" 2 "Lower Secondary lv2" 3 " Upper Secondary lv3" /// 4 "Post-Sec Non-Tertiary lv4" 5 "Short Tertiary lv5b" 6 "Bachelor's lv5a" 7 "Beyond Bachelor's lv7lv8", replace tab BSBG06A replace BSBG06B=0 if BSBG06B==8 label define BSBG06B 0 "I don't know" 1 "some or did not go to school lv1" 2 "Lower Secondary lv2" 3 " Upper Secondary lv3" /// 4 "Post-Sec Non-Tertiary lv4" 5 "Short Tertiary lv5b" 6 "Bachelor's lv5a" 7 "Beyond Bachelor's lv7lv8", replace tab BSBG06B *recode and label student education expectations* replace BSBG07=0 if BSBG07==7 label define BSBG07 0 "I don't know" 1 "Lower Secondary lv2" 2 " Upper Secondary lv3" /// 3 "Post-Sec Non-Tertiary lv4" 4 "Short Tertiary lv5b" 5 "Bachelor's lv5a" 6 "Beyond Bachelor's lv7lv8", replace tab BSBG07 *Recode student birth questions* replace BSBG08A=0 if BSBG08A==2 label define BSBG08A 0 "no" 1 "yes", replace tab BSBG08A replace BSBG08B=0 if BSBG08B==2 label define BSBG08B 0 "no" 1 "yes", replace tab BSBG08B replace BSBG09A=0 if BSBG09A==2 133 label define BSBG09A 0 "no" 1 "yes", replace tab BSBG09A replace BSBG09B=4 if BSBG09B==1 replace BSBG09B=1 if BSBG09B==3 replace BSBG09B=3 if BSBG09B==4 label define BSBG09B 1 "younger than 5 years old" 2 "5 to 10 years old" 3 "older than 10 years old", replace tab BSBG09B *recode how often questions 10 and 11* replace BSBG10A=5 if BSBG10A==1 replace BSBG10A=6 if BSBG10A==2 replace BSBG10A=2 if BSBG10A==3 replace BSBG10A=1 if BSBG10A==4 replace BSBG10A=4 if BSBG10A==5 replace BSBG10A=3 if BSBG10A==6 label define BSBG10A 1 "never or almost never" 2 "once or twice a month" 3 "Once or twice a week" 4 "every day or almost every day", replace tab BSBG10A replace BSBG10B=5 if BSBG10B==1 replace BSBG10B=6 if BSBG10B==2 replace BSBG10B=2 if BSBG10B==3 replace BSBG10B=1 if BSBG10B==4 replace BSBG10B=4 if BSBG10B==5 replace BSBG10B=3 if BSBG10B==6 label define BSBG10B 1 "never or almost never" 2 "once or twice a month" 3 "Once or twice a week" 4 "every day or almost every day", replace tab BSBG10B replace BSBG10C=5 if BSBG10C==1 replace BSBG10C=6 if BSBG10C==2 134 replace BSBG10C=2 if BSBG10C==3 replace BSBG10C=1 if BSBG10C==4 replace BSBG10C=4 if BSBG10C==5 replace BSBG10C=3 if BSBG10C==6 label define BSBG10C 1 "never or almost never" 2 "once or twice a month" 3 "Once or twice a week" 4 "every day or almost every day", replace tab BSBG10C replace BSBG11A=5 if BSBG11A==1 replace BSBG11A=6 if BSBG11A==2 replace BSBG11A=2 if BSBG11A==3 replace BSBG11A=1 if BSBG11A==4 replace BSBG11A=4 if BSBG11A==5 replace BSBG11A=3 if BSBG11A==6 label define BSBG11A 1 "never or almost never" 2 "once or twice a month" 3 "Once or twice a week" 4 "every day or almost every day", replace tab BSBG11A replace BSBG11B=5 if BSBG11B==1 replace BSBG11B=6 if BSBG11B==2 replace BSBG11B=2 if BSBG11B==3 replace BSBG11B=1 if BSBG11B==4 replace BSBG11B=4 if BSBG11B==5 replace BSBG11B=3 if BSBG11B==6 label define BSBG11B 1 "never or almost never" 2 "once or twice a month" 3 "Once or twice a week" 4 "every day or almost every day", replace tab BSBG11B replace BSBG11C=5 if BSBG11C==1 replace BSBG11C=6 if BSBG11C==2 replace BSBG11C=2 if BSBG11C==3 135 replace BSBG11C=1 if BSBG11C==4 replace BSBG11C=4 if BSBG11C==5 replace BSBG11C=3 if BSBG11C==6 label define BSBG11C 1 "never or almost never" 2 "once or twice a month" 3 "Once or twice a week" 4 "every day or almost every day", replace tab BSBG11C replace BSBG11D=5 if BSBG11D==1 replace BSBG11D=6 if BSBG11D==2 replace BSBG11D=2 if BSBG11D==3 replace BSBG11D=1 if BSBG11D==4 replace BSBG11D=4 if BSBG11D==5 replace BSBG11D=3 if BSBG11D==6 label define BSBG11D 1 "never or almost never" 2 "once or twice a month" 3 "Once or twice a week" 4 "every day or almost every day", replace tab BSBG11D *recode questions 12-19 so disagree is lowest value and agree a lot is highest* *question 12* replace BSBG12A=5 if BSBG12A==1 replace BSBG12A=6 if BSBG12A==2 replace BSBG12A=2 if BSBG12A==3 replace BSBG12A=1 if BSBG12A==4 replace BSBG12A=4 if BSBG12A==5 replace BSBG12A=3 if BSBG12A==6 label define BSBG12A 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBG12A replace BSBG12B=5 if BSBG12B==1 replace BSBG12B=6 if BSBG12B==2 replace BSBG12B=2 if BSBG12B==3 136 replace BSBG12B=1 if BSBG12B==4 replace BSBG12B=4 if BSBG12B==5 replace BSBG12B=3 if BSBG12B==6 label define BSBG12B 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBG12B replace BSBG12C=5 if BSBG12C==1 replace BSBG12C=6 if BSBG12C==2 replace BSBG12C=2 if BSBG12C==3 replace BSBG12C=1 if BSBG12C==4 replace BSBG12C=4 if BSBG12C==5 replace BSBG12C=3 if BSBG12C==6 label define BSBG12C 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBG12C *question 13* replace BSBG13A=5 if BSBG13A==1 replace BSBG13A=6 if BSBG13A==2 replace BSBG13A=2 if BSBG13A==3 replace BSBG13A=1 if BSBG13A==4 replace BSBG13A=4 if BSBG13A==5 replace BSBG13A=3 if BSBG13A==6 label define BSBG13A 1 "Never" 2 "A few times a year" 3 "Once or twice a month" 4 "At least once a week", replace tab BSBG13A replace BSBG13B=5 if BSBG13B==1 replace BSBG13B=6 if BSBG13B==2 replace BSBG13B=2 if BSBG13B==3 replace BSBG13B=1 if BSBG13B==4 replace BSBG13B=4 if BSBG13B==5 137 replace BSBG13B=3 if BSBG13B==6 label define BSBG13B 1 "Never" 2 "A few times a year" 3 "Once or twice a month" 4 "At least once a week", replace tab BSBG13B replace BSBG13C=5 if BSBG13C==1 replace BSBG13C=6 if BSBG13C==2 replace BSBG13C=2 if BSBG13C==3 replace BSBG13C=1 if BSBG13C==4 replace BSBG13C=4 if BSBG13C==5 replace BSBG13C=3 if BSBG13C==6 label define BSBG13C 1 "Never" 2 "A few times a year" 3 "Once or twice a month" 4 "At least once a week", replace tab BSBG13C replace BSBG13D=5 if BSBG13D==1 replace BSBG13D=6 if BSBG13D==2 replace BSBG13D=2 if BSBG13D==3 replace BSBG13D=1 if BSBG13D==4 replace BSBG13D=4 if BSBG13D==5 replace BSBG13D=3 if BSBG13D==6 label define BSBG13D 1 "Never" 2 "A few times a year" 3 "Once or twice a month" 4 "At least once a week", replace tab BSBG13D replace BSBG13E=5 if BSBG13E==1 replace BSBG13E=6 if BSBG13E==2 replace BSBG13E=2 if BSBG13E==3 replace BSBG13E=1 if BSBG13E==4 replace BSBG13E=4 if BSBG13E==5 replace BSBG13E=3 if BSBG13E==6 138 label define BSBG13E 1 "Never" 2 "A few times a year" 3 "Once or twice a month" 4 "At least once a week", replace tab BSBG13E replace BSBG13F=5 if BSBG13F==1 replace BSBG13F=6 if BSBG13F==2 replace BSBG13F=2 if BSBG13F==3 replace BSBG13F=1 if BSBG13F==4 replace BSBG13F=4 if BSBG13F==5 replace BSBG13F=3 if BSBG13F==6 label define BSBG13F 1 "Never" 2 "A few times a year" 3 "Once or twice a month" 4 "At least once a week", replace tab BSBG13F *Question 14* replace BSBM14A=5 if BSBM14A==1 replace BSBM14A=6 if BSBM14A==2 replace BSBM14A=2 if BSBM14A==3 replace BSBM14A=1 if BSBM14A==4 replace BSBM14A=4 if BSBM14A==5 replace BSBM14A=3 if BSBM14A==6 label define BSBM14A 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBM14A replace BSBM14B=5 if BSBM14B==1 replace BSBM14B=6 if BSBM14B==2 replace BSBM14B=2 if BSBM14B==3 replace BSBM14B=1 if BSBM14B==4 replace BSBM14B=4 if BSBM14B==5 replace BSBM14B=3 if BSBM14B==6 label define BSBM14B 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace 139 tab BSBM14B replace BSBM14C=5 if BSBM14C==1 replace BSBM14C=6 if BSBM14C==2 replace BSBM14C=2 if BSBM14C==3 replace BSBM14C=1 if BSBM14C==4 replace BSBM14C=4 if BSBM14C==5 replace BSBM14C=3 if BSBM14C==6 label define BSBM14C 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBM14C replace BSBM14D=5 if BSBM14D==1 replace BSBM14D=6 if BSBM14D==2 replace BSBM14D=2 if BSBM14D==3 replace BSBM14D=1 if BSBM14D==4 replace BSBM14D=4 if BSBM14D==5 replace BSBM14D=3 if BSBM14D==6 label define BSBM14D 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBM14D replace BSBM14E=5 if BSBM14E==1 replace BSBM14E=6 if BSBM14E==2 replace BSBM14E=2 if BSBM14E==3 replace BSBM14E=1 if BSBM14E==4 replace BSBM14E=4 if BSBM14E==5 replace BSBM14E=3 if BSBM14E==6 label define BSBM14E 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBM14E replace BSBM14F=5 if BSBM14F==1 replace BSBM14F=6 if BSBM14F==2 replace BSBM14F=2 if BSBM14F==3 140 replace BSBM14F=1 if BSBM14F==4 replace BSBM14F=4 if BSBM14F==5 replace BSBM14F=3 if BSBM14F==6 label define BSBM14F 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBM14F *Question 15* replace BSBM15A=5 if BSBM15A==1 replace BSBM15A=6 if BSBM15A==2 replace BSBM15A=2 if BSBM15A==3 replace BSBM15A=1 if BSBM15A==4 replace BSBM15A=4 if BSBM15A==5 replace BSBM15A=3 if BSBM15A==6 label define BSBM15A 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBM15A replace BSBM15B=5 if BSBM15B==1 replace BSBM15B=6 if BSBM15B==2 replace BSBM15B=2 if BSBM15B==3 replace BSBM15B=1 if BSBM15B==4 replace BSBM15B=4 if BSBM15B==5 replace BSBM15B=3 if BSBM15B==6 label define BSBM15B 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBM15B replace BSBM15C=5 if BSBM15C==1 replace BSBM15C=6 if BSBM15C==2 replace BSBM15C=2 if BSBM15C==3 replace BSBM15C=1 if BSBM15C==4 replace BSBM15C=4 if BSBM15C==5 replace BSBM15C=3 if BSBM15C==6 141 label define BSBM15C 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBM15C replace BSBM15D=5 if BSBM15D==1 replace BSBM15D=6 if BSBM15D==2 replace BSBM15D=2 if BSBM15D==3 replace BSBM15D=1 if BSBM15D==4 replace BSBM15D=4 if BSBM15D==5 replace BSBM15D=3 if BSBM15D==6 label define BSBM15D 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBM15D replace BSBM15E=5 if BSBM15E==1 replace BSBM15E=6 if BSBM15E==2 replace BSBM15E=2 if BSBM15E==3 replace BSBM15E=1 if BSBM15E==4 replace BSBM15E=4 if BSBM15E==5 replace BSBM15E=3 if BSBM15E==6 label define BSBM15E 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBM15E *Question 16* replace BSBM16A=5 if BSBM16A==1 replace BSBM16A=6 if BSBM16A==2 replace BSBM16A=2 if BSBM16A==3 replace BSBM16A=1 if BSBM16A==4 replace BSBM16A=4 if BSBM16A==5 replace BSBM16A=3 if BSBM16A==6 label define BSBM16A 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBM16A replace BSBM16B=5 if BSBM16B==1 142 replace BSBM16B=6 if BSBM16B==2 replace BSBM16B=2 if BSBM16B==3 replace BSBM16B=1 if BSBM16B==4 replace BSBM16B=4 if BSBM16B==5 replace BSBM16B=3 if BSBM16B==6 label define BSBM16B 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBM16B replace BSBM16C=5 if BSBM16C==1 replace BSBM16C=6 if BSBM16C==2 replace BSBM16C=2 if BSBM16C==3 replace BSBM16C=1 if BSBM16C==4 replace BSBM16C=4 if BSBM16C==5 replace BSBM16C=3 if BSBM16C==6 label define BSBM16C 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBM16C replace BSBM16D=5 if BSBM16D==1 replace BSBM16D=6 if BSBM16D==2 replace BSBM16D=2 if BSBM16D==3 replace BSBM16D=1 if BSBM16D==4 replace BSBM16D=4 if BSBM16D==5 replace BSBM16D=3 if BSBM16D==6 label define BSBM16D 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBM16D replace BSBM16E=5 if BSBM16E==1 replace BSBM16E=6 if BSBM16E==2 replace BSBM16E=2 if BSBM16E==3 replace BSBM16E=1 if BSBM16E==4 replace BSBM16E=4 if BSBM16E==5 143 replace BSBM16E=3 if BSBM16E==6 label define BSBM16E 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBM16E replace BSBM16F=5 if BSBM16F==1 replace BSBM16F=6 if BSBM16F==2 replace BSBM16F=2 if BSBM16F==3 replace BSBM16F=1 if BSBM16F==4 replace BSBM16F=4 if BSBM16F==5 replace BSBM16F=3 if BSBM16F==6 label define BSBM16F 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBM16F replace BSBM16G=5 if BSBM16G==1 replace BSBM16G=6 if BSBM16G==2 replace BSBM16G=2 if BSBM16G==3 replace BSBM16G=1 if BSBM16G==4 replace BSBM16G=4 if BSBM16G==5 replace BSBM16G=3 if BSBM16G==6 label define BSBM16G 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBM16G replace BSBM16H=5 if BSBM16H==1 replace BSBM16H=6 if BSBM16H==2 replace BSBM16H=2 if BSBM16H==3 replace BSBM16H=1 if BSBM16H==4 replace BSBM16H=4 if BSBM16H==5 replace BSBM16H=3 if BSBM16H==6 label define BSBM16H 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBM16H replace BSBM16I=5 if BSBM16I==1 144 replace BSBM16I=6 if BSBM16I==2 replace BSBM16I=2 if BSBM16I==3 replace BSBM16I=1 if BSBM16I==4 replace BSBM16I=4 if BSBM16I==5 replace BSBM16I=3 if BSBM16I==6 label define BSBM16I 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBM16I replace BSBM16J=5 if BSBM16J==1 replace BSBM16J=6 if BSBM16J==2 replace BSBM16J=2 if BSBM16J==3 replace BSBM16J=1 if BSBM16J==4 replace BSBM16J=4 if BSBM16J==5 replace BSBM16J=3 if BSBM16J==6 label define BSBM16J 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBM16J replace BSBM16K=5 if BSBM16K==1 replace BSBM16K=6 if BSBM16K==2 replace BSBM16K=2 if BSBM16K==3 replace BSBM16K=1 if BSBM16K==4 replace BSBM16K=4 if BSBM16K==5 replace BSBM16K=3 if BSBM16K==6 label define BSBM16K 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBM16K replace BSBM16L=5 if BSBM16L==1 replace BSBM16L=6 if BSBM16L==2 replace BSBM16L=2 if BSBM16L==3 replace BSBM16L=1 if BSBM16L==4 replace BSBM16L=4 if BSBM16L==5 145 replace BSBM16L=3 if BSBM16L==6 label define BSBM16L 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBM16L replace BSBM16M=5 if BSBM16M==1 replace BSBM16M=6 if BSBM16M==2 replace BSBM16M=2 if BSBM16M==3 replace BSBM16M=1 if BSBM16M==4 replace BSBM16M=4 if BSBM16M==5 replace BSBM16M=3 if BSBM16M==6 label define BSBM16M 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBM16M replace BSBM16N=5 if BSBM16N==1 replace BSBM16N=6 if BSBM16N==2 replace BSBM16N=2 if BSBM16N==3 replace BSBM16N=1 if BSBM16N==4 replace BSBM16N=4 if BSBM16N==5 replace BSBM16N=3 if BSBM16N==6 label define BSBM16N 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBM16N *Question 17* replace BSBS17A=5 if BSBS17A==1 replace BSBS17A=6 if BSBS17A==2 replace BSBS17A=2 if BSBS17A==3 replace BSBS17A=1 if BSBS17A==4 replace BSBS17A=4 if BSBS17A==5 replace BSBS17A=3 if BSBS17A==6 label define BSBS17A 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBS17A 146 replace BSBS17B=5 if BSBS17B==1 replace BSBS17B=6 if BSBS17B==2 replace BSBS17B=2 if BSBS17B==3 replace BSBS17B=1 if BSBS17B==4 replace BSBS17B=4 if BSBS17B==5 replace BSBS17B=3 if BSBS17B==6 label define BSBS17B 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBS17B replace BSBS17C=5 if BSBS17C==1 replace BSBS17C=6 if BSBS17C==2 replace BSBS17C=2 if BSBS17C==3 replace BSBS17C=1 if BSBS17C==4 replace BSBS17C=4 if BSBS17C==5 replace BSBS17C=3 if BSBS17C==6 label define BSBS17C 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBS17C replace BSBS17D=5 if BSBS17D==1 replace BSBS17D=6 if BSBS17D==2 replace BSBS17D=2 if BSBS17D==3 replace BSBS17D=1 if BSBS17D==4 replace BSBS17D=4 if BSBS17D==5 replace BSBS17D=3 if BSBS17D==6 label define BSBS17D 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBS17D replace BSBS17E=5 if BSBS17E==1 replace BSBS17E=6 if BSBS17E==2 replace BSBS17E=2 if BSBS17E==3 replace BSBS17E=1 if BSBS17E==4 147 replace BSBS17E=4 if BSBS17E==5 replace BSBS17E=3 if BSBS17E==6 label define BSBS17E 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBS17E replace BSBS17F=5 if BSBS17F==1 replace BSBS17F=6 if BSBS17F==2 replace BSBS17F=2 if BSBS17F==3 replace BSBS17F=1 if BSBS17F==4 replace BSBS17F=4 if BSBS17F==5 replace BSBS17F=3 if BSBS17F==6 label define BSBS17F 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBS17F replace BSBS17G=5 if BSBS17G==1 replace BSBS17G=6 if BSBS17G==2 replace BSBS17G=2 if BSBS17G==3 replace BSBS17G=1 if BSBS17G==4 replace BSBS17G=4 if BSBS17G==5 replace BSBS17G=3 if BSBS17G==6 label define BSBS17G 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBS17G *Question 18* replace BSBS18A=5 if BSBS18A==1 replace BSBS18A=6 if BSBS18A==2 replace BSBS18A=2 if BSBS18A==3 replace BSBS18A=1 if BSBS18A==4 replace BSBS18A=4 if BSBS18A==5 replace BSBS18A=3 if BSBS18A==6 label define BSBS18A 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace 148 tab BSBS18A replace BSBS18B=5 if BSBS18B==1 replace BSBS18B=6 if BSBS18B==2 replace BSBS18B=2 if BSBS18B==3 replace BSBS18B=1 if BSBS18B==4 replace BSBS18B=4 if BSBS18B==5 replace BSBS18B=3 if BSBS18B==6 label define BSBS18B 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBS18B replace BSBS18C=5 if BSBS18C==1 replace BSBS18C=6 if BSBS18C==2 replace BSBS18C=2 if BSBS18C==3 replace BSBS18C=1 if BSBS18C==4 replace BSBS18C=4 if BSBS18C==5 replace BSBS18C=3 if BSBS18C==6 label define BSBS18C 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBS18C replace BSBS18D=5 if BSBS18D==1 replace BSBS18D=6 if BSBS18D==2 replace BSBS18D=2 if BSBS18D==3 replace BSBS18D=1 if BSBS18D==4 replace BSBS18D=4 if BSBS18D==5 replace BSBS18D=3 if BSBS18D==6 label define BSBS18D 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBS18D replace BSBS18E=5 if BSBS18E==1 replace BSBS18E=6 if BSBS18E==2 replace BSBS18E=2 if BSBS18E==3 149 replace BSBS18E=1 if BSBS18E==4 replace BSBS18E=4 if BSBS18E==5 replace BSBS18E=3 if BSBS18E==6 label define BSBS18E 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBS18E *Question 19* replace BSBS19A=5 if BSBS19A==1 replace BSBS19A=6 if BSBS19A==2 replace BSBS19A=2 if BSBS19A==3 replace BSBS19A=1 if BSBS19A==4 replace BSBS19A=4 if BSBS19A==5 replace BSBS19A=3 if BSBS19A==6 label define BSBS19A 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBS19A replace BSBS19B=5 if BSBS19B==1 replace BSBS19B=6 if BSBS19B==2 replace BSBS19B=2 if BSBS19B==3 replace BSBS19B=1 if BSBS19B==4 replace BSBS19B=4 if BSBS19B==5 replace BSBS19B=3 if BSBS19B==6 label define BSBS19B 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBS19B replace BSBS19C=5 if BSBS19C==1 replace BSBS19C=6 if BSBS19C==2 replace BSBS19C=2 if BSBS19C==3 replace BSBS19C=1 if BSBS19C==4 replace BSBS19C=4 if BSBS19C==5 replace BSBS19C=3 if BSBS19C==6 150 label define BSBS19C 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBS19C replace BSBS19D=5 if BSBS19D==1 replace BSBS19D=6 if BSBS19D==2 replace BSBS19D=2 if BSBS19D==3 replace BSBS19D=1 if BSBS19D==4 replace BSBS19D=4 if BSBS19D==5 replace BSBS19D=3 if BSBS19D==6 label define BSBS19D 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBS19D replace BSBS19E=5 if BSBS19E==1 replace BSBS19E=6 if BSBS19E==2 replace BSBS19E=2 if BSBS19E==3 replace BSBS19E=1 if BSBS19E==4 replace BSBS19E=4 if BSBS19E==5 replace BSBS19E=3 if BSBS19E==6 label define BSBS19E 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBS19E replace BSBS19F=5 if BSBS19F==1 replace BSBS19F=6 if BSBS19F==2 replace BSBS19F=2 if BSBS19F==3 replace BSBS19F=1 if BSBS19F==4 replace BSBS19F=4 if BSBS19F==5 replace BSBS19F=3 if BSBS19F==6 label define BSBS19F 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBS19F replace BSBS19G=5 if BSBS19G==1 replace BSBS19G=6 if BSBS19G==2 151 replace BSBS19G=2 if BSBS19G==3 replace BSBS19G=1 if BSBS19G==4 replace BSBS19G=4 if BSBS19G==5 replace BSBS19G=3 if BSBS19G==6 label define BSBS19G 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBS19G replace BSBS19H=5 if BSBS19H==1 replace BSBS19H=6 if BSBS19H==2 replace BSBS19H=2 if BSBS19H==3 replace BSBS19H=1 if BSBS19H==4 replace BSBS19H=4 if BSBS19H==5 replace BSBS19H=3 if BSBS19H==6 label define BSBS19H 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBS19H replace BSBS19I=5 if BSBS19I==1 replace BSBS19I=6 if BSBS19I==2 replace BSBS19I=2 if BSBS19I==3 replace BSBS19I=1 if BSBS19I==4 replace BSBS19I=4 if BSBS19I==5 replace BSBS19I=3 if BSBS19I==6 label define BSBS19I 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBS19I replace BSBS19J=5 if BSBS19J==1 replace BSBS19J=6 if BSBS19J==2 replace BSBS19J=2 if BSBS19J==3 replace BSBS19J=1 if BSBS19J==4 replace BSBS19J=4 if BSBS19J==5 replace BSBS19J=3 if BSBS19J==6 152 label define BSBS19J 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBS19J replace BSBS19K=5 if BSBS19K==1 replace BSBS19K=6 if BSBS19K==2 replace BSBS19K=2 if BSBS19K==3 replace BSBS19K=1 if BSBS19K==4 replace BSBS19K=4 if BSBS19K==5 replace BSBS19K=3 if BSBS19K==6 label define BSBS19K 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBS19K replace BSBS19L=5 if BSBS19L==1 replace BSBS19L=6 if BSBS19L==2 replace BSBS19L=2 if BSBS19L==3 replace BSBS19L=1 if BSBS19L==4 replace BSBS19L=4 if BSBS19L==5 replace BSBS19L=3 if BSBS19L==6 label define BSBS19L 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBS19L replace BSBS19M=5 if BSBS19M==1 replace BSBS19M=6 if BSBS19M==2 replace BSBS19M=2 if BSBS19M==3 replace BSBS19M=1 if BSBS19M==4 replace BSBS19M=4 if BSBS19M==5 replace BSBS19M=3 if BSBS19M==6 label define BSBS19M 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBS19M replace BSBS19N=5 if BSBS19N==1 replace BSBS19N=6 if BSBS19N==2 153 replace BSBS19N=2 if BSBS19N==3 replace BSBS19N=1 if BSBS19N==4 replace BSBS19N=4 if BSBS19N==5 replace BSBS19N=3 if BSBS19N==6 label define BSBS19N 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBS19N *Recode questions 20A and 21A so they ascend* *Question 20* replace BSBM20A=6 if BSBM20A==1 replace BSBM20A=7 if BSBM20A==2 replace BSBM20A=8 if BSBM20A==4 replace BSBM20A=10 if BSBM20A==5 replace BSBM20A=5 if BSBM20A==6 replace BSBM20A=4 if BSBM20A==7 replace BSBM20A=2 if BSBM20A==8 replace BSBM20A=1 if BSBM20A==10 label define BSBM20A 1 "Never" 2 "Less than once a week" 3 "1 or 2 times a week" 4 "3 or 4 times a week" 5 "Every day", replace tab BSBM20A *Question 21* replace BSBS21A=6 if BSBS21A==1 replace BSBS21A=7 if BSBS21A==2 replace BSBS21A=8 if BSBS21A==4 replace BSBS21A=10 if BSBS21A==5 replace BSBS21A=5 if BSBS21A==6 replace BSBS21A=4 if BSBS21A==7 replace BSBS21A=2 if BSBS21A==8 replace BSBS21A=1 if BSBS21A==10 154 label define BSBS21A 1 "Never" 2 "Less than once a week" 3 "1 or 2 times a week" 4 "3 or 4 times a week" 5 "Every day", replace tab BSBS21A ***End student questionnaire*** ***Start separate Science Subjects Questionnaire*** *Recode question 17 so no=0* replace BSBB17=0 if BSBB17==2 label define BSBB17 0 "no" 1 "yes", replace tab BSBB17 *Recode 18-20 for ascending agreement scale* *Question 18* replace BSBB18A=5 if BSBB18A==1 replace BSBB18A=6 if BSBB18A==2 replace BSBB18A=2 if BSBB18A==3 replace BSBB18A=1 if BSBB18A==4 replace BSBB18A=4 if BSBB18A==5 replace BSBB18A=3 if BSBB18A==6 label define BSBB18A 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBB18A replace BSBB18B=5 if BSBB18B==1 replace BSBB18B=6 if BSBB18B==2 replace BSBB18B=2 if BSBB18B==3 replace BSBB18B=1 if BSBB18B==4 replace BSBB18B=4 if BSBB18B==5 replace BSBB18B=3 if BSBB18B==6 label define BSBB18B 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBB18B 155 replace BSBB18C=5 if BSBB18C==1 replace BSBB18C=6 if BSBB18C==2 replace BSBB18C=2 if BSBB18C==3 replace BSBB18C=1 if BSBB18C==4 replace BSBB18C=4 if BSBB18C==5 replace BSBB18C=3 if BSBB18C==6 label define BSBB18C 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBB18C replace BSBB18D=5 if BSBB18D==1 replace BSBB18D=6 if BSBB18D==2 replace BSBB18D=2 if BSBB18D==3 replace BSBB18D=1 if BSBB18D==4 replace BSBB18D=4 if BSBB18D==5 replace BSBB18D=3 if BSBB18D==6 label define BSBB18D 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBB18D replace BSBB18E=5 if BSBB18E==1 replace BSBB18E=6 if BSBB18E==2 replace BSBB18E=2 if BSBB18E==3 replace BSBB18E=1 if BSBB18E==4 replace BSBB18E=4 if BSBB18E==5 replace BSBB18E=3 if BSBB18E==6 label define BSBB18E 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBB18E replace BSBB18F=5 if BSBB18F==1 replace BSBB18F=6 if BSBB18F==2 replace BSBB18F=2 if BSBB18F==3 replace BSBB18F=1 if BSBB18F==4 156 replace BSBB18F=4 if BSBB18F==5 replace BSBB18F=3 if BSBB18F==6 label define BSBB18F 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBB18F replace BSBB18G=5 if BSBB18G==1 replace BSBB18G=6 if BSBB18G==2 replace BSBB18G=2 if BSBB18G==3 replace BSBB18G=1 if BSBB18G==4 replace BSBB18G=4 if BSBB18G==5 replace BSBB18G=3 if BSBB18G==6 label define BSBB18G 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBB18G *Question 19* replace BSBB19A=5 if BSBB19A==1 replace BSBB19A=6 if BSBB19A==2 replace BSBB19A=2 if BSBB19A==3 replace BSBB19A=1 if BSBB19A==4 replace BSBB19A=4 if BSBB19A==5 replace BSBB19A=3 if BSBB19A==6 label define BSBB19A 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBB19A replace BSBB19B=5 if BSBB19B==1 replace BSBB19B=6 if BSBB19B==2 replace BSBB19B=2 if BSBB19B==3 replace BSBB19B=1 if BSBB19B==4 replace BSBB19B=4 if BSBB19B==5 replace BSBB19B=3 if BSBB19B==6 label define BSBB19B 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace 157 tab BSBB19B replace BSBB19C=5 if BSBB19C==1 replace BSBB19C=6 if BSBB19C==2 replace BSBB19C=2 if BSBB19C==3 replace BSBB19C=1 if BSBB19C==4 replace BSBB19C=4 if BSBB19C==5 replace BSBB19C=3 if BSBB19C==6 label define BSBB19C 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBB19C replace BSBB19D=5 if BSBB19D==1 replace BSBB19D=6 if BSBB19D==2 replace BSBB19D=2 if BSBB19D==3 replace BSBB19D=1 if BSBB19D==4 replace BSBB19D=4 if BSBB19D==5 replace BSBB19D=3 if BSBB19D==6 label define BSBB19D 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBB19D replace BSBB19E=5 if BSBB19E==1 replace BSBB19E=6 if BSBB19E==2 replace BSBB19E=2 if BSBB19E==3 replace BSBB19E=1 if BSBB19E==4 replace BSBB19E=4 if BSBB19E==5 replace BSBB19E=3 if BSBB19E==6 label define BSBB19E 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBB19E *Question 20* replace BSBB20A=5 if BSBB20A==1 replace BSBB20A=6 if BSBB20A==2 158 replace BSBB20A=2 if BSBB20A==3 replace BSBB20A=1 if BSBB20A==4 replace BSBB20A=4 if BSBB20A==5 replace BSBB20A=3 if BSBB20A==6 label define BSBB20A 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBB20A replace BSBB20B=5 if BSBB20B==1 replace BSBB20B=6 if BSBB20B==2 replace BSBB20B=2 if BSBB20B==3 replace BSBB20B=1 if BSBB20B==4 replace BSBB20B=4 if BSBB20B==5 replace BSBB20B=3 if BSBB20B==6 label define BSBB20B 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBB20B replace BSBB20C=5 if BSBB20C==1 replace BSBB20C=6 if BSBB20C==2 replace BSBB20C=2 if BSBB20C==3 replace BSBB20C=1 if BSBB20C==4 replace BSBB20C=4 if BSBB20C==5 replace BSBB20C=3 if BSBB20C==6 label define BSBB20C 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBB20C replace BSBB20D=5 if BSBB20D==1 replace BSBB20D=6 if BSBB20D==2 replace BSBB20D=2 if BSBB20D==3 replace BSBB20D=1 if BSBB20D==4 replace BSBB20D=4 if BSBB20D==5 replace BSBB20D=3 if BSBB20D==6 159 label define BSBB18D 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBB20D replace BSBB20E=5 if BSBB20E==1 replace BSBB20E=6 if BSBB20E==2 replace BSBB20E=2 if BSBB20E==3 replace BSBB20E=1 if BSBB20E==4 replace BSBB20E=4 if BSBB20E==5 replace BSBB20E=3 if BSBB20E==6 label define BSBB20E 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBB20E replace BSBB20F=5 if BSBB20F==1 replace BSBB20F=6 if BSBB20F==2 replace BSBB20F=2 if BSBB20F==3 replace BSBB20F=1 if BSBB20F==4 replace BSBB20F=4 if BSBB20F==5 replace BSBB20F=3 if BSBB20F==6 label define BSBB20F 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBB20F replace BSBB20G=5 if BSBB20G==1 replace BSBB20G=6 if BSBB20G==2 replace BSBB20G=2 if BSBB20G==3 replace BSBB20G=1 if BSBB20G==4 replace BSBB20G=4 if BSBB20G==5 replace BSBB20G=3 if BSBB20G==6 label define BSBB20G 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBB20G replace BSBB20H=5 if BSBB20H==1 replace BSBB20H=6 if BSBB20H==2 160 replace BSBB20H=2 if BSBB20H==3 replace BSBB20H=1 if BSBB20H==4 replace BSBB20H=4 if BSBB20H==5 replace BSBB20H=3 if BSBB20H==6 label define BSBB20H 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBB20H replace BSBB20I=5 if BSBB20I==1 replace BSBB20I=6 if BSBB20I==2 replace BSBB20I=2 if BSBB20I==3 replace BSBB20I=1 if BSBB20I==4 replace BSBB20I=4 if BSBB20I==5 replace BSBB20I=3 if BSBB20I==6 label define BSBB20I 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBB20I replace BSBB20J=5 if BSBB20J==1 replace BSBB20J=6 if BSBB20J==2 replace BSBB20J=2 if BSBB20J==3 replace BSBB20J=1 if BSBB20J==4 replace BSBB20J=4 if BSBB20J==5 replace BSBB20J=3 if BSBB20J==6 label define BSBB20J 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBB20J replace BSBB20K=5 if BSBB20K==1 replace BSBB20K=6 if BSBB20K==2 replace BSBB20K=2 if BSBB20K==3 replace BSBB20K=1 if BSBB20K==4 replace BSBB20K=4 if BSBB20K==5 replace BSBB20K=3 if BSBB20K==6 161 label define BSBB20K 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBB20K replace BSBB20L=5 if BSBB20L==1 replace BSBB20L=6 if BSBB20L==2 replace BSBB20L=2 if BSBB20L==3 replace BSBB20L=1 if BSBB20L==4 replace BSBB20L=4 if BSBB20L==5 replace BSBB20L=3 if BSBB20L==6 label define BSBB20L 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBB20L replace BSBB20M=5 if BSBB20M==1 replace BSBB20M=6 if BSBB20M==2 replace BSBB20M=2 if BSBB20M==3 replace BSBB20M=1 if BSBB20M==4 replace BSBB20M=4 if BSBB20M==5 replace BSBB20M=3 if BSBB20M==6 label define BSBB20M 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBB20M replace BSBB20N=5 if BSBB20N==1 replace BSBB20N=6 if BSBB20N==2 replace BSBB20N=2 if BSBB20N==3 replace BSBB20N=1 if BSBB20N==4 replace BSBB20N=4 if BSBB20N==5 replace BSBB20N=3 if BSBB20N==6 label define BSBB20N 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBB20N *Question 21 recode so no=0* replace BSBE21=0 if BSBE21==2 162 label define BSBE21 0 "no" 1 "yes", replace tab BSBE21 *Question 22-24 recode for ascending order responses* *Question 22* replace BSBE22A=5 if BSBE22A==1 replace BSBE22A=6 if BSBE22A==2 replace BSBE22A=2 if BSBE22A==3 replace BSBE22A=1 if BSBE22A==4 replace BSBE22A=4 if BSBE22A==5 replace BSBE22A=3 if BSBE22A==6 label define BSBE22A 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBE22A replace BSBE22B=5 if BSBE22B==1 replace BSBE22B=6 if BSBE22B==2 replace BSBE22B=2 if BSBE22B==3 replace BSBE22B=1 if BSBE22B==4 replace BSBE22B=4 if BSBE22B==5 replace BSBE22B=3 if BSBE22B==6 label define BSBE22B 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBE22B replace BSBE22C=5 if BSBE22C==1 replace BSBE22C=6 if BSBE22C==2 replace BSBE22C=2 if BSBE22C==3 replace BSBE22C=1 if BSBE22C==4 replace BSBE22C=4 if BSBE22C==5 replace BSBE22C=3 if BSBE22C==6 label define BSBE22C 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBE22C 163 replace BSBE22D=5 if BSBE22D==1 replace BSBE22D=6 if BSBE22D==2 replace BSBE22D=2 if BSBE22D==3 replace BSBE22D=1 if BSBE22D==4 replace BSBE22D=4 if BSBE22D==5 replace BSBE22D=3 if BSBE22D==6 label define BSBE22D 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBE22D replace BSBE22E=5 if BSBE22E==1 replace BSBE22E=6 if BSBE22E==2 replace BSBE22E=2 if BSBE22E==3 replace BSBE22E=1 if BSBE22E==4 replace BSBE22E=4 if BSBE22E==5 replace BSBE22E=3 if BSBE22E==6 label define BSBE22E 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBE22E replace BSBE22F=5 if BSBE22F==1 replace BSBE22F=6 if BSBE22F==2 replace BSBE22F=2 if BSBE22F==3 replace BSBE22F=1 if BSBE22F==4 replace BSBE22F=4 if BSBE22F==5 replace BSBE22F=3 if BSBE22F==6 label define BSBE22F 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBE22F replace BSBE22G=5 if BSBE22G==1 replace BSBE22G=6 if BSBE22G==2 replace BSBE22G=2 if BSBE22G==3 replace BSBE22G=1 if BSBE22G==4 164 replace BSBE22G=4 if BSBE22G==5 replace BSBE22G=3 if BSBE22G==6 label define BSBE22G 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBE22G *Question 23* replace BSBE23A=5 if BSBE23A==1 replace BSBE23A=6 if BSBE23A==2 replace BSBE23A=2 if BSBE23A==3 replace BSBE23A=1 if BSBE23A==4 replace BSBE23A=4 if BSBE23A==5 replace BSBE23A=3 if BSBE23A==6 label define BSBE23A 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBE23A replace BSBE23B=5 if BSBE23B==1 replace BSBE23B=6 if BSBE23B==2 replace BSBE23B=2 if BSBE23B==3 replace BSBE23B=1 if BSBE23B==4 replace BSBE23B=4 if BSBE23B==5 replace BSBE23B=3 if BSBE23B==6 label define BSBE23B 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBE23B replace BSBE23C=5 if BSBE23C==1 replace BSBE23C=6 if BSBE23C==2 replace BSBE23C=2 if BSBE23C==3 replace BSBE23C=1 if BSBE23C==4 replace BSBE23C=4 if BSBE23C==5 replace BSBE23C=3 if BSBE23C==6 label define BSBE23C 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace 165 tab BSBE23C replace BSBE23D=5 if BSBE23D==1 replace BSBE23D=6 if BSBE23D==2 replace BSBE23D=2 if BSBE23D==3 replace BSBE23D=1 if BSBE23D==4 replace BSBE23D=4 if BSBE23D==5 replace BSBE23D=3 if BSBE23D==6 label define BSBE23D 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBE23D replace BSBE23E=5 if BSBE23E==1 replace BSBE23E=6 if BSBE23E==2 replace BSBE23E=2 if BSBE23E==3 replace BSBE23E=1 if BSBE23E==4 replace BSBE23E=4 if BSBE23E==5 replace BSBE23E=3 if BSBE23E==6 label define BSBE23E 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBE23E *Question 24* replace BSBE24A=5 if BSBE24A==1 replace BSBE24A=6 if BSBE24A==2 replace BSBE24A=2 if BSBE24A==3 replace BSBE24A=1 if BSBE24A==4 replace BSBE24A=4 if BSBE24A==5 replace BSBE24A=3 if BSBE24A==6 label define BSBE24A 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBE24A replace BSBE24B=5 if BSBE24B==1 replace BSBE24B=6 if BSBE24B==2 166 replace BSBE24B=2 if BSBE24B==3 replace BSBE24B=1 if BSBE24B==4 replace BSBE24B=4 if BSBE24B==5 replace BSBE24B=3 if BSBE24B==6 label define BSBE24B 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBE24B replace BSBE24C=5 if BSBE24C==1 replace BSBE24C=6 if BSBE24C==2 replace BSBE24C=2 if BSBE24C==3 replace BSBE24C=1 if BSBE24C==4 replace BSBE24C=4 if BSBE24C==5 replace BSBE24C=3 if BSBE24C==6 label define BSBE24C 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBE24C replace BSBE24D=5 if BSBE24D==1 replace BSBE24D=6 if BSBE24D==2 replace BSBE24D=2 if BSBE24D==3 replace BSBE24D=1 if BSBE24D==4 replace BSBE24D=4 if BSBE24D==5 replace BSBE24D=3 if BSBE24D==6 label define BSBE24D 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBE24D replace BSBE24E=5 if BSBE24E==1 replace BSBE24E=6 if BSBE24E==2 replace BSBE24E=2 if BSBE24E==3 replace BSBE24E=1 if BSBE24E==4 replace BSBE24E=4 if BSBE24E==5 replace BSBE24E=3 if BSBE24E==6 167 label define BSBE24E 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBE24E replace BSBE24F=5 if BSBE24F==1 replace BSBE24F=6 if BSBE24F==2 replace BSBE24F=2 if BSBE24F==3 replace BSBE24F=1 if BSBE24F==4 replace BSBE24F=4 if BSBE24F==5 replace BSBE24F=3 if BSBE24F==6 label define BSBE24F 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBE24F replace BSBE24G=5 if BSBE24G==1 replace BSBE24G=6 if BSBE24G==2 replace BSBE24G=2 if BSBE24G==3 replace BSBE24G=1 if BSBE24G==4 replace BSBE24G=4 if BSBE24G==5 replace BSBE24G=3 if BSBE24G==6 label define BSBE24G 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBE24G replace BSBE24H=5 if BSBE24H==1 replace BSBE24H=6 if BSBE24H==2 replace BSBE24H=2 if BSBE24H==3 replace BSBE24H=1 if BSBE24H==4 replace BSBE24H=4 if BSBE24H==5 replace BSBE24H=3 if BSBE24H==6 label define BSBE24H 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBE24H replace BSBE24I=5 if BSBE24I==1 replace BSBE24I=6 if BSBE24I==2 168 replace BSBE24I=2 if BSBE24I==3 replace BSBE24I=1 if BSBE24I==4 replace BSBE24I=4 if BSBE24I==5 replace BSBE24I=3 if BSBE24I==6 label define BSBE24I 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBE24I replace BSBE24J=5 if BSBE24J==1 replace BSBE24J=6 if BSBE24J==2 replace BSBE24J=2 if BSBE24J==3 replace BSBE24J=1 if BSBE24J==4 replace BSBE24J=4 if BSBE24J==5 replace BSBE24J=3 if BSBE24J==6 label define BSBE24J 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBE24J replace BSBE24K=5 if BSBE24K==1 replace BSBE24K=6 if BSBE24K==2 replace BSBE24K=2 if BSBE24K==3 replace BSBE24K=1 if BSBE24K==4 replace BSBE24K=4 if BSBE24K==5 replace BSBE24K=3 if BSBE24K==6 label define BSBE24K 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBE24K replace BSBE24L=5 if BSBE24L==1 replace BSBE24L=6 if BSBE24L==2 replace BSBE24L=2 if BSBE24L==3 replace BSBE24L=1 if BSBE24L==4 replace BSBE24L=4 if BSBE24L==5 replace BSBE24L=3 if BSBE24L==6 169 label define BSBE24L 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBE24L replace BSBE24M=5 if BSBE24M==1 replace BSBE24M=6 if BSBE24M==2 replace BSBE24M=2 if BSBE24M==3 replace BSBE24M=1 if BSBE24M==4 replace BSBE24M=4 if BSBE24M==5 replace BSBE24M=3 if BSBE24M==6 label define BSBE24M 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBE24M replace BSBE24N=5 if BSBE24N==1 replace BSBE24N=6 if BSBE24N==2 replace BSBE24N=2 if BSBE24N==3 replace BSBE24N=1 if BSBE24N==4 replace BSBE24N=4 if BSBE24N==5 replace BSBE24N=3 if BSBE24N==6 label define BSBE24N 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBE24N *Question 25* replace BSBC25=0 if BSBC25==2 label define BSBC25 0 "no" 1 "yes", replace tab BSBC25 *Question 26* replace BSBC26A=5 if BSBC26A==1 replace BSBC26A=6 if BSBC26A==2 replace BSBC26A=2 if BSBC26A==3 replace BSBC26A=1 if BSBC26A==4 replace BSBC26A=4 if BSBC26A==5 170 replace BSBC26A=3 if BSBC26A==6 label define BSBC26A 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBC26A replace BSBC26B=5 if BSBC26B==1 replace BSBC26B=6 if BSBC26B==2 replace BSBC26B=2 if BSBC26B==3 replace BSBC26B=1 if BSBC26B==4 replace BSBC26B=4 if BSBC26B==5 replace BSBC26B=3 if BSBC26B==6 label define BSBC26B 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBC26B replace BSBC26C=5 if BSBC26C==1 replace BSBC26C=6 if BSBC26C==2 replace BSBC26C=2 if BSBC26C==3 replace BSBC26C=1 if BSBC26C==4 replace BSBC26C=4 if BSBC26C==5 replace BSBC26C=3 if BSBC26C==6 label define BSBC26C 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBC26C replace BSBC26D=5 if BSBC26D==1 replace BSBC26D=6 if BSBC26D==2 replace BSBC26D=2 if BSBC26D==3 replace BSBC26D=1 if BSBC26D==4 replace BSBC26D=4 if BSBC26D==5 replace BSBC26D=3 if BSBC26D==6 label define BSBC26D 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBC26D replace BSBC26E=5 if BSBC26E==1 171 replace BSBC26E=6 if BSBC26E==2 replace BSBC26E=2 if BSBC26E==3 replace BSBC26E=1 if BSBC26E==4 replace BSBC26E=4 if BSBC26E==5 replace BSBC26E=3 if BSBC26E==6 label define BSBC26E 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBC26E replace BSBC26F=5 if BSBC26F==1 replace BSBC26F=6 if BSBC26F==2 replace BSBC26F=2 if BSBC26F==3 replace BSBC26F=1 if BSBC26F==4 replace BSBC26F=4 if BSBC26F==5 replace BSBC26F=3 if BSBC26F==6 label define BSBC26F 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBC26F replace BSBC26G=5 if BSBC26G==1 replace BSBC26G=6 if BSBC26G==2 replace BSBC26G=2 if BSBC26G==3 replace BSBC26G=1 if BSBC26G==4 replace BSBC26G=4 if BSBC26G==5 replace BSBC26G=3 if BSBC26G==6 label define BSBC26G 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBC26G *Question 27* replace BSBC27A=5 if BSBC27A==1 replace BSBC27A=6 if BSBC27A==2 replace BSBC27A=2 if BSBC27A==3 replace BSBC27A=1 if BSBC27A==4 172 replace BSBC27A=4 if BSBC27A==5 replace BSBC27A=3 if BSBC27A==6 label define BSBC27A 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBC27A replace BSBC27B=5 if BSBC27B==1 replace BSBC27B=6 if BSBC27B==2 replace BSBC27B=2 if BSBC27B==3 replace BSBC27B=1 if BSBC27B==4 replace BSBC27B=4 if BSBC27B==5 replace BSBC27B=3 if BSBC27B==6 label define BSBC27B 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBC27B replace BSBC27C=5 if BSBC27C==1 replace BSBC27C=6 if BSBC27C==2 replace BSBC27C=2 if BSBC27C==3 replace BSBC27C=1 if BSBC27C==4 replace BSBC27C=4 if BSBC27C==5 replace BSBC27C=3 if BSBC27C==6 label define BSBC27C 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBC27C replace BSBC27D=5 if BSBC27D==1 replace BSBC27D=6 if BSBC27D==2 replace BSBC27D=2 if BSBC27D==3 replace BSBC27D=1 if BSBC27D==4 replace BSBC27D=4 if BSBC27D==5 replace BSBC27D=3 if BSBC27D==6 label define BSBC27D 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBC27D 173 replace BSBC27E=5 if BSBC27E==1 replace BSBC27E=6 if BSBC27E==2 replace BSBC27E=2 if BSBC27E==3 replace BSBC27E=1 if BSBC27E==4 replace BSBC27E=4 if BSBC27E==5 replace BSBC27E=3 if BSBC27E==6 label define BSBC27E 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBC27E *Question 28* replace BSBC28A=5 if BSBC28A==1 replace BSBC28A=6 if BSBC28A==2 replace BSBC28A=2 if BSBC28A==3 replace BSBC28A=1 if BSBC28A==4 replace BSBC28A=4 if BSBC28A==5 replace BSBC28A=3 if BSBC28A==6 label define BSBC28A 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBC28A replace BSBC28B=5 if BSBC28B==1 replace BSBC28B=6 if BSBC28B==2 replace BSBC28B=2 if BSBC28B==3 replace BSBC28B=1 if BSBC28B==4 replace BSBC28B=4 if BSBC28B==5 replace BSBC28B=3 if BSBC28B==6 label define BSBC28B 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBC28B replace BSBC28C=5 if BSBC28C==1 replace BSBC28C=6 if BSBC28C==2 replace BSBC28C=2 if BSBC28C==3 174 replace BSBC28C=1 if BSBC28C==4 replace BSBC28C=4 if BSBC28C==5 replace BSBC28C=3 if BSBC28C==6 label define BSBC28C 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBC28C replace BSBC28D=5 if BSBC28D==1 replace BSBC28D=6 if BSBC28D==2 replace BSBC28D=2 if BSBC28D==3 replace BSBC28D=1 if BSBC28D==4 replace BSBC28D=4 if BSBC28D==5 replace BSBC28D=3 if BSBC28D==6 label define BSBC28D 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBC28D replace BSBC28E=5 if BSBC28E==1 replace BSBC28E=6 if BSBC28E==2 replace BSBC28E=2 if BSBC28E==3 replace BSBC28E=1 if BSBC28E==4 replace BSBC28E=4 if BSBC28E==5 replace BSBC28E=3 if BSBC28E==6 label define BSBC28E 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBC28E replace BSBC28F=5 if BSBC28F==1 replace BSBC28F=6 if BSBC28F==2 replace BSBC28F=2 if BSBC28F==3 replace BSBC28F=1 if BSBC28F==4 replace BSBC28F=4 if BSBC28F==5 replace BSBC28F=3 if BSBC28F==6 label define BSBC28F 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace 175 tab BSBC28F replace BSBC28G=5 if BSBC28G==1 replace BSBC28G=6 if BSBC28G==2 replace BSBC28G=2 if BSBC28G==3 replace BSBC28G=1 if BSBC28G==4 replace BSBC28G=4 if BSBC28G==5 replace BSBC28G=3 if BSBC28G==6 label define BSBC28G 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBC28G replace BSBC28H=5 if BSBC28H==1 replace BSBC28H=6 if BSBC28H==2 replace BSBC28H=2 if BSBC28H==3 replace BSBC28H=1 if BSBC28H==4 replace BSBC28H=4 if BSBC28H==5 replace BSBC28H=3 if BSBC28H==6 label define BSBC28H 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBC28H replace BSBC28I=5 if BSBC28I==1 replace BSBC28I=6 if BSBC28I==2 replace BSBC28I=2 if BSBC28I==3 replace BSBC28I=1 if BSBC28I==4 replace BSBC28I=4 if BSBC28I==5 replace BSBC28I=3 if BSBC28I==6 label define BSBC28I 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBC28I replace BSBC28J=5 if BSBC28J==1 replace BSBC28J=6 if BSBC28J==2 replace BSBC28J=2 if BSBC28J==3 176 replace BSBC28J=1 if BSBC28J==4 replace BSBC28J=4 if BSBC28J==5 replace BSBC28J=3 if BSBC28J==6 label define BSBC28J 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBC28J replace BSBC28K=5 if BSBC28K==1 replace BSBC28K=6 if BSBC28K==2 replace BSBC28K=2 if BSBC28K==3 replace BSBC28K=1 if BSBC28K==4 replace BSBC28K=4 if BSBC28K==5 replace BSBC28K=3 if BSBC28K==6 label define BSBC28K 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBC28K replace BSBC28L=5 if BSBC28L==1 replace BSBC28L=6 if BSBC28L==2 replace BSBC28L=2 if BSBC28L==3 replace BSBC28L=1 if BSBC28L==4 replace BSBC28L=4 if BSBC28L==5 replace BSBC28L=3 if BSBC28L==6 label define BSBC28L 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBC28L replace BSBC28M=5 if BSBC28M==1 replace BSBC28M=6 if BSBC28M==2 replace BSBC28M=2 if BSBC28M==3 replace BSBC28M=1 if BSBC28M==4 replace BSBC28M=4 if BSBC28M==5 replace BSBC28M=3 if BSBC28M==6 label define BSBC28M 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace 177 tab BSBC28M replace BSBC28N=5 if BSBC28N==1 replace BSBC28N=6 if BSBC28N==2 replace BSBC28N=2 if BSBC28N==3 replace BSBC28N=1 if BSBC28N==4 replace BSBC28N=4 if BSBC28N==5 replace BSBC28N=3 if BSBC28N==6 label define BSBC28N 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBC28N *Question 29* replace BSBP29=0 if BSBP29==2 label define BSBP29 0 "no" 1 "yes", replace tab BSBP29 *Question 30* replace BSBP30A=5 if BSBP30A==1 replace BSBP30A=6 if BSBP30A==2 replace BSBP30A=2 if BSBP30A==3 replace BSBP30A=1 if BSBP30A==4 replace BSBP30A=4 if BSBP30A==5 replace BSBP30A=3 if BSBP30A==6 label define BSBP30A 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBP30A replace BSBP30B=5 if BSBP30B==1 replace BSBP30B=6 if BSBP30B==2 replace BSBP30B=2 if BSBP30B==3 replace BSBP30B=1 if BSBP30B==4 replace BSBP30B=4 if BSBP30B==5 replace BSBP30B=3 if BSBP30B==6 178 label define BSBP30B 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBP30B replace BSBP30C=5 if BSBP30C==1 replace BSBP30C=6 if BSBP30C==2 replace BSBP30C=2 if BSBP30C==3 replace BSBP30C=1 if BSBP30C==4 replace BSBP30C=4 if BSBP30C==5 replace BSBP30C=3 if BSBP30C==6 label define BSBP30C 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBP30C replace BSBP30D=5 if BSBP30D==1 replace BSBP30D=6 if BSBP30D==2 replace BSBP30D=2 if BSBP30D==3 replace BSBP30D=1 if BSBP30D==4 replace BSBP30D=4 if BSBP30D==5 replace BSBP30D=3 if BSBP30D==6 label define BSBP30D 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBP30D replace BSBP30E=5 if BSBP30E==1 replace BSBP30E=6 if BSBP30E==2 replace BSBP30E=2 if BSBP30E==3 replace BSBP30E=1 if BSBP30E==4 replace BSBP30E=4 if BSBP30E==5 replace BSBP30E=3 if BSBP30E==6 label define BSBP30E 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBP30E replace BSBP30F=5 if BSBP30F==1 replace BSBP30F=6 if BSBP30F==2 179 replace BSBP30F=2 if BSBP30F==3 replace BSBP30F=1 if BSBP30F==4 replace BSBP30F=4 if BSBP30F==5 replace BSBP30F=3 if BSBP30F==6 label define BSBP30F 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBP30F replace BSBP30G=5 if BSBP30G==1 replace BSBP30G=6 if BSBP30G==2 replace BSBP30G=2 if BSBP30G==3 replace BSBP30G=1 if BSBP30G==4 replace BSBP30G=4 if BSBP30G==5 replace BSBP30G=3 if BSBP30G==6 label define BSBP30G 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBP30G *Question 31* replace BSBP31A=5 if BSBP31A==1 replace BSBP31A=6 if BSBP31A==2 replace BSBP31A=2 if BSBP31A==3 replace BSBP31A=1 if BSBP31A==4 replace BSBP31A=4 if BSBP31A==5 replace BSBP31A=3 if BSBP31A==6 label define BSBP31A 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBP31A replace BSBP31B=5 if BSBP31B==1 replace BSBP31B=6 if BSBP31B==2 replace BSBP31B=2 if BSBP31B==3 replace BSBP31B=1 if BSBP31B==4 replace BSBP31B=4 if BSBP31B==5 180 replace BSBP31B=3 if BSBP31B==6 label define BSBP31B 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBP31B replace BSBP31C=5 if BSBP31C==1 replace BSBP31C=6 if BSBP31C==2 replace BSBP31C=2 if BSBP31C==3 replace BSBP31C=1 if BSBP31C==4 replace BSBP31C=4 if BSBP31C==5 replace BSBP31C=3 if BSBP31C==6 label define BSBP31C 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBP31C replace BSBP31D=5 if BSBP31D==1 replace BSBP31D=6 if BSBP31D==2 replace BSBP31D=2 if BSBP31D==3 replace BSBP31D=1 if BSBP31D==4 replace BSBP31D=4 if BSBP31D==5 replace BSBP31D=3 if BSBP31D==6 label define BSBP31D 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBP31D replace BSBP31E=5 if BSBP31E==1 replace BSBP31E=6 if BSBP31E==2 replace BSBP31E=2 if BSBP31E==3 replace BSBP31E=1 if BSBP31E==4 replace BSBP31E=4 if BSBP31E==5 replace BSBP31E=3 if BSBP31E==6 label define BSBP31E 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBP31E *Question 32* 181 replace BSBP32A=5 if BSBP32A==1 replace BSBP32A=6 if BSBP32A==2 replace BSBP32A=2 if BSBP32A==3 replace BSBP32A=1 if BSBP32A==4 replace BSBP32A=4 if BSBP32A==5 replace BSBP32A=3 if BSBP32A==6 label define BSBP32A 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBP32A replace BSBP32B=5 if BSBP32B==1 replace BSBP32B=6 if BSBP32B==2 replace BSBP32B=2 if BSBP32B==3 replace BSBP32B=1 if BSBP32B==4 replace BSBP32B=4 if BSBP32B==5 replace BSBP32B=3 if BSBP32B==6 label define BSBP32B 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBP32B replace BSBP32C=5 if BSBP32C==1 replace BSBP32C=6 if BSBP32C==2 replace BSBP32C=2 if BSBP32C==3 replace BSBP32C=1 if BSBP32C==4 replace BSBP32C=4 if BSBP32C==5 replace BSBP32C=3 if BSBP32C==6 label define BSBP32C 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBP32C replace BSBP32D=5 if BSBP32D==1 replace BSBP32D=6 if BSBP32D==2 replace BSBP32D=2 if BSBP32D==3 replace BSBP32D=1 if BSBP32D==4 182 replace BSBP32D=4 if BSBP32D==5 replace BSBP32D=3 if BSBP32D==6 label define BSBP32D 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBP32D replace BSBP32E=5 if BSBP32E==1 replace BSBP32E=6 if BSBP32E==2 replace BSBP32E=2 if BSBP32E==3 replace BSBP32E=1 if BSBP32E==4 replace BSBP32E=4 if BSBP32E==5 replace BSBP32E=3 if BSBP32E==6 label define BSBP32E 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBP32E replace BSBP32F=5 if BSBP32F==1 replace BSBP32F=6 if BSBP32F==2 replace BSBP32F=2 if BSBP32F==3 replace BSBP32F=1 if BSBP32F==4 replace BSBP32F=4 if BSBP32F==5 replace BSBP32F=3 if BSBP32F==6 label define BSBP32F 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBP32F replace BSBP32G=5 if BSBP32G==1 replace BSBP32G=6 if BSBP32G==2 replace BSBP32G=2 if BSBP32G==3 replace BSBP32G=1 if BSBP32G==4 replace BSBP32G=4 if BSBP32G==5 replace BSBP32G=3 if BSBP32G==6 label define BSBP32G 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBP32G 183 replace BSBP32H=5 if BSBP32H==1 replace BSBP32H=6 if BSBP32H==2 replace BSBP32H=2 if BSBP32H==3 replace BSBP32H=1 if BSBP32H==4 replace BSBP32H=4 if BSBP32H==5 replace BSBP32H=3 if BSBP32H==6 label define BSBP32H 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBP32H replace BSBP32I=5 if BSBP32I==1 replace BSBP32I=6 if BSBP32I==2 replace BSBP32I=2 if BSBP32I==3 replace BSBP32I=1 if BSBP32I==4 replace BSBP32I=4 if BSBP32I==5 replace BSBP32I=3 if BSBP32I==6 label define BSBP32I 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBP32I replace BSBP32J=5 if BSBP32J==1 replace BSBP32J=6 if BSBP32J==2 replace BSBP32J=2 if BSBP32J==3 replace BSBP32J=1 if BSBP32J==4 replace BSBP32J=4 if BSBP32J==5 replace BSBP32J=3 if BSBP32J==6 label define BSBP32J 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBP32J replace BSBP32K=5 if BSBP32K==1 replace BSBP32K=6 if BSBP32K==2 replace BSBP32K=2 if BSBP32K==3 replace BSBP32K=1 if BSBP32K==4 184 replace BSBP32K=4 if BSBP32K==5 replace BSBP32K=3 if BSBP32K==6 label define BSBP32K 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBP32K replace BSBP32L=5 if BSBP32L==1 replace BSBP32L=6 if BSBP32L==2 replace BSBP32L=2 if BSBP32L==3 replace BSBP32L=1 if BSBP32L==4 replace BSBP32L=4 if BSBP32L==5 replace BSBP32L=3 if BSBP32L==6 label define BSBP32L 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBP32L replace BSBP32M=5 if BSBP32M==1 replace BSBP32M=6 if BSBP32M==2 replace BSBP32M=2 if BSBP32M==3 replace BSBP32M=1 if BSBP32M==4 replace BSBP32M=4 if BSBP32M==5 replace BSBP32M=3 if BSBP32M==6 label define BSBP32M 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBP32M replace BSBP32N=5 if BSBP32N==1 replace BSBP32N=6 if BSBP32N==2 replace BSBP32N=2 if BSBP32N==3 replace BSBP32N=1 if BSBP32N==4 replace BSBP32N=4 if BSBP32N==5 replace BSBP32N=3 if BSBP32N==6 label define BSBP32N 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BSBP32N 185 *Question 33AA-AE recode for ascending order* *AA* replace BSBM33AA=6 if BSBM33AA==1 replace BSBM33AA=7 if BSBM33AA==2 replace BSBM33AA=8 if BSBM33AA==4 replace BSBM33AA=10 if BSBM33AA==5 replace BSBM33AA=5 if BSBM33AA==6 replace BSBM33AA=4 if BSBM33AA==7 replace BSBM33AA=2 if BSBM33AA==8 replace BSBM33AA=1 if BSBM33AA==10 label define BSBM33AA 1 "Never" 2 "Less than once a week" 3 "1 or 2 times a week" 4 "3 or 4 times a week" 5 "Every day", replace tab BSBM33AA *AB* replace BSBB33AB=6 if BSBB33AB==1 replace BSBB33AB=7 if BSBB33AB==2 replace BSBB33AB=8 if BSBB33AB==4 replace BSBB33AB=10 if BSBB33AB==5 replace BSBB33AB=5 if BSBB33AB==6 replace BSBB33AB=4 if BSBB33AB==7 replace BSBB33AB=2 if BSBB33AB==8 replace BSBB33AB=1 if BSBB33AB==10 label define BSBB33AB 1 "Never" 2 "Less than once a week" 3 "1 or 2 times a week" 4 "3 or 4 times a week" 5 "Every day", replace tab BSBB33AB *AC* replace BSBE33AC=6 if BSBE33AC==1 replace BSBE33AC=7 if BSBE33AC==2 186 replace BSBE33AC=8 if BSBE33AC==4 replace BSBE33AC=10 if BSBE33AC==5 replace BSBE33AC=5 if BSBE33AC==6 replace BSBE33AC=4 if BSBE33AC==7 replace BSBE33AC=2 if BSBE33AC==8 replace BSBE33AC=1 if BSBE33AC==10 label define BSBE33AC 1 "Never" 2 "Less than once a week" 3 "1 or 2 times a week" 4 "3 or 4 times a week" 5 "Every day", replace tab BSBE33AC *AD* replace BSBC33AD=6 if BSBC33AD==1 replace BSBC33AD=7 if BSBC33AD==2 replace BSBC33AD=8 if BSBC33AD==4 replace BSBC33AD=10 if BSBC33AD==5 replace BSBC33AD=5 if BSBC33AD==6 replace BSBC33AD=4 if BSBC33AD==7 replace BSBC33AD=2 if BSBC33AD==8 replace BSBC33AD=1 if BSBC33AD==10 label define BSBC33AD 1 "Never" 2 "Less than once a week" 3 "1 or 2 times a week" 4 "3 or 4 times a week" 5 "Every day", replace tab BSBC33AD *AE* replace BSBP33AE=6 if BSBP33AE==1 replace BSBP33AE=7 if BSBP33AE==2 replace BSBP33AE=8 if BSBP33AE==4 replace BSBP33AE=10 if BSBP33AE==5 replace BSBP33AE=5 if BSBP33AE==6 replace BSBP33AE=4 if BSBP33AE==7 187 replace BSBP33AE=2 if BSBP33AE==8 replace BSBP33AE=1 if BSBP33AE==10 label define BSBP33AE 1 "Never" 2 "Less than once a week" 3 "1 or 2 times a week" 4 "3 or 4 times a week" 5 "Every day", replace tab BSBP33AE ***Recode the catagorical index variables so they ascend logically*** replace bsdgher=4 if bsdgher==1 replace bsdgher=1 if bsdgher==3 replace bsdgher=3 if bsdgher==4 label define BSDGHER 1 "Few Resources" 2 "Some Resources" 3 "Many Resources", replace tab bsdgher replace bsdgslm=4 if bsdgslm==1 replace bsdgslm=1 if bsdgslm==3 replace bsdgslm=3 if bsdgslm==4 label define BSDGSLM 1 "Do not like learning mathematics" 2 "Somewhat like learning mathematics" 3 "Like learning mathematics", replace tab bsdgslm replace bsdgsls=4 if bsdgsls==1 replace bsdgsls=1 if bsdgsls==3 replace bsdgsls=3 if bsdgsls==4 label define BSDGSLS 1 "Do not like learning science" 2 "Somewhat like learning science" 3 "Like learning science", replace tab bsdgsls replace bsdgslb=4 if bsdgslb==1 replace bsdgslb=1 if bsdgslb==3 188 replace bsdgslb=3 if bsdgslb==4 label define BSDGSLB 1 "Do not like learning biology" 2 "Somewhat like learning biology" 3 "Like learning biology", replace tab bsdgslb replace bsdgslc=4 if bsdgslc==1 replace bsdgslc=1 if bsdgslc==3 replace bsdgslc=3 if bsdgslc==4 label define BSDGSLC 1 "Do not like learning chemistry" 2 "Somewhat like learning chemistry" 3 "Like learning chemistry", replace tab bsdgslc replace bsdgslp=4 if bsdgslp==1 replace bsdgslp=1 if bsdgslp==3 replace bsdgslp=3 if bsdgslp==4 label define BSDGSLP 1 "Do not like learning physics" 2 "Somewhat like learning physics" 3 "Like learning physics", replace tab bsdgslp replace bsdgsle=4 if bsdgsle==1 replace bsdgsle=1 if bsdgsle==3 replace bsdgsle=3 if bsdgsle==4 label define BSDGSLE 1 "Do not like learning earth science" 2 "Somewhat like learning earth science" 3 "Like learning earth science", replace tab bsdgsle replace bsdgsvm=4 if bsdgsvm==1 replace bsdgsvm=1 if bsdgsvm==3 189 replace bsdgsvm=3 if bsdgsvm==4 label define BSDGSVM 1 "Do not value learning mathematics" 2 "Somewhat value learning mathematics" 3 "Value learning mathematics", replace tab bsdgsvm replace bsdgsvs=4 if bsdgsvs==1 replace bsdgsvs=1 if bsdgsvs==3 replace bsdgsvs=3 if bsdgsvs==4 label define BSDGSVS 1 "Do not value learning science" 2 "Somewhat value learning science" 3 "value learning science", replace tab bsdgsvs replace bsdgsvb=4 if bsdgsvb==1 replace bsdgsvb=1 if bsdgsvb==3 replace bsdgsvb=3 if bsdgsvb==4 label define BSDGSVB 1 "Do not value learning biology" 2 "Somewhat value learning biology" 3 "value learning biology", replace tab bsdgsvb replace bsdgsvc=4 if bsdgsvc==1 replace bsdgsvc=1 if bsdgsvc==3 replace bsdgsvc=3 if bsdgsvc==4 label define BSDGSVC 1 "Do not value learning chemistry" 2 "Somewhat value learning chemistry" 3 "value learning chemistry", replace tab bsdgsvc replace bsdgsvp=4 if bsdgsvp==1 replace bsdgsvp=1 if bsdgsvp==3 190 replace bsdgsvp=3 if bsdgsvp==4 label define BSDGSVP 1 "Do not value learning physics" 2 "Somewhat value learning physics" 3 "value learning physics", replace tab bsdgsvp replace bsdgsve=4 if bsdgsve==1 replace bsdgsve=1 if bsdgsve==3 replace bsdgsve=3 if bsdgsve==4 label define BSDGSVE 1 "Do not value learning earth science" 2 "Somewhat value learning earth science" 3 "value learning earth science", replace tab bsdgsve replace bsdgscm=4 if bsdgscm==1 replace bsdgscm=1 if bsdgscm==3 replace bsdgscm=3 if bsdgscm==4 label define BSDGSCM 1 "Not confident with mathematics" 2 "Somewhat confident with mathematics" 3 "Confident with mathematics", replace tab bsdgscm replace bsdgscs=4 if bsdgscs==1 replace bsdgscs=1 if bsdgscs==3 replace bsdgscs=3 if bsdgscs==4 label define BSDGSCS 1 "Not confident with science" 2 "Somewhat confident with science" 3 "Confident with science", replace tab bsdgscs replace bsdgscb=4 if bsdgscb==1 replace bsdgscb=1 if bsdgscb==3 191 replace bsdgscb=3 if bsdgscb==4 label define BSDGSCB 1 "Not confident with biology" 2 "Somewhat confident with biology" 3 "Confident with biology", replace tab bsdgscb replace bsdgscc=4 if bsdgscc==1 replace bsdgscc=1 if bsdgscc==3 replace bsdgscc=3 if bsdgscc==4 label define BSDGSCC 1 "Not confident with chemistry" 2 "Somewhat confident with chemistry" 3 "Confident with chemistry", replace tab bsdgscc replace bsdgscp=4 if bsdgscp==1 replace bsdgscp=1 if bsdgscp==3 replace bsdgscp=3 if bsdgscp==4 label define BSDGSCP 1 "Not confident with physics" 2 "Somewhat confident with physics" 3 "Confident with physics", replace tab bsdgscp replace bsdgsce=4 if bsdgsce==1 replace bsdgsce=1 if bsdgsce==3 replace bsdgsce=3 if bsdgsce==4 label define BSDGSCE 1 "Not confident with earth science" 2 "Somewhat confident with earth science" 3 "Confident with earth science", replace tab bsdgsce replace bsdgeml=4 if bsdgeml==1 replace bsdgeml=1 if bsdgeml==3 192 replace bsdgeml=3 if bsdgeml==4 label define BSDGEML 1 "Not engaged in mathematics lessons" 2 "Somewhat engaged in mathematics lessons" 3 "Engaged in mathematics lessons", replace tab bsdgeml replace bsdgesl=4 if bsdgesl==1 replace bsdgesl=1 if bsdgesl==3 replace bsdgesl=3 if bsdgesl==4 label define BSDGESL 1 "Not engaged in science lessons" 2 "Somewhat engaged in science lessons" 3 "Engaged in science lessons", replace tab bsdgesl replace bsdgebl=4 if bsdgebl==1 replace bsdgebl=1 if bsdgebl==3 replace bsdgebl=3 if bsdgebl==4 label define BSDGEBL 1 "Not engaged in biology lessons" 2 "Somewhat engaged in biology lessons" 3 "Engaged in biology lessons", replace tab bsdgebl replace bsdgecl=4 if bsdgecl==1 replace bsdgecl=1 if bsdgecl==3 replace bsdgecl=3 if bsdgecl==4 label define BSDGECL 1 "Not engaged in chemistry lessons" 2 "Somewhat engaged in chemistry lessons" 3 "Engaged in chemistry lessons", replace tab bsdgecl replace bsdgepl=4 if bsdgepl==1 replace bsdgepl=1 if bsdgepl==3 193 replace bsdgepl=3 if bsdgepl==4 label define BSDGEPL 1 "Not engaged in physics lessons" 2 "Somewhat engaged in physics lessons" 3 "Engaged in physics lessons", replace tab bsdgepl replace bsdgeel=4 if bsdgeel==1 replace bsdgeel=1 if bsdgeel==3 replace bsdgeel=3 if bsdgeel==4 label define BSDGEEL 1 "Not engaged in earth science lessons" 2 "Somewhat engaged in earth science lessons" 3 "Engaged in earth science lessons", replace tab bsdgeel replace bsdgedup=9 if bsdgedup==1 replace bsdgedup=7 if bsdgedup==2 replace bsdgedup=8 if bsdgedup==4 replace bsdgedup=10 if bsdgedup==5 replace bsdgedup=5 if bsdgedup==9 replace bsdgedup=4 if bsdgedup==7 replace bsdgedup=2 if bsdgedup==8 replace bsdgedup=1 if bsdgedup==10 label define BSDGEDUP 1 "Some Primary, Lower Secondary, or No School" 2 "Lower Secondary" 3 "Upper Secondary" 4 "Post-Secondary but not University" 5 "University or Higher", replace tab bsdgedup replace bsdmwkhw=4 if bsdmwkhw==1 replace bsdmwkhw=1 if bsdmwkhw==3 replace bsdmwkhw=3 if bsdmwkhw==4 label define BSDMWKHW 1 "45 Minutes or less" 2 "Between 45 minutes and 3 hours" 3 "3 hours or more", replace 194 tab bsdmwkhw replace bsdswkhw=4 if bsdswkhw==1 replace bsdswkhw=1 if bsdswkhw==3 replace bsdswkhw=3 if bsdswkhw==4 label define BSDSWKHW 1 "45 Minutes or less" 2 "Between 45 minutes and 3 hours" 3 "3 hours or more", replace tab bsdswkhw replace bsdbwkhw=4 if bsdbwkhw==1 replace bsdbwkhw=1 if bsdbwkhw==3 replace bsdbwkhw=3 if bsdbwkhw==4 label define BSDBWKHW 1 "45 Minutes or less" 2 "Between 45 minutes and 3 hours" 3 "3 hours or more", replace tab bsdbwkhw replace bsdcwkhw=4 if bsdcwkhw==1 replace bsdcwkhw=1 if bsdcwkhw==3 replace bsdcwkhw=3 if bsdcwkhw==4 label define BSDCWKHW 1 "45 Minutes or less" 2 "Between 45 minutes and 3 hours" 3 "3 hours or more", replace tab bsdcwkhw replace bsdpwkhw=4 if bsdpwkhw==1 replace bsdpwkhw=1 if bsdpwkhw==3 replace bsdpwkhw=3 if bsdpwkhw==4 label define BSDPWKHW 1 "45 Minutes or less" 2 "Between 45 minutes and 3 hours" 3 "3 hours or more", replace tab bsdpwkhw replace bsdewkhw=4 if bsdewkhw==1 replace bsdewkhw=1 if bsdewkhw==3 195 replace bsdewkhw=3 if bsdewkhw==4 label define BSDEWKHW 1 "45 Minutes or less" 2 "Between 45 minutes and 3 hours" 3 "3 hours or more", replace tab bsdewkhw *Create compositite variables* *Students* *Create SES variable - Books in home, parent ed, home educational resources *Need to standarize because the scales are different *Also do factor analysis to weight correctly bysort idcntry: egen mBSBG04=mean(BSBG04) bysort idcntry: egen sdBSBG04=sd(BSBG04) gen zBSBG04=(BSBG04-mBSBG04)/sdBSBG04 bysort idcntry: egen mBSBG06A=mean(BSBG06A) bysort idcntry: egen sdBSBG06A=sd(BSBG06A) gen zBSBG06A=(BSBG06A-mBSBG06A)/sdBSBG06A bysort idcntry: egen mBSBG06B=mean(BSBG06B) bysort idcntry: egen sdBSBG06B=sd(BSBG06B) gen zBSBG06B=(BSBG06B-mBSBG06B)/sdBSBG06B bysort idcntry: egen mbsbgher=mean(bsbgher) bysort idcntry: egen sdbsbgher=sd(bsbgher) gen zbsbgher=(bsbgher-mbsbgher)/bsbgher gen SES=zBSBG04+zBSBG06A+zBSBG06B+zbsbgher replace SES=. if zBSBG04==.|zBSBG06A==.|zBSBG06B==.|zbsbgher==. drop mBS* mbs* sdBS* sdbs* zBS* zbs* *Create parent involvement variable* *Mean variable, same scale *gen a11flag=1 if BSBG11A!=. 196 *gen b11flag=1 if BSBG11B!=. *gen c11flag=1 if BSBG11C!=. *gen d11flag=1 if BSBG11D!=. *egen parinvden=rowtotal(a11flag - d11flag) *egen parinvnum=rowtotal(BSBG11A - BSBG11D) *gen parinv= parinvnum/parinvden *drop a11flag - d11flag parinvnum parinvden *standardized (not used right now) *bysort idcntry: egen mBSBG11A=mean(BSBG11A) *bysort idcntry: egen sdBSBG11A=sd(BSBG11A) *gen zBSBG11A=(BSBG11A-mBSBG11A)/sdBSBG11A *bysort idcntry: egen mBSBG11B=mean(BSBG11B) *bysort idcntry: egen sdBSBG11B=sd(BSBG11B) *gen zBSBG11B=(BSBG11B-mBSBG11B)/sdBSBG11B *bysort idcntry: egen mBSBG11C=mean(BSBG11C) *bysort idcntry: egen sdBSBG11C=sd(BSBG11C) *gen zBSBG11C=(BSBG11C-mBSBG11A)/sdBSBG11C *bysort idcntry: egen mBSBG11D=mean(BSBG11D) *bysort idcntry: egen sdBSBG11D=sd(BSBG11D) *gen zBSBG11D=(BSBG11D-mBSBG11D)/sdBSBG11D *gen BSBG11all=zBSBG11A+zBSBG11B+zBSBG11C+zBSBG11D *drop mBSB* sdBSB* zBSB* *Create attitude variable* *Need to combine Finland into a single science variable* 197 *gen finlike=(bsdgslb+bsdgslc+bsdgslp+bsdgsle)/4 if idcntry==246 *replace finlike=. if (bsdgslb==.)|(bsdgslc==.)|(bsdgslp==.)|(bsdgsle==.) *replace finlike=1 if finlike==1.25|finlike==1.5 *replace finlike=2 if finlike==1.75|finlike==2.25 *replace finlike=3 if finlike==2.5|finlike==2.75 *replace bsdgsls=finlike if idcntry==246 *gen finval=(bsdgsvb+bsdgsvc+bsdgsvp+bsdgsve)/4 if idcntry==246 *replace finval=. if (bsdgsvb==.)|(bsdgsvc==.)|(bsdgsvp==.)|(bsdgsve==.) *replace finval=1 if finval==1.25|finval==1.5 *replace finval=2 if finval==1.75|finval==2.25 *replace finval=3 if finval==2.5|finval==2.75 *replace bsdgsvs=finval if idcntry==246 *gen fincon=(bsdgscb+bsdgscc+bsdgscp+bsdgsce)/4 if idcntry==246 *replace fincon=. if (bsdgscb==.)|(bsdgscc==.)|(bsdgscp==.)|(bsdgsce==.) *replace fincon=1 if fincon==1.25|fincon==1.5 *replace fincon=2 if fincon==1.75|fincon==2.25 *replace fincon=3 if fincon==2.5|fincon==2.75 *replace bsdgscs=fincon if idcntry==246 *gen fineng=(bsdgebl+bsdgecl+bsdgepl+bsdgeel)/4 if idcntry==246 *replace fineng=. if (bsdgebl==.)|(bsdgecl==.)|(bsdgepl==.)|(bsdgeel==.) *replace fineng=1 if fineng==1.25|fineng==1.5 *replace fineng=2 if fineng==1.75|fineng==2.25 *replace fineng=3 if fineng==2.5|fineng==2.75 *replace bsdgesl=fineng if idcntry==246 *Now create the variable* *Mean variable, same scale* *gen slsflag=1 if bsdgsls!=. 198 *gen svsflag=1 if bsdgsvs!=. *gen scsflag=1 if bsdgscs!=. *gen eslflag=1 if bsdgesl!=. *egen attaffden=rowtotal(slsflag - eslflag) *egen attaffnum=rowtotal(bsdgsls - bsdgesl) *gen attaff= attaffnum/attaffden *drop slsflag - eslflag attaffden attaffnum *Standardized, not used right now* *bysort idcntry: egen mbsdgsls=mean(bsdgsls) *bysort idcntry: egen sdbsdgsls=sd(bsdgsls) *gen zbsdgsls=(bsdgsls-mbsdgsls)/sdbsdgsls *bysort idcntry: egen mbsdgsvs=mean(bsdgsvs) *bysort idcntry: egen sdbsdgsvs=sd(bsdgsvs) *gen zbsdgsvs=(bsdgsvs-mbsdgsvs)/sdbsdgsvs *bysort idcntry: egen mbsdgscs=mean(bsdgscs) *bysort idcntry: egen sdbsdgscs=sd(bsdgscs) *gen zbsdgscs=(bsdgscs-mbsdgscs)/sdbsdgscs *bysort idcntry: egen mbsdgesl=mean(bsdgesl) *bysort idcntry: egen sdbsdgesl=sd(bsdgesl) *gen zbsdgesl=(bsdgesl-mbsdgesl)/sdbsdgesl *gen attaff=zbsdgsls+zbsdgsvs+zbsdgscs+zbsdgscs ***Combine Separate Science questions from Finland into a single question egen fin17a=rowmean(BSBB18A BSBE22A BSBC26A BSBP30A) egen fin17b=rowmean(BSBB18B BSBE22B BSBC26B BSBP30B) egen fin17c=rowmean(BSBB18C BSBE22C BSBC26C BSBP30C) egen fin17d=rowmean(BSBB18D BSBE22D BSBC26D BSBP30D) 199 egen fin17e=rowmean(BSBB18E BSBE22E BSBC26E BSBP30E) egen fin17f=rowmean(BSBB18F BSBE22F BSBC26F BSBP30F) egen fin17g=rowmean(BSBB18G BSBE22G BSBC26G BSBP30G) egen fin18a=rowmean(BSBB19A BSBE23A BSBC27A BSBP31A) egen fin18b=rowmean(BSBB19B BSBE23B BSBC27B BSBP31B) egen fin18c=rowmean(BSBB19C BSBE23C BSBC27C BSBP31C) egen fin18d=rowmean(BSBB19D BSBE23D BSBC27D BSBP31D) egen fin18e=rowmean(BSBB19E BSBE23E BSBC27E BSBP31E) egen fin19a=rowmean(BSBB20A BSBE24A BSBC28A BSBP32A) egen fin19b=rowmean(BSBB20B BSBE24B BSBC28B BSBP32B) egen fin19c=rowmean(BSBB20C BSBE24C BSBC28C BSBP32C) egen fin19d=rowmean(BSBB20D BSBE24D BSBC28D BSBP32D) egen fin19e=rowmean(BSBB20E BSBE24E BSBC28E BSBP32E) egen fin19f=rowmean(BSBB20F BSBE24F BSBC28F BSBP32F) egen fin19g=rowmean(BSBB20G BSBE24G BSBC28G BSBP32G) egen fin19h=rowmean(BSBB20H BSBE24H BSBC28H BSBP32H) egen fin19i=rowmean(BSBB20I BSBE24I BSBC28I BSBP32I) egen fin19j=rowmean(BSBB20J BSBE24J BSBC28J BSBP32J) egen fin19k=rowmean(BSBB20K BSBE24K BSBC28K BSBP32K) egen fin19l=rowmean(BSBB20L BSBE24L BSBC28L BSBP32L) egen fin19m=rowmean(BSBB20M BSBE24M BSBC28M BSBP32M) egen fin19n=rowmean(BSBB20N BSBE24N BSBC28N BSBP32N) replace BSBS17A=fin17a if idcntry==246 replace BSBS17B=fin17b if idcntry==246 replace BSBS17C=fin17c if idcntry==246 200 replace BSBS17D=fin17d if idcntry==246 replace BSBS17E=fin17e if idcntry==246 replace BSBS17F=fin17f if idcntry==246 replace BSBS17G=fin17g if idcntry==246 replace BSBS18A=fin18a if idcntry==246 replace BSBS18B=fin18b if idcntry==246 replace BSBS18C=fin18c if idcntry==246 replace BSBS18D=fin18d if idcntry==246 replace BSBS18E=fin18e if idcntry==246 replace BSBS19A=fin19a if idcntry==246 replace BSBS19B=fin19b if idcntry==246 replace BSBS19C=fin19c if idcntry==246 replace BSBS19D=fin19d if idcntry==246 replace BSBS19E=fin19e if idcntry==246 replace BSBS19F=fin19f if idcntry==246 replace BSBS19G=fin19g if idcntry==246 replace BSBS19H=fin19h if idcntry==246 replace BSBS19I=fin19i if idcntry==246 replace BSBS19J=fin19j if idcntry==246 replace BSBS19K=fin19k if idcntry==246 replace BSBS19L=fin19l if idcntry==246 replace BSBS19M=fin19m if idcntry==246 replace BSBS19N=fin19n if idcntry==246 drop fin* ***Create percentles for SES and achievement** 201 *SES Whole Sample* xtile SESperall=SES, n(3) label define SESperall 1 "Low SES" 2 "Medium SES" 3 "High SES", replace label values SESperall SESperall tab SESperall *SES within country* ***Need to create separate datasets for each country to create within country *percentiles, see separate do file for this creation merge m:m idcntry idstud using "C:\Users\educ.brunerju\Google Drive\Dissertation\Do files\Percentiles\Chile.dta", nogenerate update replace merge m:m idcntry idstud using "C:\Users\educ.brunerju\Google Drive\Dissertation\Do files\Percentiles\Finland.dta", nogenerate update replace merge m:m idcntry idstud using "C:\Users\educ.brunerju\Google Drive\Dissertation\Do files\Percentiles\Ghana.dta", nogenerate update replace merge m:m idcntry idstud using "C:\Users\educ.brunerju\Google Drive\Dissertation\Do files\Percentiles\Korea.dta", nogenerate update replace merge m:m idcntry idstud using "C:\Users\educ.brunerju\Google Drive\Dissertation\Do files\Percentiles\Singapore.dta", nogenerate update replace merge m:m idcntry idstud using "C:\Users\educ.brunerju\Google Drive\Dissertation\Do files\Percentiles\USA.dta", nogenerate update replace label define SESper 1 "Low SES" 2 "Medium SES" 3 "High SES", replace label values SESper SESper label define achper 1 "Low Achievement" 2 "Medium Achievement" 3 "High Achievement", replace label values achper achper ********************************************************************************** 202 ***Put dataset in Survey Mode*** svyset jkrep [pweight=totwgt], strata(jkzone) vce(jackknife) mse singleunit(missing) ***svy analysis*** *Compare student level achievement differences within all low SES groups *svy jackknife, subpop(if SESper==1) : mlogit achper BSBG01 attaff parinv, baseoutcome(2) rrr *Compare student level SES differences across all students in sample *svy jackknife : mlogit SESper BSBG01 attaff parinv BSSSCI01, baseoutcome(2) rrr ***PV commands*** *pv BSBG01 if SESper==1 [aw=totwgt], pv(BSSSCI*) jkzone(jkzone) jkrep(jkrep) jrr timss 203 Stata code for teacher variable cleaning ***Dataset generated using IEA IDB analyzer* *TIMSS 2011 Science Teacher data cleaning* *For country filtered dataset clear set more off use "C:\Users\educ.brunerju\Google Drive\Dissertation\TIMSS\Datasets\2011\TIMSS2011SciTeachCountryFilter.dta" *reorder dataset* sort idcntry order id* BT* bt* BS* *Recode gender so male =0 replace BTBG02=0 if BTBG02==2 label define BTBG02 0 "male" 1 "female", replace tab BTBG02 *Relabel Q4 to include education with level* label define BTBG04 1 "Did not complete Upper Secondary lv3" 2 "Upper Secondary lv3" 3 "Post-Sec NonTertiary lv4" /// 4 "Short Tertiary lv5b" 5 "Bachelor's lv5a" 6 "Second Bachelor's or higher lv5a/6/7", replace tab BTBG04 *Recode question 5 so no=0* replace BTBG05A=0 if BTBG05A==2 label define BTBG05A 0 "no" 1 "yes", replace tab BTBG05A replace BTBG05B=0 if BTBG05B==2 label define BTBG05B 0 "no" 1 "yes", replace tab BTBG05B 204 replace BTBG05C=0 if BTBG05C==2 label define BTBG05C 0 "no" 1 "yes", replace tab BTBG05C replace BTBG05D=0 if BTBG05D==2 label define BTBG05D 0 "no" 1 "yes", replace tab BTBG05D replace BTBG05E=0 if BTBG05E==2 label define BTBG05E 0 "no" 1 "yes", replace tab BTBG05E replace BTBG05F=0 if BTBG05F==2 label define BTBG05F 0 "no" 1 "yes", replace tab BTBG05F replace BTBG05G=0 if BTBG05G==2 label define BTBG05G 0 "no" 1 "yes", replace tab BTBG05G replace BTBG05H=0 if BTBG05H==2 label define BTBG05H 0 "no" 1 "yes", replace tab BTBG05H replace BTBG05I=0 if BTBG05I==2 label define BTBG05I 0 "no" 1 "yes", replace tab BTBG05I 205 *Rescale question 6 to move in ascending order* replace BTBG06A=6 if BTBG06A==1 replace BTBG06A=7 if BTBG06A==2 replace BTBG06A=8 if BTBG06A==4 replace BTBG06A=10 if BTBG06A==5 replace BTBG06A=5 if BTBG06A==6 replace BTBG06A=4 if BTBG06A==7 replace BTBG06A=2 if BTBG06A==8 replace BTBG06A=1 if BTBG06A==10 label define BTBG06A 1 "Very Low" 2 "Low" 3 "Medium" 4 "High" 5 "Very High", replace tab BTBG06A replace BTBG06B=6 if BTBG06B==1 replace BTBG06B=7 if BTBG06B==2 replace BTBG06B=8 if BTBG06B==4 replace BTBG06B=10 if BTBG06B==5 replace BTBG06B=5 if BTBG06B==6 replace BTBG06B=4 if BTBG06B==7 replace BTBG06B=2 if BTBG06B==8 replace BTBG06B=1 if BTBG06B==10 label define BTBG06B 1 "Very Low" 2 "Low" 3 "Medium" 4 "High" 5 "Very High", replace tab BTBG06B replace BTBG06C=6 if BTBG06C==1 replace BTBG06C=7 if BTBG06C==2 replace BTBG06C=8 if BTBG06C==4 replace BTBG06C=10 if BTBG06C==5 206 replace BTBG06C=5 if BTBG06C==6 replace BTBG06C=4 if BTBG06C==7 replace BTBG06C=2 if BTBG06C==8 replace BTBG06C=1 if BTBG06C==10 label define BTBG06C 1 "Very Low" 2 "Low" 3 "Medium" 4 "High" 5 "Very High", replace tab BTBG06C replace BTBG06D=6 if BTBG06D==1 replace BTBG06D=7 if BTBG06D==2 replace BTBG06D=8 if BTBG06D==4 replace BTBG06D=10 if BTBG06D==5 replace BTBG06D=5 if BTBG06D==6 replace BTBG06D=4 if BTBG06D==7 replace BTBG06D=2 if BTBG06D==8 replace BTBG06D=1 if BTBG06D==10 label define BTBG06D 1 "Very Low" 2 "Low" 3 "Medium" 4 "High" 5 "Very High", replace tab BTBG06D replace BTBG06E=6 if BTBG06E==1 replace BTBG06E=7 if BTBG06E==2 replace BTBG06E=8 if BTBG06E==4 replace BTBG06E=10 if BTBG06E==5 replace BTBG06E=5 if BTBG06E==6 replace BTBG06E=4 if BTBG06E==7 replace BTBG06E=2 if BTBG06E==8 replace BTBG06E=1 if BTBG06E==10 label define BTBG06E 1 "Very Low" 2 "Low" 3 "Medium" 4 "High" 5 "Very High", replace tab BTBG06E 207 replace BTBG06F=6 if BTBG06F==1 replace BTBG06F=7 if BTBG06F==2 replace BTBG06F=8 if BTBG06F==4 replace BTBG06F=10 if BTBG06F==5 replace BTBG06F=5 if BTBG06F==6 replace BTBG06F=4 if BTBG06F==7 replace BTBG06F=2 if BTBG06F==8 replace BTBG06F=1 if BTBG06F==10 label define BTBG06F 1 "Very Low" 2 "Low" 3 "Medium" 4 "High" 5 "Very High", replace tab BTBG06F replace BTBG06G=6 if BTBG06G==1 replace BTBG06G=7 if BTBG06G==2 replace BTBG06G=8 if BTBG06G==4 replace BTBG06G=10 if BTBG06G==5 replace BTBG06G=5 if BTBG06G==6 replace BTBG06G=4 if BTBG06G==7 replace BTBG06G=2 if BTBG06G==8 replace BTBG06G=1 if BTBG06G==10 label define BTBG06G 1 "Very Low" 2 "Low" 3 "Medium" 4 "High" 5 "Very High", replace tab BTBG06G replace BTBG06H=6 if BTBG06H==1 replace BTBG06H=7 if BTBG06H==2 replace BTBG06H=8 if BTBG06H==4 replace BTBG06H=10 if BTBG06H==5 replace BTBG06H=5 if BTBG06H==6 208 replace BTBG06H=4 if BTBG06H==7 replace BTBG06H=2 if BTBG06H==8 replace BTBG06H=1 if BTBG06H==10 label define BTBG06H 1 "Very Low" 2 "Low" 3 "Medium" 4 "High" 5 "Very High", replace tab BTBG06H *Recode question 7 to move in ascending order* replace BTBG07A=5 if BTBG07A==1 replace BTBG07A=6 if BTBG07A==2 replace BTBG07A=2 if BTBG07A==3 replace BTBG07A=1 if BTBG07A==4 replace BTBG07A=4 if BTBG07A==5 replace BTBG07A=3 if BTBG07A==6 label define BTBG07A 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BTBG07A replace BTBG07B=5 if BTBG07B==1 replace BTBG07B=6 if BTBG07B==2 replace BTBG07B=2 if BTBG07B==3 replace BTBG07B=1 if BTBG07B==4 replace BTBG07B=4 if BTBG07B==5 replace BTBG07B=3 if BTBG07B==6 label define BTBG07B 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BTBG07B replace BTBG07C=5 if BTBG07C==1 replace BTBG07C=6 if BTBG07C==2 replace BTBG07C=2 if BTBG07C==3 209 replace BTBG07C=1 if BTBG07C==4 replace BTBG07C=4 if BTBG07C==5 replace BTBG07C=3 if BTBG07C==6 label define BTBG07C 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BTBG07C replace BTBG07D=5 if BTBG07D==1 replace BTBG07D=6 if BTBG07D==2 replace BTBG07D=2 if BTBG07D==3 replace BTBG07D=1 if BTBG07D==4 replace BTBG07D=4 if BTBG07D==5 replace BTBG07D=3 if BTBG07D==6 label define BTBG07D 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BTBG07D replace BTBG07E=5 if BTBG07E==1 replace BTBG07E=6 if BTBG07E==2 replace BTBG07E=2 if BTBG07E==3 replace BTBG07E=1 if BTBG07E==4 replace BTBG07E=4 if BTBG07E==5 replace BTBG07E=3 if BTBG07E==6 label define BTBG07E 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BTBG07E *Recode question 9A so no =0* replace BTBG09AA=0 if BTBG09AA==2 label define BTBG09AA 0 "no" 1 "yes", replace tab BTBG09AA 210 replace BTBG09AB=0 if BTBG09AB==2 label define BTBG09AB 0 "no" 1 "yes", replace tab BTBG09AB replace BTBG09AC=0 if BTBG09AC==2 label define BTBG09AC 0 "no" 1 "yes", replace tab BTBG09AC *recode question 9B so scale ascends* replace BTBG09BA=5 if BTBG09BA==1 replace BTBG09BA=6 if BTBG09BA==2 replace BTBG09BA=2 if BTBG09BA==3 replace BTBG09BA=1 if BTBG09BA==4 replace BTBG09BA=4 if BTBG09BA==5 replace BTBG09BA=3 if BTBG09BA==6 label define BTBG09BA 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BTBG09BA replace BTBG09BB=5 if BTBG09BB==1 replace BTBG09BB=6 if BTBG09BB==2 replace BTBG09BB=2 if BTBG09BB==3 replace BTBG09BB=1 if BTBG09BB==4 replace BTBG09BB=4 if BTBG09BB==5 replace BTBG09BB=3 if BTBG09BB==6 label define BTBG09BB 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BTBG09BB replace BTBG09BC=5 if BTBG09BC==1 211 replace BTBG09BC=6 if BTBG09BC==2 replace BTBG09BC=2 if BTBG09BC==3 replace BTBG09BC=1 if BTBG09BC==4 replace BTBG09BC=4 if BTBG09BC==5 replace BTBG09BC=3 if BTBG09BC==6 label define BTBG09BC 1 "Disagree a lot" 2 "Disagree a little" 3 "Agree a little" 4 "Agree a lot", replace tab BTBG09BC *Recode question 14 to be in ascending order* replace BTBG14A=5 if BTBG14A==1 replace BTBG14A=6 if BTBG14A==2 replace BTBG14A=2 if BTBG14A==3 replace BTBG14A=1 if BTBG14A==4 replace BTBG14A=4 if BTBG14A==5 replace BTBG14A=3 if BTBG14A==6 label define BTBG14A 1 "Never" 2 "Some lessons" 3 "About half the lessons" 4 "Every or almost every lesson", replace tab BTBG14A replace BTBG14B=5 if BTBG14B==1 replace BTBG14B=6 if BTBG14B==2 replace BTBG14B=2 if BTBG14B==3 replace BTBG14B=1 if BTBG14B==4 replace BTBG14B=4 if BTBG14B==5 replace BTBG14B=3 if BTBG14B==6 label define BTBG14B 1 "Never" 2 "Some lessons" 3 "About half the lessons" 4 "Every or almost every lesson", replace tab BTBG14B 212 replace BTBG14C=5 if BTBG14C==1 replace BTBG14C=6 if BTBG14C==2 replace BTBG14C=2 if BTBG14C==3 replace BTBG14C=1 if BTBG14C==4 replace BTBG14C=4 if BTBG14C==5 replace BTBG14C=3 if BTBG14C==6 label define BTBG14C 1 "Never" 2 "Some lessons" 3 "About half the lessons" 4 "Every or almost every lesson", replace tab BTBG14C replace BTBG14D=5 if BTBG14D==1 replace BTBG14D=6 if BTBG14D==2 replace BTBG14D=2 if BTBG14D==3 replace BTBG14D=1 if BTBG14D==4 replace BTBG14D=4 if BTBG14D==5 replace BTBG14D=3 if BTBG14D==6 label define BTBG14D 1 "Never" 2 "Some lessons" 3 "About half the lessons" 4 "Every or almost every lesson", replace tab BTBG14D replace BTBG14E=5 if BTBG14E==1 replace BTBG14E=6 if BTBG14E==2 replace BTBG14E=2 if BTBG14E==3 replace BTBG14E=1 if BTBG14E==4 replace BTBG14E=4 if BTBG14E==5 replace BTBG14E=3 if BTBG14E==6 213 label define BTBG14E 1 "Never" 2 "Some lessons" 3 "About half the lessons" 4 "Every or almost every lesson", replace tab BTBG14E replace BTBG14F=5 if BTBG14F==1 replace BTBG14F=6 if BTBG14F==2 replace BTBG14F=2 if BTBG14F==3 replace BTBG14F=1 if BTBG14F==4 replace BTBG14F=4 if BTBG14F==5 replace BTBG14F=3 if BTBG14F==6 label define BTBG14F 1 "Never" 2 "Some lessons" 3 "About half the lessons" 4 "Every or almost every lesson", replace tab BTBG14F *Recode question 16 to move in ascending order* replace BTBG16A=6 if BTBG16A==1 replace BTBG16A=7 if BTBG16A==2 replace BTBG16A=8 if BTBG16A==4 replace BTBG16A=10 if BTBG16A==5 replace BTBG16A=5 if BTBG16A==6 replace BTBG16A=4 if BTBG16A==7 replace BTBG16A=2 if BTBG16A==8 replace BTBG16A=1 if BTBG16A==10 label define BTBG16A 1 "Never" 2 "1-3 times per year" 3 "4-6 times per year" 4 "Once or twice a month" 5 "At least once a week", replace tab BTBG16A replace BTBG16B=6 if BTBG16B==1 214 replace BTBG16B=7 if BTBG16B==2 replace BTBG16B=8 if BTBG16B==4 replace BTBG16B=10 if BTBG16B==5 replace BTBG16B=5 if BTBG16B==6 replace BTBG16B=4 if BTBG16B==7 replace BTBG16B=2 if BTBG16B==8 replace BTBG16B=1 if BTBG16B==10 label define BTBG16B 1 "Never" 2 "1-3 times per year" 3 "4-6 times per year" 4 "Once or twice a month" 5 "At least once a week", replace tab BTBG16B *Recode question 18 move in ascending order* replace BTBS18A=4 if BTBS18A==1 replace BTBS18A=1 if BTBS18A==3 replace BTBS18A=3 if BTBS18A==4 label define BTBS18A 1 "Not confident" 2 "Somewhat confident" 3 "Very confident", replace tab BTBS18A replace BTBS18B=4 if BTBS18B==1 replace BTBS18B=1 if BTBS18B==3 replace BTBS18B=3 if BTBS18B==4 label define BTBS18B 1 "Not confident" 2 "Somewhat confident" 3 "Very confident", replace tab BTBS18B replace BTBS18C=4 if BTBS18C==1 replace BTBS18C=1 if BTBS18C==3 replace BTBS18C=3 if BTBS18C==4 label define BTBS18C 1 "Not confident" 2 "Somewhat confident" 3 "Very confident", replace 215 tab BTBS18C replace BTBS18D=4 if BTBS18D==1 replace BTBS18D=1 if BTBS18D==3 replace BTBS18D=3 if BTBS18D==4 label define BTBS18D 1 "Not confident" 2 "Somewhat confident" 3 "Very confident", replace tab BTBS18D replace BTBS18E=4 if BTBS18E==1 replace BTBS18E=1 if BTBS18E==3 replace BTBS18E=3 if BTBS18E==4 label define BTBS18E 1 "Not confident" 2 "Somewhat confident" 3 "Very confident", replace tab BTBS18E *Recode question 19 to move in ascending order* replace BTBS19A=5 if BTBS19A==1 replace BTBS19A=6 if BTBS19A==2 replace BTBS19A=2 if BTBS19A==3 replace BTBS19A=1 if BTBS19A==4 replace BTBS19A=4 if BTBS19A==5 replace BTBS19A=3 if BTBS19A==6 label define BTBS19A 1 "Never" 2 "Some lessons" 3 "About half the lessons" 4 "Every or almost every lesson", replace tab BTBS19A replace BTBS19B=5 if BTBS19B==1 replace BTBS19B=6 if BTBS19B==2 replace BTBS19B=2 if BTBS19B==3 216 replace BTBS19B=1 if BTBS19B==4 replace BTBS19B=4 if BTBS19B==5 replace BTBS19B=3 if BTBS19B==6 label define BTBS19B 1 "Never" 2 "Some lessons" 3 "About half the lessons" 4 "Every or almost every lesson", replace tab BTBS19B replace BTBS19C=5 if BTBS19C==1 replace BTBS19C=6 if BTBS19C==2 replace BTBS19C=2 if BTBS19C==3 replace BTBS19C=1 if BTBS19C==4 replace BTBS19C=4 if BTBS19C==5 replace BTBS19C=3 if BTBS19C==6 label define BTBS19C 1 "Never" 2 "Some lessons" 3 "About half the lessons" 4 "Every or almost every lesson", replace tab BTBS19C replace BTBS19D=5 if BTBS19D==1 replace BTBS19D=6 if BTBS19D==2 replace BTBS19D=2 if BTBS19D==3 replace BTBS19D=1 if BTBS19D==4 replace BTBS19D=4 if BTBS19D==5 replace BTBS19D=3 if BTBS19D==6 label define BTBS19D 1 "Never" 2 "Some lessons" 3 "About half the lessons" 4 "Every or almost every lesson", replace tab BTBS19D replace BTBS19E=5 if BTBS19E==1 217 replace BTBS19E=6 if BTBS19E==2 replace BTBS19E=2 if BTBS19E==3 replace BTBS19E=1 if BTBS19E==4 replace BTBS19E=4 if BTBS19E==5 replace BTBS19E=3 if BTBS19E==6 label define BTBS19E 1 "Never" 2 "Some lessons" 3 "About half the lessons" 4 "Every or almost every lesson", replace tab BTBS19E replace BTBS19F=5 if BTBS19F==1 replace BTBS19F=6 if BTBS19F==2 replace BTBS19F=2 if BTBS19F==3 replace BTBS19F=1 if BTBS19F==4 replace BTBS19F=4 if BTBS19F==5 replace BTBS19F=3 if BTBS19F==6 label define BTBS19F 1 "Never" 2 "Some lessons" 3 "About half the lessons" 4 "Every or almost every lesson", replace tab BTBS19F replace BTBS19G=5 if BTBS19G==1 replace BTBS19G=6 if BTBS19G==2 replace BTBS19G=2 if BTBS19G==3 replace BTBS19G=1 if BTBS19G==4 replace BTBS19G=4 if BTBS19G==5 replace BTBS19G=3 if BTBS19G==6 label define BTBS19G 1 "Never" 2 "Some lessons" 3 "About half the lessons" 4 "Every or almost every lesson", replace tab BTBS19G 218 replace BTBS19H=5 if BTBS19H==1 replace BTBS19H=6 if BTBS19H==2 replace BTBS19H=2 if BTBS19H==3 replace BTBS19H=1 if BTBS19H==4 replace BTBS19H=4 if BTBS19H==5 replace BTBS19H=3 if BTBS19H==6 label define BTBS19H 1 "Never" 2 "Some lessons" 3 "About half the lessons" 4 "Every or almost every lesson", replace tab BTBS19H replace BTBS19I=5 if BTBS19I==1 replace BTBS19I=6 if BTBS19I==2 replace BTBS19I=2 if BTBS19I==3 replace BTBS19I=1 if BTBS19I==4 replace BTBS19I=4 if BTBS19I==5 replace BTBS19I=3 if BTBS19I==6 label define BTBS19I 1 "Never" 2 "Some lessons" 3 "About half the lessons" 4 "Every or almost every lesson", replace tab BTBS19I replace BTBS19J=5 if BTBS19J==1 replace BTBS19J=6 if BTBS19J==2 replace BTBS19J=2 if BTBS19J==3 replace BTBS19J=1 if BTBS19J==4 replace BTBS19J=4 if BTBS19J==5 replace BTBS19J=3 if BTBS19J==6 219 label define BTBS19J 1 "Never" 2 "Some lessons" 3 "About half the lessons" 4 "Every or almost every lesson", replace tab BTBS19J replace BTBS19K=5 if BTBS19K==1 replace BTBS19K=6 if BTBS19K==2 replace BTBS19K=2 if BTBS19K==3 replace BTBS19K=1 if BTBS19K==4 replace BTBS19K=4 if BTBS19K==5 replace BTBS19K=3 if BTBS19K==6 label define BTBS19K 1 "Never" 2 "Some lessons" 3 "About half the lessons" 4 "Every or almost every lesson", replace tab BTBS19K *Recode question 20 to ascending order* replace BTBS20A=4 if BTBS20A==1 replace BTBS20A=1 if BTBS20A==3 replace BTBS20A=3 if BTBS20A==4 label define BTBS20A 1 "Not Used" 2 "Supplement" 3 "Basis for instruction", replace tab BTBS20A replace BTBS20B=4 if BTBS20B==1 replace BTBS20B=1 if BTBS20B==3 replace BTBS20B=3 if BTBS20B==4 label define BTBS20B 1 "Not Used" 2 "Supplement" 3 "Basis for instruction", replace tab BTBS20B replace BTBS20C=4 if BTBS20C==1 220 replace BTBS20C=1 if BTBS20C==3 replace BTBS20C=3 if BTBS20C==4 label define BTBS20C 1 "Not Used" 2 "Supplement" 3 "Basis for instruction", replace tab BTBS20C replace BTBS20D=4 if BTBS20D==1 replace BTBS20D=1 if BTBS20D==3 replace BTBS20D=3 if BTBS20D==4 label define BTBS20D 1 "Not Used" 2 "Supplement" 3 "Basis for instruction", replace tab BTBS20D replace BTBS20E=4 if BTBS20E==1 replace BTBS20E=1 if BTBS20E==3 replace BTBS20E=3 if BTBS20E==4 label define BTBS20E 1 "Not Used" 2 "Supplement" 3 "Basis for instruction", replace tab BTBS20E *Recode question 21a and 21b so no=0* replace BTBS21A=0 if BTBS21A==2 label define BTBS21A 0 "no" 1 "yes", replace tab BTBS21A replace BTBS21B=0 if BTBS21B==2 label define BTBS21B 0 "no" 1 "yes", replace tab BTBS21B *Recode so 21c moves in ascending order* replace BTBS21CA=5 if BTBS21CA==1 221 replace BTBS21CA=6 if BTBS21CA==2 replace BTBS21CA=2 if BTBS21CA==3 replace BTBS21CA=1 if BTBS21CA==4 replace BTBS21CA=4 if BTBS21CA==5 replace BTBS21CA=3 if BTBS21CA==6 label define BTBS21CA 1 "Never or almost never" 2 "Once or twice a month" 3 "Once or twice a week" 4 "Every or almost every day", replace tab BTBS21CA replace BTBS21CB=5 if BTBS21CB==1 replace BTBS21CB=6 if BTBS21CB==2 replace BTBS21CB=2 if BTBS21CB==3 replace BTBS21CB=1 if BTBS21CB==4 replace BTBS21CB=4 if BTBS21CB==5 replace BTBS21CB=3 if BTBS21CB==6 label define BTBS21CB 1 "Never or almost never" 2 "Once or twice a month" 3 "Once or twice a week" 4 "Every or almost every day", replace tab BTBS21CB replace BTBS21CC=5 if BTBS21CC==1 replace BTBS21CC=6 if BTBS21CC==2 replace BTBS21CC=2 if BTBS21CC==3 replace BTBS21CC=1 if BTBS21CC==4 replace BTBS21CC=4 if BTBS21CC==5 replace BTBS21CC=3 if BTBS21CC==6 label define BTBS21CC 1 "Never or almost never" 2 "Once or twice a month" 3 "Once or twice a week" 4 "Every or almost every day", replace tab BTBS21CC 222 replace BTBS21CD=5 if BTBS21CD==1 replace BTBS21CD=6 if BTBS21CD==2 replace BTBS21CD=2 if BTBS21CD==3 replace BTBS21CD=1 if BTBS21CD==4 replace BTBS21CD=4 if BTBS21CD==5 replace BTBS21CD=3 if BTBS21CD==6 label define BTBS21CD 1 "Never or almost never" 2 "Once or twice a month" 3 "Once or twice a week" 4 "Every or almost every day", replace tab BTBS21CD replace BTBS21CE=5 if BTBS21CE==1 replace BTBS21CE=6 if BTBS21CE==2 replace BTBS21CE=2 if BTBS21CE==3 replace BTBS21CE=1 if BTBS21CE==4 replace BTBS21CE=4 if BTBS21CE==5 replace BTBS21CE=3 if BTBS21CE==6 label define BTBS21CE 1 "Never or almost never" 2 "Once or twice a month" 3 "Once or twice a week" 4 "Every or almost every day", replace tab BTBS21CE *Recode 22 so it moves in ascending order* replace BTBS22AA=4 if BTBS22AA==1 replace BTBS22AA=1 if BTBS22AA==3 replace BTBS22AA=3 if BTBS22AA==4 label define BTBS22AA 1 "Not yet taught or just introduced" 2 "Mostly taught this year" 3 "Mostly taught before this year", replace tab BTBS22AA 223 replace BTBS22AB=4 if BTBS22AB==1 replace BTBS22AB=1 if BTBS22AB==3 replace BTBS22AB=3 if BTBS22AB==4 label define BTBS22AB 1 "Not yet taught or just introduced" 2 "Mostly taught this year" 3 "Mostly taught before this year", replace tab BTBS22AB replace BTBS22AC=4 if BTBS22AC==1 replace BTBS22AC=1 if BTBS22AC==3 replace BTBS22AC=3 if BTBS22AC==4 label define BTBS22AC 1 "Not yet taught or just introduced" 2 "Mostly taught this year" 3 "Mostly taught before this year", replace tab BTBS22AC replace BTBS22AD=4 if BTBS22AD==1 replace BTBS22AD=1 if BTBS22AD==3 replace BTBS22AD=3 if BTBS22AD==4 label define BTBS22AD 1 "Not yet taught or just introduced" 2 "Mostly taught this year" 3 "Mostly taught before this year", replace tab BTBS22AD replace BTBS22AE=4 if BTBS22AE==1 replace BTBS22AE=1 if BTBS22AE==3 replace BTBS22AE=3 if BTBS22AE==4 label define BTBS22AE 1 "Not yet taught or just introduced" 2 "Mostly taught this year" 3 "Mostly taught before this year", replace tab BTBS22AE 224 replace BTBS22AF=4 if BTBS22AF==1 replace BTBS22AF=1 if BTBS22AF==3 replace BTBS22AF=3 if BTBS22AF==4 label define BTBS22AF 1 "Not yet taught or just introduced" 2 "Mostly taught this year" 3 "Mostly taught before this year", replace tab BTBS22AF replace BTBS22AG=4 if BTBS22AG==1 replace BTBS22AG=1 if BTBS22AG==3 replace BTBS22AG=3 if BTBS22AG==4 label define BTBS22AG 1 "Not yet taught or just introduced" 2 "Mostly taught this year" 3 "Mostly taught before this year", replace tab BTBS22AG replace BTBS22BA=4 if BTBS22BA==1 replace BTBS22BA=1 if BTBS22BA==3 replace BTBS22BA=3 if BTBS22BA==4 label define BTBS22BA 1 "Not yet taught or just introduced" 2 "Mostly taught this year" 3 "Mostly taught before this year", replace tab BTBS22BA replace BTBS22BB=4 if BTBS22BB==1 replace BTBS22BB=1 if BTBS22BB==3 replace BTBS22BB=3 if BTBS22BB==4 label define BTBS22BB 1 "Not yet taught or just introduced" 2 "Mostly taught this year" 3 "Mostly taught before this year", replace tab BTBS22BB 225 replace BTBS22BC=4 if BTBS22BC==1 replace BTBS22BC=1 if BTBS22BC==3 replace BTBS22BC=3 if BTBS22BC==4 label define BTBS22BC 1 "Not yet taught or just introduced" 2 "Mostly taught this year" 3 "Mostly taught before this year", replace tab BTBS22BC replace BTBS22BD=4 if BTBS22BD==1 replace BTBS22BD=1 if BTBS22BD==3 replace BTBS22BD=3 if BTBS22BD==4 label define BTBS22BD 1 "Not yet taught or just introduced" 2 "Mostly taught this year" 3 "Mostly taught before this year", replace tab BTBS22BD replace BTBS22CA=4 if BTBS22CA==1 replace BTBS22CA=1 if BTBS22CA==3 replace BTBS22CA=3 if BTBS22CA==4 label define BTBS22CA 1 "Not yet taught or just introduced" 2 "Mostly taught this year" 3 "Mostly taught before this year", replace tab BTBS22CA replace BTBS22CB=4 if BTBS22CB==1 replace BTBS22CB=1 if BTBS22CB==3 replace BTBS22CB=3 if BTBS22CB==4 label define BTBS22CB 1 "Not yet taught or just introduced" 2 "Mostly taught this year" 3 "Mostly taught before this year", replace tab BTBS22CB 226 replace BTBS22CC=4 if BTBS22CC==1 replace BTBS22CC=1 if BTBS22CC==3 replace BTBS22CC=3 if BTBS22CC==4 label define BTBS22CC 1 "Not yet taught or just introduced" 2 "Mostly taught this year" 3 "Mostly taught before this year", replace tab BTBS22CC replace BTBS22CD=4 if BTBS22CD==1 replace BTBS22CD=1 if BTBS22CD==3 replace BTBS22CD=3 if BTBS22CD==4 label define BTBS22CD 1 "Not yet taught or just introduced" 2 "Mostly taught this year" 3 "Mostly taught before this year", replace tab BTBS22CD replace BTBS22CE=4 if BTBS22CE==1 replace BTBS22CE=1 if BTBS22CE==3 replace BTBS22CE=3 if BTBS22CE==4 label define BTBS22CE 1 "Not yet taught or just introduced" 2 "Mostly taught this year" 3 "Mostly taught before this year", replace tab BTBS22CE replace BTBS22DA=4 if BTBS22DA==1 replace BTBS22DA=1 if BTBS22DA==3 replace BTBS22DA=3 if BTBS22DA==4 label define BTBS22DA 1 "Not yet taught or just introduced" 2 "Mostly taught this year" 3 "Mostly taught before this year", replace tab BTBS22DA 227 replace BTBS22DB=4 if BTBS22DB==1 replace BTBS22DB=1 if BTBS22DB==3 replace BTBS22DB=3 if BTBS22DB==4 label define BTBS22DB 1 "Not yet taught or just introduced" 2 "Mostly taught this year" 3 "Mostly taught before this year", replace tab BTBS22DB replace BTBS22DC=4 if BTBS22DC==1 replace BTBS22DC=1 if BTBS22DC==3 replace BTBS22DC=3 if BTBS22DC==4 label define BTBS22DC 1 "Not yet taught or just introduced" 2 "Mostly taught this year" 3 "Mostly taught before this year", replace tab BTBS22DC replace BTBS22DD=4 if BTBS22DD==1 replace BTBS22DD=1 if BTBS22DD==3 replace BTBS22DD=3 if BTBS22DD==4 label define BTBS22DD 1 "Not yet taught or just introduced" 2 "Mostly taught this year" 3 "Mostly taught before this year", replace tab BTBS22DD *Recode 24c so in ascending order* replace BTBS24CA=4 if BTBS24CA==1 replace BTBS24CA=1 if BTBS24CA==3 replace BTBS24CA=3 if BTBS24CA==4 label define BTBS24CA 1 "Never or almost never" 2 "Sometimes" 3 "Always or almost always", replace tab BTBS24CA 228 replace BTBS24CB=4 if BTBS24CB==1 replace BTBS24CB=1 if BTBS24CB==3 replace BTBS24CB=3 if BTBS24CB==4 label define BTBS24CB 1 "Never or almost never" 2 "Sometimes" 3 "Always or almost always", replace tab BTBS24CB replace BTBS24CC=4 if BTBS24CC==1 replace BTBS24CC=1 if BTBS24CC==3 replace BTBS24CC=3 if BTBS24CC==4 label define BTBS24CC 1 "Never or almost never" 2 "Sometimes" 3 "Always or almost always", replace tab BTBS24CC replace BTBS24CD=4 if BTBS24CD==1 replace BTBS24CD=1 if BTBS24CD==3 replace BTBS24CD=3 if BTBS24CD==4 label define BTBS24CD 1 "Never or almost never" 2 "Sometimes" 3 "Always or almost always", replace tab BTBS24CD replace BTBS24CE=4 if BTBS24CE==1 replace BTBS24CE=1 if BTBS24CE==3 replace BTBS24CE=3 if BTBS24CE==4 label define BTBS24CE 1 "Never or almost never" 2 "Sometimes" 3 "Always or almost always", replace tab BTBS24CE *Recode question 25 for ascending order* replace BTBS25A=4 if BTBS25A==1 replace BTBS25A=1 if BTBS25A==3 229 replace BTBS25A=3 if BTBS25A==4 label define BTBS25A 1 "Little or no emphasis" 2 "Some emphasis" 3 "Major emphasis", replace tab BTBS25A replace BTBS25B=4 if BTBS25B==1 replace BTBS25B=1 if BTBS25B==3 replace BTBS25B=3 if BTBS25B==4 label define BTBS25B 1 "Little or no emphasis" 2 "Some emphasis" 3 "Major emphasis", replace tab BTBS25B replace BTBS25C=4 if BTBS25C==1 replace BTBS25C=1 if BTBS25C==3 replace BTBS25C=3 if BTBS25C==4 label define BTBS25C 1 "Little or no emphasis" 2 "Some emphasis" 3 "Major emphasis", replace tab BTBS25C *Recode question 26 for ascending order* replace BTBS26=6 if BTBS26==1 replace BTBS26=7 if BTBS26==2 replace BTBS26=8 if BTBS26==4 replace BTBS26=10 if BTBS26==5 replace BTBS26=5 if BTBS26==6 replace BTBS26=4 if BTBS26==7 replace BTBS26=2 if BTBS26==8 replace BTBS26=1 if BTBS26==10 label define BTBS26 1 "Never" 2 "A few times a year" 3 "About once a month" 4 "About every two weeks" 5 "About once a week", replace tab BTBS26 230 *Recode question 27 for ascending order* replace BTBS27A=4 if BTBS27A==1 replace BTBS27A=1 if BTBS27A==3 replace BTBS27A=3 if BTBS27A==4 label define BTBS27A 1 "Never or almost never" 2 "Sometimes" 3 "Always or almost always", replace tab BTBS27A replace BTBS27B=4 if BTBS27B==1 replace BTBS27B=1 if BTBS27B==3 replace BTBS27B=3 if BTBS27B==4 label define BTBS27B 1 "Never or almost never" 2 "Sometimes" 3 "Always or almost always", replace tab BTBS27B replace BTBS27C=4 if BTBS27C==1 replace BTBS27C=1 if BTBS27C==3 replace BTBS27C=3 if BTBS27C==4 label define BTBS27C 1 "Never or almost never" 2 "Sometimes" 3 "Always or almost always", replace tab BTBS27C replace BTBS27D=4 if BTBS27D==1 replace BTBS27D=1 if BTBS27D==3 replace BTBS27D=3 if BTBS27D==4 label define BTBS27D 1 "Never or almost never" 2 "Sometimes" 3 "Always or almost always", replace tab BTBS27D *Recode question 28 so no=0* replace BTBS28A=0 if BTBS28A==2 231 label define BTBS28A 0 "no" 1 "yes", replace tab BTBS28A replace BTBS28B=0 if BTBS28B==2 label define BTBS28B 0 "no" 1 "yes", replace tab BTBS28B replace BTBS28C=0 if BTBS28C==2 label define BTBS28C 0 "no" 1 "yes", replace tab BTBS28C replace BTBS28D=0 if BTBS28D==2 label define BTBS28D 0 "no" 1 "yes", replace tab BTBS28D replace BTBS28E=0 if BTBS28E==2 label define BTBS28E 0 "no" 1 "yes", replace tab BTBS28E replace BTBS28F=0 if BTBS28F==2 label define BTBS28F 0 "no" 1 "yes", replace tab BTBS28F replace BTBS28G=0 if BTBS28G==2 label define BTBS28G 0 "no" 1 "yes", replace tab BTBS28G *Recode question 29 so not applicable=99, remaining questions in ascending order* 232 replace BTBS29AA=99 if BTBS29AA==1 replace BTBS29AA=5 if BTBS29AA==2 replace BTBS29AA=2 if BTBS29AA==3 replace BTBS29AA=1 if BTBS29AA==4 replace BTBS29AA=3 if BTBS29AA==5 label define BTBS29AA 1 "Not well prepared" 2 "Somewhat prepared" 3 "Very well prepared" 99 "Not applicable", replace tab BTBS29AA replace BTBS29AB=99 if BTBS29AB==1 replace BTBS29AB=5 if BTBS29AB==2 replace BTBS29AB=2 if BTBS29AB==3 replace BTBS29AB=1 if BTBS29AB==4 replace BTBS29AB=3 if BTBS29AB==5 label define BTBS29AB 1 "Not well prepared" 2 "Somewhat prepared" 3 "Very well prepared" 99 "Not applicable", replace tab BTBS29AB replace BTBS29AC=99 if BTBS29AC==1 replace BTBS29AC=5 if BTBS29AC==2 replace BTBS29AC=2 if BTBS29AC==3 replace BTBS29AC=1 if BTBS29AC==4 replace BTBS29AC=3 if BTBS29AC==5 label define BTBS29AC 1 "Not well prepared" 2 "Somewhat prepared" 3 "Very well prepared" 99 "Not applicable", replace tab BTBS29AC replace BTBS29AD=99 if BTBS29AD==1 233 replace BTBS29AD=5 if BTBS29AD==2 replace BTBS29AD=2 if BTBS29AD==3 replace BTBS29AD=1 if BTBS29AD==4 replace BTBS29AD=3 if BTBS29AD==5 label define BTBS29AD 1 "Not well prepared" 2 "Somewhat prepared" 3 "Very well prepared" 99 "Not applicable", replace tab BTBS29AD replace BTBS29AE=99 if BTBS29AE==1 replace BTBS29AE=5 if BTBS29AE==2 replace BTBS29AE=2 if BTBS29AE==3 replace BTBS29AE=1 if BTBS29AE==4 replace BTBS29AE=3 if BTBS29AE==5 label define BTBS29AE 1 "Not well prepared" 2 "Somewhat prepared" 3 "Very well prepared" 99 "Not applicable", replace tab BTBS29AE replace BTBS29AF=99 if BTBS29AF==1 replace BTBS29AF=5 if BTBS29AF==2 replace BTBS29AF=2 if BTBS29AF==3 replace BTBS29AF=1 if BTBS29AF==4 replace BTBS29AF=3 if BTBS29AF==5 label define BTBS29AF 1 "Not well prepared" 2 "Somewhat prepared" 3 "Very well prepared" 99 "Not applicable", replace tab BTBS29AF replace BTBS29AG=99 if BTBS29AG==1 replace BTBS29AG=5 if BTBS29AG==2 234 replace BTBS29AG=2 if BTBS29AG==3 replace BTBS29AG=1 if BTBS29AG==4 replace BTBS29AG=3 if BTBS29AG==5 label define BTBS29AG 1 "Not well prepared" 2 "Somewhat prepared" 3 "Very well prepared" 99 "Not applicable", replace tab BTBS29AG replace BTBS29BA=99 if BTBS29BA==1 replace BTBS29BA=5 if BTBS29BA==2 replace BTBS29BA=2 if BTBS29BA==3 replace BTBS29BA=1 if BTBS29BA==4 replace BTBS29BA=3 if BTBS29BA==5 label define BTBS29BA 1 "Not well prepared" 2 "Somewhat prepared" 3 "Very well prepared" 99 "Not applicable", replace tab BTBS29BA replace BTBS29BB=99 if BTBS29BB==1 replace BTBS29BB=5 if BTBS29BB==2 replace BTBS29BB=2 if BTBS29BB==3 replace BTBS29BB=1 if BTBS29BB==4 replace BTBS29BB=3 if BTBS29BB==5 label define BTBS29BB 1 "Not well prepared" 2 "Somewhat prepared" 3 "Very well prepared" 99 "Not applicable", replace tab BTBS29BB replace BTBS29BC=99 if BTBS29BC==1 replace BTBS29BC=5 if BTBS29BC==2 replace BTBS29BC=2 if BTBS29BC==3 235 replace BTBS29BC=1 if BTBS29BC==4 replace BTBS29BC=3 if BTBS29BC==5 label define BTBS29BC 1 "Not well prepared" 2 "Somewhat prepared" 3 "Very well prepared" 99 "Not applicable", replace tab BTBS29BC replace BTBS29BD=99 if BTBS29BD==1 replace BTBS29BD=5 if BTBS29BD==2 replace BTBS29BD=2 if BTBS29BD==3 replace BTBS29BD=1 if BTBS29BD==4 replace BTBS29BD=3 if BTBS29BD==5 label define BTBS29BD 1 "Not well prepared" 2 "Somewhat prepared" 3 "Very well prepared" 99 "Not applicable", replace tab BTBS29BD replace BTBS29CA=99 if BTBS29CA==1 replace BTBS29CA=5 if BTBS29CA==2 replace BTBS29CA=2 if BTBS29CA==3 replace BTBS29CA=1 if BTBS29CA==4 replace BTBS29CA=3 if BTBS29CA==5 label define BTBS29CA 1 "Not well prepared" 2 "Somewhat prepared" 3 "Very well prepared" 99 "Not applicable", replace tab BTBS29CA replace BTBS29CB=99 if BTBS29CB==1 replace BTBS29CB=5 if BTBS29CB==2 replace BTBS29CB=2 if BTBS29CB==3 replace BTBS29CB=1 if BTBS29CB==4 236 replace BTBS29CB=3 if BTBS29CB==5 label define BTBS29CB 1 "Not well prepared" 2 "Somewhat prepared" 3 "Very well prepared" 99 "Not applicable", replace tab BTBS29CB replace BTBS29CC=99 if BTBS29CC==1 replace BTBS29CC=5 if BTBS29CC==2 replace BTBS29CC=2 if BTBS29CC==3 replace BTBS29CC=1 if BTBS29CC==4 replace BTBS29CC=3 if BTBS29CC==5 label define BTBS29CC 1 "Not well prepared" 2 "Somewhat prepared" 3 "Very well prepared" 99 "Not applicable", replace tab BTBS29CC replace BTBS29CD=99 if BTBS29CD==1 replace BTBS29CD=5 if BTBS29CD==2 replace BTBS29CD=2 if BTBS29CD==3 replace BTBS29CD=1 if BTBS29CD==4 replace BTBS29CD=3 if BTBS29CD==5 label define BTBS29CD 1 "Not well prepared" 2 "Somewhat prepared" 3 "Very well prepared" 99 "Not applicable", replace tab BTBS29CD replace BTBS29CE=99 if BTBS29CE==1 replace BTBS29CE=5 if BTBS29CE==2 replace BTBS29CE=2 if BTBS29CE==3 replace BTBS29CE=1 if BTBS29CE==4 replace BTBS29CE=3 if BTBS29CE==5 237 label define BTBS29CE 1 "Not well prepared" 2 "Somewhat prepared" 3 "Very well prepared" 99 "Not applicable", replace tab BTBS29CE replace BTBS29DA=99 if BTBS29DA==1 replace BTBS29DA=5 if BTBS29DA==2 replace BTBS29DA=2 if BTBS29DA==3 replace BTBS29DA=1 if BTBS29DA==4 replace BTBS29DA=3 if BTBS29DA==5 label define BTBS29DA 1 "Not well prepared" 2 "Somewhat prepared" 3 "Very well prepared" 99 "Not applicable", replace tab BTBS29DA replace BTBS29DB=99 if BTBS29DB==1 replace BTBS29DB=5 if BTBS29DB==2 replace BTBS29DB=2 if BTBS29DB==3 replace BTBS29DB=1 if BTBS29DB==4 replace BTBS29DB=3 if BTBS29DB==5 label define BTBS29DB 1 "Not well prepared" 2 "Somewhat prepared" 3 "Very well prepared" 99 "Not applicable", replace tab BTBS29DB replace BTBS29DC=99 if BTBS29DC==1 replace BTBS29DC=5 if BTBS29DC==2 replace BTBS29DC=2 if BTBS29DC==3 replace BTBS29DC=1 if BTBS29DC==4 replace BTBS29DC=3 if BTBS29DC==5 238 label define BTBS29DC 1 "Not well prepared" 2 "Somewhat prepared" 3 "Very well prepared" 99 "Not applicable", replace tab BTBS29DC replace BTBS29DD=99 if BTBS29DD==1 replace BTBS29DD=5 if BTBS29DD==2 replace BTBS29DD=2 if BTBS29DD==3 replace BTBS29DD=1 if BTBS29DD==4 replace BTBS29DD=3 if BTBS29DD==5 label define BTBS29DD 1 "Not well prepared" 2 "Somewhat prepared" 3 "Very well prepared" 99 "Not applicable", replace tab BTBS29DD ***Recode composite variables*** replace BTDS05=6 if BTDS05==1 replace BTDS05=7 if BTDS05==2 replace BTDS05=8 if BTDS05==4 replace BTDS05=10 if BTDS05==5 replace BTDS05=5 if BTDS05==6 replace BTDS05=4 if BTDS05==7 replace BTDS05=2 if BTDS05==8 replace BTDS05=1 if BTDS05==10 label define BTDS05 1 "No post secondary education" 2 "All other majors" 3 "Major in Sci but not Ed" 4 "Major in Ed but not sci" 5 "Major in Ed and Sci", replace tab BTDS05 replace BTDG01=5 if BTDG01==1 replace BTDG01=6 if BTDG01==2 239 replace BTDG01=2 if BTDG01==3 replace BTDG01=1 if BTDG01==4 replace BTDG01=4 if BTDG01==5 replace BTDG01=3 if BTDG01==6 label define BTDG01 1 "Less than 5 years" 2 "At least 5 but less than 10 years" 3 "At least 10 but less than 20 years" 4 "20 years or more", replace tab BTDG01 replace btdgeas=4 if btdgeas==1 replace btdgeas=1 if btdgeas==3 replace btdgeas=3 if btdgeas==4 label define BTDGEAS 1 "Medium Emphasis" 2 "High Emphasis" 3 "Very High Emphasis", replace tab btdgeas replace btdgsos=4 if btdgsos==1 replace btdgsos=1 if btdgsos==3 replace btdgsos=3 if btdgsos==4 label define BTDGSOS 1 "Not safe and orderly" 2 "Somewhat safe and orderly" 3 "Safe and orderly", replace tab btdgsos replace btdscts=0 if btdscts==2 label define BTDSCTS 0 "Somewhat confident" 1 "Confident", replace tab btdscts replace btdgtcs=4 if btdgtcs==1 replace btdgtcs=1 if btdgtcs==3 replace btdgtcs=3 if btdgtcs==4 label define BTDGTCS 1 "Less than satisfied" 2 "Somewhat satisfied" 3 "Satisfied", replace 240 tab btdgtcs replace btdgcit=4 if btdgcit==1 replace btdgcit=1 if btdgcit==3 replace btdgcit=3 if btdgcit==4 label define BTDGCIT 1 "Sometimes collaborative" 2 "Collaborative" 3 "Very Collaborative", replace tab btdgcit replace btdgies=4 if btdgies==1 replace btdgies=1 if btdgies==3 replace btdgies=3 if btdgies==4 label define BTDGIES 1 "Some lessons" 2 "About half the lessons" 3 "Most lessons", replace tab btdgies replace btdsesi=0 if btdsesi==2 label define BTDSESI 0 "Less than half the lessons" 1 "About half the lessons or more", replace tab btdsesi *Create a science major variable* gen scimaj=. replace scimaj=0 if BTDS05==1|BTDS05==2 replace scimaj=1 if BTDS05==3|BTDS05==4|BTDS05==5 label define scimaj 0 "Not a science major" 1 "Science major", replace label values scimaj scimaj tab scimaj ******************************************************************************** *Merge in student country level variables 241 merge m:m idcntry idstud using "C:\Users\educ.brunerju\Google Drive\Dissertation\Do files\Percentiles\Chile.dta", nogenerate update replace merge m:m idcntry idstud using "C:\Users\educ.brunerju\Google Drive\Dissertation\Do files\Percentiles\Finland.dta", nogenerate update replace merge m:m idcntry idstud using "C:\Users\educ.brunerju\Google Drive\Dissertation\Do files\Percentiles\Ghana.dta", nogenerate update replace merge m:m idcntry idstud using "C:\Users\educ.brunerju\Google Drive\Dissertation\Do files\Percentiles\Korea.dta", nogenerate update replace merge m:m idcntry idstud using "C:\Users\educ.brunerju\Google Drive\Dissertation\Do files\Percentiles\Singapore.dta", nogenerate update replace merge m:m idcntry idstud using "C:\Users\educ.brunerju\Google Drive\Dissertation\Do files\Percentiles\USA.dta", nogenerate update replace label define achper 1 "Low Ach" 2 "Med Ach" 3 "High Ach", replace label values achper achper tab achper label define SESper 1 "Low SES" 2 "Med SES" 3 "High SES", replace label values SESper SESper tab SESper *Put dataset in survey mode for teacher descriptives* svyset jkrep [pweight=sciwgt], strata(jkzone) vce(jackknife) mse singleunit(missing) *Generate descriptives for teacher data* *svy jackknife: mean prepar instru prodev practi scimaj BTBG04 BTDG01, over(idcntry) ****Teacher analysis *Compare low SES student acheievement differences using teacher characteristics 242 *svy jackknife, subpop(if SESper==1) : mlogit achper BTBG04 BTDG01 prepar practi scimaj prodev, baseoutcome(2) rrr *Compare student SES differences by teacher characteristics *svy jackknife : mlogit SESper BSSSCI01 BTBG04 BTDG01 prepar practi scimaj prodev, baseoutcome(2) rrr 243 Stata code for school cleaning analysis ***TIMSS 2011 8th grade school dataset cleaning*** *Dataset generated using IEA IDB analyzer* clear set more off use "C:\Users\educ.brunerju\Google Drive\Dissertation\TIMSS\Datasets\2011\SchBackCountryFilter.dta" *reorder the dataset* sort idcntry order id* BC* bc* *Recode question 4 native language test for ascending order* replace BCBG04=6 if BCBG04==1 replace BCBG04=7 if BCBG04==2 replace BCBG04=8 if BCBG04==4 replace BCBG04=10 if BCBG04==5 replace BCBG04=5 if BCBG04==6 replace BCBG04=4 if BCBG04==7 replace BCBG04=2 if BCBG04==8 replace BCBG04=1 if BCBG04==10 label define BCBG04 1 "25% or less" 2 "26% to 50%" 3 "51 to 75%" 4 "76% to 90%" 5 "More than 90%", replace tab BCBG04 *Recode question 5 for ascending order* replace BCBG05A=11 if BCBG05A==1 replace BCBG05A=12 if BCBG05A==2 replace BCBG05A=13 if BCBG05A==3 replace BCBG05A=14 if BCBG05A==4 replace BCBG05A=15 if BCBG05A==5 replace BCBG05A=16 if BCBG05A==6 replace BCBG05A=1 if BCBG05A==16 244 replace BCBG05A=2 if BCBG05A==15 replace BCBG05A=3 if BCBG05A==14 replace BCBG05A=4 if BCBG05A==13 replace BCBG05A=5 if BCBG05A==12 replace BCBG05A=6 if BCBG05A==11 label define BCBG05A 1 "3,000 people or fewer" 2 "3,001 to 15,000 people" 3 "15,001 to 50,000 people" /// 4 "50,001 to 100,000 people" 5 "100,001 to 500,000 people" 6 "More than 500,000 people", replace tab BCBG05A replace BCBG05B=6 if BCBG05B==1 replace BCBG05B=7 if BCBG05B==2 replace BCBG05B=8 if BCBG05B==4 replace BCBG05B=10 if BCBG05B==5 replace BCBG05B=5 if BCBG05B==6 replace BCBG05B=4 if BCBG05B==7 replace BCBG05B=2 if BCBG05B==8 replace BCBG05B=1 if BCBG05B==10 label define BCBG05B 1 "Remote rural" 2 "Small town or village" 3 "Medium city or large town" 4 "Suburban" 5 "Urban", replace tab BCBG05B replace BCBG05C=4 if BCBG05C==1 replace BCBG05C=1 if BCBG05C==3 replace BCBG05C=3 if BCBG05C==4 label define BCBG05C 1 "Low" 2 "Medium" 3 "High", replace tab BCBG05C *Recode 6c for ascending order* 245 replace BCBG06C=11 if BCBG06C==1 replace BCBG06C=12 if BCBG06C==2 replace BCBG06C=13 if BCBG06C==3 replace BCBG06C=14 if BCBG06C==4 replace BCBG06C=15 if BCBG06C==5 replace BCBG06C=16 if BCBG06C==6 replace BCBG06C=1 if BCBG06C==16 replace BCBG06C=2 if BCBG06C==15 replace BCBG06C=3 if BCBG06C==14 replace BCBG06C=4 if BCBG06C==13 replace BCBG06C=5 if BCBG06C==12 replace BCBG06C=6 if BCBG06C==11 label define BCBG06C 1 "Other" 2 "4 days" 3 "4.5 days" 4 "5 days" 5 "5.5 days" 6 "6 days", replace tab BCBG06C *Recode question 8 so no=0* replace BCBG08A=0 if BCBG08A==2 label define BCBG08A 0 "no" 1 "yes", replace tab BCBG08A replace BCBG08B=0 if BCBG08B==2 label define BCBG08B 0 "no" 1 "yes", replace tab BCBG08B *Recode question 11 to be in ascending order* replace BCBG11A=6 if BCBG11A==1 replace BCBG11A=7 if BCBG11A==2 replace BCBG11A=8 if BCBG11A==4 replace BCBG11A=10 if BCBG11A==5 246 replace BCBG11A=5 if BCBG11A==6 replace BCBG11A=4 if BCBG11A==7 replace BCBG11A=2 if BCBG11A==8 replace BCBG11A=1 if BCBG11A==10 label define BCBG11A 1 "Very Low" 2 "Low" 3 "Medium" 4 "High" 5 "Very High", replace tab BCBG11A replace BCBG11B=6 if BCBG11B==1 replace BCBG11B=7 if BCBG11B==2 replace BCBG11B=8 if BCBG11B==4 replace BCBG11B=10 if BCBG11B==5 replace BCBG11B=5 if BCBG11B==6 replace BCBG11B=4 if BCBG11B==7 replace BCBG11B=2 if BCBG11B==8 replace BCBG11B=1 if BCBG11B==10 label define BCBG11B 1 "Very Low" 2 "Low" 3 "Medium" 4 "High" 5 "Very High", replace tab BCBG11B replace BCBG11C=6 if BCBG11C==1 replace BCBG11C=7 if BCBG11C==2 replace BCBG11C=8 if BCBG11C==4 replace BCBG11C=10 if BCBG11C==5 replace BCBG11C=5 if BCBG11C==6 replace BCBG11C=4 if BCBG11C==7 replace BCBG11C=2 if BCBG11C==8 replace BCBG11C=1 if BCBG11C==10 label define BCBG11C 1 "Very Low" 2 "Low" 3 "Medium" 4 "High" 5 "Very High", replace tab BCBG11C 247 replace BCBG11D=6 if BCBG11D==1 replace BCBG11D=7 if BCBG11D==2 replace BCBG11D=8 if BCBG11D==4 replace BCBG11D=10 if BCBG11D==5 replace BCBG11D=5 if BCBG11D==6 replace BCBG11D=4 if BCBG11D==7 replace BCBG11D=2 if BCBG11D==8 replace BCBG11D=1 if BCBG11D==10 label define BCBG11D 1 "Very Low" 2 "Low" 3 "Medium" 4 "High" 5 "Very High", replace tab BCBG11D replace BCBG11E=6 if BCBG11E==1 replace BCBG11E=7 if BCBG11E==2 replace BCBG11E=8 if BCBG11E==4 replace BCBG11E=10 if BCBG11E==5 replace BCBG11E=5 if BCBG11E==6 replace BCBG11E=4 if BCBG11E==7 replace BCBG11E=2 if BCBG11E==8 replace BCBG11E=1 if BCBG11E==10 label define BCBG11E 1 "Very Low" 2 "Low" 3 "Medium" 4 "High" 5 "Very High", replace tab BCBG11E replace BCBG11F=6 if BCBG11F==1 replace BCBG11F=7 if BCBG11F==2 replace BCBG11F=8 if BCBG11F==4 replace BCBG11F=10 if BCBG11F==5 replace BCBG11F=5 if BCBG11F==6 248 replace BCBG11F=4 if BCBG11F==7 replace BCBG11F=2 if BCBG11F==8 replace BCBG11F=1 if BCBG11F==10 label define BCBG11F 1 "Very Low" 2 "Low" 3 "Medium" 4 "High" 5 "Very High", replace tab BCBG11F replace BCBG11G=6 if BCBG11G==1 replace BCBG11G=7 if BCBG11G==2 replace BCBG11G=8 if BCBG11G==4 replace BCBG11G=10 if BCBG11G==5 replace BCBG11G=5 if BCBG11G==6 replace BCBG11G=4 if BCBG11G==7 replace BCBG11G=2 if BCBG11G==8 replace BCBG11G=1 if BCBG11G==10 label define BCBG11G 1 "Very Low" 2 "Low" 3 "Medium" 4 "High" 5 "Very High", replace tab BCBG11G replace BCBG11H=6 if BCBG11H==1 replace BCBG11H=7 if BCBG11H==2 replace BCBG11H=8 if BCBG11H==4 replace BCBG11H=10 if BCBG11H==5 replace BCBG11H=5 if BCBG11H==6 replace BCBG11H=4 if BCBG11H==7 replace BCBG11H=2 if BCBG11H==8 replace BCBG11H=1 if BCBG11H==10 label define BCBG11H 1 "Very Low" 2 "Low" 3 "Medium" 4 "High" 5 "Very High", replace tab BCBG11H 249 *Recode question 13 so no=0* replace BCBG13A=0 if BCBG13A==2 label define BCBG13A 0 "no" 1 "yes", replace tab BCBG13A replace BCBG13B=0 if BCBG13B==2 label define BCBG13B 0 "no" 1 "yes", replace tab BCBG13B replace BCBG13C=0 if BCBG13C==2 label define BCBG13C 0 "no" 1 "yes", replace tab BCBG13C replace BCBG13D=0 if BCBG13D==2 label define BCBG13D 0 "no" 1 "yes", replace tab BCBG13D *Recode question 14 so no=0* replace BCBG14A=0 if BCBG14A==2 label define BCBG14A 0 "no" 1 "yes", replace tab BCBG14A replace BCBG14B=0 if BCBG14B==2 label define BCBG14B 0 "no" 1 "yes", replace tab BCBG14B replace BCBG14C=0 if BCBG14C==2 label define BCBG14C 0 "no" 1 "yes", replace tab BCBG14C replace BCBG14D=0 if BCBG14D==2 label define BCBG14D 0 "no" 1 "yes", replace tab BCBG14D 250 *Recode question 16 so no=0" replace BCBG16A=0 if BCBG16A==2 label define BCBG16A 0 "no" 1 "yes", replace tab BCBG16A replace BCBG16B=0 if BCBG16B==2 label define BCBG16B 0 "no" 1 "yes", replace tab BCBG16B replace BCBG16C=0 if BCBG16C==2 label define BCBG16C 0 "no" 1 "yes", replace tab BCBG16C *Recode composite variables so they are in ascending order* replace bcdgeas=4 if bcdgeas==1 replace bcdgeas=1 if bcdgeas==3 replace bcdgeas=3 if bcdgeas==4 label define BCDGEAS 1 "Medium Emphasis" 2 "High Emphasis" 3 "Very High Emphasis", replace tab bcdgeas replace bcdgcmp=5 if bcdgcmp==1 replace bcdgcmp=6 if bcdgcmp==2 replace bcdgcmp=2 if bcdgcmp==3 replace bcdgcmp=1 if bcdgcmp==4 replace bcdgcmp=4 if bcdgcmp==5 replace bcdgcmp=3 if bcdgcmp==6 label define BCDGCMP 1 "No computers available" 2 "1 computer for 6 or more students" 3 "1 computer for 3-5 students" 4 "1 computer for 1-2 students", replace tab bcdgcmp 251 replace BCDG03=4 if BCDG03==1 replace BCDG03=1 if BCDG03==3 replace BCDG03=3 if BCDG03==4 label define BCDG03 1 "More disadvantaged" 2 "Neither more affluent or disadvantaged" 3 "More affluent", replace tab BCDG03 *Create teacher assessment composite* gen a14flag=1 if BCBG14A!=. gen b14flag=1 if BCBG14B!=. gen c14flag=1 if BCBG14C!=. egen teevalden=rowtotal(a14flag - c14flag) egen teevalnum=rowtotal(BCBG14A - BCBG14C) gen teeval= teevalnum/teevalden drop a14flag - c14flag teevalnum teevalden ******************************************************************************** *Merge in student country level variables merge m:m idcntry idstud using "C:\Users\educ.brunerju\Google Drive\Dissertation\Do files\Percentiles\Chile.dta", nogenerate update replace merge m:m idcntry idstud using "C:\Users\educ.brunerju\Google Drive\Dissertation\Do files\Percentiles\Finland.dta", nogenerate update replace merge m:m idcntry idstud using "C:\Users\educ.brunerju\Google Drive\Dissertation\Do files\Percentiles\Ghana.dta", nogenerate update replace merge m:m idcntry idstud using "C:\Users\educ.brunerju\Google Drive\Dissertation\Do files\Percentiles\Korea.dta", nogenerate update replace merge m:m idcntry idstud using "C:\Users\educ.brunerju\Google Drive\Dissertation\Do files\Percentiles\Singapore.dta", nogenerate update replace 252 merge m:m idcntry idstud using "C:\Users\educ.brunerju\Google Drive\Dissertation\Do files\Percentiles\USA.dta", nogenerate update replace label define achper 1 "Low Ach" 2 "Middle Ach" 3 "High Ach", replace label values achper achper tab achper label define SESper 1 "Low SES" 2 "Middle SES" 3 "High SES", replace label values SESper SESper tab SESper *Put in survey mode for school* svyset jkrep [pweight=totwgt], strata(jkzone) vce(jackknife) mse singleunit(missing) ******************************************************************************* ***School Analysis*** *Student SES differences by school characteristics* *svy jackknife : mlogit SESper BSSSCI01 bcbgdas bcbgeas bcbgsrs BCBG03A teeval, baseoutcome(2) rrr *Low SES student achievement differences by school characteristics* *svy jackknife, subpop(if SESper==1) : mlogit achper bcbgdas bcbgeas bcbgsrs BCBG03A teeval, baseoutcome(2) rrr 253 Stata code for student factor analysis ***Creating student composites using factors analysis*** set more off *Run student cleaning file first* factor BSBG11* BSBG13* BSBS17* BSBS18* BSBS19* rotate predict Factor1-Factor6 *Student compsite variables reliability checks based on factor analysis alpha BSBS17A BSBS17C BSBS17E BSBS17F BSBS18C BSBS18D BSBS18E, std item alpha BSBS17G BSBS19J BSBS19K BSBS19L BSBS19M BSBS19N, std item alpha BSBS18A BSBS19A BSBS19D BSBS19F BSBS19G BSBS19H, std item alpha BSBS17D BSBS19B BSBS19C BSBS19E BSBS19I, std item alpha BSBG13*, std item alpha BSBG11*, std item *Variable 1 - Interest and Enjoyment of Science gen intenj=. replace intenj=((BSBS17A*.18)+(BSBS17C*.11)+(BSBS17E*.16)+(BSBS17F*.19)+(BSBS18C*.10)+(BSBS18D*.14)+(B SBS18E*.12)) replace intenj=. if BSBS17A==.|BSBS17C==.|BSBS17E==.|BSBS17F==.|BSBS18C==.|BSBS18D==.|BSBS18E==. *Variable 2 - Value of Science gen valsci=. replace valsci=((BSBS17G*.13)+ (BSBS19J*.15)+ (BSBS19K*.15)+ (BSBS19L*.19)+ (BSBS19M*.21)+ (BSBS19N*.16)) 254 replace valsci=. if BSBS17G==.| BSBS19J==.| BSBS19K==.| BSBS19L==.| BSBS19M==.| BSBS19N==. *Variable 3 - Positive Science Affect gen possci=. replace possci=((BSBS18A*.12)+(BSBS19A*.17)+(BSBS19D*.16)+(BSBS19F*.17)+(BSBS19G*.19)+(BSBS19H*.19)) replace possci=. if BSBS18A==.| BSBS19A==.| BSBS19D==.| BSBS19F==.| BSBS19G==.| BSBS19H==. *Variable 4 - Negative Science Affect gen negsci=. replace negsci=((BSBS17D*.15)+(BSBS19B*.22)+(BSBS19C*.2)+(BSBS19E*.21)+(BSBS19I*.22)) replace negsci=. if BSBS17D==.| BSBS19B==.| BSBS19C==.| BSBS19E==.| BSBS19I==. *Variable 5 - Bullied gen bully=. replace bully=((BSBG13A*.17)+(BSBG13B*.17)+(BSBG13C*.17)+(BSBG13D*.15)+(BSBG13E*.18)+(BSBG13F*.16)) replace bully=. if BSBG13A==.|BSBG13B==.|BSBG13C==.|BSBG13D==.|BSBG13E==.|BSBG13F==. *Variable 6 - Parent Involvement gen parent=. replace parent=((BSBG11A*.26)+(BSBG11B*.24)+(BSBG11C*.26)+(BSBG11D*.24)) replace parent=. if BSBG11A==.|BSBG11B==.|BSBG11C==.|BSBG11D==. drop Factor* 255 Stata code for teacher factor analysis ***Creating teacher composites using factors analysis*** *Run teacher cleaning file first* set more off factor BTBG01 BTBG04 scimaj BTBG12 BTBS17A BTBG06D BTBG06E BTBG06H BTBG10A BTBG10B /// BTBG10C BTBG10D BTBG10E BTBG14D BTBG14E BTBG15A BTBG15B BTBG15C BTBG15D BTBG15E BTBG15F BTBS19A /// BTBS19B BTBS19C BTBS19D BTBS19G BTBS19H BTBS19I rotate predict Factor1-Factor6 *Teacher compsite variables reliability checks based on factor analysis alpha BTBG10A BTBG10B BTBG10C BTBG10E, std item alpha BTBG15A BTBG15B BTBG15C BTBG15D BTBG15E BTBG15F, std item alpha BTBS19A BTBS19B BTBS19C BTBS19D, std item alpha BTBG06D BTBG06E BTBG06H, std item alpha BTBG14D BTBG14E, std item *Variable 1 - Teacher Cooperation gen tecoop=. replace tecoop=((BTBG10A*.26)+(BTBG10B*.26)+(BTBG10C*.24)+(BTBG10E*.25)) replace tecoop=. if BTBG10A==.| BTBG10B==.| BTBG10C==.| BTBG10E==. *Variable 2 - Classroom Limitations gen telimi=. replace telimi=((BTBG15A*.15)+(BTBG15B*.13)+(BTBG15C*.17)+(BTBG15D*.16)+(BTBG15E*.19)+(BTBG15F*.2)) replace telimi=. if BTBG15A==.| BTBG15B==.| BTBG15C==.| BTBG15D==.| BTBG15E==.| BTBG15F==. 256 *Variable 3 - Science Pedagogy gen sciped=. replace sciped=((BTBS19A*.18)+(BTBS19B*.25)+(BTBS19C*.29)+(BTBS19D*.28)) replace sciped=. if BTBS19A==.| BTBS19B==.| BTBS19C==.| BTBS19D==. *Variable 4 - Expectations gen expect=. replace expect=((BTBG06D*.29)+(BTBG06E*.34)+(BTBG06H*.37)) replace expect=. if BTBG06D==.| BTBG06E==.| BTBG06H==. *Factor 5 - Teachers supporting students gen tesupp=. replace tesupp=((BTBG14D*.5)+(BTBG14E*.5)) replace tesupp=. if BTBG14D==.| BTBG14E==. drop Factor* 257 Stata code for school factor analysis ***Creating school composites using factors analysis*** *Run school cleaning file first* set more off factor BCBG03A BCBG05B BCBG10AA BCBG10CB BCBG10CG BCBG11E BCBG12AA BCBG12AB BCBG12AH BCBG14A BCBG15B BCBG17B BCBG17D rotate predict Factor1 Factor2 *School compsite variables reliability checks based on factor analysis alpha BCBG12AA BCBG12AB, std item alpha BCBG10AA BCBG10CB BCBG10CG, std item *Create composite variables based on weighting, reduce by factor of 1 to preserve likert scaling *Sum each varible then divide variable weight by sum to get percentage of contribution *Variable 1 - School Climate gen schcli=. replace schcli=((BCBG12AA*.5)+(BCBG12AB*.5)) replace schcli=. if BCBG12AA==.|BCBG12AB==. *Variable 2 - School Parent Involvement gen schpar=. replace schpar=((BCBG10AA*.3)+(BCBG10CB*.31)+(BCBG10CG*.39)) replace schpar=. if BCBG10AA==.|BCBG10CB==.|BCBG10CG==. drop Factor* 258 Stata code for creating percentiles ***Used to create percentiles within each country*** clear *Chile use "C:\Users\educ.brunerju\Google Drive\Dissertation\Do files\Percentiles\SESall.dta" drop if idcntry!=152 xtile achper=BSSSCI01, n(3) label define achper 1 "Low Science Achievement" 2 "Middle Science Achievement" /// 3 "High Science Achievement", replace label values achper achper tab achper xtile SESper=SES, n(3) label define achper 1 "Low SES" 2 "Middle SES" 3 "High SES", replace label values SESper SESper tab SESper save "C:\Users\educ.brunerju\Google Drive\Dissertation\Do files\Percentiles\Chile.dta", replace *Finland clear use "C:\Users\educ.brunerju\Google Drive\Dissertation\Do files\Percentiles\SESall.dta" drop if idcntry!=246 xtile achper=BSSSCI01, n(3) label define achper 1 "Low Science Achievement" 2 "Middle Science Achievement" /// 3 "High Science Achievement", replace label values achper achper tab achper 259 xtile SESper=SES, n(3) label define achper 1 "Low SES" 2 "Middle SES" 3 "High SES", replace label values SESper SESper tab SESper save "C:\Users\educ.brunerju\Google Drive\Dissertation\Do files\Percentiles\Finland.dta", replace *Ghana clear use "C:\Users\educ.brunerju\Google Drive\Dissertation\Do files\Percentiles\SESall.dta" drop if idcntry!=288 xtile achper=BSSSCI01, n(3) label define achper 1 "Low Science Achievement" 2 "Middle Science Achievement" /// 3 "High Science Achievement", replace label values achper achper tab achper xtile SESper=SES, n(3) label define achper 1 "Low SES" 2 "Middle SES" 3 "High SES", replace label values SESper SESper tab SESper save "C:\Users\educ.brunerju\Google Drive\Dissertation\Do files\Percentiles\Ghana.dta", replace *Korea clear use "C:\Users\educ.brunerju\Google Drive\Dissertation\Do files\Percentiles\SESall.dta" drop if idcntry!=410 260 xtile achper=BSSSCI01, n(3) label define achper 1 "Low Science Achievement" 2 "Middle Science Achievement" /// 3 "High Science Achievement", replace label values achper achper tab achper xtile SESper=SES, n(3) label define achper 1 "Low SES" 2 "Middle SES" 3 "High SES", replace label values SESper SESper tab SESper save "C:\Users\educ.brunerju\Google Drive\Dissertation\Do files\Percentiles\Korea.dta", replace *Singapore clear use "C:\Users\educ.brunerju\Google Drive\Dissertation\Do files\Percentiles\SESall.dta" drop if idcntry!=702 xtile achper=BSSSCI01, n(3) label define achper 1 "Low Science Achievement" 2 "Middle Science Achievement" /// 3 "High Science Achievement", replace label values achper achper tab achper xtile SESper=SES, n(3) label define achper 1 "Low SES" 2 "Middle SES" 3 "High SES", replace label values SESper SESper tab SESper save "C:\Users\educ.brunerju\Google Drive\Dissertation\Do files\Percentiles\Singapore.dta", replace 261 *USA clear use "C:\Users\educ.brunerju\Google Drive\Dissertation\Do files\Percentiles\SESall.dta" drop if idcntry!=840 xtile achper=BSSSCI01, n(3) label define achper 1 "Low Science Achievement" 2 "Middle Science Achievement" /// 3 "High Science Achievement", replace label values achper achper tab achper xtile SESper=SES, n(3) label define achper 1 "Low SES" 2 "Middle SES" 3 "High SES", replace label values SESper SESper tab SESper save "C:\Users\educ.brunerju\Google Drive\Dissertation\Do files\Percentiles\USA.dta", replace 262 Stata file for descriptive statistics and correlations ***Descriptive Statistics*** ***Need to run cleaning and factor files first*** ***SPSS and IEA analyzer were used for achievement statistics*** *student by idcntry: sum BSBG01 intenj negsci bully parent SES pwcorr BSSSCI01 BSBG01 intenj negsci bully parent SES, sig *teacher by idcntry: sum BTBG01 BTBG04 BTBS17A scimaj tecoop telimi sciped expect tesupp pwcorr BTBG01 BTBG04 BTBS17A scimaj tecoop telimi sciped expect tesupp, sig *school by idcntry: sum BCBG03A BCBG05B schcli schpar pwcorr BCBG03A BCBG05B schcli schpar, sig 263 Stata file for OLS regressions ***Simple Linear Regressions*** *Use corresponding setup file first and student factor analysis second then make sure survey mode is set* *There will always be two regressions per country, one whole sample and one low SES only* *Student Regressions Achievement* *Chile svy jackknife, subpop(if idcntry==152): reg BSSSCI01 BSBG01 intenj negsci bully parent svy jackknife, subpop(if idcntry==152): reg BSSSCI02 BSBG01 intenj negsci bully parent svy jackknife, subpop(if idcntry==152): reg BSSSCI03 BSBG01 intenj negsci bully parent svy jackknife, subpop(if idcntry==152): reg BSSSCI04 BSBG01 intenj negsci bully parent svy jackknife, subpop(if idcntry==152): reg BSSSCI05 BSBG01 intenj negsci bully parent *Finland svy jackknife, subpop(if idcntry==246): reg BSSSCI01 BSBG01 intenj negsci bully parent svy jackknife, subpop(if idcntry==246): reg BSSSCI02 BSBG01 intenj negsci bully parent svy jackknife, subpop(if idcntry==246): reg BSSSCI03 BSBG01 intenj negsci bully parent svy jackknife, subpop(if idcntry==246): reg BSSSCI04 BSBG01 intenj negsci bully parent svy jackknife, subpop(if idcntry==246): reg BSSSCI05 BSBG01 intenj negsci bully parent *Ghana svy jackknife, subpop(if idcntry==288): reg BSSSCI01 BSBG01 intenj negsci bully parent svy jackknife, subpop(if idcntry==288): reg BSSSCI02 BSBG01 intenj negsci bully parent svy jackknife, subpop(if idcntry==288): reg BSSSCI03 BSBG01 intenj negsci bully parent svy jackknife, subpop(if idcntry==288): reg BSSSCI04 BSBG01 intenj negsci bully parent svy jackknife, subpop(if idcntry==288): reg BSSSCI05 BSBG01 intenj negsci bully parent *Korea 264 svy jackknife, subpop(if idcntry==410): reg BSSSCI01 BSBG01 intenj negsci bully parent svy jackknife, subpop(if idcntry==410): reg BSSSCI02 BSBG01 intenj negsci bully parent svy jackknife, subpop(if idcntry==410): reg BSSSCI03 BSBG01 intenj negsci bully parent svy jackknife, subpop(if idcntry==410): reg BSSSCI04 BSBG01 intenj negsci bully parent svy jackknife, subpop(if idcntry==410): reg BSSSCI05 BSBG01 intenj negsci bully parent *Singapore svy jackknife, subpop(if idcntry==702): reg BSSSCI01 BSBG01 intenj negsci bully parent svy jackknife, subpop(if idcntry==702): reg BSSSCI02 BSBG01 intenj negsci bully parent svy jackknife, subpop(if idcntry==702): reg BSSSCI03 BSBG01 intenj negsci bully parent svy jackknife, subpop(if idcntry==702): reg BSSSCI04 BSBG01 intenj negsci bully parent svy jackknife, subpop(if idcntry==702): reg BSSSCI05 BSBG01 intenj negsci bully parent *USA svy jackknife, subpop(if idcntry==840): reg BSSSCI01 BSBG01 intenj negsci bully parent svy jackknife, subpop(if idcntry==840): reg BSSSCI02 BSBG01 intenj negsci bully parent svy jackknife, subpop(if idcntry==840): reg BSSSCI03 BSBG01 intenj negsci bully parent svy jackknife, subpop(if idcntry==840): reg BSSSCI04 BSBG01 intenj negsci bully parent svy jackknife, subpop(if idcntry==840): reg BSSSCI05 BSBG01 intenj negsci bully parent ***School Regressions*** *Run school setup file first and factor analysis second* *Chile svy jackknife, subpop(if idcntry==152):reg BSSSCI01 BCBG03A BCBG05B schcli schpar svy jackknife, subpop(if idcntry==152):reg BSSSCI02 BCBG03A BCBG05B schcli schpar svy jackknife, subpop(if idcntry==152):reg BSSSCI03 BCBG03A BCBG05B schcli schpar svy jackknife, subpop(if idcntry==152):reg BSSSCI04 BCBG03A BCBG05B schcli schpar 265 svy jackknife, subpop(if idcntry==152):reg BSSSCI05 BCBG03A BCBG05B schcli schpar *Finland svy jackknife, subpop(if idcntry==246):reg BSSSCI01 BCBG03A BCBG05B schcli schpar svy jackknife, subpop(if idcntry==246):reg BSSSCI02 BCBG03A BCBG05B schcli schpar svy jackknife, subpop(if idcntry==246):reg BSSSCI03 BCBG03A BCBG05B schcli schpar svy jackknife, subpop(if idcntry==246):reg BSSSCI04 BCBG03A BCBG05B schcli schpar svy jackknife, subpop(if idcntry==246):reg BSSSCI05 BCBG03A BCBG05B schcli schpar *Ghana svy jackknife, subpop(if idcntry==288):reg BSSSCI01 BCBG03A BCBG05B schcli schpar svy jackknife, subpop(if idcntry==288):reg BSSSCI02 BCBG03A BCBG05B schcli schpar svy jackknife, subpop(if idcntry==288):reg BSSSCI03 BCBG03A BCBG05B schcli schpar svy jackknife, subpop(if idcntry==288):reg BSSSCI04 BCBG03A BCBG05B schcli schpar svy jackknife, subpop(if idcntry==288):reg BSSSCI05 BCBG03A BCBG05B schcli schpar *Korea svy jackknife, subpop(if idcntry==410):reg BSSSCI01 BCBG03A BCBG05B schcli schpar svy jackknife, subpop(if idcntry==410):reg BSSSCI02 BCBG03A BCBG05B schcli schpar svy jackknife, subpop(if idcntry==410):reg BSSSCI03 BCBG03A BCBG05B schcli schpar svy jackknife, subpop(if idcntry==410):reg BSSSCI04 BCBG03A BCBG05B schcli schpar svy jackknife, subpop(if idcntry==410):reg BSSSCI05 BCBG03A BCBG05B schcli schpar *Signapore svy jackknife, subpop(if idcntry==702):reg BSSSCI01 BCBG03A BCBG05B schcli schpar svy jackknife, subpop(if idcntry==702):reg BSSSCI02 BCBG03A BCBG05B schcli schpar svy jackknife, subpop(if idcntry==702):reg BSSSCI03 BCBG03A BCBG05B schcli schpar svy jackknife, subpop(if idcntry==702):reg BSSSCI04 BCBG03A BCBG05B schcli schpar 266 svy jackknife, subpop(if idcntry==702):reg BSSSCI05 BCBG03A BCBG05B schcli schpar *USA svy jackknife, subpop(if idcntry==840):reg BSSSCI01 BCBG03A BCBG05B schcli schpar svy jackknife, subpop(if idcntry==840):reg BSSSCI02 BCBG03A BCBG05B schcli schpar svy jackknife, subpop(if idcntry==840):reg BSSSCI03 BCBG03A BCBG05B schcli schpar svy jackknife, subpop(if idcntry==840):reg BSSSCI04 BCBG03A BCBG05B schcli schpar svy jackknife, subpop(if idcntry==840):reg BSSSCI05 BCBG03A BCBG05B schcli schpar ***Teacher Regressions*** *Run teacher setup file first* *Chile svy jackknife, subpop(if idcntry==152):reg BSSSCI01 BTBG01 BTBG04 scimaj BTBS17A tecoop telimi sciped expect tesupp svy jackknife, subpop(if idcntry==152):reg BSSSCI02 BTBG01 BTBG04 scimaj BTBS17A tecoop telimi sciped expect tesupp svy jackknife, subpop(if idcntry==152):reg BSSSCI03 BTBG01 BTBG04 scimaj BTBS17A tecoop telimi sciped expect tesupp svy jackknife, subpop(if idcntry==152):reg BSSSCI04 BTBG01 BTBG04 scimaj BTBS17A tecoop telimi sciped expect tesupp svy jackknife, subpop(if idcntry==152):reg BSSSCI05 BTBG01 BTBG04 scimaj BTBS17A tecoop telimi sciped expect tesupp *Finland svy jackknife, subpop(if idcntry==246):reg BSSSCI01 BTBG01 BTBG04 scimaj BTBS17A tecoop telimi sciped expect tesupp 267 svy jackknife, subpop(if idcntry==246):reg BSSSCI02 BTBG01 BTBG04 scimaj BTBS17A tecoop telimi sciped expect tesupp svy jackknife, subpop(if idcntry==246):reg BSSSCI03 BTBG01 BTBG04 scimaj BTBS17A tecoop telimi sciped expect tesupp svy jackknife, subpop(if idcntry==246):reg BSSSCI04 BTBG01 BTBG04 scimaj BTBS17A tecoop telimi sciped expect tesupp svy jackknife, subpop(if idcntry==246):reg BSSSCI05 BTBG01 BTBG04 scimaj BTBS17A tecoop telimi sciped expect tesupp *Ghana svy jackknife, subpop(if idcntry==288):reg BSSSCI01 BTBG01 BTBG04 scimaj BTBS17A tecoop telimi sciped expect tesupp svy jackknife, subpop(if idcntry==288):reg BSSSCI02 BTBG01 BTBG04 scimaj BTBS17A tecoop telimi sciped expect tesupp svy jackknife, subpop(if idcntry==288):reg BSSSCI03 BTBG01 BTBG04 scimaj BTBS17A tecoop telimi sciped expect tesupp svy jackknife, subpop(if idcntry==288):reg BSSSCI04 BTBG01 BTBG04 scimaj BTBS17A tecoop telimi sciped expect tesupp svy jackknife, subpop(if idcntry==288):reg BSSSCI05 BTBG01 BTBG04 scimaj BTBS17A tecoop telimi sciped expect tesupp *Korea svy jackknife, subpop(if idcntry==410):reg BSSSCI01 BTBG01 BTBG04 scimaj BTBS17A tecoop telimi sciped expect tesupp svy jackknife, subpop(if idcntry==410):reg BSSSCI02 BTBG01 BTBG04 scimaj BTBS17A tecoop telimi sciped expect tesupp svy jackknife, subpop(if idcntry==410):reg BSSSCI03 BTBG01 BTBG04 scimaj BTBS17A tecoop telimi sciped expect tesupp 268 svy jackknife, subpop(if idcntry==410):reg BSSSCI04 BTBG01 BTBG04 scimaj BTBS17A tecoop telimi sciped expect tesupp svy jackknife, subpop(if idcntry==410):reg BSSSCI05 BTBG01 BTBG04 scimaj BTBS17A tecoop telimi sciped expect tesupp *Singapore svy jackknife, subpop(if idcntry==702):reg BSSSCI01 BTBG01 BTBG04 scimaj BTBS17A tecoop telimi sciped expect tesupp svy jackknife, subpop(if idcntry==702):reg BSSSCI02 BTBG01 BTBG04 scimaj BTBS17A tecoop telimi sciped expect tesupp svy jackknife, subpop(if idcntry==702):reg BSSSCI03 BTBG01 BTBG04 scimaj BTBS17A tecoop telimi sciped expect tesupp svy jackknife, subpop(if idcntry==702):reg BSSSCI04 BTBG01 BTBG04 scimaj BTBS17A tecoop telimi sciped expect tesupp svy jackknife, subpop(if idcntry==702):reg BSSSCI05 BTBG01 BTBG04 scimaj BTBS17A tecoop telimi sciped expect tesupp *USA svy jackknife, subpop(if idcntry==840):reg BSSSCI01 BTBG01 BTBG04 scimaj BTBS17A tecoop telimi sciped expect tesupp svy jackknife, subpop(if idcntry==840):reg BSSSCI02 BTBG01 BTBG04 scimaj BTBS17A tecoop telimi sciped expect tesupp svy jackknife, subpop(if idcntry==840):reg BSSSCI03 BTBG01 BTBG04 scimaj BTBS17A tecoop telimi sciped expect tesupp svy jackknife, subpop(if idcntry==840):reg BSSSCI04 BTBG01 BTBG04 scimaj BTBS17A tecoop telimi sciped expect tesupp svy jackknife, subpop(if idcntry==840):reg BSSSCI05 BTBG01 BTBG04 scimaj BTBS17A tecoop telimi sciped expect tesupp 269 Stata code for binary regression cleaning and analysis ***File for running binary regressions*** clear set more off use "C:\Users\educ.brunerju\Google Drive\Dissertation\Outputs\Logistic Regressions\Binary Regressions\AllLevelsCountryVarFilter.dta" drop teeval ***Create low achievement and SES flags*** gen lowses=. replace lowses=1 if SESper==1 replace lowses=0 if SESper==2|SESper==3 gen lowach=. replace lowach=1 if achper==1 replace lowach=0 if achper==2|achper==3 ***Create percentiles for other variables*** global raw intenj negsci bully parent tecoop telimi sciped expect tesupp /// schcli schpar urban ecdisa scihrs teaedu teaexp foreach var of global raw{ xtile perchi_`var'=`var' if idcntry==152, n(3) } foreach var of global raw{ xtile perfin_`var'=`var' if idcntry==246, n(3) } 270 foreach var of global raw{ xtile pergha_`var'=`var' if idcntry==288, n(3) } foreach var of global raw{ xtile perkor_`var'=`var' if idcntry==410, n(3) } foreach var of global raw{ xtile persin_`var'=`var' if idcntry==702, n(3) } foreach var of global raw{ xtile perusa_`var'=`var' if idcntry==840, n(3) } gen perintenj = max(perchi_intenj, perfin_intenj, pergha_intenj, perkor_intenj, persin_intenj, perusa_intenj) gen pernegsci = max(perchi_negsci, perfin_negsci, pergha_negsci, perkor_negsci, persin_negsci, perusa_negsci) gen perbully = max(perchi_bully, perfin_bully, pergha_bully, perkor_bully, persin_bully, perusa_bully) gen perparent = max(perchi_parent, perfin_parent, pergha_parent, perkor_parent, persin_parent, perusa_parent) gen pertecoop = max(perchi_tecoop, perfin_tecoop, pergha_tecoop, perkor_tecoop, persin_tecoop, perusa_tecoop) gen pertelimi = max(perchi_telimi, perfin_telimi, pergha_telimi, perkor_telimi, persin_telimi, perusa_telimi) gen persciped = max(perchi_sciped, perfin_sciped, pergha_sciped, perkor_sciped, persin_sciped, perusa_sciped) gen perexpect = max(perchi_expect, perfin_expect, pergha_expect, perkor_expect, persin_expect, perusa_expect) gen pertesupp = max(perchi_tesupp, perfin_tesupp, pergha_tesupp, perkor_tesupp, persin_tesupp, perusa_tesupp) gen perschcli = max(perchi_schcli, perfin_schcli, pergha_schcli, perkor_schcli, persin_schcli, perusa_schcli) gen perschpar = max(perchi_schpar, perfin_schpar, pergha_schpar, perkor_schpar, persin_schpar, perusa_schpar) gen perurban = max(perchi_urban, perfin_urban, pergha_urban, perkor_urban, persin_urban, perusa_urban) 271 gen perecdisa = max(perchi_ecdisa, perfin_ecdisa, pergha_ecdisa, perkor_ecdisa, persin_ecdisa, perusa_ecdisa) gen perscihrs = max(perchi_scihrs, perfin_scihrs, pergha_scihrs, perkor_scihrs, persin_scihrs, perusa_scihrs) gen perteaedu = max(perchi_teaedu, perfin_teaedu, pergha_teaedu, perkor_teaedu, persin_teaedu, perusa_teaedu) gen perteaexp = max(perchi_teaexp, perfin_teaexp, pergha_teaexp, perkor_teaexp, persin_teaexp, perusa_teaexp) drop perchi* perfin* pergha* perkor* persin* perusa* *Create flags for each variable global percentile perintenj pernegsci perbully perparent pertecoop /// pertelimi persciped perexpect pertesupp perschcli perschpar perurban perecdisa perscihrs perteaedu perteaexp foreach var of global percentile{ gen low_`var'=. replace low_`var'=1 if `var'==1 replace low_`var'=0 if `var'==2|`var'==3 } foreach var of global percentile{ gen hi_`var'=. replace hi_`var'=1 if `var'==3 replace hi_`var'=0 if `var'==1|`var'==2 } *Create interaction terms with low SES* global low low_perintenj low_pernegsci low_perbully low_perparent low_pertecoop /// low_pertelimi low_persciped low_perexpect low_pertesupp low_perschcli low_perschpar /// low_perurban low_perecdisa low_perscihrs low_perteaedu low_perteaexp girl scimaj foreach var of global low{ 272 gen sesx_`var'=lowses*`var' } global high hi_perintenj hi_pernegsci hi_perbully hi_perparent hi_pertecoop /// hi_pertelimi hi_persciped hi_perexpect hi_pertesupp hi_perschcli hi_perschpar hi_perurban /// hi_perecdisa hi_perscihrs hi_perteaedu hi_perteaexp foreach var of global high{ gen sesx_`var'=lowses*`var' } ***Binary Regressions Analysis*** *Run Binary Regression Cleaning File First* set more off sort idcntry *Put dataset in survey mode for teacher descriptives* svyset jkrep [pweight=sciwgt], strata(jkzone) vce(jackknife) mse singleunit(missing) *Run Individual Regessions* svy jackknife, subpop(if idcntry==152) : logistic lowach lowses girl sesx_girl svy jackknife, subpop(if idcntry==246) : logistic lowach lowses girl sesx_girl svy jackknife, subpop(if idcntry==288) : logistic lowach lowses girl sesx_girl svy jackknife, subpop(if idcntry==410) : logistic lowach lowses girl sesx_girl svy jackknife, subpop(if idcntry==702) : logistic lowach lowses girl sesx_girl svy jackknife, subpop(if idcntry==840) : logistic lowach lowses girl sesx_girl svy jackknife, subpop(if idcntry==152) : logistic lowach lowses hi_perintenj sesx_hi_perintenj svy jackknife, subpop(if idcntry==246) : logistic lowach lowses hi_perintenj sesx_hi_perintenj svy jackknife, subpop(if idcntry==288) : logistic lowach lowses hi_perintenj sesx_hi_perintenj 273 svy jackknife, subpop(if idcntry==410) : logistic lowach lowses hi_perintenj sesx_hi_perintenj svy jackknife, subpop(if idcntry==702) : logistic lowach lowses hi_perintenj sesx_hi_perintenj svy jackknife, subpop(if idcntry==840) : logistic lowach lowses hi_perintenj sesx_hi_perintenj svy jackknife, subpop(if idcntry==152) : logistic lowach lowses low_pernegsci sesx_low_pernegsci svy jackknife, subpop(if idcntry==246) : logistic lowach lowses low_pernegsci sesx_low_pernegsci svy jackknife, subpop(if idcntry==288) : logistic lowach lowses low_pernegsci sesx_low_pernegsci svy jackknife, subpop(if idcntry==410) : logistic lowach lowses low_pernegsci sesx_low_pernegsci svy jackknife, subpop(if idcntry==702) : logistic lowach lowses low_pernegsci sesx_low_pernegsci svy jackknife, subpop(if idcntry==840) : logistic lowach lowses low_pernegsci sesx_low_pernegsci svy jackknife, subpop(if idcntry==152) : logistic lowach lowses low_perecdisa sesx_low_perecdisa svy jackknife, subpop(if idcntry==246) : logistic lowach lowses low_perecdisa sesx_low_perecdisa svy jackknife, subpop(if idcntry==288) : logistic lowach lowses low_perecdisa sesx_low_perecdisa svy jackknife, subpop(if idcntry==410) : logistic lowach lowses low_perecdisa sesx_low_perecdisa svy jackknife, subpop(if idcntry==702) : logistic lowach lowses low_perecdisa sesx_low_perecdisa svy jackknife, subpop(if idcntry==840) : logistic lowach lowses low_perecdisa sesx_low_perecdisa svy jackknife, subpop(if idcntry==152) : logistic lowach lowses hi_perurban sesx_hi_perurban svy jackknife, subpop(if idcntry==246) : logistic lowach lowses hi_perurban sesx_hi_perurban svy jackknife, subpop(if idcntry==288) : logistic lowach lowses hi_perurban sesx_hi_perurban svy jackknife, subpop(if idcntry==410) : logistic lowach lowses hi_perurban sesx_hi_perurban svy jackknife, subpop(if idcntry==702) : logistic lowach lowses hi_perurban sesx_hi_perurban svy jackknife, subpop(if idcntry==840) : logistic lowach lowses hi_perurban sesx_hi_perurban svy jackknife, subpop(if idcntry==152) : logistic lowach lowses low_perschcli sesx_low_perschcli svy jackknife, subpop(if idcntry==246) : logistic lowach lowses low_perschcli sesx_low_perschcli svy jackknife, subpop(if idcntry==288) : logistic lowach lowses low_perschcli sesx_low_perschcli 274 svy jackknife, subpop(if idcntry==410) : logistic lowach lowses low_perschcli sesx_low_perschcli svy jackknife, subpop(if idcntry==702) : logistic lowach lowses low_perschcli sesx_low_perschcli svy jackknife, subpop(if idcntry==840) : logistic lowach lowses low_perschcli sesx_low_perschcli svy jackknife, subpop(if idcntry==152) : logistic lowach lowses hi_perteaexp sesx_hi_perteaexp svy jackknife, subpop(if idcntry==246) : logistic lowach lowses hi_perteaexp sesx_hi_perteaexp svy jackknife, subpop(if idcntry==288) : logistic lowach lowses hi_perteaexp sesx_hi_perteaexp svy jackknife, subpop(if idcntry==410) : logistic lowach lowses hi_perteaexp sesx_hi_perteaexp svy jackknife, subpop(if idcntry==702) : logistic lowach lowses hi_perteaexp sesx_hi_perteaexp svy jackknife, subpop(if idcntry==840) : logistic lowach lowses hi_perteaexp sesx_hi_perteaexp svy jackknife, subpop(if idcntry==152) : logistic lowach lowses scimaj sesx_scimaj svy jackknife, subpop(if idcntry==246) : logistic lowach lowses scimaj sesx_scimaj svy jackknife, subpop(if idcntry==288) : logistic lowach lowses scimaj sesx_scimaj svy jackknife, subpop(if idcntry==410) : logistic lowach lowses scimaj sesx_scimaj svy jackknife, subpop(if idcntry==702) : logistic lowach lowses scimaj sesx_scimaj svy jackknife, subpop(if idcntry==840) : logistic lowach lowses scimaj sesx_scimaj svy jackknife, subpop(if idcntry==152) : logistic lowach lowses low_pertelimi sesx_low_pertelimi svy jackknife, subpop(if idcntry==246) : logistic lowach lowses low_pertelimi sesx_low_pertelimi svy jackknife, subpop(if idcntry==288) : logistic lowach lowses low_pertelimi sesx_low_pertelimi svy jackknife, subpop(if idcntry==410) : logistic lowach lowses low_pertelimi sesx_low_pertelimi svy jackknife, subpop(if idcntry==702) : logistic lowach lowses low_pertelimi sesx_low_pertelimi svy jackknife, subpop(if idcntry==840) : logistic lowach lowses low_pertelimi sesx_low_pertelimi svy jackknife, subpop(if idcntry==152) : logistic lowach lowses hi_persciped sesx_hi_persciped svy jackknife, subpop(if idcntry==246) : logistic lowach lowses hi_persciped sesx_hi_persciped svy jackknife, subpop(if idcntry==288) : logistic lowach lowses hi_persciped sesx_hi_persciped 275 svy jackknife, subpop(if idcntry==410) : logistic lowach lowses hi_persciped sesx_hi_persciped svy jackknife, subpop(if idcntry==702) : logistic lowach lowses hi_persciped sesx_hi_persciped svy jackknife, subpop(if idcntry==840) : logistic lowach lowses hi_persciped sesx_hi_persciped svy jackknife, subpop(if idcntry==152) : logistic lowach lowses hi_perexpect sesx_hi_perexpect svy jackknife, subpop(if idcntry==246) : logistic lowach lowses hi_perexpect sesx_hi_perexpect svy jackknife, subpop(if idcntry==288) : logistic lowach lowses hi_perexpect sesx_hi_perexpect svy jackknife, subpop(if idcntry==410) : logistic lowach lowses hi_perexpect sesx_hi_perexpect svy jackknife, subpop(if idcntry==702) : logistic lowach lowses hi_perexpect sesx_hi_perexpect svy jackknife, subpop(if idcntry==840) : logistic lowach lowses hi_perexpect sesx_hi_perexpect *Run regression with variables as a block* svy jackknife, subpop(if idcntry==152) : logistic lowach lowses girl hi_perintenj low_pernegsci low_perecdisa low_perschcli hi_perteaexp scimaj low_pertelimi hi_persciped hi_perexpect svy jackknife, subpop(if idcntry==246) : logistic lowach lowses girl hi_perintenj low_pernegsci low_perecdisa low_perschcli hi_perteaexp scimaj low_pertelimi hi_persciped hi_perexpect svy jackknife, subpop(if idcntry==288) : logistic lowach lowses girl hi_perintenj low_pernegsci low_perecdisa low_perschcli hi_perteaexp scimaj low_pertelimi hi_persciped hi_perexpect svy jackknife, subpop(if idcntry==410) : logistic lowach lowses girl hi_perintenj low_pernegsci low_perecdisa low_perschcli hi_perteaexp scimaj low_pertelimi hi_persciped hi_perexpect svy jackknife, subpop(if idcntry==702) : logistic lowach lowses girl hi_perintenj low_pernegsci low_perecdisa low_perschcli hi_perteaexp scimaj low_pertelimi hi_persciped hi_perexpect svy jackknife, subpop(if idcntry==840) : logistic lowach lowses girl hi_perintenj low_pernegsci low_perecdisa low_perschcli hi_perteaexp scimaj low_pertelimi hi_persciped hi_perexpect svy jackknife, subpop(if idcntry==152) : logistic lowses lowach hi_perintenj low_pernegsci low_perecdisa low_perschcli hi_perteaexp scimaj low_pertelimi hi_persciped hi_perexpect 276 svy jackknife, subpop(if idcntry==246) : logistic lowses lowach hi_perintenj low_pernegsci low_perecdisa low_perschcli hi_perteaexp scimaj low_pertelimi hi_persciped hi_perexpect svy jackknife, subpop(if idcntry==288) : logistic lowses lowach hi_perintenj low_pernegsci low_perecdisa low_perschcli hi_perteaexp scimaj low_pertelimi hi_persciped hi_perexpect svy jackknife, subpop(if idcntry==410) : logistic lowses lowach hi_perintenj low_pernegsci low_perecdisa low_perschcli hi_perteaexp scimaj low_pertelimi hi_persciped hi_perexpect svy jackknife, subpop(if idcntry==702) : logistic lowses lowach hi_perintenj low_pernegsci low_perecdisa low_perschcli hi_perteaexp scimaj low_pertelimi hi_persciped hi_perexpect svy jackknife, subpop(if idcntry==840) : logistic lowses lowach hi_perintenj low_pernegsci low_perecdisa low_perschcli hi_perteaexp scimaj low_pertelimi hi_persciped hi_perexpect 277 Appendix 2 – Factor Analysis Table A2.1 - Student Factor Analysis Raw Outputs Factor analysis/correlation Method: principal factors Rotation: (unrotated) Number of obs = Retained factors = Number of params = 30720 13 390 Factor Eigenvalue Difference Proportion Cumulative Factor1 Factor2 Factor3 Factor4 Factor5 Factor6 Factor7 Factor8 Factor9 Factor10 Factor11 Factor12 Factor13 Factor14 Factor15 Factor16 Factor17 Factor18 Factor19 Factor20 Factor21 Factor22 Factor23 Factor24 Factor25 Factor26 Factor27 Factor28 Factor29 Factor30 Factor31 Factor32 Factor33 Factor34 Factor35 Factor36 10.88462 2.69460 1.71738 1.46246 1.00168 0.80193 0.44665 0.27508 0.17527 0.12358 0.05874 0.03231 0.01915 -0.00263 -0.01747 -0.03125 -0.04160 -0.04761 -0.06025 -0.06563 -0.07432 -0.07835 -0.08643 -0.08882 -0.09510 -0.10137 -0.11493 -0.11999 -0.12641 -0.13036 -0.13296 -0.13983 -0.15627 -0.17169 -0.17895 -0.19577 8.19003 0.97722 0.25491 0.46078 0.19975 0.35529 0.17157 0.09980 0.05170 0.06484 0.02643 0.01316 0.02179 0.01484 0.01378 0.01035 0.00601 0.01264 0.00539 0.00869 0.00403 0.00808 0.00239 0.00628 0.00627 0.01357 0.00505 0.00643 0.00394 0.00260 0.00687 0.01644 0.01542 0.00726 0.01682 . 0.6243 0.1545 0.0985 0.0839 0.0575 0.0460 0.0256 0.0158 0.0101 0.0071 0.0034 0.0019 0.0011 -0.0002 -0.0010 -0.0018 -0.0024 -0.0027 -0.0035 -0.0038 -0.0043 -0.0045 -0.0050 -0.0051 -0.0055 -0.0058 -0.0066 -0.0069 -0.0073 -0.0075 -0.0076 -0.0080 -0.0090 -0.0098 -0.0103 -0.0112 0.6243 0.7788 0.8773 0.9612 1.0187 1.0647 1.0903 1.1060 1.1161 1.1232 1.1266 1.1284 1.1295 1.1294 1.1284 1.1266 1.1242 1.1214 1.1180 1.1142 1.1100 1.1055 1.1005 1.0954 1.0900 1.0841 1.0776 1.0707 1.0634 1.0559 1.0483 1.0403 1.0313 1.0215 1.0112 1.0000 LR test: independent vs. saturated: chi2(630) = 5.7e+05 Prob>chi2 = 0.0000 278 Table A2.1 (cont’d) Factor analysis/correlation Method: principal factors Rotation: orthogonal varimax (Kaiser off) Number of obs = Retained factors = Number of params = 30720 13 390 Factor Variance Difference Proportion Cumulative Factor1 Factor2 Factor3 Factor4 Factor5 Factor6 Factor7 Factor8 Factor9 Factor10 Factor11 Factor12 Factor13 4.14364 3.58137 3.37016 2.93905 2.04264 1.90082 0.86017 0.27798 0.25521 0.18836 0.05891 0.04187 0.03326 0.56227 0.21121 0.43111 0.89642 0.14182 1.04064 0.58219 0.02278 0.06684 0.12946 0.01703 0.00862 . 0.2377 0.2054 0.1933 0.1686 0.1172 0.1090 0.0493 0.0159 0.0146 0.0108 0.0034 0.0024 0.0019 0.2377 0.4431 0.6364 0.8049 0.9221 1.0311 1.0804 1.0964 1.1110 1.1218 1.1252 1.1276 1.1295 LR test: independent vs. saturated: chi2(630) = 5.7e+05 Prob>chi2 = 0.0000 Rotated factor loadings (pattern matrix) and unique variances Variable Factor1 Factor2 Factor3 Factor4 Factor5 Factor6 BSBG11A BSBG11B BSBG11C BSBG11D BSBG13A BSBG13B BSBG13C BSBG13D BSBG13E BSBG13F BSBS17A BSBS17B BSBS17C BSBS17D BSBS17E BSBS17F BSBS17G BSBS18A BSBS18B BSBS18C BSBS18D BSBS18E BSBS19A BSBS19B BSBS19C BSBS19D BSBS19E BSBS19F BSBS19G BSBS19H BSBS19I BSBS19J BSBS19K BSBS19L BSBS19M BSBS19N 0.0851 0.1013 0.1155 0.0800 0.0562 0.0656 -0.0008 0.0559 0.0303 0.0446 0.7188 -0.4531 0.4514 -0.4896 0.6432 0.7385 0.3717 0.3022 -0.2154 0.3955 0.5750 0.4685 0.3359 -0.1084 -0.2025 0.4001 -0.1450 0.3166 0.2315 0.2399 -0.1743 0.3865 0.2775 0.1697 0.1831 0.3299 0.0834 0.0852 0.1072 0.1029 0.0545 0.0781 0.0181 0.0768 0.0493 0.0821 0.2946 -0.1857 0.3296 -0.1554 0.2803 0.3122 0.5237 0.2070 -0.0932 0.1725 0.2559 0.2274 0.2245 -0.0442 -0.1346 0.2230 -0.0273 0.2276 0.2186 0.2250 -0.0361 0.6059 0.5980 0.7606 0.8107 0.6204 0.1303 0.0969 0.1568 0.1171 0.0105 0.0573 0.0262 0.0562 0.0457 0.0556 0.3142 -0.0657 0.2661 -0.0505 0.2321 0.3160 0.1892 0.4334 -0.0182 0.3660 0.3041 0.3708 0.5987 -0.2047 -0.2527 0.5718 -0.1599 0.6271 0.6988 0.6859 -0.1667 0.1733 0.1723 0.1812 0.1564 0.2480 -0.0095 -0.0277 -0.0179 0.0128 0.0386 0.0906 0.0453 0.0652 0.0450 0.1186 -0.2248 0.4728 -0.0158 0.4833 -0.1466 -0.2342 -0.0702 -0.0772 0.3272 -0.1312 -0.1048 -0.0631 -0.3194 0.7039 0.6434 -0.2947 0.6624 -0.2277 -0.1423 -0.1432 0.7171 -0.0563 0.0012 -0.0796 -0.0668 -0.1156 0.0096 -0.0374 0.0086 0.0566 0.5925 0.5706 0.5650 0.5109 0.6225 0.5484 0.0628 0.0173 0.1850 0.0461 0.0043 0.0556 0.0029 -0.0102 0.0880 -0.0165 0.0067 0.0160 0.0315 0.0731 -0.0004 0.0338 0.0955 0.0503 0.0315 0.0513 0.0577 0.0444 0.0625 0.0233 0.0421 0.0800 0.6766 0.6296 0.6628 0.6260 0.0039 0.0247 0.0156 0.0293 -0.0098 0.0037 0.0780 -0.0397 0.0895 -0.0117 0.0931 0.0653 0.1218 0.1731 -0.0583 0.0864 0.1285 0.1582 0.1187 -0.0018 -0.0022 0.0870 -0.0245 0.0972 0.1175 0.1273 0.0059 0.0829 0.0893 0.0782 0.0545 0.0437 279 Proposed model (six factors) - - - - - - - Factor 1 – Interest and Enjoyment of Science o 17A – I enjoy learning science o 17C – I read about science in my spare time o 17E – I learn many interesting things in science o 17F – I like science o 18C – My teacher is easy to understand o 18D – I am interested in what my teacher says o 18E – My teacher gives me interesting things to do  Alpha - .8967 Factor 2 – Value of science o 17G – It is important to do well in science o 19J – I think learning science will help me in my daily life o 19K – I need science to learn other school subjects o 19L – I need to do well in science to get into the university of my choice o 19M – I need to do well in science to get the job I want o 19N – I would like a job that involves using science  Alpha - .8835 Factor 3 – Positive Science Affect o 18A – I know what my teacher expects me to do o 19A – I usually do well in science o 19D – I learn things quickly in science o 19F – I am good at working out different science problems o 19G – My teacher thinks I can do well in science with difficult materials o 19H – My teacher tells me I am good at science  Alpha - .885 Factor 4 – Negative Science Affect o 17D – Science is boring o 19B – Science is more difficult for me than some of my classmates o 19C – Science is not one of my strengths o 19E – Science makes me confused and nervous o 19I – Science is harder for me than any other subject  Alpha - .8292 Factor 5 – Bullied o 13A – I was made fun of or called names o 13B - I was left out of games or activities by other students o 13C – Someone spread lies about me o 13D – Something was stolen from me o 13E – I was hit or hurt by other students o 13F – I was made to do things I didn’t want to by other students  Alpha - .7674 Factor 6 – Parent Involvement o 11A – My parents ask me what I am learning in school o 11B – I talk about my schoolwork with my parents o 11C – My parents make sure that I set aside time for my homework o 11D – My parents check if I do my homework  Alpha - .7860 Other variables of interest no grouped o SES (derived) o Gender (girl=1) 280 Table A2.2 – Student Item Correlations BSSSCI01 intenj valsci negsci bully parent BSSSCI01 1.0000 intenj -0.1738 0.0000 1.0000 valsci -0.1294 0.0000 0.6788 0.0000 1.0000 negsci -0.2523 0.0000 -0.4361 0.0000 -0.2748 0.0000 1.0000 bully -0.3326 0.0000 0.1560 0.0000 0.1632 0.0000 0.1234 0.0000 1.0000 parent -0.1900 0.0000 0.3190 0.0000 0.2696 0.0000 -0.0888 0.0000 0.0964 0.0000 1.0000 possci -0.1372 0.0000 0.7553 0.0000 0.5854 0.0000 -0.4668 0.0000 0.1303 0.0000 0.3386 0.0000 possci 1.0000 Because of the high correlations between positive science, interest/enjoyment, and valuing science some of these variables will need to be dropped. Valuing science and positive science have the weakest correlations with achievement and are also the least loaded factors so they will be omitted from the models. 281 Table A2.3 - Teacher Factor Analysis Raw Outputs Factor analysis/correlation Method: principal factors Rotation: orthogonal varimax (Kaiser off) Number of obs = Retained factors = Number of params = 32960 13 286 Factor Variance Difference Proportion Cumulative Factor1 Factor2 Factor3 Factor4 Factor5 Factor6 Factor7 Factor8 Factor9 Factor10 Factor11 Factor12 Factor13 2.47724 2.14944 1.94866 1.69286 1.24208 1.07898 0.97886 0.27109 0.21473 0.19395 0.06088 0.05899 0.00288 0.32780 0.20079 0.25579 0.45078 0.16310 0.10012 0.70777 0.05636 0.02078 0.13307 0.00189 0.05611 . 0.2445 0.2121 0.1923 0.1671 0.1226 0.1065 0.0966 0.0268 0.0212 0.0191 0.0060 0.0058 0.0003 0.2445 0.4566 0.6490 0.8160 0.9386 1.0451 1.1417 1.1685 1.1897 1.2088 1.2148 1.2206 1.2209 LR test: independent vs. saturated: chi2(378) = 2.7e+05 Prob>chi2 = 0.0000 Rotated factor loadings (pattern matrix) and unique variances Variable Factor1 Factor2 Factor3 Factor4 Factor5 Factor6 BTBG01 BTBG04 scimaj BTBG12 BTBS17A BTBG06D BTBG06E BTBG06H BTBG10A BTBG10B BTBG10C BTBG10D BTBG10E BTBG14D BTBG14E BTBG15A BTBG15B BTBG15C BTBG15D BTBG15E BTBG15F BTBS19A BTBS19B BTBS19C BTBS19D BTBS19G BTBS19H BTBS19I -0.0695 -0.0360 0.0246 -0.0435 -0.0166 0.1629 0.0892 0.1277 0.7412 0.7605 0.6819 0.3909 0.7161 0.1456 0.1667 -0.0413 0.0721 -0.0125 -0.0075 -0.0363 -0.0244 0.1187 0.1243 0.1538 0.1579 0.1132 0.0804 0.0628 0.0368 0.0580 -0.0932 -0.0076 0.0446 -0.1146 -0.2259 -0.2751 -0.0423 0.0383 -0.0620 -0.1117 -0.0013 -0.0050 -0.0551 0.5185 0.4259 0.5768 0.5411 0.6450 0.6865 -0.0126 -0.0230 -0.0221 -0.0471 -0.0007 -0.0442 -0.0553 0.0689 -0.1427 -0.1111 0.0407 0.0763 0.1234 -0.0613 0.1001 0.1105 0.0931 0.0757 0.1448 0.1263 0.1578 0.1349 0.0270 0.0839 -0.0349 -0.0073 -0.0707 -0.0459 0.4583 0.6196 0.7108 0.6905 0.3123 0.1904 0.2051 -0.2764 -0.6898 -0.1040 0.6448 0.2279 0.1888 -0.2170 0.1072 -0.1329 0.0005 -0.0273 0.4293 0.2540 0.1235 0.0732 0.1595 0.2653 -0.0342 -0.0759 -0.0581 -0.0893 0.0537 0.1143 0.2133 -0.0916 0.2633 0.1347 0.0934 -0.0248 0.0079 0.0681 0.0377 -0.0433 0.5225 0.6144 0.6560 0.0340 0.0875 0.0307 0.0243 0.0925 -0.0094 0.0203 -0.1753 -0.0934 -0.1069 -0.0426 -0.1375 -0.2119 0.0368 -0.0001 0.0398 0.0480 0.0938 0.0457 0.0232 -0.0020 -0.1165 -0.1188 0.0685 0.1525 0.1898 -0.0780 -0.0096 0.0674 0.0547 0.0810 0.0170 0.0740 0.6455 0.6389 0.0064 0.0911 0.0102 0.0475 -0.0795 -0.0522 0.1207 0.0876 0.1293 0.0481 0.0548 0.1521 0.2219 282 Proposed model (six factors) - - - - - - - Factor 1 – Teacher Cooperation (How often do you have the following interactions with other teachers?) o 10A – Discuss how to teach a particular topic o 10B - Collaborate in planning and preparing instructional materials o 10C – Share what I have learned about my teaching experiences o 10E – Work together to try new ideas  Alpha-.81 Factor 2 – Teacher limitations (In your view, to what extent do the following limit how you teach this class?) o 15A – Students lack prerequisite knowledge o 15B - Students suffering from lack of basic nutrition o 15C – Students suffering from not enough sleep o 15D – Students with special needs o 15E – Disruptive Students o 15F – Uninterested Students  Alpha - .7545 Factor 3 – Science Pedagogy (In teaching science to the students in this class, how often do you usually ask them to do the following) o 19A – Observe natural phenomena and describe what they see o 19B – Watch me demonstrate an experiment or investigation o 19C – Design or plan experiments or investigations o 19D – Conduct experiments or investigations  Alpha - .7669 Factor 4 – Teacher Quality o 04 – Level of teacher education o 12 – Class size  Alpha - .6606 Factor 5 – Expectations (how would you characterize the following within your school) o 6D – Teachers expectations for student achievement o 6E – Parental support for students achievement o 6H – Students desire to do well in school  Alpha - .6878 Factor 6 – Teachers supporting students (How often do you do the following in this class) o 14D – Encourage all students to improve their performance o 14E – Praise students for good effort  Alpha - .7406 Not grouped variables to consider o 17 – Hours of science instruction o 01 – Teacher years of teaching o Science Major 283 Table A2.4 – Teacher Item Correlations teint telim sciped tequal stuexp stusup teint 1.0000 telim -0.0424 0.0000 1.0000 sciped 0.2632 0.0000 -0.0328 0.0000 1.0000 tequal -0.0384 0.0000 -0.0125 0.0124 0.0572 0.0000 1.0000 stuexp 0.1993 0.0000 -0.3154 0.0000 0.1518 0.0000 -0.0062 0.2091 1.0000 stusup 0.2396 0.0000 0.0214 0.0000 0.2990 0.0000 0.1129 0.0000 0.0668 0.0000 1.0000 BSSSCI01 -0.0366 0.0000 -0.1831 0.0000 -0.1638 0.0000 -0.1695 0.0000 0.1451 0.0000 -0.2212 0.0000 284 BSSSCI01 1.0000 Table A2.5 - School Factor Analysis Raw Outputs . rotate Factor analysis/correlation Method: principal factors Rotation: orthogonal varimax (Kaiser off) Number of obs = Retained factors = Number of params = 33829 6 63 Factor Variance Difference Proportion Cumulative Factor1 Factor2 Factor3 Factor4 Factor5 Factor6 1.63864 1.21360 0.67555 0.60494 0.40064 0.00123 0.42503 0.53806 0.07061 0.20430 0.39941 . 0.4795 0.3551 0.1977 0.1770 0.1172 0.0004 0.4795 0.8346 1.0322 1.2093 1.3265 1.3268 LR test: independent vs. saturated: chi2(78) = 8.2e+04 Prob>chi2 = 0.0000 Rotated factor loadings (pattern matrix) and unique variances Variable Factor1 Factor2 Factor3 Factor4 Factor5 Factor6 BCBG03A BCBG05B BCBG10AA BCBG10CG BCBG10CB BCBG11E BCBG12AA BCBG12AB BCBG12AH BCBG14A BCBG15B BCBG17B BCBG17D 0.3045 -0.0954 -0.1416 -0.1534 -0.1530 -0.3691 0.7491 0.7556 0.3360 -0.1232 0.2646 -0.0530 -0.0271 -0.0127 0.2411 0.4804 0.4992 0.6174 0.1520 -0.0726 -0.1140 0.2993 0.2735 -0.0595 0.2746 0.0967 0.5539 -0.1694 -0.1149 -0.1207 0.0045 -0.4644 0.0561 0.1560 0.0385 0.2106 0.1249 -0.0094 0.0868 0.1282 0.1096 -0.0530 0.2008 0.1159 0.1045 -0.0439 -0.0011 -0.1367 0.2665 0.0392 0.4176 0.4914 -0.1207 0.4531 -0.0680 0.2606 0.0929 -0.0318 0.0142 -0.0858 -0.0174 0.2548 -0.0390 0.1497 0.0427 -0.0017 0.0021 0.0080 -0.0063 -0.0015 -0.0039 0.0054 -0.0038 -0.0038 -0.0152 0.0265 0.0095 -0.0042 Variable BCBG03A BCBG05B BCBG10AA BCBG10CG BCBG10CB BCBG11E BCBG12AA BCBG12AB BCBG12AH BCBG14A BCBG15B BCBG17B BCBG17D Uniqueness 0.5693 0.6867 0.7284 0.6043 0.5733 0.6130 0.4283 0.3844 0.7771 0.7295 0.9071 0.7248 0.7391 285 Proposed Model (two factors) - - - Factor 1 – Student attendance at school (To what degree is each of the following a problem among students at your school) o 12A – Arriving late at school o 12B – Absenteeism  Alpha - .7893 Factor 2 – School/Parent Contact (How often does your school do the following for parents concerning individual students) o 10AA – Inform parents about their child’s progress o 10CB – Inform parents about school accomplishments o 10CG – Organize workshops or seminars for parents on learning or pedagogical issues  Alpha .6095 Additional variables to consider that are not loaded o 3A – Approximately what percentage of students come from economically disadvantaged backgrounds o 05B – Urban/Rural 286 Table A2.6 – School Item Correlations BSSSCI01 schpar schatt BCBG03A BSSSCI01 1.0000 schpar 0.3108 0.0000 1.0000 schatt -0.3480 0.0000 -0.2481 0.0000 1.0000 BCBG03A -0.4778 0.0000 -0.1035 0.0000 0.3194 0.0000 1.0000 BCBG05B 0.2701 0.0000 0.2978 0.0000 -0.1427 0.0000 -0.2275 0.0000 287 BCBG05B 1.0000 Appendix 3 – Raw Data Outputs Student Regressions name: log: C:\Users\educ.brunerju\Google Drive\Dissertation\Outputs\Baseline Regressions.smcl log type: smcl opened on: 29 May 2014, 09:58:49 . do "C:\Users\EDUC~1.BRU\AppData\Local\Temp\STD00000000.tmp" . *Student Regressions Achievement* . *Chile . svy jackknife, subpop(if idcntry==152): reg BSSSCI01 BSBG01 SES intenj negsci bully parent (running regress on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Linear regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = 150 Number of obs Population size = 37782 = 4642235.2 4623 288 Subpop. size = 198123.83 Replications = Design df F( 6, 70) Prob > F R-squared 150 = 75 = 119.08 = 0.0000 = 0.2725 Jknife * BSSSCI01 Coef. Std. Err. t P>t BSBG01 -20.06898 2.707004 SES [95% Conf. Interval] -7.41 0.000 16.94593 .9703669 17.46 0.000 -25.46161 -14.67636 15.01286 18.879 intenj -5.015054 2.353278 -2.13 0.036 -9.703023 -.3270847 negsci -29.11545 -21.35275 -25.2341 1.94837 -12.95 0.000 bully -4.637225 2.140848 -2.17 0.033 parent -3.947207 1.535114 -2.57 0.012 _cons 56.57 0.000 573.0947 10.13087 -8.902013 -.3724364 -7.005311 -.8891038 552.913 593.2764 . svy jackknife, subpop(if idcntry==152): reg BSSSCI02 BSBG01 SES intenj negsci bully parent (running regress on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 289 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Linear regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = Number of obs 150 = 198123.83 Replications = F( 6, 70) Prob > F R-squared = = 4642235.2 150 75 = 126.58 = 0.0000 = Population size 37782 4623 Subpop. size Design df = 0.2852 Jknife * BSSSCI02 Coef. Std. Err. t P>t BSBG01 -22.38882 2.794357 SES [95% Conf. Interval] -8.01 0.000 16.47378 1.008912 16.33 0.000 14.46392 intenj -6.793504 1.970009 -3.45 0.001 negsci -26.77879 1.660283 -16.13 0.000 bully -5.57936 1.923 -2.90 0.005 parent -4.086366 1.399593 _cons 585.8172 9.04947 -27.95547 -16.82218 18.48363 -10.71796 -2.869046 -30.08624 -23.47134 -9.410173 -1.748547 -2.92 0.005 64.73 0.000 -6.874498 -1.298234 567.7898 603.8447 290 . svy jackknife, subpop(if idcntry==152): reg BSSSCI03 BSBG01 SES intenj negsci bully parent (running regress on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Linear regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = 150 = 198123.83 Replications = F( 6, 70) Prob > F R-squared = 75 125.49 = 0.0000 37782 = 4642235.2 150 = = Population size = 4623 Subpop. size Design df Number of obs 0.2762 Jknife * 291 BSSSCI03 Coef. Std. Err. t P>t BSBG01 -18.81001 2.934812 SES intenj 16.26861 1.04905 [95% Conf. Interval] -6.41 0.000 15.51 0.000 14.17879 -7.42007 2.089272 -3.55 0.001 -2.48 0.015 18.35842 -11.58211 -3.258027 negsci -26.43442 1.738695 -15.20 0.000 bully -5.42642 2.184549 -24.65646 -12.96357 -29.89808 -22.97076 -9.778264 -1.074576 parent -4.672847 1.544743 -3.02 0.003 -7.750133 -1.595561 _cons 59.71 0.000 568.3072 587.9227 9.846589 607.5381 . svy jackknife, subpop(if idcntry==152): reg BSSSCI04 BSBG01 SES intenj negsci bully parent (running regress on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Linear regression Number of strata = Number of PSUs 75 = 150 Number of obs Population size = 37782 = 4642235.2 292 Subpop. no. of obs = 4623 Subpop. size = 198123.83 Replications = Design df F( 6, 70) 150 = 75 = 132.09 = 0.0000 Prob > F R-squared = 0.2691 Jknife * BSSSCI04 Coef. Std. Err. t P>t BSBG01 -20.04766 2.732757 SES 16.5284 .989414 [95% Conf. Interval] -7.34 0.000 16.71 0.000 -25.4916 -14.60373 14.55738 18.49941 intenj -5.378383 2.279144 -2.36 0.021 -9.91867 -.8380961 negsci -25.18912 1.884253 -13.37 0.000 -28.94274 -21.43549 bully -6.841283 2.156721 -3.17 0.002 -11.13769 -2.544875 parent -4.65174 -3.39 0.001 -7.388231 -1.915249 _cons 579.0978 10.14069 1.37367 57.11 0.000 558.8965 599.2991 . svy jackknife, subpop(if idcntry==152): reg BSSSCI05 BSBG01 SES intenj negsci bully parent (running regress on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. 293 Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Linear regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = 150 = 198123.83 Replications = F( 6, 70) Prob > F R-squared = Population size 37782 = 4642235.2 150 75 = 151.12 = 0.0000 = = 4623 Subpop. size Design df Number of obs 0.2701 Jknife * BSSSCI05 Coef. Std. Err. t P>t BSBG01 -22.13961 2.704158 SES [95% Conf. Interval] -8.19 0.000 16.45274 1.065931 15.44 0.000 -27.52657 -16.75265 14.32929 18.57618 intenj -4.021312 2.272029 -1.77 0.081 -8.547426 negsci -27.48844 -20.08077 -23.7846 1.85926 -12.79 0.000 bully -8.887506 2.176902 parent -5.071338 1.466989 -4.08 0.000 -3.46 0.001 .5048025 -13.22412 -4.550895 -7.993729 -2.148947 294 _cons 579.5177 9.514023 60.91 0.000 560.5648 598.4706 . *Finland . svy jackknife, subpop(if idcntry==246): reg BSSSCI01 BSBG01 SES intenj negsci bully parent (running regress on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Linear regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = 150 = 40464.76 Replications = F( 6, 70) Prob > F R-squared = 75 179.39 = 0.0000 37688 = 4677857 150 = = Population size = 2960 Subpop. size Design df Number of obs 0.2961 295 Jknife * BSSSCI01 BSBG01 SES intenj Coef. Std. Err. .7482848 11.37239 t P>t 2.23247 .754742 [95% Conf. Interval] 0.34 0.738 15.07 0.000 11.50835 3.554757 9.868869 3.24 0.002 2.50575 0.57 0.572 5.195593 12.87591 4.426908 negsci -36.61739 3.412578 -10.73 0.000 bully 1.421334 -3.699023 18.58979 -43.4156 -29.81919 -3.570376 6.413045 parent -11.27617 1.567929 -7.19 0.000 -14.39965 -8.152695 _cons 42.31 0.000 603.4044 633.2213 14.96755 663.0381 . svy jackknife, subpop(if idcntry==246): reg BSSSCI02 BSBG01 SES intenj negsci bully parent (running regress on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Linear regression 296 Number of strata = Number of PSUs 75 = Subpop. no. of obs = 150 = 40464.76 Replications = F( 6, = 70) Prob > F = 4677857 75 197.41 = 0.0000 = Population size 37688 150 = R-squared = 2960 Subpop. size Design df Number of obs 0.3094 Jknife * BSSSCI02 BSBG01 SES intenj Coef. Std. Err. t P>t 2.913435 2.375028 11.22026 .656365 [95% Conf. Interval] 1.23 0.224 17.09 0.000 12.61926 3.544626 -1.817864 9.912711 3.56 0.001 12.5278 5.558002 negsci -38.01772 3.379303 -11.25 0.000 7.644733 19.68052 -44.74964 -31.28581 bully .5999736 2.153869 0.28 0.781 parent -10.12552 1.573057 -6.44 0.000 -13.25921 -6.991831 _cons 41.91 0.000 600.5072 630.4737 15.04265 -3.690754 4.890701 660.4402 . svy jackknife, subpop(if idcntry==246): reg BSSSCI03 BSBG01 SES intenj negsci bully parent (running regress on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are 297 insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Linear regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = 150 = 40464.76 Replications = F( 6, = 70) Prob > F R-squared 37688 = 4677857 150 75 = 195.11 = 0.0000 = Population size = 2960 Subpop. size Design df Number of obs 0.3027 Jknife * BSSSCI03 BSBG01 SES intenj Coef. Std. Err. t P>t 2.638137 2.180259 1.21 0.230 11.30102 .6838147 16.53 0.000 13.41312 3.23658 [95% Conf. Interval] 4.14 0.000 -1.705162 6.981436 9.938795 12.66325 6.965518 19.86071 298 negsci -33.5465 2.955286 -11.35 0.000 -39.43374 -27.65927 bully 1.733491 2.106703 0.82 0.413 parent -10.02319 1.409583 -7.11 0.000 -12.83122 -7.215154 _cons 46.69 0.000 589.6849 615.9686 13.19392 -2.463277 5.93026 642.2522 . svy jackknife, subpop(if idcntry==246): reg BSSSCI04 BSBG01 SES intenj negsci bully parent (running regress on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Linear regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = 150 = 40464.76 Replications = F( 6, 70) = = Population size = 37688 = 4677857 2960 Subpop. size Design df Number of obs 150 75 173.07 299 Prob > F = R-squared 0.0000 = 0.2996 Jknife * BSSSCI04 BSBG01 SES intenj Coef. Std. Err. t P>t 2.221633 2.263941 [95% Conf. Interval] 0.98 0.330 10.99661 .7583426 14.50 0.000 11.63287 3.284166 negsci -38.83245 3.54 0.001 3.23107 -12.02 0.000 bully .2651028 2.429517 0.11 0.913 parent -11.51897 1.668631 -6.90 0.000 _cons 48.44 0.000 638.6125 13.18432 -2.288369 9.48591 6.731634 12.5073 5.090473 18.17526 -45.26907 -32.39583 -4.574744 5.10495 -14.84306 -8.194887 612.348 664.877 . svy jackknife, subpop(if idcntry==246): reg BSSSCI05 BSBG01 SES intenj negsci bully parent (running regress on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 300 Survey: Linear regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = 150 = 40464.76 Replications = F( 6, Prob > F R-squared = 4677857 75 = 227.09 = 0.0000 = Population size 37688 150 = 70) = 2960 Subpop. size Design df Number of obs 0.2953 Jknife * BSSSCI05 BSBG01 SES intenj Coef. Std. Err. .5536185 t P>t 2.15009 [95% Conf. Interval] 0.26 0.798 11.12215 .7538574 14.75 0.000 12.64845 3.528667 3.58 0.001 negsci -36.73649 3.226575 -11.39 0.000 bully 4.200884 2.435908 1.72 0.089 parent -11.76548 1.482656 -7.94 0.000 _cons 627.6784 13.5762 46.23 0.000 -3.729581 9.620385 4.836818 12.62391 5.618982 19.67791 -43.16416 -30.30882 -.6516935 9.053461 -14.71909 -8.811882 600.6332 654.7236 . . *Ghana 301 . svy jackknife, subpop(if idcntry==288): reg BSSSCI01 BSBG01 SES intenj negsci bully parent (running regress on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Linear regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = 150 = 260840.55 Replications = F( 6, 70) Prob > F R-squared = 36385 = 4560577.3 150 75 = 74.56 = 0.0000 = Population size = 4714 Subpop. size Design df Number of obs 0.2380 Jknife * BSSSCI01 Coef. Std. Err. t P>t [95% Conf. Interval] 302 BSBG01 -24.81438 3.006217 SES intenj 7.607394 1.979316 -8.25 0.000 3.84 0.000 32.14731 7.160592 3.664394 4.49 0.000 -2.51 0.014 parent 4.59646 3.391579 1.36 0.179 _cons 321.5432 30.31594 10.61 0.000 11.55039 17.88268 negsci -46.58742 2.987616 -15.59 0.000 bully -8.670727 3.460629 -30.80307 -18.82569 46.41194 -52.53905 -40.63578 -15.56465 -1.776801 -2.159913 11.35283 261.1508 381.9357 . svy jackknife, subpop(if idcntry==288): reg BSSSCI02 BSBG01 SES intenj negsci bully parent (running regress on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Linear regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = 150 Number of obs Population size = 36385 = 4560577.3 4714 303 Subpop. size = 260840.55 Replications = Design df F( 6, 70) Prob > F R-squared 150 = 75 = 66.40 = 0.0000 = 0.2206 Jknife * BSSSCI02 Coef. Std. Err. t P>t BSBG01 -20.58483 3.665683 SES intenj 6.735979 1.941842 30.72411 7.061667 [95% Conf. Interval] -5.62 0.000 3.47 0.001 2.867632 4.35 0.000 parent 6.759218 3.29987 _cons 319.9602 29.69142 10.60433 16.65654 negsci -44.73695 2.874884 -15.56 0.000 bully -11.50457 3.680082 -27.88725 -13.28242 44.79167 -50.46402 -39.00989 -3.13 0.003 -18.83567 -4.17347 2.05 0.044 .1855398 13.3329 10.78 0.000 260.8119 379.1086 . svy jackknife, subpop(if idcntry==288): reg BSSSCI03 BSBG01 SES intenj negsci bully parent (running regress on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 304 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Linear regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = 150 = 260840.55 Replications = F( 6, 70) Prob > F R-squared = Population size 36385 = 4560577.3 150 75 = 57.04 = 0.0000 = = 4714 Subpop. size Design df Number of obs 0.2257 Jknife * BSSSCI03 Coef. Std. Err. t P>t BSBG01 -22.74647 3.425234 SES intenj 6.629172 2.009092 30.37033 6.97272 [95% Conf. Interval] -6.64 0.000 3.30 0.001 4.36 0.000 negsci -44.89786 3.210201 -13.99 0.000 bully -10.45604 3.445425 -3.03 0.003 -29.56989 -15.92305 2.626857 10.63149 16.47996 44.2607 -51.2929 -38.50281 -17.31968 -3.592405 parent 5.324097 3.425959 1.55 0.124 -1.500763 12.14896 _cons 325.8532 29.95617 10.88 0.000 266.1774 385.5289 305 . svy jackknife, subpop(if idcntry==288): reg BSSSCI04 BSBG01 SES intenj negsci bully parent (running regress on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Linear regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = 150 = 260840.55 Replications = F( 6, 70) Prob > F R-squared = 75 62.05 = 0.0000 36385 = 4560577.3 150 = = Population size = 4714 Subpop. size Design df Number of obs 0.2323 306 Jknife * BSSSCI04 Coef. Std. Err. t P>t BSBG01 -23.50367 3.136024 SES intenj 7.912912 1.91433 -7.49 0.000 4.13 0.000 32.94539 6.935481 [95% Conf. Interval] 4.099371 4.75 0.000 3.14259 parent 4.98581 _cons 316.036 29.83308 11.72645 19.1292 negsci -44.96127 3.161739 -14.22 0.000 bully -9.94121 -29.75095 -17.25639 46.76157 -51.25978 -38.66276 -3.16 0.002 -16.20157 -3.68085 1.68 0.097 -.9287624 10.90038 256.6054 375.4665 2.96901 10.59 0.000 . svy jackknife, subpop(if idcntry==288): reg BSSSCI05 BSBG01 SES intenj negsci bully parent (running regress on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Linear regression Number of strata = 75 Number of obs = 307 36385 Number of PSUs = Subpop. no. of obs = 150 = 260840.55 Replications = F( 6, 70) Prob > F R-squared 150 = 75 = 62.33 = 0.0000 = = 4560577.3 4714 Subpop. size Design df Population size 0.2299 Jknife * BSSSCI05 Coef. Std. Err. BSBG01 -19.96862 SES t P>t 3.31544 6.758408 1.940117 intenj 24.81147 6.875636 negsci -47.2682 -6.02 0.000 3.48 0.001 3.61 0.001 3.08427 -15.33 0.000 bully -7.901061 3.332779 parent 6.414811 3.023509 _cons 340.6003 29.9672 [95% Conf. Interval] -2.37 0.020 -26.57332 -13.36393 2.893497 10.62332 11.1145 38.50844 -53.41238 -41.12402 -14.5403 -1.261825 2.12 0.037 .3916717 12.43795 11.37 0.000 280.9026 400.298 . . . *Korea . svy jackknife, subpop(if idcntry==410): reg BSSSCI01 BSBG01 SES intenj negsci bully parent (running regress on estimation sample) 308 Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Linear regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = 150 = 523257.02 Replications = F( 6, = 70) Prob > F R-squared 38114 = 4587506.1 150 75 = 351.97 = 0.0000 = Population size = 4286 Subpop. size Design df Number of obs 0.3380 Jknife * BSSSCI01 BSBG01 Coef. Std. Err. 1.192351 2.219052 t P>t [95% Conf. Interval] 0.54 0.593 -3.228227 309 5.61293 SES intenj 13.7865 .5893338 23.39 0.000 12.61249 26.54169 2.536473 10.46 0.000 14.96051 21.48877 negsci -26.06279 2.567937 -10.15 0.000 31.5946 -31.17839 -20.9472 bully 1.627431 2.449957 0.66 0.509 parent -.7786903 1.105136 -0.70 0.483 -2.980234 1.422854 _cons 49.19 0.000 538.9269 561.6737 11.41851 -3.253134 6.507997 584.4206 . svy jackknife, subpop(if idcntry==410): reg BSSSCI02 BSBG01 SES intenj negsci bully parent (running regress on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Linear regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = 150 Number of obs Population size = = 4587506.1 4286 Subpop. size = 523257.02 Replications = 38114 150 310 Design df F( 6, = 70) Prob > F R-squared 75 = 279.12 = 0.0000 = 0.3321 Jknife * BSSSCI02 BSBG01 SES intenj Coef. Std. Err. t P>t 3.595104 2.302305 [95% Conf. Interval] 1.56 0.123 14.05179 .6123408 22.95 0.000 -.9913224 12.83194 25.41683 2.478111 10.26 0.000 15.27163 20.48018 negsci -26.10905 2.520986 -10.36 0.000 8.18153 30.35348 -31.13111 -21.08699 bully 2.980976 2.489545 1.20 0.235 parent -1.779268 1.091722 -1.63 0.107 -3.954089 .3955532 _cons 52.02 0.000 541.6656 584.8076 563.2366 10.82825 -1.978451 7.940404 . svy jackknife, subpop(if idcntry==410): reg BSSSCI03 BSBG01 SES intenj negsci bully parent (running regress on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 311 .................................................. 100 .................................................. 150 Survey: Linear regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = 150 = 523257.02 Replications = F( 6, = 70) Prob > F R-squared Population size 38114 = 4587506.1 150 75 = 412.49 = 0.0000 = = 4286 Subpop. size Design df Number of obs 0.3201 Jknife * BSSSCI03 BSBG01 SES Coef. Std. Err. t P>t 3.403234 1.963387 [95% Conf. Interval] 1.73 0.087 13.63658 .6723077 20.28 0.000 -.5080324 12.29727 7.314501 14.97588 intenj 25.25946 2.520173 10.02 0.000 20.23902 negsci -26.0015 2.652854 -31.28625 -20.71674 bully 1.896411 2.37787 -9.80 0.000 0.80 0.428 -2.840549 30.2799 6.633371 parent -1.163524 1.133571 -1.03 0.308 -3.421712 1.094665 _cons 49.30 0.000 540.8921 563.6669 11.43256 586.4418 312 . svy jackknife, subpop(if idcntry==410): reg BSSSCI04 BSBG01 SES intenj negsci bully parent (running regress on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Linear regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = 150 = 523257.02 Replications = F( 6, 70) Prob > F R-squared = 38114 = 4587506.1 150 75 = 282.15 = 0.0000 = Population size = 4286 Subpop. size Design df Number of obs 0.3169 Jknife * BSSSCI04 Coef. Std. Err. t P>t [95% Conf. Interval] 313 BSBG01 SES intenj 5.01334 2.125696 2.36 0.021 13.08141 .6269355 20.87 0.000 26.50207 2.693585 negsci -24.75348 2.785835 bully 3.038812 2.533175 .7787368 11.83249 9.247943 14.33033 9.84 0.000 21.13618 -8.89 0.000 -30.30315 -19.20381 1.20 0.234 31.86797 -2.007532 8.085155 parent .3454188 1.074346 0.32 0.749 -1.794788 2.485626 _cons 551.2104 12.26495 44.94 0.000 526.7774 575.6435 . svy jackknife, subpop(if idcntry==410): reg BSSSCI05 BSBG01 SES intenj negsci bully parent (running regress on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Linear regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = 150 Number of obs Population size = 38114 = 4587506.1 4286 314 Subpop. size = 523257.02 Replications = Design df F( 6, 70) Prob > F R-squared 150 = 75 = 326.18 = 0.0000 = 0.3358 Jknife * BSSSCI05 BSBG01 SES intenj Coef. Std. Err. 4.5002 2.320843 t P>t [95% Conf. Interval] 1.94 0.056 13.89796 .6154728 22.58 0.000 25.91023 2.351143 11.02 0.000 negsci -26.45503 2.42552 -10.91 0.000 bully 3.415193 2.239393 1.53 0.131 -.1231565 12.67188 9.123556 15.12405 21.22652 30.59395 -31.28691 -21.62315 -1.045907 7.876292 parent .5450708 .9687879 0.56 0.575 -1.384854 2.474995 _cons 556.5002 9.966294 55.84 0.000 536.6463 576.354 . *Singapore . svy jackknife, subpop(if idcntry==702): reg BSSSCI01 BSBG01 SES intenj negsci bully parent (running regress on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. 315 Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Linear regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = 150 = 38291.883 Replications = F( 6, 70) Prob > F R-squared = 37560 = 4683378 150 75 = 156.09 = 0.0000 = Population size = 4493 Subpop. size Design df Number of obs 0.2712 Jknife * BSSSCI01 Coef. Std. Err. t P>t BSBG01 -4.254839 2.736275 SES intenj [95% Conf. Interval] -1.55 0.124 21.39965 1.202567 17.79 0.000 6.565951 3.192757 2.06 0.043 negsci -28.80333 2.695653 -10.69 0.000 bully -8.004555 2.476899 parent -4.285026 1.588201 -3.23 0.002 -2.70 0.009 -9.705777 1.196099 19.00401 23.79528 .205652 12.92625 -34.17334 -23.43331 -12.93879 -3.070318 -7.448886 -1.121167 316 _cons 666.0989 14.7759 45.08 0.000 636.6638 695.534 . svy jackknife, subpop(if idcntry==702): reg BSSSCI02 BSBG01 SES intenj negsci bully parent (running regress on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Linear regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = 150 = 38291.883 Replications = F( 6, 70) Prob > F R-squared = 75 139.00 = 0.0000 37560 = 4683378 150 = = Population size = 4493 Subpop. size Design df Number of obs 0.2714 317 Jknife * BSSSCI02 BSBG01 SES intenj Coef. Std. Err. t P>t -4.88458 2.699434 [95% Conf. Interval] -1.81 0.074 21.23389 1.163279 18.25 0.000 5.157671 3.23402 1.59 0.115 negsci -29.24084 2.743594 -10.66 0.000 bully -7.528502 2.563733 -2.94 0.004 -10.26213 .4929687 18.91652 23.55127 -1.284827 11.60017 -34.70636 -23.77532 -12.63572 -2.421284 parent -5.010462 1.562349 -3.21 0.002 -8.122822 -1.898103 _cons 45.49 0.000 643.1828 672.6422 14.78809 702.1016 . svy jackknife, subpop(if idcntry==702): reg BSSSCI03 BSBG01 SES intenj negsci bully parent (running regress on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Linear regression 318 Number of strata = Number of PSUs 75 = Subpop. no. of obs = 150 = 38291.883 Replications = F( 6, 70) Prob > F R-squared = Population size 37560 = 4683378 150 75 = 140.21 = 0.0000 = = 4493 Subpop. size Design df Number of obs 0.2760 Jknife * BSSSCI03 Coef. Std. Err. t P>t BSBG01 -2.401738 2.788269 SES intenj [95% Conf. Interval] -0.86 0.392 20.93624 1.200636 17.44 0.000 6.679197 3.332748 18.54445 2.00 0.049 -3.38 0.001 3.152779 23.32803 .0400221 negsci -29.61844 2.706125 -10.94 0.000 bully -8.120476 2.402108 -7.956254 13.31837 -35.00932 -24.22756 -12.90572 -3.335231 parent -4.018848 1.575273 -2.55 0.013 -7.156953 -.8807442 _cons 43.18 0.000 635.4195 666.1555 15.42895 696.8916 . svy jackknife, subpop(if idcntry==702): reg BSSSCI04 BSBG01 SES intenj negsci bully parent (running regress on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are 319 insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Linear regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = 150 = 38291.883 Replications = F( 6, 70) Prob > F R-squared = 37560 = 4683378 150 75 = 128.99 = 0.0000 = Population size = 4493 Subpop. size Design df Number of obs 0.2720 Jknife * BSSSCI04 Coef. Std. Err. t P>t BSBG01 -3.820324 2.601545 SES intenj -1.47 0.146 21.09088 1.223275 17.24 0.000 8.165681 3.365731 [95% Conf. Interval] 2.43 0.018 -9.002867 18.65399 1.460801 1.362219 23.52777 14.87056 320 negsci -28.07377 2.884519 bully -10.6814 2.596567 -9.73 0.000 -4.11 0.000 -33.82003 -22.32752 -15.85402 -5.508771 parent -4.163413 1.541022 -2.70 0.009 -7.233286 -1.09354 _cons 42.54 0.000 632.6188 694.7847 663.7018 15.60309 . svy jackknife, subpop(if idcntry==702): reg BSSSCI05 BSBG01 SES intenj negsci bully parent (running regress on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Linear regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = 150 = 38291.883 Replications = F( 6, 70) = = Population size = 37560 = 4683378 4493 Subpop. size Design df Number of obs 150 75 168.75 321 Prob > F = R-squared = 0.0000 0.2844 Jknife * BSSSCI05 Coef. Std. Err. t P>t BSBG01 -4.115481 2.612605 SES 21.2491 1.112392 intenj [95% Conf. Interval] -1.58 0.119 19.10 0.000 7.761001 3.155808 19.0331 2.46 0.016 -4.50 0.000 parent -3.836538 1.444678 -2.66 0.010 _cons 46.38 0.000 666.1575 14.36448 1.089096 23.4651 1.474308 negsci -28.93866 2.756288 -10.50 0.000 bully -10.35453 2.302693 -9.320058 14.04769 -34.42947 -23.44785 -14.94173 -5.767331 -6.714485 -.9585914 637.542 694.773 . *USA . svy jackknife, subpop(if idcntry==840): reg BSSSCI01 BSBG01 SES intenj negsci bully parent (running regress on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 322 .................................................. 150 Survey: Linear regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = 150 = 2339025.1 Replications = F( 6, 70) Prob > F R-squared = Population size 35919 = 3724905.2 150 75 = 184.71 = 0.0000 = = 7402 Subpop. size Design df Number of obs 0.2827 Jknife * BSSSCI01 Coef. Std. Err. t P>t BSBG01 -7.993755 2.249543 SES intenj [95% Conf. Interval] -3.55 0.001 17.97701 .6970517 25.79 0.000 5.447141 1.810806 3.01 0.004 negsci -23.45706 1.874042 -12.52 0.000 bully -1.969878 1.878659 -1.05 0.298 -12.47507 -3.512436 16.58841 19.3656 1.83983 9.054452 -27.19034 -19.72378 -5.712358 1.772602 parent -9.061249 1.181402 -7.67 0.000 -11.41472 -6.707776 _cons 64.11 0.000 580.7861 599.4116 9.349636 618.037 . svy jackknife, subpop(if idcntry==840): reg BSSSCI02 BSBG01 SES intenj negsci bully parent 323 (running regress on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Linear regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = 150 = 2339025.1 Replications = F( 6, 70) Prob > F R-squared = 35919 = 3724905.2 150 75 = 186.22 = 0.0000 = Population size = 7402 Subpop. size Design df Number of obs 0.2838 Jknife * BSSSCI02 Coef. Std. Err. t P>t [95% Conf. Interval] 324 BSBG01 -9.741113 2.104536 SES intenj -4.63 0.000 17.35684 .6759887 25.68 0.000 4.791123 1.852325 16.0102 2.59 0.012 -0.88 0.380 18.70348 1.101102 negsci -24.59783 1.941488 -12.67 0.000 bully -1.418901 1.606224 -13.93356 -5.548662 8.481144 -28.46547 -20.73019 -4.618663 1.780862 parent -9.226321 1.130527 -8.16 0.000 -11.47845 -6.974196 _cons 68.76 0.000 586.4739 603.9715 8.783501 621.4691 . svy jackknife, subpop(if idcntry==840): reg BSSSCI03 BSBG01 SES intenj negsci bully parent (running regress on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Linear regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = Subpop. size 150 Number of obs Population size = 35919 = 3724905.2 7402 = 2339025.1 325 Replications Design df F( 6, 70) Prob > F R-squared = 150 = 75 = 173.69 = 0.0000 = 0.2854 Jknife * BSSSCI03 Coef. Std. Err. t P>t BSBG01 -9.932611 2.287356 SES intenj [95% Conf. Interval] -4.34 0.000 17.55112 .6445489 27.23 0.000 5.022694 1.816412 16.26712 2.77 0.007 -1.51 0.136 18.83513 1.404216 negsci -24.31915 1.923206 -12.65 0.000 bully -2.507032 1.664471 -14.48926 -5.375965 8.641171 -28.15037 -20.48792 -5.822828 .8087643 parent -9.836347 1.103202 -8.92 0.000 -12.03404 -7.638655 _cons 66.70 0.000 588.1687 606.2754 9.089262 624.3822 . svy jackknife, subpop(if idcntry==840): reg BSSSCI04 BSBG01 SES intenj negsci bully parent (running regress on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 326 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Linear regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = 150 = 2339025.1 Replications = F( 6, 70) Prob > F R-squared = Population size 35919 = 3724905.2 150 75 = 157.36 = 0.0000 = = 7402 Subpop. size Design df Number of obs 0.2807 Jknife * BSSSCI04 Coef. Std. Err. t P>t BSBG01 -8.813076 2.237044 SES intenj [95% Conf. Interval] -3.94 0.000 17.69143 .6829283 25.91 0.000 4.53578 1.852019 2.45 0.017 negsci -24.64881 2.024052 -12.18 0.000 bully -2.140738 1.680984 -1.27 0.207 -13.2695 -4.356656 16.33096 19.05189 .8463689 8.225191 -28.68093 -20.6167 -5.48943 1.207954 parent -9.717997 1.135991 -8.55 0.000 -11.98101 -7.454986 _cons 62.41 0.000 587.0348 606.3896 9.715762 625.7444 327 . svy jackknife, subpop(if idcntry==840): reg BSSSCI05 BSBG01 SES intenj negsci bully parent (running regress on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Linear regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = 150 = 2339025.1 Replications = F( 6, 70) Prob > F R-squared = 75 168.66 = 0.0000 35919 = 3724905.2 150 = = Population size = 7402 Subpop. size Design df Number of obs 0.2731 Jknife * 328 BSSSCI05 Coef. Std. Err. t P>t BSBG01 -9.679217 2.189395 SES intenj [95% Conf. Interval] -4.42 0.000 17.55194 .6808323 25.78 0.000 4.335434 1.92541 2.25 0.027 negsci -23.83243 2.038693 -11.69 0.000 bully -2.58355 1.857594 parent -9.270511 1.197096 _cons 606.313 10.25266 -1.39 0.168 -7.74 0.000 59.14 0.000 -14.04072 -5.317718 16.19565 18.90822 .4998209 8.171047 -27.89372 -19.77115 -6.284067 1.116968 -11.65525 -6.885773 585.8887 626.7374 end of do-file . log close name: log: C:\Users\educ.brunerju\Google Drive\Dissertation\Outputs\Baseline Regressions.smcl log type: smcl closed on: 29 May 2014, 10:06:54 329 Teacher Regressions name: log: C:\Users\educ.brunerju\Google Drive\Dissertation\Outputs\Baseline Regressions\Teacher Regressions.smcl log type: smcl opened on: 29 May 2014, 15:02:02 . do "C:\Users\EDUC~1.BRU\AppData\Local\Temp\STD00000000.tmp" . ***Teacher Regressions*** . *Run teacher setup file first* . . *Chile . svy jackknife, subpop(if idcntry==152):reg BSSSCI01 BTBG01 BTBG04 scimaj BTBS17A tecoop telimi sciped expect tesupp (running regress on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Linear regression 330 Number of strata = Number of PSUs 75 = 150 Subpop. no. of obs = = 211760.94 Replications = F( 9, Prob > F R-squared = 4614635.5 75 = 12.32 = 0.0000 = Population size 44837 150 = 67) = 4854 Subpop. size Design df Number of obs 0.1665 Jknife * BSSSCI01 BTBG01 Coef. Std. Err. -.103439 .2351395 BTBG04 -4.223781 scimaj t P>t 9.89866 13.68661 5.390349 BTBS17A -8.801104 -0.44 0.661 -.5718609 .3649829 -0.43 0.671 -23.94292 15.49536 2.54 0.013 2.88735 tecoop -3.122058 4.157605 [95% Conf. Interval] 2.948489 -3.05 0.003 -0.75 0.455 24.42474 -14.553 -3.049208 -11.40443 5.160316 telimi -21.46446 4.846524 -4.43 0.000 -31.11923 -11.80969 sciped -9.871942 6.302062 -1.57 0.121 -22.42629 2.682409 expect 21.19333 6.121532 3.46 0.001 8.998609 tesupp -7.630526 6.767426 -1.13 0.263 -21.11193 5.850876 _cons 564.0643 51.1417 11.03 0.000 462.1849 33.38804 665.9438 331 . svy jackknife, subpop(if idcntry==152):reg BSSSCI02 BTBG01 BTBG04 scimaj BTBS17A tecoop telimi sciped expect tesupp (running regress on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Linear regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = 150 = 211760.94 Replications = F( 9, 67) Prob > F R-squared = 75 13.03 = 0.0000 44837 = 4614635.5 150 = = Population size = 4854 Subpop. size Design df Number of obs 0.1599 Jknife * 332 BSSSCI02 Coef. Std. Err. t P>t [95% Conf. Interval] BTBG01 -.0794398 .2369678 -0.34 0.738 -.5515038 .3926242 BTBG04 -4.512207 9.847903 -0.46 0.648 -24.13024 15.10582 scimaj 12.66583 5.308011 2.39 0.020 BTBS17A -7.539681 2.853686 tecoop -4.324469 4.396334 2.09173 -2.64 0.010 -0.98 0.328 23.23993 -13.22452 -1.854848 -13.08242 4.433478 telimi -19.38341 4.566652 -4.24 0.000 -28.48065 -10.28618 sciped -8.932595 6.368012 -1.40 0.165 -21.61833 3.753134 expect 22.92777 6.043442 3.79 0.000 10.88862 34.96693 tesupp -8.037058 7.014322 -1.15 0.256 -22.0103 5.936188 448.7537 655.9804 _cons 552.367 52.01206 10.62 0.000 . svy jackknife, subpop(if idcntry==152):reg BSSSCI03 BTBG01 BTBG04 scimaj BTBS17A tecoop telimi sciped expect tesupp (running regress on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 333 Survey: Linear regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = 150 = 211760.94 Replications = F( 9, 67) Prob > F R-squared = Population size 44837 = 4614635.5 150 75 = 14.92 = 0.0000 = = 4854 Subpop. size Design df Number of obs 0.1794 Jknife * BSSSCI03 Coef. Std. Err. t P>t [95% Conf. Interval] BTBG01 -.1975216 .2403093 -0.82 0.414 -.6762422 .281199 BTBG04 -2.931022 9.687925 -0.30 0.763 -22.23036 16.36831 scimaj 11.47308 5.251864 2.18 0.032 BTBS17A -8.934139 2.876316 tecoop -3.157683 4.064659 1.01083 -3.11 0.003 -0.78 0.440 21.93533 -14.66405 -3.204224 -11.2549 4.939533 telimi -20.27072 4.578145 -4.43 0.000 -29.39085 -11.15059 sciped -10.49346 6.256884 -1.68 0.098 -22.95781 1.970896 expect 4.02 0.000 11.75742 -1.35 0.180 -21.51073 4.095718 23.3257 5.807074 tesupp -8.707504 6.426991 _cons 558.4794 50.84802 10.98 0.000 457.1849 34.89399 659.7738 334 . svy jackknife, subpop(if idcntry==152):reg BSSSCI04 BTBG01 BTBG04 scimaj BTBS17A tecoop telimi sciped expect tesupp (running regress on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Linear regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = 150 = 211760.94 Replications = F( 9, 67) Prob > F R-squared = 75 13.78 = 0.0000 44837 = 4614635.5 150 = = Population size = 4854 Subpop. size Design df Number of obs 0.1664 Jknife * 335 BSSSCI04 Coef. Std. Err. t P>t [95% Conf. Interval] BTBG01 -.1292141 .2367481 -0.55 0.587 -.6008404 .3424123 BTBG04 -4.235996 9.582776 -0.44 0.660 -23.32586 14.85387 scimaj 14.50533 5.269538 2.75 0.007 BTBS17A -8.822406 2.762236 tecoop -3.693285 4.176894 telimi -20.0373 4.429035 4.007868 -3.19 0.002 -0.88 0.379 -4.52 0.000 -14.32506 -3.319751 -12.01409 4.627516 -28.86039 -11.21421 sciped -9.104341 6.241057 -1.46 0.149 expect 21.97417 5.923474 3.71 0.000 10.174 tesupp -8.099709 6.601167 -1.23 0.224 -21.24991 _cons 557.9845 50.6328 11.02 0.000 25.00278 -21.53716 3.328481 33.77433 457.1188 5.05049 658.8502 . svy jackknife, subpop(if idcntry==152):reg BSSSCI05 BTBG01 BTBG04 scimaj BTBS17A tecoop telimi sciped expect tesupp (running regress on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 336 Survey: Linear regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = 150 = 211760.94 Replications = F( 9, 67) Prob > F R-squared = Population size 44837 = 4614635.5 150 75 = 12.36 = 0.0000 = = 4854 Subpop. size Design df Number of obs 0.1738 Jknife * BSSSCI05 Coef. Std. Err. t P>t [95% Conf. Interval] BTBG01 -.1403861 .2571031 -0.55 0.587 -.6525617 .3717894 BTBG04 -4.768465 9.767599 -0.49 0.627 -24.22652 14.68959 scimaj 14.11784 5.637617 2.50 0.014 BTBS17A -8.729816 2.794559 tecoop -3.191448 4.380954 2.887136 -3.12 0.003 -0.73 0.469 25.34855 -14.29686 -3.162769 -11.91876 5.535859 telimi -19.27758 4.747959 -4.06 0.000 -28.736 -9.819163 sciped -10.00079 6.454712 -1.55 0.126 -22.85924 2.857653 expect 23.50633 6.198916 3.79 0.000 11.15746 35.85521 tesupp -8.155994 6.605586 -1.23 0.221 -21.315 5.003008 10.84 0.000 454.8399 659.6644 _cons 557.2522 51.40913 337 . . *Finland . svy jackknife, subpop(if idcntry==246):reg BSSSCI01 BTBG01 BTBG04 scimaj BTBS17A tecoop telimi sciped expect tesupp (running regress on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Linear regression Number of strata = Number of PSUs 75 = 150 Subpop. no. of obs = = 49542.548 Replications = F( 9, 67) Prob > F R-squared = 75 8.24 = 0.0000 44244 = 4645698 150 = = Population size = 8060 Subpop. size Design df Number of obs 0.0505 338 Jknife * BSSSCI01 Coef. Std. Err. t P>t [95% Conf. Interval] BTBG01 .1372539 .1448218 0.95 0.346 -.151246 .4257537 BTBG04 3.168828 2.642956 1.20 0.234 -2.09621 8.433865 scimaj 9.077452 3.674292 BTBS17A 2.47 0.016 .548512 1.247877 tecoop -2.924588 1.757887 0.44 0.662 16.39702 -1.937386 3.034409 2.56687 -1.14 0.258 -8.038055 2.188879 telimi -25.07404 4.830418 -5.19 0.000 -34.69672 -15.45135 sciped 8.773994 3.175608 2.76 0.007 2.447858 15.10013 expect 8.67794 3.897207 2.23 0.029 .9143049 16.44158 tesupp .0053685 2.477476 0.00 0.998 -4.930017 4.940754 _cons 545.7593 26.64041 20.49 0.000 492.6888 598.8297 . svy jackknife, subpop(if idcntry==246):reg BSSSCI02 BTBG01 BTBG04 scimaj BTBS17A tecoop telimi sciped expect tesupp (running regress on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 339 .................................................. 150 Survey: Linear regression Number of strata = Number of PSUs 75 = 150 Subpop. no. of obs = = 49542.548 Replications = F( 9, = 67) Prob > F R-squared Population size 44244 = 4645698 150 75 = 7.58 = 0.0000 = = 8060 Subpop. size Design df Number of obs 0.0452 Jknife * BSSSCI02 Coef. Std. Err. t P>t [95% Conf. Interval] BTBG01 .0963649 .1350506 0.71 0.478 -.1726696 .3653994 BTBG04 2.794673 2.697723 1.04 0.304 -2.579467 8.168813 scimaj 8.412535 3.615806 2.33 0.023 BTBS17A -.0707089 1.117745 tecoop -1.989056 2.281924 telimi -23.95094 4.902653 1.20948 -0.06 0.950 -0.87 0.386 -4.89 0.000 15.61559 -2.297371 -6.534883 2.155953 2.55677 -33.71752 -14.18435 sciped 8.114568 3.023533 2.68 0.009 2.091381 14.13776 expect 8.727108 3.740863 2.33 0.022 1.274926 16.17929 tesupp -1.662232 2.483238 -0.67 0.505 -6.609097 3.284632 _cons 551.7296 27.61962 19.98 0.000 496.7085 606.7507 340 . svy jackknife, subpop(if idcntry==246):reg BSSSCI03 BTBG01 BTBG04 scimaj BTBS17A tecoop telimi sciped expect tesupp (running regress on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Linear regression Number of strata = Number of PSUs 75 = 150 Subpop. no. of obs = = 49542.548 Replications = F( 9, 67) Prob > F R-squared = 75 8.41 = 0.0000 44244 = 4645698 150 = = Population size = 8060 Subpop. size Design df Number of obs 0.0448 341 Jknife * BSSSCI03 Coef. Std. Err. t P>t [95% Conf. Interval] BTBG01 .0947948 .1294615 0.73 0.466 -.1631058 .3526953 BTBG04 3.522536 2.331784 1.51 0.135 -1.122617 8.167689 scimaj 7.893122 3.571821 BTBS17A 2.21 0.030 .4710285 1.045472 tecoop -1.965512 2.234575 telimi -23.00898 4.585051 .7776894 0.45 0.654 -0.88 0.382 -5.02 0.000 15.00855 -1.611658 -6.417014 2.553715 2.48599 -32.14286 -13.87509 sciped 8.141941 2.775263 2.93 0.004 2.613333 13.67055 expect 7.876465 3.498905 2.25 0.027 .9062902 14.84664 tesupp -.088705 2.429177 -0.04 0.971 -4.927873 4.750463 _cons 541.0647 24.13118 492.9929 589.1364 22.42 0.000 . svy jackknife, subpop(if idcntry==246):reg BSSSCI04 BTBG01 BTBG04 scimaj BTBS17A tecoop telimi sciped expect tesupp (running regress on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 342 .................................................. 150 Survey: Linear regression Number of strata = Number of PSUs 75 = 150 Subpop. no. of obs = = 49542.548 Replications = F( 9, = 67) Prob > F R-squared Population size 44244 = 4645698 150 75 = 7.47 = 0.0000 = = 8060 Subpop. size Design df Number of obs 0.0476 Jknife * BSSSCI04 Coef. Std. Err. t P>t [95% Conf. Interval] BTBG01 .1030642 .1389839 0.74 0.461 -.1738059 .3799344 BTBG04 2.99924 2.487898 1.21 0.232 -1.956907 7.955387 scimaj 9.223801 3.829512 BTBS17A 2.41 0.018 .5180863 1.180087 tecoop -1.796017 2.526786 1.595022 0.44 0.662 -0.71 0.479 16.85258 -1.832767 2.868939 -6.829633 3.237598 telimi -25.05099 5.060887 -4.95 0.000 -35.1328 -14.96919 sciped 8.736063 2.84 0.006 2.615886 expect 7.724976 3.944674 1.96 0.054 -.1332179 15.58317 tesupp .0107474 2.589395 0.00 0.997 -5.147592 5.169087 _cons 547.9108 26.63321 20.57 0.000 3.07222 494.8548 14.85624 600.9669 343 . svy jackknife, subpop(if idcntry==246):reg BSSSCI05 BTBG01 BTBG04 scimaj BTBS17A tecoop telimi sciped expect tesupp (running regress on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Linear regression Number of strata = Number of PSUs 75 = 150 Subpop. no. of obs = = 49542.548 Replications = F( 9, 67) Prob > F R-squared = 75 6.76 = 0.0000 44244 = 4645698 150 = = Population size = 8060 Subpop. size Design df Number of obs 0.0426 344 Jknife * BSSSCI05 Coef. Std. Err. t P>t [95% Conf. Interval] BTBG01 .066046 .1313557 0.50 0.617 -.195628 BTBG04 .8751006 2.250577 0.39 0.699 -3.608278 scimaj 8.855882 3.726044 BTBS17A 2.38 0.020 .1172353 1.022318 tecoop -2.197796 2.373627 1.433222 0.11 0.909 -0.93 0.357 telimi -22.73756 4.908266 -4.63 0.000 .3277199 5.358479 16.27854 -1.919326 2.153797 -6.926303 2.530711 -32.51533 -12.95979 sciped 7.597769 2.969108 2.56 0.013 1.683003 13.51253 expect 8.800117 3.874989 2.27 0.026 1.080743 16.51949 tesupp -.42859 2.343429 -0.18 0.855 -5.096939 4.239759 _cons 557.524 25.90179 505.925 609.1231 21.52 0.000 . *Ghana . svy jackknife, subpop(if idcntry==288):reg BSSSCI01 BTBG01 BTBG04 scimaj BTBS17A tecoop telimi sciped expect tesupp (running regress on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 345 .................................................. 150 Survey: Linear regression Number of strata = Number of PSUs 75 = 150 Subpop. no. of obs = = 321057.2 Replications = F( 9, = 67) Prob > F R-squared Population size 44111 = 4579557.1 150 75 = 3.05 = 0.0040 = = 6219 Subpop. size Design df Number of obs 0.0863 Jknife * BSSSCI01 Coef. Std. Err. t P>t [95% Conf. Interval] BTBG01 1.613637 .8773043 1.84 0.070 -.1340429 3.361317 BTBG04 13.63518 6.394819 2.13 0.036 .8960491 26.37431 scimaj -23.56386 16.03886 -1.47 0.146 BTBS17A -5.862794 4.047998 tecoop -4.935656 9.277783 telimi -42.22349 16.24949 -55.51491 8.387185 -1.45 0.152 -0.53 0.596 -2.60 0.011 -13.92682 2.201232 -23.41795 13.54664 -74.59413 -9.85285 sciped 5.865848 8.784197 0.67 0.506 -11.63317 23.36487 expect 25.1927 9.611151 2.62 0.011 6.046303 44.33909 tesupp 10.7636 12.65611 0.85 0.398 -14.44867 35.97586 _cons 259.0061 88.45847 2.93 0.005 82.78778 435.2244 346 . svy jackknife, subpop(if idcntry==288):reg BSSSCI02 BTBG01 BTBG04 scimaj BTBS17A tecoop telimi sciped expect tesupp (running regress on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Linear regression Number of strata = Number of PSUs 75 = 150 Subpop. no. of obs = = 321057.2 Replications = F( 9, 67) Prob > F R-squared = 75 3.20 = 0.0028 44111 = 4579557.1 150 = = Population size = 6219 Subpop. size Design df Number of obs 0.0902 347 Jknife * BSSSCI02 Coef. Std. Err. t P>t [95% Conf. Interval] BTBG01 1.638068 .8661183 1.89 0.062 -.0873286 3.363464 BTBG04 13.85823 6.098564 2.27 0.026 1.709264 26.00719 scimaj -28.36456 15.80458 -1.79 0.077 BTBS17A -5.964633 4.044299 tecoop -5.580489 8.805568 telimi -43.11183 15.80971 -59.84889 -1.47 0.144 -0.63 0.528 3.11977 -14.02129 -23.12208 2.092024 11.9611 -2.73 0.008 -74.6064 -11.61727 sciped 7.210351 8.591692 0.84 0.404 -9.905177 24.32588 expect 25.57212 9.834785 2.60 0.011 5.980225 tesupp 11.94642 12.99192 0.92 0.361 -13.93482 37.82766 _cons 254.3069 89.31273 2.85 0.006 76.38681 45.16402 432.227 . svy jackknife, subpop(if idcntry==288):reg BSSSCI03 BTBG01 BTBG04 scimaj BTBS17A tecoop telimi sciped expect tesupp (running regress on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 348 .................................................. 150 Survey: Linear regression Number of strata = Number of PSUs 75 = 150 Subpop. no. of obs = = 321057.2 Replications = F( 9, = 67) Prob > F R-squared Population size 44111 = 4579557.1 150 75 = 2.84 = 0.0068 = = 6219 Subpop. size Design df Number of obs 0.0853 Jknife * BSSSCI03 Coef. Std. Err. t P>t [95% Conf. Interval] BTBG01 1.686454 .8899818 1.89 0.062 -.086481 3.459388 BTBG04 13.80733 6.113694 2.26 0.027 1.62823 25.98644 scimaj -27.34405 16.2011 -1.69 0.096 BTBS17A -5.643439 3.832261 tecoop -4.189543 9.043179 telimi -40.33224 15.90515 -59.61829 -1.47 0.145 -0.46 0.645 4.93019 -13.27769 1.990815 -22.20448 13.82539 -2.54 0.013 -72.01691 -8.64756 sciped 5.476536 8.695406 0.63 0.531 -11.8456 22.79867 expect 23.82173 9.443515 2.52 0.014 5.00928 42.63417 tesupp 10.33839 12.75605 0.81 0.420 -15.07297 35.74975 _cons 261.9277 88.45735 2.96 0.004 85.71167 438.1438 349 . svy jackknife, subpop(if idcntry==288):reg BSSSCI04 BTBG01 BTBG04 scimaj BTBS17A tecoop telimi sciped expect tesupp (running regress on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Linear regression Number of strata = Number of PSUs 75 = 150 Subpop. no. of obs = = 321057.2 Replications = F( 9, 67) Prob > F R-squared = 75 2.94 = 0.0053 44111 = 4579557.1 150 = = Population size = 6219 Subpop. size Design df Number of obs 0.0916 350 Jknife * BSSSCI04 Coef. Std. Err. t P>t [95% Conf. Interval] BTBG01 1.940929 .9040403 2.15 0.035 .1399886 3.74187 BTBG04 14.09394 6.377873 2.21 0.030 1.388566 26.79931 scimaj -25.1757 16.21863 -1.55 0.125 BTBS17A -6.102465 4.211631 tecoop -5.645781 9.035567 -57.48487 -1.45 0.152 -0.62 0.534 telimi -43.7793 16.56989 -2.64 0.010 sciped 5.47692 8.707554 0.63 0.531 expect 24.51909 9.862544 tesupp _cons 7.133472 -14.49247 2.287534 -23.64555 12.35399 -76.78823 -10.77038 -11.86942 22.82326 2.49 0.015 4.871892 44.16628 14.70795 13.08495 1.12 0.265 -11.3586 40.7745 249.9422 90.12842 2.77 0.007 70.39714 429.4872 . svy jackknife, subpop(if idcntry==288):reg BSSSCI05 BTBG01 BTBG04 scimaj BTBS17A tecoop telimi sciped expect tesupp (running regress on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 351 .................................................. 150 Survey: Linear regression Number of strata = Number of PSUs 75 = 150 Subpop. no. of obs = = 321057.2 Replications = F( 9, = 67) Prob > F R-squared Population size 44111 = 4579557.1 150 75 = 3.26 = 0.0024 = = 6219 Subpop. size Design df Number of obs 0.0933 Jknife * BSSSCI05 Coef. Std. Err. t P>t [95% Conf. Interval] BTBG01 1.777205 .8742939 2.03 0.046 .0355226 3.518888 BTBG04 13.30656 6.035282 2.20 0.031 1.283662 25.32946 scimaj -26.46248 16.07498 BTBS17A -1.65 0.104 -6.1374 4.172604 tecoop -4.933553 8.840237 telimi -43.02695 16.32765 -58.48549 5.560534 -1.47 0.146 -0.56 0.578 -14.44965 -22.54421 2.174853 12.6771 -2.64 0.010 -75.55329 -10.50061 -14.05855 20.76895 sciped 3.355201 8.741395 0.38 0.702 expect 25.39382 9.477668 2.68 0.009 tesupp 13.59633 12.79724 1.06 0.291 -11.89707 39.08974 _cons 259.2335 88.58318 2.93 0.005 82.76675 6.51334 44.27431 435.7002 352 . *Korea . svy jackknife, subpop(if idcntry==410):reg BSSSCI01 BTBG01 BTBG04 scimaj BTBS17A tecoop telimi sciped expect tesupp (running regress on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Linear regression Number of strata = Number of PSUs 75 = 150 Subpop. no. of obs = = 564592.19 Replications = F( 9, 67) Prob > F R-squared = 75 2.70 = 0.0095 45191 = 4587604.4 150 = = Population size = 5487 Subpop. size Design df Number of obs 0.0139 353 Jknife * BSSSCI01 BTBG01 Coef. Std. Err. t P>t [95% Conf. Interval] .3064962 .1841181 1.66 0.100 -.0602859 .6732784 BTBG04 -.2410047 5.017108 -0.05 0.962 -10.2356 9.753587 scimaj 11.71431 5.232119 2.24 0.028 BTBS17A -1.917661 1.841615 tecoop 1.291398 -1.04 0.301 22.13723 -5.586346 1.751024 .04889 3.021769 0.02 0.987 -5.970783 6.068563 telimi 2.468101 3.191902 0.77 0.442 -3.890494 8.826695 sciped 6.898243 4.245226 1.62 0.108 -1.558681 15.35517 expect 12.13128 3.177191 3.82 0.000 5.801993 tesupp -3.083097 3.172207 -0.97 0.334 -9.402457 3.236263 _cons 497.1696 39.01015 12.74 0.000 419.4574 18.46057 574.8818 . svy jackknife, subpop(if idcntry==410):reg BSSSCI02 BTBG01 BTBG04 scimaj BTBS17A tecoop telimi sciped expect tesupp (running regress on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 354 .................................................. 150 Survey: Linear regression Number of strata = Number of PSUs 75 = 150 Subpop. no. of obs = = 564592.19 Replications = F( 9, = 67) Prob > F R-squared Population size 45191 = 4587604.4 150 75 = 3.18 = 0.0029 = = 5487 Subpop. size Design df Number of obs 0.0152 Jknife * BSSSCI02 BTBG01 Coef. Std. Err. t P>t [95% Conf. Interval] .3689674 .1883007 1.96 0.054 -.0061469 .7440818 BTBG04 -1.129781 5.127893 -0.22 0.826 -11.34507 9.085506 scimaj 13.3933 4.039801 3.32 0.001 BTBS17A -1.142787 1.737476 tecoop .4597979 2.947332 5.345605 -0.66 0.513 0.16 0.876 21.441 -4.604015 2.318442 -5.411588 6.331183 telimi 2.710189 3.368321 0.80 0.424 -3.99985 sciped 6.758775 4.157994 1.63 0.108 -1.524375 15.04192 expect 12.77817 3.245275 3.94 0.000 6.313248 tesupp -2.531212 3.176069 -0.80 0.428 -8.858265 3.795842 _cons 491.3677 39.95723 12.30 0.000 9.420228 411.7688 19.24309 570.9666 355 . svy jackknife, subpop(if idcntry==410):reg BSSSCI03 BTBG01 BTBG04 scimaj BTBS17A tecoop telimi sciped expect tesupp (running regress on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Linear regression Number of strata = Number of PSUs 75 = 150 Subpop. no. of obs = = 564592.19 Replications = F( 9, 67) Prob > F R-squared = 75 2.79 = 0.0077 45191 = 4587604.4 150 = = Population size = 5487 Subpop. size Design df Number of obs 0.0131 356 Jknife * BSSSCI03 BTBG01 Coef. Std. Err. t P>t [95% Conf. Interval] .3701896 .1671278 2.22 0.030 .0372539 .7031252 BTBG04 -.4745464 4.862812 -0.10 0.923 -10.16176 9.212672 scimaj 9.645788 6.301432 1.53 0.130 BTBS17A -1.378606 1.689244 tecoop .2772389 3.175286 -2.907309 22.19889 -0.82 0.417 0.09 0.931 -4.743752 1.98654 -6.048256 6.602734 telimi 1.045434 3.198116 0.33 0.745 -5.32554 sciped 4.901911 4.284261 1.14 0.256 -3.632774 13.4366 expect 12.22665 3.218127 3.80 0.000 5.815814 18.63749 tesupp -2.9252 3.137101 -0.93 0.354 -9.174627 3.324227 427.8242 581.3789 _cons 504.6015 38.54088 13.09 0.000 7.416408 . svy jackknife, subpop(if idcntry==410):reg BSSSCI04 BTBG01 BTBG04 scimaj BTBS17A tecoop telimi sciped expect tesupp (running regress on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 357 .................................................. 150 Survey: Linear regression Number of strata = Number of PSUs 75 = 150 Subpop. no. of obs = = 564592.19 Replications = F( 9, = 67) Prob > F R-squared Population size 45191 = 4587604.4 150 75 = 2.58 = 0.0128 = = 5487 Subpop. size Design df Number of obs 0.0112 Jknife * BSSSCI04 BTBG01 Coef. Std. Err. t P>t [95% Conf. Interval] .3382082 .1661786 2.04 0.045 .0071635 .6692529 BTBG04 -.2699527 4.902171 -0.06 0.956 -10.03558 9.495672 scimaj 16.03672 6.737845 2.38 0.020 BTBS17A -.6961948 1.672656 tecoop .078681 3.314551 telimi 1.304382 sciped 5.637719 4.270995 expect -0.42 0.678 0.02 0.981 -4.028297 -6.524242 2.635908 6.681605 7.463663 1.32 0.191 -2.87054 14.14598 11.07264 3.103067 3.57 0.001 4.891013 17.25427 tesupp -2.804747 3.074391 -0.91 0.365 -8.929247 3.319753 496.5476 37.79435 0.42 0.674 29.4592 -4.854899 _cons 3.09185 2.614244 13.14 0.000 421.2574 571.8378 358 . svy jackknife, subpop(if idcntry==410):reg BSSSCI05 BTBG01 BTBG04 scimaj BTBS17A tecoop telimi sciped expect tesupp (running regress on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Linear regression Number of strata = Number of PSUs 75 = 150 Subpop. no. of obs = = 564592.19 Replications = F( 9, 67) Prob > F R-squared = 75 3.10 = 0.0036 45191 = 4587604.4 150 = = Population size = 5487 Subpop. size Design df Number of obs 0.0144 359 Jknife * BSSSCI05 BTBG01 Coef. Std. Err. t P>t [95% Conf. Interval] .3861884 .1865144 2.07 0.042 .0146326 .7577441 BTBG04 -.4315674 5.132894 -0.08 0.933 -10.65682 9.793681 scimaj 12.98189 4.979658 2.61 0.011 BTBS17A -.5716366 1.751456 tecoop .3479 3.029684 3.061907 -0.33 0.745 0.11 0.909 22.90188 -4.060716 -5.68754 2.917443 6.38334 telimi 2.093972 3.175229 0.66 0.512 -4.23141 sciped 5.028715 4.207944 1.20 0.236 -3.353939 13.41137 expect 13.26424 3.074101 4.31 0.000 7.140314 tesupp -2.697667 3.130394 -0.86 0.392 -8.933731 3.538398 _cons 491.1706 39.81297 12.34 0.000 8.419353 411.8591 19.38816 570.4821 . *Singapore . svy jackknife, subpop(if idcntry==702):reg BSSSCI01 BTBG01 BTBG04 scimaj BTBS17A tecoop telimi sciped expect tesupp (running regress on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 360 .................................................. 150 Survey: Linear regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = 150 = 45102 Replications = 150 F( 9, = 67) Prob > F R-squared Population size 45194 = 4648951.3 75 = 13.57 = 0.0000 = = 5303 Subpop. size Design df Number of obs 0.2295 Jknife * BSSSCI01 Coef. Std. Err. t P>t [95% Conf. Interval] BTBG01 .3351843 .4523507 0.74 0.461 -.5659446 1.236313 BTBG04 16.04142 10.02882 1.60 0.114 -3.937019 36.01985 scimaj -14.55089 20.49288 BTBS17A -0.71 0.480 4.166812 3.751218 tecoop -11.24328 7.957502 1.11 0.270 -1.41 0.162 telimi -58.72717 10.26892 -5.72 0.000 sciped 6.704676 0.66 0.508 expect 46.43481 7.249906 10.0898 -55.3748 26.27303 -3.305997 11.63962 -27.09543 4.608881 -79.1839 -38.27044 -13.39523 26.80458 6.40 0.000 31.99226 60.87736 tesupp -12.72109 7.105514 -1.79 0.077 -26.876 1.433822 _cons 8.86 0.000 427.4877 675.329 551.4084 62.20597 361 . svy jackknife, subpop(if idcntry==702):reg BSSSCI02 BTBG01 BTBG04 scimaj BTBS17A tecoop telimi sciped expect tesupp (running regress on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Linear regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = 150 = 45102 Replications = 150 F( 9, 67) Prob > F R-squared = 14.17 = 0.0000 45194 = 4648951.3 75 = = Population size = 5303 Subpop. size Design df Number of obs 0.2308 362 Jknife * BSSSCI02 Coef. Std. Err. t P>t [95% Conf. Interval] BTBG01 .2707345 .4439913 0.61 0.544 -.6137414 1.15521 BTBG04 15.29753 9.684719 1.58 0.118 -3.995416 34.59048 scimaj -13.32965 BTBS17A 20.7116 -0.64 0.522 3.975759 3.895298 tecoop -10.14317 7.977794 -54.58927 1.02 0.311 -1.27 0.208 10.5039 -5.66 0.000 27.92997 -3.784072 11.73559 -26.03575 5.749409 telimi -59.4069 sciped 6.374358 9.955896 0.64 0.524 -13.4588 26.20752 expect 45.41774 7.193243 6.31 0.000 31.08806 59.74741 tesupp -13.5555 6.860423 -1.98 0.052 -27.22216 .1111657 _cons 561.5935 62.30501 437.4756 685.7115 9.01 0.000 -80.33174 -38.48205 . svy jackknife, subpop(if idcntry==702):reg BSSSCI03 BTBG01 BTBG04 scimaj BTBS17A tecoop telimi sciped expect tesupp (running regress on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 363 .................................................. 150 Survey: Linear regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = 150 = 45102 Replications = 150 F( 9, = 67) Prob > F R-squared Population size 45194 = 4648951.3 75 = 14.44 = 0.0000 = = 5303 Subpop. size Design df Number of obs 0.2325 Jknife * BSSSCI03 Coef. Std. Err. t P>t [95% Conf. Interval] BTBG01 .2496065 .4262571 0.59 0.560 -.5995412 1.098754 BTBG04 15.03083 9.583327 1.57 0.121 -4.060133 34.1218 scimaj -16.49099 21.03807 BTBS17A -0.78 0.436 4.07057 3.779605 tecoop -10.12348 7.881758 -58.40098 1.08 0.285 -1.28 0.203 25.419 -3.458788 11.59993 -25.82474 5.577791 telimi -58.29007 10.16382 -5.74 0.000 -78.53744 -38.04271 sciped 6.03169 9.824059 0.61 0.541 -13.53884 25.60222 expect 46.1679 7.227921 6.39 0.000 31.76915 60.56666 tesupp -13.72454 6.891613 -1.99 0.050 -27.45333 _cons 8.98 0.000 437.2062 561.8425 62.56524 .0042616 686.4789 364 . svy jackknife, subpop(if idcntry==702):reg BSSSCI04 BTBG01 BTBG04 scimaj BTBS17A tecoop telimi sciped expect tesupp (running regress on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Linear regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = 150 = 45102 Replications = 150 F( 9, 67) Prob > F R-squared = 14.40 = 0.0000 45194 = 4648951.3 75 = = Population size = 5303 Subpop. size Design df Number of obs 0.2290 365 Jknife * BSSSCI04 Coef. Std. Err. t P>t [95% Conf. Interval] BTBG01 .340619 .4515729 0.75 0.453 -.5589603 1.240198 BTBG04 15.69633 9.744081 1.61 0.111 -3.714872 35.10754 scimaj -14.9246 21.10691 -0.71 0.482 BTBS17A tecoop 4.34954 3.869886 -56.97171 1.12 0.265 27.12252 -3.359668 -25.8176 12.05875 -10.0534 7.913347 -1.27 0.208 5.710794 telimi -59.86455 10.40634 -5.75 0.000 -80.59504 -39.13405 sciped 6.317426 10.12528 0.62 0.535 -13.85317 26.48802 expect 45.14931 7.283955 6.20 0.000 30.63893 tesupp -13.13383 7.136657 -1.84 0.070 -27.35078 1.083117 _cons 8.75 0.000 432.5221 560.0783 64.03098 59.65969 687.6346 . svy jackknife, subpop(if idcntry==702):reg BSSSCI05 BTBG01 BTBG04 scimaj BTBS17A tecoop telimi sciped expect tesupp (running regress on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 366 .................................................. 150 Survey: Linear regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = 150 = 45102 Replications = 150 F( 9, = 67) Prob > F R-squared Population size 45194 = 4648951.3 75 = 15.10 = 0.0000 = = 5303 Subpop. size Design df Number of obs 0.2300 Jknife * BSSSCI05 Coef. Std. Err. t P>t [95% Conf. Interval] BTBG01 .2898338 .4427804 0.65 0.515 -.59223 BTBG04 15.51676 9.474261 1.64 0.106 -3.356932 scimaj -17.18706 21.38301 BTBS17A -0.80 0.424 3.607851 3.643897 tecoop -10.14764 1.171898 34.39046 -59.78419 25.41008 0.99 0.325 -3.651164 7.80817 -1.30 0.198 -25.70231 telimi -58.36325 10.37012 -5.63 0.000 -79.0216 10.86687 5.407035 -37.7049 sciped 6.650248 10.04279 0.66 0.510 -13.35601 26.65651 expect 45.69433 7.072456 6.46 0.000 31.60528 tesupp -12.46748 6.987772 -1.78 0.078 -26.38783 1.452881 _cons 9.08 0.000 435.3459 557.7094 61.42432 59.78339 680.073 367 . *USA . svy jackknife, subpop(if idcntry==840):reg BSSSCI01 BTBG01 BTBG04 scimaj BTBS17A tecoop telimi sciped expect tesupp (running regress on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Linear regression Number of strata = Number of PSUs 75 = 150 Subpop. no. of obs = = 1575311.9 Replications = F( 9, 67) Prob > F R-squared = 75 6.25 = 0.0000 40566 = 2961191.9 150 = = Population size = 5130 Subpop. size Design df Number of obs 0.1002 368 Jknife * BSSSCI01 Coef. Std. Err. t P>t [95% Conf. Interval] BTBG01 .6155894 .3028333 2.03 0.046 .0123146 1.218864 BTBG04 2.020761 6.073329 0.33 0.740 -10.07793 14.11945 scimaj 3.027081 7.163176 0.42 0.674 BTBS17A -1.772089 1.772905 tecoop -1.404721 3.686586 telimi -15.72287 9.842003 -1.00 0.321 -0.38 0.704 -1.60 0.114 sciped 1.663991 5.771859 0.29 0.774 expect 28.49753 5.912013 4.82 0.000 tesupp .0836787 6.291998 0.01 0.989 _cons 461.6476 62.7745 -11.2427 7.35 0.000 17.29686 -5.303896 1.759719 -8.748776 5.939335 -35.32915 3.883403 -9.834142 13.16212 16.7202 40.27486 -12.45062 12.61798 336.5944 586.7008 . svy jackknife, subpop(if idcntry==840):reg BSSSCI02 BTBG01 BTBG04 scimaj BTBS17A tecoop telimi sciped expect tesupp (running regress on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 369 .................................................. 150 Survey: Linear regression Number of strata = Number of PSUs 75 = 150 Subpop. no. of obs = = 1575311.9 Replications = F( 9, = 67) Prob > F R-squared Population size 40566 = 2961191.9 150 75 = 6.16 = 0.0000 = = 5130 Subpop. size Design df Number of obs 0.0999 Jknife * BSSSCI02 Coef. Std. Err. t P>t [95% Conf. Interval] BTBG01 .6720455 .2922085 2.30 0.024 .0899364 1.254155 BTBG04 2.869809 6.377153 0.45 0.654 -9.83413 15.57375 scimaj 2.393848 7.504045 0.32 0.751 BTBS17A -1.669439 1.687197 tecoop -1.626934 3.87915 -12.55498 17.34267 -0.99 0.326 -0.42 0.676 -9.354598 -34.8023 1.691629 6.10073 telimi -15.4666 9.706182 sciped 1.463619 5.908759 0.25 0.805 -10.30723 13.23447 expect 28.09126 5.866755 4.79 0.000 16.40409 tesupp -1.530116 6.389054 -0.24 0.811 -14.25776 11.19753 _cons 7.69 0.000 343.0676 463.0434 60.22571 -1.59 0.115 -5.030507 3.869109 39.77844 583.0191 370 . svy jackknife, subpop(if idcntry==840):reg BSSSCI03 BTBG01 BTBG04 scimaj BTBS17A tecoop telimi sciped expect tesupp (running regress on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Linear regression Number of strata = Number of PSUs 75 = 150 Subpop. no. of obs = = 1575311.9 Replications = F( 9, 67) Prob > F R-squared = 75 6.47 = 0.0000 40566 = 2961191.9 150 = = Population size = 5130 Subpop. size Design df Number of obs 0.1020 371 Jknife * BSSSCI03 Coef. Std. Err. t P>t [95% Conf. Interval] BTBG01 .5855941 .2936532 1.99 0.050 .0006068 1.170581 BTBG04 3.242365 6.118051 0.53 0.598 -8.945417 15.43015 scimaj 1.67611 7.099721 0.24 0.814 BTBS17A -1.329443 1.678943 tecoop -1.746974 3.784613 -12.46726 -0.79 0.431 -0.46 0.646 15.81948 -4.674068 2.015182 -9.286311 5.792362 telimi -14.83462 9.778912 -1.52 0.133 -34.31521 4.645973 sciped 2.281926 0.40 0.687 -8.966638 13.53049 expect 28.69023 5.683489 5.05 0.000 17.36814 40.01232 tesupp .9408527 6.44021 0.15 0.884 -11.8887 13.77041 _cons 446.9898 60.84737 7.35 0.000 325.7756 568.204 5.64658 . svy jackknife, subpop(if idcntry==840):reg BSSSCI04 BTBG01 BTBG04 scimaj BTBS17A tecoop telimi sciped expect tesupp (running regress on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 372 .................................................. 150 Survey: Linear regression Number of strata = Number of PSUs 75 = 150 Subpop. no. of obs = = 1575311.9 Replications = F( 9, = 67) Prob > F R-squared Population size 40566 = 2961191.9 150 75 = 5.86 = 0.0000 = = 5130 Subpop. size Design df Number of obs 0.1011 Jknife * BSSSCI04 Coef. Std. Err. t P>t [95% Conf. Interval] BTBG01 .6013818 .2959839 2.03 0.046 .0117517 1.191012 BTBG04 2.067479 6.185784 0.33 0.739 -10.25523 14.39019 scimaj 1.499896 7.150622 0.21 0.834 BTBS17A -1.277586 1.697883 tecoop -1.385202 3.778817 telimi -14.16672 9.44517 -12.74487 15.74467 -0.75 0.454 -0.37 0.715 -1.50 0.138 -4.659943 -8.91299 -32.98247 2.10477 6.142587 4.649021 sciped 2.205629 5.644084 0.39 0.697 -9.037962 13.44922 expect 28.99398 5.854242 4.95 0.000 17.33173 40.65622 tesupp -1.238703 6.241092 -0.20 0.843 -13.6716 11.19419 _cons 7.57 0.000 337.0962 577.8572 457.4767 60.42888 373 . svy jackknife, subpop(if idcntry==840):reg BSSSCI05 BTBG01 BTBG04 scimaj BTBS17A tecoop telimi sciped expect tesupp (running regress on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Linear regression Number of strata = Number of PSUs 75 = 150 Subpop. no. of obs = = 1575311.9 Replications = F( 9, 67) Prob > F R-squared = 75 6.31 = 0.0000 40566 = 2961191.9 150 = = Population size = 5130 Subpop. size Design df Number of obs 0.1000 374 Jknife * BSSSCI05 Coef. Std. Err. t P>t [95% Conf. Interval] BTBG01 .5908797 .3130341 1.89 0.063 -.0327163 1.214476 BTBG04 3.000571 6.262381 0.48 0.633 -9.474732 15.47587 scimaj 1.100941 7.561977 0.15 0.885 BTBS17A -1.004701 1.758536 tecoop -2.090898 3.910392 telimi -14.96705 9.685698 -13.96329 16.16517 -0.57 0.569 -0.53 0.594 -1.55 0.126 -4.507886 2.498483 -9.880798 5.699003 -34.26195 4.327847 sciped 1.167738 5.935532 0.20 0.845 -10.65645 12.99192 expect 29.19172 5.867408 4.98 0.000 17.50324 tesupp -2.358362 6.583682 -0.36 0.721 -15.47373 10.75701 _cons 7.35 0.000 337.5903 463.1629 63.03524 40.88019 588.7356 . . svy jackknife, subpop(if idcntry==840):reg BSSSCI01 BTBG01 BTBG04 scimaj BTBS17A tecoop telimi sciped expect tesupp (running regress on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 375 .................................................. 100 .................................................. 150 Survey: Linear regression Number of strata = Number of PSUs 75 = 150 Subpop. no. of obs = = 402016.3 Replications = F( 9, = 67) Prob > F R-squared Population size 44399 = 4217396.4 150 75 = 3.34 = 0.0020 = = 1335 Subpop. size Design df Number of obs 0.0350 Jknife * BSSSCI01 Coef. Std. Err. t P>t BTBG01 .65582 .2923397 BTBG04 8.593528 5.323542 scimaj .268642 7.09174 2.24 0.028 6.067065 4.189481 telimi -19.71519 10.2361 .0734493 1.61 0.111 0.04 0.970 BTBS17A -1.308654 1.644557 tecoop [95% Conf. Interval] -2.011511 -13.85883 -0.80 0.429 1.45 0.152 -1.93 0.058 1.238191 19.19857 14.39611 -4.58478 1.967473 -2.278809 14.41294 -40.10656 .6761703 sciped -8.433605 5.401045 -1.56 0.123 -19.19304 2.325829 expect 3.480579 5.669658 0.61 0.541 -7.813959 14.77512 tesupp 3.2182 6.750955 0.48 0.635 -10.23039 16.66679 376 _cons 493.1612 53.47377 9.22 0.000 386.636 599.6864 . svy jackknife, subpop(if idcntry==840):reg BSSSCI02 BTBG01 BTBG04 scimaj BTBS17A tecoop telimi sciped expect tesupp (running regress on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Linear regression Number of strata = Number of PSUs 75 = 150 Subpop. no. of obs = = 402016.3 Replications = F( 9, 67) Prob > F R-squared = 75 2.27 = 0.0275 44399 = 4217396.4 150 = = Population size = 1335 Subpop. size Design df Number of obs 0.0329 377 Jknife * BSSSCI02 Coef. Std. Err. t P>t [95% Conf. Interval] BTBG01 .7532245 .312197 2.41 0.018 .1312963 BTBG04 7.736072 5.835568 1.33 0.189 -3.888976 scimaj -3.674644 7.503922 -0.49 0.626 BTBS17A -.8618293 1.574942 tecoop 1.375153 19.36112 -18.62322 11.27394 -0.55 0.586 -3.999275 2.275616 6.501649 4.406371 1.48 0.144 -2.276292 15.27959 telimi -16.39793 10.41892 -1.57 0.120 -37.15347 4.357618 sciped -7.146381 5.425484 -1.32 0.192 -17.9545 3.661737 expect 3.498201 5.815464 0.60 0.549 -8.086798 15.0832 tesupp 3.313954 7.191748 0.46 0.646 -11.01274 17.64065 _cons 483.2869 55.81954 8.66 0.000 372.0887 594.4851 . svy jackknife, subpop(if idcntry==840):reg BSSSCI03 BTBG01 BTBG04 scimaj BTBS17A tecoop telimi sciped expect tesupp (running regress on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 378 .................................................. 100 .................................................. 150 Survey: Linear regression Number of strata = Number of PSUs 75 = 150 Subpop. no. of obs = = 402016.3 Replications = F( 9, Prob > F R-squared Population size 44399 = 4217396.4 150 = 67) = 1335 Subpop. size Design df Number of obs 75 = 3.36 = 0.0019 = 0.0370 Jknife * BSSSCI03 Coef. Std. Err. t P>t BTBG01 .799533 .311758 BTBG04 9.548113 5.057215 scimaj -3.653464 7.130452 2.56 0.012 .1784793 1.89 0.063 -0.51 0.610 BTBS17A -.8498622 1.673699 tecoop [95% Conf. Interval] 1.420587 -.526377 19.6226 -17.85805 10.55112 -0.51 0.613 -4.184042 2.484318 6.21128 4.029102 1.54 0.127 -1.815104 14.23766 telimi -20.58498 9.666753 -2.13 0.036 -39.84214 -1.327826 sciped -4.393994 5.115747 -0.86 0.393 -14.58508 5.797097 expect 2.469305 5.523893 0.45 0.656 -8.534853 13.47346 tesupp 1.62673 7.712572 0.21 0.834 -13.7375 16.99096 379 _cons 488.8494 54.59315 8.95 0.000 380.0943 597.6046 . svy jackknife, subpop(if idcntry==840):reg BSSSCI04 BTBG01 BTBG04 scimaj BTBS17A tecoop telimi sciped expect tesupp (running regress on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Linear regression Number of strata = Number of PSUs 75 = 150 Subpop. no. of obs = = 402016.3 Replications = F( 9, 67) Prob > F R-squared = 75 2.75 = 0.0085 44399 = 4217396.4 150 = = Population size = 1335 Subpop. size Design df Number of obs 0.0348 380 Jknife * BSSSCI04 Coef. Std. Err. t P>t [95% Conf. Interval] BTBG01 .7155497 .302183 2.37 0.020 .1135702 1.317529 BTBG04 5.940648 5.54892 1.07 0.288 -5.113368 16.99466 scimaj -2.828353 7.288638 BTBS17A -.0206283 tecoop telimi -0.39 0.699 1.68331 7.297913 4.391132 -16.1037 10.17394 -17.34806 11.69136 -0.01 0.990 1.66 0.101 -1.58 0.118 -3.373954 3.332698 -1.44967 16.0455 -36.37123 4.163825 sciped -6.462913 5.570124 -1.16 0.250 -17.55917 4.633343 expect 4.815943 5.447508 0.88 0.379 -6.03605 tesupp 4.558529 6.776524 0.67 0.503 -8.940999 18.05806 _cons 474.9937 56.48918 8.41 0.000 362.4615 15.66794 587.5259 . svy jackknife, subpop(if idcntry==840):reg BSSSCI05 BTBG01 BTBG04 scimaj BTBS17A tecoop telimi sciped expect tesupp (running regress on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 381 .................................................. 100 .................................................. 150 Survey: Linear regression Number of strata = Number of PSUs 75 = 150 Subpop. no. of obs = = 402016.3 Replications = F( 9, = 67) Prob > F R-squared Population size 44399 = 4217396.4 150 75 = 2.39 = 0.0205 = = 1335 Subpop. size Design df Number of obs 0.0289 Jknife * BSSSCI05 Coef. Std. Err. t P>t [95% Conf. Interval] BTBG01 .786946 .3156015 2.49 0.015 .1582356 BTBG04 6.997979 5.706598 1.23 0.224 -4.370146 scimaj -.0403892 8.050494 -0.01 0.996 BTBS17A -.5364266 1.692307 tecoop 1.415657 18.3661 -16.07779 15.99702 -0.32 0.752 -3.907675 2.834822 4.385955 4.470057 0.98 0.330 telimi -16.28579 10.41447 -1.56 0.122 -37.03248 4.460891 sciped -5.089769 -0.88 0.381 -16.60591 6.426375 5.7809 -4.518856 13.29077 expect 3.835048 5.954061 0.64 0.521 -8.026051 15.69615 tesupp 4.152607 7.137906 0.58 0.562 -10.06683 18.37205 382 _cons 478.8766 57.50902 8.33 0.000 364.3128 593.4405 . end of do-file . log close name: log: C:\Users\educ.brunerju\Google Drive\Dissertation\Outputs\Baseline Regressions\Teacher Regressions.smcl log type: smcl closed on: 29 May 2014, 15:12:41 383 School Regressions name: log: C:\Users\educ.brunerju\Google Drive\Dissertation\Outputs\Baseline Regressions\School Regressions.smcl log type: smcl opened on: 29 May 2014, 16:05:46 . do "C:\Users\EDUC~1.BRU\AppData\Local\Temp\STD00000000.tmp" . ***School Regressions*** . *Run school setup file first and factor analysis second* . . *Chile . svy jackknife, subpop(if idcntry==152):reg BSSSCI01 BCBG03A BCBG05B schcli schpar (running regress on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Linear regression 384 Number of strata = Number of PSUs 75 = Subpop. no. of obs = 150 = 212596.06 Replications = F( 4, 72) Prob > F R-squared Population size 37955 = 4656707.4 150 = 75 = 31.63 = 0.0000 = = 4796 Subpop. size Design df Number of obs 0.1561 Jknife * BSSSCI01 Coef. Std. Err. t P>t [95% Conf. Interval] BCBG03A -16.95777 2.928502 -5.79 0.000 -22.79164 -11.12389 BCBG05B 2.26 0.027 .6283909 5.327978 2.359109 schcli -13.47861 3.654099 10.02756 -3.69 0.000 -20.75795 -6.19927 schpar 7.838688 5.447871 1.44 0.154 -3.014028 18.6914 _cons 495.6406 25.56495 19.39 0.000 444.7126 546.5686 . svy jackknife, subpop(if idcntry==152):reg BSSSCI02 BCBG03A BCBG05B schcli schpar (running regress on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. 385 Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Linear regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = 150 = 212596.06 Replications = F( 4, 72) Prob > F R-squared = Population size 37955 = 4656707.4 150 75 = 32.67 = 0.0000 = = 4796 Subpop. size Design df Number of obs 0.1596 Jknife * BSSSCI02 Coef. Std. Err. t P>t [95% Conf. Interval] BCBG03A -17.64707 2.763456 -6.39 0.000 -23.15216 -12.14199 BCBG05B 2.22 0.029 .5218957 5.097265 2.296754 schcli -12.98712 3.452984 -3.76 0.000 9.672634 -19.86582 -6.108421 schpar 5.589523 5.499499 1.02 0.313 -5.36604 16.54509 _cons 504.1634 24.83698 20.30 0.000 454.6856 553.6412 386 . svy jackknife, subpop(if idcntry==152):reg BSSSCI03 BCBG03A BCBG05B schcli schpar (running regress on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Linear regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = 150 = 212596.06 Replications = F( 4, 72) Prob > F R-squared = 75 34.63 = 0.0000 37955 = 4656707.4 150 = = Population size = 4796 Subpop. size Design df Number of obs 0.1686 Jknife * 387 BSSSCI03 Coef. Std. Err. t P>t [95% Conf. Interval] BCBG03A -17.89569 2.923917 -6.12 0.000 -23.72044 -12.07095 BCBG05B 2.07 0.042 .1871971 4.775412 2.303202 schcli -13.4605 3.641262 -3.70 0.000 schpar 6.974895 5.602187 1.25 0.217 _cons 504.7213 25.89554 19.49 0.000 9.363626 -20.71427 -6.206734 -4.185234 18.13502 453.1348 556.3079 . svy jackknife, subpop(if idcntry==152):reg BSSSCI04 BCBG03A BCBG05B schcli schpar (running regress on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Linear regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = Subpop. size 150 Number of obs Population size = 37955 = 4656707.4 4796 = 212596.06 388 Replications Design df F( 4, 72) Prob > F R-squared = 150 = 75 = 30.97 = 0.0000 = 0.1572 Jknife * BSSSCI04 Coef. Std. Err. t P>t [95% Conf. Interval] BCBG03A -17.57956 2.838849 -6.19 0.000 -23.23483 -11.92428 BCBG05B 2.30 0.024 .7272379 5.40223 2.346763 schcli -12.83735 3.594849 schpar 6.168233 5.382398 _cons 499.8924 24.4551 10.07722 -3.57 0.001 -19.99866 -5.676045 1.15 0.255 -4.554055 16.89052 20.44 0.000 451.1753 548.6094 . svy jackknife, subpop(if idcntry==152):reg BSSSCI05 BCBG03A BCBG05B schcli schpar (running regress on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 389 .................................................. 150 Survey: Linear regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = 150 = 212596.06 Replications = F( 4, 72) Prob > F R-squared = Population size 37955 = 4656707.4 150 75 = 31.00 = 0.0000 = = 4796 Subpop. size Design df Number of obs 0.1627 Jknife * BSSSCI05 Coef. Std. Err. t P>t [95% Conf. Interval] BCBG03A -17.36149 2.916781 -5.95 0.000 -23.17201 -11.55096 BCBG05B 2.26 0.027 .6517727 5.48864 2.428022 10.32551 schcli -14.20417 3.735898 -3.80 0.000 -21.64646 -6.761883 schpar 4.84833 5.527366 0.88 0.383 -6.162748 15.85941 _cons 508.1731 25.87607 19.64 0.000 456.6253 559.7208 . *Finland . svy jackknife, subpop(if idcntry==246):reg BSSSCI01 BCBG03A BCBG05B schcli schpar (running regress on estimation sample) 390 Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 ............................ss.................... 50 ..ss............ss................................ 100 ............................................ss..ss 150 Survey: Linear regression Number of strata = Number of PSUs 70 = 140 Subpop. no. of obs = = 49197.513 Replications = F( 4, 67) Prob > F R-squared = 35864 = 4396071.3 140 70 = 4.79 = 0.0018 = Population size = 3549 Subpop. size Design df Number of obs 0.0159 Jknife * BSSSCI01 Coef. Std. Err. BCBG03A -6.163648 3.765956 t P>t [95% Conf. Interval] -1.64 0.106 -13.67461 391 1.347315 BCBG05B 1.510354 2.193165 schcli -13.43711 3.673132 schpar 3.155216 5.558205 _cons 580.5762 13.5707 0.69 0.493 -2.863776 5.884484 -3.66 0.000 -20.76294 -6.111281 0.57 0.572 -7.930274 14.24071 42.78 0.000 553.5103 607.6421 Note: 5 strata omitted because they contain no subpopulation members. . svy jackknife, subpop(if idcntry==246):reg BSSSCI02 BCBG03A BCBG05B schcli schpar (running regress on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 ............................ss.................... 50 ..ss............ss................................ 100 ............................................ss..ss 150 Survey: Linear regression Number of strata = Number of PSUs 70 = Subpop. no. of obs = 140 = 49197.513 Replications = = Population size = 35864 = 4396071.3 3549 Subpop. size Design df Number of obs 140 70 392 F( 4, 67) Prob > F R-squared = 3.67 = 0.0092 = 0.0127 Jknife * BSSSCI02 Coef. Std. Err. t P>t [95% Conf. Interval] BCBG03A -6.510842 3.618118 -1.80 0.076 -13.72695 .7052662 BCBG05B 0.84 0.404 -2.441851 5.992295 1.775222 2.114418 schcli -10.63195 3.563283 -2.98 0.004 -17.73869 -3.525205 -6.627714 13.54724 schpar 3.459761 5.057806 0.68 0.496 _cons 573.9403 13.36017 42.96 0.000 547.2943 600.5863 Note: 5 strata omitted because they contain no subpopulation members. . svy jackknife, subpop(if idcntry==246):reg BSSSCI03 BCBG03A BCBG05B schcli schpar (running regress on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 ............................ss.................... 50 ..ss............ss................................ 100 ............................................ss..ss 150 393 Survey: Linear regression Number of strata = Number of PSUs 70 = 140 Subpop. no. of obs = = 49197.513 Replications = F( 4, 67) Prob > F R-squared Population size 35864 = 4396071.3 140 = 70 = 3.23 = 0.0175 = = 3549 Subpop. size Design df Number of obs 0.0106 Jknife * BSSSCI03 Coef. Std. Err. t P>t BCBG03A -5.444692 3.444748 BCBG05B 1.27745 2.08709 schcli -9.877143 3.450645 [95% Conf. Interval] -1.58 0.118 0.61 0.542 -12.31503 -2.88512 1.425642 5.44002 -2.86 0.006 -16.75924 -2.995048 -6.365162 12.74603 schpar 3.190433 4.791124 0.67 0.508 _cons 571.4375 13.08183 43.68 0.000 545.3466 597.5284 Note: 5 strata omitted because they contain no subpopulation members. . svy jackknife, subpop(if idcntry==246):reg BSSSCI04 BCBG03A BCBG05B schcli schpar (running regress on estimation sample) 394 Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 ............................ss.................... 50 ..ss............ss................................ 100 ............................................ss..ss 150 Survey: Linear regression Number of strata = Number of PSUs 70 = 140 Subpop. no. of obs = = 49197.513 Replications = F( 4, 67) Prob > F R-squared = 35864 = 4396071.3 140 70 = 4.44 = 0.0030 = Population size = 3549 Subpop. size Design df Number of obs 0.0139 Jknife * BSSSCI04 Coef. Std. Err. BCBG03A -5.889965 3.749823 t P>t [95% Conf. Interval] -1.57 0.121 -13.36875 395 1.588822 BCBG05B 2.031216 2.283143 schcli -12.53224 3.60317 0.89 0.377 -3.48 0.001 -2.522369 6.584802 -19.71853 -5.345943 schpar 3.031279 5.047551 0.60 0.550 -7.035743 13.0983 _cons 577.2637 13.34764 43.25 0.000 550.6427 603.8847 Note: 5 strata omitted because they contain no subpopulation members. . svy jackknife, subpop(if idcntry==246):reg BSSSCI05 BCBG03A BCBG05B schcli schpar (running regress on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 ............................ss.................... 50 ..ss............ss................................ 100 ............................................ss..ss 150 Survey: Linear regression Number of strata = Number of PSUs 70 = Subpop. no. of obs = 140 = 49197.513 Replications = = Population size = 35864 = 4396071.3 3549 Subpop. size Design df Number of obs 140 70 396 F( 4, 67) Prob > F R-squared = 3.36 = 0.0145 = 0.0119 Jknife * BSSSCI05 Coef. Std. Err. t P>t BCBG03A -6.43257 3.75949 BCBG05B 1.932568 2.041524 [95% Conf. Interval] -1.71 0.092 -13.93064 1.065496 0.95 0.347 -2.139123 6.00426 schcli -10.0191 3.501451 -2.86 0.006 schpar 3.472226 5.265361 0.66 0.512 _cons 572.3695 13.67999 41.84 0.000 -17.00252 -3.035675 -7.029206 13.97366 545.0856 599.6534 . *Ghana . svy jackknife, subpop(if idcntry==288):reg BSSSCI01 BCBG03A BCBG05B schcli schpar (running regress on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 397 Survey: Linear regression Number of strata = Number of PSUs 75 = 150 Subpop. no. of obs = = 384343.95 Replications = F( 4, 72) Prob > F R-squared = Population size 38718 = 4684080.7 150 75 = 23.23 = 0.0000 = = 7047 Subpop. size Design df Number of obs 0.1140 Jknife * BSSSCI01 Coef. Std. Err. t P>t [95% Conf. Interval] BCBG03A -13.52556 4.926036 -2.75 0.008 -23.33873 -3.712392 BCBG05B 4.79 0.000 10.42226 17.84174 3.72445 25.26123 schcli -27.15524 6.516304 -4.17 0.000 -40.13638 -14.17409 schpar -6.521348 10.75379 -0.61 0.546 -27.94399 14.90129 _cons 387.1498 35.16749 11.01 0.000 317.0925 457.207 . svy jackknife, subpop(if idcntry==288):reg BSSSCI02 BCBG03A BCBG05B schcli schpar (running regress on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are 398 insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Linear regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = 150 = 384343.95 Replications = F( 4, 72) Prob > F R-squared = 38718 = 4684080.7 150 75 = 20.50 = 0.0000 = Population size = 7047 Subpop. size Design df Number of obs 0.1092 Jknife * BSSSCI02 Coef. Std. Err. t P>t [95% Conf. Interval] BCBG03A -12.04057 5.198148 -2.32 0.023 -22.39581 -1.685327 BCBG05B 4.70 0.000 10.36374 17.97756 3.822003 schcli -27.46812 6.738249 -4.08 0.000 25.59138 -40.8914 -14.04484 399 schpar -6.89137 11.10632 -0.62 0.537 -29.0163 15.23356 _cons 382.1064 35.45456 10.78 0.000 311.4773 452.7355 . svy jackknife, subpop(if idcntry==288):reg BSSSCI03 BCBG03A BCBG05B schcli schpar (running regress on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Linear regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = 150 = 384343.95 Replications = F( 4, 72) Prob > F R-squared = 75 20.80 = 0.0000 38718 = 4684080.7 150 = = Population size = 7047 Subpop. size Design df Number of obs 0.1133 400 Jknife * BSSSCI03 Coef. Std. Err. t P>t [95% Conf. Interval] BCBG03A -12.92119 5.166434 -2.50 0.015 -23.21325 -2.629124 BCBG05B 4.71 0.000 9.715893 16.83349 3.572906 23.95108 schcli -28.09284 6.333994 -4.44 0.000 -40.71081 -15.47488 schpar -6.601392 10.84267 -0.61 0.544 -28.20109 14.99831 _cons 392.0584 34.29073 11.43 0.000 323.7478 460.3691 . svy jackknife, subpop(if idcntry==288):reg BSSSCI04 BCBG03A BCBG05B schcli schpar (running regress on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Linear regression Number of strata = 75 Number of obs = 401 38718 Number of PSUs = 150 Subpop. no. of obs = = 384343.95 Replications = F( 4, 72) Prob > F R-squared 150 = 75 = 22.78 = 0.0000 = = 4684080.7 7047 Subpop. size Design df Population size 0.1180 Jknife * BSSSCI04 Coef. Std. Err. BCBG03A -12.31113 BCBG05B t P>t [95% Conf. Interval] 5.02958 -2.45 0.017 18.89054 3.936388 4.80 0.000 schcli -28.18616 6.471089 schpar -7.63842 _cons 384.6702 35.09009 -4.36 0.000 11.0752 -0.69 0.493 10.96 0.000 -22.33056 -2.291691 11.04885 26.73223 -41.07723 -15.29509 -29.70134 314.7671 14.4245 454.5732 . svy jackknife, subpop(if idcntry==288):reg BSSSCI05 BCBG03A BCBG05B schcli schpar (running regress on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 402 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Linear regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = 150 = 384343.95 Replications = F( 4, 72) Prob > F R-squared = Population size 38718 = 4684080.7 150 75 = 21.37 = 0.0000 = = 7047 Subpop. size Design df Number of obs 0.1159 Jknife * BSSSCI05 Coef. Std. Err. t P>t [95% Conf. Interval] BCBG03A -12.7581 5.039427 -2.53 0.013 -22.79716 -2.71905 BCBG05B 17.76004 3.789956 4.69 0.000 10.21006 25.31002 schcli -28.22508 6.585263 -4.29 0.000 -41.3436 -15.10657 schpar -9.47429 11.25156 -0.84 0.402 -31.88855 12.93997 _cons 394.2841 36.45543 10.82 0.000 321.6611 466.907 403 . *Korea . svy jackknife, subpop(if idcntry==410):reg BSSSCI01 BCBG03A BCBG05B schcli schpar (running regress on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 ....................ss............................ 50 .................................................. 100 .................................................. 150 Survey: Linear regression Number of strata = Number of PSUs 74 = Subpop. no. of obs = 148 = 602292.51 Replications = F( 4, 71) Prob > F R-squared = 74 22.36 = 0.0000 38270 = 4610715.3 148 = = Population size = 4920 Subpop. size Design df Number of obs 0.0406 Jknife * 404 BSSSCI01 Coef. Std. Err. t P>t [95% Conf. Interval] BCBG03A -11.17311 1.641465 -6.81 0.000 -14.4438 -7.902417 BCBG05B 4.72 0.000 3.916046 6.778318 1.436492 schcli -9.498307 2.040824 9.64059 -4.65 0.000 -13.56474 -5.431876 -3.071659 8.645151 schpar 2.786746 2.940164 0.95 0.346 _cons 564.9733 11.92453 47.38 0.000 541.2132 588.7335 Note: 1 stratum omitted because it contains no subpopulation members. . svy jackknife, subpop(if idcntry==410):reg BSSSCI02 BCBG03A BCBG05B schcli schpar (running regress on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 ....................ss............................ 50 .................................................. 100 .................................................. 150 Survey: Linear regression Number of strata = Number of PSUs 74 = Subpop. no. of obs = 148 Number of obs Population size = 38270 = 4610715.3 4920 405 Subpop. size = 602292.51 Replications = Design df F( 4, 71) Prob > F R-squared 148 = 74 = 25.11 = 0.0000 = 0.0386 Jknife * BSSSCI02 Coef. Std. Err. t P>t [95% Conf. Interval] BCBG03A -11.25656 1.517548 -7.42 0.000 -14.28034 -8.23278 BCBG05B 4.32 0.000 3.173077 8.59456 5.883818 1.360443 schcli -9.793105 2.160117 -4.53 0.000 -14.09723 -5.488977 schpar 2.80734 3.063133 0.92 0.362 -3.296085 8.910766 _cons 568.7605 12.65729 44.94 0.000 543.5403 593.9807 Note: 1 stratum omitted because it contains no subpopulation members. . svy jackknife, subpop(if idcntry==410):reg BSSSCI03 BCBG03A BCBG05B schcli schpar (running regress on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 406 ....................ss............................ 50 .................................................. 100 .................................................. 150 Survey: Linear regression Number of strata = Number of PSUs 74 = 148 Subpop. no. of obs = = 602292.51 Replications = F( 4, 71) Prob > F R-squared = Population size 38270 = 4610715.3 148 74 = 22.97 = 0.0000 = = 4920 Subpop. size Design df Number of obs 0.0372 Jknife * BSSSCI03 Coef. Std. Err. BCBG03A -11.226 1.629396 BCBG05B 5.725901 schcli -9.959497 t P>t 1.44549 2.21665 [95% Conf. Interval] -6.89 0.000 3.96 0.000 -4.49 0.000 schpar 1.459976 3.220772 0.45 0.652 _cons 574.2449 13.46203 42.66 0.000 -14.47264 -7.979355 2.845699 8.606103 -14.37627 -5.542726 -4.957552 7.877504 547.4213 601.0686 Note: 1 stratum omitted because it contains no subpopulation members. 407 . svy jackknife, subpop(if idcntry==410):reg BSSSCI04 BCBG03A BCBG05B schcli schpar (running regress on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 ....................ss............................ 50 .................................................. 100 .................................................. 150 Survey: Linear regression Number of strata = Number of PSUs 74 = Subpop. no. of obs = 148 = 602292.51 Replications = F( 4, 71) Prob > F R-squared = 38270 = 4610715.3 148 74 = 19.74 = 0.0000 = Population size = 4920 Subpop. size Design df Number of obs 0.0333 Jknife * BSSSCI04 Coef. Std. Err. t P>t [95% Conf. Interval] 408 BCBG03A -10.49805 1.577041 -6.66 0.000 -13.64038 -7.355731 BCBG05B 3.77 0.000 2.605287 5.530206 1.467933 schcli -8.854558 2.110974 -4.19 0.000 schpar 2.867396 2.984835 0.96 0.340 _cons 566.1108 13.12098 43.15 0.000 -13.06077 8.455126 -4.64835 -3.080018 8.814809 539.9667 592.2549 Note: 1 stratum omitted because it contains no subpopulation members. . svy jackknife, subpop(if idcntry==410):reg BSSSCI05 BCBG03A BCBG05B schcli schpar (running regress on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 ....................ss............................ 50 .................................................. 100 .................................................. 150 Survey: Linear regression Number of strata = Number of PSUs 74 = Subpop. no. of obs = Subpop. size 148 Number of obs Population size = 38270 = 4610715.3 4920 = 602292.51 409 Replications Design df F( 4, 71) Prob > F R-squared = 148 = 74 = 25.16 = 0.0000 = 0.0385 Jknife * BSSSCI05 Coef. Std. Err. t P>t [95% Conf. Interval] BCBG03A -11.28273 1.597077 -7.06 0.000 -14.46498 -8.100488 BCBG05B 4.00 0.000 3.021534 6.020157 1.504922 schcli -9.657042 2.17003 -4.45 0.000 schpar 3.173776 2.976187 1.07 0.290 _cons 567.0377 12.15605 46.65 0.000 9.018779 -13.98092 -5.333164 -2.756405 9.103957 542.8162 591.2591 . *Signapore . svy jackknife, subpop(if idcntry==702):reg BSSSCI01 BCBG03A BCBG05B schcli schpar (running regress on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 410 .................................................. 150 Survey: Linear regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = 150 = 47556 Replications = 150 F( 3, 73) Prob > F R-squared = = Population size 38673 = 4692642.1 5606 Subpop. size Design df Number of obs 75 = 18.63 = 0.0000 = 0.1235 Jknife * BSSSCI01 Coef. Std. Err. BCBG03A -41.7896 6.220093 BCBG05B 0 (omitted) schcli -25.63588 8.619678 t P>t [95% Conf. Interval] -6.72 0.000 -54.18066 -29.39854 -2.97 0.004 -42.80716 -8.464597 -14.27083 18.23018 schpar 1.979675 8.157467 0.24 0.809 _cons 701.3056 34.21971 20.49 0.000 633.1365 769.4748 . svy jackknife, subpop(if idcntry==702):reg BSSSCI02 BCBG03A BCBG05B schcli schpar (running regress on estimation sample) 411 Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Linear regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = 150 = 47556 Replications = 150 F( 3, 73) Prob > F R-squared = 38673 = 4692642.1 75 = 18.79 = 0.0000 = Population size = 5606 Subpop. size Design df Number of obs 0.1254 Jknife * BSSSCI02 Coef. Std. Err. BCBG03A -41.47554 6.206029 t P>t [95% Conf. Interval] -6.68 0.000 -53.83858 -29.11249 412 BCBG05B 0 (omitted) schcli -25.75023 8.495488 -3.03 0.003 -42.67411 -8.826353 schpar 2.565665 8.13032 0.32 0.753 -13.63076 18.76209 _cons 699.3116 34.09518 20.51 0.000 631.3905 767.2327 . svy jackknife, subpop(if idcntry==702):reg BSSSCI03 BCBG03A BCBG05B schcli schpar (running regress on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Linear regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = 150 = 47556 Replications = 150 F( 3, 73) = = Population size = 38673 = 4692642.1 5606 Subpop. size Design df Number of obs 75 18.64 413 Prob > F = R-squared 0.0000 = 0.1251 Jknife * BSSSCI03 Coef. Std. Err. t P>t BCBG03A -41.13202 6.102209 BCBG05B [95% Conf. Interval] -6.74 0.000 -53.28824 -28.9758 0 (omitted) schcli -26.0367 8.253861 -3.15 0.002 schpar 2.173769 7.842944 0.28 0.782 _cons 700.5823 33.21713 21.09 0.000 -42.47923 -9.594164 -13.45018 17.79771 634.4104 766.7542 . svy jackknife, subpop(if idcntry==702):reg BSSSCI04 BCBG03A BCBG05B schcli schpar (running regress on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Linear regression 414 Number of strata = Number of PSUs 75 = 150 Subpop. no. of obs = = 47556 Replications = 150 F( 3, 73) Prob > F R-squared = Population size 38673 = 4692642.1 5606 Subpop. size Design df Number of obs = 75 = 19.13 = 0.0000 = 0.1244 Jknife * BSSSCI04 Coef. Std. Err. BCBG03A -41.4095 6.228698 BCBG05B 0 (omitted) schcli -25.76281 8.251842 t P>t [95% Conf. Interval] -6.65 0.000 -53.8177 -29.0013 -3.12 0.003 -42.20132 -9.324296 -12.98503 18.97854 schpar 2.996754 8.022574 0.37 0.710 _cons 697.8335 33.93101 20.57 0.000 630.2395 765.4276 . svy jackknife, subpop(if idcntry==702):reg BSSSCI05 BCBG03A BCBG05B schcli schpar (running regress on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. 415 Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Linear regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = 150 = 47556 Replications = 150 F( 3, 73) Prob > F R-squared = = Population size 38673 = 4692642.1 5606 Subpop. size Design df Number of obs 75 = 18.24 = 0.0000 = 0.1229 Jknife * BSSSCI05 Coef. Std. Err. t P>t BCBG03A -40.91272 6.173915 BCBG05B [95% Conf. Interval] -6.63 0.000 -53.21179 -28.61365 0 (omitted) schcli -25.22685 8.66054 -2.91 0.005 schpar 2.661979 8.031258 0.33 0.741 _cons 697.8837 34.43576 20.27 0.000 -42.47953 -7.974164 -13.33711 18.66106 629.2841 766.4832 416 . . *USA . svy jackknife, subpop(if idcntry==840):reg BSSSCI01 BCBG03A BCBG05B schcli schpar (running regress on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Linear regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = 150 = 2848668.2 Replications = F( 4, 72) Prob > F R-squared = 75 32.26 = 0.0000 37489 = 4234548.2 150 = = Population size = 8972 Subpop. size Design df Number of obs 0.1142 417 Jknife * BSSSCI01 Coef. Std. Err. t P>t [95% Conf. Interval] BCBG03A -22.18278 2.416457 -9.18 0.000 -26.99661 -17.36895 BCBG05B -6.386407 2.002125 -3.19 0.002 -10.37484 -2.397969 schcli -5.225859 4.968709 -1.05 0.296 -15.12403 4.672317 schpar .2803225 4.655729 0.06 0.952 -8.994365 9.55501 _cons 620.5221 16.64404 37.28 0.000 587.3655 653.6788 . svy jackknife, subpop(if idcntry==840):reg BSSSCI02 BCBG03A BCBG05B schcli schpar (running regress on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Linear regression Number of strata = 75 Number of obs = 418 37489 Number of PSUs = Subpop. no. of obs = 150 = 2848668.2 Replications = F( 4, 72) Prob > F R-squared 150 = 75 = 30.21 = 0.0000 = = 4234548.2 8972 Subpop. size Design df Population size 0.1099 Jknife * BSSSCI02 Coef. Std. Err. t P>t [95% Conf. Interval] BCBG03A -21.55983 2.387367 -9.03 0.000 -26.31571 -16.80395 BCBG05B -5.709709 1.948047 -2.93 0.004 -9.590417 -1.829001 schcli -5.631669 4.789146 schpar .4222875 4.447653 _cons 615.7627 16.3883 -1.18 0.243 0.09 0.925 37.57 0.000 -15.17214 3.9088 -8.437892 9.282467 583.1155 648.4099 . svy jackknife, subpop(if idcntry==840):reg BSSSCI03 BCBG03A BCBG05B schcli schpar (running regress on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 419 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Linear regression Number of strata = Number of PSUs 75 = 150 Subpop. no. of obs = = 2848668.2 Replications = F( 4, 72) Prob > F R-squared = Population size 37489 = 4234548.2 150 75 = 31.10 = 0.0000 = = 8972 Subpop. size Design df Number of obs 0.1152 Jknife * BSSSCI03 Coef. Std. Err. t P>t BCBG03A -21.85046 2.327879 BCBG05B -6.302009 1.92792 schcli -5.556843 4.765216 [95% Conf. Interval] -9.39 0.000 -3.27 0.002 -1.17 0.247 schpar 1.299022 4.606766 0.28 0.779 _cons 616.7083 17.15721 35.94 0.000 -26.48783 -17.21308 -10.14262 -2.461395 -15.04964 3.935954 -7.878127 10.47617 582.5294 650.8872 420 . svy jackknife, subpop(if idcntry==840):reg BSSSCI04 BCBG03A BCBG05B schcli schpar (running regress on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Linear regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = 150 = 2848668.2 Replications = F( 4, 72) Prob > F R-squared = 37489 = 4234548.2 150 75 = 29.02 = 0.0000 = Population size = 8972 Subpop. size Design df Number of obs 0.1092 Jknife * BSSSCI04 Coef. Std. Err. t P>t [95% Conf. Interval] 421 BCBG03A -21.50212 2.398055 BCBG05B -5.78973 1.924021 -8.97 0.000 -3.01 0.004 -26.27929 -16.72495 -9.622575 -1.956884 schcli -5.429032 4.853628 -1.12 0.267 -15.09796 4.239891 schpar .6670165 4.81319 0.14 0.890 -8.921349 10.25538 _cons 615.2379 16.82368 36.57 0.000 581.7235 648.7524 . svy jackknife, subpop(if idcntry==840):reg BSSSCI05 BCBG03A BCBG05B schcli schpar (running regress on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Linear regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = 150 Number of obs Population size = = 4234548.2 8972 Subpop. size = 2848668.2 Replications = 37489 150 422 Design df F( 4, 72) Prob > F R-squared = 75 = 29.12 = 0.0000 = 0.1075 Jknife * BSSSCI05 Coef. Std. Err. t P>t [95% Conf. Interval] BCBG03A -21.13326 2.404006 -8.79 0.000 -25.92228 -16.34423 BCBG05B -5.706257 1.958821 -2.91 0.005 -9.608427 -1.804086 schcli -6.442937 4.9446 -1.30 0.197 schpar -.1942855 4.440976 _cons 619.7497 16.87054 -0.04 0.965 36.74 0.000 -16.29309 3.407211 -9.041164 8.652593 586.1419 653.3575 end of do-file . log close name: log: C:\Users\educ.brunerju\Google Drive\Dissertation\Outputs\Baseline Regressions\School Regressions.smcl log type: smcl closed on: 29 May 2014, 16:13:09 423 HLM Outputs Program: HLM 7 Hierarchical Linear and Nonlinear Modeling Authors: Stephen Raudenbush, Tony Bryk, & Richard Congdon Publisher: Scientific Software International, Inc. (c) 2000 techsupport@ssicentral.com www.ssicentral.com ------------------------------------------------------------------------------Module: Date: Time: HLM2.EXE (7.01.21202.1001) 7 July 2014, Monday 9:43:46 ------------------------------------------------------------------------------Specifications for this HLM2 run Problem Title: no title The data source for this run = Chile The command file for this run = C:\Users\EDUC~1.BRU\AppData\Local\Temp\whlmtemp.hlm Output file name = C:\Users\educ.brunerju\Google Drive\Dissertation\HLM Stuff\Datasets\hlm2.avg The maximum number of level-1 units = 3370 The maximum number of level-2 units = 131 The maximum number of iterations = 100 Method of estimation: full maximum likelihood This is part of a plausible value analysis using the following variables: BSSSCI01 BSSSCI02 BSSSCI03 424 BSSSCI04 BSSSCI05 Weighting Specification ----------------------Weight Variable Weighting? Name Normalized? Level 1 yes SCIWGT yes Level 2 yes SCIWGT yes Precision no Summary of the model specified (in hierarchical format) --------------------------------------------------Level-1 Model BSSSCI01 = B0 + r Level-2 Model B0 = G00 + u0 Mixed Model BSSSCI01 = G00 + u0+ r THE AVERAGED RESULTS FOR THIS PLAUSIBLE VALUE RUN sigma^2 = 3331.02009 Standard Error of sigma^2 = 117.61816 tau INTRCPT1,B0 1802.35909 Standard error of tau INTRCPT1,B0 255.32926 ---------------------------------------------------Random level-1 coefficient Reliability estimate 425 ---------------------------------------------------INTRCPT1, G0 0.928 ---------------------------------------------------- 426 The outcome variables are: BSSSCI01,BSSSCI02,BSSSCI03,BSSSCI04,BSSSCI05 Final estimation of fixed effects: ---------------------------------------------------------------------------Standard Fixed Effect Approx. Coefficient Error T-ratio d.f. P-value ---------------------------------------------------------------------------For INTRCPT1, B0 INTRCPT2, G00 459.099527 3.989363 115.081 130 0.000 ---------------------------------------------------------------------------- The outcome variables are: BSSSCI01,BSSSCI02,BSSSCI03,BSSSCI04,BSSSCI05 Final estimation of fixed effects (with robust standard errors) ---------------------------------------------------------------------------Standard Fixed Effect Approx. Coefficient Error T-ratio d.f. P-value ---------------------------------------------------------------------------For INTRCPT1, B0 INTRCPT2, G00 459.099527 4.077491 112.594 130 0.000 ---------------------------------------------------------------------------Final estimation of variance components: ----------------------------------------------------------------------------Random Effect Standard Deviation Variance df Chi-square P-value Component ----------------------------------------------------------------------------INTRCPT1, u0 42.45420 1802.35909 130 1969.32619 427 0.000 level-1, r 57.71499 3331.02009 ----------------------------------------------------------------------------- Program: HLM 7 Hierarchical Linear and Nonlinear Modeling Authors: Stephen Raudenbush, Tony Bryk, & Richard Congdon Publisher: Scientific Software International, Inc. (c) 2000 techsupport@ssicentral.com www.ssicentral.com ------------------------------------------------------------------------------Module: HLM2.EXE (7.01.21202.1001) Date: 30 June 2014, Monday Time: 12:44:32 ------------------------------------------------------------------------------Specifications for this HLM2 run Problem Title: no title The data source for this run = Chile The command file for this run = C:\Users\EDUC~1.BRU\AppData\Local\Temp\whlmtemp.hlm Output file name = C:\Users\educ.brunerju\Google Drive\Dissertation\HLM Stuff\Datasets\hlm2.avg The maximum number of level-1 units = 3370 The maximum number of level-2 units = 131 The maximum number of iterations = 100 Method of estimation: full maximum likelihood This is part of a plausible value analysis using the following variables: BSSSCI01 BSSSCI02 BSSSCI03 428 BSSSCI04 BSSSCI05 Weighting Specification ----------------------Weight Variable Weighting? Name Normalized? Level 1 yes SCIWGT yes Level 2 yes SCIWGT yes Precision no Summary of the model specified (in hierarchical format) --------------------------------------------------Level-1 Model BSSSCI01 = B0 + B1*(SES) + r Level-2 Model B0 = G00 + G01*(ECDISA) + u0 B1 = G10 Mixed Model BSSSCI01 = G00 + G01*ECDISA + G10*SES + u0+ r THE AVERAGED RESULTS FOR THIS PLAUSIBLE VALUE RUN sigma^2 = 3202.27788 Standard Error of sigma^2 = 110.77201 tau INTRCPT1,B0 877.66884 Standard error of tau 429 INTRCPT1,B0 146.80772 ---------------------------------------------------Random level-1 coefficient Reliability estimate ---------------------------------------------------INTRCPT1, G0 0.869 ---------------------------------------------------- 430 The outcome variables are: BSSSCI01,BSSSCI02,BSSSCI03,BSSSCI04,BSSSCI05 Final estimation of fixed effects: ---------------------------------------------------------------------------Standard Fixed Effect Coefficient Error Approx. T-ratio d.f. P-value ---------------------------------------------------------------------------For INTRCPT1, B0 INTRCPT2, G00 508.910984 7.760544 ECDISA, G01 For 65.577 129 0.000 -14.969364 2.396952 -6.245 129 0.000 6.936489 0.720723 9.624 18 0.000 SES slope, B1 INTRCPT2, G10 ---------------------------------------------------------------------------- The outcome variables are: BSSSCI01,BSSSCI02,BSSSCI03,BSSSCI04,BSSSCI05 Final estimation of fixed effects (with robust standard errors) ---------------------------------------------------------------------------Standard Fixed Effect Coefficient Error Approx. T-ratio d.f. P-value ---------------------------------------------------------------------------For INTRCPT1, B0 INTRCPT2, G00 ECDISA, G01 For 508.910984 8.865948 57.401 129 0.000 -14.969364 2.689959 -5.565 129 0.000 6.936489 0.801222 8.657 28 0.000 SES slope, B1 INTRCPT2, G10 ---------------------------------------------------------------------------- 431 Final estimation of variance components: ----------------------------------------------------------------------------Random Effect Standard Deviation Variance df Chi-square P-value Component ----------------------------------------------------------------------------INTRCPT1, level-1, u0 r 29.62548 877.66884 129 1055.48710 0.000 56.58867 3202.27788 ----------------------------------------------------------------------------- Program: HLM 7 Hierarchical Linear and Nonlinear Modeling Authors: Stephen Raudenbush, Tony Bryk, & Richard Congdon Publisher: Scientific Software International, Inc. (c) 2000 techsupport@ssicentral.com www.ssicentral.com ------------------------------------------------------------------------------Module: HLM2.EXE (7.01.21202.1001) Date: 7 July 2014, Monday Time: 14:19:44 ------------------------------------------------------------------------------Specifications for this HLM2 run Problem Title: no title The data source for this run = Chile The command file for this run = C:\Users\EDUC~1.BRU\AppData\Local\Temp\whlmtemp.hlm Output file name = C:\Users\educ.brunerju\Google Drive\Dissertation\HLM Stuff\Datasets\hlm2.avg The maximum number of level-1 units = 3370 432 The maximum number of level-2 units = 131 The maximum number of iterations = 100 Method of estimation: full maximum likelihood This is part of a plausible value analysis using the following variables: BSSSCI01 BSSSCI02 BSSSCI03 BSSSCI04 BSSSCI05 Weighting Specification ----------------------Weight Variable Weighting? Name Normalized? Level 1 yes SCIWGT yes Level 2 yes SCIWGT yes Precision no Summary of the model specified (in hierarchical format) --------------------------------------------------Level-1 Model BSSSCI01 = B0 + B1*(GIRL) + B2*(SES) + B3*(INTENJ) + B4*(NEGSCI) + B5*(BULLY) + B6*(PARENT) + r Level-2 Model B0 = G00 + G01*(TEAEXP) + G02*(TEAEDU) + G03*(SCIHRS) + G04*(ECDISA) 433 + G05*(URBAN) + G06*(SCIMAJ) + G07*(TECOOP) + G08*(TELIMI) + G09*(SCIPED) + G010*(EXPECT) + G011*(TESUPP) + G012*(SCHCLI) + G013*(SCHPAR) + u0 B1 = G10 B2 = G20 B3 = G30 B4 = G40 B5 = G50 B6 = G60 Mixed Model BSSSCI01 = G00 + G01*TEAEXP + G02*TEAEDU + G03*SCIHRS + G04*ECDISA + G05*URBAN + G06*SCIMAJ + G07*TECOOP + G08*TELIMI + G09*SCIPED + G010*EXPECT + G011*TESUPP + G012*SCHCLI + G013*SCHPAR + G10*GIRL + G20*SES + G30*INTENJ + G40*NEGSCI + G50*BULLY + G60*PARENT + u0+ r THE AVERAGED RESULTS FOR THIS PLAUSIBLE VALUE RUN sigma^2 = 2811.76214 Standard Error of sigma^2 = 109.04670 Tau INTRCPT1,B0 512.61461 Standard error of tau INTRCPT1,B0 94.09324 434 ---------------------------------------------------Random level-1 coefficient Reliability estimate ---------------------------------------------------INTRCPT1, G0 0.816 ---------------------------------------------------The outcome variables are: BSSSCI01,BSSSCI02,BSSSCI03,BSSSCI04,BSSSCI05 Final estimation of fixed effects: ---------------------------------------------------------------------------Standard Fixed Effect Coefficient Error Approx. T-ratio d.f. P-value ---------------------------------------------------------------------------For INTRCPT1, B0 INTRCPT2, G00 576.451919 48.307117 11.933 TEAEXP, G01 0.039249 0.198565 0.198 TEAEDU, G02 -0.123200 6.221525 -0.020 117 117 117 0.000 0.844 0.984 SCIHRS, G03 -5.051270 2.410310 -2.096 117 0.038 ECDISA, G04 -8.987758 2.202899 -4.080 117 0.000 URBAN, G05 3.412620 1.908756 1.788 117 0.076 SCIMAJ, G06 8.533368 5.363488 1.591 101 0.115 TECOOP, G07 -0.246567 3.412232 -0.072 117 0.943 TELIMI, G08 -10.584459 4.594102 -2.304 117 0.023 SCIPED, G09 -3.027814 4.500364 -0.673 117 0.502 EXPECT, G010 10.194370 4.110887 TESUPP, G011 SCHCLI, G012 SCHPAR, G013 For 2.480 117 0.015 -0.114289 5.800147 -0.020 117 0.984 -6.787578 3.229705 -2.102 117 0.038 6.424158 4.856533 1.323 117 0.188 GIRL slope, B1 435 INTRCPT2, G10 For -23.551279 2.209859 -10.657 106 0.000 SES slope, B2 INTRCPT2, G20 6.111289 0.670476 9.115 20 0.000 -2.688175 2.670238 -1.007 11 0.336 For INTENJ slope, B3 INTRCPT2, G30 For NEGSCI slope, B4 INTRCPT2, G40 For -24.359498 1.878705 -12.966 35 0.000 -1.388432 3.210457 -0.432 8 0.677 -2.056319 1.625476 -1.265 32 0.215 BULLY slope, B5 INTRCPT2, G50 For PARENT slope, B6 INTRCPT2, G60 ---------------------------------------------------------------------------The outcome variables are: BSSSCI01,BSSSCI02,BSSSCI03,BSSSCI04,BSSSCI05 Final estimation of fixed effects (with robust standard errors) ---------------------------------------------------------------------------Standard Fixed Effect Coefficient Error Approx. T-ratio d.f. P-value ---------------------------------------------------------------------------For INTRCPT1, B0 INTRCPT2, G00 576.451919 52.309990 11.020 TEAEXP, G01 0.039249 0.201621 0.195 TEAEDU, G02 -0.123200 6.735827 -0.018 117 117 117 0.000 0.846 0.985 SCIHRS, G03 -5.051270 2.852768 -1.771 117 0.079 ECDISA, G04 -8.987758 2.470549 -3.638 117 0.000 URBAN, G05 3.412620 2.329290 1.465 117 0.146 SCIMAJ, G06 8.533368 5.151860 1.656 86 436 0.101 TECOOP, G07 -0.246567 3.734517 -0.066 117 0.947 TELIMI, G08 -10.584459 4.993043 -2.120 117 0.036 SCIPED, G09 -3.027814 4.952031 -0.611 117 0.542 EXPECT, G010 10.194370 4.771459 2.137 117 0.035 TESUPP, G011 -0.114289 4.381691 -0.026 117 0.979 SCHCLI, G012 -6.787578 3.403172 -1.994 117 0.048 6.424158 5.444246 1.180 117 0.240 -23.551279 2.563740 -9.186 192 0.000 6.111289 0.748017 8.170 31 0.000 -2.688175 2.806724 -0.958 14 0.354 SCHPAR, G013 For GIRL slope, B1 INTRCPT2, G10 For SES slope, B2 INTRCPT2, G20 For INTENJ slope, B3 INTRCPT2, G30 For NEGSCI slope, B4 INTRCPT2, G40 For -24.359498 2.113220 -11.527 57 0.000 BULLY slope, B5 INTRCPT2, G50 -1.388432 3.411741 -0.407 11 0.692 -2.056319 1.806072 -1.139 48 0.261 For PARENT slope, B6 INTRCPT2, G60 ---------------------------------------------------------------------------Final estimation of variance components: ----------------------------------------------------------------------------Random Effect Standard Deviation Variance df Chi-square P-value Component ----------------------------------------------------------------------------INTRCPT1, level-1, u0 r 22.64099 512.61461 117 752.96522 53.02605 2811.76214 ----------------------------------------------------------------------------- 437 0.000 Program: HLM 7 Hierarchical Linear and Nonlinear Modeling Authors: Stephen Raudenbush, Tony Bryk, & Richard Congdon Publisher: Scientific Software International, Inc. (c) 2000 techsupport@ssicentral.com www.ssicentral.com ------------------------------------------------------------------------------Module: HLM2.EXE (7.01.21202.1001) Date: 7 July 2014, Monday Time: 10: 9:55 ------------------------------------------------------------------------------- Specifications for this HLM2 run Problem Title: no title The data source for this run = Finland The command file for this run = C:\Users\EDUC~1.BRU\AppData\Local\Temp\whlmtemp.hlm Output file name = C:\Users\educ.brunerju\Google Drive\Dissertation\HLM Stuff\Datasets\hlm2.avg The maximum number of level-1 units = 6307 The maximum number of level-2 units = 549 The maximum number of iterations = 100 Method of estimation: full maximum likelihood This is part of a plausible value analysis using the following variables: BSSSCI01 438 BSSSCI02 BSSSCI03 BSSSCI04 BSSSCI05 Weighting Specification ----------------------Weight Variable Weighting? Name Normalized? Level 1 yes SCIWGT yes Level 2 yes SCIWGT yes Precision no Summary of the model specified (in hierarchical format) --------------------------------------------------Level-1 Model BSSSCI01 = B0 + r Level-2 Model B0 = G00 + u0 Mixed Model BSSSCI01 = G00 + u0+ r THE AVERAGED RESULTS FOR THIS PLAUSIBLE VALUE RUN sigma^2 = 3258.72568 Standard Error of sigma^2 = 137.18406 tau INTRCPT1,B0 818.38869 Standard error of tau INTRCPT1,B0 118.42013 439 ---------------------------------------------------Random level-1 coefficient Reliability estimate ---------------------------------------------------INTRCPT1, G0 0.677 ---------------------------------------------------The outcome variables are: BSSSCI01,BSSSCI02,BSSSCI03,BSSSCI04,BSSSCI05 Final estimation of fixed effects: ---------------------------------------------------------------------------Standard Fixed Effect Coefficient Error Approx. T-ratio d.f. P-value ---------------------------------------------------------------------------For INTRCPT1, B0 INTRCPT2, G00 551.799155 1.601175 344.621 176 0.000 ---------------------------------------------------------------------------- The outcome variables are: BSSSCI01,BSSSCI02,BSSSCI03,BSSSCI04,BSSSCI05 Final estimation of fixed effects (with robust standard errors) ---------------------------------------------------------------------------Standard Fixed Effect Coefficient Error Approx. T-ratio d.f. P-value ---------------------------------------------------------------------------For INTRCPT1, B0 INTRCPT2, G00 551.799155 1.719190 320.965 234 ---------------------------------------------------------------------------Final estimation of variance components: 440 0.000 ----------------------------------------------------------------------------Random Effect Standard Deviation Variance df Chi-square P-value Component ----------------------------------------------------------------------------INTRCPT1, level-1, u0 r 28.60749 818.38869 548 1999.30334 0.000 57.08525 3258.72568 ----------------------------------------------------------------------------- Program: HLM 7 Hierarchical Linear and Nonlinear Modeling Authors: Stephen Raudenbush, Tony Bryk, & Richard Congdon Publisher: Scientific Software International, Inc. (c) 2000 techsupport@ssicentral.com www.ssicentral.com ------------------------------------------------------------------------------Module: HLM2.EXE (7.01.21202.1001) Date: 30 June 2014, Monday Time: 12:52: 4 ------------------------------------------------------------------------------- Specifications for this HLM2 run Problem Title: no title The data source for this run = Finland The command file for this run = C:\Users\educ.brunerju\Google Drive\Dissertation\HLM Stuff\Datasets\Finland SES.hlm Output file name = C:\Users\educ.brunerju\Google Drive\Dissertation\HLM Stuff\Datasets\hlm2.avg The maximum number of level-1 units = 6307 441 The maximum number of level-2 units = 549 The maximum number of iterations = 100 Method of estimation: full maximum likelihood This is part of a plausible value analysis using the following variables: BSSSCI01 BSSSCI02 BSSSCI03 BSSSCI04 BSSSCI05 Weighting Specification ----------------------Weight Variable Weighting? Name Normalized? Level 1 yes SCIWGT yes Level 2 yes SCIWGT yes Precision no Summary of the model specified (in hierarchical format) --------------------------------------------------Level-1 Model BSSSCI01 = B0 + B1*(SES) + r Level-2 Model B0 = G00 + G01*(ECDISA) + u0 B1 = G10 Mixed Model BSSSCI01 = G00 + G01*ECDISA 442 + G10*SES + u0+ r THE AVERAGED RESULTS FOR THIS PLAUSIBLE VALUE RUN sigma^2 = 2900.58218 Standard Error of sigma^2 = 123.60526 tau INTRCPT1,B0 530.63406 Standard error of tau INTRCPT1,B0 98.62077 ---------------------------------------------------Random level-1 coefficient Reliability estimate ---------------------------------------------------INTRCPT1, G0 0.613 ---------------------------------------------------The outcome variables are: BSSSCI01,BSSSCI02,BSSSCI03,BSSSCI04,BSSSCI05 Final estimation of fixed effects: ---------------------------------------------------------------------------Standard Fixed Effect Coefficient Error Approx. T-ratio d.f. P-value ---------------------------------------------------------------------------For INTRCPT1, B0 INTRCPT2, G00 ECDISA, G01 For 561.798053 3.397494 165.357 -5.401219 1.771841 -3.048 201 547 0.000 0.002 SES slope, B1 INTRCPT2, G10 9.380816 0.405592 23.129 24 0.000 ---------------------------------------------------------------------------- The outcome variables are: BSSSCI01,BSSSCI02,BSSSCI03,BSSSCI04,BSSSCI05 443 Final estimation of fixed effects (with robust standard errors) ---------------------------------------------------------------------------Standard Fixed Effect Approx. Coefficient Error T-ratio d.f. P-value ---------------------------------------------------------------------------For INTRCPT1, B0 INTRCPT2, G00 ECDISA, G01 For 561.798053 3.528718 159.207 -5.401219 1.784744 -3.026 234 547 0.000 0.003 SES slope, B1 INTRCPT2, G10 9.380816 0.418694 22.405 27 0.000 ---------------------------------------------------------------------------Final estimation of variance components: ----------------------------------------------------------------------------Random Effect Standard Deviation Variance df Chi-square P-value Component ----------------------------------------------------------------------------INTRCPT1, level-1, u0 r 23.03550 530.63406 547 1658.84110 0.000 53.85705 2900.58218 ----------------------------------------------------------------------------- Program: HLM 7 Hierarchical Linear and Nonlinear Modeling Authors: Stephen Raudenbush, Tony Bryk, & Richard Congdon Publisher: Scientific Software International, Inc. (c) 2000 techsupport@ssicentral.com www.ssicentral.com 444 ------------------------------------------------------------------------------Module: HLM2.EXE (7.01.21202.1001) Date: 7 July 2014, Monday Time: 12:47:15 ------------------------------------------------------------------------------Specifications for this HLM2 run Problem Title: no title The data source for this run = Finland The command file for this run = C:\Users\EDUC~1.BRU\AppData\Local\Temp\whlmtemp.hlm Output file name = C:\Users\educ.brunerju\Google Drive\Dissertation\HLM Stuff\Datasets\hlm2.avg The maximum number of level-1 units = 6307 The maximum number of level-2 units = 549 The maximum number of iterations = 100 Method of estimation: full maximum likelihood This is part of a plausible value analysis using the following variables: BSSSCI01 BSSSCI02 BSSSCI03 BSSSCI04 BSSSCI05 Weighting Specification ----------------------Weight Variable Weighting? Name Normalized? Level 1 yes SCIWGT yes Level 2 yes SCIWGT yes Precision no 445 Summary of the model specified (in hierarchical format) --------------------------------------------------Level-1 Model BSSSCI01 = B0 + B1*(GIRL) + B2*(SES) + B3*(INTENJ) + B4*(NEGSCI) + B5*(BULLY) + B6*(PARENT) + r Level-2 Model B0 = G00 + G01*(TEAEXP) + G02*(TEAEDU) + G03*(SCIHRS) + G04*(ECDISA) + G05*(URBAN) + G06*(SCIMAJ) + G07*(TECOOP) + G08*(TELIMI) + G09*(SCIPED) + G010*(EXPECT) + G011*(TESUPP) + G012*(SCHCLI) + G013*(SCHPAR) + u0 B1 = G10 B2 = G20 B3 = G30 B4 = G40 B5 = G50 B6 = G60 Mixed Model BSSSCI01 = G00 + G01*TEAEXP + G02*TEAEDU + G03*SCIHRS + G04*ECDISA + G05*URBAN + G06*SCIMAJ + G07*TECOOP + G08*TELIMI + G09*SCIPED + G010*EXPECT + G011*TESUPP + G012*SCHCLI + G013*SCHPAR + G10*GIRL + G20*SES + G30*INTENJ 446 + G40*NEGSCI + G50*BULLY + G60*PARENT + u0+ r 447 THE AVERAGED RESULTS FOR THIS PLAUSIBLE VALUE RUN sigma^2 = 2423.12128 Standard Error of sigma^2 = 96.58805 tau INTRCPT1,B0 400.87485 Standard error of tau INTRCPT1,B0 77.85057 ---------------------------------------------------Random level-1 coefficient Reliability estimate ---------------------------------------------------INTRCPT1, G0 0.591 ---------------------------------------------------The outcome variables are: BSSSCI01,BSSSCI02,BSSSCI03,BSSSCI04,BSSSCI05 Final estimation of fixed effects: ---------------------------------------------------------------------------Standard Fixed Effect Coefficient Error Approx. T-ratio d.f. P-value ---------------------------------------------------------------------------For INTRCPT1, B0 INTRCPT2, G00 578.583185 27.082247 21.364 68 0.000 TEAEXP, G01 0.050136 0.120540 0.416 381 0.678 TEAEDU, G02 5.572373 2.885863 1.931 53 0.059 SCIHRS, G03 1.385063 1.073677 1.290 535 0.198 ECDISA, G04 -2.957957 1.673765 -1.767 233 0.078 URBAN, G05 -3.273412 1.090440 -3.002 535 0.003 SCIMAJ, G06 5.184725 2.898109 1.789 TECOOP, G07 -1.830504 1.759240 -1.041 535 0.299 -11.143894 3.196688 -3.486 126 0.001 TELIMI, G08 535 448 0.074 SCIPED, G09 5.728647 2.187462 2.619 230 0.009 EXPECT, G010 8.164263 2.646398 3.085 329 0.002 TESUPP, G011 0.035537 2.202477 0.016 205 0.987 SCHCLI, G012 -3.309461 2.906234 -1.139 32 0.263 3.566717 2.661830 1.340 535 3.045319 1.866992 1.631 16 0.122 7.350557 0.353284 20.806 51 0.000 11.563821 1.664175 6.949 97 0.000 SCHPAR, G013 For GIRL slope, B1 INTRCPT2, G10 For 0.181 SES slope, B2 INTRCPT2, G20 For INTENJ slope, B3 INTRCPT2, G30 For NEGSCI slope, B4 INTRCPT2, G40 For -33.552841 2.281258 -14.708 12 0.000 BULLY slope, B5 INTRCPT2, G50 2.824617 1.854781 1.523 16 -10.254960 1.132836 -9.052 0.147 For PARENT slope, B6 INTRCPT2, G60 22 0.000 ---------------------------------------------------------------------------The outcome variables are: BSSSCI01,BSSSCI02,BSSSCI03,BSSSCI04,BSSSCI05 Final estimation of fixed effects (with robust standard errors) ---------------------------------------------------------------------------Standard Fixed Effect Coefficient Error Approx. T-ratio d.f. P-value ---------------------------------------------------------------------------For INTRCPT1, B0 INTRCPT2, G00 TEAEXP, G01 578.583185 35.612906 0.050136 0.128496 16.246 0.390 206 492 449 0.000 0.697 TEAEDU, G02 5.572373 4.362112 1.277 278 0.203 SCIHRS, G03 1.385063 1.185818 1.168 535 0.243 ECDISA, G04 -2.957957 1.697779 -1.742 247 0.083 URBAN, G05 -3.273412 1.162532 -2.816 535 0.005 SCIMAJ, G06 5.184725 3.327474 1.558 TECOOP, G07 -1.830504 1.959028 -0.934 535 0.351 TELIMI, G08 -11.143894 3.656097 -3.048 216 0.003 SCIPED, G09 5.728647 2.568497 2.230 535 0.120 437 0.026 EXPECT, G010 8.164263 2.808128 2.907 417 0.004 TESUPP, G011 0.035537 2.332215 0.015 258 0.988 SCHCLI, G012 -3.309461 2.769974 -1.195 26 0.243 3.566717 2.426313 1.470 535 3.045319 1.929705 1.578 18 0.132 7.350557 0.381465 19.269 69 0.000 11.563821 1.974260 5.857 194 0.000 20 0.000 SCHPAR, G013 For GIRL slope, B1 INTRCPT2, G10 For 0.142 SES slope, B2 INTRCPT2, G20 For INTENJ slope, B3 INTRCPT2, G30 For NEGSCI slope, B4 INTRCPT2, G40 For -33.552841 2.582107 -12.994 BULLY slope, B5 INTRCPT2, G50 2.824617 2.021789 1.397 23 -10.254960 1.230493 -8.334 0.176 For PARENT slope, B6 INTRCPT2, G60 31 0.000 ---------------------------------------------------------------------------Final estimation of variance components: ----------------------------------------------------------------------------Random Effect Standard Variance df Chi-square P-value 450 Deviation Component ----------------------------------------------------------------------------INTRCPT1, level-1, u0 r 20.02186 400.87485 535 1565.49014 0.000 49.22521 2423.12128 ----------------------------------------------------------------------------- Program: HLM 7 Hierarchical Linear and Nonlinear Modeling Authors: Stephen Raudenbush, Tony Bryk, & Richard Congdon Publisher: Scientific Software International, Inc. (c) 2000 techsupport@ssicentral.com www.ssicentral.com ------------------------------------------------------------------------------Module: HLM2.EXE (7.01.21202.1001) Date: 7 July 2014, Monday Time: 10:20:19 ------------------------------------------------------------------------------- Specifications for this HLM2 run Problem Title: no title The data source for this run = Ghana.mdm The command file for this run = C:\Users\EDUC~1.BRU\AppData\Local\Temp\whlmtemp.hlm Output file name = C:\Users\educ.brunerju\Google Drive\Dissertation\HLM Stuff\Datasets\hlm2.avg The maximum number of level-1 units = 3948 The maximum number of level-2 units = 132 451 The maximum number of iterations = 100 Method of estimation: full maximum likelihood This is part of a plausible value analysis using the following variables: BSSSCI01 BSSSCI02 BSSSCI03 BSSSCI04 BSSSCI05 Weighting Specification ----------------------Weight Variable Weighting? Name Normalized? Level 1 yes SCIWGT yes Level 2 yes SCIWGT yes Precision no Summary of the model specified (in hierarchical format) --------------------------------------------------Level-1 Model BSSSCI01 = B0 + r Level-2 Model B0 = G00 + u0 Mixed Model BSSSCI01 = G00 + u0+ r THE AVERAGED RESULTS FOR THIS PLAUSIBLE VALUE RUN sigma^2 = 7425.38848 452 Standard Error of sigma^2 = 263.26224 tau INTRCPT1,B0 4895.07523 Standard error of tau INTRCPT1,B0 686.61104 ---------------------------------------------------Random level-1 coefficient Reliability estimate ---------------------------------------------------INTRCPT1, G0 0.944 ---------------------------------------------------- 453 The outcome variables are: BSSSCI01,BSSSCI02,BSSSCI03,BSSSCI04,BSSSCI05 Final estimation of fixed effects: ---------------------------------------------------------------------------Standard Fixed Effect Approx. Coefficient Error T-ratio d.f. P-value ---------------------------------------------------------------------------For INTRCPT1, B0 INTRCPT2, G00 311.498557 6.390832 48.741 131 0.000 ---------------------------------------------------------------------------The outcome variables are: BSSSCI01,BSSSCI02,BSSSCI03,BSSSCI04,BSSSCI05 Final estimation of fixed effects (with robust standard errors) ---------------------------------------------------------------------------Standard Fixed Effect Approx. Coefficient Error T-ratio d.f. P-value ---------------------------------------------------------------------------For INTRCPT1, B0 INTRCPT2, G00 311.498557 7.440326 41.866 131 0.000 ---------------------------------------------------------------------------Final estimation of variance components: ----------------------------------------------------------------------------Random Effect Standard Deviation Variance df Chi-square P-value Component ----------------------------------------------------------------------------INTRCPT1, level-1, u0 r 69.96481 4895.07523 131 2705.94121 86.17069 7425.38848 ----------------------------------------------------------------------------- 454 0.000 Program: HLM 7 Hierarchical Linear and Nonlinear Modeling Authors: Stephen Raudenbush, Tony Bryk, & Richard Congdon Publisher: Scientific Software International, Inc. (c) 2000 techsupport@ssicentral.com www.ssicentral.com ------------------------------------------------------------------------------Module: HLM2.EXE (7.01.21202.1001) Date: 30 June 2014, Monday Time: 13: 0:51 ------------------------------------------------------------------------------Specifications for this HLM2 run Problem Title: no title The data source for this run = Ghana.mdm The command file for this run = C:\Users\educ.brunerju\Google Drive\Dissertation\HLM Stuff\Datasets\Ghana SES.hlm Output file name = C:\Users\educ.brunerju\Google Drive\Dissertation\HLM Stuff\Datasets\hlm2.avg The maximum number of level-1 units = 3948 The maximum number of level-2 units = 132 The maximum number of iterations = 100 Method of estimation: full maximum likelihood This is part of a plausible value analysis using the following variables: BSSSCI01 BSSSCI02 BSSSCI03 BSSSCI04 BSSSCI05 Weighting Specification ----------------------- 455 Weight Variable Weighting? Name Normalized? Level 1 yes SCIWGT yes Level 2 yes SCIWGT yes Precision no Summary of the model specified (in hierarchical format) --------------------------------------------------Level-1 Model BSSSCI01 = B0 + B1*(SES) + r Level-2 Model B0 = G00 + G01*(ECDISA) + u0 B1 = G10 Mixed Model BSSSCI01 = G00 + G01*ECDISA + G10*SES + u0+ r THE AVERAGED RESULTS FOR THIS PLAUSIBLE VALUE RUN sigma^2 = 7425.77077 Standard Error of sigma^2 = 263.62216 tau INTRCPT1,B0 4562.45109 Standard error of tau INTRCPT1,B0 649.32233 ---------------------------------------------------Random level-1 coefficient Reliability estimate ---------------------------------------------------INTRCPT1, G0 0.941 ---------------------------------------------------- 456 The outcome variables are: BSSSCI01,BSSSCI02,BSSSCI03,BSSSCI04,BSSSCI05 Final estimation of fixed effects: ---------------------------------------------------------------------------Standard Fixed Effect Approx. Coefficient Error T-ratio d.f. P-value ---------------------------------------------------------------------------For INTRCPT1, B0 INTRCPT2, G00 378.924734 25.204945 ECDISA, G01 For 15.034 130 0.000 -18.997719 6.878783 -2.762 130 0.007 0.276931 0.887405 0.312 20 0.758 SES slope, B1 INTRCPT2, G10 ---------------------------------------------------------------------------The outcome variables are: BSSSCI01,BSSSCI02,BSSSCI03,BSSSCI04,BSSSCI05 Final estimation of fixed effects (with robust standard errors) ---------------------------------------------------------------------------Standard Fixed Effect Approx. Coefficient Error T-ratio d.f. P-value ---------------------------------------------------------------------------For INTRCPT1, B0 INTRCPT2, G00 ECDISA, G01 For 378.924734 34.525608 10.975 130 0.000 -18.997719 9.260725 -2.051 130 0.042 0.276931 1.052858 0.263 39 0.794 SES slope, B1 INTRCPT2, G10 ---------------------------------------------------------------------------Final estimation of variance components: ----------------------------------------------------------------------------Random Effect Standard Variance df Chi-square P-value 457 Deviation Component ----------------------------------------------------------------------------INTRCPT1, level-1, u0 r 67.54592 4562.45109 130 2423.79813 0.000 86.17291 7425.77077 ----------------------------------------------------------------------------Program: HLM 7 Hierarchical Linear and Nonlinear Modeling Authors: Stephen Raudenbush, Tony Bryk, & Richard Congdon Publisher: Scientific Software International, Inc. (c) 2000 techsupport@ssicentral.com www.ssicentral.com ------------------------------------------------------------------------------Module: HLM2.EXE (7.01.21202.1001) Date: 7 July 2014, Monday Time: 12:14:41 ------------------------------------------------------------------------------Specifications for this HLM2 run Problem Title: no title The data source for this run = Ghana.mdm The command file for this run = C:\Users\EDUC~1.BRU\AppData\Local\Temp\whlmtemp.hlm Output file name = C:\Users\educ.brunerju\Google Drive\Dissertation\HLM Stuff\Datasets\hlm2.avg The maximum number of level-1 units = 3948 The maximum number of level-2 units = 132 The maximum number of iterations = 100 Method of estimation: full maximum likelihood This is part of a plausible value analysis using the following variables: BSSSCI01 458 BSSSCI02 BSSSCI03 BSSSCI04 BSSSCI05 Weighting Specification ----------------------Weight Variable Weighting? Name Normalized? Level 1 yes SCIWGT yes Level 2 yes SCIWGT yes Precision no Summary of the model specified (in hierarchical format) --------------------------------------------------Level-1 Model BSSSCI01 = B0 + B1*(GIRL) + B2*(SES) + B3*(INTENJ) + B4*(NEGSCI) + B5*(BULLY) + B6*(PARENT) + r Level-2 Model B0 = G00 + G01*(TEAEXP) + G02*(TEAEDU) + G03*(SCIHRS) + G04*(ECDISA) + G05*(URBAN) + G06*(SCIMAJ) + G07*(TECOOP) + G08*(TELIMI) + G09*(SCIPED) + G010*(EXPECT) + G011*(TESUPP) + G012*(SCHCLI) + G013*(SCHPAR) + u0 B1 = G10 B2 = G20 B3 = G30 B4 = G40 B5 = G50 B6 = G60 Mixed Model 459 BSSSCI01 = G00 + G01*TEAEXP + G02*TEAEDU + G03*SCIHRS + G04*ECDISA + G05*URBAN + G06*SCIMAJ + G07*TECOOP + G08*TELIMI + G09*SCIPED + G010*EXPECT + G011*TESUPP + G012*SCHCLI + G013*SCHPAR + G10*GIRL + G20*SES + G30*INTENJ + G40*NEGSCI + G50*BULLY + G60*PARENT + u0+ r THE AVERAGED RESULTS FOR THIS PLAUSIBLE VALUE RUN sigma^2 = 5952.52424 Standard Error of sigma^2 = 243.36111 tau INTRCPT1,B0 2334.97904 Standard error of tau INTRCPT1,B0 326.28416 ---------------------------------------------------Random level-1 coefficient Reliability estimate ---------------------------------------------------INTRCPT1, G0 0.912 ---------------------------------------------------The outcome variables are: BSSSCI01,BSSSCI02,BSSSCI03,BSSSCI04,BSSSCI05 Final estimation of fixed effects: ---------------------------------------------------------------------------Standard Fixed Effect Coefficient Error Approx. T-ratio d.f. P-value 460 ---------------------------------------------------------------------------For INTRCPT1, B0 INTRCPT2, G00 315.367735 88.496891 3.564 118 0.001 TEAEXP, G01 0.423419 0.848523 0.499 118 0.619 TEAEDU, G02 0.629728 5.475541 0.115 118 0.909 SCIHRS, G03 -1.563395 3.528957 -0.443 118 0.659 ECDISA, G04 -8.822848 5.755311 -1.533 118 0.128 URBAN, G05 14.242551 4.643295 3.067 118 0.003 SCIMAJ, G06 -21.424159 10.583930 -2.024 118 0.045 TECOOP, G07 -7.381062 6.778240 -1.089 118 0.278 TELIMI, G08 -2.904609 10.904813 -0.266 118 0.790 SCIPED, G09 7.415945 6.721217 1.103 118 0.272 EXPECT, G010 14.222256 7.031427 2.023 118 0.045 TESUPP, G011 3.047235 12.459339 0.245 118 0.807 SCHCLI, G012 -24.167039 6.709174 -3.602 118 0.000 -9.756530 9.625500 -1.014 118 0.313 -24.469565 2.931292 -8.348 68 0.000 -0.633118 0.867415 -0.730 14 0.477 36.685601 4.659003 7.874 12 0.000 SCHPAR, G013 For GIRL slope, B1 INTRCPT2, G10 For SES slope, B2 INTRCPT2, G20 For INTENJ slope, B3 INTRCPT2, G30 For NEGSCI slope, B4 INTRCPT2, G40 For -34.174805 1.869863 -18.277 61 0.000 BULLY slope, B5 INTRCPT2, G50 -9.194980 2.143072 -4.291 92 0.000 1.101837 2.750540 0.401 11 0.696 For PARENT slope, B6 INTRCPT2, G60 461 ---------------------------------------------------------------------------The outcome variables are: BSSSCI01,BSSSCI02,BSSSCI03,BSSSCI04,BSSSCI05 Final estimation of fixed effects (with robust standard errors) ---------------------------------------------------------------------------Standard Fixed Effect Coefficient Error Approx. T-ratio d.f. P-value ---------------------------------------------------------------------------For INTRCPT1, B0 INTRCPT2, G00 315.367735 89.017121 3.543 118 0.001 TEAEXP, G01 0.423419 0.756708 0.560 118 0.577 TEAEDU, G02 0.629728 5.982978 0.105 118 0.916 SCIHRS, G03 -1.563395 3.994129 -0.391 118 0.696 ECDISA, G04 -8.822848 6.433232 -1.371 118 0.173 URBAN, G05 14.242551 4.924537 2.892 118 0.005 SCIMAJ, G06 -21.424159 12.398559 -1.728 118 0.087 TECOOP, G07 -7.381062 8.397324 -0.879 118 0.381 TELIMI, G08 -2.904609 13.892226 -0.209 118 0.835 SCIPED, G09 7.415945 7.712477 0.962 118 0.338 EXPECT, G010 14.222256 8.332377 1.707 118 0.090 TESUPP, G011 3.047235 11.349524 0.268 118 0.789 SCHCLI, G012 -24.167039 7.900781 -3.059 118 0.003 -9.756530 9.817542 -0.994 118 0.322 -24.469565 3.487451 -7.016 136 0.000 -0.633118 0.971526 -0.652 23 SCHPAR, G013 For GIRL slope, B1 INTRCPT2, G10 For SES slope, B2 INTRCPT2, G20 For INTENJ slope, B3 462 0.521 INTRCPT2, G30 36.685601 5.698973 6.437 28 0.000 For NEGSCI slope, B4 INTRCPT2, G40 For -34.174805 2.414088 -14.156 170 0.000 BULLY slope, B5 INTRCPT2, G50 -9.194980 2.553662 -3.601 186 1.101837 3.219091 0.342 21 0.000 For PARENT slope, B6 INTRCPT2, G60 0.736 ---------------------------------------------------------------------------Final estimation of variance components: ----------------------------------------------------------------------------Random Effect Standard Deviation Variance df Chi-square P-value Component ----------------------------------------------------------------------------INTRCPT1, level-1, u0 r 48.32162 2334.97904 118 1594.44530 0.000 77.15260 5952.52424 ----------------------------------------------------------------------------- Program: HLM 7 Hierarchical Linear and Nonlinear Modeling Authors: Stephen Raudenbush, Tony Bryk, & Richard Congdon Publisher: Scientific Software International, Inc. (c) 2000 techsupport@ssicentral.com www.ssicentral.com ------------------------------------------------------------------------------Module: HLM2.EXE (7.01.21202.1001) Date: 7 July 2014, Monday Time: 10:28:55 ------------------------------------------------------------------------------Specifications for this HLM2 run 463 Problem Title: no title The data source for this run = Korea The command file for this run = C:\Users\EDUC~1.BRU\AppData\Local\Temp\whlmtemp.hlm Output file name = C:\Users\educ.brunerju\Google Drive\Dissertation\HLM Stuff\Datasets\hlm2.avg The maximum number of level-1 units = 5097 The maximum number of level-2 units = 171 The maximum number of iterations = 100 Method of estimation: full maximum likelihood This is part of a plausible value analysis using the following variables: BSSSCI01 BSSSCI02 BSSSCI03 BSSSCI04 BSSSCI05 Weighting Specification ----------------------Weight Variable Weighting? Name Normalized? Level 1 yes SCIWGT yes Level 2 yes SCIWGT yes Precision no Summary of the model specified (in hierarchical format) --------------------------------------------------Level-1 Model BSSSCI01 = B0 + r Level-2 Model B0 = G00 + u0 464 Mixed Model BSSSCI01 = G00 + u0+ r THE AVERAGED RESULTS FOR THIS PLAUSIBLE VALUE RUN sigma^2 = 5431.99565 Standard Error of sigma^2 = 127.05564 tau INTRCPT1,B0 480.48329 Standard error of tau INTRCPT1,B0 79.44763 ---------------------------------------------------Random level-1 coefficient Reliability estimate ---------------------------------------------------INTRCPT1, G0 0.710 ---------------------------------------------------- 465 The outcome variables are: BSSSCI01,BSSSCI02,BSSSCI03,BSSSCI04,BSSSCI05 Final estimation of fixed effects: ---------------------------------------------------------------------------Standard Fixed Effect Approx. Coefficient Error T-ratio d.f. P-value ---------------------------------------------------------------------------For INTRCPT1, B0 INTRCPT2, G00 558.953839 2.044810 273.353 170 0.000 ---------------------------------------------------------------------------The outcome variables are: BSSSCI01,BSSSCI02,BSSSCI03,BSSSCI04,BSSSCI05 Final estimation of fixed effects (with robust standard errors) ---------------------------------------------------------------------------Standard Fixed Effect Approx. Coefficient Error T-ratio d.f. P-value ---------------------------------------------------------------------------For INTRCPT1, B0 INTRCPT2, G00 558.953839 2.341682 238.698 170 0.000 ---------------------------------------------------------------------------Final estimation of variance components: ----------------------------------------------------------------------------Random Effect Standard Deviation Variance df Chi-square P-value Component ----------------------------------------------------------------------------INTRCPT1, level-1, u0 r 21.91993 480.48329 170 599.97622 73.70207 5431.99565 ----------------------------------------------------------------------------- 466 0.000 Program: HLM 7 Hierarchical Linear and Nonlinear Modeling Authors: Stephen Raudenbush, Tony Bryk, & Richard Congdon Publisher: Scientific Software International, Inc. (c) 2000 techsupport@ssicentral.com www.ssicentral.com ------------------------------------------------------------------------------Module: HLM2.EXE (7.01.21202.1001) Date: 30 June 2014, Monday Time: 12:56:17 ------------------------------------------------------------------------------Specifications for this HLM2 run Problem Title: no title The data source for this run = Korea The command file for this run = C:\Users\educ.brunerju\Google Drive\Dissertation\HLM Stuff\Datasets\Korea SES.hlm Output file name = C:\Users\educ.brunerju\Google Drive\Dissertation\HLM Stuff\Datasets\hlm2.avg The maximum number of level-1 units = 5097 The maximum number of level-2 units = 171 The maximum number of iterations = 100 Method of estimation: full maximum likelihood This is part of a plausible value analysis using the following variables: BSSSCI01 BSSSCI02 BSSSCI03 BSSSCI04 BSSSCI05 Weighting Specification ----------------------- 467 Weight Variable Weighting? Name Normalized? Level 1 yes SCIWGT yes Level 2 yes SCIWGT yes Precision no Summary of the model specified (in hierarchical format) --------------------------------------------------Level-1 Model BSSSCI01 = B0 + B1*(SES) + r Level-2 Model B0 = G00 + G01*(ECDISA) + u0 B1 = G10 Mixed Model BSSSCI01 = G00 + G01*ECDISA + G10*SES + u0+ r THE AVERAGED RESULTS FOR THIS PLAUSIBLE VALUE RUN sigma^2 = 4738.18003 Standard Error of sigma^2 = 110.41088 tau INTRCPT1,B0 164.05717 Standard error of tau INTRCPT1,B0 36.01461 ---------------------------------------------------Random level-1 coefficient Reliability estimate ---------------------------------------------------INTRCPT1, G0 0.501 ---------------------------------------------------- 468 The outcome variables are: BSSSCI01,BSSSCI02,BSSSCI03,BSSSCI04,BSSSCI05 Final estimation of fixed effects: ---------------------------------------------------------------------------Standard Fixed Effect Approx. Coefficient Error T-ratio d.f. P-value ---------------------------------------------------------------------------For INTRCPT1, B0 INTRCPT2, G00 570.391132 3.618833 157.617 ECDISA, G01 For -4.627658 1.530492 -3.024 169 169 0.000 0.003 SES slope, B1 INTRCPT2, G10 11.906630 0.496439 23.984 43 0.000 ---------------------------------------------------------------------------The outcome variables are: BSSSCI01,BSSSCI02,BSSSCI03,BSSSCI04,BSSSCI05 Final estimation of fixed effects (with robust standard errors) ---------------------------------------------------------------------------Standard Fixed Effect Approx. Coefficient Error T-ratio d.f. P-value ---------------------------------------------------------------------------For INTRCPT1, B0 INTRCPT2, G00 ECDISA, G01 For 570.391132 3.542265 161.024 -4.627658 1.633512 -2.833 169 169 0.000 0.005 SES slope, B1 INTRCPT2, G10 11.906630 0.566205 21.029 73 0.000 ---------------------------------------------------------------------------Final estimation of variance components: ----------------------------------------------------------------------------Random Effect Standard Variance df Chi-square P-value 469 Deviation Component ----------------------------------------------------------------------------INTRCPT1, level-1, u0 r 12.80848 164.05717 169 376.03687 0.000 68.83444 4738.18003 ----------------------------------------------------------------------------- Program: HLM 7 Hierarchical Linear and Nonlinear Modeling Authors: Stephen Raudenbush, Tony Bryk, & Richard Congdon Publisher: Scientific Software International, Inc. (c) 2000 techsupport@ssicentral.com www.ssicentral.com ------------------------------------------------------------------------------Module: HLM2.EXE (7.01.21202.1001) Date: 7 July 2014, Monday Time: 11:57:21 ------------------------------------------------------------------------------Specifications for this HLM2 run Problem Title: no title The data source for this run = Korea The command file for this run = C:\Users\EDUC~1.BRU\AppData\Local\Temp\whlmtemp.hlm Output file name = C:\Users\educ.brunerju\Google Drive\Dissertation\HLM Stuff\Datasets\hlm2.avg The maximum number of level-1 units = 5097 The maximum number of level-2 units = 171 The maximum number of iterations = 100 Method of estimation: full maximum likelihood This is part of a plausible value analysis using the following variables: BSSSCI01 BSSSCI02 470 BSSSCI03 BSSSCI04 BSSSCI05 Weighting Specification ----------------------Weight Variable Weighting? Name Normalized? Level 1 yes SCIWGT yes Level 2 yes SCIWGT yes Precision no Summary of the model specified (in hierarchical format) --------------------------------------------------Level-1 Model BSSSCI01 = B0 + B1*(GIRL) + B2*(SES) + B3*(INTENJ) + B4*(NEGSCI) + B5*(BULLY) + B6*(PARENT) + r Level-2 Model B0 = G00 + G01*(TEAEXP) + G02*(TEAEDU) + G03*(SCIHRS) + G04*(ECDISA) + G05*(URBAN) + G06*(SCIMAJ) + G07*(TECOOP) + G08*(TELIMI) + G09*(SCIPED) + G010*(EXPECT) + G011*(TESUPP) + G012*(SCHCLI) + G013*(SCHPAR) + u0 B1 = G10 B2 = G20 B3 = G30 B4 = G40 B5 = G50 B6 = G60 471 Mixed Model BSSSCI01 = G00 + G01*TEAEXP + G02*TEAEDU + G03*SCIHRS + G04*ECDISA + G05*URBAN + G06*SCIMAJ + G07*TECOOP + G08*TELIMI + G09*SCIPED + G010*EXPECT + G011*TESUPP + G012*SCHCLI + G013*SCHPAR + G10*GIRL + G20*SES + G30*INTENJ + G40*NEGSCI + G50*BULLY + G60*PARENT + u0+ r 472 THE AVERAGED RESULTS FOR THIS PLAUSIBLE VALUE RUN sigma^2 = 3766.31218 Standard Error of sigma^2 = 92.34341 tau INTRCPT1,B0 84.33946 Standard error of tau INTRCPT1,B0 23.84355 ---------------------------------------------------Random level-1 coefficient Reliability estimate ---------------------------------------------------INTRCPT1, G0 0.398 ---------------------------------------------------The outcome variables are: BSSSCI01,BSSSCI02,BSSSCI03,BSSSCI04,BSSSCI05 Final estimation of fixed effects: ---------------------------------------------------------------------------Standard Fixed Effect Coefficient Error Approx. T-ratio d.f. P-value ---------------------------------------------------------------------------For INTRCPT1, B0 INTRCPT2, G00 570.354085 25.223746 22.612 TEAEXP, G01 0.577484 0.116942 4.938 TEAEDU, G02 -4.914471 2.609988 -1.883 157 157 157 0.000 0.000 0.062 SCIHRS, G03 -0.488993 1.447489 -0.338 97 ECDISA, G04 -4.725889 1.286353 -3.674 157 0.000 URBAN, G05 2.442476 1.209400 2.020 157 0.045 SCIMAJ, G06 12.124984 8.778514 1.381 157 0.169 TECOOP, G07 -0.346606 2.134910 -0.162 157 0.871 TELIMI, G08 -2.417949 2.172129 -1.113 134 473 0.736 0.268 SCIPED, G09 0.953994 2.842314 0.336 157 0.738 EXPECT, G010 5.085626 2.204946 2.306 157 0.022 TESUPP, G011 -3.552776 1.791380 -1.983 157 0.049 SCHCLI, G012 -3.528025 1.673527 -2.108 157 0.037 SCHPAR, G013 For -0.495552 2.804392 -0.177 157 1.119995 2.758776 0.406 17 0.690 9.699059 0.489712 19.806 30 0.000 26.215209 1.829080 14.332 4335 0.000 -27.105737 1.859344 -14.578 222 0.000 GIRL slope, B1 INTRCPT2, G10 For 0.860 SES slope, B2 INTRCPT2, G20 For INTENJ slope, B3 INTRCPT2, G30 For NEGSCI slope, B4 INTRCPT2, G40 For BULLY slope, B5 INTRCPT2, G50 5.788569 2.009114 2.881 62 0.005 -1.530735 1.554407 -0.985 23 0.335 For PARENT slope, B6 INTRCPT2, G60 ---------------------------------------------------------------------------The outcome variables are: BSSSCI01,BSSSCI02,BSSSCI03,BSSSCI04,BSSSCI05 Final estimation of fixed effects (with robust standard errors) ---------------------------------------------------------------------------Standard Fixed Effect Coefficient Error Approx. T-ratio d.f. P-value ---------------------------------------------------------------------------For INTRCPT1, B0 INTRCPT2, G00 570.354085 29.422995 19.385 474 157 0.000 TEAEXP, G01 0.577484 0.116463 4.959 TEAEDU, G02 -4.914471 3.008551 -1.634 157 0.000 157 0.104 SCIHRS, G03 -0.488993 1.266516 -0.386 57 0.701 ECDISA, G04 -4.725889 1.352109 -3.495 157 0.001 URBAN, G05 2.442476 1.468547 1.663 157 0.098 SCIMAJ, G06 12.124984 5.312479 2.282 46 0.027 TECOOP, G07 -0.346606 3.349880 -0.103 157 0.918 TELIMI, G08 -2.417949 2.091056 -1.156 115 0.250 SCIPED, G09 0.953994 3.529957 0.270 157 0.787 EXPECT, G010 5.085626 2.171504 2.342 157 0.020 TESUPP, G011 -3.552776 1.875193 -1.895 157 0.060 SCHCLI, G012 -3.528025 1.909585 -1.848 157 0.067 SCHPAR, G013 For -0.495552 3.178928 -0.156 157 1.119995 2.804585 0.399 18 0.694 9.699059 0.542378 17.882 45 0.000 26.215209 2.482477 10.560 4920 0.000 -27.105737 2.544752 -10.652 779 0.000 GIRL slope, B1 INTRCPT2, G10 For 0.876 SES slope, B2 INTRCPT2, G20 For INTENJ slope, B3 INTRCPT2, G30 For NEGSCI slope, B4 INTRCPT2, G40 For BULLY slope, B5 INTRCPT2, G50 5.788569 2.443429 2.369 136 0.019 -1.530735 1.511931 -1.012 20 0.323 For PARENT slope, B6 INTRCPT2, G60 ---------------------------------------------------------------------------Final estimation of variance components: ----------------------------------------------------------------------------- 475 Random Effect Standard Deviation Variance df Chi-square P-value Component ----------------------------------------------------------------------------INTRCPT1, level-1, u0 r 9.18365 84.33946 157 313.88056 0.000 61.37029 3766.31218 ----------------------------------------------------------------------------- Program: HLM 7 Hierarchical Linear and Nonlinear Modeling Authors: Stephen Raudenbush, Tony Bryk, & Richard Congdon Publisher: Scientific Software International, Inc. (c) 2000 techsupport@ssicentral.com www.ssicentral.com ------------------------------------------------------------------------------Module: HLM2.EXE (7.01.21202.1001) Date: 7 July 2014, Monday Time: 10:36: 9 ------------------------------------------------------------------------------Specifications for this HLM2 run Problem Title: no title The data source for this run = Singapore.mdm The command file for this run = C:\Users\EDUC~1.BRU\AppData\Local\Temp\whlmtemp.hlm Output file name = C:\Users\educ.brunerju\Google Drive\Dissertation\HLM Stuff\Datasets\hlm2.avg The maximum number of level-1 units = 4835 The maximum number of level-2 units = 277 The maximum number of iterations = 100 Method of estimation: full maximum likelihood This is part of a plausible value analysis using the following variables: BSSSCI01 476 BSSSCI02 BSSSCI03 BSSSCI04 BSSSCI05 Weighting Specification ----------------------Weight Variable Weighting? Name Normalized? Level 1 yes SCIWGT yes Level 2 yes SCIWGT yes Precision no Summary of the model specified (in hierarchical format) --------------------------------------------------Level-1 Model BSSSCI01 = B0 + r Level-2 Model B0 = G00 + u0 Mixed Model BSSSCI01 = G00 + u0+ r THE AVERAGED RESULTS FOR THIS PLAUSIBLE VALUE RUN sigma^2 = 2228.61632 Standard Error of sigma^2 = 50.00110 tau INTRCPT1,B0 6871.35029 Standard error of tau INTRCPT1,B0 606.79273 477 ---------------------------------------------------Random level-1 coefficient Reliability estimate ---------------------------------------------------INTRCPT1, G0 0.982 ---------------------------------------------------The outcome variables are: BSSSCI01,BSSSCI02,BSSSCI03,BSSSCI04,BSSSCI05 Final estimation of fixed effects: ---------------------------------------------------------------------------Standard Fixed Effect Approx. Coefficient Error T-ratio d.f. P-value ---------------------------------------------------------------------------For INTRCPT1, B0 INTRCPT2, G00 589.659957 5.036686 117.073 276 0.000 ---------------------------------------------------------------------------The outcome variables are: BSSSCI01,BSSSCI02,BSSSCI03,BSSSCI04,BSSSCI05 Final estimation of fixed effects (with robust standard errors) ---------------------------------------------------------------------------Standard Fixed Effect Approx. Coefficient Error T-ratio d.f. P-value ---------------------------------------------------------------------------For INTRCPT1, B0 INTRCPT2, G00 589.659957 5.201987 113.353 276 0.000 ---------------------------------------------------------------------------Final estimation of variance components: ----------------------------------------------------------------------------Random Effect Standard Deviation Variance df Chi-square P-value Component 478 ----------------------------------------------------------------------------INTRCPT1, level-1, u0 r 82.89361 6871.35029 276 14729.38623 0.000 47.20822 2228.61632 ----------------------------------------------------------------------------Program: HLM 7 Hierarchical Linear and Nonlinear Modeling Authors: Stephen Raudenbush, Tony Bryk, & Richard Congdon Publisher: Scientific Software International, Inc. (c) 2000 techsupport@ssicentral.com www.ssicentral.com ------------------------------------------------------------------------------Module: HLM2.EXE (7.01.21202.1001) Date: 30 June 2014, Monday Time: 13: 6:56 ------------------------------------------------------------------------------Specifications for this HLM2 run Problem Title: no title The data source for this run = Singapore.mdm The command file for this run = C:\Users\educ.brunerju\Google Drive\Dissertation\HLM Stuff\Datasets\Singapore SES.hlm Output file name = C:\Users\educ.brunerju\Google Drive\Dissertation\HLM Stuff\Datasets\hlm2.avg The maximum number of level-1 units = 4835 The maximum number of level-2 units = 277 The maximum number of iterations = 100 Method of estimation: full maximum likelihood This is part of a plausible value analysis using the following variables: BSSSCI01 BSSSCI02 BSSSCI03 479 BSSSCI04 BSSSCI05 Weighting Specification ----------------------Weight Variable Weighting? Name Normalized? Level 1 yes SCIWGT yes Level 2 yes SCIWGT yes Precision no Summary of the model specified (in hierarchical format) --------------------------------------------------- Level-1 Model BSSSCI01 = B0 + B1*(SES) + r Level-2 Model B0 = G00 + G01*(ECDISA) + u0 B1 = G10 Mixed Model BSSSCI01 = G00 + G01*ECDISA + G10*SES + u0+ r THE AVERAGED RESULTS FOR THIS PLAUSIBLE VALUE RUN sigma^2 = 2204.90291 Standard Error of sigma^2 = 49.69692 tau INTRCPT1,B0 5512.66117 Standard error of tau 480 INTRCPT1,B0 491.24175 ---------------------------------------------------Random level-1 coefficient Reliability estimate ---------------------------------------------------INTRCPT1, G0 0.977 ---------------------------------------------------The outcome variables are: BSSSCI01,BSSSCI02,BSSSCI03,BSSSCI04,BSSSCI05 Final estimation of fixed effects: ---------------------------------------------------------------------------Standard Fixed Effect Coefficient Error Approx. T-ratio d.f. P-value ---------------------------------------------------------------------------For INTRCPT1, B0 INTRCPT2, G00 661.608441 11.682609 ECDISA, G01 For 56.632 275 0.000 -41.917142 6.284927 -6.669 275 0.000 2.628566 0.371672 7.072 66 0.000 SES slope, B1 INTRCPT2, G10 ---------------------------------------------------------------------------The outcome variables are: BSSSCI01,BSSSCI02,BSSSCI03,BSSSCI04,BSSSCI05 Final estimation of fixed effects (with robust standard errors) ---------------------------------------------------------------------------Standard Fixed Effect Coefficient Error Approx. T-ratio d.f. P-value ---------------------------------------------------------------------------For INTRCPT1, B0 INTRCPT2, G00 ECDISA, G01 661.608441 13.031453 -41.917142 6.944101 50.770 -6.036 275 275 481 0.000 0.000 For SES slope, B1 INTRCPT2, G10 2.628566 0.395434 6.647 85 0.000 ---------------------------------------------------------------------------Final estimation of variance components: ----------------------------------------------------------------------------Random Effect Standard Deviation Variance df Chi-square P-value Component ----------------------------------------------------------------------------INTRCPT1, level-1, u0 r 74.24730 5512.66117 275 11937.28376 0.000 46.95639 2204.90291 ----------------------------------------------------------------------------- Program: HLM 7 Hierarchical Linear and Nonlinear Modeling Authors: Stephen Raudenbush, Tony Bryk, & Richard Congdon Publisher: Scientific Software International, Inc. (c) 2000 techsupport@ssicentral.com www.ssicentral.com ------------------------------------------------------------------------------Module: HLM2.EXE (7.01.21202.1001) Date: 7 July 2014, Monday Time: 11:21:36 ------------------------------------------------------------------------------Specifications for this HLM2 run Problem Title: no title The data source for this run = Singapore.mdm The command file for this run = C:\Users\EDUC~1.BRU\AppData\Local\Temp\whlmtemp.hlm Output file name = C:\Users\educ.brunerju\Google Drive\Dissertation\HLM Stuff\Datasets\hlm2.avg The maximum number of level-1 units = 4835 482 The maximum number of level-2 units = 277 The maximum number of iterations = 100 Method of estimation: full maximum likelihood This is part of a plausible value analysis using the following variables: BSSSCI01 BSSSCI02 BSSSCI03 BSSSCI04 BSSSCI05 Weighting Specification ----------------------Weight Variable Weighting? Name Normalized? Level 1 yes SCIWGT yes Level 2 yes SCIWGT yes Precision no Summary of the model specified (in hierarchical format) --------------------------------------------------- Level-1 Model BSSSCI01 = B0 + B1*(GIRL) + B2*(SES) + B3*(INTENJ) + B4*(NEGSCI) + B5*(BULLY) + B6*(PARENT) + r Level-2 Model B0 = G00 + G01*(TEAEXP) + G02*(TEAEDU) + G03*(SCIHRS) + G04*(ECDISA) + G05*(SCIMAJ) + G06*(TECOOP) + G07*(TELIMI) + G08*(SCIPED) + G09*(EXPECT) + G010*(TESUPP) + G011*(SCHCLI) + G012*(SCHPAR) + u0 483 B1 = G10 B2 = G20 B3 = G30 B4 = G40 B5 = G50 B6 = G60 Mixed Model BSSSCI01 = G00 + G01*TEAEXP + G02*TEAEDU + G03*SCIHRS + G04*ECDISA + G05*SCIMAJ + G06*TECOOP + G07*TELIMI + G08*SCIPED + G09*EXPECT + G010*TESUPP + G011*SCHCLI + G012*SCHPAR + G10*GIRL + G20*SES + G30*INTENJ + G40*NEGSCI + G50*BULLY + G60*PARENT + u0+ r THE AVERAGED RESULTS FOR THIS PLAUSIBLE VALUE RUN sigma^2 = 1778.83145 Standard Error of sigma^2 = 47.63600 tau INTRCPT1,B0 3931.53001 Standard error of tau INTRCPT1,B0 356.92364 ---------------------------------------------------- 484 Random level-1 coefficient Reliability estimate ---------------------------------------------------INTRCPT1, G0 0.975 ---------------------------------------------------The outcome variables are: BSSSCI01,BSSSCI02,BSSSCI03,BSSSCI04,BSSSCI05 Final estimation of fixed effects: ---------------------------------------------------------------------------Standard Fixed Effect Coefficient Error Approx. T-ratio d.f. P-value ---------------------------------------------------------------------------For INTRCPT1, B0 INTRCPT2, G00 766.791812 73.882193 10.379 0.441284 0.464262 TEAEDU, G02 11.246833 8.909605 SCIHRS, G03 3.479631 3.264933 ECDISA, G04 -29.477039 5.771593 -5.107 264 0.000 SCIMAJ, G05 -16.378815 18.795340 -0.871 264 0.384 TECOOP, G06 -7.652809 7.080278 -1.081 264 0.281 TELIMI, G07 -60.590019 9.650195 -6.279 264 0.000 SCIPED, G08 4.276034 8.891730 0.481 EXPECT, G09 27.907848 6.928773 4.028 TESUPP, G010 -15.700658 6.824568 -2.301 264 0.022 SCHCLI, G011 -17.436689 8.506720 -2.050 264 0.041 1.971958 7.332997 0.269 264 0.788 -14.044347 2.425032 -5.791 9 0.000 1.876755 0.326981 5.740 128 0.000 For 1.262 0.343 264 1.066 0.208 264 0.288 264 0.631 264 0.000 GIRL slope, B1 INTRCPT2, G10 For 264 0.000 TEAEXP, G01 SCHPAR, G012 0.951 264 SES slope, B2 INTRCPT2, G20 485 For INTENJ slope, B3 INTRCPT2, G30 5.830833 1.985292 2.937 12 0.012 For NEGSCI slope, B4 INTRCPT2, G40 For -22.880320 1.290969 -17.723 56 0.000 BULLY slope, B5 INTRCPT2, G50 4.125330 1.694033 2.435 11 0.033 -1.615503 0.938409 -1.722 40 0.093 For PARENT slope, B6 INTRCPT2, G60 ---------------------------------------------------------------------------The outcome variables are: BSSSCI01,BSSSCI02,BSSSCI03,BSSSCI04,BSSSCI05 Final estimation of fixed effects (with robust standard errors) ---------------------------------------------------------------------------Standard Fixed Effect Coefficient Error Approx. T-ratio d.f. P-value ---------------------------------------------------------------------------For INTRCPT1, B0 INTRCPT2, G00 766.791812 75.222638 10.194 1.095 264 264 0.000 TEAEXP, G01 0.441284 0.403105 TEAEDU, G02 11.246833 9.069377 SCIHRS, G03 3.479631 3.215632 ECDISA, G04 -29.477039 5.999843 -4.913 264 0.000 SCIMAJ, G05 -16.378815 17.178215 -0.953 264 0.341 TECOOP, G06 -7.652809 7.683957 -0.996 264 0.320 TELIMI, G07 -60.590019 9.705447 -6.243 264 0.000 SCIPED, G08 4.276034 9.714719 0.440 EXPECT, G09 27.907848 7.221691 3.864 TESUPP, G010 -15.700658 6.575098 -2.388 1.240 0.275 264 1.082 264 264 0.216 0.280 0.660 264 486 264 0.000 0.018 SCHCLI, G011 -17.436689 7.774794 -2.243 264 0.026 1.971958 8.123662 0.243 264 0.808 -14.044347 2.399595 -5.853 9 0.000 1.876755 0.356702 5.261 181 0.000 5.830833 2.060575 2.830 14 0.013 SCHPAR, G012 For GIRL slope, B1 INTRCPT2, G10 For SES slope, B2 INTRCPT2, G20 For INTENJ slope, B3 INTRCPT2, G30 For NEGSCI slope, B4 INTRCPT2, G40 For -22.880320 1.439539 -15.894 87 0.000 BULLY slope, B5 INTRCPT2, G50 4.125330 1.746127 2.363 12 0.036 -1.615503 1.005936 -1.606 53 0.114 For PARENT slope, B6 INTRCPT2, G60 ---------------------------------------------------------------------------Final estimation of variance components: ----------------------------------------------------------------------------Random Effect Standard Deviation Variance df Chi-square P-value Component ----------------------------------------------------------------------------INTRCPT1, level-1, u0 r 62.70191 3931.53001 264 10528.29312 0.000 42.17620 1778.83145 ----------------------------------------------------------------------------- Program: HLM 7 Hierarchical Linear and Nonlinear Modeling Authors: Stephen Raudenbush, Tony Bryk, & Richard Congdon Publisher: Scientific Software International, Inc. (c) 2000 techsupport@ssicentral.com 487 www.ssicentral.com ------------------------------------------------------------------------------Module: HLM2.EXE (7.01.21202.1001) Date: 30 June 2014, Monday Time: 13: 9:30 ------------------------------------------------------------------------------Specifications for this HLM2 run Problem Title: no title The data source for this run = USA The command file for this run = C:\Users\EDUC~1.BRU\AppData\Local\Temp\whlmtemp.hlm Output file name = C:\Users\educ.brunerju\Google Drive\Dissertation\HLM Stuff\Datasets\hlm2.avg The maximum number of level-1 units = 4293 The maximum number of level-2 units = 424 The maximum number of iterations = 100 Method of estimation: full maximum likelihood This is part of a plausible value analysis using the following variables: BSSSCI01 BSSSCI02 BSSSCI03 BSSSCI04 BSSSCI05 Weighting Specification ----------------------Weight Variable Weighting? Name Normalized? Level 1 yes SCIWGT yes Level 2 yes SCIWGT yes 488 Precision no Summary of the model specified (in hierarchical format) --------------------------------------------------Level-1 Model BSSSCI01 = B0 + r Level-2 Model B0 = G00 + u0 Mixed Model BSSSCI01 = G00 + u0+ r THE AVERAGED RESULTS FOR THIS PLAUSIBLE VALUE RUN sigma^2 = 3273.62864 Standard Error of sigma^2 = 85.89667 tau INTRCPT1,B0 2763.59137 Standard error of tau INTRCPT1,B0 229.26274 ---------------------------------------------------Random level-1 coefficient Reliability estimate ---------------------------------------------------INTRCPT1, G0 0.829 ---------------------------------------------------The outcome variables are: BSSSCI01,BSSSCI02,BSSSCI03,BSSSCI04,BSSSCI05 Final estimation of fixed effects: ---------------------------------------------------------------------------Standard Approx. 489 Fixed Effect Coefficient Error T-ratio d.f. P-value ---------------------------------------------------------------------------For INTRCPT1, B0 INTRCPT2, G00 530.603533 2.981983 177.936 311 0.000 ---------------------------------------------------------------------------- The outcome variables are: BSSSCI01,BSSSCI02,BSSSCI03,BSSSCI04,BSSSCI05 Final estimation of fixed effects (with robust standard errors) ---------------------------------------------------------------------------Standard Fixed Effect Approx. Coefficient Error T-ratio d.f. P-value ---------------------------------------------------------------------------For INTRCPT1, B0 INTRCPT2, G00 530.603533 3.030834 175.068 332 0.000 ---------------------------------------------------------------------------Final estimation of variance components: ----------------------------------------------------------------------------Random Effect Standard Deviation Variance df Chi-square P-value Component ----------------------------------------------------------------------------INTRCPT1, level-1, u0 r 52.56987 2763.59137 423 4095.80214 0.000 57.21563 3273.62864 ----------------------------------------------------------------------------- Program: HLM 7 Hierarchical Linear and Nonlinear Modeling Authors: Stephen Raudenbush, Tony Bryk, & Richard Congdon Publisher: Scientific Software International, Inc. (c) 2000 490 techsupport@ssicentral.com www.ssicentral.com ------------------------------------------------------------------------------Module: HLM2.EXE (7.01.21202.1001) Date: 30 June 2014, Monday Time: 13: 8:37 ------------------------------------------------------------------------------- Specifications for this HLM2 run Problem Title: no title The data source for this run = USA The command file for this run = C:\Users\educ.brunerju\Google Drive\Dissertation\HLM Stuff\Datasets\USA SES.hlm Output file name = C:\Users\educ.brunerju\Google Drive\Dissertation\HLM Stuff\Datasets\hlm2.avg The maximum number of level-1 units = 4293 The maximum number of level-2 units = 424 The maximum number of iterations = 100 Method of estimation: full maximum likelihood This is part of a plausible value analysis using the following variables: BSSSCI01 BSSSCI02 BSSSCI03 BSSSCI04 BSSSCI05 Weighting Specification ----------------------Weight Variable 491 Weighting? Name Normalized? Level 1 yes SCIWGT yes Level 2 yes SCIWGT yes Precision no Summary of the model specified (in hierarchical format) --------------------------------------------------Level-1 Model BSSSCI01 = B0 + B1*(SES) + r Level-2 Model B0 = G00 + G01*(ECDISA) + u0 B1 = G10 Mixed Model BSSSCI01 = G00 + G01*ECDISA + G10*SES + u0+ r THE AVERAGED RESULTS FOR THIS PLAUSIBLE VALUE RUN sigma^2 = 3142.07398 Standard Error of sigma^2 = 85.72416 tau INTRCPT1,B0 1819.81571 Standard error of tau INTRCPT1,B0 160.76977 ---------------------------------------------------Random level-1 coefficient Reliability estimate ---------------------------------------------------INTRCPT1, G0 0.778 ---------------------------------------------------- 492 The outcome variables are: BSSSCI01,BSSSCI02,BSSSCI03,BSSSCI04,BSSSCI05 Final estimation of fixed effects: ---------------------------------------------------------------------------Standard Fixed Effect Approx. Coefficient Error T-ratio d.f. P-value ---------------------------------------------------------------------------For INTRCPT1, B0 INTRCPT2, G00 576.382569 7.115401 ECDISA, G01 For -15.405340 2.226698 81.005 -6.918 422 0.000 422 0.000 24 0.000 SES slope, B1 INTRCPT2, G10 6.493992 0.538833 12.052 ---------------------------------------------------------------------------- The outcome variables are: BSSSCI01,BSSSCI02,BSSSCI03,BSSSCI04,BSSSCI05 Final estimation of fixed effects (with robust standard errors) ---------------------------------------------------------------------------Standard Fixed Effect Approx. Coefficient Error T-ratio d.f. P-value ---------------------------------------------------------------------------For INTRCPT1, B0 INTRCPT2, G00 ECDISA, G01 For 576.382569 7.746060 -15.405340 2.435081 74.410 -6.326 422 0.000 422 0.000 30 0.000 SES slope, B1 INTRCPT2, G10 6.493992 0.568105 11.431 ---------------------------------------------------------------------------Final estimation of variance components: ----------------------------------------------------------------------------- 493 Random Effect Standard Deviation Variance df Chi-square P-value Component ----------------------------------------------------------------------------INTRCPT1, level-1, u0 r 42.65930 1819.81571 422 2833.38380 0.000 56.05421 3142.07398 ----------------------------------------------------------------------------Program: HLM 7 Hierarchical Linear and Nonlinear Modeling Authors: Stephen Raudenbush, Tony Bryk, & Richard Congdon Publisher: Scientific Software International, Inc. (c) 2000 techsupport@ssicentral.com www.ssicentral.com ------------------------------------------------------------------------------Module: HLM2.EXE (7.01.21202.1001) Date: 7 July 2014, Monday Time: 11: 2: 2 ------------------------------------------------------------------------------Specifications for this HLM2 run Problem Title: no title The data source for this run = USA The command file for this run = C:\Users\educ.brunerju\Google Drive\Dissertation\HLM Stuff\Datasets\USA Full FE.hlm Output file name = C:\Users\educ.brunerju\Google Drive\Dissertation\HLM Stuff\Datasets\hlm2.avg The maximum number of level-1 units = 4293 The maximum number of level-2 units = 424 The maximum number of iterations = 100 Method of estimation: full maximum likelihood This is part of a plausible value analysis using the following variables: BSSSCI01 494 BSSSCI02 BSSSCI03 BSSSCI04 BSSSCI05 Weighting Specification ----------------------Weight Variable Weighting? Name Normalized? Level 1 yes SCIWGT yes Level 2 yes SCIWGT yes Precision no Summary of the model specified (in hierarchical format) --------------------------------------------------- Level-1 Model BSSSCI01 = B0 + B1*(GIRL) + B2*(SES) + B3*(INTENJ) + B4*(NEGSCI) + B5*(BULLY) + B6*(PARENT) + r Level-2 Model B0 = G00 + G01*(TEAEXP) + G02*(TEAEDU) + G03*(SCIHRS) + G04*(ECDISA) + G05*(URBAN) + G06*(SCIMAJ) + G07*(TECOOP) + G08*(TELIMI) + G09*(SCIPED) + G010*(EXPECT) + G011*(TESUPP) + G012*(SCHCLI) + G013*(SCHPAR) + u0 B1 = G10 B2 = G20 B3 = G30 B4 = G40 495 B5 = G50 B6 = G60 Mixed Model BSSSCI01 = G00 + G01*TEAEXP + G02*TEAEDU + G03*SCIHRS + G04*ECDISA + G05*URBAN + G06*SCIMAJ + G07*TECOOP + G08*TELIMI + G09*SCIPED + G010*EXPECT + G011*TESUPP + G012*SCHCLI + G013*SCHPAR + G10*GIRL + G20*SES + G30*INTENJ + G40*NEGSCI + G50*BULLY + G60*PARENT + u0+ r THE AVERAGED RESULTS FOR THIS PLAUSIBLE VALUE RUN sigma^2 = 2794.74552 Standard Error of sigma^2 = 90.88238 tau INTRCPT1,B0 1558.81030 Standard error of tau INTRCPT1,B0 144.01110 ---------------------------------------------------Random level-1 coefficient Reliability estimate ---------------------------------------------------INTRCPT1, G0 0.772 ---------------------------------------------------The outcome variables are: BSSSCI01,BSSSCI02,BSSSCI03,BSSSCI04,BSSSCI05 496 Final estimation of fixed effects: ---------------------------------------------------------------------------Standard Fixed Effect Coefficient Error Approx. T-ratio d.f. P-value ---------------------------------------------------------------------------For INTRCPT1, B0 INTRCPT2, G00 613.774381 47.499979 12.922 410 0.000 TEAEXP, G01 0.487220 0.257046 1.895 410 0.059 TEAEDU, G02 0.246797 4.775731 0.052 410 0.959 SCIHRS, G03 -1.282413 1.370459 ECDISA, G04 -10.551848 2.615848 -4.034 410 0.000 URBAN, G05 -1.373566 2.152438 -0.638 410 0.524 SCIMAJ, G06 5.257745 5.451392 0.964 TECOOP, G07 -0.936 1.209151 3.003473 175 0.351 410 0.403 0.335 410 0.687 TELIMI, G08 -9.592458 6.536746 -1.467 410 0.143 SCIPED, G09 0.234517 4.550110 0.052 255 0.959 EXPECT, G010 7.674429 4.472945 1.716 410 0.087 TESUPP, G011 0.328498 5.868724 0.056 410 0.955 SCHCLI, G012 -3.042320 4.148908 -0.733 410 0.464 SCHPAR, G013 For -1.116 410 0.265 -13.167173 1.935358 -6.803 83 0.000 5.528931 0.523309 10.565 23 0.000 7.547119 1.573355 4.797 126 0.000 GIRL slope, B1 INTRCPT2, G10 For -5.149188 4.615936 SES slope, B2 INTRCPT2, G20 For INTENJ slope, B3 INTRCPT2, G30 For NEGSCI slope, B4 INTRCPT2, G40 -18.150088 1.609204 -11.279 497 45 0.000 For BULLY slope, B5 INTRCPT2, G50 3.997272 1.740579 2.297 63 0.025 -6.694493 1.236020 -5.416 55 0.000 For PARENT slope, B6 INTRCPT2, G60 ---------------------------------------------------------------------------The outcome variables are: BSSSCI01,BSSSCI02,BSSSCI03,BSSSCI04,BSSSCI05 Final estimation of fixed effects (with robust standard errors) ---------------------------------------------------------------------------Standard Fixed Effect Coefficient Error Approx. T-ratio d.f. P-value ---------------------------------------------------------------------------For INTRCPT1, B0 INTRCPT2, G00 613.774381 51.257409 11.974 410 0.000 TEAEXP, G01 0.487220 0.272510 1.788 410 0.075 TEAEDU, G02 0.246797 4.994943 0.049 410 0.961 SCIHRS, G03 -1.282413 1.351118 ECDISA, G04 -10.551848 2.942921 -3.586 410 0.000 URBAN, G05 -1.373566 2.278231 -0.603 410 0.547 SCIMAJ, G06 5.257745 5.897769 0.891 TECOOP, G07 1.209151 3.161753 -0.949 165 0.344 410 0.382 0.373 410 0.702 TELIMI, G08 -9.592458 7.436414 -1.290 410 0.198 SCIPED, G09 0.234517 4.882622 0.048 338 0.962 EXPECT, G010 7.674429 4.750119 1.616 410 0.107 TESUPP, G011 0.328498 6.476060 0.051 410 0.960 SCHCLI, G012 -3.042320 4.157447 -0.732 410 0.465 SCHPAR, G013 For -5.149188 4.323795 -1.191 GIRL slope, B1 498 410 0.234 INTRCPT2, G10 For -13.167173 2.165100 -6.082 131 0.000 5.528931 0.566219 9.765 32 0.000 7.547119 2.022223 3.732 345 0.000 -18.150088 1.875876 -9.676 84 0.000 3.997272 1.757412 2.275 65 0.026 -6.694493 1.280673 -5.227 63 0.000 SES slope, B2 INTRCPT2, G20 For INTENJ slope, B3 INTRCPT2, G30 For NEGSCI slope, B4 INTRCPT2, G40 For BULLY slope, B5 INTRCPT2, G50 For PARENT slope, B6 INTRCPT2, G60 ---------------------------------------------------------------------------Final estimation of variance components: ----------------------------------------------------------------------------Random Effect Standard Deviation Variance df Chi-square P-value Component ----------------------------------------------------------------------------INTRCPT1, level-1, u0 r 39.48177 1558.81030 410 2744.67784 52.86535 2794.74552 ----------------------------------------------------------------------------- 499 0.000 Logistic Regresssions Output name: log: C:\Users\educ.brunerju\Google Drive\Dissertation\Outputs\Logistic Regressions\Binary Regressions\Full Binary Regressions V2.smcl log type: smcl opened on: 17 Jul 2014, 13:48:21 . do "C:\Users\EDUC~1.BRU\AppData\Local\Temp\STD00000000.tmp" . *Run regression with variables as a block* . svy jackknife, subpop(if idcntry==152) : logistic lowach lowses girl hi_perintenj low_pernegsci low_perecdisa low_perschcli hi_perteaexp scimaj low_perte > limi hi_persciped hi_perexpect (running logistic on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Logistic regression 500 Number of strata = Number of PSUs 75 = Subpop. no. of obs = Number of obs 150 = 163623.79 Replications = = 4566498.3 150 Design df = F( 11, = 22.85 = 0.0000 Prob > F Population size 43612 3629 Subpop. size 65) = 75 Jknife * lowach Odds Ratio Std. Err. lowses girl 2.603782 .2815525 1.597955 .1418904 hi_perintenj t P>t [95% Conf. Interval] 8.85 0.000 5.28 0.000 .9979255 .1089499 2.099196 1.338884 -0.02 0.985 3.229654 1.907157 .8028662 1.240375 low_pernegsci .4259289 .0429848 -8.46 0.000 .3483576 .5207736 low_perecdisa .4711922 .0859693 -4.12 0.000 .3276041 .6777147 low_perschcli .730163 .1245166 -1.84 0.069 .5198547 1.025552 .9899339 .1515513 -0.07 0.947 .7297212 1.342936 hi_perteaexp scimaj .7761666 .1212908 low_pertelimi -1.62 0.109 .8337302 .1434053 .5685353 1.059626 -1.06 0.294 .5918536 1.174456 hi_persciped 1.023555 .1585893 0.15 0.881 .7517329 1.393666 hi_perexpect .5675416 .1155262 -2.78 0.007 .3783466 .851345 _cons .7977532 .141728 -1.27 0.207 .5599707 1.136506 . svy jackknife, subpop(if idcntry==246) : logistic lowach lowses girl hi_perintenj low_pernegsci low_perecdisa low_perschcli hi_perteaexp scimaj low_perte 501 > limi hi_persciped hi_perexpect (running logistic on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 ............................ss.................... 50 ..ss............ss................................ 100 ............................................ss.... 150 Survey: Logistic regression Number of strata = Number of PSUs 71 = Subpop. no. of obs = 142 = 42342.586 Replications = Population size 41022 = 4401996.4 142 Design df = F( 11, = 31.59 = 0.0000 Prob > F = 6690 Subpop. size 61) Number of obs 71 Jknife * lowach Odds Ratio Std. Err. lowses 2.720005 .251612 t P>t [95% Conf. Interval] 10.82 0.000 2.261856 3.270955 502 girl .9133711 .0741693 hi_perintenj -1.12 0.268 .727745 .0732305 -3.16 0.002 low_pernegsci .3803604 .0471107 low_perecdisa .9073529 low_perschcli .9761785 .1384062 hi_perteaexp 1.03161 .0990014 scimaj .11484 .7791531 .1011828 low_pertelimi -7.80 0.000 -0.77 0.445 -0.17 0.865 0.32 0.747 -1.92 0.059 1.073906 .5954437 .8894422 .2971251 .7049776 1.167823 .7357842 .851945 .4869129 1.295114 1.249165 .6014053 1.009435 .063142 -3.10 0.003 .6608526 .9137589 hi_persciped .8633855 .0844931 -1.50 0.138 .7103296 1.049421 hi_perexpect .8156947 .0801899 -2.07 0.042 .6704963 .9923364 _cons .7770842 .7768338 .7487175 .1387112 -1.56 0.123 .5174711 1.083303 Note: 4 strata omitted because they contain no subpopulation members. . svy jackknife, subpop(if idcntry==288) : logistic lowach lowses girl hi_perintenj low_pernegsci low_perecdisa low_perschcli hi_perteaexp scimaj low_perte > limi hi_persciped hi_perexpect (running logistic on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 503 Survey: Logistic regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = Number of obs 150 = 258356.57 Replications = = 4516856.5 150 Design df = F( 11, = 31.24 = 0.0000 Prob > F Population size 42828 4936 Subpop. size 65) = 75 Jknife * lowach Odds Ratio Std. Err. lowses girl 1.086542 .1342734 1.738085 .1396937 hi_perintenj t P>t [95% Conf. Interval] 0.67 0.504 6.88 0.000 .6170558 .0730674 .8494375 1.480936 -4.08 0.000 low_pernegsci .2275501 .0271079 -12.43 0.000 low_perecdisa .5273216 .1382659 low_perschcli .4958596 .0942739 hi_perteaexp scimaj .7057209 .1440102 1.034431 .1955678 low_pertelimi 2.039886 .4873919 .7812149 .179478 .2884981 -2.44 0.017 .3127716 .8890451 -3.69 0.000 .3395258 .7241769 -1.71 0.092 0.18 0.858 .7263783 .1792108 1.38983 .469989 .7098002 -1.30 0.199 1.059689 1.507534 .4443369 1.187445 hi_persciped .8946399 .1883306 -0.53 0.598 .5881982 1.360733 hi_perexpect .7866539 .1984781 -0.95 0.345 .4758808 1.300377 _cons 1.090972 .2026627 0.47 0.641 .753527 1.579533 504 . svy jackknife, subpop(if idcntry==410) : logistic lowach lowses girl hi_perintenj low_pernegsci low_perecdisa low_perschcli hi_perteaexp scimaj low_perte > limi hi_persciped hi_perexpect (running logistic on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 ....................ss............................ 50 .................................................. 100 .................................................. 150 Survey: Logistic regression Number of strata = Number of PSUs 74 = Subpop. no. of obs = 148 = 551142.35 Replications = = F( 11, = 70.31 = 0.0000 44406 = 4518328.4 148 Design df Prob > F Population size = 5320 Subpop. size 64) Number of obs 74 Jknife * 505 lowach Odds Ratio Std. Err. lowses girl 2.875947 .2165541 1.1401 .0836126 hi_perintenj .449755 t P>t [95% Conf. Interval] 14.03 0.000 1.79 0.078 .0487 2.475263 3.341492 .9850995 -7.38 0.000 1.31949 .3624724 .5580552 low_pernegsci .3059572 .0272073 -13.32 0.000 low_perecdisa .7018147 .074654 -3.33 0.001 .5677703 .8675055 low_perschcli .8256427 .0774189 -2.04 0.045 .6849359 .9952549 hi_perteaexp scimaj .8776363 .1113799 .7282082 .2750007 low_pertelimi -0.84 0.404 1.07575 .1093506 hi_persciped .981042 hi_perexpect .7587286 .0658769 _cons -1.03 0.307 .103076 .9575639 .3812858 .6815439 .3652684 1.130148 .3431345 1.545421 0.72 0.475 -0.18 0.856 -3.18 0.002 -0.11 0.914 .2562768 .8785125 .7957324 .638193 .4331115 1.31727 1.209506 .9020297 2.117073 Note: 1 stratum omitted because it contains no subpopulation members. . svy jackknife, subpop(if idcntry==702) : logistic lowach lowses girl hi_perintenj low_pernegsci low_perecdisa low_perschcli hi_perteaexp scimaj low_perte > limi hi_persciped hi_perexpect (running logistic on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 506 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Logistic regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = Number of obs 150 = 43962.354 Replications = = 4647811.7 150 Design df = F( 11, = 23.12 = 0.0000 Prob > F Population size 45045 5154 Subpop. size 65) = 75 Jknife * lowach Odds Ratio Std. Err. lowses girl 2.401415 .2109231 1.127129 .0946513 hi_perintenj t P>t [95% Conf. Interval] 9.97 0.000 1.43 0.158 1.002943 .1124445 2.015941 .9535016 0.03 0.979 2.860596 1.332373 .8021935 1.253929 low_pernegsci .4947989 .0536055 -6.49 0.000 .3987486 .6139858 low_perecdisa .3804092 .0912717 -4.03 0.000 .2358702 .6135203 low_perschcli .8159184 .3657174 -0.45 0.651 .334086 1.992669 .5774812 1.471938 hi_perteaexp scimaj .9219633 .2165158 1.277152 .7237448 low_pertelimi hi_persciped 0.43 0.667 .2949365 .0715844 .7821426 -0.35 0.730 .188289 .4130175 -5.03 0.000 -1.02 0.311 3.949267 .1818632 .4841858 507 .4783131 1.263455 hi_perexpect _cons .6283273 .1535767 .9561939 .6766593 -1.90 0.061 -0.06 0.950 .3861202 .2335115 1.022467 3.915468 . svy jackknife, subpop(if idcntry==840) : logistic lowach lowses girl hi_perintenj low_pernegsci low_perecdisa low_perschcli hi_perteaexp scimaj low_perte > limi hi_persciped hi_perexpect (running logistic on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Logistic regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = 150 = 1485985.4 Replications = = F( 11, = 27.89 = 0.0000 40239 = 2871865.5 150 Design df Prob > F Population size = 4803 Subpop. size 65) Number of obs 75 508 Jknife * lowach Odds Ratio Std. Err. lowses girl 2.620721 .2153426 1.170221 .0743233 hi_perintenj t P>t [95% Conf. Interval] 11.73 0.000 2.47 0.016 .8075818 .0795272 2.225007 3.086813 1.031145 -2.17 0.033 1.328055 .6637268 .9826158 low_pernegsci .4404034 .0395318 -9.14 0.000 .3682915 .5266351 low_perecdisa .3986233 .0837989 -4.38 0.000 .2622336 .6059502 low_perschcli .852761 .1761658 hi_perteaexp .90675 .1334905 scimaj .8548171 .1308181 .9782454 .1291845 hi_persciped .878746 .1376686 hi_perexpect .8986847 .1616839 .6501179 .1391304 -0.66 0.508 -1.03 0.309 low_pertelimi _cons -0.77 0.443 .6762698 1.286928 1.21578 .6301913 1.159509 -0.17 0.868 -0.83 0.412 -0.59 0.554 -2.01 0.048 .5650675 .7519643 .6431664 .6279935 .4244643 1.272619 1.200614 1.286055 .9957333 end of do-file . log close name: log: C:\Users\educ.brunerju\Google Drive\Dissertation\Outputs\Logistic Regressions\Binary Regressions\Full Binary Regressions V2.smcl log type: smcl closed on: 17 Jul 2014, 13:58:17 509 name: log: C:\Users\educ.brunerju\Google Drive\Dissertation\Outputs\Logistic Regressions\New Regressions Richard\Binary Regressions.smcl log type: smcl opened on: 9 Jul 2014, 15:20:52 . do "C:\Users\EDUC~1.BRU\AppData\Local\Temp\STD00000000.tmp" . ***Binary Regressions Analysis*** . *Run Binary Regression Cleaning File First* . set more off . sort idcntry . *Put dataset in survey mode for teacher descriptives* . svyset jkrep [pweight=sciwgt], strata(jkzone) vce(jackknife) mse singleunit(missing) pweight: sciwgt VCE: jackknife MSE: on Single unit: missing Strata 1: jkzone SU 1: jkrep FPC 1: . *Run Regessions* . svy jackknife, subpop(if idcntry==152) : logistic lowach lowses girl sesx_girl (running logistic on estimation sample) 510 Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Logistic regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = Number of obs 150 = 227495.41 Replications = F( 3, Prob > F = 73) Population size 45286 = 4630370 5303 Subpop. size Design df = 150 75 = 65.03 = 0.0000 Jknife * lowach Odds Ratio Std. Err. lowses girl 2.807334 .3160159 1.292482 .1326739 sesx_girl t P>t 9.17 0.000 2.50 0.015 1.389321 .2206036 [95% Conf. Interval] 2.243391 1.053454 2.07 0.042 3.513042 1.585746 1.012577 511 1.906238 _cons .3028317 .0265791 -13.61 0.000 .2542538 .3606908 . svy jackknife, subpop(if idcntry==246) : logistic lowach lowses girl sesx_girl (running logistic on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Logistic regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = 150 = 56103.064 Replications = F( 3, Prob > F 73) = Population size = 45492 = 4652258.5 9308 Subpop. size Design df Number of obs 150 75 = 68.75 = 0.0000 Jknife * lowach Odds Ratio Std. Err. t P>t [95% Conf. Interval] 512 lowses girl 2.701089 .2635933 .9334776 .0769163 sesx_girl _cons 10.18 0.000 -0.84 0.406 1.254234 .1919049 2.223873 .7921672 1.48 0.143 .3323125 .0286246 -12.79 0.000 3.28071 1.099996 .9247072 .2799137 1.70119 .3945202 . svy jackknife, subpop(if idcntry==288) : logistic lowach lowses girl sesx_girl (running logistic on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Logistic regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = 150 = 342732.48 Replications = = Population size = 44723 = 4601232.4 6831 Subpop. size Design df Number of obs 150 75 513 F( 3, 73) Prob > F = 27.30 = 0.0000 Jknife * lowach Odds Ratio Std. Err. lowses girl 1.307595 .1649361 1.611611 .1449829 sesx_girl _cons [95% Conf. Interval] 2.13 0.037 5.30 0.000 1.017277 .1546809 .3380947 t P>t 1.017056 1.347191 0.11 0.911 .031134 -11.78 0.000 1.681132 1.92793 .7514302 .2814289 1.377178 .4061701 . svy jackknife, subpop(if idcntry==410) : logistic lowach lowses girl sesx_girl (running logistic on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Logistic regression Number of strata = Number of PSUs 75 = 150 Number of obs Population size = 45778 = 4649937.4 514 Subpop. no. of obs = 6074 Subpop. size = 626925.12 Replications = Design df F( 3, = 73) Prob > F 150 75 = 122.81 = 0.0000 Jknife * lowach Odds Ratio Std. Err. lowses girl 2.774208 1.094915 .0999507 sesx_girl _cons .273179 t P>t 10.36 0.000 0.99 0.324 1.327114 .1697954 .3088161 [95% Conf. Interval] 2.280059 .9128583 2.21 0.030 .020479 -17.72 0.000 3.375453 1.313281 1.02853 .2705997 1.712378 .3524297 . svy jackknife, subpop(if idcntry==702) : logistic lowach lowses girl sesx_girl (running logistic on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 515 Survey: Logistic regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = Number of obs 150 = 49977.699 Replications = F( 3, = 73) Prob > F Population size 45790 = 4653827 5899 Subpop. size Design df = 150 75 = 66.37 = 0.0000 Jknife * lowach Odds Ratio Std. Err. lowses girl 2.883582 .2861411 .9553725 .0829726 sesx_girl _cons t P>t [95% Conf. Interval] 10.67 0.000 -0.53 0.601 1.289549 .1382634 2.366364 3.513848 .8035909 2.37 0.020 .3120784 .0315778 -11.51 0.000 1.135822 1.041542 .255107 1.59661 .3817729 . svy jackknife, subpop(if idcntry==840) : logistic lowach lowses girl sesx_girl (running logistic on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. 516 Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Logistic regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = Number of obs 150 = 3204096.2 Replications = F( 3, Prob > F = 4589976.3 150 = 73) Population size 45605 10169 Subpop. size Design df = 75 = 124.49 = 0.0000 Jknife * lowach Odds Ratio Std. Err. lowses girl 2.576002 .2108746 1.042792 .0771143 sesx_girl _cons 1.559577 t P>t [95% Conf. Interval] 11.56 0.000 0.57 0.573 .174159 .8999524 3.98 0.000 .303161 .0196482 -18.41 0.000 2.188382 3.032279 1.208304 1.248516 .2664412 1.948138 .3449412 . svy jackknife, subpop(if idcntry==152) : logistic lowach lowses hi_perintenj sesx_hi_perintenj (running logistic on estimation sample) 517 Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Logistic regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = 150 = 221212.15 Replications = F( 3, 73) Prob > F = Population size = 45151 = 4624086.7 5168 Subpop. size Design df Number of obs 150 75 = 65.54 = 0.0000 Jknife * lowach Odds Ratio Std. Err. lowses 3.014108 .2813808 hi_perintenj t P>t 11.82 0.000 .693664 .0956137 sesx_hi_perintenj [95% Conf. Interval] 2.502605 3.630157 -2.65 0.010 1.347166 .2430472 .5271047 1.65 0.103 .9128544 .9404475 518 1.929779 _cons .380698 .0288981 -12.72 0.000 .3272713 .4428465 . svy jackknife, subpop(if idcntry==246) : logistic lowach lowses hi_perintenj sesx_hi_perintenj (running logistic on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Logistic regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = 150 = 55905.854 Replications = F( 3, Prob > F 73) = Population size = 45462 = 4652061.3 9278 Subpop. size Design df Number of obs 150 75 = 95.37 = 0.0000 Jknife * lowach Odds Ratio Std. Err. t P>t [95% Conf. Interval] 519 lowses 2.378885 .2203434 hi_perintenj .375586 .0406482 sesx_hi_perintenj _cons 9.36 0.000 1.978055 -9.05 0.000 1.771006 .3124534 .3027448 3.24 0.002 .438814 .0343331 -10.53 0.000 2.860938 .4659529 1.246185 .3754826 2.516851 .5128274 . svy jackknife, subpop(if idcntry==288) : logistic lowach lowses hi_perintenj sesx_hi_perintenj (running logistic on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Logistic regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = 150 = 312153.83 Replications = F( 3, 73) = = Population size = 44065 = 4570653.8 6173 Subpop. size Design df Number of obs 150 75 19.88 520 Prob > F = 0.0000 Jknife * lowach Odds Ratio Std. Err. lowses 1.451359 .1680586 hi_perintenj 1.152377 -5.16 0.000 .7722858 .1153684 .4704068 .0456756 [95% Conf. Interval] 3.22 0.002 .5482675 .0638153 sesx_hi_perintenj _cons t P>t .4348033 -1.73 0.088 -7.77 0.000 1.827911 .6913408 .5735029 .3876755 1.039969 .5707933 . svy jackknife, subpop(if idcntry==410) : logistic lowach lowses hi_perintenj sesx_hi_perintenj (running logistic on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Logistic regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = 150 Number of obs Population size = 45758 = 4647766.1 6054 521 Subpop. size = 624753.9 Replications = Design df F( 3, 73) Prob > F = 150 75 = 166.75 = 0.0000 Jknife * lowach Odds Ratio Std. Err. lowses 2.704004 .2056273 hi_perintenj [95% Conf. Interval] 13.08 0.000 2.323892 .2205723 .0247338 -13.48 0.000 sesx_hi_perintenj _cons t P>t 1.665323 .3251742 .1764155 2.61 0.011 .4748904 .0287099 -12.32 0.000 3.14629 .2757816 1.128667 .4210071 2.457147 .5356701 . svy jackknife, subpop(if idcntry==702) : logistic lowach lowses hi_perintenj sesx_hi_perintenj (running logistic on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 522 Survey: Logistic regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = 150 = 49676.907 Replications = F( 3, 73) Prob > F = = Population size 45755 = 4653526.2 5864 Subpop. size Design df Number of obs 150 75 = 85.34 = 0.0000 Jknife * lowach Odds Ratio Std. Err. lowses 2.943155 .2641904 hi_perintenj [95% Conf. Interval] 12.03 0.000 .6254282 .0612362 sesx_hi_perintenj _cons t P>t 2.461233 -4.79 0.000 1.268127 .1844268 .514599 1.63 0.107 .3564971 .0329816 -11.15 0.000 3.51944 .7601266 .9491624 .2964934 1.694279 .4286441 . svy jackknife, subpop(if idcntry==840) : logistic lowach lowses hi_perintenj sesx_hi_perintenj (running logistic on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 523 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Logistic regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = 150 = 3120332.7 Replications = F( 3, 73) Prob > F = = Population size 45334 = 4506212.7 9898 Subpop. size Design df Number of obs 150 75 = 117.87 = 0.0000 Jknife * lowach Odds Ratio Std. Err. lowses 3.059321 .2269503 hi_perintenj [95% Conf. Interval] 15.07 0.000 .5732192 .0449762 sesx_hi_perintenj _cons t P>t 2.639033 3.546544 -7.09 0.000 1.045677 .1034089 .4902732 0.45 0.653 .3623811 .0222402 -16.54 0.000 .6701982 .8586979 .3206777 1.27337 .4095079 . . svy jackknife, subpop(if idcntry==152) : logistic lowach lowses low_pernegsci sesx_low_pernegsci (running logistic on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are 524 insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Logistic regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = 150 = 221551.39 Replications = F( 3, = 73) Prob > F = Population size 45160 = 4624425.9 5177 Subpop. size Design df Number of obs 150 75 = 75.33 = 0.0000 Jknife * lowach Odds Ratio Std. Err. lowses 2.95643 .2846546 low_pernegsci [95% Conf. Interval] 11.26 0.000 .3888091 .0460369 sesx_low_perneg~i _cons t P>t 2.440435 -7.98 0.000 1.335582 .2195064 .4538909 .0316754 -11.32 0.000 3.581524 .3071124 1.76 0.082 .3949801 .4922385 .9626747 .5215883 525 1.852942 . svy jackknife, subpop(if idcntry==246) : logistic lowach lowses low_pernegsci sesx_low_pernegsci (running logistic on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Logistic regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = 150 = 55906.247 Replications = F( 3, Prob > F 73) = = Population size 45463 = 4652061.7 9279 Subpop. size Design df Number of obs 150 75 = 124.33 = 0.0000 Jknife * lowach Odds Ratio Std. Err. lowses 2.637948 .2275114 t P>t [95% Conf. Interval] 11.25 0.000 2.22152 3.132438 526 low_pernegsci .2929118 .0332805 -10.81 0.000 sesx_low_perneg~i _cons 1.088513 .2079959 .4650204 .0364343 -9.77 0.000 .2335812 0.44 0.658 .3673127 .7439038 .3978202 1.59276 .5435722 . svy jackknife, subpop(if idcntry==288) : logistic lowach lowses low_pernegsci sesx_low_pernegsci (running logistic on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Logistic regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = 150 = 311425.92 Replications = F( 3, Prob > F 73) = Population size = 150 75 = 60.78 = 0.0000 44062 = 4569925.9 6170 Subpop. size Design df Number of obs 527 Jknife * lowach Odds Ratio Std. Err. lowses 1.109083 .1141985 low_pernegsci [95% Conf. Interval] 1.01 0.318 .9034028 .1377559 .0224464 -12.17 0.000 sesx_low_perneg~i _cons t P>t 2.14004 .3880275 .6515341 .0584048 -4.78 0.000 1.36159 .0995723 4.20 0.000 .190582 1.491259 .5449826 3.071077 .7789179 . svy jackknife, subpop(if idcntry==410) : logistic lowach lowses low_pernegsci sesx_low_pernegsci (running logistic on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Logistic regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = 150 Number of obs Population size = = 4645203.1 6031 Subpop. size = 622190.86 Replications = 45735 150 528 Design df F( 3, 73) Prob > F = 75 = 158.07 = 0.0000 Jknife * lowach Odds Ratio Std. Err. lowses 2.525949 .1985503 low_pernegsci [95% Conf. Interval] 11.79 0.000 2.159829 2.954131 .1507524 .0205857 -13.86 0.000 sesx_low_perneg~i _cons t P>t 2.311861 .3986898 .5323614 .0348943 -9.62 0.000 .1148483 4.86 0.000 .197881 1.63969 .4671954 3.259582 .6066169 . svy jackknife, subpop(if idcntry==702) : logistic lowach lowses low_pernegsci sesx_low_pernegsci (running logistic on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Logistic regression Number of strata = 75 Number of obs = 529 45737 Number of PSUs = Subpop. no. of obs = 150 = 49544.366 Replications = F( 3, 73) Prob > F = 4653393.7 5846 Subpop. size Design df Population size 150 = 75 = 141.11 = 0.0000 Jknife * lowach Odds Ratio Std. Err. lowses 2.933056 .2773301 low_pernegsci .4036441 2.429499 3.540984 -8.60 0.000 1.196025 .1688137 .037683 [95% Conf. Interval] 11.38 0.000 .4255055 .0422898 sesx_low_perneg~i _cons t P>t -9.72 0.000 .3490755 1.27 0.209 .3351429 .5186698 .9028731 1.584359 .4861465 . svy jackknife, subpop(if idcntry==840) : logistic lowach lowses low_pernegsci sesx_low_pernegsci (running logistic on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 530 Survey: Logistic regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = 150 = 3110586.7 Replications = F( 3, Prob > F Population size 45304 = 4496466.7 150 = 73) = 9868 Subpop. size Design df Number of obs 75 = 124.65 = 0.0000 Jknife * lowach Odds Ratio Std. Err. lowses 2.83957 .2011333 low_pernegsci [95% Conf. Interval] 14.73 0.000 2.465877 .3475592 .0333367 -11.02 0.000 sesx_low_perneg~i _cons t P>t 1.341682 .1813216 .4156829 .0229253 -15.92 0.000 3.269895 .2871083 2.17 0.033 .3724328 .4207381 1.025009 1.75619 .4639556 . . svy jackknife, subpop(if idcntry==152) : logistic lowach lowses low_ecdisa sesx_low_perecdisa (running logistic on estimation sample) variable low_ecdisa not found an error occurred when svy executed logistic Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. 531 an error occurred when jackknife executed logistic r(111); end of do-file r(111); . do "C:\Users\EDUC~1.BRU\AppData\Local\Temp\STD00000000.tmp" . svy jackknife, subpop(if idcntry==152) : logistic lowach lowses low_perecdisa sesx_low_perecdisa (running logistic on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Logistic regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = Subpop. size 150 Number of obs Population size = 44499 = 4602036.6 4516 = 199162.06 532 Replications Design df F( 3, 73) Prob > F = 150 = 75 = 59.70 = 0.0000 Jknife * lowach Odds Ratio Std. Err. lowses 2.424247 .2615806 low_perecdisa [95% Conf. Interval] 8.21 0.000 .3167679 .0561912 sesx_low_perecd~a _cons t P>t -6.48 0.000 1.5701 .2728795 .5379759 .0476553 1.95535 -7.00 0.000 3.005585 .22247 .4510358 2.60 0.011 1.110619 .4509463 .6418018 2.219677 . svy jackknife, subpop(if idcntry==246) : logistic lowach lowses low_perecdisa sesx_low_perecdisa (running logistic on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 ............................ss.................... 50 ..ss............ss................................ 100 ............................................ss.... 150 Survey: Logistic regression 533 Number of strata = Number of PSUs 71 = Subpop. no. of obs = 142 = 48442.447 Replications = F( 3, 69) Prob > F = Population size 42139 = 4408096.3 7807 Subpop. size Design df Number of obs 142 = 71 = 60.67 = 0.0000 Jknife * lowach Odds Ratio Std. Err. lowses 3.203192 .4310244 low_perecdisa [95% Conf. Interval] 8.65 0.000 .9517087 .1447448 sesx_low_perecd~a _cons t P>t -0.33 0.746 .9338807 .1663159 .3208497 .0388013 2.449395 -9.40 0.000 4.188969 .70275 -0.38 0.702 .2521034 1.288864 .6547445 1.332021 .4083425 Note: 4 strata omitted because they contain no subpopulation members. . svy jackknife, subpop(if idcntry==288) : logistic lowach lowses low_perecdisa sesx_low_perecdisa (running logistic on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 534 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Logistic regression Number of strata = Number of PSUs 75 = 150 Subpop. no. of obs = = 338524.64 Replications = F( 3, 73) Prob > F = Population size 44679 = 4597024.6 6787 Subpop. size Design df Number of obs 150 = 75 = 5.36 = 0.0022 Jknife * lowach Odds Ratio Std. Err. lowses 1.062039 .1154976 low_perecdisa [95% Conf. Interval] 0.55 0.582 .3978138 .0992579 sesx_low_perecd~a _cons t P>t -3.69 0.000 1.741226 .4432508 .5687173 .0517281 .8551721 -6.20 0.000 1.318946 .2420001 2.18 0.033 .4744662 .6539495 1.04862 2.891293 .6816912 . svy jackknife, subpop(if idcntry==410) : logistic lowach lowses low_perecdisa sesx_low_perecdisa (running logistic on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are 535 insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Logistic regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = 150 = 611095.04 Replications = F( 3, 73) Prob > F = Population size 45639 = 4634107.3 5935 Subpop. size Design df Number of obs 150 = 75 = 88.65 = 0.0000 Jknife * lowach Odds Ratio Std. Err. lowses 2.911903 .2446579 low_perecdisa [95% Conf. Interval] 12.72 0.000 .6116873 .0822188 sesx_low_perecd~a _cons t P>t 2.463124 3.442449 -3.66 0.000 1.177885 .1957819 .3855179 .0257863 -14.25 0.000 .4679945 0.98 0.328 .3374242 .7994994 .8458641 1.640231 .4404665 . svy jackknife, subpop(if idcntry==702) : logistic lowach lowses low_perecdisa sesx_low_perecdisa 536 (running logistic on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Logistic regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = 150 = 48466.699 Replications = F( 3, Prob > F Population size 45615 = 4652316 150 = 73) = 5724 Subpop. size Design df Number of obs 75 = 56.85 = 0.0000 Jknife * lowach Odds Ratio Std. Err. lowses 2.59414 .2597388 low_perecdisa t P>t 9.52 0.000 .3188318 .0780744 sesx_low_perecd~a [95% Conf. Interval] 2.12505 -4.67 0.000 1.27537 .1949808 3.166778 .1957512 1.59 0.116 .5193005 .9405223 537 1.729431 _cons .486774 .0589637 -5.94 0.000 .3824101 .6196199 . svy jackknife, subpop(if idcntry==840) : logistic lowach lowses low_perecdisa sesx_low_perecdisa (running logistic on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Logistic regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = 150 = 2877729.4 Replications = F( 3, Prob > F 73) = Population size = 44553 = 4263609.5 9117 Subpop. size Design df Number of obs 150 75 = 134.65 = 0.0000 Jknife * lowach Odds Ratio Std. Err. t P>t [95% Conf. Interval] 538 lowses 2.746349 .1918469 low_perecdisa .3678926 .0388891 sesx_low_perecd~a _cons 14.46 0.000 2.389571 3.156397 -9.46 0.000 1.138234 .1502272 .4306363 .0320995 -11.30 0.000 .2980348 0.98 0.330 .4541247 .8750755 .371212 1.480531 .4995734 . svy jackknife, subpop(if idcntry==152) : logistic lowach lowses hi_perurban sesx_hi_perurban (running logistic on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 note: hi_perurban omitted because of collinearity note: sesx_hi_perurban omitted because of collinearity Survey: Logistic regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = 150 Number of obs Population size = = 4607090.9 4759 Subpop. size = 204216.3 Replications = 44742 150 539 Design df F( 1, 75) Prob > F = 75 = 170.30 = 0.0000 Jknife * lowach Odds Ratio Std. Err. lowses 3.364725 .3128454 hi_perurban [95% Conf. Interval] 13.05 0.000 2.795817 4.049396 1 (omitted) sesx_hi_perurban _cons t P>t 1 (omitted) .3370041 .0237675 -15.42 0.000 .2928323 .3878389 . svy jackknife, subpop(if idcntry==246) : logistic lowach lowses hi_perurban sesx_hi_perurban (running logistic on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Logistic regression Number of strata = 75 Number of obs = 540 44723 Number of PSUs = Subpop. no. of obs = 150 = 52650.456 Replications = F( 3, 73) Prob > F = = 4648805.9 8539 Subpop. size Design df Population size 150 75 = 61.44 = 0.0000 Jknife * lowach Odds Ratio Std. Err. lowses 2.991038 .2534188 hi_perurban [95% Conf. Interval] 12.93 0.000 1.073443 .2496107 sesx_hi_perurban _cons t P>t 2.526506 3.540979 0.30 0.761 .9357117 .2293111 .6754621 -0.27 0.787 .3205343 .0248733 -14.66 0.000 1.705912 .5742765 .274624 1.524625 .3741197 . svy jackknife, subpop(if idcntry==288) : logistic lowach lowses hi_perurban sesx_hi_perurban (running logistic on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 541 .................................................. 150 Survey: Logistic regression Number of strata = Number of PSUs 75 = 150 Subpop. no. of obs = = 341158.24 Replications = F( 3, 73) Prob > F = = Population size 44704 = 4599658.2 6812 Subpop. size Design df Number of obs 150 75 = 5.36 = 0.0022 Jknife * lowach Odds Ratio Std. Err. lowses 1.214622 .1307615 hi_perurban .98017 -3.05 0.003 1.115922 .2371847 .4899553 .0460491 [95% Conf. Interval] 1.81 0.075 .4423776 .1184325 sesx_hi_perurban _cons t P>t 1.505153 .2595221 0.52 0.607 -7.59 0.000 .7540705 .7307149 .4062968 1.704198 .5908396 . svy jackknife, subpop(if idcntry==410) : logistic lowach lowses hi_perurban sesx_hi_perurban (running logistic on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. 542 Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 note: hi_perurban omitted because of collinearity note: sesx_hi_perurban omitted because of collinearity Survey: Logistic regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = 150 = 614443.04 Replications = F( 1, 75) Prob > F = = Population size 45670 = 4637455.3 5966 Subpop. size Design df Number of obs 150 75 = 293.69 = 0.0000 Jknife * lowach Odds Ratio Std. Err. lowses 3.188998 .2158024 hi_perurban sesx_hi_perurban _cons t P>t [95% Conf. Interval] 17.14 0.000 2.786815 3.649222 1 (omitted) 1 (omitted) .3251818 .0179624 -20.34 0.000 .2912974 .3630077 543 . svy jackknife, subpop(if idcntry==702) : logistic lowach lowses hi_perurban sesx_hi_perurban (running logistic on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 note: hi_perurban omitted because of collinearity note: sesx_hi_perurban omitted because of collinearity Survey: Logistic regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = 150 = 49977.699 Replications = F( 1, Prob > F 75) = Population size = 150 75 = 182.16 = 0.0000 45790 = 4653827 5899 Subpop. size Design df Number of obs Jknife * 544 lowach Odds Ratio Std. Err. lowses 3.237224 .2817564 hi_perurban [95% Conf. Interval] 13.50 0.000 2.721901 3.850109 1 (omitted) sesx_hi_perurban _cons t P>t .3048321 1 (omitted) .027726 -13.06 0.000 .2543141 .3653854 . svy jackknife, subpop(if idcntry==840) : logistic lowach lowses hi_perurban sesx_hi_perurban (running logistic on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Logistic regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = 150 = 2879755.5 Replications = = Population size = 44572 = 4265635.5 9136 Subpop. size Design df Number of obs 150 75 545 F( 3, 73) Prob > F = 119.05 = 0.0000 Jknife * lowach Odds Ratio Std. Err. lowses 3.259135 hi_perurban [95% Conf. Interval] 14.17 0.000 2.547621 .4230636 sesx_hi_perurban _cons .271805 t P>t 2.760259 5.63 0.000 .8531191 .1245745 1.830057 -1.09 0.280 .2517434 .0178544 -19.45 0.000 3.848175 3.546541 .6377891 .2185739 1.141149 .2899465 . svy jackknife, subpop(if idcntry==152) : logistic lowach lowses hi_perschcli sesx_hi_perschcli (running logistic on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Logistic regression Number of strata = Number of PSUs 75 = 150 Number of obs Population size = 44725 = 4606164.5 546 Subpop. no. of obs = 4742 Subpop. size = 203289.99 Replications = Design df F( 3, 73) Prob > F = 150 75 = 63.49 = 0.0000 Jknife * lowach Odds Ratio Std. Err. lowses 3.598376 .4902227 hi_perschcli [95% Conf. Interval] 9.40 0.000 2.769932 .4149955 sesx_hi_perschcli _cons t P>t 2.743103 6.80 0.000 .6829501 .1306646 2.055179 -1.99 0.050 .2495113 .0259894 -13.33 0.000 4.720315 3.733263 .4665127 .2027563 .999803 .3070478 . svy jackknife, subpop(if idcntry==246) : logistic lowach lowses hi_perschcli sesx_hi_perschcli (running logistic on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 547 Survey: Logistic regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = 150 = 52298.026 Replications = F( 3, 73) Prob > F = = Population size 44650 = 4648453.5 8466 Subpop. size Design df Number of obs 150 75 = 69.59 = 0.0000 Jknife * lowach Odds Ratio Std. Err. lowses 2.983858 .2630024 hi_perschcli [95% Conf. Interval] 12.40 0.000 1.369548 .3789576 sesx_hi_perschcli _cons t P>t 2.50335 1.14 0.259 .9845392 .2290101 .7891974 -0.07 0.947 .3113953 .0256613 -14.16 0.000 3.556598 2.376671 .619429 .2642508 1.564857 .3669507 . svy jackknife, subpop(if idcntry==288) : logistic lowach lowses hi_perschcli sesx_hi_perschcli (running logistic on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. 548 Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Logistic regression Number of strata = Number of PSUs 75 = 150 Subpop. no. of obs = = 341131.97 Replications = F( 3, 73) Prob > F = = Population size 44696 = 4599631.9 6804 Subpop. size Design df Number of obs 150 75 = 8.08 = 0.0001 Jknife * lowach Odds Ratio Std. Err. lowses 1.593293 .2110522 hi_perschcli 1.223754 4.04 0.000 .5587785 .0974306 .3248151 .0398109 [95% Conf. Interval] 3.52 0.001 2.213355 .4352797 sesx_hi_perschcli _cons t P>t 1.49592 -3.34 0.001 -9.17 0.000 2.074422 3.274867 .3948094 .2544475 .7908458 .414643 . svy jackknife, subpop(if idcntry==410) : logistic lowach lowses hi_perschcli sesx_hi_perschcli (running logistic on estimation sample) 549 Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 ....................ss............................ 50 .................................................. 100 .................................................. 150 Survey: Logistic regression Number of strata = Number of PSUs 74 = Subpop. no. of obs = 148 = 601631.56 Replications = F( 3, 72) Prob > F = Population size = 44913 = 4568817.6 5827 Subpop. size Design df Number of obs 148 74 = 107.65 = 0.0000 Jknife * lowach Odds Ratio Std. Err. lowses 3.207333 .2687032 hi_perschcli t P>t 13.91 0.000 1.282868 .1685611 sesx_hi_perschcli 1.022079 [95% Conf. Interval] 2.714232 3.790018 1.90 0.062 .162391 .9873704 0.14 0.891 1.666801 .744723 550 1.402729 _cons .3088397 .0194974 -18.61 0.000 .2723345 .3502384 Note: 1 stratum omitted because it contains no subpopulation members. . svy jackknife, subpop(if idcntry==702) : logistic lowach lowses hi_perschcli sesx_hi_perschcli (running logistic on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Logistic regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = 150 = 47889.699 Replications = F( 3, Prob > F 73) = Population size = 150 75 = 55.75 = 0.0000 45541 = 4651739 5650 Subpop. size Design df Number of obs 551 Jknife * lowach Odds Ratio Std. Err. lowses 3.268326 .2998947 hi_perschcli t P>t [95% Conf. Interval] 12.91 0.000 1.34839 .5312512 0.76 0.450 sesx_hi_perschcli .6793908 _cons .030219 -11.98 0.000 .3049431 2.722326 3.923833 .237636 .6151088 -1.11 0.273 2.955826 .338459 .2503134 1.363745 .3714955 . svy jackknife, subpop(if idcntry==840) : logistic lowach lowses hi_perschcli sesx_hi_perschcli (running logistic on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Logistic regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = 150 Number of obs Population size = = 4237273.1 9026 Subpop. size = 2851393 Replications = 44462 150 552 Design df F( 3, 73) Prob > F = 75 = 113.58 = 0.0000 Jknife * lowach Odds Ratio Std. Err. lowses 3.154183 hi_perschcli [95% Conf. Interval] 17.22 0.000 1.748897 .3555056 sesx_hi_perschcli _cons .210454 t P>t 2.761605 2.75 0.007 1.10262 .1653686 1.166541 0.65 0.517 .2691167 .0193101 -18.29 0.000 3.602568 2.621976 .8178452 .2332719 1.486553 .3104696 . svy jackknife, subpop(if idcntry==152) : logistic lowach lowses hi_perteaexp sesx_hi_perteaexp (running logistic on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Logistic regression Number of strata = 75 Number of obs = 553 44938 Number of PSUs = Subpop. no. of obs = 150 = 216103.04 Replications = F( 3, = 73) Prob > F = 4618977.6 4955 Subpop. size Design df Population size 150 75 = 67.38 = 0.0000 Jknife * lowach Odds Ratio Std. Err. lowses 3.66307 hi_perteaexp [95% Conf. Interval] 11.77 0.000 1.366142 .2491272 sesx_hi_perteaexp _cons .404143 t P>t 2.940309 1.71 0.091 .7303488 .1504292 .950007 -1.53 0.131 .3231634 .0302848 -12.05 0.000 4.563493 .2681298 1.964557 .4845453 1.100845 .3894926 . svy jackknife, subpop(if idcntry==246) : logistic lowach lowses hi_perteaexp sesx_hi_perteaexp (running logistic on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 554 Survey: Logistic regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = 150 = 53697.415 Replications = F( 3, = 73) Prob > F = Population size 44990 = 4649852.9 8806 Subpop. size Design df Number of obs 150 75 = 82.00 = 0.0000 Jknife * lowach Odds Ratio Std. Err. lowses 2.82165 .2591462 hi_perteaexp [95% Conf. Interval] 11.29 0.000 .9225477 .0838582 sesx_hi_perteaexp _cons t P>t 2.349877 -0.89 0.378 1.203565 .1254383 .328912 .0291061 -12.57 0.000 3.388138 .7697454 1.105683 1.78 0.079 .9779139 .2757527 .3923191 1.481284 . svy jackknife, subpop(if idcntry==288) : logistic lowach lowses hi_perteaexp sesx_hi_perteaexp (running logistic on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. 555 Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Logistic regression Number of strata = Number of PSUs 75 = 150 Subpop. no. of obs = = 319832.04 Replications = F( 3, 73) Prob > F = = Population size 44179 = 4578332 6287 Subpop. size Design df Number of obs 150 75 = 2.23 = 0.0915 Jknife * lowach Odds Ratio Std. Err. lowses 1.172985 .1373291 hi_perteaexp [95% Conf. Interval] 1.36 0.177 .6888066 .1650516 sesx_hi_perteaexp _cons t P>t -1.56 0.124 1.150699 .2537705 .4804214 .0514222 .9289715 -6.85 0.000 1.481093 .4273544 1.110213 0.64 0.526 .7415883 .3881677 .5946004 1.785502 . svy jackknife, subpop(if idcntry==410) : logistic lowach lowses hi_perteaexp sesx_hi_perteaexp (running logistic on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are 556 insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Logistic regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = 150 = 599285.3 Replications = F( 3, 73) Prob > F = = Population size 45504 = 4622297.6 5800 Subpop. size Design df Number of obs 150 75 = 99.41 = 0.0000 Jknife * lowach Odds Ratio Std. Err. lowses 3.242927 .2892667 hi_perteaexp [95% Conf. Interval] 13.19 0.000 .992738 .1341146 sesx_hi_perteaexp _cons t P>t 2.714974 3.873546 -0.05 0.957 .9364609 .1290792 .7584997 -0.48 0.635 .3232707 .0228748 -15.96 0.000 .2807678 1.299313 .7116043 .3722078 557 1.232369 . svy jackknife, subpop(if idcntry==702) : logistic lowach lowses hi_perteaexp sesx_hi_perteaexp (running logistic on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Logistic regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = 150 = 49412.199 Replications = F( 3, Prob > F 73) = Population size = 45719 = 4653261.5 5828 Subpop. size Design df Number of obs 150 75 = 59.03 = 0.0000 Jknife * lowach Odds Ratio Std. Err. t P>t [95% Conf. Interval] 558 lowses 2.955055 .2955311 hi_perteaexp 1.108389 .2510664 sesx_hi_perteaexp _cons 10.83 0.000 2.421264 3.606525 0.45 0.651 1.303288 .2613411 .7058642 1.32 0.191 .2951464 .0333862 -10.79 0.000 1.740458 .8740875 .2355988 1.943238 .3697447 . svy jackknife, subpop(if idcntry==840) : logistic lowach lowses hi_perteaexp sesx_hi_perteaexp (running logistic on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Logistic regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = 150 = 2255258 Replications = F( 3, Prob > F 73) = Population size = 150 75 = 96.27 = 0.0000 42589 = 3641138 7153 Subpop. size Design df Number of obs 559 Jknife * lowach Odds Ratio Std. Err. lowses 2.884995 .2327712 hi_perteaexp [95% Conf. Interval] 13.13 0.000 .7543769 .1191029 sesx_hi_perteaexp _cons t P>t 2.456638 3.388044 -1.79 0.078 1.400372 .1867975 .5508012 2.52 0.014 .3405424 .0260368 -14.09 0.000 1.033194 1.073591 .2924313 1.82662 .3965686 . svy jackknife, subpop(if idcntry==152) : logistic lowach lowses scimaj sesx_scimaj (running logistic on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Logistic regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = Subpop. size 150 Number of obs Population size = 44862 = 4614798.1 4879 = 211923.55 560 Replications Design df F( 3, 73) Prob > F = = 150 75 = 65.52 = 0.0000 Jknife * lowach Odds Ratio Std. Err. t P>t [95% Conf. Interval] lowses 2.553205 .3906101 6.13 0.000 1.882464 3.462938 scimaj .5612526 .0936894 -3.46 0.001 .4024743 .7826697 sesx_scimaj _cons 1.370971 .2444658 .525354 .0671181 1.77 0.081 -5.04 0.000 .9610736 .4073064 1.955689 .6776147 . svy jackknife, subpop(if idcntry==246) : logistic lowach lowses scimaj sesx_scimaj (running logistic on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Logistic regression 561 Number of strata = Number of PSUs 75 = Subpop. no. of obs = 150 = 53232.631 Replications = F( 3, 73) Prob > F = = Population size 44944 = 4649388.1 8760 Subpop. size Design df Number of obs 150 75 = 72.92 = 0.0000 Jknife * lowach Odds Ratio Std. Err. t P>t [95% Conf. Interval] lowses 2.488175 .3341864 6.79 0.000 1.904064 3.251474 scimaj .6470128 .0738131 -3.82 0.000 .5154815 .812106 sesx_scimaj _cons 1.253904 .1630641 .4516071 .0491479 1.74 0.086 -7.30 0.000 .9677306 .3635854 1.624703 .5609382 . svy jackknife, subpop(if idcntry==288) : logistic lowach lowses scimaj sesx_scimaj (running logistic on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 562 .................................................. 150 Survey: Logistic regression Number of strata = Number of PSUs 75 = 150 Subpop. no. of obs = = 325614.25 Replications = F( 3, 73) Prob > F = = Population size 44261 = 4584114.2 6369 Subpop. size Design df Number of obs 150 75 = 3.27 = 0.0261 Jknife * lowach Odds Ratio Std. Err. t P>t [95% Conf. Interval] lowses 1.492976 .2125393 2.82 0.006 1.124318 1.982514 scimaj 1.166229 .2714368 0.66 0.511 .7335339 1.854161 sesx_scimaj _cons .8263336 .1605308 .3772217 .0637136 -0.98 0.329 -5.77 0.000 .561154 .2694445 1.216827 .5281096 . svy jackknife, subpop(if idcntry==410) : logistic lowach lowses scimaj sesx_scimaj (running logistic on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. 563 Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Logistic regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = 150 = 597255.14 Replications = F( 3, 73) Prob > F = = Population size 45489 = 4620267.4 5785 Subpop. size Design df Number of obs 150 75 = 92.53 = 0.0000 Jknife * lowach Odds Ratio Std. Err. t P>t [95% Conf. Interval] lowses 2.036731 1.025016 1.41 0.162 .7473592 5.550576 scimaj .800528 .1686854 -1.06 0.294 .5261048 1.218094 sesx_scimaj _cons 1.568105 .7974312 .4028122 .0847639 0.88 0.379 -4.32 0.000 .5693956 .2648786 4.31853 .6125736 . svy jackknife, subpop(if idcntry==702) : logistic lowach lowses scimaj sesx_scimaj (running logistic on estimation sample) 564 Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Logistic regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = 150 = 48615.067 Replications = F( 3, 73) Prob > F = = Population size 45629 = 4652464.4 5738 Subpop. size Design df Number of obs 150 75 = 60.11 = 0.0000 Jknife * lowach Odds Ratio Std. Err. t P>t [95% Conf. Interval] lowses 3.247502 1.180293 3.24 0.002 1.574383 6.698668 scimaj 1.462542 .8355375 0.67 0.508 .4686523 4.564215 sesx_scimaj .9858515 .3837647 -0.04 0.971 .453972 565 2.140888 _cons .2125098 .1200127 -2.74 0.008 .0689905 .6545889 . svy jackknife, subpop(if idcntry==840) : logistic lowach lowses scimaj sesx_scimaj (running logistic on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Logistic regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = 150 = 2192932.9 Replications = F( 3, Prob > F 73) = Population size = 42413 = 3578813 6977 Subpop. size Design df Number of obs 150 75 = 93.36 = 0.0000 Jknife * lowach Odds Ratio Std. Err. t P>t [95% Conf. Interval] 566 lowses 3.500654 .5540702 scimaj 1.20735 .2134085 sesx_scimaj _cons 7.92 0.000 1.07 0.290 .9248346 .1599821 .2604463 .0410836 2.553966 .8490042 -0.45 0.653 -8.53 0.000 4.798255 1.716947 .6552468 .1902153 1.305339 .3566079 . . svy jackknife, subpop(if idcntry==152) : logistic lowach lowses low_pertelimi sesx_low_pertelimi (running logistic on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Logistic regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = 150 = 213177.6 Replications = F( 3, 73) = = Population size = 44881 = 4616052.2 4898 Subpop. size Design df Number of obs 150 75 70.00 567 Prob > F = 0.0000 Jknife * lowach Odds Ratio Std. Err. lowses 2.834321 .2996995 low_pertelimi [95% Conf. Interval] 9.85 0.000 .4321718 .0753355 sesx_low_pertel~i _cons t P>t -4.81 0.000 1.278321 .2920468 .4612626 .0434077 2.295977 -8.22 0.000 3.498891 .3053819 .6116029 1.07 0.286 .8109324 .3824118 .5563719 2.015094 . svy jackknife, subpop(if idcntry==246) : logistic lowach lowses low_pertelimi sesx_low_pertelimi (running logistic on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Logistic regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = 150 Number of obs Population size = 44536 = 4647312.9 8352 568 Subpop. size = 51157.417 Replications = Design df F( 3, 73) Prob > F = 150 75 = 82.25 = 0.0000 Jknife * lowach Odds Ratio Std. Err. lowses 2.851681 .2560415 low_pertelimi [95% Conf. Interval] 11.67 0.000 .6176995 .0593349 sesx_low_pertel~i _cons t P>t 2.384634 3.410203 -5.02 0.000 1.210175 .1508127 .5101195 1.53 0.130 .3727509 .0304643 -12.07 0.000 .3167457 .7479673 .9441298 1.55119 .4386587 . svy jackknife, subpop(if idcntry==288) : logistic lowach lowses low_pertelimi sesx_low_pertelimi (running logistic on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Logistic regression 569 Number of strata = Number of PSUs 75 = 150 Subpop. no. of obs = = 330717.55 Replications = F( 3, 73) Prob > F = = Population size 44471 = 4589217.5 6579 Subpop. size Design df Number of obs 150 75 = 3.15 = 0.0299 Jknife * lowach Odds Ratio Std. Err. lowses 1.201701 .1523163 low_pertelimi [95% Conf. Interval] 1.45 0.151 .5926601 .1497921 sesx_low_pertel~i _cons t P>t -2.07 0.042 1.253209 .2959585 .5068828 .0551396 .9335485 -6.25 0.000 1.546877 .3582125 .9805521 0.96 0.342 .7829014 .4081258 .6295366 2.006041 . svy jackknife, subpop(if idcntry==410) : logistic lowach lowses low_pertelimi sesx_low_pertelimi (running logistic on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 570 .................................................. 100 .................................................. 150 Survey: Logistic regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = 150 = 591374.82 Replications = F( 3, 73) Prob > F = = Population size 45426 = 4614387.1 5722 Subpop. size Design df Number of obs 150 75 = 89.84 = 0.0000 Jknife * lowach Odds Ratio Std. Err. lowses 3.270025 .2948665 low_pertelimi [95% Conf. Interval] 13.14 0.000 1.015299 .1279894 sesx_low_pertel~i _cons t P>t 2.732357 3.913494 0.12 0.904 .9065215 .1241372 .7898262 -0.72 0.476 .3240166 .0194348 -18.79 0.000 .2875242 1.305139 .6900889 1.190834 .3651407 . svy jackknife, subpop(if idcntry==702) : logistic lowach lowses low_pertelimi sesx_low_pertelimi (running logistic on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. 571 Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Logistic regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = 150 = 48646.73 Replications = F( 3, 73) Prob > F = = Population size 45614 = 4652496 5723 Subpop. size Design df Number of obs 150 75 = 65.02 = 0.0000 Jknife * lowach Odds Ratio Std. Err. lowses 3.090343 low_pertelimi 2.503639 -5.59 0.000 .8264615 .1867677 .4462562 .0487468 [95% Conf. Interval] 10.68 0.000 .2569053 .0624414 sesx_low_pertel~i _cons .326605 t P>t .1583049 -0.84 0.402 -7.39 0.000 3.814535 .358987 .4169191 .5268779 1.296389 .5547405 . svy jackknife, subpop(if idcntry==840) : logistic lowach lowses low_pertelimi sesx_low_pertelimi (running logistic on estimation sample) 572 Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Logistic regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = 150 = 1829103.9 Replications = F( 3, 73) Prob > F = = Population size 41283 = 3214984 5847 Subpop. size Design df Number of obs 150 75 = 78.41 = 0.0000 Jknife * lowach Odds Ratio Std. Err. lowses 3.285542 .2827281 low_pertelimi t P>t 13.82 0.000 .6786328 .1114233 sesx_low_pertel~i [95% Conf. Interval] 2.76795 -2.36 0.021 1.181157 .2630267 3.899921 .4893124 0.75 0.457 .9412034 .757965 573 1.840629 _cons .3102963 .0299352 -12.13 0.000 .2560427 .376046 . . svy jackknife, subpop(if idcntry==152) : logistic lowach lowses hi_persciped sesx_hi_persciped (running logistic on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Logistic regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = 150 = 215969.89 Replications = F( 3, Prob > F 73) = Population size = 44934 = 4618844.4 4951 Subpop. size Design df Number of obs 150 75 = 58.64 = 0.0000 Jknife * lowach Odds Ratio Std. Err. t P>t [95% Conf. Interval] 574 lowses 3.261851 .3500346 hi_persciped .9787816 .1973819 sesx_hi_persciped _cons 11.02 0.000 2.634041 4.039297 -0.11 0.916 .9710699 .2044703 .6549636 -0.14 0.889 .3613843 .0325894 -11.29 0.000 1.462697 .6383825 .3019603 1.477134 .4325025 . svy jackknife, subpop(if idcntry==246) : logistic lowach lowses hi_persciped sesx_hi_persciped (running logistic on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Logistic regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = 150 = 51552.288 Replications = F( 3, Prob > F 73) = Population size = 150 75 = 68.78 = 0.0000 44580 = 4647707.7 8396 Subpop. size Design df Number of obs 575 Jknife * lowach Odds Ratio Std. Err. lowses 3.056274 .2477451 hi_persciped [95% Conf. Interval] 13.78 0.000 .8324361 .0694256 sesx_hi_persciped _cons t P>t 2.600528 3.591891 -2.20 0.031 .9632687 .0903744 .7050115 -0.40 0.691 .3376594 .0265427 -13.81 0.000 .9828916 .799057 .2887158 1.161227 .3949 . svy jackknife, subpop(if idcntry==288) : logistic lowach lowses hi_persciped sesx_hi_persciped (running logistic on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Logistic regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = 150 Number of obs Population size = = 4594828.7 6688 Subpop. size = 336328.76 Replications = 44580 150 576 Design df F( 3, 73) Prob > F = 75 = 5.13 = 0.0028 Jknife * lowach Odds Ratio Std. Err. lowses 1.615046 .1978929 hi_persciped .544926 .3750255 .0462717 [95% Conf. Interval] 3.91 0.000 1.398168 .3098985 sesx_hi_persciped _cons t P>t 1.265249 1.51 0.135 .109331 .8990837 -3.03 0.003 -7.95 0.000 2.061548 2.174295 .3653901 .293302 .8126778 .4795199 . svy jackknife, subpop(if idcntry==410) : logistic lowach lowses hi_persciped sesx_hi_persciped (running logistic on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Logistic regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = 150 Number of obs Population size = 45403 = 4613883 5699 577 Subpop. size = 590870.74 Replications = Design df F( 3, 73) Prob > F = 150 75 = 90.71 = 0.0000 Jknife * lowach Odds Ratio Std. Err. lowses 3.120779 .2549991 hi_persciped [95% Conf. Interval] 13.93 0.000 .9285126 .1228735 sesx_hi_persciped _cons t P>t 2.651983 3.672444 -0.56 0.577 1.029187 .1701152 .7133425 0.17 0.862 .3317413 .0240002 -15.25 0.000 .2872159 1.208586 .7404428 1.430531 .3831691 . svy jackknife, subpop(if idcntry==702) : logistic lowach lowses hi_persciped sesx_hi_persciped (running logistic on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Logistic regression 578 Number of strata = Number of PSUs 75 = Subpop. no. of obs = 150 = 48550.199 Replications = F( 3, 73) Prob > F = = Population size 45610 = 4652399.5 5719 Subpop. size Design df Number of obs 150 75 = 59.14 = 0.0000 Jknife * lowach Odds Ratio Std. Err. lowses 2.925572 .3178367 hi_persciped [95% Conf. Interval] 9.88 0.000 .5641822 .1312682 sesx_hi_persciped _cons t P>t -2.46 0.016 1.367879 .2844174 .3648417 .0388663 2.356238 -9.46 0.000 3.632473 .3549139 .8968416 1.51 0.136 .9039801 .2950799 .4510963 2.069839 . svy jackknife, subpop(if idcntry==840) : logistic lowach lowses hi_persciped sesx_hi_persciped (running logistic on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 579 .................................................. 100 .................................................. 150 Survey: Logistic regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = 150 = 1847591.8 Replications = F( 3, = 73) Prob > F = Population size 41336 = 3233471.9 5900 Subpop. size Design df Number of obs 150 75 = 65.27 = 0.0000 Jknife * lowach Odds Ratio Std. Err. lowses 3.46803 .3773415 hi_persciped [95% Conf. Interval] 11.43 0.000 .7444655 .1352407 sesx_hi_persciped _cons t P>t 2.792213 -1.62 0.108 1.05455 .1991855 .5184162 0.28 0.779 .3026606 .0250931 -14.42 0.000 4.307418 1.069081 .7238595 .2565825 1.536315 .3570135 . svy jackknife, subpop(if idcntry==152) : logistic lowach lowses hi_perexpect sesx_hi_perexpect (running logistic on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. 580 Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Logistic regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = 150 = 217989.85 Replications = F( 3, 73) Prob > F = = Population size 44987 = 4620864.4 5004 Subpop. size Design df Number of obs 150 75 = 56.64 = 0.0000 Jknife * lowach Odds Ratio Std. Err. lowses 2.854517 .2664078 hi_perexpect [95% Conf. Interval] 11.24 0.000 .4729926 .1089209 sesx_hi_perexpect _cons t P>t 2.37022 -3.25 0.002 1.467383 .3695954 .2989684 1.52 0.132 .4380494 .0359228 -10.07 0.000 3.437769 .3720272 .7483133 .8884493 2.423564 .5157883 . svy jackknife, subpop(if idcntry==246) : logistic lowach lowses hi_perexpect sesx_hi_perexpect (running logistic on estimation sample) 581 Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Logistic regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = 150 = 53556.892 Replications = F( 3, 73) Prob > F = = Population size 44980 = 4649712.3 8796 Subpop. size Design df Number of obs 150 75 = 76.74 = 0.0000 Jknife * lowach Odds Ratio Std. Err. lowses 2.955233 .2292078 hi_perexpect t P>t 13.97 0.000 .754066 .0945987 sesx_hi_perexpect [95% Conf. Interval] 2.532154 3.449002 -2.25 0.027 1.025246 .1551719 .5873188 0.16 0.870 .9681548 .7583778 582 1.386024 _cons .3456085 .0282298 -13.01 0.000 .2937087 .4066791 . svy jackknife, subpop(if idcntry==288) : logistic lowach lowses hi_perexpect sesx_hi_perexpect (running logistic on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Logistic regression Number of strata = Number of PSUs 75 = 150 Subpop. no. of obs = = 334036.33 Replications = F( 3, Prob > F 73) = Population size = 150 75 = 3.15 = 0.0301 44518 = 4592536.3 6626 Subpop. size Design df Number of obs Jknife * 583 lowach Odds Ratio Std. Err. lowses 1.175978 .1419252 hi_perexpect .5990133 sesx_hi_perexpect _cons t P>t [95% Conf. Interval] 1.34 0.183 .152066 -2.02 0.047 1.266465 .2666617 .4947245 .052871 .9246684 -6.59 0.000 1.49559 .3612489 1.12 0.265 .3998566 .993268 .8325854 1.926449 .6121003 . svy jackknife, subpop(if idcntry==410) : logistic lowach lowses hi_perexpect sesx_hi_perexpect (running logistic on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Logistic regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = 150 = 594787.1 Replications = F( 3, 73) = = Population size = 45470 = 4617799.4 5766 Subpop. size Design df Number of obs 150 75 101.25 584 Prob > F = 0.0000 Jknife * lowach Odds Ratio Std. Err. lowses 2.952218 hi_perexpect .218014 [95% Conf. Interval] 14.66 0.000 .6240339 .0806637 sesx_hi_perexpect _cons t P>t 2.548347 -3.65 0.000 1.291012 .2247305 .365927 .0226507 -16.24 0.000 3.420095 .4823655 .8073096 1.47 0.146 .9127024 .3234757 .4139494 1.826128 . svy jackknife, subpop(if idcntry==702) : logistic lowach lowses hi_perexpect sesx_hi_perexpect (running logistic on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Logistic regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = 150 Number of obs Population size = 45654 = 4652815 5763 585 Subpop. size = 48965.636 Replications = Design df F( 3, 73) Prob > F = 150 75 = 58.99 = 0.0000 Jknife * lowach Odds Ratio Std. Err. lowses 2.809963 .2627616 hi_perexpect .415212 [95% Conf. Interval] 11.05 0.000 .3593755 .0825802 sesx_hi_perexpect _cons t P>t 2.332378 3.385339 -4.45 0.000 .2273764 1.182045 .2138071 0.92 0.358 .046824 -7.79 0.000 .3316689 .5680044 .8244129 1.694818 .5197986 . svy jackknife, subpop(if idcntry==840) : logistic lowach lowses hi_perexpect sesx_hi_perexpect (running logistic on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Logistic regression 586 Number of strata = Number of PSUs 75 = Subpop. no. of obs = 150 = 2274091.6 Replications = F( 3, 73) Prob > F = Population size 42588 = 3659971.6 7152 Subpop. size Design df Number of obs 150 = 75 = 117.79 = 0.0000 Jknife * lowach Odds Ratio Std. Err. lowses 2.750879 .1966072 hi_perexpect [95% Conf. Interval] 14.16 0.000 .4212059 .0645244 sesx_hi_perexpect _cons t P>t 2.385822 3.171795 -5.64 0.000 1.391013 .1963474 .3104281 2.34 0.022 .4087686 .0332076 -11.01 0.000 .3476912 .5715154 1.050051 1.842689 .4805752 end of do-file . log close name: log: C:\Users\educ.brunerju\Google Drive\Dissertation\Outputs\Logistic Regressions\Binary Regressions\Binary Regressions.smcl log type: smcl opened on: 10 Jul 2014, 09:02:21 . do "C:\Users\EDUC~1.BRU\AppData\Local\Temp\STD00000000.tmp" 587 . svy jackknife, subpop(if idcntry==152) : logistic lowach lowses low_perschcli sesx_low_perschcli (running logistic on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Logistic regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = 150 = 203289.99 Replications = F( 3, Prob > F 73) = Population size = 44725 = 4606164.5 4742 Subpop. size Design df Number of obs 150 75 = 55.29 = 0.0000 Jknife * lowach Odds Ratio Std. Err. t P>t [95% Conf. Interval] 588 lowses 2.543157 .2972424 low_perschcli .4210295 .0724031 sesx_low_persch~i _cons 7.99 0.000 -5.03 0.000 1.588444 .3064666 .5420786 .0553124 2.014903 -6.00 0.000 3.209907 .2989056 2.40 0.019 .5930497 1.081564 .4423679 2.332875 .6642645 . svy jackknife, subpop(if idcntry==246) : logistic lowach lowses low_perschcli sesx_low_perschcli (running logistic on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Logistic regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = 150 = 52298.026 Replications = F( 3, Prob > F 73) = Population size = 150 75 = 64.47 = 0.0000 44650 = 4648453.5 8466 Subpop. size Design df Number of obs 589 Jknife * lowach Odds Ratio Std. Err. lowses 2.990508 .3555304 low_perschcli [95% Conf. Interval] 9.21 0.000 .8529412 .1343413 sesx_low_persch~i _cons t P>t -1.01 0.316 .9904726 .1571593 .3589668 .0443733 2.359877 -8.29 0.000 3.789662 .6232373 -0.06 0.952 1.167306 .7220492 .2806135 1.358683 .4591981 . svy jackknife, subpop(if idcntry==288) : logistic lowach lowses low_perschcli sesx_low_perschcli (running logistic on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Logistic regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = 150 Number of obs Population size = = 4599631.9 6804 Subpop. size = 341131.97 Replications = 44696 150 590 Design df F( 3, 73) Prob > F = 75 = 10.81 = 0.0000 Jknife * lowach Odds Ratio Std. Err. lowses .9854208 .1273777 low_perschcli [95% Conf. Interval] -0.11 0.910 .3822943 .0722157 sesx_low_persch~i _cons t P>t -5.09 0.000 1.626564 .3013768 .6847036 .0740442 .7617094 1.274835 -3.50 0.001 .2624029 .556964 2.63 0.010 1.12453 .5520062 .8493004 2.352727 . svy jackknife, subpop(if idcntry==410) : logistic lowach lowses low_perschcli sesx_low_perschcli (running logistic on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 ....................ss............................ 50 .................................................. 100 .................................................. 150 Survey: Logistic regression Number of strata = 74 Number of obs = 591 44913 Number of PSUs = Subpop. no. of obs = 148 = 601631.56 Replications = F( 3, 72) Prob > F = 4568817.6 5827 Subpop. size Design df Population size 148 = 74 = 102.42 = 0.0000 Jknife * lowach Odds Ratio Std. Err. lowses 3.322545 .2814234 low_perschcli [95% Conf. Interval] 14.18 0.000 .7950458 .0950073 sesx_low_persch~i _cons t P>t 2.806563 3.933391 -1.92 0.059 .9342511 .1456985 .6265901 -0.44 0.664 .3558926 .0272727 -13.48 0.000 .3054961 1.00879 .6847157 1.274726 .4146029 Note: 1 stratum omitted because it contains no subpopulation members. . svy jackknife, subpop(if idcntry==702) : logistic lowach lowses low_perschcli sesx_low_perschcli (running logistic on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 592 .................................................. 100 .................................................. 150 Survey: Logistic regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = 150 = 47889.699 Replications = F( 3, = 73) Prob > F = Population size 45541 = 4651739 5650 Subpop. size Design df Number of obs 150 75 = 55.75 = 0.0000 Jknife * lowach Odds Ratio Std. Err. lowses 2.22047 .7509389 low_perschcli [95% Conf. Interval] 2.36 0.021 .7416254 .2921925 sesx_low_persch~i _cons t P>t -0.76 0.450 1.471907 .5148407 .4111821 .1529867 1.132029 -2.39 0.019 4.355443 .3383149 1.11 0.273 .1959469 1.625729 .7332747 2.954568 .8628396 . svy jackknife, subpop(if idcntry==840) : logistic lowach lowses low_perschcli sesx_low_perschcli (running logistic on estimation sample) Note: Some subpopulation observations were dropped during estimation. This is most likely because of missing values in the model variables. If there are insufficient observations to compute jackknife standard errors, consider changing the subpopulation to exclude these observations. 593 Jackknife replications (150) 1 ---+--- 2 ---+--- 3 ---+--- 4 ---+--- 5 .................................................. 50 .................................................. 100 .................................................. 150 Survey: Logistic regression Number of strata = Number of PSUs 75 = Subpop. no. of obs = 150 = 2851393 Replications = F( 3, 73) Prob > F = Population size 44462 = 4237273.1 9026 Subpop. size Design df Number of obs 150 = 75 = 113.58 = 0.0000 Jknife * lowach Odds Ratio Std. Err. lowses 3.477864 .5051873 low_perschcli 2.604003 -2.75 0.007 .906931 .1360196 .4706575 .0841564 [95% Conf. Interval] 8.58 0.000 .5717889 .1162299 sesx_low_persch~i _cons t P>t 4.64498 .3813917 -0.65 0.517 -4.21 0.000 .3296168 .8572355 .672697 .6720485 . end of do-file . log close name: 594 1.222726 log: C:\Users\educ.brunerju\Google Drive\Dissertation\Outputs\Logistic Regressions\Binary Regressions\Binary Regressions.smcl log type: smcl closed on: 10 Jul 2014, 09:04:44 595 BIBLIOGRAPHY 596 BIBLIOGRAPHY 1. Akiba, Motoko, LeTendre, Gerald K., and Scribner, Jay P. (2007). Teacher Quality, Opportunity Gap, and National Achievement in 46 Countries. Educational Researcher, Vol. 36, No. 7, pp. 369-387. 2. Angrist, Joshua D. and Lang, Kevin. (2004). Does school integration generate peer effects? Evidence from Boston’s Metco Program. The American Economic review. Vol. 94, No. 5, pp. 1613-1634. 3. Angrist, Joshua and Lavy, Victor. (2009). The effects of high stakes high school achievement awards: Evidence from a randomized trial. The American Economic Review. pp. 1384-1414. 4. Askell-Williams, Helen, and Lawson, Michael J. (2002). Mapping students’ perceptions of interesting class lessons. Social Psychology of Education. Vol. 5, pp. 127-147. 5. Beaman, Lori, Duflo, Ester, Pande, Rohini, and Topalova, Petia. Female leadership raises aspirations and educational attainment for girls: A policy experiment in India. Science. Vol. 335, No. 6068, pp. 582-586. 6. Becker, Gary S. (1962). Investment in Human Capital: A theoretical analysis. Journal of Political Economy, Vol. 70, No. 5, Part 2: Investment in Human Beings (Oct., 1962), pp. 9-49 7. Blomeke, Sigrid and Paine, Lynn (2008). Getting the fish out of the water: Considering benefits and problems of doing research on teacher education at an international level. Teaching and Teacher Education, Vol. 24, pp. 2027-2037. 8. Bowen, William G., Chingos, Matthew M., and McPherson, Michael S. (2009) Crossing the Finish Line: Completing College at America’s Public Universities. Princeton University Press. Princeton, NJ. 9. Bowles, Samuel (1970). Towards and Educational Production Function. Pages 9-70 in Education, Income, and Human Capital. W. Lee Hansen, ed. 10. Boyd, Donald J., Grossman, Pamela L., Lankford, Hamilton, Loeb, Susanna, and Wyckoff, James (2009). Educational Evaluation and Policy Analysis. Vol. 31, No. 4, pp. 416-440. 11. Central Intelligence Agency (2013). The World Factbook. Available at: https://www.cia.gov/library/publications/the-world-factbook.html Accessed October, 2013. 12. Coleman, James S. (1966). Equality of Educational Opportunity. United States Department of Health, Education and Welfare. 13. Cooper, Kristy S. (2013). Eliciting Engagement in the High School Classroom: A Mixed Methods Examination of Teaching Practices. American Educational Research Journal. Early Online Version, Oct. 25th, 2013. 14. Cowan, Charles D., Hauser, Robert M., Kominski, Robert A., Levin, Henry M., Lucas, Samuel R., Mogan, Stephen L., Spencer, Margaret Beale, and Chapman, Chris (2012). Improving the Measurement of Socioeconomic Status for the National Assessment of Educational Progress: A Theoretical Foundation. Recommendations to the National Center for Education Statistics. 15. Davidson, Beatrice Avalos (2013). Preparing Teachers of Mathematics in Chile. TEDS-M Encyclopedia – A Guide to Teacher Education, Context, Structure, and Quality Assurance in 17 Countries. Edited by Schwille, Ingvarson, and Holdgreve-Resendez 597 16. Dee, Thomas S., and West, Martin R. The non-cognative returns to class size. Educational Evaluation and Policy Analysis. Vol. 33, No. 1, pp. 23-46. 17. Duflo, Ester, Dupas, Pascaline, and Kremer, Michael. (2012). School governance, teacher incentives, and pupil-teacher ratios: Experimental evidence from Kenyan primary schools. NBER working paper number: w17939. 18. Foy, Pierre, Arora, Alka, and Stanco, Gabrielle M. (2013) TIMSS User Guide for the International Database. International Association for the Evaluation of Educational Achievement. 19. Foy, Pierre and Drucker, Kathleen T. (2013) PIRLS 2011 User Guide for the International Database. International Association for the Evaluation of Educational Achievement. 20. Glewwe, Paul, Kremer, Michael, and Moulin, Sylvie. (2009). Many children left behind? Textbooks and test scores in Kenya. American Economic Journal” Applied Economics. Vol. 1, No. 1, pp. 112-135. 21. Greenwald, Robert, Hedges, Larry, and Lane, Richard. (1996). The effects of school resources on student achievement. Review of Educational Research, Vol. 66, pp. 361-396. 22. Griffith, Amanda L. (2010) Persistence of women and minorities in STEM field majors: Is it the school that matters? Economics of Education Review No. 29, pp 911-922. 23. Gubler, Johanna (2012). Chile – TIMSS 2011 Encyclopedia Edited by: Mullis et al. TIMSS & PIRLS International Study Center. 24. Hanushek, Eric (2007) Education Production Functions. Palgrave Encyclopedia. 25. Hautamaki, Jarkko, Harjunen, Elina, Hautamaki, Airi, Karjalainen, Tommi, Kupiainen, Sirkku, Laaksonen, Seppo, Lavonen, Jari, Pehkonen, Erkki, Rantanen, Pekka, and Scheinin, Patrik (2008). PISA 2006 and Finland: Analyses, Reflections, Explanations. Finnish Ministry of Education Publications. 26. Hedges, Larry (2013). Economic Inequality and Academic Achievement. Prepared for the 5th IEA Research Conference, Singapore, June 25th, 2013. 27. Heyneman, Stephen P. and Loxley, William A. (1983). The effect of primary school quality on academic achievement across twenty nine high and low income countries. American Journal of Sociology. pp. 11621194. 28. Hoxby, Caroline M. (2000). The effects of class size on student achievement: New evidence from population variation. The Quarterly Journal of Economics. Vol. 114, No. 4, pp. 1239-1285. 29. IEA (2011). TIMSS 2011 School Questionnaire. TIMSS & PIRLS International Study Center. Lynch School of Education, Boston College. 30. IEA (2011). TIMSS 2011 Student Questionnaire. TIMSS & PIRLS International Study Center. Lynch School of Education, Boston College. 31. Imberman, Scott A., Kugler, Adriana D., and Sacerdote, Bruce I. (2012). Katrina’s Children: Evidence on the Structure of Peer Effects from Hurricane Evacuees. American Economic Review. Vol. 102, No. 5, pp. 2048-82. 32. Jaccard, James and Turrisi, Robert. (2003). Interaction Effects in Multiple Regression 2nd edition. Sage Publications, thousand Oaks, California. 33. Klieme, Eckhard (2013). Quality of Schools and Teaching. Presented for the IEA General Assembly, Lisbon, October 2013. 598 34. Krueger, Alan B. (2003). Economic considerations and class size. The Economic Journal. Vol. 113, No. 485, pp. F34-F63. 35. Langdon, David, McKittrick, George, Beede, David, Khan, Beethika, and Doms, Mark. (2011) STEM: Good Jobs Now and for the Future. United States Department of Commerce, Economics and Statistics Administration. ESA Issue Brief #03-11 36. Lavonen, Jari and Laaksonen (2009). Context of Teaching and Learning School Science in Finland: Reflections on PISA 2006 Results. Journal of Research in Science Teaching, Vol. 46, No. 8, pp. 922-944. 37. Lloyd, Cynthia B., Mete, Cem, and Grant, Monica J. (2009). Economics of Education Review. Vol. 28, No. 1, pp. 152-160. 38. Marks, Gary, Cresswell, John, and Ainsley, John. (2006) Explaining Socioeconomic Inequalities in School Achievement: The Role of School and Home Factors. Educational Research and Evaluation, Vol. 12, No. 2, April. 39. Martin, Michael O., Mullis, Ina V.S., Foy, Pierre, and Stanco, Gabrielle M. (2012) TIMSS 2011 International Results in Science. International Association for the Evaluation of Educational Achievement. 40. Martin, M.O. & Mullis, I.V.S. (Eds.). (2012). Methods and procedures in TIMSS and PIRLS 2011. Chestnut Hill, MA: TIMSS & PIRLS International Study Center, Boston College. 41. Michaelowa, Katharina. (2001). Primary Education quality in francophone Sub-Saharan Africa: Determinants of learning achievement and efficiency considerations. World Development. Vol. 29, No. 10, pp. 1699-1716. 42. Mullis, Ina V.S., Martin, Michael O., Foy, Pierre, and Arora, Alka. (2012) TIMSS 2011 International Results in Math. International Association for the Evaluation of Educational Achievement. 43. Mullis, Ina V.S., Martin, Michael O., Minnich, Chad A., Stanco, Gabrielle M., Arora, Alka, Victoria, AS Centurino, and Castle, Courtney E. (2012) TIMSS 2011 Encyclopedia – Education Policy and Curriculum in Mathematics and Science. International Association for the Evaluation of Educational Achievement. 44. Mullis, Ina V.S., Martin, Michael O., Foy, Pierre, and Drucker Kathleen T. (2012) PIRLS 2011 International Results in Reading. International Association for the Evaluation of Educational Achievement. 45. Mullis, Ina V.S., Martin, Michael O., Ruddock, Graham J., O’Sullivan, Christine Y. and Preuschoff, Corinna. (2009) TIMSS 2011 Assessment Frameworks. TIMSS & PIRLS International Study Center, Lynch School of Education, Boston College. 46. Prais, S.J. (2007). Two recent (2003) international surveys of schooling attainments in mathematics: England’s problems. Oxford Review of Education, Vol. 33, No.1, February 2007, pp. 33-46. 47. Poon, Chew Leng, NG, Hui Leng, and Lim, Pik Yen. (2013) Investigating the Performance of Singapore Students from Different Socio-Economic Backgrounds in TIMSS/PIRLS 2011. Prepared for the IEA General Assembly Meeting, Lisbon, October 2013 48. OECD (2014). Lessons from PISA for Korea. OECD publishing 49. OECD (2009). PISA 2009, Volume II: Overcoming Social Background: Equity in Learning Opportunities and Outcomes. 50. OECD (2009). PISA 2009, Volume IV: What makes a school successful? Resources, Policies, and Practices. 599 51. OECD (2009). PISA Data analysis manual SPSS, second edition. 52. OECD (2009). Parent Questionnaire for PISA 2009 (International Option). Core B Consortium. 53. OECD (2009). Student Questionnaire for PISA 2009. Core B Consortium. 54. OECD (2007). PISA 2006, Science Competencies for Tomorrow’s World: Volume 1 55. OECD (2006). Glossary of Statistical Terms. Available at: http://stats.oecd.org/glossary/detail.asp?ID=4842 Accessed November, 2013. 56. Olson, John F. Martin, Michael O., and Mullis, Ina V.S (2008) TIMSS 2007 Technical Report. International Association for the Evaluation of Educational Achievement. 57. Oreopoulos, Philip, Page, Marianne E., and Huff Stevens, Ann. (2006). The Intergenerational Effects of Compulsory Schooling. Journal of Labor Economics, Vol. 24, No. 4, pp. 729-760. 58. Osborne, Jonathan, Simon, Shirley, and Collins Sue. (2003). Attitudes toward science: A review of the literature and its implications. International Journal of Science Education. Vol. 25, No. 9, pp. 1049-1079. 59. Raudenbush, S.W., Bryk, A.S, & Congdon, R. (2013). HLM 7.1 for Windows [Computer software]. Skokie, IL: Scientific Software International, Inc. 60. Raudenbush, Stephen, and Bryk, Anthony (2002). Hierarchical Linear Models – 2nd Edition. Sage Publications Inc. Thousand Oaks, California. 61. Reardon, Sean F. (2011). The Widening Academic Achievement Gap Between the Rich and the Poor: new Evidence and Possible Explanations. From Whither Opportunity: 91-116. 62. Reigle-Crumb, Catherine and Moore, Chelsea. (2013). The Gender Gap in High School Physics: Considering the Context of Local Communities. Social Science Quarterly. 63. Reimers, Fernando, DeShano Da Silva, Carol and Trevino, Ernesto. (2006). Where’s the “Education” in Conditional Cash Transfers in Education? UNESCO Institute for Statistics Working Paper No. 4. 64. Rothwell, Jonathan and Ruiz, Neil G. (2013). H-1B Visas and the STEM Shortage: A Research Brief. Available at SSRN: http://ssrn.com/abstract=2262872 or http://dx.doi.org/10.2139/ssrn.2262872 65. Sahlberg, Pasi (2014) What the U.S. can’t learn from Finland. Available online at: http://pasisahlberg.com/text/ (Accessed May, 2014) 66. Schultz, Theodore W. (1961). Investment in Human Capital. The American Economic Review, Vol. 51, No. 1 (Mar., 1961), pp. 1-17 67. Schultz, T. Paul. (2002). Why governments should invest more to educate girls. World Development. Vol. 30, No. 2, pp. 207-225. 68. StataCorp LP (2013). Stata Survey Data Reference Manual, Release 13. Stata Press, College Station, Texas. 69. Stat Transfer (2013). Stat Transfer Version 11. (software program). Circle Systems Seattle, WA. 70. Steiner-Khamsi, ed (2004). Global Politics of Educational Borrowing and Lending. Teachers College Press, New York. 600 71. Tsai, Chin-Chung and Yang, Fang-Ying (2011). Editorial: Research About Science Learning in Asian countries. The Asia-Pacific Researcher. Vol. 20, No. 2, pp. 201-206. 72. UCLA: Statistical Consulting Group. Stata topics: Survey Data Analysis. Available at: http://www.ats.ucla.edu/stat/stata/topics/Survey.htm (Accessed August 2013 to June 2013). 73. Uitto, Anna, Juuti, Kalle, Lavonen, Jari, and Meisalo, Veijo. (2006). Students interest in biology and their out-of-school experiences. Journal of Biology Education. Vol. 40, No. 3, pp. 124-129. 74. UNESCO. (2014). Education for All Monitoring Report 2013-14: Teaching and Learning: Achieving Quality for All. 75. UNESCO. (2012) Education for All Monitoring Report 2012: Youth and Skills – Putting Education to Work. 76. UNESCO. (2005) Education for All Monitoring Report 2005: The Quality Imperative. 77. UNESCO. (2004) Education for All Monitoring Report 2003/4: Gender and Education for All, the Leap to Quality. 78. Wong, Khoon Yoong, Lim-Teo, Suat Khoh, Lee, Ngan Hoe, Boey, Kok Leong, Koh, Caroline, Dindyal, Jagusthing, Teo, Kok Ming, and Cheng, Lu Pien (2013). Preparing Teachers of Mathematics in Singapore. TEDS-M Encyclopedia – A Guide to Teacher Education, Context, Structure, and Quality Assurance in 17 Countries. Edited by Schwille, Ingvarson, and Holdgreve-Resendez 79. World Bank (2013). Data: Gini Index. Available at: http://data.worldbank.org/indicator/SI.POV.GINI Accessed October, 2013. 80. World Bank (2006) World Development Report 2006 – Equity and Development. 81. Youngs, Peter and Grogan, Erin (2013). Preparing Teachers of Mathematics in the United States of America. TEDS-M Encyclopedia – A Guide to Teacher Education, Context, Structure, and Quality Assurance in 17 Countries. Edited by Schwille, Ingvarson, and Holdgreve-Resendez 601