. 1...! . 5|. .itsiulhflurugi hrg‘lilfl $.SIIEQJSA 2.3.. 8:- . 1g; 1,531: In [.2 55-33.11 .1 | . . . ..!.«!::z......... r... .a...:_. any! gun. u (I . 1.93.? , . .s £11.91; .1 &r r [1012. 3?. fit . at... I509... . C... l I 1.‘ .5 :1... 3 .33.... b... litflzl‘l-O {uni-I .I it... i. :g33n4Wifu.lfla.g.RV|fl| I I . 1 2 Q . .. . ”$3...." «haflh‘ Sign; 1.2.531 s . z. I‘ K - Km- 3...»m....ufl..i..z .5 .2. min; .7. $21... 33.2 . i. , ’25... .3... 1 $9.? 1...au.sn.l.. 5.3%....951 ”liiallfi. .18} . ~91 n:.§3.!:.!$!9flnfirifluxa.lnart¢dt. u. : Iii-t»!- ‘ {tzciu 3.... :Ju... ‘ Luann»... J. . : "Br ._ a: {will} till»!!! .. Sinntl...‘ gang n‘i': :.....i..v..h.....cs.; .Efiflt 12.. .. o .‘I 3‘- ..cf‘! 11"9’3 0.! ill". 1 THBS IS 1 ’L 001’ This is to certify that the dissertation entitled THE IMPACT OF THE WISCONSIN COMPETENCY BASED ADMISSION (CBA) PROGRAM ON COLLEGE ACCESS AND OUTCOMES presented by Clinton D. Gardner has been accepted towards fulfillment of the requirements for Ph.D. . Educational Administration degree in MS U is an Affirmative Action/Equal Opportunity Institution 0-12771 LIBRARY Michigan State University PLACE IN RETURN BOX to remove this checkout from your record. TO AVOID FINES return on or before date due. MAY BE RECALLED with earlier due date if requested. DATE DUE DATE DUE ' DATE DUE 6/01 cJCIRC/DateDuepss-p. 1 5 THE IMPACT OF THE WISCONSIN COMPETENCY BASED ADMISSION (CBA) PROGRAM ON COLLEGE ACCESS AND OUTCOMES By Clinton D. Gardner A DISSERTATION Submitted to Michigan State University in partial fulfillment of the requirements for the degree of DOCTOR OF PHILOSOPHY Department of Educational Administration 2002 ABSTRACT THE IMPACT OF THE WISCONSIN COMPETENCY BASED ADMISSION (CBA) PROGRAM ON COLLEGE ACCESS AND OUTCOMES BY Clinton D. Gardner The purpose of this study was to evaluate the usefulness of the University of Wisconsin Competency Based Admission program as a college admission alternative to the Carnegie Unit admission program. To carry out this study, data were gathered on both admission programs (i.e. Carnegie Unit and CBA) in the areas of final admission decisions (e.g. admit and deny decisions); first semester and first year college grades for matriculated students; and the time required by admission counselors to make final admission decisions. These data were used to test the impact of the CBA model and Carnegie Unit model on the levels of admission access; the amount of decision-making efficiency associated with each program; and the amount of college grade variation explained by each program. The aggregate data yielded results that showed the Carnegie Unit program was more effective in maintaining higher levels of college access. However, the disaggregated results of the data analysis showed that in four of the ten universities studied, the CBA program generated higher college access rates compared to the Carnegie Unit program. An analysis of first semester college grades generated grade point average results that indicated that each admission program explained similar levels of variation in grades. However, differences were observed in the explained grade variation in five subject areas. The CBA program explained more of the grade variation in the subjects of social science and foreign language, while the Carnegie Unit program explained more of the grade variation in English, math and science. The aggregate mean time required by admission officers to make final application decisions for each program identified the Carnegie Unit program to be more efficient than the CBA program by more than two minutes. A disaggregated analysis found only one university that had an average CBA decision time that was lower than the Carnegie Unit decision time. The evaluation of these data resulted in the researcher establishing that the CBA program, while not as effective in maintaining broad levels of access (aggregate) and less efficient compared to the Carnegie Unit program, does hold promise in its use as an alternative admission strategy. Copyright by Clinton D. Gardner 2002 DEDICATION This study is dedicated to the memory of my late parents, Nellie and David Gardner, and to my late sister, Rochelle. ACKNOWLEDGMENT The support of many people over the course of my life helped to make this dream a reality. The early mentoring provided by Charles Thornton will always be valued and never forgotten. Charles your unyielding support and friendship has affected me in deeply profound ways. Dr. Marylee Davis has provided immeasurable support in helping me shape this work over many years. I could not have found a more supportive and caring committee chairperson. The enthusiasm you show for your work and your commitment to students is a high standard for all to try to emulate. To the members of my committee, Dr. Richard Brandenburg, Dr. Lee June and Dr. Marilyn Amey, thank you for all of your guidance, support and assistance. To Drs. Harry Reed, Sam Carter, James Jay and Marti Hesse, thank you for being mentors, and consistent supporters. To Dr. Bill Turner, your vision has helped me understand the importance of college admissions. To Albert Watson, Murray Edwards, Clayton Wheeler and Lisa Baggett, thank you for your years of friendship and support. To my children, Charles and Camille, you have changed and enriched my life in so many positive and important ways. I hope that this work reminds you of what is possible once you dedicate yourself to a goal. vi To my wife and life partner Pamela Sedwick, the road has finally come to a successful conclusion. Thank you for being the same high quality, loving person after close to twenty years of our commitment to each other. vii TABLE OF CONTENTS LIST OF TABLES .............................................................................................. x LIST OF ABBREVIATIONS .............................................................................. xii CHAPTER I: INTRODUCTION TO THE STUDY .............................................. 1 Background .................................................................................................. 1 Need for the Study ....................................................................................... 5 Purpose of the Study ................................................................................... 6 Research Questions .................................................................................... 8 Summary of Research Issues ...................................................................... 8 The CBA Program ....................................................................................... 9 Mathematical Knowledge and Reasoning ............................................... 13 Social Science ......................................................................................... 14 Definition of Key Terms ................................................................................ 16 Limitations ..................................................................................................... 18 Delimitations ................................................................................................. 19 Overview of the Remainder of the Dissertation ............................................. 19 CHAPTER II: LITERATURE REVIEW ............................................................... 21 Introduction ................................................................................................... 21 History of Access and Admission to Post-Secondary Institutions ................ 21 Redefining Access ........................................................................................ 27 Admission Requirements and Carnegie Units ............................................. 31 Admission Requirements .............................................................................. 33 Standardized Test Scores and Test Bias Issues .......................................... 34 Admission Decisions with Test Scores ........................................................ 39 Current Admission Challenges .................................................................... 41 Changes in High School Curricula ................................................................ 43 Home-Schooling Options ............................................................................. 45 Responding to Admission Challenges ......................................................... 46 CHAPTER III: METHODS ................................................................................. 48 Introduction ................................................................................................... 48 Research Questions .................................................................................... 48 Research Issues .......................................................................................... 49 The Research Study and Population ........................................................... 50 The CBA Pilot Design .................................................................................. 52 Multi-Attribute Utility Model (MAUT) .............................................................. 54 Admit Rates and the College Access Hypothesis ......................................... 63 Null Hypothesis 1 ................................................................................... 64 viii Alternative Hypothesis 1 ......................................................................... 64 College Grades Hypothesis ......................................................................... 64 Null Hypothesis 2 ................................................................................... 65 Alternative Hypothesis 2 ......................................................................... 66 Admission Decision Times Hypothesis ........................................................ 66 Null Hypothesis 3 ................................................................................... 67 Alternative Hypothesis 3 ......................................................................... 67 Summary of Research Hypotheses ............................................................. 67 CHAPTER lV: RESULTS AND ANALYSIS OF DATA ...................................... 69 Introduction .................................................................................................. 69 Research Results ........................................................................................ 69 Hypothesis 1 ............................................................................................ 69 Hypothesis 2 ........................................................................................... 73 Hypothesis 3 ........................................................................................... 77 CHAPTER V: SUMMARY, CONCLUSIONS, IMPLICATIONS AND ................. 86 RECOMMENDATIONS FOR FUTURE RESEARCH Introduction .................................................................................................. 86 Hypotheses .................................................................................................. 86 Findings and Conclusions ............................................................................ 87 Hypothesis 1 ........................................................................................... 87 Hypothesis 2 ........................................................................................... 92 Hypothesis 3 ........................................................................................... 94 Implications of the Study .............................................................................. 98 Direction for Future Research ...................................................................... 101 Summary of Areas for Future Research ...................................................... 103 Post-Pilot Summary Status .......................................................................... 105 APPENDICES Appendix A — Competency Based Admission: ............................................. 110 The Wisconsin Model (Final Report) Appendix B — The University of Wisconsin System ..................................... 118 Competency-Based Admission: Admission Competencies Appendix C - Competency Based Admission: Standardized ...................... 138 Reporting Profile (SRP) Bibliography ................................................................................................. 141 LIST OF TABLES Table 1: Admission requirements of Colonial Colleges in 1800 ........................ 26 Table 2: Carnegie Unit admission requirements ............................................... 32 of selected institutions, 1915 to 1930 Table 3: Carnegie Unit admission requirements ............................................... 33 of selected institutions, 1930 to 1945 Table 4: Outcomes of high school rank and high school rank ........................... 40 plus SAT admissions policies at Dartmouth Table 5: MAUT process, step 7 ........................................................................ 60 Table 6: Prior Admit Probabilities (Based on a five-year average ..................... 61 For each university) Table 7: Estimated/Projected Carnegie Unit and CBA admits (Using ............... 61 Prior Admit Probabilities and observed admission applications) Table 8: MAUT process, step 8 (Carnegie Unit Posterior Applications.) ........... 62 Admits and Admit Probabilities Table 9: MAUT process, step 8 (CBA Posterior Applications, Admits) ............. 62 And Admit Probabilities Table 10: Expectation Matrix ............................................................................ 68 Table 11: Expected MAUT college access utilities ........................................... 70 (Based on Prior Admit Probabilities) Table 12: MAUT college access results (Based on Posterior ........................... 71 Admit Probabilities) Table 13: Differences in Expected and Calculated MAUT utilities .................... 72 for the Carnegie Unit admission program Table 14: Differences in Expected and Calculated MAUT utilities ................... 72 for the CBA admission program Table 15: First semester correlation GPA weights by subject area ................... 74 Table 16: Comparison of SRP satisfactory ratings and .................................... 75 first year college GPA Table 17: Comparison of SRP satisfactory ratings, GPA and ........................... 76 completed course credits Table 18: Time needed to make decisions on Carnegie Unit ............................ 78 admission applications Table 19: Time needed to make decisions on CBA admission applications ..... 80 Table 20: Aggregate admission application decision time ................................ 80 (Carnegie Unit and CBA) Table 21: Results of ANOVA test of between-subject effects ........................... 81 Table 22: Results of Tukey’s multiple comparison of mean differences ............ 82 Table 23: Results of ANOVA test of CBA admission application ...................... 83 decision time for each institution Table 24: Results of Expectation Matrix ............................................................ 84 Table 25: Alternative admission program policy decision-making matrix .......... 85 Table 26: Admit rates and variations in admit rates for the ............................... 88 Carnegie Unit model and the CBA model xi ACT ANOVA CBA GPA K-12 MAUT SAT SRP LIST OF ABBREVIATIONS American College Test Analysis of Variance Competency Based Admission Grade Point Average Kindergarten through 12th Grade Multi-Attribute Utility Theory Scholastic Aptitude Test Standardized Reporting Profile xii CHAPTER I: INTRODUCTION TO THE STUDY Background The history of access to American higher education is centered on the structured process called college admission. The college admission process, for first-time freshmen, involves the use of structured criteria, including high school grades, standardized test scores, and in some cases a personal statement and letters of recommendation (Fetter, 1995). These items are the primary criteria used to determine access to postsecondary education (Astin, 1991; Breland et al., 1995). Although admission practices vary across colleges and universities, high school grades and standardized test scores are criteria that are consistently employed at competitive, “non—open-access” institutions. In screening and evaluating each student applicant, universities with competitive admission practices use structured criteria that primarily focus on high school grades and standardized test scores to determine the applicant’s level of preparation and opportunity for future academic success (Breland et al., 1995; Heamden, 1973; Maeroff, 1983). Because students compete for space in the freshman class, these institutions are competitive in their admission practices (Goren, 1962). The continued reliance on test scores in making college admission decisions has been and continues to be controversial. With minority students and women, standardized tests continue to pose concerns related to the possibility of racial and gender biases (Crouse 8 Trusheim, 1991; Rosser, 1989a). In recent years, some highly selective colleges and universities expanded their admission criteria beyond a dependency on test scores as a means to ensure the enrollment of a racially diverse student body. The use of affirmative action strategies by some of these institutions, starting in the 19703, resulted in an enormous increase in college enrollment of minority students (Jones, 1998). The first major increase resulting from Affirmative Action admission practices was observed when the college enrollment for Black students rose from a total of 417,000 in 1970, including students enrolled in Historically Black Colleges and Universities (HBCUs), to 1,033,000 in 1976 (US. Department of Education, 1999). By 1994, the college enrollment of Black students totaled 1,469,000 students (US. Department of Education, 1999). This rise in Black student enrollment was occurring while the number attending HBCUs was only increasing at a modest level. In 1976, there were 222,000 Black students enrolled in HBCUs, and in 1994, a total of 280,000 Black students were enrolled in these institutions (US. Department of Education, 1999). The enrollment of Black students, at HBCUs and predominantly White colleges and universities, shows the important gains in access to higher education for this ethnic group. While important admission and enrollment gains were occurring among Blacks, significant increases were observed among women and Hispanics. The number of Hispanic students enrolled in college between 1988 and 1997 was 680,000 and 1,218,000, respectively (Weiger, 2000). This represents close to an 80% increase over this ten-year period. An increased enrollment of women in higher education has been evolving for close to twenty-five years. During the period from 1975 to 1997, the total enrollment of women was 4,262,000 and 5,014,000, respectively (US. Department of Education, 1999). While this increase is not as impressive as the gains observed for Blacks and Hispanics, the increase of 25% represented a sizeable enrollment gain for women. While some colleges and universities deviated from the direct application of making access decisions based solely on high school grades and test scores, increasing political, and legal pressure is resulting in some of these institutions returning to a strict interpretation and use of these criteria (Douglass, 1999; Jones, 1998; Karabel, 1999). In fact, the impact of these pressures is seen in recent freshman admission and enrollment numbers in the state of California. In California, with the passing of Proposition 209 (This proposition became a legal statute in November 1996), which bars the use of affirmative action criteria in college admission for public universities, the impact on access has been significant. The enrollment of minority students on Califomia’s public university campuses had steadily increased through 1996 (Douglass, 1999). With the passing of this proposition in 1996, the data show a dramatic reduction in the enrollment of Black, Hispanic and Native American freshman students (Jones, 1998; Karabel, 1999). These results magnify the concerns that the strict admission criteria that are heavily reliant on test scores will close the access door on significant numbers of minority students. The maintenance of broad college access for minority students is an important reason to examine alternative admission programs that could minimize the importance of test scores in the college admission process. Additional pressure on the current college admission criteria is being driven by changes at the K-12 level. A number of curricular changes occurring in secondary schools across the United States are threatening the fundamental relationship that characterized educational access to higher education during the majority of the twentieth century. As secondary schools implement standards that make it more difficult for colleges and universities to evaluate college readiness, new admission strategies need to be developed to meet these changing requirements. The following are some of the K-12 curricular changes that challenge the current college admission criteria: . Elimination of letter grades in the high school assessment and graduation process. 0 Development of block scheduling models where multiple courses (e.g., math and science) are taught as one course will alter the admission process of counting Carnegie Units and make course-by-course grade assessment in each subject more difficult. In addition, home schooling options, resulting in an assessment of students’ skills without the benefit of course and subject grades, have an effect on the current college admission criteria. Developing admission criteria to address each of these curricular challenges require establishing new methods for evaluating student achievement. In essence, the Competency-Based Admission (CBA) program developed by the University of Wisconsin System is an attempt to address these admission challenges. Although CBA was designed to provide educational opportunities to students graduating from high schools with nontraditional curricula, it could also serve to eliminate the use of standardized test scores in the college admission process. The benefits derived from developing this program must be compared with any increased resource trade-offs. If these new demands translate into additional admission-processing demands and increased resource use, then issues of access at what expense becomes a relevant factor. Need for the Study Numerous curricular changes are occurring in K-12 education in America and, more specifically, in the state of Wisconsin. A number of K-12 school districts in Wisconsin are developing plans for non-grade assessment processes and block scheduling options. These curricular changes could result in an increasing number of students, using an unorthodox set of credentials, applying for admission to University of Wisconsin institutions. These students will bring to the college admission process an alternative set of achievement measurements that, if examined in the light of current admission practices, could serve as barriers to postsecondary educational opportunities. To address these current admission challenges, the University of Wisconsin System developed an alternative admission process called Competency-Based Admission (CBA: The Wisconsin Model, 1994). The literature indicates that no comprehensive examination of the Wisconsin CBA program has been undertaken to answer key admission decision and resource- efficiency questions. Such an investigation may help to further validate the use of the CBA evaluation process as an effective admission decision-making model and a resource-efficient alternative method for determining college access and college readiness. The present study was undertaken to address that need. The researcher is hopeful that the results of this study will provide some evidence to determine if the CBA program has a broader application and use that could minimize concerns with the use of test scores in college admission. Purpose of the Study As changes in K-12 education continue to blanket the landscape, the establishment of new college admission strategies, similar to the CBA program, will play an important role in defining college access in the current century. The CBA program was designed as an alternative approach to the current college admission criteria and a way of maintaining broad postsecondary educational access (CBA: The Wisconsin Model, 1994). Given this intention, CBA as an admission process could serve to meet societal needs by maintaining educational access during a time of numerous K-12 curricular reforms. In addition, when one examines CBA purely from a policy perspective, questions related to resource allocations must be explored to determine whether the program is effective in achieving these important educational-access outcomes. Therefore, the researcher’s purpose in this study was to determine (1) how CBA admission decisions compare with the current admission process, and (2) the level of resources expended by universities in processing and rendering admission decisions on CBA applications. The research compared admission-decision rates and first year college grade outcomes for CBA and the traditional admissions process (Carnegie Unit) and the time required to make decisions on CBA and traditional application materials (Camegie-based applications). In focusing on the aforementioned areas of research, the researcher evaluated CBA as an alternative admission practice. Therefore, CBA admission- decisions and resource allocations (time requirements) were compared and evaluated against those admission-decisions and resource allocations (time requirements) associated with the use of the traditional admission program (e.g., high school grades and test scores). The instrument used to examine and compare admit decisions forthese two admission programs is a Multi-Attribute Utility (MAUT) model (Edwards, Guttentag & Snapper, 1975). The MAUT model is often used to help policy makers formulate program review decisions (Edwards, 1977). The use of the MAUT model in this analysis will provide a method for evaluating and determining the impact of the CBA program on access to college, while comparing and contrasting these results with the traditional admission program. Research Questions To facilitate and develop this study, the researcher established topics that centered on the following questions: . Do admission decisions using CBA result in similar access outcomes (Admit Rates) when compared to the traditional admission (Carnegie Unit) process? . Do predictive relationships exist between the CBA process and first semester college grades? If such a relationship exists, how does it compare to college grades predicted from the traditional admissions (Carnegie Unit) process? 0 When compared to Carnegie Unit, does CBA represent a time- efficient process for making college admission decisions? Summary of Research Issues . The CBA program was designed by the University of Wisconsin System as an alternative admission program (CBA: The Wisconsin Model, 1996). This implies that CBA should lead to comparable admission decisions and predicts similar levels of success in college when compared to the traditional admission process (Carnegie Unit). Admission decision rates and first semester college grades are used to examine this assumption. . The CBA program has been cited and hailed as an alternative process that will allow admission offices to render decisions in a timely manner consistent with the time involved in making decisions on traditional admission (Carnegie Unit) application materials (e.g., grades and test scores). The CBA program implies that the use of a Standardized Reporting Profile (SRP) form will allow university admission offices to evaluate application materials and render timely admission decisions. This research was undertaken to examine whether this time-efficiency assumption has merit. o The researcher conducted this study to determine whether the CBA processing time expended by university admission offices compared favorably with the processing time associated with processing traditional application (Carnegie Unit) materials. In carrying out this phase of the study, the researcher analyzed the time expended by universities in processing traditional admission applications, as compared to the time expended by universities in making CBA decisions. The CBA Program In October of 1992, a task force was convened by the University of Wisconsin President’s Office to explore the prospects of developing a supplemental admission process for maintaining college access opportunities for students enrolled in nontraditional curricula and to develop an admission process that would meet the needs of students enrolled in high schools undergoing curricular changes (CBA: The Wisconsin Model, 1994). The task force met for several months to explore the viability of developing a new admission procedure (CBA: The Wisconsin Model, 1994). At the conclusion of an eight-month process, the task force submitted its recommendations to the University of Wisconsin’s Board of Regents for the creation of the Competency Based Admission (CBA) program (CBA: The Wisconsin Model, 1994). In June 1993, the Board of Regents endorsed CBA and asked the President’s Office to develop a strategy to make CBA a viable admission program in all of the four-year, public universities (CBA: The Wisconsin Model, 1994). Shortly after the approval from the Board of Regents, the President’s Office appointed a steering committee to define the parameters needed to make CBA a reality (CBA: The Wisconsin Model, 1994). Working in partnership with the steering committee were six subcommittees made up of seventy-five faculty members from all of the public universities and the two-year technical colleges (Community Colleges in Wisconsin), teachers from various K-12 school districts, and administrators from the University of Wisconsin President’s Office (CBA: The Wisconsin Model, 1994). These committee members started work on this initiative in November of 1993 (CBA: The Wisconsin Model, 1994). Committee members maintained continuous contact as they focused on the charge assigned to each group (CBA: The Wisconsin Model, 1994). Five of the six subcommittees worked on developing core competency requirements needed for admission and necessary for students to achieve academic success on any of the University of Wisconsin four- year campuses (CBA: The Wisconsin Model, 1994). The members served on 10 committees that established CBA admission requirements in the subject areas of English, math, science, social science and foreign language (CBA: The Wisconsin Model, 1994). The final documents developed by these committees became known as the CBA-Subject Competencies (CBA: The Wisconsin Model, 1994). The sixth subcommittee focused its attention on developing a process for summarizing the competency levels attained in each subject (CBA: The Wisconsin Model, 1994). The work of this committee resulted in the development of the Standardized Reporting Profile (SRP) form (CBA: The Wisconsin Model, 1994). The SRP is the form used by high schools to summarize student competency scores in each academic subject (CBA: The Wisconsin Model, 1994). This form is submitted to the university admission office for a final admission decision on each applicant (CBA: The Wisconsin Model, 1994). The subcommittees completed their assignments in May 1994 (CBA: The Wisconsin Model, 1994). Their collective work resulted in the development of a five- subject set of competency requirements to be used in making college admission decisions, thereafter called the Competency Based Admission Program/T he Wisconsin Model (CBA: The Wisconsin Model, 1994). The CBA program was established as the formal, alternative college admission criteria for students graduating from high school curricula that could not easily be evaluated using the traditional Carnegie Unit admission criteria (CBA: The Wisconsin Model, 1994). The potential use and impact of the CBA program in the college admission process could be significant. For example, standardized college 11 admission tests have been continuously cited as an important college enrollment barrier for minority students and in some cases female students (Medina & Neill, 1990; Rosser, 19893). This argument points out a concern that college access decisions made solely on the use of high school grades and test scores could have a negative effect on these two groups. By its design, the CBA program could minimize the importance and reliance on standardized test scores in college admission. As a result, CBA has the potential to expand access to college for students of color and women. The CBA program asks high school teachers to evaluate each student's competency in each subject area (CBA: The Wisconsin Model, 1994). An understanding of how this process works can be obtained by looking at the subject of math. Math teachers are asked to determine how well a student is able to complete math problems ranging from elementary algebra to more complex levels of trigonometry and analytical geometry (CBA: Admission Competencies, 1995). These math problems were selected as containing the necessary knowledge each student must possess in order to be successful in math courses on a University of Wisconsin campus (CBA: The Wisconsin Model, 1994). The following represents some examples of CBA competency statements that high school math teachers are asked to provide as a formal evaluation of each student’s skill level: 12 Mathematical Knowledge and Reasoning 1. Use of Constraints: 3. Perform arithmetic operations in proper order, represent real numbers in a variety of forms and simplify arithmetic expressions involving radicals. Use arithmetic operations to model problem situations. Use mental arithmetic and estimation. b. Construct and read charts, tables and graphs that summarize data from real world situations. 2. Use of Variable in Linear Situations: a. Solve linear systems of equations in two or more variables and interpret solutions both symbolically and graphically. b. Use matrices to represent and analyze linear situations. 3. Use of Variables ln Algebraic Situations: a. Add, subtract, multiply, divide and exponentiate polynomial, rational, complex fractional and radical expressions and simplify the results. b. Solve algebraic equations and inequalities in one variable, including those which can be factored into linear and quadratic expressions, or which contain fractional expressions, absolute values, radicals or fractional exponents. 4. Use of Variables In Transcendental Situations: a. Manipulate and simplify expressions involving exponentials or logarithm. Solve equations and inequalities involving exponential and logarithmic expressions. 13 b. Use the language, notation and properties of exponential, logarithmic and trigonometric (sine, cosine, tangent) functions and their graphs. 5. Geometry: a. Visualize and sketch points, lines, planes and simple solids in three-dimensional space and find volumes of boxes and cylinders. b. Use knowledge of parallelism, perpendicularity and associated angle properties to analyze and construct figures and represent problem situations (with or without coordinates). In the subject area of social science, the following examples are represented in the competency evaluation process: Social Science 1. Knowledge: a. Students should be able to distinguish among the powers assigned to the executive, legislative, and judicial branches of the government in the US. Constitution, and between the areas of responsibility assigned to the state and federal governments; identify significant changes that have altered the foregoing through judicial interpretation and other developments. b. Students should be able to discuss the concepts of class, race, ethnicity, and gender in the analysis of society, and describe how the following have affected the status of women in various cultures 14 of the world, including the United States: 1) increasing numbers of women in the economy; 2) the rebirth of an organized women’s movements; 3) traditional definitions of women’s roles. 0. Students should be able to recognize the principal eras in the history of western civilization from Greek and Roman times to the present, identifying elements used in conventional periodizations, show a knowledge of the basic chronologies of world history. 2. Skills and Methods: a. Students should be able to demonstrate ability to use geographic tools and resources (e.g., maps, atlases, data bases, and spatial data). b. Students should be able to show awareness of the variety of sources used as evidence by social scientists and humanists, including print material, statistics, paintings, sculpture, architecture, film, photographs, and other artifacts. 3. Integrative Applications: 3. Students should be able to demonstrate their knowledge and skills through critical analyses in which, for example, they compare and contrast the impact of race, class, ethnicity, and gender on the histories of US. and other cultures. b. Students should be able to demonstrate their knowledge and skills through critical analyses in which, for example, they apply economic reasoning to help explain historical and current developments and issues,distinguishing between and showing the interaction of the US. domestic economy and the global economy. 15 The CBA process uses a similar approach in asking teachers of English, science, and foreign language to evaluate competency levels for each student applicant. Definition of Key Terms The following terms are defined in the context in which they are used in this dissertation: Cameme Unit: The name Camegle Unit is derived from the Carnegie Foundation for the Advancement of Teaching. In 1909, the Camegle Foundation provided the leadership for the creation and development of a systematic process for quantifying the academic achievement of K-12 students (Tompkins & Gaumnitz, 1954). By quantifying the number of class hours and the grades received in college- preparatory courses, Camegle Units are the standard criteria used to evaluate learning outcomes (Tompkins & Gaumnitz, 1954). Over the course of their 92-year history, Carnegie Units have continued to be an important part of college admission criteria. Carnegie Unit Admission: The process of using Camegle Units and standardized test scores (American College Test—ACT and Scholastic Aptitude Test—SAT) as the primary criteria for making college admission decisions (CBA: The Wisconsin Model, 1994; Wechsler, 1977). The Camegle Unit admission criteria commonly used by most colleges and universities requires four years of college- preparatory English; three years each of social studies, math, and science; and two 16 years of foreign language coursework (Tompkins 8: Gaumnitz, 1954). Commtengy Based Admission (CBA): An admission process that uses newly developed University of Wisconsin System competencies as the primary basis for making admission decisions. This process allows high school teachers, using a Standardized Reporting Profile (SRP) form, to evaluate students” competencies in each academic subject (e.g., English, math, science, social science, and foreign language). The CBA process was developed and designed to serve as a direct alternative and substitute for the traditional Camegle Unit admission process. Standardized Repgrting Profile (SRP) form: The form used to summarize the level of subject-specific competencies for each admission applicant. Teacher evaluations for all academic subjects (English, math, science, social science, and foreign language) are listed on the SRP form. Traditional admission applications: Applications that require high school transcripts, listing grades and standardized test scores, as the material needed in the admission decision-making process. This term is another way of denoting an admission process that requires Camegle Unit. Traditional college admission: The admission process that uses criteria that rely on Carnegie Units and standardized test scores as the basis for making final admission decisions (Often referred to as Camegle Unit Admission). 17 Limitations The examination of admit decisions (access) and the resources expended in processing CBA applications has important implications for higher education in Wisconsin, as well as other states. Although gaining a better understanding of college access decisions and the time expended in processing these admission applications will provide useful insights, this examination is not without limitations. A limitation of the study results from the data being collected during a pilot phase study of the impact of CBA. Some of the observed results could vary during a full program implementation. The second limitation is that data on the Camegle Unit and the CBA programs were collected by the University of Wisconsin System. As a result, the analysis in this study utilized secondary data sources. The third limitation is that Carnegie Unit and CBA admission decisions are made by two separate admission officers at each university. Since all admission officers at each university review and make Camegle Unit decisions on a regular basis, the lack of a full dissemination process for CBA applications to these same officers could result in a directional bias (potentially positive or negative) for CBA admit rates. 18 Delimitations An explanation of the participant data used to make admission decisions will be limited to the following areas: 0 High School Grades 0 CBA — Standardized Reporting Profile scores 0 University Admit Decisions . University Deny Decisions 0 Admit Decision Times . Deny Decision Times . First Semester College Grades 0 First Year College Grades Based on the number of admission decisions made during the pilot phase, the study will be limited to analyzing data from ten of eleven University of Wisconsin institutions (One institution had insignificant admission data which could not be evaluated). Overview of the Remainder of the Dissertation To begin, research on access to college and the history and evolution of college admission criteria is discussed in Chapter II. The traditional admission applications (Carnegie Unit) and the rationale behind the development of the CBA 19 program are also discussed in Chapter II. In Chapter III, the research procedures and methodology are explained, and a discussion of how the Multi-Attribute Utility (MAUT) process is used in making program review decisions in comparing which admission program is better at maintaining levels of college access (i.e., Camegle Unit versus CBA). Also listed in Chapter III is the establishment of hypotheses related to college grades and the time needed to make college admission decisions based on Camegle Unit and CBA applications. The results of these analyses are presented in Chapter IV. In Chapter V, the summary, conclusions, implications and recommendations for future research are outlined. 20 CHAPTER II: LITERATURE REVIEW Introduction Chapter II explores the history of access to postsecondary education in America, beginning with the founding of Harvard College and other early colonial colleges. Chapter II details how access to American higher education changed after the passage of the Morrill Act of 1862 and 1890. Also discussed is the historical development of Camegle Units and standard test scores, as well as the evolution of structured college admission criteria. Important challenges and threats facing traditional admission criteria are highlighted in this chapter. In addition, curricular changes in K-12 schools, high school scheduling innovations, and home schooling options are discussed as important issues leading to the development of the CBA program. History of Access and Admission to Postsecondary Institutions The history of access to American colleges and universities can be summarized by examining two key periods. These periods can be characterized as the pre-Morrill Act period and the post-Morrill Act period. Many historians have designated these periods in higher education as the pre-Civil War period and the post-Civil War period (Allmendinger, 1971 ; Axtell, 1971 ; Potts, 1971 ). The period before 1862 has been described as a time of limited educational 21 access. In this pre-Morrill Act period, education was available to a limited number of wealthy young men whom faculty members admitted to a number of elite colleges (Wechsler, 1977). The vast majority of these institutions were small, private liberal arts colleges (Axtell, 1971). Harvard College, founded in 1636, established a standard policy of limited access by educating only a select, elite class of young men. Research conducted by historian Ronald Story (1975a, 1975b) indicated that students from private preparatory schools in Massachusetts and other New England states constituted the vast majority of the Harvard student body during the period from 1801 to 1845. The inclusion of significant numbers of public high school students at Harvard was not seen until after 1846. Story (1975b) summarized Harvard’s admission policy during this period by noting that “the proportion of Harvardians from private boarding schools-institutions providing preparatory instruction to boarders for profit-was about the same in the twenty years after 1820 as in the twenty years before” (p. 286). The establishment of the College of William and Mary in Virginia in 1693 resulted in a continuation of the limited-access trend started by Harvard. Whereas Harvard focused attention on educating leaders across many disciplines, the mission of William and Mary was to prepare young men for careers in the clergy (Rudolph, 1990). The precedent established by William and Mary spread through eariy colonial America, resulting in the establishment of other secular-based educational institutions during the eighteenth century (Herbst, 1974; Tewksbury, 22 1965). The establishment of Yale, Princeton, Dartmouth, and Brown perpetuated the tradition of educating future clergymen (Rudolph, 1990). According to traditional historians, these institutions maintained a commitment to the limited-access ideology established by Harvard and William and Mary (Herbst, 1975; Story, 19753, 1975b). This entrenched philosophy of educating society’s financially elite class established a precedent for all institutions founded during the eighteenth century and the first half of the nineteenth century. Although traditionalist historians have characterized these institutions as catering to wealthy citizens, certain evidence indicates an expansion and broadening of educational access to less prosperous students (Angelo, 1983). Historian David Allmendinger (1971, 1975) noted that this shift, which originated in eastern colleges, occurred long before the establishment of universities. According to revisionist historians, these nineteenth-century colleges exhibited many egalitarian characteristics by providing educational access to a broad segment of society (Burke, 1982; Potts, 1975, 1977). Although the time frame of this transformation has not been fully established, Allmendinger (1971) wrote, It began almost imperceptibly at the time of the American Revolution and then assumed unprecedented proportions after 1800. Into New England colleges there came a flood of students from poor families. Never before had these families sent sons to college, nor could they afford to do so now; their sons came voluntarily to higher education, for the most part having made their own decisions. All ten New England colleges founded before 1822 experienced influxes of these poor young men, especially the newer, provincial institutions outside New Haven and Cambridge. (pp. 381-382) The debate between traditionalists and revisionists concerning educational access for poor Americans during the 19th century continues today. Although more 23 research is needed on educational access for this group, to determine when and why these young men enrolled in college and what challenges they faced, Blackburn and Conrad (1986) concluded, “Whether the revisionists’ data of student backgrounds proves that the colleges were aristocratic or not is another matter. The less privileged may have attended college for any of a number of reasons, in spite of discrimination they may have suffered” (p. 223). If the revisionists are correct, the experience of these early New England colleges was more an exception than the general rule in American higher education. Most colleges maintained a policy of restrictive access during this early period. Moreover, the process of limiting access to these institutions constituted the earliest form of college admission criteria (Story, 1975a, 1975b). These criteria allowed institutions to pick the choicest of society’s elite. Students seeking admission to these schools often were required to interview with faculty members and pass the institutions’ qualifying exams (Wechsler, 1977). In this eariy period in American higher education, a process of restricted access was perpetuated, with educational opportunities accruing to the wealthiest segment of society. Whereas at Harvard great emphasis was placed on the social and economic background of prospective students, other important admission criteria also were established. The initial criterion was for students to interview with Harvard’s president so that he could assess issues of character and knowledge (Fine, 1946). Once a dialogue was established, the student was required to demonstrate verbal proficiency in both Greek and Latin. As early as 1643, the following admission 24 statement appeared in Harvard’s publications: When any scholar is able to read Tully or such like classical Latin author extempore, and make and speak true Latin in verse and prose at sight, and decline perfectly the paradigms of nouns and verbs in the Greek tongue, then may he be admitted into the college, nor shall any claim admission before such qualifications. (Broome, 1903, p. 18) The College of William and Mary established a similar admission statement, indicating that before matriculation a student “must first undergo an examination before the president and masters” (Fine, 1946, p. 15). When Columbia was established in 1754, it took Harvard’s admission policy and expanded its requirements, stating: None shall be admitted (unless by a particular act of the Governors) but such as can read the first three of Tully’s Select Orations, and the three final Books of Virgil’s Aeneid into English, and the ten first chapters of St. John’s Gospel into Greek, and such as are well versed in all the rules of Clark’s introduction so as to make true Grammatical Latine, and are expert in arithmetic as far as the rule of reduction to be examined by the president. (Fine, 1946, pp. 15-16) In addition, other early New England colleges followed Harvard’s lead by establishing admission criteria with proficiency in Greek and Latin as their core requirement. The admission requirements at Yale, Princeton, and Brown were all developed using Harvard’s criteria as a guide. The similarities in admission requirements among these early colleges are noteworthy, as Broome (1903) suggested: Uniformity was the striking characteristic of college admission requirements during the Colonial Period. First of all there was a uniform aim; secondly, there was a uniform course of study, with absolutely no flexibility; thirdly, the grammar school had a single purpose-to prepare for the college-and consequently the same condition existed there. Uniformity in admission requirements was, therefore, a natural consequence. (pp. 38-39) 25 In addition to requiring grammar and composition skills in Latin and Greek, math (arithmetic) proficiency was a consistent admission requirement for most colonial colleges by 1800. Interestingly, in 1800, Harvard College was the notable exception to the math requirement established by many colonial colleges (Broome, 1903). (See Table 1.) Table 1: Admission requirements of colonial colleges in 1800. Admission Requirement Latin Latin Greek Greek Math/ College Grammar Composition Grammar Composition Arithmetic Brown Yes Yes Yes Yes Yes Columbia Yes Yes Yes Yes Yes Harvard Yes Yes Yes Yes No Princeton Yes Yes Yes Yes Yes Yale Yes Yes Yes Yes Yes William and Mary Yes Yes Yes Yes Yes Note: From A Historical and Critical Look at Collge Admission Reguirements, by E. C. Broome, 1903, New York: MacMillan. The influence of Harvard’s admission policy spread far beyond these early colonial colleges. The policy itself, as well as its component parts, had a profound influence on college admission requirements throughout the United States. When the University of Michigan opened in 1841, it adopted the following admission statement: 26 Applicants for admission must adduce satisfactory evidence of good moral character, and sustain an examination in geography, arithmetic, the elements of algebra, the grammar of English, Latin, and Greek languages, the exercise and reader of Andrews Cornelius Nepos, Vita Washingtonii, Sallust, Cicero’s Orations, Jacobs’ Greek Reader, and the Evangelist. (Farrand, 1885, p. 47) Seven years later, these initial criteria were changed to include: English Grammar, Geography, Arithmetic, Algebra through simple equations, Kreb’s Guide for the writing of Latin, Latin Reader, Cornelius Nepos, Cicero's Orations, Virgil's Bucolics and six books of the Aeneid, Greek Reader through, Latin and Greek Grammar, Keightley’s (or Pinnock’s, Goldsmith’s) Grecian History to the time of Alexander the Great, and Roman to the time of the Empire. (Broome, 1903, pp. 44-45) Regardless of the institution being considered, the admission requirements developed at Harvard were the prototype criteria during the eighteenth century and the first half of the nineteenth century. ' Redefining Access Passage of the Morrill Act of 1862 (Land Grant Act of 1862) served as a critical point of departure by redefining the idea of educational access in American society (Axtell, 1971). This legislation gave public land to existing colleges, while establishing new institutions with the singular mission of providing a much larger segment of society with access to educational opportunities (Roderick & Stephens, 1979). Today, these institutions are profiled as large (student enrollment), public (taxpayer supported), and research-intensive universities (commonly referred to as 1862 land grant universities). 27 The Land Grant Act of 1862 did not provide an access provision for newly freed slaves following the Civil War. Therefore, in an attempt to reestablish a society of educational inclusion and access for all citizens, the US. Congress passed the Morrill Act of 1890 (Land Grant Act of 1890), which established a group of black colleges to educate emancipated slaves. These 1890 land grant colleges and universities were designed to parallel the 1862 land grant universities by embodying similar principles of educational access. With the establishment of universities, the latter part of the nineteenth century can be viewed as the classical period in American higher education (Axtell, 1971; Herbst, 1975). Highlighting this period was the vast educational opportunities provided to students in every state with little regard to socioeconomic status. During this period, access to higher education became a reality for the vast majority of high school graduates. In the twentieth century, two major pieces of federal legislation impacted and shaped the direction of college access. The first boost to college access came after the end of World War II. Colleges and universities experienced increased enrollments from World War II veterans. In 1944, the federal government enacted legislation called the Servicemen’s Readjustment Act (commonly called the GI. Bill) which provided veterans the financial benefits needed to pay for a college education (Duffy & Goldberg, 1998). The impact of this legislation on college enrollment was unprecedented in the history of American higher education (Olson, 1974). In 1945, 88,000 veterans were enrolled in college, representing 5.2% of all students (total 28 enrollment was 1,676,851) enrolled during that year (Olson, 1974). By 1946, veterans represented 48.7% (1,013,000 veteran enrollment) of the total college enrollment of 2,078,095 students (Olson, 1974). The enormous impact of veterans enrolling in college was observed on the University of Wisconsin-Madison campus starting in 1946. The total enrollment at the University of Wisconsin-Madison increased to 18,598 in 1946 compared to 9,028 during the 1945 academic year (Olson, 1974). During that period, veteran enrollment increased to 11,076 in 1946 from a total of 1,347 in 1945 (Olson, 1974). After World War II, the vast majority of colleges exercised an open door admission policy to accommodate the enrollment of veterans (Duffy & Goldberg, 1998; Olson, 1974). The second major impact on college access in the twentieth century came after the legislative enactment of the Civil Rights Act of 1964. By ending legalized segregation, the act helped to pave the way for African American students to enroll in White colleges and universities (Duffy & Goldberg, 1998). In addition, the act helped to increase the total number of African American students enrolled in colleges and universities immediately after the passage of the legislation (Cross & Slater, 1999). The Civil Rights Act of 1964, was followed by additional legislation in the early 19703 that gave rise to the use of Affirmative Action strategies in college admission (Cross & Slater, 1999; Orfield, 1998). The use of Affirmative Action allowed colleges and universities to consider factors such as race, gender and ethnicity in making admission decisions (Bryant, 1996; Duffy & Goldberg, 1998). 29 This expanded the admission criteria beyond the use of grades and test scores and opened the door to the enrollment of a large number of African American, Hispanic and female students (Bryant, 1996; Cross & Slater, 1999; Orfield, 1998). Between 1964 and 1970, enrollment of African American students in higher education increased from 274,000 to 522,000 (US. Department of Education, 1999). African American student enrollment in 1975 increased to 927,000 students (US. Department of Education, 1999). After 1975, the enrollment of African American students in higher education continued to increase reaching a high of 1,415,000 students in 1996 (US. Department of Education, 1999). During the period from 1975 to 1996, Hispanic student enrollment in higher education grew from 41 1 ,000 to 1,039,000 (US. Department of Education, 1999). The enrollment of women during the period from 1965 to 1970 grew from 2,139,000 to 2,962,000 (US. Department of Education, 1999). The period from 1970 to 1997 resulted in a 110% increase as the enrollment of women grew to 6,252,000 (US. Department of Education, 1999). Since the current legislative and legal challenges to Affirmative Action have focused almost exclusively on race-based admission practices, there has been very little impact on the enrollment of women (Bryant, 1996; Hu-DeHart, 1997; Jones, 1998; Pusser, 2001 ). In addition, while the aggregate African American and Hispanic enrollment has continued to reach record levels, the elimination of Affirmative Action consideration in college admission has had a negative effect on these populations in states like California and Texas (Bok, 2000; Chapa & Lazaro, 1998; Douglass, 1999; Karabel, 1999). 30 Admission Requirements and Camegle Units The end of the nineteenth century gave rise to numerous challenges for the emerging industrialized American society. During the early part of the twentieth century, many new colleges and universities were established to prepare workers to meet the demands of business and industry for skilled laborers. One problem these institutions faced was the lack of a uniform standard and the absence of an agreed- upon procedure for evaluating students” high school records (Broome, 1903). According to Broome, the lack of an admission standard across American colleges and universities placed a significant burden on preparatory schools’ resources: Schools which send students to several colleges have been compelled to maintain more classes than they othewvise would, because the several colleges failed to agree on any policy of admission requirements, and petty and non-essential differences were insisted on, even in the more common subjects. (p. 127) To address this admission issue, the Carnegie Foundation for the Advancement of Teaching established a blue-ribbon committee in 1909. This committee, whose members included a number of distinguished college presidents, formulated a set of admission requirements called Carnegie Units. The Carnegie Units are a measurement of high school work in terms of credit based on time spent in the classroom (Tompkins & Gaumnitz, 1954). These units of evaluation provides higher education admission offices a method for counting and evaluating high school coursework to determine each student’s college-readiness skills. The Carnegie Units take the four-year high school as a basis and assumes that (a) the length of the school year was 36 to 40 weeks, (b) a class period was 40 to 60 minutes long, and (c) a subject was studied for four or five periods a week 31 (Tompkins 8 Gaumnitz, 1954). For example, four years of English would be equal to four units, a year of algebra one unit, a year of plane geometry another unit, a year of history one unit, and two years of a foreign language two units (Fine, 1946). Development of the Carnegie Units had an immediate and widespread effect on college admission standards. Colleges and universities expressed and publicized their admission requirements based on the recommendations of the Carnegie Foundation. For example, in 1911, students seeking admission to the University of Chicago needed a minimum of 12 Carnegie Units of high school coursework (Reeves & Russell, 1933). By 1941, the vast majority of colleges and universities used 15 Carnegie Units as the minimum benchmark and standard admission requirement (see Tables 2 and 3) (Emanuel, 1953). Table 2: Carnegie Unit admission requirements of selected institutions, 1 91 5 to 1930. Number of Carnegie Units Required Social Foreign College/University English Math Science Science Language Other Ohio University 2 2 1 1 — — Ohio Wesleyan 3 3 2 1 4 - Wayne State Univ. 3 2 —- 1 2 - Wooster College 3 2 1 2 2 - The work of the Carnegie Foundation allowed colleges and universities to specify their admission requirements directly and unambiguously. The process of counting and totaling units was a key innovation that established modem—day 32 admission criteria. The significance of the Carnegie Units is evidenced by its longevity, spanning more than nine decades—close to 90 years of continuous use in college and university admission. Table 3: Carnegie Unit admission requirements of selected institutions, 1930 to 1945. Number of Carnegie Units Required Social Foreign College/University English Math Science Science Language Other Univ. of California Berkeley 3 2 1 1 2 3“I 3b Iowa State Univ. 3 2 2 2 Reading - Speaking Proficiency Michigan State Univ. 3 2 3 4 Background 3° Univ. of Missouri 3 2 1 1 2 6d Note. From A Historical and Criticfllgok at College Admission Requirements, by E. C. Broome, 1903, New York: Macmillan; A Historvof the Iowa State Colleg_e_, by E. D. Ross, 1942, Ames: The Iowa State College Press; The University of California 1868-1968, by V. Stadtmann, 1970, New York: McGraw-Hill; A History of 4&9 University of Missou_ri, by F. F. Stephens, 1962, Columbia: University of Missouri Press. aCoursework in English, math, science, and foreign language. General electives. :Vocational coursework. Academic electives (maximum of 3 vocational units). Admission Requirements In stark contrast to the nineteenth century, the second half of the twentieth century represented a time of qualified access to higher education (Fetter, 1995; 33 Trow, 1981). During this period, college admission and educational access was defined through the use of high school grades in academic subjects (Carnegie Units with specific numbers of requirements in math, English, science, social science, and foreign language). These subjects are considered important ingredients in predicting future college success. At the same time, the Scholastic Aptitude Test (SAT) and the American College Test (ACT) were commonly used in assessing the level of quantifiable student knowledge. Taken together, grades and test scores constitute the most commonly used four-year college admission criteria and it is designed to limit and restrict access to postsecondary education (Bracey, 1990; Heamden, 1973; Powell & Steelman, 1996). Standardized Test Scores and Test Bias Issues The SAT gained acceptance as a college admission criterion during the early part of the twentieth century (Grossman, 1935). In contrast, the ACT was developed in 1959 and gained widespread use and acceptance during the 1960s. In the early 1930s, 26% of postsecondary schools used the SAT (Grossman, 1935). By 1992, more than 90% of four-year colleges and universities used the SAT or the ACT as a primary admission requirement (Breland et al., 1995). Before and even after validation of the SAT and ACT as standardized tests, most colleges and universities continued to administer an institutional examination to evaluate students’ knowledge across a range of subject matter prior to making 34 final admission-decisions. The oral and written examinations used for admission to early colonial colleges were viewed as psychological tests (Fine, 1946). While these tests were used in the admission process, they were not standardized tests but instead represented exams that were specific to each institution. Much later, many universities described the institution’s test as a “comprehensive examination" (Michigan State College, 1941, p. 42). These institutional tests were eliminated as colleges and universities incorporated the ACT and SAT tests as part of their admission criteria. Although most colleges and universities have defended the use of the SAT and ACT as admission criteria, these standardized tests have come under much scrutiny from multiple segments of society (Crouse & Trusheim, 1991; Heamden, 1973; Nettles & Nettles, 1999). The arguments for retaining standardized tests focus on the reliability of the instruments in evaluating students’ current levels of knowledge and learning. Alternatively, critics have argued a persuasive case for discontinuing the use of these tests by citing empirical evidence that casts doubt on their predictive validity (Crouse & Trusheim, 1988; Rosser, 1989a). Over the past three decades, concerns about the administration and use of standardized test scores in college admission decisions has been the subject of critical examination by numerous researchers and scholars. Much of the concern centers on whether or not these exams are culturally biased and underestimate the abilities of students from minority groups (Medina & Neill, 1990). A similar concern has focused attention on the possibility that these exams have a gender bias (Rosser, 19893). In exploring the issue of bias, a number of scholars have 35 examined three primary categories that may serve to devalue the skills and abilities of students in these groups. These three areas of interest are social bias, measurement bias, and predictive bias (Jacobs, 1991; Medina & Neill, 1990; Rosser, 1989a). The social bias argument points to differences in socio—economic and cultural distinctions between men and women and members of ethnic groups (Taylor & Lee, 1987; Willie, 1985). Walter Jacobs defines social bias as the ”bias that extends a notion of unfairness when test score differences by demographic groups are more rooted to differences in society" (Jacobs, 1991, page 23). The cultural differences argument states that the vast majority of White students live and are raised in a familiar, popular and dominant environment that nurtures their personal development (Stewart, 1993). It has been noted that White students are raised in a singular, monolithic environment that fosters a clear vision of their role and place in society (Helms, 1992). By extension, some argue that standardized tests are constructed based on issues and activities related to American popular culture and are more user friendly to White students (Scheuneman, 1987). By contrast, scholars argue that women and minority students are raised in dualistic cultural environments (Rosser, 1989a). As evidence of this duality, scholars note that many minority students live and interact in two distinctly separate worlds on a daily basis (Rosser, 1989a). African American students may encounter the cultural values of the larger society on a daily basis, however, their primary cultural values are often shaped by the ethnic and gender composition of their community. A case in point is the test score gap that exists between African 36 American students who reside in predominantly African American communities and those African American students who live in mostly white communities (Nettles & Nettles, 1999). The test score gap between these two group is large, with the scores of African American students who reside in predominantly White communities coming close to the scores achieved by Whites on these tests (Lyman, 1998). While issues related to the quality of schools, stability of family structure and a number of other factors may contribute to these test score differences, the social bias factor is viewed as being an important reason for some of the variation in test scores (Scheuneman & Oakland, 1998). The differences in Hispanic test scores and Whites are argued based on cultural distinctions (Codina, 1991 ). Scholars have noted that for large numbers of Hispanics, Spanish is the dominant language spoken in the home and in predominantly Hispanic communities (Sandoval & Duran, 1998). For these students, taking tests written in English may still result in many challenges. This fact is referenced in research conducted by Scheuneman and Oakland (1998). Scheuneman and Oakland noted that students’ success and their overall “performance on tests requiring the use of language may be affected even for bilingual examinees who are competent in English” (Scheuneman & Oakland, 1998, page 87). While English is an important language for Hispanic students, it will sometimes take on the role of a second language in these communities (Codina, 1991; Cortada, 1984). It has been estimated that close to half (48%) of the Hispanic population who are fluent in Spanish are much less proficient in English (Sandoval & Duran, 1998). As with African American students, the test score gap is similar 37 between Hispanic students who reside in predominantly Hispanic communities and those who live in predominantly White communities (Pennock-Roman, 1990). Some scholars and critics argue that social bias in these tests invariably result in a measurement and predictive bias (Crouse & Trusheim, 1991; Rosser, 1989a). For women and minority students, test scores have less validity in predicting levels of success in college (Rosser, 1989a). The consequence of the measurement bias is the underperformance of students from minority populations on these tests when compared to White student. In relation to women and standardized tests, some researchers have pointed out that these exams under predict the true abilities of girls (Rosser, 1989a). Phyllis Rosser noted that “the biggest sex differences in SAT score averages-much larger than the national averages for the test as a whole-occurred at the lowest GPA level” (Rosser, 1989b, Page 61). The argument of test bias is compelling when you consider that on average females have lower SAT and ACT scores but consistently receive higher grades in both high school and college (Fleming & Garcia, 1998; Rosser, 19893). These results highlight the challenges that females from various ethnic groups face on SAT and ACT tests. A number of studies comparing the test scores of men and women have observed that women perform less well on multiple choice tests than essay based assessment instruments (Bell & Hay, 1987; Bolger 8. Kellaghan, 1990). These gender test score differences have been observed within various ethnic groups (Mazzeo, Schmitt & Bleistein, 1993). The results have prompted some to conclude, “minority female high school students are doubly penalized by both gender and racial biases of the SAT” (Rosser, 1989a, page 14). 38 Admlssion Decisions with Test Scores The presumption of test bias serves to limit and reduce college choices available to women and minority students (Navarro, 1984; Powell & Steelman, 1996). Test scores are an important part of the current college admission criteria. Students with lower test scores have limited opportunities to gain admission to a host of highly selective colleges and universities (Cole, 1987). While these tests play a critical role in controlling access to college, their ability to predict success in college has come under much scrutiny (Crouse & Trusheim, 1991). For all student populations, the best predictors of success in college are high school grade point average and class rank (Crouse & Trusheim, 1991). When test scores are added to these two measurement units, a small, incremental increase is observed in the model that explains the variation in college GPA. The research conducted by James Crouse and Dale Trusheim (1991) focused attention on college admission decisions at highly selective colleges and universities. They examined the high school grade point average, class rank and SAT scores for the 1985 and 1986 Dartmouth College freshman class. Based on the data provided by Dartmouth, Crouse and Trusheim constructed a cross tabulation table of predicted grade point averages. The results of their study found no significant difference in average GPA when SAT scores were added to high school grades and class rank (Crouse & Trusheim, 1991). 39 Table 4: Outcomes of high school rank and high school rank plus SAT admissions policies at Dartmouth. Outcome High School Rank High School Rank Plus SAT Average Freshman 3.28 3.28 GPA Source: Crouse and Trusheim, 1991. The research conducted by Crouse and Trusheim, Rosser, and Codina shows that test scores have a limited ability to predict success in college (Crouse & Trusheim, 1991 ; Rosser, 1989a, 1989b; Codina, 1984). If these studies have merit, then an important question is: Why do colleges and universities use test scores in the admission decision-making process? The research tends to suggest that the following two explanations are answers to this question: 1 . Test scores help the admission officer confirm the applicant’s ability to perform college level work. 2. Test scores help to regulate access to college. For many college admission officers, the use of test scores provide an additional set of data from which they are able to analyze and draw comparisons with the information gained from high school grades and class rank. In this respect, admission officers are using these tests to confirm whether or not the applicant’s scores are consistent with the level of ability that is indicated from examining the high school record (Bracey, 1990). 40 In their book, “The Case Against The SAT,” Crouse and Trusheim concluded that in cases where the SAT altered the admission decision, it usually served to reject African American students and students from low income families (Crouse & Trusheim, 1988). Given the concerns about bias in these tests, their limited predictive ability, and their role in controlling access to college, calls for changes in the college admission criteria continue to be heard (Douglass, 1999; Powell & Steelman, 1996). In addition, challenges with evaluating, in a thorough and effective manner, students who are enrolled in schools with block scheduling curricula and schools with restricted or limited grade assessment processes (e.g., traditional letter grades) are key factors that require an admission alternative to the Camegle Unit program. The development and validation of the Competency Based Admission program could result in an admission procedure that might serve to reduce or eliminate the reliance on test scores in college admission decisions, while effectively assessing the college readiness skills of students. Current Admission Challenges Starting in the later part of the 19905, numerous curricular changes began to emerge in K-12 education. The demand from employers for high school graduates who possess basic competencies, parental demands for practical and applied learning skills, and societal concerns about eroding educational standards are the catalysts behind these curricular changes (Newman, 1985). 41 In the future, postsecondary institutions will recognize that high school curricular changes necessitate concomitant modifications in college admission practices. Some states such as Wisconsin and Oregon already have discovered the importance of devising a strategy for establishing alternative admission criteria. These states understand the importance of developing a process or method of evaluating a nontraditional high school performance record in order to maintain a plan of educational accessibility. To address curricular changes in secondary schools, the Higher Education Governing Board in Wisconsin developed an alternative college admission process called competency-based admission (CBA; University of Wisconsin System, 1994). At the same time, Oregon has focused on college admission by developing a proficiency-based admission standards system (Oregon State System of Higher Education, 1996). These two state-based initiatives have been the driving force behind the development of alternative admission criteria that could serve as a catalyst in redefining college admission and access to higher education in the twenty-first century. Today, most admission offices use Carnegie Unit and standardized test scores as the primary criteria to determine students’ access to and probability of success in college (Crouse & Trusheim, 1991). The Carnegie Unit, focusing on high school grades and the number of years of coursework completed in predictor courses, has been commonly accepted as important factors in college admission decisions. Although the use of standardized test scores in admission decisions is as 42 widespread as the use of Carnegie Units, the application of such criteria is more controversial. The use of standardized test scores in the college admission process has been vigorously debated during the last 30 years. In addition, the continued use of Camegle Units in admission decisions is facing challenges as a result of curricular changes in secondary schools. A number of colleges have recognized this dilemma and have adapted their criteria to allow students to submit personal portfolios for admission consideration (Needle, 1991). The portfolio often requested by colleges require students to complete the admission application; submit a current high school transcript; provide a personal statement of interest; and submit any supplemental material that may build a case for a positive admission decision (Needle, 1991). While the use of portfolios has been gaining popularity in undergraduate admission, they are becoming increasingly popular in graduate school admission (Hagedom & Nora, 1996; Johnson & Gentry, 2000). In future years, the use of portfolios may play an increasing role in both undergraduate and graduate school admission. Changes in High School Curricula Recent changes in high school curricula are creating tension in the traditional college admission and assessment process. Two primary changes pose key challenges to the perpetuation of this admission model: 43 Changes in grading and graduation requirements. Some K-12 school districts are altering the course requirements for graduation from high school. A number of these requirements have focused on developing specific academic competencies. These school districts are implementing changes in the student evaluation process at lower grade levels (Guskey, 1994; Sperling, 1994; Wiggins, 1994). In a number of these districts, alternative assessment practices that exclude historical grading scales (letter grades ranging from A to F) are used to evaluate learning (Guskey, 1994; Kenney & Perry, 1994). The establishment of any system that disrupts the fundamental college admission balance that relies on letter grades will pose a major challenge to colleges and universities. Any continued shift away from the use of grades to evaluate student success will require new college admission strategies to maintain broad levels of institutional accessibility. Block scheduling. An increasing number of high schools across the United States are implementing block-scheduling programs that are both inner disciplinary and interdisciplinary in academic composition (Eineder & Bishop, 1997; Skrobarcek et al., 1997). Block scheduling is designed to provide students with an extended class period, thus increasing the probability of their learning the subject and course material. The positive effect of block scheduling on learning has been documented in subjects such as math and science (Fitzpatrick & Mowers, 1997; Skrobarcek et al., 1997). Under most block-scheduling plans, students can complete one year of academic 44 coursework in just one semester. Block scheduling alters the historical calculation of high school grades and the number of completed Carnegie Units. This issue is further complicated when two academic disciplines are combined in one distinct class block. Here college admission questions arise concerning how to evaluate the grade received (Should the same grade be assigned to each course?) and how to assign Carnegie Units (Should one unit or one-half unit be assigned to each course?) As a result of these structural alterations, block-scheduling issues could provide admission challenges well into the twenty-first century. Home-Schooling Options More parents across the United States have decided to forsake the traditional practice of sending their children to established K-12 school districts, in favor of educating their youngsters at home. By 1994, more than 250,000 students are educated at home, compared with 15,000 students who were home schooled during the 19703 (Gordon, Russo, & Miles, 1994). This trend toward home schooling is expected to continue well into the first half of the twenty-first century. During the last decade, the concept of home schooling has gained popularity among parents, the public, and politicians. Some states have developed legislative strategies designed to recognize the growing number of students who are being educated in home environments. In Oklahoma, the state legislature enacted a law providing protection for home schools. This law provides a mechanism for 45 monitoring and regulating the educational environment of every home school (Klicka, 1995). In addition, many states have adopted and currently maintain regulations for home schooling. In fact, twelve states allow home schools to register, operate, and become certified as private schools (Klicka, 1995). The trend toward home schooling represents a return to the classical period of education in America. As Whitehead and Bird (1987) noted, “From the beginning of this country, education was an area of concern left to parents through home education” (p. 23). In fact, the development of a broad-based public educational system evolved from the early period of private, home-schooling concepts (Klicka, 1995; Whitehead & Bird, 1987). Whereas the aforementioned changes or innovations were designed to provide academic-enrichment opportunities for students, they place an additional burden on existing resources. This is particularly true when considering the effect that these changes will have on college admission. The increased focus on home schooling and the elimination of grades in secondary schools served as a catalyst for the establishment of the Wisconsin CBA program. Responding to Admission Challenges The Wisconsin Model was developed in 1994 by a group of 75 faculty members representing each University of Wisconsin two-year and four-year institutions (University of Wisconsin System, 1994). College-competency matriculation requirements were developed in five disciplinary areas: English, math, 46 science, social science, and foreign language. The Wisconsin Model provides high school teachers with the means to evaluate the college-readiness skills of student applicants using the CBA criteria. A form called the Standardized Reporting Profile (SRP) is used to summarize these competencies. While CBA was established as an alternative to the use of the Carnegie Unit admission criteria, little attention has been focused on decisions comparing admission rates, first semester college grades, and the time required to make admission decisions associated with each admission process. These outcomes are important to the decision making process for policy makers as they consider a broad implementation strategy for a program like CBA. 47 CHAPTER III: METHODS Introduction The researcher’s purpose in this study was to examine and compare admission rates, first year college grades and the level of resources expended (time) by universities in processing and rendering admission decisions on Carnegie Unit and CBA applications. Chapter III details the methodology used to analyze these data. In addition, this chapter discusses the use of the Multi-Attribute Utility (MAUT) model to determine if CBA results in similar college access outcomes (Admit Rates) when compared to the Carnegie Unit admission program. The MAUT development process for admission decisions and its utility ranking procedure is described in this chapter. The research hypotheses that highlight the relationship between admit decisions and grades in college, and the time required to make admission decisions are discussed. Finally, the expected outcomes associated with each research hypothesis are summarized in an expectation matrix. Research Questions . Do admission decisions using CBA result in similar access outcomes (Admit Rates) when compared to the traditional admission (Carnegie Unit) process? . Do predictive relationships exist between the CBA process and first semester college grades? If such a relationship exists, how does it 48 compare to college grades predicted from the traditional admission (Carnegie Unit) process? 0 When compared to Carnegie Unit, does CBA represent a time- efficient process for making college admission decisions? Research Issues To determine if the CBA program is an effective, alternative to the Carnegie Unit admission model, specific outcomes must be measured. The primary comparative outcome for CBA and Carnegie Unit focus on an analysis of admit rates. The secondary comparative outcomes of this study are an analysis of the correlation of first semester college grades and the processing time associated with final admission decisions. These two secondary issues attempt to answer efficiency and resource allocation questions. In total, these comparative outcomes should provide evidence of which program provides the broadest access to college; which program most effectively predicts how well admitted students will perform during their first semester in college; and which program is most efficient in rendering final admission decisions. The results of this study could provide evidence concerning the impact a program like CBA might have in maintaining high levels of college access for women and minority students. In addition, as political pressure continues to mount against the use of Affirmative Action strategies in maintaining access to 49 public universities for a diverse group of students, these institutions will face legislative mandates requiring them to use a more stmctured college admission criteria (e.g., Califomia’s Proposition 209). If the current results in California are any indication of the impact these political pressures will have on access to college for minority students, then higher education will need to continue to explore alternative admission programs as a way of maintaining high levels of diversity. Since the CBA program moves away from the traditional criteria used to determine college readiness skills, it has the potential to move the issue of access out of the political spotlight. The Research Study and Population The University of Wisconsin System developed a pilot program to evaluate the effect of CBA on college admission practices. This pilot program was designed to provide evaluative data with which to compare the CBA process to the Carnegie Unit admission process. The study examined 660 application decisions made by ten universities. The applications were divided equally between Carnegie Unit and CBA materials. The pilot study was conducted over a two-year period in 1996 and 1997. The study was funded by a grant from the US. Department of Education (Fund for the Improvement of Postsecondary Education — F IPSE). The FIPSE grant, which covered the period from September 1994 through June 1997, provided the financial resources necessary for the development of the University of Wisconsin CBA- 50 Subject Competencies, and the implementation of the two-year pilot study. The pilot study involved a selected group of seven high schools from Wisconsin and one from Minnesota (CBA: The Wisconsin Model, 1994). These high schools were selected based on the fact that they were involved in a number of curricular changes that could result in their students facing challenges meeting the Carnegie Unit admission criteria (CBA: The Wisconsin Model, 1994). These high schools were primarily located in rural communities, had similar gee—demographic characteristics, and had very little racial and ethnic diversity in their student body (CBA: The Wisconsin Model, 1994). As a result, the request for volunteers to participate in this study yielded students from similar racial backgrounds. At each high school, teachers and administrators were required to attend workshops and seminars that focused on how to use the CBA-Competencies and the SRP form to evaluate college readiness skills. In the case where a student, during the course of their high school years had been taught by more than one teacher in a specific subject area, the CBA process allowed these teachers to collaborate in making a final competency evaluation. As a result, the pilot study resulted in a number of high schools setting aside release time for teachers to meet to discuss each student involved in the pilot study. Once teachers completed the competency evaluation, high school counselors submitted to the appropriate university a CBA application and a Carnegie Unit application for each student. Each university was required to set up an independent review process that ensured that these two application processes remained separate. Admission 51 officers were assigned to render decisions on either Carnegie Unit or CBA applications. The process also required these admission officers to make application decisions without being able to view the student’s second application (i.e., either Camegle Unit or CBA application). The separate and independent review process ensured that student applicants were held harmless by the admission decision. This provision allowed students to be admitted even though they may have been denied admission by one of the two admission programs. Alternatively, a student was only denied admission to a university when they were denied admission on both their Camegle Unit and CBA application. In addition, all public universities were required to document the final admission decision (Admit or Deny), and maintain a detailed log on the amount of time needed to make each decision. These data were compiled and centralized by the University of Wisconsin System for review and evaluation. The data on college grades were collected after students matriculated to a University of Wisconsin institution during the 1996/1997 academic year. The researcher was able to secure these data from the University of Wisconsin System to test several hypotheses. The CBA Pilot Design As part of the CBA pilot design, the University of Wisconsin System required public universities to participate in the study. These universities were provided with 52 instructions for handling students submitting CBA evaluation profiles (Supplemental Reporting Profile — SRP Form). So as to have a basis for evaluating the effects of CBA, students submitted two sets of admission materials to each university. These two sets of materials were: 1. The CBA-Supplemental Reporting Profile (SRP Form): This form contained a summary of teacher evaluations by subject. Each student applicant received a competency evaluation in each academic subject area (e.g., English, math, science, social science, and foreign language). Teachers evaluated the student’s subject-based competencies using the following scale: 5 = Excellent performance, 4 = Very good performance, 3 = Satisfactory performance, 2 = Limited performance, 1 = Poor performance, NBE = No basis for evaluation. The traditional application (Carnegie Unit application) material: This material detailed all grades by class and subject area for each semester and each academic year (academic transcript). In addition, the student’s class rank and standardized test scores were included with each application. The applications examined in this study were submitted to ten University of Wisconsin schools. The data collected during the pilot phase included information on final Carnegie Unit and CBA application decisions and the time needed to make these decisions. In addition, after admitted students matriculated to college, data were collected on grades during each of the first three semesters. 53 Mum-Attribute Utility Model (MAUT) MAUT models have been used to help inform the decision making process (Edwards, 1977). They are useful in helping policy makers finalize decisions that involve program alternatives in a number of important areas. “Examples that demonstrate the diversity of their use include the budgeting decisions of public health officials, the job-choice decisions of job applicants, the student-admission decisions of universities, the plant-location decisions of corporate executives, the funding decisions of federal program administrators, the land-use decisions of regional planning groups, and the personnel-assignment decisions of various military services” (Huber, 1980, page 47). The Multi-Attribute Utility (MAUT) model is an effective method for evaluating programs with multiple and seemingly unrelated variables. The MAUT model can be used to provide a policy maker with an analysis of outcomes associated with two or more programs by examining specific variables in each program (Edwards, Guttentag & Snapper, 1975). In this study, the primary comparison of Carnegie Unit Admission and Competency Based Admission involves an analysis of variables that looks at a comparison of final admission decisions. The focus of this examination is to compare admit rates for these two admission programs. In doing so, the MAUT process generates a utility ranking of each program. Here the MAUT evaluation process will identify the admission program that meets the key priorities established by program decision makers. As part of the early development of the CBA program, it was decided that the basis for developing this program was to continue to provide access to college for students completing non-traditional educational programs (CBA: The Wisconsin Model, 1994). As a result of the CBA development process, 54 college access based on admit decisions is the number one reason for conducting a MAUT college access utility analysis. The MAUT model can be and has been used to make program evaluation decisions among two or more programs (Edwards, 1977; Farmer, 1987). The MAUT process allows programs to be evaluated using a comparison matrix consisting of factors that have a consistent set of measurement outcomes. The MAUT process is designed so that “if one alternative has a higher (or more preferable) attribute value than any of the other alternatives, that alternative is chosen and the decision process ends” (Yoon and Hwang, 1995, page 23). In addition, the MAUT model used in this study is designed to provide policy makers the data needed to evaluate the effectiveness of both admission programs. More specifically, the MAUT model will provide a final utility ranking to compare the effectiveness of each admission program in making college access decisions. The MAUT model has been used successfully in applied settings ranging from evaluating the effectiveness of specific government programs to examining land use regulations (Edwards, 1977). In the case of the CBA program and the Carnegie Unit program, the MAUT model will provide University ofWisconsin policy makers with data on student access. The program with the highest score would represent the admission model with the highest utility. This result is only one measurement outcome that policy makers could use in making decisions on these programs. Since the Carnegie Unit model is an established and fully operational program, the results of the MAUT process would not prohibit University ofWisconsin policy makers from deciding to implement both programs, rather than a substitution strategy. Indeed, the goals outlined for the CBA pilot involved examining CBA to 55 determine if this new alternative could be used to meet the needs of some students who would have challenges gaining access to college using the Camegle Unit program (CBA: The Wisconsin Model, 1994). The strategy behind the MAUT model is focused on determining the value of attributes associated with each alternative. All attributes are assigned or have an established numerical value that is maximized through the use of a mathematical procedure. As a part of the mathematical process, each attribute will be assigned a probability weight. While the assigned probability for each attribute will vary, the sum of the assigned attribute values will always be equal to one. This sum of attribute values is achieved through a normalization process that allows each attribute to be scaled based on its value in relationship to the summed value of all attributes. The product of the attribute value and the attribute probability will result in a final utility value for each attribute. Once the final utility for each attribute is determined, the final utility for each alternative is calculated by taking the aggregate value of the alternatives. In using the MAUT model to evaluate the two admission programs, Carnegie Unit Admission and Competency Based Admission are examined as competing admission programs. The total applications processed are normalized for each university. This process provides a value for the attribute related to the number of admission applications received by each university. Also, the admit rates observed in each admission program are used as the percentages associated with the probability or the likelihood of an admit decision occurring. The MAUT process involves a ten-step process for assisting policy makers in making programmatic 56 decisions on program alternatives (Edwards, Guttentag and Snapper 1977). The following is the ten-step process used to determine the impact on college access observed in examining each admission program (alternative): Step 1 The organization’s utilities that are to be maximized are public universities in Wisconsin. The focus of this process is centered on the admission offices in these universities. Step 2 The utilities to be maximized are college admission rates. A direct focus on the admit rates in these universities is established. Step 3 The examination of a group of applications submitted by students using the Camegle Unit program and the Competency Based Admission program are the basis for this evaluation. Step 4 The number of university admission offices in Wisconsin was established at ten based on the collection of data. Step 5 An aggregate admit rate is established by evaluating all application decisions. In addition, final admit rates for each university is developed by examining the application decisions made in each admission office. Step 6 The raw weights for each university is based on the number of Carnegie Unit and CBA applications received and processed. $92]. The number of applications received by each university (raw weight) is 57 normalized by dividing each raw weight by the sum of the ten weights and then multiplying by 100. Detailed in table 5 is a listing of results derived from this normalizing process. Step 8 The evaluation of prior admit rates based on historical data, is evaluated and compared against the posterior (observed) admit rates from the evaluation period. The following are the methods used for determining prior and posterior admit rates: a. Step 9 Prior admit probabilities are derived by taking the average Carnegie Unit admit rate for each university during the five year period prior to the development of the pilot program. These rates are listed in table 6. A projected number of admits for each university can be calculated by multiplying the prior admit rate by the number of observed admission applications for each program. The projected admits for each university, using the prior admit rates, is listed in table 7. The posterior probabilities are derived for Carnegie Unit and CBA admit decisions by examining the results of all decisions during the pilot evaluation period. The probability of an admit decision occurring on any admission application is the observed probability (Percentage of students admitted) based on decision made by each university. These posterior admit rates for Carnegie Unit and CBA applications are listed in tables 8 and 9, respectively. The utilities are calculated for each university by multiplying the normalized weights by the admit rate (probability). A set of utilities for each university is 58 calculated using prior admit probabilities, while a set of utilities are calculated using the posterior (observed) admit probabilities. This process provides a comparative ranking for Carnegie Unit and CBA admit decisions for each university. The summation of the ten universities provides a final, overall ranking for the two alternative admission programs. These results are summarized in Chapter V. Step 10 This final step provides an examination of the final utility rankings for the purpose of deciding between the two admission alternatives. The utility rankings and results associated with this method are detailed in Chapter V. In mathematical terms, the ten step process can be illustrated be maximizing the following MAUT Utility model used in this study: Expected/Projected MAUT Utility Model (Prior Admit Probabilities) Ui = Z (Wj 0 Pij) Where: Ui : Represents the total MAUT utility associated with the ith admission model. In this study, it would represent the total expected MAUT utility associated with the Camegle Unit admission model or the CBA admission model. 2: The sum of the total for all University of Wisconsin institutions. i W]: The normalized application weight associated with the jth University of Wisconsin institution. 59 Pij: The prior probability that a student applying for admission using the ith admission model (Carnegie Unit or CBA) will be accepted to the jth University of Wisconsin institution. Calculated MAUT Utility Model (Posterior Admit Probabilities) The calculated MAUT utility model is based on using the posterior admit probabilities for each university (Pij) established during the application evaluation period. The calculated MAUT utility model is: Ui = Z (Wj O Pij) 1 Where: Pij: The posterior probability that a student applying for admission using the ith admission model (Carnegie Unit or CBA) will be accepted to the jth University of Wisconsin institution. Table 5: MAUT process, step 7. Institution Raw Applications Total Raw Normalized (University) Applications Application Camegle Unit CBA Weight Institution 1 73 73 146 22 Institution 2 24 24 48 7 Institution 3 42 42 84 13 Institution 4 60 60 120 18 Institution 5 20 20 4O 6 Institution 6 11 11 22 3 Institution 7 12 12 24 4 Institution 8 25 25 50 8 Institution 9 13 13 26 4 Institution 10 50 50 100 1 5 Total 330 330 660 100 60 Table 6: Prior Admit Probabilities (Based on a five-year average for each university). Institution Prior Admit Rate (University) Institution 1 .77 Institution 2 .82 Institution 3 .70 Institution 4 .73 Institution 5 .78 Institution 6 .76 Institution 7 .80 Institution 8 .65 Institution 9 .72 Institution 10 .84 Total .76 Table 7: Estimated/Projected Carnegie Unit and CBA admits (Using Prior Admit Probabilities and observed admission applications). Institution Prior Admit Projected Projected CBA (University) Rates Carnegie Unit Admits Admits (Applications times (Appflcations times Prior Admlt Rate) Prior Admit Rate) Institution 1 .77 56 56 Institution 2 .82 19 19 Institution 3 .70 29 29 Institution 4 .73 44 44 Institution 5 .78 16 16 Institution 6 .76 8 8 Institution 7 .80 10 10 Institution 8 .65 16 16 Institution 9 .72 9 9 Institution 10 .84 42 42 Total .755 249 249 61 Table 8: MAUT process, step 8 (Carnegie Unit Posterior Applications, Admits and Admit Probabilities). Institution Applications Admits Admit Rate (University) Institution 1 73 61 .83 Institution 2 24 23 .96 Institution 3 42 33 .79 Institution 4 60 41 .68 Institution 5 20 13 .65 Institution 6 11 6 .55 Institution 7 12 8 .67 Institution 8 25 18 .72 Institution 9 13 11 .85 Institution 10 50 41 .82 Total 330 255 .77 Table 9: MAUT process, step 8 (CBA Posterior Applications, Admits and Admit Probabilities). Institution Applications Admits Admit Rate (University) Institution 1 73 54 .74 Institution 2 24 18 .75 Institution 3 42 29 .69 Institution 4 60 36 .60 Institution 5 20 15 .75 Institution 6 11 9 .82 Institution 7 12 1 1 .92 Institution 8 25 17 .68 Institution 9 13 10 .77 Institution 10 50 42 .84 Total 330 242 .73 62 Admit Rates and the College Access Hypothesis Previous CBA research revealed that admission officers had prior experience making Carnegie Unit admission decisions (CBA: The Wisconsin Model, 1994). By contrast, these same admission officers possessed a very steep Ieaming curve related to the CBA program (CBA: The Wisconsin Model, 1994). Since the CBA program was developed as a pilot initiative, these admission officers possessed less knowledge and no prior history related to the implications associated with CBA admit decisions. As a result, one would hypothesize that these admission officers would be prone to be more selective in making CBA admit decisions. On the other hand, the Carnegie Unit admission model represents a familiar program with a detailed history related to the affects associated with admit decisions. In the case of Carnegie Unit admission, we would expect admission officers to be less selective resulting in higher admit rates, and a higher MAUT utility ranking associated with these admit decisions. Admit rates and the time needed to make decisions on each type of application (Camegle Unit and CBA) is an important part of determining the efficiency associated with each admission program. In evaluating admit rates for each admission model, an analysis of all decisions is compared to all admit (acceptance) decisions. This analysis will provide a gauge of “how selective” the CBA program is when compared to the Camegle Unit program. As a result, analyses exploring the variance that may exist in admit rates between each admission program is conducted. Since the Carnegie Unit admission program is the standard for evaluating any new program, the degree to which significant differences exist in admit rates will determine the overall usefulness of the 63 CBA program. The MAUT model is used to estimate and evaluate this hypothesis. Np" Hypothesis 1: There is no significant difference in the Multi-Attribute Utility rankings derived from admit rates between the Carnegie Unit admission program and the CBA admission program. Alternative Hypothesis 1: There is a significant difference in the Multi- Attribute Utility rankings derived from admit rates between the Camegle Unit admission program and the CBA admission program. College Grades Hypothesis The CBA process serves as a decision-making model for determining access and enrollment in University of Wisconsin four-year, public institutions. Given this issue, it is important to evaluate how well CBA results compare with those obtained using the Carnegie Unit admission process. In thinking about this analysis, the comparison must maintain a focus on the primary reason students are admitted to college. In essence, admission is provided to students who have the potential and ability to be successful and ultimately matriculate to graduation. Examining the grades received in college is a common method used for measuring student success. The Carnegie Unit Grade Point Average (GPA) model has been validated as a measurement tool for predicting levels of success in college (Willingham, Lewis, Morgan 8. Ramist, 1990; Beecher & Fischer, 1999). The GPA model associated with Carnegie Unit admission is designed to take the high school GPA, test scores (ACT/SAT) and class rank to estimate how well an applicant will perform in college courses. Alternatively, in order for CBA to be a valid process for predicting success in college, the SRP scores should have a corresponding relationship to college GPA. Using the CBA admission model and the rating scale 64 listed on the SRP form, the satisfactory score would seem to represent the minimum requirement needed to ensure an adequate level of college success. Furthermore, the scores received on the SRP form should correlate to the grades that students received in their first semester in college. A comparison of the SRP scores by subject area and the college grades received in each subject will provide a method for determining how much of the first semester GPA can be explained by SRP scores. If this analysis yields a meaningful correlation, it will provide a method for estimating college success through a CBA predictive model. At any rate, the fact that the Camegle Unit GPA model has been validated as a successful method for predicting success in college could result in an expectation that this process would explain more of the variation in grades than a similar method associated with the CBA program. Therefore, the second hypothesis concerns whether a significant difference exist in the first semester grade/SRP correlations for CBA and the GPA correlations for the Carnegie Unit admission process. This correlation would allow us to determine what portion of the variance in first semester grades are explained using each model. While this analysis is a measurement process for determining success in college, it also serves as a way of determining if the CBA process will lead to admission outcomes that are similar or vary from the Carnegie Unit process. A grade variance and correlation analysis is used to evaluate this hypothesis. The following null and alternative research hypotheses are evaluated: MII Hypothesis 2; There is no significant difference in the explained grade variance for CBA admitted students and Carnegie Unit admitted students. 65 Altematjve Hypothesis _2_:_ There is a significant difference in the explained grade variance for CBA admitted students and Carnegie Unit admitted students. Admission Decision Times Hypothesis During the CBA pilot study, University of Wisconsin institutions were asked to gather and maintain data on the processing time needed to make CBA and Carnegie Unit admission decisions. An analysis was conducted to determine (a) the level of resources (time) necessary to render a CBA admission decision on student applicants and (b) the level of resources (time) required to process Camegle Unit admission application materials. To address the issue of time and resource efficiency, several analyses that look at both the aggregate times and the times associated with each university are required. Again, an expectation exists that due to a lack of familiarity and knowledge of the CBA process, admission officers will not be able to make time efficient decisions that compare favorably to Carnegie Unit decisions. This expectation of application decision times is focused on (1) aggregate decisions for all universities and (2) decision outcomes associated with each university. A two-way Analysis of Variance (ANOVA) is used to determine whether significant differences existed in application processing time between the Carnegie Unit application process and the CBA application process. This analysis is performed to test whether differences in the means for each program exists. Hypothesis 3 concerns whether a significant difference existed in the 66 processing time needed to make admission decisions on the aggregate number of Camegle Unit and CBA admission applications. Null Hypothesis 3: There is no significant difference between the amount of time expended processing Carnegie Unit admission applications and the amount of time expended processing CBA admission applications. Alternative Hypothesis 3: There is a significant difference between the amount of time expended processing Carnegie Unit admission applications and the amount of time expended processing CBA admission applications. Summary of Research Hypotheses As a result of the previous discussion in this chapter, the following hypotheses are derived: H1: Admit rates and the corresponding MAUT utility will be higher for Carnegie Unit admission when compared to CBA admission. H2: The Carnegie Unit model explains a higher amount of the variation in first semester grades. H3: The time needed to make decisions on CBA applications will be higher than the time needed to make decisions on Carnegie Unit applications. 67 A summary of the expected outcomes stated in each hypothesis is listed in an Expectation Matrix (see Table 10). Table 10: Expectation Matrix. Score/Ranking Variable Highest Lowest Admit Rates (Selectivity) Carnegie Unit CBA 1. Explained Grade Carnegie Unit CBA Variation 2. Grade Correlation by Subject Area a. Math 3. Camegle Unit a. CBA b. English b. Carnegie Unit b. CBA c. Science 0. Carnegie Unit c. CBA d. Social Science d. Carnegie Unit (I. CBA e. Foreign Language e. Camegle Unit e. CBA Admission Decision Time CBA Carnegie Unit 68 CHAPTER IV: RESULTS AND ANALYSIS OF DATA Introduction Detailed in Chapter IV are the research results. Listed in this chapter are tables and charts that serve to allow for an analysis of each hypothesis. In some cases, the results do not allow for the null hypothesis to be rejected. In other cases, the results are significant and accordingly allow for the rejection of the null hypothesis. Research Results Hypothesis 1 Npll Hypothesis 1: There is no significant difference in the Multi-Attribute Utility rankings derived from admit rates between the Carnegie Unit admission program and the CBA admission program. The expected/projected MAUT utilities associated with each university are listed in Table 11. These utilities are the results that would be expected based on the use of prior admit probabilities (five year average admit rates for each university) and the normalized application weights for each university. The expected results are compared to the results generated from calculating the utility model using posterior admit probabilities for the Carnegie Unit admission program and the CBA admission program. 69 Table 11: Expected MAUT college access utilities (Based on Prior Admit Probabilities. Institution Normalized Prior Admit Expected MAUT (University) Weights Rate Utilities (Prior Admit Rate x Normalized Weight) Institution 1 22 .77 16.94 Institution 2 7 .82 5.74 Institution 3 13 .70 9.1 Institution 4 18 .73 13.14 Institution 5 6 .78 4.68 Institution 6 3 .76 2.28 Institution 7 4 .80 3.2 Institution 8 8 .65 5.2 Institution 9 4 ' .72 2.88 Institution 10 15 .84 12.60 Total 100 .76 75.76 The Carnegie Unit admission program has a higher aggregate utility (77.18) when compared to the CBA program (73.06). This result indicates that the Carnegie Units admission program does a better job of providing broad access to college based on aggregate admission decisions. The results of this analysis are detailed in Table 12. These results show variations in MAUT rankings among several universities. In fact, there are four universities that had CBA utility rankings that were higher than Carnegie Unit utility rankings (Universities Number 5, 6,7 and 10). The most significant difference in utility rankings among these four institutions was observed in University 7. The CBA utility ranking for University 7 was 3.68 compared to 2.68 for the Carnegie Unit utility ranking. 70 Table 12: MAUT college access results (Based on Posterior Admit Probabilities). Institution Normalized Carnegie Carnegie Unit CBA CBA Utility (University) Weights Unit Admit Utility Ranking Admit Ranking Rate (Carnegie Unit Rate (CBA Admit Admit Rate x Rate x Normalized Normalized Weight) Weight) Institution 1 22 .83 18.26 .74 16.28 Institution 2 7 .96 6.72 .75 5.25 Institution 3 13 .79 10.27 .69 8.97 Institution 4 18 .68 12.24 .60 10.80 Institution 5 6 .65 3.90 .75 4.50 Institution 6 3 .55 1.65 .82 2.46 Institution 7 4 .67 2.68 .92 3.68 Institution 8 8 .72 5.76 .68 5.44 Institution 9 4 .85 3.40 .77 3.08 Institution 10 15 .82 12.30 .84 12.60 Aggregate Utility 77.18 (Carnegie Unit) 73.06 (CBA) Final Rank 1 (Carnegie Unit) 2 (CBA) A comparison of the expected and observed MAUT utility results for Camegle Unit admission and CBA admission are listed in Tables 13 and 14, respectively. The comparison of the expected and observed utilities show variation associated with each university. In some cases, the expected utility model under-predicted the observed results, while in other cases, the expected utility model over-predicted the results derived from the calculated model. However, in examining University 10, the expected utility achieved 100% success in predicting the actual CBA MAUT utility. 71 At the same time, the expected utility model under-predicted the Camegle Unit MAUT utility by .30 for University 10. Table 13: Differences in Expected and Calculated MAUT utilities for the Carnegie Unit admission program. Institution Expected MAUT Calculated Differences (University) Utilities (Prior Carnegie Unit (Calculated minus Probabilities) MAUT Utilities Expected) (Posterior Probabilities) Institution 1 16.94 18.26 1.32 Institution 2 5.74 6.72 .98 Institution 3 9.1 10.27 1.17 Institution 4 13.14 12.24 (.9) Institution 5 4.68 3.90 (.78) Institution 6 2.28 1.65 (.63) Institution 7 3.2 2.68 (.52) Institution 8 5.2 5.76 .56 Institution 9 2.88 3.40 .52 Institution 10 12.60 12.30 (.30) Total 75.76 77.18 1.42 Table 14: Differences in Expected and Calculated MAUT admission program. utilities for the CBA Institution Expected MAUT Calculated CBA Differences (University) Utilities (Prior MAUT Utilities (Calculated minus Probabilities) (Posterior Expected) . Probabilities) Institution 1 16.94 16.28 (.66) Institution 2 5.74 5.25 (.49) Institution 3 9.1 8.97 (.13) Institution 4 13.14 10.80 (2.34) Institution 5 4.68 4.50 (.18) Institution 6 2.28 2.46 .18 Institution 7 3.2 3.68 .48 Institution 8 5.2 5.44 .24 Institution 9 2.88 3.08 .20 Institution 10 12.60 12.60 --- Total 75.76 73.06 (2.7) 72 The aggregate MAUT utilities derived for the expected utility models for both Carnegie Unit and CBA were 75.76. These compare to a calculated aggregate Carnegie Unit utility of 77.18 and a calculated aggregate CBA MAUT utility of 73.06. In the case of Carnegie Unit admission, the expected MAUT utility model under- predicted the actual MAUT results by 1.42. This represents a difference of 1.9%. The expected MAUT utility model over-predicted the actual CBA MAUT utility results by 2.7. This represents a difference of 3.6%. These results show that the CBA program provided less admission access than would normally be expected, while the Camegle Unit program provided more admission access than would normally be expected. As a result of the aggregate utility ranking for all universities, there exist sufficient evidence that allow for the rejection of the null hypothesis. These admit utility rankings show that the Carnegie Unit program provided a broader level of college access than the utility results derived from the CBA program. Hypothesis 2 Null Hypothesis 2: There is no significant difference in the explained grade variance for CBA admitted students and Carnegie Unit admitted students. The results highlighted in Table 15 show that the CBA model resulted in first semester grade correlations that were comparable to the grade correlations observed from the Carnegie Unit model. The correlations in Table 15 represent the percentage of variation in grades explained by both models. The combination of all subject areas and CBA Standardized Reporting Profile 73 (SRP) scores explained a similar amount of the variation in overall grade point average when compared to the Carnegie Unit model that included high school GPA, ACT composite score and class rank (See Table 15). The CBA model explained 47% of the variation in first semester grade point average, and the Carnegie Unit model explained 48% of the variation in first semester GPA. Based on the explained variation in grades of each admission model, the evidence is not compelling enough to allow for the rejection of the null hypothesis. Table 15: First semester correlation GPA weights by subject area. Subject Area Correlation Carnegie Unit Model CBA Model Math .37 .31 English .49 .40 Science .49 .45 Social Science .26 .33 Foreign Language .32 .37 Aggregate Correlation for all .48 .47 Subjects In examining the results listed in Table 15, the CBA model explained a larger portion of the variation in first semester grades in social science and foreign language subjects. At the same time, the Carnegie Unit model explained a larger portion of the variation in English, science and math grades. The higher Carnegie Unit grade values in English, math and science could indicate that the use of standardized test scores provide an added level of predictive power in estimating first semester grades. The notion that these standardized exams add to the 74 predictive power of the Camegle Unit model is consistent with the composition of these tests. Both the ACT and the SAT test student knowledge in English, math and science. These two tests have a limited amount of material that focuses on areas of the social sciences, and there is no testing of foreign language skills. The lack of attention provided to the social sciences and foreign language could explain why the CBA model had a higher predictive value when compared to the Camegle Unit model. Table 16: Comparison of SRP satisfactory ratings and first year college GPA. Number of Competencies Number of Students First Year GPA (Mean) Rated Satisfactory or Higher 0 25 2.47 1 26 2.57 2 36 2.90 3 25 2.87 4 35 3.19 Total 147 2.85 In addition, the statistics listed in Table 16 provide an analysis of the mean GPA at the end of the first year for students who received a rating of satisfactory or higher in subject areas listed on the CBA-SRP form. This table shows a direct relationship between the number of SRP areas with a rating of satisfactory or higher and first year grade point average. The direct correlations, summarized in Table 16, between GPA and the number of satisfactory SRP scores are an important first step in helping to validate the use of the CBA program as a viable alternative in predicting student success in college. Furthermore, Table 17 shows a strong 75 correlation at the end of three semesters between the number of SRP areas rated satisfactory or higher and both grade point average and credits attempted. This initial result does indicate that for the pilot study the SRP teacher evaluations of students were a good and fairly accurate representation of their level of preparation for college. Since the SRP form is to the CBA process what the high school transcript is to the Carnegie process, this is an important finding. Ultimately, the validation of the CBA program rests on how well it is able to predict each student’s level of preparation for college. As a result, additional research that compares the SRP satisfactory scores and college grades of future CBA applicants would be necessary to validate the CBA program. However, the results of the pilot study do show a positive relationship between the SRP satisfactory ratings and the level of student preparation for college. Table 17 also highlights a strong relationship between the number of SRP areas rated satisfactory or higher and the percentage of students with end of third semester grade point averages of 2.0 or higher. Table 17: Comparison of SRP satisfactory ratings, GPA and completed course credits. Number of Number of Third Semester Percentage with Mean Credits Competencies Students GPA GPA of 2.0 or Attempted Rated Higher Satisfactory or Higher 0 25 2.45 74% 37.4 1 26 2.58 84% 40.9 2 36 2.85 89% 42.1 3 25 2.83 95% 44.1 4 35 3.15 100% 44.5 Total 147 2.82 90% 42.2 76 The statistics show that students receiving scores of satisfactory or higher on the SRP form have a higher level of success in college based on grade point averages. These data also show that when members of the pilot group received a satisfactory or higher rating in four of the five competency areas, these students performed at a much higher level than their cohorts who received no satisfactory rating (First year: 3.19 mean GPA versus 2.47 mean GPA). In addition, all students in the group with a minimum satisfactory rating in four of the five competency areas had grade point averages higher than 2.0 (Table 17), compared with 74% of students with no satisfactory rating. Hypothesis 3 Mill Hypothesis 3: There is no significant difference between the amount of time expended processing Carnegie Unit admission applications and the amount of time expended processing CBA admission applications. The use of a two-way Analysis of Variance (ANOVA) allowed for the evaluation of the main effects highlighted in this hypothesis (Shavelson, 1988; Bryk & Raudenbush). This statistical method provided a thorough examination of the mean processing times associated with each university and each admission model. In addition, the use of the ANOVA provided an analysis of the interaction between universities and admission models in this study. The ANOVA performed to measure the effect of university and application type on the time required to make admission decisions yielded important time- efficiency results. Displayed in Table 18 are the descriptive statistics (means, 77 standard deviations, and cell sizes) for Carnegie Unit admission decisions. Listed in Table 19 are the comparative descriptive statistics for CBA decisions. Table 18: Time needed to make decisions on Carnegie Unit admission applications. Institution Mean SD p Institution 1 4.68 4.02 73 Institution 2 7.71 4.63 24 Institution 3 5.07 3.37 42 Institution 4 4.93 .52 60 Institution 5 7.60 3.14 20 Institution 6 8.64 2.58 11 Institution 7 3.42 1.16 12 Institution 8 3.76 1.56 25 Institution 9 5.69 3.28 13 Institution 10 10.34 5.03 50 Total 6.09 4.05 330 The means in the tables are given in minutes as the sole time measurement. For all but one university, the mean time expended to make Carnegie Unit admission decisions was less than the time needed to make CBA decisions. The one university (Institution 10) with a mean CBA time that was less than the mean Carnegie Unit time differed significantly from the other universities in the sample. The mean CBA time for that university was 2.84 minutes with a standard deviation of .87 minutes, whereas the mean Carnegie Unit time was 10.34 minutes with a standard deviation of 5.03 minutes. These results show a high level of efficiency 78 that existed in processing CBA applications. If admission officers at University 10 developed a high degree of comfort in examining the SRP forms, this familiarity may have translated into efficient processing of CBA applications. At the same time, the 10.34 minutes needed to process Carnegie Unit applications is almost 2 minutes greater (1.70 minutes) than the next highest university in the study. It appears that something in the decision making process for University 10 resulted in the high Carnegie Unit processing time and low CBA processing time. Further on-site research at University 10 might provide important insight concerning both the CBA and Carnegie Unit results. Two other universities showed important time differences in processing each kind of admission application. University 4 had a mean time of 4.93 minutes with a standard deviation of .52 for processing Carnegie Unit applications, and a mean time of 15.47 minutes with a standard deviation of 10.26 for processing CBA applications. In addition, University 8 had a mean time of 3.76 minutes with a standard deviation of 1.56 for processing Carnegie Unit applications, and a mean time of 9.20 minutes with a standard deviation of 6.89 for processing CBA applications. Listed in Table 20 are data on all applications (Carnegie Unit and CBA) processed by each university. This table includes the mean times, standard errors, and confidence intervals for all admission decisions. 79 Table 19: Time needed to make decisions on CBA admission applications. Institution Mean S]; p Institution 1 6.27 5.16 73 Institution 2 8.46 6.62 24 Institution 3 8.69 3.26 42 Institution 4 15.47 10.26 60 Institution 5 10.00 8.58 20 Institution 6 10.00 4.47 1 1 Institution 7 5.17 1.75 12 Institution 8 9.20 6.89 25 Institution 9 8.46 2.40 13 Institution 10 2.84 .87 50 Total 8.51 7.30 330 Table 20: Aggregate admission application decision time (Carnegie Unit and CBA). 95% Confidence Interval Institution Mean Std. Error Lower Bound Upper Bound Institution 1 5.479 .470 4.557 6.402 Institution 2 8.083 .820 6.474 9.693 Institution 3 6.881 .620 5.664 8.098 Institution 4 10.200 .518 9.182 11.218 Institution 5 8.800 .898 7.037 10.563 Institution 6 9.318 1.211 6.941 11.695 Institution 7 4.292 1.159 2.016 6.568 Institution 8 6.480 .803 4.903 8.057 Institution 9 7.077 1.114 4.890 9.264 Institution 10 6.590 .568 5.475 7.705 Data from the ANOVA assisted in evaluating Null Hypothesis 3, related to the time required to process CBA and Carnegie Unit applications. In testing the null hypothesis, the ANOVA showed a strong E—test for the time required to make admission decisions across application types (CBA and Carnegie Units applications). A painlvise comparison of the E-test computed at the .05 alpha level yielded an E-value equal to 9.26 and an eta-squared value of .125 (Table 21). The value of the _E-test at this alpha level was statistically significant, indicating that a significant difference existed in the mean time required to make CBA and Carnegie Unit admission decisions. As a result, this analysis provided the evidence necessary to reject the null hypothesis of no significant difference between the time expended processing CBA applications and the time expended processing Carnegie Unit applications. An _E-value of 29.99 for the main effect of the ANOVA was calculated for admission time and application type (see Table 21). Table 21: Results of ANOVA test of between-subject effects. Type of Sum 0f Mean Eta Noncent. Observed Source Squares _d_f Square E Sig.a Squared Parameter Power Corrected 2986.711 10 298.671 9.262 .000 .125 92.623 1.000 model Intercept 23449.596 1 23449.596 727.215 .000 .528 727.215 1.000 APPTYPE 967.274 1 967.274 29.997 .000 .044 29.997 1.000 UWSCHOOL 2019.437 9 224.382 6.958 .000 .088 62.626 1.000 Error 20927.488 649 32.246 Total 59071 .000 660 Corrected 23914.198 659 total Notes: Dependent variable: admission decision time. _R_ squared = .125 (adjusted R squared = .111). 81 Further, an examination of the interaction effect between university and application type for admission-decision time resulted in an E-value of 15.80. In addition, a post hoc Tukey analysis of mean differences was used to highlight between-university differences in the time required to make admission decisions (see Table 22). Differences in mean admission decision-making time were observed between a number of universities in this study (e.g., Institutions 4 and 10 exhibited differences that cut across multiple universities). This pairwise analysis indicated that the means for University 4 varied significantly (at the .05 alpha level) from the means of all nine other universities analyzed in this study. Similariy, the means for University 10 varied significantly from the means of six other universities. Table 22: Results of Tukey’s multiple comparison of mean differences. (rI‘JsrtgrLgigirt'y) Institution (University) 1 2 3 4 5 6 7 8 9 10 Institution 1 - - - - - - -g_1g - - - - .. - - - - - - - Institution 2 - - - - - - -7.01 - - - - - - - - - - 5.62 Institution 3 - - - - - - -6.67 - - - - - - - - - - 5.85 Institution 4 9.19 7.01 6.78 - - 5.47 5.47 10.3 6.27 7.01 12.63 0 Institution 5 - - - - - - -5,47 - - - - - - - - - - 7.16 Institution 6 - - - - - - -5,47 - - - - - - - - - - 7.16 Institution 7 .. - - - - - -1030 - - - - - - - - - - - - Institution 8 - - - - - - -6.27 - - - - - - - - - - 6.36 Institution 9 - - - - - - -7,o1 - - - - - - - - - - - - Institution 10 - - -5.62 -5.85 -12.63 -7.16 -7.16 - - 636 - - - - 82 Table 23: Results of ANOVA test for CBA admission application decision time for each institution. 95% Confidence Interval for Institution Std. Mean (University) N Mean SQ Error Lower Upper Min Max Bound Bound Institution 1 73 6.27 5.16 .60 5.07 7.48 1 29 Institution 2 24 8.46 6.62 1.35 5.66 11.25 1 30 Institution 3 42 8.69 3.26 .50 7.67 9.71 2 15 Institution 4 60 15.47 10.26 1.32 12.82 18.12 2 50 Institution 5 20 10.00 8.58 1.92 5.98 14.02 5 45 Institution 6 11 10.00 4.47 1.35 7.00 13.00 5 15 Institution 7 12 5.17 1.75 .51 4.06 6.28 2 10 Institution 8 25 9.20 6.89 1.38 6.36 12.04 2 25 Institution 9 13 8.46 2.40 .67 7.01 9.91 5 10 Institution 10 50 2.84 .87 .12 2.59 3.09 2 5 Total 330 8.51 7.30 .40 7.72 9.30 1 50 A summary of the results of all three hypotheses is found in Table 24. This table highlights the results of the highest and lowest ranking admission program for each variable. These results are detailed using the Expectation Matrix table found in Chapter III. The results for the highest and lowest ranked admission program in explaining variations in first semester grades was indeterminable. Both Carnegie Unit and CBA explained a similar percentage of grade variation. In examining the grade correlations by subject area, differences between the expectation matrix and the actual results are found in social studies and foreign language. The Carnegie 83 Table 24: Results of Expectation Matrix. Score/Ranking Variable Highest Lowest Admit Rates Carnegie Unit CBA 1. Explained Grade Indeterrninable lndeterrninable Variation 2. Grade Correlation by Subject Area a. Math a. Carnegie Unit a. CBA b. English b. Carnegie Unit b. CBA 0. Science c. Carnegie Unit c. CBA d. Social Science d. CBA d. Carnegie Unit e. Foreign Language e. CBA e. Carnegie Unit Admission Decision Time CBA Carnegie Unit Unit program had a higher expected grade correlation for these two subject areas. These results proved to be opposite of what was expected from the model. Listed in Table 25 is a decision-making matrix that a policy maker could use in making final decisions concerning each admission program. The MAUT utility ranking and utility scores for each program are listed along with explained grade variation; the time needed to make final admit decisions; and the time needed to make decisions on all applications (admit and deny decisions). 84 Table 25: Alternative admission program policy decision-making matrix. Admission MAUT Utility Explained Mean Admit Mean Model Ranking Variation in Decision Decision (Admit Utility) GPA Time Time (Percentage) (Aggregate) Carnegie Unit 1 (77.18) 48% 6.08 Minutes 6.09 Minutes Admission Competency 2 (73.06) 47% 7.41 Minutes 8.51 Minutes Based Admission 85 CHAPTER V: SUMMARY, CONCLUSIONS, IMPLICATIONS AND RECOMMENDATIONS FOR FUTURE RESEARCH Introduction The discussion of the results is detailed in several sections of this chapter. The chapter starts with a summary of the research hypotheses. A summary and discussion of the findings and conclusions of the results of each hypothesis is reviewed. This is followed by a discussion of the implications of this study. Finally, the direction for future research is discussed and a listing of future research summary points is outlined. Hypotheses The hypotheses that were tested are the following: o Admit rates and the corresponding MAUT utility will be higher for Carnegie Unit admission when compared to CBA admission. . The Carnegie Unit model explains a higher amount of the variation in first semester grades. 0 The time needed to make decisions on CBA applications will be higher than the time needed to make decisions on Carnegie Unit applications. 86 Findings and Conclusions Hypothesis 1 states: Admit rates and the corresponding MAUT utility will be higher for Carnegie Unit admission when compared to CBA admission. The results of the MAUT utility ranking showed that the Camegle Unit admission program had a higher aggregate utility value than the CBA admission program. Therefore, the hypothesis is accepted. The results were expected based on the hypothesis related to admission access. Furthermore, these results shed light on differences in the two programs in evaluating admit decisions. The CBA program yielded a lower admit utility ranking for six of the ten universities. However, these disaggregated results do indicate that there is a level of high access maintained in four universities by the CBA program that is not present in the Carnegie Unit program. In dissecting these results, it could be appropriately argued that a student with a lower Carnegie Unit profile (i.e., grades and test scores) would have a better chance of gaining admission to one of these four universities if s/he possesses an average or better than average CBA profile (i.e., SRP evaluation scores). On the other hand, students with a higher Carnegie Unit profile would be better served applying for admission to one of the six universities where the Carnegie Unit program had a higher MAUT utility ranking. The admit rates for all universities are detailed in Table 26. The data show a close admit ratio between the two admission programs for University 10. The Carnegie Unit admit rate for this university was 82% and the CBA admit rate was 84%. These results compare favorably to the MAUT utility rankings calculated for University 10. The Carnegie Unit MAUT utility was 12.30 and the CBA MAUT utility 87 was calculated at 12.60. In addition, the results indicate that not only does the CBA model lead to similar access outcomes when compared to the Carnegie Unit model, it also would result in a small increase in admission access for University 10. The differences in the admit rates (variation), for this institution, would indicate that over time 2 more students out of 100 would be granted admission using the CBA model versus the use of the Carnegie Unit model. Table 26: Admit rates and variations in admit rates for the Carnegie Unit model and the CBA model. University Camegle Unit CBA Admit Admit Rate Variation (Institution) Admit Rate Rate (Camegle Unit - CBA) Institution 1 .83 .73 .09 Institution 2 .96 .75 .21 Institution 3 .79 .69 .10 Institution 4 .68 .60 .08 Institution 5 .65 .75 (.10) Institution 6 .55 .82 (.27) Institution 7 .67 .92 (.25) Institution 8 .72 .68 .04 Institution 9 .85 .77 .08 Institution 10 .82 .84 (.02) The universities with the highest variations in admit rates were Universities 2, 3, 5, 6 and 7. Each of these universities had a difference in admit rates greater than 10%. In fact, Universities 2, 6 and 7 had admit rate variations greater than 20%. For University 2, the Carnegie Unit admit rate was 96% while the CBA admit rate 88 was 75%. Overtime, this variation would result in 21 more students out of 100 being admitted through the Carnegie Unit program or 21 fewer students being admitted using the CBA program. Likewise, the trend is reversed for Universities 6 and 7. The Carnegie Unit and CBA admit rates for University 6 were 55% and 82%, respectively. Overtime, these rates would result in 27 more students out of 100 being admitted to University 6 using the CBA program or 27 fewer students gaining admission access using the Carnegie Unit program. The corresponding Carnegie Unit and CBA admit rates for University 7 were 67% and 92%, respectively. In keeping with the analysis provided for University 6, 25 more students out of 100 would be granted admission to University 7 using the CBA program. The MAUT and admit rate differences across the ten universities studied has important implications for students planning to attend a University of Wisconsin institution. For students who have strong grades and test scores, submitting this material for Carnegie Unit admission would result in these students having a 96% chance of being successfully admitted to University 2. Alternatively, students with strong Supplemental Reporting Profiles (SRP) would have a 92% chance of gaining a positive CBA admission decision when applying to University 7. By comparison, students would only have a 55% chance of gaining a positive Carnegie Unit admission decision when applying to University 6. These same students would also have a 60% chance of receiving an affirmative CBA admission decision when applying for admission to University 4. The results of this study show that for two of the ten universities studied the 89 CBA model achieves similar access outcomes in comparison to the Camegle Unit model. This statement is best represented with University 8 and University 10. With University 8, there is a 4% admit rate variation in favor of the Camegle Unit model, and with University 10, there is a 2% admit rate variation in the direction of the CBA model. Based on both the individual university results and the aggregate admission rates, the CBA program does prove itself to be a worthy, alternative program for making college admission decisions. The MAUT process used in this study provides University ofWisconsin policy makers with important information on the Carnegie Unit and CBA programs. In trying to make a final decision on which program maintains the highest level of access, the policy decision could be balanced by looking at both the aggregate utilities and the utilities derived for each university. In some cases, the final decision on programs that have a societal impact, as is the case in this comparison, will come down to whether there are important redeemable outcomes (e.g., providing access to college to groups of students who would othenlvise be denied admission) associated with each program. If this were the basis of final decision-making between Carnegie Unit and CBA, then the decision on access would result in both programs being available for use by students applying for admission to a University of Wisconsin institution. If the program review decision requires a final decision based solely on access rates, then the final program choice would be to continue to use the Carnegie Unit program as the sole admission strategy for all public 90 universities in Wisconsin. The universities examined in this study consisted of institutions that vary in size from 5,000 to 40,000 enrolled students. In comparing the level of access and the size of the university, there appears to be no discemable trends. Some universities provided a higher level of access using the Carnegie Unit program while others had higher access rates with the CBA program. In comparing access and university enrollment, the following four results were observed: 1. The smallest university (Based on enrollment) had the highest Carnegie Unit access rate. 2. The third smallest university had the lowest Carnegie Unit access rate. 3. The second smallest university had the highest CBA access rate. 4. The largest university had the lowest CBA access rate. These results are important when considering the institutional mission associated with these universities. For example, the smaller universities are characterized as regional universities; are less selective in their admission practices; and devote fewer financial resources to research in comparison to larger universities in this study. The observations noted in numbers 3 and 4 point towards a conclusion that lower CBA access rates may exist more often in among large, research-intensive universities, while higher CBA access rates may be more commonly associated with smaller, regional universities. These universities also consist of regional, national and institutions that have a global enrollment appeal. The geo-demographic characteristics of these 91 universities are diverse. The net affect of all of these characteristics is a group of institutions that are representative of most public universities. The CBA program results appear to be outcomes that could be generalized across many public universities in America. Hypothesis 2 states: The Carnegie Unit model explains a higher amount of the variation in first semester grades. Based on an evaluation of the collected data, the evidence does not support this hypothesis. The amount of variation in grades explained by the Carnegie Unit model was 48%, while the CBA model explained 47%. The difference in the explained grade variation between these two admission models is relatively small (1%). However, differences in each subject area highlights greater levels of explained grade variations between the two admission models. For example, the Carnegie Unit model explained 37% of the variation in grades in the subject of math, and 49% of the variation in grades for English and science. These results compare favorably to a CBA math rate of 31%, an English rate of 41%, and a science rate of 45%. Alternatively, the CBA model accounted for a higher amount of the explained grade variation in social science and foreign language. In fact, the explained grade variation for CBA in social science was 33% and a rate of 37% for foreign language. These results are higher than the Camegle Unit rates of 26% for social science and 32% for foreign language. The math, English and science results could point towards the impact that 92 test scores and test methodologies have in improving grade prediction in the Carnegie Unit model. Since the ACT and SAT evaluate levels of student knowledge in these subjects, we might expect an admission model that utilizes these tests (i.e., Carnegie Unit model) would explain higher levels of grade variation when compared to a model that excludes these exams (i.e., CBA model). The ACT and SAT do not evaluate proficiency levels in foreign language and general knowledge in the area of social science. As a result, the addition of test scores does not enhance the Camegle Unit model’s ability to explain levels of grade variation in foreign language and social science. This point could explain why the CBA model yielded higher grade variation results for these two subjects when compared to the Carnegie Unit model. Another possible explanation for these results, and specifically the amount of explained grade variation observed in the CBA model for foreign language may relate to the pedagogical methods used in both high school and college curricula. The instructional methods used to evaluate a student’s proficiency in high school and college foreign language courses are similar. These courses are graded using a combination of written tests and oral examinations. The ability to verbally communicate in an expressive and understandable manner is often evaluated at a much higher level when compared to written expression. Consequently, the CBA- SRP evaluation process provides teachers the opportunity to express in a detailed manner the student’s level of mastery of foreign language material. The Carnegie Unit model has to rely on high school grades as the primary means for predicting 93 success in college foreign language courses. The foreign language results represent a clear example of how an written explanation of a student’s skills provides an enhanced college grade prediction method when compared to a strategy that predicts results by examining high school grades. Hypothesis 3 states: The time needed to make decisions on CBA applications will be higher than the time needed to make decisions on Carnegie Unit applications. The data indicate that time differences did exist between the two processes. The average time needed to make aggregate Carnegie Unit admission decisions was 6.09 minutes and the average time needed to make aggregate CBA admission decisions was 8.51 minutes. Therefore, the hypothesis is accepted. These time differences identified in the study could have resulted from different levels of familiarity among admission officers with the two types of applications. Typically, such familiarity will result in very little time needed to pause or be indecisive in rendering final application decisions. As a result, most admission offices in this study were able to make time-efficient admission decisions on Carnegie Unit applications. Conversely, because the CBA process was a new pilot program, much of the time admission officers needed to make decisions could be explained by a lack of familiarity with this program (CBA: The Wisconsin Model, 1994). With any new initiative, there may exist a steep learning curve prior. The CBA program required admission officers to spend significant time understanding the parameters 94 surrounding the decision-making process (CBA: The Wisconsin Model, 1994). Once this knowledge base is better established, processing and decision-making efficiency should begin to improve. Over time we would expect processing time differences between the two admission programs to be less significant. In nine of the ten universities studied, the CBA program had higher rates for the amount of time needed to make application decisions than the Carnegie Unit program. The aggregate time needed to make Carnegie Unit decisions was 2.42 minutes less than the time needed to make CBA decisions. The average time needed to make Carnegie Unit decisions for University 2 was 45 seconds less than the time expended on CBA decisions. At the same time, Universities 6 and 1 experienced Camegle Unit decision times that were less than CBA decision times by 82 seconds and 95 seconds, respectively. These results show that CBA does represent a viable time efficient instrument for some universities. The two most startling results in comparing each admission program came from Universities 4 and 10. University 4 was the school with the highest average time needed to make CBA decisions, while University 10 was the school that had the lowest average CBA decision time. In addition, these two schools had the largest time differences when comparing the average Carnegie Unit time to the average CBA time. The average time needed to make Carnegie Unit decisions for University 4 was 4.93 minutes compared to an average time of 15.47 minutes needed to make CBA decisions. The time difference between the two admission programs was 10.54 minutes. These results show the huge gap in efficiency that could be the result of a lack of familiarity with the CBA-SRP form. If such a lack of 95 knowledge concerning the SRP form exists, through the continuous use of the CBA program, the time efficiency gap with the Camegle Unit program should start to close. Another possible explanation for this time difference may relate to the way admission officers viewed each program. For example, the Carnegie Unit program is primarily an evaluation of quantitative data (e.g., the examination of letter grades from the high school transcript and the evaluation of standardized test scores). Alternatively, the CBA application could be viewed by some admission officers as mostly a qualitative evaluation process. Under certain conditions, a qualitative admissions review could result in a greater amount of time needed to make decisions when compared to a process that is primarily a quantitative review. Time differences between the two programs could also occur if some universities used committees (i.e., two or more admission officers) to make admission decisions. In some cases, the committee process of trying to arrive at a group consensus can be more time consuming than a process of having one officer make the same admission decisions. At the opposite end of the efficiency spectrum is University 10. The results observed for this institution represent more of an “outlier" effect when compared to the other nine universities in the study. University 10 was the only school that experienced results that found the CBA program to be more efficient than the Carnegie Unit program. The analyses of these results show a huge average time gap between the two admission programs. The average time expended on CBA application decisions was 2.84 minutes compared to an average time for Carnegie Unit applications of 10.34 minutes. These totals represent the second largest time 96 gap (7.50 minutes) among institutions in the study. While these results for University 10 are important, they are also surprising and counterintuitive to what would normally be expected. The normal expectation for time differences between each admission program would be in a direction similar to the other nine universities. A logical hypothesis would be targeted and directed toward the average Carnegie Unit decision time for each university being less or lower than the average CBA decision time for each school. The fact that the results for University 10 are in the opposite direction of this hypothesis raises at least one important question: . What happened in the admission office for University 10 that generated these results? While this is a difficult question to answer without conducting on-site research . at this university, there are some possible explanations. One explanation could be that an experienced admission officer was assigned to make all CBA decisions, while a less experienced admission officer was assigned to make all Camegle Unit decisions. Another possible explanation could be that the CBA admission officer may have quickly grasped the logic behind the SRP form and made time efficient decisions once his/her Ieaming curve reached its peak. An additional explanation focuses on the complexity of the admission decision-making process for Camegle Unit applications. Some American universities employ a complex admission decision logarithm that assigns points to applicants based on a large number of characteristics which include factors such as letters of recommendation and an applicant’s personal statement (Fetter, 1995). 97 If University 10 uses a Carnegie Unit application decision-making system that utilizes a complex point system, similar to the aforementioned process, this would explain the large time differences between Carnegie Unit and CBA. At any rate, a follow-up study on university 10 would go far in determining the basis for these time differences. This is one of the first studies to examine admission decisions using a time- focused evaluation process. The data analysis could serve a useful purpose that may enable colleges and universities to develop strategies and establish time- efficient benchmarks for making decisions on college admission applications. Implications of the Study This study provides some useful contributions to the literature by establishing a better understanding of the evaluation of alternative methods for determining access to college. The study also contributes to the body of literature that deals with process oriented efficiency by examining research data that quantifies and provides a benchmark for the average amount of time needed to make final college admission decisions. From the study, the development of a new admission model, the Competency Based Admission program, is proven to be an effective decision making tool in determining access to college. These evaluations of CBA admit rates show that this new program, when compared to the standard program (Carnegie Unit Admission), is successful in maintaining a sufficiently high level of college access. The first 98 semester and first year grades received by students admitted using the CBA program compare favorably to the Carnegie Unit program. These results assist in helping to validate the CBA program as an effective model in determining future success in college. The strong admit rates and the success in college achieved by students admitted using the CBA program points toward an admission model that could have use that extends far beyond the borders of Wisconsin. The other important contribution of this study is in the area of time based admission decision-making literature. Since Carnegie Unit admission is a widely used process, this study could assist admission offices by helping them establish a benchmark related to the amount of time an admission officer should spend making application decisions. To date, a thorough review of the body of literature yields no studies that provide this type of benchmark and efficiency analysis. The Camegle Unit admission results generated by this study could become the standard used to establish a benchmark of decision-making efficiency for college admission officers. In addition, the study of the time required to make CBA decisions is important in helping to establish a benchmark of efficiency for the evaluation of newly developed, alternative admission programs. The admit rates and college grade success derived from the CBA program is an important finding in arguing against the prominent use of standardized test scores in college admission decisions. The CBA results show that an effective admission model, which minimizes the reliance on test scores, can be successfully developed and implemented. The study results have their most important contribution nested in the minds 99 of policy makers who may have felt helpless in examining meritorious college access using only one evaluation process. This merit based access issue has also become a highly charged issue impacting access to college for minority students in key states like California and Texas (San Miguel & Valencia, 1998). In California, Proposition 209 was passed into law in 1996, which requires the evaluation of college admission materials to center on Carnegie Unit criteria and away from any Affirmative Action based consideration (Jones, 1998). In the years following the strict enforcement of Proposition 209, the data show admission rates for minority students have been dramatically lower when compared to the rates observed in the years before this law (Douglass, 1999; Jones, 1998; Karabel, 1999). In the two years following the Proposition 209 law, enrollment of Black and Hispanic freshman in California’s public universities declined 54% and 30%, respectively (Douglass, 1999). During that same period, White freshman student enrollment increased by 10% in these public universities (Douglass, 1999). A CBA type of program could serve an important role in moving the admission debate in states like California to a level that focuses on qualifications based on demonstrated success in high school with good indicators of knowledge and competencies. The data analyzed in this study did not focus on the direct use of CBA in states other than Wisconsin, however, the results do hold much promise for this program being able to successfully maintain broad college access while minimizing the dependence on test scores. 100 Direction for Future Research The imposition of any new system on an organization could place an additional burden on already constrained resources. This may be particularly true in the case of the CBA program and its affects on universities. During the pilot study, no financial resources were provided to Wisconsin universities to process and compile data on the CBA program. As a result, a study that investigates the impact of CBA on existing financial resources for each university is needed. A challenge to anyone conducting research on the CBA program is the examination of data that are generated during the pilot phase of the program. In fact, much of the data for this study were gathered at the conclusion of the CBA pilot phase. Processes used in operating a program in its full implementation period might and often do differ significantly from those practices established during a pilot phase. In addition, during the full-implementation phase of a program like CBA, some of the college access and resource-expenditure (time) issues might be eliminated. Thus, providing data and evidence that may allow admission personnel to devise strategies to refine and continue to improve the ability of colleges and universities to make better and more informed decisions and to utilize more efficiently admission staff members time is an important reason to carry out a thorough longitudinal study of the CBA program. Another critical area for future research should focus on examining the use of teacher evaluations as utilized in preparing the CBA-Supplemental Reporting Form (SRP). These teacher SRP assessments of student skills and subject competencies are the primary basis for making CBA admission decisions. The rationale for a 101 future study in this area is driven by the need to validate the SRP instrument in college admission decisions. Before a program like CBA can experience broad and wide spread acceptance in higher education, its core tool for predicting student success should have a consistent degree of predictive validity. The availability of a predicted validity standard would provide admission counselors with the confidence needed to make admit and deny decisions based on a reliable SRP scoring benchmark. The results in this study show a strong and direct relationship between satisfactory SRP scores and college grades. Future research that continues to examine this relationship is one way of attempting to validate the use of the SRP form in making college admission decisions. This study was inconclusive in evaluating the impact of CBA on college access for women and minority students. The attack on the use of Affirmative Action practices in college admission was an important catalyst behind the need to study the CBA program. A future study that explores the impact of the CBA program on women and minority students could help in shaping the discussion concerning college access for these groups. In addition, future research studies should explore the affects of CBA on college admission offices and the impact of CBA on K-12 education. Specifically, these studies should examine the following aspects of each area: 0 College Admission Offices: This research should explore time comparisons between admit decisions and deny decisions. 0 K-12 Education: An examination and study of the impact of CBA on K- 12 education in Wisconsin is important and should focus on evaluating 102 the explained college grade variation observed in this study in the subjects of social science and foreign language. A study that evaluates and compares the high school and college curriculum for these two subjects may provide valuable insight and knowledge. In addition, a study should be undertaken to determine the time demands placed on teachers in completing the CBA-SRP evaluation form. Future research that explores these areas will provide a better understanding of the overall affects associated with the CBA program. Summary of Areas for Future Research . A study that examines the financial impact of CBA on university admission offices. . A longitudinal study of CBA to evaluate admission decision-making time over an extended period. . A continuous study that attempts to identify a direct relationship between CBA-SRP ratings/scores and first semester college grades. 0 A study that examines the impact of CBA on the enrollment of women and minority students. 103 . A study that examines and compares CBA admit and deny decisions. 0 A further examination that seeks to explore why CBA explained a high level of grade variation in the subjects of social science and foreign language. . A study that examines the time demands placed on high school teachers in completing the CBA-SRP evaluation process. The results derived from this study on admit decisions for the Camegle Unit program and the CBA program does provide concrete ways in which broad levels of admission access can be preserved as new admission strategies are developed. The level of access maintained by CBA in this study does appropriately address this point. The results also provide a better understanding of the time efficiencies associated with Carnegie Unit and CBA admission decisions. In addition, this study and its findings provide a starting point for future research on the impact on access of a newly implemented admission program. While it is important to the general knowledge, that future studies on CBA and other alternative admission programs be undertaken, this study does provide an important and promising start in expanding on this body of literature. 104 Post-Pilot Summary Status After the conclusion of the pilot study, the University of Wisconsin System used the collected data to argue for the widespread use of the CBA program. During the fall of 1997, the findings from the CBA pilot study were provided to the University of Wisconsin Board of Regents. These Board members were asked to consider a recommendation from the System’s administration that called for the full implementation of the CBA program. At their November 1997 meeting, the Regents approved the recommendation to allow the CBA program to be used as an alternative admission procedure (CBA: Spring 1998 Final Report, 1998). Today, the CBA program is available for students to use for admission consideration to a University of Wisconsin institution. However, since the full implementation of CBA, the program has received limited use. Only a small number of CBA applications have been submitted for admission review. There are a number of possible explanations for the limited use of the CBA program during this post pilot period. These include the following possibilities: . There are a limited number of high schools trained to evaluate student competency levels in accordance with the CBA criteria. The University of Wisconsin System requires all high schools that are considering using the CBA program to attend a formal orientation and training session. This orientation is designed to provide training to high school personnel on key aspects of CBA. If high schools are not able to find the time for training, then they are restricted from using the CBA 105 program. Currently, the only high schools trained to use CBA are the ten schools involved in the early pilot study. There may exist among high school personnel a belief that the CBA application process places a higher level of demand on staff time when compared to Carnegie Unit application process. The time commitments associated with the completion of the five subject evaluations by teachers and the submission of these materials by high school counselors to each university involves numerous school personnel. By comparison, the Carnegie Unit program requires high school counselors to compile and submit the high school transcript and standardized test scores to universities. The Carnegie Unit program usually requires the time of one staff member to complete the process of submitting these application materials. As a result, the time needed by teachers to complete the SRP evaluations may be dampening the enthusiasm of high school officials for the CBA program. In 1998, the governor of Wisconsin issued an executive order mandating the development of various assessment methods to evaluate student Ieaming (Executive Order 326, 1998). This legislation requires mandatory testing of students at the elementary, middle and high school level, and culminates in a high school graduation examination that must be successfully passed as a condition for receiving the high school diploma. The high school 106 graduation test is being developed for implementation during the 2003 school year. This test is being developed as part of the strategy in Wisconsin’s K-12 education plan for the evaluation of each student’s competency in the core academic disciplines of math, science, English and social science. The University of Wisconsin’s President’s Office has decided to establish a number of committees to evaluate the relationship between the CBA competencies and the core competencies needed for graduation from a Wisconsin high school. These committees are organized by disciplinary area and include high school officials, university faculty and administrators. The University of Wisconsin System refers to this process as the Curriculum Articulation Project. The goal of this initiative is to ensure that there is consistency in the knowledge and skills universities expect student to have for admission and the actual Ieaming that is occurring in the k-12 system. As a result, the University of Wisconsin System has focused much of its attention on the Curriculum Articulation Project, and less attention on promoting the CBA program among high schools in Wisconsin. These aforementioned items will need to be reconciled before the University of Wisconsin System can have a reasonable expectation that the CBA program will become a commonly used tool among high schools. 107 APPENDICES 108 APPENDIX A 109 APPENDIX A COMPETENCY-BASED ADMISSION: THE WISCONSIN MODEL FINAL REPORT INTRODUCTION In response to educational reform efforts occurring in K-12 schools in Wisconsin and throughout the country, the University of Wisconsin (UW) System is developing a supplemental admissions process for students graduating from high schools using performance-based curricula. This process is called Competency-Based Admission (CBA). CBA will be used to determine whether students graduating from high schools with restructured curricula are prepared for college level coursework. As part of the implementation process, the UW System will conduct a CBA pilot project with eight Wisconsin high schools during the 1995-96 school year. In preparation, subcommittees comprised of UW faculty, in consultation with representatives from the Department of Public Instruction (DPI), the Wisconsin Technical College System (WTCS), and the K-12 schools, have developed competencies in English, foreign language, mathematics, science, and social studies. UW institutions will use levels of competency attainment as one factor to determine whether students graduating from the eight pilot high schools will be admitted. The CBA approach was devised to be an alternative to the Carnegie Unit approach to admission. It was not devised to change the standards of admission to UW System institutions. During the pilot project, no student will be denied admission based solely on the descriptions of competency level attainment contained in the Standardized Reporting Profile (SRP). Any student who is not deemed ”admissible" based on the SRP will be reevaluated based on the traditional transcript. This report explains the background and the process used to develop the CBA approach and the competencies. The report also summarizes the procedures and timeline that will be used to implement the Competency-Based Admission pilot project. 110 BACKGROUND In its July 1992 report to the Board of Regents recommending revision of the freshman admission policy, the Working Group on Math and Science Admission Requirements also recommended that a task force be appointed to examine the feasibility of developing a competency-based process for admission to UW institutions. This recommendation was based on the growing restructuring and reform efforts taking place in K-12 schools in Wisconsin. The working group reasoned that as school districts changed their curricula to become, in many cases, more performance-based, it would be difficult for UW institutions to use the traditional Carnegie Unit approach to admission for all students. The Working Group suggested that the UW System encourage and support K-12 educational reform because such reform may result in better prepared students entering the university. One way to demonstrate support would be through an admission process that would allow UW institutions to use non-traditional information in making admission decisions. The Working Group recommended that the UW System consider an admission process that focused on attainment of knowledge and skills rather than Carnegie Units. A Task Force was appointed in October 1992 by the UW System Senior Vice President for Academic Affairs. Task Force members included LTW faculty and administration and representatives from the Department of Public Instruction (DPI), the Wisconsin Technical College System (WTCS), and the K-12 schools. 'After careful deliberation of the issues surrounding competency-based admission and the possible implications that may result from the adoption of such an approach, the Task Force recommended that the UW System adopt a competency-based admission process as a supplement to the current admission policy based on Carnegie Units. Subsequently, a Steering Committee and six subcommittees were appointed to develop competencies in English, foreign language, mathematics, science, and social studies and to design a standardized reporting profile that would be used by the pilot high schools to report levels of student competency attainment to UW admission offices. Seventy-five faculty, along with consultants from DPI, WTCS, and the K-12 schools served on the subcommittees. In November 1993, the subcommittee members attended a seminar to begin this major initiative. The Senior Vice President for Academic Affairs and the chair of the Task Force charged each discipline subcommittee to develop competencies that students should attain for admission to UW institutions. The profile subcommittee was asked to design a standardized reporting profile that would allow for competitive admission decisions. Finally, the members were given training and practice in generating and writing the competencies. The subcommittees completed their work in May 1994. 111 COMPETENCIES Competency-Based Admission is formulated on the premise that if students attain and demonstrate a certain level of knowledge and skill in the major disciplinary areas, they will be adequately prepared for a successful college career. in this report competency is defined as the state of being properly prepared or well qualified for college level success. It does not exclude higher achievement, but rather exists on a continuum from minimum to maximum levels of competency attainment. Competencies define the knowledge and skills that students should possess and the appropriate levels of mastery that students should demonstrate. Competency attainment is a key predictor of student success in college, but it is only one of many predictors. Competency-Based Admission is not intended to make admission to the university more difficult. This process is comparable to the Carnegie Unit process, but designed to facilitate admission for students graduating from high schools using restructured, non-Camegle based curricula. The competencies will form the basis for the Standardized Reporting Profile that will be used by the schools to report levels of student competency to university admission offices. These competencies will be included in the CBA Training Manual that will be provided to each Wisconsin high school prior to implementation of this initiative. COMPETENCY ASSESSMENT The Task Force recommended that the responsibility for presentation of the students' case for college readiness should remain with the secondary schools. UW institutions will be responsible for setting admission standards and for determining the general requirements for admission. Through the use of valid and reliable techniques such as traditional assessment, performance based measures, and portfolio reviews, secondary school personnel will determine what level of competency the student has attained and will report this information to admission offices on the Standardized Reporting Profile. A good assessment strategy should be comprehensive and multidimensional. It should include written tests, problem solving simulations, peer review and a range of performance-based measures. Leaving the responsibility for determining levels of competency attainment with the secondary schools will provide a greater likelihood that students will be comprehensively evaluated on several dimensions. Secondary school teachers have the 112 knowledge, skills, and opportunities to provide this kind of effective evaluation. The UW System, due to time and resource constraints, would most likely only be able to administer a single standardized measure to determine competency attainment. This kind of one-dimensional assessment would not accurately test student knowledge, and, in all likelihood, might tend to drive the high school curriculum. Students in Wisconsin already participate in several comprehensive assessment programs. DPI is developing a Student Assessment System that will include evaluation of every student in fourth, eighth and tenth grades using traditional, performance based and locally developed portfolio assessment. Students in tenth grade take the UW Early Math Placement exam to help determine their level of mathematics ability. This information helps them decide how to proceed with mathematics in high school and gives them an idea of how they will likely perform at UW institutions. In addition, UW institutions require students to take math, foreign language, and English placement exams to help determine their levels and placement upon admission. Finally, each student applying for admission must take an ACT exam. This information helps students decide at which institutions they are likely to be successful and to determine their academic strengths and weaknesses. The combination of the information from the evaluation systems already in place and the Standardized Reporting Profile received from the secondary schools should give the UW admission offices and faculty the information necessary to make accurate and informed decisions for admission and course placement. STANDARDIZED REPORTING PROFILE Currently, students are admitted to UW institutions according to the number of high school credits, or Camegle Units, they acquire. Under this approach, students take discrete classes and receive credits and grades based on their performance. These credits and grades are recorded and transmitted to the university on a traditional student transcript. If students have the required number of credits in the required academic areas and meet class rank and ACT score requirements, they are eligible for admission. While this system works well for a curricular structure based on Carnegie Units and standard grades (i.e., A-F), it does not work as well for an integrated and applied performance-based curriculum. As K-12 schools restructure their curricula, they will need a different way to report the knowledge and skills their students have learned. In turn, UW institutions will need a new method to evaluate these students for admission. 113 In response to these concerns, a subgroup was formed to develop a common profile that high schools with restructured curricula could use to report levels of student achievement to university admission offices. lle Standardized Reporting Profile (SRP) will provide a scale that the high schools can use to rate the students' levels of competency attainment. The SRP also will solicit additional information about a student (e.g. attendance, effort, extra-curricular activities, honors, etc.) that may be used by institutions in determining eligibility for admission. In certain cases, the SRP may be used in conjunction with a traditional transcript to provide a more complete record. It is anticipated that the SRP would be completed by teachers at the end of the junior year and again after graduation. The competency levels reported on the SRP would be based on the results of the assessment process used by the high school. Each UW institution would determine the competency level that would be required for admission and how other factors would be considered. The purpose of the CBA approach is not to change admission standards, but to provide an alternative to the Carnegie Unit approach. PILOT PROJECT Since Competency-Based Admission is a significant departure from the present admission process, and because it is unclear at this time how extensive K-12 school reform will be, the UW System decided to implement CBA on a pilot basis. Up to eight Wisconsin high schools in various stages of curricular reform will be invited to participate in a pilot project that will be conducted during the 1995-96 academic year. Teachers and other representatives from these schools will participate in a training program and will use the CBA process for students seeking admission to a UW institution for Fall 1996. The pilot project will provide important information about the training component, the Standardized Reporting Profile and the overall CBA. admission process. The results of the pilot win be used to enhance the process and help determine the feasibility of implementing this approach on a wider basis. TRAINING AND DISSEMINATION The LTW System will use a Trainer-of-Trainers approach to facilitate the implementation of CBA. A Core Training group consisting of two UW faculty, a LTW ,admission director, and a K-12 representative will plan and conduct a training seminar that will be held in January 1995. A group of Site Trainers, 114 consisting of a representative from each K-12 pilot school, an admission officer, and faculty member from each UW institution, will attend the seminar and will subsequently be responsible for training colleagues at their respective schools and institutions. Specifically, the K-12 Site Trainers will train high school faculty, administrators, and other staff at their high schools to use the Standardized Reporting Profile for reporting student levels of competency. The UW admission office Site Trainers will train the other admission personnel at their institutions to evaluate the Standardized Reporting submitted by the K-12 pilot schools for competitive student admission. The UW faculty Site Trainers will meet with other faculty members to explain the CEA process and its purpose. This information may be used to help the faculty establish institutional admission standards for students applying through this process. The Core Training Group will write a Training Manual during the 1994-95 academic year. This Training Manual will be used as a resource for the Site Trainers, admission personnel, and secondary school and university faculty as they implement the CBA pilot project. In addition, the Training Manual will be disseminated in written and electronic form to other colleges and universities throughout the nation for use as a model in developing their own alternative admission programs. The Training Manual will detail the process the UW System used to develop the CBA process and the procedures that were put in place to implement the pilot project. It will include a listing of the competencies and definitions of the competency levels, examples of activities or evidence that may be used to assess student knowledge and skills, instructions for using the Standardized Reporting Profile to report student competency, and how it will be used in making competitive admission decisions. NEXT STEPS This report will be sent to the Vice Chancellors and to other governance groups during the 1994 fall semester. Preparation and training for CBA will occur during the 1995 spring semester and the pilot study will begin in the 1995 fall semester. After the pilot is completed, the competencies and the SRP will be reviewed and modified as needed. The results of the pilot study will be used to enhance the CBA process and help determine how it might be used on an expanded statewide and national basis. 115 CONCLUSION The Task Force on Competency-Based Admission that first examined this issue concluded that a competency-based admission process in the LTW System had the potential to support K-12 restructuring, to allow for diversity among school districts and individual students, to provide cost-saving measures in remedial programs, and to promote collaboration between UW institutions and the K-12 schools. At a time when most institutions of higher education are moving slowly in changing their admission policies, CEA puts the UW System in a leading position in educational reform. The development and implementation of the CBA process has been a collaborative effort between the educational agencies that will support educational reform. It is difficult to predict its impact before it is implemented and evaluated, but its potential benefits are far reaching. \NJK\COM\COMPRPRT 1 0-27-94 116 APPENDIX B 117 APPENDIX B UNIVERSITY OF WISCONSIN SYSTEM COMPETENCY-BASED ADMISSION ADMISSION COMPETENCIES Competencies, Standardized Reporting Profile and Rating Scale for Competency-Based Admission Pilot Project January 1995 Competency-Based Admission Pilot Project is an initiative of University of Wisconsin System. Additional support for the training and evaluation components is provided by the Fund for the Improvement of Post Secondary Education; United States Department of Education. 118 RATING SCALE Completing the Competency-Based Admission Standardized Reporting Profile for a student requires a numerical rating for each of the major categories under the subject headings English, mathematics, science, social studies and foreign language. A student’s level of proficiency is described using a five point scale. 5 - Excellent Performance 4 - Very Good Performance 3 - Satisfactory Performance 2 - Limited Performance I - Poor Performance NBE - No Basis for Evaluation In the case of the rating for foreign language, the assumption is that the student has completed a minimum of 2 years in a high school foreign language program. Excellent Performance This rating is used to describe a student who performs the competency-related tasks with outstanding quality, initiative and adaptability and who can lead others. There may be evidence of insight, originality, or other attributes that characterize deep understanding of the concept or process. The few students who achieve this level of performance may well be capable of going into advanced coursework upon entering the university. Very Good Performance The student who receives this rating performs the competency-related tasks with more than acceptable quality and can apply past knowledge in solving new problem situations. Work tends to show evidence of originality, rather than simple rote execution. Satisfactory Performance The student performs the competency-related tasks satisfactorily without additional supervision or assistance from the teacher. When encountering a new concept, he or she builds on previous knowledge but may have some difficulty using new concepts to solve a related problem. Work tends to be relatively free from errors and on time but not necessarily original or imaginative in its scope. Limited Performance This student may work very hard, but lacks the understanding to carry out the competency-related tasks independently and with originality. Work may be on time but probably has errors. This rating also describes the student who may be capable but who is sporadic in providing evidence of proficiency. 119 Poor Performance Work is unacceptable by usual performance standards. The student may also have difficulty understanding new concepts. No Basis for Evaluation Due to the degree of difficulty or selection of coursework, the student may not have encountered specific concepts or tasks. 120 ENGLISH The following points should be understood by anyone using these competencies: They apply not only to English composition and literature; but to reading and writing across the disciplines. The order in which the competencies appear does not correspond to their Importance. ”Text" means any written discourse in any discipline; “literature” means chiefly imaginative writing In English, or works translated Into English. A. Writing: Process WAn effective writer 1. Formulates and explores ideas using strategies such as brainstorming, listing, mapping, journal writing, questioning, clustering, and outlining; . Considers purpose and audience in selecting and limiting topics; Gathers and evaluates materials and Information pertinent to the topic; Uses primary and secondary research to shape ideas, when appropriate. Drafting: An effective writer 5. 6. 9. Develops and elaborates ideas, distinguishing between topics and theses; Distinguishes major features from minor points; Develops support that is both sufficient and relevant, including source materials where appropriate; Uses appropriate resources for research, e.g., interviews, bibliographies and data bases; Reports and acknowledges the ideas of others; 10. Writes multiple drafts when necessary. 121 Revising: An effective writer 11. 12. 13. 14. 15. 16. 17. Reads own drafts critically to refine the development of ideas; Anticipates the needs and responses of readers; Incorporates feedback from readers and provides constructive feedback to other writers; Assesses and, as necessary, improves the focus and clarity of the controlling ldea(s); Reviews supporting material for relevance and adequacy; Revises for Ideas, coherence, and organization, reshaping the text as necessary by adding, deleting, substituting, and rearranging; Demonstrates control of Edited American English, B. Writing: Product Effective writipg 1. 2. Displays a clear purpose and addresses an appropriate audience; Focuses on a subject, employs unifying Ideas, and uses appropriate organizational patterns (e.g., comparison/contrast, cause/effect, description/narration); Has a logical organization, appropriate transitions, and Internal coherence and cohesion; Uses a variety of sentence types and lengths appropriate for the reader and genre; Supports generalizations with appropriate details; Expresses ideas with individuality and insight; Employs conventional formats of documentation, e.g., MLA, APA, Chicago. 122 C. Reading An effective reader 1. 2. 10. 11. Reads, analyzes, and interprets texts orally and in writing; Understands denotative, connotative, and figurative meanings; Comprehends literal and inferential meanings; Distinguishes main ideas from subordinate details; Summarizes and paraphrases texts orally and in writing; Formulates questions about the implications of a text; Recognizes and evaluates the validity of differing interpretations of a text; . Transfers critical reading skills from one discipline, or setting to another, Recognizes the influences of historical, social, biographical, cultural, ethnic, and other contexts on, a text; Recognizes, understands, and discusses, orally and in writing, conventional literary forms and terms; Discusses, orally and in writing, characterization, setting, point of view, and plot development in imaginative literature; Understands the role of imaginative literature in the development of cultures. D. Oral Communication An effective -speaker 1. 2. Presents Information that is well organized; Communicates in ways appropriate to a variety of audiences and contexts; Supports a position with evidence and effective reasoning; Presents Information which reflects effective language skills, e.g., use or. transitions and clear and appropriate word choices; 123 5. Delivers messages using effective vocal articulation, pronunciation, volume, pitch, vocal quality and body movement; 6. Distinguishes among the five functions or purposes of communication: informative, affective, imaginative, ritualistic and persuasive; 8. Employs and responds to nonverbal communication. An effective listener 8. Distinguishes between hearing and listening; 9. Identifies barriers to effective listening and applies techniques to overcome the barriers; 10. Identifies the different types and levels of listening. 124 FOREIGN LANGUAGE Foreign Language is not a general admission requirement of UW System institutions, but institutions that do require foreign language, require two years at the high school level. These competencies are, therefore, based on expected competency levels for third semester foreign language Instruction at UW System institutions. A. Culture: A student who demonstrates competency in culture can 1. E. Demonstrate basic familiarity with the role of important social institutions within the target culture or cultures. Identify and evaluate superficial stereotypes of the target culture or cultures, . Writing: A student who demonstrates competency in writing . Can communicate information on familiar topics in the target language. Can make correct and appropriate use of present, past and future time. . Speaking: A student who demonstrates competency in speaking can . Respond appropriately to an inquiry. . Participate in discussion about familiar topics. Use the target language to accomplish everyday tasks. Address everyday problems by asking pertinent, logical questions and making suggesfions. . Listening: A student who demonstrates competency in listening can . Understand and respond to native or near-native speech in familiar situations and on familiar topics. Reading: A student who demonstrates competency in reading can 125 1. Comprehend authentic written materials on familiar topics. 2. Extract particular information from a variety of authentic documents and make reasonable comments about style, tone, purpose, and audience. F. Transferable Skills (Provisional): Foreign language competency includes the ability to 1. Contrast and compare the structure and usage of the target language with English. 2. Summarize, Isolate main and subordinate ideas, infer meaning, and draw conclusions from context. 126 MATHEMATICAL KNOWLEDGE AND REASONING GENERAL EXPECTATIONS: Knowing the mathematics listed in the sections which follow includes knowing how to put that mathematics to use in common varieties of problem situations. Thus, knowing about use of variables in linear situations includes knowing how to set up and analyze such common situations as those Involving mixtures or constant rates. Rational situations include variation and proportion. Transcendental situations include exponential growth and decay problems. In all cases, the use includes interpreting, in terms of the problem, results obtained from mathematical analysis. Knowing the mathematics in the detailed listings also means being able to translate or interpret between different representations: between functions, equations, tables and graphs; between pictures and the trigonometric functions sine, cosine and tangent; between pictures of plane regions and equations or Inequalities. A. Use of Constants 1. Perform arithmetic operations in proper order, represent real numbers in a variety of forms and simplify arithmetic expressions involving radicals. Use arithmetic operations to model problem situations. Use mental arithmetic and estimation. 2. Construct and read charts, tables and graphs that summarize data from real world situations. B. Use of Variables In: Linear Situations 1. Solve linear algebraic equations and inequalities in one variable, including those with literal coefficients. 2. Solve linear systems of equations in two or more variables and interpret solutions both symbolically and graphically 3. Use linear functions and their graphs. 4. Use matrices to represent and analyze linear situations. 127 flgebraic Sitpations 5. Add, subtract, multiply, divide and exponentiate polynomial, rational, complex fractional and radical expressions and simplify the results. Solve algebraic equations and inequalities in one variable, including those which can be factored into linear and quadratic expressions, or which contain fractional expressions, absolute values, radicals or fractional exponents. . Use the language, notation and properties of algebraic functions and their graphs, with particular attention to quadratic functions. Transcendental Situations 8. Manipulate and simplify expressions involving exponentials or logarithm. Solve equations and inequalities involving exponential and logarithmic expressions. Use the language, notation and properties of exponential, logarithmic and trigonometric (sin, cos, tan) functions and their graphs. C. Geometry 1. Solve geometric problems (with or without coordinates) by analyzing figures in terms of points, lines, circles, and polygons, including finding perimeters and areas. . Visualize and sketch points lines planes and simple solids in three dimensional space and find volumes of boxes and cylinders. Use knowledge of parallelism, perpendicularity and associated angle properties to analyze and construct figures and represent problem situations (with or without coordinates). Apply knowledge of the angle sum and relationship between sides and angles in a triangle (including isosceles and equilateral triangles). Use knowledge of similarity and congruence to make a reasoned analysis of relations, angles, lengths and areas in a figure or problem situation. Demonstrate knowledge of the relationship between triangle properties, the Pythagorean theorem and distance (with or without coordinates). 128 7. Use right triangle relationships and the trigonometric ratios sine, cosine and tangent to analyze relationships and formulate and solve problems. 129 SOCIAL STUDIES Social Studies is an academic area defined for school programs and encompasses a number of fields in the humanities and social sciences. A. Knowledge: Students should be able to 1. recognize the principal significance and chronological sequence of major events, movements and personalities In the mlitical and diploma__tic history of the British North American colonies to 1776 and the United States thereafter; distinguish among the powers assigned to the executive, legislative, and judicial branches of the government in the US. Constitption and between the areas of responsibility assigned to the state and federal governments; identify significant changes that have altered the foregoing through Judicial interpretation and other developments; describe the processes for choosing political and governmental leaders in the United States, including formal constitutional and other conventional procedures, and the role of such major elements in American political culture as political parties, interest groups, traditional images and values, and the media; discuss the sources and history of civil rights in the political system of the United States, recognizing distinctions between ideas of natural and civil rights, and identifying issues and competing interests in debates over human and civil rights; discuss the concepts of class. race, ethnicityLand gender In the analysis of society, and: - . characterize the major ethnic and racial groups that compose the population of the United States, identifying their linguistic, religious, and other cultural differences, the chronology of their arrival in North America, and their main regional and national influences within the United States; describe how the following have affected the status of women in various cultures of the wor1d, including the United States: 1) increasing numbers of women in the economy, 2) the rebirth of an organized women's movements; 3) traditional definitions of women's roles; 130 10. 11. recognize the principal eras in the history of western civilization from Greek and Roman times to the present, identifying elements used in conventional periodization; show a knowledge of the basic chronologies of work} histonr; demonstrate an ability to compare and contrast the various political theories including socialism, communism, fascism, totalitarianism, and democracy; discuss patterns of .qovemmental amority in countries other than the United States in relation to their differing historical, geographical, cultural and social circumstances; recognize in chronological order the major wars ofthe twentieth centugy and alliances of nations that preceded and emerged from the wars, and the principal international organizations that have been founded to resolve disputes and promote concord and cooperation among nations; demonstrate knowledge of major world religious systems and philosophical schools and explain how the scarcity of productive resources requires the development of economic systems to make decisions about how goods and services are produced and distributed. B. Skills and Methods: Students should be able to 1. distinguish between Primary and secondary soprces use them appropriately as evidence to support an argument in formal writing, giving Full and accurate citations; demonstrate ability to use geographic tools and_ resources (e.g., maps, atlases, data bases, and spatial data); demonstrate ability to analyze and correlate data through the use of conventional historical, comparative, and quantitative research techniqwfi (using, e.g., tables, graphs, and basic statistics); and . show awareness of the variety of sources used as evidence by social scientists and humanists, including print material, statistics, paintings, sculpture, architecture, film, photographs, and other artifacts. Integrative Applications: Students should be able to demonstrate their knowledge and skills through critical analyses in which, for example, they 131 . compare and contrast the impact or race, class. ethnicity. and_gender on the histories or US. and other cultures; . compare and contrast the definition, role and significance or citizenship in the history of the US. and other countries; . discuss the significance or geography in the development of cultures with specific reference both to the US. and other areas of the worid; . apply economic reasoning, to help explain historical and current developments and Issues, distinguishing between and showing the interaction of the US. domestic economy and the global economy; . explain how major world religious systems and philosophical schools affect the way people react to crises and dilemmas; .describe the interconnections among cultural, political, social and technological, and environmental change accounting for and resulting from the emergence 9f modem Industrial economies in the United States and the world; and . use social science methods in such disciplines as anthropology, sociology, and psychology to analyze historical and contemporary issues. 132 SCIENCE A. Science Process: given a specific scientific concept, the student should be able to complete this process: 1. State a hypothesis 2. Design an experiment to test the hypothesis 3. Collect and organize appropriate data and 4. Interpret these data to communicate conclusions related to that concept. Science Knowledge Base: the student will be able to demonstrate an understanding of fundamental concepts from at least two of the science disciplinary areas that are typically covered In high school curricula: earth science, environmental science, chemistry, biology and physics. Examples of such concepts are given below. Note: these lists of concepts are not Intended to be all-inclusive, but rather to serve as a minimum model for secondary school science curriculum content. Laws of conservation of engqy and matter as they apply to living and non-Iivipg system: 1. 2. 3. Apply the, concepts or force as they relate to motion. Exhibit an understanding of the basic principles of wave motion. Analyze the movement of charge in an electrical circuit. Understand the nature of chemical reactions, including rates of reactions and energy changes. . Apply the concepts of stoichiometry (i.e., mole concept) to chemical reactions and in the preparation of solutions. Understand the relationship between respiration and photosynthesis, and the central role of ATP (adenosine triphosphate) in the energy transfer process. Recognize that the total amount of matter and energy in the universe is constant. 133 The atomic natpre of matter. 8. Express measurements from subatomic to astronomical using the appropriate scale and units. Describe the structure and changes of matter from subatomic to macromolecular levels. 10. Use the kinetic molecular theory to describe phases, solutions and changes in states of matter. 11. Identify the relationships among the structure, shape, function and properties of molecules. The natpr_e of organisms-frpm cellular to macroscopic: 12. 13. 14. 15. Describe cell structure and function, and their relationships to chemical and physical principles. Conceptualize the classification of living things according to structure and/or function. Integrate structure, function and control mechanisms of a variety of living organisms. Understand the nature of genetic information, and the role of heredity in controlling cell processes and the transmission of genetic Information. Relationships within and between systems, e.g., atmosphere, hydrpsphere and geosphere 16. 17. 18. 19. Describe the evolution through time of the earth’s systems through repeated interactions and transitions. Describe the level and consequences of the relationship between living organisms and their environment. Recognize the central role of the sun in the production of energy. Trace the cyclical flow of matter and energy through living and non-living systems. 134 . Science Communication: the student will be able to read and discuss scientific information from print and electronic sources, and to . Correctly use appropriate scientific terminology. . Locate and use appropriate sources to obtain scientific information (e.g., electronic, print, etc.) . Cogently communicate, orally and in writing, this knowledge. . Science, Technology and Society. the contextual study of science should magnify the scientific ideals of curiosity, diligence and skepticism for probing and seeking understanding of relationships among science, technology and social issues. The student will be able to demonstrate the following competencies: . Recognize the relationships among science, technology and social issues. . Identify situations and problems that include a science component, and be able to identify relevant concepts. . Distinguish opinion from data and fact in discussions and considerations of personal, social and global issues. . Analysis/Problem Solving: the student will be able to . Gather, organize and use information to provide qualitative and/or quantitative solutions to problems. . Use data to construct graphs to represent the relationships shown in an appropriate mathematical form. . Identify the correct units required to describe a given quantity and to use dimensional analysis to solve problems. . Make informed decisions by examining options and anticipating the consequences of actions. . Convert the magnitude of data to/from decimal, scientific and engineering notation, and to perform basic mathematical computations using any of these forms. 135 F. Laboratory Skills: the student will be able to 1. Follow written and/or oral direction to use laboratory equipment to collect experimental data. 2. Assimilate and correlate laboratory data with theory to render a clear, comprehensive, and concise presentation in written or oral reports supported, where appropriate, with various models of data presentation (e.g., tables, diagrams, graphs, models, etc.). 3. Achieve a desired result by interpreting and executing instructions, plans, models and diagrams, using appropriate equipment and technology. 4. Use graphical extrapolation and interpolation to predict the magnitude of variables not directly measured. 136 APPENDIX C 137 APPENDIX C 50.9.1. 33...... 2.6 5.8335 8.. v.83 oz n mmz m sea bani... .I‘lll 99.5582 Loom u _ 3.30m Eu.»8.—.a:a.na< coca—Eton v2.8..— u N 923m d has—3:93. .92—9.9m = 238. 8:58.»... bSanam u m . L5 5 =2.5.§E§b aux—Rum 3.82.5 2.... ocean—Stun 30m bo> n v 3.... 83.35. 853 85:5...2. 29:90am n m 3...... 88.8 mov- act-3:”.— .§ no... 5:82.800. Palm—Um 35.3... 3.5 35...... 3R. ":13 faves 9.22.: 22.3.... 3.3.32.3... 5 332.3 .o 8: mac—355 36:2...» 2592?. 5 335:3, no 3.). 3......» 22.8.5 as... c. 825.5 .5 x: 22.5 «so 3.. F852 .99.. .8. 9:32.36. .5505... 25:5... .3... 8... toxemia _ 8.35:3: 8.9.9:. 2.5 3535 28 5.26.5550 35 5.5.39. 9:332... 8353.358... 52.3: Ba 23.». .83... 2.2.3 $3305. 382; 95...? a»: ”3350733 .23 a... 55.2.5.5 _ 3:58 .308 .96.. 3... guisei 52.55 a» an... 5...; .83 38¢. 5.5.35... 2.5 8.... .85... .3: .25.. 2...: .835 £35 .86.. .o 882 .85.» an... .o 25.2 mar—OK.— DZF—b—Omm—d GHN_Q¢UzmhmmECU 138 8:... 33295 .8 0:82 m—IWMm~=U< QZ< ZO~H