PERCEPTIONS OF SECONDARY VOCATIONAL EDUCATORS OF THE APPROPRIATENESS OF A PROCESS FOR [ , THE LOCALLY DIRECTED EVALUATION OF LOCAL VOCATIONAL EDUCATION PROGRAMS Dissertation for the Degree of Ph. D. MICHIGAN STATE UNIVERSITY MARVIN D. DeWITT 1975 NIVEI ISITY LIBlHARIEl‘S IIIIIIIIIIIIII IIII III I II 3 1293 00083 I 7' H :‘C 1. This is to certify that the thesis entitled PERCEPTIONS OF SECONDARY VOCATIONAL EDUCATORS OF THE APPROPRIATENESS OF A PROCESS FOR THE LOCALLY DIRECTED EVALUATION OF LOCAL VOCATIONAL EDUCATION PROGRAMS presented by Marvin D. DeWitt has been accepted towards fulfillment of the requirements for Ph.D. . Vocational-Technical degree in Education IIIIM II Major professor I (A /l. November 12, 1975 [)ate 0-7539 amomc av ‘3': I "OAS & SONS noun mum me. I PERCEPTIONS OF SECONDARY VOCATIONAL EDUCATORS OF THE APPROPRIATENESS OF A PROCESS FOR THE LOCALLY DIRECTED EVALUATION OF LOCAL VOCATIONAL EDUCATION PROGRAMS BY Marvin D. DeWitt The purpose of this study was to determine the extent to which teachers support Byram's evaluation system as a process for locally directed evaluation of local Michigan vocational education programs in comprehensive high schools served by area centers. Recommendations were made regarding locally directed evaluation of local vocational education programs in comprehensive high schools utilizing area centers. The instrument used in this study was developed from a manual by Harold M. Byram and Marvin Robertson entitled, Locally Directed Evaluation of Local Vocational Education Programs, A Manual for Administrators, Teachers, and Citizens. The total questionnaire containing sixty-four statements dealt with Byram's concept of how to conduct a local evaluation project. Where appropriate, references to the role of the area center were included within each statement to aid in the determination of the degree of the respondent's support of the evaluation process for his local school situation in relation to the area center. Vocational education teachers of fourteen high schools and seven area vocational centers in Michigan were surveyed in this study. Marvin D. DeWitt Ten of the high schools had been involved in one or the other of Byram's two research and development projects in 1963-65 and 1966-67. The survey included five area centers which either served as an exten- sion of the vocational programs of one of the schools at the time of this study or had been involved in Byram's projects before being designated an area vocational center. The remaining four high schools and the two area vocational centers serving these four schools were chosen because they were located in districts which had not participated in one or the other of Byram's two research projects noted above. The population of respondents for this study consisted of all the teachers of vocational subjects in the above twenty-one schools. This included all of those teachers who taught or coordinated one or more classes which were eligible for special reimbursement as vocational education in the seven major vocational-technical teaching areas: agriculture, distributive education, health occupations/education, home economics, office occupations, technical education, and trade and industrial occupations. Based on the data acquired in this study, vocational education teachers and teacher/coordinators as individuals appea;_ta_§uppnntq Byram's evaluation system for locally directed evaluation of vocational N education programs including those in the area center, as indicated by the following. 1. Although the majority of the teachers had no experience in one of Byram's evaluation projects, when they were confronted with the process, they were favorable of it. Although the teachers were not in favor of using a portion of faculty meetings as a way to provide staff time to work on evaluation activities, Marvin D. DeWitt they were in favor of budgeting funds for inservice education and curriculum planning as alternatives. The majority of the teachers believe that an evaluation project should be initiated only when the board and the school staff including the area center staff are committed to program evaluation. The majority also indicated each vocational area at the area center should be represented on the staff committee, and all members of the faculty and administration of both the high school and the area center should participate in the evaluation project. There was a high level of agreement among the teachers in their opinions on sixteen selected survey statements in support of Byram's evaluation system as a process for use by a comprehen- sive high school to evaluate the quality of its vocational education programs including the instruction available in an area center. The average level of agreement (strongly agreed plus agreed) to all sixteen survey statements by the 278 teachers was 79 percent. There were no significant differences in responses to survey statements by teachers in project schools, non-project schools, and project area centers. With an alpha level of .10, no significant differences were found (P=0.7719) in the responses for the three types of institutions. No differences in teachers' opinions appeared to have existed when comparing institutions which had been involved in one of Byram's projects and those which had not. Marvin D. DeWitt 5. There were no significant differences in teachers' responses to selected survey statements when compared by type of institution, size of institution, and years of experience, except in a three- way analysis of variance comparing type of institution by size of institution by years of teaching experience. The three-way analysis of variance indicated a significant difference (P: 0.0615) with an alpha level of .10, among responses of teachers with eleven or more years of teaching experience. Those teachers with eleven or more years of teaching experience in the large area centers and the small schools show a relatively higher level of support than those teachers with eleven or more years of teaching experience in small area centers and large schools. 6. There was relatively high agreement on what position the local leader should hold in the school system. Nearly half of the teachers (46 percent) said the director of vocational education should be the local leader while 28 percent said a teacher should be the local leader. 7. A majority of the teachers believed the local leader should be provided with released time. Fourteen percent said full-time, 39 percent said half—time, and 26 percent said one hour per day. Recommendations were as follows. 1. In the interests of improving instruction in secondary vocational education, procedures should be set up for encouraging locally directed evaluation of local programs which include the area center . Marvin D. DeWitt Locally directed evaluation should be a responsibility of all teachers and administrators in both home schools and area centers. Funds should be budgeted at the local level for inservice education and curriculum planning to support activities in the local evaluation program. Programs in the training of local personnel in local program evaluation should be made available to teachers and administra- tors of both home schools and area centers. Local leaders of locally directed evaluation prOgrams should be provided with released time from their daily schedule. PERCEPTIONS OF SECONDARY VOCATIONAL EDUCATORS OF THE APPROPRIATENESS OF A PROCESS FOR THE LOCALLY DIRECTED EVALUATION OF LOCAL VOCATIONAL EDUCATION PROGRAMS BY vfixb Marvin D. DeWitt A DISSERTATION Submitted to Michigan State University in partial fulfillment of the requirements for the degree of DOCTOR OF PHILOSOPHY Department of Secondary Education and Curriculum 1975 ACKNOWLEDGMENTS The writer expresses his appreciation and gratitude to Dr. 0. Donald Meaders for his counsel and guidance throughout the development of the study, to Dr. Rex Ray for his interest and assistance as a Guidance Committee member, and to Dr. Dale Alam and Dr. Erling Jorgensen for their time and assistance as members of the Guidance Committee. Sincere appreciation is also expressed to the vocational education teachers and administrators who graciously took of their time to complete the survey instrument. ii To my mother, Lois Eleanor DeWitt who endured more than the usual amount of suffering. iii TABLE OF CONTENTS LIST OF TABLES . Chapter I. II. III. THE PROBLEM. Introduction. . Statement of the Problem. Purpose . Need for the Study. . Sources of Data and Procedures. Delimitations Limitations . Definition of Terms . Organization of the Dissertation . DESCRIPTION OF THE AREA CENTER WITHIN THE CONCEPT OF THE COMPREHENSIVE HIGH SCHOOL . Introduction. A Description of the Comprehensive High School The Comprehensive High School--Its Meaning . The Comprehensive High School--Its Purpose . A Philosophy of the Area Vocational Education Program . The Area Program Concept . The Area Center Concept REVIEW OF LITERATURE AND RESEARCH Definitions of Evaluation A Methodological Process . A Basis for Decision Making . Other Definitions Purposes of Evaluation The Need for Evaluation . iv Page vii H HmmNO‘hMMt—i H 12 12 12 12 15 23 23 26 29 29 31 33 34 35 40 Chapter Public Interest . . Responsibility to Society. Basis for Program Improvement . . Broad Perspective . . . . . . . New Evaluation Techniques Needed Importance of Evaluation at State and Federal Levels. Importance of Evaluation in Vocational Education. Local Involvement in Evaluation Research Studies in Local Evaluation of Vocational Education Programs . Summary IV. METHODOLOGY. Population of the Study . . . Questionnaire Design and Survey Procedures. Background of Schools, Area Centers, and Respondents Background of Schools and Area Centers Background of Re5pondents. Data Analysis Procedures. V. ANALYSIS OF DATA . Analysis of Data Statements Relating to Area Centers . . . Teacher Responses to Selected Survey Statements Teacher Responses by Institution to Survey Statements . . . . . . Teacher Responses by Institution Type, Institution Size, and Teaching Experience. Additional Data. Appointing a Qualified Leader to the Position of Local Leader . . . . . Providing Time for Evaluation Additional Comments About Program Evaluation Goals of Evaluation . Success of Evaluation . Input to the Evaluation Process . Participation in the Evaluation Process Need for Evaluation . . . . . Summary Page 40 42 44 47 49 51 52 SS 56 71 75 75 77 80 80 80 89 93 93 94 95 101 103 107 107 110 112 112 113 113 113 114 114 Chapter VI. SUMMARY, CONCLUSIONS, AND RECOMMENDATIONS . Summary. Statement of the Problem Research Design and Procedures Findings. Conclusions Recommendations . Discussion. BIBLIOGRAPHY . APPENDICES Appendix A. Questionnaire B. Sample Letter of Request. C. Sample Cover Letter for Questionnaire D. Sample Follow-Up Thank You/Reminder Letter. E. Respondent School Data F. Teacher/Administrator Responses to Survey Statements G. Administrator Data. vi Page 118 118 118 119 120 123 125 126 127 136 139 140 141 142 144 149 Table 10. 11. 12. LIST OF TABLES Number of Questionnaires Distributed to and Responses Received from Teachers by Type of Institution . Age and Sex of Teacher Respondents Classified by Type of Institution . . . . . . . . . Years of Teaching Experience of Teacher Respondents Classified by Type of Institution . Participation of Teacher Respondents in Byram's Projects of Locally Directed Evaluation of Local Vocational Education Programs Classified by Type of Institution. Participation of Teacher Respondents in Any Locally Directed Evaluation Project Aimed at Evaluation of Local Vocational Education Programs Other Than North Central Evaluations Classified by Type of Institution Highest Level of Schooling Completed by Teacher Respondents Classified by Type of Institution . Relationship of the Sixteen Statements to the Essential Elements for Evaluation, Process Activities, All Survey Statements, and Those Statements Making Reference to the Area Centers. Percent of Teachers Who Agreed or Disagreed with Sixteen Selected Survey Statements About Evaluation of Local Vocational Education Programs . . . . . . . One-Way Analysis of Variance of Teachers' Responses by Type of Institution. . . . Three-Way Analysis of Variance. Cell Means of Teacher Respondent Groups by Years of Teaching Experience, Type, and Size of Institution Mean Scores of Teachers with Five Years or Less Teaching Experience by Type of Institution . vii Page 81 83 84 86 87 88 90 97 103 104 105 106 Table Page 13. Mean Scores of Teachers with Six and No More Than Ten Years' Teaching Experience by Type of Institution . . . 107 14. Mean Scores of Teachers with Eleven or More Years' Teaching Experience by Type of Institution . . . . . 108 15. Number of Responses of 278 Teachers to Their Preference for Position of Person to be Selected Local Leader . . . . 109 16. Number of Responses of 278 Teachers to Amount of Released Time Per Day to be Provided the Local Leader. . . . . 111 F. Responses of 278 Teachers and SO Administrators to Survey Statements Shown in Percent . . . . . . . . . . 144 6-1. Number of Questionnaires Distributed to and Responses Received from Administrators by Type of Institution . . 149 G-Z. Age and Sex of Administrator Respondents Classified by Type of Institution . . . . . . . . . . . . . 150 G-3. Years of Teaching Experience of Administrator Respondents Classified by Type of Institution . . . . . . . . 151 G-4. Participation of Administrator Respondents in Byram's Projects of Locally Directed Evaluation of Local Vocational Education Programs Classified by Type of Institution. . . . . . . . . . . . . . . . 152 6-5. Participation of Administrator Respondents in Any Locally Directed Evaluation Project Aimed at Evaluation of Local Vocational Education Programs Other Than North Central Evaluations Classified by Type of Institution . . . . 153 6-6. Highest Level of Schooling Completed by Administrator Respondents Classified by Type of Institution . . . . 154 6-7. Years of Full-Time Administrative Experience of Administrator Respondents Classified by Type of Institution. . . . . . . . . . . . . . . . 155 6-8. Position of Local Leader as Perceived by Fifty Administrators. . . . . . . . . . . . . . . 156 6-9. Responses of Fifty Administrators to Amount of Released Time Per Day to be Provided the Local Leader. . . . . 157 viii CHAPTER I THE PROBLEM Introduction The comprehensive high school as we know it today is ". responsible for providing good and appropriate education, both academic and vocational, for all young people within a democratic environment. ."1 How well the comprehensive high school is meeting its goal is of growing concern to educators, students, and citizens. All persons associated with a school make judgments about its value and worth, and in order for these judgments to be valid adequate data must be available on which to base conclusions. To provide data on which to make value judgments and to base decisions for future improvement, a system for evaluation is needed. Thompson believes the test of an evaluation system is as Fred Wilhelms has indicated in the question: "Does it deliver the feedback that is needed, when it is needed, to the persons or groups who need 2 it?" According to Thompson, the ASCD believes the following to be the criteria for an evaluation system for schools: 1James B. Conant, The American High School Today (New York: McGraw-Hill Book Company, 1959), pp. 7-8. 2John F. Thompson, Foundations of Vocational Education (Englewood Cliffs, N.J.: Prentice-Hall, 1973), p. 198. An evaluation system for modern education must: (a) facilitate self-evaluation, (b) encompass every objective valued by the school, (c) facilitate learning and teaching, (3) produce records apprOpriate to the purposes for which records are essential, and (e) must provide continuing feedback to the larger questions of curriculum development and educational policy.3 Because of public interest and resultant legislation, the number of vocational programs being offered within the concept of the comprehensive high school has increased greatly in recent years. The quality and apprOpriateness of these occupational programs have become a concern of those persons involved. Byram and McKinney have said, "The basic concern of peOple affected by programs of occupational education is whether they are getting what they hOpe to from the programs, and whether this is comparable to what they are putting in."4 In the interests of improving evaluation within vocational edu- cation, Harold Byram of Michigan State University has developed a manual entitled Locally Directed Evaluation of Local Vocational Education Programs, A Manual for Administrators, Teachers, and Citizens. The purpose of the manual is to provide the reader with a process for locally directed evaluation of local vocational education programs. The principles and practices contained within the manual are based, to a large extent, on the experience of, and study by the staffs of thirteen public schools in Michigan that participated in the first two develop- mental research projects; and the twenty school systems in four states 31bid. 4Harold Byram and Floyd McKinney, Evaluation of Local Vocational Education Programs (East Lansing, Mich.: Michigan State University, 1968), p. 2. participating in the third project.5 The evaluation process described within Byram's manual is the basis for this study. Evaluation procedures in vocational education as part of the comprehensive high school Concept take on a new concern with the advent of the area center. The area center provides the means whereby a vocational education program can be developed to serve several com— prehensive high schools and, at the same time, serve as an extension of those schools. Evaluation procedures in vocational education need to take into consideration these new area center concepts. Statement of the Problem A concern exists for evaluation of local programs of vocational education in local schools and any area centers which may be a part of that local school. There appears to be a tendency for the area centers to be considered as separate entities. Based on this concern, the problem was to determine the extent to which teachers in local schools and teachers in area centers support a concept of local program evaluation as the concept relates to the area center. Pur se The purpose of this study was to determine the extent to which teachers support Byram's evaluation system as a process for locally directed evaluation of local Michigan vocational education programs in comprehensive high schools served by area centers. Recommendations 5Harold Byram and Marvin Robertson, Locally Directed Evaluation of Local Vocational Education Programs, A Manual for Administrators, Teachers, and Citizens, 3rd ed. (Danville, 111.: The Interstate Printers and Publishers, Inc., 1970), p. iii. were to be made regarding locally directed evaluation of local vocational education programs in comprehensive high schools utilizing area centers. This study attempted to answer the following questions. 1. To what extent did teachers' opinions on selected survey statements support Byram's evaluation system as a process for use by a comprehensive high school to evaluate the quality of its vocational education programs including the instruction available in an area center? 2. Was there a difference in responses to survey statements by teachers in project schools, non-project schools, project area centers, and non-project area centers? 3. Were there differences in teachers' responses to selected survey statements in any of the following comparisons: a. By type of institution? b. By size of institution? c. By years of teaching experience? d. For type of institution by size of institution? e. For type of institution by years of teaching experience? f. For size of institution by years of teaching experience? g. For type of institution by size of institution by years of teaching experience? Need for the Study Vocational education in Michigan at the senior high school level is a development within an American phenomenon called the comprehensive high school. Conant has said the American high school is called comprehensive because it offers, under one administration and under one roof, or on one campus, secondary education for almost all the children of high school age of one district. In Michigan the comprehensive high school concepts have been used to develop guidelines for expanding the availability of vocational education. Emphasis has been on cooperative arrangements, between two or more school districts or between high schools within large districts for the purpose of operating jointly-shared vocational education programs.6 Programs in vocational education may have far-reaching effects on the deve10pment and expansion of the economy. Vocational education programs help to shape the future work and lives of our youth and adults. Byram has indicated that questions come from many quarters asking about the degree to which such vocational education programs are (a) available and (b) as effective as they should be in meeting the needs of those who support them, of those who benefit from them, and of employers.7 Evaluation, at the state level, of Michigan's vocational edu- cation programs for purposes of policy deve10pment and revision, is conducted by the Michigan Department of Education, Vocational and Technical Education Service. Evaluation of vocational programs at each school district level is accomplished locally. Byram's manual describes the process for effecting locally directed evaluation of local programs. 6Michigan Department of Education, A Tentative Plan for the Development of Area Vocational Education Centers in Michigan (Lansing, Michigan, 1970), p. 4. 7Byram and Robertson, Locally Directed Evaluation, p. I:1. Local programs in any given school district, in many areas in Michigan, include the instruction offered to students by area vocational edu~ cation centers. There is a need for determining the feasibility of using the same procedures in evaluating local programs which include the programs of new and proposed area centers. Byram indicated such a need exists when he wrote: ". . . it is believed that the practices set forth in this manual would merit trial in . . . area vocational schools ."8 The need for evaluation of vocational education programs throughout the State of Michigan is imperative in order to effectively revise present programs and design/implement new programs. Sources of Data and Procedures Data were gathered from teachers and administrators in schools and area centers which provide vocational education in Michigan. Data from administrators may be found in the appendix. The data gathered were in the form of opinions obtained through the use of a question- naire. An instrument based on the eleven activities in Byram's process (see Chapter IV) was developed to survey the opinions of teachers to determine the degree of support for Byram's evaluation process for locally directed evaluation of local vocational education programs in relation to area centers. The instrument contained statements within each of the activities. All items in the instrument were based on 81bid., pp. iii—iv. Byram's procedures that had been tried and tested in three projects, and included references to area centers where appropriate for purposes of this study. The opinions gathered were analyzed by comparing the responses of all teachers to each statement on the survey instrument. Teacher responses were also analyzed by use of four variables: participation or non-participation in one of Byram's projects; type of institution; size of institution; and years of teaching experience. Research reports and other literature dealing with evaluation of programs and local evaluation of programs are reviewed in Chapter III for the purposes of establishing: the purposes of evaluation, the need for evaluation, importance of evaluation in vocational education, local involvement in evaluation and a review of studies in local vocational education programs. Delimitations This study was limited to fourteen high schools, ten of which were located in school districts which had participated in one of Byram's two projects on evaluation of local vocational education programs, and seven area centers, five of which were located in school districts that had participated in one of Byram's projects. Only teachers of vocational education classes at the secondary level were included in this study. Counselors and other teachers were not included. Data were collected from administrators but were not incorporated into the analysis of this study. No attempt was made to evaluate the present vocational education programs. Limitations This study was limited to fourteen high schools and seven area vocational education centers in Michigan which had students enrolled in vocational education programs at their high school and/or sent enrolled students to their area vocational education centers. The respondent's attitude toward the instructional environment, his or her past educational experiences, and his or her past experiences with and/or attitude toward vocational education--all may have affected his or her responses. Because the majority of the teachers had not participated in one of Byram's projects, the information obtained from the study represents their opinions about his process. The data collected will apply only to the school districts concerned and are not necessarily indicative of situations in other locations. Definition of Terms For the purpose of this study, the following words and phrases are defined. Appropriateness. Rightness for the purpose; suitability; 9 fitting; proper. Area Center. The term area center is used to indicate a secondary area vocational center designed to expand the vocational training Opportunities of participating K-12 districts by providing 9David B. Guralnik, Webster's New World Dictionary (New York: The World Publishing Company, 1972), p. 68. those programs which cannot be, or are not, provided by each individual district for lack of sufficient student demand and/or financial resources. In addition, the area center is designed to serve students in grades 11 and 12 whose needs cannot be as well met by vocational education courses offered at their own schools. Students enrolled usually spend one-half time in the area center and one-half time in their home schools. Building. A home high school or area center, either large or small, which may or may not have been included in one of Byram's evaluation projects. Comprehensive High School. A public high school expected to provide education for all youth living in a town, city, or district.10 Evaluation. The term evaluation refers to the task of making judgments about the worth or value of a total program of vocational or technical education. Evaluation involves primarily the determination of the extent to which previously established goals and objectives are being or have been attained.11 Home School. A comprehensive high school that sends students to an area center, on a half-day basis, for instruction in vocational education programs. Institution. One or more buildings categorized by specific characteristics. 10James B. Conant, The American High School Today (New York: McGraw-Hill Book Company, Inc., 1959), p. 7. 11Byram and Robertson, Locally Directed Evaluation, p. 1:1. 10 a. Type of Institution. An institution categorized as a Project School, Non-Project School, Project Area Center, or Non-Project Area Center. b. Size of Institution. An institution categorized as a large school or a small school. Intermediate District. Intermediate district refers to that administrative unit which functions between the local school districts and the State Department of Education. Large School. A home high school with a student enrollment of 1,500 or more. Local Level. In this study, local level refers to education at less than state level and includes any such school and the school system. Non-Project Area Center. An area center serving at least one home school which in the past did not participate in one of Byram's projects. Non-Project School. A home high school not included in one of Byram's projects. Project Area Center. An area center serving at least one home school whose high school personnel in the past may have participated in one of Byram's projects at that home high school. Project School. A home high school whose personnel in the past may have participated in one of Byram's projects at that home high school. 11 Secondary_School. Secondary school refers to a comprehensive high school designated by its school system as a senior high school, and may consist of grades 9 through 12. Small School. A home high school with a student enrollment of less than 1,500. Vocational Education. Vocational education, in this study, refers to education at the secondary level which is eligible for special reimbursement as vocational education in the seven major vocational- technical teaching areas as identified by the U.S. Office of Education: agriculture, distributive education, health occupations/education, home economics, office occupations, technical education, and trade and industrial occupations. Organization of the Dissertation Chapter I contains the statement of the problem, purpose and need for the study, sources of data and procedures, delimitations and limitations of the study, definitions of terms, and organization of the dissertation. Chapter II contains a rationale for the area center within the concept of the comprehensive high school. Chapter III is a review of related literature and research. Chapter IV contains a description of the population; question- naire design and survey procedures; background of schools, area centers, and respondents; and data analysis procedures. Chapter V contains an analysis of the data; and Chapter VI, the final chapter, contains the summary, conclusions, and recommendations. CHAPTER II A DESCRIPTION OF THE AREA CENTER WITHIN THE CONCEPT OF THE COMPREHENSIVE HIGH SCHOOL Introduction The purpose of this chapter is to describe the inclusion of the secondary vocational education center within the concept of the compre- hensive high school. A Description of the Comprehensive High School The Comprehensive High School--Its Meaning The comprehensive high school, according to Conant, is a result of American history. Conant said each community in our nation has been legally expected to provide free education. As the years passed, high schools gradually were expected to provide education to all youth irregardless of their abilities and interest. American students are expected to plan their course of study from a variety of subjects offered by the high school. Thus, the name comprehensive high school.1 This is not to infer all American high schools are comprehensive high schools. The difficulty of defining the term comprehensive high 1James B. Conant, The Comprehensive High School, A Second Report to Interested Citizens (New York: McGraw-Hill Book Co., 1967), pp. 3-4. 12 13 school prevents all high schools being included in such a definition. Adolf Panitz believes the difficulty in defining the term becomes more apparent upon examining schools to which the term is applied. As Panitz has said: The To some administrators, it means a broad program of academic studies with, perhaps, some offerings in business education and home economics. To others, it means a general and college preparatory program with some shop courses in industrial arts. Still others have added a small number of vocational courses, thus giving some credence to their claim to comprehensiveness. A different approach makes the composition of the student body the basis of comprehensiveness. In this view, if the student body represents a cross section of the socio-economic structure of the community, the school is comprehensive. Conant defines the comprehensive high school in terms of national values and the quality of the curriculum; "curriculum of common democratic under- standings which seek to provide in its elective offerings excellent instruction in academic fields and rewarding first-class vocational education." Havighurst uses the socio-economic origin and the educational aspirations of the students: "in comprehensive high schools with 30 to 70 percent of their students from working class homes, between 25-70 percent of their graduates go to college." This list could be substantially expanded. It is significant that nearly always the definition includes mention of college preparatory or college entrance. Rarely is there concern with the kind of education that would provide the high school graduate with several alternatives.2 Panitz's referral to Conant's definition is in Conant's book, American High School Today, in which Conant also said: With few exceptions, for the most part in large eastern cities, the public high school is expected to provide education for all the youth living in a town, city, or district. Such a high—EEhool has become known as a "comprehensive high school in contrast to the "specialized" high schools which provide vocational education or which admit on a selective basis and offer only an academic curriculum.3 2Adolf Panitz, "What Makes a High School Comprehensive?" in Contemporary Concapts in Vocational Education, ed. Gordon F. Law (Washington: American Vocational Assn., 1971), pp. 205-06. 3James B. Conant, The American High School Today (New York: McGraw-Hill Book Co., 1959), pp. 7-8. 14 The concept of a comprehensive high school is unclear to many who have had no communication with American public high schools. Conant defined the comprehensive high school by quoting John Gardner of the Carnegie Corporation of New York: The comprehensive high school is a peculiarly American phenomenon. It is called comprehensive because it offers, under one administration and under one roof (or series of roofs), secondary education for almost all the high school age children of one town or neighborhood. It is responsible for educating the boy who will be an atomic scientist and the girl who will marry at eighteen; the prospective captain of a ship and the future captain of industry. It is responsible for educating the bright and the not so bright children with different vocational and professional ambitions and with various motivations. It is responsible, in sum, for providing good and appropriate education both academic and vocational, for all young people within a democratic environment which the American people believe serves the principles they cherish.4 Conant, in his Second Report to Interested Citizens, said of the 15,000 high schools surveyed: "One may conclude that the picture which the results present is a fairly accurate and national index of the comprehensiveness of our schools." He went on to say, It will be noted that over three-quarters of all the responding schools fall in the categories of those from which between 25 percent and 75 percent of the graduating class enroll in institutions of higher education. These schools with a hetero- ggpeous student body I have chosen to call widely comprehensive high schools.5i A comprehensive high school by virtue of its heterogeneous student body, offers secondary education for almost all the high school age children of one town or neighborhood, as Gardner pointed out. The comprehensive high school is no more general education 4Conant, The Comprehensive High School, p. 3. SIbid., pp. 6—7. 15 oriented than vocational education oriented. Rather, it is a com- bination of both by serving the needs of all the youth of our nation. The Comprehensive High School--Its Purpose Our schools are a reflection of our culture. They value what our society values. John Thompson has said that education grows out of social values and norms. Consequently education has become a highly complex system, influenced by pe0p1e and institutions. Thompson quotes Havighurst and Neugarten as saying that: In a changing society there is always some divergence between what society is and what it wants to be, between practices and its ideals. Thus, the educational system, being part of the culture, has two supplementary functions: to be a mirror that reflects the society as it is, and at the same time, to be an agent of social change and a force directed toward implementing the ideals of society.6 American schools have taken on the role of change agent for our society. Increasing efforts have, according to Conant, been made in the last thirty or forty years to insure that American youth, regardless of their background, would come to understand each other--a purpose of the comprehensive high school. In terms of American social and political ideas, Conant said: The comprehensive high school . . . endeavors to provide a general education for all future citizens on the basis of a common democratic understanding: and it seeks to provide in its elective offerings excellent instruction in academic fields and rewarding first-class vocational education.7 Panitz has indicated there is no all-embracing meaning of the term comprehensive that can be accurately applied to education. 6Thompson, Foundations of Vocational Education, p. 7. 7Conant, The Comprehensive High School, p. 4. 16 "Perhaps, the popular notion of 'all inclusive' is most appr0priate for the description of a high school, as it comes close to the long held American ideal of equal educational opportunities for all our youth,"-- another purpose of the comprehensive high school.8 A distinctive purpose of the comprehensive high school is its inclusiveness, according to Keller. Inclusiveness not only in reference to clientele, but also in reference to aims and purposes: The comprehensive high school aims to serve the needs of all American youth. That is to say, it accepts without selection:_all the young people in the area it commands—-all races, creeds, nationalities, intelligences, talents, and all levels of wealth and social status. Such a school has as its broadest objective the teaching of all varieties of skills, all kinds of knowledge to all kinds of youth bent upon living socially profitable lives. To each one it seeks to give the course for which he seems best fitted. Its design is to prepare one and all for potentially successful vocations. The comprehensive high school prepares the college- oriented youth for college. It qualifies the non-college bound youth and, as far as is possible, the boy or girl who will drop out before graduation, for an occupation. It is adapted to give everyone a general education for the common things he will do in life and it may and should give some pupils of high capacity preparation for both college and occupation. In this last area it functions also as a double-purpose high school.9 Should vocational education be included within the comprehensive high school concept? Conant wrote in his book, The Comprehensive High_ School, that he would like to repeat a conviction he acquired ten years prior and which recent discussions had not altered. In The American High School Today, he had formulated his conviction as follows: My inclination is strongly in favor of including vocational work in a comprehensive high school instead of providing it in a 8Panitz, "What Makes a High School Comprehensive?" p. 205. 9Franklin J. Keller, "Vocational and Educational Guidance," in Vocational Education, The Sixty-fourth Yearbook of the National Society for the Stpdy of Education, ed. Melvin J. Barlow (Chicago: The Uni- versity of Chicago Press, 1965), p. 162. 17 separate school. My reasons are largely social rather than edu- cational. I believe it is important for the future of American democracy to have as close a relationship as possible in high school between the future professional man, the future craftsman, the future manager of industry, the future labor leader, the future salesman, and the future engineer. As I have often stressed in my writings and earlier in this report, I am convinced that one of the fundamental doctrines of American society is equality of status in all forms of honest labor as well as equality of opportunity. To my mind, it is desirable for as many boys and girls in high school as possible to have an ultimate vocational goal. It may well be that many of them will change their minds before the high school course is over or in later years. But if a student thinks that what he or she is studying in school is likely to have significance in later life, the study in question takes on a new importance. There is less tendency for such "committed" students to waige their time or have a negative attitude toward their school work. If the comprehensive high school has the general responsibi- lities previously stated including vocational education, what then, are the purposes of vocational education within the comprehensive philosophy? Barlow quotes Conant as saying the purpose of vocational edu- cation programs at the high school level: . . . is to develop skills for useful employment. These rograms relate school work to a specific occupational goal and involve more than training for specific jobs. Vocational education is not offered in lieu of general academic education, but grows out of it, supplementing and enhancing it.11 A vital purpose of vocational education is to provide vocational training to all youth before they enter the labor market. 10Conant, The Comprehensive High School, pp. 62-63. 11Melvin J. Barlow, "The Challenge to Vocational Education," in Vocational Education, The Sixty-fourth Yearbook of the National Society for the Study of Education, ed. Melvin J. Barlow (Chicago: The Uni- versity of Chicago Press, 1965), p. 6. 18 P.L. 90-576, the Vocational Amendments of 1968, has given impetus to this view. By identifying "all," the Act emphasizes the need for vocational education for the disadvantaged, the handicapped, secondary school youth, post secondary enrollees, and adults who have entered 12 the work force. Haskew and Tumlin state at least three objectives, not mutually exclusive, which exist for vocational education: 1. To make the prime objectives of vocational education coterminous with the intellectual training and personal- development objectives of the common school (elementary and secondary) . . 2. To make occupational orientation for all and portal preparation for some the peculiar contribution of vocational education to a total common-school program. 3. To make the objectives of vocational education, primarily those of job-training for the emergent labor market, Open to large numbers of students as an alternative to the academic specialization routes in the . . . schools . . .13 To survive, Shoemaker has said, public education must accept greater responsibility. Services must be expanded and improved. If the individual is to achieve his goals, education must be based on sound principles of learning. Education must be experience-oriented. "Work oriented education should be recognized as an effective means 12John Beaumont, "Philosophical Implications of the Vocational Education Amendments of 1968," in Contemporary Concepts in Vocational Education, ed. Gordon F. Law (Washington: American Vocational Association, 1971), p. 16. 1:I’Laurence D. Haskew and Inez Wallace Tumlin, "Vocational Edu- cation in the Curriculum of the Common School," Vocational Education, The Sixty-fourth Yearbook of the National Society for the Study of Education, ed. Melvin J. Barlow (Chicago: The University of Chicago Press, 1965), p. 76. 19 for selecting and preparing for employment. It must also be seen as a means of individual fulfillment."14 Shoemaker has indicated a need for expanding sound vocational education programs: It seems clear that there can be no relevance in a curriculum unless it is related to student goals, and I submit that the success of vocational education is due to the fact that it is goal-centered education based upon the student's choice of a goal. The present Commissioner of Education, Sidney P. Marland, has made a strong statement supporting the need for expansion of vocational education. He has declared war upon the general education program which allows students to wander through school, graduating with the necessary 16 or 17 credits, but unprepared to go to college or to work. It is hoped that this interest and concern is directed toward the expansion Of sound vocational edu- cation programs designed tO prepare youth for employment in a technological society, rather than the mere vocationalizing of general education in such a manner as to make it more interesting but little more productive.15 High schools, to be productive and comprehensive, must provide relevant educational Opportunities to all our youth. TO accomplish this there must be an integration of the vocational curriculum with the general curriculum. A trend toward such an integration has begun. Swanson has said such a trend is progressing rapidly: The gist of this effort is to place greater emphasis at an earlier age on meaningful information about the world of work, on occupational exploration and work experiences, and on closer coordination Of the communication skills, mathematics, physical science and social studies with occupational skill training. New emphasis is being placed on education in interpersonal relation- ships and problem-solving approaches and techniques. The latter are intended to equip youth to make adaptations to the rapidly changing job recruitments in the labor force. Valuable research and experimental efforts in behalf of these newer goals is already 14Byrl R. Shoemaker, "People, Jobs and Society: Towards Relevance In Education, Contemporary Concepts in Vocational Education, ed. Gordon F. Law (Washington: American Vocational Association, 1971), p. 21. lslbid. 20 under way and will, no doubt, progress rapidly as additional funds and resources are applied.16 Conant stated, "Perhaps one might say that if a school is going to be comprehensive it should offer, at the very least a strong program in one subject on the academic side and one subject on the vocational side."17 The foregoing does not suggest integration of the two, but at least equal need for the existence Of both is indicated. Walsh and Selden believe the main purpose of a vocational edu- cation program is the instruction that is given in a curriculum should be designed to meet the employment Objective Of an occupational area. They have indicated: "A balanced program of vocational education is required to provide the range of skills needed in a competitive labor market."18 Barlow said in 1965, "The challenges to vocational education have created a nation-wide interest in it and its total role in society. Vocational education is a means, in the judgment of many persons, Of meeting an important need of American society." Of the need for vocational education to not be just the responsibility of those in vocational education, he said: A platform for vocational education in the future will be constructed upon the strength of renewed commitments to the American ideal of education for all. Vocational education must figure prominently in the attainment of this goal. The end 16J. Chester Swanson, "Criteria For Effective Vocational Edu- cation," Contemporary Concepts In Vocational Education, ed Gordon F. Law (Washington: American Vocational Association, 1971), p. 26. 17Conant, The Comprehensive High School, p. 68. 18John Patrick Walsh and William Selden, "Vocational Education in the Secondary School," Vocational Education, The Sixty-fourth Year- book Of the National Society for the Study of Education, ed. Melvin J. Barlow (Chicago: The University Of Chicago Press, 1956), pp. 88 and 90. 21 product is not solely the responsibility Of vocational educators. Successful vocational education programs, to contribute maximally to the social and economic stability of the nation, must evolve from many relevant sources. Alger, in 1967, said: Vocational education is an integral part Of the total process. The basic purpose of vocational education is to develop skills, abilities, understandings, attitudes, work habits, and appre- ciations which are necessary for occupational success. In a democracy the individual must have freedom of choice in determining his occupation. He must have the Opportunity to prepare himself for that career.20 Burkett, in 1971, quoted the financial columnist Sylvia Porter: "Of all the education programs we have, vocational education may hold the most glittering surprises for us."21 '0f the purposes for vocational education, Burkett, in 1974, quoted the American Vocational Association Board of Directors: . . to develop and promote comprehensive programs of vocational education through which individuals are brought to a level of occupational performance commensurate with their innate potential and the needs of society.22 15 the comprehensive high school what we mean it to be? Panitz believes it is not: The comprehensive high school--what a magic term, what powerful appeal to schoolman and layman and, yet, how misleading. In the search for solutions to the problems created by the ever-multiplying 1gBarlow, "The Challenge to Vocational Education," p. 18. 20Leon J. Alger, "A Rationale For the Establishment of Area Vocational Education Programs in Michigan" (Ph.D. dissertation, Michigan State University, 1967), p. 127. 21Lowell A. Burkett, "Access to a Future," Contemporary Con- cepis in Vocational Education, ed. Gordon F. Law (Washington: American Vocational Association, 1971), p. 34. 22Lowell A. Burkett, "Study Panel Reports," American Vocational Journal 49 (January 1974):10. 22 and diverse needs of our youth, the attempted solutions have invariably involved the addition of some shop programs for the academically less able. Most of these additions have failed, simply because they never became a part of the total program. Adding courses does not make a school comprehensive. Such actions failed because they neglected to reach the heart of the problem. Vocational education is neither a series of subjects nor a separate discipline: "Vocational education is much more than knowing how to work. It is education with a purpose, for a purpose. Vocational education is learning how to work in a milieu, among people, with people, for people. It is learning to live with other workers, at home, at play and in the community" (Keller). Schools must recognize that: "A man's occupation in American society is now his single most significant status-conferring role. Whether it be high or low, a job status allows the individual to form some stable conception of himself and his position in the community" (Brookover and Nosow).23 The comprehensive high school in today's society must be truly comprehensive. Panitz has said a man's occupation being the single most status conferring role: . . . clearly demands a strong orientation in our educational system towards work and careers. Without such orientation, edu- cation is irrelevant to individual needs. The many young people who continue to drop out give clear and loud voice to such irrelevance. The trend towards comprehensive schools is in- creasing but before we give another name to the same old high school and arouse expectations that cannot be met, it will be necessary for us to initiate a most critical analysis of the American secondary school and then be prepared to implement some fundamental changes.24 The Michigan Department of Education has seen the need for a strong orientation in Michigan's high schools toward work and careers. The Department has gone on record saying: TO assure that all Michigan citizens who need it will have ready access to adequate occupational preparation, specific responsibilities rest with the high schools, secondary area 23Panitz, "What Makes a High School Comprehensive?" pp. 214-15., 24Ibid., p. 215. 23 vocational centers, community colleges, four-year colleges and universities, intermediate school districts, and the State Department of Education. . these basic understanding should be integrated into the total school program by relating the student's basic education to the world of work.25 In summary, the comprehensive American high school needs to be an integration of both general education and vocational education-- not separate programs but an amalgam Of the two--thereby providing appropriate education for all the youth of our society. Vocational education can, within the comprehensive high school concept, provide the relevance for that which many hold to be of such importance that youth must learn of it and master it to the best of their abilities. A Philosophy of the Area Vocational Education Program An explanation of the area vocational education program concept is necessary as the basis for a definitive explanation of the area vocational education center, since the area center is a part of that larger entity--the area program. The Area Program Concept The area program is founded upon the conviction that all persons should have available to them quality vocational education programs. The Michigan Department of Education has indicated such 25Michigan Department of Education, A Position Statement Con- cerningithe Development of Area Vocational and Technical Education Programs in Michigan (Lansing, Mich.: Michigan Department of Education, 1967), p. 54. 24 programs should be directed to ”. . . individual occupational preparation needs, abilities and interests."26 Early impetus for the area program stemmed from Title VIII of the National Defense Education Act of 1958, stated in part below: Sec. 801. The Congress hereby finds that the excellent programs of vocational education, which States have established and are carrying on with the assistance of the Federal Government . . need extension to provide vocational education to residents of areas inadequately served. . . . It is therefore the purpose of this title to provide assistance to the States so that they may improve their vocational education programs through area vocational education programs as providing vocational and technical training and retraining for youths . . . designed to fit them for useful employment as technicians or skilled workers in scientific or technical fields.27 In 1959, the American Vocational Association gave support to the area program: . . area vocational programs operated by the state, by the county, or by cooperation among local school districts, can offer training to youth and adults who do not now have such opportunities in their local secondary schools.28 In 1967, the Michigan Department of Education indicated the area program . . . emphasizes cooperative arrangements between two or more school districts, usually adjacent, or between high schools within large districts for the purpose of Operating jointly-shared vocational education programs for people in relatively large geographical areas or areas of high population density.29 26Michigan Department of Education, A Position Statement, p. 53. 27U.S., Congress, National Defense Education Act of 1958, Public Law 85-864, 85th. Congress, 1958, p. 1597. 28Alger, A Rationale, pp. 9 and 14. 29Michigan Department of Education, A Position Statement, p. 53. 25 According to the Michigan Department of Education, an area program has the following advantages. 1. It provides for a broader tax base distributed over a larger population than is usually present in a single school district. 2. It avoids unnecessary duplication of equipment, services and costs which might occur if two or more neighboring districts elected to offer identical or similar training programs. 3. It makes possible a broader range of curriculum offerings and, therefore, a more extensive program of occupational training opportunities. 4. It offers training Opportunities to a larger number of persons than is possible in smaller schools serving single communities. 5. The area program concept is the best means through which single school districts lacking sufficient financial resources and/or students can provide adequate vocational education opportunities to enable all youth and adults to develop and maintain satisfactory occupational competence.30 The Michigan Department of Education, again in 1970, viewed the area program as one which ". . . emphasizes cooperative arrangements between two or more school districts, usually adjacent, or between high schools within large districts . ."31 The area program, usually involving one or more intermediate districts, may offer a number of vocational education plans each of which contains vocational Offerings for a geographic area. A plan may include what each school might do best, and then when looking at the total of all the schools in the geographical area, the plan may provide for what the schools cannot do individually and should be done collectively. An area center is one plan within the area program. SOIbid. 1Michigan Department of Education, A Tentative Plan For the Development of Area Vocational Education Centers in Michigan (Lansing, Mich.: Michigan Department of Education, 1970), p. 4. 26 The Area Center Concept The area center concept was defined, in 1967, in terms of a building or buildings: Area Center. A building or complex of buildings designated by the State Board of Education to be used expressly for providing vocational education programs.32 TO aid in the definition of the concept, the Michigan Department of Education also stated in 1967: "The area (center) program serves as a centralized extension of existing vocational programs in participating high schools."33 In addition, consideration was given in 1970 to including operational characteristics in the area center concept: To participate in vocational programs not provided in their home high school, students would be transported to an area center for occupational education for half of the time. This could be by half days, by one to three days per week, or other block time arrangements fitting local schedules, distances, and other factors.34 For the home, or participating high school, the area (center) program offers "another classroom down the hall," at a central location. Students participating in an area program are transported, or provide their own tranSportation, to the facility Offering the program from their home high school. The area center program enables all students to retain their identity with their home high school. As the comprehensive high school needs to provide an integration of vocational education with general education, there needs to be an 32Michigan Department of Education, A Position Statement, p. 51. 33Michigan Department of Education, A Tentative Plan, p. 4. 341bid. 27 integration of the secondary area vocational center program with the vocational education programs of each of the participating districts. The policies of the Michigan Department of Education have been developed to provide for the integration of the instructional program at the area center with the vocational programs of the participating K—12 districts: The Secondary_Area Vocational Center. The secondary area vocational center should serve to expand the vocational training opportunities of K-12 participating districts. Those programs which cannot be provided by each individual district for lack of sufficient student demands and/or financial resources might be successfully provided in the jointly-supported area center.35 The area center has received support from the Vocational Amendments of 1968 by providing funds for exemplary programs. Public Law 90-576 states, in part: Sec. 143.(a) Grants or contracts pursuant to this part may be made, upon terms and conditions consistent with the provisions of this part, to pay all of the cost of-- "(l)planning and developing exemplary programs or projects such as those described in paragraph (2), or "(2)establishing, operating, or evaluating exemplary programs or projects designed to carry out the purposes set forth in section 141 . . .36 Section 141 of Public Law 88-210, indicates the following purposes for the Amendments: Sec. 141. The Congress finds that it is necessary to reduce the continuing seriously high level Of youth unemployment by developing means for giving the same kind of attention as is now given to the college preparation needs of those young persons who go on to college, to the job preparation needs of the two out of three young persons who end their education at or before completion 35Michigan Department of Education, A Position Statement, p. 54. 36U.S., Congress, Vocational Amendments of 1968, Public Law 90-576, 90th. Congress, 1968, p. 1081. 28 of the secondary level, too many of whom face long and bitter months of job hunting or marginal work after leaving school. The purposes of this part, therefore, are to stimulate, through Federal financial support, new ways to create a bridge between school and earning a living for young peOple, who are still in school, who have left school either by graduation, or by dropping out, or who are in post secondary programs of vocational preparation and to promote cooperation between public education and manpower agencies.37 The single phrase referred to previously by the Michigan Department of Education is perhaps the most definitive: the area center ". . . should provide those programs which cannot be provided by each individual district for lack of sufficient student demands and/or financial resources. ."38 The area center has been an established fact in Michigan for several years. A number of area center programs are in existence with more being planned. The purpose of the area center has been demon- strated by its role in assisting the comprehensive high school in meeting the educational needs of our youth. In summary, the area center is a part of the area program--it is not the area program but, instead, one means of achieving vocational education goals within the area program. 37U.S., Congress, Vocational Amendments, p. 1080. 38Michigan Department of Education, A Position Statement, p. 54. CHAPTER III REVIEW OF LITERATURE AND RESEARCH The purpose of this chapter is to identify and examine literature and research pertaining to local evaluation of vocational education programs. In order to provide the reader with an overview of current thinking in education in regard to program evaluation this chapter is organized in the following manner. 1. Definitions of evaluation 2. Purposes of evaluation 3. The need for evaluation 4. Importance of evaluation in vocational education 5. Local involvement in evaluation 6. Research studies in local evaluation of vocational education programs 7. Summary Definitions of Evaluation Byram defined evaluation as the process of making judgments about the worth or value of a total program.1 A number of definitions follow, many of which support Byram's definition of evaluation. 1Harold M. Byram and Marvin Robertson, Localiy Directed Evaluation of Local Vocational Education Programs (Danville, 111.: The Interstate Printers and Publishers, Inc., 1970), p. 1:1. 29 30 Lewis indicates evaluation is a necessary part of the instructional process. Evaluation is an essential, and integral, and an ongoing part of the teaching-learing process . . . it enables the students and teachers to know how much progress has been made and what can be done to improve performance.2 Kibler, Barker, and Miles have defined evaluation as involving the use of tests and instruments to measure the acquisition of knowledge, skills, and attitudes.3 Husek has defined evaluation in terms of its being a subjective decision made by one or more peOple. A teacher giving a grade for student performance is evaluation, just as is the decision of a dean not to grant tenure to a member of his faculty.4 Cohen sets forth a broader definition for evaluation in education. He believes evaluation is a mechanism with which the character of an educational enterprise can be explored and expressed.5 2James Lewis, Jr., Administering the Individualized Instruction Program (West Nyack, N.Y.: Parker Publishing Company, Inc., 1971), p. 117. 3Robert J. Kibler, Larry L. Barker, and David T. Miles, Behavioral Objectives, and Instruction (Boston: Allyn and Bacon, Inc., 1970), p. 13. 4T. R. Husek, "Different Kinds of Evaluation and Their Impli- cations for Test Development," UCLA Evaluation Comment 11 (October 1968), p. 8. 5David K. Cohen, "Politics and Research: Evaluation of Social Action Programs in Education," in Review of Educational Research, Education Evaluation, ed. Gene V. Glass (Washington: American Edu- cational Research Association, April, 1970), p. 245. 31 A Methodological Process Banathy, in discussing the fact that the systems approach may offer a framework and procedures useful in evaluating existing curriculums, views evaluation in the following manner. In evaluation, the nature of strategies used is analysis. We are not building a new system; we are only furnishing data that can be used to correct, adjust, or rebuild the system.6 Erlandson contrasts a pervasive view vs. a useful view of evaluation: Typically . . . evaluation is something that someone does to someone else. The final product is seen as a rating of value. This view of evaluation is the pervasive one, and it has a limited utility. A more useful view of evaluation is that it is a system responsible for both assessing the pertinent features in and around a school program and generating alternatives to improve that program.7 Scriven agrees evalaution is a methodological activity: Evaluation is itself a methodological activity which is essentially similar whether we are trying to evaluate coffee machines or teaching machines, plans for a house or plans for a curriculum. The activity consists simply in the gathering and combining of performance data with a weighted set of goal scales to yield either comparative or numerical ratings, and in the justification of (a) the data-gathering instruments, (b) the weightings, and (c) the selection of goals.8 According to House, evaluation consists of a highly specific, precise methodology involving behavioral objectives: 6Bela H. Banathy, Instructional Systems (Belmont, Calif.: Fearon Publishers, 1968), p. 85. 7David A. Erlandson, "Evaluation and an Administrator's Autonomy," in School Evaluation, The Politics and Process, ed. Ernest R. House (Berkeley, Calif.: McCutchan Publishing Corporation, 1973), p. 21. 8Ralph W. Tyler, Robert M. Gagne, and Michael Scriven, Per- spectives of Curriculum Evaluation (Chicago: Rand McNally and Company, 1972), p. 40. 32 Evaluation has consisted of a highly specific, precise methodology. It has consisted of specifying in behavioral Objectives what students are supposed to be doing and measuring those objectives with standardized achievement tests. Lessinger and associates state that: Evaluation is the process of assessment or appraisal of value, the comparison of desired outcomes (objectives) with the actual progress made in actual accomplishments.1 Westbury, in defining evaluation, quotes Harris: In his second paper (1963, p. 191) Harris . . . defines evaluation as the "systematic attempt to gather evidence regarding student behavior that accompanies planned educational experi- ences.11 The following popular definition of evaluation, quoted by Guba, also presents evaluation as a process. . . . that formulation which regards evaluation as a process for determining the congruence of performance and objectives. All school programs should be guided by behavioral Objectives: indeed, it is the essence of program planning to project objectives and the essence of curricular planning to project a series of experiences through which the pupil can achieve the objectives. Similarly it is the essence of evaluation to deter- mine whether the objectives were in fact met.12 9Ernest R. House, "Technology and Evaluation," Educational Technology 13 (November l973):21. 10C. D. Sabine, ed., Accountability: Systems Planning in Education, Leon Lessinger and Associates (Homewood, Ill.: ETC Publi- cations, 1973), p. 236. 11Ian Westbury, "Curriculum Evaluation," in Review of Edu- cational Research, Educational Evaluation, ed. Gene V. Glass (Washington: American Educational Research Association, April, 1970), p. 245. 12Egon G. Guba, Evaluation and Chapge in Education, a report prepared by the National Institute for the Study of Educational Change for the U.S. Office of Education (Bloomington, Indiana: The National Institute for the Study of Educational Change, 1968), p. 8. 33 Guba extends this definition to include the use of information obtained in evaluation for decision making. A definition of evaluation which . . . is currently under formulation in a number of centers of activity . . . views evaluation as a process of providipg and using information for making educational decisions.13 A Basis for Decision Making The following describe evaluation as a tool for decision making. . a procedure for collecting and analyzing data to produce pertinent information which can be used to facilitate decision- making by decision-makers.14 Evaluation consists of the collection and use of information concerning changes in student behavior to make decisions about an educational program.15 The current meaning of the term "evaluation" in several recent writings and in federal legislation is that it is the gathering of empirical evidence for decision-making and the justification of decision-making policies and the values upon which they are based.16 Evaluation is a process of examining certain Objectives and events in the light of specified value standards for the purpose of making adaptive decisions.17 Lessinger, who sees evaluation as a continuing process, refers to Stufflebeam's definition of evaluation: 13Ibid., p. 11. 14Leon Jones, "Using Evaluation Data to Improve an Ongoing Program: A Methodology" (paper presented at the New England Educational Research Organization Conference, Boston College, Chestnut Hill, Mass., June 4, 1971), p. 2. 15Westbury, "Curriculum Evaluation," p. 245. 16Ibid., p. 241. 17Ben K. Gold, "Evaluation of Programs" (paper presented at a conference sponsored by the Compensatory Education Project, 34 Stufflebeam defines evaluation as the "process of delineating, obtaining, and providing useful information for judging decision alternatives.18 M. C. Alkin, Director of the Center for the Study of Evaluation at the University of California at Los Angeles, offered the following as a definition of evaluation formulated by the Center: Evaluation has . . . been defined as the process of ascertaining the decision areas of concern, selecting appropriate information, and collecting and analyzing information in order to report summary data useful to decision-makers in selecting among alternatives. Tyler describes evaluation as a task to improve student learning: In place of the older criteria and the dependent procedures we need new concepts Of educational readiness, strengths on which to build, deficiencies to be attacked, and the like. These new concepts must be based on the assumption of dynamic potential in all or almost all human beings. The evaluation task is to describe or measure phases of this potential and the difficulties to be surmounted that can help the individual and the educational institution in improving student learning. 0 Other Definitions Grotelueschen and Gooler propose that evaluation also applies to things not done: Evaluation means different things to different peOple. For example, it is judging the growth of a program (Scriven, 1967). It is describing a program as fully as possible (Stake, 1967). It is documenting how well program objectives are met (Tyler, 1942). It is providing useful infOrmation to decision-makers (Stufflebeam, 1969). It is all of these, and combinations of these, and more things besides. The characteristic common to Coordinating Board, Texas College and University System, Sheraton- Crest Inn, Austin, Texas, April 5-6, 1971), p. 3. 18C. D. Sabine, ed., Accountabiliiy: Systems Plannipg in Education, p. 31. 19Marvin C. Alkin, "Products for Improving Educational Evaluation," UCLA Evaluation Comment, II (September 1970), p. l. 20Tyler, Gagne, and Scriven, Perspectives of Curriculum Evaluation, p. 16. 35 each of these perceptions of evaluation is a focus on what has been done or what is presently being done. We prOpose that evaluation also be applied to things not yet done. . . . We will argue, as Moynihan (1970, p. 14) did, for "Evaluation in advance." To do so, we feel, requires a more-than- usual reliance upon judgmental data.21 For use in this study, Byram's definition of evaluation will be utilized: The term "evaluation" . . . refers to the task of making judgments about the worth or value of a total program of vocational or technical education. It involves primarily the determination of the extent to which previously established goals and objectives are being or have been attained.22 Purposes of Evaluation A diverse array of opinions exist about what should be the purposes of evaluation. Sabine believes all evaluation should serve two purposes: First, evaluation should provide data concerning the achieve- ment levels attained; and second, evaluation should be an ongoing process providing information about the processes as well as the products. Well written specific objectives will include the techniques for measurement and the specific criteria for assessing success.23 Lindvall and Cox see evaluation as having three basic purposes: 21Arden D. Grotelueschen and Dennis D. Gooler, "Role of Evaluation in Planning Educational Programs" (paper read at "Evaluation and the Planning of Educational Programs," a symposium of the American Educational Research Association Meeting, New York, February, 1971), p. 1. 22Harold M. Byram and Marvin Robertson, Locally Directed Evaluation of Local Vocational Education Programs (Danville, 111.: The Interstate Printers and Publishers, Inc., 1970), p. I:l. 23Creta D. Sabine, "Systems in Education," in Accountability: Systems Planning_in Education, p. 30. 36 Whereas in the past there has been some tendency to think of evaluation as being represented only by the effort to make some ultimate assessment of the worth of an innovation, it is now recognized that evaluation can serve a number of other functions. One of these is the gathering of data while a program is being developed for the purpose of guiding the development process. Scriven (1967) has termed this formative evaluation. Another is the role of evaluation within an instructional program, parti- cularly an individualized system, in monitoring pupil progress (Glaser, 1967; Lindvall and Cox, 1969). Since, in this latter case, evaluative information is used to adjust the curriculum to the needs of the individual, it might be viewed as a type of continuing formative evaluation. Also, a number of persons have made contributions to the clarification of what should be involved in using evaluation procedures to make some type of summary assessment of the overall value of an educational program, what Scriven (1967) has described as summative evaluation.24 Morphet, Johns, and Reller list the following as purposes for evaluation: --To secure the basis for making judgments at the end of a period of Operation. --To ensure continuous, effective, and improved operation. --To diagnose difficulties and avoid destructive upheavals. --To improve staff and citizen ability to develop the educational system. --To test new approaches to problems and to conduct pilot studies in the consideration of which advancements can be effected.25 Sullivan saw the role of evaluation to be the improvement of education, "Extensive evaluation in schools cannot be justified unless its ultimate consequence is the improvement of education."26 24C. M. Lindvall and Richard C. Cox, Evaluation as a Tool in Curriculum DeveloPment: The IPI Evaluation Program (Chicago: Rand McNally 8 Company, 1970), p. 1. 25Edgar L. Morphet, Roe L. Johns, and Theodore L. Reller, Edu- cational Organization and Administration (Englewood Cliffs, N.J.: Prentice-Hall, Inc., 1967), pp. 534-35. 26Howard J. Sullivan, "Objectives, Evaluation, and Improved Learner Achievement," in Instructional Objectives, ed. Robert E. Stake (Chicago: Rand McNally 8 Company, 1970), p. 65. 37 Healy has said the purpose of evaluation is ". . . to decide what, if any, revisions of the program or its components are advisable."27 Karns and Wenger cast their vote in favor of evaluation having a useful role in education: Evaluation can be the process by which education can become a meaningful and relevant experience for the students and all edu- cators who are there to serve those students. Evaluation can be the corrective tool with which education can change.28 The ultimate purpose Of an evaluation, according to Alkin, is to provide information upon which present or potential decisions are to be made and Alkin says this is the crucial factor that distinguishes . 29 evaluat1on from research. Rippey believes ". . . evaluation is intended to transform the conflict energy of change into productive activity; to clarify the roles of those persons involved in the program changes, not to produce new knowledge or ascribe causality."30 27John Healy, "Systems Planning in the Classroom," in Accountability: Systems Planning in Education, ed. Creta Sabine (Homewood, 111.: ETC Publications, 1973), p. 98. 28Edward A. Karns and Marilyn J. Wenger, "Developing Corrective Evaluation Within the Program," Educational Leadership 30 (March 1973): S35. 29Marvin D. Alkin, ”Wider Context Goals and Goal-Based Evaluators," UCLA Evaluation Comment, III (December 1972), p. 5. 30Robert M. Rippey, Studies in Transactional Evaluation (Berkeley, Calif.: McCutchan Publishing Corporation, 1973), p. 4. 38 Westbury indicated the goal of evaluation must be to answer questions of selection, adOption, support and worth of educational materials and activities, as originally reported by Class.31 The purpose of evaluation as part of the deve10pment of a training program in the Human Resources Research Office of the U.S. Army is to supply information on the adequacy of training and infor- mation for future research and development efforts.32 Another purpose of evaluation, according to Starr and Dieffenderfer, can be to assist state and local administrative and supervisory personnel in their on-site efforts to improve the delivery . 33 of programs and serv1ces. When considering the purposes of evaluation, Gooler and Grotelueschen have said: Exist for now the mode of evaluation concerned with outcomes, statistics, and compare-this-program-to-another, and enter evaluation Operating in a who's-asking-what, who's-doing-what, how-do-the-groups-with-an-interest-stack-up-in-terms-of—potential- influence mode. In this mode, evaluation raises many questions (few answers), alerts the curriculum developer (sensitizes?), and causes a lot of concern about justifying, explaining, relating. It is not a comfortable mode. But neither is it comfortable to be holding a beautifully structured, logically sequenced curriculum that no one can or will let into his schools!34 31Westbury, "Curriculum Evaluation," p. 241. 32Robert Glaser, "Implications of Training Research for Education," in Theories of Learning_and Instruction, ed. Ernest R. Hilgard (Chicago: University of Chicago Press, 1964), p. 160. 33Harold Starr and Richard A. Dieffenderfer, A System For Statewide Evaluation Of Vocational Education (Columbus, Ohio: The Ohio State University, 1972), p. 4. 34Dennis D. Gooler and Arden Grotelueschen, "Accountability In Curriculum Development," in Curriculum Theory Network, ed. F. Michael Connelly (Toronto: The Ontario Institute for Studies in Education, 1971), pp. 30—31. 39 Another manner in which to consider the purposes of evaluation is to consider the criteria evaluation must meet. As Wilhelms has pointed out, no matter where one draws the line, the test of an evaluation system is simply this: "Does it deliver the feedback that is needed, when it is needed, to the persons or groups who need it?"35 Wilhelms believes if any evaluation system is to meet the above test, there are several basic criteria the system must satisfy: 1. Evaluation must facilitate self-evaluation. 2. Evaluation must encompass every Objective valued by the school. 3. Evaluation must facilitate learning and teaching. 4. Evaluation must produce records appropriate to the purposes for which records are essential. 5. Evaluation must provide continuing feedback into the larger questions of curriculum development and educational policy.36 Bruner offers several guidelines as a philOSOphy to determine the purposes of evaluation: 1. Evaluation is best looked at as a form of educational intelligence for the guidance of curriculum construction and pedagogy. 2. Evaluation, to be effective, must at some point be combined with an effort to teach so that the child's response to a particular process of teaching can be evaluated. 3. Evaluation can be of use only when there is a full company on board, a full curriculum-building team consisting of the scholar, the curriculum maker, the teacher, the evaluator, and the students. 35Fred T. Wilhelms, "Evaluation as Feedback," in Evaluation as Feedback and Guide, ed. Fred T. Wilhelms (Washington: Association for Supervision and Curriculum Development, 1970), p. 4. 36Ibid., pp. 4-7. 4O 4. Evaluation, in its very nature, is likely to create suspicion and concern in the conventional educational setting where it has a history that is inapprOpriate to present practice of the kind being discussed here. 5. From time to time the evaluator must design instruction as a means of probing and developing general intellectual skills. 6. A curriculum cannot be evaluated without regard to the teacher who is teaching it and the student who is learning it. 7. Curriculum evaluation must, to be effective, contribute to a theory of instruction.37 The Need for Evaluation Silberman once said, "It simply never occurs to more than a handful (of teachers, principals and superintendents) to ask why they are doing what they are going--to think seriously or deeply about the purposes or consequences of education."38 Many writers have indicated the need for rational policies concerning evaluation in education. Public Interest A strong factor in the emphasis on evaluation in education today is the public being served. The American public is becoming more interested in the education of our youth. As noted in Kibler, Barker, and Miles' book Behavioral Objectives and Instruction, the parent is oftentimes neglected as a participant in the educational process. Parents, however, are becoming interested in the schools at an increasing rate. They are also becoming interested in the quality of education their children are receiving, and consequently, are becoming 37Jerome S. Bruner, Toward a Theory of Instruction (Cambridge, Mass.: Harvard University Press, 1971), pp. 163-66. 38Richard W. Hostrop, Managing Education for Results (Homewood, 111.: ETC Publications, 1973), p. 21. 41 more concerned about their educational growth and their classroom problems.‘39 Holstrop states that: Society rightly expects to find that the school experience will have made a desirable difference in the lives of their children. Children, of course, are shaped not only by the school but by their families, friends, church, mosque, synagogue, or temple--and other environmental forces as radio, television and travel--each in its own distinctive way, yet interdependent in forming the whole person. But society has a right to expect that the school will provide children with attitudes and skills that he can not get any place else. In short, schools are expected to make a difference.4 Sciara and Jantz indicate the public is in favor of evaluation: School personnel are being pushed to explain the efficacy of their programs. A new definition of the adequate school is being forged on the anvil of public Opinion. No longer are the majority of taxpayers satisfied with the old triad of the past--qualified teachers, the latest equipment and methods, and modern school p1ants--as indicators of effective schools.41 Webster and Schuhmacher agree, ". . . there has been an increased emphasis on evaluation in the public schools."42 Merriman has said, "We have observed a growing concern at the local level with educational decision—making . ."43 39Kibler, Behavioral Objectives, p. 109. 40Hostrop, Managing Education, p. 9. 41Frank J. Sciara and Richard K. Jantz, "The Call of Account- ability," in Accountability in American Education, ed. Frank J. Sciara and Richard K. Jantz (Boston: Allyn and Bacon, Inc., 1972), p. 3. 42William J. Webster and Clinton C. Schuhmacher, "A Unified Strategy for System-Wide Research and Evaluation," Educational Tech- nology 13 (May 1973), p. 68. 43Howard 0. Merriman, "A Conception of Curriculum Evaluation Involving Teachers, Parents, and Other Educational Decision-Makers," in Curriculum Theory Network, ed. F. Michael Connelly (Toronto: The Ontario Institute for Studies in Education, 1971), p. 24. 42 As taxpayers are being asked to give more and more financial support to public education, they are holding the educational system to stricter account, according to Dannenberg. The entire educational program in the United States is being viewed under a public microscope 44 more than ever before. Hilton alludes to the need for evaluation, comparing input and cosr with output . the given educational system should be seen to be producing an output compatible, by some agreed Objective standards, with the cost of the input--and with the needs of society.45 According to Ott, Fletcher, and Turner-- . evaluation is becoming more and more necessary as the demand for excellence increases and as school systems become too large for administrators to remain informed without assistance.46 A pervasive desire on the part of the American public in recent years, according to Popham, revolves around an indicated need to rigorously evaluate the quality of instructional activities in American education.47 Responsibility to Society Shear indicates a responsibility exists upon the part of edu- cators to society: 44Raymond A. Dannenberg, "Michigan Feels the Pull of Account- ability," American Vocational Journal 49 (January 1974):54. 45Peter Hilton, "The Survival of Education," Educational Technology 13 (November l973):12. 46Jack M. Ott, Sheila Fletcher, and Donald G. Turner, "Taxonomy of Administrative Information Needs: An Aid to Educational Planning and Evaluation," Educational Technology 13 (May 1973):3l. 47W. James POpham, Evaluating Instruction (Englewood Cliffs, N.J.: Prentice-Hall, Inc., 1973), p. 7. 43 The educator as a professional, by virtue of his specialized knowledge regarding learners and the facilitation of their learning, is a public servant and thus is responsible to society for judgments made and acted on within the scope of his area Of professional expertise.48 Grant Venn has said quality in education is the issue around which the schools of tomorrow must be discussed: The first revolution in American education was a revolution in Quantity. Everyone was to be provided the chance for an education of some sort. That revolution is almost won in the schools and is on its way in higher education. The second revolution is eouality of opportunity. That revolution is underway. The next turn of the wheel must be a revolution in guality.49 In regard to quality of education, Hayman has gone on record as stating ". . . improvement in education clearly depends on improved management practice, and this in turn depends on more effective infor- mation systems.50 Glaser points out the need for the development of techniques for the analysis of what the student needs to achieve. Such a need is an important implication for educational practice. Hostrop has said, "It has not been possible in the past to measure the full output and eventual impact of an educational system on society." In terms of outputs as "products," it is important 48Twyla M. Shear, "Accountability Versus Responsibility," American Vocational Journal 48 (March l973):26. 49Grant Venn, Man, Education, and Manpower (Washington: The American Association of School Administrators, 1970), p. 106. 50John L. Hayman, Jr., "Educational Management Information Systems For the Seventies," Educational Administration Quarterly 10 (Winter 1974):69. 51Glaser, "Implications of Training Research for Education," p. 156. 44 to determine the difference between "finished" and "unfinished" products.52 Evaluation is inevitable, according to Morphet, Johns, and Reller: Appraisals are inevitable. Citizens, parents, students, teachers, administrators, board of education members, and repre- sentatives of the state department of education have views (judgments) regarding the strengths and limitations of given schools or school systems. They are not likely to be without some judgments if they have association with student or staff personnel, or even if they meet other people who have some relationship to the school system or any of its many component elements. The question that confronts the educational administra- tor, therefore, is not whether or not there will be appraisals. It is, rather, whether or not the appraisals will be reasonably valid or only judgments made on the basis of inadequate data, or even with merely rumor as the "foundation."53 The National Advisory Council on Education Professions Development sees evaluation of educational programs as an emerging issue: In its second annual report, submitted earlier this year to the President and Congress, this Council devoted a major section to research and evaluation. It has expanded on its statements in that report at this time because the need for sensible policies concerning evaluation of educational programs is rapidly emergipg_ as one of the major issues in American Education.54 Basis for Program Improvement Byram and McKinney point to evaluation as a prerequisite to improvement of program. They state that evaluation properly starts 52Hostrop, Managing Education for Results, pp. 15-16. 53Morphet, Educational Organization and Administration, p. 533. 54National Advisory Council on Education Professions Develop- ment, Evaluation of Education: In Need of Examination (Washington, D.C.: National Advisory Council on Education Professions Development, 1973), Preface. 45 with the identification or formulation of program goals or Objectives. The focal point being the degree to which these goals are being . . . 55 achieved in the part1cular program. The Commission on Instructional Technology, in defining instructional technology, has alluded to the need for evaluation in the instructional process: instructional technology ". . . is a systematic way of designing, carrying out, and evaluating the total process of learning and teaching in terms of specific objectives, based on research in human learning and communication, and employing a combination of human and non-human resources to bring about more - . . . "56 effective 1nstruct1on. Westbury emphasizes the need for evaluation by relating evaluation of curriculum with curriculum development: ". . . there can be no curriculum evaluation that is not intertwined with curriculum deve10pment, and curriculum evaluation is an immediately important goals."57 Bruner compares the need for evaluation to a need for light: A . . . result of contemporary exploration in teaching is the conclusion that educational experiment, in the main, has been conducted and is being conducted in the dark--without feedback in usable form. The substitute for light (or usable feedback) is evaluation after the job has been completed.58 55Harold M. Byram and Floyd McKinney, Evaluation of Local Vocational Education Programs (East Lansing, Mich.: Michigan State University, 1968), p. 56Commission on Instructional Technology, "What Is Instructional Technology?" in To Improve Learning, ed. Sidney G. Tickton (New York: R. R. Bowker Company, 1970), p. 7. 57Westbury, "Curriculum Evaluation," p. 257. 8Bruner, Toward a Theory of Instruction, p. 30. 46 Yelon believes the failure of a student to achieve may be due to causes within the instructional environment; therefore, the system itself should be evaluated on a continuing basis.59 Awareness of evaluation is an ingredient of successful teaching, so states Redfern: . . An intelligent, well-prepared, insightful teacher; a clear understanding of what the teaching assignment entails; knowledge of what is expected by administrators and supervisors; familiarity with the criteria by which success or failure will be judged; awareness that teaching performance will be evaluated and assurance that the results of evaluation will be made known and explained-~these are some of the ingredients of successful teaching.60 Wilhelms and Diederich have said, ". . . within the school itself, there is tremendous need for feedback so organized that it can provide guidance for curriculum improvement and every sort of edu- . . . "61 cat1onal pol1cy mak1ng. Grobman states: Today there is increasing concern with evaluation decisions and procedures, accompanied by a greater recognition that the decisions are evaluative in nature and should be based on the best evidence that can be made available.62 59Stephen L. Yelon, "An Examination of Systematic Development of Instruction for Nonresidential Colleges,” Educational Technology, 13 (July l973):38. 60George B. Redfern, How to Appraise Teaching Performance (Columbus, Ohio: School Management Institute, Inc., 1964), p. 6. 61Fred T. Wilhelms and Paul B. Diederich, "The Fruits of Freedom," in Evaluation as Feedback and Guide, ed. Fred T. Wilhelms (Washington: Association for Supervision and Curriculum Development, 1970), pp. 237-38. 62Hulda Grobman, Evaluation Activities of Curriculum Projects: A Starting_Point (Chicago: Rand McNally and Company, 1968), p. 1. 47 Deterline feels the problem of evaluation or validation of a program is a critical issue, whether from the point of view of selecting a program or of the production of a local program.63 Skinner likens the need for explicit designs in education to the need for survival: Survival is a difficult value. Ideally a system of education should maximize the chances that the culture will not only cope with its problems but steadily increase its capacity to do so. . . . Accidental practices and practices designed for irrelevant reasons have survival value, but the explicit design of a policy with respect to the strength of the culture is more promising. Porter has indicated a need for evaluation: I happen to be one educator who believes that we, as educators, must do a better job of explaining the discrepancies that exist in our present methods of delivering educational services to boys and girls.65 Broad Perspective Evaluation in education has many facets which are in need of consideration. Link and Diederich indicate a need for people to know who they really are: . The biggest single need in evaluation is not improved methods of testing, but for human beings to know who they really are and what their possibilities are for creative growth.66 63William A. Deterline, "Practical Problems in Program Pro- duction," in Programed Instruction, ed. Herman G. Richey and Merle M. Coulson (Chicago: University of Chicago Press, 1967), p. 195. 64B. F. Skinner, The Technology of Teaching (New York: Appleton- Century-Crofts, 1968), pp. 232-33. 65John W. Porter, "Educational Challenge Accepted," The Detroit News, January 4, 1973. 66Frances R. Link and Paul B. Diederich, "Cooperative Evaluation Program, in Evaluation as Feedback and Guide, ed. Fred T. Wilhelms (Washington: Association for Supervision and Curriculum Development, 1970). p. 180. 48 Grotelueschen and Gooler believe there is a need for evaluation to look towards the future as in the past: We contend . . . that evaluation has tended to emphasize too heavily a preciseness of measurement of present and past events, and in so doing has forfeited possibilities for more futuristic thought. We believe evaluation can play a wider role in planning.67 According to Stake, evaluation requires judgment. Decision- making requires judgment. Such judgments are made both from within the school and without. To understand what a school is doing requires an awareness of what a school is expected to do.68 The National Study of School Evaluation has attested to the need for evaluation: The National Study of School Evaluation believes that schools which are quite different may be equally good. This belief involves the basic principle that a school shall be evaluated in terms of what it is striving to accomplish.69 Westbury quotes Larkins and Shaver as to another purpose of evaluation, that is to include in the evaluation process a procedure to measure the worth of the data gathered. We are suggesting that evaluation design ought to be broadened to include more than data gathering and analysis design. It is not simply that procedures need to be established for obtaining assessment data that are more adequate, but that adequate evaluation design requires procedures for both obtaining adequate assessment 67Grotelueschen and Gooler, "Role of Evaluation in Planning," 68Robert E. Stake, "Objectives, Priorities, and Other Judgment Data," in Review of Educational Research, Educational Evaluation, ed. Gene V. Glass (Washington: American Educational Research Association, April, 1970), p. 181. 69National Study of School Evaluation, Elementary School Evaluative Criteria (Arlington, Vir.: National Study of School Evaluation, 1972), p. 3. 49 data--assessment desi n--and procedures for weighing the worth of the assessment data.7 Effective evaluation is not a simple undertaking, as indicated by the following. The difficulty of evaluation is recognized. The complexity of the task makes it appear forbidding. However, a good part of the inadequacy of evaluation in schools stems from the failure to recognize the significance of such work. . . . Drift is easier and less disturbing to routine than is critical evaluation of procedures and results. There is reason to believe that the effectiveness of the educational enterprise would be substantially increased if even just 1 or 2 percent of the budget were devoted to this purpose.71 The difficulty of change has been emphasized in the following observation by Silberman in his book Crisis in the Classroom. In 1960 Zacharias was confident that with $100 million a year for curriculum deve10pment, he could work a revolution in the quality of United States education. By 1966 he knew better "It's easier to put man on the moon," he confessed to this writer, "than to reform the public schools."72 New Evaluation Techniques Needed The old procedures for evaluation have not proven to be effective. Stake has expressed a need for the development of new judgment techniques: New techniques of observation and judgment need to be developed. In fact, we need a new technology of educational evaluation. We need new paradigms, new methods, and new findings to help the buyer beware, to help the teacher capitalize on new devices, to help the 70Westbury, "Curriculum Evaluation," p. 246. 71Morphet, Johns, and Reller, Educational Organization and Administration, p. 533. 72Charles E. Silberman, Crisis In the Classroom (New York: Vintage Books, 1970), pp. 170-71. 50 deveIOper create new materials, and to help all of us to under- stand the changing educational enterprise. 3 Cawelti also believes educational designs cannot be happen- stance: . there is a need for persons with new specialities if school systems are to be able to diagnose existing classroom environments and provide help to teachers in designing better teaching.74 Tyler has said the current climate in this country is to seek innovation, to try to get institutions active in learning how to serve . . 75 the1r new c11ents. Gottman and Clasen ”. . . feel that it is crucial for teachers and administrators to develop the skills needed to monitor the progress of individual students in attaining educational objectives."76 In a study completed by Verduin, it was found that the useful- ness and effectiveness of curriculum changes were clearly indicated by those people who made use of the changes in their instructional environments. In addition, participants become more aware of edu- cational problems and their attention was drawn to inconsistencies in . . 7 the1r own curr1culum. 73Robert E. Stake, "Toward a Technology for the Evaluation of Educational Programs," in Perspectives of Curriculum Evaluation, ed. Robert E. Stake (Chicago: Rand McNally and Company, 1972), p. 3. l4Gordon Cawelti, "Must We Systematize Curriculum Building?" Educational Leadership 31 (March 1974):483. 75Ty1er, Perspectives of Curriculum Evaluation, p. 15. 76John Mordechai Gottman and Robert Earl Clasen, Evaluation in Education (Itasca, Ill.: F. E. Peacock Publishers, Inc., 1972), p. 6. 77John R. Verduin, Jr., "An Evaluation of a Cooperative Approach to Curriculum Change" (Ph.D. dissertation, Michigan State University, 1962), p. 3. 51 The Michigan Department of Education has said, "In-service professional development and evaluation of effort in preparing youth for adulthood may require a greater emphasis on willingness to accept change. ."78 Conant has written, "It is my belief there will be more radical changes in the future and this in turn means that our old methods of determining educational policy need drastic revision to meet the impact of the educational revolutions."79 Importance of Evaluation at State and Federal Levels Evaluation is also needed at the state department of education level, according to Stallard: Someastate departments of education have made concentrated efforts to improve their total state programs through periodic evaluations. Evaluation does not play a minor role in those state programs; it is made part of the training. Lindvall, Cox, and Bolvin believe the recent focus on evaluation is due to two items: funding and new curricula. The current emphasis on the evaluation of educational programs and curricula can be ascribed largely to two factors: the invest- ment of increasingly large sums of money in educational innovation and the related development of a variety of new curricula. Local and federal sources that are financing these efforts as well as foundations providing grants for special programs have become increasingly interested in evidence concerning the return on their investments. As a result a built—in evaluation procedure has become a requirement for most proposals. These demands plus the 8 . . . . . 7 M1ch1gan Department of Education, A Pos1t1on Statement on Educational Accountability (Lansing, Mich.: Michigan Department of Education, 1972), p. 9. 79James Bryant Conant, Shaping Educational Policy (New York: McGraw-Hill Book Company, 1964), p. dT‘ 80John J. Stallard, "MDTA's Evaluation Gap," American Vocational Journal 48 (September l973):54. 52 existence of diverse opinions among educators as to the proper role of evaluation have resulted in efforts to redefine evaluation and to specify new tasks and procedures (Cronbach, 1963; Scriven, 1967; Stake, 1967a).81 Kruger found evaluation to be lacking in program proposals: The 1967 Miller Report on the first-year Title III prOposals found evaluation to be the weakest element in the proposals. The 1968 report of the Second National Study of PACE found, in a review of 94 planning projects, that "only 30 . . . were judged as having adequate evaluation procedures in the project design . . . con- trasted to 31 projects which gave no evidence whatsoever of evaluation procedures." Of 43 Operational projects, "a little over 8 percent of the projects had made plans that promised to be adequate for evaluation of their projects; about 70 percent . had done a little, and about 13 percent had not bothered with evaluation at all." From our own work with federal programs, we can only conclude that the typical federally-supported project has an evaluation process that is unplanned, partial, incompetent, uncoordinated, remote, terminal, narrow in perspective, and underfunded.82 Kruger also said, "Deliberate, systematic, and consistent procedures for development, implementation, evaluation, and refinement must be vigorously pursued."83 Importance of Evaluation in Vocational Education Byram has indicated a growing concern exists for evaluation in vocational education ". . . on the part of citizens and school people for evaluation of their school programs of occupational education.84 81Lindvall and Cox, Evaluation as a Tool, p. l. 82W. Stanley Kruger, "Implications of Accountability for Edu- cational Program Evaluation," in Accountability in American Education, ed. Frank J. Sciara and Richard K. Jantz (Boston: Allyn and Bacon, Inc., 1972), pp. 12-13. 831bid., p. 9. 84Harold M. Byram, Evaluation Systems For Local Programs of Vocational-Technical Education (East Lansing, Mich.: Michigan State University, 1968), p. l. 53 This need for evaluation of vocational education programs is recognized by the California Coordinating Unit for Occupational Research and Deve10pment in the following statement. Evaluation is an important aspect of the industrial world. Quality control must be maintained throughout all Operations in the manufacture of a product. Mass production requires interchange- ability of parts. After the product is ready for distribution, the final evaluation is public acceptance. If the product is not purchased, changes must be made in display, advertising, or in the basic design or manufacture of the product. Evaluation is equally important in the field of education, and due to the rapid industrial and technological developments, evaluation of vocational education programs must be a continuous process. The product to be evaluated is an educated youth not only capable of entry into the labor market but one who is capable of persisting and progressing in the occupation.85 The Advisory Council on Vocational Education was aware of the importance of evaluation when it made the following recommendations under the provisions of the Vocational Education Act of 1963. 19. IT 18 RECOMMENDED, That the act provide that each state conduct a periodic statewide review and evaluation of its vocational education program.86 The Center for Vocational and Technical Education at The Ohio State University has disclosed a need on the part of ”. . . state divisions of vocational education for an evaluation system which could 85California Coordinating Unit for Occupational Research and Deve10pment, Evaluation in Vocational Education, a report prepared by the California Coordinating Unit for Occupational Research and Develop- ment for the U.S. Office of Education (Sacramento, Calif.: Research Coordinating Unit for Vocational Education, 1967), p. iii. 86General Report of the Advisory Council on Vocational Edu- cation, Vocational Education, The Bridge Between Man and His Work (Washington: 1968), p. 203. 54 be used as a management tool to assist them in program planning responsibilities."87 Up-to-date planning information is essential, according to Brandon and Evans: More than most other types of education, vocational education must change its structure and content to adapt to rapidly changing occupational requirements. New groups of trainees may need to be served. Programs in unusual occupations may need a new area-school structure if they are to be implemented. Expansions and con- tractions of existing programs may be needed.88 Hull has declared that each school system, ". . . state division of vocational and technical education or university department should devise procedures for systematic review of innovative ideas." He con- tinues by stating that: An evaluation of the appropriateness of the innovation often includes reevaluation of the role and purposes of the existing organization. This type of soul-searching activity requires time and commitment to the need for improvement.89 According to Bruhns, there is little doubt that evaluation of vocational-technical education is a "must."90 87Starr and Dieffenderfer, A System For Statewide Evaluation, p. 3. 88George L. Brandon and Rupert N. Evans, "Research in Vocational Education," in Vocational Education, ed. Melvin L. Barlow (Chicago: The University of Chicago Press, 1965), p. 263. 89William L. Hull, "Process of Planned Change," in Contemporary Concepts in Vocational Education, ed. Gordon F. Law (Washington: American Vocational Association, 1971), p. 163. 90Arthur E. Bruhns, Evaluation Processes Used to Assess the Effectiveness of Vocational-Technical Programs, a report prepared by the University of California at Los Angeles for the U.S. Office of Education (Los Angeles: University of California at Los Angeles, 1968), p. 4. 55 Evaluation in vocational education must happen before improve- ment in vocational programs will occur. Norton, Love, and Rolloff have indicated that educators are realizing that sound procedures for decision-making are needed but because of a lack of adequate preparation, they have not taken the necessary steps to operationalize program evaluation.91 Local Involvement in Evaluation These individuals who are involved in the educational programs are affected by any change that is made. Bruner suggests that these persons should be directly involved in the evaluation process. It is crucial to discover an adequate working relationship between evaluator, curriculum maker, and teacher so that they can benefit from each other's activities. One of the important objectives of any evaluation study is to discover how this can be done. Our experience suggests that the key may be the day-to- day planning and communication of the curriculum building team that includes the scholar, teacher, evaluator, and students.92 House believes each school is responsible for evaluation. When I express the naive belief that the school might be sensitive to one small girl, many smile at me. I smile back. For I know that it may be possible for each school to have its own evaluation and each community its own science.93 Nerden also expresses the importance of involvement in evaluation by those who are affected by the results. 91Robert E. Norton, E. Lamar Love, and John A. Rolloff, Institute for Improving Vocational Education Evaluation, a report prepared by the University of Arkansas for the U.S. Office of Edu- cation (Fayetteville, Ark.: University of Arkansas, 1970), p. l. 92Bruner, Toward a Theory of Instruction, p. 165. 93House, ”Technology and Evaluation," p. 26. 56 Obviously anything as detailed, personal and critical as evaluation requires that all who are affected by evaluation should be involved in the determination of the procedures and instruments to be utilized in the process.94 McKinney and Mannebach say there is a key lesson to be learned from all the efforts exerted to improve educational programs. That lesson is: "Unless citizens, students, and educators are personally involved in designing and conducting the effort, it is not likely to . 95 result 1n much success." Research Studies in Local Evaluation of Vocational Education Programs Research studies in local evaluation of vocational education programs are somewhat limited in number. The following are representa- tive of research projects and are listed in chronological order. Byram, 1963.--One of the early efforts was by Byram in 1963. His project, Byram's first in local evaluation of vocational programs, was conduCted to develop and try out a systematic approach to local program evaluation. The project was conducted over a two year period (1963-1965) in cooperation with the staffs of three Michigan public high schools. The project director Harold Byram, three Michigan State Uni- versity consultants, and one Michigan Department of Public Instruction consultant, worked with the local group of each school to develop and 94Joseh Nerden, "Statewide Evaluation of Vocational Education," in Contemporary Concepts in Vocational Education, ed. Gordon F. Law (Washington, D.C.: American Vocational Association, 1971), p. 404. 95Floyd L. McKinney and Alfred J. Mannebach, "Let's Give Students Their Say," American Vocational Journal 49 (January 1974):27. 57 try out an evaluation process for that school. The local group con- sisted of a local director and an assistant, a teacher committee, and a citizens committee. The staffs, students, and interested citizens within the three school districts where each high school was located were also involved in the project. Results of the project was a manual published in September, 1965, describing the principles, practices and forms recommended for use in locally directed evaluation of local programs in vocational education.96 Wyllie, 1963.--Wyllie reported, in 1963, on an evaluation plan for business education programs in high schools. Primary purpose of the plan was to provide study materials as a basis for drawing up and implementing plans for program improvement. The plan consisted of step-by-step procedures to be followed in evaluating a departmental program.97 Hammond, 1967.--Hammond reported in 1967 on the development by a team of educators of a model for evaluation at the local level. The model was developed to meet the need for a systematic approach to the evaluation of innovations in programs of instruction. 96Harold M. Byram, Evaluation of Local Vocational Education Programs (East Lansing, Mich.: Michigan State University, 1965), pp. 1-81. 97Eugene Donald Wyllie, An Evaluation Plan for Business Edu- cation Programs in High Schools, a report prepared with the cooperation of the Indiana Business Education Association for the U.S. Office of Education (Bloomington, Ind.: South-Western Publishing Company, 1963), p. l. S8 The evaluation model consisted of five steps: (1) begin with a simple subject area of the curriculum, (2) descriptive variables in the instructional dimensions should be defined, (3) objectives should be stated in behavioral terms, (4) assess the behavior described in the objectives, and (5) analyze the results. In order to utilize the process, funds were made available for the development of a center to . . . . . 9 tra1n personnel 1n ut1lization of the model. 8 Merriman, 1967.--Merriman, in a paper entitled Evaluation of Planned Educational Change at the Local Education Agenoy Level, discussed the components of a model entitled 92333 as developed by the Ohio State University Evaluation Center. The components of the CIPP Model were: context, input, process, and product evaluation. The functions Of the components were discussed in terms of their use in project innovations within the local education agency.99 Byram, 1968.--In 1968, Byram reported on a study he had con- ducted with ten school systems in Michigan. Four of the schools were classed as small, three as medium in size, and three as large. Two staff members were appointed by each school to constitute a leadership team with the responsibility of supervising the evaluation project. 98Robert L. Hammond, Evaluation at the Local Level, a report prepared by Project ERIC for the U.S. Office of Education (Tuscon, Arizona: Project ERIC, 1967), pp. 2-8. 99Howard 0. Merriman, Evaluation of Planned Educational Change at the Local Education Ageney Level, a report prepared by the Evaluation Center, The Ohio State University College of Education for the U.S. Office of Education (Columbus, Ohio: Ohio State University, 1967), p. 4. 59 The project began in January, 1966, and except for publishing the revised manual, was completed by January 1, 1968. Byram's project placed emphasis on local staff and citizen involvement, curriculum analysis, assessment of outcomes through follow-up, and Special activities and projects of particular interest to each school. Objectives of the study were as follows: 1. To further try out and demonstrate a system for the evaluation of vocational education on the local level which originated in the Michigan Project on Evaluation of Local Programs of Vocational Education. 2. To discover and/or devise new or improved procedures for local program evaluation. 3. To establish a working environment in which learning of evaluation procedures for both local school personnel and potential leaders at state and national levels of vocational education can take place. 4. TO identify and describe the role of the consultant in program evaluation. 5. To uncover situations that could be considered as potential research and development centers.100 The following are the findings of Byram's project, their significance, and implications: The objectives of the project were satisfactorily attained as follows: 100Byram, Evaluation Systems for Local PrOgrams, pp. 2-3. 60 The evaluation system was tried out in all ten cooperating schools, and was demonstrated successfully in nearly all of these schools. As a result, the following elements were established as relevant in a local program self-evaluation: (a) administrative endorsement and support; (b) a local leader- ship team; (c) a program of local leadership pre-service and in-service training; (3) an evaluation project plan; (e) a staff committee for the evaluation; (f) staff time for evaluation; (g) objectives and curricular analysis; (h) advisory committees for assistance in the evaluation; (1) follow-up study of former students; (j) use of consultant help and instructional materials in deve10pment and use of evaluation instruments; and (k) ade- quate reporting of activities. New and/or improved procedures for local program evaluation were developed. These relate to follow-up studies, advisory committees, curricular study, and procedures of involvement. These are incorporated in the revised manual on evaluation. The project provided the working environment for leadership development of twenty local school persons and for four graduate research assistants. The assistants' activities included conducting conferences for local leaders, planning and/or conducting research studies, and consultation regarding local studies. The role of the consultant in program evaluation was studied and partially described. Further work was needed on this. 61 5. Situations were identified that could be considered as potential research and development centers.101 According to Byram there was substantial evidence that local school personnel assigned to the project made considerable professional growth. System strengths were: a. emphasis on objectives and evluation in terms of outcomes b. involvement of staff and citizens contributing to implementation of recommendations growing out of the studies c. identification of need for and use of consultant services d. development of understandings about and support for vocational education in the local community, and e. the many techniques in and related to local program evaluation resulting from the clinical approach. In terms of shortcomings, Byram indicated the project leader was not successful in getting local leadership teams to go very far in the identification Of student goals in the various vocational programs of the participating schools and in adopting a logical procedure to measure the attainment of the behavioral goals. Another shortcoming was the relatively small emphasis given to evaluation of adult and post-high school vocational education by the local schools. Byram listed two specific recommendations resulting from the project; first, that steps be taken to provide state level leadership and support to other school systems in the State of Michigan so that the system could be used and improved by many school systems wanting 1011mm,, pp. 4-5. 62 to improve their programs through self-evaluation studies; and second, that the system be tried out and demonstrated in other states, thereby helping to establish its applicability to similar states and school districts, as well as to states having different organizational patterns. Byram added it would be desirable to determine the applicability of the system to evaluation of area vocational schools and centers.102 Wisconsin State Department of Public Instruction, l968.--In February, 1968, the Wisconsin State Department of Public Instruction reported on a three-year pilot program in high school vocational edu- cation. The pilot program emphasized planning in the first year, planning and implementation in the second year, continued planning and implementation plus evaluation in the third year. The program had the following goals: . to help determine the extent to which pilot schools individually and collectively were able to identify local program needs and meet them; the extent to which local programs were accepted as a part of the comprehensive high school's program by students, staff and other community members. . . . Data gathered and analyzed will help determine the future course of action by high schools throughout the State. State guidelines for program development will be revised in light of evaluation conclusions and recommendations.103 The pilot program resulted in the publishing of a manual for use in the evaluation of comprehensive high schools in Wisconsin. The 102Ibid., pp. 4-6. 103Wisconsin State Department of Public Instruction, A Manual to be Used in the Evaluation of Thirty-Four Comprehensive High Schools in Wisconsin Which Participated in a Three-Year Pilot Programfiof High School Vocational Education (Madison, Wis.: Wisconsin Department of—— Public Instruction, 1968), p. 2. 63 manual focused on: (1) major areas of concern, (2) duties of local evaluation chairmen, (3) procedures for conducting local self-evaluation, (4) composition and role of the evaluation review committee, (5) com- position and role of the state evaluation committee, and (6) evaluation criteria for guidance and counseling, local administration, local Planning, and instructional prOgram.104 Bruhns, 1968.--In a seminar paper, Bruhns explored the evaluation processes to be used to assess the effectiveness of vocational-technical programs including self-initiated evaluation consisting of the following elements: a. Selection of competent leadership b. Involvement of faculty in the evaluation process c. A focus on the output of programs d. Identification of real objectives of the total program e. Use appropriate methods of data gathering f. Study the essential elements of programs g. Involving citizen groups.105 Christensen, 1969.--In 1969, Christensen reported on a study to determine needed improvements in vocational programs of male students in selected Nevada high schools. Data on needed improvements was gained through the use of a questionnaire to 1,856 boys in grades 9 through 12. Two of several conclusions arrived at from the study were: (1) there is a need to redirect almost half the students who say they 10 105 4Ibid., pp. 1-18. Bruhns, Evaluation Processes, p. 10. 64 plan to go to college and enter a profession into occupations that require less than a college degree and (2) there needs to be further study of ways and means to determine which students can profit most from vocational classes.106 University of Arkansas, 1969.--The Industrial Research and Extension Center, College of Business Administration, University of Arkansas, published a report on an evaluation of Arkansas vocational training programs in 1969. Basis for the evaluation was a need for a state plan organized to reflect manpower needs and education and training requirements of secondary and postsecondary levels for each socioeconomic area of the state and the entire state. Included in the project was the development of criteria, standards of performance and guidelines for use by local and area-wide schools for providing high quality vocational-technical education and training programs.107 Byram, 1971.--Byram reported in 1971 on a Multi-State Project in locally directed evaluation of public school programs of vocational- technical education. The evaluation system tried out and demonstrated in the four-state project was the major outcome of two previous Michigan projects involving locally directed evaluation in thirteen 106Howard H. Christensen, A Study to Determine Needed Improve- ments in Vocational Programs in Nine Nevada High Schools, a report prepared by Michigan State University and Nevada State Department of Education for the U.S. Office of Education (East Lansing, Michigan and Reno, Nevada: University of Nevada, 1969), pp. l-l97. 107Samuel M. Burt, Evaluation of Arkansas Vocational Training Programs in Relation to Economic Development, a report prepared by the University of Arkansas and the W. E. Upjohn Institute for Employment Research for the U.S. Office of Education (Little Rock, Arkansas and Washington, D.C.: University of Arkansas and the W. E. Upjohn Institute, 1969), pp. 1, 7. 65 school systems. Projects were conducted in the states of Arkansas, Minnesota, Mississippi, and Nevada. ScOpe of the study was described by Byram as follows: . . . a state project leader was designated who served for a period of two years. Each leader selected five schools to cooperate with him. Eleven of these were local school districts, seven were county districts and two were post-secondary institu- tions. Each of these schools named a local leadership team of two faculty members. Thus, 20 school systems, with 40 local leaders in them were most deeply involved. In addition, however, 291 local staff members, who served on local staff committees and roughly 475 citizens serving on advisory committees were directly involved. Unnumbered other faculty members in these schools and citizens in the communities, counties or areas served also were indirectly involved.108 The Multi-State Project encompassed three major objectives: --To determine the feasibility and generalizability of a state procedure of assisting local school district leadership to use the evaluation system developed in Michigan. --To discover and/or develop new or improved procedures in a state system for local evaluation. --To assist in development of state and local leadlership competencies in evaluation of local programs of vocational and technical education, including creation of understanding of the values of local involvement.109 The project resulted in all participating schools meeting the criteria established by state leaders and their advisory committees. The leadership teams appointed in the participating schools accepted the minimum activities, and thirteen of the schools completed the minimum activities. All twenty schools indicated plans for future local evaluations. It was concluded that the system for local 108Harold M. Byram, A Five-State Try-Out and Demonstration Program to Determine the Generalizability of an Evaluation System for Local Programs of Vocational and Technical Education (East Lansing, Mich.: Michigan State University, 1971), p. l. 109Ibid., p. 2. 66 evaluation was replicable in the four states and, with minor exceptions, in the schools involved in these states.110 Brown, l97l.--A manual for local evaluation was compiled by Brown in COOperation with the University of Tennessee and the Tennessee State Department of Education. The manual, based on a compilation of concepts and materials from various sources, outlines a systems approach to the evaluation of local vocational education programs. Local evaluation was viewed as a team effort by those responsible for improvement in the system and to be carried out as a continuous process. The manual reports on a proposal to team evaluate each program with common instruments. The point of view in the manual is that evaluation should be accomplished by those who are responsible for improvement of the programs. The concepts and the materials in the manual are acknowledged by the editor to be derived from the work of Byram, Norton, MCKinney, and others.111 Crunkilton, l97l.--Crunkilton reported in 1971 on the testing of a model for evaluation of secondary school programs or vocational education in agriculture. The model was designed by Krebs to measure the degree to which students used the skills and abilities learned in agricultural education after graduation and to evaluate a local program. llolbid. 111Donald V. Brown, Manual for Local Evaluation, a report prepared by the University of Tennessee and the Tennessee State Depart- ment of Education for the U.S. Office of Education (Knoxville and Nashville, Tenn.: University of Tennessee and the Tennessee State Department of Education, 1971), p. 3. 67 The model contained some twenty-six forms used as data gathering instruments. A follow-up for the graduates from the James Wood High School Agricultural Department for the years 1970, 1969, 1968, and 1967 provided a partial test of the data collecting instruments to determine their appropriateness in using the evaluation model in Virginia. The long term objective of the project was to develop an evaluation model which could be used throughout the state by local communities for evaluating and improving vocational education in agriculture. Some of the conclusions arrived at were: (1) that the model for evaluation of secondary school programs of vocational education in agriculture is a viable means by which sufficient and reliable data can be collected for critical analysis and used for program direction, (2) that the role of the advisory council is vital if the evaluation is to be successful, and (3) that the support and interest exhibited by the school board, administrators, teachers, advisory group, and local agricultural businesses indicated the value placed upon evaluation as being a part of the total school program.112 National Study for Accreditation, 197l.--In 1971, the National Study for Accreditation of Vocational/Technical Education developed a draft of untested instruments and procedures for evaluating vocational/ technical education programs. Purpose of the draft was for discussion and use in field tests. The draft represented an effort and concern 112John R. Crunkilton, Testing of Model for Evaluation of Secondarnychool Programs of Vocational Education in Agriculture, a report prepared by Virginia Polytechnic Institute and State University for the U.S. Office of Education (Blacksburg, Vir.: Virginia Polytechnic Institute and State University, 1971), pp. 1-61. 68 on the part of the American Vocational Association for the accreditation of vocational/technical education. Instruments included in the draft were meant to be utilized in the evaluation of vocational/technical education. A three-step evaluative process was included, with one step placing emphasis on institution self-evaluation. The self-evaluation study was intended to provide staff members with a systematic means of analysis of their institution: The self-evaluation study should provide staff members with a systematic means of analyzing instructional and program operation and should help them gain new perspective and insights into the various institutional operations and resources open to them. It provides an Opportunity for a kind of in-depth study that is other-wise often not possible in the ordinary course of work. The self-evaluation process consists of: (l) a steering committee appointed by the head of the institution, (2) working committees appointed by the steering committee, and (3) self-evaluation by each faculty member through the completion of an individual self- evaluation form.113 Dale, l972.--Dale directed a project entitled VEEP, Vocational Education Evaluation Project, in 1972 for the purpose of developing and implementing a management information system in Virginia. The intent of the project was to provide data needed in meeting accountability requirements at the local, state, and federal levels. Additional intents of the project were to evaluate the effectiveness of secondary 113National Study for Accreditation of Vocational/Technical Education, Instruments and Procedures for the Evaluation of Vocational/ Technical Education, a report prepared by the American Vocational Association for the U.S. Office of Education (Washington, D.C.: American Vocational Association, 1971), pp. 1-71. 69 vocational education programs, services, and activities; and to aid in determining the adjustments needed to meet changing conditions. The management information system contained two subsystems, a macro-subsystem concerned with information at the state level and the micro-subsystem designed to provide information to local decision makers. Information to be provided local personnel included: assessing, planning, and programming individual vocational education programs in local schools.114 Starr and Dieffenderfer, 1972.--In May, 1972, a manual entitled A System for Statewide Evaluation of Vocational Education was authored by Starr and Dieffenderfer. The manual contains a description of an evaluation system designed to assist vocational education agencies in their prOgram planning, accountability, and reporting responsibilites. The evaluation system, developed over a five-year period with field tryouts and revisions, has three purposes-- First, the evaluation system is designed to provide management data for program planning purposes. It relates a vocational system's program outcomes to specific program goals as a logical input in planning and replanning vocational system activities and programs. Second, the evaluation system is designed as a con- tinually operative monitoring mechanism in order that program plans may be adjusted whenever required by changes in the field situation. Third, the evaluation system is designed to be sufficiently flexible so that it can be modified to meet changing evaluative requirements in the Operational situations of its users. 114Oliver J. Dale, Vocational Education Evaluation Project, a report prepared by Virginia Polytechnic Institute, State University and State Department of Education for the U.S. Office of Education (Richmond, Vir.: Virginia Polytechnic Institute, State University and Virginia State Department of Education, 1972), pp. 1, 3. 115Start and Dieffenderfer, A System for Statewide Evaluation, 70 The evaluation process consists of an evaluation program planning cycle which provides for sequencing many major events when coordinating an evaluation system with a set of program planning procedures. The main point which should not be overlooked is the systematic manner in which the vocational agency proceeds in this evaluation and program planning cycle; that is, the vocational agency starts with objectives, which are the major intended purposes or ends which it seeks to accomplish; derives goal statements, which are the measurable means by which objectives are intended to be achieved; assigns specific goals, numerical outcomes to be achieved for each goal statement during the period of evaluation; measures the degree of success in attaining goals; analyzes failures in goal attainment; and systematically initiates program planning procedures to redirect vocational agency efforts and resources in achieving new annual and long-range goals.116 The evaluation system is also designed for use by local school districts: Project staff rec0gnized that here would continue to be a need for a variety of methodologies for the evaluation of vocational education designed for other purposes; for example, locally directed self-evaluations . . .1 7 Grotsky, 1973.--Another management information system directed by Grotsky and others is entitled The Peer Evaluation Program (PEP) with the purpose of allowing intermediate units an opportunity to continuously improve their programs. Provisions for self-evaluation are included through the participation of all staff members at the local level. At the beginning of the school year teachers are asked to examine areas of strength or weaknesses and to make specific recommendations. The authors indicate feedback becomes part of the self-evaluation process involving supervisory and other local 11611311, p. 10. ll71bid., p. 4. 71 administrative staff. Consequently, all local staff have an Oppor- tunity to identify with the programs.118 Summary In summary, the following are generalizations from the review of research reports and other materials. 1. The review of literature indicates general agreement with Byram's definition of the term evaluation. Byram defined evaluation as the process of making judgments about the worth or value of a total program. Writers agreeing with Byram's definition are: Scriven, House, Lessinger, Westbury, Guba, Jones, Gold, Stufflebeam, Alkin, and Tyler. 2. Byram saw the general purpose of evaluation to be that of improvement of programs. Sullivan, Jones, Starr and Dieffenderfer, Hayman, Yelon, and Wilhelms and Diederich--a11 concur with Byram. 3. The need for evaluation of vocational programs was seen as a growing concern by Byram. The Advisory Council on Vocational Education and the Center for Vocational and Technical Education at Ohio State University, both have indicated that evaluation of vocational programs is a must. Brandon and Evans, Bruhns, and Norton, Love and Rolloff, believe evaluation in vocational education must happen before improvement can result from 118Jeffrey N. Grotsky, Peer Evaluation Program, a report Prepared by the Pennsylvania State Department of Education for the U.S. Offiice of Education (Harrisburg, Penn.: Pennsylvania State Department of Education, 1973), p. 3. 72 vocational education being able to adapt to rapid changes in occupational requirements. 4. In answer to why there should be local involvement in evaluation, Byram said evaluation ought to be done by those who are in a position to implement recommendations. Bruner, House, and Nerden agree with Byram. McKinney and Mannebach believe, along with Byram, that citizens and students, as well as educators, need to be personally involved in evaluation. 5. A number of studies in local evaluation of vocational education programs have been completed. Rationales for why the various studies were conducted are varied. However, the studies support the need for, and the importance of, local evaluation of vocational education programs. Innovation was the basis for two studies in local evaluation of vocational education programs. Hammond (1967) reported on the develop- ment of a model as a systematic approach to the evaluation of innovations in instruction. Merriman (1967) described a model which contained a product evaluation phase to be used in the evaluation of project innovations in local educational agencies. The development of study materials as the medium for imple- menting an evaluation plan for program improvement in business edu- cation was reported on by Wyllie (1963). The plan contained procedures for evaluation of departmental programs. Studies done to identify specific program needs at the local level were reported by the Wisconsin State Department of Public Instruction (1968), the University of Arkansas (1969), Crunkilton (1971), and Christensen (1969). The Wisconsin and Arkansas projects 73 produced guidelines for local evaluation to be used state-wide. The project reported by Crunkilton developed a model which could be used state-wide in the evaluation of local vocational programs. Christensen indicated a further study was needed to determine which students could profit most from vocational classes. A systems approach to the evaluation of local vocational programs was compiled by Brown (1971). The concepts and the materials were derived from the work of Byram, Norton, McKinney, and others. Self-initiated evaluation was included in evaluation processes explored by Bruhns (1968). Purpose of the project was to assess the effect of vocational programs. A management information system to provide data in meeting accountability requirements at local, state, and federal levels, was directed by Dale (1972). The system was designed to provide local personnel with information pertaining to assessing, planning, and programming local vocational programs. A similar goal was included in an evaluation system reported on by Starr and Dieffenderfer (1972). Purpose of the system was to assist local agencies in program planning, accountability, and reporting responsibilities. A systematic means of self-evaluation, in the form of a set of test instruments, was developed by the National Study for Accreditation (1971). The systematic self-evaluation study was intended to provide local personnel with a systematic means of analysis of the programs in their building. A project to develop and try-out a systematic approach to local program self-evaluation was conducted by Byram (1963). A try-out 74 and demonstration of the 1963 project was also conducted by Byram (1968). Purpose of the 1968 project was to discover new or improved procedures for local evaluation. A Multi-State Project in locally directed evaluation was conducted by Byram (1971). The project was based on the outcomes of the two previous projects. The project had a three-fold purpose: (1) to determine the feasibility of a state procedure of assisting local school districts in the use of the evaluation system developed in Michigan, (2) to discover and/or develop new or improved procedures in a state system for evaluation, and (3) to assist in the development of state and local leadership competencies in evaluation of local vocational programs. CHAPTER IV METHODOLOGY The purpose of this study, as shown in Chapter I, was to determine the support or lack of support for Byram's evaluation system as a process for locally directed evaluation of local Michigan voational education programs in comprehensive high schools which were served by area centers. Byram's process is based on the premise that the primary purpose of the evaluation of a local vocational education program by local personnel is to improve the program. Population of the Study Vocational education teachers of fourteen high schools and seven area vocational centers in Michigan were surveyed in this study (see Appendix E). Ten high schools had been involved in one or the other of Byram's two research and developments projects in 1963-65 and 1966-67.1 Five of the area centers which were included in the study either served as an extension of the vocational programs of one of the schools at the time of this study or had been involved in Byram's projects before being designated an area vocational center. The 1Harold M. Byram, Locally Directed Evaluation of Local Vocational Education PrOgrams (Danville, Ill.: The Interstate Printers and Publishers, Inc., 1970), p. iii. 7S 76 remaining four high schools and the two area vocational centers serving these four schools were chosen because they were located in districts which had not participated in one or the other of Byram's two research projects noted above. The ten high schools and five area centers which had been involved in one of Byram's projects were located throughout the State of Michigan, from Benton Harbor in the southwest corner of the State to Sault Ste. Marie in the eastern end of the Upper Peninsula. These schools and area centers included all the school districts in Michigan which were involved in Byram's two projects. The two area vocational centers located in districts which had not participated in one of Byram's projects were chosen because of the diverse sizes of high schools being served. Also, the geographical areas served by both of these area vocational centers included both rural and industrial communities. One large high school and one small high school sending students to each area center were selected making a total of four non-project schools. The population of respondents for this study consisted of all the teachers of vocational subjects in the above twenty-one schools. This included all of these teachers who taught or coordinated one or more classes which were eligible for special reimbursement as vocational education in the seven major vocational-technical teaching areas: agriculture, distributive education, health occupations/education, 77 home economics, office occupations, technical education, and trade and industrial occupations.2 Questionnaire Design and Survey Procedures The concepts and principles for the questionnaire were taken from a manual compiled by Harold M. Byram and Marvin Robertson entitled, Locally Directed Evaluation of Local Vocational Education Programs, A Manual for Administrators, Teachers, and Citizens. The process, described in the manual as "A System for Organizing and Conducting an Evaluation," contained the following eleven areas of activities which were reported to be needed in the development and implementation of an evaluation system for locally directed evaluation of local vocational education programs: Organizing for Local Evaluation 1. Appointing Qualified Leaders 2. Providing Time for Evaluation 3. Providing for Involvement of Faculty and Citizens 4. Developing a Plan for Directing the Evaluation 5. Developing Competency in Program Evaluation Major Activities in the Systematic Approach 1. Studying the Existing Program 2. Stating Philosophy and Objectives 3. Formulating Criterion Questions 4. Identifying and Obtaining Evidences 2U.S. Department of Health, Education, and Welfare, Office of Education, Vocational Education and Occupations (Washington, D.C.: Government Printing Office, 1969), p. x. 78 5. Analyzing, Interpreting and Reporting Information 6. Formulating and Implementing Recommendations The questionnaire contained statements selected from information pertaining to these eleven activities. Where apprOpriate, references to the role of the area center were included within each statement to aid in the determination of the respondent's degree of support of the evaluation process for his local school situation in relation to the area center. Space was provided on the survey instrument for respondents to include additional practices or procedures which they believed ought to be a part of a total locally directed self-evaluation process. The questionnaire was read critically by two administrators and one instructor in an area center before it was prepared for distri- bution to a pilot group. A piiot group of twelve teachers and six administrators of one high school where several vocational education prOgrams were being offered provided the needed information for revision of the question- naire prior to its being distributed to the total population. A 100 percent return was received on the pilot questionnaire from both the teachers and the administrators. Each member of the pilot group was asked to read and comment on each statement and note their general reactions to the total questionnaire. The few comments received included references made to the lehgth of the survey instrument with no suggestions on deletion of specific questions. As a result of the pilot study, the introductory statement on the first page was rewritten and the final questionnaire consisted of three pages on one side on 8 1/2 by 14 inch paper (see Appendix A). 79 On March 11, 1974, a letter (Appendix B) was mailed to each building principal requesting participation in the study on the part of his administrative staff and vocational education teachers and teacher-coordinators. A few days after the building principal received the letter he was contacted by telephone and asked to designate a person as building representative. At that time an appointment was made to deliver and explain the intent of the question- naire to the building representative. On March 15, 1974, the process began of hand-carrying the questionnaires to each school with a cover letter (Appendix C) attached to the front of each questionnaire for each respondent. Each building representative was supplied with an adequate number of questionnaires, cover letters, and self-addressed large envelopes for first-class mailing to return the completed questionnaires to the researcher. On April 16, 1974, a thank you/reminder letter (Appendix D) was mailed to each building representative. Several days later, all schools not having returned completed questionnaires were contacted by telephone. All fourteen high schools and the seven area centers returned completed questionnaires from both teachers and administrators in their respective buildings. Three hundred twenty-eight responses were received out of a total of 474 which included administrators of vocational education programs, teacher-coordinators of vocational education programs, and vocational education teachers. Questionnaires distributed by the building representatives to persons not qualifying under one of the above three positions were not returned. Of the 328 questionnaires 80 received, 278 were responses from teachers (see Table 1), and 50 were from administrators (see Appendix G-l). Background of Schools, Area Centers and Respondents Background of Schools and Area Centers Of the 14 high schools and 7 area centers surveyed (see Appendix E), 5 high schools and 3 area centers each had an enrollment of over 1,500 students. These eight schools and area centers were classified as large institutions. The enrollment in each of the remaining nine high schools and four area centers was under 1,500, all of which were classified as small institutions. Five high schools included grades 10, 11, and 12 while 9 high schools had grades 9, 10, 11, and 12. With the exception of special programs, the seven area centers served students in grades 11 and 12. The high schools and area centers were categorized into the following groups: project schools, prOject area centers, non-project schools, and non-project area centers. A project school was one whose past personnel had participated in one Of Byram's projects in program self-evaluation at that school (see Appendix E). The project area centers served the project schools as a part of their vocational program or may have been involved in one of Byram's projects before being designated an area vocational center. Non-project schools and non-project area centers had not been involved in Byram's projects. Background of Respondents All respondents were asked to provide information about them- selves which is tabulated in tables two through six of this chapter. 81 .thpmaumMGMEpm seam momcommOH new xfipcommm oom .xpspm wasp Ops“ poumuomuoocfl uoc who: memo was .mpoumuumficfieem seam po>flooou paw ou wouznfluumfip Oman one: mouwmccofiumoso k. w.oo mum med we no mmH om mHH pocpsuom m.mm me mm om um me om mm eocuspom uoz o.ooH cfie 0mm mm emH owH ow ova eousnfiaumfio ucoonom honesz honesz nonszz ponezz Honssz Honesz nonezz mamuoe mamuOH poemoumucoz pounoum meuoe weenopducoz Hammond mouflwccoflumoso mumpmww mon< mHoocom .sofip3uflpmcH me make xn rmponowOH seam po>flooom momcommom paw Op wousnflhumfia monflmccofipmoso mo nonezz--.a canoe 82 The respondents were first asked to indicate their primary position at the time of the survey. Of the 328 who responded, 278 were teachers and teacher-coordinators and 50 were administrators of vocational education programs. Of the 278 teachers and teacher- coordinators, 115 were employed at project schools, 20 at non-project schools, 97 at project area centers, and 46 at non-project area centers (Table 1). Respondents were then asked their age and sex, as shown in Table 2. The age group 26 to 30 years had the largest number of teachers, 58 teachers or 21 percent of the total respondent population. Fifty-one percent of the teachers surveyed were under thirty-five years of age. There was much similarity in the age distribution for teachers in schools and those in area centers. In the schools 50 percent of the teachers were age thirty-five or younger and 51 percent of the teachers in the area centers were in this age group. Twenty-three percent of teachers in schools were forty-six years or Older with 20 percent of the area center teachers age forty-six or older. Sixty-six percent of the teachers responding were male, with the area centers showing 72 percent male and the schools 59 percent male. Sixty-seven percent of the teachers surveyed had ten years or less of teaching experience (Table 3), while 39 percent had five years teaching experience or less. Fifty percent of the teachers in area centers had five years or less experience in teaching while 27 percent of the teachers in schools were in this category. Twenty-two percent of the teachers in schools had sixteen or more years teaching 83 OOH NNN OOH meH OOH Oe OOH NO OOH mmH OOH ON OOH mHH mHNOOH N o N m N H N N N m m H N N 35831 oz Nm ON ON mm wN mH mN ON ON Nm mN m He Ne oneOm OO me NN mOH ON Nm mN HN Om ON ON eH mm OO on2 OOH NNN OOH mvH OOH oe OOH no OOH mmH OOH ON OOH mHH mHmHOH N N m m e N m m N N S N omeoamom oz H e H H H H N m m m OHOE no HO e HH O N N H m N N m m m OO on om m eN N OH v N w w OH NH m H HH mH mm Op Hm w NN O O mH N N N O NH m H OH NH Om ou Oe HH Om HH OH NH w w w OH NH m H HH mH me Op He mH He mH HN HH m NH OH mH ON mH m mH NH oe on em NH me OH MN OH O eH eH ON ON Om O NH ON mm ou Hm HN mm NN Nm NH m mN eN ON ON om O NH ON ON Op ON NH Nm mH NH O N eH eH OH eH mH eH mmoH Ho mumox mN 5'3 N .62 N .02 N .02 N .02 N .02 N .02 N .02 mHmuOH meuoe HOOMOHO-:oz OOOHONO mHNNOH weenoumucoz poemond Houomd mnouzou mop< mHoogom .coHuauHumcH mo omxh xn OOHmHmmmHu mucopcommom peacock mo xom mam Om-u.m oHan 85 experience while 14 percent of teachers in area centers had taught sixteen or more years. Of the:i;§_teachers surveyed (Table 4), 5 percent (14 teachers) had participated in one of Byram's projects--6 percent of the teachers in project schools and 6 percent of the teachers in project area centers. NO teacher in non-project schools and only one teacher in non-project area centers had been involved in a Byram evaluation project. Ninety-five percent of the teachers responded to this question on the survey form. Seventy-eight percent of the teachers had not participated in any locally directed evaluation project (Table 5). Twenty-five teachers in project schools and project area centers and seven teachers in non- project schools and non-project area centers had participated in some form of a local evaluation project. Only one teacher (Table 6) had less than a high school edu- cation. Five percent had only a high school diploma. Over 80 percent of the teachers had bachelor's or master's degrees. Five percent of all the teachers responding had specialist's degrees or master's degrees plus thirty hours. One teacher had a doctor's degree. Teachers in area centers showed a greater diversity of amount of schooling completed than teachers in schools. Two percent of the teachers in schools had less than a bachelor's while 13 percent of area center teachers were in this category. Ninety-two percent of the teachers in schools and 72 percent of teachers in area centers had bachelor's or master's degrees. Four percent of teachers in schools and 8 percent of teachers in area centers had a master's degree plus thirty graduate hours or more. 86 ooH NNN ooH NNH ooH ON ooH No ooH mmH ooH ON ooH mHH HoooN m NH m N o N m m m N m H m o omooomom oz oo omN oo ONH No me ON ON oo HNH mo oH ow NoH oz m NH m N N H o o m N o N mo» N .oz N .02 N .oz N .oz N .oz N .oz N .oz mHNOON mHmuOH weenoudncoz NOONOHO mHOOON HOONONO-:OZ uo0noad Hepomm mumpcou NON< mHoozom .cOHNSOHumcH mo OONN NO OonHmmmHO mempmonm :OHumosvm mo coHumsHm>m OONOOHHO NHHOOOH mo muooNonO m.emhxm :H mucovcommom NOHONOH mo :OHONQHOHOHNO--.O OHOON HNfiOM HNUO> HNUOQ 87 OOH ONN OOH meH OOH Oe OOH NO OOH mmH OOH ON OOH mHH mHmuoN OH NN OH OH OH O N N HH OH OH N OH NH omcommom oz ON OHN OO mHH «N NO NO HO ON OOH mO NH mN OO oz NH mm OH OH ma O O O OH OH O H OH NH mo» N .oz N .oz N .oz N .oz N .oz N .oz N .oz mHmHOH mHOHOH pawnoumucoz HOOnoHO mHOOON poonoamncoz weenoHO HOHOOO whmufimu $02 mHoocow .ooHooOHomoH mo ooNN No ooHOHmmoHO mcoHumsHm>m Hmhuzou zpaoz cane HOHHO msmnmonm :oHumoswm choHmeo> HOOOH we :OHumsHa>m um OoEH< OOONONO :oHpmsHm>m OOHOONHO NHHOOOH Nam :H mucovcommom HOHONOH mo :OHOOOHOHOHmOuu.m oHnmN 88 OOH ONN OOH meH OOH Ow OOH NO OOH mmH OOH ON OOH mHH HNOON m o m N N m o e H N m H H H omoooooz oz H e N m e N H H H H m H nogpo m. H H H N H oopwom m.HOOOOO O OH N OH O O O O N m m H N e on + m.uoummz no OOHOOO m.umHHmHoomw ee mNH em ON mm OH em mm mm ON OO OH Om eO OOHNOO m.Noumm2 Om mOH Om mm ON OH NO NO NO Om mm N Nm me OOHOOO m.HoHo:omm m N m N N H o o oNoHHoO NOHCOEEOONNOHCOO m NH N oH HH m m m N m m m Hoozom ONHO m. H H H N H Hoonom OOH: coco mmoH N .oz N .oz N .oz N .oz N .oz N .oz N .oz mHOHON OHNHON powwonmucoz NOONOHO mHmHON pumm0hmscoz OOONOHO Hepomm muoucou mon< mHOOHOO .coHpspHumcH mo OQNN NH OOHmHmmmHO mucovcommom Hosomob NH OouonEou OnHHoocom mo Ho>oH omozmHm--.o oHooN 89 Data Analysis Procedures Data gathered by the questionnaire were coded and punched on cards for use by the computer. The "no reponses" to survey statements were recorded as "no opinions." The total questionnaire containing sixty-four statements dealt with Byram's concept of how to conduct a local evaluation project. In order to reduce the data to a more manageable size for specific statistical analysis of questions one and three, sixteen statements dealing with the general process of Byram's evaluation system were selected from the sixty-four statements. These sixteen statements were selected from those considered basic to the process, and specifically included some of the questions which made reference to area centers. The relationship of the sixteen statements to the total questionnaire and to those questions making reference to the area centers is shown in Table 7. Using all teacher responses to the sixteen statements as a base of 100, a percent of the total teacher population responding was determined for each category of response (on a Likert scale of 1 to 5) for each of the sixteen statements. To determine the reliability for using a total response score for each teacher, Hoyt's test of reliability was applied to the means of the total response scores of all 278 teachers. Hoyt has indicated: The coefficient of reliability of a test gives the percentage of the obtained variance in the distribution of test scores that may be regarded as true variance, that is, as variance not due to the unrealiability of the measuring instrument.3 3Cyril J. Hoyt, "Test Reliability Estimated by Analysis of Variance," in Principles of Educational and Psychological Measurement, ed. William A. Mehrens and Robert L. Ebel (Chicago: Rand McNally and Co., 1967), p. 108. 90 :.mOO=OOH>m mchHauno and NcmeHucovH: cu OOHHOOON :mO>HuOOmno vca N£dOmOHHcO NcHuwum: :H HCOEOONOm < O :.:oHua:Hm>m on» OcHuoouHO How caHO a OcHOoHO>OO: :H accompaum a cH vovnHucH ma: :coHuesz>m Eduuoum cH NoOOuomaou OchoHo>oazo :.u:OEuHEEou: cu oocouowou cuss :mHOONOH OOHOHHOOO OcHucHommd: NuH>Huum 0:» :H u:0&0umum < n .mucoeouaum OeuooHom :ooume on» :H covaHucH one: mucoamuaum omozu mo ouoa No econ OH OH «O mHaOON mcoHumvcosEooom m oN O OcHucosoHOEH van OcHumHseuOm coHumeuowcH NcHuHOOom H OH N van OcHuounuoucH .mcHNme:< NHHHHnHmH> van mcoHumoHcssEou HoO H N moocovH>m uchHauOo van OcwaHucovH H O H mcoHumoso :oHHOHHuu mcHuaH=EHOm N oH o 338:8 as 2833.3 5:35 N N N eauuoum OcHumem on» OcHxvaum :Omoumm< OHumEoumxm ozu cH moHuH>Huo< mafia: :oHuasHa>m aaumoua mousvoooum :oHumsHa>m OO O e cH NOCOHOOEOO NcHaOHo>OO Ono condone: cH OchHwHN :OHuaaHu>m on» H O N OcHuuouHO How cOHO a OCHOOHO>OO OOH>uom unauHsncou OOHOHHaso each OHcmuoum0H H O N coHOmzHa>m New oaHh NchH>ONO ou OBHN we :oHuaOOHH< N O m mcoNHuHu mo ucoso>Ho>cH oouuHanu NHOmH>v< acoNHuHu H me O Nquoau mo ucoao>Ho>=H oouuHaEOO ucHHOOum mumuw N H O mHOOOOH OOHOHHOOO mcHucHOOO< each OHnmNOOOOH OO H H Hm>OHOO< euoom ucoauHaaou :oHuasHm>w HOOOH New mcHNHcmMuo vouooHow muoucou uou< ouHaccoHumoso :OOOme Ou oocouowom nqu no HOOON coHumaHa>m HmoOH Now mmoooum onuusHa>m HOOOH new mu=030umum No>hzm mo Honssz m.EmHNO :H moHuH>Huo< mucoeon HaHucommm n.5ahNO .muOucou aou< on» on oucouowox Ocmez mu:OEOuauw Omonh vca .mucosoumum No>Hsm HH< .moHuH>Huo< mmoooum .coHuwaHm>m How mucoson HOHHOOOmm on» O» mucoaOuwum coaume on» we nHzm:ONuuHO¢u-.N oHnaN 91 Hoyt's test indicated that the use of a total score for each teacher was a reliable measure of that teacher's response. Total scores were, therefore, employed for further analysis.4 Analysis of variance was to have been used to determine if there was a difference in the support of Byram's process for evaluation by these teachers who had participated in the projects and those who had not. Upon return of the questionnaires, only 14 of the 278 teachers who responded had actually been involved in the projects. Plans for this analysis were, therefore, discarded. The one-way analysis of variance was selected as the best test to determine if there were significant differences by institution in the level of teacher support of Byram's evaluation process. Analysis of variance has been defined by Borg as . . . a statistical technique that makes it possible to divide the difference (variance) that we obtain in our experi- mental data in to parts and assign each part to its correct source. . . . In educational research the most common application of analysis of variance is to determine the significance of differences among group means.5 In discussing analysis of variance as a procedure, Hays has said: . the experimenter's problem is essentially . . . (that of needing) . . . a method for the simultaneous comparison of many means in order to decide if some statistical relation exists . . 4Reliability score based on Hoyt's test: .845. 5Walter R. Borg, Educational Research An Introduction (New York: David McKay Company, Inc., 1963), p. 142. 6William L. Hays, Statistics (New York: Holt, Rinehart and Winston, 1963), p. 336. 92 The teacher responses, when clustered by building, formed twenty- one sub-groups which were categorized into four types of institutions for the one-way analysis of variance. Analysis of variance was also used to determine if there were differences in teachers' responses to Byram's process when these responses were compared by size of institution, type of institution, and years of teaching experience. CHAPTER V ANALYSIS OF DATA The purpose of this chapter is to present an analysis of data from the survey instrument. The main objectives of this study were to determine: (1) to what extent teachers supported Byram's evaluation system as a process, (2) if a difference existed in teacher responses, by type of institution, to all survey statements, and (3) if differ- ences existed in teacher responses by type of institution, size of institution, and years of teaching experience, to selected survey stat ements . Analysis of Data Of the sixty-four statements on the survey instrument (see Appendix F), over 90 percent of the teachers agreed or strongly agreed with two statements. The two statements were: 1. A program description should be prepared with the potential users in mind, such as school counselors, other faculty members, students, parents, and other citizens. 2. The availability of courses should be checked against the needs of employers for their future or present employees, as well as the occupational interests of students. Eighty to 90 percent of the teachers agreed with 29 statements. Seventy to 80 percent agreed with 16 statements, 60 to 70 percent 93 94 agreed with 11 statements, and 50 to 60 percent agreed with 4 state- ments. The single statement receiving the least support (20.2 percent) with 50 percent disagreeing made reference to the use of faculty meetings as a way for providing staff time to work on evaluation activities. The use of budgeted funds for inservice education and curriculum planning to support staff activities in the evaluation program as an alternative way of providing staff time for work on evaluation activities was agreed to by 67 percent of the teachers. Statements Relating to Area Centers Teachers responded to thirteen survey statements making references to area centers as follows (see Appendix F). Sixty-seven percent agreed that an evaluation project should be initiated only when the board and the school staff including the area center staff are committed to program evaluation. Eighty-three percent agreed each vocational area in the area center should be represented on the staff committee. Nearly two-thirds of the teachers agreed that all members of the faculty (both vocational education and general education) and administrators of the school including the area center should parti- cipate in the evaluation project, insofar as is feasible. Seventy-three percent agreed that one or more citizen advisory committees should be authorized and appointed for the specific purpose of assisting in evaluation of the vocational education program including the area center. Agreement was shown by 62 percent of the teachers in reference to the following statement: "Names of persons to be considered for nomination to the citizen advisory committee are best 95 suggested by members of your staff including the area center staff and the board of education." Opinions were about evenly divided with 40 percent agreeing and 35 percent disagreeing to the statement that "persons considered for nomination to the citizen advisory committee should not be nominated by organizations outside of the school and area center." Seventy-four percent of the teachers agreed that all staff members of the school, including the area center, should assist in deve10ping statements of philosophy and general objectives of programs as they pertain to the preparation of youth and adults for entrance tO/and advancement in the world of work. Eighty percent of the teachers agreed with the following statement: The implementation of recommendations of program changes involves different levels of decision-making. All decision makers--citizens, board of education members, administrators, and teachers of the school including the area center--should be involved at appropriate points in the evaluation process-- objectives, criterion questions, gathering and interpreting evidence, and making recommendations. Eighty-one percent agreed that the overall report might well be organized on the basis of the objectives of the total vocational education program of the school including the area centers. The remaining four questions that include references to area centers focused on procedural characteristics with the evaluation process. The range of agreement on the four remaining questions was 75 to 83 percent. Teacher Responses to Selected Survey Statements The first question considered was: "To what extent did teachers' opinions on selected survey statements support Byram's 96 evaluation system as a process for use by a comprehensive high school to evaluate the quality of its vocational education programs including the instruction available in an area center?" Utilizing the sixteen selected statements from the survey instrument, the percent of the 278 teachers responding to each of the five response categories--strongly agree, agree, no opinion, disagree, and strongly disagree-~was calculated and reported in Table 8. (For teacher/administrator responses to the complete questionnaire see Appendix F.) The following two statements were agreed to or strongly agreed to by over 90 percent of the teachers. Both statements made reference to the necessity for insuring that programs and classes meet the needs of the users. 1. The availability of courses should be checked against the needs of employers for their future or present employees, as well as the occupational interests of students (93.5%). 2. A program description should be prepared with the potential users in mind, such as school counselors, other faculty members, students, parents, and other citizens (92.8%). The range of agreement on the following seven statements was from 80 to 86 percent. The majority of the statements make reference to local administrative support for local evaluation efforts. 1. The local leader should have a staff committee to work with him on the evaluation project (85.2%). 2. Released time should be provided the local leader for super- vision of the evaluation process (84.2%). 3. Services of consultants should be available to the evaluation project personnel (83.1%). 4. A local leader needs to be appointed to conduct a program of self-evaluation of local vocational education programs (82.3%). 97 Table 8.--Percent of Teachers Who Agreed or Disagreed With Sixteen Selected Survey Statements About Evaluation of Local Vocational Education Programs. Strongly Agree No Strongly Disagree Selected Statements Plus Agree Opinion Plus Disagree A local leader needs to be appointed to conduct a program of self-evaluation of local vocational edu- cation programs. 82.3 10.8 6.9 The local leader should have a staff committee to work with him on the evaluation project. 85.2 12.6 2.2 Released time should be provided the local leader for supervision of the evaluation process. 84.2 11.9 3.9 All members of the faculty and administrators of your school, including your area center, should participate in the evaluation project, insofar as is feasible. 65.1 16.9 18.0 One or more citizen advisory committees should be authorized and appointed for the spedific purpose of assisting in evaluation of the vocational education program in your school including the area center. 73.1 15.4 11.5 There is need for permanent citizen advisory committees because evaluation of vocational education is a continuous process. 75.6 12.9 11.5 Services of consultants should be available to the evaluation project personnel. 83.1 11.1 5.8 98 Table 8.--Continued. Strongly Agree No Strongly Disagree Selected Statements Plus Agree Opinion Plus Disagree A program description should be prepared with the potential users in mind, such as school counselors, other faculty members, students, parents, and other citizens. 92.8 4.7 2.5 The availability of courses should be checked againSt the needs of employers for their future or present employees, as well as the occupational interests of students. 93.5 5.0 1.5 The process of program evaluation includes the deve10pment of a philosophy, the stating of objectives, and the gathering of evidences to determine the extent of the attainment of the objectives and the fulfillment of the philOSOphy. 82.0 13.0 5.0 All staff members of your school, including the area center, should assist in developing statements of philosophy and general objectives of programs as they pertain to the preparation of youth and adults for entrance to/and advancement in the world of work. 73.8 11.5 14.7 A staff evaluation should decide what evidence it is willing to accept that the criterion question has been answered and the achievement of the objective indicated. 66.8 30.0 3.2 99 Table 8.--Continued. Strongly Agree No Selected Statements Plus Agree Opinion Strongly Disagree Plus Disagree The semi-final report of the local study should be informally discussed by the staff and administrators of your school including your area center and members of citizen advisory committees, and not released to the public prior to formulating recommendations and releasing the report to the administra- tors and other decision makers. 74.5 20.5 The implementation of recommendations or program changes involves different levels of decision-making. All decision makers--citizens, board of education members, administrators and teachers of your school including your area center-~should be involved at appropriate points in the evaluation process--objectives, criterion questions, gathering and interpreting evidence, and making recommendations. 80.2 14.4 The general public should be given an appropriate type of report of the evaluation project. 74.1 19.1 A culminating step in locally directed evaluation is that of making plans for continuing or periodic evaluation. 80.9 16.9 Average percent for all sixteen statements 79.2 14.1 5.0 5.4 6.8 2.2 6.7 100 The process of program evaluation includes the development of a philosophy, the stating of objectives, and the gathering of evidences to determine the extent of the attainment of the objectives and the fulfillment of the philosophy (81.9%). A culminating step in locally directed evaluation is that of making plans for continuing or periodic evaluation (80.9%). The implementation of recommendations or program changes involves different levels of decision making. All decision makers-~citizens, board of education members, and administra- tors and teachers of your school including your area center-- should be involved in appropriate points in the evaluation process--objectives, criterion questions, gathering and interpreting evidences, and making recommendations (80.2%). The range of agreement by the teachers was from 73 to 76 percent on the following five statements. Three of the statements refer to citizen involvement in local evaluation efforts. 1. There is need for permanent citizen advisory committees because evaluation of vocational education is a continuous process (75.6%). The semi-final report of the local study should be informally discussed by the staff and administrators of your school, including your area center, and members of citizen advisory committees, and not released to the public prior to formulating recommendations and releasing the report to the administrators and other decision makers (74.5%). The general public should be given an appropriate type of report of the evaluation project (74.1%). All staff members of your school, including the area center, should assist in developing statements of philosophy and general Objectives of programs as they pertain to the preparation of youth and adults for entrance to/and advancement in the world of work (73.8%). One or more citizen advisory committees should be authorized and appointed for the specific purpose of assisting in evaluation of the vocational education program in your school including the area center (73.1%). The lowest level of agreement by the teachers, 65 to 66.8 per- cent, was on the following two statements. 101 l. A staff evaluation should decide what evidence it is willing to accept that the criterion question has been answered and the achievements of the objectives indicated (66.8%). 2. All members of the faculty and administrators of your school, including your area center, should participate in the evaluation project, insofar as is feasible (65.1%). The statement receiving the highest percent (30.0%) of no Opinion responses was: A staff evaluation should decide what evidence it is willing to accept that the criterion question has been answered and the achievements of the objectives indicated. Ten to 30 percent of the teachers responded with a no opinion to thirteen of the statements. Less than 10 percent responded with a no Opinion to two statements (see Table 8), which were the same two statements that received the highest percent of agreement from the teachers. The statement receiving the greatest percent (18.0%) of responses disagreeing or strongly disagreeing with it was the following: All members of the faculty and administrators of your school, including your area center, should participate in the evaluation project, insofar as is feasible. Three statements were disagreed to or strongly disagreed to by 10 to 18 percent of the teachers. Less than 10 percent of the teachers disagreed or strongly disagreed with twelve of the sixteen statements (Table 8). Teacher Responses by Institution to Survey Statements The second question to be considered in the analysis of data from the survey instrument (Appendix A) was: "Was there a difference in responses to survey statements by teachers in project schools, 102 non—project schools, project area centers, and non-project area centers?" This question was asked to determine if the opinions of the present teachers had been influenced by the fact that instructional personnel in their building had participated in one of Byram's projects at some time in the past. The possibility of a residual effect was considered even though most of the present teachers in that building may have had no experience with Byram's projects. In categorizing the responses by institution, one institution type (non-project area centers) had an N of two. Of these two area centers, one center with which the researcher is associated had a very high percentage of return (93%) with forty-six teachers responding to the survey form. Only five teachers (13%) responded from the second area center which had recently been involved in an unpopular evaluation project (which was not the Byram project). In view of these situations, the decision was made to omit the non-project area centers from the analysis and to use only three institution types--project schools, project area centers, and non-project schools. In analyzing the responses to the question, a one-way analysis of variance was computed on institution averages of all teacher responses to all sixty-four statements using buildings as a unit of analysis. On the basis of the one-way analysis of variance, no residual effect from the school's having participated in one of Byram's projects was apparent among teachers surveyed. With an alpha level of .10, no significant differences were found (P = 0.7719) in the responses from the three types of institutions (see Table 9). 103 Table 9.--One-Way Analysis of Variance of Teachers' Responses by Type of Institution. Source Hypothesis Mean Square F P Institution Types 2.1245 .2632 .7719* *The probability value of .7719 exceeded the level of signifi- cance of .10 adopted for this analysis. Teacher Responses by Institution Type, Institution Size, and Teaching Experience In looking for possible differences in teacher responses, analyses of variances were utilized in the analysis of the following question (see Table 10). Were there differences in teachers' responses to selected survey statements in any of the following comparisons: By type of institution? By size of institution? By years of teaching experience? For type of institution by size of institution? For type of institution by years of teaching experience? For size of institution by years of teaching experience? For type of institution by size of institution by years of teaching experience? OOP'bCDD-OC'W Responses for 274 teachers (four teachers failed to indicate years of teaching experience) on the sixteen selected survey statements were used in analyzing this question. Total scores were found for each respondent and a mean for each age group by type and size of institution (see Table 11). For responses on the sixteen statements using the Likert scale of l to 5, the maximum score possible was 80. Analysis of variance indicated a relatively high level of agreement among the teachers in their responses to the selected survey statements. The only statistically significant difference which was 104 Table lO.--Three-Way Analysis of Variance. Factors: Institution Size, Institution Type, and Teaching Experience Hypothesis Source Mean Square F P Type of Institution 147.3804 2.6387 .1055 Size of Institution 10.3919 .1861 .6666 Years of Teaching Experience 22.0648 .3950 .6741 Type of Institution by Size + of Institution 159.8515 2.8619 .0919 Type of Institution by Years of Teaching Experience 71.9825 1.2888 .2774 Size of Institution by Years of Teaching Experience 1.4615 .0262 .9742 Type of Institution by Size of Institution and by Years of Teaching Experience 157.4793 2.8195 .0615* *This item showed a statistically significant difference at the .10 level by having the P value of .0615. +The P value of .0919 is slightly less than the accepted level of significance of .10. This difference was not considered great enough on which to conclude significant differences in the pOpulation. 105 Table ll.--Cell Means of Teacher Respondent Groups by Years of Teaching Experience, Type, and Size of Institution.* Mean of Institution Institution Years of Age Group by Mean by Size Type Experience Institution Institution 1 - 5 years 66.33 School 6 - 10 years 65.00 11 or more years 61.09 63.81 Large 1 - 5 years 65.13 Area Center 6 - 10 years 64.88 11 or more years 68.00 66.00 1 - 5 years 64.07 School 6 - 10 years 65.46 11 or more years 64.77 64.77 Small 1 - 5 years 64.77 Area Center 6 - 10 years 65.73 11 or more years 61.57 64.02 *Maximum score possible: 80. 106 measured was in the instance where the three variables (type of institution, size of institution, and years of teaching experience) were used. For the group with five years or less of teaching experience, there was little difference in responses (see Table 12) between school and area center, whether large or small. Table 12.--Mean Scores of Teachers With Five Years or Less Teaching Experience by Type of Institution.* Type of Institution Cell Means School Area Center 68 67 66 6S 64 63 62 61 *Maximum score possible: 80. In the group of teachers with six and no more than ten years of teaching experience, very little difference was found in response scores as shown in Table 13. 107 Table 13.--Mean Scores of Teachers With Six and No More than Ten Years' Teaching Experience by Type of Institution.* Type of Institution Cell Means School Area Center 68 67 66 ---------------------- 65 64 63 62 ----- Small 61 *Maximum score possible: 80. Those teachers with eleven or more years of teaching experience in the large area centers and the small schools show a relatively higher level of support than those teachers with eleven or more years of teaching experience in small area centers and large schools (see Table 14). Additional Data Appointing a Qualified Leader to the Position of Local Leader The 278 teacher respondents were asked what position the local leader of an evaluation program should hold in the school system. The statement was, "The position of local leader should be: a teacher, the director of vocational education, the high school principal, 108 Table 14.--Mean Scores of Teachers With Eleven or More Years' Teaching Experience by Type of Institution.* Type of Institution Cell Means School Area Center 68 68.00 67 66 65 64 63 62 61 61.09 61.57 *Mean score possible: 80. ____the superintendent, ____other." In response to the question (Table 15 and Appendix A), 46 percent of the teachers said the director of vocational education should be the local leader, 28 percent said a teacher, 11 percent made other recommendations, and 11 percent did not respond to the statement. Several teachers thought the local leader should be a person from the business and industrial community. Several teachers believed the chairman of the advisory committee for a program should act as the local leader. A few teachers thought a representative from either the Career Education Planning District or Intermediate School District should be the local leader. 109 OOH HH HH N N ON ON ucoonom HmuoN ONN Hm OO O m ONH ON Honssz me O O m H NH O Houcou mop< HHNEm ON m HH N N Om NN Hoonom OO O O N me Hm Houcou mou< OOHOH Nm O m ON OH Hoozom No>uom ou xcmHO HOHHO ucmwcou HOOHOOHNO OOHONOSOm maHucommom icHuomom Hoocom HNGOHHNOO> honeaz :OHz mo HOHOOHHO OQNN ONHO HOOOH Hocoaok cOHpsuHumcH :oHusqumcH coHpHmOO NOONOH HNOOH New oo:onomohm .HOOOOH HOOOH OouooHom on Op somuom mo :oHpHmOO new ooconomonm HHOHH Op muonomoh ONN mo momcommom me nope: u-.mH OHHON 110 Two teachers indicated the local leader should be someone highly specialized in vocational education. A recently retired vocational education or industrial arts teacher was suggested as a person to be a local leader. Someone highly specialized in the particular vocational occupation was also recommended. Some teachers said the local leader should be the person most qualified for the position. One teacher specifically recommended that the local leader should be a degree person with a minimum of at least ten years' industrial experience. In contrast one teacher expressed preference for a non-education community person as local leader. A new staff person was also suggested as the best person to be the local leader. An educator not directly connected with the system being evaluated was also recommended. Providing Time for Evaluation Teachers were asked to what extent the local leader should be provided released time (Table 16 and Appendix A). The statement was: "The amount of released time per day to be provided to the local leader should be: ___one hour, ___one-half time, ___ full time." About one out of every four teachers, 26 percent, said the local leader should have released time in the amount of one hour per day. Thirty-nine percent said the local leader should be provided with released time of one-half of his day. Fourteen percent of the teachers indicated the local leader should be released from other responsibilities to direct the local evaluation program on a full-time basis (Table 16). 111 OOH OH m «H mm ON HGOOHOO HOOON ONN mv mH Om NOH NN Honesz me HH N O OH OH Nopcou mon< HHmEm ON OH O O ON NN Hoonom OO HH m OH vv ON Hmucou mou< OONOH Nm O m OH OH OH Hoosom No>uom Op HcmHO HOHHO oEHNuHHam oeHhumHm: use: Ono OONN ONHO mcHucommom ponesz :OHOSOHumcH GOHHSHHOmcH NmO MOO oEHN OommoHom mo ucsoe< .pOOOOH HOOOH on» OOOH>OHO on on NmO HON oEHN OommOHom mo unsoa< Op mHOHOOON ONN mo momcommom mo HOOEOZ--.OH oHnmN 112 Five percent of the teachers made other recommendations. Some thought whatever time was necessary should be considered released. A few thought no released time was necessary. One teacher said, "Budget funds for the position--or forget it." Additional ways of financing which were recommended included cash grants, state funding and federal assistance. One recommendation made was the local leader position should be a contractual arrangement. Additional Comments About Program Evaluation All teachers surveyed were given the opportunity on the survey instrument to express their opinions about program evaluation. The following Opinions were selected as being representative responses. Goals of Evaluation "If program evaluation means better meeting the needs of students, then the time is well spent. From past experiences, evaluation meets the needs of the evaluator and that's about all." "Several programs and program goals are much easier to evaluate than specifics within an individual program. The means of achieving these goals is always going to be different and should only be evaluated if the goals are not being met." "Any evaluation should measure growth and/or programs; therefore, pre-evaluation (before and after) using identical criteria becomes a must." "Any vocational program should include the development of habits and attitudes necessary for successful job performances." "As industry and education formulate the need, then the proper teaching materials can be made to cover those needs as well as the general information needed by the student to survive this fast moving technology." "An evaluation is useless unless it results in change." "It seems essential to me that programs be evaluated for future improvement of training for employment." 113 "Too many young people leave school unprepared for any kind of work that will provide a livable wage." "Employability of students should be the main objective of any vocational instructor, and the Opinion and advice of peOple in industry should be of the utmost importance in pursuit of this goal." Success of Evaluation "After all suggestions and evaluations are drawn together, it would be nice to see more plans, about which thousands of words have been written, actually put into use for the benefit of students." "Theory is good, but it takes so very much time to do the job like you feel it should be done." "Many instructors have lost faith in proving programs, addition of facilities, etc., because as the results are filtered through the regulating agencies, the end result of the time and effort spent is zero. It matters little how facility needs are documented if there are no funds available to support the recommendations." Input to the Evaluation Process "Senior students at area centers would be helpful in evaluating programs." "Student evaluation input should be utilized." "Committees move so slowly--objectives could change before first report is ready for adoption." Participation in the Evaluation Process "Time for all participation should be time released from classroom duties. It should not be planned that participants will devote planning period or home time to this project." "Although complete staff involvement is desirable, it may be unreasonable to obtain complete participation." "Teachers must also be allowed time, secretarial help, and financial assistance to do the job." "It would be good research for someone who does not have a full-time teaching job." 114 "Teachers (instructors) must be given ample time to comply with already existing reports and/or requirements." "Peeple who teach the vocational areas, I think, would be best qualified to do the follow-up study." "Excellent, but the whole evaluation process is very time- consuming." "Pay a qualified individual to get a job done." "Evaluation should be made by an outside agency that is agreed upon by faculty and administration." Need for Evaluation "Not enough evaluation presently done." "Evaluation is an important, long process.” "Need means of evaluating total student growth and develop- ment. Just because a student can do a given skill doesn't mean he is a better citizen because of it." Summary The purpose of this chapter was to present an analysis of data from the survey instrument. 1. Of all sixty-four survey statements, teachers were most supportive of the statement that indicated the availability of courses should be checked against the needs of employers. Teachers were least supportive of the statement that made reference to the use of a portion of faculty meetings as a way to provide staff time to work on evaluation activities, although two-thirds agreed that budgeted funds for inservice education and curriculum planning were an alternative. 2. Of the thirteen statements that included references to area centers, the statement receiving highest agreement by the teachers was: "Each specialized vocational area in the area 115 center should be represented on the staff committee." The statement receiving least support was: "Persons considered for nomination to the citizen advisory committee should not be nominated by organizations or agencies outside of the school and the area center." In response to the first research question: "To what extent did teachers' Opinions on selected survey statements support Byram's evaluation system as a process for use by a comprehen- sive high school to evaluate the quality of its vocational education programs including the instruction available in an area center?" the 278 teachers supported Byram's evaluation system as a process as follows: a. Over 90 percent of the teachers agreed or strongly agreed with two statements which dealt with the necessity for insuring that programs and classes meet the needs of the users. b. Eighty to 86 percent of the teachers agreed or strongly agreed to seven statements. The majority of the statements made reference to local support for local evaluation efforts. c. The range of agreement by the teachers was from 73 to 76 percent on five statements, three of which referred to citizen involvement in local evaluation efforts. d. The lowest range of agreement was 65 to 66.8 percent on two statements. One statement referred to the need for participation by all faculty and administration in an evaluation project. The other statement referred to what 116 evidence is acceptable in making a decision as to whether a stated objective in the evaluation project has been achieved. In response to the research question: "Was there a difference in responses to survey statements by teachers in project schools, non-project schools, project area centers, and non-project area centers?" there were no significant differences, at the .10 level, found in the responses of three types of institutions. Non-project area centers were not included in the analysis of this question because of insufficient data. Only a very small number of the respondents had actually been involved in the evaluation projects conducted by Byram. The third research question asked in the study was: "Were there differences in teachers' responses to selected survey statements in any of the following comparisons: (a) by type of institution, (b) by size of institution, (c) by years of teaching experience, (d) for type of institution by size of institution, (e) for type of institution by years of teaching experience, (f) for size of institution by years of teaching experience, and (g) for type of institution by size of institution by years of teaching experience?" With the exception of part "g" there was a relatively high level of|agreement among the teachers in the way in which they responded to all items. In reference to section "g," at the .10 level a significant difference in teachers' responses occurred at the level of eleven years or more of teaching experience. These teachers with eleven or more years of teaching experience in the large area centers and the small schools showed a relatively higher degree of 117 support than these teachers with eleven or more years of teaching experience in small area centers and large schools. Under additional data, with 89 percent responding, 46 percent of the teachers said the director of vocational education should be the local leader and 28 percent indicated a teacher should be the local leader. With all 278 teachers responding, 89 percent said the local leader should be provided with released time of one half of his day, 26 percent said the local leader should have released time in the amount of one hour per day, and 14 percent said the local leader should be released on a full-time basis. CHAPTER VI SUMMARY, CONCLUSIONS, AND RECOMMENDATIONS Summary The purpose of this study was to determine the extent to which teachers supported Byram's evaluation system as a process for locally directed evaluation of local Michigan vocational education programs in comprehensive high schools served by area centers. Recommendations were to be made for locally directed evaluation of local vocational education programs in comprehensive high schools utilizing area centers. Statement of the Problem The problem was to determine to what extent teachers supported Byram's evaluation system as a process for use by a comprehensive high school to evaluate the quality of its vocational education programs including the instruction available in an area center. This study attempted to answer the following questions. 1. To what extent did teachers' opinions on selected survey statements support Byram's evaluation system as a process for use by a comprehensive high school to evaluate the quality of its vocational education programs including the instruction available in an area center? 118 119 2. Was there a difference in responses to survey statements by teachers in project schools, non-project schools, project area centers, and non-project area centers? 3. Were there differences in teachers' responses to selected survey statements in any of the following comparisons: a. By type of institution? b. By size of institution? c. By years of teaching experience? d. For type of institution by size of institution? e. For type of institution by years of teaching experience? f. For size of institution by years of teaching experience? g. For type of institution by size of institution by years of teaching experience? Research Design and Procedures The population of this study consisted of vocational education teachers in fourteen Michigan high schools and seven area centers. Ten of the high schools and five of the area centers were located in districts which had participated in one or both of Byram's two research and deve10pment projects in 1963-1965 and 1966-1967. The remaining four high schools and two area vocational centers were chosen because of their location within districts which had participated in neither of Byram's two research projects. The questionnaire which included sixty-four statements was developed as the data gathering instrument for the study. The questionnaire was hand-carried to the fourteen high schools and seven area centers and distributed to the 416 teachers and teacher/coordinators 120 who were teaching in the occupational areas of agriculture, distributive education, health occupations education, home economics, business/office education, technical education, and trade and industrial occupations. Usable questionnaires were received from 278 or 66.8 percent of the teachers. With the exception of administrative information in the appendix, all data were obtained from teachers or teacher/coordinators who were teaching one or more classes in career orientation, exploration, or job preparation in the above mentioned occupational areas. No information was Obtained from other teachers in the respective schools and area centers. With the exception of a specially funded program that might exist within a specific building, this study was limited to teachers of students in grades 9 through 12. Findings Concerning the extent to which teachers supported Byram's evluation system as a process for locally directed evaluation of local Michigan vocational education programs, the major findings were: 1. With the exception of two statements, the minimum percent of agreement by all teachers to any of the remaining sixty-two statements was 52 percent. Two statements were agreed to by over 90 percent of the teachers. Teachers were least supportive of the statement that made reference to the use of a portion of faculty meetings as a way to provide staff time to work on evaluation activities. Two-thirds of the teachers agreed that budgeted funds for inservice education and curriculum planning was an alternative. 121 Sixty-seven percent of the teachers agreed that an evaluation project should be initiated only when the board and the school staff including the area center staff are committed to program evaluation. Eighty-three percent agreed each vocational area in the area center should be represented on the staff committee. All members of the faculty (both vocational education and general education) and administrators of the school including the area center should participate in the evaluation project, insofar as is feasible. In answer to the question, "To what extent did teachers' opinions on selected survey statements support Byram's evaluation system as a process for use by a comprehensive high school to evaluate the quality of its vocational education programs including the instruction available in an area center?" over 90 percent of the teachers agreed or strongly agreed on two statements, 80 to 86 percent agreed or strongly agreed on seven statements, 73 to 76 percent agreed or strongly agreed on five statements, and 65 to 67 percent agreed or strongly agreed on two statements. Thirty percent (the highest percent of no Opinion) had no Opinion on one statement, 10 to 30 per- cent had no Opinion on thirteen statements and less than 10 percent had no opinion on two statements. The highest percent of disagreement, or strong disagreement was 18 percent on one statement. Ten to 18 percent disagreed or strongly disagreed with three statements and less than 10 percent disagreed or strongly disagreed with twelve of the sixteen statements . 122 The second question asked in the study was: ”Was there a difference in responses to survey statements by teachers in project schools, non-project schools, project area centers, and non-project area centers?" An analysis of variance with an alpha level of .10 indicated no significant differences were found in the teacher responses from the three types of institutions: project schools, non-project schools, and project area centers. Data from non-project area centers was not included in the analysis because of insufficient data. Analysis of variance was used to analyze the third question: "Were there differences in teachers' responses to selected survey statements in any of the following comparisons: (a) by type of instituiton, (b) by size of institution, (c) by years of teaching experience, (d) for type of institution by size of institution, (e) for type of institution by years of teaching experience, (f) for size of institution by years of teaching experience, and (g) for type of institution by size of institution by years of teaching experience." No significant differences were found at the alpha level of .10 in questions "a" through "f." The only statistically significant differ- ence which was measured was in question "g" and that differ- ence occurred only in the group of teachers with eleven or more years of teaching experience. All teachers were asked what position should the local leader hold in the school system. Twenty-eight percent said a teacher, 46 percent said the director of vocational education, 2 percent indicated the local leader should be the high school principal, 123 2 percent said the superintendent, 11 percent made other recommendations, and 11 percent did not respond. When all teachers were aksed what amount of released time should be provided the local leader, 26 percent said one hour per day, 39 percent said half-time, 14 percent said full-time, 5 percent offered other recommendations, and 16 percent did not respond to the statement. Conclusions Based on the data acquired in this study, the key conclusions are as follows. Vocational education teachers and teacher/coordinators as indivi- duals appear to support Byram's evaluation system for locally directed evaluation of vocational education programs including those in the area center, as indicated by the following. 1. Although the majority of the teachers had no experience in one of Byram's evaluation projects, when they were confronted with the process they were favorable toward it. Although the teachers were not in favor of using a portion of faculty meetings as a way to provide staff time to work on evaluation activities, they were in favor of budgeting funds for inservice education and curriculum planning as alternatives. The majority of the teachers believe that an evaluation project should be initiated only when the board and the school staff including the area center staff are committed to program evaluation. The majority also indicated each vocational area at the area center should be represented on the staff 124 committee, and all members of the faculty and administration of both the high school and the area center should participate in the evaluation project. There was a high level of agreement (an average of 79 percent) among the teachers in their opinions on sixteen selected survey statements in support of Byram's evaluation system as a process for use by a comprehensive high school to evaluate the quality of its vocational education programs including the instruction available in an area center. There were no significant differences in responses to survey statements by teachers in project schools, non-project schools, and project area centers. There were no significant differences in teachers' responses to selected survey statements when compared by type of institution, size of institution and years of experience, except in a three-way analysis of variance comparing type of institution by size of institution by years of teaching experience. Those teachers with eleven or more years of teaching experience in the large area centers and the small schools showed a relatively higher level of support than those teachers with eleven or more years of teaching experience in small area centers and large schools. There was relatively high agreement that the director of vocational education should be the local leader. A majority of the teachers believed the local leader should be provided with released time. 125 Recommendations The recommendations that follow are based on data obtained from the survey and the review of literature. Recognizing the need for local involvement in the evaluation of local vocational programs as set forth in Byram's projects on locally directed evaluation, the following recommendations are intended for consideration in the planning of evaluation of vocational programs throughout the State of Michigan. 1. In the interest of improving instruction in secondary vocational education, procedures should be set up for encouraging locally directed evaluation of local programs which include the area centers. Locally directed evaluation should be a responsibility of all teachers and administrators in both home schools and area centers. Funds should be budgeted at the local level for inservice education and curriculum planning to support activities in the local evaluation program. Programs in the training of local personnel in local program evaluation should be made available to teachers and administra- tors of both home schools and area centers. Local leaders should be provided with released time from their daily schedule to direct an evaluation of their local vocational education program. The above recommendations reflect the findings from this study and are based on the assumption that evaluation of local programs of vocational education should be conducted by those who are involved in, 126 and responsible for, such local programs since they are in the most advantageous position to improve instruction. Discussion The comprehensive high school philosophy provides the basis for the majority of high schools in our nation today. The comprehensive high school, by its definition, incorporates vocational education as an integral part of the curriculum. It is vital to the future of our country to provide both academic and vocational education as a part of the comprehensive school philosophy for our youth. Neither program will succeed if a separate, but equal, philosophy is adopted. Opportunity must exist for the student at the high school level to choose educational pursuits as he prefers. The area center, by viture of being a part of vocational education and by being established as another classroom down the hall, requires that it exist within the philosophical framework of the compre- hensive high school. Society depends on education and the progress it is able to achieve. Progress in education, in turn, is dependent upon improvement. Improvement in education is, in turn, dependent upon program evaluation. Local school districts, to succeed, must depend on improvement through program evaluation. Local districts with comprehensive high schools have the responsibility of including local program evaluation and area vocational centers within their operating philosophies. This study has shown a support for the comprehensive philosophy and the incorporation of program evaluation and the area center concept within that phiIOSOphy. BIBLIOGRAPHY BIBLIOGRAPHY Books Banathy, Bela H. Instructional Systems. Belmont, Calif.: Fearon Publishers, 1968. Barlow, Melvin J. "The Challenge to Vocational Education." Vocational Education, the Sixty-fourth Yearbook of the National Society for the Study of Education. Edited by Melvin J. Barlow. Chicago: The University of Chicago Press, 1965. Beaumont, John. "Philosophical Implications of the Vocational Education Amendments of 1968." Contemporary Concepts in Vocational Edu- cation. Edited by Gordon F. Law. Washington: American Vocational Association, 1971. Borg, Walter R. Educational Research an Introduction. New York: David McKay Company, Inc., 1963. Brandon, George L., and Evans, Rupert N. "Research in Vocational Education." Vocational Education. Edited by Melvin L. Barlow. Chicago: The University of Chicago Press, 1965. Bruner, Jerome S. Toward a Theory of Instruction. Cambridge, Mass.: Harvard University Press, 1971. Burkett, Lowell A. "Access to a Future." Contemporary Concepts in Vocational Education. Edited by Gordon F. Law. Washington: American Vocational Association, 1971. Cohen, David K. "Politics and Research: Evaluation of Social Action Programs in Education." Review of Educational Research, Edu- cational Evaluation. Edited by Gene V. Glass. Washington: American Educational Research Association, 1970. Commission on Instructional Technology. "What Is Instructional Technology?" To Improve Learning. Edited by Sidney G. Tickton. New York: R. R. Bowker Company, 1970. Conant, James B. The American High School Today. New York: McGraw- Hill Book Co., 1959. 127 128 Conant, James B. The Comprehensive High School, A Second Report to Interested Citizens. New York: McGraw-Hill Book Co., 1967. ShapingEducational Policy. New York: McGraw-Hill Book Company, 1964. Deterline, William A. "Practical Problems in Program Production." Programed Instruction. Edited by Herman G. Richey and Merle M. Coulson. Chicago: University of Chicago Press, 1967. Erlandson, David A. "Evaluation and an Administrator's Autonomy." School Evaluation, the Politics 8 Process. Edited by Ernest R. House. Berkeley: McCutchan Publishing Corporation, 1973. Glaser, Robert. "Implications of Training Research for Education." Theories of Learning_and Instruction. Edited by Ernest R. Hilgard. Chicago: University of Chicago Press, 1964. Gooler, Dennis D., and Grotelueschen, Arden. "Accountability in Curriculum Development." Curriculum Theory Network. Edited by F. Michael Connelly. Toronto: The Ontario Institute for Studies in Education, 1971. Gottman, John Mordechai, and Clasen, Robert Earl. Evaluation in Edu- cation. Itasca, Ill.: F. E. Peacock Publishers, Inc., 1972. Guralnik, D. B., ed. Webster's New World Dictionary. New York: The World Publishing Company, 1972. Haskew, Laurence D., and Tumlin, Inez Wallace. "Vocational Education in the Curriculum of the Common School." Vocational Education, the Sixty-fourth Yearbook of the National Society for the Study of Education. Edited by Melvin J. Barlow. Chicago: The Uni- versity of Chicago Press, 1965. Hays, William L. Statistics. New York: Holt, Rinehart and Winston, 1963. Healy, John. "Systems Planning in the Classroom." Accountability: Systems Planning in Education. Edited by Creta Sabine. Homewood, Ill.: ETC Publications, 1973. Hostrop, Richard W. Managing Education for Results. Homewood, Ill.: ETC Publications, 1973. Hoyt, Cyril J. "Test Reliability Estimated by Analysis of Variance." Principles of Educational and Psychological Measurement. Edited by William A. Mehrens and Robert L. Ebel. Chicago: Rand McNally and Co., 1967. Hull, William L. "Process of Planned Change." Contemporary Concepts in Vocational Education. Edited by Gordon F. Law. Washington: American Vocational Association, 1971. 129 Keller, Franklin J. ”Vocational and Educational Guidance." Vocational Education, the Sixty-fourth Yearbook of the National Society for the Study of Education. Edited by Melvin J. Barlow. Chicago: The University of Chicago Press, 1965. Kibler, Robert J.; Barker, Larry L.; and Miles, David. Behavioral Objectives, and Instruction. Boston: Allyn and Bacon, Inc., 1970. Kruger, W. Stanley. "Implications of Accountability for Educational Program Evaluation." Accountability in American Education. Edited by Frank J. Sciara and Richard K. Jantz. Boston: Allyn and Bacon, Inc., 1972. Lewis, James, Jr. Administering the Individualized Instruction Program. West Nyack, N.Y.: Parker Publishing Company, Inc., 1971. Lindvall, C. M., and Cox, Richard C. Evaluation as a Tool in Curriculum Development: The IPI Evaluation Program. Chicago: Rand McNally 8 Company, 1970. Link, Frances R., and Diederich, Paul B. "Evaluation as Feedback and Guide." Evaluation as Feedback and Guide. Edited by Fred T. Wilhelms. Washington: Association for Supervision and Curriculum Development, 1970. Merriman, Howard 0. "A Conception of Curriculum Evaluation Involving Teachers, Parents, and Other Educational Decision-Makers." Curriculum Theory Network. Edited by F. Michael Connelly. Toronto: The Ontario Institute for Studies in Education, 1971. Evaluation of Planned Educational Change at the Local Education Agency Level. A report prepared by the Evaluation Center, The Ohio State University College of Education for the U.S. Office of Education. Columbus, Ohio: Ohio State University, 1967. Morphet, Edgar L.; Johns, Roe L.; and Reller, Theodore L. Educational Organization and Administration. Englewood Cliffs, N.J.: Prentice-Hall, Inc., 1967. National Advisory Council on Education Professions Development. Evaluation of Education: In Need of Examination. Washington, D.C.: National Advisory Council on Education Professions Development, 1973. Nerden, Joseph. "Statewide Evaluation of Vocational Education." Contemporary Concepts in Vocational Education. Edited by Gordon F. Law. Washington: American Vocational Association, 1971. 130 Panitz, Adolph. "What Makes a High School Comprehensive?" Contemporary Concepts in Vocational Education. Edited by Gordon F. Law. Washington: American Vocational Education Association, 1971. POpham, W. James. Evaluating Instruction. Englewood Cliffs, N.J.: Prentice-Hall, Inc., 1973. Redfern, George. How to Appraise Teaching Performance. Columbus, Ohio: School Management Institute, Inc., 1964. Rippey, Robert M. Studies in Transactional Evaluation. Berkeley, Calif.: McCutchan Publishing Corporation, 1973. Sabine, C. D., ed. Accountability: Systems Planning in Education. Homewood, Ill.: ETC Publications, 1973. Sciara, Frank J., and Jantz, Richard K. "The Call to Accountability." Accountability in American Education. Edited by Frank J. Sciara and Richard K. Jantz. Boston: Allyn and Bacon, Inc., 1972. Shoemaker, Byrl R. "People, Jobs and Society: Towards Relevance in Education." Contemporary Concepts in Vocational Education. Edited by Gordon F. Law. Washington: American Vocational Education Association, 1971. Silberman, Charles E. Crisis in the Classroom. New York: Vintage Books, 1970. Skinner, B. F. The Technology of Teaching. New York: Appleton- Century-Crofts, 1968. Stake, Robert E. "Objectives, Priorities, and Other Judgment Data." Review of Educational Research. Edited by Gene V. Glass. Washington: American Educational Research Association, 1970. Stake, Robert E. "Toward a Technology for the Evaluation of Edu- cational Programs. Perspectives of Curriculum Evaluation. Edited by Robert E. Stake. Chicago: Rand McNally 8 Company, 1972. Sullivan, Howard J. "Objectives, Evaluation, and Improved Learner Achievement." Instructional Objectives. Edited by Robert E. Stake. Chicago: Rand McNally 8 Company, 1970. Swanson, J. Chester. "Criteria for Effective Vocational Education." Contemporary Concepts in Vocational Education. Edited by Gordon F. Law. Washington: American Vocational Education Association, 1971. Thompson, John F. Foundations of Vocational Education. Englewood Cliffs: Prentice-Hall, Inc., 1973. 131 Tyler, Ralph W.; Gagne, Robert M.; and Scriven, Michael. Perspectives of Curriculum Evaluation. Chicago: Rand McNally & Company, 1972. Venn, Grant. Man, Education, and Manpower. Washington: The American Association of School Administrators, 1970. Walsh, John Patrick, and Selden, William. "Vocational Education in the Secondary School." Vocational Education, the Sixty-fourth Yearbook of the National Society for the Study of Education. Edited by Melvin J. Barlow. Chicago: The University of Chicago Press, 1965. Westbury. "Curriculum Evaluation." Review of Educational Research. Edited by Gene V. Glass. Washington: American Educational Research Association, 1970. Wilhelms, Fred T. "Evaluation as Feedback." Evaluation as Feedback and Guide. Edited by Fred T. Wilhelms. Washington: Association for Supervision and Curriculum Development, 1970. , and Diederich, Paul B. "The Fruits of Freedom." Evaluation as Feedback and Guide. Edited by Fred T. Wilhelms. Washington: Association for Supervision and Curriculum Development, 1970. Reports and Manuals--Published Brown, Donald V. Manual for Local Evaluation. A report prepared by the University of Tennessee and the Tennessee State Department of Education for the U.S. Office of Education. Knoxville and Nashville, Tenn.: University of Tennessee and the Tennessee State Department of Education, 1971. Bruhns, Arthur E. Evaluation Processes Used to Assess the Effective- ness of Vocational-Technical Programs. A report prepared by the University of California at Los Angeles for the U.S. Office of Education. Los Angeles, Calif.: University of California at Los Angeles, 1968. Burt, Samuel M. Evaluation of Arkansas Vocational TrainingyPrograms in Relation to Economic Development. A report prepared by the University of Arkansas and the W. E. Upjohn Institute for Employment Research for the U.S. Office of Education. Little Rock, Arkansas and Washington, D.C.: University of Arkansas and the W. E. Upjohn Institute, 1969. Byram, Harold M. Evaluation of Local Vocational Education Programs. East Lansing, Mich.: Michigan State University, 1965. Evaluation Systems for Local Programs of Vocational- Technical Education. East Lansing, Mich.: Michigan State University, 1968. 132 Bryam, Harold M. A Five-State Try-Out and Demonstration Program to Determine the Generalizability of an Evaluation Systzm for Local Programs of Vocational and Technical Education. East Lansing, Mich.: Michigan State University, 1971. , and McKinney, Floyd. Evaluation of Local Vocational Edu- cation Programs. East Lansing, Mich.: Michigan State Uni- versity, 1968. , and Robertson, Marvin. Locally Directed Evaluation of Local Vocational Education Programs. 3rd ed. Danville, Ill.: The Interstate Printers and Publishers, Inc., 1970. California Coordinating Unit for Occupational Research and Development. Evaluation in Vocational Education. A report prepared by the California Coordinating Unit for Occupational Research and Development for the U.S. Office of Education. Sacramento, Calif.: Research Coordinating Unit for Vocational Education, 1967. Christensen, Howard H. A Study to Determine Needed Improvements in Vocational Programs in Nine Nevada High Schools. A report prepared by Michigan State University‘and Nevada State Depart- ment of Education for the U.S. Office of Education. East Lansing, Mich., and Reno, Nevada: University of Nevada, 1969. Crunkilton, John R. Testing of Model for Evaluation of Secondary School " Programs of Vocational Education in Agriculture. A report prépared by Virginia Polytechnic Institute and State University for the U.S. Office of Education. Blacksburg, Vir.: Virginia Polytechnic Institute and State University, 1971. Dale, Oliver J. Vocational Education Evaluation Project. A report prepared by Virginia Polytechnic Institute, State University and State Department of Education for the U.S. Office of Education. Richmond, Vir.: Virginia Polytechnic Institute, State University and Virginia State Department of Education, 1972. General Report of the Advisory Council on Vocational Education. Vocational Education, the Bridge Between Man and His Work. Washington, 1968. Grotsky, Jeffrey N. Peer Evaluation Program. A report prepared by the National Institute for the Study of Educational Change for the U.S. Office of Education. Bloomington, Ind.: The National Institute for the Study of Educational Change, 1968. Hammond, Robert L. Evaluation at the Local Level. A report prepared by Project ERIC for the U.S. Office of Education. Thscon, Arizonia: Project ERIC, 1967. 133 Michigan Department of Education. A Position Statement Concerning the Deve10pment of Area Vocational and Technical Education Programs in Michigan. Lansing, Mich.: Michigan Department of Education, 1967. A Position Statement on Educational Accountability. Lansing, Mich.: Michigan Department of Education, 1972. A Tentative Plan for the Development of Area Vocational Education Centers in Michigan. Lansing, Mich.: Michigan Department of Education, 1970. National Study for Accreditation of Vocational/Technical Education. Instruments and Procedures for the Evaluation of Vocational/ Technical Education. A report prepared by the American Vocational Association for the U.S. Office of Education. Washington, D.C.: American Vocational Association, 1971. National Study of School Evaluation. Elementary School Evaluative Criteria. Arlington, Vir.: National Study of School Evaluation, 1972. Norton, Robert E.; Love, E. Lamar; and Rolloff, John A. Institute for Improving Vocational Education Evaluation. A report prepared by the University of Arkansas for the U.S. Office of Education. Fayetteville, Ark.: University of Arkansas, 1970. Starr, Harold, and Dieffenderfer, Richard A. A System for Statewide Evaluation of Vocational Education. Columbus, Ohio: The Ohio State University, 1972. U.S. Department of Health, Education, and Welfare. Office of Education. Vocational Education and Occupations. Washington, D.C.: U.S. Government Printing Office, 1969. Wisconsin State Department of Public Instruction. A Manual to be Used in the Evaluation of Thirty-Four Comprehensive High Schools in Wisconsin Which Participated in a Three-Year Pilot Program of High School Vocational Education. Madison, Wis.: Wisconsin Department of Public Instruction, 1968. Wyllie, Eugene Donald. An Evaluation Plan for Business Education Programs in High_Schools. A report prepared with the cooperation of the Indiana Business Education Association fer the U.S. Office of Education. Bloomington, Ind.: South-Western Publishing Company, 1963. 134 Articles in Journals or Magazines "Study Panel Reports." American Vocational Journal 49 (January 1974):10. Cawelti, Gordon. "Must We Systematize Curriculum Building?" Edu- cational Leadership 31 (March 1974):484-84. Dannenberg, Raymond A. "Michigan Feels the Pull of Accountability." American Vocational Journal 49 (January 1974):54-55. Hayman, John L. "Educational Management Information Systems for the Seventies." Educational Administration Quarterly 10 (Winter l974):60-71. Hilton, Peter. "The Survival of Education." Educational Technology 13 (November 1973):12-16. House, Ernest R. "Technology and Evaluation." Educational Technology_ 13 (November l973):20-26. Karns, Edward A., and Wenger, Marilyn J. "Developing Corrective Evaluation Within the Program." Educational Leadership 30 (March 1973):533-35. McKinney, Floyd L., and Mannebach, Alfred J. "Let's Give Students Their Say." American Vocational Journal:27-29. Ott, Jack M.; Fletcher, Sheila; and Turner, Donald G. "Taxonomy of Administrative Information Needs: An Aid to Educational Planning and Evaluation." Educational Technology 13 (May 1973): 29-31. Shear, Twyla M. "Accountability Versus Responsibility." American Vocational Journal 48 (March l973):26-27. Stallard, John J. "MDTA's Evaluation Gap." American Vocational Journal 48 (September l973):54. Webster, William J., and Schuhmacher, Clinton C. "A Unified Strategy for System-Wide Research and Evaluation." Educational Tech- nology 13 (May l973):68-7l. Yelon, Stephen L. "An Examination of Systematic Development of Instruction for Nonresidential Colleges." Educational Technology 13 (July 1973):36-43. Unpublished Materials Alger, Leon J. "A Rationale for the Establishment of Area Vocational Education Programs in Michigan." Ph.D. dissertation, Michigan State University, 1967. 135 Gold, Ben K. "Evaluation of Programs." Paper presented at a con- ference sponsored by the Compensatory Education Project, Coordinating Board, Texas College and University System, Sheraton-Crest Inn, Austin, Texas, April 5-6, 1971. Grotelueschen, Arden D., and Gooler, Dennis 0. "Role of Evaluation in Planning Educational Programs.” Paper read at "Evaluation and the Planning of Education Programs," a symposium of the American Educational Research Association Meeting, New York, February, 1971. Jones, Leon. "Using Evaluation Data to Improve an Ongoing Program: A Methodology." Paper presented at the New England Educational Research Organization Conference, Boston College, Chestnut Hill, Mass., June 4, 1971. Verduin, John R., Jr. "An Evaluation of a Cooperative Approach to Curriculum Change. Ph.D. dissertation, Michigan State Uni- versity, 1962. Newsletters and Newspaper Articles Alkin, Marvin C. "Products for Improving Educational Evaluation." UCLA Evaluation Comment 11 (September 1970), 1-15. Alkin, Marvin D. "Wider Context Goals and Goal-Based Evaluators." UCLA Evaluation Comment 111 (December 1972), 1-8. Husek, T. R. "Different Kinds of Evaluation and Their Implications for Test Development." UCLA Evaluation Comment 11 (October 1969), 1-10. Porter, John W. "Educational Challenge Accepted." The Detroit News, January 4, 1973. Federal Laws U.S. Congress. National Defense Education Act of 1958. 85th Congress, 1958. Vocational Education Act of 1963. 88th Congress, 1963. Vocational Amendments of 1968. 90th Congress, 1968. APPENDICES APPENDIX A QUESTIONNAIRE APPENDIX A QUESTIONNAIRE SURVEY FORM I. Pleaae provide the following information about youraelf. Sex: ——Male Female What '- your prim-y position preaently? Age: ———2¥: your: or let ——-g6 to 25% _._. —— o —— he: -— I "5% to —Ef:f>p coordinator/teacher ——36 t3 40 ——6 or over Odfi'm‘mm H 4] 10‘5“” .a '"trat' ' -— : our ears 0 u -ime Inna are ex nencr I have mcoympleted? (Include fIIia year.) PC If O'I‘IIEIk return th'l urrvey form unamered to your boil repreamtatrve. —-—5 {eariaoor lea —— o H I II-ti . teachi expene' nee have on —- to I cthpIetedx. ’55.:th Ilhia 'ytar.) n; y — f) to 23 la- —- I or more —— Iowio" What a. the highest level of retooling you have completed ——%to&3 lofthiadate. —- IO _ m —-1 th h' .hool etion a o" d‘ t D H told a ‘ u' LIE; frmgém'lgnr te . ec n -——-— r or ‘ m n to lgjhyxflm‘cgduafig‘n 0%? Izod :ocazional 313360;”, I -—I'I“’JtIItt‘ girqs eugr'eri W propam. ea —— o -——«§Ia~ters tit-gr ‘ '- ‘ " ea; an Iocall ——lboct‘(tll'l‘:‘fie 3“ °' "u" i 30 'Rcze‘alev uaffirimpsrnpg‘f'ame act yevou OWIQ IocaIIn vocational y —-—Uther (spcn y) e cation programs. ot cr than ‘North ntraI evaIuationa? -—- ea —-— 0 THE STATEMENTS BELOW REPRESENT A PROCESS DEVELOPED BY DR. HAROLD BYRAM FOR EVALUATION OF LOCAL VOCATIONAL EDUCATION PROGRAMS BY PERSONS INVOLVED IN THE LOCAL PROGRAMS. PLEASE RESPOND TO ALL STATEMENTS IN TERMS OF WHAT YOU BELIEVE TO BE APPROPRIATE FOR YOUR SCHOOL BY INDICATING TO WHAT DEGREE YOU 'AGREE' OR ‘DISAGREE‘ BY PLKCING A CHECK IN ONE BLOCK. USING THE RATING SCALE AND EXAMPLE BELOW AS A GUIDE. '5' ‘STRONGLY AGREE' ‘4' ‘AGREE' '3' ‘NO OPINION’ '2' ‘DISAGREE‘ '1' ‘STRONGLY DISAGREE' Eaarnple— i“ A major factor in the auooe- of a citizen adviory committee in an understanding of the ED: purpoaea by all concerned ORGANIZING FOR LOCAL EVALUATION \n evduation project should be mama only when the board .nd the achooi mu including the area [:[:[:[::[:] center ataff are committed to program evduation. \m’ntirt‘ fidified Leadera A lord leader M to be appointed to couhct a progam of aeIf-evaluation of feed vocational education m progama. The poaition of local leader should be: —a teacher —'I‘hhee MMtorhogl vocational education — ac ncr —--Thr uperintende'hrt‘ pal —-other The local leader should have a staff committee to work with him on the evaluation project. (A staff commit ia a committee compared of key persona in the mhool with each apecialiaed vocational area represented.) W The following ahould be repreaented on the ataff committee: Each epecialiaed vocationd area it your achool Each specialized vocationd area in your area center Coon-cling - Adminitration . . . W of the ataff committee ahonld be appointed as an anociate to work with the leader on a team Pruitt" ' Time for Evduation. The leader of the evaluation Inject moi he provided with clericd help. equipment, applies. and support for m neceoary local travel. mehprwideddaeloedhaderfumpuvifionofthenduadonprocm CECE: Theamountofreleaaedti-eperdaytohe providadtotheloealleaderahouldhe: —one hour —one-haIf time ——fuII-time ”on ' for Involvement of Faculty and Citiaena The following me alternative waya of profiting ataff time for work on evduation activitiea: Uae budgeted funrh for inacniee education and uniwlura planning to upport ataff activitiea in the m evaluation prepare. Uaeaehoolfinancdpahateauldyeouauuherehdepaadentatudyorrueanhfupadudemditi [33:33 avalahle. u. aehooI financed mu credit Miropa [:[IEE] u. - portion ore-eh tan: new CEIIEI 0th (apocify): AB umber! of the faculty (hoth vocational education and general education) and administrator: m of ’OI.‘ aehool. including your area center, bould participate in the evaluation project. insofar I u eanhle' . 136 137 2 Kintb of vocational citiacn adviso committees are rather varied, but may be classified as general, departmtal. and craft committees. The genera committee is usually a permanent one organized to give advice on proyam of vocational education in the school. i L4 I 3 I 2| 1| Partici tion in evaluation could be one of the functions of a neral adv' committee which ' es advice I on tthI..otal vocational education program of the school. ge “0'7 9v LI I I I J One or more citiaen adviso committees should be authorized and a inted for the s Tue purpose of amisting in evaluation of d2 vocational education program in your aghzol including thchrea center. I I I I I ] There is need for permanent citisen advisory committees because evaluah'on of vocationd education is a I I I I I I continuous process. Names of persons to be considered for nomination to the citizen advisory committee are best sugested by I I I I I I members of your school staff including the area center staff and the board of education. Persons considered for nomination to the citizen advisory committee should not be nominated by I I I I I I organisations or agencies outside of your school and your area center. Developing a Plan for Directing the Evaluation. Connrltants are pemns from outside the school staff who have had experience and/ or education that qualify them to give advice information. and/ or suggestions regarding professional decisions. plans, or other matters which should be studied. These would include persons living in the community. or in nearby communities. or who work in state or federal agencies. and/ or in state institutions of higher education. Services of consultants should be available to the evaluation project personnel. I I I I I I A consultant would be of assistance in determining the total estimated time for all evaluation activities needed I I I I I I in the evaluation project. Developing Cometency in Program Evaluation. Most administrators need training in program evaluation. l l I Most teachers need training in program evaluation. I I j I I When a consultant is invited to meet with a citizen advisory committee or a staff committee. the members I I I I ;hou|r|I_be informed in advance that the consultant will be present. and to the type of help that may be expected rom rm. Controversid local issues should be made known to the consultant so he can decide whether to avoid them. to I I I I I I apply information to aid in settling them, or to take a position with respect to them. III. MAJOR ACTIVITIES IN THE SYSTEMATIC APPROACH Surfing the Existing Program. The members of the staff committee. should assemble and disseminate information regarding specialized vocational I r L I I I courses being offered; related general courses; and the vocational guidance program. All members of the school facult . area center faculty. and citizens serving on organized committees, should I I I I I I receive information from the stu f committee. The information assembled aml disseminated is to be conddered in the light of the characteristics of the I I I I I I community. the school, and the area center. The recommendations that eventually will be made should relate, to the nature. the objectives. and the instructional I I I I J I components of what already exists and to what is needed. A written description of the programs contributing to occupational preparation should be prepared. I L I I L J A program description should be prepared with the potential users in mind. such a school counselors. other I I I I TJ faculty members, students, parents, and other citizens. The availability of courses should be checked against the needs of employers for their future or present employees. I I I I I I as well as the occupational interests of students Stating Philosophiand Ofiectives. The cm of am evaluation includes the devclo ment of a hiloso h . the stati of o 'ectivcs, and the gatheIIrIiIrIc of evrjziiscres to determine the extent of the apttainmcnt (if the o 'eyctivcs and t c ful lment of the I I I I I I P510009 r- All staff members of your school. including the area center, should assist in devclo ing statements of philoso y I I I I I I and general objectives of (prqgrams as they pertain to the preparation of youth an adults for entrance to/an advancement in the worl 0 work. Most statements of objectives include certain related attitudes, understandings. and appmciations in addition to I I I L I I competencies or abilities. Pri attention in the r0 am evaluation should be placed on com tencies ac ired and learnin animating to job pcrfoiImd‘rice and job satisfaction. I” qu P r [ I I L ] If unbiased means cc to be employed in appraising attainment of objectives, standards of attainment appropriate I I I I I I to the lcvd, or advancement of the students must be spelled out. Statements of com tencics be made more ' stud of em ment irementa. and adjustingtheseto ageorgydelevelofstudcn towhomgyap . ploy W by I I I l JJ Formulatirg Criterion Questions. A staff evaluation should decide what evidence it is willi to acce t that the criterion question has been I I I I I I amwered and the achievement of t]; ofiectrve mama? IA criterion question is one which places the objective in larch a way that an answer is called for that would help to measure the attainment of the o jective. 0c ' y a broad objective would suggest several questions relating to it.) 138 Identify ingand Obtaining Hitting Evidences may be defined as data and information the staff is willing to accept as answers to criterion questions. There are several possible approaches to the gathering of evidences related to objectives and criterion questions: A curriculum study conducted in cooperation with business and industry wherein specific performance goals would be developed within each course in the light of employer needs. . . . . . . A plan for providing vocational guidance based on a pre-determined set of ‘evaluative criteria.‘ A follow-up survey of former students. A plan to determine the availability of specialized courses. Other: When the different approaches to pro am evaluation have been agreed upon. they should be listed and scheduled as activities in the total cvafilation project. A local follow-up should be made in such a manner as to complement and assist in state and national follow-ups. Area studies which have been made to determine needs for vocational and technical education centers should also be conSidered by a local evaluation committee evaluating a local program. Analyzing. Interpreting and Reporting liiformagon. ln analyzing. inter eting and re orting data. conclusions should not ‘go beyond' the data, but an effort should be made. to complete y interpret w iat the information actually reveals. Consultants on research can help the evaluator to check the reliability and validity of data. Data processing machines and personnel should be used in the evaluation process wherever possible and practical. In analyzing data. a recommended procedure for analysis includes reference back to the objectives of programs and the criterion questions. Before findings and lmpltt‘flltnll.‘ of findings are re irted they should be checked for accuracy and their interpretation concurred in by those must involve in the evaluation effort. The semi- final report of the local study should be informally discount-d by the staff and administrators of your school including your area center and members of citizen advisory committees, and not released to the public prior to formulating recommendations and releasing the report to the administrators and other decision makers. There is no point where involvement of all who are to implement the findings of an evaluation is more important than at the stage of the project described in the above statement. Emiatingandl mflemcnting l_{_i~i-iiiiir_iiendati1iiis. The implementation of recommendatiom or program changes involves different leg-ls of denim-making. All decision niakcry‘ucitiacns. board of education members. administrator:- and teat-hen of your school including your area (‘t'ltll‘l-v-~‘lt0ulfl be involved at appropriate points in the evaluation process-«olijcclives. criterion questions, gathering and interpreting evidence. and making recommendations. When writing or presenting recommendations for implementation. it is very important to present e\ idm. ' to document needs for program change. The overall report might well be organised on the basis of the objectives of the total vocational education program of your school including t e area center. In reporting the results of an evaluation. caution should be exercised against over-emphasis of departments or occupational areas. The general public should be given an appropriate type of report of the evaluation project. Generating or building interest in the evaluation report on the art of the audience may be accomplished in art bydshowing the relation of the. analysis of information and rela ed recommendations to the goals held by eac au ience. Those directing an evaluation and the implementation of recommendations stemming from it should study ways of promoting cooperation and overcoming difficulties. The psychology of motivation should be reco ized by providing recognition and/or reward for the contributions made by different people or groups to the sway. A culminating step in locally directed evaluation is that of making plans for continuing or period evaluation. Please add any additional opinions you have about program evaluation: [1:13:13 [-7 J: 13:] L] 17" I" I f. 1335.13 LI- 1:1.—.13 [le [:f I113 CEEJ—jj DICE] [Illll l-TT—m [:llll] Lilli] CD333 If you would like a summary of the findings. please place your name and address below. Thank you. APPENDIX B SAMPLE LETTER OF REQUEST APPENDIX B SAMPLE LETTER OF REQUEST Genessee Area Skill Center G 5081 Torrey Road Flint, Michigan 48507 March 11, 1974 Dear As an administrator involved in vocational education you are aware of the need for consistent planning and implementing of improvements in vocational education programs. As part of my dissertation at Michigan State University, I would like to make recommendations for evaluation by local people of vocational education programs in Michigan high schools which utilize the area center as part of their facilities. These recommendations will be based on a survey of teachers in vocational education and administrators to gain their perceptions of the appropriateness of an evaluation process developed by Dr. Harold Byram of Michigan State University. Dr. Byram's process is based on the premise that evaluation of local vocational education programs should be directed by local personnel. High schools whose personnel participated in one of Dr. Byram's program evaluation projects and four additional high schools with area centers are being surveyed. Twenty schools in Michigan will be parti- cipating and I would appreciate your school being a part of my survey. The Opinionaire will require fifteen to twenty minutes to complete. A copy of the opinionaire and the cover letter which would go to each teacher of vocational subjects and all administrators of your school is enclosed. Findings from the survey will be forwarded to the Michigan Department of Vocational Education. I would like to work through a building representative who would distribute and return the survey forms to me and would appreciate a brief meeting of fifteen minutes or so with you, and/or your designate, in the next few days. I will call for an appointment. Yours truly, Marvin DeWitt Enclosures 139 APPENDIX C SAMPLE COVER LETTER FOR QUESTIONNAIRE APPENDIX C SAMPLE COVER LETTER FOR QUESTIONNAIRE 508 Chamberlain Street, Apt. F Flushing, Michigan 48433 Dear Educator: As a teacher or administrator involved in vocational education, you are aware of the need for consistent planning and evaluating in vocational education programs. As part of my dissertation at Michigan State University, I would like to make recommendations for evaluation by local people of vocational education programs in Michigan high schools WHICH UTILIZE THE AREA CENTER FOR VOCATIONAL EDUCATION as part of their facilities. These recommendations will be based on a survey of teachers and administrators in vocational education to gain their perceptions of the appropriateness of an evaluation process deve10ped by Dr. Harold Byram of Michigan State University. Dr. Byram's process is based on the premise that evaluation of local vocational education programs should be directed by local personnel. I would greatly appreciate your furnishing the information needed on the enclosed form, as it represents the essential elements of Dr. Byram's evaluation process. If you wish, a summary of the findings will be available from your building representative, when the study is complete. Sincerely, Marvin DeWitt 140 APPENDIX D SAMPLE FOLLOW-UP THANK YOU/REMINDER LETTER APPENDIX D SAMPLE FOLLOW-UP THANK YOU/REMINDER LETTER As of this date the survey forms have not been received from your school. As data is needed from all the schools, I will appreciate your sending the forms as soon as possible. If the opinionaires are already in the mail, please accept my thanks for your c00peration. Sincerely, Marvin DeWitt 141 APPENDIX E RESPONDENT SCHOOL DATA APPENDIX E RESPONDENT SCHOOL DATA Number Institution City of Grades* Enrollment SCHOOLS Project Benton Harbor Senior High School Benton Harbor 4 2,396 Big Rapids Senior High School Big Rapids 4 879 Corunna Senior High School Corunna 4 850 Eastern Senior High School Lansing 3 1,760 Fitzgerald Senior High School Warren 4 1,813 Gaylord Senior High School Gaylord 4 700 Hillsdale Senior High School Hillsdale 4 860 Marshall Senior High School Marshall 4 1,130 Niles Senior High School Niles 3 1,364 Waterford-Kettering Senior High Drayton Plains 3 1,331 Non-Project Western Senior High School Bay City 4 1,781 Southwestern Senior High School Flint 3 2,236 Pinconning Senior High School Pinconning 3 715 Lake Fenton Senior High School Fenton 4 690 142 143 Number Institution City of Grades* Enrollment AREA CENTERS Project Alpena High School Vocational Center Alpena 2 2,400 Calhoun Area Vocational Center Battle Creek 2 1,300 Newaygo Vocational Education Center Fremont 2 538 Northwest Oakland Vocational Education Center Clarkston 2 400 Sault Ste. Marie Area Center Sault Ste. Marie 2 1,630 Non-Project Bay-Arenac Skill Center Bay City 2 1,000 Genesee Area Skill Center Flint 2 1,878 *Exclusive of special programs. APPENDIX F TEACHER/ADMINISTRATOR RESPONSES TO SURVEY STATEMENTS APPENDIX F TEACHER/ADMINISTRATOR RESPONSES TO SURVEY STATEMENTS Table F.--Responses of 278 Teachers and 50 Administrators to Survey Statements Shown in Percent.‘ Strongly No Strongly Survey Statements Agree Agree Opinion Disagree Disagree ORGANIZING FOR LOCAL EVALUATION An evaluation project should be initiated only when the board and the school staff including the area center staff are committed to program T+ 25.2 41.3 11.5 13.0 9.0 evaluation A° 42.0 34.0 ' 6.0 12.0 6.0 Appginting Qualified Leaders A local leader needs to be appointed to conduct a program of self-evaluation of local T 31.6 50.7 10.8 4.7 2.2 vocational education programs. A 50.0 34.0 10.0 4.0 2.0 The local leader should have a staff committee T 43.1 42.1 12.6 1.8 .4 to work with him on the evaluation project. A 60.0 30.0 10.0 .0 .0 The following should be represented on the staff committee: Each specialized vocational area in your T 47.1 38.8 10.1 2.9 l 0 school . . . . . . . . . . . A 60.0 24.0 16.0 .0 0 Each specialized vocational area in your T 43.5 39.2 12.6 3.2 1.5 area center . . . . A 58.0 22.0 16.0 4.0 .0 Counseling. . . . T 32.0 45.0 15.1 S 7 2.0 A 64 0 26.0 8.0 0 2.0 Administration . . T 26.3 49.6 15.5 5.4 3 2 A 62.0 28.0 10.0 .0 0 General edu-ation T 20.9 42.8 24.1 8.6 3.6 A 48.0 36.0 14.0 0 2.0 A member of the staff committee should be appointed as an associate to work with the T 21.9 45.0 27.3 4.3 1.5 leader on a team basis. A 30.0 40.0 30.0 .0 0 Providing Time for Evaluation The leader of the evaluation project must be provided with clerical help, equipment, T 52.5 37.0 7.9 1.5 1.1 supplies, and support for necessary local travel. A 60.0 34.0 4.0 2.0 0 Released time should be provided the local T 45.7 38.5 11.9 2.5 1.4 leader for supervision of the evaluation process. A 52.0 34.0 6.0 8.0 0 Providing for Involvement of Faculty and Citizens The following are alternative ways of providing staff time for work on evaluation activities. Use budgeted funds for inservice education and curriculum planning to support staff activities T 22.0 44.6 20.1 9.0 4.3 in the evaluation program. A 30.0 54.0 10.0 6.0 0 Use school financed graduate study courses where independent study or research for T 13.3 38.1 29.5 11.9 7.2 graduate credit is available. A 10.0 38.0 28.0 22.0 2.0 ‘See Tables 15 and 16 and G-8 and G-9 for the information from the items in the survey form which did not fit this pattern. *Teacher data. °Administrative data. 144 145 Table F.o-Continued. Strongly No Strongly Survey Statements Agree Agree Opinion Disagree Disagree Use school financed graduate credit workshops. T 15.1 41.4 29.1 9.4 5.0 A 10.0 24.0 40.0 24.0 2.0 Use a portion of each faculty meeting. T 3.3 16.9 30.2 25.5 24.1 A 2.0 18.0 28.0 42.0 10.0 All members of the faculty (both vocational education and general education) and administra- tors of your school, including your area center, should participate in the evaluation project, T 24.1 41.0 16.9 12.6 5.4 insofar as is feasible. A 44.0 38.0 8.0 8.0 2.0 Participation in evaluation could be one of the functions of a general advisory committee which gives advice on the total vocational education T 25.5 54.0 10.8 7.2 2.5 program of the school. A 16.0 72.0 6.0 4.0 2.0 One or more citizen advisory committees should be authorized and appointed for the specific purpose of assisting in evaluation of the vocational education program in your school T 21.6 51.4 15.5 10.4 1.1 including the area center. A 22.0 44.0 8.0 20.0 6.0 There is need for permanent citizen advisory committees because evaluation of vocational T 30.5 45.0 13.0 9.7 1.8 education is a continuous process. A 46.0 28.0 10.0 14.0 2.0 Names of persons to be considered for nomination to the citizen advisory committee are best suggested by members of your school staff including the area center staff and T 15.1 46.8 18.0 16.1 4.0 the board of education. A 16.0 50.0 16.0 16.0 2.0 Persons considered for nomination to the citizen adivsory committee should not be nominated by organizations or agencies outside T 19.0 20.5 25.2 27.7 7.6 of your school and your area center. A 8.0 22.0 22.0 38.0 10.0 Developing a Plan for Directing the Evaluation Services of consultants should be available T 30.6 52.5 11.1 4.7 1 1 to the evaluation project personnel. A 44.0 48.0 6.0 2.0 0 A consultant would be of assistance in deter- mining the total estimated time for all evaluation activities needed in the evaluation T 14.4 45.3 28.4 9.0 2.9 project. A 22.0 50.0 20.0 8.0 .0 Developing Competency in Program Evaluation Most administrators need training in program T 40.6 37.8 17.3 3.6 7 evaluation. A 36.0 56.0 4.0 4.0 0 Most teachers need training in program T 37.4 48.2 7.9 4.7 1 8 evaluation. A 42.0 52.0 4.0 2.0 0 When a consultant is invited to meet with a citizen advisory committee or a staff committee, the members sbould be informed in advance that g the consultant will be present, and to the type T 41.4 45.3 10.8 1.4 1.1 of help that may be expected from him. A 54.0 42.0 4.0 .0 .0 Controversial local issues should be made known to the consultant so he can decide whether to avoid them, to supply information to aid in settling them, or to take a position with T 28.8 50.7 15.1 3.6 I. respect to them. A 44.0 42.0 10.0 2 146 Table F.--Continued. Survey Statements Strongly Agree No Strongly Agree Opinion Disagree Disagree MAJOR ACTIVITIES IN THE SYSTEMATIC APPROACH Studying the Existing Program The members of the staff committee should assemble and disseminate information regarding specialized vocational courses being offered; related general courses; and the vocational guidance program. All members of the school faculty, area center faculty, and citizens serving on organized committees, should receive information from the staff committee. The information assembled and disseminated is to be considered in the light of the characteristics of the community, the school, and the area center. The recommendations that eventually will be made should relate to the nature, the objectives, and the instructional components of what already exists and to what is needed. A written description of the programs contributing to occupatibnal preparation should be prepared . A program description should be prepared with the potential users in mind, such as school counselors, other faculty members, students, parents, and other citizens. The availability of courses should be checked against the needs of employers for their future or present employees, as well as the occupational interests of students. Stating_Philosophy and Objectives The process of program evaluation includes the development of a philosophy, the stating of objectives, and the gathering of evidences to determine the extent of the attainment of the objectives and the fulfillment of the philosophy. All staff members of your school, including the area center, should assist in developing statements of philosophy and general objectives of programs as they pertain to the preparation of youth and adults for entrance to/and advancement in the world of work. Most statements of objectives include certain related attitudes, understandings, and appreciations in addition to competencies or abilities. Primary attention in the program evaluation should be placed on competencies acquired and learnings contributing to job performance and job satisfaction. If unbiased means are to be employed in appraising attainment of objectives, standards of attainment appropriate to the level, or advancement of the students must be spelled out. Statements of competencies may be made more Specific through study of employment require- ments, and by adjusting these to the age or grade level of students to whom they apply. >-a 28.8 48.0 27.3 44.0 28.1 46.0 34.5 36.3 48.0 41.4 66.0 52.8 64.0 28.7 58.0 29.9 28.0 40.0 35.2 42.0 32.7 27.7 28.0 60. 52. SS. 50. SS. 50. S3. 42. 51. 44. 51. 28. 40. 53. 36. 43. 42. 50. 46. SO. 50. 48. 48. 58. 58. @110 «510 CA A U1 ll. l4. 14. 10. 10. GUI 12.0 A 00 can e.a 00 NN 0'0 \I N NI 5 DU! \1 a \J bin \1 147 Table F.--Continued. g n-- - ---—-— v-- —. ‘._» m ’55.“; ‘— Strongly No Strongly Survey Statements Agree Agree Opinion Disagree Disagree Formulating Criterion Questions A staff evaluation should decide what evidence it is willing to accept that the criterion question has been answered and the achievement 19.4 47.5 29.9 2.5 .7 of the objective indicated. 30.0 60.0 10.0 .0 .0 Identifying and Obtaining Evidences There are several possible approaches to the gathering of evidences related to objectives and criterion questions: A curriculum study conducted in cooperation with business and industry wherein specific performance goals would be developed within 34.5 51.8 9.0 4.0 .7 each course in the light of employer needs. 50.0 38.0 8.0 4.0 .0 A plan for providing vocational guidance based on a pre-determined set of 18.3 45.0 25.2 9.3 2.2 "evaluative criteria”. . . ‘ 26.0 52.0 18.0 4.0 .0 A follow-up survey of former students 40.3 47.5 8.2 4.0 .0 46.0 42.0 10.0 2.0 .0 A plan to determine the availability 18.0 49.3 26.6 4.3 1.8 of specialized courses . . . . . 22.0 56.0 20.0 2.0 .0 When the different approaches to program evaluation have been agreed upon, they should be listed and scheduled as activities in the 20.5 61.2 15.8 1.4 1.1 total evaluation project. 4.0 46.0 8.0 2.0 .0 A local follow-up should be made in such a manner as to complement and assist in state 22.3 49.3 21.9 5.1 1.4 and national follow-ups. 34.0 52.0 12.0 2.0 .0 Area studies which have been made to determine needs for vocational and technical education centers should also be considered by a local evaluation committee evaluating a local 24.8 54.7 17.6 2.2. .7 program. 38.0 52.0 8.0 2.0 .0 Analyzing, Interpreting and Reporting Information In analyzing, interpreting and reporting data, conclusions should not "go beyond" the data, but an effort should be made to completely interpet 27.0 54.0 14.8 3.9 .3 what the information actually reveals. 40.0 44.0 8.0 8.0 .0 Consultants on research can help the evaluator 19.8 54.3 20.2 4.3 1.4 to check the reliability and validity of data. 26.0 58.0 14.0 2.0 .0 Data processing machines and personnel should be used in the evaluation process wherever 29.5 50.4 16.9 2.2 1.0 possible and practical. 40.0 42.0 14.0 4.0 .0 In analyzing data, a recommended procedure for analysis includes reference back to the objectives of programs and the criterion 26.6 54.3 17.7 .7 .7 questions. ‘ 44.0 48.0 8.0 .0 .0 Before findings and implications of findings are reported they should be checked for accuracy and their interpretation concurred in by those 36.7 48.9 12.3 1.4 .7 most involved in the evaluation effort. 48.0 40.0 10.0 2.0 .0 The semi-final report of the local study should be informally discussed by the staff and administrators of your school including your area center and members of citizen advisory committees, and not released to the public prior to formulating recommendations and releasing the report to the administrators 36.0 .S 20.5 4.3 .7 and other decision makers. 56.0 7’ 0 14.0 8.0 .0 148 Table F.--Continued. Strongly No Strongly Survey Statements Agree Agree Opinion Disagree Disagree There is no point where involvement of all who are to implement the findings of an evaluation is more important than at the stage of the T 21.2 38.5 31.0 6.8 2.5 project described in the above statement. A 34.0 26.0 32.0 6.0 2.0 Formulating and Implementing Recommendations The implementation of recommendations or program changes involves different levels of decision—making. All decision makers—- citizens, board of education members, administrators and teachers of your school including your area center-~shou1d be involved at appropriate points in the evaluation process- objectives, criterion questions, gathering and interpreting evidence, and making T 30.6 49.6 14.4 3.6 1.8 recommendations. A 42.0 46.0 8.0 2.0 2.0 When writing or presenting recommendations for implementation, it is very important to present evidence to document needs for T 38.5 46.0 12.6 2.5 .4 program change. A 54.0 38.0 6.0 2.0 .0 The overall report might well be organized on the basis of the objectives of the total vocational education program of your school T 24.8 56.5 16.6 1.8 .3 including the area center. A 46.0 44.0 10.0 .0 .0 In reporting the results of an evaluation, caution should be exercised against over- emphasis of departments or occupational T 20.5 46.8 22.7 9.3 .7 areas. A 38.0 40.0 14.0 4.0 4.0 The general public should be given an appropriate type of report of the evaluation T 26.6 47.5 19.1 6.1 .7 project. A 42.0 40.0 12.0 6.0 .0 Generating or building interest in the evaluation report on the part of the audience may be accomplished in part by showing the relation of the analysis of information and related recommendations to the goals held by T 14.8 47.8 33.8 2.9 .7 each audience. A 28.0 54.0 14.0 4.0 .0 Those directing an evaluation and the imple- mentation of recommendations stemming from it should study ways of promoting cooperation T 25.2 58.6 15.5 .7 .0 and overcoming difficulties. A 36.0 54.0 10.0 0 .0 The psychology of motivation should be recognized by providing recognition and/or reward for the contributions made by different T 23.4 48.9 24.5 2.5 .7 people or groups to the study. A 40.0 38.0 18.0 4 0 .0 A culminating step in locally directed evaluation is that of making plans for T 25.5 55.4 16.9 2.2 .0 continuing or period evaluation. A 48.0 42.0 10.0 0 .0 APPENDIX C ADMINISTRATOR DATA N.ow om «N m vH mm OH ma encampom w.m~ w m o m m m N poemsuom uoz ooH mm mm m NH mm ma om eouecmuumfia unwound 909552 Honesz Honssz monasz nonemz nonesz nonesz mHmuOH mHmHOH poonoum-:oz pounced mHmHOH poonoud-:oz pounced moufimccOAumoso maoucou mou< mHoonom .comuauwummH we make an muoueuumflcwev< Eouw eo>fioomm momeommom can on eeusnfiuumfio menflmcmofiumoso mo nonesz--.a-o manme uu.muo oHnmN 152 mHoHCQU emu< ooH om ooH NN ooH w ooH vH ocH mN ooH oH ooH NH mHeHON NH e 2 m S m 3 N H. H e H 8227.3. oz Nm He we mH no m NN oH Nm 0N ooH oH mm oH oz 0 m m N eH N v H o H mm» N .02 N .oz N .oz N .oz N .02 N .02 N .oz mHeHON mHeuON . Hammondncoz pounced mHeuON moonoaducoz pounced Heuomm mHoonum .eoHuspHumcH mo emxe Np vonHmmmHu mENNNOHQ eoHueosvm HemeHueoo> HeoOH ceuoonHo NHHmoOH mo muoononm m.emuxm :H mucoeeommom HepmuuchHem< mo :oHummHoHuueduu.v-u oHnmN MO fiOwuflSHmczm 153 ooH om ooH NN ooH N ooH 4H OCH NN ooH oH ooH NH mHmHON a N m H N H a H m H mmeommom oz oe mm mm NH Nm m NN oH HN ON ow N No NH oz om mH om N me m HN N mN N ON N NN m we» N .oz N .oz N .oz N .oz N .oz N .oz N .oz mHNHON mHmHOH poemommncoz puenoud mHNHON poomoumumoz nuanced Homomu whenceu mou< mHoocom Hmmumou :uaoz mesh emcee msmnmond .coHuspHumcH we make Np vonHmmeHu mmoHpmsHm>m :oHumomcm choHumoo> Hmoom mo :oHumaHe>m um eoEH< Hammond :OHumsHe>m wouoonHo NHHNUOH Nc< CH mucoecommom HoummumfieHEc< mo :oHpemHoHumednu.muo mHnmN 154 ooH om ooH NN 00H m 00H vH NN HH NN 0 mm N mm e 00 mm mm NH No m om N ooH mN ooH oH ooH wH mHNHOH omeommom oz Adamo e H o H oonoo m.HOpooo NH m OH H NN e om + m.Houme2 Ho oonma m.umHHmHoomm mN HN om a so NH mmHNoa m.Honm2 v H o H eeuwoa m.HoHonuem mmoHHou NHH::EEOU\HoHc:n Hoogum HNH: Hoocom :mH: menu mmmH .oz .02 .oz o\° .oz o\° o\° o\° mHNHON mHNHON poonoudueoz poenoum muoumou eeu< .02 .02 .oz o\° o\° o\° mHNHON Hammondnmoz Hammond mucuowm mHoonom mo omxb Na conHmmeHu mucopmommom HepmnuchHEc< .eoHumqumcH NH empoHHaou NEHHooHum mo HosmH HmoHNHz--.e-u mHnsN 155 OOH om OOH NN OOH N OOH OH OOH NN OOH OH OOH NH mHmuoN O m m H NH H N N ON N omeoamom oz v N m H N H v H O H egos Ho HN o m m H NH H N N OH N ON on OH N v vH m NH H OH N v H o H mH ou HH ON NH NH v ON v NN O ON N ON N OH on O Om mN em NH «O m Om N ow NH OO O on N mmoH Ho mane» m N .oz N .oz N .oz N .oz N .oz N .oz N .02 mHeuoN mHNHOH uoenoaa-:oz Hammond mHNHON poenondumoz pounced Houumm muoucou eoH< mHoonum .eerauHumeH mo maze NO OonHmmeHu mucevcommom HoumuuchHEv< mo oomoHuomxm o>NpmHuchHaO< oEHhuHHmm mo mneo>un.Nuo oHan 156 NN NOH NN NNH New NO NOOH HeuoN mo useoHOO e N H O NN N om HOHON H H H N O + HH N H N O OH n O Heucou eon< N N m - H u- ....... --u----u-- uuuuuu -a---u------u ....................................................... -uHHeEm H m n N + HH N H N OH OH - O Hoocom H H N m u H m v N + HH H H OH - O Hmucou meH< H H m n H uuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuu ONHmH H N O + HH H H OH . O Hoozom N N m u H HceHN Honuo pneumou HmmHomHud :OHueosvm Hozoeoe Nchcommom 06:0Hnomxm omNN ONHN -cHuomsm Hoonom HecoHueuo> Honesz NmHHUOON ONH: HOHOOHHO when» NOHOHHOO .mnouenuchHsO< NumHm NO Oo>Hoouom we HQOOOH HOUOH mo =0HuHmom--.N-O OHONN 157 NNN NN NN NNO NON NOOH HOOON mo HOOOHOO HH H e ON OH Om HOHON N N H m + HH H m H O OH . O Houcou NOH< H H N m u H ............................................................................................... HHOEN N H v H N + HH H m N O OH - O Hoozom H N m m - H H N e H N + HH N N OH . O Hmuceu OOH< H H m - H uuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuu ONHNH + HH N H N O OH - O Hoozom H H m u H HcmHN Honpo oEHhuHHON OEHNumHmz use: oco NO>H3N ou oucoHHomxm omNN ONHN NcHecommom NGHHUOON NcHOHHSN NcHOHHSN Honeaz mo meme» .HOONOH Hmooq Oz» OOOH>on on on NNO Hem oEHN OommoHom mo unsoe< op muoumuumHmHaO< NumHm mo mommommomnu.muu oHOmN HICHIGRN STRTE UNIV. LIBRRRIES llHI”WINllllIIIWIINllllllllllINIWINIWIHIHI 31293000839534