. ‘1’! .12.} nun-09 1*“ ‘2 U L L:&\..‘ at - v‘ I”. -M . n .p «a, 3%. .-\.~;v .3 7‘? I? ---I- 5:: .‘a nl...\"u-a“‘ "‘31‘ ‘l‘.’ x“. '5' H‘w-‘u Afiv a; At“... a z- . ”We LI?” lRY Mich? . t Stat. Univz'rn’ty ‘ This is to certify that the thesis entitled AN ANALYSIS OF THE SIMILARITIES AND DIFFERENCES AMONG A SAMPLE OF OPERATIONAL COMPETENCY BASED TEACHER EDUCATION ASSESSMENT AND REVISION SYSTEMS presented by Walter Gordon Ritchie has been accepted towards fulfillment of the requirements for Ph.D. degree in Secondary Education and Curriculum (Instructional Development and Technology) Castelle G. Gentrv Major professor Date January 30, 1976 0-7639 ' anemia or ‘5 7: ‘ y; WAG 8 SUNS' . ‘--- l} 390K BINDERY INC. *E- .‘ LIBRARY muons ‘ H To; first“. .s I}:- Nun '9. SK ' n t! .17; . | tency .m .._ E :1“! h i ' , mm an; ’9' TU? :11 p-_ , ‘ flirt-:3 -; > es 1mm») ._ 3 "Wqflmenl'k ‘ 3- ». z" ‘1 . b pl'flgg‘g‘ :' “'1 i7 - ' L " '3“ 0-. .7). ll ‘ glut 5‘ ‘ ‘f " ’ "'11“ -" , .1 H. . ”In; '1. Lt-naa.‘ ". .lptudied? ‘ a - 2 . ,Wt are ‘31'V3r-3-‘z‘4_ "'- - -' '1 9. .m-"esw‘m Oifect .‘ * Wane-3, \i‘r‘ ’frr‘r sum ~. 1 2:.» n."1.uii'!¢¥ Q5 5 ' é g, 3"" 9. ‘ - . , V ." .‘ ’ based tea _, ‘ artllL‘d{rA._/‘.| 35.3.; ,_\ H? “J" »-I' 1 . were (Jr mule: problems afar-.haqiulrm‘ u the 1&3- ‘ .V of the Competency based teach-w. -.;éi.r.'.o-‘T‘H:KI. 4 z “flmd :ovtsion systems? _ 7 f} .x " t ‘ . . . I .‘ ‘ . '11 63:36.. ' t ‘5; _ . In ' W ABSTRACT AN ANALYSIS OF THE SIMILARITIES AND DIFFERENCES AMONG A SAMPLE OF OPERATIONAL COMPETENCY BASED TEACHER EDUCATION ASSESSMENT AND REVISION SYSTEMS BY Walter Gordon Ritchie This is an exploratory study developed from a concern that teacher training institutions, converting to competency based teacher education, were neglecting the assessment and revision components. The purpose of this study was to analyze the similarities and differences in assessment and revision procedures among a sample of institutions using electronic data management systems in their competency based teacher education programs. The research focused on three specific questions: 1. What are the similarities and differences of those systems studied? 2. What are the variables identified that have an effect on the relevance, the effectiveness, and the efficiency of a competency based teacher education assessment and revision system? 3. What were the major problems encountered in the development of the competency based teacher education assessment and revision systems? Walter Gordon Ritchie The study was intended to provide insight into the necessary procedures which are considered essential for all assessment and revision Systems. It would also identify the necessary support factors, which would enable a program to modify and improve itself. By identifying the significant design variables, recommendations could be made to new com- petency based teacher education programs to aid in their development of a more effective assessment and revision system. A survey of judges was used to identify the five institutions to be studied. The judges were selected on the basis of acquired recognition at the national level. This procedure was intended to increase the potential of identifying five institutions with fairly sophisticated competency based assessment and revision systems. The principal gathering instrument was a questionnaire. The questionnaire addressed the three research questions. An individual, with program development responsibilities, was interviewed at each of the five institutions. Their responses to the questions on the questionnaire were tape recorded. This enabled the respondent to elaborate freely, without the interruptions which hand recording would have necessitated. The data was recorded on the questionnaire after the interview. The significant design variables identified by this study were used to develop a model assessment and revision system, which was integrated with a data management system. This model is presented as a recommendation to institutions ._ f- -- .‘u A” ‘ V an ANALYSIS OF THE SIMILARITIES AND DIFFERENCES 1;“ .. ‘ ”a” . ' flinch SAMPLE OF OPERATIONAL COMPETENCY BASED 1'»; Maw EDUCATION ASSESSMENT AND REVISION SYSTEMS ”I, and . _ . BY “4“ stair); .- . . .i.: Walter Gordon Ritchie git1CAi Ar i‘, '. ‘lifit,'fiz. . rappt‘et': , .L 4‘ ogre." I... A DISSERTATION Submitted to Michigan State University in partial fulfillment of the requirements for the degree of DOCTOR OF PHILOSOPHY College of Education 19ié ! '3‘! _ ACKNOWLEDGMENTS f4 vnziw' - ..%,I£u-gmgvnr. Castelle Gentry, Chairman of the Advisory _T:hfly;tee, the writer expresses his appreciation for the “is and patience given during the preparation of the ::3a; tation and for the quick responses in times of .7 ~f iitical need . "o f To other.members of the Committee, Dr. Paul W. F. -S¥klwtt, Dr. Dale Alam, and Dr. Christopher Sower, the writer ‘J‘ ag’appreciative for their contributions to his doctoral - gyrogram and this dissertation. 0 _¢ )3 To.his wife, Ruth, the writer is especially indebted ‘ ’fifidr.her'help and encouragement during times of stress. ffif' And finally to Mary Jane Cook, a deep thanks for service under very difficult conditions. TABLE OF CONTENTS 'QCKNOWLEDGMENTS . . . . . . . . . . . . . LIST OF FIGURES . . . . . . . . . . . . . Chapter 1. THE PROBLEM . . . . . . . . . . . THE NEED FOR THE STUDY . . . . PURPOSE OF THE STUDY . . . . . RESEARCH OBJECTIVES . . . . . . DEFINITIONS . . . . . . . . . . OVERVIEW . . . . . . . . . . . 2. REVIEW OF THE LITERATURE . . . . SUMMARY . . . . . . . . . . . . 3. DESIGN OF STUDY . . . . . . . . . INTRODUCTION . . . . . . . . . RESEARCH QUESTIONS . . . . . . SELECTION OF PROGRAM VARIABLES INSTRUMENTATION . . . . . . . . SUMMARY . . . . . . . . . . . . 4. ANALYSIS AND RESULTS . . . . . . SURVEY OF JUDGES . . . . . . . INTERVIEW OF INSTITUTIONS . . . s UMMARY I o o O o o o I o o o 0 Page ii 0 (Dd 12 14 38 40 40 40 41 42 49 51 51 53 73 Chapter 5. iv SUMMARY AND CONCLUSIONS . . . . . . . . . RECAPITULATION . . . . . . . . . . . . . CONCLUSIONS . . . . . . . . . . . . . . MODEL FOR ASSESSMENT AND REVISION SYSTEM IMPLICATIONS OF FUTURE RESEARCH . . . . BIBLIOGRAPHY . . . . . . . . . . . . . . . . . . . APPENDIX APPENDIX APPENDIX APPENDIX APPENDIX APPENDIX APPENDIX APPENDIX A. CRITERIA FOR MAKING JUDGMENT FROM ANALYSIS OF SELECTED DATA MANAGEMENT SYSTEMS . . . . . . . . . . . . . . . B. COVER LETTER FOR SURVEY OF JUDGES . . C. FORM FOR SURVEY OF JUDGES . . . . . . D. A SURVEY OF DATA MANAGEMENT COMPONENT FOR ASSESSMENT AND REVISION SYSTEMS USED IN OPERATIONAL COMPETENCY BASED TEACHER EDUCATION PROGRAMS . . . . . E. DATA FROM SURVEY OF JUDGES . . . . . F. INTERVIEW OF INSTITUTIONS . . . . . . G. CRITERION CHECKLIST AND MARK-SENSE ANSWER SHEET . . . . . . . . . . . . H. EXAMPLES OF DATA RETRIEVED AS PRINT- OUTS . . . . . . . . . . . . . . . . Page 77 77 81 88 119 120 124 126 127 128 142 144 178 180 LIST OF FIGURES Page .: ‘2.'m;e Matrix 0 o o n o a o o I o o o o o o I o o 48 ""fi'lodel For An Assessment and Revision System . . 90 '97,? {Data Management System . . . . . . . . . . . . . 96 z "w n gram-r I: 1"“. -. 7¥:;Deinq -;. 1' A , gEg, one shot, . 1' ,1 r ' r ' -! ~ - @ rfgeen~ It ususl7} rt“ 2 s; I‘ k)?”_?l’.' 'acdbnqk. :' ,4 '1‘. .3: . . '. . _ ". ;. .irzfiicince or trainer re on oa.-e* art!tvfifivfi —“»‘h‘z§i}‘x' r teaching sfratfztrlosli .z-d x'- :~ rue. cairn“; ' “l: ‘D ‘_ w . .L ' C . ' .- _ . .V\\- - '- ‘3 T . — ‘Ov _ ' _ Chapter 1 THE PROBLEM The Need for the Study During the last several years, there has been a movement toward competency based teacher education (CBTE). Recently this movement has accelerated at a rapid rate. The fact that the CBTE movement is progressing so rapidly verifies, to some extent, its ability to eliminate many of the deficiencies of traditional training programs. Educa- tion generally has been under severe attack for many years, as being ineffectual and frequently irrelevant to the needs of society. Teacher education programs have been the target of much of this criticism. Typical of the charges against teacher education programs is the complaint of Lembo and Olds. "In nearly all teacher training programs, the only semblance of 'training' is the student teaching experience. This is a feeble, one shot, trial and error experiment and screen- ing device. It usually provides no systematic feedback to the trainee or trainer to encourage improvement in the trainee's teaching strategies and fails to prevent many 1 2 grossly incompetent and emotionally disturbed people from becoming certified to work with classrooms of children."1 In response to the mounting clamor of the 60's, the U. S. Office of Education in 1968, supported the CBTE movement through the Elementary Teacher Education Models Program. Recently the State of Michigan has funded a num- ber of projects for the development of competency based teacher education. For example, in Michigan, the compe— tency based movement has made significant inroads. Of the thirty teacher education institutions in Michigan, seventeen have converted at least some portion of their program to competency based teacher education. The other thirteen have indicated an interest in competency based teacher education and may well develop programs of their own . To determine if a program is competency based, the definition used by Cooper and Weber can be a yard- stick. "A competency-based (or performance-based) teacher education program is a program in which the competencies to be acquired by the student and the criteria to be applied in assessing the competency of the student are made explicit and the student is held accountable for meeting these criteria . . . Three types of criteria are used: (1) knowledge criteria which are used to assess the cognitive 1John M. Lembo, ed., Learning and Teaching in Today's Schools, (Columbus: Chas. E. Merrill Publishing C00, 7 I p. 14. 3 understandings . . .; (2) performance criteria which are used to assess the teaching behaviors . . .; (3) product criteria which are used to assess the student's ability to teach by examining the achievement of pupils . . ."2 "The most striking feature of competency-based edu- cation obviously is competency, which is synonymous with the concept of ability. At the end of instruction, in compe— tency education, the learner is to have acquired the ability or skill to def-something--since doing is the essence of learning."3 Burns further asserts the importance of competency based teacher education by saying, "It is the specification of the behaviors to be acquired that gives leverage to the competency-based movement. It is the extra-power--this exact specification of the behaviors to be acquired by the learner-- that is making the competency-based education movement more than just another fad in the field of education."4 Many institutions across the nation have been deeply involved in this movement. Their experiences have provided some explicit guidelines for continued effort for those involved in competency based teacher education programs. 2James M. Cooper and Wilford A. Weber, "Chapter I, Vol. II, A Competency—Based Systems Approach to Teacher Education." (Typewritten). 3Richard W. Burns, “Behavioral Objectives for Competency-Based Education,“ Educational Technology, November, 1972, p. 22. 4 Ibid., p. 22. LL-Iiii C _ 4 Gentry and Johnson5 and Houston6 have clearly defined the problems and promise in the development and implementation of competency based programs. One of the major problems confronting competency based teacher education today appears to involve assessment techniques. David Krathwolh predicted CBTE ". . . is certain to fail to reach its ultimate objective if it continues on its present course. This failure will be caused by the almost complete lack of attention given to the assessment of teaching competencies . . ."7 It becomes increasingly evident that assessment and revision procedures are the critical factors in improving educational programs. A competency based program implies that students in the program will move through the instruc- tional packages at their own rates. Prerequisite and pretests would place a group of students all entering the program at the same time, at different entering points. Meeting the criteria of each package will be accomplished in varying amounts of time. A competency based teacher education is quite a recent innovation, implementation of this procedure for 5Castelle Gentry and Charles Johnson, A Practical Mana ement System for Performance-Based Teacher Education. {American Association of Colleges for Teacher EducatIOn, February, 1974. 6W. Robert Houston, Strategies and Resources for Develo in a Com etency-Based Teacher Education Program. (A JOInt Publication of New York State Education Depart- ment and Multi-State Consoritum on Performance-Based Teacher Education, October 1972). 7"Introductory Note" in: Jack C. Merwin, Performance-Based Teacher Education: Some Measurement and Decision-Making Considerations. (Washington, D. C.,: American Association of Colleges for Teacher Education, June 1973) p. v. 5 preparing teachers, has not yet had extensive research and tryouts. The learning packages may not yet be properly sequenced or inclusive. The strengths and weaknesses are still being discovered. Changing from traditional methods of training teachers to a competency based program encompasses a great deal of time and money. Add to this the human reluctance to give up the old and the familiar in order to embrace a new unknown program, means that the program should be strongly justified. The need for individualized instruction, lack of detailed and/or in-depth research, the variety of human resources involved, and the financial burden of making costly curriculum changes, all suggest the necessity of an elaborate and sophisticated data management system. Because assessment of the competency and eventually its effect on learning is the main ingredient, CBTE programs need a management system that can sort out the strengths and weaknesses of an individual, of the learning package and of the program itself. Feedback to almost everyone involved is a vital part of the management system, so that revisions can be made and the program improved. From these feedbacks, the student, the instructor, the advisor, the developer, and others, should benefit by monitoring their areas of concern. Managing vast amounts of data is a highly complex task. Eventually the system should be sophisticated enough to handle it, or valuable information will be lost. Ideally, a sophisticated system should operate from the very beginning, when revisions are most needed. .Linikm ' W“ 6 During the design, implementation, and evolutionary stages of a competency based teacher program, it is essential that the competencies be accurately and precisely assessed. Considering the numerous variables involved, the assessment and revision system used must be comprehensive and therefore faces the formidable task of generating vast amounts of data. This suggests the need for a management system that would be capable of delivering the appropriate data to the various parties, who are factors relevant to the efficacy of the program. The above concern is the framework around which this study has evolved. Since programs are developing with somewhat different perspectives, the need to study variables which influence program structures in diverse ways, is desirable. PURPOSE OF THE STUDY The purpose of this study is to analyze the similar- ities and differences in assessment and revision procedures, among a sample of institutions using electronic data manage- ment systems in their competency based teacher education programs. This study will hopefully provide insight into necessary procedures, which are considered essential for all assessment and revision systems. It should also identify the necessary support factors, which will enable a program to modify and improve itself. By identifying the signi- ficant design variables, recommendations can be made to new 7 competency based teacher education programs to aid in their development of a more effective assessment and revision system. RESEARCH OBJECTIVES The objectives of the study are to determine the significant design variables associated with competency based teacher education assessment and revision systems based on operational experience. By identifying the signi— ficant design variables, recommendations can be made to new competency based teacher education programs in the develop- ment of their assessment and revision systems. The following questions will be answered. 1. What are the similarities and the differences of those systems studied? 2. What are the variables identified that have an effect on the relevance, the effectiveness, and the effi- ciency of a competency based teacher education assessment and revision system? 3. What were the major problems encountered in the development of the competency based teacher education assessment and revision systems? A model assessment system, synthesizing the elements in the analyzed systems, will be generated as a product of this research. 8 DEFINITIONS Since there is some disagreement over the proper title to use, it seems feasible to use the two terms, per— formance based and competency based interchangeably as suggested by Andrews.8 This study will use these terms interchangeably also. Other terms that need to be defined for this study follow. The following definitions of Observing, Measuring and Evaluating, and Assessing, have been taken from Ways of Teaching by Ronald T. Hyman. "Observing involves the intentional and methodical viewing of some object or activity. Observing is more than mere seeing; it entails planned, careful, focused, active attention by the observer. Measuring is the assigning of numbers (quantitative values) to a set of people, objects, or activities, according to some established rules. Measuring is a descriptive activity that expresses quantitatively the degree or type of characteristic possessed by a person, object or activity. Evaluating is the judging or rating of persons, objectives, or activities to be good or bad, right or wrong, worthy or unworthy, desirable or undesirable, etc. Evalu- ating is not, like measuring, a descriptive activity; it is the 'systematic process of judging the worth, desirability, 8Theodore E. Andrews, Atlanta or Atlantis. (A Publication of the Multi-State Consortium of Performance- Based Teacher Education, 1973), p. l. '7 9 effectiveness, or adequacy of something according to definite criteria and purposes. The judgment is based upon a careful comparison of observation data with criteria standards.’ Assessing involves comparison between given measure- ments, to determine to what degree the measurements meet given objectives or criteria. Assessing does not involve value judgments, since the objectives or criteria of com— parison are simply accepted as given. These four terms are obviously related to one another in that observing becomes one aspect of measuring, and measuring, in turn, is often involved in evaluating, though it is often used to obtain data upon which value judgments are based.“9 Input Data are the various types and forms of data collected and fed into the data management system. Output Data are the various types and forms of data generated by the data management system used for advisement and decision making. "Criterion—referenced measures ascertain an individual's status with respect to some criterion or performance standard. Because the individual is compared with some established criterion, rather than with other individuals, these measures are described as criterion— referenced."lo 9Ronald T. Hyman, Ways of Teaching, 2nd edition, (Philadelphia: J. B. Lippincott Co., 1974), pp. 333-334. 10W. James Popham, Evaluating Instruction, (Engle- wood Cliffs: Prentice-Hall, Inc., 1973), p. 25. 10 "Norm-referenced measures ascertain an individual's performance in relationship to the performance of other individuals on the same measuring device. Because the individual is compared to some normative group such measures are described as norm—referenced."ll Formative evaluations are judgments of merit made relative to segments of a program. Summative evaluations are judgments of merit made relative to an entire program. Student advisor is an individual who is assigned the responsibility of monitoring a student's progress throughout the program, providing guidance and assistance when necessary. Program developer is a person directly responsible for program revision. Cooperating teacher is employed by a local school system and is responsible for supervising the student in the field setting. Field experience is any learning activity that occurs off campus and is normally associated with a public school classroom. This would include a variety of acti- vities, particularly student teaching. Competencies - A description in performance terms of knowledge, skills, and attitudes that will enable a student to meet performance criteria for classroom teaching. Module — A cluster of related objectives with its own pretest, posttest, and instructional strategies. 111bid., p. 25. ll Terminal Performance Objectives - Objectives which state what the learner is to be able to do at the end of instruction. They specify the standard levels of performance in behavioral terms. Competency Based Teacher Education Programs. The following list of characteristics will be used as criteria in the analysis of the several data management systems. The list was produced by the AACTE Committee on Performance- Based Teacher Education. The first five are considered essential by the Committee. The remaining characteristics are considered as either implied or desirable. Those that are implied are assumed to exist as a consequence of the first five generic elements, which the Committee feels will qualify a program as PBTE or not. The desirable character- istics may or may not be found in a program, but are con- sidered as desirable components of a PBTE program. 1. Teaching competencies to be demonstrated are role- derived, specified in behavioral terms and made public. 2. Assessment criteria are competency-based, specify mastery levels, and made public. 3. Assessment required performance as prime evidence, takes student knowledge into account. 4. Student's progress rate depends on demonstrated competency. 5. Instructional program facilitates development and evalu- ation of specific competencies. 6. Instruction is individualized. 7. The learning experience of the individual is guided by feedback. Feedback also provides information necessary for program revision. Feedback should be timely, comprehensive, and accurate. 11. 12. 12 The program as a whole is systemic. Most systems are product oriented. How accurately these products reflect the system's purpose is the critical measure by which we judge the system's operation. The emphasis is on exit, not on entrance requirements. Instruction is modularized. A module is a set of learning activities (with objectives, prerequisites, pre-assessment, instructional activities, post- assessment, and remediation) intended to facilitate the student's acquisition and demonstration of a particular competency. The program is field-centered. Because PBTE is systemic and because it depends upon feedback for the correction of error and for the improvement of efficiency, it is likely to have a research component; it is open and regenerative. The impact of PBTE on teacher training programs should show: a. much greater flexibility. b. greater attention of specific skill training. c. greater congruity between objectives and the evidence admitted for evaluation purposes. d. better rationalization of faculty decisions and demands affecting students. e. development of neY facilities and technology required by PBTE. OVERVIEW Much has been written recently on competency based teacher education. In Chapter 2, the pertinent literature on assessment and revision systems will be reviewed. 12Stanley Elam, Performance-Based Teacher Education, What is the State of the Art? (A Publication of the American Association of Colleges for Teacher Education, December 1971), pp. 7- -ll. 13 The design of the study will be developed in Chapter 3. The procedure for selection of sample institutions will be explained. The variables to be examined in a competency based teacher education assessment and revision system will be established. A questionnaire addressing these variables will be presented. A description of methodology for analysis and treatment of data collected will be explained. Assumptions and limitations of the study will be identified in this chapter. Chapter 4 will include the analysis of the data collected and the identification of relevant findings. The matrices for the analysis of the differences, similarities, advantages, and disadvantages among the sample institutions will be used. An overview of the study will be presented in Chapter 5. This will consist of a summary, the conclusions, recommendations resulting from the study, and implications for future research. Chapter 2 REVIEW OF THE LITERATURE This study is devoted to the assessment/revision systems used by operational programs and the data management systems utilized to improve the efficacy of the programs. Particular emphasis will be given to concerns of assessment and revision. Since it appears the level of sophistication of the assessment and revision system dictates the level of complexity of the data management system. Unfortunately most of the literature on competency based teacher education tends to be quite redundant. This also holds true for those sections discussing assessment and revision procedures. Considering this problem, the review of the literature will concentrate on a limited number of sources, but will explore a few of them in greater depth than would normally be the case. Those selected for in-depth treatment were chosen because of their comprehensive coverage of the subject. The literature is replete with emphatic concern for a lack of assessment procedures appropriate for competency based teacher education. The concerns also express an apparent disregard for designing an assessment system prior to implementing a competency based teacher education program. 14 15 "There is a paradox in education that is hard to explain. It has often been noted that instructional developers may use highly sophisticated techniques in producing instructional programs, only to turn the programs over to management and evaluation systems which are not only outmoded, but in many cases antithetical to the vital pro- cesses of new programs. An examination of such a marriage should reduce the wonder regarding the early demise of so many promising innovations. PBTE is a promising innovation that also may fail because of the hostile environment in which it attempts to grow."1 This concern is clarified even further by Elam. "But the overriding problem before which the others pale to insignificance is that of the adequacy of measurement instru- ments and procedures. PBTE can only be successful if there are adequate means to assess the competency of the student. The bulk of the effort in establishing PBTE is most likely to go into the development of new instructional materials, into working out arrangements with the bursar and registrar, into devising ways for practicing teachers and administrators to share decision making, into moving the program into the field, and--most important of all-—into developing ways to use faculty and librarians most effectively in the operation of unconventional modules in a conventional system. But 1Castelle Gentry and Charles Johnson, A Practical Management System for Performance-Based Teacher Education (A Publication of American Assoc1ation of Colleges for Teacher Education, February 1974), p. 29. 16 when all this is done, an institution will still not have moved beyond current conventional grading procedures unless new methods are found for assessing the complex cognitive and affective objectives which are such an essential part of the training of teachers. Yet this is the foundation stone on which the program rests. Knowing that all they must do is pass a given test, students are going to use those instructional materials that most help them do that and will give short shrift to those that don't. The program designed may think that certain content, theory, or experiences are good for the student, but if all the student must do is to pass the test, then the test controls his motivation and his learning activity. If we merely require him to encounter a variety of experiences regardless of the testing, we may have done little more than cut up old courses into new pieces. Judging from modules that are currently being developed, evaluation appears to have been an afterthought. It is often crudely devised. The developers' energy, effort, and imagination have gone into producing the materials themselves, not into means of assessing mastery of them. Thus, one of the elements of PBTE that seems likely to receive only the attention that is left after other needs are taken care of is the very one that is unique to PBTE and critical to its success—-adequate evaluation. Unless there is a change of focus on the part of developers--perhaps a concentration of effort involving division of labor among 17 institutions in some kind of exchange network and unless the federal government, seeing this as necessary, provides massive new resources and support for the creation of adequate evaluation devices as well, PBTE may well fail to achieve more than a fraction of its potential."2 The lack of adequate assessment techniques is echoed by Andrews. "People believe that objective evaluation of a prospective teacher will reveal whether the person possesses the competency and whether the program is valid, but there is pg evidence now available to indicate that assessment techniques are sophisticated enough to validate any programs."3 In fact there are those who predict that this task is considerably more formidable than many realize, as expressed by Schalock. "As experience with performance based teacher education has accumulated the interrelated problems of competency definition and competency assessment have come increasingly into focus. On the one hand it is recognized that teaching competence is something more than the mastery of knowledge and simple teaching skills or behaviors, but on the other it is recognized that as soon as the definition of competency extends beyond the knowledge and skill level, the matter of assessment becomes inordinately complex. In fact, AStanley Elam, Performance-Based Teacher Education, What is the State of the Art? (A Publication of American Association of Colleges for Teacher Education, December 1971). pp. 21-22. 3Theodore E. Andrews, Atlanta or Atlantis, (A Publi- cation of the Multi-State Consortium on Performance-Based Teacher Education, October 1972), p. 33. 18 in the eyes of many, it takes on properties that demand more from the technology of measurement and evaluation than that technology has at the moment to give."4 "Few competency based programs employ any but the most rudimentary assessment procedures, and fewer still have anything that could be called an assessment system at all. As yet there simply is no sign of an emerging technology of assessment that meets the demands of the competency based teacher education movement."5 Schalock goes on to say, "While the general neglect of the assessment function within CBTE is beginning to prove embarrassing, a greater threat to the ultimate success of the movement is the relatively limited concept that most persons in the field have of that function. By and large, the literature of CBTE tends to treat the problem of assess- ment as if it were equivalent to the problem of measuring teaching performance or 'teaching competency.‘ Given the emphasis that has been placed on the concepts of demonstrated performance, performance criteria, and the like within the CBTE literature, this is understandable, but it does not serve well the long-term needs of the movement. 4H. D. Schalock, "From Commitment to Practice in Assessing the Outcomes of Teaching; A Case Study." (Paper presented at Multi-State Consortium on Performance Based Teacher Education, New Orleans, 1973), p. l. 5H. Del Schalock, "Notes on a Model of Assessment That Meets the Requirements of CBTE," Ex lorin Com etenc Based Education, W. Robert Houston (ed.), (Berkeley: McCutcHan Publishing Corporation, 1974), p. 210. 19 To be sure, measurement problems within CBTE are great. This is particularly so when performance criteria are defined in terms of the demonstration of complex teaching skills under ongoing classroom conditions, or the demonstrated ability to bring about short—or long-term learning outcomes in children. But no matter how complex the performance, measurement is not enough. In addition to the process of measurement judgments have to be made about what has been measured, and decisions have to be made on the basis of those judgments. Obviously, a central question is whether a particular performance meets criterion. But a host of other questions must be attended to as well, e.g., if performance does meet criterion what should be done next? In either case what should be done with the infor- mation about performance that has been collected? Who should see it? What form must it be in to be usable? How should it be stored? In this writer's judgment, the issue of what to do with information once it is collected is as critical to performance based teacher education programs as the issue of what information is to be collected in the first place."6 Although the predictions are rather dismal, there is a note of optimism in the literature. McDonald suggests, "I assume that even small progress made in assessing teacher competence will be a great improvement over our present evaluations. Because I assume this, I am willing to urge 61bid., pp. 210-211. 20 the use of procedures and systems which at the present time are limited or even defective-—since I also assume that as we use what instruments and techniques are now available, we shall learn more about the nature of teaching competence and progressively improve our methods for evaluating it."7 A number of authors offer new directions for developing effective assessment systems. Joyce states, Assessment is focused on the program elements rather than on the teacher. Beyond the testing of the units themselves, it is unlikely that much useful data will come from isolated studies of the effectiveness of program elements or components. Very few modules and only a few major components are likely to account for a significant proportion of teacher behavior. How- ever, the competency based teacher education format requires and makes relatively easy the testing of units (and their revision or replacement). The most important area to assess, however, is whether the program prepares teachers for specifically defined roles. There is a fork in the road in the near future for the assessment community as it approaches the problem of the competency orientation. It is possible to build an assessment system around the questions asked that are specific to program elements. Such an assessment system would have immediate use for tracking the progress of students and testing the strength of component elements. However, as both McDonald and Schalock point out, it is far more important to search for trust- worthy principles that can be used to guide future programs. Assessment systems can be designed to search for principles just as easily as they can be designed to test only specific elements of a program. For example, variations in feedback, modeling, task complexity, staff size, group size, 7Frederick J. McDonald, "The State of the Art in Performance Assessment of Teaching Competence." (Paper presented at Multi-State Consortium at the American Educa- tional Research Association Convention, New Orleans, February 2973), p. l. 21 type of feedback, etc., can be researched if the assessment system is designed to permit them to be studied conveniently. It is important to establish principles for several reasons. In addition to the obvious ones that are needed to learn what will work in teacher education, there is the need to establish the reliability of ele- ments so that we do not have to test every student's acquisition of every behavior that is specified in the program. If we have reliable principles on which to build components, then we can predict that trainees will achieve certain levels of performance if components are built around those principles. We do not necessarily have to assess its outcome every time it is used or for every trainee within a program. There are a variety of ideas suggested to assure the development of an effective assessment system, which would compliment the promise of precision considered vital to the competency based teacher education movement. Michael Scriven states that CBTE wears its credentials on its sleeve.9 He is making obvious reference to the performance objectives, with their precise statements of behavior or products of behavior and the criteria by which they will be judged. Davies elaborates on this topic by identifying the essential requirements for effective learning. "If a student or trainee is to realize the learning objectives with which he has been tasked, then it is necessary to ensure that three basic requirements are met: 8Bruce Joyce, "Assessment in Teacher Education: Notes from the Competency Orientation," Exploring Competepgy Based Education, W. Robert Houston (ed.), (Berkeley: McCutchan Publishing Corporation, 1974), pp. 205-206. 9Michael Scriven, "If the Program is Competency Based, How Come the Evaluation is Costing so Much?" Competency Assessment, Research, and Evaluation, (A Report of a National Conference, University of Houston, March 1974), p. 155. 22 §g_mg§2 kpgw exactly ypgp ii expected pf him. Unfortunately, in most teaching situations, students are rarely given any precise and concrete information about what is going to happen to them, and what they are going to learn to do. It is not surprising there- fore, that a student's view of what is necessary, may be very different to his teacher's view of the same situation. Hg must pg given 53 opportunity E9 learn. This is just too obvious. Yet some teachers and instructors behave in such a way as to make learning difficult, if not impossible. An opportunity to learn involves freedom to accept responsibility for one's own learning, freedom to exercise initiative, and freedom to work within the full limits of authority delegated to a student. Hg_mp§E kpg! ghgp progress hp ig making. This means that his progress in realizing learning objectives must be constantly monitored, and the results immediately fed back to him in a form that is meaningful to both short-term and long-term action. The only really effective way of fulfilling this need is by means of regular and detailed counselling and guidance. This means that teachers are committed to discuss a student's successes and failures in realizing his learning objec- tives. There is little point in talking to students about such things as the need for hard work, the importance 23 of trying, and the virtue of being interested; only effectiveness converts resources into results. Hard work is futile when effort is directed at realizing unimportant or unnecessary learning objectives—- particularly when it is done at the expense of objectives which are important. In order to ensure that such a waste of time and effort does not occur, the importance of a teacher's role cannot be overemphasized. Not only must a teacher define learning objectives and ensure that they are indeed being realized, but he must also discuss them with his students in order to secure and maintain their personal involvement and participation in the learning process. It is possible to Eglk about the importance of student motivation. It is functionally more useful to ggip a student's commitment to a set of agreed objectives."lo Davies is indirectly giving direction to the use of assessment data in his requirement number 3. other directions are offered to help guide the development of more appropriate assessment systems. Massanari says, "Because CBTE emphasizes competencies and objectives, individualization of instruction, reconceptuali- zation of faculty roles, effective use of the schools, new kinds of training materials, and assessment, new kinds of management procedures are needed to facilitate effective 10Ivor K. Davis, Competency Based Learning: Tech- nology, Management, and Design, (New York: McGraw-Hill Book Co., 1973), p. 234. 24 operation. A related characteristic is periodic review and modification of the management system based on experience. While assessment has always been problematic for educators, it is particularly critical for CBTE. CBTE is heavily process oriented because it emphasizes the demon- stration of competence. Implementation of CBTE requires assessment techniques to determine the appropriateness of given program competencies, the achievement of the selected competencies, the effectiveness of training materials and procedures, and the effectiveness of program management. CBTE pushes educators to obtain or develop assessment techniques which are applicable to all of these program elements. It pushes them both to develop new assessment techniques and to make clear to the profession and to society that new kinds of assessment techniques are needed and will be used. CBTE pushes educators to break through the narrow assessment boundaries imposed by the scarcity of available techniques."11 "Valid assessment instruments are necessary for making that determination, and the measures in that instru— ment must be directly related to the instructional objective. Put another way, a CBTE program that does not have a match between its instructional objectives and primary criteria 11Karl Massanari, "CBTE's Potential for Improving Educational Personnel Development," Journal of Teacher Education, XXIV, Fall 1973, p. 247. 25 for determining how well those objectives are met, does not have a competency-based teacher education program."12 "The competency-based or performance-based concept is currently viewed by many as a potentially useful educa- tional innovation. It is our contention that competency— based teacher education has certain advantages over other forms of teacher education, but that the odds for its survival are not particularly good. Previous educational innovations have failed because of the type of criteria applied to them. "Primary criteria," designed to measure the direct effects of instruction on the behavior of the learner, were seldom used. "Secondary criteria,‘ like teachers attitude, administrative convenience, amount of Federal or foundation funding, or popularity of the instructional approach, were used instead. Ironically (considering its meaning), competency-based educational programs are currently evaluated by secondary criteria."l3 "Whereas nearly all efforts at assessment have in the past been oriented toward the study of the teacher— searching out the most and least effective--the focus in competency based teacher education assessment is the program itself. The program has to be assessed-~in terms of particular elements, components, and the totality--in order 12Cass Gentry and others, "For Want of an Assessment System, CBTE Programs are Lost," (Published by the Multi- State Consortium on Performance-Based Teacher Education, Vol. 3, No. 3, September 1974), p. 1. 13Ibid., p. 1. 26 to build a reliable CBTE system whose output of competencies can be assessed in its turn. If there is to be really meaningful within-program assessment, an assessment system has to be built that will yield data about components, units, or modules within components and the program as a whole, all on an ongoing basis which is linked to program renewal and redefinition."l4 "It is anticipated that the advisor report for both selected and constructed response type objectives will be used to perform at least these three primary functions: (a) early identification of student difficulty, (b) serving as a detailed data source for the advisor in working with the student, (c) providing data on a continual basis to indicate student growth through the program."15 "Operationally, an assessment system that serves decision-making must contain specifications on what is to be assessed, standards by which to judge what is assessed, who is to be involved in making judgments about performance as it relates to standards, and rules that spell out what is to occur if performance standards are or are not met. In addition the system must have the data generation, reduction, storage, retrieval, and distribution capability implied by such specifications, including the decision-making structures needed to match data, time, people, decisions, and decision 14Joyce, Op. cit., p. 204. l5Gentry and others, Op. cit., p. 8. 27 schedules. Finally, an assessment system that serves decision-making must contain a well-worked-out design that permits it to interact functionally with the program it supports and with the elements of the program that in turn supports it. Without this close articulation, either the program will function in splendid isolation, or the assessment system will fail to function because of its isolation."16 "Management by teams rather than by complex computer management systems is favored. Perhaps the overriding reason is the team structure provides the opportunity to humanize teacher education. This is so because the team structure: 1. Provides supportive relationships among students and between students and faculty. 2. Provides for curriculum building which can focus on the needs of that particular group. 3. Provides for a close counseling relationship that can more adequately personalize a teacher education program. 4. Allows for participation by students, staff, and public school personnel in decision making at a meaningful level."17 But Schalock cautions, "Just as CBTE programs are to be continuously adaptive, so must the assessment system that supports such programs also be adaptive. In the language of systems theory, an assessment system must be an "open" system."18 l6Schalock, op. cit., "Notes on a Model of Assessment Meet the Requirements of CBTE," p. 218. l7Gilbert F. Shearron and Charles E. Johnson, "A CBTE Program in Action: University of Georgia," Journal of Teacher Education, XXIV, Fall 1973, p. 192. l8Schalock, op. cit., "Notes on a Model of Assessment That Meets the Requirements of CBTE," p. 218. 28 Schalock continues along this line of thought when he adds, "The point is that the student assessment system must be adaptive to such variation in demonstration contexts, and yet at the point of application in these varying contexts be 'closed' to the point where reliable, trustworthy infor- mation can be obtained on each student's performance."19 It becomes increasingly obvious that the data management system must be analyzed in terms of the data it processes. It also becomes obvious that the demand for a sophisticated data management system is a direct product of the level of sophistication of the assessment and revision procedures developed. Three data management systems will be cited to show how these institutions differed in their approach to data management. First, Florida International University developed a computer management system (COMSPEC) with the following objectives. 1. Establishment of a record for each student which includes his planned program of studies (course prescription) and his progress through courses in terms of modules, tasks, and enablers which comprise each course. 2. Reporting to instructors, on a weekly basis, the per— formance of students in their courses, the reports indicating which enablers, tasks, and modules were attempted and completed by each student. Weekly 19Ibid., p. 219. 29 reporting on enabler data is optional, being included only at the request of the instructor. 3. Establishment, by instructors, of certain criteria regarding individual student progress on tasks and modules which, if not met, should be reported to them so that they may respond more quickly to difficulties students may be having. A report of this kind is referred to as a Red Flag Report. A Red Flag Report may include, for example, a listing of all students who have unsuccessfully attempted a task two or more times. 4. Providing advisors with a list of students who have registered for courses not included in their original course prescriptions. 5. Providing advisors with a list of students who are carrying one or more No Credit Grades. 6. Establishment of a data base of student performance for the purposes of long—range planning and evaluation of programs in the School of Education.20 Secondly, Oregon College of Education addressed its system as follows: "Assessment was seen in the context of the OCE program, therefore, as a mechanism that supports decision making. Put in other terms, it was a targeted information system. Two major classes of decisions were to be served by the system, instructional decisions and program adaptation or design (management and policy) decisions. 20G. Wesley Sowards, "One Year in Retrospect," Published by Multi-State Consortium on Performance-Based Teacher Education, November 1973), p. 11. at! on" .q q. 1. up w. 30 Given this concept of assessment, the system was to include measures of teaching competency; performance standards for competency demonstration at particular levels of certification or precertification experience; specifi- cations as to the decisions to be served by particular measures of competency; specifications as to the structure of mechanisms to be employed in arriving at particular classes of decisions; specifications as to the form which the data were to assume to facilitate each particular class of decisions and an information reduction, storage and distribution/retrieval system that permits the efficient handling of the data that comes from the system."21 It is interesting to note that the FIU system was almost completely handled by electronic data processing equipment, while OCE handled all the data by hand except summary information, which was stored in a computer for further analysis. Thirdly, the University of Toledo felt that it would take years to implement all the specifications previously developed into an operational system. Earlier studies had been concerned only with developing a competency based elementary teacher education program. The decision not to wait years and to incorporate the secondary level, required a management system different than previously thought. 21 . H. D. Schalock, op. cit., "From Commitment to Practice in Assessing the Outcomes of Teaching; A Case Study." p. 9. 31 "Several principles and rules were adopted for guiding the management system. The most powerful of these we call the 'Principles of Successive Approximations.‘ Because we were not able to begin our program at the vali- dity and reliability level necessary for a competency based program, we decided to carefully identify the starting con- dition of the different elements of our program, and to just as carefully determine the necessary steps those different elements would have to go through in order to meet the requirements of a competency based program. In addition, we decided to set up a time line designation when, who, and with what each of the progressive steps would be carried out."22 "The use of the Principle of Successive Approximations requires additional conditions for our management system. For one, it requires a well defined assessment system that can pinpoint parts of the instructional system that are effective and efficient, and those which need to be improved. By the same token, our management system must have a revision component that will act on the assessment data to improve the instructional system. This brings us to a unique characteristic of a management system that is used to develop and maintain a CBTE program through successive approximations. That is, given its imperfect state when first implemented and the intent of the faculty to have it evolve into a truly 22Castelle G. Gentry, "Management System For A Competency Based Teacher Education Program," University of Toledo, (Typewritten), p. 2. 32 competency based teacher education program, the CBTE management system must maintain continuous assessments and revision processes. This differs from most educational management systems in that most of their energies go to maintaining a system once it is in operation. Our manage- ment system must not only maintain the current approximation, but it must collect data for the next approximation, provide resources for the modification of the current approximation, and install the next approximation. None of these tasks occur by chance, and none of them occur without resources and personnel assigned expressly for the completion of those tasks."23 "The intent of this successive approximation of our PBTE evaluation system is to provide data which will aid the effective advisement of our students, and to identify ineffective, inefficient, and irrelevant portions of our program so that corrections can be made. Such an evaluation program must also indicate necessary changes in the program organization and the management of the program, as well as changes in the modules of the program. We think this continuous process of assessment and revision will bring existing evaluation approximations ever closer to the ideal evaluation system and, more importantly, toward a valid and reliable program for preparing teachers."24 231bid., p. 4. 24Castelle Gentry and Charles Johnson, op. git., p. 35. 33 Influences In The Development of CBTE It is reasonable to assume that competency based teacher education is a result of many influences. Three of the more recent influences will be identified and explained in this section. First, mastery learning as proposed by Bloom, was explored to point out the similarities to the competency based movement.25 Mastery learning requires an established level of achievement to be reached by every student. It also recommends that the program be structured to meet the needs of the individual student. Bloom also stressed the need for timely feedback and corrective procedures. It was decided that brief diagnostic-progress (formative) tests were most useful for feedback to both teacher and student. The test results were not used for grading, but were specifically designed for guidance purposes. Bloom asserts, "The success or failure of mastery learning is clearly related to the degree of efficiency of the formative tests in pinpointing the learning needs of the student."26 A finding, Of major importance, was that the elimination of competition between students for grades created a more positive learning environment. The students became cooperative in helping each other attain mastery. 25Benjamin S. Bloom, "An Introduction to Mastery Learning Theory," Schools, Society, and Mastery Learning, New York: Holt, Rinehart and Winston, Inc. 1974, p. 5. 26Ibid., p. 5. -A': .‘vb' sun- INC v ‘c- on "I ,v‘ 34 The second strategy explored was Keller's Personalized System of Instruction.27 PSI is an approach to teacher education, which is quite similar to mastery learning. Instead of addressing the components and procedures which are common to competency based education, mastery learning and personalized system of instruction strategies, the major differences between mastery learning and PSI will be identified. This possibly suggests the influences which could account for differences which appear in the design of some CBTE prOgrams. The first difference lies in their conception of mastery. Bloom believed that a comprehensive test should determine mastery. He did not believe that mastery of the parts of a course was synonymous with mastery of the whole. Keller felt that mastery of the parts of a course was synonymous with mastery of the whole course. Keller also used smaller learning units. The units would consist of one week's work or even less. Bloom's units usually were about two weeks in length. PSI used individualized instruction while mastery learning as Bloom conceived it, was group paced. The type of feedback tests in PSI were more descriptive as he used a variety of formats for testing, but his tests tended to sample only student achievement. The tests were a random selection of items intended to measure 27James H. Block, "A Description and Comparison of Bloom's Learning for Mastery," Schools, Society, and Mastery Learning, New York: Holt, Rinehart and Winston, Inc., 1974, pp. 15-24. 35 the units objectives, whereas Bloom's tests were comprehensive. The last major difference was that PSI demanded total mastery of each unit, while Bloom established lower levels of mastery, usually falling between eighty and ninety percent accuracy. There are competency based programs that are using strategies based on either Bloom's or Keller's version of mastery learning. All CBTE programs are using strategies common to both approaches. The third influence was not an educational strategy, but certainly hastened the development of educational pro- grams, which could provide empirical evidence of the school's role in student learning. The accountability movement was eventually to merge into a chorus of discontent. As to the genesis of the movement, it is difficult to identify a single source and may be of no particular interest at this point in time. More important, is that educators, govern- ment, departments of education, and the public are all very concerned about the school's responsibility for positive growth in student learning and that the schools be held accountable. As the movement has gained momentum, colla— boration of various groups concerned with accountability was inevitable. "Although systems technologists and behavioral objectivists started their reform movements separately, it was, as Erick Lindman states, 'inevitable that they should discover each other and find they had much in common.‘ Combined with the pressure of the time, the notion that accountability could and should be more 36 rigorously applied to education has gained currency. Why should persons employed by the public to provide a service (and given considerable latitude in determining how and under what conditions that service will be rendered) be exempt from standing to account for the results of that service? It is not likely that the premise of this argument will be seriously (or at least openly) challenged."28 Although arguments against accountability do, in fact, occur, it is not the intended spirit of the movement that is resisted, but rather the difficulties in implementing an accountability system. Gronlund states, "This diagram makes clear that teacher competence is just one of many factors that determine student learning. The mental ability and past achievement of students sets limits on their level and rate of learning during instruction, and their attitude toward school and learning determines how wholeheartedly they enter into the learning experience. The quality of the instruction itself is determined not only by the competence of the teacher, but also by the facilities and resources that are made available to the teacher, by the support provided by the administration (e.g., supervision), and by special services (e.g., remedial reading, counseling). One of the major elements in the effectiveness of instruction 28Lesley H. Browder, Jr. and others, Deve10ping an Educationally Accountable Program. Berkeley: McCutchan Publishing Corporation, 1973, pp. 15-16. 37 student entry character- student's learning __.. . - P-J’ "” instruction response to outcomes istics instruction mental ability, teacher community past achieve- competence, influences, home ment, attitude facilities, influences, peer- resources, group influences, administration self—concept, support, special aspirations, services interests is the student's response to it. Although the teacher has a certain amount of control over the motivation of the student, this is shaped to a large extent by the student's present and past community and home environments; the attitudes of the peer group; and the student's self-concept, aspirations, and interests. A recent review of research into the factors related to students' school performance has shown that the variables bearing the strongest relationship were of a nonschool nature (Wilbur, 1970). The fact that teacher effects can be submerged by these other factors is supported by studies showing that the amount of student learning brought about by a particular teacher varies considerably from one group of students to another (Rosenshine, 1970)."29 In spite of the arguments pro and con, no doubt exists as to the importance of the pressure brought to bear 29Norman E. Gronlund, DetermininggAccountability for Classroom Instruction. New York: Macmillan PubliShing Co., Inc., 1974, pp. 9-10. 38 on the educational community. This pressure is evident in the attempts to develop teacher education programs that can answer the challenge of accountability. The competence based teacher education movement holds much promise in this regard. SUMMARY In review of the literature on assessment, evaluation, and data management systems in competency based teacher education programs, the following concerns have been identified: 1. There is a general neglect in developing assessment and revision systems structured specifically for CBTE Programs. 2. Assessment techniques are not sophisticated enough to validate any program, particularly beyond the assessment of mastery of knowledge and simple teaching skills or behavior. 3. Assessment should be addressed to teaching competence in a demonstration context in addition to the assessment of program components. 4. The need exists for assessment systems to search out principles that can be used to guide future programs. 5. The need is for assessment feedback so that student can monitor his own progress. 6. There must be a precise match between stated objectives and the criteria used to measure how well those objectives are met. 10. 11. 12. 39 The need is for "Primary Criteria" designed to measure the direct effects of instruction on the behavior of the learner. The assessment system should serve to improve the efficacy of decision making. There is a need for the development of decision-making structure to match data, time, people, decisions, and decision schedules. An assessment system that serves decision-making must be so designed that it can interact functionally with the program it supports and with the elements of the program that in turn support it. CBTE programs should be continuously adaptive, including the assessment system. Since a CBTE program should reasonably generate an enormous amount of data, it will require a data management system capable of data reduction, storage, retrieval, and distribution SOphisticated enough to serve the needs of the program. Chapter 3 DESIGN OF STUDY INTRODUCTION This research was designed to determine the current state of assessment and revision components of competency based teacher education programs, as exemplified by five recognized, operational programs. In addition, this study identified the structures and procedures of the data management systems of these programs, as they were used to store, manipulate, and distribute the data collected by the assessment and revision systems. This information from the five programs was analyzed to determine significant elements for a model system, which could serve as a guide to develOping competency based teacher education programs. RESEARCH QUESTIONS In order to make the above determinations, information on the following specific research questions was collected. 1. What are the similarities and the differences among the systems studied? 2. What are the variables having the most influence on the relevance, the effectiveness, and the 40 41 efficiency of a competency based teacher education assessment and revision system? What were the major problems encountered in the development of competency based teacher education assessment and revision systems? SELECTION OF PROGRAM VARIABLES The selection of the program variables relating to the assessment and revision components of the operational programs to be studied, was based on the following criteria: 1. The AACTE Committee on Performance-Based Teacher Education list of characteristics used to determine whether a program is PBTE or not. (See Appendix A) Identification of a number of concerns regarding assessment and revision procedures from reviewing the literature. On the basis of logic, the apparent appropriate characteristics to enhance the effectiveness and efficiency of any system. Using the preceding rationale, judgments were made in the selection of variables to be analyzed in operational competency based teacher education programs. Those selected were: Instruments used for data collection Types of data being generated by the system Sources of input data Destination of output data 42 5. Various forms of output data 6. Intended use of output data 7. Efficiency of data management system 8. Physical characteristics having impact on system 9. Difficulties encountered making transition to CBTE 10. The internal structure variables having inpact on system 11. Proposed revisions or modifications The above concerns were analyzed in terms of the similarities and the differences that exist among the sample institutions. INSTRUMENTATION Five institutions were selected as the sample for the study. Each had a fully developed competency based teacher education program and had incorporated electronic data processing for the management of the assessment and revision data. These five institutions were selected by judges, considered to have expertise in the area of competency based teacher education. Survey of judges The judges were selected on the basis of acquired recognition at the national level. The minimum criteria were identifiable expertise in the implementation of a competency based teacher education prOgram and association with the development of the assessment and revision component. The majority of judges exceeded the minimum 43 criterion, by having served on the AACTE Committee on Performance Based Teacher Education or having been presenters at national conferences on competency based teacher education. All judges have published articles or monographs, which dealt with assessment/revision and/or data management systems in competency based teacher education. The following sixteen judges were selected. Name Institution 1. Dr. Hugh Baird Brigham Young University 2. Dr. Fred Cook Wayne State University 3. Dr. George E. Dickson University of Toledo 4. Dr. Norman Dodl Florida State University 5. Dr. William Drummond University of Florida 6. Dr. Thomas G. Dunn University of Toledo 7. Dr. Paul Gallagher Florida International University 8. Dr. Castelle Gentry Michigan State University 9. Dr. Robert Houston University of Houston 10. Dr. Charles Johnson University of Georgia 11. Dr. Lorrin Kennamer University of Texas at Austin 12. Dr. Karl Massanari American Association of Colleges for Teacher Education 13. Dr. Donald Orlosky University of South Florida 14. Dr. Rita Richey Wayne State University 15. Dr. Del Schalock Oregon College of Education 16. Dr. Gilbert Shearron University of Georgia A preliminary survey to identify potential institutions was made by reviewing the literature. The purpose of the preliminary survey was to identify several institutions that appeared to have fairly sophisticated competency based teacher education programs. The names of these institutions would be included in the survey instrument in an attempt to gather information regarding an institution's acquired reputation in the literature and how they were rated by the judges. Of particular interest was the level of agreement regarding the sophistication of assessment components of the potential 44 institutions among the experts associated with each institution and the other experts. This is regarded as an indication of the level of communication between institutions. Each institution identified was represented by a judge. A cover letter was sent to these sixteen judges, indicating the purpose of the study. The letter solicited their cooperation in identifying appropriate institutions whose data management systems used to service assessment and revision information and decision making, would be analyzed in this study. (See Appendix B) The judges were asked to respond to the following statement: The following institutions have been suggested in a preliminary survey as mature prOgrams that may have a sophisticated system for managing their pro- gram data. Would you please react to these suggestions and more importantly, add to the list below, those competency based teacher education programs that you think are using an electronic data processing system for the management of assessment and revision data. (See Appendix C) The responses from the judges were to be converted to a numerical value. Given the following options, the value of each is indicated on the right. Lack information (=0) Beginning stage of development (=1) Moderately well developed (=2) Well developed (=3) The scores for each institution listed were totaled and then placed in rank order. The five top institutions were selected for analysis purposes. 45 Interview Instrument For purposes of analysis, an interview instrument was developed. A visit to each selected institution was made in order to tape record elaborate and in-depth responses. The questionnaire was used to insure the replication in data gathering. (See Appendix D) The questions to be included covered the following: A. Questions directly related to the structure of an assessment and revision system. B. Questions indirectly related, but considered to be influential in the develOpment of an assessment and revision system. C. Questions directly related to data management of assessment and revision data. D. Questions that are neutral in relationship to the assessment/revision system and the data management system, but have impact on the overall structural development of a competency based teacher education program. It was felt that the questions to be asked would be divided into four distinct categories. Category one is concerned with environmental conditions that existed during the implementation stage. These conditions are ones that exerted some identified influence, not only on the initial stages of implementation, but also on the total program. For example, lack of financial support or faculty resistance is considered environmental conditions or constraints. 46 Category two is concerned with program characteristics. These are the structural components making up the entire competency based teacher education program. For example, it could include the method used to handle advisement. Does an institution use a group of advisees assigned to a single advisor or are the advisees assigned to a group of advisors, depending on program? Are the students required to move sequentially through the prOgram? Category three is directly concerned with the assessment and revision system. For example, what is the time lapse between student assessment and feedback to student? Another would be, how many times may a student take a posttest? Questions in category four deal with the data management system. For instance, are all data processed by electronic data processing equipment? And, how is data stored? The validation of the interview instrument was accomplished through four means. First, a list of common assessment and revision concerns were extracted from the literature. Secondly, a review of the instrument by selected experts was made. These experts were asked to read and critique the survey instrument. They were: Dr. John Doneth, Ferris State College; Dr. William Klingele, University of Arkansas; and Dr. Castelle Gentry, Michigan State University. 47 A third means of validation was through consultation with Dr. John Schweitzer of the Office for Research Consul- tation at Michigan State University, who reviewed the instrument and made comments. Fourth, any new relevant questions were added to the interview instrument after each visit to the selected institutions. These questions were products of the concerns of the experts at these institutions. Then telephone calls were made to update the data received from previous interviews, so that all questions would be answered by the five institutions. Treatment of Data The questions from the survey instrument are clustered according to an identified concern and then placed in a matrix. The response relative to each question is recorded for each institution interviewed. (See sample matrix on following page.) A series of matrices are pre- sented, each with its interrelated cluster of questions. The content of the matrices is limited to questions relative to assessment and/or revision, and to the data management system. The concern stated for each cluster will be related to one or more criteria identified as necessary or implied to qualify as a competency based program. Where differences in responses appear, whether structural differences or procedural differences, the rationale supplied by the deviant institution will be provided. OOZOmeu H5QHm Ismummm useEmHmEHi Bowman ucmfimmmcce camp mo cmflmmp Emummm EOama>mu can ucmsmmwmmm cmamop pmamaucmpfl mum mpmmc Ecumonm opus cOHDOEMOmcfi HO\Ocm Homoam>mp .ll1onwwlllll ’ Ecumoum OD Ommmmmmc commmmmm .soummmcu Hacaflm>c moms moocoauomxm mmocownmmxo .Oms mucusm cowucfiuomcfl pagan UODOHSEHm Mom pououm - Ben X/ e \ musmcfl Headmcoo mouspmooum msofium> gunfiumoummm muossmcoo mummmmomc we was All, on ‘- msoaumS A. pouum>coo can mooa>nwm co pmumsam>m pmusnauumwp Mom pmucmuu OODOOHHOO mcflcwmuu EODDMm EOHDOEM0mcH cusp camp usage moa>ummca Homw>pm Ou. \ # OHQOHHO>M - 4 TOME pmumuwcmm HOUUSHUWGH O uwmmmmmfl MUMU coauceuomca . « oahpaam>m meanxm m>auownnsm mums paw Hmnuo UGO Dempsum OD coauceuomcfl moccasocx accwpsufluum mancaflc>m opus - coaucshOmcfl pocafiumuop momcmno hummmmomc lb Emummm coamfl>wm can DCOEmmmmmd Gd now Hope: d N musmflm uhmum 91 Successive Approximation, may be necessary.1 The Principle of Successive Approximation suggests that an ideal system be designed and then to design a realistic system that a given institution and personnel can accommodate with their present resources. The next step is to plan a series of intermediate steps between the two, which will ultimately lead to the "ideal" system. The "ideal" system will probably never be achieved, because of new information that will need to be incorporated and because of Obsolescence of Old information in the system. Testing procedures will change. Since conventional programs use norm referenced testing, it will be necessary to convert to criterion referenced testing. Competency based instruction is dedicated to each student reaching a stated level of mastery of a skill, knowledge, or attitude, whereas norm-referenced testing measures how well one student compares to another, in terms of their test scores. Criterion referenced testing measures competency achievement. This conversion in testing procedures proves disconcerting or even threatening to some faculty members. In designing the assessment component, it will be necessary to establish criterion measures through the process of objective analysis. Initially, this procedure should provide face validity to all test items, if the 1Castelle Gentry and Charles Johnson. A Practical Management System for Performance-Based Teacher EducaEIon. A Publication of American Association of Colleges for Teacher Education, February 1974, p. 9. 92 data are to be useful in making decisions for advising students and revising program elements. It will be necessary to include public school personnel in the assessment and revision process. Their commitment is crucial to the success of the field experience portion of the program and they represent a rich source of support and feedback for continued program assessment. The availability of specialized facilities will determine the variety of experiences which can be assessed. They also determine the extent to which the program can be designed for individualized instruction. Specialized facili- ties include testing labs, independent study labs, one-way glass windows and TV monitors for observation purposes, and microteaching labs. The tests must be related directly to the stated objectives. It is not enough to develop one set of measures; equivalent measures must be developed to enable instructors to use various forms of the same test. An example, Form A can be used for pretest and Form B for posttest. Another form could be used for a delayed posttest. Use of various forms of the same test prevents students from memorizing specific test items as they retake posttests. Areas of learning that must be assessed can be included under the following three categories. (1) Classroom instruction (may be individualized learning), where type of test would be either a selection type (i.e., multiple choice, etc.) or a construction type (i.e., essay, etc.). 93 If the test is in essay form, then scores have to be converted for introduction into data management system. (2) Simulated experiences, where a graphic rating instrument can be used and the scores recorded on mark-sense sheets and entered into the data system. Since the structure of a simulated teaching experience can be controlled, it provides an opportunity to present students with a variety of exper- iences that otherwise could take years before a student was confronted with a similar situation. (3) Field experiences, where a graphic rating instrument (i.e., criterion checklist) may be used to assess a student teacher's performance. The last two categories of data usually require conversion to scores compatible with the data management system. This would also be true for categories like questionnaires and anecdotal records. Provisions must be made for research studies. Longitudinal studies over several years will provide very important information regarding the success pattern of graduates, feedback on the most relevant components of a program, and possibly indications that portions of programs should be modified or even deleted. Feedback from longi- tudinal studies might well suggest the inclusion of new areas Of instruction. The possible gains of longitudinal research are almost unlimited. Both short term and long term research studies can look at personality characteristics and student background characteristics to determine what variables might emerge as 94 influential as predictors of immediate and continued success in the profession. As research is conducted, it must look at the overall effectiveness of graduates of competency based teacher education programs in terms of the three domains of learner outcomes, cognitive, affective, and psychomotor. Evaluation of the assessment system will be derived through the use of the following data: 1. Class profiles on student achievement 2. Feedback from faculty, students, and public school personnel 3. Research findings that are relevant to the program The assessment and revision system should be open and regenerative in nature. Designing the Data Management System The data management system must deliver the appro- priate data to the appropriate consumer in the appropriate form at the appropriate time. This includes data required by, and generated by the assessment and revision system. If the data management system does not operate efficiently, there will be an overall repressive effect on the whole program, including the assessment and revision system. The need for a specialized expertise in a data management system is recommended by the institutions interviewed. Competency based programs require and generate enormous amounts of data and the capability of processing these data is vital to 95 the efficacy of the program and the morale of those individuals who depend on data output. One of the more crucial aspects of competency based teacher education reported is the close contact between advisor and student. Immediate feedback on student performance is necessary to strengthen this relationship and allows the advisor to react expeditiously in resolving problems of students having diffi- culty. It is assumed that immediate feedback also eliminates apathy on the part of students by quickly reinforcing successes. The preceding paragraph and the section on designing an assessment and revision system have dealt with some general concerns in developing a model of an assessment and revision system. The following sections describe the com- ponents of a data management system and will provide specific details of the proposed model. Each of the eight components will be treated in a dichotomized fashion. First, the functions of each component will be identified, and second, how these functions are operationalized will be presented. Figure 3 is a flow chart of the proposed data management system. Data Selection It is necessary to provide information on the following functions for selecting the appropriate data for assessment and revision purposes. I. Assessment of student achievement to determine: A. Effectiveness of program in meeting intended objectives as judged by student performance and learner outcomes. 96 START [ DATA SELECTION | l | DATA COLLECTION I 1 | DATA TREATMENT l l [ DATA STORAGE 1 DATA RETRIEVAL ] .l DATA DISTRIBUTION I '14 l 41 (Reiterative Loop) i . T—W RESEARCH STUDENTS TRAINING A 4—' ADMINISTRA- TION 4.— INSERVICE G—J‘ REVISOR ‘ FACULTY 4‘—- 4‘—d 'L-{ DATA ASSESSED ] Figure 3 DATA MANAGEMENT SYSTEM 97 Efficiency of program in terms of use Of faculty time, student time, and instructional alternatives. Relevance to needs as expressed by the students, faculty, adjunct personnel, and others (e.g., business, industry, parents). II. Data relevant to administrative decision making: A. B. C. To procedures I. To to A. Determine effectiveness of faculty concerns. Resolving budgetary concerns. Monitoring entire program. operationalize the above functions, the following are recommended. measure student achievement, it will be necessary assess the following: Student success in meeting mastery level of each program Objective. The essential information for assessing the effectiveness of the program is the following: 1. Entry behaviors. A pretest will be admin- istered to determine if the student has already achieved mastery level on any of the program objectives, or for placement purposes. Cognitive outcomes regarding achievement level. This will be accomplished by administering a posttest immediately following instruction. 98 Cognitive outcomes regarding retention of knowledge and skills. This may be determined by using a delayed posttest. The application of various forms of the posttest for use as delayed posttests and administered more than once would provide information for possible research. For example, a dramatic drop in test scores might suggest experiments involving changes in sequence patterns for instruction. It might suggest intermittent exercises to reinforce knowledge and/or skills which tend to dissipate over time. It also might suggest that those particular skills could be more efficiently learned on the job. Cognitive outcomes regarding transfer of learning. The most Obvious assessment of transfer will occur during the field experiences. It is at this time that observations will be made to assess a student's ability to transfer knowledge and skills required during classroom instruction to actual teaching situations. The ability to synthesize new solutions, consistent with and related to approved practices, will be observed. The ability to transfer knowledge to aid in interpreting 99 new information is an area which must be attended to with much more rigor than in the past. At the present, there seems to be a paucity of techniques and instruments which enable an accurate assessment of transfer of learning, although there appears to be a general awareness of the need. Affective outcomes regarding changes in students' attitudes and interests. Although the science of measuring affective outcomes accurately is at the moment rather crude, it is encouraging to note that a greater interest is evident. The use of standardized personality tests is desirable if affective outcomes are to be monitored. These tests will enable both the student and the faculty advisor to inventory positive and negative behaviors of students and to provide data for determining the factors required for developing a program to modify behaviors identified as requiring change. Psychomotor outcomes regarding manipulative skills. The students' ability to manipulate objects, operate equipment, and other rele- vant physical skills can be assessed by using a criterion checklist. The observer will make judgments as to the degree of proficiency demonstrated by the student. 100 Retention of psychomotor skills. The retention of manipulative skills can be assessed at a later date to determine which skills are retained and those skills that will require periodic practice to be main- tained at mastery level. A criterion checklist will accommodate this assessment. Transfer of psychomotor skills. Using a criterion checklist to determine a student's ability to perform similar, different, and related complex psychomotor tasks, will assess the extent of transfer of psychomotor abilities. To determine efficiency of programs, the following assessments would be useful. 1. 2. A comparative analysis between student achievement and the cost of various com- ponents of the program. A comparative analysis between student achievement and various teaching strategies, which would involve different time allo- cations for faculty and students. Cost would be considered in relation to overall loss or gain in achievement patterns. An analysis of the number of students who have to recycle through a module one or more times, will provide evidence of the efficiency Of that module. 101 Relevance would continuously be monitored by the following activities: 1. Seeking feedback data on performance of graduates of the program. 2. Soliciting feedback from all sources having a vested interest in the program and are sufficiently knowledgeable to provide relatively objective opinions. 3. Measure effectiveness of CBTE graduates in terms of pupil achievement and attitudes toward learning, school, and society. Data Collection It is necessary to use various instruments to collect data needed for assessment and revision purposes. I. II. Instructional assessment of student achievement. A. Selection type tests 1. Multiple choice 2. True and false 3. Matching items Constructed response type tests 1. Completion 2. Essay Simulated experience assessment of student performance. A. Non-quantitative measure of performance 1. Peer group evaluation 2. Self-evaluation 3. Faculty evaluation 102 B. Quantitative measure of performance 1. Graphic rating instrument (i.e., criterion checklist) 2. Systematic observation III. Field experience assessment of student performance. A. Quantitative measures 1. Graphic rating instruments (i.e., criterion checklist) 2. Systematic Observation 3. Pupil performance B. Non-quantitative measures 1. Anecdotal records 2. Questionnaires 3. Unobtrusive measures IV. Program assessment. Collect value judgments regarding adequacy of program components from: A. Faculty B. Students C. Adjunct personnel The information collected must be in a form compatible with the data management system. I. Assessment of student achievement using selection and response type tests. A. When using selection type tests, student responses should be on mark-sense answer sheets. This allows the test to be scored immediately by machine or by use of a template, then entered 103 directly into data bank. It is possible to provide both immediate scoring and entering data into the bank in one operation with the appropriate electronic data processing equipment. When using constructed response type tests, it will be necessary to convert answers to a quanti- tative measure by using a criterion checklist while grading the test. (See Appendix G) The checklist scores may be provided as immediate feedback to the student and also be entered into the data bank for statistical treatment to provide data for revision. II. Assessment of simulated experiences may be accomplished in two ways. A. Assessment by discussion, where peers, instructor and student will discuss the performance. This type of interaction can be beneficial to both the student and his classmates, each of whom must go through the same process. This can be considered a practice assessment, providing immediate feedback procedure and data will not be used in grading or revision. This is a limited assessment. An expert with validated criterion checklist while assessing performance, must be certain that data from that checklist will be compatible with other data from selection type and construction type tests. This will allow the computer to treat 104 data from both simulated and field experiences in the same way. Performances relative to the same objectives may be compared between the simulated experience and the field experience. The cause for any major deviations Of performance should be identified. This might suggest a revision in constructing simulated experiences. III. Field experiences can be assessed in several ways. A. Quantified measures made during observation of teaching performance. These measures would be related to stated criteria and be provided as immediate feedback and entered into data bank. The success of the student in meeting the objectives of instruction in student teaching should be measured as learner outcomes. An item analysis of pupil performance would provide useful information by which to judge the compe- tency of the student teacher and also provide information indicating areas of weakness. Non-quantitative measures may be made in narrative form, such as anecdotal records and unobtrusive observations. Questionnaires may also be used to acquire useful information. In both cases, the information must be reduced to a numerical count or score, so it may be entered into the computer bank. This information may be used to assess student competency. IV. 105 Program assessment will depend on objective data collected while assessing student performance and on the responses to questionnaires. The use of questionnaires will be a valuable source of infor- mation regarding the opinions of the various parties involved. This type of information will be of particular value to those responsible for program revision. A good questionnaire must ask relevant questions and provide for alternative choices. It should be constructed to provide a maximum amount of information regarding the instructional strate- gies, assessment procedures, relevance of objectives, faculty roles, adjunct personnel roles, student responsibilities, and available resources to all concerned. Data Treatment The data must be treated in a differential manner to accommodate the varied needs of those involved in the profes- sional training of teachers. I. II. Data essential for student grading and advisement. A. Data directed to student B. Data directed to advisor C. Data directed to instructor D. Data directed to adjunct personnel Data necessary for program revision. A. Data directed to instructor B. Data directed to revisor C. Data directed to administrator 106 III. Data useful for students' placement records. There are several conditions which must be met if the various data print-outs are to be of maximum value to the individual consumer. The information must be accurate, comprehensive, relevant, timely, and in a form easily under- stood by the consumer. The data should be capable of providing information in regards to relevancy and efficiency of program components. I. Data treatment needed for immediate feedback, grading, and advisement purposes. Specific types of feedback available, resulting from data treatment are discussed in the data retrieval section. A. The students need to know the successes and failures Of their efforts to achieve mastery of each Objective. To accommodate this need, a print-out for each student, indicating the results regarding each objective is required. Criterion referenced testing will provide the assessment data. B. The advisor needs a print-out on each of his advisees. He must be able to identify any objectives where the student's performance falls short of predetermined mastery level. This will enable the advisor to prescribe a remediation exercise which will allow the student to reach the desired level of performance. 107 The instructor needs an item analysis print-out of each class. An item analysis will provide indications of course weaknesses. Further investigation would provide evidence as to where revisions are necessary to strengthen weak areas. The instructor should be sensitive to emerging patterns, rather than waiting for excessive evidence to accumulate before taking action. In addition to achievement scores, the instructor needs feedback indicating the attitudes of the students regarding the various aspects of the course of instruction, including an evaluation of the instructor's work. Adjunct personnel (public school personnel) require student teacher's performance records for grading purposes. They also can use pupil performance records for student-teacher advise- ment and for additional information in determining a final grade. Student performance can be scored on a graphic rating instrument and be provided as immediate feedback. The rating can be transferred to mark-sense sheets or the initial rating could be made on mark-sense answer sheets. (See Appen- dix G) This information would enter the data bank. The results Of all observations using a graphic rating instrument by various observers would be collated by the computer and be provided as a 108 print-out. The results of Opinion surveys made by student teachers should also be made available in print-out form. This will enable the adjunct personnel to make revisions in their classrooms that appear potentially beneficial. Any appre- hensions regarding inappropriate administrative use Of student evaluations should be eliminated at the beginning of the program. Otherwise, such differences could cause both college faculty and adjunct personnel to become antagonistic toward the program. The use Of college computer services to provide useful and timely information could produce a positive attitude among adjunct per- sonnel and strengthen the team relationship. II. Data necessary for program revision. A. The instructor would use both subjective and objective data to identify needed revisions. The instructor's role in revision was discussed in a previous section. The revisor needs data treated across individuals. This will enable him to pinpoint trouble areas in the instruction and to make revision decisions and also to analyze interactive factors between units of instruction. He will also look at data collected over two or more terms (semesters, if appropriate) to determine if problem areas con- tinue to exist. He will work as a consultant 109 with the instructor involved in modifying any unit of instruction. This not only provides valuable revision input, but also encourages the instructor to be an integral part of the decision making process, which helps maintain a positive attitude toward the program. C. The administrators should have access to all data, since their responsibilities encompass the entire program. Data regarding an individual instructor's performance should be used to encourage the instructor to improve the effec- tiveness of his instruction. He should be offered help in making revisions and be made to feel he is part of a team effort. The adminis- tration should take great care not to use the assessment data for promotion, salary, or tenure decisions, because of the danger of instructors seeking out ways of "beating the system (i.e., teaching the test)." The administrators should have available information relative to cost of the various instructional activities, since they may be called upon to make revisions in the bud- get. All the data available regarding effective- ness of the program will help the administration to justify additional costs, as compared to conventional programs. III. Data for use in student placement records would indicate the mastery level achieved relative to each llO competency. That is, the level of attainment of each of the Objectives would be related to their respective competencies. The print-out would consist of a list of the mastered competencies and would simplify the task of a prospective employer having to process an enormous amount of data, which in fact, would probably be ignored. Data Storage One of the unique advantages of competency based teacher education is the great quantity of data generated. These data enable student competency to be assessed with a high degree of accuracy. The extensive data that are generated provide greater insights into the total program Operation, thus enabling the system to be monitored with considerable precision. To accommodate the need for accu- mulated data, it must be stored so that information can be accessible at any time. Data can be stored on punch cards, computer tapes, and on computer discs. Data Retrieval Since all data stored must be retrieved in a variety of forms, it is necessary to code data prior to entry into the data bank. The code will make the desired data available for computer treatment. Computer programs must be adopted or developed to produce the various forms of information needed to effectively Operate and monitor the assessment and revision system. Data that are typically included among the readouts are: 10. 11. 12. 13. 14. 15. 111 The identification of the student or students involved in the assessment. The identification of the instructor(s). The identification of the advisor if other than the instructor. The identification of the module or the unit of instruction. The identification of each objective covered in the module. The date the assessment took place. The time spent by each student relative to each objective or each module. The number of times a student has attempted each Objective or module. The average time spent by all students success- fully completing module. The percentage of students completing module on first attempt, on second attempt, etc. The learning strategy used with each module during presentation. The test items missed by each student on each objective, including the incorrect responses. The collated results of test items missed by all students taking test. The student evaluation of each objective and the teaching strategy employed. The grade received by each student for each Objective. 112 Each of these data is discussed individually below in the elaboration of the five readouts. These readouts are: 1. Student Progress Report 2. Student Value Report 3. Effectiveness Report 4. Course Analysis 5. Efficiency Analysis Student Progress Report (See Appendix H, Example 5.1) This example shows a student's achievement relative to each objective in Module 3. This report is intended for the instructor and/or the advisor. The report shows that the student successfully completed objectives 1, 3, and 4. For this particular example, MP indicates the number of correct responses to test items necessary to achieve mastery performance, while AP indicates the number of correct responses to achieve acceptable performance. The "score" indicates the actual achievement of the student. Items missed identify which test question was answered incorrectly. "Incorrect Response" identifies the selection the student made, while "Correct Response" indicates the correct answer to the question. The knowledge of which incorrect answer the student selected helps indicate the particular type of error the student made and therefore aids in helping him overcome the difficulty. 113 Student Value Report (See Appendix H, Example 5.2) This value report indicates the student's perception of the value of the objectives and teaching strategies con- tained in the module. This report is intended for the instructor and/or the advisor. The report also indicates the number of times the student has taken the posttest (cycle 1). In the example, this student has taken the posttest once. The time spent on the module was 22 hours. The time spent on each Objective is supplied along with the mean time spent by previous students. This information is useful in making comparisons of time spent on the module and its successful completion (e.g., Student A spent 2 hours; others spent 7 hours). Referring to the preceding example, if a student reported a low value for this particular objective and data indicate the student spent too little time, then the low value might be considered an affective reaction. "Range of hours" indicates both the minimum and maximum hours spent studying for each objective. This represents accumulated data covering large numbers of students. The value judgments will be discussed with the student to gain insights as to learning difficulties and to determine a pattern of preferences. This information should be helpful in advising students and in prescribing alternate learning experiences. 114 Effectiveness Report (See Appendix H, Example 5.3) This report identifies the course, the module, type of posttest, and number of students in the class. This report is intended for the instructor and the revisor. The item analysis consists of two parts: first, the specific responses to each question and second, a compilation of incorrect responses for each question. An analysis of the specific responses to each item could generate a concern as to why a large number of stu- dents select the same incorrect response (e.g., question number 4, 33% of the class selected response number 3). The total of incorrect responses provide immediate indications of problem areas (e.g., question number 4 and question number 12 should be suspect). The problem indicated with these two questions might be some ambiguity in the test item, inadequate instruction, or poorly stated objective. The additional information regarding domain level,2 3 and teaching strategy will prove useful in learning type, diagnosing learning difficulties and provide relevant guidelines for restructuring learning activities for students who must recycle. Teaching strategies identified are: lecture (Lec) in the classroom setting, independent study (Ind) in an individualized learning facility using developed learning packages, and self study (Sel) where the student 2Benjamin S. Bloom, ed., Taxonomy of Educational Objectives, New York: David McKay Company, Inc., pp. 62-200. 3Robert M. Gagne, The Conditions of Learning, New York: Holt, Rinehart and Winston, Inc., 1970, pp. 35-62. 115 determines what materials, resources, and activities are relevant to successful attainment of objectives. Efficiency Analysis (See Appendix H, Example 5.4) This analysis is intended for the revisor and the instructor. The analysis of the module in question provides information regarding the time it takes each student to complete each objective, the number of students who failed to complete each objective and the number of attempts by each student to complete each objective. This information is useful in determining time needs in sequencing modules within time constraints of the institution and also to place a reasonable demand on the amount of time a student should spend on each Objective. The number of times it is necessary for students to recycle is important if a large number must recycle or if a student must recycle more than once. A large number recycling could suggest needed improvements in instructional content. A single student failing to complete the objective after two attempts might suggest the use of alternate teaching strategies. The achievement profile on each cycle will provide more detailed information regarding the proficiency levels achieved during each attempt. The failure of a large number Of students to reach mastery performance (MP) might suggest further refinements in the revision of learning activities. In this example, "AP" means acceptable performance and "F" means failure. 116 Course Analysis (See Appendix H, Example 5.5) The course analysis report provides information relative to the overall effectiveness of the course. This analysis is intended for the revisor and the instructor. It answers such questions as: 1. Did the majority of students complete each module in a reasonable length of time? 2. Did an unreasonable large number of students fail to complete any of the modules? 3. How do the students value the teaching strategies? Deficiencies suggested by answers to the above ques- tions would suggest needed revisions either in content of modules or in teaching strategies employed. Data Distribution The essential function of data distribution is to get the desired information to the appropriate consumer as quickly as needed. One difficulty that may arise is institutional policy preventing the posting of student achievement print-outs. This will necessitate the incon- venience to the student of having to meet with the instructor to obtain test results. The time lag between taking the test and the delay until an appointment can be made with the instructor, could produce apathy on the part of the student. The use of a template or machine scoring equipment at the test site would, in most cases, overcome the problem by making results immediately available in the selection and 117 performance type tests. The checklists for construction type tests (essay, projects, etc.) could be mailed to the student immediately after evaluation. The appropriate print-outs for the other consumers should be made available as soon as possible so that they can react expeditiously to identified problems. The type of print-out needed by each consumer was identified in the sections on data collection and treatment. The consumers identified were: 1. Student 2. Instructor 3. Advisor 4. Adjunct personnel 5. Revisor 6. Administrators Data Assessment Each consumer will evaluate the output data provided and decide if it meets his particular needs. It the output data are unsatisfactory in any way, then the consumer should immediately contact the revisor and clarify his precise needs. The revisor then can proceed to determine the cor- rections needed and where the revisions are needed within the data management system. Possible problems that could be identified are collecting incomplete data, which would suggest a change in the instrument used, treating the data improperly, which would indicate a revision in treatment to accommodate 118 consumer needs, or data not coded properly to extract needed information. It is assumed the various data are accurate, comprehensive, in appropriate form, and timely. The data stored from the operational program, with the addition of data collected on graduates of the program, will be used for research purposes. Research results will be used as indicators of success of the program and also to identify areas of the program where modifications are needed. Inservice Training Inservice training should be considered essential. At this time, the various responsibilities should be clarified. Since the professional roles of faculty members will change dramatically, inservice training is necessary to model the values and processes congruent with the demands of a competency based program. The responsibility of every individual directly involved in assessment, revision, and data management systems, would be explicitly defined. Then a training program which clarifies the function aspects of a program, the need for a unified team effort, and the means by which the whole process will come to fruition, should be thoroughly explored. A guide book, presenting a description of procedures to be followed and services available, would be provided as a reference for faculty, adjunct personnel, and administrators. 119 IMPLICATIONS OF FUTURE RESEARCH The following three suggestions are recommended for further research. One, in this study, data collection was limited to interviewing one individual per institution, although additional information was obtained by reading articles about that institution's program and through informal visits with other personnel at each institution. It would be useful to replicate this study and include representatives from a cross sectional representation from each teacher education program, such as interviewing administrators, students, faculty members, etc. The purpose would be to seek out consistency of responses within each institution. Two, to do a comparative analysis of the facilities, personnel involved, assessment procedures, and time allo- cation for the field experiences of several major opera- tionally competency based teacher education programs. The target of this study would be to research the assessment of data generated within the system to determine the efficacy of the assessment system relative to the other variables analyzed. The necessary precaution is to limit the study to one part of a program and also to limit the education level. This eliminates the confusion of over-lapping information. Three, to use some means like the Delphi technique to analyze the rationale for various statistical treatments of assessment data among several operational competency based teacher education programs. As a result of an initial survey, 120 the common treatments would be recorded and eliminated for further consideration. Any treatment that was identified and was not used by a given institution would be sent to that institution with the query as to why this treatment was not selected. This procedure would continue with all institutions. The purpose would be to gain greater insight as to what the several institutions were looking for in their statistical treatment and the rationale for their decisions. A study of this nature could result in assessment data with relatively strong empirical evidence for recommendations. BIBLIOGRAPHY BIBLIOGRAPHY Andrews, Theodore E. Atlanta or Atlantis. A Publication of the Multi-State Consortium on Performance-Based Teacher Education, 1973. Block, James H. "A Description and Comparison of Bloom's Learning for Mastery," Schools, Society, and Mastery Learning, New York: Holt, Rinehart and Winston, Inc., 1974. Bloom, Benjamin S. "An Introduction to Mastery Learning Theory," Schools, Society, and Mastery Learning, New York: Holt, Rinehart and Winston, Inc., 1974. Bloom, Benjamin 8., ed. Taxonomy of Educational Objectives. New York: David McKay Company, Inc., 1956. Bonar, John R. and Dick, Walter. "Development of a Com- puterized Management Information System for Teacher Education Programs," Educational Technology, February, 1974. Browder, Lesley H., Jr. and others. Developing an Educa— tionally Accountable Program. Berkeley: McCutchan Publishifig Corporation, 1973. Burns, Richard W. "Behavioral Objectives for Competency- Based Education," Educational Technology, November, 1972. Cooper, James M. and Weber, Wilford A. "Chapter I, Vol. II, A Competency-Based Systems Approach to Teacher Education." (Typewritten) Davis, Ivor K. Competency Based Learning: Technology, Management, and Design. New York: McGraw-Hill Book Co., 1973. Elam, Stanley. Performance-Based Teacher Education, What is the State of the Art? A Publication of the American Association of Colleges for Teacher Education, December, 1971. 121 122 Gagne, Robert M. The Conditions of Learning, second ed. New York: Holt, Rinehart and Winston, Inc., 1970. Gentry, Cass and others. For Want of an Assessment System, CBTE Programs Are Lost. Published by the Multi-State Consortium on Performance-Based Teacher Education, Vol. 3, No. 3, September 1974. Gentry, Castelle and Johnson, Charles. A Practical Manage- ment System for Performance-Based Teacher Education. A Publication of AmeriCan Association of Colleges for Teacher Education, February 1974. Gronlund, Norman E. Accountability for Classroom Instruction. New York: Macmillan Publishing Co., 1974. Houston, W. Robert. Strategies and Resources for Developing a Competency-Based Teacher Education Program. A Joint Publication of New York State EducatIOn Department and Multi-State Consortium on Performance-Based Teacher Education, October 1972. Hyman, Ronald T. Ways of Teaching, 2nd edition. Philadelphia: J. B. Lippincott Co., 1974. Joyce, Bruce. "Assessment in Teacher Education: Notes from the Competency Orientation," Exploring Competency Based Education, W. Robert Houston (ed.), Berkeley: McCutchan Publishing Corporation, 1974. Lembo, John M., Ed. Learning and Teaching in Today's Schools. Columbus: Chas. E. Merrill Publishing Co., 1972. Massanari, Karl. "CBTE's Potential for Improving Educational Personnel Development," Journal of Teacher Education, XXIV, Fall 1973. McDonald, Frederick J. "The State of the Art in Performance Assessment of Teaching Competence." (Paper presented at Multi-State Consortium at the American Educational Research Association Convention, New Orleans, February 1973. Merwin, Jack C. Performance-Based Teacher Education:_Some Measurement gnd Decision-Making Considerations. "Intro- ductory Note" Washington, D. C.: American Association of Colleges for Teacher Education, June 1973. Popham, W. James. Evaluating Instruction. Englewood Cliffs: Prentice-Hall, Inc., 1973. Schalock, H. D. "From Commitment to Practice in Assessing the Outcomes of Teaching: A Case Study." Paper presented at Multi-State Consortium on Performance Based Teacher Edu- cation, New Orleans, 1973. 123 Schalock, H. Del. "Notes on a Model of Assessment That Meets the Requirements of CBTE," Exploring Competency Based Education, W. Robert Houston (ed.). Berkeley: McCutchan PubliShing Corporation, 1974. Scriven, Michael. "If the Program is Competency Based, How Come the Evaluation is Costing So Much?" Com- petency Assessment, Research, and Evaluation. _A_Report of a National Conference, University of Houston, March 1974. Shearron, Gilbert F. and Johnson, Charles E. "A CBTE Program in Action: University of Georgia," Journal of Teacher Education, XXIV, Fall 1973. Sowards, G. Wesley. "One Year in Retrospect," Published by Multi—State Consortium on Performance Based Teacher Education, November 1973. Appendix A Criteria for Making Judgments From Analysis of Selected Data Management Systems 124 Criteria for Making Judgments From Analysis of Selected Data Management Systems The following list of characteristics will be used as criteria in the analysis of the several data management systems. The list was produced by the AACTE Committee on Performance-Based Teacher Education. The first five are considered essential by the Committee. The remaining characteristics are considered as either implied or desirable. Those that are implied are assumed to exist as a consequence of the first five generic elements, which the Committee feels will qualify a program as PBTE or not. The desirable char- acteristics may or may not be found in a program, but are considered as desirable components of a PBTE program. The numbers following the criteria refer to related questions of the survey instrument. 1. Teaching competencies to be demonstrated are role— derived, specified in behavioral terms and made public. (20, 70) 2. Assessment criteria are competency-based, specify mastery levels, and made public. (21, 22) 3. Assessment required performance as prime evidence, takes student knowledge into account. (42, 43, 45, 46, 47, 51) 4. Student's progress rate depends on demonstrated competency. (12, 23, 49) 5. Instructional program facilitates development and evalu- ation of Specific competencies. (15, 25, 26, 42) 6. Instruction is individualized. (6, 8, 10, ll, 38, 55) 7. The learning experience of the individual is guided by feedback. Feedback also provides information necessary for program revision. Feedback should be timely, com- prehensive, and accurate. (33, 34, 35, 50, 63, 64, 65, 66, 67, 68, 69, 79) 8. The program as a whole is systemic. Most systems are product oriented. How accurately these products reflect the system's purpose is the critical measure by which we judge the system's operation. (57, 58, 59, 62) lStanley Elam, What is the State of the Art? (Washington: American Association of Colleges for Teacher Education), December 1971, pp. 6-11. 10. ll. 12. 125 The emphasis is on exit, not on entrance, requirements. (44, 52, 53) Instruction is modularized. A module is a set of learning activities (with Objectives, prerequisites, pre-assessment, instructional activities, post-assessment, and remediation) intended to facilitate the student's acquisition and demonstration of a particular competency. (13, 14, l6, 17, 18, 24) The program is field-centered. (48, 54) Because PBTE is systemic and because it depends upon feedback for the correction of error and for the improvement of efficiency, it is likely to have a research component; it is open and regenerative. (30, 32, 70, 71) The impact of PBTE on teacher training programs should show: much greater program flexibility. (8, 10, 11, 17, 18, 38, 39) greater attention of specific skill training. (6, 25, 26, 43, 55, 56) greater congruity between objectives and the evidence admitted for evaluation purposes. (41, 45, 46, 47) better rationalization of faculty decisions and demands affecting students. (27, 28, 57, 58, 59, 68) development of new facilities and technology required by PBTE. (19, 61, 72, 73, 74, 75, 76, 77, 78) Appendix B Cover Letter for Survey of Judges 126 FERRI S STATE CO L LEG E Big Rapids. Michigan 49307 6| 6/796-9971 School of Education I am planning a study of the data management systems used for assessing competency based teacher education ro rams. The purpose of this study is to examine the identified programs. I am soliciting your expertise and familiarity with competency based teacher education to help me identify appropriate institutions. I would appreciate it if you would list the institutions you feel are most advanced in developing an electronic data processing system for management of assessment data for a competency based teacher education prOgram. I am enclosing a response sheet and a stamped pre- addressed envelope. I would be very grateful for your help in identifying these institutions. If your institution has a competency based teacher education program, please include it if you feel it qualifies. I would appreciate a response as soon as possible. Thank you for your cooperation. Sincerely yours, Walter G. Ritchie School of Education Ferris State College Big Rapids, MI 49307 WGR/rer Appendix C Form for Survey of the Judges 127 The following institutions have been suggested in the preliminary survey as mature programs that may have a sophisticated system for managing their program data. Would you please react to these suggestions and more importantly, add to the list below, those competency based teacher education programs that ygg think are using an electronic data processing system for the management of assessment and revision data. Please check the appropriate boxes. Lack Beginning Moderately ‘Well Name of institution . stage of infor- well develop- mation ment developed developed Florida State University p - University of Georgia University of Houston University of Oregon University of Toledo Wayne State University Appendix D A Survey of Data Management Components for Assessment and Revision Systems Used in Operational Competency Based Teacher Education Programs 128 A. Environmental Conditions During Implementation Stage 1. Once you decided to implement an assessment component for your CBTE program, did you experience any of the following difficulties? Faculty resistance Faculty lacking essential skills Program causing deviation from normal institutional procedures Student attitude Necessity for developing own learning packages Developing effective data management system Developing an efficient/effective feedback system Other (specify) 2. Did lack of financial support affect the assessment component of your CBTE program? Yes No If answer is "yes," how? Personnel resources Data management facilities Equipment Physical facilities Acquisition of software Other (specify) 3. Did you convert entire program to CBTE in one stage? Yes No If no, what segments of program were first? Which segments gave most trouble? 4. Since your program has been Operational for some time, what is the current faculty reaction to CBTE? Enthusiastic % Middle of road % Negative % Current student reaction? Enthusiastic Middle of road Negative deepen B. 129 Is your teacher education program totally competency If no, which parts are not competency based? Are students required to move sequentially through Are you running a second teacher training program that is parallel to your CBTE program? Are there multiple entry points for students? What is the approximate number of students in your Program Characteristics 5. based? Yes _____ No 6. program? Yes______No If no, please explain. 7. Yes __ No 8. Yes __ NO 9. CBTE program? Number of faculty involved? 10. Are there any time constraints for student completion of a learning module? Yes No Please explain. 11. 12. 13. 14. 15. 16. 17. 130 Are there any time constraints for completion of an entire program? Yes No Please explain. How many modules must student be successful in? What portion of your program is modular? 100% 75% 50% 25% 0% Do you feel entire program should be modular? Yes No Why? Are all units of instruction based on stated performance objectives, whether modular or not? Yes No DO all modules and/or learning packages have supporting non-print media? Yes NO If no, what is the approximate percent that do? What percent of your modules have alternate delivery systems available? Approximately 100% 75% 50% 25% 0% 18. 19. 131 Do you plan to develop additional alternative delivery systems? Yes No Do you feel they are necessary? Yes NO Do you feel that alternative learning experiences can be synthesized at the time of need (based on feedback data)? Yes No Does your program have specialized facilities? Yes No Testing labs? Individualized learning labs? Other? (specify) Given the following definitions: 20. 21. 22. Competencies - A description in performance terms of knowledge, skills, and behaviors that will enable a student to meet performance criteria for classroom teaching. How many competencies in your program? Terminal Performance Objective - Objectives which state what the learner is to be able to do at the end of instruction. They specify the standard levels of performance in behavioral terms. Approximately how many terminal performance objectives in your program? Enabling objective - Objectives which describe those knowledges, skills, and attitudes which a learner must attain at some intermediate point if he is to acquire the terminal objective. Approximately how may enabling Objectives in your program? 23. 24. 25. 26. 27. 28. 29. 132 Module - A cluster of related objectives with its own pretest, posttest, and instruc- tional strategies. How many modules are used in your program? What percentage of your modules are: Group paced % Self-paced % Are competencies ranked in terms of importance? Yes No Are objectives ranked in terms of importance? Yes No What are total credit hours required in professional teacher preparation program? Semester Quarter Other Hours in classroom instruction Percent that is CB % Hours in simulated experience Percent that is CB % Hours in field experience Percent that is CB % Do you have personnel who have been identified as directly responsible for program revision? Yes NO If yes, How are they assigned? What process do they go through? (time and resource constraints) How are cooperating teachers selected? By superintendent By principal By volunteer Other (specify) 30. 31. 32. 133 Have you been conducting follow—up studies? Yes No If no, do you plan to conduct follow-up studies? Yes NO Have any students graduated from your CBTE program? Yes NO Have you received any feedback from employers? Yes No If yes, would you rate feedback as being: Good Fair Poor Do you feel this reaction differs from that of previous graduates? Yes No If yes, please explain. C. Assessment and Evaluation System 33. 34. 35. How is student advisement handled? Scheduled meetings: daily weekly monthly other (speCIfy) By request: by advisor by instructor by student other (speCIfy) Is each student assigned a specific advisor to help him through the competency based program? Yes NO If no, how is advisement handled? How many advisees are assigned to each advisor in the competency based program? 36. 37. 38. 39. 40. 41. 42. 43. 134 Do you test students prior to instruction to identify entry personality characteristics of students? Yes No Given that prerequisite skills are skills necessary to enter the CB program, do you test for them? Yes No If yes, Once for entire program For each learning package For first learning package only If you have pretests, which of the following are they used to determine: Entry into program Point of entry into program Exemption from program All of above Do you admit students who lack prerequisite skills? Yes No Is prerequisite test part of the pretest? Yes No If no, when do you administer prerequisite test? Before pretest After pretest What percent of your competency based programs per- formance objectives are matched with specific strategies and criterion items? % Of the total number of behavioral objectives in your program, what percentage of them must be met by the student at the predetermined criterion level? % How is decision made for establishing criterion level? Teacher judgment % of time Faculty judgment % of time Objective analysis % of time Other (specify) % of time 44. 45. 46. 47. 48. 135 Is criterion-referenced testing used with all units of instruction, whether modular or not? Yes No What type of tests are utilized during instructional process? (Including prerequisite, pretest, and posttest and excluding simulated and field experiences.) True and false Multiple choice Matching items Fill in Essay Other (specify) 090060090060 What type of assessment data are utilized for simulated experiences? Observational: Graphic rating instrument Anecdotal record Systematic observations Pupil performance Questionnaire Other (specify) Which of the following types of assessment data are collected during field experience? Observational: Graphic rating instrument Anecdotal record Systematic observations Pupil performance Questionnaire Other (specify) Is choice of instrument based on specific behaviors? Yes No Or is choice based on products of behaviors to be measured? Yes NO What percent of field experiences are assessed by: cooperating teacher university evaluator other (specify) 690'de 49. 50. 51. 52. 53. 54. 136 If students fail a posttest, may they retake it? Yes NO If yes, how many times? If yes, do they retake: same test? % or equivalent test? % How rapidly is posttest feedback data made available to students? Minutes Hours Days Weeks Does data cover learning package achievement? Yes No Competency achievement? Yes No Objective achievement? Yes No Credit received? Yes No Are there experiences which are not assessed? Yes NO If yes, why are they not assessed? Are any portions of program graded on a Pass/Fail basis? All Part None If answer is Part, which parts are? Do you utilize grades beyond minimal level of performance, such as: minimum level = C above average = B excellent = A Yes NO Please explain: For objectives designed to be learned in field setting, what percent have assessment instruments? % 55. 56. 57. 58. 59. 60. 137 Do alternative learning packages for a specific module have: the same objectives? the same criterion measure? How are test validity and reliability determined? What data do you use for formative evaluation purposes? What data do you use for summative evaluation purposes? Please list. DO you collect data on student attitudes toward individual learning package objectives and strategies? Yes NO Does administration use data for faculty evaluation? Yes No If yes, can the faculty member determine what data are included in his promotion, tenure, salary increases files? Yes No D. Data Management System 61. Is electronic data processing equipment used for record keeping? All Part If part, do you plan to convert all to electronic data processing? If no, why not? 62. 63. 64. 65. 66. 138 Did you use a system from another institution(s) as a model in developing your data management system? Yes No If yes, whose system did you use? How are records kept for each student? Total achievement scores for each learning package For each competency Achievement on each Objective All of above Other (specify) DO you receive an item analysis printout? Yes No Is system designed to identify student having difficulty? Yes NO If yes, is this a separate printout? (divorced from other data) Yes NO What type of difficulties are identified by your system? Do you collect data on individual student's personality characteristics? Yes No If yes, how do you use the data? Does data include number of times student has attempted to posttest out of each: individual objective? Yes NO learning package? Yes No 67. 68. 69. 70. 71. 139 Who receives feedback data? Advisor Student Instructor Developer AdministratiOn Other (speCIfy) Do all you have checked receive the same data? Yes NO If no, please explain differences. What percentage of the time does your data management system get the appropriate data to the appropriate person, in the appropriate form, at the appropriate time? % Perceived weaknesses: Perceived strengths: What changes would you make in your data management system if you were starting again? Are you collecting data on the relevancy of the competencies and objectives of your program? Yes NO If yes, are you using: student judgment faculty judgment expert judgment empirical evidence wdpwaw What statistical treatment is given your management data, and for what purpose? 140 Information handling of data management system 72. Selection - How do you determine what data should be collected? 73. Collection - How are data collected? 74. Organization - How are the data treated to make it appropriate for the consumer? 75. Storage - How are the data stored? 76. Retrieval - How are data retrieved? 77. Distribution - How is the information distributed? 78. Assessment - How is the effect of the information assessed? fi 79. 80. 81. 141 Does your data management system provide competency level profiles on each graduate for use by pro- spective employers? Yes No Reason for action: Could I have sample assessment data collection instruments and electronic data processing readouts for your program? DO all field experiences have performance objectives which must be met? Yes No Appendix E Data From Survey of Judges 142 Data From Survey of Judges The following institutions have been suggested in the preliminary survey as mature programs that may have a sophisticated system for managing their program data. Would you please react to these suggestions and more importantly, add to the list below, those competency based teacher education programs that ygp think are using an electronic data processing system for the management of assessment and revision data. Please check the appropriate boxes. ‘3 S >. .3 m a; H 'U '0 +1 C'HE 0.) Q) Q) _ . H0O: U O. 0: Name of institution E 5 Dog 3 ,2 ,3 ,4 .540 HUNG) alt—la) HQ) I!) w as: ”8'35 as 4' :SS 8 mm: s 31! 3:p 8 O 1 2 3 Florida State Univ. xxxxx x xxx xx 13 Univ. of Georgia xxxx xxx xxx 9 xxx Univ. of Houston xx xxx xxxx 17 xxx Univ. of Oregon xxx 0 xxx Univ. of Toledo xx x xxxx xx 21 xxx Wayne State Univ. xxx xxx x 9 Florida Int'l. Univ. x xx 5 Univ. of Texas-Danton x 1 Oregon College of Educ. x xxxx xx 15 Texas A & M x 3 Brigham Young Univ. x l 143 c D O G >1 ":3 EMF '3 '° '8 Name of institution to -a o m +’ 8. a E c o m o o c Ora H H H H x o wiC\O cram H m m 04-: onu> 'Uv—I:> H> «u 3 c nip O OQ)O m o o «4 camp 23'!) 3'0 B 0 l 2 3 Weber State X X N West Georgia College X 1 Univ. of Texas-El Paso X 1 Michigan State Univ. X 2 Univ. of Wisconsin X X 3 Univ. of Kansas X 2 Univ. of Texas-Austin X 2 Western Kentucky XX 2 Univ. of Nebraska X 2 Appendix F Interview of Institutions 144 memnmoum memo Mao» OD HOHHOHOQ mH Dons scum loud mchHCHD Honommu pcoomm c o: o: 0: 0: wow mchcsu so» one .5 mecumoum smsoueu seamen Hcsuocuucoo Icmsvmm O>Oe mucmEOHqumn OD OOMHDUOH mm» no» mm» mm» EDEHCHE .Oc mucmpsuw one .m AOOHOQO Emumonm muHsommv mpmmmb chou mmocmHHOme uoc mH MOCODOQEOO ONHHODOH>HO conch OHOHM EOHuOsuumcH wHHmuou Emumoum ICH on com: mumucmEOHO eooummmHo EOHDCODOO mo» .oc OOHMHHCDO wow ch0 .0: meow .Oc stocmu meow mH .m thmMm>HcD GOHDOOSOM muHmHm>HcD mumum OOOHOB mo coumsom m0 m0 mom H00 mucum memos muHme>HcD wuHmHm>HCD :0 one OOHHOHE H.v xHHucz coHucoscm HOSOCOB pmmmm hocmummeou mm mchMHHCSO "ZZWUZOU mGOHu.Su.flU.mGH HO 30H>HOUHHH 145 mHmHSOOE we pesosm Emumoum OMHDOO lmww mm» mm» mm» 0c Hmmw do» on .eH mHmHspOE mH HHC Emnmoum uses no wOOH wOOH wooH comma OHOHM wom COHDHOQ use: .m; mmucwpsum no“ mDCHom muucm OHQHDHSE mm» no» no» mm» o: muons mum .m muHmHO>HCD COH moscm munHO>HCD mumum 0pm 09 mo coumsom no HO O m HOU mumum mc>c3 wuHm O>HGD NDHmHO>HcD :0 who mpHHOHm AO.DCOOV H.v xHHucz coHpcospm HOEOMOB Ommmm mocwuwmeou mm OOH>MHHCDO "ZMMUZOU 146 mm» mm» mm» mm» 0G mmEmumwm SHO>HHOO O>HDCCHODHC HOCOHDHOOO QOHO>OO OD CCHQ so» on .mH we wmm wmm wooH me mOHanHm>m mamummm SHO>HHOO ODCCHODHC o>me mOHspOE H50» mo ucmoumm Dmnz .bH mow mm» mm» 0C mum Duos Dan .0: muoc HO HOHDOOE weapons .mm>Huownno mocmEHOmumm possum no comma SOHuOsHDOCH mo muHcs HHm mud .mH huHmHm>HCD mumum memos OpmHOB mo wuHmHO>HcD coumsom mo SDHmMm>HcD COHumospm mo mow H00 so who SDHmHO>HcD mumum mpHuon Ac.ucoov H.v xHHumz coHumosOm HOEOCOB Ommcm mocmummeou mm mGHMHHHCSO "ZMMUZOU 147 axooHs Hmpnmsq mpOcmHmmm an pmcmHmmc HOmH>pm 0: .mm» mm» new mom OHMHommm mH .wm ucocsum OD HODDOH Houmch mmumHuHcH IHOOO muHm mmmcHummE usoucHHm HOHOmcsoo HOmH>HOmsm cam .Hmeommu Omumwscmu In Dempsum pcm .ucmpsum Hoonom pew .HODODHDmOH an pmecmn .HouosnumcH HouosuumcH HO .HouosuumcH .ucmpdum .HOD .ucmpsum ucwEmmH>pm an .mmm Dampsum .mm> .uOmH>pm an IOSHDmGH an .uOmH>pm Dampsum mH .nmm mmOCHuOOE >Huouumsq OOHspmnom onusoo .NHEDCOE we OOHOOOE 0» OdeOO .memms DOOEmmH>Om 0c Eoum mOHHm> 0: on .mHHmp Dampsum mH .mmm MDHmHO>HcD coHumospm muHmnO>HcD mumum OOOHOB mo coumsom no mo mom HOU mucum memos muHmum>HcD muHmum>HCD so one mpHuon N.e saunas DECEOmH>pa "ZEMOZOO 148 mmOHumH lumpocumso huHHchmumm >HHCEHOmcH mused now 0: use .mm» mm» mm» mm» Dump 50» 0: .mm. HOmH>pm.£OOO OD mommH>pc o mH mmlom mH on m0 HOQEDZ .mm muHmHm>HcD. GOHucospm huHmHm>HcD mumum OUOHOB mo couwsom m0 m0 mmm H00 mumum Teams muHmHm>HCD huHme>HCD :0 OHO OUHHOHm Ap.usoov m.e saunas DEDEOmH>pa .zmmozoo 149 OHDOOE Ou OHDOOE mmHHc> mecumoum OucH wuuco mo ucHom OCHEHOuOO on use .mm» mow mm» o: mumwumum on .mmm mmHHme mcH muHmHsvmu lemmuom MOM loud MUCH ummu anHmcm 0:3 mucmpsum . OE mom no» new mom uHEcm so» 00 .mm mmHHme muHmqumu loud new wow mm» mom 0: mm» ummu 50% on .hm muHmuO>Hcc coHumOSOM muHmHO>HED mucum OOOHOB mo coumsom m0 m0 mmmHHOU mucum m muHmHO>HcD muHmHO>HcD commuo cpHHOHm m.v xHHucz COHuosuumcH mcHNHHmspH>HOcH "ZMMUZOU 150 0C m cm mOEOOOQ uH mumuumsq m umumm .mmm mmHOH Once on .mmm mOHDH 0: use .mom OHO>Om uoc use .mmm mOHsOOE mchucmH C no cOHuOHmEOO ucmpsuw MOM muchuumcoo OEHu was mumnu one .OH mow 0C 03 0G 0G mEmHmOHm OHHucm Eoum mucmpsum ummem mummucum ecu .Omm mm» 0C wow wow 0C mEmHmoum mo uucm Scum :OHumEOxO OOHEHoqu mummumum OD .Qmm muHmHO>HcD wumum mcwmz OOOHOB mo muHmHO>HcD coumsom mo wuHmHO>HcD coHumOOcm mo moo HOU do one wuHmHO>HcD mumum mpHHOHm Ap.ucoov m.v xHHumz OOHuOOHumcH mCHNHHcOcH>HOOH “2mmUZOU 151 mucosa msOMm OEHHomo wom OMs mespOE emsOMnu .mHHHMm pas Msom mo nu woos woe mmpmusosx :u wom usmoums uses .msm OOH manomou OCHsomou chommuOMomE OMOHE IOMOHE >9 pas OMOHE .bmH usme .mMouHsOE >9 .mMOuHCOE ImquO .QsH >9 .mmmHm >9 .mmMHm mHmHMmumE as; Odo ass mso mmOHuHHHOsm .mQMH msH .man msH .mQMH mcH OONHHCHOOmm IsMsmH .mnsH ustmH .szH IGMCOH .mnsH ®>O£ EOM@OMQ 0s msHummu .m0> vsHumOu .mo» 0s msHumwu .mom Msom moon .mH mEOMmOMm usO OMHusm as no OOHOmssOO mchscEOp dOHuOHmEOO MOM on Mme mOHsM pMmc mOHsM 0s MO pOsHqu musHMMumsOO on use .0: 0s .mom use .mom uos use .mom eHu OMOSu OMH .HH wuHmMO>HsD sOHusOspm muHmMO>HsD Oucum OOOHO9 mo soumsom no no see HOU mumum scams auHmMO>Hcs huHmMO>HsD so OMO COHMon Ap.usoov m.v xHMucz sOHuOsMumsH msHuHHsspH>HpsH «ZMHUZOU 152 NOOOCQIMHOm mcHHomOOM OMC mmHsOOE smsOMcu usmEmmscsE Msow mo woos mesa woe wom mmummuo .som Demands uses .nsm wuHmMO>HsD sOHusospm wuHmMO>HsD oumum OOOHO9 mo . soumsom m0 m0 mmm HOU mumum scam: muHmMO>HsD huHmMm>HsD so OMO OOHMOHM 1p.ueooc m.e xuuumz souuosuumcu msunuumspu>upeH .zmmozoo 153 OHQDHMC> m¢ musmpsum MOmH>MOQsm .OHnHmmom Hooaom usmcH mmu SHHCEMOmsH Owns pew MOmH> m¢ musmpsum mHO>OH IuHEEOO OmHm mMmcommu mHmmHmsm IMOmsm OHOHM .N¢ manocOu sOHMOuHMO msH .wom wuHsocm pew muHsosm O>Huomflno .H* mHmmHmss .H# mHmmHmss uanHnmumm MOM mOOH mHthsssiwooH mHmesss manommu O>Huomflbo O>Huomnbo TOME sOHmHOOO O>Huuoflno O>Huomnno wOOH muHsomm was muHsomw pew muHsomm mH sow .mv mHO>OH OOHMouHMO OOEHE IMouopOMd us EstOMm mmOH use on umse ms mstcE mHHmsOHum mO>HuOOnnO Omosu IMOQO uocm sH HMMOH>C£OO mo WOOH wooH. mp wOOH OHOHM wOOH pmcsmusH wOOH ucOOMOm uses .mv mmEOuH coHMmuHMO 6cm mOHmOumMum OHMHOOmm nuHs concuse mOOOOHMmme mHHmcoHustmo.WMs mm>HuomnnO OHOHM mmOH use OssEMOmMmm mo wOOH woOH mCOH >Hso .wooH pmcswusH wooH usmOMOm umcz .Hv muHmMO>Hcs :OHumOspm huHmMO>HsD mumum OOOHO9 mo . soumsom no mo mmm HOU mucum memos muHmMm>HCD muHmMO>HsD so OMO MOHMOHM v.v xHMucz msHumO9 OOOEOMOMOM sOHMOuHMU "ZMWUZOU 154 NTHSmMTE couscous sOHMOuHMO OEsm EMOm mMOumOHOsH O>m£ mommxosm uCOMOMMHO msOHMc> msHsMsOH mm» use .mmh on use .Os mow O>HumstuHm on .bmm mmO>HuOOmOO mesh O>s£ mommxomm mchMsOH mm» mom 0: mm» mm» O>HusstuHs OQ .smm msOHuOsMumcH mo muHss HHm cuH3 poms mCHummu mO>HuOOnbO OOOGOMOMOM mm» mm» mm» OHOHM hHsO 0s sOHMOuHMO mH .ve muHmMO>HsD coHumOspm muHmMm>HsD oumum OOOHO9 mo coumsom m0 m0 mom HOU mumum scams auHmMm>Hs muHmMO>HCD co OMO OOHMOHM Ap.ucoov e.v xHMusz mCHumO9 pmosmMomom sOHMOuHMU "ZMEUZOU 155 huHHHnsHHOM mocmumHmsOO anHnsumO HcstusH Ou musomMu .Oms msHMsp ummu .mqu COHumOHMHMO> IHHs> O>HuOHp mHmecss mOHcsum IOMQ .muMmmxm Mouosw msISOHHOm usoucoo euHs mpOGHEMouOO OOH>MOm cmsOMcu .msOHuOHpOMm muHOHHs> ucou auHHHOmHHOM OCHumOu mo mumemcc mquHHc>.pss mossMsmmm IsOO .sOHusO can wquHHm> Oms smsOMcu EwuH uosMumsOO wuHHssU IHMHMO> Hmom ummu OMC 303 .mm muHmMO>HsD sOH cospm wuHmMO>Hcs mucum OOOHO9 mo soumsom no mo O O HOU mucum scams muHmMm>HsD huHmMO>HsD so OMO OOHMOHm Ac.usoov v.¢ xHMuCE mCHumm9 UOOCOMmmOm sOHMOuHMU "ZMMUZOU 156 #Gmemmmmmm L d...‘_'. ‘- mcMOOOM Moms mOsOHMOme OOECEMOMMOQ chOpOOsm 0s .xomnpmmm OHOHM msHMss . HHmsm mom mussoo OdsuOOOH> GOHum>MmmOo pouOOHHOO sump cam OMOOOM OOscEMOMMOm mSOHum>MOmQO OHumEOummm HCOHEQCMmOHO usOEmmOmmm HmquOOss HHmsm .HHC OHuCEOumam use HHs msHm HHs mo mom>9 .shv usmEmmmmms mOOCOHMOme Moms can Hsmoms umOE Immommm MHOm pousHsEHm OOOGOHMOQxO mCOHus>MOmQO. OMO £OH£3 can Mmmm ppm Mom Owns pmumHsEHm OHuceOumwm mommu OOOH> usmEmmOmmc OMHcssOHumwsv usOEmmOmmm 0s m>c£ .umHHMOOSO msHm .HHm Mmmm use .HHs mo mom>9 .mv i mmOOOMd HmsOHuOsMumcH . msHMsp poms HHO HHm HHm HHm HHO mumwu mo mmmh9 .mw wuHmMO>HsD sOHumospm huHmMO>HsD mumuw OOOHO9 mo soumsom no mo 0mm HOU musum Osmmz muHmMO>HsD huHmMm>HsD so OMO OOHMOHm m.v xHMucz sOHumusOEsMumsH "ZMMUZOU 157 mm» now mm» mm» mm» mums on Ou mO>HuOOnOO woscEMOMMOm O>c£ mOOsOHMOme UHOHM HHm on .Hm mow mm» mm» 0G mm» mmMOH>mson mo muospOMm so comma usoesMumCH mo OOHOEO mH .ohv mm» mm» mm» 0G mwh mMOH>m3OO OHMHOOmm so comma ucmesMuDCH mo OOHObO mH .th muHmMO>HcD mucum msmmz OOOHO9 MO huHmMO>HcD soumsom mo muHmMm>HsD soHusospm mo mom H00 :0 mMo muHmMO>HsD mumum OUHMOHM Ap.usoov m.e xHMumz soHucusOEsMumsH «2mMUZOU 3OH>OM OMOOO mumwu OHOMOOM meMs> usmempsn 0cm usmEmpsn lumom s Oxsu umsE smsu .uswempsm muHsosm muHsomu ucmpsum s was uHEHH 0c .mOEHu Osu huHsocm .mOHMs> .mOHMc> mmEHu meme 30m .smv i mMoquHs>O muHMMm>Hss an . pmmmmmms OMO COHmHOOp COHmHOOO mOOSOHMOme HOCHM HmsHm OHOHM mo mouse wow wow mouse .wOOH wooH woo usmOMOm ucn3 .nma manocOu msHustmooo an pmmmmmms OMC woos iuumms sow mmosmuumsxm musmpsum mMOmm can UHOHM mo . was won .soos woos Dampsum .sow unmonma has: .MmJ huHmMO>HCD sOHusOspm muHmMO>HsD mumum OOOHO9 mo coumsom no mo mmm Hoo musum ms>c3 wuHmMO>Hc muHmMm>HsD so OMO COHMOHm m.v xHMucz hMoummz Mow mCHmmmed "ZMMUZOU 159 NOOSOEMOM iMOm mo HO>OH COHumE ass HmEHsHE csomon IMOmsH OM05 HCsOHusm> DOOMm Oms so» u mopst Isoo OOOCMm W on mmHmmn .O>Huomn mOOMMm mOmMsOO mEOm HHM9\mmcm ansm On ou HCCOHusO> .M\m mOmMsOO s so OOOMMm HOGOHu pswu £0H£3 Isoo uuHs meow Oss EstOMm IsuHumsH mmmMm Omosu COHumEMOmsH moosmHMmme mo msOHuMom m\m HHc .9\m 0: OH M\m mm» OMOE .m\m 0s pHOHw .mm» was OM< .mm .mm commommm on ou cmuocMusOo mOOGOHMOme mmocOHMwmxm mpOmmmmmm uos uoc OMC OHOHM UHOHM moosOHMmmxm 0s meow .mmh 0c OEOm .mmm meow .mm» was Oanu OMH .Hm ucmMOMMHp mooso woe ummu OMOe ummuumom ummu .ucOH usmHs>HsUO mmpOHsosx seem Tau Oxsu woOH no» nm>HsOO .Os .wom menu .mom I OHOHM mOHMm> musmpsum on .nme huHmMO>HsD soHusOspm. muHmMO>HsD mumum OOOHO9 MO soumsom no mo omm HOU oumum memos muHmMO>HsD muHmMO>HCD so OMO OOHMOHm Ap.usooc G.s xuMums spasms: OED msummmmma "Emmozoo 160 mmusoe IsMumsH uswe Immomms O>C£ usOOMOm uses .meMuumm puma“ CH OOGMCOH On ou posmHmos wooH wOH wooH MOOH wQOH mO>HuOOnnO Mom .vm muHmMm>HsD sOHusOspm muHmMm>Hco wusum OOOHO9 mo soumsom m0 m0 moo HOU mucum memos wuHmMO>HsD huHmMm>HsD so OMO spHMOHm Ap.usoov m.v xHMumz NMOumcz Mom mchmmmm¢ "zmmozou 161 mucwem>mHsoc mossumm IEOO MO>OO sump Mocncmom 0: mm» mm» mm» mm» ummuumom moon .Oom musmsm>OHnom mmsxosm msHstOH MO>OO cusp xomnpmmm 0s 0: mm» o: no» ummuumom moon .nom mmusmpsum ou OHOMHHM>O Opus can: suus mxmmz MO>Oc .mxmmz cusp Mosh wasp .mom wasp .mMsoc .mmmp .mMson mucus .mmmp .mMson upmmw ummuumom suHs mxmms .mmussHE .mmussHE .mmmp .mMsoc .moussHE mH meHmsM 30m .som muHmMO>HsD soHusoscm muHmMO>HsD mumum OOOHO9 mo soumsom m0 m0 mom HOU mucum Dawns auHmMO>HsD huHmMO>HcD so OMO COHMOHm a.e saunas xopnpmms no on: can usousou "ZMmUZOU 162 mEMOu m >Mm>m 9O> IMsm smHmOc mosmEMOMMmd .MmumOEOm .HschsuHuus mOMHms mCOHumsHm>O m >O>Msm .Hcsummo mmwmomMsm usOHumOsc msOHmHOOp wuHsosm usmEumsn IMOQ .mOmMsOO sOHumsHm>O .mumou HO>OH Emmu ICOHumOsU ups .9O>Msm smmsuwn O>HumEMOm uHxO .mHm .mHmmHscm ucmpsum .mHm xmmz N poms pom OmMsOO MOM Oms so» . IMHcsc EouH EmuH ImHmsm EmuH mosmcmuche CHsuHs ummu Op cusp uses .sm mpO>HOOOM quOMO MO>OO sump xocnpmom wow wow wow 0: mm» ummuumom moon .mom muswem>OHnom O>Huomnno MO>OO cusp Momnpmmm mm» mm» mm» on no» ummuumom mmoa .com. wuHmMO>HsD EOHusOspm wuHmMO>HsD mumum OOOHO9 mo coumsom HO HO mmm HOUi mumum mcmmz muHmMO>HsD muHmMO>HsD so OMO COHMOHM 1p.usoov n.v xHMumz Momnpwmm mo Om: can usOusOU "ZMmUZOU 163 sOHumO muHsocm os sOHumO sOHumsHs>O msOHussHs>O muHsosm os usmpsum muHsosu Mom poms wow wow uos uos uos sump OMC 30m .ow mcmuOOHHOO mOHmoucMum psm mO>HuOOnsO so mopsuHuus mm» mm» mm» wow wow usmpsum OMs .mm mmmmomMsm sOHumsHs>O O>Huseesm O>HuOEMOm O>HusEMOm o>HumEMOm O>HumEMOm O>HusEMOm Mom poms no sewn no menu ms meow ms mash up mEcm OMs cusp ucsz ..mm muHmMO>HsD sOHusOspm muHmMO>HsD ousum OpmHO9 mo soumsom HO HO moo HOU musum mshcz muHmMm>HsD muHmMO>HsD so OMO COHMOHm 1p.usouv 5.4 saunas xomnpmms mo was was usmusoo "ZMHUZOU 164 mMOOd MO .MHsm .poom msHms ms xosspssm OusM II @000 poom poom @000 so» pHsOB .smm mmMsmonEO .EOMm xosspmsm ass UO>HOOOM Os- mm» mmw so» new so» m>sm .smm msOMsOmOM \mOHUsum asusoHHOm pmuospsoo mm» mm» mm» mm» mm» so» m>sm .om muHmMm>HsD sOHusOspm muHmMm>HsD susum OOOHO9 mo. soumsom no mo wmm HOU Ousum ss>s3 muHmMs>HsD huHmMO>HsD so 0M0 sOHMOHm m.v xHMusz sOMsmmOM\sOHussHs>m uzmmozoo 165 mmsusspst msOH>OMm mo ussu EOMM mMOMMHp poom mesOH>OMQ EOHsOMm sOHuosmM mHusoumHmsOO . sssu Mwuusb so mstMos mHsu ussu II Boss u.sOp OMOE .msh sw>m .mmh use .Os Hsmw so» on .Omm wuHmMO>HsD sOHusOspm lNuHmMo>HsD ousum OOOHO9 mo soumsom m0 MO was HOU , susum ms>s3 wuHmMO>HsD huHmMO>HsD so OMO sOHMOHm is.usooi m.e xuuusz sousmmmm\eouussHs>m .zmmozoo 166 mamumhm usOEOmsssE . susp MsOM msHmOHO>OO sH EsumMm mOOMsOm .m.sOHusuHumsH MO muOHMs> Mmsuoss 0s 0s s Owns .0s 0s 0s Oms s09 OHQ .mw. mmsHmmOOOMm susp O>Huomm MOsOE susp Issm MOM mam MOM Oss seHu mpMoomM OHsOMuOOHm OusHMmOMmms susHMmOMmms MO mEMmu s30 mmsx ass Ou HHs IsH .Oumss uos susc sH usOHOHMMm mMouOsMumsH uMs>sOO ou II sEHu .Os meow .hHumOE uos .Os smsosuHs ssHm so» on .sHm mumsu mmsHmOsx OmsommsM OMoosM MOM psuosMumsOO poms msHmmOO HHs sMOOm IOMm susp HHs psss .uMsd uMsm uMsm HHs OHsOMuosHm mH .sHm MuHmMO>HsD sOHusOspm MuHmMm>HsD ousum OOOHO9 MO soumsom MO Mo was HOU Ousum OsmsB muHmMm>HsD muHmMm>HsD so OMO sOHMOHM m.e xHMusz mOHumHMOuosMssU Emummm msHmmOOOMm suso OHsOMuOOHm "ZMMUZOU 167 msuss mquHHs> Oms so» Op O>HuOHOOMQ monomMsm Toe Oss mOHuMH usmE MOM sOMsOmOM sOMsOmsM Oms s>Hu lMOuosMssO IOmH>Os MOM .ussEOmH>ps MOM .EOOumm IOHOOMQ MOM wuHHssOmMOm .mumsMOusH .sOHusOHMHOOE 1MHOm Oss mmss sOMssmOM pss usspsum usmpsum HsE MOH>ssOs IOMsssIMHOm msHHOmssOO so susp os IMOMsH .mmm IMHOm .mom MOM .ms» .mmm uOOHHOO so» on .Omm mOHOMOOM MO MOnEss sH mEmum9m musopsum MOM .ussummfioo uos Msom we mOHOMOOM . sOHusEMOMsH msts uss3 muses OOHMHusOOH sMs MO MOQEss O>HumHMOmOMm usOEO>OHsos .mosmummfioo ImMquOM uOmE mmHuHsOHMMHp .OMsHHsM .OMsHHsM ummu . MO xosH. ou sMsHHsM MO Oahu ussz .bmm mmuHsOHMMHp msH>ss usspsum sMMusosu usOIusHMd usOIusHMd usOIusHMm usOIusHMm Esum9m an mm» as mm» as mm» Osss an new. as mmh mOOp 30m .smm muHmMm>HsD sOHusospm MuHmMO>HsD susum OOOHO9 MO soumsom MO MO was HOU musum ssmsz wuHmMO>HsD muHmMO>HsDi so OMO spHMOHm Ap.usoov m.v xHMusz mOHumHMmuosMssU Esumhm msHmmsOOMm suss OHsOMuOOHm uzmmuzoo 168 mEstOMm MsO> MO mm>Huomn Ibo pss mOHO IssuOmEOO Osu MO Ooss>mHOM ssu so susp msHuOOHHOO new mm» mm» mm» mm» so» mMs .so> msmsxosm msHsMsOH so ummuumom Ou pmumEOuus Hm>OH mss usmpsum -OHusMumsOEOc moEHu MO OOsOuOOEOO Msness OOsHO ms» mew mew .msm mmw IsH susp OQ .nmm mm>HuOOmnO HsspH>HpsH so umouumom ou psumfimuus mss uswpsum msEHu MO Mmsess OOsHO mow mm» mm» ssh ms» IsH susc on .smm huHmMO>HsD sOHusOspm MuHmMO>HsD . musum OpmHO9 MO soumsom M0 M0 mmo HOU ousum msmsz muHmMO>HsD muHmMO>HsD so OMO spHMOHm ip.uaoos a.s xusumz mOHumHMOuosMssU Ewummm msHmmmOOMm suso OHsOMuOOHm "ZMMUZOU 169 musousHMm mHmMHsss EouH ss s>HOOOM mm» mm» mm» mm» mm» so» on .vm msHuMsum OEOm umsfl musspsum mOsOpH>m OOsOOH>O OOsOOH>O OOsOOH>O whoss>mHOM .wOOH HsOHMHsEO HsOHMHmEO HsOHMHQEm HsOHMHmeO msHEMmqu mosmpH>O .me muMOme .muMmdxm .muMOme .muMmme ou Owns HsOHMHmEO .m* musmpsum .musscsum .musmpsum .muswpsum mH usmempsn .soos Mussoms .Hs suusomu .suusosm .MuHsosM .suusosm smog; .mms MM .noa muHmMO>HsD . sOHusospm huHmMO>HsD musum OOOHO9 MO soumsom MO MO was HOU Ousum Osasz muHmMm>HsD muHmMO>HsD so OMO spHMOHm Ap.usoov m.v xHMusz mOHumHMsuosMssU Esumhm msHmmOOOMm susa OHsOMuOOHm “2mmuzoo 170 mm>uuomflno susp HsOHsm soss so IstOHn css usmeO>mHsos assessuuuus MoM uses mm» mm» os mm» Oss mm» mpMOOOM OMs .Omo whosmuOmEOO soss MOM umOx mOMOOOM mm» mm» no» mm» mm» usmpsum 0M4 .smm momsxosm soss MOM msMOOm usOEs>mHsos Essu ou Hsuou MOM Esou EOMM ummx mpMOOOM ms» mOHMs>. mO> mm» mm» usopsum OMs .smm. muHmMO>HsD sOHusospm muHmMO>HsD susum. OpmHO9 MO soumsom M0 MO was HOD. Ousum Osmsz muHmMm>HsD suHmMO>HsD so OMO spHMOHm OH.v xHMusz usOEusOM9 suse "ZMMUZOU 171 sawumhm mOMspOOOMm momsum usmEOmsssE squ MHQEOO O>HusEMOM susp Msom MO ou OMsHHsM msu sH msmssso susc msmmssxsss .muswpsum pmpssmxm Omssoms hsm msHMsE .OOHM HssOHus>MmmOO pO>HOOMOm pss wuHsosMi Os Ou spews ou OMss ImHusm uos msHHpsss osu OMs uses .smm MouOsMumsH MOmH>Mmmsm HmssOmMOm MOmH>Os MOHOmssOO Hoosom Hoosom ususp mst OHHQsm msHm OHHssm msHm xosnpmmM sass sEsO OEsm III mm» mess I new Osu O>HOOOM I new new mm» mm» mm» mmwsu on .Qsm sOHusMumHsHeps MOOOHO>OO mm» usspsum mm» muHsosM msm "susc Moss MmquH MO MomH>ps mm» mm» mew 1pm M OEsm MOuOsMumsH. smsOMsu msm mm» mm» Osu O>HmOOM smsOMsu .momi .mm» new . mm» mm» Omssu OQ .ssmr _ldmd.H.mHm~mHCD QCHMECSEE m... «apes: Ousum OpmHO9 MO soumsom M0 M0 was HOU Ousum ssmsz MuHmMO>HsD muHmMs>HsD so OMO spHMOHm is.usoov OH.s xMMusE usssusmMs muss "zmmozou 172 HssOHusHmMMOO .sOHmmmMm IOM MsOsHH .OossHMs> sOHmmstst 1 MO mHmMHsss OHmHuHsE. mOHMuOE mOmOdMsm .mEMos .sOMsOmOM. IsMsmisos uss3 MOM ou msOmHMsm .mmesum. MHHMsEHMm pss susp usmE IEOO .mOMOOm usOEmOMmsU .mumsu IsmsssE Ou MmusH HHuss sHmsHm cousHOM mossOHMHsmHm sO>Hm mH usOE susp msHMOum .mOHHMOMm IMOusH MO Hs. .mOMOOm sHsm lusOMu HsOHu m>HumHMOmmp mHmmHsss EOuH IhHsss EwuH IsOHusHOMMOO ..Oum .mssme ImHusum uss3 .Hn mmEHu assay we» muHsOHMMHO us sOmMOm msHmssO susp usmHM Osu msH>HO>O mmmssso IsOHus>MOmsO ou mumm susp HHHum Emumwm mstse .OOHM. .muHsmmM usmHM msu MO wmh .OEHu MO umOE wom usons ImHusm uos ummu mOOH usOOMOm ussz .Omm msHMouHsOE mEOumwm mmsHO Msuums usOEOmsssE .Mosmummeoo Omsum spams usOE susp Msoa MO usmpsum susp O>HusEMOM Ou msHmsssO msumsOMum so sOHusE umHHxOOsO sH Omssomn usmsomEOO .UmsmHmmp Os>HOOMmm IMOMsH Mouumb O>HusHsEsoos msm ou pMss sOMsOmOM Hst wsu OMs ussz .smm huHme>HsD sOHusospm. muHmMO>HsD Ousum OOOHO9 MO soumsom M0 M0 mmOHHOU Ousum Osmsz muHmMm>HsD muHmMO>HsD somOMo spHMOHm is.usooi oH.e xHMusE usmsussss sums "2mmozoo mHmsb ms. issHusOO s s musmsomeoo 173 mom .mussM usOEmmsmms msu pss sOH assoe uOOMMs uMommsm uH IsussEOHmEH uMOm so uHHss HsHOsssHM os psmsHsp .msm os msHMsp .Os MHHsuou .msm MO MosH OHQ .m WMOHHOQ HssOHusu IHumsH HsEMos EOMM sOHusH>ss . an pmmsso mmuussouuuus suusoumuup sea usOEuMsmOp mosOHMsmxs so» sOHusosps OHO .usOsOQEOO ssu MO sOHu ussEmmOmms IsMumHsHEOs msso HHsEm ss usoEOHmEH mm» mm» pomsso .mmm MHsO .Os so» ou msHpHOOp sH .OH mmHHme HsHusOmmO mstosH muHsosM MO muHsOHMMHp msu mosOHMsmxs mmmOOMm msHsHsMu so» OHO .usssom OOusOHHmEOO sOH>MOm IEOO ussEmmmmms MOM cosmos IsH MOM sews . ss usOEOHmEH ms» OHuuHH meow msHsHsMu HssM .OEOm Ou msHOHOOp sH .sH muHmMm>HsD sOHusOspm muHmMO>HsD Ousum OOOHO9 MO Oumsom M0 M0 ems HOU- Ousum ssmsz muHmMO>HsD uHmMs>HsD so mMo sOHMOHh MM.s saunas mOOsOsHMsH uss>sHmm "ZKmUZOU 174 Emumhm MO msHsms msHuMsum msHMouHsOE mam MO Oms MH EmumMm msHomsO MOM O>OMQEH Ou msHsHsMu usOEOmsssE mossumHmms msHssHusOO OOH>MOmsH susp Msow sH OpHmusO .pmHMmHusm MOM Est Oxse so» pHsos MOM poms mHHsMOsOm Osos msos IOMQ Mmuums mmmssso ussz .mw NM9mU pMssou O>Husmms OMs huHsosM MO on me wOH Osos wOH usOOMOm ussz .v muHmMO>HsD sOHusospm muHmMO>HsD susum OOOHO9 MO soumsom M0 MO was HOU Ousum Osmsz muHmMO>HsD wuHmMO>HsD so OMO sOHMOHm is.usoov HH.e xMMusE msosmsMMsH uss>sHmm "zmmozoo 175 L mEMos . HssOHuss . MmuMssO mHmmHsss css auHsosM OMsOmwM\xsss . sosm EOuH Mmsuo Ou .msl30HHOM usmpsum pss .mosOHOHMMO OMsmEOO mHm .mEsxO HuHsosM EOMM OHsOOE Istss EmuH sOMssmOM .mOmMsOO muMommM .usOEO>mHs0s .usmEO>OHs0s MOM mHmEsm .musspsum mpousmMu MMoumHs O>Huomnn0 usscsum smHmOp MO mpMoosM susp OMs 3oz .vh MQQfimmsU Ime EOMM usOEEOO Essu sOHusEMOMsH . Oss Esmu OONHMsE msMOOm MousuHHHosM lesm .Omu Ou OmuMO>s00 .mMOsosOu IMO>sOO css msmms .Omsmm mOMHss msHusMs pmMoom psss stE \musme IsOHummsq 1&000 .mussp .musOEsMum IsMumsH umsu .xMO3 M0 Isum EOMM IsH OOMOOm umHonssO spouOOHHOO ssHm .mummw Mosswmsm Osmsose m>s3 msse mossEMOMMOm susm sMs 30m .mh mOHHMOMm mHthsss mmsHo O>Hu0OmnO ussempsn mpOuOOHHOO on .muHsosM pss .mmuuHEEOO muHsosM Oss pHsosm susp usmEmmswms m a s mMmmsssE sOHussHEMsuwp OOuuHEEOO wss3 OsHEMOuOO usspsum pss mEssu EstOMm muHsosM muHsosM so» 0c 30m .ms MuHmMs>HsD sOHusOsOm muHmMO>HsD susum OOOHO9 MO soumsom M0 M0 was H00 susum ms>s3 MuHmMO>HsD MuHmMm>HsD s0 OMO spHMOHM NH.v xHMusz Emum>m usmEmmsssz susn MO msHHpssm sOHusEMOMsH "ZMMUZOU 176 same soumu>ue ie.usooc ms.s xuuuss Emume usOEOOsssZ susQ M0 msHHOssm sOHusEMOMsH Oss OmuuHEEOO usspsum ou mHMouMssq pss .Essu umOsOOM .muHsosM .HsspH>HpsH mMmEsmsOO as meow pss uswpsum .wuHsosM "Ou OHMHOOQm iHHsOHusEOuss mpmussHMume 0u mess3 MHHsMO css ou mommu psss as pss meow sOHusEMOMsH Oss mHHsp usOusHMm HsHommm usOusHMm .usousHMm mH 30m .55 mOHssu thEEsm .uOHm sOHusHOMMOO mms sOHussHMumHO psss pss pspssusH mosmscsMM mssOE mOs>OHMuOM musousHMm usOusHMm suH3 mOHMs> .msHHMOMm .usousHMm susp OMs 30m .ms Omsu\psss momsu mOHHM psss OONHMsEEsm mOHHM psss mcMsO mpsMOum MmussEOO Oss mmsu OEOm pss MOusmEOO .mOmHO .mmmsu susp sMs 303 .ms muHmMO>HsD sOHusOspm muHmMO>HsD wusum OUOHO9 MO soumsom M0 M0 OmOHHOU musum Oswsz huHmMO>HsD qumMm>HsD somOMo sOHMOHm "ZmMUZOU ...~sv rt 177 mMOmH>MOmsm .uswcsum .muHsosM «mOOMsOm _ smuuHEEOO OOHMs> susp MO m s 4 EOMM mmMHss sms O>HuOOMMO mpsmmmmms mHHsEMOM smsOMsu IsOHumOsv MOM msHsHsMu sOHusEMOMsH pmmmsmms mmmoosm smsOMsu poms .Ooom ssu M0 uOOMMO msHsn uos usmpsum pmmmsmms uos MHHsEMOMsH sOHusEMOMsH ssu mH 30m .mb MuHmMs>HsD sOHusospm muHmMO>HsD susum OOOHO9 MO soumsom M0 M0 was H00 Ousum OsmsB qumMO>HsD muHMMO>HsD s0 OMO spHMOHM Ap.us00v NH.v xHMusE Eoummm usOEmmsssz suso M0 msHHpssm sOHusEMOMsH "ZKMUZOU Appendix G Criterion Checklist and Mark-Sense Answer Sheet 178 CRITERION CHECKLIST STUDENT: John Doe DATE: 12/12/75 COMPETENCY: Use of oxy-acetylene torch Pass (A) Fail (B) l. Selects proper tip for thick- ness of metal to be welded. 2. Adjusts proper pressure on tank gauges. 3. Uses proper safety devices in preparing for welding. \\\\ 4. Places work securely on bench and removes articles not needed. 5. Bends end of welding rod to V/’ prevent injury to others. 6. Makes proper adjustment on y// torch to produce a neutral flame. in proper position prior to starting weld. 7. Tacks pieces of metal together V// (The scores, Pass and Fail, will be transferred to Columns A and B respectively on the mark sense answer sheet, see next page. Note transfer of scores on items 1 through 7.) 179 Mark-Sense Answer Sheet Appendix H Examples of Data Retrieved as Print-Outs _ 180 Emommm mmmmoomm BZMQDBm H]: t. . l: M.m musssxm mossEMOMMOQ OHssummoos O>mHs0s Ou OMsHHsMs s m n m.e s mmzommmm sommmou H m u m.H N mmzommmm 90mmmoozH am mu u mH.m m smmmHE msmeH m. e m m. m mmOOm e m s e s as e a m a s m2 m s m m M mm>Hsomnmo m "masses oom ssm "mmssou ms\mH\NH "mess shoe .sos “szmsssm 181 N.m mussaxm smHm u m .OHsOm m Ou H so psusm OsHs> mam mum sum sum sum muses Mo moses e m m.M a.m H.m soMDmMssoo sou msuu ssOE pmsmHHssumm m a m m a m>uuomnso soss s0 OeH9 m m m N w hmmusMum msHsosmu MO msHs> m m m s m m>uuommno Mo msHs> m a m m M m>Hsomemo m "masses .mMs mm "msz can uses "mmmsoo H "MHUMU msw .m0x3sm "MOmH>Q¢ ma\mu\~u ”mess snow .moo .szmossm BmOmmm MDH¢> BZHQDBm 182 ll‘llull . ll “I m.m mHsssxm Hem sous Hsss s 4H SH N HM H NH Hmm nous Hess e N N mN HH ssH omus 30cm m H H aN OH ssH osoo ssoo m e H N GN H m seH oeoO ssoo m H mN H m was omuo sons N om s oms ago some N om m use mHsm ssoo N N mN H H m scM omus Hess H NH m OH AM a mag EEO 30cm H s m H H «N H m use EEO some H m H N AN N oms EEO 30cm H m m mN N H mmm mm>9 HO>OH flso M00sH MHm m v m N H MOQEsz smousMum msHsMsOH sHsEOQ Hsu09 sOHummsO “MBZMQDBW HdBOB ms\NH\NH “WBHfiummmm QHBUMHmm mmzommmm mouoso uHsz : ummuumom s "masses oom ass “mmmaoo sac .mmxsss "mososmsmzH 183 v.m mamsmxm m H m N H m< m: u N H a m m H H mm H N m: n N m N H m m NH m N m m mN ON ON NN vN HN m: u N m w m N H maoao pmamuué 30mm :0 maflmoum pamEm>mH£o¢ who: H H m m H N N N HN mN om mN NN H m N m N a waoau m>ap0mnno umm mmemuu¢ mo coausnfluumfln mocmsvmum m "mm>Haumnmo m "mango: om "mezmoaem q m.m ¢.m w.v N.v m.m m.v woa w¢ mm m.mH m 0H m2 N2 HE meDQOE om "mfiszDBm HfiBOB mn\ON\N “MBNQ mHmwdfiz¢ mmMDOU mmmumnum mafinommu mo msam> cam: mm>fluomnno mo msam> cam: mUHEHH mafia aflnufi3 wasvofi mcaumamsoo no: pamonmm coaumamsoo How Umuflsvmu mmmc mo HmnEDG cum: m "mmqaooz mo mmmzoz mam .mwxzmm "moaoomemzH No 0mm oom SUM "mmmDOU IIIIIIIIIIIIIIIIIIIIIIIIIIIIIIII IllIIIMHILIIMMIN[[NilflllfilfllllflWlllH