THE DESTGR RF A COMPUTER MANAGED 7‘ TNSTRUSTTON SYSTEM FOR USE IN A TECHNTEAL EDUGATLUN PROGRAMS AT THE COMMUMTY QULLEGE LEVEL MECHTGAN STATE UNLVERSLW V ’ mam mmow mu. j, § - 197 3 _.‘.-f_ ' ' ‘ ..,... .... HM Tl I]!!! WANNA 31293 _ - ~ , . , r LIBPAR " P E; MlChl'Jan St '. University J This is to certify that the thesis entitled THE DESIGN OF A OWNER-MANAGED INSTRIETION SEW FOR USE IN TEENICAL EDUCATION PROGRAMS AT THE CCMMUNITY COLLEGE IEV'EL presented by Robert B. Tholl has been accepted towards fulfillment of the requirements for _Eh.D.__degree in WMucation M Major professor 0-7 839 LIBIII'I". Ilcllfll I M/ ABSTRACT THE DESIGN OF A COMPUTER-MANAGED INSTRUCTION SYSTEM FOR USE IN TECHNICAL EDUCATION PROGRAMS AT THE COMMUNITY COLLEGE LEVEL BY Robert Barkalow Tholl Statement of the Problem The problem which was studied was concerned with individualized instruction, the effect of student characteristics, and the use of computers to assist in the management function of individualized in- struction. Specifically, it was the purpose of the study to design, develop and test a computer-managed instruction (CMI) system for use in technical education programs at the community college level. Procedures of the Study, The procedures involved the design of a computer-managed instruction system and the test of the system in a technical education course at the community college level. The system included a course strategy based upon individualized instruction, a management system operated via computer programs written in COBOL, and an instructional staff who conducted the course. The system was tested in the evening section of the course, Principles of Electronics, at Cerritos College, Norwalk, California, during the first seven weeks of the Spring Robert Barkalow Tholl Semester, 1973. Forty-three students participated in the system test in addition to the instructional staff who conducted the course. The evaluation of the system was based upon student progress during the system test; student reaction to the system as indicated by responses on a student questionnaire; and staff reaction to the system as indicated by responses on an instructor critique which was also used to provide infbrmation for improvement and modifications to the system. Conclusions 1. Given the opportunity to work at their own rate, students will vary greatly the time required to attain mastery of the assignment objectives when participating in a technical education electronics course conducted via a computer-managed instruction system. 2. Students who possess the belief that they learn better when not working closely with the teacher will complete more assign- ments in a technical education electronics course conducted via the computer-managed instruction system than those who believe that they learn better when working closely with the teacher. 3. Students who do not possess the belief that it is a teacher's responsibility to see that students learn the subject matter of a course will find greater satisfaction with a technical education electronics course conducted via the computer-managed instruction system than those who believe that it is a teacher's responsibility to see that students learn the subject matter of a course. Robert Barkalow Tholl 4. Completion or non-completion of pre-requisite electronics or mathematics courses, concurrent mathematics or college English will not affect the number of assignments completed by students in a tech- nical education electronics course conducted via the computer-managed instruction system. 5. School and College Ability Test Verbal, Quantitative and Total scores will not predict success or satisfaction in a technical education electronics course conducted via the computer-managed instruction system. 6. Completion or non-completion of pre-requisite electronics or mathematics courses, concurrent mathematics, or college English will not affect the satisfaction of students participating in a technical education electronics course conducted via the computer- managed instruction system. 7. Conducting a technical education electronics course via the computer-managed instruction system will be acceptable to both students and staff and will not threaten students or staff participat- ing in the course. However, this may be affected by a staff's interest and ability in utilizing an individualized approach in the course in which the system is used. 8. A management system operated via a computer will assist an instructional staff in the management function of an individualized technical education electronics course and the system's computer- generated reports will not be threatening to either staff or students participating in the course. Robert Barkalow Tholl Recommendations l. The research should be replicated because the system test was conducted for only seven weeks in an effort to develop the system. The research should be conducted using the modified system which was developed as a result of the system test. 2. Research needs to be conducted comparing achievement in technical education courses conducted via the computer-managed instruction system to achievement in technical education courses conducted via the conventional approach. 3. Cost of conducting a technical education course via the computer-managed instruction system needs to be compared to the cost of conducting a technical education course via the conventional approach. 4. More extensive research needs to be conducted relative to the completion of course pro-requisites and achievement in technical education courses conducted via the computer-managed instruction system. These findings were based upon a test of only seven weeks and not an entire course. 5. More extensive research needs to be conducted relative to the capability of the School and College Ability Test scores (Verbal, Quantitative and Total) to predict achievement in technical education courses conducted via the computer-managed instruction system since the findings were based upon a test of only seven weeks and not an entire course. 6. A different set of prediction variables based upon cognitive style should be used in future research to deve10p a profile Robert Barkalow Tholl of students finding success and difficulty in technical education courses conducted via the computer-managed instruction system since the variables used in this study did not predict achievement or satisfaction with the system. THE DESIGN OF A COMPUTER-MANAGED INSTRUCTION SYSTEM FOR USE IN TECHNICAL EDUCATION PROGRAMS AT THE COMMUNITY COLLEGE LEVEL BY Robert Barkalow Tholl A DISSERTATION Submitted to Michigan State University in partial fulfillment of the requirements for the degree of DOCTOR OF PHILOSOPHY College of Education 1973 i? ACKNOWLEDGMENTS It would be difficult to enumerate the names of the many individuals who contributed towards making this research possible. However, I wish to single out and express my sincere appreciation to: The members of my Doctoral Committee--Dr. Norman Bell, Dr. Peter Haines, Dr. William Herzog, and Dr. 0. Donald Meaders for their direction and assistance throughout my program; Mr. James Burnett of the Computer Science Department at Michigan State University for his assistance during the development and modification of the computer managed instruction system computer programs; Mr. Terry Gibson, Mr. Henry Kato, Mr. Don Nakatani and Miss Dal-Ling Yu who conducted the system test at Cerritos College and made recommendations for improvement of the system; The Division of Vocational Education, California State Department of Education, who nominated me for an EPDA Fellowship in Vocational Education and made it possible for me to attend Michigan State University; Dr. Peter Haines and Dr. Rex Ray, Co-Directors of the EPDA Program in Vocational Education at Michigan State University for their support and assistance during my program; and My wife, Valerie, for her tolerance and continued support during this endeavor. ii To Valerie, Stephanie and Kevin iii TABLE OF CONTENTS Page LIST OF TABLES . . . . . . . . . . . . . . . . . viii LIST OF FIGURES . . . . . . . . . . . . . . . . x Chapter 1. THE PROBLEM AND DEFINITION OF TERMS . . . . . . . 1 THE PROBLEM O 0 O O O 0 O O O O O O 0 O 3 Statement of the Problem . . . . . . . . . 6 Delimitation of the Study . . . . . 7 Assumptions Underlying the Development of the System . . . . . . . . . . . . . 7 DEFINITIONS OF TERMS USED . . . . . . . . . . 8 ORGANIZATION OF THE STUDY . . . . . . . . . . 12 2. REVIEW OF THE LITERATURE . . . . . . . . . . . 14 INDIVIDUALIZED INSTRUCTION . . . . . . . . . 14 Characteristics of Individualized Instruction . . 16 Individual Differences . . . . . . . . . . 17 USE OF THE COMPUTER IN INSTRUCTION . . . . . . . 19 Computer-Assisted Instruction . . . . . . 20 Types of CAI Programs . . . . . . 21 Costs of CAI . . . . . . . . . . . 23 Computer-Managed Instruction . . . . . . . . 24 Characteristics of Computer-Managed Instructional Systems . . . . . . . . . . 24 Costs of CMI . . . . . . . . . . . . . 26 iv Chapter REVIEW OF RELATED CMI SYSTEMS . Current Status of Computer-Managed Instruction SUMMARY . . . . . . . . . . . . 3. PROCEDURES OF THE STUDY . THE COMPUTER-MANAGED INSTRUCTION SYSTEM WHICH WAS DESIGNED AND TESTED . . . . Course Strategy . . . . . . Management System . . . . . . . . Instructional Manager Report Program . Laboratory Instructor Report Program . . . . Individual Student Report Program . . . . . Operation of the Computer Programs . . . . . Instructional Staff . . . . . . . . . . Instructional Manager . . . . . Laboratory Instructor . . . . . . . . . Clerk . . . . . . . . . . . . . . TEST OF THE SYSTEM . . . . . . . . . . . Course Used for the Test of the System . . . Population Involved in the Test of the System . Instructional Staff Involved in the Test of the System . . . . . . . . . . . INSTRUMENTS USED IN THE EVALUATION OF THE SYSTEM The Student Questionnaire . . . . . . . . The Instructor Critique . . . . . . . . . SUMMARY . . . . . . . . . 4. RESULTS OF THE SYSTEM TEST AND MODIFICATION TO THE SYSTEM . . . . . . . . . . FINDINGS CONCERNING THE RESEARCH QUESTIONS . . . Student Progress During the System Test . . . . Option to bypass assignments . . . . . . . Characteristics of students related to achievement . . . . . . . . . . . . Summary of student progress during the system test . . . . . . . . . . . . Page 27 41 42 44 44 45 SO 50 52 53 SS 57 58 59 59 6O 61 63 65 68 68 7O 71 73 73 73 78 79 85 Chapter Student Reaction to the System . System threat to students . . . System acceptability to students Characteristics of students related to satisfaction with the system . Summary of student reaction to the system Staff Reaction to the System System threat to staff . . System acceptability to staff . Summary of staff reaction to the system RECOMMENDATIONS FOR MODIFICATION OF THE SYSTEM Recommended Report Modifications . Recommended Operational Modifications THE MODIFIED MANAGEMENT SYSTEM . . . . Modified Instructional Manager Report Program . Modified Laboratory Instructor Report PrOgram . Modified Individual Student Report Program . SUMMARY . . . . . . . . . 5. SUMMARY AND CONCLUSIONS . . . . . . . . . PROBLEM AND PROCEDURES . . . . . . . . . The System Which Was Designed . . . . Test of the System . . . . . . . . . Findings Related to Student Progress During the System Test . . . . . . . Findings Related to Student Reaction to the System . . . . . . . . . Findings Related to Staff Reaction to the System . . . . SUMMARY OF MAJOR FINDINGS . . . . . . . . CONCLUSIONS . . . . . . . . . . . RECOMENDATIONS . . . . . . . . . . . IMPLICATIONS OF THE STUDY . . . BIBLIOGRAPHY . . . . . . . . . . . . . . vi Page 86 87 92 97 103 104 105 106 108 109 110 110 111 111 114 117 119 120 121 122 123 124 125 127 128 130 131 133 136 Chapter Staff Responses to Question One of the Modified Instructional Manager Report . Modified Laboratory Instructor Report . Modified Individual Student Report . APPENDICES Appendix A. Instructional Manager Report . 8. Laboratory Instructor Report . C. Individual Student Report D. Instructor Critique E. F. G. H. Sample Instructional Assignment . I. Student Information Sheet J. CMI System Student Questionnaire _K. Instructor Critique . vii Page 141 146 148 149 152 157 161 163 169 170 171 Table 10. 11. LIST OF TABLES Page Matrix Summary of Computer-Managed Instruction Systems Reviewed . . . . . . . .« . . . . . . 40 Summary of Students' Backgrounds as Indicated by Responses on the Student Information Sheet . . . . . 64 Assignments Completed by Students During the System Test . . . . . . . . . . . . . . . 75 The Number of Assignments Completed at the End of Each Class Session During the System Test . . . . . 76 Range of Assignments Completed Each Class Session During the System Test . . . . . . . . . . . . 77 Number of Assignments Completed Via Pre- and Post-tests During the System Test . . . . . . . . 79 Comparison of Means and Standard Deviations of Characteristics of Students Completing a High, Intermediate and Low Number of Instructional Assignments During the System'Test . . . . . . . . 81 One-Way Analysis of Variance for the Characteristic, "I learn best by working closely and directly with the teacher." . . . . . . . . . . . . . 83 One—Way Analysis of Variance for the Characteristic, "Work experience in the electronics industry related to this course." . . . . . . . . . . . 84 Comparison of Means and Standard Deviations of Feelings Towards the System by Students Completing a High, Intermediate and Low Number of Instructional Assignments During the System Test . . . . . . . . 88 Chi Square Analysis of Student Feelings Toward the System for the Total Population During the System Test . . . . . . . . . . . . . . . 89 viii 'fable Page 12. Chi Square Analysis of Student Feelings Toward the System for the Total P0pu1ation in Which Students Were and Were Not Identified . . . . . . . . . . 91 13. Comparison of the Means and Standard Deviations of Acceptability of the System by Students Completing a High, Intermediate and Low Number of Instruc- tional Assignments During the System Test . . . . . 93 14. Chi Square Analysis of Student Acceptability of the System for the Total Population During the System Test . . . . . . . . . . . . . . 94 15. Chi Square Analysis of Student Acceptability of the System for the Total Population in Which Students Were and Were Not Identified . . . . . . . 96 16. Comparison of Means and Standard Deviations of Characteristics of Students Expressing a High, Intermediate, and Low Satisfaction with the CMI System During the Test . . . . . . . . . . . . 99 17. One-Way Analysis of Variance for the Characteristic, "It is a teacher's reSponsibility to see that I learn the subject matter of a course." . . . . . . 101 18. One-Way Analysis of Variance for the Characteristic, "I like to figure out how to do a thing by myself rather than be told." . . . . . . . . . . 102 ix LIST OF FIGURES Figure 1. Operation of the Computer-Managed Instruction System Which Was Designed for Use in Technical Education Programs at the Community College Level . . . . . . . . . . . . . . 2. Course Strategy for a Technical Education Course at the Community College Level . . . . . 3. Flowchart of the Instructional Manager ReportProgram. . . . . . . . 4. Flowchart of the Laboratory Instructor Report Program . . . . . . . S. Flowchart of the Individual Student ReportProgram........... 6. Card Deck for Operation of the CMI System Programs . . . . . . . . . . . . . 7. Flowchart of the Modified Instructional Manager Report PrOgram . . . . . . . . 8. Flowchart of the Modified Laboratory Instructor Report Program . . . . . . . 9. Flowchart of the Modified Individual Student Report Program . . . . . . . . Page 46 49 51 S4 S6 S7 113 115 118 Chapter 1 THE PROBLEM AND DEFINITION OF TERMS The community college is a unique institution of higher learning. It was created initially to provide two years of university- parallel work in the home communities of its students, but since then, its role and function have been expanded to serve a variety of educa- tional, social and community needs. A true community college prepares students for transfer to other institutions of higher learning, for immediate employment in technical and semiprofessional positions, and is active in retraining adults for new jobs created in an age of auto- mation. In addition, it Sponsors cultural activities for the commu- nity and serves as a coordinating educational agency for the entire community.1 7‘ Community college technical education programs, which are con- cerned with preparing students for immediate employment in technical and semiprofessional positions as well as retraining adults, provide both entry level and upgrading experiences for students. As a result, students enrolled in these prOgrams possess a wide variety of 18. Lamar Johnson, Islands of Innovation E andin : Chan es in the Community Colleg2_(BeverIy'Hills: Glencoe Fress, 15355, pp 0 fl'az e experiences, backgrounds and needs. Staff in these prOgrams feel the impact of the variety of student backgrounds and in many cases, find it difficult to satisfy the individual needs of students. But, indi- vidual courses designed to satisfy the needs of each student, would not constitute a realistic program. The cost of such a program would be prohibitive due to the vast number of courses that would be required. However, an approach in these courses whereby students complete in— structional materials based upon an assessment of their prior knowledge could provide a solution to this problem. An individualized approach could provide a valuable time savings for students who bring experience to a class by allowing them to bypass materials previously attained. And, those students without prior experience or those who are deficient in certain pre-requisite fundamentals would not be penalized, since their needs could be re- flected in their assignments also. However, one of the most difficult aspects of an individualized approach such as this, is the management function. According to Esbensen, "Operationally, the central problem of individualized instruction is the problem of classroom management."2 As a result, some educators have expressed interest in the use of the computer to assist in the management fhnction that is required in an individualized instruction program. The characteristic of computers that makes them applicable to the management function of individualized instruction is that they process information. This includes: collecting, storing, retrieving, 2Thorwald Esbensen, Workin with Individualized Instruction (Palo Alto: Fearson Publishers {15585, p. 3. comparing and formatting data in addition to performing mathematical Operations on the data. While they do only that which can be done manually, they are capable of performing these tasks more accurately, reliably and far more quickly than people.3 With these capabilities, the computer could facilitate the management function of an indi- vidualized instruction program in technical education and, as such, make it possible for the program to more closely satisfy the variety of needs of students enrolled. THE PROBLEM Several research projects have been conducted on the indi- vidualization of instruction through computer management at the elementary school and university levels. For example, the Instruc- tional Management System, which was developed by Systems DeveIOp- ment Corporation, is typical of the programs developed for use at the elementary school level. The system was designed to help teachers monitor the progress of students and make various decisions relating to the instructional program of students.4 Data was pre- sented to the teacher daily indicating student achievement and progress. Summary reports were presented weekly and at various other times when desired by the teacher. 3Paul E. Resta, Joel E. Strandberg and Edwin Hirsch, Strate ies for DeveloPment of Computer-Based Instructional Management Systems (1971), p. 4. ED 46245: 4Cleone L. Geddes and Beverly Y. Kooi, "An Instructional Management System for Classroom Teachers," The Elementary School Journal, LXIX (April, 1969), 337-45. Other systems at the elementary school level include the Individually Prescribed Instruction/Management and Information System, which was developed by the University of Pittsburgh Learning Research and Development Centers and the Conwell System, which was developed by the American Institute for Research.6 These systems attempted to prescribe instruction based upon learner characteristics and achieve- ment, and they also assisted the teacher in monitoring the progress of students and provided information for making instructional decisions. Florida State University has conducted research at the uni- versity level through the development of a model for training elemen— tary school teachers,7 the deve10pment of an undergraduate course in health occupations,8 and the development of a graduate course in education, "Techniques of Programmed Instruction."9 The several Harvey J. Brudner, "Computer Managed Instruction," Science, CLXII (November 29, 1968), 970-76. 6John A. Connoly, A Co uter-Based Instructional Management System: The Conggll Appranfi 51970). ED 49620. 7Edward N. Hobson, "Empirical Development of a Computer- Managed Instruction System for the Florida State University Model for the Preparation of Elementary Teachers" (unpublished Doctoral disser- tation, Florida State University, 1970). 8R. Michael Lawler, An Investigation of Selected Instructional Strate ies in an Under raduate co uteriflana ed Instruction COurse, TeEhnicaI Report Walls (Florida tate Univers1ty, I Center, 1971). ED S4652. 9Nancy K. Hagerty, Development and Implementation of a Computer-Managed Instruction System in GraduateTraining, TeEfinical Report Nb. 11 (Florida State University, CAI Center, 19 0) ED 42354. systems developed at Florida State University used on-line terminals for diagnostic and prescriptive functions and in that sense, were quite different from those previously described which used paper and pencil tests coupled with staff consultations. However, the systems assisted the teacher in monitoring student progress and provided information for making various instructional decisions. A Program fer Learning in Accordance with Needs, which was developed by the American Institute fer Research and the Westinghouse Learning Corporation, included both elementary and secondary school programs.10 It is being tested in grades one through twelve in selected school districts across the United States at this time. In this system, the computer performs the clerical functions of test scoring and reporting in addition to summarizing student status. Like the previous systems, Project PLAN provides infbrmation for making instructional decisions. Far less information is available on research into computer- managed instruction at the community college level. The Computer Assisted Management for Personalized Instruction (CAMPI) system, which was developed at Oakland Community College, is one of the few systems which has been developed at this level.11 Its primary function was to match the cognitive style of students with the style of various instruc- tional approaches and assign students to that approach which most nearly fit their cognitive style. While perfbrming this function, it 10John C. Flanagan, "Functional Evaluation for the Seventies," Phi Delta Kappan, XLIX (September, 1967), 27-30. 11Joseph E. Hill, The Educational Sciences (Bloomfield Hills: Oakland Community College, 1971). also assisted the teacher in monitoring student progress and provided information for making various instructional decisions. Thus, given the lack of computer-managed instruction systems available at the community college level and the need for an indi- vidualized instruction approach in technical education, it appears that there is a need for additional research into computer-managed instruction at this level. Statement of the Problem The problem which was studied was concerned with individualized instruction, the effect of student characteristics, and the use of computers to assist in the management function of individualized instruction. Specifically, it was the purpose of this study to design, develop and test a computer-managed instruction (CMI) system for use in technical education programs at the community college level. The questions to be answered by the test of the system were: 1. Given the opportunity to work at their own rate, how much will students vary the time required for achieving mastery of the learning materials? a. To what degree will students take advantage of the option to bypass materials previously mastered? b. What are the characteristics of students related to early and late attainment of assignment objectives? 2. What will be the student reaction to the system as indicated by the following questions? a. To what degree will the system be threatening to the students? b. To what degree will the system be acceptable to students as a means of acquiring course information? c. What are the characteristics of students expressing satisfaction and dissatisfaction with the system? 3. What will be the instructional staff reaction to the system as indicated by the following questions? a. To what degree will the system be threatening to the staff? b. To what degree will the system be acceptable to the staff as a means of conducting the course? Delimitation of the Study The study was delimited to include only the design, develop- ment and initial test of the computeramanaged instruction system and excluded a comparison of this method of instruction to other methods of instruction which should follow upon refinement of the system developed in this study. The exclusion was based upon the belief that accurate results of a comparison can only be achieved when a refined version of the computerbmanaged instruction system is used. Assumptions Underlying the Development Sffthe System The assumptions underlying the development of the computer- managed instruction system were: l. Individualized instruction, as defined in the Definitions of Terms Used, is a viable learning method and was therefore not a part of the system to be evaluated. 2. Mastery learning strategy, as defined in the Definitions of Terms Used, is a viable learning method and was therefore not a part of the system to be evaluated. 3. The instructional materials presently in use at Cerritos College and which were prescribed by the instructional assignments are viable and therefore were not a part of the system to be evaluated. DEFINITIONS OF TERMS USED Conventional Instruction. The instructional approach currently in use at Cerritos College which includes three hours of lecture and six hours of laboratory per week. All students attend both the lecture and laboratory sessions and complete identical assignments at a predescribed time. Students receive five semester credits for the course-~three credits for thelecture and two credits for the laboratory. Individualized Instruction. An instructional approach which allows for the individual background, experiences, and needs of the students to be met. The course material is developed around learning assignments based upon performance objectives. The objectives are evaluated in terms of mastery as measured by the assignment pre- and post-tests. 1" I‘l *J. 9"? ‘1 Instructional Assignments. Activity guide used by the stu- dent which contains the description and directions for a particular lesson. It contains a list of the behavioral objectives related to the lesson and the possible means for accomplishing the objectives. The objectives are evaluated by a criterion referenced pre- and post-test. Mastery Learning Strategy. An instructional technique whereby the amount of time available for learning is made appropriate to the characteristic and needs of each student thus establishing a fixed level of achievement as it relates to acquiring the instructional material.12 The level established for this material is 80 percent correct responses on assignment pre;tests or 80 percent correct responses fer each objective measured by the assignment post-tests. Eighty percent was established for mastery because the material was used throughout the fundamentals courses providing the opportunity for students to acquire the remaining information. Instructional Manager. Instructor responsible for making instructional decisions relating to student pregress through the course. Communicates with students and staff regarding student pro- gress and schedules lecture, instructional assignments and remedial materials for students as necessary. Laboratory Instructor. Instructor responsible for conducting laboratory sessions. Works closely with Instructional Manager while 12Benjamin 8. Bloom, Learning for Mastery (1968), p. 3. ED 53419. lO assisting students working on individual assignments. Assigns scores for operational phase of assignment pre- and post-tests. Clerk. Scores pre- and post-tests and builds data decks from student scores for entry into the management pregram. Operates system and distributes appropriate reports to staff and students. Data Deck. Assortment of computer cards arranged in pre- described order for data entry into the computer system. Cards con- tain student scores from pre- and post-tests in addition to special messages for students and instructional staff. Batch ProcessingpMode of Computer Operation. Programs and data are punched on computer cards. The cards are stacked and fed into the computer in batches. Operation of each program is governed by control cards or programs called up by the computer. Output is generally printed by a high-Speed printer at the end of the program. On-Line-Remote-Terminal Mode of Computer Operation. The computer is operated from a remote point via a special terminal (typewriter or cathode ray-tube) connected to the computer by either direct lines or voice-grade telephone lines. The user has the capability of interacting with the computer via the terminal with output being displayed on the user's terminal. Technical Education. "Technical education is concerned with that body of knowledge organized in a planned sequence of classroom and laboratory experiences, usually at the postsecondary level, to prepare pupils for a cluster of job opportunities in a Specialized 11 field of technology. The program of instruction normally includes the study of the underlying sciences and supporting mathematics inherent in a technology, as well as methods, skills, materials, and processes commonly used and services performed in the technology."13 Post-Hoe Analysis. A statistical technique which is designed to isolate the population means which contribute to the significant difference indicated by a one-way analysis of variance.14 One-Way Analysis of Variance. A statistical technique whereby the differences among three or more population means may be tested in which only one type of treatment is involved.15 Cpgnitive Spyle. A student's cognitive style is the way he seeks meaning or knowing. It is determined by the way he takes note of his total surroundings--how he seeks meaning, how he becomes informed.16 13U.S., Department of Health, Education, and Welfare Office of Education, Vocational Education and Occ ations (Washington: Government Printing ice, , , p. . 14Linda Glendening, "Posthoc: A Fortran IV Program for Gener- ating Confidence Intervals Using Either Tukey or Scheffe Multiple Comparison Procedures," Occasional Paper No. 20 (East Lansing: Michigan State University, Office of Research Consultation, 1973), page 1. 15Lincoln L. Chao, Statistics: Methods and Analysis (New York: McGraw-Hill Book Companyj'lnc., 1969), p. 301. 16Hill, op. cit., p. l. 12 Student File. A file stored on the computer high-Speed magnetic disk which contains the instructional assignment history of students. Information in the file is called up for processing by the various system programs. Remedial Materials File. A file stored on the computer high- Speed magnetic disk which contains a listing of materials relating to the instructional assignments. Information in the file is called up for processing by the various system programs. Lecture File. A file stored on the computer high-speed magnetic disk which contains a listing of the lectures related to the instructional assignments. Information in the file is called up for processing by the various system programs. ORGANIZATION OF THE STUDY Chapter 2 presents a review of the literature which was perti- nent to the problem being studied. The rewiew of the literature includes a discussion of individualized instruction, the computer and instruction, CAI and CMI, and a review of selected computer managed instruction systems. Chapter 3 presents a description of the system and the pro- cedures involved in the study. Included is a description of the course strategy, the management system and the instructional staff required to operate the system, a description of the system test and the instruments used in the evaluation of the system. 13 Chapter 4 presents the results of the system test. Included is a discussion of student prOgress during the test, student and staff reaction to the system, and the modifications to the system resulting from the system test. Chapter 5 presents a summary, conclusions and recommendations for further study. Chapter 2 REVIEW OF THE LITERATURE A review of the literature was conducted to provide informa- tion to be used as a basis for: (l) the development of a course strategy, and (2) the development of a management system to be used in the management function that is required in an individualized in- struction program in technical education at the community college level. This included a review of literature pertaining to individual- ized instruction, the effect of student characteristics on instruction, the use of computers in instruction, and a review of various computer managed instruction systems which have been deveIOped. INDIVIDUALIZED INSTRUCTION The term individualized education and instruction has been described by many. For instance, Cooley and Glaser state that indi- vidualized education is essentially the adaptation of instructional practices to individual requirements.1 Heathers indicates that it refers to any procedure used to insure that the individual student 1William W. Cooley and Robert Glaser, "The Computer and Indi- vidualized Instruction," Science, CLXVI (October, 1969), S74. 14 15 receives instruction that is specifically appropriate for him.2 Melching defines individualized instruction as a program of study that is fitted to the needs and characteristics of the learner at a given point in time, and in which the learner has a role in selecting what he studies, as well as how fast he proceeds.3 Hagerty charac- terized it as a method of allowing for differences in students and helping each student perform to the best of his ability.4 Mager stated that an instructional system was individualized when the characteristics of each student played a major role in the selection of objectives, materials, procedures, and time.5 Thus , it appears from several major papers that individualized instruction is an in- structional approach which allows for the individual background, experience, and needs of the students to be met and such a definition was used in this study. 2Glen Heathers, "A Definition of Individualized Education" (paper presented at the 1971 AERA Annual Meeting as part of Symposium C81: Teacher Behavior in Individualized Education, p. 1). ED 50012. 3William H. Melching, Behavioral Objectives and Individualized Instruction (April, 1969), p. rm. 4Nancy K. Hagerty, Development and Ipplementation of a C0 uter Mana ed Instruction System in Graduate Trainin , CAI Center TeEE. Report No. II (Tallahassee: Florida State Univers ty, June, 1970), p. 2. ED 42354. 5Robert F. Mager, "Foreword," Working with Individualized Instruction, Thorwald Esbensen (Palo to: Pearson ubTishers, 1968), p. Vii. .‘rlu —v.uu — r t .13 Tu. Ob c a e t “HA 1 One a 6.5. 5 (re. .0 0U ~|e \ Irw \vfle. 16 Characteristics of Individualized InstruCTibn Gibbons categorizes individualized instruction by active, responsive and permissive approaches.6 In the active category, the teacher is in control of the classroom making the majority of the instructional decisions. At the other end of the continuum is the permissive category in which the teacher allows or encourages students to make most of their own curriculum decisions. In the responsive category, the teacher and student plan the program together. In active forms of individualization, instruction may be modified for each student but it may include constraints on individual freedom typical of the conventional classroom. In permissive individualized programs, many of the constraints can be removed if allowed by the administrative structure. In the responsive mode, combinations of the active and permissive are found. Although the student may have a choice of materials or a choice of what he does with the materials, the environment and what it contains are usually carefully chosen to stimulate the student in certain ways.7 Bjorkquist believes that one of the characteristics of individualized instructionixivocational education is behavioral objectives which Specify what each learner is to learn. According to him, " . . . Each unit of instruction indicates who is to do the learning, the observable behavior expected after instruction, the 6Maurice Gibbons, Individualized Instruction, A Descri tive Anal sis (New York: Columbia University,_T3achersCollege Press, , p. 2. 71bid. 17 conditions under which it will occur, and the minimum level of acceptable performance."8 Heathers suggests that another characteristic of individualized instruction is mastery learning. According to him, mastery should mean that the student can retain and use what he has studied, that he doesn't have to keep doing work over, that his success increases his motivation to learn, and that he develops an enhanced self-concept based upon his success.9 Bloom believes that mastery will result when the amount of time can be made appropriate to the characteristics and needs of each student. He states that given sufficient time and the appropriate type of help, 95 percent of the students can learn a subject to a high level of mastery.10 Thus it would appear that to have mastery, an individualized instruction approach is necessary. There are several approaches to individualized instruction with the more active approach characterized by behavioral objectives and mastery learning. Such an approach to individualized instruction was used in the development of the course strategy found in the com- puter managed instruction system. Individual Differences Gagne suggests that learning is an individual process. "The design of efficient conditions for learning demands that learning be 8David Bjorkquist, What Vocational Education Teachers Should Know About Individualized Instruction (1971), p. 5. . 9Heathers, 0p. cit., p. 3. 10 . Benjamen S. Bloom, Learn1n for Mastery, 1968, p. 1. ED 53419. _—£ 18 conceived as an individual matter. We must find out what the learner is like, what he needs to know to begin the learning process, and what he needs to do to carry it out."11 Cronbach believes that a person learns more easily from one method than another and that the best method differs from person to person. He recommends that treatments be designed to fit students with particular aptitude patterns.12 Briggs indicates that the most efficient situation would be where a student's cognitive style was matched with the task style demanded and where his study strategies were compatible with both.13 Nunny and Hill have worked with cognitive styles at the com- munity college level and have found that by using cognitive style mapping, they can determine which students will probably learn well from a particular method of presentation and which students will probably have difficulty.14 In a social science course, using cog- nitive style mapping, 93 percent of the students in the course received grades of A, B, or C. Prior to the inclusion of cognitive style mapping, only 64 percent received A, B, or C.15 11Robert M. Gagne, Learning Theory, Educational Media and Individualized Instructiop, , p. 14. ED 39752: 12 Lee J. Cronbach, "The Two Disciplines of Scientific Psycho- logy" (Address of the President at the Sixty-Fifth Annual Convention of the American Psychological Association, September 2, 1957, New York), p. 12. 3 . . 1 J. B. B1ggs, Informat1on and Human Learning_(Glenview: Scott, Foresman and Company,1968), p. 99. 4 1 Derek N. Nunney and Joseph E. Hill, "Personalized Educa- tional Programs," Audiovsiual Instruction, XVII, No. 2 (February, 1972), 10. lslbid., p. 12. 19 Coop and Sigel summarize the concern for individual differ- ences in instruction when they say " . . . there is tremendous variability in the way in which individuals process information and hence in the manner in which they approach individualized instruc- tional programs. As such, teachers should be sensitive to the differ- ent individual approaches adOpted by the various students in their classes."16 Thus, it appears that individual differences play a major role in the way people learn. As such, if the cognitive style of students could be matched to the Style of the presentation of the material, an increase in learning would result. USE OF THE COMPUTER IN INSTRUCTION According to Schure, the greatest future impact of the com- puter upon occupationally related education may exist in its capacity to manage total systems. With the advent of large-scale computeriza- tion, the transformations taking place in the field of occupational education include: 1. Computer usage designed to yield effective evaluation, accountability, and management information to consti- tuencies concerned with the education system; 2. Guidance related to occupational education; 3. Use of the computer: a. to "teach" and/or problem solve (Computer-assisted instruction); b. to manage and/or monitor varied subject matter content (computer monitored instruction); 16Richard H. C00p and Irving E. Sigel, "Cognitive Style: Implications for Learning and Instruction," Psychology in the Schools, VIII, No. 2 (1971), 160. 20 4. Occupational training to prepare personnel to operate, to pregram and/or to maintain equipment in the computer field. 7 The following discussion relates to the third item described by Schure, the use of the computer to teach and to manage or monitor instruction. Computer-Assisted Instruction Computer—assisted instruction, CAI, is the most familiar term related to the instructional application of the computer. Cooley and Glaser characterize CAI as a method whereby the computer is used by the student as a means of instruction.18 Instruction is presented to the student via a terminal connected to a computer. According to Kemeny, CAI pragrams are most effective for rote learning and mechan- ical drill.19 However, others have described various techniques and strategies included in CA1 which go far beyond rote learning and mechanical drill. These range from the tutorial mode where the teacher is the director and the computer is the tool of the teacher, to the problem solving mode where the student is the director and the computer becomes the tool of the student. 17Alexander Schure, "An Accountability and Evaluation Design for Occupational Education," Educational Technology! XI, 3 (March, 1971), 26. 18William C. Cooley and Robert Glaser, "An Information and Management System for Individually Prescribed Instruction," Computer- Assisted Instruction, ed. Richard C. Atkinson and H. A. Wilson (Neinoik: AcademiE Press, Inc., 1969), p. 96. 19John G. Kemeny, Man and the Computer (New York: Charles Scribner's Sons, 1972), p. 75. 21 _Iypes of CAI Programs. Suppes,20 Bell,21 Luskin,22 and Roberts and Zirkel23 have classified the types or strategies involved in computer-assisted instruction. Included in these are drill and practice, tutorial, dialog, inquiry, problem-solving and Simulation strategies. In drill and practice, brief lessons are administered to the student at a computer terminal as fellow-up exercises to the teacher's presentation.24 The point of the computer system at this level is to provide a simple, straight-ferward, and individualized approach.25 In the tutorial mode, the computer presents a question, the student answers and the computer responds to the student with appro- priate branching.26 Here, the aim is to take over the main reSponsi- bility for developing skill in the use of a given concept.27 20Patrick Suppes, "On Using Computers to Individualize Instruction," The Computer in American Education, ed. D. D. Bushnel and D. W. Allen (New York: John—Wiley and Sons, Inc., 1967), pp. 13-18. 21Norman T. Bell, "Strategies for Computer Applications in Instruction" (East Lansing: Learning Systems Institute, Michigan State University), p. 1. 22Bernard J. Luskin, "Computer Assisted Instruction: A Dream and a Reality," The Improvement of Junior Colle e Instruction, ed. B. Lamar Johnson (March, 1970, pp. 83-845. ED 40757. 23Arthur D. Roberts and Perry Allen Zirkel, "Computer Appli- cations to Instruction," Journal of Secondary Education, XLVI, 3 (March, 1971), 100-102. 24mid. ZSSuppeS , loc. cit. 26Bell, loc. cit. 27Suppes, loc. cit. 22 Dialog systems are designed to allow a dialog between the stu- dent and the computer program. This technique requires a complex program that will recognize and answer freely constructed questions.28 In the inquiry mode, the computer diSplays a problem which is accom- panied by a listing of available assistance. The student solves the problem or seeks assistance. The computer then analyzes the student response and branches appropriately.29 The problem-solving mode involves the use of the computer to compute.30 The student has a problem to be solved, and reduces it to mathematical terms. The computer stores the algorithm. The student then questions the computer, the computer responds and the process continues until the student receives the desired solution.31 In the simulation mode, the computer displays an experiment with options for varying the parameters. The student specifies the parameters. The computer then responds with the results of the simulation.32 Several advantages have been suggested for computer-assisted instruction. For instance, in the tutorial mode, the student may spend as much time as he needs on a particular assignment. The computer will not get "angry" about mistakes and will present prob- lems to the student as long as he desires. In the areas of problem- solving and simulation, students are able to experience and work with problems not possible in other forms in fields such as medicine and 2 30 28Suppes, loc. cit. 9Bell, loc. cit. Luskin, loc. cit. 31 3 Bell, loc. cit. 2Ibid. 23 business administration. However, there is one overwhelming dis- advantage related to computer-assisted instruction--that of cost. Costs of CA1. Costs are a recurring problem in almost all aspects of CA1. Costs per terminal hour are relatively high even with the simplest systems available, and they increase with the addi- tion of sophisticated audio and graphic display components.33 Brightman estimated the Coast Community College District budget for CAI to be equal to thirty-one full-time instructors. Therefore, the seventy CAI terminals supported by the district's computer system would have to each offer 247 contact hours of instruction per week to equal the same number of student-teacher contact hours provided by the staff.34 Bell and Moon indicated that a class of thirty students using a computer for twenty minutes per day would cost $800.00 per month, and based upon commercial rates of ten dollars per student hour, would be $2000.00 per month.35 As stated by Becker, "The cost of paper is still cheaper than the use of a terminal and computer for multi-guessing questions."36 33R. G. Atkinson and H. A. Wilson, "Computer-Assisted Instruction," Science, CLXII (1969), 73. 34Richard W. Brightman, Coast's Practicioners Review Computer- Assisted Instruction (May, 1972), pp. 15-16. ED 60547} 35Norman T. Bell and Robert B. Moon, "Teacher Controlled Computer-Assisted Instruction" (East Lansing: Michigan State University), p. 8. 36James W. Becker, "Whatever Happened to the Computer?" Journal of Educational Data Processing! VIII, 1 (1971), 4. 24 Computer-Managed Instruction The main impetus for computer-managed instruction comes from two sources: the provision for an inexpensive alternative to CAI and the capability of CMI systems to integrate with more conventional classroom methods.37 In this type of system, the computer is used to help the teacher administer and guide the instructional process, 38 The com- but relies on separate hardware and learning materials. puter is used as a tool in the management of the information needed by teachers in planning a more effective individualized curriculum. The teacher and computer cooperate to administer and guide the in- structional process.39 According to Johnston, the teacher does the teaching, while the computer helps in prescribing learning materials and activities for the individual Student. The computer monitors, records and reports on the students' educational activities.40 Characteristics of Computer-Managed Instructional Systems. Computer-managed instruction is generally characterized by activities 37John F. Vinsonhaler, Computers in Education and Social Science, Part III Computer Applications-Information Analysis systems for Instruction, Administration and Research (East Lansing: Informa- tion Systems Laboratory, Michigan State University, 1972), p. 93. 38Harvey J. Brudner, "Computer-Managed Instruction," Science, CLXII (1968), 971. 39Kentner V. Fritz and Lynn B. Levy, Introduction to Computer- Managed Instruction and the Automated Instructional Managgment System, Report V01. 5,'No. 8 (University of Wisconsin Counseling enter, June, 1972), pp. 6-7. ED 69757. 40Robert J. Johnston, "Computers in Education: An IBM Viewpoint," Educational Technolpgy, XI, 12 (December, 1971), 17. 25 found in individualized instruction, with the computer assisting in the management function of these activities. According to Baker, the four major functions performed by computers in existing systems include test scoring, diagnosing, prescribing, and reporting.41 At the beginning of each unit of instruction, 3 pre-test is taken by a student to determine his status relative to the instruc- tional objectives. On the basis of the pre-test results, students are assigned specific learning tasks. The assignments can be made by computer prOgrams which implement decision rules relating test scores to learning tasks, or the computer can generate the test results in a printed report. The report can then be used as one of several information sources by the teacher to prescribe learning tasks for the student.42 When the student has completed the assigned tasks, he takes a post-test covering the unit of instruction. On the basis of post- test results, he may be subjected to remedial materials or he may be advanced to the next unit of instruction. Again, the decisions may be made by the computer implementing decision rules or by the teacher reviewing a computer-generated report. Following the administration of each test, the teacher generally receives a computer-printed report. The report typically lists each pupil, the unit of instruc- tion he is working on, the objectives of that unit, and the percentiles he achieved for each objective covered by the test. Using the report, 41Frank B. Baker, "Computer Based Instructional Management Systems: A First Look," Review of Educational Research, XLI (February, 1971), S3. 42Ibid. the teacher can study the pattern of accomplishment of each student and identify those who warrant additional attention.43 Becker believes there are several characteristics of computer- managed instruction that make it feasible in today's educational setting. According to him, interactive terminals are not necessary as in computer-assisted instruction. However, if on-line reports need to be generated, a teletypewriter can handle that function. The Strategy is not threatening to the teacher since it is a supporting aid designed to help the teacher individualize instruction. And, he believes that it is cost-feasible to use such a system. He indicates that banks and businesses have been conducting similar activities for some time and the use of computers in education is a logical extension.44 Costs of CMI. As indicated by Becker, computer-managed instruction appears to be cost-feasible. In a test of the Teaching Information Processing System, developed at the University of Wiscon- sin, it was found that the cost required to run the system and generate the various system reports was about one dollar per student per semester.45 In the test of a computer-managed instruction system at Florida State University, which used on-line terminals for testing purposes, it was found that the per student cost of running the system 43Ibid. 44Becker, op. cit., p. 7. 4SAllen C. Kelley, "An Experiment with TIPS: A Computer . Aided Instructional System for Undergraduate Education," The American Economic Review, LVIII, 2 (May, 1968), 455. 27 was $29.90 for the course.46 The great difference between the costs of these systems might be attributed to the fact that the Teaching Information Processing System did not require on-line terminals for testing purposes whereas the Florida State University System did. AS such, the cost attributed to computer time could have been much less for the Wisconsin project than the Florida State University project. Thus, it appears that computer-managed instruction is characterized by the activities found in individualized instruction with the computer assisting in the management function of these activities. Generally, interactive terminals are not necessary and the strategy should not be threatening to teachers. And, it appears that it can be cost-feasible in an educational setting. REVIEW OF RELATED CMI SYSTEMS The following computer-managed instructional systems have been developed or are under development at this time. They represent research efforts funded by foundations and grants, and in many cases have been undertaken by research staffs of universities and private organizations. In a few cases, doctoral dissertations have been conducted in the field, but these have been related to existing research projects and subsequently have been a part of the overall research project. These systems will be compared after each has been 46Paul D. Gallagher, An Investigation of Instructional Treatments and Learner CharacteriStics 1n a Computer4ManagedInstruc- tion Cburse, Technical Report No. 12 (Florida Statéiuni¥ersiiy, CAI Center, July, 1970), p. 50. ED 42360. 28 reviewed on the following variables: (1) requirement for interactive terminals, (2) threat to staff, (3) cost, (4) modifications required to adapt the system to classroom use, and (5) adaptability of the system to technical education programs at the community college level. The Instructional Management System (IMS) was developed by Systems Development Corporation to provide a framework for making decisions on classroom management. The system was designed to help teachers monitor the pregress of students and make decisions related to the pace of instruction, the grouping of students, sequence of lessons, and the individualization of instruction.47 IMS has been operated in the reading programs in nine differ- ent first-grade classrooms in three Los Angeles City Schools. Stu- dents in the classes ranged from disadvantaged to highly advantaged in socio-economic status. In all three schools, each first grade reading class was divided into three groups: a fast group, a middle group, and a slow group. The students in one group worked on tests at individual carrels during their regular follow-up portion of the reading period while the teacher worked with the other groups. The tests were collected at the end of each day and taken to the Systems DeveIOpment Corporation by courier where they were computer-processed. A report was printed containing related data and was in the teacher's mailbox when the teacher arrived at school the next morning.48 47Cleone L. Geddes and Beverly Y. Kooi, "An Instructional Management System for Classroom Teachers," The Elementary School Journal, LXIX, 7 (April, 1969), 337. 48Ibid. 29 In evaluating IMS, emphasis was placed on the teacher's class— room behavior and how that behavior was affected by the availability of the IMS data. It was found that teachers apparently paced their instruction and provided remedial exercises more on the basis of group membership than on the basis of the students' scores on Specific tests.49 Teachers regarded as most useful the regular progress reports that gave results on a single test and indicated that group averages were more useful than the individual reports. However, they discovered that they could "snow" complaining parents by Showing them a computer print-out indicating a child's score on a particular objective.so Thus, it appears that one of the problems that must be overcome with a system that alters the present method of instruction is how the teaching staff will use the new system. The Teaching Information Processing System (TIPS) was developed at the University of Wisconsin for use in undergraduate economics courses.51 The system involved periodic collection of information from students regarding either their understanding of course materials or their reaction to various aSpects of course presentation. The information was processed and summarized in three separate reports: one for distribution to each student, a second for each section leader, and a third for the professor.52 49John E. Coulson, "Computer Assisted Instructional Management for Teachers," AV Communication Review, XIX, 2 (Summer, 1971), 162. 5°Ibid., pp. 166-67. 51 52 Kelley, loc. cit. Ibid., pp. 448-49. 30 The student report contained a summary of his performance. On the basis of his performance, assignments for the forthcoming period were indicated. The assignments-~some of which were required while others were optional-—varied considerably from student to student. Additional information was generated on the basis of past as well as current performance. The teaching assistant report contained informa- tion to help the assistant appraise the performance of students in his individual section. This included actual reSponses on the survey, statistical data relative to the reSponses, and lists of students recommended for appointment or tutorials. The professor's report was Similar to that received by the teaching assistant. However, the information available was for all enrolled students rather than a particular section.53 A pilot project using TIPS in a Principles of Economics course was implemented during the Fall Semester 1966. Eighty-six percent of the class felt that TIPS helped them learn the course material either "much better" or "somewhat better" with only 12 percent indicating that TIPS "did not help." Fifty-one percent felt "There were no particular harmful effects of the system," but 24 percent suggested that "TIPS did not accurately reflect knowledge of the material, and thus gave a false sense of confidence."54 Thus it appears that students will accept a computer-generated report system to give them infbrmation about their individual pro- gress. However, caution should be maintained to insure that the 53 54 Ibid. Ibid., pp. 449-51. 31 system accurately reflects knowledge of the material and does not give a false sense of confidence to the students. The Computer Assisted Management of Personalized Instruction (CAMPI) system was developed by Oakland Community College to assist the teacher in the management of the Personalized Education Program 5 In the Personalized Education (PEP) conducted at the college.5 PrOgram, the cognitive style of each student is mapped by the college to provide a picture of the various ways in which he searches for meaning. His style is determined by the way he takes note of his total surroundings--how he seeks meaning, how he becomes informed. The map provides a complete picture of the diverse ways in which the Student acquires meaning. It identifies his Strengths and weaknesses and is used as the basis upon which to build an individualized program for him. In addition, various modes of presentation, available in prescription centers at the college, have been mapped. Included are programmed texts, video tape recordings and films, youth-tutor-youth sessions, library books and microfilm packages, seminars to enrich, seminars to rap and independent study.56 CAMPI is used to match the cognitive style of the student with that mode of presentation found to be Ubest" for the student. After matching, CAMPI administers an entry-level test, scores it, reports results to the instructor, the prescription centers, and the student.57 ssJoseph E. Hill, The Educational Sciences (Bloomfield Hills: Oakland Community College, 1971), p. 7} S6 57 Ibid., pp. 1-2. Ibid., p. 7. 32 The system's primary function is to assist the staff in the Operation and management of the Personalized Education PrOgram. However, the Personalized Education Pregram does not require its use and therefore, it is an optional facility of the Personalized Education Program. The Individually Prescribed Instruction/Management and Infor- mation System (IPI/MIS) was developed by the University of Pitts- burgh's Learning Research and Development Center in cooperation with the Baldwin-Whithall School District in suburban Pittsburgh.S8 In this system, the student takes a placement test upon entering the school, which places him in a particular unit. Following placement in a unit, he takes the unit pre-test which attempts to diagnose his profile within the unit. As the student works through the lesson, he takes the curriculum-embodied tests which assess whether mastery has been attained on the objective. When all objectives have been mastered, the unit post-test is taken. If 85 percent is attained on this test, the student begins the next unit; if not, he is reassigned to an appropriate objective in the unit until he masters it.59 Sass developed a management program which was intended to promote self-direction by assisting the student in the selection of learning activities. The program was tested in the science curriculum of the IPI project at Oakleaf School, Pittsburgh. Grade I science was scheduled at Oakleaf School for the prototype test of the system. Students were signed onto the system and presented lesson options by 58Brudner, op. cit., p. 972. 59Cooley and Glaser, 0p. cit., p. 106. 33 the terminal. These options were read to the student for his selec- tion. The selected option was marked on a list of printed options and handed to the student. The student then presented his choice to the teacher for approval, after which the aide secured the preper materials.60 In evaluating the system, Sass noted that, "Although the class- room personnel were consistently helpful and cooperative through the field test, their enthusiasm for the use of the computer system in the school was clearly lacking after they saw it in operation." The teacher aide, commenting on the use of the management program by the students to get their prescriptions, felt that the teacher could do the job faster. The teacher, when asked if he would use the system the following year if it were available, replied that he " . . probably would not."61 Contrary to what Sass experienced, Cooley suggests that teachers and teacher aides in an IPI system can and will use computer assistance in recordkeeping and student monitoring functions. The need in the classroom, according to him, is for quiet devices which allow the teacher to interact quickly and easily with the information required, at the time a child is at her side.62 60Richard E. Sass, "The Deve10pment of a Computer-Based Management PrOgram for Use with Adaptive Instructional Systems" (unpublished doctoral dissertation, University of Pittsburgh, 1970), p. 5. 61Ibid., p. 74. 62William W. Cooley, "Computer Assistance for Individualized Education," Journal of Educational Data Processing, VII, 1 (February, 1970), 21. 34 A Program for Learning in Accordance with Needs (PLAN), was developed by the American Institute for Research and the Westing- house Learning Corporation in cooperation with fourteen school dis- tricts throughout the United States.63 The system contains five components which include a comprehensive set of educational Objec- tives, a teaching-learning unit, a set of tests, guidance and indi- vidual planning for each student, and evaluation and systems aspects.‘54 The objectives are stated at a level such that they require about two hours of student study to achieve. They are grouped together with approximately five per module, which the typical student can achieve in about two weeks. The teaching-learning units represent a guide to the student as to how he might best proceed to achieve the Objectives of the module. Several test items are constructed for each module Objective and the results of the related tests are presented to the Student to indicate whether or not he has mastered the module Objectives. Test results are computer scored and recorded separately with respect to each of the objectives included in the module test. On the basis Of the scores on the Objectives, the teacher is told that the student: (a) has mastered the module, (b) needs to review the instructional materials related to some of the Objectives before proceeding to the next module, (c) should study specific Objectives 63John C. Flanagan, "Functional Evaluation for the Seventies," Phi Delta Kappan, XLIX (September, 1967), 30. 64John C. Flanagan, "Program for Learning in Accordance with Needs," Psychology in the Schools, VI, 2 (April, 1969), 134-35. 1'1: WAM 0,3 I '1'! 3S and have his mastery certified by the teacher before going on, or (d) should re-study the module using the same or a different teaching- learning unit and take another module test covering the material.65 Thus, in the PLAN system, the computer functions much as a clerk, scoring and reporting specific test results, summarizing and organizing student status with respect to modules, and handling other housekeeping chores related to teacher actions and the requisitioning of supplies.66 A graduate level course in education, "Techniques Of Programmed Instruction," was conducted at Florida State University via CMI. The course was develOped and implemented by Hagerty,67 with investigation into instructional treatments by Gallagher.68 Students proceeded through the course at an individualized, self-paced manner and reported to the CAI Center when they were ready to be evaluated on a task which they had completed. The students scheduled a terminal at the CA1 Center in order to take the quiz on the Objectives for that task.69 6SJOhn C. Flanagan, "The Role of the Computer in PLAN," Journal of Educational Data Processing, VII, 1 (February, 1970), 10-11. 66Ibid., p. 12. 67Nancy K. Hagerty, Development and Implementation Of a Com- uter Managed Instructional Systemiin Graduate TrainipgéTechnical Report NO. 11 (Florida"State university, CAI Center, I 0). ED 42354. 68Paul D. Gallagher, An Investi ation Of Instructional Treat- ments and Learner Characteristics ifiia omputer-ManpgpdInstructien Course, Technical Report No. 12*(Florida State University, CAI Center, 19755. ED 42360. 69Walter Dick and Paul Gallagher, Systems Concgpts and Com- uter Mana ed Instruction: An Implementation and VaTidation Stud e n1ca Memo NO. (Florida State University, CAI Center, 1971;. ED 50543. 36 Instructional treatments investigated by Gallagher included: (1) Sequence Assigned/Instructor Evaluated Products, (2) Sequence Assigned/Computer Evaluated Products, (3) Self Sequence/Instructor Evaluated Products, and (4) Self Sequence/Computer Evaluated Products. As indicated by the final product score, there was no significant dif- ference between treatments. However, the Self Sequence/Instructor Evaluated group had the highest mean score and the lowest standard deviation, while the Sequence Assigned/Computer Evaluated group had the lowest mean score and highest standard deviation.70 Continuing the study of treatments related to computer-managed instruction at Florida State University, Lawler conducted a study pertaining to treatments in an undergraduate Health Education course presented by CMI.71 The treatments investigated included: (1) Remedial Prescription/Forced Mastery, (2) Remedial Prescription] Forced Progression, (3) Forced Progression, and (4) Classroom Instruc- tion. The CMI groups were designed to focus on the differential treatments of students failing to meet criterion. The Remedial Prescription/Forced Mastery group did significantly better than the Forced Progression group, but the mean difference between (a) Remedial Prescription/Forced Mastery vs. Remedial Prescription/Forced Progression and (b) Remedial Prescription/Forced Progression vs. 70Ibid., p. 37. 71R. Michael Lawler, An Investi ation of Selected Instruc- tional Strate ies in an Under aduate CO uter-Mana edhinstruction Course, Technical Report NO. I; (Flor1da State Ufiiversity,'CAI Center, Apr1I, 1971). ED S4652. 37 . . . 7° . . Forced Progre551on were not Significant. ‘ However, Lawler indicated that there was no apparent hierarchical structure among the fourteen modules used in the course and this may have affected the results. An interesting result, as indicated by anecdotal comments from students, was that several students in the Forced Mastery group felt the re- quirement Of repeating failed module post-tests was more punitive than helpful.7:5 Hobson conducted a feasibility study of the implementation of a computerized management system as a subcomponent of the Florida State University's proposed model for training elementary teachers.74 Activities Of the field study included: (1) the selection of tasks and resource Options, (2) the teaching Of concepts to local school children, (3) the taking Of quizzes both manually and via a teletype terminal, and (4) the entering of data associated with all these activities at an on-line teletype terminal.75 On the basis Of the analysis of the field study, it was pro- posed that the computerized management system for the elementary model be capable of carrying out six Specific functions. These functions included: (1) computer-managed instruction, (2) computer-assisted instruction, (3) counseling and scheduling, (4) weekly reporting, 7 721bid., p. 27. 3Ibid., p. 69. 74Edward N. Hobson, "Empirical Development Of a Computer- Managed Instruction System for the Florida State University Model for the Preparation of Elementary Teachers" (unpublished doctoral dissertation, Florida State University, 1970). 75Ibid., p. ii. 38 (5) data formating and long-range planning research, and (6) cost analysis.76 A CMI system was developed by American Institute for Research personnel at Conwell Middle Magnet School in Philadelphia. The system consisted Of three basic components: (1) a set of instruments and techniques for assessing student needs, (2) a bank Of curriculum packets related to assessed needs, and (3) a computer-based system for relating individual needs to available curriculum Options.77 The students enrolled in the prOgram were eighth grade stu- dents who were scheduled in the Center for Individually Prescribed Learning Activities for a portion of their instructional time. In the Center, the students received instructions concerning their learning packet assignments by means Of a remote terminal. All packets were stored in the Center and individual progress was guided and monitored by a Center supervisor and an instructional aide. In an effort to match individual learning characteristics with learning packets, four student variables were measured. These in- cluded: reading level, aptitude level, learning style, and cognitive style. The focal point of the system was the rapid selection of the appropriate packet for a particular student at a given point in time. The packet choice was based on a matching algorithm consisting of six procedural steps. The computer first searched for a perfect match 76Ibid., p. 75. 77John A. Connoly, A Computer-Based Instructional Management System: The Conwell Approach (1970), p. 3. ED 49620. 39 between the student's measured learning characteristics and the coded packet characteristics for an instructional unit. The computer com- pared the student's reading level, aptitude level, learning style and cognitive Style to the coded dimensions of the available packet variations. If no selection (match) was made, the computer program moved to the first alternative and searched for an imperfect match. There was a hierarchy placed on the alternatives provided by an im- perfect match and the computer ran through the selection process until the closest match was located.78 Thus, the computer was used for monitoring the learning prescriptions of individual students and for attempting to match individual learning characteristics to learning modules. Table 1 represents a matrix summary of the CMI systems re- viewed. The systems are summarized according to the characteristics which make CMI possible according to Becker.79 Also included is an evaluation of the extent to which existing curriculum would have to be modified to be used with the various systems. Three Observations are apparent when reviewing Table l. The first Observation is that only half of the ten systems have reported test results, with only one reporting teacher reaction to the system. According to Becker, one Of the factors supporting CMI is that it is not threatening to teachers.80 The lack of documentation in the area indicates that further research is needed. The second observation is that only two of the systems reported the cost of Operating the system. Since a characteristic of CMI is that it is purported to be 7 79 8 8Ibid., pp. 6-13. Page 26. OIbid. 3‘35 ->l: IEB. IA...- :... u....,. 5!... 793-...1.‘ a: :732: v m: .a. .a....::..p. .- .L ~Es< . ~ 5~ {‘5 4O deunu>eu onwsoo o>«n:o«wo nouwsceu ecu n«:oos«n o« co««qauom=« «om ceanw>oun oztuoz n«:oo=«u o« :o««aeuomc« «om eo«na>oun oz--oz o~n¢««e>¢ ao«nxn ~a:«an0« o>m«uauo«:« m“ .x~p«nnom canagwa>a ao«m>m «defiauo« o>««oauo«:w m“ .x«n«umom n«eouo«n o« co««aeuom:« «on gawnw>oun oz--oz n«=oos«n o« :o««uauom=« you =o«n«>oun OZucoz eoflnw>ou enuaoo o>«m:o«xe nouusoou one .n«eoos«n o« co««eauou:« «om :o«n«>oun oz--oz o«na««a>u one «Haawaho« o:«« -eo o>««uauo«:« m“ xupfinmom emphases o«:oos«n o« co««aauou=« «om :o«n«>oun oz--oz o>«n:o«xm o>«m:o«xm 0>fimcouxm 0>MmCOHKM o>«m:o«xm enemas: o>amco«xm u:««mo« ocmuuco you o>«m:o«xm asaficmz 5:53:32 2.9393 «oz czocxaa 38:2: «Hiawaho« o>««ueuo«e« mu Aspannoa 1.3212. maaewawo« o>fi«oauo«:« ca saaannoa :zocxca vo««oaou «oz vo««onou «oz crocxca no» vo«uomo« «oz 6.22:: excuse: uo«uomow «Oz vo«wono« «oz czoexes oo«wonou «oz «53.3909 «oz crocxcn oo««oaou «oz uncommon cox“: vouusoom u=««no« «om vouwscom u:««no« «om penance: u:««ae« «ow vouuncox CIOGXGD vouwacom oowmscom oouwsoom vowfiavou «oz o~n¢~au>¢ «so penancoa «oz ao«nxm unereou A««nuo>«:= o«a«m weanedm «a nuozoaoh «oosom >«a«eoso~m unwewauh you «one: .>a:= ouuum «pagoda «- onuoou =o««uoaom :«ueoz .>«:: e«a«m noquHE «a onwaou =o««o=u«n=~ voaauuuoum mo caduc500h 21: «018.. anew an vomo~o>oo anemone «cascaded: ao«nxm eo««aauomen «eeaeuaeaz\eo««o=u«n:~ eonsuunoua sasunoasaoca :o««o:u«n=~ oouaaanmuom mo «:03 -ouaeaz vo«m«mn< uo«:aaou ao«mxm ue«mmououa :o««aauomeu adage-oh ao«nxw «cascade-z H¢=o««usu«neu ~o>oH ouoHHou >«wcoaaoo o:« «a dew«aosoo ~eo«ezuo« o« o~ne«aaoa ao«nxm on: aoounnauo o« lo«»»» «aqua o« «ee«:oo caused me :o««ao«m«ooz oaaunuom «.8 uuozoae« o« ue«e0«aou:« «o: xuo«au«m naaewauo« o>««oauo«e« «chanced ao«nsm vorofl>om nao«nxw =o««u:u«nem oomu:u:-uo«smflou mo shaaaam x«w«ez .H can-h 41 cost-feasible, this area needs documentation.81 A third Observation is that six of the ten systems would require extensive curriculum modi- fication to place them into a format that would allow the systems to be used in an existing classroom. This ranges from writing CAI-type interactive diagnostic tests for use with interactive terminals, to develOping learning packets based upon learner characteristics. This indicates that few Of the systems reviewed lend themselves to class- room use in their existing format. As a result, possibly four Of the systems lend themselves to use in technical education programs at the community college. However, three of these require on-line terminals further reducing the number Of systems to one. Several characteristics of this system, the Teaching Information Processing System, are found in the system which was designed and are expanded upon to facilitate a course in technical education at the community college level. Current Status of Computer- Managgd Instruction The systems reviewed reveal that there are many variations of computer-managed instruction. These range in Operation from batch processing systems from remote locations (such as the Instructional Management System developed by Systems Development Corporation), to on-line systems (such as CAMPI at Oakland Community College and the systems developed at Florida State University). The reporting func- tions Of the systems have been equally as varied ranging from high- speed line printer generated reports to on-line terminal generated reports. All of the systems reviewed have dealt with individualized 81Ibid. 42 instruction in various forms using the computer to assist in the management function related to instructional activities. The Conwell System and the Personalized Education Program in which CAMPI is used were concerned with the cOgnitive style of students and used the computer to attempt to prescribe instructional approaches based upon the cognitive style of students. These systems went beyond the normal monitoring and prescriptive functions typical of computer-managed instruction. Therefore, it is difficult to describe a typical computer-managed instruction system which has been developed. They vary greatly in their scope, Operation, reporting technique and in- formation which they provide to the instructional staff. However, a common theme found in all of the systems reviewed is that computer- managed instruction is characterized by activities found in individual- ized instruction with the computer assisting in the management func- tion of these activities. SUMMARY It appears from several major papers that individualized instruction is an instructional approach which allows for the indi- vidual backgrounds, experiences, and needs of students to be met. There are several approaches to individualized instruction with the more active approach characterized by behavioral objectives and mastery learning. Computer-managed instruction is generally characterized by activities fOund in individualized instruction, with the computer assisting in the management function Of these activities. The 43 computer-managed instruction systems which were reviewed represented research efforts funded by foundations and grants, and in many cases, were undertaken by research staffs of universities and private organizations. The review Of these systems revealed that there are many variations of computer-managed instruction. These ranged in Operation from batch processing systems from remote locations to on- line systems. The reporting functions of the systems ranged from high-Speed line printer generated reports to on-line terminal generated reports. Many Of the characteristics found in the Teaching Information Processing System are applicable to a system for use in technical education programs at the community college level. As such, many of the characteristics found in this system are fOund in the system which was designed in this study. Chapter 3 PROCEDURES OF THE STUDY Information from the review of the literature was used as a basis for the design of the computer-managed instruction system. Included in the system was a course strategy, a management system operated via computer programs, and an instructional staff who con— ducted the course. Following the design Of the system, it was tested in the evening section Of the course, Principles of Electronics, at Cerritos College, Norwalk, California, during the first seven weeks Of the Spring Semester, 1973. Forty-three students participated in the system test in addition to the instructional staff who conducted the course. The evaluation of the system was based upon student progress during the system test, Student reaction to the system as indicated by responses on a student questionnaire, and instructor reaction to the system as indicated by reSponses on an instructor critique which also solicited information for improvement of the system. THE COMPUTER-MANAGED INSTRUCTION SYSTEM WHICH WAS DESIGNED AND TESTED The computer-managed instruction system was designed fOr use in technical education programs at the community college level. The 44 4S flowchart of Figure 1 depicts the operation Of the system. The major elements Of the system are identified by the letters A through F on Figure l and are referred to in the discussion of the system which fellows: Students began by taking the assignment pre- or post-tests (A). The tests were scored by a clerk who was available during the regularly scheduled class hours. Student scores were assembled by the clerk into a data deck for entry into the management system (B). The data were entered into the system along with student names, identification numbers, and completed assignment history from the student file. The data were processed and the Instructional Manager Report (IMR) was generated (C). An analysis of student status and achievement on the assignment pre/post-tests indicated on the IMR was conducted by the Instructional Manager. As a result of the analysis, messages containing lecture information, remedial assignments, and/or special notes were given to the clerk to be entered into the manage- ment system for distribution to Staff and students (D). The informa- tion was processed and the Laboratory Instructor Reports (LIR) were generated and distributed by the clerk to the Laboratory Instructors prior to each class session (E). Weekly, the Individual Student Reports (ISR) were generated and distributed by the clerk to the students enrolled in the course (F). Course Strategy The course strategy fOllowed the format of individualized instruction which consisted Of instructional assignments developed around performance Objectives with mastery pre- and post-tests. 46 ~o>om ouoHHou x«fiesasou oz« «a maauuoum =o««ao=om «mowenooh a“ on: «em ooeuwmon we: gems: Eo«mxm dow«o=w«m:H uoma:a2-uo«smaou oz« we eofl«euomo .~ ownuwm .ao«m>m «aoaouaeaa on« o«:w oopo«=o ode unabsomma .xuoHo o:« as m«=oo=«m o« mm «Home: humane: Ha:0m« .ao«m>m vo«=nwu«mme use vo«uuo=ou one uu:h«m:H on« we mmmxgaee o:« «:osouoeda o:« o«:w Ah«:o m«womox «coo=«m «movw>wv:H .a seam u:««~=mou xuou momwmmoz .o now gone a«ae mvuwan use m«mo« noueon xhoHu .n .xuoau o:« An mwa«m o« .uouacaz ~a:o««o=««mem o:« uo«=nfin«mwv one vo«auoeou mm An vouxHaea ecu eo«auo:ou mm .n«mo«n«mom\ohm «Henna uo«o=u«m:H xuo«auonaa .m «women «ounce: «u:o««o:h«m:H .u «coaeuwmma oxa« m«:oo=«m .< «Homom Eo«mxm 1|I\\MMMMMMII# Humane: «:osowmaaz «:ov:«m «mc0m«u=u«m=H Hosefi>mecw . O . z «Honom uo«oah«w:H xuo«ewonag Eo«me «:oEomacmz l a. xuov omemmoz 47 The assignment format was a modification of that described by Ringis.1 In his format, there are six elements in an instructional package: (1) a single concept focus, (2) behaviorally-stated Objec- tives which tell the learner what performance is expected of him, under what conditions, and the proficiency, (3) multiple activities and methodologies describing various ways the student can attain the Objectives, (4) diversified learning resources to allow for variation in the styles of learning of students, (5) evaluation instruments consisting usually of pre-tests, self-tests, and post-tests, and (6) breadth and/or depth suggestions intended to provide the learner with suggestions for further exploration.2 This basic format was modified to facilitate a technical education course at the community college level. The resulting modification included these elements in the assignments: (1) single concept focus, (2) behaviorally-stated objectives, (3) multiple activities, (4) evaluation instruments, and (5) statement of future assignments for those students wishing to study ahead. Instructions listed on the assignments directed the student to take the pre-test and then, based upon the results of the pre-test, to complete the instructional activities listed on the assignment which he believed necessary to correct the deficiency indicated by the pre-test score. A short annotation on each instructional activity indicated the information which could be expected from that particular 1R. Herbert Ringis, "What is 'An Instructional Package'?" Journal Of Secondary Education, XLVI, 5 (May, 1971), 201-205. 2Ibid. 48 instructional activity. A sample instructional assignment is fOund in Appendix H. The flowchart depicted in Figure 2 was used for guiding stu- dents through the assignments' sequence on an individual basis. It is a modification of the general decision model used for development Of the CAI courses at the U.S. Naval Academy in 1967-68.3 Imple- mentation of this process by students was as follows: Students began by taking the assignment pro-test. The test was corrected by a clerk who indicated the correct and incorrect items immediately to the student. If the student achieved mastery on the pre—test, he was presented with the pre-test for the next assignment. If he did not achieve mastery, he then completed instruc- tional activities related to the questions missed on the pre-test. Upon completion of these activities, the post-test was taken by the student and scored by the clerk who again indicated the correct and incorrect items. If the student achieved mastery on the post-test, he was directed to the pre-test for the next assignment. If he did not achieve mastery, a decision was made by the Instructional Manager to either prescribe remediation or to ignore the deficiency. When remediation was prescribed, the student completed the remedial activities and, when ready, an alternate form of the post-test was taken, followed by the pre-test for the next assignment. If the student was notified to ignore the post-test deficiency, he then continued to the pre-test for the next assignment. 3U.S., Civil Service Commission, Bureau Of Training, CO uter Assisted Instruction: A General Discussion and Case Stud , Tra1n1ng Systems andTechnoIOgy Series: NO. V, FampEIet T-IS (WasEington: Government Printing Office, 1971), p. 11. 49 Assignment Pre-test Yes Instructional activities 1 Remedial . ’ activities A551gnment I Post-test 7 Remedial , Mat? Yes Pre-test next assignment Figure 2. Course Strategy for a Technical Education Course at the Community College Level 50 Manpgement System The management system which was developed consisted Of a stu- dent file and a remedial materials file stored on high speed magnetic disk, along with the computer programs which generated the Instruc- tional Manager Report, Laboratory Instructor Report and the Individual Student Reports. The computer programs were written in COBOL (Common Business Oriented Language) because the language lends itself to a report generating system. The student file contained the instructional assignment history of students and was updated after each class session by the Instructional Manager Report program. The remedial materials file contained a listing of materials relating to the instructional assign- ments to be used by the Instructional Manager for assignment to stu- dents having post-test deficiencies. A discussion Of the computer- managed instruction system programs follows. Instructional Mappger Report Program, The Instructional Manager Report program performed two functions in the management system: (1) the updating of the student file and, (2) the generation of the Instructional Manager Report. The flowchart depicted in Figure 3 represents the operation of the IMR program. The major ele- ments of the program are identified by the letters A through F on Figure 3 and are referred to in the discussion of the program which follows: Data were entered into the program via computer data cards. The student file was read and the pre-test update performed, (A). Pre-test scores were checked against mastery, established at 80 (0 Student pre/ post—test scores T fi Read student scores Updated student file Write updated student fille ED PrilntG Instructio 1 Manager Report 1 Instructional anager Repor C ) Read , student Pae;:::t file p Develop pre-test 're-test deficiency def? list TE Post-test emedial update uzterials Develop Read Post-test Post-test remedial def1c1ency def? materials 11st file 1 6 Develop Develop remedial assignments M materials student essage list working list cards . Read DeveIOp message message Messagig' cards lists '” sta ' Figure 3. Flowchart of Instructional Manager Report Program *A, B, C, D, E, and F are major elements of the program. 52 percent for the test. If a student failed to achieve mastery on the pre-test, his name, assignment number, score, criterion and possible number of points were transferred to a pre-test deficiency list. Following the pre-test update, the post-test update was per- formed (B). Again, scores were checked against mastery, this time by Objective. If a student failed to attain mastery on any Of the assignment Objectives, his name, assignment number, objective number, score and possible points on the Objective were transferred to a post- test deficiency list. At that time, the remedial materials file was searched for materials relating to the post-test deficiency and the materials' title and location were transferred to a remedial materials list (C). The student file was then sorted by assignments in which students were working and a list of students by assignment was developed (D). The date work was begun on the assignment and whether mastery had been attained was indicated. Data cards were checked for messages from the Laboratory Instructors. If messages were present, they were transferred to a message list. Following the processing of the data and messages, the student file was updated, (E) and the Instructional Manager Report was generated, (F). A sample Instruc- tional Manager Report is found in Appendix A. Laboratory Instructor Report Progrgm: The Laboratory Instructor Report program generated the Laboratory Instructor Report, sections 1 and 2, and provided an information and communication channel between the Instructional Manager and the Laboratory Instruc- tors. The LIR provided information for the Laboratory Instructors 53 as to where students were working on course assignments and allowed for communications between the Instructional Manager and the Labora- tory Instructors related to student progress. The flowchart depicted in Figure 4 represents the operation of the LIR program. The major elements of the program are identified by the letters A through E on Figure 4 and are referred to in the discussion of the program which follows: The student file was read and the data were sorted for each section by the assignments in which students were working, (A). The date work was begun on the assignment and whether mastery had been attained was indicated. Upon completion of the sort, message cards were checked for messages from the Instructional Manager to be sent to section 1 and 2 Laboratory Instructors. If messages were present, they were transferred to a message list for the appropriate section, (B and C). Following the processing of the data from the student file and the message cards, the Laboratory Instructor Report for section 1 was printed, (D), followed by the section 2 report, (E). A sample Laboratory Instructor Report is found in Appendix B. Individual Student Report Program. The Individual Student Report program generated the Individual Student Reports and provided an information and communication channel between the Instructional Manager and the Students. Although the students received immediate feedback relative to their progress when assignment pre- and post- tests were scored by the clerk, the ISR provided an Official listing of completed assignments and scores for the students. It also prO- vided a means of informing students Of lectures pertaining to the Message cards Section 1 l ] 54 C D I Q Develop assignments stu. working list, Sec.162 Read student file _ Read Develop message message list Messages cards for Section 1 ec.l? 1 Message cards ' 2 Read DevelOp message cards message list for Section 2 Figure 4. *A. B. C. L . Laboratory Pr1nt Instructor LIR e ort-Sec Section 1 p ‘ Laboratory Pr1nt Instructor LIR Re ort-Sec Section 2 p ' I Cm D Flowchart of the Laboratory Instructor Report Program D, and E are major elements of the program. tn 1;: assignments in which they were working. The flowchart depicted in Figure 5 represents the operation of the ISR program. The major elements of the program are identified by the letters A through C on Figure 5 and are referred to in the discussion of the prOgram which follows: The student file was read and the list of assignments completed was developed, (A). The assignment number, date completed, score, criterion and possible number Of points for each assignment was indi- cated. Message cards were then checked to determine if remedial materials were prescribed by the Instructional Manager. If remedial materials were prescribed, the material title and location were trans- ferred to a remedial materials list. Lecture cards were then read and the recommended lecture information was entered, (B). Message cards were checked for Special messages from the Instructional Manager. If messages were present, they were transferred to a special message list. Following the special message check, the Individual Student Reports were printed, (C). A sample Individual Student Report is fOund in Appendix C. Operation Of the Comuter Programs. Each program Of the manage- ment system was Operated by a set Of control cards which called up the programs stored in the computer. The control cards were placed in front of the data deck which contained student assignment scores and/or messages to be processed. An End-Of-File card completed the format. Figure 6 represents the format of the computer card deck used for operating the computer-managed instruction system programs. Remedial materials 56 Read Develop remedial remedial materials materials cards _ list DeveIOp assignments Rgad completed St: int list 1 e 'emedial Recommended lecture cards [1. , R Speci al Enter recogcelnde message recommended lecture _ cards lectures cards Read Develop Add special Spec1al messages? message message list L Figure 5. Individual Student Reports Print Individual Student Reports I cm D Flowchart of the Individual Student Report Program *A, B, and C are major elements of the program. S7 EOF .1 DATA DECK _4 End of / $ DECK CMI.DAT File .RU CMIlOO $PASSWORD Data Deck $JOB CMI 300,311/TIME:3:OO ....J y—l Control Cards Figure 6. Card Deck for Operation Of the CMI System Programs Pre- and post—test scores were previously key-punched on computer data cards in addition to cards with names and assignment titles. As a result, the data deck used for entering assignment pre- and post-test scores into the management system required only the assembly of the data deck and did not require cards to be key-punched after each class session. Thus, individually key-punched cards were only required for special messages and lecture infOrmation. Instructional Staff The instructional staff required for the Operation of the system included an Instructional Manager, two Laboratory Instructors and a clerk. 58 Instructional Manager. The Instructional Manager had full reSponsibility for the Operation of the course. He conducted and scheduled the lecture/small group discussions relating to instructional assignments and monitored student progress using the Instructional Manager Report. In addition, he made all instructional decisions relating to prOgress Of students. He communicated with students via the Individual Student Reports concerning their progress and advised them Of lectures related to assignments in which they were working. Via the Laboratory Instructor Report, he communicated with the Laboratory Instructors any special information related to student prOgress and needs. A critical requirement imposed upon the Instructional Manager was that he possess an excellent command of the course subject matter and that he be capable Of coping with a wide variety Of student prob- lems which might be brought about by students working at their own rates and on different subject matter Simultaneously. This require- ment is supported by Hensley who cautions that faculty involved in individualized instruction should be selected on the basis Of their interest in an individualized program and their ability to maintain flexibility in classroom organization.4 Since the Instructional Manager had full responsibility for the Operation of the course, he needed to be capable of coping with an unstructured situation. 4Charles Hensley, "Individualized Instruction," School and Community, LVIII, 2 (October, 1971), 33. 59 Laboratory Instructor. The Laboratory Instructor was responsible for conducting the laboratory sessions. Using the Labora- tory Instructor Report, he monitored the prOgress of students in his laboratory section and provided assistance to the students when neces- sary. Whereas the Instructional Manager was responsible for the instructional decisions relating to the course, the Laboratory Instructor's function was to assist students in achieving mastery on the instructional assignments during the laboratory sessions. Like the Instructional Manager, he needed to be capable of coping with a wide variety of student problems which could be brought about by students working at their own rates and on different subject matter simultaneously. Since his function was to assist students in achieving mastery of the assignment Objectives, he needed to have complete command Of the subject matter and to be capable Of discussing various aspects of the subject matter with students as needed. Most of all, he needed to be capable of functioning in an unstructured situation since it was anticipated that few students in the laboratory sessions would be working on similar assignments and, therefore, it was required that he be most flexible. 91333: The clerk was responsible for running the computer programs and maintaining a file of the instructional assignments, along with the assignment pre- and post-tests. In addition, the clerk was required to attend all regularly scheduled laboratory sessions for the purpose of scoring pre- and post-tests. Upon scoring the tests, the correct and incorrect items were indicated. Based upon the re- sults, the appropriate assignment materials were presented to the 60 students. Following the class sessions, the clerk built the data deck from the pre- and post-test results using the pre-punched computer cards. Special messages were key-punched and the clerk then entered the data deck into the computer system and distributed the appropriate reports. Minimum typing skills were required of the clerk since it was necessary to key-punch infOrmation and Special messages on computer cards for entry into the management system. However, the most impor- tant qualifications were that the clerk be reliable and friendly because Operating the computer programs and working directly with the students and staff were major responsibilities. TEST OF THE SYSTEM The computer-managed instruction system was tested in the evening section of the course, Principles of Electronics, at Cerritos College, Norwalk, California during the first seven weeks Of the Spring Semester, 1973. The computer programs Of the management system were Operated on the college's Digital Equipment Corporation PDP-lO computer. The first five weeks were conducted with the inves- tigator on site assisting where necessary. The remaining two weeks were conducted with the investigator away from the site and the system operated solely by the Cerritos College staff. The test of the system was concerned with student prOgress through the course, in addition to student and staff reaction to the system during the test. 61 A student information sheet (Appendix A), was completed by the students during the first class meeting to provide background information on students. At the end Of the third, fifth and seventh weeks, a questionnaire (Appendix J), was completed by the students to provide information concerning student reaction to the system. At the end of the ninth week Of the semester, the questionnaire (Appendix J) was completed anonymously by students for the purpose of comparing responses to those in which students were identified. At the completion Of the seventh week, an instructor critique (Appendix K), was also completed by the instructional staff concerning staff reaction to the system. Course Used for the Test of the System The course, Principles of Electronics, at Cerritos College, Norwalk, California, was selected for testing the system since this course met the criteria for technical education courses outlined in the Vocational Education and Occupations document published by the U.S. Office of Education.5 The course was the first in a four-course sequence intended for the student of Electronic Technology. The pre-requisites for the course included one year Of high school radio/ electronics, or the course, Basic Electricity, Offered at Cerritos College. In addition, one year Of high school algebra or one semester of Technical Math, which covered the basic principles Of mathematics, was required. A second course in Technical Math, which covered the principles of algebra, geometry and trigonometry needed SChapter 1, pages 1-10. 62 in first year technical courses, and the co-requisite electronics laboratory were taken concurrently. Under certain conditions, the counseling staff at Cerritos College permitted students to enter Principles Of Electronics without completion Of the pre-requisite courses. Entrance under these conditions was based upon previous background or work experience in the field of electronics. Therefore, student backgrounds were quite varied. The course, which Officially met twice a week for 75 minutes of lecture followed by the laboratory which met for 150 minutes, was modified to accommodate the structure Of the CMI system. The 75- minute lecture period was extended 5 minutes to accommodate one 20- minute lecture and two 30-minute lectures. Thus, the Instructional Manager presented three different lectures in a regularly scheduled class meeting in one evening. Students were advised Of the lectures scheduled for assignments in which they were working via the Individual Student Reports. Attendance was optional based upon the student's decision as to his need for the material. Laboratory sessions consisted Of students working individually on assignments at their own pace and receiving individual help from the Laboratory Instructors. The laboratory sessions began immediately following the lecture period and were Open for student work for a period Of 150 minutes. Students were allowed to come and go as they desired. However, all pre- and post-tests were taken during the laboratory sessions thus requiring student attendance during that time. 63 Po ulation Involved in the Test of tfie System Fifty-four students were initially enrolled in the course. Four students dropped the course due to work and class conflicts during the first week. Student information, SCAT scores and ques- tionnaires were incomplete fer seven students, resulting in forty- three (43) students participating in the system test. Examination of Table 2 shows the summary of the student back- ground information as indicated On the student information sheets. The table reveals that Of the forty-three students participating in the test, fourteen had completed the pre-requisite course, Basic Electricity. Eight of the fourteen had completed the cO-requisite technical mathematics course and three had completed a standard mathematics course. Six students had completed a college English course and six indicated prior work experience in the electronics industry. Fourteen students had completed the pro-requisite elec- tronics course elsewhere, ranging from correspondence courses to military electronics schools. Of these students, Six had completed prior mathematics courses including both technical and standard mathematics through calculus. Five had completed a college English course and again, Six indicated prior work experience in the elec- tronics industry. Fifteen students indicated no previous formal training in electronics. Seven of these students indicated prior mathematics experience, four indicated completion Of a college English course, and seven indicated prior work experience in the field of electronics. 64 Nmool U') H I .¢ H '1le N3V3dq v- H C Snoop oeoz voumaom oooooo HNHOH ocoz oooooom «OOHwQ swoop ocoz oo«mHom «oonflo oocofiuonxm xeoz «H mm Va 000 spook ocoz .oosa omooucm Hoooe ocoz .ocoa smoomom Hoooo ocoz .ocaa gmoomom nooomcm omooooo naviaq u: H ea HUI” ¢3V1Nq <- 0-1 «poop oaoz 5a: oooofiom goo: oaoooeooo Hm«op ocoz goo: ooaooaom goo: sooooeooo swoop ocoz goo: oooooaom goo: Hooooeooo meo«msog«mz msOH>ond ma ma Hooch C ”mo«:OH«Oon maom>oua oz OONNNI sl- H vH ea Hoooe Hoooom goo: cocooeommoauou sooooooz momeon«ooam owmmm "ononzomam swoop no«fimwnoon-onmv mowcou«ooam owmam "ouoHHou mo«fihuou mochH«Oon m30fi>onm .«oonm eow«meaom:H «coez«m o;« no noncoamoa so oo«aowo=H me meesonmxoem .m«:oos«m mo suaaasm .N odes. 65 Nine students either were currently working or had worked as electronics technicians, with ten working in related electronics positions. Twenty-four students had no previous work-related exper- ience in the field Of electronics. The School and College Ability Test, Form A, scores were made available through the cooperation of the counseling department and the admissions Office of the college. SCAT Verbal, Quantitative and Total scores ranged almost continuously from the third (3rd) to the ninety-ninth (99th) percentile on Cerritos College norms. As indicated, backgrounds of the student population involved in the test were extremely varied in terms of prior coursework in electronics, work experience in the field Of electronics, and ability to perform college work as measured by the School and College Ability Test. AS such, the backgrounds of students indicated a need for an individualized instruction approach to the course. Instructional Staff Involved in the Test Ofithe System The instructional staff involved in the test Of the system included two instructors and a clerk. One instructor was assigned both the lecture and one laboratory section. He assumed the role of Instructional Manager and one Laboratory Instructor. The second instructor was assigned a second laboratory section and also assumed the role of Laboratory Instructor. A former student became interested in the project and donated his time as a laboratory assistant in both lab sections. 66 The Instructional Manager was a part-time instructor who held a master's degree in Industrial Arts Education, Electronics Option, and who had been teaching Industrial Arts Electronics at a high school in the Cerritos Junior College District for fourteen years. He had been teaching part-time in the evening program at Cerritos College for five years and was thoroughly familiar with the electronics prOgram at Cerritos College. During his five years at Cerritos College he had taught three of the four fundamentals courses. In addition, he had worked in the electronics industry as an elec- tronics technician. During the test, he developed all instructional assignments and made all instructional decisions for the course. Based upon his educational background and teaching performance in previous courses at Cerritos College, he was asked to teach the course during the test of the system. It was believed that his educational experience and teaching ability would be valuable in evaluating the system for future modifications and further development from a pro- fessional educator's point of view. The other instructor, who was assigned the role of the second Laboratory Instructor, was a part-time instructor also, but his back- ground was from industry rather than education. He was a graduate of a technical School in the Los Angeles area and had been a super- visor in a prototype development lab in electronics, a project direc- tor, and, at the time of the study, was a hardware design supervisor and coordinator at a prominent electronics firm in the Los Angeles area. He had taught in the evening program at Cerritos College for twelve years and also, was thoroughly familiar with the electronics 67 program. During the test, he conducted one of the laboratory sections and worked closely with the Instructional Manager. Based upon his electronics industrial background and teaching performance in previous courses at Cerritos College, he was asked to teach the second labora- tory section during the system test. It was believed that his indus- trial experience and teaching ability would be valuable in evaluating the system from a non-educational point of view, representing the large segment of part-time Staff found in the technical education programs at the community college. Both instructors were interested in individualized instruc- tion and were quite capable of functioning in a non-structured environment. As such, they met the criteria necessary in individual- ized instruction suggested by Hensley, which was that faculty involved in individualized programs needed to be interested in such a program and needed to be able to maintain flexibility in classroom organization.6 A former student, who became interested in the project, assisted students during the laboratory sessions. He had completed the electronics program at Cerritos College and had worked in the U.S. Navy in electronics training programs. The student clerk was hired through the student placement office at Cerritos College to work with the program. She was a foreign student from Taiwan, Formosa, attending Cerritos College. She worked ten hours per week with the system. She maintained the Page 58. 68 instructional assignment file, corrected student assignment pre- and post-tests, built the data decks from student scores and Operated the management prOgrams. She was available during laboratory sessions and worked additional hours during the day Operating the management programs. INSTRUMENTS USED IN THE EVALUATION OF THE SYSTEM The instruments used in the evaluation of the system included a student questionnaire and an instructor critique. The instruments were used to provide information relative to student and staff reac- tion to the system and to solicit information for the improvement of the system. The Student Questionnaire The student questionnaire (Appendix J) was used to determine student reaction to the system and student attitudes toward learning. The questionnaire followed the semantic differential format and was a modification of that used by Hobson in a study of a CMI system at Florida State University.7 According to Osgood, Suci and Tannen- baum, in a semantic differential, the subject is provided with a. concept to be differentiated and a bipolar scale, or scales, against which to do it. The subject's task is to indicate, for each item, 7Edward N. Hobson, "Empirical Development of a Computer- Managed Instruction System for the Florida State University Model for the Preparation of Elementary Teachers" (unpublished doctoral dissertation, Florida State University, 1970). 69 the direction of his association and its intensity on the scale.8 In research conducted by Osgood and others, it was found that when nine or more steps were used, the frequency of the reSponses on either side of the center were much lower when compared to scales with fewer steps. However, when five steps were allowed, college students ex- pressed irritation at not being able to indicate more discriminately between steps.9 Thus, seven steps were selected for this scale. The questionnaire was reviewed by staff at Michigan State University and was pilot-tested for clarity in a beginning elec- tronics course at Lansing Community College, Lansing, Michigan. Some modifications of educational terminology used on the question- naire were suggested and were incorporated in the instrument. The questionnaire was completed at the end of the third, fifth, and seventh week of the test. Two weeks after the completion date Of the test, an anonymous form of the questionnaire which did not request the student identification was completed to determine if students changed their responses when not identified. Ratings on the questionnaires completed at the end of the third, fifth, and seventh weeks were averaged and means and standard deviations were analyzed in relation to the number of assignments completed. The data from the three questionnaires completed during the system test were also compared to means and standard deviations for the anonymous form of 8Charles E. Osgood, George J. Suci, and Percy H. Tannenbaum, The Measurement Of Meaning (Urbana: University of Illinois Press, 1957). p. 20. 91bid., p. 85. 70 the questionnaire and a chi-square analysis was conducted to deter- ‘mine if reSponses had changed during the test of the system. In addition, a profile of students finding success and difficulty with the system and a profile of students expressing satisfaction and dissatisfaction with the system was developed from the data. An analysis of the data derived from the student questionnaire is found in Chapter 4. The Instructor Critique The instructor critique (Appendix K) was used to determine the instructional staff reaction to the system and to solicit infor- mation fer improvement of the system. Only two instructors were involved in the test of the system; therefore, a subjective critique was selected. It was believed that more information could be Ob- tained for improvement of the system by allowing the instructors to reSpond in their own manner to questions about the system and how it might be improved. The instructor critique was reviewed by staff at Michigan State University for clarity and content. It was indi- cated by the staff that the instructor critique was satisfactory in its present form in both clarity and content. The instructor critique was completed at the termination of the test of the system, the seventh week of the semester. The responses relative to the reaction to the system were summarized and are presented in Chapter 4 along with the suggestions for improvement of the system. The suggestions were analyzed and used as a basis for modifying the Instructor Manager Report program, the Laboratory 71 Instructor Report program, the Individual Student Report program, and their subsequent reports. SUMMARY The course strategy Of the computer-managed instruction system followed the format of individualized instruction using instructional assignments developed around performance Objectives with mastery pre- and post-tests. The management system consisted of a student file and a remedial materials file stored in the com- puter along with a set of computer prOgrams which generated an Instructional Manager Report, Laboratory Instructor Reports and Individual Student Reports. The reports were used by staff and students to assess student progress and as a communications channel between staff and students. The instructional staff included an Instructional Manager who was responsible for instructional decisions pertaining to the course, Laboratory Instructors who assisted stu- dents in the laboratory and a clerk who assisted with the clerical functions of the course. The test of the system was conducted at Cerritos College, Norwalk, California, in the evening section of the course, Principles of Electronics. Forty-three students participated in the test of the system during the first seven weeks of the Spring Semester, 1973. The system was evaluated by students using a student questionnaire administered during the third, fifth, and seventh weeks, with an additional anonymous form of the questionnaire completed during the 72 ninth week. The instructional staff completed an instructor critique at the end of the seventh week and made recommendations for improve- ment of the system. Chapter 4 RESULTS OF THE SYSTEM TEST AND MODIFICATION TO THE SYSTEM This chapter reports the results of the system test. Included is a discussion of student prOgress during the test, student and staff reaction to the system as they relate to the research questions and a description of the modified system resulting from the test. FINDINGS CONCERNING THE RESEARCH QUESTIONS Data from the Student Information Sheet, CMI System Student Questionnaire, the student file and the Instructor Critique were used to answer the research questions and as a basis for modifying the system. Student PrOgress Durimg the System Test Data from the student file, CMI System Student Questionnaire and the Student Information Sheet are presented in Tables 3 through 9. The data were used to answer research question one and the related sub-questions which read as follows: 73 74 1. Given the opportunity to work at their own rate, how much will students vary the time required for achieving mastery of the learning materials? a. To what degree will students take advantage of the option to bypass materials previously mastered? b. What are the characteristics of Students related to early and late attainment of assignment Objectives? A summary Of the assignments completed during the test as indicated by data from the student file is shown in Table 3. Of the forty-three students participating in the test, forty-two (97 percent) completed Unit I, assignments 1-7. Six students (13 percent) completed Units I and II, assignments l-7 and 8-15, and three stu- dents (6 percent) continued On into Unit III. Normal progress for the course at the end of seven weeks should have placed students in the area of assignments lO-l3. Of the forty-three students, twenty- six were working within that area, nine were ahead and eight had not reached that point. Table 4 Shows the assignments completed during each class session and Table 5 indicates the range of assignments completed during the sessions. AS can be seen in Table 4, several students completed multiple assignments during the first four or five class sessions. This was to be expected since much of the early course information could have been attained through pre-requisite course- work or prior experience. By the second class session, most students had begun to vary the time required to attain mastery of the assign- ment objectives. And, by the fourteenth class session, students were 75 Table 3. Assignments Completed by Students During the System Test Numbers of Unit Assignment Assignment Students Number Title Completing Assignments Unit I: 1 Electron Theory 43 2 Conductance and Resistance 43 3 Electrical Units and Meter Fundamentals 43 4 Resistive Devices 43 5 Voltage and Current Measurement in a DC Circuit 43 6 Prefixes and Powers Of Ten 43 7 Review Assignments 1-6 42 Unit II: 8 Use of the Slide Rule 42 9 Resistance and Voltage Measure- ment using a VTVM 4O 10 Series Circuit Analysis 35 11 Parallel Circuit Analysis 29 12 Compound Circuit Analysis 26 13 Voltage Divider Circuit Analysis l4 14 Loading Effects of Meters 9 15 Review Assignments 8-14 6 Unit III: 16 The Superposition Theorem 3 l7 Thevenin's Theorem 3 76 Table 4. The Number of Assignments Completed at the End of Each Class Session During the System Test Session Number Assignment Number 1 2 3 4 s 6 7 3 9 1o 11 12 13 14 n 1 33 s 3 2 43 2 16 19 3 1 2 2a 43 3 25 12 2 4 43 4 13 23 3 2 1 1 43 s 11 16 s 2 1 4 2 1a 1a 43 6 9 10 12 4 2 4 1 1 43 7 7 9 8 3 1 5 4 2b 3 42 8 13 9 4 3 3 3 2 4 1 42 9 6 11 6 2 3 2 3 2 4 1 4o 10 6 10 3 s 2 3 1a 2 3 3s 11 s 2 9 3 6 4 29 12 4 3 s 2 3 s 4 26 13 3 2 4 s 14 14 4 3 2 9 15 3 1 2 6 16 1 2 3 17 3 3 aMastery attained upon completion of later assignments. bOne student attained mastery upon completion of later assignments. 77 Table 5. Range Of Assignments Completed Each Class Session.During the System Test Session Highest Assignment Lowest Assignment Range at End Completed Completed of Session 1 2 1 1 2 4 1 3 3 7 l 6 4 7 2 5 S 9 l 8 6 10 4 6 7 ll 4 7 8 12 2 10 9 12 S 7 10 13 5 8 11 14 6 8 12 15 5 10 13 16 5 ll 78 spread from the Sixth to the seventeenth assignments. As indicated in Table 5, the range of assignments completed each class session began to expand almost immediately. For example, the range for the first session was one and, by the thirteenth session, the range had expanded to eleven. Thus, as indicated by the data from Tables 3, 4, and S, stu- dents will vary greatly the time required to attain mastery of the assignment objectives when given the Opportunity towork at their own rate. Option to bypass asSignments. Data from the student file were used to answer research Sub-question la which read "To what degree will students take advantage of the Option to bypass materials previously mastered?" A summary of the number Of assignments completed via pre- and post-tests is shown in Table 6. Of the 509 pro-tests completed, 446 (87 percent) were at or above mastery. Two factors may have contributed to this high rate: (1) The course was the first of the four fundamentals courses in the program and required a pre-requisite electronics course. Since the test was conducted during the first seven weeks of the semester, some of the material could have been obtained from previous course- work Or prior experience. (2) Student orientation to testing in the conventional classroom setting could have caused some students to read ahead in an effort to achieve mastery on the pre-test rather than take the pre-test as a self-diagnostic test. 79 Table 6. Number of Assignments Completed via Pre- and Post-tests During the System Test Assignment Tests n Pre-tests completed 509 Pre-test mastery 446 Post-test completed resulting from Pre-test deficiency 61 Post-tests in addition to Pre-test mastery 2 Total assignments completed (Pre- and Post-tests) 507 On only two of the 507 assignments completed was mastery attained on both the pre- and post-tests. It is not known if students worked through the instructional materials after completing assignments via the pre-test to correct their errors, but, it is apparent that they did not elect to take the post-test after achieving mastery on the pre-test. Therefore as indicated by the data from Table 6, students will elect to bypass materials previously attained, or at least they will elect not to re-test on those materials. Characteristics of students related to achievement. Data from the Student Information Sheet, the CMI System Student Question- naire, and students' School and College Ability Test scores were used to answer research Sub-question lb which read "What are the charac- teristics of students related to early and late attainment Of assignment Objectives?" 80 Means and standard deviations for students completing a high, intermediate and low number of assignments are shown in Table 7 for thirteen characteristics. Students were grouped based upon the number of assignments completed relative to normal progress for the course at the end of seven weeks. Students who had completed between ten and thirteen assignments were placed in the intermediate group, since that represented normal prOgress for the course at that time. Stu- dents who had completed a greater number were placed in the high group, while those who had completed a fewer number were placed in the low group. Based upon these criteria, the number of students in the low completion group was eight, the intermediate completion group was twenty-six, and the high completion group was nine. The characteristics of students were based upon their ability to perform college work as indicated by the School and College Ability Test scores (verbal, quantitative, and total), their backgrounds as indicated by responses on the Student InfOrmation Sheet (Appendix I), and their attitudes toward learning as indicated by their reSponses to items six, eight, nine and ten from the CMI System Student Ques- tionnaire (Appendix J). The responses to the items from the CMI System Student Questionnaire were summed over the three questionnaires administered and the means and standard deviations were determined from that value. It must be noted that items six and nine of the questionnaire were inverted yielding a high value representing a positive response for all items from the questionnaire. As can be seen from Table 7, the mean SCAT scores were greatest for the intermediate group while previous electronics 81 .325" no. 65 on 233253 a .oo>oo co. ago oo ooooomocuoma ooo.~ ~N~.n ann.o mon.o moo.” oo«.m moogoooo goo: soowooo nucaxuo: so «non menace Nan.o «So.v coo.o «mo.o omo.~ Soo.o naoooooo are owsuwm o« anemone amo.o soo.n Hom.o -o.n who.” nom.~ woeoooow poo oooooooo» 10k Hosanna m0>0m~on o-.S Hoo.o «ma.o moo.o moo.o «no.4 ooooogoo spoon are mean o« muomoud «so. ooo. «o4. new. one. mmo. floooooou one oooooou «concouonxm xuoz mas. NNN. how. moo. «no. omo. gnooucm oaoooou ooo. ooo. moo. nnm. one. man. Aoouoaoon on. one n«az «eouusoeou nan.o coo.o mno.o son.o soc.” man.o Apnoea-ow on. one o««n«scounoum :«az oh~.o moo." can. nmo. ooo.~ coo." “nooaoo on. oo.0ha Ahv.no mmm.om Noo.mo onm.km ooo.oo man.oo oooom swoop--a--aoo ouao:a«m o:- amao: mo confinemeou .5 names 82 courses were similar for all three groups. The low group had the highest amount Of math pre-requisites, whereas the intermediate group had the highest amount of concurrent math and prior college English experience. The high and low groups were quite Similar when consider- ing prior work experience. Their means were more than twice the mean of the intermediate group. The low group members preferred to plan their own Study schedule more than the other two groups. The inter- mediate group believed stronger that it was the teacher's responsi- bility to see that the students learned the course material. They also indicated that they preferred to figure out how to do something by themselves more than the Others. However, they were followed closely by the high group. And, the low group believed that they learned best when working closely with the teacher whereas the high group did not. A one-way analysis of variance was conducted for the three groups on the thirteen characteristics. For two and 40 degrees of freedom, an F ratio of 5.18 is required to be significant at the .01 level of confidence. An F ratio of 3.23 is required bo be signifi- cant at the .05 level and an F ratio of 2.44 is required at the .10 level of confidence. As the study was exploratory in nature, a criti- cal F level Of .10 was selected in which to test the difference between the groups on all variables. Of the thirteen characteristics measured, "1 learn best by working closely with the teacher." was significant at the .05 level and "Work experience in the electronics industry related to this course." was significant at the .10 level. Table 8 shows the results 83 of the one-way analysis of variance conducted for the characteristic, "I learn best by working closely and directly with the teacher." An F ratio of 3.752 for the characteristic was significant at the .05 level of confidence. A post hoc analysis indicated that the high group was significantly different at the .05 level when compared to the low group. Also, the high group was significantly different at the .05 level when compared to the sum of the intermediate and low groups. It can therefore be said that an asset of the high group, as it relates to the number of assignments completed, was that they believed that they learned better when not working closely with the teacher. Table 8. One-Way Analysis of Variance for the Characteristic, "I learn best by working closely and directly with the teacher." Residuals Source of Variation Degrees of Sum of Mean Freedom Squares Squares Between groups 2 16.4 8.200 Within groups 40 87.409 2.185 Total 42 103.809 8.200, F2,40 ' ms' “52* Critical F a 2.44 1"Significant at the .05 level of confidence. 84 Table 9 shows the results of the one-way analysis of variance conducted for the characteristic, "Work experience in the electronics industry related to this course." An F ratio of 2.536 fer the charac- teristic was significant at the .10 level Of confidence. A post hoc analysis indicated that the difference between the high and low groups was not Significant at the .10 level. The only difference found to be significant at the .10 level of confidence was when the intermediate group was compared to the sum of the high and low groups. It can therefore be said that the combination Of the high and low completion groups had more prior work experience in the electronics induStry than did the intermediate completion group. Table 9. One-Way Analysis of Variance for the Characteristic, "Work experience in the electronics industry related to this course." Residuals Source of Variation Degrees of Sum of Mean Freedom Squares Squares Between groups . 2 1.193 .596 Within groups 40 9.405 .235 Total 42 10.598 .596 _ F2,40 ‘ 773'!" 2536* Critical F = 2.44 *Significant at the .10 level Of confidence. 85 Several important considerations are apparent from Tables 7, 8, and 9. The School and College Ability Test scores were not a factor relative to the number of assignments completed when students were grouped in this manner. This was quite surprising considering that scores ranged from the third to the ninety-ninth percentile on the Cerritos College norms.1 There appeared to be no difference in the number of assignments completed as related to the completion or non-completion of the course pre-requisites--either electronics or mathematics. Also, college English was not a factor relative to the number of assignments completed. And, there was more work experience in the electronics industry in the combination of the high and low groups than in the intermediate group. As such, it appears that the factors affecting the number of assignments completed, when based upon the groups established, are other than those listed in Table 7, page 81, with the exception of the desire to work closely with the teacher. It was, therefore, not possible to describe the characteristics of students related to early and late attainment of assignment Objectives since the characteristics measured were similar for both groups. Summary Of Student progress during the system test. The data from Tables 3-9 were used to answer research question one and the related sub-questions which read as follows: 1. Given the opportunity to work at their own rate, how much will students vary the time required for achieving mastery of the learning materials? 1Page 65. 86 a. To what degree will students take advantage of the option to bypass materials previously mastered? b. What are the characteristics of students related to early and late attainment of assignment objectives? The data indicates that students will vary greatly the time required to attain mastery of the assignment objectives when given the opportunity to work at their own rate. It further indicates that Students will take advantage of the option to bypass materials pre- viously mastered, or at least will elect not to re-test on those materials. And, of the thirteen characteristics compared for stu- dents completing a high, intermediate and low number of assignments, only one, "I learn best by working closely and directly with the teacher." was significantly different at the .05 level of confidence between the high and low groups. Thus, a profile was not developed for those students with early and late attainment of assignment objectives since the characteristics measured were similar for both groups. Student Reaction to the System Data from the CMI System Student Questionnaire, the Student Information Sheet and the student file are presented in Tables 10-18. The data were used, in addition to the reSponseS to question four of the Instructor Critique, to answer research question two and the related sub-questions which read as follows: 2. What will be the student reaction to the system as indicated by the following questions? 87 a. To what degree will the system be threatening to the students? b. To what degree will the system be acceptable to the students as a means of acquiring course information? c. What are the characteristics of students, expressing satisfaction and dissatisfaction with the system? System threat to Students. The reSponses to statements one and two of the CMI System Student Questionnaire, administered during the system test and the anonymous form of the questionnaire adminis- tered following the test, were used to determine the extent to which the system was threatening to students. Table 10 shows the means and Standard deviations of feelings toward the system for students completing a high, intermediate and low number of assignments. The reSponses to the items were summed over the three questionnaires and the means and standard deviations were determined. It must be noted that items lb and 2a were inverted yielding a high value representing a positive reSponse for all items in the table. As is seen in Table 10, members Of the intermediate group were the most relaxed, the most free and the most calm while commu- nicating with the Instructional Manager via the Individual Student Report. The intermediate group was the most relaxed and the most free while using the assignments, whereas the low group was the most calm. A one-way analysis of variance was conducted for the three groups on the two characteristics. Of the six responses measured, none were significant at the critical F level. 88 woo.a mno.e evm.H Nae.v ooo.« mon.v eflmu mmm.H mno.m emH.~ Hum.m e~m.« coo.e coed mm~.H noe.m mNN.H m~m.m mmN.H nmm.o eoxmaom ”m«coa¢mwmm< mean: mmm.H va.v Hmm.H omm.m Nmm. men.v Eamu Hmo.a mom.w HmH.H Nam.m 0mm. mmw.e comm Hem. mmm.m mom.~ vnm.m new. Hem.m eoxoaom H«homoa «coo3«m m .M m .M m .M Hmsoo>fiecH mo> mco«ao«cseeou m u c om u c m u c no-4o ”ooooHQEoo mo-oS "oooooaaou m-o uooooaeou macho cOm«oHoEou goo: nacho :Ofl«ofimeou 09d MCOEOHCH maoao :o««oaneou 304 «:oEo«m«m «mob ao«mxm oc« unmade m«coaem«mm< Haeoo«o=u«mem mo nonesz so; one o«omeoano«:H .gmfiz a mcw«oflmsou m«eoo=«m so eoomxm o:« moumzoe mmcwflood mo meow«e«>oo enmecu«m one memo: mo cemwnoanu .oH oases 89 A chi square analysis was conducted to determine if the reSponses to statements one and two of the questionnaire changed over the three questionnaires completed. Table 11 shows the results of the chi square analysis Of student feelings toward the system for the three questionnaires administered during the system test. The re- sponses to statements one and two were summed and the means determined. A "4" reSponse on the questionnaire represented a neutral feeling with responses above that indicating a positive feeling and responses below that a negative feeling. There were seven responses between three and four on Questionnaires 1 and 2, and four responses between three and four on Questionnaire 3. These scores were included in the cell whose mean was less than five. Table 11. Chi Square Analysis of Student Feelings Toward the System for the Total Population During the System Test Mean ReSponses to Statements 1 and 2 Questionnaire 725 X5-6 236 Total 1 20a 14 9 43 2 18a 12 13 43 3 15b 12 16 43 Total 53 33 38 129 x2 = 4.700 Critical x2 = 7.779 aIncludes seven responses between three and four. bIncludes four responses between three and four. 90 With four degrees of freedom, a chi square value of 13.276 is required to be significant at the .01 level of confidence. A chi square value Of 9.487 is required to be significant at the .05 level and a chi square value of 7.779 is required at the .10 level of confi- dence. A critical x2 value of .10 was selected in which to test the difference between the cells. Of the three questionnaires completed, there was not a significant difference at the critical x2 value. A chi square analysis was conducted to determine if there was a difference in the reSponses to statements one and two of the three questionnaires in which the Students identified themselves and the anonymous form of the questionnaire administered two weeks after the test. Table 12 shows the results of the chi square analysis of stu- dent feelings toward the system when students were and were not identi- fied. The responses to statements one and two were summed over the three questionnaires in which students identified themselves and the means were determined. There were five responses between three and four on the sum of questionnaires 1, 2 and 3 and one response between three and four on the anonymous form Of the questionnaire. These scores were included in the cell whose mean was less than five. With two degrees of freedom, a chi square value Of 9.210 is required to be Significant at the .01 level of confidence. A chi square value of 5.991 is required to be significant at the .05 level and a chi square value of 4.605 is required to-be significant at the .10 level of confidence. A critical x2 value of .10 was selected in which to test the difference between the cells. When comparing the sum of the three questionnaires in which students were identified to 91 the anonymous form of the questionnaire, there was not a significant difference at the critical x2 value. Table 12. Chi Square Analysis of Student Feelings Toward the System for the Total Population in Which Students Were and Were Not Identified Mean ReSponses to Statements 1 and 2 Questionnaire Total 725 25-6 X36 a Q1, Q2, Q3 l8 13 12 43 Anonymous 8b 15 8 31 Total 26 28 20 74 x2 = 2.882 Critical x2 = 4.605 aIncludes five responses between three and four. bIncludes one response between three and four. Therefore, as indicated by the data from Tables 10, 11 and 12, the responses to statements one and two of the CMI System Student Questionnaire which were intended to indicate threat were primarily positive, indicating that students were not threatened by the system. There was no difference in threat between students who had completed a high, intermediate, or low number of assignments and there was no difference between responses in which they identified themselves and their responses on an anonymous form Of the questionnaire completed after the system test had been terminated. System acceptability to students. Data from the CMI System Student Questionnaire were used to answer research Sub-question 2b which read "To what degree will the system be acceptable to students as a means of acquiring course information?" The reSponses to statements three, four, five and seven of the CMI System Student Questionnaire were used to determine the extent to which the system was acceptable to students as a means of acquiring course information. Table 13 shows the means and standard deviations of acceptability of the system for students completing a high, intermediate and low number of assignments. The responses to the items were summed over the three questionnaires and the means and standard deviations were determined. As seen in Table 13, members Of the intermediate group had the highest mean score for the items measured. They believed strongest that the pre- and post-test results provided adequate information, and that the Student Report provided adequate information. They were followed closely by the low group in recommending the system to others and were also followed closely by the low group in not desiring to Opt out of the experience if it were possible. A one way analysis of variance was conducted for the three groups on all variables. Of the four variables measured, none were significant at the critical F level. A chi square analysis was conducted to determine if the responses to statements three, four, five and seven of the CMI System Student Questionnaire changed over the three questionnaires completed. 93 smo.o mos.m mom. mmm.o woo.~ mmo.o oooomooo mo oocowuoaxo we «so «no «oz vmm.« mmw.m mmn. HmN.c meH.H omm.o muoc«o o« ao«mxm ecosaooom com. Smo.o moo.o oao.m omm.o moo.o oooooaooooo ooaooooa eoofi>oum «gonna «:oe=«m He~.H m~m.m «om. Hom.m mom.« mmm.m :ow«eeuomew o«oscoom eoem>onm m«H:mon «mo«u«m0d one team m x m m m m m a a nu u e m u : no-43 "oooooosou ozone :Ow«oHanu coo: mo-oH "oooooosoo macaw cofl«oHanu o«m«eoeuo«cH one "oo«ofimaou macho cofi«ofioeou 304 «coEo«m«m «mop eoomxm on« weapon m«coecwmmm< Hacofi«osu«mcm mo pooaoz so; one ooafieoaho«:w .cm«: a mew«oaaeou m«:ooo«m so Eo«m>m ago we xoflfioom«aooo< mo mcofi«ofi>oo oneoea«m pea mode: on« me commuamEou .nH canoe 94 Table 14 shows the results of the chi square analysis of student acceptability of the system for the three questionnaires administered during the system test. The responses to the statements were summed and the means determined. There was one response between two and three and one reSponse between three and four on Questionnaire 1. There was one response between two and three and one response between three and four on Questionnaire 2. And, there were three responses between three and four on Questionnaire 3. These scores were included in the cell whose mean was less than five. Table 14. Chi Square Analysis of Student Acceptability of the System for the Total Population during the System Test Mean Responses to Statements 3, 4, 5 and 7 Questionnaire Total 225 23-6 X36 1 6a 22 15 43 2 9b 10 24 43 3 7c 11 25 43 Total 22 43 64 129 x2 = 9.659d 2 Critical x = 7.779 a Includes one response between two and three and one reSponse between three and four. bIncludes one response between two and three and one response between three and four. cIncludes three responses between three and four. dSignificant at the .05 level of confidence. 95 A chi square value of 9.659 was significant at the .05 but not the .01 level of confidence. Thus, as indicated by the data from Table 14, the responses to statements three, four, five and seven of the CMI System Student Questionnaire became more positive indicating that the system became more acceptable as the system test progressed. A chi square analysis was conducted to determine if there was a difference in the reSponses to statements three, four, five and seven between the three questionnaires in which students had identi- fied themselves and the anonymous form of the questionnaire adminis- tered two weeks following the termination of the system test. Table 15 Shows the results of the chi square analysis for the three question- naires compared to the anonymous form Of the questionnaire. The responses to statements three, four, five and seven were summed over the three questionnaires and the means were determined. There was one reSponse between two and three and one response between three and four on the sum of Questionnaires One, two, and three. There were two reSponseS between three and four on the anonymous form of the ques- tionnaire. These scores were included in the cell whose mean was less than five. When comparing the sum of the three questionnaires in which students were identified to the anonymous form of the question- naire, there was not a significant difference at the critical x2 value. Question four of the Instructor Critique asked the instructors how they felt the students reacted to the system and the method for conducting the course during the test. The instructor who assumed the role of Instructional Manager and one of the Laboratory Instructors responded that: 96 Table 15. Chi Square Analysis of Student Acceptability of the System for the Total POpulation in Which Students Were and Were Not Identified Mean Responses to Statements 3, 4, 5 and 7 Questionnaire 725 25-6 X56 Total 2' 7a 15 21 43 Q1, Q2, Q3 Anonymous 7b 10 14 31 Total 14 25 35 74 x2 = .463 Critical x2 = 4.605 8Includes one response between two and three and one response between three and four. bIncludes two responses between three and four. Eighty-five per cent of the students indicated that this system and the method for conducting the course was a good way to learn and they liked it better than the conventional lecture/ lab approach. One student remarked to me that "This system gives you a fighting chance to master the material." The other instructor who was assigned the role of the second Laboratory Instructor responded that: In general and considering the fact that all new systems and methods have their bugs, I believe most students reacted very favorably to this new approach. I felt that the students were enjoying the course a little bit more and were feeling a more complete sense of accomplishment. Therefore, as indicated by the data from Tables 13, 14 and 15 and question four of the Instructor Critique, the system was acceptable to students as a means of acquiring course information. Several stu- dents indicated that they liked it better than the conventional 97 lecture/lab approach. There was no difference in acceptability be- tween students who had completed a high, intermediate or low number Of assignments. The system became more acceptable to students as the test progressed with the increase in acceptability significant at the .05 level of confidence. And, there was no difference between re- sponses in which students identified themselves and responses on the anonymous form Of the questionnaire completed following the system test. Characteristics of students related to satisfaction with the system. Data from the Student Information Sheet, the CMI System Student Questionnaire and students' School and College Ability Test scores were used to answer research Sub-question 2c which read "What are the characteristics of students expressing satisfaction and dissatisfaction with the system?" Satisfaction was determined by combining the Statements re- lating to threat and the statements relating to acceptability from the CMI System Student Questionnaire. The reSponses to the statements representing satisfaction for each student were summed over the three questionnaires and the means determined. Students with mean responses between five and six were placed in the intermediate group. Students with mean responses above Six were placed in the high group, while those students with mean responses below five were placed in the low group. In the low group, there were only two students whose mean reSponseS indicating satisfaction were less than four. Based upon the criteria, the number of students in the low satisfaction group was 98 fourteen, the intermediate satisfaction group was twenty, and the high satisfaction group was nine. Table 16 shows the means and standard deviations for the three groups established as high, intermediate and low satisfaction groups. The characteristics of students were based upon their ability to perform college work as indicated by the School and College Ability Test scores (verbal, quantitative, and total), their backgrounds as indicated by responses on the Student Information Sheet and their attitudes towards learning as indicated by reSponses to items six, eight, nine and ten from the CMI System Student Questionnaire. The responses from the CMI System Student Questionnaire were summed over the three questionnaires administered, and the means and standard deviations were determined from that value. It must be noted that items six and nine of the questionnaire were inverted yielding a high value representing a positive response for all items from the ques- tionnaire. An additional item, number of assignments completed, was added bringing the total number of characteristics to fourteen. As can be seen from Table 16, the mean SCAT scores were great- est for the high group, with members of the low satisfaction group completing the most assignments. The high satisfaction group had the greatest amount of previous electronics coursework and mathematics pro-requisites, whereas the intermediate and low groups were tied for the greatest amount of concurrent mathematics. The high satis- faction group had the greatest amount Of prior college English back- ground. The intermediate satisfaction group had the greatest amount Of previous work experience, they preferred to plan their own study £99 .ooeoooueoo we oo>eo an. oo« «- «eaowm«:uwmo .ueom one oouo« eeesueo noaeoeneu one «confluen- mom.o aoo.o soo.o ”no.4 ooo.« omn.o nwocoooo goo: soowooo madame: so «moo chaoo mmo.o nnm.m oom.o moo.o «No.3 was.» "cocoons cso oououuu o« vouuomoua was.“ moo.~ ons.~ oom.n oao.o can.» ooooowcoooou uooo¢o« no>oanon nkm.o o-.4 ono.o aom.o Smo.o ~ma.n convoco» spoon are soon o« oouuououm moo. NNN. one. ooo. «so. «N4. hooocoou on. ooooooc cameouoawm one: «so. 444. mom. coo. moo. an”. anooucm ouooooo «so. no”. com. com. com. com. Apnoea-o. oo may o«uz «eouwsoeou coo.~ ooo.~ «no.3 coo." «so. coo.o zoo-ocuo. cc. pug e««n«5aeuuoua o«ax moo.« non.o no“. com. com. soc. Awoogoo oc- o~-oum nos." SSS.NS oom.~ ooS.So ano.~ oom.~« ooooooaoo n«eeleu«nn< we noose: oco.n« ooo.oo som.oo omo.sm «36.4” mwa.om oooom Hooch--ooocoooco:o--o--aon vueo:e«m one «can: we eenfiuaaaeu .oH oooeh 100 schedule more than the others, and they believed stronger that it was the teacher's reSponsibility to see to it that students learned the course material. The high satisfaction group indicated that they preferred to figure out how to do something by themselves and they indicated that they learned better than the other groups when working closely with the teacher. A one-way analysis of variance was conducted for the three groups on the fourteen characteristics. Of the fourteen character- istics measured, "It is a teacher's responsibility to see that I learn the subject matter of a course," and "I like to figure out how to do a thing by myself rather than be told," were Significant at the .10 level of confidence. Table 17 shows the results of the one-way analysis of variance conducted for the characteristic, "It is a teacher's responsibility to see that I learn the subject matter of a course." An F ratio of 2.754 for the characteristic was significant at the .10 level of confidence. A post hoc analysis indicated that the difference between the high and low groups was not significant at the .10 level. How- ever, the high group was significantly different at the .10 level when compared to the sum of the intermediate and low groups. It can, therefore, be said that a characteristic Of the high satisfaction group was that it did not feel that it was as much a teacher's re- sponsibility to see that students learned the subject matter of a courSe as the combination Of the intermediate and low satisfaction groups. 101 Table 17. One-Way Analysis of Variance for the Characteristic, "It is a teacher's reSponsibility to see that I learn the subject matter of a course." Residuals Source Of Variance Degrees of Sum of Mean Freedom Squares Squares Between groups 2 12.770 6.385 Within groups 40 92.721 2.318 Total 42 105.491 6.385 _ , I:2, 40 m” 2'75“ Critical F = 2.44 *Significant at the .10 level of confidence. Table 18 shows the results of the one-way analysis of variance conducted for the characteristic, "1 like to figure out how to do a thing by myself rather than be told." An F ratio of 3.087 was signi- ficant at the .10 level of confidence. A post hoc analysis indicated that the difference between the high and low satisfaction groups was not significant at the .10 level of confidence. The difference between the high and the sum of the intermediate and low groups and between the low and the sum of the intermediate and high groups was not Signi- ficant at the .10 level when using normal weights. TherefOre, the combination of weights which produced the Significant difference at the .10 level of confidence was too complex to distinguish the charac- teristic of the groups on this variable. 102 Table 18. One-Way Analysis of Variance for the Characteristic, "I like to figure out how to do a thing by myself rather than be told." Residuals Source of Variation Degrees of Sum of Mean Freedom Squares Squares Between groups 2 11.900 5.950 Within groups 40 77.103 1.927 Total 42 89.003 _ 5.950 = a 2’ 40 - m. 3.087 Critical F = 2.44 F 8Significant at the .10 level of confidence. Several interesting findings are apparent from Tables 16, 17, and 18. AS was the case when compared to the number of assignments completed, the School and College Ability Test scores were not a factor relative to satisfaction toward the system when students were grouped in this manner. There was no difference in the number of aSsignments completed as related to satisfaction with the system. There appeared to be no difference in satisfaction with the system as related to the completion or non-completion of the course pre-requisites--either electronics or mathematics. Completion of college English was not a factor relative to satisfaction nor was previous work experience in the electronics industry. The only factors affecting satisfaction were the belief that it was a teacher's responsibility to see to it that students learned the subject matter of a course and the desire 103 to figure out something by themselves rather than be told. In neither case was the difference between the high and low groups significant at the .10 level of confidence. AS such, it appears that the factors affecting satisfaction with the system, when based upon the groups established, are other than those listed in Table 16, page 99. It was, therefore, not possible to describe the characteristics of students related to satis- faction and dissatisfaction with the system because most of the stu— dents were satisfied with the system as indicated by the characteris- tics measured. Summary Of student reaction to the System. The data from Tables 10-17 and the responses to question four of the Instructor Critique were used to answer research question two and the related sub-questions which read as follows: 2. What will be the student reaction to the system as indicated by the following questions? a. To what degree will the system be threatening to the Students? b. To what degree will the system be acceptable to students as a means of acquiring course information? c. What are the characteristics of students expressing satisfaction and dissatisfaction with the system? In referring to the data, it appears that the students were not threatened by the system. The system appeared to be acceptable to the students as a means of acquiring course information. In fact, several students indicated that they liked it better than the 104 conventional lecture/lab approach. Of the fourteen characteristics compared against students expressing high and low satisfaction, none were Significantly different at the .10 level of confidence. In addition, it was not possible to develop a profile for those students expressing satisfaction and dissatisfaction with the system since most of the Students expressed satisfaction with the system, as indi- cated by the characteristics measured. Staff Reaction to the System Data from the Instructor Critique were used to answer research question three and the related sub-questions which read as follows: 3. What will be the instructional staff reaction to the system as indicated by the following questions? a. To what degree will the system be threatening to the staff? b. To what degree will the system be acceptable to staff as a means of conducting the course? Since the Instructor Critique followed the format Of open- ended answers to Specific questions about the system, the questions were listed followed by the staff responses to the questions. Instructor one was assigned the role of Instructional Manager and of a Laboratory Instructor in one of the laboratory sections. His responses reflected both roles. Instructor two was assigned the role of a Laboratory Instructor in a second laboratory section and his responses reflected his role as Laboratory Instructor in his section. The clerk did not complete an Instructor Critique since her function was to score pre- and post-tests and operate the computer programs 105 underlinedirection of the Instructional Manager. As such, her function was not instructional in nature. System threat to staff. The responses to question three of the Instructor Critique were used to determine the extent to which the system was threatening to staff. Question three of the Instructor Critique read "Did you feel that the atmosphere established by the system was threatening to you or to the students involved in the system test?" Instructor one response: The atmosphere established by the system was one I have long envisioned where each student spends as much time in an area as he feels necessary, allowing time for independent research if desired, where each student progresses at his own rate, where an absence due to illness, job interference or personal business won't hinder a student's progress and at the same time an atmosphere that doesn't result in a mountain of paper work waiting at the end of a class session fer the instructor. I feel that less than 10% of the students felt threatened by the system. I would attribute the cause Of the 10% that reacted adversely to the system to be a lack of self-discipline on their part . Instructor two response: As the instructor, I was perfectly comfortable and relaxed in the new environment. I do not feel that the atmosphere established by the system was threatening to either the instructor or students. I say that it was not threatening tO the instructor because no matter how well organized and programmed a computer managed system is, it still takes a good instruction team (lecture and lab instructors) to implement the system. A machine cannot feel the pulse of the class so that appropriate and timely instructions and changes can be fed back to the students. From the responses to question three of the Instructor Critique, the atmosphere established by the system was not threatening to the staff or the majority Of the students. The staff, according 106 to their responses, appeared to be quite relaxed and comfortable with the system and were quite supportive Of the atmosphere established by the system. It was indicated that " . . . less than 10% of the students felt threatened by the system." Syetem acceptability to staff. The responses to questions two and five of the Instructor Critique were used to determine the extent to which the system was acceptable to the staff as a means of conducting the course. Question 2a of the Instructor Critique read "Did the reports provide adequate infOrmation to facilitate communication required to operate the system?" Instructor one response: The information was adequate but the method set up for com- municating with the lab instructor and additional notes to students was inefficient. Communications with the lab instructor is better not done through the computer and the additional notes to the students should not require a separate card for each student when the same message is to be sent to a number of students. One card should be all that is necessary with all the student names behind the card getting the same message. Instructor two response: "Yes." Question 2b of the Instructor Critique read "What was your reaction to using a computer printed report for the purpose Of communicating with students and Staff?" Instructor one reSponse: This is a tremendous idea if used to its fullest potential. The communication with the student is of primary importance indicated by 50% of the students saying that the student report served a major purpose. Instructor two reSponse: I thought it was a marvelous method of communication. It provides the student with almost immediate feedback on his 107 progress as well as instructions and guidelines for progressing to the next assignment. Another wonderful by-product is that it provides for a media to express words of encouragement to students who are having a rough time with the course. You might say an Opportunity for a personal touch without personal contact. It also provides for better coordination and communication between lecture instructor and lab instructor. Question 5a of the Instructor Critique read "How do you feel about the computer-managed instructional system?" Instructor one reSponse: I think this system has tremendous potential and would make the following Observations at this point in its development: 1. It is a little more expensive than the conventional approach but could have more holding power on the students to offset part Of the additional expense. 2. If the system ever catches on it will be the best thing that has happened in instructing electronics in the Community College. Instructor two response: I feel that the system can use some minor refinements and perhaps some type of audio/visual teaching aids to supplement the lecture in order to achieve its maximum effectivity. However, the system as it stands now, is not only workable and very effective, but has lots of possibilities. The system makes it possible to greatly increase the size of the class without increasing operational cost by using a combination of instructor/instructor aide. If properly organized and administered, this concept would definitely be beneficial financially and educationally to the community college. This concept also allows the student to learn and advance at his own rate. This new born freedom now Opens the possi- bility for fast and smart students to complete more than one semester's work in one semester. Another possibility is that it allows the student to enter the class several weeks after classes start and still catch up without loss of learning material. It also allows the students to be out several nights due to illness or overtime work without loss of learning material. Question 5b of the Instructor Critique read "Would you recommend that it (the computer-managed instruction system) be used as a model for this course and others in the electronics program? Other technical courses?" 108 Instructor one reSponse: I would not only recommend that it be used as a model for this course but I have requested that it be used again in the Fall of 1973 in an attempt to improve on it and I am looking forward to implementing it in the second electronics class in the program. Instructor two response: Yes, for the reasons I have given (in 5a), I think the computer managed instructional system would fit very nicely into the electronics program. The concept can be expanded into other technical fields as well. Examples would be areas of medicine, auto shop, machine Shop and cosmetology. Therefore, it appears that the instructional staff involved in the test of the system were quite enthusiastic about the system. With modification and refinements, they see it as a viable method for conducting the course in which it was tested. They expressed enthusiasm for use of a computer-printed report as a method of com- munication and indicated that the reports provided a good communica- tion link between instructor and Student. Both instructors recom- mended that the system be used as a model for the other electronics courses and it was suggested that it could be expanded into other technical fields as well. Summary of Staff reaction to the_eystem. The reSponses to questions two, three and five of the Instructor Critique were used to answer research question three and the related sub-questions which read as follows: 3. What will be the instructional staff reaction to the system as indicated by the following questions? a. To what degree will the system be threatening to the staff? b. TO what degree will the system be acceptable to the staff as a means of conducting the course? 109 In referring to the data, it appears that the system was not threatening to the staff and was most acceptable to them as a means of conducting the course. Both instructors thought that a computer- printed report was a good communication device and both were quite satisfied with the atmosphere established by the system. Neither instructor appeared to be threatened as indicated by their reSponses to the Instructor Critique. AS an example, one instructor Stated, "I say that it was not threatening to the instructor because no matter how well organized and programmed a computer managed system is, it still takes a good instruction team (lecture and lab instruc- tors) to implement the system." As such, he did not see it as a threat but instead he saw it as an aid to help him perform his func- tion as the instructor for the course. The system appeared most acceptable to the staff as a means of conducting the course to the extent that both recommended the system as a model for other courses in the electronics program at Cerritos College and both have requested to use the system again in the Fall of 1973 to teach the course in which the system was tested. RECOMMENDATIONS FOR MODIFICATION OF THE SYSTEM The responses to question one of the Instructor Critique (Appendix D) were used as a basis for modifying the system. Question one Of the Instructor Critique read "Did the management system provide adequate information for you to monitor the progress of students? How might the system be improved to perform this function?" 110 The recommendations which resulted pertained to the overall Operation of the system, in addition to the information provided by the Instructional Manager Report, Laboratory Instructor Report and the Student Report. The recommendations were analyzed and used as a basis for modifying the system. Recommended Report Modifications It was suggested that lecture information be programmed into the system and generated as "Recommended Lectures" by the system and that the reports indicate titles along with the assignment and objec- tive numbers. The staff believed that there was a need for a listing of assignments skipped by students on the reports also. It was sug- gested that infOrmation included on the Laboratory Instructor Report parallel information on the Instructional Manager Report for each section. And, it was suggested that "Remedial Materials" be auto- matically listed on the Student Report along with information per- taining to assignments skipped and student progress. Recommended Operational Modifications The recommendations were also related to the Operational aspects Of the system such as the administration of review assign- ments and the distribution of Student Reports. However, these recommendations did not require major changes in the system and could be handled by the staff involved in the operation of the system. As such, the modifications to the system which follow pertain only to the computer programs which generated the various reports. 111 THE MODIFIED MANAGEMENT SYSTEM The modified management system consisted of a student file, lecture file and remedial materials file stored on high speed mag- netic disk, along with the programs which generate the Modified Instructional Manager Report (MIMR), the Modified Laboratory Instruc- tor Report (MLIR), and the Modified Individual Student Report (MISR). These programs were operated via control cards in a batch process mode of Operation as were the original Instructional Manager Report, Laboratory Instructor Report, and the Individual Student Report programs. The Student file contained the instructional assignment history of students and was updated after each class session by the Modified Instructional Manager Report Program. The lecture file con- tained a listing of the lectures to be recommended by the Modified Instructional Manager Report Program based upon assignments in which students were working. The remedial materials file contained a listing of materials relating to the instructional assignments to be generated by the Modified Individual Student Report Program based upon student post-test deficiencies. A discussion of the modified computer-managed instruction system programs follows. Modified Instructional Manager Report Program The Modified Instructional Manager Report program perfOrmed the same function in the modified system as the original Instructional 112 Manager Report program: (1) the updating of the student file and (2) the generating of the Modified Instructional Manager Report (MIMR). The flowchart depicted in Figure 7 represents the Operation of the MIMR program. The major elements of the program are identified by the letters A through F on Figure 7 and are referred to in the discussion of the program which follows. Data was entered into the program via computer data cards. The student file was read and the pre-test update performed (A). Pre-test scores were checked against mastery, established at 80 per- cent. If a student failed to achieve mastery on the pre-test, his name, laboratory section number, assignment number and title, score, criterion and possible number of points were transferred to a pre- test deficiency list. Following the pre-test update, the post-test was updated (B). Again, scores were checked for mastery, this time by Objective. If a student failed to attain mastery on any of the assignment Objectives, his name, laboratory section number, assignment number and title, Objective number and title, score and possible points were transferred to a pest-test deficiency list. The student file was then sorted by assignments in which students were working (C). The date the work was begun and whether the assignment was completed by a pre- or post-test, or whether a pre- or post-test deficiency resulted was indicated. During the sort, the student file was searched for post-test deficiencies out- standing and assignments skipped by students. If a post-test deficiency was fOund to be outstanding, the student's name, laboratory 113 4) ’ Student data I (ID cards Pre-test stfigzgt update data cards - ..-Illl" Lecture file DevelOp Preft95t Pre-test , 6:) - def1c1ency def? Develop Read 115‘ recommended lecture L no lecture list file Post-test update Develop ~tudent not overed by lecture list DeveIOp .] post-test Post-test 7 deficiency def? Write list updated E no @ student . file Develop assignment stu. working llst 7 Print MIMR Develop I post-test P05t't°5t , out. list l_i DevelOp assignment skipped list L, Figure 7. Flowchart of the Modified Instructional Manager Report Program *A, B, C, D, E and F are major elements of the program. 114 section, assignment number and title, and the objective number and title were transferred to a post-test deficiency outstanding list. Likewise, if assignments were found to have been skipped by students, the student's name, laboratory section, and assignment number and title were transferred to an assignments skipped list. The lecture file was read and matched against the assignments in which students were working. This resulted in the recommended lecture list (D). Students were listed by assignment number and title. The list was checked to insure that all students were covered by a lecture. Those students not covered by a lecture were listed along with the assignment in which they were working. The student file was then updated with the students recommended lecture indicated (E), followed by the printing of the Modified Instructional Manager Report (F). A sample Modified Instructional Manager Report is found in Appendix E. Modified Laboratory Ipstructor Report Pregram The Modified Laboratory Instructor Report program performed the same function in the modified system as the original Laboratory Instructor Report prOgram by generating the Modified Laboratory Instructor Report (MLIR). The flowchart depicted in Figure 8 repre- sents the Operation of the MLIR program. The major elements of the program are identified by the letters A through E on Figure 8 and are referred to in the discussion Of the program which follows: The student file was read and the list of assignments com- pleted by students during the previous class session was developed Read student file ections l 8.2 Develop pre- post- couple- ion list for Develop pre-test def. list for Sections 1 8 2 [_, Develop Post-test def list fer Sections 1 G L 're-test def? Develop assign. Stu. work list Sections 162 Develop post- test out- standing list Sections 162 1 DeveIOp assignments skipped list Sections 182 [ Figure 8. Program 115 Lecture control Read lecture data cards .L_____, Develop assigned Edi lecture list SectionslGZ Develop student not cov. by lec. Sectionslfiz J Print MLIR MLIR Section 1 Section 1 Print MLIR MLIR Section 2 Section 2 1 (END) Flowchart of the Modified Laboratory Instructor Report *A, B, C, D and E are major elements of the program. 116 for sections one and two (A). Individual pre- and post-test scores were checked against mastery. If a student had not achieved mastery on the pre-test, his name, assignment number and title, score, cri- terion and possible points for the assignment were transferred to a pre-test deficiency list for each section. If a student failed to achieve mastery on post-test Objectives, his name, assignment number and title, objective number and title, score and possible points for the Objective were transferred to a post-test deficiency list for each section. The student file was read and the data were sorted for each section by the assignments in which students were working (8). The date the work was begun and whether the assignment was completed by a pre- or post-test, or whether a pre- or post-test deficiency resulted was indicated. During the sort, the student file was searched for post-test deficiencies outstanding and assignments skipped by stu- dents. If a post-test deficiency was found to be outstanding, the student's name, assignment number and title, and Objective number and title were transferred to a post-test deficiency outstanding list by laboratory section. If assignments were found to have been skipped by students, the student's name and assignment were transferred to an assignment skipped list by section. The Instructional Manager's lecture input, indicating accep- tance or modification of the recommended lectures, was read resulting in the development of the assigned lecture list by laboratory section (C). Students not covered by lectures were indicated and the Modified Laboratory Instructor Report for section one was printed (D) followed 117 by the report for section two (E). A sample Modified Laboratory Instructor Report is found in Appendix F. Modified Individual Student Report Pregram The Modified Individual Student Report program performed the same function in the modified system as the original Individual Stu- dent Report program by generating the Modified Individual Student Report (MISR). The flowchart depicted in Figure 9 represents the Operation of the MISR prOgram. The major elements of the program are identified by the letters A through D on Figure 9 and are referred to in the discussion of the program which follows: The student file was read and the list Of assignments completed was developed (A). The assignment in which the student was working was checked for pre- and post-test completion. If the post-test had been taken and mastery not attained, the remedial materials file was searched for materials relating to the assignment and the list of materials were transferred to a remedial materials list (B). The student's file was checked for post-tests outstanding and assignments skipped. If a post-test deficiency was found to be outstanding, the assignment number and title, and the Objective number and title were transferred to a post-test deficiency list. If assignments had been skipped, the assignment number and title were transferred to an assignment skipped list. The Instructional Manager's lecture input was read, resulting in the development of the lecture list and the student's recommended lecture assignment (C). A check for messages from the Instructional 118 l® Read Develop student assignments file completed list Read 0, def. Cl} file no Deve10p _ Remedial Materials list Deve10p POSt-teSt 95 Post-test outstanding list I Deve10p assignments skipped list Read f 1- no lecture Lecture , data cards control cards Develop Message lecture list Read message cards ill END Figure 9. Flowchart of the Modified Individual Student Report Program *A, B, C, and D are major elements of the program. 119 Manager was then made. If messages were available, they were trans- ferred to a message list and the Modified Individual Student Report was printed (D). A sample Modified Individual Student Report is found in Appendix G. SUMMARY Chapter 4 has dealt with the results of the system test. Data from the student file, CMI System Student Questionnaire, Student Information Sheet and the Instructor Critique were used to answer the three research questions pertaining to student progress, and student and staff reaction to the system. An attempt was made to determine the characteristics of students related to the number of assignments completed and the characteristics of students related to satisfaction and dissatisfaction with the system. However, profiles were not developed due to a lack of difference between groups. Instructional staff reaction to the system along with recommendations for improvement of the system were also presented. The primary modifications to the system included the addition of the "recommended lecture" file, generation of an assignments skipped and post-test outstanding lists on the Instructional Manager Report, and the expansion of the Laboratory Instructor Report. Chapter 5 SUMMARY AND CONCLUSIONS Students enrolled in technical education programs at the com- munity college level possess a wide variety of experiences, backgrounds and needs. As such, staff in these programs feel the impact of the variety of student backgrounds and in many cases, find it difficult to satisfy the individual needs of students. A solution to this problem could be an instructional approach which allows students to complete, individually, instructional materials based upon an assess- ment of their prior knowledge. However, one of the most difficult aSpects of an individualized instructional approach is the management function. As a result, some educators have expressed interest in the ‘use of the computer to assist in the management function that is required in an individualized instruction program. However, little information is available in the literature on research into the use of the computer to manage individualized instruc- ti on in technical education programs at the community college level. TTuas, given the lack of information available pertaining to systems at; this level and the need for an individualized approach, there was a-lleed for research into computer-managed instruction in technical education programs at the community college level. 120 121 PROBLEM AND PROCEDURES The problem which was studied was concerned with individualized instruction, the effect of student characteristics, and the use of computers to assist in the management function of individualized instruction. Specifically, it was the purpose of this study to design, deve10p and test a computer-managed instruction (CMI) system for use in technical education programs at the community college level. The study posed three major questions which included: 1. Given the opportunity to work at their own rate, how much will students vary the time required for achieving mastery of the learning material? a. To what degree will students take advantage of the Option to bypass materials previously mastered? b. What are the characteristics of students related to early and late attainment of assignment objectives? What will be the student reaction to the system as indicated by the following questions? a. To what degree will the system be threatening to the students? b. To what degree will the system be acceptable to students as a means of acquiring course information? c. What are the characteristics of students expressing satisfaction and dissatisfaction with the system? What will be the instructional staff reaction to the system as indicated by the following questions? 122 a. To what degree will the system be threatening to the staff? b. To what degree will the system be acceptable to the staff as a means of conducting the course? The study was delimited to include only the design, develop- ment and initial test of the computer-managed instruction system. Excluded from the study was a comparison of this method of instruction to other methods of instruction which should follow upon refinement of the system developed in this study. The assumptions underlying the development of the system were: 1. Individualized instruction, as defined in the Definition of Terms Used, is a viable learning method and was therefore not a part of the system to be evaluated. 2. Mastery learning strategy, as defined in the Definition of Terms Used, is a viable learning method and was therefore not a part of the system to be evaluated. 3. The instructional materials presently in use at Cerritos College and which were prescribed by the instructional assign- ments are viable and therefore were not a part of the system to be evaluated. The System Which Was Designed The system was designed for use in technical education pro- grams at the community college level. Students began by taking the assignment pre- or post-tests. The tests were scored by a clerk who was available during the regularly scheduled class hours. Student scores were assembled by the clerk into a data deck for entry into 123 the management system. The data were entered into the system along with student assignment history stored on the system. The data were processed and the Instructional Manager Report was generated. An analysis of student prOgress and achievement indicated on the Instruc- tional Manager Report was conducted by the Instructional Manager. As a result of the analysis, messages containing lecture information and/or Special notes were given to the clerk to be entered into the management system for distribution to staff and students. The informa- tion was processed and the Laboratory Instructor Reports, which con- tained student progress information, were generated and distributed by the clerk to the Laboratory Instructors prior to each class session. Weekly, the Individual Student Reports, which contained lecture infor- mation, assignment information and Special messages, were generated and distributed by the clerk to the students enrolled in the course. Test of theg§ystem The system was tested in the evening section of the course, Principles of Electronics, at Cerritos College, Norwalk, California, during the first seven weeks of the Spring Semester, 1973. The course was the first of the four fundamentals courses in the electronics technician program and required a pre-requisite electronics course. Forty-three students, with varied backgrounds related to prior elec- tronics coursework and work experience in the electronics industry participated in the system test. The staff involved in the system test included an Instructional Manager who also assumed the role of one Laboratory Instructor; a second Laboratory Instructor, a Clerk and a former student who assisted in both laboratory sections. The system 124 was evaluated by students using a CMI System Student Questionnaire administered during the third, fifth, and seventh weeks and an addi- tional anonymous form of the questionnaire completed during the ninth week of the semester. The instructional staff completed an Instructor Critique at the end of the seventh week and made recommendations for improvement of the system. Finding§_Related to Student PrOgress During the System Test Of the forty-three students participating in the system test, forty-two students (97 percent) completed Unit I, assignments l-7. Six students (13 percent) completed Units I and II, assignments 1-7 and 8-15, and three students (6 percent) continued on into Unit III. During the first few class sessions, several students completed multiple assignments each session. By the second class session, some students had begun to vary the time required to attain mastery of the assignment objectives. And, by the fourteenth class session, students were spread from the sixth to the seventeenth assignment. 0f the 507 assignments in which mastery was attained, only two were found in which mastery had been attained on both the pre- and post-tests. An attempt was made to determine the characteristics of stu- dents relating to early and late attainment of assignment objectives. Means and standard deviations were calculated for the characteristics of students completing a high, intermediate and low number of assign- ments. A one-way analyiss of variance was conducted for the three groups on all variables. Of the thirteen characteristics measured, "1 learn best by working closely and directly with the teacher," was significant at the .05 level of confidence and "Work experience in the electronics industry related to this course" was significant at the .10 level of confidence. A post hoc analysis conducted for the characteristic, "I learn best by working closely and directly with the teacher," indicated that the high group was significantly different at the .05 level when compared to the low group. Also, the high group was significantly different at the .05 level when compared to the sum of the intermediate and low groups. A post hoc analysis conducted for the characteristic, “Work experience in the electronics industry related to this course" indicated that the intermediate group was significantly different at the .10 level when compared to the sum of the high and low groups. Findin 5 Related to Student Reaction to the SYStem Means and standard deviations for students feelings toward the system related to threat were calculated for the students completing a high, intermediate and low number of assignments. A one-way analysis of variance was conducted for the three groups on all variables. 0f the six variables related to threat, none were significant at the .10 level of confidence. A chi square analysis was conducted to determine if the feelings towards the system related to threat changed over the three questionnaires completed. There was no significant difference at the .10 level of confidence. A chi square analysis was conducted to determine if the feelings toward the system related to threat were different for the questionnaires completed in which students identified themselves and the anonymous form of the questionnaire administered two 126 weeks after the test. Again, there was no significant difference at the .10 level of confidence. Means and standard deviations for student feelings toward the system related to acceptability were calculated for the students com- pleting a high, intermediate and low number of assignments. A one-way analysis of variance was conducted for the three groups on all variables. Of the four variables related to acceptability of the system, none were significant at the .10 level of confidence. A chi square analysis was conducted to determine if the feelings toward the system related to acceptability changed over the three questionnaires completed. A chi square value of 9.659 was significant at the .05 level of confidence. A chi square analysis was conducted to determine if the feelings toward the system related to acceptability were different for the questionnaires completed in which students identified themselves and the anonymous form of the questionnaire completed two weeks following the system test. There was no significant difference at the .10 level of confidence. An attempt was made to determine the characteristics of stu- dents expressing satisfaction and dissatisfaction with the system. Means and standard deviations were calculated for the characteristics of students indicating a high, intermediate and low satisfaction with the system. A one-way analysis of variance was conducted for the three groups on all variables. Of the fourteen characteristics measured, "It is a teacher's responsibility to see that I learn the subject matter of a course,” and "I like to figure out how to do a thing by myself rather than be told," were significant at the .10 level 127 of confidence. A post hoc analysis conducted for the characteristic, "It is a teacher's responsibility to see that I learn the subject matter of a course," indicated that the high satisfaction group was significantly different when compared to the sum of the intermediate and low satisfaction groups. A post hoc analysis conducted for the characteristic, "1 like to figure out how to do a thing by myself rather than be told," indicated that the combination of weights which produced the significant difference at the .10 level of confidence was too complex to distinguish the characteristic of the groups on this variable. Findings Related to Staff Reaction to the System The staff appeared to be quite relaxed and comfortable with the system as indicated by the statements on the Instructor Critique, "The atmOSphere established by the system was one I have long en- visioned . . ." and," . . . I was perfectly comfortable and relaxed." The staff expressed enthusiasm for use of a computer-printed report as a method of communication but indicated a need for some modifica- tions of the information provided by the reports. These included the suggestion that lecture information be programmed into the system and generated as "Recommended Lectures" by the system. The staff believed that there was a need for a listing of assignments skipped by students and it was also suggested that information included on the Laboratory Instructor Report parallel information on the Instructional Manager Report for each section. The instructors suggested that the system has great potential for classroom use. This was indicated by the statement " . . . the system as it stands now is not only workable and very effective, but has lots of possibilities." Both instructors recommended that the system be used as a model for the other elec- tronics courses and suggested that the system be expanded into other technical fields as well. SUMMARY OF MAJOR FINDINGS The following major findings are indicated by the data col- lected during the system test: 1. Given the opportunity to work at their own rate, students varied greatly the time required to attain mastery of the assignment objectives when participating in the course conducted Via the computer-managed instruction system. They elected to bypass materials previously attained, or at least they elected not to re-test on those materials. 2. Students who believed that they learned better when not working closely with the teacher completed more assignments than those who believed that they learned better when working closely with the teacher. 3. Completion or non-completion of course pre-requisites, either electronics or mathematics, did not affect the number of assignments completed by students during the system test. The same was true for completion or non-completion of college English courses, concurrent mathematics courses, and SCAT Verbal, Quantitative and Total scores. 129 4. Students did not feel threatened by the computer-managed instruction system while participating in the system test. 5. The computer-managed instruction system was acceptable to students as a means of acquiring couse information while participating in the system test. 6. The computer-managed instruction system became more acceptable to students as the system test progressed. 7. Those students who believed less that it is a teacher's responsibility to see that students learn the subject matter of a course expressed greater satisfaction with the course conducted via the computer-managed instruction system than those students who believed more that it is a teacher's reSponsibility to see to it that students learned the subject matter of a course. 8. Completion or non-completion of course pre-requisites, either electronics or mathematics, concurrent mathematics, college English, work experience in the electronics industry or the number of assignments completed did not affect satisfaction of students participating in the course conducted via the computer-managed instruction system during the system test. 9. Conducting the course via the computer-managed instruction system was not threatening to staff during the test. 10. The computer-managed instruction system was acceptable to the staff as a means of conducting the course during the system test. 130 CONCLUSIONS The following conclusions are indicated by the data collected during the system test: 1. Given the opportunity to work at their own rate, students will vary greatly the time required to attain mastery of the assign- ment objectives when participating in a technical education electronics course conducted via the computer-managed instruction system. 2. Students who possess the belief that they learn better when not working closely with the teacher will complete more assign- ments in a technical education electronics course conducted via the computer-managed instruction system than those who believe that they learn better when working closely with the teacher. 3. Students who do not possess the belief that it is a teacher's responsibility to see that students learn the subject matter of a course will find greater satisfaction with a technical education electronics course conducted via the computer-managed instruction system than those who believe that it is a teacher's responsibility to see that students learn the subject matter of a course. 4. Completion or non-completion of pre-requisite electronics or mathematics courses, concurrent mathematics or college English will not affect the number of assignments completed or the satisfaction of students participating in a technical education electronics course conducted via the computer-managed instruction system. 5. School and College Ability Test Verbal, Quantitative and Total scores will not predict success or satisfaction in a technical 131 education electronics course conducted via the computer-managed instruction system. 6. Conducting a technical education electronics course via the computer-managed instruction system will be acceptable to both students and staff and will not threaten students or staff partici- pating in the course.1 7. A management system Operated via a computer will assist an instructional staff in the management function of an individualized technical education electronics course and the system's computer- generated reports will not be threatening to either staff or students participating in the course. RECOMMENDATIONS The following recommendations are indicated as a result of the study: 1. The research should be replicated because the system test was conducted for only seven weeks in an effort to develop the system. The research should be conducted using the modified system which was developed as a result of the system test. 2. Research needs to be conducted comparing achievement in technical education courses conducted via the computer-managed instruc- tion system to achievement in technical education courses conducted via the conventional approach. 1However, the success of the system may be affected by the interest and capability of the staff in utilizing an individualized approach for the course in which the system is used. 132 3. The conflicting reSponses by the staff involved in the test of the system relative to the projected cost of operating the system suggest a need for research related to cost of operating the system. Upon refinement of the modified system, cost of conducting a technical education course via the computer-managed instruction system needs to be compared to the cost of conducting a technical education course via the conventional approach. 4. More extensive research needs to be conducted relative to the completion of course pre-requisites and achievement in technical education courses conducted via the computer-managed instruction system since these findings were based upon a test of only seven weeks and not an entire course. 5. More extensive research needs to be conducted relative to the capability of the School and College Ability Test scores (Verbal, Quantitative and Total) to predict achievement in technical education courses conducted via the computer-managed instruction system since these findings were based upon a test of only seven weeks and not an entire course. 6. A different set of prediction variables based upon cog- nitive style should be used in future research to develop a profile of students finding success and difficulty in technical education courses conducted via the computer-managed instruction system since the variables used in this study did not predict achievement or satisfaction with the system. 133 IMPLICATIONS OF THE STUDY The computer-managed instruction system which was develOped and tested appears to be a viable instructional approach to conducting technical education courses at the community college level. The management system is usable by staff, providing assistance with the management function required in an individualized instruction program. As such, the computer-managed instruction system could facilitate an individualized technical education program at the community college level. However, restraint must be exercised while reviewing the find- ings of the study as several factors may have affected the success of the system test. A major contributing factor could have been the personal relationship between the investigator and the instructional staff participating in the system test. The investigator had taught for several years with the two instructors in the department in which the system was tested and as such, this could have influenced their cooperation in the system test. Another contributing factor could have been the interest of the staff in utilizing an individualized approach for the course in which the system was tested since they believed that the varied backgrounds and needs of students suggested such an approach. Staff interest in such an approach is indicated by the fact that the instructor who assumed the role of the Instructional Manager develOped all of the instructional assignments for the courses on his own time. This required a dedication and interest that went 134 far beyond normal course preparation. As such, the staff which participated in the system test was actively involved in the development of the system. Their enthusiasm, which was demonstrated by their willingness to put forth extra effort, could have helped establish an atmOSphere which could have affected student satis- faction with the system. Therefore, replication in a different environment is necessary to substantiate or diSprove student and staff satisfaction with the system. The following implications are suggested by the system test for future implementation of the computer-managed instruction system: 1. There must be a definite need for an individualized pro- gram as indicated by backgrounds and needs of students. The number of students involved and the breadth of the assignments must be great enough to warrant computer assistance in the management function of such a program. 2. Staff who have an interest in an individualized instruc- tional approach and who are capable of functioning in an unstruc- tured environment should be selected to conduct the course. 3. Staff who have a firm command of the instructional materials to be presented should be selected since few students will be working on similar assignments. 4. Time and resources must be made available to the staff involved in such a prOgram for the development of instructional materials. 135 S. An in-service training program should be conducted for the purpose of acquainting staff with the operation of an individualized instruction program and providing assistance in the development of instructional materials. 6. An in-service training program should be conducted for the purpose of acquainting staff with the operation of the computer management system and providing for staff input on the computer- generated report data. BIBLIOGRAPHY |"\Il.l.|ll III ‘I I'll-I .I .III ‘II all i BIBLIOGRAPHY A. BOOKS Atkinson, R. C., and H. A. Wilson (eds.). Computer Assisted Instruc- tion. New York: Academic Press, Inc., 1969. Biggs, J. B. Information and Human Learning, Glenview: Scott Foresman and Company, 1963. Bushnell, D. D., and D. W. Allen (eds.). The Computer in American Education. New York: John Wiley and Sons, Inc., 1967. Chao, Lincoln L. Statistics: Methods and Analysis. New York: McGraw-Hill Book Company, Inc., 1969. Esbensen, Thorwald. Workin With Individualized Instruction. Palo Alto: Pearson PEEEishers, 1968. Gibbons, Maurice. Individualized Instruction,_A Descriptive Analysis. Columbia University: Teachers College—Piess, 1971. Johnson, B. Lamar. Islands of Innovation Expandin : Chan es in the Community_Collegg: Beveily Hills: Glencoe Press, 196 . Kemeny, John G. Man and the Computer. New York: Charles Scribner's Sons, 1972. Osgood, Charles E., George J. Suci, and Percy H. Tannenbaum. The Measurement of Meaning: Urbana: University of Illinois Press, 1957. Vinsonhaler, John F. Computers in Education and Social Science. Part III, Computer Applications. East Lansing: Information Systems Laboratory, Michigan State University, 1972. 136 137 B. PERIODICALS Atkinson, R. C., and H. A. Wilson. "Computer-Assisted Instruction," Science, CLXII (1968), 73-77. Baker, Frank B. "Computer Based Instructional Management Systems: A First Look," Review of Educational Research, XLI (February, 1971), 51-70. Becker, James W. "Whatever Happened to the Computer?" Journal of Educational Data Processing, VIII, 1 (1971), 3-8. Brudner, Harvey J. "Computer-Managed Instruction," Science, CLXII (1968), 970-76. Cooley, William W. "Computer Assistance for Individualized Education," Journal of Educational Data Processing, VII, 1 (February, I975), IB-ZB. , and Robert Glaser. "The Computer and Individualized Instruction," Science, CLXVI (October, 1969), 574-82. C00p, Richard H., and Irving E. Sigel. "Cognitive Style: Implications for Learning and Instruction," Psychology in the Schools, VIII, 2 (1971), 152-61. Coulson, John E. "Computer Assisted Instructional Management for Teachers," AV Communication Review, XIX, 2 (Summer, 1971), 161-68. Flanagan, John C. "Functional Evaluation for the Seventies," Phi Delta Kappan, XLIX (September, 1967), 27-32. "Program for Learning in Accordance with Needs," __hi>szc ology in the Schools, v1, 2 (April, 1969), 133-36. . "The Role of the Computer in PLAN," Journal of Educational Data Processing, VII, 1 (February, 1970), 7-12. Geddes, Cleone L., and Beverly Y. Kooi. "AN Instructional Management System for Classroom Teachers," The Elementary School Journal, LXIX, 7 (April, 1969), 337-45. Hensley, Charles. "Individualized Instruction," School and Community, LVIII, 2 (October, 1971), 32-33+. Johnston, Robert J. "Computers in Education: An IBM Viewpoint," Educational Technology, XI, 12 (December, 1971), 16-18. 138 Kelley, Allen C. "An Experiment With TIPS: A Computer Aided Instructional System for Undergraduate Education," The American Economic Review, LVIII, 2 (May, 1968), 446-57. Nunney, Derek N., and Joseph E. Hill. "Personalized Educational Programs," Audiovisual Instruction, XVII, 2 (February, 1972), 10-15. Ringis, Herbert R. "What is 'An Instructional Package'?" Journal of Secondary Education, XLVI, S (May, 1971), 201-205. Roberts, Arthur D., and Perry Allen Zirkel. "Computer Applications to Instruction,” Journal of Secondary Education, XLVI, 3 (March, 1971), 99-105. Schure, Alexander. "An Accountability and Evaluation Design for Occupational Education," Educational Technology, XI, 3 (March, 1971), 26-37. C. GOVERNMENT DOCUMENTS U.S. Civil Service Commission. Bureau of Training. Co uter Assisted Instruction: A General Discussion and Case tu g. Training Systems and Technology Series: No. , amp et -15. Washington: Government Printing Office, 1971. U.S. Department of Health, Education and Welfare. Office of Education. Vocational Education and Occupations. Washington: Government Printing Office, 1969. D. REPORTS AND PAPERS Bell, Norman T. "Strategies for Computer Applications in Instruc- tion." East Lansing: Learning Systems Institute, Michigan State University. (Mimeographed.) , and Robert B. Moon. "Teacher Controlled Computer-Assisted Instruction." East Lansing: Michigan State University. (Mimeographed.) Bjorkquist, David. What Vocational_Education Teachers Should Know About IndividualizediInstruction. iCoiumbus: Ohio State Universitinenter fer Vocational and Technical Education, Information Series No. 49 (1971). ED 057184. Bloom, Benjamin S. Learning for Mastery, Instruction and Curriculum. Regional Education a oratory for theiCarolinas and’Virgifiia, Topical Papers and Reprints, No. l (1968). ED 053419. 139 Brightman, Richard W. Coast's Practicioners Review Computer— Assisted Instruction. Costa MESa: iCoast—Community College District, 1972. ED 060847. Connolly, John A. A Computer-Based Instructional Mana ement System: The Conwell Agproach. Silver Spring: Amefican Institutes for esear , . 049620. Cronbach, Lee J. "The Two Disciplines of Scientific Psychology." Address of the President at the Sixty-Fifth Annual Convention of the American Psychological Association, September 2, 1957, New York. Dick, Walter, and Paul Gallagher. Systems Concepts and C uter- Mgngged Instruction: An Implementation and Validation Study. Tech. Memo No. 32. Tallahassee: Florida State University, CAI Center, 1971. ED 050543. Fritz, Kentner V. and others. Introduction to Computer-Managgd_ Instruction and the Automatedilnstructional Man ement System. Vol. 5, No. 8T"Madison: Univeriity of Wisconsin Counseling Center, 1972. ED 069757. Gagne, Robert M. Learning Theory, Educational Media and Individual- ized Instruction. ashington, D.C.: Academy for Educational Development, Ific., 1970. ED 039752. Gallagher, Paul D. An Investigation of Instructional Treatments and Learner Characteristics in a ComputeriMana ed InStruction COurse. TEEhnical:ReportW6. 12}'—Tallahassee: Florida State University, CAI Center, 1970. ED 042360. Glendening, Linda. "Posthoc: A Fortran IV Program for Generating Confidence Intervals Using Either Tukey or Schaffe Multiple Comparison Procedures." Occasional Paper No. 20. East Lansing: Michigan State University, Office of Research Consultation, 1972. Hagerty, Nancy K. Development and I lementation of a Computer Managed Instruction System in Grasuate Training: Technical eport No. 11. SWEDIEhassee: ’Florida State’Uhiversity, CAI Center, 1970. ED 042354. Heathers, Glen. "A Definition of Individualized Education." Paper presented at the 1971 AERA Annual Meeting as part of Symposium C81: Teacher Behavior in Individualized Education, 1971. ED 050012. Hill, Joseph E. The Educational Sciences. Bloomfield Hills: Oakland Community College, l971. 140 Hobson, Edward N. "Empirical Development of a Computer-Managed Instruction System for the Florida State University Model for the Preparation of Elementary Teachers." Unpublished Doctoral dissertation, Florida State University, 1970. Johnson, B. Lamar (ed.). "The Improvement of Junior College Instruc- tion." A report of a conference Sponsored by the UCLA Junior College Leadership Program, The American Association of Junior Colleges, and the University of California Office of Relations with Schools, 1969. ED 040707. Lawler, Michael R. An Investi ation of Selected Instructional Strategies in an Undergradfiate Computer-Mangged Instruction ourse. eéhnical Report No. 19. Tallahassee: Florida State Ufiiversity, CAI Center, 1971. ED 054652. Melching, William H. Behavioral Objectives and Individualized Instruction. Alexandria: GBOrge WhShington university Human Resources Research Office, 1969. ED 048821. Resta, Paul E., Joel E. Strandberg, and Edwin Hirsch. Strate ies for Development of Computer-Based Instructional Management Systems. Inglewood: Southwest—REgional Educational“Laboratory, 1971. ED 046245. Sass, Richard E. "The Development of a Computer-Based Management Pragram for Use with Adaptive Instructional Systems." Unpublished Doctoral dissertation, University of Pittsburgh, 1970. APPENDICES APPENDIX A INSTRUCTIONAL MANAGER REPORT mp0“ .5 xuaat mm mm n~ mu m. o— 0’ o. o— a" on a. an a” a“ a. m” m— wqcunnoa unnzochuugu no moan—uznum «on Am unnhaoaua auu¢o .zuano a: zoa .anhaaazss wpzzuc .nuauqu asazce .1_~zm cu>¢c .zwcac u: mnzzuc .m~44~w pawhlwzm cz~x<» whzwoahm 141 142 o— S“ as 2:35 .zucuux a“ a“ cs quxcozm4 .czcoz am o. s_ a¢oe .zunzmouxm 0. kg S" ammoa .zmxuuo 143 smxms zauc .z<>u u msst u osxms ms as ugcuanoa on em mu 0. cu maounnoa z<4>oma .uwu<~zou nuzefi .snmm ww~uszuHumc me n ms u ucaum u>~buuwoo mu s“ on , ea ~u as e“ n« cu mu zo~umbnmu maoun m— cu m. «— e. n« nexms u nmxms u c5\~s osxms 05 ~s pzmrzuuamc Nu dd 08 5s 50 hzuzzoumm< c8 08 us nus .zamszuuz >u4mus .rmac msupzurzodmm< :DFSH, .xssuu m:i_zuzzonnm< .2eri .zuszam .3Ipzuzzuumm4 uz~z373 wz< whzuoshn tear: z~ m—zuzzuuww< m43302 O» uzupqumx age—qu41 Juzu~uuumo paupuura 21:5 .zcu»«3 osoo .xu~amouza aausnxa .¢u>¢: zrow .zom»ac .zuazo us UUU osxns aaxno ooxne oaxns osxns «exna nmxms osxns wmst oo\ns n—xms mmxms saxms amazou .zomxcumsux osczoc .zoocm»ua zxoa .eUMszuuux when» .cwtoum ozaru~a .34rm cucu .umxummuuo amazoo .cussazw=>uz bza .mxucczur moua:u>z¢z .am>.as u: >m>mar .zaxzurm U U esxns osxns sexms «sxms mmxmm os\ms ooxno smst as\ns «axns 08\m9 maxms n.\~s 9n\ms VNxms cs\~9 su¢c .44m~»x Ju—z<2 .zetam demon .zwzuuo s~n~zuzzo~mm< corms .uzooz osczoz .oacmzhzou escuzuzzunmn¢ m~u2«zi .musxuuu calpzuzzunmm‘ 2:: .cnnacszas Law‘s: .cuxmpm .«uuou uzor zcoz unmm~omohuaupnz~ m<4 23am wmacmmu: osxns goes .4:..» u osxns 4u osa: ncsacu .uuaa o~>ao .znaao u: csu>mcx .z .xuduu msopzmtzo~nm< mpzu03hu rout: 2n ahzuxzaunnq 146 147 u ocxna romwa .wzcox ...»moama to oz».-. u o8\ns u csxns osmd Na Na oH>m¢ he «a 210$ admxm4¥ MJmHmmoa onwmbHdu macaw hzwtzaummd onbomm wtcz >ozmuuuuwo bmwhiuma ~H xuao .zaaaa u: we zoo .hmHaaosz mo mszmc .mHssuo onmmmm mmqmu MZHQDO hmm» hmomiumd oz~x¢h whzwoahm nsou .5 1014: moHZOdbomam no muJaHOZHma a.« 4m iuuhxoawm «mo0¢u me on waoummoa me .20m2H24H2 >u>m¢2 .2¢tmm2m 2¢w0 .z¢>¢ .~m4¢~zoo hauooa .~»z¢aaua >u4mu: .zmao summoco .pm::o~zom zoumnmuo 0« mo maze 02¢ mxmam wm4¢0m «abut mm4¢0m «mum: mbuza acouuhummw m>Hhuu000 no onmH>H0 ~0 ow no maze 02¢ mxuuu macaw uruhowfioo 0000 m¢w2 bzwdmau 02¢ m0¢>40> .mo hzm220Hmm¢ cN-ouoa.o Mum“ .0 20¢¢t .02H2m>m >¢0m¢32b mudahuwd 0m02m2200mm 0mmm¢u>0 hmuhohmoa ad «a 20¢fi .Jr¢hm one mbzutzomwm¢ 2mH>m¢ ha ~o 0wb02Hb .00¢20¢00 0mmm¢c>0 bwmhibmom me «a hd¢ .mxHaozmr bzuzzuumm¢ 20Hhuwm w2¢2 whzuoahw >0 owenuym mhzutzonmm¢ wand woman m2» no mm: a: N0 04¢20¢ .m¢am¢h200 and uhzutzmumm¢ 2wn>wm so «a 2205 .mwqux 020k up: 02¢ mbuza omaw Ma No rmmom .hd¢3mhm 020u a»: 02¢ mhuza owa no No 2¢4>oau .NuJ¢N200 023m «ht 02¢ mhuza owam no No orom .xundmowdm hzwtzwumm¢ 20Hhomm mt¢z 02H02¢hmh30 meozuHQHumo hmwhahmoa was“ maugm wt» no um: ca No 04¢20¢ .mcdmdhzou one mbzutzuumm¢ 2wH>w¢ sa «0 2203 .«mxwdx hzwtzunmm¢ 20Hb0mm mt¢2 >uszoHum0 hmwhlbmoe 154 ~« zomswz .~u¢Hz¢¢ ~« canes“: .euao 04¢zo¢ .2omxubwa an>ce .44u~p¢¢z 04¢200 .42¢> 4UH2¢6 .2¢2o¢ 94.20: .«wsaazwo»ux censure .uupat oaazoe .zpuzm muzzua .mussau Nu 2¢H44H2 .¢4Huhm Nd 4u¢20H2 .2om2205 Na mu2¢s .0Hu8 0a<20u¢ .2muuut huuoano .¢H0¢¢0 page» ..¢¢:u»m 0>00 axuuxuouau zxoa .zom~¢x hxa .mxnaazm: 0H>¢0 u2H¢Da 02 cubOXHh .00¢20¢00 0-2a 00\~o hdwooc .mwuzum paowrh 20¢h0m4u .«aibzwtzunmm¢ 02Hx¢02 w¢¢ mhzwoahm 20u22 2H szwtzwumm¢ Na xo<5 .42¢bm NH 2¢m .2uw02 Na wtn¢fi .002024w0 ~u zmamhm .40000 we 20b0H> .xugwu «0 bummed udwozum mwdahumJ hzwtzunmw¢ >0 oumm>ou ~02 mbzw00bm amazou .zomz¢nssux amazon: .oxczum zxos .cmzmsg zxos .xousuzuuux whoum .umzmnu mwxaq .mzasu m~m>s¢amwap mosmau .uuca yoamJ .uaoo: muuz«¢a .ausruuu amazox .maumapzoo mHm>saza buanano mmuamm .os zoumma a~.A-om.o gouxpaa .mmH: oH>¢o .zohsux >ms¢«am or zoo .hmaacozas quoz» .hsmsacsm 2H<4m .auxqm wand wonqm wt» to um: .00 hzwtzmumm¢ amnolo~oo 155 «use auto 23-0 0min onto «BOD auto no\ne ca\n0 s~\~o O¢\n0 00\n0 00\00 00\n0 OIKQ 0~\N0 0000 .h¢¢2ubm 00 agenxunucu QHJuxm 2205 u20Mh¢2 0H>¢0 .20h4w2 H22m0 .mndanu >¢0 .2H¢:G or 2¢m0 adu>¢2 .2¢>¢ main auto 1100 acne «500 male data onto auto unto 00\no 0¢¢20H¢ .2¢2m «use 00\na 4mH2¢0 .2¢20¢ «e\na mum .2wxuummu0 «moo oo\no mmmoa .muxowm mnm>4¢2¢ hHDOMHu mundwm .auihzwtzunmm¢ -\~o 0 .dw4402u0>m2 unto 0~\~a roam; .w2002 moxno p¢¢ .mgnaozux «duo oaxna zoa .maamapzou caps a exam: mam: unapao, 92¢ pmumwx .moibzwxzeumm< -x~o mc4¢¢u .me¢a «duo «a\na oz<¢a .cuszuuu usaa uoHJm mt» do um: .ooihzmxzoumma nexus amazon .zpnzm ca-u max»: zoa .pmuacozns n«\~o 2oz» .psmsaaao «duo nax~a zucsm .amxco xum zusomzh mzo mbzuzzuummc guH>m¢ .Aaubzmxzunmm< oaio nusmo xuuapca .mmnz n~x~° «a: .uu>«¢o a: «duo °~xwa ozH» .ooczoxou zu» do mxmxoa oza muxndmua .ooibzmzzoumma oiaa --o moo: .Hbzcummd uxuo on 4 2H mam: hzmaaso oz¢ moahso, .maihzmzzuumma o-~o m>¢¢: .zazamzm aa-o oox~a mac .hmxao~zoo mmuH>uo m>HpmHmua .souhzuzzoumwc o-¢a oox~u mus .zomzHgSH: mdx~a poad .NmsaNzou ma-o msx~= >m4mmz .rmsm m44¢z¢ buaoauo aznoatoo .Nalhzutonmm¢ «moo 0o\na 02 .2omt¢u44H2 menu oo\no m¢zuH2 .0¢¢2Hm 002a oo\n0 0¢mu .bmuomum auto oa\na ¢zod .zomdmhwa «moo 0o\na M¢IOH¢ .mwuuwt unto caxne 2:03 .mwzuaz «0:0 oo\no 03 .xuaqmtuauv auto «o\na H>¢0 .44w~h¢¢2 atlu wasna undue .¢H0¢¢0 auto ooxna hhoom .auzwuu menu «0\no mw2¢fi .m2¢>u mum>4¢2¢ bHDOMHO 4m44¢¢¢a .«aihzmzzuumm¢ «moo -\~o 04¢200 .42¢> APPENDIX F MODIFIED LABORATORY INSTRUCTOR REPORT 02H02¢hmh00 meuszoHuwc hmmklhmoe we we as do wage or. mxuwa u a-" mpzmzruumm4 3mH>m¢ so zzow .auxusy msmmumna wwoum msHuomqmo pzwxzonwma mzqz >oszoHuw0 hmwhlhmoe ca NH as 4‘24 husoano czaoatoo Na yoga .srqpm om ms s“ 442a pusoauo smasqaqa «a name .pamcmum ms mm as mszm ucHJm wt» do um: cu zzoa .zomhax as as "a o-« mpzuzzoumma 3mH>wa so zzoa .wwxmux msemvmon reaamtuo mmoum bzmzzoummq mzqz >uzmHuHuw0 hmwhiwmd ~« yoga .sxa»m ~« quaq .oozarsmo «a 2cm .zomrcH33H3 «a oawm .pwummam as 4420a .zcmmmpua «a >omms .umoor «a 2:03 .amzmsy as Ecnom .ymrmuu as amateur .xczm ow Junzao .zazou as ammo: .omxnmm as care .pwtmmwm mu pea .wamorw: c, 2105 .zomuqz co cos .myHocau: An zros .zomcax kg csazoo .xcuzm so zxos .amxmsx km szzmo .uHssHe mm yoHyuqo .muHx as amazoo .xpwxm on oH>oo .ssm~»«ox whoom .ourmua mHm»sora buoonuo czooaroo .NH zonmmo om..-om.o osozoo .szoo o~cxoom .xsrv swozao .zazoa ummoo .mmyouo mom»soya uuoomoo smssaaoa .«a onwmo om.m-o~.o 02H7w>w rcommsh cmzowmm¢ mwmnhowq zzos .zombo: >oam3 .m~oor dousuxa .wm>o: bud .mxuaozu: mHm>soz< pusomHo umoamm .oa zouwmo o~.k-om.o youepca .me: poser .7Ha:o oz mszmc .mussuo zHaua .mquo msoa moasm or» do mm: .oo hzmzzonmmo om.ouo~.o >m>06x .7armwrw zomc .2¢>¢ bmuoom .szommmu «our ~2mm~oo ozo woocso> .mo uzmrzoummo o~.muoo.o 02sz>w >¢0waDIh ouzommmc wwmahowq omwm¢o>m umm»-»moo ow yoga .sroum ommmqa»o pmmhupmoa mo ego .myHmnzmz hzmzonmmo mroz mhzwoawm >0 cwnuuxm szmxonmma as do mayo oz. wxuoo o o-“ vozmzzoouma 2mH>mu so zxoo .mmsty u>Hbomfimo hzwronmm¢ wt¢z 159 mono oo\M9 No-0 o~\mo ”duo moxmo oa¢IoHW >0Nm4 644700 .zorm .mmooz .rwmrm voafi «die -\~o osazoo .4:c> ao-o ooxnn smuzac .zozou «duo moxno ammos .«myouo «Hm>sozo bosomHo mwummm .oa-»zuzzouwmo «o-o -\~o aaHsto .«m»qz ao-o ooxno he. .mxuoozmz z>>> a oznmo mow: moapoo> oz<.»mmmmm .ooipzwzzoammo oo-o ooxno zzoo .zomuqz ma-o -x~o oosyoo .moon mesa moHsm oz» do mm: .oouozmzzoummo ~o-o ooxoo mszmo .mHsswo «duo nax~o zaqso .«uxoo me zoooaz» mzo mpzmzonmmo :uuzwo .Aouuzuzzonmmo oa-o mmxmo yoHapao .mmH: mano ooxno asqo .zocoo oz zm» do mamzoa ozo mmxHumya .oo-»zmzzonmmo o-ao -x~o mooa .thoamua omHo oo o 2H mom: pzmamoo ozo mozuuos .mo-»zm:zonmm¢ cuwn moxmo mm; .Jrabv mono o~\~o m>wcz .2axwwrm mwcw>w0 w>H>mHmmm aeolbzwtzoumm¢ .zomszJHz oima ouxwo zzmo .z¢»c msoczmroozod «who: oz. muhzn saouwbousu .nouuzmzzoummo Na mono mo\~0 mo>0H> .xnawu m02¢>wme¢ 02¢ m02¢>030200 .Ncl>zm220HvM¢ wquq 02H¥202 wad m>2w00>m IQHII 2H mhzwrzouwma .oozDzawo Nu ao>0H> .xHqu m.n:>om4 >zmronmmq >0 0waw>oo >02 wpzwozhm 160 Hutu coxmo «zed .zomomhwo ---.«oomz do ozu--u 00am mo\no mono oo\nn «duo mo\mo om goon .szahm «duo mum>44¢2¢ hmauxno ooxno H45 .oozarsuo ozooozoo .Naubzwzzonmmo onxno comm .pmwmmnm ooxno whoom .umzmuu 4m44¢u¢a .«dchMtszmm¢ APPENDIX C MODIFIED INDIVIDUAL STUDENT REPORT who“ .o Iowa: ow MN m« ow an ma 0“ m« aa aa mawummoa :« ca Nu :« «A ~« mu Na 0o 00 ZOHMmFHmu muuzoahquw no wwaunosza H." 40 m~w>4¢2¢ huouuHo 4m44¢V¢a odd zuHmm¢ nmuolo~.o ozuzmsm >aommo» .mmmohoms mHm>4¢2¢ bnnoauo mummmm .oa szmm¢ ownsiomao mJDQ wQHJm wt» no um: .oo hzmzzoummc am.mla~.m mow: pzmaaou ozc moo>4o> .mo pzmzzonmma n~.ouoo.o «A mm :« o" ta ma 0« 3“ ca 00 wroom owwu «nxno smxwo -\~0 mdxwo m«\~o m«\~0 nusmo oc\~o ooan ocxwo mpqo .awxummmwn -uupwoamm uzmoocm--- 02H2w>m >¢0mdar> .mmwahumJ «moazct J¢20H>00d>m2H town mmo¢wmmz MHM>J¢2¢ kHDUMHo mmHmwm 25») ¢ oszD w¢wt H 024 w manm wonam mrh no mm: 00« mpzthOHmm¢ 2wH>md zm> no mawxoa 02¢ mwxnuwda ONHO 00 ¢ 2H w¢wt H 02¢ m mwon>m0 m>H>mHmwd 020m mmhmt 02¢ m>H20 0m4w m02¢>memw 02¢ mor¢>osozou >aouI> zombumam hzmtzwummq whao o» owhmJuzon M>2mrroumm¢ on me no mo mo 90 No «a 161 162 we? a midgid a! ”ififiztflr‘. eh! m ---cyouo« do o2m--- 2&0: 0000 mth a: ammxnimzHueozHoc my< 00> mmhoz 4¢20H>H00¢ mHm>4¢Z¢ >HDUQHU umquqqqd .aa 20Hmm< omuoiowuo n«\n ma» mdapumJ 0m0zmttoomm mHm>4¢Z¢ kHDQQHQ 02:00:00 .Na onmm¢ amunummoo APPENDIX H SAMPLE INSTRUCTIONAL ASSIGNMENT ASSIGNMENT 11 Parallel Circuit Analysis Instructions: Listed below are the objectives for Assignment 11, ParalLeL-Circuit Analysis. After reading the objectives, complete the pre-test and have it scored by the clerk. Based upon the results of the pre-test, complete the instructional modes necessary for you to acquire the information in which you are deficient. Obtain the assignment post-test from the clerk. Complete the test and'have it scored by the clerk. Based upon the results of the post-test, you will be presented the next assignment. Objectives: Upon completion of this assignment, you will be able to: 1. Calculate the voltage, current, conductance, resistance and power values associated with a parallel circuit. 2. Calculate the branch current in a parallel circuit using the ratio method. 3. Measure the voltage, current, and resistance in a parallel circuit within 20% of the calculated values. Instructional Modes: The following instructional modes will provide {he information necessary for you to achieve the objectives listed above. The amount of time you spend is not as important as the fact that you acquire the information necessary to achieve the objectives. You may want to use a portion of one, two, or three modes to acquire the information. Mode 1: Text material: DeFrance, Electrical Fundamentals, Ch. 8, pp. 90-97. The text material will provide you with information related to voltage, current, resistance (including the special cases) and Kirchhoffs current law in a parallel circuit. Mode 2: Lecture/small group discussion--announced at apprOpriate time. 163 164 The lecture/small group discussion will cover voltage, current, resistance (including the Special cases), power and Kirchhoff's current law as it applies to a parallel circuit. Mode 3: Laboratory experiment: Experiment 5, Parallel Circuit Analysis, pp. 5-1 to 5-4, Sections B, C, 0, pp. 5-6 to 5-7. The laboratory exercise will provide you with information related to current, voltage, resistance, conductance and Kirchhoff's current law as it relates to a parallel circuit. In addition it will provide an opportunity to test Kirchhoff's voltage and current laws. Future Assignments: The assignments listed below will follow this ASSignment, Parallel Circuit Analysis. Assignment 12. Compound Circuit Analysis. The text material relating to this assignment is: DeFrance, Electrical Fundamentals. Ch. 9, pp. 99-105. Assignment 13. Voltage Divider Circuits. The text material relating to this assignment is: DeFrance, Electrical Fundamentals, Ch. 10, pp. 112-118. Pre-test Assignment 11 Directions: Darken the circle under the choice 1, 2, 3, or 4 on the answer sheet corresponding to the answer a, b, c, or d that best completes the statement. When you are finished, present the pre-test answer sheet to the clerk for scoring. 1. Calculate the total resistance of seven 28K ohm load resistors in parallel using thespecial relationship for equal resistors. a. 2.8K ohms b. 196K ohms c. 2.5K ohms d. 4K ohms 2. Determine the total resistance of a 13K ohm and a 7K ohm load resistor in parallel usin the special relationshipyfor any two resistors in paral e . a. 4.SSK ohms b. 20K ohms c. 81K ohms d. 1.86K ohms 3. Calculate the total resistance for two 90K ohm, one 30K ohm, one 15K ohm and a 10K ohm resistors in parallel using the conductance method. a. 6.5K ohms b. 5K ohms c. 235K ohms d. 4.5K ohms 165 4. Which of the following statements about parallel circuits is true? a. The total resistance in a parallel circuit is larger than the largest resistor in the circuit. b. The total resistance in a parallel circuit is larger than the smallest resistor in the circuit. c. The total resistance in a parallel circuit is smaller than the largest resistor in the circuit. d. The total resistance in a parallel circuit is smaller than the smallest resistor in the circuit. 5. What is ERZ in the circuit below? a. 4.1 volts b. 2.7 volts c. 12 volts d. 4 volts A E 8.2ma R1 R2 R3 12 v0?: "‘ 5x ohms .sx 6.5K ohms 7 ohms 6. What will ammeters A thru 0 register in the circuit below? a. A 12ma b. A 24ma c. A 1.2ma d. A lma B 6ma B 12ma B .6ma B lma C 4ma C 6ma C .4ma C lma D Zma D 2ma D .Zma D lma A AITFL . 4/5\ \_/T T‘\.1 + E R1 R2 R3 R4 1 -r- 1K ohms 2K ohms 3K ohms 6K ohms 166 7. Calculate the value of Rs in the circuit below. a. 1 ohm b. .999 ohms c. 10 ohms d. 9.99 ohms 500ma RS SOOmv Sma 8. Determine the total power dissipation in the circuit below: a. .lW b. 100W c. 33.3mW d. lOOOmW qr fig: 0VDC » 1mg: 4 a Sma 9. What Should the power rating of R2 be in the circuit below? a. .SW b. .ZSW c. 18W d. 40W l 120ma _ R1 R2 qr. 3K ohms 30ma 167 10. If the resistor values Shown below are the color coded values, and the meter values shown are measurements taken for trouble- shooting purposes, what would you say is the Specific defect in the circuit? a. R is open b. R3 has doubled c. R is Open d. R1 is 1 in value 2 shorted Sma A ’ 2 9ma ' R1 R2 R3 20V -— SK 20K 4K Ohms _— , ohms ohms 20V 20 20v A 11. Using the current ratio concept, determine 1R2 in the circuit below: a. 7.61 ma b. 1.94ma c. 6.71ma d. 2.84ma 1 R1 R2 9.55 4.7K 1.2K ma Ohms ohms 1 12. Calculate the current flow through RS below using the current ratio concept: a. 78ma b. 90ma c. 10ma d. 9ma RS Rm 100 2.78 25 ohms ma ohms 13. 168 a) Calculate and record the values asked for below: -_[ R1 R2 R3 K1 —_ 1 .2K 4 . 7x 33K ohms ""-‘ ohms ohms :« 25VDC + E m R I I E b) Construct the circuit Shown above using the assignment #5 resistor kit, a Simpson 260 VOM, and a low voltage power supply. Experimentally determine the values asked for below. NOTE: 1) The 500ma range is the highest range you will need for this circuit. 2) Be extremely careful when making the meter connections for It and 12. c) Your experimental values in prOblem 13b should fall within 20% of your calculated values in problem 13a. If this is true, have your instructor verify your results. If this is not true, determine where your error is, correct it, and then have your instructor verify your results. Be prepared to repeat any measurements your instructor may request. APPENDIX I STUDENT INFORMATION SHEET 3mm INFORMATIQI SHEET Instructions: Please provide the intonation requested below. It is requested 1. 3. for use in the evolution or the CHI systen. bank you for your cOOperatiom. Ilene Identities tion Ember Age Marital stems Previous electronics courses taken at Cerritos College: (Please indicate grade received where applicable.) IA 10 BL 1.2 EL 1.1 EL 1.26 31. 1.16 Other electronics courses not taken at Cerritos College: 1. ' 2. 3. h.Wone Previous math courses taken at Cerritos College: (Please indicate grade received where amlicable. If you are taking ssth this semester, plesse indicate the course.) RT 1.1 with 21 ar 1.2 Hath 23 M 3.1 Hath 5.1 RT 3.2 with 5.2 Previous English courses taken at Cerritos College: (Please indicate grade received where applicable.) an 1 an 50.1 310 5 E!!! 50.2 Others Work experience in the electronics industry related to this course: Title of poeitiai Washer of hours present]: sorting per week (Include average overtime.) umber of years 169 APPENDIX J CMI SYSTEM STUDENT QUESTIONNAIRE ID Huber CHI srsuw 810W? 0133110.“!!! Instructions: Please circle the odor on the scale following the statuent that best des- eHEes your feelings related to the stat-mt. A response of "l' or "7" indicates that your feelings strongly agree with the tors at the end of the scale, while a “1:" romance repre- “seats a neutral feeling between the terse. the mining embers represent degrees toward the us. Btstaente 1 and 2 contain three scales requiring a response for each scale. State-ants 3-10 contain one scale each requiring one response per statement. here are no "correct' or “incor- rest" responses. any an honest unease is requested. Your response will in no way affect your grade in the course, nor will this information be used to evaluate the instructional staff. It is requested EH to evaluate are CHI systu being tested. bank you for your cooperation. 1. “mile co-unicating with the Instructional W via the Studnt leper-t, I felt: a. lanes 1 2 3 h 5 6 7 Related b. Free 1 2 3 h S 6 7 Constrained c. Anxious 1 2 3 h 5 6 ' 7 Gal- 2. While using the assiu-ents, I felt: a. Island 1 2 3 I: 5 6 7 Moe b. Constrained 1 I 2 3 h 5 6 7 Free e. Unions 1 2 3 I: 5 6 7 Gals 3. The assigumt pre- and post-test results provided adequate information to allow you to proceed without delay: lot at all 1 2 3 I: S 6 7 Very such so h. “The Student Report provided adequate intonation to assist you is progressing through the course without additional instructions from the staff: Not at all i 2 3 I: 5 6 7 run much so 5. I would roccmend this type of learning experience to my friends: Host probably not 1 2 3 I: 5 6 7 Very probably 6. I prefer to plan my own study schedule rather than have seasons else lay it all outfor me: Very probably 1 2 3 h 5 6 7 bet probably not 7. Given the opportunity, I would opt out of this eaperiaiee if I could: Var-y probably 1 2 3 I: 5 6 7 Host probably not 8. It is a teacher's umnsibiliv to see that I learn the abject setter of a course: Host probably not 1 2 3 I: S 6 7 Very probably 9. Iliketofigureouthowtocbathingwsyselfnflasr thanbe told: Veryprobably 1 2 3 I: 5 6 7 Hostprobablynot 10. I learn best w WM 610.01! and directly '1“! the “0!"?! Host probably not 1 2 3 h 5 6 7 V017 PM 170 APPENDIX K INSTRUCTOR CRITIQUE msrkucroa CRITIQUE Instructions: Please respond to the questions below. This information is required as part of the critique of the CMI system. Please be candid. Use Specific illustrations where possible. If you need additional Space, please attach additional pages. Did the management system provide adequate infOrmation for you to monitor the progress of students? How might the system be improved to perform this function? Did the reports provide adequate information to facilitate communication required to Operate the system? What was your reaction to using a computer-printed report for the purpose of communicating with students and staff? Did you feel that the atmosphere established by the system was threatening to you or to the students involved in the system test? What factors do you believe influenced this? From your Observation, how do you think the students reacted to the system and the method for conducting the course during the test? How do you feel about the computer-managed instructional system? Would you recommend that it be used as a model fOr this course and Others in the electronics program? Other technical areas? ADDITIONAL COMMENTS: 171 11111111111111