MSU LIBRARIES _—_—- RETURNING MATERIALS: P1ace in book drop to remove this checkout from your record. FINES wil] be charged if book is returned after the date stamped below. USING HIGHLY STRUCTURED, TEACHER-DIRECTED COBOL MATERIALS IN A MICROCOMPUTER-BASED ENVIRONMENT: A DESCRIPTIVE STUDY BY Lynda Joy Allen A DISSERTATION Submitted to Michigan State University in partial fulfillment of the requirements for the degree of DOCTOR OF PHILOSOPHY Department of Counseling, Educational Psychology and Special Education 1985 ABSTRACT USING HIGHLY STRUCTURED, TEACHER- DIRECTED COBOL MATERIALS IN A MICROCOMPUTER-BASED ENVIRONMENT: A DESCRIPTIVE STUDY BY Lynda Joy Allen Even though schools increased are attempting to meet the demand for computer experiences for students, many K-12 systems have not programming language, BASIC. teacher expanded offerings beyond computer literacy and the study of the the A main problem is the inadequacy of preparation for involvement in the computer area, particularly involvement with teaching a programming language. belief that highly opportunities This study examined structured lessons and activities may offer for teachers to become involved comfortably with a computer language and for students to experience academic success. The focus was on a teacher-directed approach to the instruction of COBOL, the business-oriented programming language. A descriptive research approach was used in three classroom situations to discover how high school and community college level teachers and their students would react to highly teacher-directed materials designed to COBOL. varying computer the teach microcomputer-based structured, The teachers had limited COBOL experience and the students had and mathematics experiences. study included describing The major purposes of teacher and student reactions to such materials and suggesting a relationship between student charac- teristics and academic COBOL success. Reaction forms and a COBOL achievement test were the instruments used for data collection. Findings in the area of teacher response suggested that instructional materials which clearly define the role of the teacher as a director of student participation were received positively by teachers with limited COBOL experience. Main academic teaching areas and varied computer experiences did not seem to influence how teachers felt about using such materials. Findings in the area of student response suggested that overall cognitive response to learning aspects of COBOL with highly structured, teacher-directed materials was very positive and overall affective response was positive. In the area of predicting academic success in COBOL, findings suggested that computer experience, mathematics experience, and grade point averages did not account for a significant portion of the variance in student COBOL test scores. However, when these three variables were used as the basis for predicting success, only mathematics experience was found to affect the prediction. ACKNOWLEDGEMENTS Appreciation is extended to Drs. James L. Page, Perrin E. Parkhurst, and George F. Sargent for their guidance in my doctoral program and to Dr. Norman T. Bell for directing this dissertation and supporting me in my instructional computing endeavors. Gratitude is also expressed to Dr. Raywin Huang for his assis- tance with the statistical component of this study, to Jean Eddy for the preparation of the COBOL. materials, and to the study's parti- cipants, the COBOL teachers and students. For their unique understanding and acceptance of my ways, I would also like to acknowledge and thank my mother, my husband, Bob, my special friend and word processor, Alma, and my poodle, Nicole. ii TABLE OF CONTENTS Page LISTOFTABLES...ooooooooooooooooooooooVii CHAPTER I. INTRODUCTION 0 O O O O O O O O O O O O C O O O O O O O O O O 1 Identification of the Problem Purpose of the Study Research Questions Importance Limitations and Scope of the Study Definition of Terms Summary II 0 REVIEW OF THE LITERATURE O O O O O O O O O O O O O O O O O .10 Overview SECTION ONE 0 O O O O O 0 O O O O O O O O O O O O O O O O O 10 Opportunities for Teacher Involvement in Teaching Programming Languages Teacher Preparation Choice of Lanaguage to be Taught Approaches to Teaching Programming Languages Learning Styles of Students Predicting Academic Achievement Measuring Programming Proficiency Summary of Section One Relationship of Section One to Study SECTION Two 0 O O O O O O O O O O O O O O O O O O O O O O O 23 Instructional Objectives and Overviews as Preinstructional Strategies Sequencing Practice and Feedback Degree of Structure Student Involvement in Learning Summary of Section Two Relationship of Section Two to Study iii CHAPTER III. IV. Page DESIGN OF THE STUDY 0 O O O O O O O O O O O O O O O O O O O 35 Introduction Nature of the COBOL Environments Nature of the Participants Overview of Design Description for Research Questions FIRST RESEARCH QUESTION 0 O O O O O O O O O O O O O O C O O 38 Instrumentation Data Collection Procedures Data Analysis SECOND RESEARCH QUESTION 0 O O O O O O O C O O O O O O O O .41 Instrumentation Data Collection Procedures Data Analysis THIRD RESEARCH QUESTION 0 O O O O O O O O O O O O O O O O 0 4a Instrumentation Data Collection Procedures Data Analysis Summary PRESENTATION AND INTERPRETATION OF DATA . . . . . . . . . . 50 Introduction PRESENTATION AND INTERPRETATION OF DATA FOR FIRST RESEARCH QUESTION . . . . . . . . . . . . . . . 50 Quantitative Presentation of Findings for Cognitive Student Response Interpretation of Findings for Cognitive Student Response Quantitative Presentation of Findings for Affective Student Response Interpretation of Findings for Affective Student Response Quantitative Presentation of Findings for Relationship Between Cognitive and Affective COBOL Measures Interpretation of Findings for Relationship Between Cognitive and Affective COBOL Measures Rationale for Describing Students Who Scored the Highest and the Lowest on the COBOL Test iv CHAPTER Page Qualitative Presentation of Findings for Highest Scoring Students Quantitative Presentation of Findings for Highest Scoring Students Interpretation of Findings for Highest Scoring Students Qualitative Presentation of Findings for Lowest Scoring Students Quantitative Presentation of Findings for Lowest Scoring Students Interpretation of Findings for Lowest Scoring Students Conditions Affecting Data Interpretation for First Research Question PRESENTATION AND INTERPRETATION OF DATA FOR SECOND RESEARCH QUESTION . . . . . . . . . . . . . . .76 Qualitative Presentation of Findings for Teacher Response Quantitative Presentation of Findings for Teacher Response Interpretation of Findings for Teacher Response Conditions Affecting Data Interpretation for Second Research Question PRESENTATION AND INTERPRETATION OF DATA FOR THIRD RESEARCH QUESTION I C C C I O O O O O O O O O O 83 Presentation of Findings for Predicting COBOL Success Interpretation of Findings for Predicting COBOL Success Conditions Affecting Data Interpretation for Third Research Question Summary V. SUMMARY, CONCLUSIONS, AND IMPLICATIONS . . . . . . o . . o .90 Summary Perspective on Conclusions Conclusions Associated with First Research Question Conclusions Associated with Second Research Question Conclusions Associated with Third Research Question Recommendations for Decision Makers Recommendations for Further Study APPENDIX Page A. SAMPLE STUDENT COBOL MATERIALS . . . . . . . . . . . . . 107 B. SAMPLE TEACHER COBOL MATERIALS . . . . . . . . . . . . . 117 CO COBOL TEST 0 O O O O O O O O O O O O O O O O O O O O O O 131 ANALYSIS OF COBOL TEST CONTENT . . . . . . . . . . . . . 135 D. STUDENT REACTION FORM . . . . . . . . . . . . . . . . . .136' E. TEACHER REACTION FORM 0 O O O O O O O O C O O O O O O O .139 BIBLIOGRAPHY O O O O O O O O O O O O O O O O O O O O O O O O O O 141 vi TABLE 10 ll 12 13 LIST OF TABLES Distribution of Points on COBOL Tests and Overall Test Scores for Group 1 (Business- oriented High School Students) Distribution of Points on COBOL Tests and Overall Test Scores for Group 2 (Community College Students) Distribution of Points on COBOL Tests and Overall Test Scores for Group 3 (High School Students with BASIC Experience Central Tendency and Dispersion Measures for COBOL Achievement Mean Attitudinal Measures for COBOL Students Cognitive and Affective Measures for Group 1 (Business-oriented High School Students) Cognitive and Affective Measures for Group 2 (Community College Students) Cognitive and Affective Measures for Group 3 (High School Students with BASIC Experience) Student Achievement Versus Student Attitude Summarization of Mean Attitudinal Responses From Top Scoring COBOL Students Summarization of Mean Attitudinal Responses From Lowest Scoring COBOL Students Summarization of Specific Teacher Attitudinal Responses Mean Attitudinal Measures for the Three COBOL Teachers vii Page 51 52 53 53 56 58 59 6O 60 67 73 79 80 TABLE 14 15 16 Personal Student Data for Group 1 (Business- oriented High School Students) and Respective COBOL Test Scores Personal Student Data for Group 3 (High School Students with BASIC Experience) and Respective COBOL Test Scores Statistical Significance of Variables Deleted from Full Model Regression Equation viii Page 84 85 86 CHAPTER I INTRODUCTION Opportunities to teach computer-related courses, particularly at pre-college levels, are growing alarmingly faster than levels of faculty expertise. Few computer scientists are joining pre-college institutions where instructors and existing staffs are attempting, and sometimes struggling, to become knowledgeable about computers (Calkins, 1981). Of particular interest at these pre-college levels is the teaching of programming languages with microcomputers. Expanding microcomputer course offerings beyond the teaching of BASIC, the most common programming language, is becoming inevitable at a pre-college level. This expansion has already begun with the selection of Pascal as the computer language to be studied in high school Advanced Placement courses. To assist administrators and teachers in undertaking further expansion of course offerings, this study focused on the teaching of one of the computer languages, COBOL, a business-oriented language, using microcomputers and highly structured, teacher-directed materials. Using instructional materials which clearly defined the role of the teacher as a guide and director of student participation, this study examined how helpful such structure was to COBOL teachers and students. Identification of the Problem With a decrease in size and cost of computers comes increased proliferation in their use and, consequently, there is a general acceptance that computer experiences must become an integral part of an individual's education (Vensel, 1984). Whereas it was previously felt that computer literacy or computer awareness was an appropriate topic for colleges and some high schools, elementary schools are now attempting to use computers in problem solving and it is common to see very young children comfortably interacting with microcomputers. Computer literacy, often described as an understanding of how computers aid problem solving, seems incomplete without some knowledge of the most commonly used language for microcomputers, BASIC. Elementary school children are being introduced to BASIC concepts and instructional staffs at middle schools are beginning to assume responsibility for the refinement and expansion of these concepts. High schools must become more responsive to pressure from parents and students to provide a continuation of computer-related courses. Logical considerations include the study of other high-level programming languages such as COBOL, FORTRAN, or Pascal. School districts must contend with the fact that computer educators will often need to be developed within existing staffs. Educators skilled in teaching programming languages are exiting the teaching profession and joining others who once were potential computer educators in better paying positions (Luehrmann, 1983). In addition, tenured teachers, forced to explore options as a result of curricular changes, appear willing to help fill the personnel void in educational computing. Thus, as these teachers prepare to become computer educators, the element of instructional support appears most critical. Few educators deny the potential importance of computers in education but an overwhelming majority of teachers feel inadequately prepared to utilize this technology effectively (Norris, 1984). Even if these teachers are reimbursed for studying a particular computer language or given the opportunity for a sabbatical leave, the transfer of knowledge acquisition to classroom application continues to be a most difficult one. Instructional materials surface as a main concern even for those teachers who view the teaching of a programming language as stimulating and challenging and welcome the involvement with a new subject area. Highly structured lessons and activities may allow teachers a focus point in the classroom while they further explore the area of computer science for an overall understanding of its potential. The security one experiences in a particular academic teaching area is often gained slowly. Giving structure to classroom experiences through the use of such instructional materials may provide an initial framework for student and teacher success as some teachers strive to eliminate temporary deficiencies in their computer knowledge. Purpose of the Study This study had two general purposes and three specific purposes. In general, this study was directed to promote thinking about the inclusion of COBOL and/or other programming languages in high school curricula and to offer detailed information about a particular approach to COBOL instruction to school systems desiring to expand microcomputer offerings to include COBOL. Specifically, the three major purposes of this study were to: 1) describe student reaction to highly structured materials designed to teach microcomputer-based COBOL, 2) describe teacher reactions to such materials, and 3) suggest a relationship between student characteristics and academic success in COBOL. Two high school classes and one community college-level class were selected for analysis of student achievement, student/teacher attitudes toward the materials, and a relationship between student characteristics and achievement. Research Questions The following three research questions were formulated on the basis of the major purposes of the study: 1. How do students respond, both cognitively and affec- tively, to learning COBOL with highly structured materials and microcomputers? 2. How do teachers with varying computer and teaching experiences react to the use of such materials in a microcomputer-based instructional environment? 3. What is the relative importance of computer experience, mathematics experience, and grade point average when they are used to predict student success in a highly structured microcomputer-based high school COBOL class? Importance This study is important because it acknowledged the relevancy of microcomputer-based COBOL and attempted to establish a relationship between a highly structured approach to teaching COBOL and instructional effectiveness. Even though computer languages have undergone significant changes and improvements, there remains a strong adherence to the belief that COBOL is the most appropriate computer language for business applications (Shelley, 1977). The increasingly strong demand for COBOL programmers emphasizes the continued widespread use of COBOL. Applications such as payroll, billing, and inventory control continue to be developed in COBOL. Previously, a programmer was only expected to write a set of instructions to direct a computer to process data in a way that would solve a given problem. However, since the mid 1970's, a COBOL programmer's task expanded to include developing programs which are easier to maintain and modify. The instructional materials used in this study focused upon the solutions of business-related problems with added emphasis on the "style" of programming. One important element of this study, then, is its relevancy. Until recently, COBOL had been associated with large, mainframe computers and teaching COBOL necessarily implied the utilization of a large computer system. Now, COBOL compilers are available for microcomputers and, as memory decreases in price, a logical extension of course offerings includes the teaching of COBOL using microcomputers and related instructional materials. Expanding or attaining microcomputer systems for this purpose could eliminate ongoing costs to access a large computer system. Studies of the relationship between course design and instructional effectiveness in the area of computer programming are scarce. However, it is important for school systems to consider this relationship when planning to teach programming languages such as COBOL. Insufficient data exist to assist with this planning. This study provides a framework to work from as it begins to delineate the problems with one particular course design and its effectiveness. The researcher chose a course design which was microcomputer-based and had appeal for teachers who are still developing their own COBOL skills. Classroom activities are highly structured and teacher-directed and may offer support and security to those whose knowledge of COBOL is limited. Besides determining the effectiveness of the chosen materials as a tool to use in microcomputer-based COBOL instruction, the study offers a guideline for exploring other course designs, suggests alternatives for exploration, and employs an evaluation means for analyzing effectiveness. Limitations and Scope of the Study There are two basic limitations to this study. It is felt that neither of these is so severe as to invalidate the study, but it is important to note these limitations carefully. First, the descriptive nature of this study limited gener- alizations of findings. The limiting factors of considering only three COBOL classes and one mode of instruction provided tentative relationships, at best. Since most high schools are not currently teaching COBOL, two of the classes for consideration in this study were selected from progressive school districts which valued the purposes of this study. The responsibility for applying findings to other educational situations is placed upon the reader. Detailed descriptions and direct quotations from the three COBOL classes will assist the reader with this responsibility. A second limitation arose from the heavy reliance on the researcher's judgement in a study of this type. The researcher actively participated in the study as the guide of two of the three teachers with limited COBOL knowledge and as the primary designer of the student COBOL achievement test and the teacher and student reaction forms. This descriptive study, then, necessarily displays a large element of subjectivity and intuition in the measurement of various variables. Although the study seems considerably limited in scope, it is expected to provide an impact which will justify its existence. As schools are infusing computer literacy objectives into existing curricula, this study will initiate concern for the preparation of expansion and redefinition of computer-related programs, particularly the teaching of computer languages with highly structured materials. Since the specific hypotheses generated in this study may be submitted to experimental testing, further assistance to schools in their expansion and redefinition tasks may result. Definition of Terms Cognitive learning style - problem-solving methodology employed by an individual in a decision situation Compiler - a computer program for translation of instructions expressed in a particular user language (e.g., COBOL) into a target machine language (i.e., binary numbers that the machine actually uses) Computer literacy - an understanding of how computers aid problem-solving in any discipline Formative evaluation - evaluation used to improve a curriculum during its development. Deficiencies and strengths of intermediate versions of the curriculum are identified and appropriate adjustments are made. Hardware - the physical components that make up a computer system; the machinery as opposed to the programs which are run on the machinery High-level language - a problem-oriented, English-based computer language that consists of a set of vocabulary words, rules of usage and syntax, and rules of grammar. BASIC - (Beginner's All-purpose Symbolic Instruction Code) a widely used high-level language for teaching with or about the computer in educational environments COBOL - (COmmon Business-Oriented Language) a high-level language used primarily for business applications FORTRAN - (FORmula TRANslator) the first high-level language used primarily in scientific applications where algebraic operations are emphasized Pascal - a high-level language with an established overall form that allows a program to be written in an efficient and straightforward manner. Pascal is not an acronym; its name is a tribute to the seventeenth century mathematician, Blaise Pascal. Mainframe computer - a large, expensive computer which can simultaneously be accessed by many users Microcomputer - a computer that utilizes a microprocessor; often referred to as a home or personal computer Processor - the primary component of a computer system which is responsible for manipulating information Program - a sequence of instructions that permits a computer to perform a process Structured materials - instructional materials organized so that teachers guide and direct student participation in meeting prescribed objectives. The materials, in the form of student manuals, offer suggestions for activities to reinforce and apply learning and feedback in the form of quizzes for each lesson. Structured programming - a style of programming which emphasizes clarity and consists only of sequential straight line segments and no GO TO branching statements Summary This study responded both to the demand that pre-college computer experiences be expanded and to the problem that teachers feel unpre- pared for computer involvement. Specifically, this study examined the effect that highly structured, teacher-directed materials had on teachers and high school and community college level students as they interacted with microcomputers and the programming language, COBOL. The main purposes of the study included describing reactions to these materials and suggesting conditions for their effective use. Even though generalizations were limited because of the descriptive nature of the study, the relevancy of microcomputer-based COBOL was acknowledged and an attempt was made to establish a relationship between a highly structured approach to teaching COBOL and its impact on teachers and students. CHAPTER II REVIEW OF THE LITERATURE Overview The literature review focuses on 1) issues in teaching a computer programming language and 2) elements of effective lesson design. Both areas are particularly meaningful as an attempt is made to understand the implications of using highly structured materials and micro- computers to teach a programming language. The review reflects the somewhat specific nature of the issues associated with teaching a computer language while including a more general discussion of the elements of effective lesson design. SECTION ONE In the first section of the review, several issues associated with teaching a programming language are considered, including oppor- tunities for teacher involvement, teacher preparation, the choice of language to teach, approaches to teaching programming languages, learning styles of students, the prediction of academic success in programming, and the measurement of programming proficiency. Opportunities for Teacher Involvement in Teachinngrogramming Languages When considering issues in teaching a programming language, opportunities for teacher involvement are a main concern. Oppor- tunities for teachers to become involved at and beyond the level of 10 11 computer awareness seem plentiful. Daniel H. Watt (1981) emphasizes that incorporating computer literacy objectives into existing curricula is only the first step to effective utilization of computers in educational settings. He suggests that it is critical for curriculum designers to analyze their curricular offerings and not allow themselves to be content with merely exposing students to computer capabilities. The belief that programming is an advanced topic to be studied in high school or college is one which Watt feels will contribute to taking the control of the computer out of the educational realm. Again, emphasizing the opportunities for teacher involvement, Calkins (1981) points out that there is a shortage of computer- literate teachers and teachers who are capable of teaching programming languages. Teachers who have become knowledgeable about computers and have learned to program are finding positions in industry where their knowledge and ability to explain concepts are valued. The lure of industry involvement, private consulting, or software deveIOpment leaves the educational computer area heavily lacking in qualified personnel. Despite these apparent opportunities for teacher involvement in teaching a programming language, teachers appear somewhat insecure about their abilities to use computers in their classrooms. The results of a survey issued by North Texas State University and cited by Cathleen M. Norris (1984) showed that educators' attitudes toward educational computing were, in general, highly positive. The majority of educators surveyed agreed that computers are valuable tools that can be used to improve the quality of education and that teachers 12 should know how to use computers in their classrooms. However, when these educators were asked to indicate whether they would like to have computers in their own classrooms, the proportion of educators expressing agreement drOpped significantly. In general, then, while it seems that there are opportunities for teachers to become involved in teaching computer programming languages, teachers do not appear confident to interact with students on computer- related tOpics at an appropriate level of expertise. Teacher Preparation A second important issue related to the teaching of programming languages is the preparation of teachers for computer involvement. When teachers begin to interact with computers, Hedges (1981) contends that the main problem is not one of how to teach computing but rather one of how to teach adults. His suggestions for working effectively with adult learners include guaranteeing experiences with success, avoiding emphasis on speed, utilizing peer instruction, providing periodic sharing, and emphasizing the purposes of the computer-related activities. In addition to addressing the problem of how to teach adults, Dennis (1980) realistically points out that the training levels of school personnel cannot rapidly change and that we cannot assume that teacher preparation for computer involvement will be unique. Even though he is concerned about the consequences of under-trained computer instructors, he realizes that teachers are responding as well as they can to pressures that they do something to provide young people with computer experiences. Of particular significance is his observation that school personnel are criticized for entering a field 13 for which they are inadequately prepared, but would be more severely criticized if they were to ignore involvement in the computer area. Teacher preparation seems to be inhibited by a number of unresolved issues in educational computing. Milner (1979) enumerates these unresolved issues and lists them as: 1) lack of certification and experience requirements for teaching computer-related courses, 2) lack of educators' knowledge of educational computer applications, 3) lack of training programs and meaningful courses, 4) low priority given to instructional computing, 5) lack of incentives for teachers, and 6) the need for increased commitment from administrators. Even though some of these issues are currently and actively undergoing examination and are in the process of being resolved, the lack of effective computer-related curricular materials seems to further inhibit the effectiveness of pre-inservice and inservice activities. The quality and availability of curricular materials required to support uses of computers in learning was of utmost concern to Hunter (1975) and seems to continue to be a concern, especially in pro- gramming instruction. Hunter stressed that the lack of adequate support materials was the most critical problem in educational computing. In summary, then, it appears that the lack of teacher preparation in the computer area will not be unique and that teachers will be thrust into computer situations without adequate training and instructional support. l4 Choice of Language to be Taught Another important issue relates to the selection of the computer language to be taught. Concern with the limitations of the computer language, BASIC, is prompting thorough examination and analyzation of other programming languages. Two of the limitations of BASIC include the existence of many dialects of BASIC and the ease with which one can write an error-free program which is difficult to follow or modify. Other languages, besides COBOL, seem to offer much promise for use in education and Pascal, LOGO and FORTH are mentioned as considerations (Prentice, 1981). Pascal requires that rules of good programming be followed and, consequently, it is difficult to write a poorly organized program. Because of this feature, Pascal is often referred to as the most structured language and is steadily increasing in popularity. LOGO, developed at MIT, was designed to allow young- sters to easily interact with computers. The language was deveIOped for ease in Operation by utilizing rules and commands which are said to be intuitively understood by children. Seymour Papert, who studied with Piaget, expresses the importance of being able to write interesting programs with minimal instruction and reacts positively to teaching LOGO in our schools. FORTH, developed eleven years ago by Charles Moore, stresses giving the user complete access to the machine and encourages the user to experiment. The modular nature of FORTH lends itself nicely to the production of error-free application programs. Bradley (1981), however, feels that COBOL has the potential for becoming the primary language among microcomputer users. The fact that COBOL already is the standardized business language for large 15 computer systems and allows easily understood and quickly modified programs to be written may support Bradley's belief. Ideally, he recommends that an industry committee be formed to standardize a language for microcomputers. More than twenty years ago, COBOL was defined and developed at the Conference on Data Systems Language (CODASYL) and the English-based language is still used on almost every major large computer in the country which performs business tasks. The success of establishing this language would seem to suggest much merit to Bradley's recommendation. He seems to be suggesting that a refined version of COBOL would have the best chance of being accepted as the standard language for microcomputers. Bradley feels there is a natural connection between COBOL and microcomputers. Tandy Corporation, for example, has a COBOL compiler which is software-based and successfully runs on their microcomputers with no hardware modifications. Although some of the commands in standard COBOL are not found in the Radio Shack compiler, the TBS-80 version of COBOL gives mainframe computing capabilities to micro- computers. The COBOL documentation, according to Bradley, is well written and provides an excellent resource for learning the rules of the language. In addition, the structure of the COBOL language has desirable features for personal computing. Even though COBOL has been criti- cized for being an excessively wordy language which consumes large amounts of a programmer's time, Bradley argues that the wordy aspect of COBOL is actually a strength because the language possesses the potential for being self-documenting. With respect to the time element, the initial time invested in writing COBOL programs seems 16 worth the effort since COBOL programs can easily be read and quickly understood by others. Because COBOL is the most English-like of all the computer languages, some feel it is easier to learn than other languages (McClure, 1980) because the English language syntax and rigid format of COBOL seem to assist the learning process for COBOL students. Data storage and retrieval capabilities and extensive formatting features further support the idea that COBOL is an excellent choice for the business which owns a microcomputer. McClure also points out some of the potential drawbacks of microcomputer-based COBOL. The requirements of large memory and mass storage space are expensive realities. Of significance, too, is the way in which small computer systems must process data. Small amounts of data are processed and mass storage devices such as cards or magnetic tapes are not used. Data items are usually processed as they are entered and a report is either generated immediately or the information is stored for later use. The data on small systems are often obtained directly from the operator at a computer and require more human involvement and time as the data are transferred to and from a console. In summary, then, it seems that several computer languages, besides BASIC, have desirable features which ought to be learned by students. COBOL, in particular, appears to have more advantages than disadvantages when viewed as a potential primary language among microcomputer users. l7 Approaches to Teaching Programming Languages A fourth issue associated with teaching programming languages deals with approaches to instruction. Because computer students vary so much in background, motivation, expectations, and problem-solving skills, the instructional process associated with teaching a programming language is complex. Designing a programming course is difficult and, often, course offerings do not seem to meet the needs of the majority of students. Besides the variations in mathematics backgrounds and computer experiences, the composition of one programming class may range from students with no intention of programming in the future to those who plan careers directly related to computers. Thus, Singhania (1980) stresses that any strategy which can reduce the variance in student population is of utmost importance in making a programming course meaningful and enjoyable for the teacher and the students. It is agreed that more research needs to be conducted to establish the effectiveness of various approaches to programming instruction. Empirical and theoretical studies of the relationship between course design and instructional effectiveness are extremely scarce and much of the literature focuses on better defining important concerns. Most writers seem to agree that structured programming is desirable to teach students of all types. It is strongly felt that structured programming produces better understood programs which are easily modified. Beyond this consensus, only suggestions for approaching programming instruction are offered and it is emphasized that their effectiveness needs to be tested. 18 Group techniques are offered as one approach to teaching pro- gramming languages. Kimura (1979) suggests that students need to read a large number of sample programs before they are asked to write their own, Weinberg (1971) recommends that students read and check each other's programs and that the classroom environment encourage such activities, and Yourdon (1975) and Cherniak (1976) advocate a team approach to evaluating programs. Such group discussions, it is felt, are interesting and motivating for students and prepare them well for computer-related tasks. Lemos (1977) showed in an empirical study that team interaction is an effective technique for introducing COBOL to students. He also noted that some students do not react well to forced group activity and suggests further study of group techniques as programming teaching tools. Self-instruction, another approach to teaching programming languages, is mentioned in the literature as an area which needs to be further investigated. Singhania favors breaking a programming course into segments with students assuming responsibility for finishing the segments within a given time. The role of the teacher would be to monitor each student's achievement and accomplishments. With this type of instruction, he feels that fast learners will welcome the opportunity to work ahead and that only those who pace themselves tOo slowly will not fully benefit from self-paced instruction. In conclusion, then, it appears that there is, as of yet, no one best approach to teaching programming languages and that there is a need for continuing research in this area. 19 Learninngtyles of Students A fifth issue concerning the teaching of a programming language examines learning styles of students. A study by Cheney (1980) suggests a relationship between cognitive style and student programming ability. The study focused on analytic problem solvers, those utilizing a structured approach to decision making, and heuristic problem solvers who approach decision making intuitively. The heuristic approach is characterized by trial-and-error methods with feedback causing adjustments in the approach. Results of Cheney's investigation show that analytic decision makers tend to perform better on programming exams than heuristic decision makers. Conclusions support the idea that different teaching techniques should be used with students varying in cognitive styles as they learn a programming language. At minimum, Cheney suggests that analytic thinkers be allowed to progress at their own rate as they learn a language and heuristic thinkers be offered a more structured and formal learning environment where one program at a time is brought to the students' attention. Mock, Estrin and Vasarhelyi (1971) also found that analytics out-perform decision makers who are heuristic. Both studies seem to indicate that mathematics ability, the tradi- tional predictor of programming success, should be viewed as only one of the characteristics relating to programming ability. So then, with respect to learning styles of students, it seems that there is a relationship between cognitive style and programming ability and that teaching techniques should be designed to serve both analytic and heuristic problem solvers. 20 Predicting Academic Achievement The prediction of academic achievement is another important issue associated with teaching programming languages. The problems encountered when trying to identify computer programming aptitude and to predict programming success are becoming more important as the demand for programming classes continues to grow. It is common, particularly in business and industry, to predict success in programming using the IBM Programming Aptitude Test. A recent study (Mazlack, 1980) found the test to be an unreliable predictor of success. Petersen and Howe (1979) found the General Aptitude Test Battery (GATB) to indicate that general intelligence and numerical and spatial reasoning were the best predictors of success in programming. Gray (1974) agreed with the findings of Petersen and Howe but felt that none of the predictors should be used to determine acceptance or rejection in a program. A conclusion from the available studies in this prediction area is that it is very likely that students who are successful overall in high school and perform particularly well in mathematics and science will excel academically in computer pro- gramming. In summary, it seems that overall intelligence and mathematics and science abilities are the most likely indicators of programming SUCCESS o Measurinngrogramming Proficiency A final-issue is concerned with measuring programming proficiency. Lemos (1980) has most significantly and impressively dealt with programming issues and offers much information in the proficiency area. Three dimensions of proficiency are focused upon in his studies 21 and include knowledge of language rules (grammar), ability to read programs (reading), and the ability to write logically and gram- matically correct programs (writing). Two of his studies on learning a programming language show a direct relationship between the ability to read programs and the ability to write programs. The result has special significance for language teachers. Measuring student abil- ity to read programs is far less time-consuming than evaluating student ability to write programs. The importance of measuring proficiency is considered by Lemos and reasons such as helping instructors determine grades, identifying learning obstacles, and providing feedback to students are given. The ideal approach to measurement, 'according to Lemos, is to accurately measure proficiency with utmost efficiency time-wise on the instructor's part. To test proficiency in the grammar dimension, he suggests offering students true-false questions, multiple choice questions, or fill-ins. For the reading dimension, he recommends that students be asked to show their comprehension of written programs by describing and determining a program's output. Writing a program which is correct grammatically and logically is considered essential in measuring proficiency but, as Lemos indicates, it represents a very time-consuming activity and consistent grading is difficult. Lemos concludes that the ways of deveIOping proficiency can vary greatly instructionally, but that the measurement should ultimately include the three previously mentioned dimensions. Relationships among these dimensions are discussed with the strong recommendation that further formal research be conducted in this area. Studies show that evaluation problems are greatest when measuring writing proficiency and least for measuring grammar 22 proficiency. Other conclusions are that measuring writing is most effective and measuring grammar knowledge is least effective, that there is no relationship between knowledge of grammar and ability to write error-free programs, and that a moderate relationship exists between reading and writing ability. Thus, regarding the measurement of programming proficiency, the literature suggests that effective evaluation should include elements of grammar, reading, and writing. Summary of Section One The first section of the literature review addressed issues associated with teaching a computer programming language. Some of the main findings suggest that there is a shortage of teachers who are capable of teaching programming languages, that inadequately prepared teachers will be required to teach programming languages, and that COBOL is viewed as a potential primary language among microcomputer users. Other findings indicate that there is no one favored approach to teaching programming languages, there is a relationship between cognitive learning style and student programming ability, the best predictors of programming success seem to be overall intelligence and mathematics and science abilities, and that evaluation of programming success should measure grammar, reading, and writing abilities. Relationship of Section One to Study The literature review prompted the formalization of this study and seems to suggest that there is merit to the study's intent. Effective programming instruction is a main concern as the interaction among teachers, students, and instructional materials is considered. 23 This study responded to the findings of the literature review by securing teacher and student response to highly strucured COBOL materials to determine their place in the educational environment. It was felt that use of highly structured materials can have an effect on the feelings of insecurity teachers experience as they consider becoming a programming language teacher. But, even if such materials help a teacher's confidence level when initially teaching a programming language, the effect on varying types of students, both academically and attitudinally, is important. The literature recom- mends more research in various approaches to programming instruction and this study examined students' mathematical experiences, computer experiences, overall grade point averages, cognitive learning styles (preferred ways of obtaining and processing information), and attitudes as they relate to programming success with highly structured materials. SECTION TWO In the second section of this review, consideration is given to those elements of effective instructional materials most pertinent to this study. These areas include instructional overviews and objectives as preinstructional strategies, sequencing, practice and feedback, degree of structure, and degree of student participation. Instructional Objectives and Overviews as Preinstructional Strategies When considering elements of effective lesson design, pre- instructional strategies are of prime importance. Yet, the results of studies on preinstructional strategies have not as yet yielded 24 reliable generalizable effects (Bovy, 1981). Intuitively, it would seem that the inclusion of instructional objectives in instructional materials is beneficial and, yet, research does not unequivocally support this position. Duchastel (1973) points out that research does exist which suggests that objectives sometimes are helpful in facilitating learning and that more favorable attitudes toward learning are present when students are exposed to them. Because studies indicate that instructional objectives are almost never harmful to students, Duchastel recommends that they be made available in a course of study. Specifically, he stresses that objectives may help highlight relevant content, provide a general structure to content, and assist students to better organize their time. Other research studies indicate that instructional objectives do appear useful, that they do affect learning, and that they are most suitable when used to preface long periods of instruction. Research findings support the beliefs that general objectives are as effective as highly specific ones, that middle ability students benefit most from instructional objectives because of anxiety reduction, that objectives are useful in connection with posttest situations stressing knowledge and comprehension, and that they are most advantageous in traditional teaching situations (Hartley, 1976). Because instructional cues focus attention on critical parts of instruction, Levin (1981) generalizes that students who are given information about instructional objectives prior to their learning remember the learning material better than students who are told nothing about objectives. Numerous studies are cited which lend support to this statement. Levin is in agreement that objectives help students organize their 25 thinking and direct attention to relevant material and processes. Overviews, too, seem to be an element of effective lesson design. The research on overviews is scanty but the value of summarizing what is to be accomplished prior to instruction seems intuitively obvious. Studies show that overviews used prior to instruction have positive effects upon learning and retention and are most beneficial when factual information is presented to low ability students or when concepts and principles are presented to high ability students. Familiarizing students with their task and preparing them for the general structure of the material to be learned would seem likely to facilitate learning of meaningful material. Overviews, like instructional objectives, serve as a reference point for what is to be learned and what is already learned and increase the clarity of the new material to aid in retention. (Hartley, 1976) Thus, it appears that instructional objectives and overviews, as summarizations of what is to be learned prior to instruction, are valuable elements of effective lesson design. Because they provide some structure to content, overviews and objectives seem to influence learning positively. Sequencing The importance of arranging instructional activities sequentially is noted frequently in the literature on effective lesson design. Bloom, Bruner, Piaget, and others emphasize cumulative levels of thinking and the ineffectiveness of asking students to think in a processing and/or output sense without sufficient experiences to accumulate data (Costa, 1981). Gagne' refers to curriculum as a sequence of units arranged in such a way that the learning of each 26 unit may be accomplished in a single entity. His assumption is that the capabilities described by prior units have been developed. Gagne's studies on experimenter-determined sequencing of instruction in accordance with hierarchies of competence seem to offer the most useful information. Results from his studies indicate that arranging units of instruction in accordance with the order in which competencies need to be acquired has more of an impact on criterion scores than do other characteristics of the instruction such as number and type of examples. The consistency of his findings suggests that when a task can be analyzed into a hierarchial structure, sequencing the instruction accordingly results in more effective learning and transferability. (Briggs, 1968) Even though common-sense logic seems to underlie the task of sequencing events within a lesson, Briggs and Gagne' (1974) are specific about the role of sequencing in effective lesson design. When designing instructional materials for a lesson, Briggs and Gagne‘ recommend that events of instruction occur in an approximate order. The ordered events include gaining attention, informing the learner of the lesson's objectives, stimulating recall of prerequisite learnings, presenting stimulus material, providing learning guidance, eliciting performance, providing feedback about performance correctness, assessing performance, and enhancing retention and transfer. In summary, then, sequencing, a critical component of instruc- tional design, appears to influence the effectiveness of instructional materials and to be an aspect of instruction which is fairly easy to control. 27 Practice and Feedback Two other elements of effective lesson design are practice and feedback. Davis (1974) offers practical suggestions for Optimally utilizing these two elements of design. Effective instructional materials to be used for teaching concepts must provide students with advanced practice only after they can discriminate between examples and nonexamples. Initially, a student should be presented with simple and obvious examples in practice situations. When a student responds in practice, feedback should immediately be given as to his correctness. Praise and relevant information should be included as feedback when a student responds correctly. When a student is incorrect, he should be given the opportunity to try again after his error is pointed out and explained. Practice and feedback are also important elements of instructional materials designed to teach principles. After a principle has been formulated, students should be given practice opportunities to apply the principle, particularly as was indicated in an objective. This practice, with apprOpriate feedback, is essential for helping students to see the relevance of their learning and increasing their motivation. The importance of continuous feedback for students, as it relates to effective lesson design, is stressed by Davis for learning all types of skills. Feedback, to be most effective, must inform a student of his progress toward established objectives and should be available during learning. Feedback during practice allows errors to be identified and corrected before student achievement of objectives is measured in the form of a test. Feedback for knowledge of facts, concepts, and principles could be in the form of printed answers to 28 practice quizzes reinforced by teacher explanation, while complex problem-solving feedback may be in the form of a demonstration by an expert or debriefings. A special study by the International Association for the Evaluation of Educational Achievement revealed that feedback was one of three most important factors in the classroom which affected student achievement, interests, and attitudes (Levin, 1981). Thus, with respect to effective lesson design, it seems that practice, accompanied by prompt, appropriate, and continuous feedback, needs to be available to students during learning. Feedback, in particular, seems to positively affect student achievement and attitude as it offers a way to monitor progress toward meeting established objectives. Degree of Structure A fourth element to consider is the degree of structure in lesson design and how it affects a learning environment. This part of the literature review focuses on the effectiveness of using instructional materials which demand active student involvement and are referred to as highly structured because student involvement is teacher-directed and because teacher behavior is highly structured. McKenzie (1980) examines the role of structure in lesson design. Effective instructional design, as referred to by McKenzie, assumes that students must learn through personal mental activity but stresses that teachers have the responsibility for providing opportunities to ensure that most students are likely to learn. His concept of design is content-oriented and teacher-directed and includes benefits for teachers and students. Teachers are equipped to present information 29 clearly and thoroughly and students are exposed to events which catch their attention and assist them in thinking appropriately. The premise of instructional design, as discussed by McKenzie, is that a careful task analysis of a lesson can generate a set of inputs which would enable a student to learn an objective. In connection with this premise, he makes the assumption that most content can be classified into a few categories of objectives. For content-oriented learning and teaching to be effective, students must be actively involved in learning and teachers must be responsible for deciding what input is needed to elicit specific student responses. Extensive research is cited that demonstrates that questioning students throughout a presentation and requiring that every student respond to every question provoke study behaviors which tend to increase achievement. The teacher's role is to provide experiences which alert students to important points, help students think carefully about each point, and give adequate time for review and practice. Test-like events seem to provide an effective and efficient way of guiding student thinking. In summary, McKenzie presents effective instructional design as a highly systematic, content-oriented, teacher-controlled theory of instruction which helps to ensure that sufficient information is presented to enable students to learn. Student characteristics are considered after a basic lesson is planned and decisions about test events are made. Minimally, teachers have available to them an effective means to teach content efficiently. Biggs (1971) is also concerned with the benefits of structuring a learning environment but presents a view which lessens teacher direction as students progress in their learning. When students are 30 initially asked to learn new behaviors, it is important to structure the environment with some extrinsic direction. After a behavior becomes well-learned, Biggs recommends that teachers stand back and provide opportunities for a child to continue learning on his own. Teachers can best assist in facilitating broad coding (process learning) by providing appropriately structured input. It is the responsibility of the teacher to provide a continual flow of input that can be coded in a variety of ways. The realization that no more than seven independent units can be considered simultaneously needs to be dealt with accordingly. Of extreme interest and importance is the relationship between student achievement and high teacher control over transmission of information and conditions of learning in the classroom. Parent (1975) found that even though some students preferred a low degree of teacher control and a high degree of freedom to select materials and experience learning, they maintained their achievement levels when interacting with a highly teacher-controlled situation. Before drawing further conclusions from studies of this type, Parent suggests that attention be given to a simple dynamic aspect of the problem of teacher-directed structuring. Students beginning a course of study generally know little about the subject and tend to benefit most from a fairly high level of external discipline. However, as students gain in skills, Parent warns that optimal student performance will continue only if teacher control lessens significantly. Davis (1974) suggests that, in general, students are more likely to learn if the media used in class is structured so that an instructor's messages are open to student inspection. However, Davis points out and emphasizes active 31 teacher involvement by stressing that teachers need to state objectives to students, point out relationships, give students cues and prompts, stimulate sensory channels by utilizing visual and auditory media, and actively question students. Even though open communication is recommended, the role of the teacher in learning is clearly defined as active and less teacher involvement is not mentioned as a consideration. In conclusion, then, it appears that a highly structured, teacher- directed learning environment provides an effective way to teach content to students, particularly when the content is unfamiliar to them. Student Involvement in Learning The degree of student participation in the learning process is examined as a fifth element in lesson design. Studies undertaken by the International Association for the Evaluation of Educational Achievement showed that active learning time, as well as instructional cues and feedback, have the greatest impact on learning. Increased student involvement results in increased student achievement and other positive learning outcomes. Student involvement is largely affected by the appropriateness of instructional procedures, the appropri- ateness of how the student activities relate to the goals of learning, and the clarity of the presentation. To improve class involvement, it was found beneficial to increase active student participation by eliciting active responses to instruction and having the students directly interacting with the learning materials. The instructional materials should provide opportunities for students to participate in relating their existing knowledge to new learning tasks. If adequate 32 and clear instructions are available to students, it is suggested that involvement with activities be required by all students. (Levin, 1981) Thus, active student participation in learning seems to often result in increased student achievement as students are required to direcly interact with instructional materials. Summary of Section Two The second part of the literature review addressed important elements of effective lesson design. Findings, overall, support the view that student learning is positively affected by including instructional objectives and overviews as lesson components, by sequencing instructional activities appropriately, by providing prac- tice and prompt, continuous feedback, by providing a learning environment which is highly structured and teacher-directed, and by encouraging active student participation. Relationship of Section Two to Study The highly structured COBOL materials used in this study seemed to support internal processes of learning, utilize variables and processes which promote effective learning, and fit well in Briggs' and Gagne's ordered events of instruction (Briggs and Gagne', 1974). A typical COBOL lesson gained student attention by displaying a transparency with the overview written out in paragraph form. Students found the overview was also part of their materials and began reading it. The next transparency, corresponding to student material, was the list of objectives for the lesson followed by transparencies reviewing previous learning. Students were asked to respond to guided review questions and to record their responses in their student 33 materials. As the stimuli were presented for the new COBOL material, guided learning was presumably leading to a desired combining of concepts and rules. As the instructor explained the content of the transparencies and wrote down appropriate responses elicited from students, the learners entered these responses into their structured notes. Performance was required in the form of a quiz at the end of a lesson with feedback given in the form of printed answers. After a discussion of the quiz, students were advised of their progress. In the form of summaries and reference sheets to be completed by the students, the students were asked to organize and reorganize the information in the lesson as attempts to enhance retention and transfer. Activities at the end of each lesson also enhanced retention and transfer as students were asked to apply the concepts of the lesson to new, realistic situations. Gagne' (1974) and Glaser (1976) also offer thoughts on the process of instructional design which appear to relate well to this study. Gagne' presents four basic assumptions about instructional design. He proposes that instructional design must be for the individual, that it has phases that are both immediate and long-range, that systematically designed instruction can greatly affect individual human development, and that designing instruction must be based on how human beings learn. Using preinstructional strategies within highly structured materials requiring active student involvement, an attempt was made in this study to offer an evaluation of one possible alternative to teaching COBOL. The instructional materials used in the study appear to be based upon Gagne's assumptions. The design process, as indicated by Glaser, essentially involves generating 34 alternatives and testing these alternatives against practical require- ments and constraints. From the review of the literature, it seems that the study was a promising one to pursue in that it offered examination and analyzation of instructional materials with useful and effective components. CHAPTER III DESIGN OF THE STUDY Introduction A descriptive research approach was used in three classroom situations to collect data related to teacher and student reactions to highly structured materials designed to teach microcomputer-based COBOL and to suggest a relationship between student characteristics and academic success with these materials. This research approach seemed especially appropriate when considering the purposes of this study. Because descriptive research systematically describes an area of interest factually and accurately (Isaac, 1981), its main strength appears to be in its usefulness for providing background information to decision makers, particularly those concerned with instructional environments. Through rich descriptions, much information about a particular approach to programming instruction can be identified. In describing the design of the study, the following topics are addressed: 1. Nature of the COBOL environments 2. Nature of the participants 3. Instrumentation, data collection procedures, and data analysis for each research question Nature of the COBOL Environments Radio Shack Model 1, Model II, and Model III microcomputers, each equipped with two disk drives and a COBOL compiler, were available to 35 36 each of the three COBOL classes for approximately six week periods. Most often, two or three students shared a computer as the highly structured lessons were presented to them within 25-30 hours of instruction. Each lesson provided an overview, objectives, review of previous learning, teacher-guided group activities, interaction with the microcomputers, summary of the lesson, multiple choice quiz, and activities in which students were asked to apply the concepts of the lesson to specific situations and initiate their own solutions. As instructors used transparencies to guide students through a lesson, students wrote appropriate responses in their manuals. Conveniently designed teacher manuals provided the transparencies which corre- sponded to the materials in the student manuals, reminded teachers of important concepts in a lesson by prompting them with supplemental information and offered suggested responses and solutions to problems in each lesson. Sample student materials appear in Appendix A and sample teacher materials appear in Appendix B. Nature of the Participants Subjects for this study were selected from school systems which support computer-related research and involvement for their students and teachers. Three teachers with limited COBOL experience were purposely chosen to increase the usefulness of information provided to decision makers as a result of this study. The first learning situation consisted of twelve business- oriented high school students who were encouraged by their business teachers to participate in this study. Most did not have any computer experience. Some left a business course to become involved with COBOL and others found it necessary to change their schedule of classes in 37 order to participate. The instructor was a business educator whose COBOL experience was limited to one formal course in mainframe-based COBOL. Before using any of the highly structured lessons to teach microcomputer-based COBOL to students, the instructor was asked to assume the role of a student as the researcher taught each lesson's content. The second classroom situation involved sixteen adult students from a community college data processing class and an instructor who also had taken one formal mainframe-based COBOL course. Students had varying computer experiences and knew at registration time that they would initially learn microcomputer-based COBOL with highly structured lessons. The instructor was offered the opportunity to have the researcher teach the content of each structured lesson to him before he presented the lessons to students. However, he preferred informal discussions regarding the content of some of the lessons. The third learning situation consisted of ten high school students who had all studied the programming language, BASIC, before enrolling in COBOL. In this classroom, the students and their teacher interacted with ten revised COBOL lessons, a result of field testing the original instructional materials in other learning situations. The instructor was an industrial arts teacher who had completed one formal course in mainframe-based COBOL and also had informal learning experiences with the six original COBOL lessons used in this study. As a member of a small group of educators, the instructor was exposed to microcomputer-based COBOL potential at a local Radio Shack computer center. The six lessons were presented to the group of educators and they were encouraged, but not required, to write COBOL programs. 38 Because of the informal nature of the class, no test was given to measure their COBOL knowledge. Overview of Design Description for Research Questions For each of the three research questions addressed in this study, the question will be listed, the instrumentation explained, and data collection and analysis procedures described. FIRST RESEARCH QUESTION How will students respond, both cognitively and affec- tively, to learning COBOL with highly structured materials and microcomputers? Instrumentation Two instruments were used to secure student response to the COBOL materials. First, a COBOL test was designed to measure programming proficiency. The dimensions of programming proficiency focused upon knowledge of language rules, ability to read COBOL programs, and ability to write logically and grammatically correct program segments. A sample of the COBOL test and an analysis of its content can be found in Appendix C. Secondly, student reaction forms were designed to measure attitudes after presentation of each COBOL lesson and to highlight some of the strengths and weaknesses of the instructional materials. The first part of each form, the Likert-type part of the attitudinal measure, contained eleven statements regarding the format of the COBOL materials, the mode of lesson presentation, and the effectiveness of the instructional materials. Possible student responses to these 39 statements ranged from "strongly agree" to "strongly disagree". In addition, student reaction forms included opportunities for students to rate their success in meeting the objectives of each COBOL lesson and to comment on the most and least preferred components of each lesson. Only the first reaction form asked for personal student data. A sample of the student attitudinal measure can be found in Appendix D. Data Collection Procedures In the cognitive area, student response to the learning of COBOL was secured by administering the COBOL test on programming proficiency after the lessons were presented. Students were allowed and encouraged to refer to their notes during testing. Student achieve- ment was measured by the scores on these tests of programming proficiency. The scores represented the cognitive element of student response to the COBOL materials. To secure affective data for this study and incidentally to provide feedback to the designer of the COBOL materials, a formative evaluation process was employed to collect student information. To obtain students' perceptions of their programming experiences, each student responded to the attitudinal measure after each of the lessons was taught. The first part of the reaction form asked students to respond, with varying degrees of intensity, to a set of general state- ments about the lesson. This part of the form was not altered from lesson to lesson. Students expressed their feelings on each lesson by rating each of the eleven statements and a numerical value ranging from 0 to 4 was assigned to each response. The average of these numerical values represented the affective element of student response to the COBOL materials. 40 To characterize students within particular levels of COBOL achievement, additional data were collected on the student reaction forms. On the first reaction form, students were asked to indicate their computer experiences, mathematics experiences, and overall grade point averages. To further assist in this characterization, as well as for lesson revision purposes, two other parts of the reaction form requested specific information in the area of effectiveness. Stu- dents were asked to check responses ranging from "very well" to "poorly" which best described how well they thought they accomplished the objectives of each lesson. They were also given opportunities to comment on the best part of the lesson and the part they felt should be eliminated. Data Analysis For data interpretation involving group comparisons, both student achievement data and student attitudinal data were analyzed by examining each of the three COBOL classes separately. By class, students' point distributions among the areas of COBOL grammar, reading, and writing were reported and overall test scores were the basis for statistical treatment of the cognitive data. Summary descriptive statistics including test means, medians, ranges, and standard deviations were used to analyze, by class, the cognitive student responses to the COBOL materials. In analyzing student attitude toward the COBOL experience, each position response on the eleven item attitudinal measure was assigned a numerical value ranging from 0 to 4. For each student, these numerical values were averaged to yield an individual's attitude score. The individual scores for each of the three classes were the 41 basis for statistical treatment of the affective data. Mean atti- tudinal scores for each COBOL class were used to analyze, by class, the affective student responses to the COBOL materials. To allow for further data analysis, students were grouped, within each class, by levels of achievement on the final COBOL test. The two students who attained the highest test scores and the two who attained the lowest scores in each of the three classes were described in terms of computer experiences, mathematics experiences, overall grade point averages, and specific responses to the attitudinal measures in the area of lesson effectiveness. The reporting of these findings was descriptive, rather than predictive, and provided composites of stu- dents in the COBOL classes who performed best and worst on the final test. Within the two achievement levels, the characteristics of the six students being described were examined for commonality. In an attempt to analyze the relationship between student atti- tudes and achievement, COBOL test scores and mean attitudinal measures were used to compute a single correlation for each class. The calcu- lation of the Pearson correlational coefficients (Pearson r's) allowed discussion of the relationship between the quantitative cognitive and quantified affective measures within each class. SECOND RESEARCH QUESTION How will teachers with varying computer and teaching experiences react to the use of such materials in a microcomputer-based instructional environment? 42 Instrumentation A single instrument was used to secure teacher response to the COBOL materials. Teacher reaction forms were designed to quantify attitudes after presentation of each COBOL lesson, gather perceptions of how well teachers felt students achieved each lesson's objectives, and identify problems associated with teaching each lesson. The first part of all of the teacher reaction forms asked for responses to ten general statements relating to the format of the lessons and their perceived effectiveness. This Likert-type part of the attitudinal measure was similar in design to the student attitudinal measure used in this study. Possible teacher responses ranged from "strongly agree" to "strongly disagree". In addition, teachers were asked to evaluate student accomplishments and to identify, in an open-ended manner, instructional problems encountered with the lesson. Only the first teacher reaction form requested personal data. A sample of the teacher attitudinal measure can be found in Appendix E. Data Collection Procedures Teacher reaction to the COBOL materials was secured by requesting that the three teachers fill out an attitudinal measure after each COBOL lesson was completed. The first part of the reaction forms, Likert-type in format and the same for all lessons, asked teachers to respond to a set of general statements about the lessons. Teachers rated, with varying intensity, each of the ten statements and a numerical value ranging from 0 to 4 was assigned to each response. The average of these numerical values represented the quantitative measure of teacher reaction to the COBOL materials. 43 Additional data were collected on teacher reaction forms to assist with describing how teachers felt about the COBOL materials. The teacher reaction forms for the first COBOL lesson requested information regarding previous computer experience, particularly with respect to COBOL and microcomputers, and main academic teaching areas. Besides the quantitative measure mentioned in the previous paragraph, two other parts of the teacher reaction forms provided valuable information to the researcher. The objectives for each lesson were listed and teachers were asked to check possible responses of "very well", "well", "fairly well", or "poorly" as they evaluated students' attainment of each lesson's objectives. The last part of the reaction form was Open-ended and requested that teachers identify major defi- ciencies or problems with the lesson and, if desired, suggest ways to eliminate these problems. Data Analysis The analysis of teacher reaction data was similar to that of the affective student reaction data. The reactions of each of the three COBOL teachers were described quantitatively and qualitatively. Composites of each teacher included a numerical attitudinal score, com- puter background, experience in teaching a programming language, main teaching area, and comments regarding the effectiveness of the COBOL lessons. The numerical attitude scores were obtained by assigning each position response on the ten item attitudinal measure a value from O to 4, summing these values, and dividing by 10. These mean attitudinal scores were mainly used to allow comparisons among the three COBOL teachers' reactions to the COBOL materials. 44 THIRD RESEARCH QUESTION What is the relative importance of computer experience, mathematics experience, and grade point average when they are used to predict student success in a highly structured microcomputer-based high school COBOL class? Instrumentation The student reaction forms for the first COBOL lesson and the COBOL proficiency tests were the instruments used to collect the data to address the issue of predicting student COBOL success at a high school level. Data Collection Procedures In preparation for discussing the relative importance of computer experience, mathematics experience, and grade point average as they are combined with the COBOL achievement measures to predict student success, personal data from the high school students and their COBOL test scores were gathered and quantified. The student reaction forms for the first COBOL lesson provided the researcher with information regarding each high school student's mathematics and computer experiences prior to his COBOL experience and his overall grade point average. Only high school students were considered because it was pos- sible to consistently quantify their mathematics experiences. Whereas prior computer experience was categorized as either "yes" or "no", mathematics experience was further categorized as low, medium, or high. For eleventh and twelfth graders, mathematics experiences were categorized as medium if the first year of algebra and geometry were completed and the second year of algebra was either completed or being 45 taken along with the COBOL course. Mathematics experiences for tenth graders were categorized as medium when the first year of algebra and geometry were completed and the second year of algebra was being taken concurrently with COBOL. Both of the ninth graders in the study were assigned medium mathematics experiences since one had completed the first year of algebra in the eighth grade and was taking geometry with COBOL and the other was taking first year algebra and COBOL concurrently. In assessing mathematics experiences, then, for the high school students, difficulty of the completed mathematics courses with respect to current grade level was weighed more heavily than the number of mathematics courses completed. Data Analysis R-squared delete, a statistical technique within multiple regres- sion, was used to analyze the contributions of prior computer experiences, mathematics experiences, and overall grade point averages to the variations of the student scores on the COBOL tests. Before discussing the steps that were taken to apply the R-squared delete procedure to the data analysis of this study, some discussion of multiple regression seems appropriate. Multiple regression appears to be an excellent method for explaining natural phenomena by studying the relations among those variables identified through research findings as ones which most likely impact a situation (Kerlinger and Pedhazur, 1973). Independent variables are those which are believed to affect the measurements obtained on the dependent variable, the variable measured to determine the effects of an experiment (Issac, 1981). The main purpose of multiple regression is to analyze contri- butions of two or more independent variables to the variation of a 46 dependent variable. Because multiple regression analysis is appro- priate for non-experimental research and can handle both continuous and categorical variables, it became the main area of analysis for the predictive component of this study. (Kerlinger and Pedhazur, 1973) Specifically, the R-squared delete statistical technique was used to assess the relative importance of the independent variables in this study (computer experience, mathematics experience, and overall grade point average) as they relate to prediction of COBOL success. A full model regression equation, a linear combination of the three inde- pendent variables which maximized the prediction of the dependent variable, scores on the COBOL test, was generated using the computer program, SPSS. Along with this equation, R21, the proportion of variation in the COBOL test scores accounted for by the independent variables, was noted in the SPSS summary output. Then, again using SPSS, three reduced model regression equations and their respective Rz's were generated as a result of deleting each of the independent variables, one at a time, from the full model equation. Corresponding 2 of the full model R2's were the basis for comparing each R2 with the R regression equation. The analysis allowed the researcher to examine the statistical signficance of variables deleted from the full model regression equation, one at a time, and address their individual effects on prediction of COBOL success as measured by scores on the COBOL proficiency tests. The following steps summarize the procedure used for the data analysis: 1. The categorical independent variables, computer experi- ence and mathematics experience, were quantified by 47 assigning vectors to identify group membership. In the case of computer experience, 0 was assigned to represent no experience and 1 to represent prior computer experience. In the case of mathematics experience, 00 was assigned to represent low experience, 01 represented medium experience, and 10 represented high experience. These vectors, then, represented the measures of the categorical independent variables in the SPSS generation of regression equations. Along with the measures of the categorical independent variables, corresponding grade point averages and COBOL test scores, measures of the continuous variables expressing gradation, were entered into the computer to form the full model regression equation. The resulting full model equation was in the form of Y - BO + lel + [B12 + B22] X2 + B3X3 where Y was the dependent variable, X1, X2, and X3 were the independent variables, and the B's were the coefficients which maxi- mized prediction. Specifically, Y represented the COBOL test scores, X1 represented quantified programming experiences, X2 represented quantified mathematics experiences, and X3 represented grade point averages. B12 and 322 resulted since two vectors were generated for each of the three categories of mathematics experience. 4. 5. 48 In addition to the full model regression equation, the corresponding R2 was noted from the SPSS summary report. This R2 represented the proportion of the vari- ation in the dependent variable explained by the three independent variables. Three other regression equations, each reduced by deleting one independent variable at a time from the full model equation, were generated using SPSS in the forms of Y 3 B0 + [B12 + Bzzlx2 + B3X3 with computer experi- ence deleted, Y - B0 + lel + B3X3 with mathematics experience deleted and, Y - B0 + lel + [812 + szzlxz with grade point aver- age deleted. 2. R s for each reduced regression equation were noted from the SPSS summary report. To determine if the deletion of any of the independent variables affected the full model regression equation or prediction significantly, three F ratios were computed for each of the three reduced regression equations. The formula for the F ratios was in the form of 49 2 2 (R full ' R reduced)/(k1 ' k2) (1 " R2fu11)/(N ' k where 1 ‘ l) N = total number of cases (22), k1 = number of independent variables of the larger R2, k2 = number of independent variables of the smaller R2, and the degrees of freedom were 1 by N - k1 - 1. 7. If the F value was found to be significant, the deleted independent variable was judged to enhance the pre- diction of COBOL success and vice versa. Summary Chapter III described the design of the study by addressing the nature of the COBOL environments, the nature of the participants, and, for each research question, the associated instrumentation, data collection procedures, and data analysis. Three classroom situations were examined to discover how teachers and students react to highly structured materials designed to teach microcomputer-based COBOL. A descriptive research approach was employed to gather information about this particular approach to COBOL instruction. CHAPTER IV PRESENTATION AND INTERPRETATION OF DATA Introduction For each of the components of the three research questions addressed in this study, the presentation and interpretation of the data are separated into two sections, one for reporting the findings and one for discussing the findings. Conditions affecting data interpretation are also detailed for each research question. When appropriate, the presentations of data have been summarized in tables. An overall summary of findings is given at the end of the chapter. PRESENTATION AND INTERPRETATION OF DATA FOR FIRST RESEARCH QUESTION Research Question #1: How will students respond, both cognitively and affec- tively, to learning COBOL with highly structured materials and microcomputers? Quantitative Presentation of Findings for Cognitive Student Response Tables 1, 2, and 3 present, by class, student achievement on the COBOL proficiency tests with distributions of scores in the areas of grammar (applying the rules of the programming language), reading, and writing. The maximum number of points a student could attain in each area is indicated in parentheses. 50 51 TABLE 1 Distribution of Points on COBOL Tests and Overall Test Scores for Group 1 (Business-oriented High School Students) Student Grammar Reading Writing Total Points Number (34) (51) (15) (100) l 34 51 15 100 2 34 51 15 100 3 28 51 14 93 4 28 51 14 93 5 28 51 14 93 6 25 51 15 91 7 25 51 ll 87 8 19 51 15 85 9 22 51 9 82 10 22 31 9 62 11 25 3O 6 61 12 10 22 9 41 52 TABLE 2 Distribution of Points on COBOL Tests and Overall Test Scores for Group 2 (Community College Students) Student Grammar Reading Writing Total Points Number (34) (51) (15) (100) 1 34 51 15 100 2 34 51 14 99 3 31 51 15 97 4 34 51 10 95 5 31 51 12 94 6 31 51 11 93 7 28 51 14 93 8 25 51 15 91 9 25 51 15 91 10 30 44 15 89 ll 25 51 13 89 12 28 48 9 85 13 25 46 13 84 14 31 44 7 82 15 24 44 12 80 16 31 36 9 76 53 TABLE 3 Distribution of Points on COBOL Tests and Overall Test Scores for Group 3 (High School Students With BASIC Experience) Student Grammar Reading Writing Total Points Number (32) (51) (15) (100) 1 32 51 15 98 2 31 45 12 88 3 29 43 15 87 4 20 51 15 86 5 26 44 15 85 6 26 43 15 84 7 32 37 15 84 8 28 46 7 81 9 26 43 12 81 10 26 39 14 79 Table 4 presents the summary descriptive statistics for each of the three COBOL classes. N represents the number of subjects in each group. TABLE 4 Central Tendency and Dispersion Measures for COBOL Achievement Standard N Mean Median Range Deviation Group 1 (Business-oriented High School Students) 12 82.33 89.00 59.00 18.208 Group 2 (Communitygpollege Students) 16 89.88 91.00 24.00 6.898 Group 3 (High School Students With BASIC Experience) 10 85.30 84.50 19.00 5.293 54 Interpretation of Findings for Cognitive Student Response The measures of central tendency (means and medians) and the measures of dispersion (ranges and standard deviations) are indicators of how students responded cognitively to their COBOL experiences. The relatively high means and medians associated with the achievement test infer that all three groups generally responded well to learning COBOL with highly structured materials. And, even though the measures of central tendency do not vary greatly among the three COBOL classes, the measures of dispersion suggest there was much more variability among students in Group 1 than among students in Groups 2 and 3. To provide insight into what may have caused results of this type in the area of cognitive student response, it is important to consider a further analysis of the group means and standard deviations. Specifically, because the test covered introductory COBOL concepts and the majority of the students participating in the study were academically inclined, the means on the COBOL achievement test were expected to be fairly high. The differences in the means, then, seem to require explanation as an attempt is made to place them in proper perspective. Only three students in this study scored below 75 on the COBOL test and these three students were all part of the business- oriented high school group. Thus, the mean of Group 1 should not be viewed as an indicator that this group reacted less well. More precisely, Group 1 performed somewhat better than expected. As indicated in Table 14, only one student in this group had computer experience and, additionally, mathematics experiences and grade point averages were lower than in the other high school group of students, Group 3. Similarly, to say that Group 2 responded better than the 55 other two groups, cognitively, to their COBOL experiences would require explanation and possible consideration of their past experi- ences with computer languages. Many of the community college students had taken a course which introduced them to several programming languages where the commonality among language components may have become apparent. The highest mean of 89.88 may reflect this awareness and not necessarily suggest that community college students will benefit most, academically, from learning COBOL with highly structured materials. The differences in standard deviations, as measures of variability within the three COBOL groups, were expected and are readily explained. Students in Group 1 were selected to participate in this study and were much more diverse than those in the other two groups in terms of age and academic backgrounds. The only apparent common elements among the students in Group 1 included no programming experience and an interest in the academic area of business. Within the other two groups who elected to study COBOL, much more homogeneity was present, particularly in terms of programming experiences and overall grade point averages. The small standard deviations associated with the COBOL achievement of Groups 2 and 3 would seem to reflect the more homogeneous nature of the groups. Quantitative Presentation of Findings for Affective Student Response Table 5 presents, by class, the mean attitudinal scores which were obtained by quantifying student responses to the eleven item part of the attitudinal measures. Responses ranged from "strongly agree" to 56 "strongly disagree" and were appropriately assigned numerical values from 4 to 0. TABLE 5 Mean Attitudinal Measures for COBOL Students Mean Attitudinal Measure Group 1 (Business-oriented High_School Students) 2.92 Group 2 (Community College Students) 2.97 Group 3 (High School Students With BASIC Experience) 2.45 Interpretation of Findings for Affective Student Response The mean attitudinal measures for the three COBOL classes suggest that Groups 1 and 2 responded more favorably to the learning of COBOL with highly structured materials than did Group 3. Group 3, however, with an average attitudinal measure of 2.45, seems to fall within the range of "agree" and "uncertain" as an overall reaction to the learning situation. None of the groups appears to have responded negatively to their COBOL experiences. It was speculated by the researcher that student attitude toward the highly structured, teacher-directed approach to learning COBOL would be positive. Students, especially those encountering new material, would seem to relate well to structured activities and enjoy being actively involved in the learning process. Interactions with the three COBOL teachers prior to their involvement in the study led the researcher to believe that their positive attitude toward teaching 57 COBOL in a highly structured way would most definitely have a positive effect on their students. Suggestions based upon observations and conversations with students may help explain why the mean attitudinal measures of Groups 1 and 2 were higher than the measure of Group 3. Group 1 students appeared to be particularly pleased to be given an opportunity to interact with microcomputers and seemed to feel somewhat special about being selected to participate in the study. Because all but one student had no computer experience, this group may have been more responsive, in general, to learning COBOL and less critical of the approach employed to teach the language. Group 2, the community college level students, may have found this approach to learning a computer language a refreshing change from the traditional lecture situation. After talking with some of the students in Group 3, it became apparent that their mean attitudinal measure may well represent their feelings about completing the reaction forms rather than their feelings about their COBOL experiences. They were somewhat resentful about having to fill out the reaction forms and felt that their time could have been better spent writing COBOL programs. As a group, they were less apt to hand in all of their reaction forms and respond to the open-ended statements regarding the best parts of the lessons and the parts they felt should be eliminated. Perhaps some of their resentment was a result of interacting with the revised lessons and being asked to complete ten, rather than six, student reaction forms. 58 Quantitative Presentation of Findings for Relationship Between Cognitive and Affective COBOL Measures Tables 6, 7 and 8 present, by class, the average attitudinal score for each student accompanied by his/her score on the COBOL achievement test. Maximum numerical assessment is indicated in parentheses. TABLE 6 Cognitive and Affective Measures for Group 1 (Business-oriented High School Students) Test Score Average Reaction Form Measure (100) (4.00) 100 2.52 100 2.24 93 3.29 93 3.20 93 3.12 91 2.98 87 2.89 85 2.15 82 3.00 62 3.29 61 3.56 41 2.77 59 TABLE 7 Cognitive and Affective Measures for Group 2 (Community College Students) Test Score Average Reaction Form Measure (100) (4.00) 100 3.23 99 2.74 97 2.73 95 3.40 94 3.11 93 3.89 93 2.82 91 2.89 91 2.52 89 3.12 89 2.52 85 3.40 84 2.86 82 2.77 80 2.37 76 3.15 60 TABLE 8 Cognitive and Affective Measures for Group 3 (High School Students With BASIC Experience) Test Score Average Reaction Form Measure (100) (4.00) 98 2.55 88 2.71 87 1.53 86 2.64 85 2.55 34 2.95 34 2.56 81 2.45 81 1.91 79 2.60 Table 9 presents the Pearson correlation coefficients for each COBOL class as indicators of the relationships between student achieve- ment and student attitude. TABLE 9 Student Achievement Versus Student Attitude Pearson Correlation Coefficient (r) Group 1 (Business-oriented High School Students) .17 Group 2 (Community College Students) .26 Group 3 (High School Students With BASIC Experience) .44 61 Interpretation of Findings for Relationship Between Cognitive and Affective COBOL Measures Examination of the Pearson correlation coefficients suggests that the relationship between student achievement and student feelings toward the COBOL learning situation was low in Groups 1 and 2 and moderate in Group 3. An increase in test score accompanied by an increase in positive feelings about the COBOL experience seemed to be more characteristic of Group 3 than of the other two groups of students. Results of this study, as might be expected, do not support the argument that a highly structured learning situation is viewed negatively by high achieving students and welcomed by low achieving students. Because COBOL test scores were high in all three groups and attitudes were, overall, positive toward the COBOL experience, it was not surprising to find low correlations between student achievement and student attitude. Even if a student preferred to learn COBOL independently in a less structured environment, the student's overall reaction to the structured situation was not negative. Student achievement in introductory COBOL seems quite independent of student attitude toward a highly structured learning situation. Even though a moderate relationship between achievement and attitude was found in Group 3, it is difficult to meaningfully interpret this relationship. COBOL test scores among the high school students in Group 3 were clustered at a high level and the reaction form measures were not markedly different. 62 Rationale for Describipg Students Who Scored the Highest and the Lowest on the COBOL Test To better understand student reactions to their COBOL experi- ences, the two students with the highest scores and the two with the lowest scores in each of the three COBOL classes are described in detail. The composites of the students include their grade levels or college status, previous computer and mathematics experiences, grade point averages, average attitudinal measures, and specific responses to the attitudinal measures in the area of lesson effectiveness. Students from Group 1 were at a high school level and business- oriented, students from Group 2 were at a community college level and had varied academic backgrounds, and students from Group 3 were high school students who had all studied at least one semester of the computer programming language, BASIC. Personal student data were requested as part of the first attitudinal measure students were asked to complete. Qualitative Presentation of Findings for Highest Scoring Students Group 1. Two business-oriented high school students from Group 1 scored 100 points on the COBOL achievement test. One of these students was a twelfth grader who had previous programming experience and had completed two years of algebra, one year of geometry, and one year of pre-calculus. This student had an overall gradepoint average of 2.8 while enrolled in the COBOL course and had an average reaction form measure of 2.24. Specifically, this student felt that either "very well" or "well" described how the objectives of the lessons were met. Some of the comments regarding the best part of a particular 63 lesson included: 1. "working on your own on the activity" 2. "writing and running our own COBOL programs". Some of the statements regarding the recommendation that one part of the lesson be eliminated follow: 1. "the time spent reviewing the previous lesson and use it for computer time" 2. "all the review sheets" 3. "the fill in the blanks...I would rather read and take notes". The second high school student from Group 1 scoring 100 on the COBOL test was a twelfth grader who had no computer experience and had completed two years of algebra, one year of geometry, and one year of pre-calculus. His/her overall gradepoint average was 3.11 and his/her mean attitudinal measure was 2.52. Generally, this student felt that "very well" described how well he/she met the objectives of the COBOL lessons. Some of the comments regarding the best part of a particular lesson included: 1. "using the computer" 2. "doing the program changes". Some of the student's statements regarding the recommendation that one part of the lesson be eliminated follow: 1. "time--it takes too long explaining and not enough time doing" 2. "I think that a manual or something would be great rather than having the teacher explain every little thing." 64 3. "nothing". Group 2. In the community college level COBOL class, the two top scores on the COBOL test were 100 and 99. The student who scored 100 was a freshman who had taken one college course where both BASIC and FORTRAN were introduced. In addition to completing two years of high school algebra, he/she had taken two data processing college courses which stressed mathematics logic. His/her overall college grade point average was 3.85 and his/her average reaction form measure was 3.23. In general, he/she felt that "very well" or "well" best described how he/she met the objectives of the lessons. Few comments were made about the best parts of the lessons. Reference was made to running the dem- onstration programs, especially those dealing with editing features. No suggestions were given as to what might be eliminated from the lessons. The other top scoring community college level student in Group 2 scored 99 on the COBOL test and had already attained a B.A. degree. He/she had taken several courses in a college computer science cur- riculum and his/her college mathematics experiences included one year of calculus, linear and abstract algebra, and probability and statistics. This student's overall college grade point average was 3.0 and his/her average reaction form measure was 2.74. In general, this student felt that he/she had met the objectives of the lessons "very well". Some of the comments about the best part of the lessons follow: 1. "working with the computer" 2. "practicing on the computer". 65 All of the statements regarding the elimination of one part of the lessons made reference to excessive repetition. Group 3. The two high school students with BASIC experience scoring the highest on the COBOL test received scores of 98 and 88. The student scoring 98 was a senior and had completed two years of algebra, one year of geometry, and was taking senior mathematics concurrently with COBOL. The student's grade point average was 3.0 and his/her average attitudinal measure was 2.55. Generally, he/she responded with "very well" or "well" when asked how well the objectives of the lessons were achieved. This student chose not to comment on most lessons about the best parts of the lessons and the parts he/she would recommend eliminating. The best parts of the lessons were given as "powering up and down" and "the translation process" while "description of general purpose computer" was the only part of any lesson which was suggested for elimination. The second high school student from Group 3 scored 88 on the COBOL test and was a senior who had studied BASIC and had completed only one year of algebra. His/her overall grade point average was 2.13 and his/her mean attitudinal measure for this COBOL study was 2.71. Responses to how well the student felt the lessons' objectives were achieved ranged from "very well" to "poorly", even within one lesson. Some of the comments about the best parts of the lessons included "finishing" and "running the program". Responses of "describing the parts of the computer" and "nothing" were given when asked to recommend the parts of the lessons that ought to be eliminated. 66 Quantitative Presentation of Findings for Highest Scoring Students Mean responses to the Likert-type attitudinal measures are summarized in scoring students. eleven Table item part of the 10 for the highest 67 oo.o I cocoa-on 0>HuHooa yucca cocoa-nu O’HuH-on acct oo.. . Ho.~ on.“ as.n on._ as.~ cm.n ok.~ so.n __.n m~.~ as.~ n_.a -.~ ne.~ mk.~ mn.~ sm.~ mm.~ -.~ ma.~ as.~ as.~ mm. mm. wO OMOUm HO OHOUW .mwammmmw. oo.~ a~.n nn._ oo.n mm.n oo.n nw.n o~.n mm.~ e_.n oo.m mm.n oo.m H_.n mm._ so.~ mm.~ oo.n as.~ on.n oo.s nn.n MN. oos mo uuoom Ho uuoum .MImmmwmu .uoooauna hue-l:- uou oH-Huouul Honou can neon. ueHHoou 0>HuHqu - uuoHuou on aucolouuu. HocHuHuo IouH vacuoaou cussing-um. On.N nn.o oo.c nw.n oo.~ nn.H mm.~ aw.— mo.n On.H no.n oo— Ho ouoom Ho ououm mm.H HH.H nn.n nm.m so.H oo.~ mm.— mm.— nw.~ no.— n—.n ooH Idlummwml .oco oHnu cu u-HHIH. uaouuoH ouol noHu>ov cu ouozuo van chose-on Honco ou-uaouao vHaoa H ¢.uo:o-ou an lauu 5H0: usozuwa .:So am so non-0H 0H3» aH H-Hueuul ozu :CIUH cu vouEOHoun o>qg uoa vHaot H .couuoH any we can on» an uoHuH>Huuq unu ov a» Honov unoa- :uaoco 30:: H .cooooH ecu Ho 6:0 usu u- uHau «so no HHoa vHv H .nunou Ou vuoguo>o no non: acuoauu-cH Honou ml cog: uH ooxHH H .vOucououa ucHon In: coo-0H osu oHsz nous: ox.» ou HaunHo: on: uH .uauHHouuo one: nucHon cHnI vu-uu-sHHH on non: ooHnI-uo ugh a.n¢ovH Ho coHuHuoauu :9:- oOu you no: munch «.mo-ooH .qu :H :oHunluoHcH goal ecu yo: no: chock .wcHuoouou:H huo> no: coo-0H oza .uquo one: cooouH onu Ho nH-ou 05H ludfllfludum nucovsum Honou ucHuoum noa loam amazon-ox HachauHuu< can: no coHuunHu-liam OH manHuHuoa anon; uncommon o>HuHuoa use: oc.c I o~.~ cc.~ cc.~ cw. oo.~ oo.n oo.n oo.n cc.n oq.n om.~ cc.— cm.~ cc.n oc._ Oc.~ o~.~ co.~ cm.~ oo.~ c~.~ o~.n m “H Ho ououm Ho ououm IMIummmml oo.~ nc.~ nn.~ oo.~ an.~ no.— so.~ Oc.n % Ho oboum Ho ouoom oo.~ oc.n Oo.c n~.n oc.n oo.~ oo.n oo.N on.~ o¢.n co.c fl. u macho oo.« oo.« oo.n oo.~ oo.c co.n oo.n oc.n cu.n Oo.n oo.n 3 Ho ououm Ho ouoom .oo-oq~:a sue-Ia. uoH .H-HHOu-I Hence 0:» anon. ucHHooH o>HuHoos I uuoHuau ou cyan-sung. H-cHuHuo IouH venues-u sumo-auauwa oo.n no.~ nc.~ n_.~ no.n nn.n no.~ no.— an.~ no.~ Ow.~ fl H macaw .oco onu o» u-HHIH- mean-0H one. aoHo>ov o» uuozuo can cacao-cu acaco «masseuse vHaoa H ¢.uo:o-0u an lauH 5H0: uaosuHa .:30 ml no eon-0H quu c« H-Huou-I use eu-UH on yuan-Hana can: so: vHaoa H .eo-ooH 0:» Ho we. 0:» an .oHuH>Huu- egg av on 40.00 ozone games. 30:; H .eo-ooH any «0 ago as» u. uHau 0:» co HHoa vHv H .gu-ou cu tooguo>o :- vo-s acuuauu-eH aoaou ul.=on3 uH voxHH H .vouco-oua ucHon no: coo-0H ozu oHHsa noun: ox-u ou HauaHo: on: UH .ucoHHouxo one: nucuoa cH-I canny-aHHH on can: uoHalaxo ugh ¢.-ovH Ho soHUHuonou go:- oou no: on: ouoeh e.eooooH quu c« coHu-IuoHcH goal ecu no: on: check .ucHu-ouOucH auo> on: coo-0H ugh .u-oHu one: coo-0H cng «6 uH-ou ugh once-ouuum uueovaum Honou ucHuoum uncle; loam cooconuou HochsuHuu< :10: Ho eoHu-uHu-Ilam anOH mo. um ucwuamaemam. mHoo. «mmeo.c omqm. mm Homo: woosvmm cam NM Homo: HHsm coo3umm mocmuomea Com oHumm m ouoow ume ammum>< usHom ovmuo owHNm. Homoo wocwHuoaxm «OHumEonumz muoum mama wwmuo>< ucHom ovmuo meno. Homou oucoHumnxm amusqaoo wuoum umoH wocoHuoqu «OHumamnumz «omHm. Homou oucoHumaxm uwuaaeoo ammuo>< ucHom ovmuu ouoom umoa oucoHuoqu muHumeOsumz HmHNm. Homoo oucoHquxm nousaeou Ammv mHanum> moHanum> mmHanum> ucmvcmnovcH ucmvcmaoa ucmvcoamvcH hp you vmucsoou< mmuoom umoe Homoo :H oocmHum> Ho :oHuuoqoum coHumsvm conmouwmm Homo: HHam Baum vmumea moHanum> Ho wocmuHuchHm HmoHuwHumum 0H mam>>>>> Divisions, sections, paragraphs, and sentences all must end with a 111 IDENTIFICATION DIVISION 2-5 Contents: Example: WIFICATION DIVISION. MID. m-cosrs-Lz W. J“ WI. Explanation: The computer will know this program by the name Components: Paragraphs -_ and ENVIRONMENT DIVISION - CONFIGURATION SECTION Contents: Example: MIMI" DIVISIM WIMTIOO SECTIOA. mm Mill. ”CT-CWUTER. mm. Explanation: The program and programmer will know this to be run on a computer Components: Section - Paragraphs - and DATA DIVISION - WORKING-STORAGE SECTION Contents: Example: 77 mus-ow memes sssss. 77 mus-«cu PICTURE sssss. rr mus-term srcrues sss. Explanation: The three variables defined here are: and miles-old and miles-new are defined as being digits long, miles-total is defined as being Components: Section - Entries - and 112 PROCEDURE DIVISION 2.6 Contents: Example: OHM!” WT fill-MD "on mus-New OIVIM HILES-TOTRL. OlVIOE MS-L IV 1 7. OIVINO m OIVIOE mus-torn. IV OAS-Om ammo "O. EXplan'ation: Components: Paragraph- Sentences - and SUMMARY Review the complete program and-make a brief outline of the divisions and their parts. Division. Division. Section. Division. Section. Division. (Paraeraphl {paragraph} Iparasraphl For each of the four divisions identify the contents. DIVISION CONTENTS Identification Environment Data Procedure 113 2-7 QUICK QUIZ 2 DIRECTIONS: Answer the following questions and then check your answers with the correct ones given on page 2- . For any that you may have missed, review your worksheets. ‘. Which of the following are important features of COBOL? A. highly readable B. needs little additional explanation C. business oriented D. all of the above The COBOL program written by the user is called the A. Source Code B. Compiler C. Object Code D. interpreter code. The result of the compiler‘s activity is a machine language called the code. P Source . Compiler . Object . interpreted DOW Look at the following line: OIVIOE OHS-i. Bi! 3. 7B GIVING CPS-GAL. This line is an example of the of COBOL. A. readability B. self-exploratory nature C. communicating capability D. all of the above The first division of every COBOL program is the A. data B. environment C. identification D. procedure division. The environment division has as its major purpose: A. to identify the program to the computer B. to identify and describe all data used in the program C. to tell the computer how to process the data D. none of the above 114 2-8 B. The configuration section is part of the division. A. data B. environment C. identification D. procedure 0. Definition of all variables used in the program is done in the division. A. data B. environment C. identification D. procedure 10. The following line would be found in division of the program. DIVIDE COST-TOTRL BV MILES-TOTRL GIVING CFH. data . environment . identification . procedure 000) ACTIVITY 2 Part 1 : Fill in lesson 2 reference sheet on page . Use page references to answer any questions. Part 2: A. Steps to enter a program Powerbup computer (use lesson 1 reference sheet as a guide) Type and.enter CEDIT (COBOL’s editor) Type and enter l (to insert new program) Type and enter FUEL-COSToLz program found on page if you discover and error in the line you typed use the left arrow < > to back up to error and correct. . Press when all lines have been entered. Enter P1 00:200 to display the first part of the program. Check these lines for errors. If you discover an error: Enter R lnum - line number containing error Then enter corrected line When it asks you to enter another line, press . B. Enter P200:300 and check and correct in same manner. 9. Enter P300:400 and check and correct in same manner. 10. Enter P400500 and check and correct in same manner. 1 1 . When all errors corrected: enter W FUELC2 (write program to file named FUELC2) enter 0 (quit COBOL editor) 9991‘ $9” 115 B. Steps to compile a program 1. Enter: RSCOBOL FUELC2 T 2. Take notes on any lines which contain errors. 3. If there are any compiler errors: Enter CEDIT (COBOL editor) Enter L FUELC2 (load program file) Then make changes as before Enter W FUELCZ (write to program file) ‘ Enter 0 (quit COBOL editor) Start over on step 34 . 4. Continue If no errors. C. Steps to run a program Enter RUNCOBOL FUELC2 Beginning odometer reading Ending odometer reading Liters of Gas used 81 200 81 452 38.3 1 2.1 8 a 3 . Cost 3 Trip distance a Gallons of Gas used - MPG for trip I - Cost per mile for trip 116 LESSON 2 REFERENCE SHEET COMPILATION PROCESS: um -—# tantrum DIVISIONS OF A COBOL PROGRAM: identification Environment Data Procedure Division Contents STEPS TO ENTER A PROGRAM 10. 11. 12. To insert new program: . Power-up computer (see Lesson 1 Reference Sheet) To start up COBOL Editor: To correct error in current line use: To stop entering lines: To display lines 200 through 300: To correct error in line 150: To write program to file, SAMPLE: To exit COBOL Editor: To compile program, SAMPLE: To load file, SAMPLE, after entering COBOL Editor: To run program, SAMPLE: APPENDIX B SAMPLE TEACHER COBOL MATERIALS AOmOU O... ZOFUDDOEPZ. N 20mmm4 AOmOO mA £859.. a 5: use .o:eEoo .Louco o. cameo: :ozeELoE. of 26: 53 30> 26.32298 25 EOE .Eecmoae AOmOO Eo>o co 2222p So“— ofi .3333... can possess 26: EB so> .252 Eaton—c. Lo 22:53.. use ommsmcfl 2.: Co cozficomom no.5 m .32 .323 :25 30> EEmoE AOmOo Ego .o 9.663% 0.93 of use 23:96. 82582. AOmOo 05 3 so> despot... ==s AOmOO co zoned. mi... "@3020. mm LO _O=m.> CO CgOp-m an BO=CO>O DOOR— . P >>m_>mm>O «A .935 .595 63 JOmOo e :3 use .2358 :35 3 3o... 0 £359... .2500 296 Lo .25.»? .5 .52 05 no cox-Ems... one cozeceaxo :< O 583:3 somoo 2: B 5.328% .25 < o ”nomad. £5 E b.2960 on 53 e033 9.35:2 of. >>m _>mm>O RAEQPPAI) AOmOU O... ZO_._.UDDOE._.Z. a a p 121 . £33. 65 .o 33o. Eaton—e. 05 :o 2.08 Eco—03a Bo... o. pom: 9e >9: 3565326... 6250073 teem . p mm>_._.0m_.m0 TN .235 122 .EEmoE AOmOO o E: ace .o:ano :35. .m .co_m_>=o gone .0 emcee—E Epsom 05 5298 men EeLmoE JOmOO >85 .0 9.2226 So. of «m... .u .e_.._m:o=o_o._ :9: 5298 use .Lo=mEoo use .350 Coo—no .2000 60.58 953 of 05:5 .9 no. pic on :25 :o> .283. 25 .o 23% 50> Ho case—EEO". 05 >m wm>_._.0mfim0 -\L..EQA\A\1 JOmOU O... ZO_._.UDn_Om_._.Z_ FCC, : =0 123 .3552 .353 8952.. of 95m: :05 use 9:55. .3 383.883. .0 .23... 329.30 .N .cozeELoE. o... zen-so mucous.» go... . P >>m_>mm_ Ta .235 124 ImeHt :— ONOO “ht 23¢ mohm JOQOU mzuhwEOJHz HON m¢<30w mmJHt nm—S. nhH wmmJH: .icou >oce» ou ounceuHJ .eLou ace—LeuutIcerz >3 HoeH usu.L>eou .mm.— as) JOOOUxtCH elHucax 40009 HHH\H Home: >nz<4tho .tx OZH>HO H0.H >0 H: >4k~h43t .E ES: . (33:. . .2de shfi=h§=6~ P .zxi>¢4am.aI84>m_>mm “I0” 125 gas»? 30 305 3c: .1800 ten .Eoumam eczmzmcosou co 2 62.39:. a. 5:35.25 .0230 m< .9 30:23:00 .2655. m.~ 33S 126 35:. .392: :32 E: 5226 .m 2:235? 3.285 8226 3 ucoamwm 6 92qu .52.. a2: a 2:3 5 E333 :22... .28... :35 do; 93.6 =a .555 .m 5:23.: :o 5226 .u Bgfiafifia 3.. .8 5:5 .F "Ea momma... a: 9:825.— :_ 25:2 3 3me ms... .\:«\E«\§\ 3:528 555?. WWW— -:a- =c§ —- —— _— _—— 127 .40000 0:.500. €00 :05 0500 .50: 0.0 0.:0030 05 0:0 30000. 0000000. .50: 0 £000. 00 3 00: 3.3330 05 00 60.50.0800 0 .0 00:30:30 0. 0...... .v ..:0..00.::E:.00 :03000 .0: 00.0.0005. :0 .0 22.00.55.800 c0223 .0: 0000.0:0: 0 0030.0 0. 00 0.003 :0..:.00 0:0 .05 £0.03 .0 .0000000. 0.350 05 0:00. 0:0 002:: 0:0.0 ~00 00: ....s 0.0000 0005 0:300 .3 80.00.... £0.98 .N .000:0:0. 0500 050.0000 .0: 0.0000 025 0. 0. 05.0.0. .:0...0:00 0.0.0xm . P m6<202<4 ENPDQEOU < ".0 20:02:". 0.0 .0005 128 .80: .0. .0000 2 «Bass. .0 .3. 32...... 0 .00: .0. 00.3... _ 20...:400 _ 0300. ....s 0550: .2302... 023...... x0000 0. 0:.00. .00: 000.0: _ $5.5me _ h. 02.. .30 00:0.0.00:: E5... 05000090225: :05. 0:. 0:0 000:0:04 0.2.0:... II 0:. 00:0.0.00:: .00: .050500 _zo.....ozn~o_ R.Efi\o2\ .AMMM _ N.N .5 00.5025 00.5.5.8 < .0 20:02:". pga-ec 129 .>0:0.0...0 5 000. .. .300: :2 0. 50.090 0:... E0>0 0.... 0000 0:0 .:0.0.0>:00 0513.05. 0000 0. .0... ..0.0.0.0.:. :0 00 0.00 000:0:0. 0.0.5 0:... .thz .. 2:830 2 0.... .0... 0.5.020 05:8: .0 .0000 .0030 00:00 .0000 05:00.: 0000020 0:0 00.0.0:0.. .0.:0500 .0 400500 0.... .0 803:0 0.... 0000 00.000 00...... .00: .N 5......00 .0.:0.:00 0:0 .00: 0. 50... 05.0.0. .0:0...5.00 200:0 .. mus—mm... ...Z<._.m0n:>: ".0 20.....Z_n_mo 5.0 .0:0.> 130 0009 0009 ..|..|ll|.l 1230309 ”03%me ~93 sous-0 095.2. 5.23:3. 33.33300 .00 5.5.32 .2 5.2.3. 00.52:... 925:003 wt.— 0000 .0030 3&05 muawatue¢zszu<= 9:: 33.005. .20. h... 3.2:... 00...: 5.00... 0... 8:00.00 80...: .2..— >0 £0.33... 3<~00~= .0000 was 0000 00.00% .\P.§LKQ0A\A\ Mus—mm... 0.23.0005: ".0 20...._Z.u_mn . m APPENDIX C COBOL TEST ANALYSIS OF COBOL TEST CONTENT APPENDIX C COBOL TEST STUDENT # Fill in the blanks: l. PROGRAM-ID. and AUTHOR. are referred to as within the IDENTIFICATION DIVISION. 2. The DIVISION of a COBOL program supplies information about the type of computer used. 3. A COBOL paragraph is made up of 4. A section within the DATA DIVISION is the section. 5. The COBOL instructions to be executed in a program are contained in the DIVISION. 6. Explain the purpose of the following COBOL instructions: OlO-CALCS. MULTIPLY HOURS BY RATE GIVING GROSS-PAY. MULTIPLY GROSS-PAY BY SOC-SEC-RATE GIVING TAXES. SUBTRACT TAXES FROM GROSS-PAY. Purpose: 7. In the COBOL statement 77 PAY-RATE PIC 99v9, PAY-RATE is defined as being digits long. 8. After each COBOL statement is executed, give the resulting values of A, B, C, and D: Values of COBOL Statements A B C D 2 4 6 (initial values) ADD A, B, C GIVING D. SUBTRACT A FROM D. MULTIPLY B BY A DIVIDE A BY B GIVING A. 9. A11 paragraph names must be followed by a . 131 II. 132 10. In COBOL, a comma is most often followed by a . 11. Besides letters and numbers, are often used in a variable name or paragraph name for ease in readability. 12. A. Problem Statement: A salesman's monthly commission is computed according to the following table: Sales Commission Less than $5,000 $250 + 52 of sales $5,000 or more $250 + 10% of sales B. Diagram Solution: Sales less than $5,000 Multiply sales by .05 to get partial commission + Sales less than $5,000 Multiply sales by .10 to Add partial commission to 250 get partial commission Display commission C. Program Solution: IF MULTIPLY SALES BY .05 GIVING PARTIAL-COMMISSION ELSE MULTIPLY SALES BY GIVING . ADD PARTIAL-COMMISSION TO GIVING TOTAL-COMMISSION. DISPLAY "TOTAL COMMISSION 18", . Read the following program carefully and answer the questions which follow it. 000100 IDENTIFICATION DIVISION. 000110 PROGRAM-ID. FINAL-EXAM. 000120 AUTHOR. LYN ALLEN. 000130 000140 ENVIRONMENT DIVISION. 000150 CONFIGURATION SECTION. 000160 SOURCE-COMPUTER. MODELIII. 000170 OBJECT-COMPUTER. MODELIII. 000180 000190 DATA DIVISION. 000200 WORKING-STORAGE SECTION. 000210 000220 000230 000240 000250 000260 000270 000280 000290 000300 000310 000320 000330 000340 000350 000360 000370 000380 000390 000400 000410 000420 000430 000440 000450 000460 000470 000480 Questions: 1. 2. 3. 77 77 77 77 77 77 77 77 RATE HOURS PAY EXTRA-HOURS DOUBLE-RATE EXTRA-MONEY PART-OF-PAY PAY-FINAL 133 PIC PIC PIC PIC PIC PIC PIC PIC PROCEDURE DIVISION. DISPLAY "ENTER HOURS WORKED". OZO-CALCULATIONS. O30-DISPLAY-PAY. ACCEPT HOURS. 99V99. 99. 999V99. 99. 99V99. 999V99. 999V99. $$,999.99 DISPLAY "ENTER PAY RATE". ACCEPT RATE. IF HOURS IS GREATER THAN 40 MULTIPLY 40 BY RATE GIVING PART-OF-PAY SUBTRACT 40 FROM HOURS GIVING EXTRA-HOURS MULTIPLY RATE BY 2 GIVING DOUBLE-RATE MULTIPLY EXTRA-HOURS BY DOUBLE-RATE GIVING EXTRA-MONEY ADD PART-OF-PAY, EXTRA-MONEY GIVING PAY ELSE MULTIPLY RATE BY HOURS GIVING PAY. MOVE PAY TO PAY-FINAL. DISPLAY "TOTAL PAY IT", PAY-FINAL. What is the purpose of the program? If someone were to run this COBOL program, several lines would be displayed on the computer screen. by numbering them to show the order in which they would appear on the screen. Rearrange the following lines ENTER PAY RATE 30 TOTAL PAY IS $150.00 ENTER HOURS WORKED 0500 Why is RATE multiplied by 2 in line 400? 4. 134 If HOURS is greater than 40, why can't we just multiply HOURS by RATE to get a person's pay? Suppose that a person worked 40 hours and his pay was $4.50 an hour. How much pay would he receive? Suppose that a person worked 50 hours and her rate was $5.00 per hour. According to the COBOL program given, how much pay would she receive? APPENDIX C ANALYSIS OF COBOL TEST CONTENT Part I Part II Question Category Point Value Question Category Point Value 1 Grammar 3 1 Reading 8 2 Grammar 3 2 Reading 8 3 Grammar 3 3 Reading 8 4 Grammar 3 4 Reading 8 5 Grammar 3 5 Reading 8 6 Reading 3 6 Reading 8 7 Grammar 3 8 Grammar 6 9 Grammar 3 10 Grammar 3 11 Grammar 3 12 Writing 15 135 APPENDIX D STUDENT REACTION FORM APPENDIX D STUDENT REACTION FORM Lesson # Student # In this class you are using Radio Shack microcomputers and some new teaching materials to learn COBOL. The person who developed these materials and COBOL teachers are interested in getting your reactions to these materials. Your reaction will help these people as they evaluate the usefulness of the materials. Do not write your name on this form. PART I Please fill in the requested information. Date Grade Have you taken other computer courses? What was the last math course you took or what math course are you taking now? Overall grade point average Part II Please check (x) the response which best describes how you feel about each of the statements below: KEY: §A_means you strongly agree A means you agree U_means you are uncertain 2_means you disagree §2_means you strongly disagree 1. The goals of this lesson were clear. SA A U D SD 2. The lesson was very interesting. SA A U D SD 136 137 3. There was too much information in this lesson. SA A U D SD 4. There was too much repetition of ideas. SA A U D SD 5. The examples used to illustrate main points were excellent. SA A U D SD 6. It was helpful to take notes while the lesson was being presented. SA A U D SD 7. I liked it when my COBOL instructor used an overhead to teach. SA A U D SD 8. I did well on the quiz at the end of the lesson. SA A U D SD 9. I knew enough about COBOL to do the activities at the end of the lesson. SA A U D SD 10. I would have preferred to learn the material in this lesson on my own, SA A U D SD without help from my teacher. 11. I would encourage COBOL teachers and others to develop more lessons SA A U D SD similar to this one. PART III. Please check (x) the response which best describes how well you think you did on each of the lesson's objectives. Objective 1: Define COBOL and list several of its important features. very well fairly poorly well well Objective 2: List the components involved in the Disk Operating System. very well fairly poorly well well 138 Objective 3: Explain the purpose of the diskette. very well fairly poorly well well Objective 4: Power up and power down the disk system. very well fairly poorly well well PART IV. Please complete the following: 1. The best part of the lesson was 2. If one part of the lesson had to be eliminated, I would recommend eliminating APPENDIX E TEACHER REACTION FORM APPENDIX E TEACHER REACTION FORM Lesson # Teacher # PART I Please fill in the requested information. Date How many computer courses have you taken? Have you taught a programming class before? How would you rate your knowledge of COBOL? How would you rate your knowledge of microcomputers? What is your main teaching area? PART II Please check (x) the response which best describes how you feel about each of the statements below: KEY: ‘§A_means you strongly agree A means you agree U_means you are uncertain 2_means you disagree §2_means you strongly disagree 1. The goals of this lesson were clear. SA A U D SD 2. There was too much information in this lesson. SA A U D SD 3. There was too much repetition in this lesson. SA A U D SD 4. The examples used to illustrate main points were excellent. SA A U D SD 5. Structured notetaking seemed to be valuable for most students. SA A U D SD 6. The students seemed to do well on the quiz at the end of the lesson. SA A U D SD 139 140 7. Overall, the teaching materials helped me effectively teach this lesson. SA A U D SD 8. It was enjoyable to teach this lesson. SA A U D SD 9. I had sufficient COBOL knowledge to teach this lesson. SA A U D SD 10. I would encourage COBOL teachers and others to develop more lessons SA A U D SD similar to this one. PART III Please check (x) the response which best describes how well you think the teaching materials helped students to achieve each of the lesson's objectives. Objective 1: Define COBOL and list several of its important features. very well fairly poorly well well Objective 2: List the components involved in the Disk Operating System. very well fairly poorly well well Objective 3: Explain the purpose of the diskette. very well fairly poorly well well Objective 4: Power up and power down the disk system. very well fairly poorly well well PART IV Please identify (briefly) the major deficiencies or instructional prob- lems you encountered in teaching this lesson. Any suggestions for eliminating these problems would also be appreciated. BI BLIOGRAPHY BIBLIOGRAPHY Biggs, J. B. Information and Human Learning. Glenview: Scott, Foresman and Company, 1971. Bovy, Ruth C. "Successful Instructional Methods: A Cognitive Information Processing Approach." Educational Communication and Technology - A Journal of Theory, Research, and Development 4 (Winter 1981): 203-17. Bradley, Robert L. "COBOL - Ready and Waiting." 80 Microcomputing, July 1981, pp. 116’200 Briggs, Leslie J., and Gagne', Robert M. Principles of Instructional Design. New York: Holt, Rinehart and Winston, Inc., 1974. Briggs, Leslie J. Sequencing of Instruction in Relation to Hierarchies of Competence. Pittsburg: American Institute for Research, 1968. Calkins, Andrew. "From Classroom to Commerce: How Computer-Literate Teachers are Making Their Skills Pay Off." Electronic Learning, September/October 1981, pp. 42-45. Cheney, Paul. "Cognitive Style and Student Programming Ability: An Investigation." AEDS Journal 13 (Summer 1980): 285-91. Cherniak, B. "Introductory Programming Reconsidered: A User-oriented Approach." SIGCSE Bulletin 8 (1976): 65-68. Costa, Arthur L. "Teaching for Intelligent Behavior." Educational Leadership 39 (October 1981): 29-32. Davis, Robert H., and others. LearningpSystem Design. New York: McGraw-Hill, 1974. Dennis, J. Richard. "Between What Was and What Might Be - Computers in Schools Today." AEDS Monitor 19 (October/November/December 1980): 10-11. Duchastel, Philippe C., and Merrill, Paul F. "The Effects of Behavioral Objectives on Learning: A Review of Empirical Studies." Review of Educational Research 43 (Winter 1973): 53-69. Glaser, Robert. "Components of 3 Psychology of Instruction: Toward a Science of Design." Review of Educational Research 46 (Winter 1976): 1-24. 141 142 Gray, J. D. "Predictability of Success and Achievement Level of Data Processing Technology Students at the Two-year Post-secondary Level." Ph.D. dissertation, Georgia State University, 1974. Hartley, James, and Davies, Ivor K. "Preinstructional Strategies: The Role of Pretests, Behavioral Objectives, Overviews, and Advance Organizers." Review of Educational Research 46 (Spring 1976): 239-65. Hedges, William. "Teachers, Computers Meet." AEDS Bulletin 6 (July 1981): 12. Hunter, Beverly, and others. Learninnglternatives in U.S. Education: Where Student and Computer Meet. Englewood Cliffs: Educational Technology Publications, Inc., 1974. Isaac, Stephen, and Michael, William B. Handbook in Research and Evaluation. San Diego: EDITS Publishers, 1981. Kerlinger, Fred N., and Pedhazur, Elazar. Multiple Regression in Behavioral Research. New York: Holt, Rinehart, and Winston, Inc., 1973. Kimura, T. "Reading Before Composition." SIGCSE Bulletin 11 (1979): 162-66. Lemos, Ronald S. "A Cooperative Study of the Effectiveness of Team Interaction in COBOL Programming Language Learning." Ph.D. dissertation, UCLA, 1977. Lemos, Ronald S. "Measuring Programming Language Proficiency." AEDS Journal 13 (Summer 1980): 261-73. Levin, Tamar, and Long, Ruth. Effective Instruction. Alexandria: Association for Supervision and Curriculum Development, 1981. Luehrmann, Arthur. "A Nation at Risk - Implications for Computer Science Education." AEDS Monitor 22 (November/December 1983): 22-26. Mazlack, L. J. "Identifying Potential to Acquire Programming Skill." Communications of the ACM 23 (1980): 14-17. McClure, James. "Microsoft vs. Micro Focus COBOL." Creative Computing, March 1980, pp. 20-29. McKenzie, Gary R. "Improving Instruction Through Instructional Design." Educational Leadership 37 (May 1980): 664-68. Milner, Stuart D. "An Analysis of Computer Education Needs for K-12 Teachers." Proceedings of the National Educational Computing Conference. Iowa City, Iowa: n.p., 1979, pp. 27-30. 143 Mock, J., Estrin, L., and Vasarhelyi, A. Learning Patterns, Decision Time, Decision Approach, Information Structure, and Value of Information in the Information Structure Experiments: AIS Working Paper No. 70-8. Los Angeles: Accounting and Information Systems Research Program, Graduate School of Business Administration, University of California at Los Angeles, [1970-71]. Norris, Cathleen M., and Lumsden, Barry. "Functional Distance and the Attitudes of Educators Toward Computers." T.H.E. Journal 11 (January 1984): 129-32. Parent, Joseph, and others. "Interactive Effects of Teaching Strategy and Personal Locus of Control on Student Performance and Satisfaction." Journal of Educational Psychology 67 (1975): 764-69 0 Patton, Michael Quinn. Qualitative Evaluation Methods. Beverly Hills: Sage Publications, 1980. Petersen, Chas. C., and Howe, Trevor G. "Predicting Academic Success in Introduction to Computers." AEDS Journal 12 (Summer 1979): 182-91. Prentice, Lloyd R. "The Lowdown on Languages." Classroom Computer News, July/August 1981, pp. 40-41. Shelley, Gary B., and Cashman, Thomas J. Introduction to Computer Progpammipg Structured COBOL. Fullerton: Anaheim Publishing Company, 1977. Singhania, Ram P. "Issues in Teaching the Introductory Course in Computers in Business Curriculum." AEDS Journal 14 (Fall 1980): 45-51 0 Vensel, Cyndi. "Schools Begin to Feel Impact of Home Computing." School Microcomputinngulletin 3 (June 1984): 223-24. Watt, Daniel H. "Computer Literacy: What Should Schools Do About It?" Instructor and Teacher, October 1981, pp. 85-87. Weinberg, G. The Psychology of Computer Prpgramming. New York: Van Nostrand Reinhold, 1971. Yourdan, D. Techniques of Program Structuring and Design. Englewood Cliffs: Prentice Hall, 1975. 144 General References Ausburn, Lynna J., and Ausburn, Floyd B. "Cognitive Styles: Some Information and Implications for Instructional Design." Educational Communication and Technology - A Journal of Theory, Research, and Development 26 (Winter 1978): 337-54. Hassell, Johnette. "A Paradigm for Developing a Computer Science Program." AEDS-81 Convention Proceedings, Minneapolis, Minnesota: n.p., 1981, pp. 125-280 Hirschbuhl, Katherine. "The Need for Computer Literacy and Computer Applications in the Nation's Classrooms." J. Educational Technology Systems 9 (1980-81): 183-91. McLamb, Ken. "Personal Computers in Business - The View From the Data Processing Shop." Personal Computipg, September 1981, pp. 30-40. National Educational Computing Conference, Proceedings of the 1979 Meeting. Iowa City, Iowa: n.p., 1979. National Educational Computing Conference. Proceedings of NECC/2. Norfolk, Virginia: n.p., 1980. National Educational Computing Conference. Proceedingg of NECC 1981. Denton, Texas: n.p., 1981. Norusis, Marija J. SPSS Introductory Guide: Basic Statistics and Operations. New York: McGraw-Hill, 1982. Richardson, Gary L., and others. A Primer on Structured Program Desigp, New York: Petrocelli Books, Inc., 1980. Rose, Janet S., and Medway, Frederic J. "Measurement of Teachers' Beliefs in Their Control Over Student Outcome." The Journal of Educational Research 74 (January/February 1981): 185-90. Schloss, Lisa, and Ball, Leslie D. "Computerized Education: Should We or Shouldn't We?" AEDS Monitor 20 (July/August/September 1981): 18-210 Sherman, Thomas M., and Giles, Mary B. "The DeveIOpment and Structure of Personal Control in Teachers." The Journal of Educational Research 74 (January/February 1981): 139-42. Smith, Lyle R., and Sanders, Kay. "The Effects on Student Achievement and Student Perception of Varying Structure in Social Studies Content." The Journal of Educational Research 74 (January/February 1981): 333-36. Snelbecker, Glenn E. Learnipg Theory, Instructional Theory, and Psychoeducational Design. New York: McGraw-Hill, 1974. 145 Stephens, Larry J., and others. "Group Differences in Computer Science Aptitude." AEDS Journal 14 (Winter 1981): 84-95. Stevens, Dorothy Jo. "How Educators Perceive Computers in the Classroom." AEDS Journal 13 (Spring 1980): 221-32. Tennyson Robert D., and Park, Ok-Choon. "The Teaching of Concepts: A Review of Instructional Design Research Literature." Review of Educational Research 50 (Spring 1980): 55-70. Trump, J. Lloyd. "School Uses of Computers in the 80's." AEDS Monitor 18 (April/May/June 1980): 5-6. Van Tassel, Dennis. Program Style, Design, Efficiency, Debugging, and Testing. Englewood Cliffs: Prentice Hall, 1978. Wise, John E. Methods of Research in Education. Boston: D. C. Heath and Co., 1967. Witkin, H. A., and others. "Field-Dependent and Field-Independent Cognitive Styles and Their Educational Implications." Review of Educational Research 17 (Winter 1977): 1-64. Witt, Paul W. F. Technology and the Curriculum. Columbia University: Teachers College Press, 1968. ummmnygwm1gmamuminim