~.~a-.-_—-.-..:.I.‘- .I. an!" .. . I: ~ a. «.~I..4I.I.‘ n\:.,1. .. III.“ I; I. é.."nw«'t.1k'\r r.- I. .. .I. . I. «I «u “3‘. , .mlhma. - ,I- .u..~u:1. u- \ :4»! Ix- -;II.IV III I I2 I. ;\:Vl‘:1.; III Il‘lJ JAMUI’ I.I I I. I ‘ll: I.._I'I. I .{I' III :r»..- I. . I’III; I. “III. I II. .I I:.II.I'I. ‘I: a I II N I'IIII‘II‘I. ’1Ll.l.£». "1"" ‘14,..‘1, _ .I .' l 1 I I 'I‘ I.I.:I. ALI. IIII: ~ I 7.: I I «II ,' I t. .,_ ;....{,. I, . ”3...“. ”In... I l'! l J ‘: ' _\l' “ II! I I W: K .Iu; III.- .III {III-.0. I“ I:- K‘J‘“! avI-II~II I r m I II .I I '! " 1:11)” ' 1 . ‘.’ 1,” :fi‘lfl I JJII I ' I ,‘ ....N;..,._, a. 'r“ r - IU ‘- ., ..I I....:: ,‘2 “I." I I E I I I \A’ I N... I. I-I. I" :‘ .I‘Q‘I f I: 1' «u “w: I I “:3.- HI\ \II ’4 w '1 run“; I’l‘ . ;. .I_..,.._, . . I i“\‘ '3' I b- I II II v II’I- II- I-IAIIII'III K I Ian. I II .II.. .II, -.- S’III. . 'lul II-II.>I I .I' M}. “I J I .9! , an...” “In! .' I" I 1~I u- F'IJ;:L)‘IV‘ ' I “'l")I m . I I'll)“, .’,'.{. '. iu ILIIO1011‘? If . IMO I. I..._II.. 1'9"“. 1 I»: 1 In.” ”1;! Jul} .I ’1‘" IV sz‘vI...» I lAllIth: I- .uII,I.II_ u" '3: ..I..:.;I.I I [M I l;llt;:‘ , . HI llllllllllllllllllllllllllIlllllllllllllllllllllllllllllllll 93 00793 4940 This is to certify that the dissertation entitled TESTING THE VALIDITY AND RELIABILITY OF THE AUTOMATED CROSS-REFERENCING OCCUPATIONAL SYSTEM: A TASK-BASED COMPUTER-ASSISTED PROGRAM PLANNING AND DEVELOPMENT RESOURCE presented by Carol Elaine Culpepper has been accepted towards fulfillment of the requirements for Ph.D. degree in Educational Administration Major professor Date October 17, 1990 MSU is an Affirmative Action/Equal Opportunity Institution 0-12771 E LIBRARY Michigan State University K J PLACE IN RETURN BOX to remove this checkout from your record. TO AVOID FINES return on or before date due. DATE DUE DATE DUE DATE DUE MSU Is An Affirmative Action/Equal Opportunity Institution c:\circ\datedue.pm3-p.t TESTING THE VALIDITY AND RELIABILITY OF THE AUTOMATED CROSS-REFERENCING OCCUPATIONAL SYSTEM: A TASK-BASED COMPUTER-ASSISTED PROGRAM PLANNING AND DEVELOPMENT RESOURCE By Carol Elaine Culpepper A DISSERTATION Submitted to Michigan State University in partial fulfillment of the requirements for the degree of DOCTOR OF PHILOSOPHY Department of Educational Administration 1990 (bélfj- ( f7§7€L3 ABSTRACT TESTING THE VALIDITY AND RELIABILITY OF THE AUTOMATED CROSS-REFERENCING OCCUPATIONAL SYSTEM: A TASK-BASED COMPUTER-ASSISTED PROGRAM PLANNING AND DEVELOPMENT RESOURCE By Carol Elaine Culpepper This research involved a study of the validity and reliability of job tasks contained in the Automated Cross-Referencing Occupatibnal System (ACROS). ACROS is a national computer-generated system that provides educators with access to job-related information relative to the design of employment and training programs. The task-inventory study encompassed two of the ACROS task lists, Bank Teller and Electronics Mechanic. A random sample, incumbent workers and supervisor-managers, of the population was selected from among the industries where jobs around which the task lists were written were concentrated. The banking industries were drawn from establishments located in Michigan and Georgia; the electronics industries were selected from businesses located in Michigan and New Jersey. Five questions, six hypotheses, and eight job-related variables were posed to promote collection and analysis of data. Carol Elaine Culpepper Validity and reliability were evidenced for the ACROS Bank Teller tasks, but only a measure of reliability was obtained for the ACROS Electronics Mechanic tasks. The data also indicated that validity of tasks may vary as a function of job variables, geographic location, length of employment, type of business/ industry, and size of business. The data further indicated that differences in responses made by incumbent workers and supervisor- managers relative to selection of tasks to match employees’ job assignments and. their ratings of the importance of their selected tasks to employees’ jobs are also important factors to consider in task-validation studies. Copyright by CAROL ELAINE CULPEPPER 1990 This dissertation is dedicated to my late parents, Charles Theodore White and Anna Virginia Anderson White. Their belief in family and in education provided me with the foundation necessary to engage in educational inquiry. ACKNOWLEDGMENTS My sincere gratitude is extended to Dr. Cas Heilman, my dissertation chairman and adviser. His encouragement and knowledge of educational training programs provided the expertise necessary for completion of this study. I also wish to express my sincere appreciation to the members of my doctoral committee: Dr. Charles A. Blackman, whose support throughout my graduate studies served to remind me of my goal; Dr. Richard E. Gardner, whose counsel helped me understand program planning for adults; and Dr. Castelle G. Gentry, whose insightful comments helped me focus on the importance of the study. I especially want to thank Robert C. Sherer, Director, Michigan Occupational Information Coordinating Committee, Michigan Department of Labor, a member of my committee and mentor, who helped me understand the importance of labor market information in program planning. I further wish to express my sincere gratitude to the Michigan Department of Education, Office of Minority Equity, for a King- Chavez-Parks Fellowship. The fellowship made it possible for me to complete this degree. Special gratitude is also extended to those friends who contributed to the successful completion of this dissertation: Dr. vi Chris Olson, an educational mentor who introduced me to task-based curriculum development; Drs. Joseph P. Hourihan and Jake Wamhoff, who encouraged me to pursue a doctorate; Dr. Gloria Kielbaso for her constant support; and Gertrude Bonaparte, whose friendship, caring, and understanding helped me persevere. Finally, I want to thank Elsie Kettunen for her support and assistance in preparing and generating the computerized data- analysis reports; Susan Cooley for her advice, editing, and conscientious preparation of this manuscript; and my son, William, for his patience, understanding, and support of my personal goal. vii TABLE OF CONTENTS LIST OF TABLES ....................... LIST OF FIGURES ....................... Chapter I. INTRODUCTION TO THE STUDY .............. Statement of the Problem ............. Purpose of the Study ............... Importance of the Study .............. Research Questions ................ Hypotheses .................... Research Variables ................ Assumptions and Limitations ............ Definition of Terms ................ Organization of the Dissertation ......... II. REVIEW OF RELATED LITERATURE AND RESEARCH ...... Implications of Social, Economic, and Political Changes ..................... Proceeding From a Conceptual Framework ...... Inter-Systems Models .............. Developmental Models .............. Systems Models ................. Application of Systems Model—-ACROS ....... Structured Data-Collection Methods ........ Overview of Job Analysis ............ Job Analysis: Problem Identification ...... Job Analysis in Occupational Education Environments ................. Estimating Task Validity, Reliability, and Relevance .................... Validity .................... Reliability ................... Relevance .................... Summary ...................... viii Page xi XV III. RESEARCH METHODOLOGY ................ 47 Research Questions ................ 47 Hypotheses .................... 49 Procedure for Data Collection ........... 50 Selection of the Task Lists ........... 50 Determination of the Population ......... 5l Development of the Instrument .......... 59 Collection of the Data ............. 66 Data-Analysis Procedures ............. 67 Independent Variables .............. 68 Coordination of Data .............. 69 Statistical Analysis of Data .......... 69 Summary ...................... 7l IV. RESULTS OF THE DATA ANALYSIS ............ 76 Bank Teller (Paying and Receiving) Task List Inventory Results ................ 77 Background Information ............. 77 Research Question 1 ............... 80 Research Question 2 ............... 82 Research Question 3 ............... 83 Research Question 4 ............... 87 Research Question 5 ............... 87 Electronics/Mechanic Task List Inventory Results . 91 Background Information ............. 91 Research Question l ............... 94 Research Question 2 ............... 96 Research Question 3 ............... 97 Research Question 4 ............... lOl Research Question 5 ............... lOZ Summary ...................... l05 V. SUMMARY, CONCLUSIONS, RECOMMENDATIONS, AND IMPLICATIONS ................... 107 Summary ...................... 107 Background ................... lO7 Purpose ..................... lOB Procedure .................... lO9 Findings .................... llO Synthesis .................... ll3 Conclusions .................... ll4 Recommendations .................. ll5 Implications ................... ll7 Relevance of ACROS Task Information for Development of Programs ............ ll7 ix APPENDICES A. BANK TELLER TASK INVENTORY ............ B. KUDER-RICHARDSON COMPUTATIONS: BANK TELLER . . . . C. TABLE OF TASKS BY SIGNIFICANCE: BANK TELLER D TABLE OF TASKS BY EMPLOYEE TYPE, TASK INVENTORY DATA SUMMARY--SIGNIFICANCE VALUES, AND TABLE OF TASKS BY LENGTH OF EMPLOYMENT: BANK TELLER . . . . E. TABLE OF TASKS BY STATE: BANK TELLER ....... F. TABLE OF TASKS BY INDUSTRY AND TABLE OF TASKS BY INDUSTRY SIZE: BANK TELLER .......... G. ELECTRONICS MECHANIC TASK INVENTORY ........ H. KUDER-RICHARDSON COMPUTATIONS: ELECTRONICS MECHANIC ..................... I. TABLE OF TASKS BY SIGNIFICANCE: ELECTRONICS MECHANIC ..................... J. TABLE OF TASKS BY EMPLOYEE TYPE, TASK INVENTORY DATA SUMMARY-~SIGNIFICANCE VALUES, AND TABLE OF TASKS BY LENGTH OF EMPLOYMENT: ELECTRONICS MECHANIC ..................... K. TABLE OF TASKS BY STATE: ELECTRONICS MECHANIC L. TABLE OF TASKS BY INDUSTRY AND TABLE OF TASKS BY INDUSTRY SIZE: ELECTRONICS MECHANIC ....... BIBLIOGRAPHY ....................... Task-Validation Procedures .......... Continued Development of ACROS ........ 121 135 138 143 146 147 149 169 173 184 189 190 192 Table LIST OF TABLES V-TECS and This Study’s Task-Validation Procedures . . Bank Teller-~Cross-Code Index ............ Electronics Mechanic-—Cross-Code Index ........ Four-Digit SIC Codes Used to Select a Stratified Sample of Industries for Participation in the Bank Teller Task Inventory ............. Four-Digit SIC Codes Used to Select a Stratified Sample of Industries for Participation in the Electronics Mechanic Task List Inventory ...... Cross-Index Summary Table of the Study’s Questions, Hypotheses, Variables, and Statistical Measures Definition of Tests of Significance Employed in the Study ..................... Bank Teller: Distribution of Respondents by Length of Employment in the Occupation .......... Bank Teller: Distribution of Respondents Relative to Their Places of Employment ........... Bank Teller: Distribution of Respondents’ Answers Relative to Clarity of the Inventory’s Task Items Bank Teller: Distribution of Respondents’ Answers Relative to Clarity of Instructions for Completing the Inventory ................... Bank Teller: Distribution of Respondents’ Answers Relative to Representation of the Inventory’s Tasks to Paying and Receiving Employees’ Jobs Bank Teller: Summary Data for the Range of Respondents’ Task Matches ............. xi Page 48 55 56 57 58 73 75 78 78 79 79 80 81 .10 .11 .12 .13 .14 .15 .16 .17 .18 .19 Bank Teller: Estimates of Reliability of Respondents’ Task-Selection Responses ....... Bank Teller: Distribution of Respondents’ Task- Significance Ratings ................ Hypothesis l--Bank Teller: Test of Significance Relative to the Two Respondent Groups’ Selected Task Matches .................... Hypothesis 2--Bank Teller: Test of Significance Relative to the Significance (Importance) Values Respondent Groups Assigned to Their Selected Task Matches .................... Hypothesis 3--Bank Teller: Analysis of Variance Summary Table for Respondents’ Selected Task Matches Relative to Length of Employment ...... Hypothesis 3--Bank Teller: Scheffe’s Test for Comparing Respondents’ Task Matches ........ Hypothesis 4--Bank Teller: Test of Significance Summary Table for Respondents’ Task Matches by Geographic Location ................ Hypothesis 5--Bank Teller: Analysis of Variance Summary Table for Respondents’ Selected Task Matches by Type of Industry ............ Hypothesis 5--Bank Teller: Scheffe’s Test for Comparing Industry Subgroups’ Selected Task Matches ...................... Hypothesis 6--Bank Teller: Test of Significance Summary Table for Respondents’ Selected Task Matches by Size of Industry ............ Electronics: Distribution of Respondents by Length of Employment in the Occupation .......... Electronics: Distribution of Respondents Relative to Their Places of Employment ........... Electronics: Distribution of Respondents’ Answers Relative to Clarity of the Inventory’s Task Items xii Page 82 83 84 85 86 86 87 88 89 90 92 92 93 4.32 Hypothesis 6--Electronics: Test of Significance Summary Table for Respondents’ Selected Task Matches by Size of Industry ............ l05 4.33 Summary Table of Hypotheses for Bank Teller and Electronics Mechanic ................ l06 xiv LIST OF FIGURES Figure Page l.l Input-Output Process for ACROS’s Task Information ..................... 7 5.l Conceptual Model for Developing Computer-Assisted Occupational Program Planning and Development Resources ...................... ll6 XV CHAPTER I INTRODUCTION TO THE STUDY Occupational programs are an important part of the educational system. Through these programs, thousands of youths and adults acquire knowledge, skills, and attitudes that enable them to become productive members of society. These programs also serve the educational needs of business and industry. They provide a means by which an industry’s employees are retrained to function in an ever- changing technological work environment. Job-related information and data are essential in the design of occupational programs (Frugoli, l983; Gael, 1983). Information that delineates the tasks workers perform within an occupation, for example, helps curriculum developers and instructional staff to establish program objectives. Counselors, on the other hand, use occupational-characteristic information such as aptitudes, tempera- ments, physical demands, and working conditions to help individuals make career and program choices. Finally, occupational information that reflects employers’ job descriptions, job-applicant require- ments, and criteria for the evaluation of employees’ job performance helps placement personnel match, more effectively, program graduates with employment openings. Educators frequently acquire job-related information from occupational advisory' committees and publications (Caplan, l983; Clark, l983). The use of these methods to acquire job-related information is effective when the labor market is stable, but these methods are less effective when rapid changes, such as those currently evidenced, occur in the way work is performed and in the kinds of occupations that are in demand. Concerned about keeping occupational programs relevant to the demands of employers, several program developers searched for alternative methods to use in acquiring occupational information. One alternative that became prominent in educational-program- development environments was the use of computer-assisted occupational information systems. Some program developers, for example, used computerized occupational information systems administered by their State Occupational Information Coordinating Committee (SOICC) agency. Established by Congress in l976, the SOICC agencies and the National Occupational Information Coordinating Committee (NOICC) are charged with the responsibility of compiling occupational information from various federal and state sources. ‘These agencies are also responsible for coordinating interagency delivery systems for the dissemination of an occupational information program. The major focus of these programs is to provide individuals with employment projections for state and local labor markets (Frugoli, 1983). As an example, the Michigan Occupational Information Coordinat- ing Committee’s (MOICC) program, Occupational Projections and Training Information for Michigan (OPTIM), contains "employment projections for over 500 occupations, supply and demand information for approximately llO clusters of occupations and training programs and occupational staffing patterns for some 400 industries" (Michigan Occupational Information Coordinating Committee, in press, p. l). Developed by the Michigan Employment Security Commission’s Bureau of Research and Statistics, OPTIM is frequently used by Michigan’s educators during the planning stage of an occupational program. For instance, OPTIM’s information on state and regional labor market employment outlooks, job openings, supply-demand conditions, and major employers helps program developers determine whether a need exists for the development of a program. Even though the occupational information systems that are administered by the State and National Occupational Information Coordinating Committees provide program planners with information relevant to program decisions, their systems ck) not include information that details the tasks that are associated with occupations. Such information is accessible through another computerized program, the Automated Cross-Referencing Occupational System (ACROS). Formerly known as the Michigan Occupational Data Analysis System from 1983 to l988, the ACROS was developed as it result of concerns raised by educators. These concerns centered on the need for educators to be able to access educational-occupational information quickly and cost effectively (McCage & Olson, l986; Olson, 1987). Funded by the Michigan State Board of Education, the ACROS was developed by staff members of Michigan State University’s Curriculum Projects and Career Resource Center in 1983. In addition to Michigan State University staff, representatives from several organizations and agencies were involved in planning the program. These included secondary and postsecondary vocational educators; labor-market analysts; personnel from the Michigan Department of Labor’s State Occupational Information Coordinating Committee, Bureau of Research and Statistics, and Occupational Field Analysis Center; representatives from the Vocational-Technical Education Consortium of States (V-TECS); and staff from the Michigan Department of Education’s Adult Education and Vbcational-Technical Education Service. Education and labor market information and data are cross- referenced in the ACROS. For example, approximately 270 to 280 Classification of Instructional Programs (CIP) are represented in the ACROS’s data base in the form of task-performance objective lists. In addition, there are tool and equipment lists for almost 80% of the system’s CIPs. Most of the task lists that are in the ACROS were coordinated from materials prepared by the V-TECS. The task lists are indexed to research-based occupational data that were compiled from resources published by the National Occupational Information Coordinating Committee and the U.S. Department of Labor. The occupational data include Dictionary of Occupational Titles (DOT) job titles and codes and their related information on occupational characteristics, aptitudes, temperaments, physical demands, and working conditions; General Educational Development (GED) computational-language levels; Specific Vocational Preparation (SVP) training-time indicators; and General Aptitude Test Battery (GATB) norm scores. Labor market classification codes--Workfields; Standard Occupational Classification (SOC); Materials, Product, Subject Matter and Services (MPSMS), and Occupational Employment Statistics (OES)--are also cross-referenced with ACROS task lists. The SOC and OES codes provide a means by which ACROS information can be correlated with occupational employment supply and demand data that are in the State Occupational Information Coordinating Committee’s computer-generated systems. A unique feature of the ACROS is its program menu. The menu contains: a selection option that permits one to perform cross- occupational searches. As an example, if one’s goal is to acquire a list of tasks related to a particular aspect of work, the list may be obtained by performing a search using key words or codes. Information obtained from a key word or code search includes task performance objectives that are compiled from one or several of the ACROS’s CIP task list areas. The key word/code menu function is also frequently used by program planners as a means to explore and identify commonalities that may exist among the various occupational programs. The ACROS is well-received by many educators (Culpepper, l987; Wolff, Jurdo, Kinnison, & Spooner, 1985). According to the results of a survey conducted in 1987 among 125 ACROS subscribers in 29 states, the District of Columbia, and three branches of the armed forces, the task lists (which represent work performed within an occupation) are the most frequently used components of the ACROS. The survey results also showed that program planners use the task lists primarily to develop and review curricula, design industry training and courses, and plan programs and special projects (Bibnotes, 1988,; Culpepper, 1987). In addition, ACROS task lists are used by the Vocational-Technical Education Consortium of States (V-TECS) as a foundation for the development of criterion-referenced tests (McCage & Olson, 1986). Consequently, it is imperative that the ACROS’s task information is valid and reliable. _tatement of the Problem There is evidence that job requirements are rapidly changing. These changes have resulted in a need for occupational program developers to revise and/or develop occupational programs. This demands that program developers have access to job-related information and data. Many' program developers, in addition to acquiring job information from advisory committees and publications, are obtaining information from computerized systems such as the ACROS. This system, which was developed in the latter part of 1983, contains task lists for an estimated 270 to 280 occupational program areas. The input and output processes that comprise how task information is collected and disseminated are displayed in Figure 1.1. V-TECS Task-Validation Procedures I 1. States Develop/Validate Task Lists Using V-TECS’s Procedures ACROS Task Lists Are E: Computerized i. Educators Access ACROS to Obtain Tasks to Plan Programs Figure 1.1: Input-output process for ACROS’s task information. The procedures that the V-TECS employs in the collection and verification of task information are similar to those used by several job-analysis authorities. These procedures, however, may not be broad enough to allow for the collection of macro-level data that are critical to the design of national computer-assisted occupational program resources. In particular, the V-TECS procedures that are used to promote validation of tasks, selection of the target population, and analyses of collected data may not yield information that meets job and program requirements nationwide. Under the V-TECS procedures, for example, a task-validation study is usually conducted in only one state. The V-TECS procedures further indicate that the target population from which the sample is selected for task-validation studies is identified as follows: To assure an adequate sampling return of at least 60%, no less than 50 persons are surveyed in any DOT. If less than 50 incumbent workers are identified in a specific job title, the sample for that DOT is not drawn until additional incumbent workers within businesses are identified or until the factors creating the relatively small number of incumbent workers are documented. (McCage, 1987, p. 39) If the state performing the task-validation study is unable to meet the above procedural guidelines, it may "expand the data base to include incumbent workers from other states on the population sample selection list" (McCage, 1987, p. 40). It is feasible that the validation of tasks in one state might result in valid and reliable job information. However, states are diverse in terms of their socioeconomic conditions, types of industries, and job locations. Therefore, it is possible that information collected through task-validation studies would be significantly different if the studies were conducted in more than one state. Ammerman (1977a) stated: For most curriculum content purposes the objective is to survey the range of employment settings in which trainees over time might find themselves seeking employment. With the high mobility of the population, this objective directs that a variety of geographic or regional locations, community sizes, employer sizes and employing industries be the focus of most occupational performance surveys. (p. 20) A second area of concern, as it relates to V-TECS task- validation procedures, is the target population that V-TECS includes in their studies. The population, under V-TECS procedures, comprises incumbent. workers. Yet researchers such as Ammerman (1977a) have indicated it is critical that supervisors and incumbent workers be included in task-validation studies. According to Ammerman, a measure of the representation of tasks to employees’ work assignments is obtained when a comparison is made between incumbent workers’ and their supervisors’ task-to-job matches. Ammerman also indicated that tasks appropriate for inclusion in educational programs are identified by comparing incumbent workers’ and their supervisors’ responses in regard to the significance or importance of selected tasks to employees’ job assignments. A third area of concern, as it relates to V-TECS task- validation procedures, is the lack of attention that is given to analyzing and compiling job information at different levels of specificity. For example, under the V—TECS procedures, information that relates to incumbent workers’ length of employment within their present jobs and the occupation is collected. This information, however, is not used in the data-analysis process. 10 Some researchers have indicated that subsample information, such as the correlation of job tasks performed by workers with the length of time they have been on their job, is critical to the design of various types of programs. For example, this kind of information helps educators structure entry-level, skill-upgrading, and career-change programs. It also promotes decision making as it relates to program content, and ultimately helps program developers "set meaningful limits on the range of experience desired" (Ammerman, 1977a, p. 19). The failure to include research methods that promote collection of job information that is relevant to employment requirements in labor markets nationwide gives rise to educators’ concerns that ACROS task lists may not be valid or reliable. To address these concerns, it is imperative that a study of the validity and reliability of ACROS tasks be undertaken. Purpose of the Study The purpose of this study was to test the validity and reliability of selected tasks drawn from the ACROS. Related issues, representation of tasks relative to a given occupation, and appropriateness of selected tasks for inclusion in formal education programs were also investigated. A major objective of this study was to establish a conceptual framework for the continued development of computer-assisted occupational program pJanning and development resources. 11 Importance of the Study The ACROS is the first computer-assisted technology resource designed to provide occupational program developers with job- content, task information. Task information is essential in the development of comprehensive programs that are structured to help individuals obtain and/or' maintain employment. Containing both labor market and educational materials, the ACROS has the potential to affect occupational program planning and development nationwide. This research represented an effort to provide knowledge regarding the validity and reliability of selected task descriptions that are in the ACROS. The ACROS contains tasks for approximately 270 to 280 CIP areas, but only two of the ACROS task lists, Bank Teller and Electronics Mechanic, were included in this study. The researcher concentrated on obtaining information about the match between selected tasks from the ACROS and the tasks of the occupation, and the importance of the selected task-occupation matches to employees’ work assignments. This information was collected from a group of representative employees, incumbent workers and their supervisor-managers, in two geographical locations for each task list. As a result of this study, information that addresses concerns expressed by educators about the validity and reliability of ACROS task information was obtained. Further, information acquired from this study permits recommendations to be made about research procedures that promote collection of job information for inclusion 12 in national occupational program planning and development computer- assisted technology resources. Occupational program planners, curriculum and instructional staff, counselors, placement personnel, employers, program participants, and educational technology resource developers will benefit from the information acquired through this study. Research Questions Using selected task lists from the ACROS, the following research questions were posed to fu1fill the overall objective of this study: 1. 00 selected task lists in the ACROS match the task assignments of employees? 2. How important, on a scale from 1 to 5, are the task- assignment matches to occupational employees’ work assignments? 3. Do incumbent workers and their supervisor-managers respond to selected tasks in the same way? 4. How stable are the task-assignment matches across geo- graphic boundaries? 5. How do task assignments match up across different segments of multi-industries, given the same occupation? Hypotheses The research hypotheses for this study, relative to the five questions and eight independent variables that are presented below, were: 13 Hypothesis 1. There will be no difference in the responses by incumbent. workers and their supervisor-managers relative to selected task matches. Hypothesis 3. There will be no difference in the responses by incumbent workers and their supervisor-managers relative to the significance (importance) values they assigned in: their selected task matches. Hypothesis 3. There will be no difference in the responses by respondents relative to selected task matches on the basis of length of employment. Hypothesis 4. There will be no difference in the responses by respondents relative to selected task matches on the basis of geographical location of places of employment. Hypothesis 5. There will be no difference in the responses by respondents relative to selected task matches on the basis of type of business establishment. Hypothesis 6. There will be no difference in the responses by respondents relative to selected task matches on the basis of size of the business. I Research Variables The following independent variables were used in analyzing the collected data: 1. Number of tasks selected by incumbent workers and their supervisor-managers: as matching the task assignments of workers employed in the occupation. 2. The average significance (importance) values incumbent workers and their supervisor-managers assigned to the tasks they selected to match employees’ job assignments. 3. A comparison between incumbent workers’ and their supervisor-managers’ selected task matches. 14 4. A comparison between incumbent workers’ and their supervisor-managers’ assignment of significance values to their selected task-to-job matches. 5. Comparison of respondents’ responses on the basis of the geographical location of their places of employment. 6. Comparison of respondents’ responses on the basis of the length of their employment in the job. 7. Comparison of respondents’ responses based on the type of business establishment/industry in which they were employed. 8. Comparison of respondents’ responses based on the size of the work force in their place of employment. Assumptions and Limitations The researcher assumed the following: 1. The task lists that are in the ACROS represent tasks per- formed by workers in a given occupational area. 2. The businesses selected for inclusion in the study would willingly participate in the occupational task inventory study. 3. The incumbent workers who participated in the study were typical of employees who work in the occupational domains that were represented in the study. 4. The supervisor-managers who participated in the study were typical of personnel who supervise the work of employees who work in the occupational domains included in the study. 15 The study was limited in the following ways: 1. Of the 270 to 280 task lists that are in the ACROS, two, Bank Teller and Electronics Mechanic, were included in the study. 2. Three states, Michigan, New Jersey, and Georgia, external to the states that originated the task lists were included in the study. 3. The target population comprised businesses that were located in the industries in which the jobs represented in the task lists were concentrated. 4. The population sample comprised 200 businesses for each task list included in the study. Definition of Terms The following terms are defined with regard to the research: Aptitude. Indicators of the ability level(s) needed to perform or learn the tasks of a specific job. Linked to General Aptitude Test Battery (GATB) norm scores and job titles, in the ACROS the level(s) appear as alphabetic codes: G, V, N, S, P, Q, K, F, M, E, C. Automated Cross-Referencing Occuoational Svs’cngACROS). A computer-generated system that contains cross-referenced labor market, occupational, and educational information. Classification of Instructional Proqram (CIP) codes. U.S. Department of Education codes that are used to identify and classify funded programs according to subject matter. 16 Content validity. The degree to which selected tasks represent the occupational domain about which inferences are to be made. Cross-occupational search. An ACROS menu option that permits retrieval of task and labor market information from one or more of the task lists that are indexed to CIP areas. Dictionary of Occupational TitlasaIDOT). A book published by the U.S. Department of Labor, Employment and Training Administra- tion, that contains descriptions of jobs. Qp_y. A discrete unit of work that is composed of a cluster of related job tasks. Entrant employee. An employee who has worked in the same job for three years or less. Entrant supervisor. A person who has worked in a supervisory capacity for three years or less. Experienced employee. An employee who has worked in the same job for three or more years. Experienced supervisor. A person who has worked in a supervisory capacity for three or more years. General Aptitude Test Baum/m. A standardized test. The numerical scale (1 to 5) of the test reflects the amount of aptitude associated with an occupation. General Educational Development (GED). ‘The computational and language ability ‘levels that. workers in a given occupation are expected to possess. Incumbent worker. A person who is currently employed in an occupation. 17 Industry. Classification of businesses on the basis of their economic activity, e.g., wholesale, finance, transportation, manu- facturing, service, retail, agriculture, mining. Job analysis. A systematic way of reviewing a job to acquire information that describes the occupation. Job performance. A measure of accepted, expected, or desired work activity. Key words. Words that describe the products workers produce and the knowledge, processes, tests, machines, equipment, and work aids they use in performing a job. Key word code. An alphabetic indicator that is assigned to a key word. Larqe business/establishment. A firm with a total of 100 or more employees. Mamerials. Products. Subject Matter and Services (MPSMS) codes. Alphabetic designators that are employed to classify work on the basis of common factors. Mama. A computer-screen display that specifies what choices or selections are available. Occupational domain. A job or a group of similar jobs. Qpcupational inventory (task inventory). A survey instrument that contains a list of tasks for a specific occupational domain. Physipal demand code. A code that is used to represent the physical conditions associated with a given job. 18 Program developer. An individual who develops employment-based educational programs. Reliability. The consistency of incumbent workers’ and their supervisor-managers’ responses to selected tasks obtained from the ACROS. Small businesslestablishment. A firm with a total of 100 or fewer employees. Specific Vocational Preparation (SVP). Indicators that repre— sent estimated time factors associated with the development of average job performance. Standard Industrial Classification (SIC) code. A two-, three-, or four-digit numerical code that classifies business establishments according to their economic activity. Standard Occupational Classification (SOC) code. ll numerical coding system that cross-indexes jobs on the basis of labor market employment and income data. The codes reflect jobs that have been structured into homogeneous groupings. last. A discrete component of work performed by an employee as a part of the total job. Iask inventory. Same as occupational inventory. Task list. A compilation of the skills, knowledge, and atti- tudes that represent the tasks of a given occupational domain. Iask validity. The extent to which respondents’ answers indicate a given task is a part of the work assignment of an employee in the occupation. l9 Temperament. An indicator' of“ a personality trait that is required of an individual in specific job/worker situations. Validatipn. The process of verifying that tasks identified for an occupational domain represent the job assignments of employees. Validity. The extent to which incumbent workers and their supervisor-managers indicate that selected tasks from the ACROS represent the job assignments of employees in a given occupation. Vogtional-Technical Education Consortigm of States (V-TECS). A national organization of educators, including branches of the armed forces, that produces curriculum guides, catalogs, and criterion-referenced test items that are based on information obtained through job-task analysis research. Workfield codes. A classification system that groups occupa- tions on the basis of similar objectives into fields arranged from specific to general work functions. Workinq condition code. A code that indicates the atmospheric and environmental conditions under which a job is performed by a worker. Organization of the Dissertation Five chapters, references, and appendices comprise this dissertation. Chapter I established a point of reference for this study. The content areas of the remaining four chapters are addressed in the following paragraphs. Chapter II, Review of Related Literature and Research, focuses on issues and factors that affect the collection of task information 20 for inclusion in national occupational computerized resource systems. The chapter is divided into four sections. Section one stresses the implications social, economic, and political conditions have on the development of occupational programs. In section two, communication and linkage models that promote the establishment of collaborative efforts among educators, individuals, and various organizations and agencies are previewed. Techniques and approaches applicable to the collection of task information are presented in section three. In the fourth section, methods to estimate the validity, reliability, and relevance of task information are explored. Chapter 111 contains a review of the research methodology used in this study. Included are the procedures employed to identify and select. the~ population, develop ‘the instruments, and collect and analyze the data. The data and the findings related to the respondent groups are presented in Chapter IV. Chapter V consists of the summary, conclusions, recommenda- tions, and implications. CHAPTER 11 REVIEW OF RELATED LITERATURE AND RESEARCH The writer’s purpose in this chapter is to review the related literature and research pertaining to the validity and reliability of data and information dissemination through a task-based computer- assisted program planning and development resource, ACROS. Before 1980, computers in educational environments were limited primarily to functions such as record keeping, scheduling, counseling, self-paced instruction, and test generation. As the cost of computers and software became more affordable, educators increased their use of computers. The increased use of computers was most noticeable in the areas of administration and nmnagement (Adams & Fuchs, l986; Bluhm, l987; Hannafin, Dalton, & Hooper, 1987; Oborne, 1985; Protheroe, Carroll, & Zoetis, 1982). In educational settings where occupational, employment-based programs were developed, affordable technology resulted in two major changes. First, the use of existing computer-generated information systems increased. Second, computer systems were developed that promoted decision making related to the design and delivery of occupational programs. Some of the existing systems included programs administered by the State Occupational Information Coordinating Committee agencies, and data bases such as Educational 21 22 Resources Information Center (ERIC), Resources in Vocational Education (RIVE), and Vocational Education Curriculum Materials (VECM) (Budke & Charles, 1986; Frugoli, 1983). Two of the "newly" developed systems were the Criterion—Referenced Test Item Bank and the Automated Cross-Referencing Occupational System (McCage, 1986; State of Michigan, 1988). Technology-based information systems, like any other products, must go through a research and development process. .As a part of the process, the systems must be evaluated to (a) determine the extent to which they serve their intended purpose and (b) determine whether the system’s content is valid and reliable. Unfortunately, product research has often been neglected by educators. According to Tyler (1976), there has been limited product research because of uncertainties on the part of some educators about what research entails and what can be expected from research. To provide a conceptual framework and perspective for this study, which focuses on testing the validity and reliability of selected task lists from the ACROS, the remainder of this chapter is divided into four sections: 1. Implications of social, economic, and political changes for the collection of job-related information. 2. Conceptual models that facilitate communication and linkages between educational institutions and the constituents served by education. 23 3. Structured data-collection methods that promote verifica- tion and validation of tasks for an occupational domain. 4. Estimating task validity, reliability, and relevance. Implications of Social, Economic. and Political Chanqas The nation’s social and economic structures have undergone tremendous changes in the last decade. The demographic data indicate that there are observable changes in the age, sex, mobility patterns, and educational levels of the population (U.S. Department of Commerce, 1989). Changes in the nation’s economic structures include competition from world markets, decline 'hi manufacturing, demand for new products, "new" types of managerial practices, and advanced technologies (Beckendorf, 1988; Cross, 1984; Lester & Frugoli, 1989; Sum, Amico, & Harrington, 1986; U.S. Department of Labor, 1986; "Where the Jobs Are," 1988). Some social-systems theorists have suggested that social and economic changes, such as those currently evidenced, have consequences not only for the system in which the change is taking place but also for other systems (Counts, 1961; Gouldner, 1961). Nowhere is this more evident than in those segments of the educational community in which programs that prepare people for work are developed. "The workplace will change constantly, and so nmst vocational education’s curriculum" (Goetsch, 1985, p. 198). Two major legislative acts, the Job Training and Partnership Act (JTPA) and the Carl Perkins Vocational Education Act of 1984, respond to the changes that are occurring in the nation’s social and 24 economic structures. These acts provide guidelines for the development of occupational programs (Lewis, 1989). The JTPA, for example, emphasizes collaboration between business and local governmental units (Guttman, 1983) and the development of employment competency measures (U.S. Congress, 1982). The 1984 Carl Perkins Vocational Education Act, on the other hand, focuses on program evaluation. Within the context of program evaluation, emphasis is placed ("1 preparing individuals for' work_ in present and future environments through the expansion, improvement, and modification of programs (U.S. Congress, 1984). The legislative acts provide guidelines for the development of occupational programs that are in line~ with employment changes. However, some authorities have indicated that many educators continue to apply educational practices that were applicable during the industrial era to design programs that prepare people for work in an information age (Naisbitt & Aburdene, 1985). "New technology, the complexities of human resources, and economic influences place differing demands (Hi the training mechanism" (Doran, 1983, p. 36). Consequently, it is necessary to design programs that address a variety of needs (Morton & Cross, 1980; Venn & Skutack, 1980). This includes, but is not limited to, the development and delivery of programs that center on entry-level training, skill upgrading, career changes, and training that prepares individuals to work in more than one occupational area 25 (Calhoun & Finch, 1982; Coates, 1984; Naisbitt, 1982; Tyler, 1980). The focal point of these programs must be on helping people adapt to ever-changing work environments that demand new types of skills, attitudes, and aptitudes (Barton, 1982; Pratzner & Ashley, 1985; Reece & Brandt, 1984; Sum et al., 1986; Veen & Skutack, 1980). Proceeding From a Conceptual Framework As an initial step in the program-development process, some authorities have suggested it would behoove educators to fecus on improving communication and linkages between their institution and the business community they serve (Alfthan, 1985). Through research, several techniques that promote "systems" linkages have been identified. These research-based techniques, however, have seldom been implemented into the educational practices of those who develop occupational programs. For' example, many developers of occupational programs have continued to rely on advisory committees as the main source for obtaining job-related information. Several writers have been critical of occupational program developers’ exclusive use of advisory committees as a means to linkages for the purpose of securing job-related information. As early as 1967, Burt stated: In many instances occupational advisory committees are called upon by educators to make recommendations concerning the need for specific job training programs about which the committee has little knowledge, even though the field is related to the industry served by the advisory committee. (p. 94) More recently, Clark (1983) suggested that advisory committees were frequently ineffective because often the committee nombership did 26 not include people from the business sector who had the power to make decisions. Boyle (1986) and Boone (1985) indicated that, in order for educators to solidify linkages that result in the acquisition of information, they must proceed from a conceptual framework. According to Boone (1985), the conceptual framework that program planners select must be broad enough to take into account the mission and tenets of the institution and the publics served by the institution. Some researchers have suggested that inter-systems, developmen- tal, and systems models promote linkages. These models and their implications for the development of occupational programs are discussed in the following paragraphs. Inter-Systems Models The central focus of an inter-systems model is collaboration and interaction between two open systems, but emphasis is placed on the present (Chin, 1976). Under certain circumstances, such as the need to establish short-term relationships, occupational planners might select an inter-systems model as their conceptual framework. These models, however, may not help occupational planners establish linkages that fulfill the charges put forth in legislation. The legislation, for example, stresses the development of programs that include present and future job requirements. Consequently, program planners might wish to select a conceptual framework that is 27 directed toward facilitating the establishment of long—term relationships. Davalppmental Models There is evidence that developmental models are effective in facilitating educators’ practices. Chin (1976) indicated that these models proceed from the premise that application of the models’ approaches leads to the establishment of progressive relationships over a period of time. According to Chin, those who adopt developmental models as their conceptual framework function as change agents. Chin indicated that, as change agents, educators solve problems and clarify expectations about the client system. The major intention of these models is to provide practitioners with a description of a problem. Occupational program planners, however, must go beyond problem identification. To meet legislative mandates and provide relevant services to their publics, occupational program planners must collaborate with the industrial sector to design programs that stem from deliberate, planned change. Systems Models A systems model that was developed by Boone (1985) at North Carolina State University synthesizes psychological, sociological, and educational constructs that are pertinent to promoting linkages between two or more agencies. As described by Boone, linkages occur when an institution engages in a study, analysis, and mapping (sometimes defined as identification and prioritizing of needs) of the publics it serves, and through the interfacing of the 28 institution’s educators with community leaders, individual learners, and learner groups. Appligation of SystemssModel--ACROS A systems approach, which reflected the conceptual framework specified in Boone’s model, was used in the development of the Automated Cross-Referencing Occupational System (ACROS). Linkages, for example, occur as a result of the compilation and indexing of educational and labor market data that are acquired from several federal and state agencies. These agencies, through study, analysis, mapping of socioeconomic trends, and collaboration, are linked to other organizations that comprise the society’s systems. Theoretically, when occupational program planners obtain information from ‘the TACROS they expand the linkages that exist between their institution and other agencies. These linkages are in keeping with programming constructs emphasized by Boone (1985) "to include those features of' participating systems that affect the nature of programming and the change process" (p. 3). Linkages that are brought about as a result of technology-based systems, however, may be weakened if the systems’ content fails to emphasize information that addresses concerns that are held in common by two or more entities. For example, although some authorities had predicted that work in the 19805 would not be complex (Hackman & Oldman, 1980), many educators and employers share a common concern about the growing "gap" that exists between the _______fi 1 29 skill level of individuals and the kinds of skills that are required for the jobs that are in the economy. Some writers have indicated that the 'gap" is a result of educators’ failure to keep pace with change (Bridgewood, l987; Feuer, 1987; Hamilton, 1986; "Help Wanted," 1987). Others have suggested that the "mismatch“ is the result of an economy that does not. have .jobs ‘that. fit the population’s high educational levels (Rumburger, 1984), imbalances between outputs of educational programs and employment demands (Sum et al., 1986), and negligence on the part of employers to include people and job needs in their organizational plans (Drucker, l986; Wilms, 1984). Even though congruence in regard to the casual factors that precipitated the "skills gap" is not evidenced, there are indications that this is an area of concern that is shared by educators and the public at large. Another area of concern that is of mutual interest to several people and is closely related to "skill deficiencies" is the discrepancies that exist between employers’ and employees’ perceptions of job performance. Research conducted by industrial psychologists in the 19605 and 19705 indicated that individual differences, needs, values, and experiences often result in employees’ redefining their' job ‘tasks (Arnold, 1985; Dodd, Wollowick, & McNamara, 1970; Hackman & Lawler, 1971). Studies have also shown that experienced employees frequently' view their job tasks differently than do entrant employees. Discrepancies occur as a result of employees redefining their jobs versus employers’ job requirements. This is manifested through employers’ perceptions 30 that workers do not perform at a high level (Aldag & Brief, 1979). According to some educational researchers, employee-employer job- discrepancy information should be used as one of the primary determinants of program content (Ammerman, 1977a). The effectiveness of linkages fostered through technology-based systems is affected by the quality of the systems’ information and data. Information that encompasses skill deficiencies, tasks performed by employees that are correlated with their experience levels, and the tasks of the jobs that are defined by employers should be a part of technology systems that are designed for the purpose of promoting planning and development of occupational programs. To be useful, technology-based systems’ job information must contain accurate job-related information, and "the data must be easily available to those who need it, when they need it, and the data must be updated as changes occur" (Bemis, Belenky, & Soder, 1983, p. 152). This demands that technology-assisted programming resources such as ACROS, which provide job information 11) program developers across the United States, must contain job information that is germane to a national rather than a state or local labor market. Some cross-geographic task-validation studies have been conducted. The results of a study conducted by Mann (1978), for example, indicated that job information compiled in one location is relevant across geographic boundaries. However, Mann’s study did not include a method to validate tasks as related to their 31 representation in jobs that are located in multi-employment settings. Mann’s study also did not focus on the validation of job tasks as they related to the size of the businesses in which the respondents’ jobs were located. Yet, there is evidence that the tasks of a job may vary as a result of these factors (Ammerman, 1977a; Anastasi, 1976; Sherer, 1989; Sum et al., 1986). Variations in tasks selected by incumbent workers to represent their work assignments were observed by the researcher.7 In several studies that were conducted for state curriculum projects from 1982 through 1986, the researcher found that there was a significant difference in the tasks that were selected to match the job assignment by incumbent workers employed in small as opposed to large businesses. Further, in some of the studies there was a difference in tasks selected, as representing the work assignments of the job, by incumbent workers employed in businesses that were located in urban versus rural environments. In addition, Steven Clark (1988), of the Michigan State University Agricultural and Extension Education’s Occupational Research and Assessment Center, indicated that criterion-referenced tests developed on the basis of ACROS task list information were often rejected by educators in some geographic locations. According to Clark, even though the test items corresponded with ACROS’s job—task information, the test items frequently did not match the job content that was reflected in some educators’ programs. Dewey (1935) stated: 32 It is the business of an intelligent theory of education to ascertain the causes for the conflicts that exist and then, instead of taking one side or the other, to indicate a plan of operations proceeding from a level deeper and more inclusive than is represented by the practices and ideas of the contending parties. (p. 5) The literature indicated that the plan of operation for the development of computer-generated systems, such as ACROS, should include a means to collect information that represents job require- ments nationwide. Further, the literature suggested that computer— generated information systems should be structured to promote retrieval of job data that facilitate development of programs to meet the diverse needs of the publics, individuals, and institutions that educators serve. Structured Data—Collection Methods Overview of Job Analysis According to reports, job analysis is one of the most viable techniques employed to collect job information. "If it is to yield usable information, job analysis requires systematic data-gathering procedures" (Anastasi, 1976, p. 437). Building on the functional job-analysis model developed in the 19305 by the U.S. Employment Service, Department of Labor, job- analysis techniques have evolved into elaborate systems for collecting and analyzing job data. Many authorities have attributed the evolution of job-analysis techniques to research conducted by industrial psychologists, guidelines established by the Equal Employment Opportunity (EEO) regulatory agencies and courts, and studies conducted by the armed forces. 33 Research conducted by industrial psychologists has focused primarily on obtaining information about employee motivation and job satisfaction. Conversely, the EEO and the courts have stressed the application of job analysis in determining the validity of job content and the use of information obtained from a job-analysis study in the construction of tests for job applicants (Ghorpade, 1988; Kanin-Lovers, 1986; Lee & Mendoza, 1981; Maurer & Fay, l986; Rosinger et al., 1982; Thompson 8 Thompson, 1982; Veres, Lahey, & Buckly, 1987). Studies conducted by the Air Force have emphasized the validation of task inventory instruments as a means to collect job data that promote development of training programs for Air Force personnel (Morsh & Archer, 1967). Job Analysis: Problem Identification Even though job analysis has been used by numerous researchers for several years, there is still a great deal of confusion about "job analysis in general, about definition of key terms, and about specific application of methods and techniques" (Gael, 1983, p. x). For a number of years, a major problem associated with job analysis was the lack. of' a. mechanism for compiling and analyzing large volumes of job information and data. Another job-analysis problem was the lack of a survey instrument that promoted the collection of relevant job information. Recent technological advances have helped researchers develop systems that address the problem of compiling and analyzing vast amounts of job information (Ghorpade, 1988; Heneman, Schwab, Fossum, 34 & Dyer, 1980). The Position Analysis Questionnaire (PAQ), the Work Performance Survey System (WPSS), and the Comprehensive Occupational Data .Analysis Program (CODAP) are examples of systems that are capable of analyzing and managing large amounts of data. These systems share a common element with the ACROS in that they contain task information, but their purpose is different. The PAQ, which was developed by an industrial psychologist, involves a comparison of the ratings of jobs on 187 work-related elements. "The elements are of a worker-oriented nature that tend to characterize, or to imply, the human behaviors that are involved in jobs" (McCormick, 1979, p. 144). The WPSS program, which was also developed by an industrial psychologist, promotes analysis and classification of job data. The job data are relevant to work performed by American Telephone and Telegraph employees (Gael, 1983). The CODAP is an interactive system for compiling and analyzing task inventory data. It was developed by U.S. Air Force personnel. Like the ACROS, this system emphasizes the compiling of job information that is relevant to the development of training programs. Unlike the ACROS, which is purported to contain task information that represents the requirements of jobs that are located in different types of businesses, the CODAP contains information that reflects the tasks of jobs that are found in military environments. The CODAP also includes a programming feature that supports the analysis of collected job information on 35 the basis of several different job—related variables (Morsh & Archer, 1967). Researchers have also conducted studies that have addressed the problem associated with the design of job-analysis instruments. For example, studies conducted by the Air Force from 1956 through 1965, which included 100 occupations and approximately 100,000 Air Force personnel, have contributed significantly to the development of written data-collection instruments (Gael, l983; Ghorpade, 1988; McCormick, 1979). In addition, research conducted by The Ohio State University’s Center for Vocational Education during the 19705 has provided insight into the application of various data-collection and analysis methods. Their research, which was based on the Air Force studies, resulted in a series of procedural guides that specify systematic methods for collecting and analyzing data related to the development of training programs (Ammerman, 1977a, 1977b; Ammerman & Essex, 1977; Ammerman & Pratzner, 1977; Mead, Essex, & Ammerman, 1977). According to some authorities, written instruments have several advantages over other data-collection techniques. For instance, advocates of written instruments have indicated that a number of people can be reached in a short period of time, there is uniformity of responses, it is possible to obtain quantitative results, respondents’ identities can be preserved (Ghorpade, 1988), and more than one objective can be satisfied (Gael, 1983). Some authorities have also indicated that there are disadvantages to using a written instrument in collecting job 36 information. Ghorpade (1988) indicated that two of the disadvantages of written instruments are (a) the quality of responses may be affected by the variables used in the collection of data, and (b) costs. Another disadvantage that is worth noting is that frequently the number of people who respond to written instruments is small in comparison to the population. This was evidenced in some of the task inventory studies that were conducted by the Michigan State University Curriculum Projects and by the V-TECS. Only 25 incumbent workers, for example, responded to the V-TECS Bank Teller Task List that is included in this study. "Accurate and complete information can be obtained about a job with a job inventory only if the survey questionnaire is designed correctly and a respondent sample is selected appropriately" (Gael, 1983, p. 7). Job Analysis in OCCUpational Education Environments Job analysis, also termed task analysis and occupational analysis, may be performed using one of several approaches. Even though some authorities have indicated one approach is not superior to another (Levin, Ash, Hall, & Bennett, 1980; Levin, Ash, Hall, & Sistrunk, 1983), in occupational education environments DACUM, function approach, and task inventory are the most frequently used techniques. DACUM, function approach, and task inventory are similar in some ways. For example, all of these techniques employ one or more 37 methods—-observation, group interviews, technical conferences, diary, work participation, and critical incidents--to collect job information (Dippo, 1988; Finch & Crunkilton, 1979; Kanin-Lovers, 1986; Mann, 1978; McCormick, 1979; Robertson & Smith, 1985; Shears, 1985). Each approach, however, has its own unique features. An overview of the salient features of these approaches is presented in the following paragraphs. The DACUM approapp. The focus of the Developing a Curriculum (DACUM) approach is the development of an occupational skills profile. The skills are identified by a committee of 10 to 12 workers or supervisors. Critics of the DACUM have indicated that content validity is not addressed through the procedures that are outlined for the implementation of this approach (Kosidlak, 1987). Some educators have indicated, however, that the approach has been successfully employed in the development of skills profiles at the local and regional levels (Shears, 1985). Content validity is a primary concern in the collection of job information that is relevant to programs planned by educators in varied geographic locations. Therefore, use of the DACUM approach to promote collection of job information that represents a national labor market is questionable. Ihaafunction approach. Unlike other job-analysis approaches, the function approach is directed toward analyzing industries rather than specific jobs. By implementing the procedures that are outlined for this approach, a list of the contributions employees make "to other functions in the industry" (Finch & Crunkilton, 1979, 38 p. 120) is obtained. Information about the functions of regional or local industries is obtained through surveys, panel members, and instructors (Finch & Crunkilton, 1979; Meaders, 1989). The function approach appears to be a viable method to use in compiling work-related data. In particular, the approach is applicable as it relates to the acquiring of information in environments where (a) managerial practices are centered on job redesign that results in declassification of job titles; (b) rapid changes are evidenced in the way in which work is performed, and (c) the mobility patterns of individuals, in regard to their changing careers and job locations, are high. This kind of information is critical in the structuring of technology-generated information systems. According to the reports, the function approach has not been employed by vast numbers of occupational educators. It involves a significant investment in resources and time (Finch & crunkilton, 1979). Ina taskeanalysis inventory approach. Occupational educators have used task analysis since the early 19005. In The Instructor, the Man. and the Job, Allen (1919) advocated a trade-analysis method that included inventories of job tasks. Further, in Analysis Techniques for Instructors, Fryklund (1965) stressed the matrixing of ‘task. inventory lists under' a hierarchical structure for the purpose of promoting the design of instruction. 39 Advocates of instructional systems design, such as Campbell (1987) and Butler (1972), have indicated that task analysis is an essential component 'hi instructional development. According to Butler, task analysis provides the substance for training content and serves as performance criteria for the evaluation of students and programs. Campbell and Butler’s views have been endorsed by numerous occupational educators. Many of these educators, for example, have indicated that task analysis is critical for the establishment of goals and objectives, and for decisions made about curricula and programs (Davies, 1981; Dippo, 1988; Steely, 1981; Tyler, 1980). Even though outcomes are derived from task analysis, "it is necessary to decide which of the skills demanded by the occupation may be better taught on the job or in the course" (McNeil, 1977, p. 87). Advanced technology has made it possible for educators to acquire, through task-validation studies, data that promote identi- fication of course content. To date, however, such information has not been collected on a widespread basis. Task analysis is the method that is employed by V-TECS in the collection of job information. The information that is acquired through V-TECS task-analysis procedures is used in the development of curriculum products. The V-TECS procedures encompass three major phases: (a) research and decision making, (b) development of products, and (c) dissemination of products. Several process steps are included under each of the three phases. The following is a 40 summary of the procedural phases and processes that are listed in the 1987 V-TECS Product Development Tachnical Reference Handbook. —l O mmummawm Phase I. Research and Decision Making Survey member states to prioritize a product-development ist. Obtain bids from member states to develop/revise a product. Establish agreement between V-TECS Central Office and mem- ber states to develop/revise a product. Phase 11. Develop Products Conduct state-of-the-art research. Identify population. Deve10p preliminary occupational/task inventory. Coordinate writing team. Finalizing occupational/task inventory. Conduct survey and process data. Assemble draft catalog. Write a product domain report. Review and edit draft catalog. Phase III. Disseminate Products Compile final product. Disseminate product. Conduct evaluation. 41 Estimating Task Validity. Reliability. and Relevance Validity "The concept of validity as applied to task inventory information refers to the underlying truth or correctness of the data" (McCormick, 1979, p. 133). According to McCormick, this entails the extent to which information given by job incumbents about their involvement with different tasks actually reflects their real involvement. Job analysts often describe content validity in terms that are somewhat different from those used by some of the testing and measurement authorities--for example, task validity instead of test validity. Their position in regard to what constitutes content validity, however, is the same as the position endorsed by many testing and measurement experts. Several renowned members of the testing community, for example, have agreed that content validity involves determining whether what is being measured is a representative sample of the behavioral domain (Anastasi, 1976; Borg & Gall, 1983; Mehrens & Lehmann, 1978; Sax, 1989). Further, Cronbach (1972) stated, "If the content fits the universe definition, the test is content-valid for persons of all kinds" (p. 94). Both McCormick (1979) and Gael (1983) agreed that reliability is often assumed if validity is evidenced. Gael also indicated, however, that reliability is not necessarily sufficient to obtain acceptable validity. According to a report issued by the Education Research Group, "content validity is usually established by asking 42 experts whether the items are a representative sample of the skills and traits you want to measure" (Ratzlaff, 1987, p. 43). Some authorities have indicated that incumbent workers, supervisors, or a combination of the two, depending on the study, are experts. Gael (1983), for example, indicated that a comparison of incumbent workers’ and supervisors’ responses is the method that is most often used in determining the validity of job-inventory data. Some educational researchers have supported Gael’s position. Ammerman (1977a), for instance, indicated that supervisors and incumbent workers should be included in studies that are conducted for the purpose of identifying relevant curriculum content. Reliability "There are several types of reliability and, as applied to task inventory information, several bases for deriving estimates of reliability" (McCormick, 1979, p. 131). In reference to reliabil- ity, Gael (1983) stated: The terms reliability, stability and consistency often are used interchangeably; evidence that a job inventory possesses sufficient reliability--that is, provides trustworthy informa- tion--usually is obtained by studying the degree of agreement between at least two different views of the same inventory content. (p. 23) According to McCormick (1979), one way to measure task reliability is to compare the answers of respondents on the same instrument that has been administered a week or two apart. Other job analysts have indicated that methods such as comparison between consolidated responses for entire samples with responses obtained 43 from another sample are valid ways to measure task reliability. In addition, some job analysts have supported the single administration of a questionnaire that contains units of tasks (Gael, 1983). The confidence placed in reliability results, according to Gael, is contingent on the amount of agreement that is observed between different raters or for the same raters at different times. Homogeneity, according to reports issued by the Education Research Group, is of primary concern when a single concept is measured by an instrument. The Education Research Group also indicated that homogeneity is of concern when the concepts contained in an instrument are divided into parts to facilitate measurement of a separate concept or skill (Ratzlaff, 1987). Some testing and measurement authorities have indicated there are two primary ways in which homogeneity can be measured: split- half and the Kuder-Richardson Formula-20. These methods, according to the authorities, are applicable when there is a single administration of an instrument. With the split-half method, for example, the content of the instrument is divided into parts and the scores for each half are correlated. Conversely, the Kuder- Richardson Formula-20 is usually employed when items are scored dichotomously, right or wrong. With this formula, a measurement can be obtained for each item that is represented in an instrument (Anastasi, 1976; Kuder, 1972; Mehrens & Lehmann, 1978; Mussio & Smith, 1973; Ratzlaff, 1987; Sax, 1989). "Formula 20 is considered by many specialists in educational and psychological measurement to 44 be the most satisfactory method of determining reliability" (Borg & Gall, 1983, p. 285). Relevance The results of research conducted by the U.S. Air Force in the 19605 and 19705 indicated that several factors should be considered in the collection of job data. For example, the study results indicated that time spent on tasks, task importance, and task difficulty are important task-reliability determinants (Christol, 1971; Gale, 1983; Morsh &. Archer, 1967). These findings were further substantiated through studies conducted by researchers at The Ohio State University’s Center for vocational Education (Ammerman, Essex, & Pratzner, 1974). In addition, the Ohio State University researchers concluded that information about the importance of tasks to employees’ work assignments is relevant to the identification of content for training programs. Gael (1983) concurred with the findings of these researchers. Gael also indicated that, for the purpose of developing national training programs, it is important to collect content data on variables such as job location and employees’ personal factors, e.g., job tenure. "Measurement is an intrinsic part of job analysis" (Ghorpade, 1988, p. 36). Several methods of measuring job—inventory data have been recognized by job-analysis researchers. However, many authori- ties have indicated the most efficient way to obtain ratings from respondents with regard to the significance/importance or difficulty of tasks is to employ a Likert scale (Ghorpade, 1988). Gael (1983) 45 indicated that statistics, means, proportions, and standard deviations, formatted in multirow-multicolumn and cross-tabulation tables, should be used to measure and summarize the collected job- task data. Researchers at The Ohio State University’s Center for Vocational Education also have endorsed the use of these kinds of statistics to measure and summarize job information (Mead et al., 1977). Summary A common theme that is interwoven throughout the literature is that the nation’s changing social and economic structures have affected the nature of work, the way in which work is performed, and the kinds of jobs that are accessible to the population. These changes have been manifested through legislation. The legislation emphasizes guidelines that redirect the way in which occupational programs are developed. Responding to social, economic, and political changes demands that those individuals who are responsible for planning and delivering occupational programs reconceptualize the philosophical postures that serve as the basic foundation for the development of their programs. This entails the selection of a conceptual framework. and the application of 'techniques and approaches that promote improved communication and linkages between educational institutions and the publics served through their programs. Some linkages have been advanced through the use of computer- generated programs. Some of the computerized systems contain job 46 information that is compiled and cross-referenced with information published by several different agencies. These computer-generated occupational information systems have the potential to provide occupational educators, nationwide, with viable information that is relevant to the development of various types of occupational programs. ‘The effectiveness of‘ the computer-generated systems, however, depends on how well they serve their intended purpose and the extent to which the information and data that are in these systems are valid and reliable. If it is determined that the task-list information that is in the ACROS is valid and reliable, it should follow that a precedent has been established for the continued development of computer- generated occupational program planning and development resources. CHAPTER III RESEARCH METHODOLOGY The purpose of this chapter is to describe the methodology used in conducting this study. The research design drew on the V-TECS task-validation procedures, but it went beyond the processes used by this organization. For example, some of the major research procedures that were used in this study, but were not part of V-TECS procedures, include: 1. The study was conducted in more than one state. 2. The target population was identified on the basis of where the jobs, around which the tasks included in the study were developed, are concentrated. 3. Measures of reliability were obtained. 4. Supervisor-managers were included in the population sample. 5. The data were collected and analyzed on the basis of several job-related variables. Table 3.1 is helpful in visualizing the task-validation procedures and processes that are used by V—TECS and the ones that were used in this study. Research Questions The methodology was designed to test the validity and reliabil- ity of two of the ACROS task lists, Bank Teller and Electronics 47 Table 3.l.--V-TECS and this 48 study’s task-validation procedures. Process Procedure V-TECS ACROS 1. Identify study area 1.1. Survey member states 1.1. Obtain employment projections 1.2. Develop preliminary task 1.2. Select task study area ist 2. Select population 2.1. Obtain estimates of incum- 2.1. Identify places of employment bent workers for the occu- (industries) pational domain 2.2. Compile a list of busi— 2.2. Randomly select, for each nesses task-validation area, busi- nesses from two states 2.3. Randomly select, for each task-validation area, population from one state 3. Develop task 3.1. List tasks from prelimi- 3.1. List tasks from ACROS inventory nary inventory 3.2. Job check list 3.2. Job check list 3.3. Scale for testing task 3.3. Scale for testing task valid- validity ity and reliability 4. Conduct survey 4.1. Minimum of 50 incumbent 4.1. Minimum of 200 incumbent workers per job repre- workers and 200 managers sented in the study area per task list study area 5. Compile results 5.1. Level of acceptable survey 5.1. Level of acceptable survey return 30 per job title return 30 per task list study area 5.2. Used SPSS 5.2. Used SAS 5.3. Data analyzed on the basis 5.3. Data analyzed by number of of number of tasks selected matched tasks, importance and frequency of task per- values assigned to task items formance and job-related factors 6. Statistical measure 6.1. Descriptive, percentage and 6.1. Descriptive and inferential frequency, with emphasis on measuring validity with emphasis on measuring validity and reliability 49 Mechanic, as they relate to five research questions. These questions are as follows: 1. 00 selected task lists in the ACROS match the task assignments of employees? 2. How important, on a scale from 1 to 5, are the task- assignment matches to occupational employees’ work assignments? 3. Do incumbent workers and their supervisor-managers respond to selected tasks in the same way? 4. How stable are the task-assignment matches across geo- graphic boundaries? 5. How do task assignments match up across different segments of multi-industries, given the same occupation? Hypotheses The research hypotheses for this study were: Hypothesis 1. There will be no difference in the responses by incumbent. workers and their supervisor-managers relative to selected task matches. Hypothesis 2. There will be no difference in the responses by incumbent workers and their supervisor-managers relative to the significance (importance) values they assigned ix) their selected task matches. Hypothesis 3. There will be no difference in the responses by respondents relative to selected task matches on the basis of length of employment. Hyppthesis 4. There will be no difference in the responses by respondents relative to selected task matches on the basis of geographical location of places of employment. Hypothesis 5. There will be no difference in the responses by respondents relative to selected task matches on the basis of type of business establishment. 50 Hypothesis 6. There will be no difference in the responses by respondents relative to selected task matches on the basis of size of the business. Procedure for DatasCollection The data-collection procedure *was divided into four parts. These included selection of the task lists, determination of the population, development of the instruments, and collection of the data. _a1action of the Task Lists Only two of the ACROS task lists, Bank Teller and Electronics Mechanic, were included in this study. The Bank Teller task list was developed and validated by educators in Mississippi in 1988. The Electronics Mechanic task list was produced by educators in Georgia in 1980. Both task lists are representative of an occupational area that demonstrates a favorable employment outlook. In addition, these lists are representative of some of the diverse kinds of task lists that are found in the ACROS. Some of the major differences, as they relate to the task lists selected for this study, are outlined below: Bank Teller 1. Developed around one job. 1. Developed around three jobs. Elactronics Mechanic 2. Job represented in this task 2. Jobs represented in this list is primarily in busi- nesses that are classified within the finance industry. . List was validated less than two years ago. task list are found in many businesses that are classi- fied within multi-industries. . List was validated approxi- mately ten years ago. 51 Determination of the Population The population comprised incumbent workers and their supervisor-managers who were drawn from 400 business establishments. Two hundred of the businesses, 100 from Michigan and 100 from Georgia, were included in a study of the ACROS Bank Teller tasks. The remainingZOO business establishments, 100 in Michigan and 100 in New Jersey, were involved in a study of the ACROS Electronics Mechanic tasks. The states in which the task inventory studies were conducted were selected to participate in the study because of the changes that have occurred in their economic structures within the past few years. Michigan, for example, has experienced a decline in manufacturing. This has resulted in the loss of jobs. Georgia and New Jersey, on the other hand, have evidenced economic growth. This growth, according to some labor market reports, is a result of the relocation of industrial establishments and the emergence of "high technology" in these states. To identify the population in Georgia, Michigan, and New Jersey, a four-step process was implemented. An overview of the component parts of the four-step process is presented in the following paragraphs. Stap 1. Three reference documents were consulted: Occupational Projections sand Training Information for Michigan: Users Manual (Michigan Occupational Information Coordinating Committee, 1989), Vocational Preparation Occupations: Educational 52 Occupational Code Crosswalk (National Occupational Information Coordinating Committee, 1982), and the DES Survey Dictionary (National Occupational Information Coordinating Committee, 1987). These documents were referenced to identify the Occupational Employment Statistics (OES) codes that correlated ‘with the occupational domain, Dictionary of Occupational Titles (DOT) jobs and codes, represented in the task lists. Step II. The OES codes were used to acquire employment information for the states that were included in the study as it related to the task lists’ occupational domains. This involved obtaining information that specified where the places of employment were for those jobs that were represented in the task lists. This was accomplished through telephone calls to the Michigan, Georgia, and New Jersey State Occupational Information Coordinating Committee (SOICC) directors. The SOICC agency in Georgia performed a data-base search using five digits of the DES code, 53102, to obtain information that projected the types of industrial (business) establishments in which teller, paying and receiving jobs were located in Georgia. The SOICC agency in New Jersey used OES codes 22505, 85705, and 85717 to identify where the places of employment were concentrated for the jobs electronic technician, field [electronic] engineer, and electronic mechanic, data processing equipment repairer, which were represented in the ACROS Electronics Mechanic task list. All of the DES codes were used to identify places of employment in Michigan where the jobs represented in the two task list study 53 areas were concentrated. This information was extracted from two resource manuals that were provided by the Michigan SOICC director. These manuals were Jobs by Industry: An Occupational Guide for Self-Directed gpb Search Programs and Standard Industrial Classifigation--Yellom Paqes Crosswalk. Both of these manuals were published by the Michigan Occupational Information Coordinating Committee in 1989. The job field [electronic] engineer did not correlate with the descriptive information that was contained in the SOICC agencies’ places of employment, industry, files. Consequently, it was necessary to consult an additional reference to explore whether this job was homogeneous with the occupational domain (jobs) that comprised the ACROS Electronics Mechanic task list. The U.S. Department of Commerce’s Standard Occupatipnal Classification Manual (1980) classifies the job field [electronic] engineer with jobs that involve the repair of electrical and electronic commercial and industrial equipment. This indicated that the job was homogeneous with the other jobs around which the ACROS Electronics Mechanic task list was developed. Therefore, it was determined that information from individuals who work in and supervise the work of people who performed tasks associated with this job should be collected. Step III. The places of employment, job location, list was cross-referenced with titles and three- and four-digit codes listed in the Standard Industrial Classification Manual (1987). Published by the Office of Management and Budget, Executive Office of the 54 President, this book structures industry information under a hierarchical classification system. Industry information is classified in the SIC manual as follows: Major Group 50 Wholesale Trade-Durable Goods 506 - Electrical Goods 5064 - Electrical Appliances, Television & Radio Sets 5065 - Electronic Parts & Equipment By cross-coding and matrixing the SOICC agencies’ job-location data. with industry titles-codes listed in the SIC manual, the researcher was able to identify the target population for the study. A summary of the cross-coded data that were compiled is presented in Tables 3.2 through 3.5. Stap_ly. The four-digit SIC codes that were obtained from the SIC manual were used to initiate the selection of a stratified sample of businesses. The sample was acquired by accessing one of the many data-base information systems operated by Dun and Bradstreet’s DNB Financial Records Plus Data Base. This involved compiling a list of the names and addresses of 400 industries that were places of employment in which the jobs represented in the task list study areas primarily were concentrated. 55 mcovum_oomm< sacs a mm=a>am moo mawucam< paeaau __© _oeo.mo mcv>woomm mxcmm mmcv>mm a mcvzma xuoum a pmvocmEEou moo mxcmm moo memo.no onmmopmm .cmppm» covpmcucmocou new copumcucmocou non . . . camvgovz u_m mwmcomu uHm aHu mwo h o o .xmccw weepimmocuuicmppmu x:mm-u.~.m «Pack 56 movpnazm a uch -n—avw .xcocvcooz mom acosnvacw oomuuoxcouansou Nmm Acacvoaom ucosamaom moop>cow mcvanOOLm moow>com uc—mnooOEd cues. even a saunanu hmn mcwmmooocm open a cousneou hm» memo.mp wowmoumm o’cnsooz 0.:0cuoo—u ucoEa.=uw mo.cco\cou=aeou nmm mo.c.~: moowscom Noso.m. moms—Ana Loo=_ucu mcwmmooOEm name a cognanu nmn momo.m— wowmosmo Ao_c0cuoo—m. v—ovu o—mmo—ozz sumvooo pnovcuoo—m a .ucosnvavw .xcoc—zooz mom acme ravaam 00—wuoxcouaaeou 5mm novpanzm a 9:05 no.c6mmooo< iqwavm .aLoc—zoaz mom a mucocoaeoo 0.:0cuoopw c.cno¢ a oov>com ums acosa.=am co.uno.caesoo mmm memo.mp unumomww co.o—c;oop cecOLuoo—w co—uncucoocou now co*umcucoocou new . . . .xoncp ovoonmm0couuo_cacooe no.c0cuoo—uou.Msm o—no» 57 cacapcagu zFaaaaeaa 3oz .accwpzpcpmcb mmca>am mmoo nmcmbcmsu appmcmcmu .mcowuaumpmcm mm=m>mm mmoo mmmocmm< acumcu cmcomcoam xppmcmvom one Pmcmumm Pppm umz .mxcmm meucmsaou mmom mmom mxcmm PmmucmEEou mumum Nuoo «Now mxcmm meucmssou Pacowpmz FNom Puom mpuwp Aguasccm campsowz awmcoma .zcoucm>cw xmmu coppmp xcmm mgu cw comumqwowpcmn cow mmwcumavcm mo mpasmm vmwmmumcpm a powwow o» new: mmvou gum uwmmu-csom--.e.m mpnmp 58 umz .waogm Leena“ accocpuapm ecu Paopcpoa_u amen maogm c_mamm cowmm>mpmh van ovumm NNmA cwmamm can mocmcmucwmz cmuzq5ou mums mkmu “cmsawzam van acmcwgomz mesumsucm emom pcmsamzcu can acmcwgumz cmucmu ecu Egan mmom bcmsapacm new Acmcwsomz fiasmpoguma bamuxuv mcwcmz new cowuoacumcou mmom umz .ucmansau new magma u_cocuompu mmom mumm opcmm can cowmm>mpmp .mmucmmpaa< Fmowcuumpm eoom mpmmcmumz cowbozcumcou ecu acmcmz .acwsamzcm tam mzumcaaq< qumcuompm mmom uuz .mpcmcoasou owcocuompu muom «copomccou omcocauwpm anon mcouosncm cmspo can .mgmscoEmcac» .mpwou owcocaompu Anon mcopmpmmm upcocuomFN osmm mcopwomamu u_coc¢um_m mnmm mmow>wo umumpmm can mcouuzucoopsmm «New mucmom uwzucmu newsman whom mmnzh cocaompm anm cm: .pcmsamacm compmowczssou ammm acmEqwzcm cowpmowcstou use mcmummuumocm cowmm>m—wh can owvmz moon umz .ucmsawzcm Pmcmgawcma cmazaeou Numm mpmcwscmh cmaaaaou mnmm mmom>mo mmmcoum cmuzasou Nnmm mcmpsasou upcocuumpm mem mpuwh acumzccg cmmwgowz mmmcmc zwz .acou:m>cw pmwp xmau uwcmgomz momcocpom—w may cw comumawompcma com mmwcumzucm mo mpasmm wowewamcam m pompmm ob cum: mucou uHm ummwu-c=ou--.m.m mpgm» 59 Development of tha Instrument Instruments were developed for two of the ACROS task lists, Bank Teller and Electronics Mechanic. The basic design of this study’s instruments was adapted from instruments developed by the V-TECS, the U.S. Air Force Research Division, and the Center for Vocational Education, The Ohio State University. Instruments developed by 'these1 organizations have been modified for use in studies conducted by the Texas State Comptroller of Public Accounts AT&T’5 Human Resources Laboratory, the vocational Technical Education Consortium of States, the Michigan Department of Labor’s Occupational Field Analysis Center, and Michigan State University’s Department of Agricultural and Extension Education Curriculum Projects. The study’s instruments contained four sections: (a) directions for completing the survey, (b) background infOrmation, (c) a list of duties and tasks that were acquired from the ACROS, and (d) a page for comments. (See Appendices A and G, pp. 121 and 149.) The following is a list of the tasks that were included in this study’s instruments. TELLER, PAYING AND RECEIVING A. Planning and Organizing Arrange coins in tray for day’s transactions Prepare strapped currency drawer for day’s transactions Prepare rolled coin drawer for day’s transactions Prepare working currency drawer for day’s transactions Prepare working currency and coins for day’s transactions Stock forms, supplies, and equipment for daily transactions Open teller terminal Close teller terminal Balance cash drawer and close teller window OCDVOSUT-th—i 60 Supervising and Implementing 10. 11. 12. 13. 14. Follow procedures for conduct during a robbery Follow procedures for conduct after a robbery Greet customers Issue the safekeeping of money at teller’s window Dismiss customers Inspecting and Evaluating Determine if check is negotiable Examine counter checks for acceptability Examine deposit slips for acceptability Remove excess currency from teller’s window Examine currency for counterfeit bills Inspect for mutilated and badly worn coins Inspect for mutilated or badly worn currency Inspect customer’s identification Processing Money Process counterfeit currency Process excess currency Process mutilated and badly worn currency Roll coins Sort and stack coins Sort and stack currency Strap currency Count coins Count currency Performing Customer Service Activities Verify customer checking/savings account Accept Christmas Club payments Accept installment loan payments Admit customers to safe deposit boxes Advance cash on bank credit cards Answer customer inquiries Cash checks Cash series E or EE bonds Enter amount of interest in savings passbooks Fill change requests Fill payroll requests Pay savings withdrawals Issue cashier’s checks Issue certificates of deposits Process bond coupons Process deposits Complete a collection receipt Issue money order 61 50. Process deposits 51. Issue series EE bonds 52. Issue traveler’s checks 53. Cash traveler’s checks Performing Clerical Activities 54. Take telephone calls 55. Operate checkwriter 56. Order personalized checks 57. Prepare deposit slips for the customer 58. Accept and process tax deposits 59. Complete safe deposit rental statement 60. Accept safe deposit rental fees 61. Place hold on customer accounts 62. Make title changes on customer accounts 63. Open new customer accounts 64. Place stop payment on checks 65. Operate electronic audit machine 66. Prepare cash-in and cash-out tickets Accounting 67. Buy cash from other tellers 68. Prepare a teller’s daily balance sheet 69. Sell cash to other tellers ELECTRONICS MECHANIC Adjusting/Aligning/Calibrating Electronic Circuitry Adjust AC generator output Adjust AC output resistance Adjust amplifier gain Adjust armature or field connection voltage Adjust audio intensities Adjust automatic gain control Adjust bias network Adjust capacitance Adjust core for slug tune circuits 10. Adjust DC generator output 11. Adjust drive gear 12. Adjust focus control 13. Adjust horizontal linearity l4. Adjust impedance 15. Adjust integrated circuit output 16. Adjust modulation percentage 17. Adjust noise invertor control 18. Adjust oscillator DmNO‘U‘l-ISWNd 19. Adjust output of high frequency amplifiers (grounded grid; cascade) 20. Adjust power converter output 21. Adjust probe calibrator signal 22. Adjust resistance 23. Adjust resonant frequency 24. Adjust spindle speeds 25. Adjust synchronization in IC, S 26. Adjust tape amplifier 27. Adjust tape reader 28. Adjust tension arm 29. Adjust tuned circuit valves 30. Adjust vertical linearity 31. Adjust voltage 32. Align TRF 33. Calibrate DC levels in logics racks 34. Calibrate inductance 35. Calibrate logic integration 36. Calibrate multi-vibrator circuit (stable, monostable, bistable, flip flop) 37. Calibrate power supply voltage 38. Calibrate P-P voltage 39. Calibrate timing/clock pulse 40. Calibrate vertical amplitude 41. Calibrate delay, long time delay and G. F. drop out time 62 42. Adjust transducers 43. Adjust transformers Administering Personnel 44. Administer diagnostic tests to prospective employees 45. Conduct instruction by demonstration performance 46. Evaluate employee performance 47. Evaluate training program 48. Evaluate personnel safety violations 49. Interview prospective employees 50. Maintain work records of employees 51. Monitor programmed instructions 52. Orient personnel on procedures 53. Plan work schedules 54. Report equipment related safety violations 55. Schedule work assignments Designing Equipment and Circuitry 56. Conduct physical inventory 57. Construct external interface adapters 60. Construct a MB/TF graph 61. Construct tables displaying electronic data (variables, parameters) 62. Design electrical terminations for new equipment 63 Design interfaces between sub-assemblies (electrical, mechanical) Design physical support hardware for new electronic equipment Draft preliminary specifications for an electronic device Draw schematic of circuitry Modify schematics Modify original circuitry to accommodate IC, S Plan quality assessment checks (physical electrical) Prepare Prepare Prepare Prepare Prepare Prepare Prepare an assembly guide cost factors report an estimate of production time a parts list for prototype equipment preliminary sketches for prototype equipment a production feasibility report a survey of production schedules Translate engineering specifications to a functional description Translate graphic information into written specification Verify interface connections Write Write operational procedures summary report of operational tests Design circuits from engineering specifications Performing Environmental Tests Perform atmospherical test Perform corrosive test Perform humidity test Perform maximum power test Perform pressure test Perform shock (impact) test Perform temperature test Maintaining Electronics Devices Assemble structural members according to assembly drawing Clean Clean Clean Clean Clean Clean Clean Clean Clean Clean Clean Clean air filters chassis circulation fans (exhaust and intake) contact points drive mechanism reflective mirror speaker grill spindles device tape head tape reader rectifier tuner volume control Construct a PC board (layout, etch, drill) Convert digital decimals into binary coded decimals 105. 106. 107. 108. 109. 110. 64 Locate component malfunctions using written reports Mount system in/on physical support Record meter readings Splice wires Solder/unsolder components Perform quality control checks Replacing Components 111. 112. 113. 114. 115. 116. 117. 118. 119. 120. 121. 122. 123. 124. 125. 126. 127. 128. 129. 130. 131. 132. 133. 134. 135. 136. 137. 138. 139. 140. 141. 142. 143. 144. 145. 146. 147. 148. 149. 150. 151. 152. Replace Replace Replace Replace Replace Replace Replace Replace Replace Replace Replace Replace Replace Replace Replace Replace Replace Replace Replace Replace Replace Replace Replace Replace Replace Replace Replace Replace Replace Replace Replace Replace Replace Replace Replace Replace Replace Replace Replace Replace Replace Replace amplifier bi-polar logic gates cathode ray tube capacitor counters (decode, preset, ring) counters board digital lights deflection yolk dynamotor energy storage cells fet filter frequency converter fuse IC chips IC network indicators indicator lamps klystron logic control board magnetron microphone oscillator PC boards photo electric relays photo transistor power supplies pulley bolt rectifier registers relays reader roller scr-triac servo mechanism solid state diodes switches (leaft, contact, mercurial) tape head thermal breakers thyratrons transducer transformer 65 153. Replace transistors 154. Replace translator circuit (encoder, decoder, code converter, etc.) 155. Replace tubes Note: Total number of tasks is 153. The discrepancy (155 as opposed to 153) is the result of a numbering error in the instru- ment. Two check-off columns were included beside the list of duties and tasks. The first column was designated as a place for respondents to identify the tasks that matched the work assignments of the jobs. This entailed incumbent workers matching tasks on the basis of the work they performed on their present job. In reference to supervisor-managers, it involved matching of tasks on the basis of the task they expected employees who were under their supervision to perform. 'The second column was designated as a place for respondents to rate, by placing a check mark in one of five response spaces, the significance of the tasks they had selected as matching the work of the occupation. The range of the rating values was 1 through 5. The following is a description of the scale values: Minor part of job Somewhat important Important Very important Extremely important m-hWN—l I II II II II The researcher did not field test the instruments that were used in the study. The decision was made because the basic design of the instruments had been used for a number of years by several educational and job analysis researchers as a means to collect job data. 'The tasks that. were obtained from the ACROS were also 66 accepted on the basis of their face validity because of their prior validation by V-TECS. To facilitate coordination of data from the respondents by task list area and state, the instruments were printed on colored paper. Collection of the Data The task inventories were sent to 400 businesses. Two hundred of the businesses, 100 in Michigan and 100 in Georgia, were mailed the Bank Teller Task Inventory. Inasmuch as the Bank Teller Task Inventory involved employees who worked in financial establishments other than banks, the inventory was titled Teller, Paying and Receiving. The Electronics Mechanic Task Inventory was mailed to 100 businesses in Michigan and 100 in New Jersey. Two inventories were mailed to each business. The introductory letter that accompanied the inventories indicated that one of the inventories was to be completed by an incumbent worker and one was to be completed by a supervisor-manager in the occupation. Directions for completing the task inventories were included in the instruments. The directions asked respondents to (a) place a check mark beside tasks that were a part of the job, and (b) rate, on a scale that ranged from 1 to 5, the significance/importance of the tasks they selected as they related to the work assignments of employees in the occupation. The respondents were also asked to respond to job-related questions such as the title of their present job, type of industry in which they were employed, size of their 67 place of employment, and length of time in the job. Finally, the respondents were asked to respond to statements (by circling strongly agree, agree, uncertain, disagree, or strongly disagree) regarding representation of tasks to employees’ work assignments, and clarity of the task statements and the directions for completing the survey. The instruments were numbered before they were mailed. The number that was assigned to an instrument was recorded on a list beside the name and address of the business to which the instrument was mailed. Respondents were requested to return their completed inventory in the self-addressed, postage-paid envelope that accompanied the instruments, within two weeks. Upon receipt of an inventory, the researcher referenced it with the instrument numbers that were recorded on the list that contained the names of the businesses that were included in the study. At the end of two weeks, the list was consulted for the purpose of identifying those businesses that had not responded to the survey. A follow-up letter was sent to the businesses that had not returned their surveys. Data-Analysis Procedures This section is concerned with three main topics: the independent variables, coordination of data, and statistical analysis of data. 68 Independent Variables The following independent variables (factors and background characteristics) were used in analyzing data relative to the overall objective, the five research questions, and the hypotheses. 1. Number of tasks selected by incumbent workers and their supervisor-managers as matching the task assignments of workers employed in the occupation. 2. The average significance (importance) values incumbent workers and their supervisor-managers assigned to the tasks they selected to match employees’ job assignments. 3. A comparison between incumbent workers’ and their supervisor-managers’ selected task matches. 4. A comparison between incumbent workers’ and their supervisor-managers’ assignment of significance values to their selected task-to-job matches. 5. Comparison of' respondents’ answers on the basis of the geographical location of their places of employment. 6. Comparison of respondents’ answers on the basis of the length of their employment in the job. 7. Comparison of respondents’ answers based on the type of business establishment/industry in which they are employed. 8. Comparison of respondents’ answers based on the size of the work force in their place of employment. 69 Coordipation of Data According to some testing and measurement authorities, in correlational studies there should be a minimum of 30 cases (Borg & Gall, 1983). Therefore, for this study, a minimum of 30 usable returns for each task list was established as the acceptable sampling standard. Returned inventories were scrutinized to eliminate from further processing those that did not contain responses to the major sections, background information and tasks. The remaining inven- tories were then reviewed. This second review was conducted to identify inventories that had been completed by respondents who worked in an occupational domain other than the domain included in the study. Inventories that were completed by workers and/or manag- ers who worked in an occupation that was not included in the study were eliminated from further processing. The remaining inventories, 79 Bank Teller and 34 Electronics Mechanic, contained responses to the sections on background information and tasks and were completed by respondents employed in the occupational domain(s) that comprised the study. These inventories were accepted for statistical testing. Statistical Analysis of Data The Statistical Analysis System (SAS) program was used to compile the results of the data that were acquired through the study. A measure of the validity of the study’s tasks was obtained by calculating descriptive and inferential statistics. Further, an 7O estimate of reliability' was secured by computing a reliability coefficient, the Kuder-Richardson (K-20). Relative to Questions 1 and 2, which involved obtaining a measure of the validity and reliability of the study’s tasks relative to the population, frequency counts, percentages, means, and standard deviations were computed. It also involved the computation of confidence limits at the .05 level of significance. In addition, relative to Question 1, the Kuder-Richardson (K—20) reliability coefficient was calculated. This coefficient, which has a value that ranges from 0.0 to 1.00, was measured at .70 or higher for acceptance of the consistency of respondents’ selected task matches. Questions 3, 4, and 5 were posed to obtain information about the relevance of the study’s task information. This entailed examining data on the basis of respondents’ subgroup affiliation and job-related variables. For Question 3, a two-tailed t-test at the .05 significance level was performed to obtain an estimate of the possible variances between incumbent workers’ and supervisor-managers’ selected task matches. A t-test was also performed to determine whether the significance (importance) values incumbent workers and supervisor- managers assigned to their selected task matches were equivalent. Then, analysis of variance at the .05 level of significance was computed. This test was performed to determine whether differences existed in task matches made by respondents on the basis of their years of experience working in the occupation: one year or less, 71 one to three years, or three years or more. When sample differences resulted in a significant F-ratio, Scheffe’s multiple comparison method was employed. This procedure enabled the researcher to examine within which pairs differences existed. Relative to Question 4, a t-test was computed. The test, which was performed at the .05 level of significance, was conducted to obtain an estimate of selected task-match variance across geographical boundaries. Tests. were also conducted to examine, at the .05 level of significance, task matches made by respondents on the basis of the size and type of industry' in which they were employed. This entailed the calculation of t-tests, which were used to compare respondents’ mean responses relative to size of industry, and the calculation of analysis of variance and Scheffe’s multiple comparison procedure to obtain a measure of respondents’ task matches relative to type of industry. Data from these tests were pertinent to Question 5. Cross-tabulated tables were used to display the statistical data that were compiled relative to the study’s research questions, hypotheses, and job variables. These tables served as a means to explore differences and to summarize the results from which inferences about the population were drawn. The data-analysis results are presented in Chapter IV. Summary The research design and methodologies used in conducting the study were presented in this chapter. Included were the procedures 72 for the collection of data, which encompassed selection of the task lists, determination of the population and selection of the sample, development of the instruments, and collection of the data. Procedures applied in the analysis of data were also discussed. Included was a list of the independent variables that were used in the collection of data. A cross-index of the questions, hypotheses, variables, and statistical measures is summarized in Tables 3.6 and 3.7. Chapter IV contains the results of the analysis of the data that were collected in this study. 73 omwocom .oocmmcm> wo wmwxpmc< pace-» have-» mumemp oocovmwcoo .ommucoo Icon .ucaoo zocoavocw .cmoz comucmcommucoeax .mumemp oocovmwcoo .ommucoo icon aano xocoswoce .cmoz .aoh as» aw ucosxo_aso cmocp $0 camcop on» $0 mmmmn on» :0 mcosmcm .mucoucoamoc we comwcmasoo .m .mocoume aOHIOpnxwmu uopoopom Ewes» on mos—m> oocmomhmcmmm we pcoEcmmmmm .mcommcmEIEOmm> icoasw ewes» tam .mcoxco; ucon issocm coozpon cemmcmqeoo < .¢ .wocoums xmmp empoopow .mcommcmEuLOmm>coQ:m cmocu new .mcoxcoz uconE:ocw coozpon cemmcmaeoo < .m .mucoE:mmmmm now .woomoFQEo nouns op vopoo—om xozu mxwmu on» o» @9333 2092.9: teemm>coaam cmonu can mcoxco: uconsaocm moapm> Aoocmpcoasmv oocmowwmcmmm ommco>m oak .N .comumaaooo ecu cm ooxopaeo meoxcoz we mpcoEcmmmmm xmmu ecu mcmcoume mo mcommcmEucomm>coaaw cmocu pcm mcoxcoz peon230cm >3 nouoopom mxmmu $0 consaz .— .ucoe uxopaeo mo camcop we mmwmn on» :o monoums xmmp venom—ow o» o>mum_oc mucovcoam0c xn womcoqmoc on» cm coco icowwmv c: on —_m3 ocosp .m .mocoume xmmp couoopow Even» on uocmmmmm hos» moapm> Amoco» icoaemv oocmomwmcmmm on» on o>mumpoc mcommcmsucomm>coaam cmocp vcm mcoxcoz peasanocm xn momcoamoc 0:» cm coco leewwmv 0: on —~m; ocogp .N .mozopme xwmu wouoopow ou o>mpm~oc mcommcmsucomw>coasm cmocu vcm mcoxco; uconesocm ha momcoamoc on» cw coco icoewmv 0: on ppm; ococp .P wxm: 05mm on» cm mxmmp popoopow oa ncoamoc mcommcmEucomm>coazm cmocu ecu mcoxcoz uconszocm co .m mwpcoEcmmmmm xcoz .mooxopqeo pmcomu umazooo on museums acoEcmmmmm :xwmu ecu 0cm .m o» F sec» o—mom m :o aucmpcoaem :0: .N wmooxopaeo we mucoscmmmmm xamp on» nouns moxu< 6:» cm mummp xmmu uouoopom oo .— Amvocammoz pmomummumum opnmmcm> mmmocuoaxz comumoso nucmomom .mocammos —mompmwumuw vcm .mopammcm> .momocpoqxc .mcomumozv m.mv:um ocu we opnmu acmeszm xovcwnmm0c0u1.m.m epoch 74 .ucoexopaeo $0 oompa c$0cu cm 00c0$ xcoz on» $0 o~$m on» :0 women mc03m=m amour» .mucovcoamoc $0 commcmaeou .w .voxo—aeo 0cm menu comcz cm he» nmnvcm\ucoecm$pnmumo mmocmman $0 00%» on» :0 women mcozmcm o$$0com aoocm$cm> $0 mmmxpmc< .wucoocoawoc $0 acmmcmasou .n .ucoezopaeo $0 moon—a cmozu $0 c0$umoop p00$camcm00m on» $0 mmmmn on» :0 meozmcm amour» .mpcopcoamoc $0 cowmcmaeou .m .mmocmmsn 0;» $0 onmm $0 mmmmn 0;» c0 monoume xmmp vouoopom 0» 0>$umpoc mucovcoamoc An momcoamoc 0:» :$ 00cm nc0$$$n 0: on —P$3 ococh .m .ucoezm$pnmumo wmoc$man $0 oqxu $0 m$mmn on» :0 monopme xmmu couoopom cu 0>$umpoc mucovcoamoc xn momc00m0c 0:» cm 00:0 ico$$$0 0: on P_$; ococp .m .ucosxopaeo $0 moompa $0 00$» 1000— —m0$zaocm00m $0 m$mmn on» :0 monoums xmmu wouoopom op 0>$pm—oc mucovcoamoc an momcoamoc 0:» :$ ooco ico$$$0 0: on —_$: ococh .3 wcomumaaooo oEmm ecu co>$m .momcumavcmnmupae $0 mucosmom pc0c0$$_0 mmocom a: 50905 mucoEcm$mmm xmmu 0c 30: .m ~m0$cm0caon emcamcmoom moccom monoume pcoecmmmmm nxmmu ecu 0cm opomum :0: .a Amvocsmmoz Fmomuwmumpm 0pnm$cm> ammocuoamx compmozo cocmomom .naacmucoo--.m.m o_nac 75 0:005 050 00050 000x0 0:005 00:0 0000000$$00 0cmuw$00000 00000000 000 $0 00000000500 0000000: 0$$0000 000000 00:00: 000 0003000 00002 $0 0000000500 0000000> $0 0000000< 000000 0005 00 030 0003000 00000000> 000 $0 0000; 000 $0 0000002 0000-0 000000 030 $0 00000000» 0002 $0 0000000500 0000-0 00000=0$00 000$ .00000 000 :0 00000050 000000$$cm00 $0 00000 $0 0000000$0o--.s.m 0000» CHAPTER IV RESULTS OF THE DATA ANALYSIS The results of' a study of the Automated Cross-Referencing Occupational System (ACROS), a computer-assisted task-based program planning and development resource, are presented in this chapter. The researcher’s major objective in this study was to test the validity and reliability of selected tasks from the ACROS using two task lists, Bank Teller and Electronics Mechanic. Relative to this objective and the research methodology, five questions and six xhypotheses were posed. The questions and the hypotheses are as follows: 1. Do selected task lists in the ACROS match the task assignments of employees? 2. How important, on a scale from 1 to 5, are the task- assignment matches to occupational employees’ work assignments? 3. Do incumbent workers and their supervisor-managers respond to selected tasks in the same way? Hypothesis 1. There will be no difference in the responses by incumbent. workers and their supervisor-managers relative to selected task matches. Hypothesis 2. There will be no difference in the responses by incumbent workers and their supervisor-managers relative to the significance (importance) values they assigned to their selected task matches. 76 77 Hypothesis 3. There will be no difference in the responses by respondents relative to selected task matches on the basis of length of employment. 4. How stable are the task-assignment matches across geo- graphic boundaries? Hypothesis 4. There will be no difference in the responses by respondents relative to selected task matches on the basis of geographical location of places of employment. 5. How do task assignments match up across different segments of multi-industries, given the same occupation? Hypothesis 5. There will be no difference in the responses by respondents relative to selected task matches on the basis of type of business establishment. Hypothesis 6. There will be no difference in the responses by respondents relative to selected task matches on the basis of size of the business. The data sources for the study of the ACROS tasks for Bank Teller and Electronics Mechanic occupations were incumbent workers and their supervisor-managers. In the following pages, the results of the study are presented by occupation relative to the research questions and the hypotheses. Bank Teller (Paying and Receiving) Task List Inventory Results Background Information Sixty-eight percent of 'the 'respondents (n = 79) were from Michigan and 32% were from Georgia. Of the total respondents, 54% were supervisor-managers (n = 43), 43% were incumbent workers (n = 34) who worked in teller, paying and receiving jobs, and 3% were incumbent workers (n = 2) who were employed in teller/loan assistant 78 positions. Table 4.1 shows the distribution of respondents relative to their length of employment in the occupation. Table 4.1.--Bank Teller: Distribution of respondents by length of employment in the occupation. Length of Employment Respondent Group 1 Year or Less 1-3 Years 3 Years or More Incumbent workers 7 16 13 Supervisor-managers 6 13 24 Total 13 29 37 % of responses 16 37 47 Fifty-four of the respondents worked in small business establishments; 25 worked in firms that employed a total of 100 or more people. Table 4.2 illustrates the types of businesses where the respondents worked. Table 4.2.--Bank Teller: Distribution of respondents relative to their places of employment. Business Number Percent Commercial bank 46 58.2 Savings & loan 17 21.5 Other (designated by respondents): federal savings bank, farm lending institution, 8 10.1 credit union Federal Reserve 6 7.6 Credit agency 2 2.5 79 The data obtained from the comment section of the inventory indicated that there was strong agreement between incumbent workers and supervisor-managers about the clarity of the inventory’s task statements and the instructions for completing the inventory. A summary of the number and percentage of respondents’ answers, which ranged from strongly agree to strongly disagree, is presented in Tables 4.3 and 4.4. Table 4.3.--Bank Teller: Distribution of respondents’ answers relative to clarity of the inventory’s task items. Supervisor-Managers Incumbent Workers Response Number Percent Number Percent Strongly agree 8 19 6 17 Agree 30 7O 27 75 Uncertain 3 7 l 2 Disagree - - 2 6 Strongly disagree 2 4 — - Table 4.4.--Bank Teller: Distribution of respondents’ answers relative to clarity of instructions for completing the inventory. Supervisor-Managers Incumbent Workers Response Number Percent Number Percent Strongly agree 14 33 8 22 Agree 27 63 26 72 Uncertain - - - — Disagree 1 2 2 6 Strongly disagree 1 2 - - 80 There was also evidence, based on information obtained from the respondents’ comments, that there was substantial agreement between incumbent workers and supervisor-managers that the inventory contained most of the tasks that are performed by employees in paying and receiving jobs. The number and percentage of respondents’ answers, which range from strongly agree to strongly disagree, are presented in Table 4.5. Table 4.5.--Bank Teller: Distribution of respondents’ answers relative to representation of the inventory’s tasks to paying and receiving employees’ jobs. Supervisor-Managers Incumbent Workers Response Number Percent Number Percent Strongly agree 15 34.9 11 30.5 Agree 25 58 l 23 63 8 Uncertain - - - - Disagree 1 2.3 1 2.8 Strongly disagree 2 4.7 l 2.8 Research Question 1 00 selected task lists in the ACROS match the task assignments of employees? The researcher wanted to make inferences about the population on the basis of the data that were acquired from the respondents. Confidence intervals for the sample mean were calculated at the .05 level of significance. Respondents’ (n = 79) mean response (M = 62.07) relative to the selection of tasks (n = 69) and the standard 81 error (1.69) were used to compute a confidence interval, which ranged from 58.76 to 65.38 (see Table 4.6). This range specifies the parameters within which there is a 95% chance that an estimate of the mean response (relative to selection of tasks) of the population would be found. Table 4.6.--Bank Teller: Summary data for the range of respond- ents’ task matches (n = 79). Source Total Responses Mean Standard Deviation 69 tasks 4,283 62.07 14.075 Standard Error Confidence Intervals 1.694 f 3.31 = 58.76 to 65.31 A reliability coefficient, Kuder—Richardson K-20, was computed for the purpose of obtaining an estimate of the consistency of respondents’ answers to task items. The range of this coefficient is from O to l. The coefficient that was obtained was significantly high, 0.9655. This was well within the range of the .70 or higher value that was established by the researcher as the coefficient value that would be accepted as an estimate of the reliability of respondents’ task-item responses. Table 4.7 shows a summary of the results of the reliability coefficient. Appendix B, p. 135, contains the raw data that were used in calculating the coefficient. 82 Table 4.7.--Bank Teller: Estimates of reliability of respondents’ task-selection responses. Number of Items Mean P x Q Variance rxx 69 62.07 9.46 195.26 0.9655 Research Question 2 How important, on a scale from 1 to 5, are the task-assignment matches to occupational employees’ work assignments? On a scale that ranged from 1 = minor part of job to 5 = extremely important, the respondents rated the significance/impor- tance of the tasks that were selected as matching the work assignments of employees’ jobs. Confidence limits, at the .05 level of significance, were established for the purpose of estimating the parameters within which the population item response mean might be found. These limits, +/- 3.51 of the sample mean (M = 3.51), ranged from 3.35 to 3.67. As indicated in Table 4.8, 82% of the respondents’ task matches were rated as extremely important, very important, or important. A distribution of the respondents’ ratings of their selected tasks, on an item-by—item basis, is contained in Appendix C, p. 138. 83 Table 4.8.--Bank Teller: Distribution of respondents’ task- significance ratings. Significance Level Number Percent l = Minor part of job 430 10 2 = Somewhat important 359 8 3 = Important 1,222 29 4 = Very important 954 22 5 = Extremely important 1,302 31 Total 4,267 100 Research Question 3 Do incumbent workers and their supervisor-managers respond to selected tasks in the same way? Hypothesis 1. It was hypothesized that there would be no difference, at the .05 level of significance, between incumbent workers and supervisor-managers in the selection of tasks. The hypothesis of no difference was appaptad. The data presented in Table 4.9 (see Appendix D, p. 143, for the raw data) indicate that a possibility existed (t = 3.8466 at the .05 level of significance) that the two groups were different in terms of their selection of tasks. To test for differences between the two respondent groups, a two-tailed F—test at the .05 alpha level was performed. The results of the F-test (1.36) led the researcher to conclude that there was a strong probability that both samples were obtained from a population with the same variance. The 84 data also showed that there was less than a 1% chance that a larger F-ratio would be observed in the population, p > F = .2058. Table 4.9.--Hypothesis 1--Bank Teller: Test of significance relative to the two respondent groups’ selected task matches. Groupa df Mean Std. Dev. t Prob. F Prob. 1 132.9 33.37 7.675 3.8466 .002 1.36 .2058 2 136 28.69 6.578 aGroup 1 = incumbent workers; Group 2 = supervisor-managers. Hypothesis 2. It was hypothesized that no difference would exist relative to the significance (importance) values incumbent workers and supervisor-managers assigned to selected tasks. The hypothesis of no difference was accepted. The overall frequency' of ‘the respondent groups’ ratings of tasks (minor part of the job, somewhat important, important, very important, extremely important) is summarized as follows: Number of Responses Importance of Task Matches Mgrs. Workers Minor part of job 247 183 Somewhat important 160 199 Important 657 565 Very important 505 449 Extremely important 725 577 85 Appendix D, p. 143, contains cross-tabulation statistics, number of respondents rating a task item, mean response rate (based on significance values 1 through 5) relative to a task item, and standard deviation computed for each task item. The descriptive data, means and standard deviation, were used as a basis to perform a significance test. A summary of the results of a two-tailed t-test that was performed at the .05 level of significance is presented in Table 4.10. It was concluded (t == .3876) that there was less than a 1% chance that a larger result (p > .6989) would be found in the population from which the sample was drawn. Table 4.10.--Hypothesis 2--Bank Teller: Test of significance relative to the significance (importance) values respondent groups assigned to their selected task matches. Group df Mean Std. Dev. t Prob. Incumbent workers 133.9 3.53 0.7269 .3876 .6989 Supervisor-managers 136 3.49 0.6406 Hypothesis 3. The hypothesis that no difference would exist between respondent groups relative to the selection of tasks by respondents with one year or less, one to three years, or three years or more of experience was rejected. Data pertaining to respondents’ overall answers, on the basis of years of experience in the occupation, are presented in Appendix 86 D, p. 143. Tables 4.11 and 4.12 illustrate the results of the significance tests. As indicated, the analysis of variance F-test results (F .. 145.65) suggest that there were variances between (incumbent workers and supervisor-managers) and within the subgroups (length of employment). A subsequent test, Scheffe, confirmed that while the differences within groups were of minimum significance (F = 3.01773), they did not evidence equivalence. Table 4.11.--Hypothesis 3--Bank Teller: Analysis of variance summary table for respondents’ selected task matches relative to length of employment. Sum of Mean Source df Squares Square F Prob. Between groups 3 5896.60 1965.53 145.65 .0001 Within groups 410 5533.00 13.495 Total 413 11429.60 Table 4.12.--Hypothesis 3-—Bank Teller: Scheffe’s test for compar- ing respondents’ task matches. Source Mean Mean Square (Experience) df Diff. Equivalence F One year or less 5.3116 One to three years 14.0652 Three years or more 11.6594 410 13.4951 3.01773 87 Research Question 4 How stable are the task-assignment matches across geographic boundaries? Hypothesis 4. It was hypothesized that there would be no difference in respondents’ selected task matches relative to the geographical location of ‘their places of employment. The test results indicated that a difference existed between respondents in the two states that were included in the study regarding the selection of tasks. Thus, the hypothesis of no difference was rejected. Data pertaining to respondents’ selected task matches relative to the state in which their jobs were located are found in Appendix E, p. 146. The data obtained from the test of significance, at the .05 level of acceptance, are presented in Table 4.13. Table 4.13.-~Hypothesis 4--Bank Teller: Test of significance summary table for respondents’ task matches by geographic location. State df Mean Std. Dev. t Prob. F Prob. Georgia 117 16.579 5.8244 -22.5609 .0001 2.34 .0006 Michigan 135 45.492 8.9107 Research Question 5 How do task assignments match up across different segments of multi-industries, given the same occupation? Hypothesis 5. The hypothesis that there would be no difference in respondents’ answers relative to tasks matched on the basis of 88 places of employment was rejegtad. Inasmuch as paying and receiving jobs are concentrated in a single industry, financial, the hypothesis was tested using data obtained relative to the type of financial institution-~commercial bank, savings and loan, Federal Reserve, credit agency, other (related banking institutions)--in which respondents were employed. An analysis of variance test resulted in an F-test value that was equal to 739.61. The high F-ratio suggests that there were differences between groups (incumbent workers and supervisor- managers) and within groups (places of employment) relative to their selected task matches. Further, Scheffe’s test for multiple comparisons indicated that 14 out of 20 of the industry task match selection comparisons were significant at the .05 level. A summary of the cross-tabulated industry-task item responses is presented in Appendix F, p. 147. A summary of the test results is presented in Tables 4.14 and 4.15. Table 4.14.--Hypothesis 5--Bank Teller: Analysis of variance summary table for respondents’ selected task matches by type of industry. Sum of Mean Source df Squares Square F Prob. Between groups 5 26136.65 5227.33 739.61 .0000 Within groups 553 3908.44 7.067 Total 558 30045.09 89 Table 4.15.--Hypothesis 5--Bank Teller: Scheffe’s test for compar- ing industry subgroups’ selected task matches. Industry Difference Between Means 12.4355* 16.4557* 16.5344* 18.1594* -12.4355* 4.0202* 4.0989* 5.7239* I I l NO‘U‘I-D comm-t: ”NM-h (”NOS-D acumen I .h -16.4557* .0202* 0.0787 1.7037 6.5344* 4.0989* 0 l .0787 .6250 8.1594* 5.7239* 1 1 .7037 .6250 00000000 \INNN 01010301 01010101 ##-h-h I df = 553 F = 2.388 *Significant at the .05 level. Industry key: 4 = Commercial banks 5 = Savings and loans 6 = Other 7 = Federal Reserve banks 8 = Credit agencies 90 Hypothesis 6. The hypothesis that there would be no difference in the answers of respondents relative to task matches on the basis of the size of business establishments was rejected. On the basis of the results of the t-test (t = 16.1448), it was determined that the possibility of differences in the selection of tasks relative to the size of the business existed. A subsequent test of significance resulted in an F—ratio of 5.43. This result implies that equivalence of respondents’ task matches, relative to the size of their places of employment (small and large establishments), did not exist. The results of the tests are shown in Table 4.16. A summary of the cross-tabulated task item responses relative to size of the industry is presented in Appendix F, p. 148. Table 4.16.--Hypothesis 6--Bank Teller: Test of significance summary table for respondents’ selected task matches by size of industry. Sizea df Mean Std. Dev. t Prob. F Prob. 9 92 41.768 10.1476 10 136 20.304 4.3565 16.1448 .0001 5.43 .001 Size key: 9 = 100 or fewer employees 10 = 100 or more employees 91 ElectronicsZMechanic Task List Inventory Results Background Information Fifty-six percent of the total respondents (n = 34) were incumbent workers (n = 19); 44% were supervisor-managers (n = 15). Twelve of the respondents in the incumbent worker group were from New Jersey, and seven were from Michigan. Of the respondents who comprised the supervisor-managers group, eight were from Michigan and seven were from New Jersey. The probability of sampling error, according to some testing and measurement authorities, increases when the sample size is less than 30 (Borg & Gall, 1983). Even though the number of respondents included in the data analysis for this study was 34, this is still a relatively small sample size. A measurement for small sample size, t-test, was used to examine the probability of mean differences between groups. It is possible, however, that the data—analysis results that were obtained would have been significantly different if the size of the re5pondent sample that was analyzed had been larger. Table 4.17 shows the distribution of respondents relative to their length of employment in the occupation. 92 Table 4.17.--Electronics: Distribution of respondents by length of employment in the occupation. Length of Employment Respondent Group 1 Year or Less 1-3 Years 3 Years or More Incumbent workers 6 4 9 Supervisor-managers 2 5 8 Total 8 9 l7 % of responses 23.5 26.5 50.0 Most of the respondents (68%) worked in business establishments that employed 100 or fewer people. Table 4.18 shows the types of industries in which the respondents were employed. Table 4.18.--Electronics: Distribution of respondents relative to their places of employment. Business Number Percent Computer repair 10 29.4 Computer/office 5 14.7 Communication equipment 3 8.8 Electronic components 2 5.9 Electronic/electrical 6 17.6 Other (designated by respondents): aerospace, 8 23.5 industrial electronic controls, consumer electronic services Data obtained from the inventory’s comment section indicated that, on a scale that ranged from strongly agree to strongly disagree, there was a high amount of agreement evidenced between 93 incumbent workers and supervisor-managers about the clarity of the inventory’s task statements and the instructions for completing the inventory. A summary of the number and percentage of responses is presented in Tables 4.19 and 4.20. Table 4.19.--E1ectronics: Distribution of respondents’ answers relative to clarity of the inventory’s task items. Supervisor-Managers Incumbent Workers Response Number Percent Number Percent Strongly agree - - 3 15.8 Agree 13 86.7 13 68.4 Uncertain - - 3 15.8 Disagree 2 13.3 - — Strongly disagree Table 4.20.--Electronics: Distribution of respondents’ answers relative to clarity of instructions for completing the inventory. Supervisor-Managers Incumbent Workers Response Number Percent Number Percent Strongly agree 3 20.0 9 47.4 Agree 11 73.3 10 52.6 Uncertain - - - - Disagree 1 6.7 - - Strongly disagree - - - _ There was also evidence, based on information obtained from the comments section, that a discrepancy existed between incumbent '0‘. r.‘ . .JIJ 94 workers and supervisor-managers with regard to the representation of the inventory’s tasks relative to the work of employees in the occupational domain. Fifty-three percent of the supervisor-managers indicated agreement/strong agreement that the tasks represented employees’ job assignments, as compared to 89% of the incumbent workers who indicated they agreed/strongly agreed that the tasks represented the occupational domain. The number and percentage of respondents’ answers, which range from strongly agree to strongly disagree, are presented in Table 4.21. Table 4.21.--Electronics: Distribution of respondents’ answers relative to representation of the inventory’s tasks to electronics mechanic employees’ jobs. Supervisor-Managers Incumbent Workers Response Number Percent Number Percent Strongly agree - - - - Agree 8 53.3 16 84.2 Uncertain - - l 5 3 Disagree 6 40.0 - - Strongly disagree 1 6.7 2 10.5 Research Question 1 00 selected task lists in the ACROS match the task assignments of employees? On the basis of the data that were acquired from respondents, the researcher wanted to make inferences about the population. Thus, confidence intervals for the sample mean, at the .05 level of significance, were established. Table 4.22 contains the data that 95 were used in calculating the confidence intervals. The average (mean) response (M = 11.58) made by respondents relative to select- ing tasks that were representative of employees’ jobs and the standard error (.54) were used as a basis to compute the confidence intervals. The range of the confidence intervals, 10.52 to 12.64, established an estimate of the parameters within which there was a 95% chance that the mean response (relative to the samples’ task selection data) of the population would be found. Table 4.22.--E1ectronics: Summary data for the range of respond- ents’ task matches (n = 34). Source Total Responses Mean Standard Deviation 153 tasks 1,771 11.58 6.719 Standard Error Confidence Intervals .54 f 1.06 = 10.52 to 12.64 Using the Kuder-Richardson Formula K-20, a reliability coefficient, at the .70 or above level, was computed. The coefficient was computed to obtain an estimate of the consistency of respondents’ selected task matches. A coefficient of 0.8649 was obtained. Table 4.23 shows a summary of the reliability coefficient results. Appendix H, p. 169, contains the raw data that were used in calculating the coefficient. 96 Table 4.23.--Electronics: Estimates of reliability of respondents’ task-selection responses. Number of Items Mean P x Q Variance rxx 153 11.58 28.42 201.86 0.8649 Research Question 2 How important, on a scale from 1 to 5, are the task-assignment matches to occupational employees’ work assignments? On a scale that ranged from 1 = minor part of job to 5 = extremely important, the respondents rated the significance/impor- tance of the tasks that were selected as matching the work assign- ments of employees’ jobs. Confidence limits, at the .05 level of significance, were established for the purpose of estimating the parameters within which the population item response mean might be found. These limits, +/- 0.098 of the sample mean (M = 2.98), ranged from 2.88 to 3.08. As indicated in Table 4.24, 63% of the respondents’ task item ratings were rated as extremely important, very important, or impor- tant. A distribution of the respondents’ ratings of their selected tasks, on an item-by-item basis, is contained in Appendix I, p. 173. 97 Table 4.24.--Electronics: Distribution of respondents’ task- significance ratings. Significance Level Number Percent l = Minor part of job . 279 16 2 = Somewhat important 363 21 3 = Important 427 24 4 = Very important 323 18 5 = Extremely important 378 21 Total 1,770 100 Research Question 3 Do incumbent workers and their supervisor-managers respond to selected tasks in the same way? Hypothesis 1. It was hypothesized that there would be no difference, at the .05 level of significance, between incumbent workers and supervisor-managers in the selection of tasks. The hypothesis was accepted. The data, presented in Table 4.25 (see Appendix J, p. 184, for the raw data), indicated that there was less than a 1% chance (t = 0.0198, p > t = .9842) that a significant difference would exist in the task selection responses of the population from which the two groups were drawn. 98 Table 4.25.--Hypothesis l--Electronics: Test of significance relative to the two respondent groups’ selected task matches. Group df Mean Std. Dev. t Prob. Incumbent workers 153 5.986 3.418 .0198 .9842 Supervisor-managers 143 5.979 3.436 Hypothesis 2. It was hypothesized that no difference would exist relative to the significance (importance) values incumbent workers and supervisor-managers assigned to their selected tasks. The hypothesis of no difference was rejected. The overall frequency of the respondent groups’ rating of tasks (minor part of the job, somewhat important, important, very important, extremely important) is summarized as follows: Number of Responses Importance of Task Matches Mgrs. Workers Minor part of job 191 88 Somewhat important 220 143 Important 257 170 Very important 125 198 Extremely important 122 256 Appendix J, p. 184, contains cross-tabulation statistics, number of respondents rating a task item, mean response rate (based on significance values 1 through 5) relative to a task item, and standard deviation computed for each task item. 99 Descriptive data were used as a basis to perform a significance test. A summary of the results of a two-tailed t-test that was performed at the .05 level of significance is presented in Table 4.26. It was concluded (t = -lO.37) that there was evidence that the possibility existed that incumbent workers and supervisor- managers responded differently in terms of the importance they attached to the tasks they had selected. The data indicated that there was less than a 1% chance that a smaller t-value (p > .001) would be found in the population from which the sample was drawn. To determine whether the observed difference was due to chance, an F-test was performed. The F-ratio, F = 1.49, indicated evidence existed that the observed variance between the ratings of incumbent workers’ and supervisor-managers’ selected task matches was significant. Table 4.26.-~Hypothesis 2--E1ectronics: Test of significance relative to the significance (importance) values respondent groups assigned to their selected task matches. Groupa df Mean Std. Dev. t Prob. F Prob. 1 275 2.617 0.603 -10.37 .001 1.49 .0167 2 294 3.431 0.735 3Group 1 = incumbent workers; Group 2 = supervisor-managers. Hypothesis 3. The hypothesis that no difference would exist between respondent groups relative to the selection of tasks by 100 respondents with one year or less, one to three years, or three years or more of experience was rejected. Data pertaining to respondents’ overall responses, on the basis of years of experience in the occupation, are presented in Appendix J, p. 184. The results of the analysis of variance and F-test (F = 58.94) suggested that there were variances between (incumbent workers and supervisor-managers) and within the subgroups on the basis of length of employment. A subsequent test, Scheffe, confirmed that, relative to equivalence of the subgroups, there were significant differences in respondents’ answers, based on their length of employment in the occupation. Significance at the .05 level was observed in five of the six subgroups. The subgroups that evidenced equivalence were those with three years or more experience and those with one year or less experience in the occupation. The results of the significance tests are presented in Table 4.27 and 4.28. Table 4.27.—-Hypothesis 3—-Electronics: Analysis of variance summary table for respondents’ selected task matches relative to length of employment. Sum of Mean Source df Squares Square F Prob. Between groups 3 406.124 135.37 58.94 .0001 Within groups 669 1536.487 2.296 Total 672 1942.611 101 Table 4.28.--Hypothesis 3-—Electronics: Scheffe’s test for compar- ing respondents’ task matches. Source Mean Mean Square (Length of df Diff. Equivalence F Employment) 17-18 l.4549* 17-16 1.6775* 18-17 -l.4549* 18-16 0.2227* 16-17 -l.6775* 16-18 0.2227 669 2.296 3.009 1 year or less 1 to 3 years 3 years or more Length of employment key: 16 _.a \I II II 11 *Significant at the .05 level. Research Question 4 How stable are the task-assignment matches across geographic boundaries? Hypothesis 4. It was hypothesized that there would be no dif- ference in respondents’ selected task matches relative to the geographical location of their places of employment. The test results indicate that the responses made relative to the selection of tasks in the two states were equivalent. Thus, the hypothesis of no difference was accepted. The data pertaining to tasks selected by respondents relative to the state in which their jobs were located are found in Appendix 102 K, p. 189. The data obtained from the test of significance, at the .05 level of significance, are presented in Table 4.29. Table 4.29.--Hypothesis 4--Electronics: Test of significance summary table for respondents’ task matches by geographic location. State df Mean Std. Dev. t Prob. Fa Prob. Michigan 247.5 7.639 3.792 4.926 .0001 1.21 .2675 New Jersey 273 5.483 3.449 aDegrees of freedom = 121 and 152. Research Question 5 How do task assignments match up across different segments of multi-industries, given the same occupation? It was hypothesized that there would be no difference in respondents’ answers to selected task matches on the basis of type of business establishment. This hypothesis of equivalence of selection of tasks by respondents in different segments of multi- industries was rejected. The analysis was performed on the basis of respondents’ task selection responses relative to the type of industrial/business establishment in which they were employed. The performance of an analysis of variance test resulted in a large F-ratio (F = 55.46). The obtaining of a high F-ratio strongly suggests that equivalence did not exist between the subgroups. A summary of the data is presented in Table 4.30. 103 Table 4.30.--Hypothesis 5--Electronics: Analysis of variance summary table for respondents’ selected task matches by type of industry. Sum of Mean Source df Squares Square F Prob. Between groups 6 282.31 47.05 55.46 .0000 Within groups 903 766.04 0.848 Total 909 1048.35 Scheffe’s multiple comparison procedure was employed to ascertain within which groups differences existed. The data, which are summarized in Table 4.31, indicate that significant differences, at the .05 level, were observed in 24 of the 30 multiple comparisons. A table that displays a cross-tabulation of the frequency of tasks selected by respondents in the multi-industry/ business establishments is contained in Appendix L, p. 190. Hypothesis 6. The hypothesis that there would be no difference in respondents’ answers relative to task matches on the basis of the size of the business establishment was rejected. The results of the t-test (t = 12.101) indicated that the possibility of differences in the selection of tasks as it related to the size of a business existed. A subsequent test of significance resulted in an F-ratio (F == 4.93) that implied equivalence did not exist for responses obtained from respondents on the basis of the size of the business establishment in which they were employed. The results of the tests are shown in Table 4.32. A Table 4.3l.--Hypothesis 5--Electronics: 104 Scheffe’s test for compar- ing industry subgroups’ selected task matches. Industry Difference Between Means 6-13 0.7511* 6-11 0.8922* 6-7 l.3789* 6-10 l.8000* 13-6 -0.7511* 13-11 0.1411 13-7 0.6278* 13-9 0.8129* 13-10 1.0488* 11-6 -0.8922* 11-13 -0.l411 11-7 0.4867* 11-9 0.6718* 11-10 O.9077* 7-6 -l.3789* 7-13 -0.6278* 7-11 —0.4867* 7-9 0.1851 7-10 0.4210 9-6 -l.5640* 9-13 -0.8129* 9-11 -0.6718* 9-7 -0.1851 9-10 0.2359 10-6 -l.8000* 10-13 -l.0488* 10-11 -0.9077* 10-7 -0.4210 10-9 -0.2359 df = 903 F = 2.224 *Significant at the .05 level. Industry key: 6 computer repair computer office communication equipment electronic components electronic/electrical other 105 summary of the cross-tabulated task item responses relative to size of industry is presented in Appendix L, p. 191. Table 4.32.--Hypothesis 6--Electronics: Test of significance summary table for respondents’ selected task matches by size of industry. Size df Mean Std. Dev. t Prob. F Prob. 14 189 8.914 5.082 12.101 .001 4.93 .0001 15 291 3.418 2.290 Size key: 14 15 100 or fewer employees 100 or more employees Summary The data-analysis results provided answers to the overall objective regarding the validity and reliability of selected tasks from the ACROS. The results, which were compiled from data obtained from incumbent workers and supervisor-managers, encompassed two of the ACROS task lists, Bank Teller and Electronics Mechanic. A summary of the results is presented in Table 4.33. 106 Table 4.33.-—Summary table of hypotheses for Bank Teller and Elec- tronics Mechanic. Electronics Hypothesis Bank Teller Mechanic 1. There will be no difference in the responses by incumbent workers and Accept null Accept null their supervisor-managers relative to selected task matches. 2. There will be no difference in the responses by incumbent workers and their supervisor-managers relative Accept null Reject null to the significance (importance) values they assigned to their selected task matches. 3. There will be no difference in the responses by respondents relative Reject null Reject null to selected task matches on the basis of length of employment. 4. There will be no difference in the responses by respondents relative to selected task matches on the Reject null Accept null basis of geographical location of places of employment. 5. There will be no difference in the responses by respondents relative to selected task matches on the Reject null Reject null basis of type of business estab- lishment. 6. There will be no difference in the responses by respondents relative Reject null Reject null to selected task matches on the basis of size of the business. CHAPTER V SUMMARY, CONCLUSIONS, RECOMMENDATIONS, AND IMPLICATIONS Summary Background Social and economic conditions, such as competition from world markets and technology, have escalated changes in the requirements of jobs. The changing job requirements, in addition to guidelines for the development of programs that are outlined in federal and state legislation, place demands on occupational program planners to develop curricula for educational and training programs that prepare adults and youths for present and future employment. This entails using innovative techniques to acquire job information that projects skills, knowledge, and work practices that are associated with the various occupational specialties. One infOrmation-gathering technique that occupational program planners have strongly endorsed in the last few years is the acquiring of job information from computerized systems. In particular, many educators have obtained information from the Automated Cross-Referencing Occupational System (ACROS). However, educators who subscribe to the ACROS have expressed concern about the validity and reliability of the data base’s job-task information. It was these concerns that prompted the researcher to conduct this study. 107 108 Purpose ACROS task information is used by many educators, nationally, to develop curricula for employment-based education and training programs. It is also used by some educators as a basis for the development of criterion-referenced tests. It is therefore critical that ACROS task information is valid and reliable. The writer’s main purpose in this study was to obtain an estimate of the validity and reliability of the ACROS task information. Relative to this, the researcher also wanted to investigate the representation of ACROS tasks to a given occupation and the appropriateness of selected tasks for inclusion in formal education programs. Finally, the researcher wanted to establish a conceptual framework to promote the continued development of computer-assisted occupational program planning and development resources. This entailed investigating procedures that permitted collection and analysis of task information with regard to job-related factors that are critical in planning and developing effective employment- training programs. Relative to these objectives, the ACROS Bank Teller and Electronics Mechanic tasks were selected for this study. The following research questions and hypotheses were investigated: Research Question 1. 00 selected task lists in the ACROS match the task assignments of employees? 109 Researgh Question a. How important, on a scale from 1 to 5, are the task-assignment. matches to occupational employees’ work assignments? Research Question 3. Do incumbent workers and their supervisor-managers respond to selected tasks in the same way? Hypothesis 1. There will be no difference in the responses by incumbent. workers and their supervisor-managers relative to selected task matches. Hypothesis 2. There will be no difference in the responses by incumbent workers and their supervisor-managers relative to the significance (importance) values they assigned in» their selected task matches. Hypothesis 3. There will be no difference in the responses by respondents relative to selected task matches on the basis of length of employment. Research Question 4. How stable are the task-assignment matches across geographic boundaries? Hypothesis 4. There will be no difference in the responses by respondents relative to selected task matches on the basis of geographical location of places of employment. Research Question 5. How do task assignments match up across different segments of multi-industries, given the same occupation? Hypothesis 5. There will be no difference in the responses by respondents relative to selected task matches on the basis of type of business establishment. Hypothesis 6. There will be no difference in the responses by respondents relative to selected task matches on the basis of size of the business. Procedure The population was composed of 400 businesses in which incumbent workers and their supervisor-managers were employed in bank teller and electronics mechanic jobs. Two hundred of the 110 businesses were financial establishments, 100 from Michigan and 100 from Georgia. The remaining businesses (100 in Michigan and 100 in New Jersey) were places where electronics mechanic jobs were concentrated. The data analyzed in this study were drawn from information collected from 113 respondents. Of these, 79 respondents were engaged in paying and receiving work, and 34 were employed in electronics mechanic positions. Respondents for each study area identified, from an inventory, tasks that represented their occupational specialty. Statistical tests were then performed on the collected data to obtain an estimate of the validity of the tasks that were listed in the inventory. In addition, a reliability test was performed to acquire an estimate of the consistency of respondents’ answers regarding the listed task items. After selecting the tasks, respondents assigned a signifi- cance/importance value, ranging from 1 to 5, to their matched task items. These data were used to examine the importance of selected tasks to the job assignments of employees in the occupation. Findings Bank Teller Task Inventory. Incumbent workers (94%) and their supervisor-managers (93%) agreed that the ACROS tasks represented work performed by employees in paying and receiving positions. The results of the calculation of confidence intervals, at the .05 level of significance, indicated there was less than a 5% chance that the 111 population the respondents represented would select or rate the ACROS tasks differently. A comparison of the data acquired from the two groups, incumbent workers and supervisor-managers, revealed that they did not differ significantly in their selection of tasks. The data further indicated that no variance existed between the two groups’ ratings, on a scale from 1 to 5, of the significance (importance) of the tasks they selected to match employees’ job assignments. The result of the Kuder-Richardson Formula K-20 (0.9655) was much higher than the .70 minimum acceptance level that the researcher had established. This suggested that the probability existed that respondents’ answers to the 69 task items reflected the way the population they represented would respond. Significant differences were observed, however, when responses to tasks were analyzed on the basis of job-related factors. These included differences associated with selection of tasks relative to the geographic location of paying and receiving jobs, length of employment, size of business, and type of financial institution. Electronics Mechanic Task Inventory. The size of the respond- ent sample was small, 34. Consequently, it is possible that the results of the data analysis would have been different if a larger number of respondents’ inventories had been included in the analysis process. The data indicated that incumbent workers and supervisor- managers did not agree about the extent to which the ACROS tasks 112 represented employees’ job assignments. Only 43% of the supervisor- managers as compared to 89% of the incumbent workers indicated the listed tasks were inclusive of tasks that employees perform on their jobs. Even though agreement between the two respondent groups was not evidenced relative to the extent that the inventory’s tasks represented employees’ work assignments, no difference was observed between the tasks that incumbent workers and supervisor-managers selected from ‘the inventory as matching employees’ jobs. This suggested that the ACROS Electronics Mechanics task listing does not include a representative sample of the tasks of the occupational domain. The data also revealed that even though incumbent workers and supervisor-managers did not demonstrate a worked difference in their selection of tasks, they evidenced significant differences in their ratings of the importance of their selected tasks. This raises questions about the relevance of the tasks for development of employment-based programs. The Kuder-Richardson Formula K-20 was computed to obtain a measure of the reliability of responses. A significantly high reliability coefficient (0.8649) was obtained. This suggested that the probability existed that respondents’ task-item matches were representative of the way in which the population would respond to the ACROS tasks. No significant difference was observed relative to the selection of tasks by respondents from the two states. Significant differences were observed, however, when responses were analyzed 113 with respect to length of employment, size of business, and type of business establishment. Synthesis Findings for ‘the Bank; Teller' and Electronics Mechanic Task Inventories were presented separately. However, estimates of reliability and validity that evidenced similar results for the two occupations should be noted. Similar results were demonstrated as follows: 1. For both occupations, the two respondent groups were in agreement with respect to selection of tasks. 2. High reliability coefficient estimates were evidenced for responses to both occupations. 3. Responses analyzed relative to length of employment, size of business, and type of business demonstrated that these factors affected selection of tasks for both occupations. It should also be noted that the following differences were observed between the two occupations, relative to reliability and validity measures: 1. The respondent groups for one of the occupations, Bank Teller, indicated agreement that the ACROS tasks represented the occupation; however, agreement did not exist between the respondent groups for the other occupation, Electronics Mechanic. 2. For one of the occupations, Bank Teller, differences were observed in the selection of tasks relative to geographic location; 114 however, no difference was observed on the same variable when responses were analyzed for the Electronics Mechanic occupation. 3. There was no difference in the way in which respondent groups rated their selected tasks for one occupation, Bank Teller. Conversely, marked differences were evidenced between the importance values assigned by incumbent workers and supervisor-managers for the Electronics Mechanic occupation. Conclusions The following conclusions were reached as a result of information obtained from this study: 1. Superficially, the ACROS tasks appear to be valid. On closer examination, however, validity was not evidenced when tasks were analyzed with respect to job factors: length of employment, size of business, and type of business establishment. 2. Incumbent workers and supervisor-managers selected the same tasks to match employees’ work assignments from a preestablished list. However, the two groups did not always agree that the listed tasks were a representative sample of the work performed by employees in the occupation. 3. Incumbent workers and supervisor-managers were not always in agreement with respect to the extent to which their selected task matches were important components of the work performed by employees in the occupation. 4. ACROS tasks that were developed around jobs located in multi-types of businesses/industries did not represent work 115 performed by employees in the occupation with respect to the type of business/industry in which their jobs were located. Recommendations The following recommendations are made as a result of information acquired during the course of the study. 1. Modify the procedures employed in task-validation studies to include (a) selection of the population sample on the basis of places of employment where jobs relative to the study area are concentrated, (b) validation of tasks with incumbent workers and their supervisor-managers, and (c) collection and analysis of task information with respect to job-related factors. 2. Employ a systematic approach in the continued development of ACROS (see Figure 5.1). 3. Revise the programming of ACROS to accommodate, through menu options, retrieval of task information on the basis of job- related ‘variables. Examples are entry/advanced-level job ‘tasks; regional job tasks; tasks associated with jobs performed in small versus large businesses; and tasks of a job that are pertinent to work performed in a specific type of business: service, wholesale, transportation, or manufacturing. 4. Revise the programming of ACROS to provide access to task information on the basis of business/industry codes that may be correlated with places of employment where jobs are concentrated within a program planner’s educational service area. 116 .000000000 000500~0>00 000 00$00000 5000000 000000000000 00000000-00000500 00000—0>00 00$ 00005 0000000000 00.m 000000 000000000 .p0>0m0000 0000 .00000500$0$ .0000 00000000 500 u0>0 x$000000 00 0005 1000000\00$00000>0 00: 0000000 0.5000m0 $0 0000>0000 00000< 05000000 00000000000 $00000000>00 $0 00000 00000500$0$ 000 000 00.;000 0.0 50000 00.5; 00 0000x0 000500000 00$00000>0 0>$005500 000 0>$00500$ 0000000 00:30000$ 0000000 00000 5000000 00000 00 0>$00000 000005 100$0$ $0 00000000000 .000000000 0005000 0000 0000000000 P0>000000u00000500$0m 00 00$0$000 00m>000 WLQmD ”um—0.000% 5000>0 00x00: 0000 0000 0.5000x0 000$ 00000000 00$ 0000 x000\nom 0000000 000000$ now 00 0000000 00$; 00000500$0$ 0000\000 0000000 000 000—000 00000005\0000>000:0 000000000; 000050000 00$: m0000 0000000 000000000000 000 00000000; 0005>00050 $0 000000 500$ 000500 00$0000000 000F0m 00000 00000 00x005 00000 \000000000 0000000000 0:350. Alli 600000 020.56: m|||l <20 00:28 AIIIII 2000000 250 A 00000 i$0000 000005000$>00 \000000$ 00.0. 00.500000 $0 0$000 00 0000 -0500$0$ 00000000 00 5000>0 00am$$0ou 0000 0:000 5000>0 00$50000o 0000 0000000000 $0 0$000 000 00 00500 :000 5000>0 >$$000m 00050000>00 5000000 00 00000 100000 000$0m—00 000 005000000$000 $0 00005$ 00$500000 00000500$0$ now 00$ 0000 .00000000 5000000 00$500000 00 0005000000 0000000 0000000000 000000000 0 .0050: 1000 .000000 >000w mmozc<_a 117 Implications During the past several months, the writer embarked on a journey that involved the study of the validity and reliability of selected tasks from the Automated Cross-Referencing Occupational System. At times the journey was frustrating; more questions were raised than were answered. However, these frustrating moments were now and then offset by a glimmer of insight. Some of the insights gained cannot be documented with the data that were acquired. Nevertheless, it seems important that these insights be elaborated on because they might be of benefit to others who engage in similar research. Relevance of ACROS Task Information for Development of Programs Many schools do not have resources, time, money, or staff to conduct large-scale job-task analysis validation studies. It is critical, however, that this kind of information be readily available to educators who are responsible for developing programs that prepare adults and youths for employment. ACROS is a cost- efficient computerized job-task information system that is accessible to occupational program planners. Before using ACROS information, however; educators should be aware of some of the limitations of the system’s job-task information. Many of the ACROS task descriptions were developed several years ago. Consequently, it is possible that some of the task descriptions do not represent current job requirements, and are 118 therefore not appropriate to use in the development of occupational programs. Program developers should also keep in mind that most of the ACROS task descriptions were validated in one labor market. This labor market may have been vastly different, because of economic conditions or business practices such as de-emphasis on job specialization, from the labor market in which a program planner is developing programs. Thus, it is highly possible that the job tasks an occupational program developer obtains from ACROS may not reflect the knowledge and skill requirements of jobs that are in their local labor market. It is also possible that the level of ACROS tasks may not be appropriate for use in the development of certain types of programs. Some tasks, for example, may be too elementary to include as objectives for skill upgrading, career change, "high" technology programs. Finally, educators who obtain criterion-referenced tests that are based on ACROS information should make sure that the tasks upon which the test items were developed are valid job tasks. Iask-Validation Procedures Some of the limitations of the task—validation procedures that were employed by V-TECS and other organizations that collected information included in the ACROS have been identified. For example, the procedures do not include methods fer collecting and 119 analyzing data with regard to some of the critical job factors that affect programs that are developed at the local or national level. As with any undertaking, there were also some limitations that were evidenced with this study. A major procedural problem, for instance, was encountered in cross-indexing the Classification of Instructional Programs (CIP), Dictionary of Occupational Titles, and Occupational Employment Statistics (OES) classification systems. Yet, this was an important procedural step because it provided a way to identify the population (location of jobs) where a sample for the study was obtained. Unfortunately, because of inconsistencies that existed among the classification systems, several of ‘the sample listings were inappropriate for inclusion in the study. The researcher believes this was a contributing factor in the low response rate that was evidenced in one of the study areas, Electronics Mechanic. In retrospect, the researcher realizes that some of the problems that occurred as a ‘result of using the classification system might have been eliminated if the research had been designed differently. For example, a letter mailed to the population sample before the mailing of inventories could have served to identify businesses that were not appropriate for inclusion in the study. Continued Development of ACROS ACROS has demonstrated that under certain conditions it; is a resource that has the potential to provide educators with some job information that is pertinent to development of occupational 120 programs. The task descriptions may be used by educators, for example, as a starting point for job-task validation studies that are conducted locally or nationally. Educators should keep in mind, however, that ACROS was developed approximately seven years ago. No major modifications, with regard to reprogramming of the menu options, have been made since ACROS’s inception. This is partially because information about the needs of the target population, educators, and an analysis of the environment in which programs operate has not been compiled. It is suggested that this study has provided insight into some system changes, data input and technical, to ACROS that may be of benefit to educators. This involves modifying ACROS to accommodate the diverse needs of the educational population it serves. The modifications include improvements to the system that result in increased cross-indexing of tasks and job-related factors, integration of data that emphasize various job-education situations, and retrieval of information relative to job-education variables that are deemed pertinent by individual program planners. It is believed that these kinds of changes will help ensure that ACROS becomes a viable resource that is paramount to programs planned and developed by educators nationally. APPENDICES APPENDIX A BANK TELLER TASK INVENTORY 121 MICHIGAN STATE UNIVERSITY COLLEGE 0' EDUCATION usr wusmc. . mcmcm . «524.00 0943mm! or Enuomomu. Aoumxsnumow mason mu November 14, 1989 Dear Employer: Educators across the United States are taking steps to improve the quality of their pre-employment programs. To accomplish this, they need to know the knowledge, skills, and attitudes, needed of employees who work in paying and receiving teller jobs. This information is used by educators to develop programs which respond to needs identified by employers such as yourself. As a part of this program improvement effort, we are conducting research on a computer-generated occupational information system. The system’s information is used by educators nationwide to plan and develop pre- employment programs. An important phase of the research involves validating tasks performed by paying and receiving tellers. We are asking for your participation and assistance. Enclosed are two occupational task inventories. One of the inventories is to be completed by an employee who supervises your paying and receiving tellers; The second inventory is to be completed by an employee wpp performs the work of a paying and receiving teller. The inventory can be completed in approximately 15 minutes. The directions are brief, easy to follow, and all responses are confidential and will be viewed by the researchers only. Two stamped, self-addressed envelopes are enclosed for the return of individual re5ponses. Although participation by you and your employees is strictly voluntary, it is in this phase that your firm can make a major contribution. The return of the surveys by December 4 will insure your firms participation in this important educational endeavor. Meanwhile, if you have questions about the research, please do not hesitate to contact us. Thank you. Sincerely, . ‘ 52 J0... 0.0m... 0,056 0.0/M, Cas F. Heilman, Professor Carol E. Culpepper Adult and Continuing Education Ph. D. Researcher Phone: 517/484-7745 CFszp 122 TASK INVENTORY FOR PAYING AND RECEIVING TELLER THIS INVENTORY IS TO BE COMPLETED BY A PAYING-AND RECEIVING TELLER 123 To Our Respondents: Thank you for participating in this study. Your answers to the survey questions will be used to determine the accuracy Of job related information that is contained in an educational computer-generated system. The computerized job information system is used by educators, across the United States, to develop pre-employment programs. Any information you give is confidential and will be viewed by researchers only. Your participation is strictly voluntary. DIRECTIONS FOR COMPLETING THIS SURVEY 1. Fill in the Background Information Sheet. 2. Starting with page 1, place a CHECK MARK in the CHECK ' column for each task you perform on your present job. 3. Then, rate the importance of each task you checked. Place a CHECK in the COLUMN that best describes the extent to which the task is a significant part of your job. 4. Complete the Comment Page. 5. Mail your survey in the self-addressed envelope by December 4, 1989. THANK YOU 124 TASK INVENTORY FOR PAYING AND RECEIVING TELLER THIS INVENTORY IS TO BE COMPLETED BY A MANAGER/SUPERVISOR OF PAYING AND RECEIVING TELLER EMPLOYEES 125 To Our Respondents: Thank you for participating in this study. Your answers to the survey questions will be used to determine the accuracy of job related information that is contained in an educational computer- generated system. The computerized job information system is used by educators, across the United States, to develop pre- employment programs. Any information you give is confidential and will be viewed by researchers only. Your participation is strictly voluntary. DIRECTIONS FOR COMPLETING THIS SURVEY 1. Fill in the Background Information Sheet. 2. Starting with page 1, place a CHECK MARK in the CHECK column for each task that should be performed by the paying and receiving tellers who are under your supervision. 3. Then, rate the importance of each task you checked. Place a CHECK in the COLUMN that best describes the extent to which the task is a significant part of the job performed by the paying and receiving tellers you supervise. 4. Complete the Comment Page. 5. Mail your survey in the self-addressed envelope by December 4, 1989. THANK YOU 126 BACKGROUND INFORMATION INSTRUCTIONS: Please check the box to the left of the one item in each section that most appropriately describes your experiences wHAT IS YOUR PRESENT JOB POSITION? 1 [ZZZ] TELLER, PAYINS AND RECEIVING 2 [ZZZ] MANAGER/SUPERVISOR S [ZZZ] OTHER Please write In IN WHAT TYPE OF ORGANIZATION OR INDUSTRY ARE YOU EMPLOYED? 4 [ZZZ] COMMERCIAL BANK 7 [ZZZ] FEDERAL RESERVE BANK 5 [ZZZ] SAVINGS AND LOAN S [ZZZ] CREDIT AGENCY 6 [ZZZ] OTHER Please Write In APPROXIMATELY How MANY PEOPLE ARE EMPLOYED IN YOUR PLACE OF EMPLOYMENT? 9 [ZZZ] 100 OR LESS Io [ZZZ] 100 OR MORE How LONO HAVE YOU SEEN EMPLOYED IN YOUR PRESENT JOB? 11 [ZZZ] LESS THAN ONE YEAR 13 [ZZZ] ONE To THREE YEARS 12 [ZZZ] THREE YEARS OR MORE 127 TELLER TASK INVENTOAY TASKS late The Significance of the Test to The Job (~r’l Hinor . Part Soeewhat Very Extraeely of Job Ieportant leportant leportant leportant (l) (2) (3) (4) (5) A. I. PLANNING AND ORGANIZING Arrange coins in tray for day's transactions 2. Prepare strapped currency drawer for day's transactions 3. Prepare rolled coin drawer for day's transactions 4. Prepare working currency drawer for day's transactions Prepare working currency and coins (or day's transaction Stock iores, supplies, and equipeant for daily transactions 7. Open teller tereinal Close teller tereinal 9. Balance cash drawer and close teller window IO. SUPERVISINO AND INPLENENTINO 10. Follow procedures for conduct during a robbery 128 TASKS “late the Significance ‘ of the last to the Job (v’D Ninor Part Soeewbat Very Extreeely of Job leportant leportant leportant leportant (1) (2) (3) (4) (5) II. Follow procedures for conduct after a robbery 12. Greet cestoeers Issue the safekeeping of eoney at teller's window Diseiss cestoeers 14. INSPECTING AND EVALUATING Detereine it check is negotiable 15. Exaeine counter checks for acceptability Esaeine deposit slips for acceptability 17' Peeowe excess currency (roe teller's window Exaeine currency for coantefeit bills 19. Inspect for eutilated and badly worn coins 20. Inspect (or ewtilated or badly worn currency 129 TASKS Check late the ligniiicance (t”l of the last to tie It Job (‘T’T Part of Rinor Job Part Soeewtat (I) of Job leportant (2) leportant (3) Very leportant (4) Estreeely leportant (5) 21. Inspect custoeer's identification 22. PROCESSING HONEY Process counterfeit currency Process excess currency Process eutllated and badly worn currency loll coins 26. Sort and stack coins 27. Sort and stack currency 28. Strap currency Count coins 30. Count currency OI. PERFORNINO CUSTOMER SERVICE ACTIVITIES Verity custoeer checking! savings account Accept Cbristeas Club payuentu 130 TASKS Chuct late the Significance ( 0’1 of the last to the It Job (“’1 Part Of Rinor Job Part Soeewhat (I) (2) of Job leportant leportant (3) Very leportant (4) Extreeely leportant (5) Accept installeunt loan payeents 34. Adult custoeers to safe deposit boxes 35. Advance cash on bank credit cards 36. Answer custoeer inquiries 37. Cash checks Cash series E or EE bonds 39. Enter auount of interest in savings passbooks 40. Fill change requests 41. Fill payroll requests 42. Pay savings withdrawals 43. Issue cashier's checks 44. Issue certificates of deposits its. Process bond coupons we. I Process deposits ‘1231 TASKS Check Rate the higniiicance (vr’) oi the last to the It Job (/) Part oi hinor Job Part Boeewhat Very Extraeely of Job leportant leportant leportant leportant (i) (2) (3) (4) (S) 47. Coeplete a collection receipt 40. Issue eoney order 49. Process deposits 50. Issue series E2 bonds SI. Issue traveler's checks 52. Cash traveler's checks F. 53. PERFORNINS CLERICAL ACTIVITIES Tate telephone calls 54. Operate checkwriter ISO. Order personalized chects Prepare deposit slips for the custoeer I». Accept and process tax deposits h. Coupletu safe deposit rental statueant 132 Check late the Significance M’ of the rm to n. It Job (s/O Part Of Ainor Job Part Soeewhat Very Extreealy of Job leportant leportant leportant leportant TASKS (i) (2) (3) (4) (5) H59. Accept safe deposit rental fees H60. Place hold on custoeer accounts SI. Rate title changes on custoeer accounts Open new custoeer accounts Place stop payeent on chects Operate electronic audit eachine Prepare cash-in and cash-out tickets 8. I66. ACCOUNTINS luy cash iroe other tellers lcr. Prepare a teller's daily balance sheet 168. Sell cash to other tellers 133 COMMMENTS ‘I'rnank you for your cooperation. This page provides an opportunity for you to make comments and suggestions. IINSTRUCTIONS: Respond to each of the statements below. CIRCLE the ONE symbol which best describes your feelings about each statement. SA I Strongly Agree A 8 Agree U 8 Uncertain D = Disagree SD = Strongly Disagree :1.- The tasks in this survey included most SA A U D SD of the tasks performed by employees who are working in the job area. 22.. The statements of the job tasks were SA A U D SD reasonably clear. 53.. The instructions for answering the SA A U D SD questions in this survey were reasonably clear. ‘4M- PLEASE INDICATE ANY ADDITIONAL COMMENTS YOU WOULD LIKE TO MAKE IN THIS SPACE. Please return your survey in the self-addressed envelope THANK YOU 134 MIC HIGAN STATE UNIVERSITY EAST umsmc. . marrow . auras-1034 COLLEGE OF EDUCATION DEW OF EDUCATIONAL ADMINISTRATION ”ICE-ON HAIL November 30, 1989 Dear Employer: Recently, you received a letter and survey forms regarding the knowledge, skills, and attitudes needed of employees who work in paying and receiving teller jobs. The success of this venture is dependent upon the receipt of this information from your employees. Your firm can make a valuable contribution to education by responding to the task inventory surveys that were mailed to you on November 14. If you have not already responded, may we ask you to encourage the employees you se'lected to participate in the study to complete and return the inventories to us by no later than December 20. If you have any questions please call the number below. Th ank you. 5 1° ncerel y, 0 ' A C /:;/¢Jé(éw~ W 37 WA) Carol E. Culpepper Cas F. Heilman, Professor Ad ult and Continuing Education Ph.D. Researcher 517/434-7745 CFszp APPENDIX B KUDER-RICHARDSON COMPUTATIONS: BANK TELLER PAYING AND RECEIVING TASK S @Qflamthh-b 4 1158's 4 NO's HM h. KUDER-RICHARDSON COIPUTATIONS D 0 e e e e e e e e e e e e e e e I e e e e e e e e e e e e e e e e . e 9 e g e e e e O o e e e e e e e e e e l e 0 e 0 e 0 e 0 0 0 e 0 e 0 a e e e e I 0 e e a e o e g e o e e e “NOONONNOONONHOOP‘NONOOHHNNfi-fiHO-‘NO P" O H P‘ . (X-N) 1.93 2.93 T0.07 6.93 8.93 10.93 8.93 8.93 12.93 12.93 11.93 16.93 0.93 1.93 11.93 6.93 10.93 10.93 11.93 5.93 9.93 12.93 -18.07 1.93 '1.07 -0.07 -8.07 '5.07 6.93 7.93 9.93 8.93 '24.07 12.93 021.07 4.93 15.93 11.93 4.93 -9.07 8.93 '29.07 10.93 10.93 '25.07 '15.07 9.93 '29.07 11.93 11.93 217.07 2.93 (X-I)2 3.72 8.57 0.01 47.99 79.70 119.41 79.70 79.70 167.12 167.12 142.27 286.54 0.86 3.72 142.27 47.99 119.41 119.41 142.27 35.14 98.56 167.12 326.61 3.72 1.15 0.01 65.16 25.73 47.99 62.85 98.56 79.70 579.48 167.12 444.05 24.28 253.69 142.27 24.28 82.31 79.70 845.21 119.41 119.41 628.63 227.18 98.56 845.21 142.27 142.27 291.47 8.57 PAYING AND RECEIVING (page 2) TASK 8 8 YES'S S NO's 53 73 6 54 77 2 55 51 28 56 51 28 57 64 15 58 63 16 59 26 53 60 50 29 61 59 20 62 37 42 63 38 41 64 49 30 65 18 61 66 63 16 67 64 15 68 70 9 69 62 17 TOTALS 4283 1168 IEAN: 62.07 PAYING AND RECEIVING TELLER 8 ITEMS CHKED YES FREQ X 5 1 5 6 1 6 11 1 11 13 1 13 31 1 31 34 1 34 39 1 39 43 1 43 44 2 88 46 1 46 47 1 47 48 1 48 49 3 147 50 1 50 51 1 51 52 2 104 53 3 159 54 6 324 55 1 55 56 3 168 57 5 285 58 6 348 59 7 413 60 5 300 OQOOOOOOOOOOOOOOO 'U Q 03 X2 25 121 169 961 1156 '1521 1849 7744 2116 2209 2304 21609 2500 2601 10816 25281 104976 3025 28224 81225 121104 170569 90000 O ooooooooooooocooc O O I O O O O O I I O I I O O O I OOOOOOOOOOOOQOQOO HO-‘HHHNNNHNNHHNNOO Nommm0mmwunmmuwr¢~a CD u‘ 0') KUDER-RICHARDSON COMPUTATIONS (X'I) 10.93 14.93 '11.07 '11.07 1.93 0.93 -36.07 '12.07 -3.07 “25.07 -24.07 '13.07 '44.07 0.93 1.93 7.93 T0.07 NUDER’RICHARDSON COMPUTATIONS (X-I)2 119.41 222.83 122.60 122.60 3.72 0.86 1301.22 145.74 9.44 628.63 579.48 170.89 1942.38 0.86 3.72 62.85 0.01 13472.64 137 PAYING AND RECEIVING TELLER KUDER-RICHARDSON COIPUTATIONS 6 ITEMS CHKED YES FREQ X X2 61 5 305 93025 62 2 124 15376 63 4 252 63504 64 3 192 36864 RUDER-RICHARDSON: 65 5 325 105625 66 1 66 4356 k l sue(p’q) I 67 1 67 4489 r = ---------- l 1 - -------- l 68 1 68 4624 k - 1 l variance l 69 1 69 4761 (69/68)’(1‘(9.46/195.26)) '1 II SRESPONSES 79 4283 1014765 r = 0.9655 VARIANCE = 195.26 SUM(P‘Q)= 9.46 APPENDIX C TABLE OF TASKS BY SIGNIFICANCE: BANK TELLER 138 Ipsks by Significance: Bank Teller Task Values (N=69) Total 1 2 3 4 5 1 Freq. 34 13 9 6 2 64 Row % 53.13 20.31 14.06 9.38 3.12 2 Freq. 13 14 26 6 6 65 Row % 20.00 21.54 40.00 9.23 9.23 3 Freq. 22 17 13 8 2 62 Row % 35.48 27.42 20.97 12.90 3.23 4 Freq. 11 11 27 11 9 69 Row % 15.94 15.94 39.13 15.94 13.04 5 Freq. 12 12 26 12 8 70 Row % 17.14 17.14 37.14 17.14 11.43 6 Freq. 22 21 19 10 1 73 Row % 30.14 28.77 26.03 13.70 1.37 7 Freq. 3 8 17 17 26 71 Row % 4.23 11.27 23.94 23.94 36.62 8 Freq. 3 7 18 17 26 71 Row % 4.23 9.86 25.35 23.94 36.62 9 Freq. 1 2 4 16 52 75 Row % 1.33 2.67 5.33 21.33 69.33 10 Freq. 3 O 2 10 59 74 Row % 4.05 0.00 2.70 13.51 79.73 11 Freq. 3 0 2 12 56 73 Row % 4.11 0.00 2.74 16.44 76.71 12 Freq. 1 0 9 22 47 79 Row % 1.27 0.00 11.39 27.85 59.49 13 Freq. l 1 5 12 44 63 Row % 1.59 1.59 7.94 19.05 69.84 14 Freq. 2 5 15 26 14 62 Row % 3.23 8.06 24.19 41.94 22.58 139 Task Values (N=69) Total 1 2 3 4 5 15 Freq. 0 1 3 14 56 74 Row % 0.00 1.35 4.05 18.92 75.68 16 Freq. 0 0 7 17 44 68 Row % 0.00 0.00 10.29 25.00 64.71 17 Freq. 2 1 15 26 28 72 Row % 2.78 1.39 20.83 36.11 38.89 18 Freq. O 2 8 16 47 73 Row % 0.00 2.74 10.96 21.92 64.38 19 Freq. 4 4 13 24 29 74 Row % 5.41‘ 5.41 17.57 32.43 39.19 20 Freq. 23 16 20 5 4 68 Row % 33.82 23.53 29.41 7.35 5.88 21 Freq. 21 17 23 6 5 72 Row % 29.17 23.61 31.94 8.33 6.94 22 Freq. O O 7 19 48 74 Row % 0.00 0.00 9.46 25.68 64.86 23 Freq. 10 3 12 10 9 44 Row % 22.73 6.82 27.27 22.73 20.45 24 Freq. 4 3 18 25 14 64 Row % 6.25 4.69 28.12 39.06 21.87 25 Freq. 19 15 12 8 6 60 Row % 31.67 25.00 20.00 13.33 10.00 26 Freq. 23 13 15 6 5 62 Row % 37.10 20.97 24.19 9.68 8.06 27 Freq. 26 9 9 5 5 54 Row % 48.15 16.67 16.67 9.26 9.26 28 Freq. 12 10 16 11 8 57 Row % 21.05 17.54 28.07 19.30 14.04 29 Freq. 6 9 29 13 12 69 Row % 8.70 13.04 42.03 18.84 17.39 140 Task Values (N=69) Total 1 2 3 4 5 3O Freq. 16 6 22 11 15 70 Row % 22.86 8.57 31.43 15.71 21.43 31 Freq. 7 6 22 17 20 72 Row % 9.72 8.33 30.56 23.61 27.78 32 Freq. 1 3 17 23 27 71 Row % 1.41 4.23 23.94 32.39 38.03 33 Freq. 2 3 18 11 4 38 Row % 5.26 7.89 47.37 28.95 10.53 34 Freq. 5 l 35 19 15 75 Row % 6.67 1.33 46.67 25.33 20.00 35 Freq. 2 2 20 9 8 41 Row % 4.88 4.88 48.78 21.95 19.51 36 Freq. 2 4 32 11 18 67 Row % 2.99 5.97 47.76 16.42 26.87 37 Freq. 1 2 20 29 26 78 Row % 1.28 2.56 25.64 37.18 33.33 38 Freq. O 1 18 24 31 74 Row % 0.00 1.35 24.32 32.43 41.89 39 Freq. l 2 24 21 19 67 Row % 1.49 2.99 35.82 31.34 28.36 40 Freq. 11 5 21 9 7 53 Row % 20.75 9.43 39.62 16.98 13.21 41 Freq. 7 9 34 10 11 71 Row % 9.86 12.68 47.89 14.08 15.49 42 Freq. 6 3 12 6 6 33 Row % 18.18 9.09 36.36 18.18 18.18 43 Freq. 0 4 24 20 25 73 Row % 0.00 5.48 32.88 27.40 34.25 44 Freq. 1 5 28 20 19 73 Row % 1.37 6.85 38.36 27.40 26.03 141 Task Values (N=69) Total 1 2 3 4 5 45 Freq. 0 2 12 13 9 36 Row % 0.00 5.56 33.33 36.11 25.00 46 Freq. 3 6 18 9 11 47 Row % 6.38 12.77 38.30 19.15 23.40 47 Freq. O 2 20 25 25 72 Row % 0.00 2.78 27.78 34.72 34.72 48 Freq. 6 2 17 4 4 33 Row % 18.18 6.06 51.52 12.12 12.12 49 Freq. 3 6 32 16 17 74 Row % 4.05 8.11 43.24 21.62 22.97 50 Freq. O 3 23 22 26 74 Row % 0.00 4.05 31.08 29.73 35.14 51 Freq. 1 2 21 13 7 44 Row % 2.27 4.55 47.73 29.55 15.91 52 Freq. O 4 27 19 14 64 Row % 0.00 6.25 42.19 29.69 21.87 53 Freq. 3 3 29 21 17 73 Row % 4.11 4.11 39.73 28.77 23.29 54 Freq. 9 4 25 21 18 77 Row % 11.69 5.19 32.47 27.27 23.38 55 Freq. 6 4 23 5 12 50 Row % 12.00 8.00 46.00 10.00 24.00 56 Freq. 8 7 20 9 7 51 Row % 15.69 13.73 39.22 17.65 13.73 57 Freq. 10 8 24 8 13 63 Row % 15.87 12.70 38.10 12.70 20.63 58 Freq. 5 5 22 9 22 63 Row % 7.94 7.94 34.92 14.29 34.92 59 Freq. 4 3 8 5 5 25 Row % 16.00 12.00 32.00 20.00 20.00 142 Task Values (N=69) Total 1 2 3 4 5 60 Freq. 5 10 19 7 8 49 Row % 10.20 20.41 38.78 14.29 16.33 61 Freq. 3 2 12 16 26 59 Row % 5.08 3.39 20.34 27.12 44.07 62 Freq. 1 2 18 7 9 37 Row % 2.70 5.41 48.65 18.92 24.32 63 Freq. 1 3 10 9 15 38 Row % 2.63 7.89 26.32 23.68 39.47 64 Freq. 2 2 17 10 18 49 Row % 4.08 4.08 34.69 20.41 36.73 65 Freq. O l 3 7 7 18 Row % 0.00 5.56 16.67 38.89 38.89 66 Freq. 2 l 14 21 25 63 Row % 3.17 1.59 22.22 33.33 39.68 67 Freq. 7 2 24 15 16 64 Row % 10.94 3.12 37.50 23.44 25.00 68 Freq. 1 2 15 20 32 70 Row % 1.43 2.86 21.43 28.57 45.71 69 Freq. 7 3 23 13 16 62 Row % 11.29 4.84 37.10 20.97 25.81 Total 430 359 1,222 954 1,302 4,267 Frequency missing = 16 Values key: 1 2 3 4 5 Minor part of job Somewhat important Important Very important Extremely important APPENDIX D TABLE OF TASKS BY EMPLOYEE TYPE, TASK INVENTORY DATA SUMMARY—~SIGNIFICANCE VALUES, AND TABLE OF TASKS BY LENGTH OF EMPLOYMENT: BANK TELLER 143 TASK INVENTORY DATA SDINARY (Page 1) PAYING AND RECEIVING TELLER (N=36) (N=43) l ACTUAL WORKER ll SUPERVISOR ll 1 II II TASK I NUNBER STANDARD 11 NUIBER STANDARD II I RESPONSES IEAN DEVIATION IIRESPONSES IEAN DEVIATION ll ==2============3::======:====:===:==:======:==:====:====2::8==:::===========:===II 1 l 29 2.10 1.29 11 35 1.71 1.02 ll 2 l 31 2.83 1.21 ll 34 2.50 1.13 ll 3 l 29 2.24 1.18 ll 33 2.18 1.16 11 4 l 33 2.88 1.22 ll 36 3.00 1.24 ll 5 1 33 2.79 1.22 ll 37 2.97 1.24 11 6 l 34 2.20 1.12 ll 39 2.33 1.06 ll 7 l 33 3.73 1.35 II 38 3.81 1.04 11 8 l 33 3.76 1.30 ll 38 3.81 1.06 ll 9 l 35 4.46 0.92 11 40 4.62 0.74 11 10 l 33 4.88 0.42 ll 41 4.46 1.10 ll 11 l 36 4.85 0.44 ll 40 4.42 1.11 ll 12 l 36 4.36 0.93 ll 43 4.51 0.67 11 13 l 30 4.53 0.78 ll 33 4.54 0.90 ll 14 l 31 3.61 0.95 ll 31 3.83 1.07 ll 15 l 33 4.64 0.74 ll 41 4.73 0.50 ll 16 I 33 4.42 0.75 11 35 4.66 0.59 ll 17 l 33 4.12 1.02 ll 39 4.02 0.90 ll 18 l 34 4.41 0.89 ll 39 4.53 0.72 ll 19 l 34 4.14 0.99 11 40 3.77 1.23 ll 20 l 30 2.50 1.17 11 38 2.10 1.18 ll 21 l 33 2.63 1.29 ll 39 2.20 1 08 ll 22 l 34 4.47 0.71 ll 40 4.62 0 63 ll 23 1 17 3.35 1.32 ll 27 2.96 1 51 ll 24 l 28 3.61 0.99 ll 36 3.69 1 14 ll 25 l 26 2.50 1.27 ll 34 2.41 1 40 ll 26 l 29 2.41 1.38 ll 33 2.21 1 22 || 27 l 24 2.34 1.49 ll 30 2.00 1 26 ll 28 l 26 2.92 1.35 II 31 2.83 1 34 ll 29 l 33 3.27 1.18 ll 36 3.19 1 14 ll 30 l 33 3.00 1.41 ll 37 3.08 1 46 ll 31 l 33 3.42 1.23 11 39 3.58 1 29 ll 32 l 32 4.03 1.00 ll 39 4.00 0 95 ll 33 l 19 3.21 0.92 ll 19 3.42 1 02 II 34 l 35 3.40 0.98 ll 40 3.60 1 10 II 35 I 20 3.50 1.10 ll 21 3.42 0 98 || 36 I 31 3.48 1.15 ll 36 3.67 0 96 ll 37 l 36 3.89 0.95 11 42 4.07 0 87 ll 38 l 34 4.00 0.85 11 40 4.28 0 82 ll 39 l 31 3.68 1.01 ll 36 3.94 0 86 ll 40 I 25 2.92 1.26 11 28 2.93 1 33 11 (continued next page) 144 TASK INVENTORY DATA SUMMARY (Page 2) PAYING AND RECEIVING TELLER I ACTUAL WORKER II SUPERVISOR II I II II TASK I NUIBER STANDARD II NUMBER STANDARD II I RESPONSES IEAN DEVIATION IIRESPONSES IEAN DEVIATION II ::====:==:====:==::==:===:=::======:::==3:=:=====:==2:==3====:===========:====:= I I 41 I 33 3.06 1.09 II 38 3.18 1.18 II 42 I 14 2.86 1.41 II 19 3.26 1 28 II 43 I 33 3.73 0.94 II 40 4.05 0 93 II 44 I 33 3.48 1.03 ll 40 3.88 0 91 II 45 I 17 3.65 0.93 II 19 3.95 0 85 II 46 I 19 3.16 1.12 II 28 3.57 1 20 II 47 I 33 3.88 0.86 II 39 4.12 0 86 II 48 I 13 3.31 1.18 II 20 2.70 1 17 II 49 I 34 3.38 1.13 II 40 3.62 1 00 II 50 I 34 3.79 0.91 ll 40 4.10 0.90 II 51 I 21 3.52 0.87 11 23 3.52 0.95 II 52 I 29 3.66 0.90 11 35 3.68 0 90 II 53 I 34 3.50 0.93 11 39 3.74 1 09 II 54 I 36 3.39 1.20 II 41 3.51 1 29 II 55 I 21 3.14 1.31 II 29 3.34 1.23 II 56 I 24 3.04 1.20 II 27 2.96 1 29 II 57 I 31 3.16 1.29 II 32 3.03 1.36 II 58 I 29 3.66 1.23 II 34 3.56 1.31 II 59 I 11 3.27 1.27 II 14 3.07 1 44 || 60 I 24 2.96 1.12 II 25 3.16 1 28 II 61 I 30 3.83 1.18 II 29 4.21 1 05 II 62 I 18 3.56 1.15 II 19 3.58 0 90 ll 63 I 20 3.75 1.25 II 18 4.06 0 94 II 64 I 24 3.50 1.29 II 25 4.12 0 83 II 65 I 8 4.00 1.20 II 10 4.20 0 63 II 66 I 29 4.00 1.07 II 34 4.08 0 93 ll 67 I 29 3.59 1.21 II 35 3.40 1 24 II 68 I 34 4.09 1.06 II 36 4.19 0 86 II 69 I 29 3.48 1.30 II 33 3.42 1 23 II 145 Tasks Selected on the Basis of Length of Time Employed in Present Job Length of Time Employed Total Task Total Responses 1 Year 3 Years 1 to 3 or Less or More Years 69 733 1,941 1,609 4,283 APPENDIX E TABLE OF TASKS BY STATE: BANK TELLER 146 Task Selection Responses by State Total Task State Responses Total Michigan Georgia 69 3,139 1,145 4,283 APPENDIX F TABLE OF TASKS BY TYPE OF BUSINESS/INDUSTRY AND TABLE OF TASKS BY SIZE OF BUSINESS: BANK TELLER 147 a k Select’on Res onse b e in 55 r Total Task Type of Business/Industry Responses Total 4 5 6 7 8 69 2,644 901 365 357 16 4,283 Key to Type of Business/Industry: Commercial bank Savings and loan Other Federal Reserve bank Credit agency ”Natalh II II II II II 148 Task Selection Responses by Size of Business Size of Business Total Task Responses 100 or Fewer 100 or More Total Employees Employees 69 2,882 1,401 4,283 APPENDIX 6 ELECTRONICS MECHANIC TASK INVENTORY 149 MICHIGAN STATE UNIVERSITY COLLEGE OF EDUCATION EAST LANSING 0 MICHIGAN 0 «624-1054 DEPARTKENT OE DUCATIONM. ADMINISTRATION ERICXSON HALL November 14, 1989 Dear Employer: Educators across the United States are taking steps to improve the quality of their pre-employment programs. To acccomplish this, they need to know the knowledge, skills, and attitudes needed of employees who work in ‘ electronics jobs. This information is used by educators to develop programs which respond to needs identified by employers such as yourself. As a part of this program improvement effort, we are conducting research on a computer-generated occupational information system. The system’s information is used by educators nationwide to plan and develop pre- employment programs. An important phase of the research involves validating tasks performed by electronics employees. We are asking for your participation and assistance. Enclosed are two occupational task inventories. One of the inventories is to be completed by an employee who supervises your electronic employees. The second inventory is to be completed by an employee app performs Lpp work of one of the following job classifications: electronic technician, engineer. repairer. The inventory can be completed in approximately 15 minutes. The directions are brief, easy to follow, and all responses are confidential and will be viewed by the researchers only. Two stamped, self-addressed envelopes are enclosed for the return of individual responses. Although participation by you and your employees is strictly voluntary, it is in this phase that your firm can make a major contribution. The return of the surveys by December 4 will insure your firms participation in this important educational endeavor. Meanwhile, if you have questions about the research, please do not hesitate to contact us. Thank you. Sincerely, '~ ~ . /' 9.7. - ’0/ pg 5&2??? ? Cfl-r . {-//_,/-.’////t.:><._a (c/ ' 7 02.0 Cas F. Heilman, Professor Carol E. Culpepper Adult and Continuing Education Ph.D. Researcher ‘ Phone: 517/484-7745 CFflzdp AMI, u are Al/orwuamuv Arlene 'Egeal ()pporruern [unrelen- 150 TASK INVENTORY FOR ELECTRONICS THIS INVENTORY IS TO BE COMPLETED BY AN ELECTRONIC EMPLOYEE 151 To Our Respondents: Thank you for participating in this study. Your answers to the survey questions will be used to determine the accuracy of job related information that is contained in an educational computer— generated system. The computerized job information system is used by educators, across the United States, to develop pre- employment programs. Any information you give is confidential and will be viewed by researchers only. Your participation is strictly voluntary. DIRECTIONS FOR COMPLETING THIS SURVEY 1. Fill in the Background Information Sheet. 2. Starting with page 1, place a CHECK MARK in the CHECK column for each task that should be performed by electronic technicians, engineers, or repairers, who are under your supervision. 3. Then, rate the importance of each task you checked. Place a CHECK in the COLUMN that best describes the extent to which the task is a significant part of the job performed by the electronics personnel you SDpervise. 4. Complete the Comment Page. 5. Mail your survey in the self-addressed envelope by December 4, 1989- THANK YOU 152 TASK INVENTORY FOR ELECTRONICS THIS INVENTORY IS TO BE COMPLETED BY A MANAGER/SUPERVISOR OF ELECTRONIC EMPLOYEES 153 To Our Respondents: Thank you for participating in this study. Your answers to the survey questions will be used to determine the accuracy of job related information that is contained in an educational computer—generated system. The computerized job information system is used by educators, across the United States, to develOp pre-employment programs. Any information you give is confidential and will be viewed by researchers only. Your participation is strictly voluntary. DIRECTIONS FOR COMPLETING THIS SURVEY 1. Fill in the Background Information Sheet. 2. Starting with page 1, place a CHECK MARK in the CHECK column for each task you perform on your present job. 3. Then, rate the importance of each task you checked. Place a CHECK in the COLUMN that best describes the extent to which the task is a significant part of your job. 4. Complete the Comment Page. 5. Mail your survey in the self-addressed envelOpe by December 4, 1989. THANK YOU 154 BACKGROUND INFORMATION INSTRUCTIONS: Please check the box to the left of the one item in each section that most appropriately describes your experiences I 1. s [ZZZ] OTHER wHAT IS YOUR PRESENT JOB POSITION? IIZZZZ] ELECTRONIC TECHNICIAN 2 [ZZZ] (ELECTRONIC) FIELD ENGINEER ____ ELECTRONIC MECHANIC/ ___ 3|____] EOUIPMENT REPAIRER 4 [___] MANAGER/SUPERVISOR Please Write In 2. IN wHAT TYPE OF ORGANIZATION OR INDUSTRY ARE YOU EMPLOYED? _-_ ___ COMPUTER/OFFICE EOUIP., [___] COMPUTER REPAIR 7 ___] MPG ___ MACHINERY/EQUIPMENT, ___ [___] wHOLESALE 9 [___] COMMUNICATION EOUIP., MFG ___ ELECTRONIC COMPONENTS, -__ ELECTRONIC/ELECTRICAL Io [___] MFG 11 [___] REPAIR ___ INDUSTRIAL MACHINERY/ ___ 12 [___] EOUIP. 13 [___] OTHER 3. Please write In APPROXIMATELY HOW MANY PEOPLE ARE EMPLOYED IN YOUR PLACE OF EMPLOYMENT? 14 [ZZZ] 100 OR LESS 15 [ZZZ] 100 OR MORE 4. How LONG HAVE YOU BEEN EMPLOYED IN YOUR PRESENT JOB? Is [ZZZ] LESS THAN ONE YEAR 17 [ZZZ] ONE TO THREE YEARS 18 [ZZZ] THREE YEARS OR MORE 155 TASK INVENTORY TASKS Check (./) It Part of Job late The Significance of the Task to The Joh c./) - llinor Part Soaeuhat Very Extreeely of Job leportant leportant leportant leportant (l) (2) (3) (4) (5) . ADJUSTING/AlIGNINBICALTBRATINS ELECTRONIC CIRCUITRY fl . Adjust AC generator output 2. Adjust Ac output resistance 3. Adjust aaplifier gain 4. Adjust araatare or field connection voltage 5. Adjust audio intensities 6. Adjust autoaatic gain control 7. Adjust bias netuork 8. Adjust capacitance 9. Adjust core for slug tune circuits Adjust DC generator output ll. Adjust drive gear 12. Adjust focus control 156 TASKS late the Significance of the last to the Joht/l llinor Part Souevhat of Job leportant leportant (1) (2) (3) Very leportant (4) Extreuely leportant (5) l3. Adjust horizontal linearity 1‘. Adjust iupedance 15. Adjust integraed circuit output I16. Adjust uodulation percentage 17. Adjust noise inverter control Adjust oscillator 19. Adjust output of high frequency aaplifiers (grounded grid; cascade) 20. Adjust pover converter output 21. Adjust probe calibrator signal 22. Adjust resistance 23. Adjust resonant frequency 24. Adjust spindle speeds Adjust synchonization in IC, 5 Adjust tape aaplifier TES7' TASKS Check Rate the Significance cv’3 of the Task to the If Job cv”i Part Of linor Job Part Soeevhat Very Extreuely of Job leportant leportant leportant leportant (l) (2) (3) (4) (5) 27. Adjust tape reader 28. Adjust tension are 29. Adjust tuned circuit valves 30. Adjust vertical linearity 3!. Adjust voltage 32. Align TRF 33. Calibrate DC levels in logics racks 34. Calibrate inductance 35. Calibrate logic integration 36. Calibrate uulti-vibrator circuit (stable, uonostable, bistable, flip flop) 37. Calibrate pover supply voltage 38. Calibrate P-P voltage 33. Calibrate tiling/clock pulse 40. Calibrate vertical auplitude 158 TASKS late the Significance of the Task to the Job ( ) Ninor Part Soaevhat Very Extreuely of Job leportant leportant leportant leportant (l) (2) (3) (4) (5) 4t. Calibrate delay, long tine delay and O.F. drop out tiue 42. Adjust transducers f3. Adjust transforuers 44. AONINISTERINS PERSONNEL Aduinister diagnostic tests to prospective euployees 45. Conduct instruction by deaonstra- tion peforaance 46. Evaluate euployee perforuance 47. Evaluate training prograa 48. Evaluate presonnel safety violations 49. lnterviev perspective eaployees $0. Naintain vort records of euployees 51. Nonitor prograaaed instructions 52. Orient personnel on procedures Plan vort schedules 159 TASKS Check (v’) Part Of Job Rate the Significance of the Task to the Job (af') llinor Part Soaevhat of Job leportant leportant (l) (2) (3) Very leportant (4) Extrelely leportant (5) 54. Report equipuent related safety violations 55. Schedule vort assignaents C. 56. DESIGNING EOOTPNENT AND CTRCOTTRY Conduct physical inventory S7. Construct external interface adapters (so. Construct a IDITF graph HST. Construct tables displaying electronic data (variables, paraueters) 62. Design electrical .eruinations for nev equipaent ISS. Design interfaces betveen sub- asseablies (electrical, aechanical) 64. Design physical support hardvare for nev electronic eguipaent 65. Draft preliuinary specifications for an electronic device 160 TASKS Check (f) Part Of Job late the Significance of the last to the Job (‘1’) llinor Part Soaevhat Very Extreaely of Job leportant leportant leportant leportant (l) (2) (3) (A) (5) 56. Drav scheeatic of circuitry 57. hodify scheeatics 58. Nodify original circuitry to accouodate IC, 5 59. Plan quality assessaent checks (physical electrical) to. Prepare an asseably guide Vi. Prepare cost factors report 72. Prepare an estieate of production tine 73. Prepare a parts list for prototype eguipeent 74. Prepare preliuinary sketches for for prototype eguipuent 75. Prepare a production feasibility report 76. Prepare a survey of production schedules Translate engineering specifica- tions to a functional description 161 TASKS late the Significance of the Task to the Job (II’) llinor Part Souevhat of Job leportant (l) (2) leportant (3) Very leportant (4) Extreaely leportant (5) 70. Translate graphic inforuation into vritten specifications 79. Verify interface connections Tao. Irite operational procedures lei. Irite suaaary report of operational tests Design circuits froa engineering specifications 83. PERFORNTNG ENVIRONNENTAL TESTS Perfora atuospherical test 184. Perfora corrosive test Perfore hueidity test l86. Perforu aaxiuuu pover test I87. Perforu pressure test Perfora shock (iapact) test 89. Perfora teeperature test 162 TASKS Check Rate the Significance ("’T of the Task to the if Job (VT Part Of linor Job Part Soeeuhat Very Extreeely of Job leportant leportant leportant leportant (l) (2) (3) (4) (5) 90. NAINTAININS ELECTRONICS DEVICES Asseable structural aeehers according to asseebly draving 91. Clean air filters 92. Clean chassis 93. Clean circulation fans (exhaust and intake) Clean contact points 95. Clean drive sechanisa 96. Clean reflective eirror 97. Clean speaker grill Clean spindles device 99. Clean tape head 100 Clean tape reader rectifier 101. Clean tuner 102. Clean voluae control 163 TASKS Check ( v’) if Part Of Job Rate the Significance of the Task to the Job ('v') llinor Part Souevhat of Job leportant leportant (l) (2) (3) Very leportant (4) Estreeely leportant (5) 103. Construct a PC board (layout, etch, drill) 104. Convert digital deciaals into binary coded decieals 105. Locate coaponent aalfunctions using vritten reports 106 Mount systee inlon physical support 107 Record ueter readings 108. Splice vices 109. Solder/unsolder couponents 110. Perforu quality control checks 111. REPLACING CONPONENTS Replace auplifier 112. Replace bi-polar logic gates 113. Replace cathode ray tube 114 Replace capacitor 115 Replace counters (decode, preset, ring) 164 TASKS (i) if Part of Job Check ‘ Rate the Significance of the last to the Job ( V’) llinor Part Soeevhat Very Extreaely of Job leportant leportant leportant; leportant (l) (2) (3) (4) (5) 1116. Replace counters board 117. Replace digital lights 118. Replace deflection yolk his. Replace dynaeotor H120. Replace energy storage cells 1121. Replace fet 122. Replace filter 123. Replace frequency converter 124. Replace fuse 12$. Replace 1C chips 126. Replace 1C netvort 127. Replace indicators 128. Replace indicator laeps 129. Replace klystron 130. Replace logic control board 165 TASKS Check (J) Part Of Job Rate the Significance of the Task to the Joan/i Rinor Part Soaevhat Very Extreaely of Job leportant leportant leportant leportant (i) (2) (3) (4) (5) 131. Replace aagnetron 132. Replace aichophone 133. Replace oxcillator ‘3‘. Replace PC hoards 135 Replace photo electric relays 136 Replace photo transistor 137. Replace pover supplies 138. Replace pulley bolt 139. Replace rectifier 140. Replace registers 141. Replace relays 142. Replace reader 143. Replace roller 144. Replace scr-triac 14$. Replace servo uechanisa 166 TASKS Check ( ) 1f Part Job Rate the Significance of the Task to the Job ( ) Iinor Part Souevhat of Job leportant leportant (1) (2) (3) Very leportant (4) Extreeely leportant (5) . Replace solid state dict-s 147 Replace svitches (leaft, contact, uercurial) 148. Replace tape head 149. Replace theraal breakers 150 Replace thyratrons 151. Replace transducer 152 Replace transforuer 153 Replace transistors 154. Replace translator circuit (encoder, decoder, code converter, etc.) 155. Replace tubes 167 COMMMENTS Thank you for your cooperation. This page provides an opportunity for you to make comments and suggestions. INSTRUCTIONS: Respond to each of the statements below. CIRCLE the ONE symbol which best describes your feelings about each statement. SA = Strongly Agree A = Agree U = Uncertain D 8 Disagree SD = Strongly Disagree 1. The tasks in this survey included most SA A U D SD of the tasks performed by employees who are working in the job area. 2. The statements of the job tasks were SA A U D SD reasonably clear. 3. The instructions for answering the SA A u D so questions in this survey were reasonably clear. 4. PLEASE INDICATE ANY ADDITIONAL COMMENTS YOU WOULD LIKE TO MAKE IN THIS SPACE. ' Please return your survey in the self-addressed envelope THANK YOU 168 November 30, 1989 Dear Employer: Recently, you received a letter and survey forms regarding the knowledge, skills, and attitudes needed of employees who work in electronics jobs or employees who may perform electronics tasks as a part of their overall job assignment. The success of this venture is dependent upon the receipt of this information from your employees. Your firm can make a valuable contribution to education by responding to the task inventory surveys that were mailed to you November 14. If you have not already responded, may we ask you to entourage the employees you selected to participate in the study to complete and return the inventories to us by no later than December 20. If you have any questions please call the number below. Thank you. Sincerely, Cas F. Heilman, Professor Carol E. Culpepper Adult and Continuing Education Ph.D. Researcher 517/484-7745 CFszp APPENDIX H KUDER-RICHARDSON COMPUTATIONS: ELECTRONICS MECHANIC 169 ELECTRONICS KUDER-RICHARDSON COMPUTATIONS Task 1 0 YES'S 4 NO'S P Q 9*0 x-u (x-u) 2 1 15 18 0.47 0.53 0.25 4.43 19.62 2 16 18 0.47 0.53 0.25 4.43 19.52 3 15 18 0.47 0.53 0.25 4.43 19.52 4 15 19 0.44 0.56 0.25 3.43 11.75 5 11 23 0.32 0.58 0.22 -0.57 0.32 5 8 25 0.24 0.76 0.18 -3.57 12.74 7 12 22 0.35 0.55 0.23 0.43 0.18 8 14 20 0.41 0.59 0.24 2.43 5.90 9 8 25 0.24 0.75 0.18 -3.57 12.74 10 12 22 0.35 0.55 0.23 0.43 0.18 11 12 22 0.35 0.55 0.23 0.43 0.18 12 17 17 0.50 0.50 0.25 5.43 29.48 13 18 15 0.53 0.47 0.25 6.43 41.34 14 14 20 0.41 0.59 0.24 2.43 5.90 15 10 24 0.29 0.71 0.21 -1.57 2.45 15 4 30 0.12 0.88 0.10 -7.57 57.30 17 3 31 0.09 0.91 0.08 -8.57 73.44 18 11 23 0.32 0.58 0.22 -0.57 0.32 19 5 29 0.15 0.85 0.13 -6.57 43.15 20 5 28 0.18 0.82 0.15 -5.57 31.02 21 5 28 0.18 0.82 0.15 -5.57 31.02 22 18 15 0.53 0.47 0.25 5.43 41.34 23 11 23 0.32 0.58 0.22 -0.57 0.32 24 7 27 0.21 0.79 0.15 -4.57 20.88 25 5 28 0.18 0.82 0.15 —5.57 31 02 25 8 25 0.24 0.75 0.18 -3.57 12.74 27 5 28 0.18 0.82 0.15 -5.57 31.02 28 10 24 0.29 0.71 0.21 —1 57 2.45 29 8 25 0.24 0.75 0.18 -3.57 12.74 30 15 18 0.47 0.53 0.25 4.43 19.52 31 22 12 0.65 0.35 0.23 10.43 108.78 32 5 29 0.15 0.85 0.13 -5.57 43.15 33 5 28 0.18 0.82 0.15 -5.57 31.02 34 1 33 0.03 0.97 0.03 -10 57 111.72 35 1 33 0.03 0.97 0.03 -10.57 111.72 35 3 31 0.09 0.91 0.08 -8.57 73.44 37 18 15 0.53 0.47 0.25 5.43 41.34 38 11 23 0.32 0.58 0.22 -0.57 0.32 39 12 22 0.35 0.55 0.23 0.43 0.18 40 7 27 0.21 0.79 0.15 -4.57 20.88 41 7 27 0.21 0.79 0.15 -4.57 20.88 42 5 28 0.18 0.82 0.15 -5.57 31.02 43 5 28 0.18 0.82 0.15 -5.57 31.02 44 13 21 0.38 0.52 0.24 1.43 2.04 45 13 21 0.38 0.52 0.24 1.43 2.04 45 15 19 0.44 0.56 0.25 3.43 11.75 47 12 22 0.35 0.55 0.23 0.43 0.18 48 15 19 0.44 0.55 0.25 3.43 11.75 49 15 18 0.47 0.53 0.25 4.43 19.62 50 13 21 0.38 0.52 0.24 1.43 2.04 51 8 26 0.24 0.75 0.18 -3.57 12.74 52 17 17 0.50 0.50 0.25 5.43 29.48 ELECTRONICS (page 2) KUDER-RICHARDSON COMPUTATIONS Task 8 R YES’S R NO'S 21 18 20 .achoonoowoov-samm-«ao-q-QHH-anwoowoo-acnoocoaawuauto HHHHHHHHNHNHH ”Hmhocaoeaooo-aoveot-e 13 16 14 25 29 33 31 28 31 26 29 27 26 31 26 31 30 27 33 33 27 27 26 27 22 26 26 33 31 31 31 31 31 33 30 23 15 9 17 14 16 22 24 20 15 P 170 Q P‘Q ocooooesoooocoococcooooooooocooooococccooooocooocco .24 .25 .24 .19 .13 .03 .15 .08 .18 .13 .16 .18 .08 .18 .08 .10 .16 .03 .03 .16 .16 .18 .16 .23 .18 .15 .03 .08 .08 .08 .08 .03 .10 .22 .25 .19 .25 .24 .25 .23 .24 .25 .21 .24 .03 .06 .43 .43 .43 .57 .57 .57 .57 .57 .57 .57 .57 .57 .57 .57 .57 .57 .57 .57 .57 .57 .57 .43 .57 .57 .57 .57 .57 .57 .57 .57 .57 .57 .57 .43 .43 .43 .43 .43 .43 .57 .43 .43 .57 .43 .43 .57 .57 (X-I) 2 111 .92 .34 .60 .16 .72 .44 .02 .44 .74 .16 .74 .44 .74 .44 .30 .72 .72 .88 .88 .74 .88 .18 .74 .02 111. .44 .44 .44 .44 .44 .72 .30 .32 .20 .36 .48 .06 .34 .18 .46 .90 .20 .46 .90 72 .72 .58 171 ELECTRONICS (page 3) “DER-RICHARDSON COMPUTATIONS Task 0 a YES'S a NO'S P e P‘Q x-u (x—u) 2 105 11 23 0.32 0.58 0.22 -0.57 0.32 106 15 19 0.44 0.80 0.28 3.43 11.76 107 8 25 0.24 0.75 0.18 -3.57 12.74 108 19 15 0.55 0.44 0.25 7.43 55.20 109 31 3 0.91 0.09 0.08 19.43 377.52 110 21 13 0.52 0.38 0.24 9.43 88.92 111 18 15 0.53 0.47 0.25 5.43 41.34 112 12 22 0.35 0.55 0.23 0.43 0.18 113 20 , 14 0.59 0.41 0.24 8.43 71.05 114 29 5 0.85 0.15 0.13 17.43 303.80 115 11 23 0.32 0.58 0.22 —0.57 0.32 115 7 27 0.21 0.79 0.15 -4.57 20.88 117 17 17 0.50 0.50 0.25 5.43 29.48 118 14 20 0.41 0.59 0.24 2.43 5.90 119 5 29 0.15 0.85 0.13 -5.57 43.16 120 12 22 0.35 0.55 0.23 0.43 0.18 121 15 18 0.47 0.53 0.25 4.43 19.52 122 21 13 0.52 0.38 0.24 9.43 88.92 123 9 25 0.25 0.74 0.19 -2.57 5.60 124 22 12 0.55 0.35 0.23 10.43 108.78 125 25 8 0.75 0.24 0.18 14.43 208.22 126 17 17 0.50 0.50 0.25 5.43 29.48 127 18 15 0.53 0.47 0.25 6.43 41.34 128 19 15 0.56 0.44 0.25 7.43 55.20 129 1 33 0.03 0.97 0.03 —10.57 111.72 130 18 15 0.53 0.47 0.25 5.43 41.34 131 7 27 0.21 0.79 0.16 -4.57 20.88 132 7 27 0.21 0.79 0.16 -4.57 20.88 133 13 21 0.38 0.62 0.24 1.43 2.04 134 23 11 0.58 0.32 0.22 11.43 130.54 135 13 21 0.38 0.52 0.24 1.43 2.04 136 18 15 0.53 0.47 0.25 6.43 41.34 137 29 5 0.85 0.15 0.13 17.43 303.80 138 13 21 0.38 0.62 0.24 1.43 2.04 139 17 17 0.50 0.50 0.25 A 5.43 29.48 140 12 22 0.35 0.55 0.23 0.43 0.18 141 15 18 0.47 0.53 0.25 4.43 19.52 142 9 25 0.25 0.74 0.19 -2.57 6.60 143 13 21 0.38 0.52 0.24 1.43 2.04 144 14 20 0.41 0.59 0.24 2.43 5.90 145 14 20 0.41 0.59 0.24 2.43 5.90 146 21 13 0.52 0.38 0.24 9.43 88.92 147 26 8 0.75 0.24 0.18 14.43 208.22 148 19 15 0.56 0.44 0.25 7.43 55.20 149 10 24 0.29 0.71 0.21 -1.57 2.45 150 3 31 0.09 0.91 0.08 -8.57 73.44 151 10 24 0.29 0.71 0.21 -1.57 2.45 152 21 13 0.52 0.38 0.24 9.43 88.92 153 23 11 0.58 0.32 0.22 11.43 130.54 154 15 19 0.44 0.56 0.25 3.43 11.75 155 13 21 0.38 0.52 0.24 1.43 2.04 TOTALS: 1771 28.42 6,853.39 Mean 11.58 ELECTRONICS 4 ITEMS CHKED YES FREQ 4 7 9 22 23 25 29 31 34 36 39 41 47 49 52 54 55 58 66 67 68 70 73 79 81 87 90 94 152 1542 R RESPONSES= 34 MEAN = 45.35 v-ev-tv-ev-AHHHHHHHNNHHHva-eHNHHHHHHHH VARIANCE= 201.86 SUM(P‘Q)= 172 KUDERTRICHARDSON COMPUTATIONS 4 7 9 22 23 25 29 31 68 36 39 123 47 49 52 54 110 116 66 67 68 70 73 79 81 87 90 94 152 1771 X2 16. 49. 81. .00 529. 625. 841. 961. 2312. 1296. .00 5043. 2209. .00 484 1521 2401 2704. 2916. 6050. 6728. 4356. 4489. 4624. 4900. 5329. .00 6241 6561. 7569. 8100. .00 8836 23104. 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 120875 28. 42 KUDER-RICHARDSON: “I II ‘1 11 k I sum(p‘q) k - 1 I variance (153/152)‘(1'(28.42/201.86)) 0.8649 APPENDIX I TABLE OF TASKS BY SIGNIFICANCE: ELECTRONICS MECHANIC Tasks by Significance: Electronics Mechanic 173 Task Values (N=153) Total 1 1 Freq. 5 6 4 l 0 16 Row % 31.25 37.50 25.00 6.25 0.00 2 Freq. 5 5 5 T 0 16 Row % 31.25 31.25 31.25 6.25 0.00 3 Freq. 4 4 5 2 l 16 Row % 25.00 25.00 31.25 12.50 6.25 4 Freq. 3 6 4 2 0 15 Row % 20.00 40.00 26.67 13.33 0.00 5 Freq. 2 2 5 2 0 11 Row % 18.18 18.18 45.45 18.18 0.00 6 Freq. 2 2 4 0 0 8 Row % 25.00 25.00 50.00 0.00 0.00 7 Freq. 1 3 4 2 2 12 Row % 8.33 25.00 33.33 16.67 16.67 8 Freq. 5 2 5 1 1 14 Row % 35.71 14.29 35.71 7.14 7.14 9 Freq. l 2 4 l 0 8 Row % 12.50 25.00 50.00 12.50 0.00 10 Freq. 3 4 4 1 0 12 Row % 25.00 33.33 33.33 8.33 0.00 11 Freq. 3 2 5 0 2 12 Row % 25.00 16.67 41.67 0.00 16.67 12 Freq. 2 6 5 2 2 17 Row % 11.76 35.29 29.41 11.76 11.76 13 Freq. 6 6 2 l 3 18 Row % 33.33 33.33 11.11 5.56 16.67 14 Freq. 4 1 2 6 1 14 Row % 28.57 7.14 14.29 42.86 7.14 174 Task Values (N=153) Total 1 2 3 4 5 15 Freq. 2 2 4 2 0 10 Row % 20.00 20.00 40.00 20.00 0.00 16 Freq. 2 2 0 0 0 4 Row % 50.00 50.00 0.00 0.00 0.00 17 Freq. 2 0 1 0 0 2 Row % 66.67 0.00 33.33 0.00 0.00 18 Freq. 4 4 1 0 2 11 Row % 36.36 36.36 9.09 0.00 18.18 19 Freq. 2 7 0 2 o 5 Row % 40.00 20.00 0.00 40.00 0.00 20 Freq. 2 0 1 3 0 6 Row % 33.33 0.00 16.67 50.00 0.00 21 Freq. l 1 4 0 0 6 Row % 16.67 16.67 66.67 0.00 0.00 22 Freq. 0 6 3 3 6 18 Row % 0.00 33.33 16.67 16.67 33.33 23 Freq. 1 1 2 3 4 11 Row % 9.09 9.09 18.18 27.27 36.36 24 Freq. 1 0 4 1 l 7 Row % 14.29 0.00 57.14 14.29 14.29 25 Freq. 0 4 0 0 2 6 Row % 0.00 66.67 0.00 0.00 33.33 26 Freq. 0 3 3 l 1 8 Row % 0.00 37.50 37.50 12.50 12.50 27 Freq. 0 0 2 1 3 6 Row % 0.00 0.00 33.33 16.67 50.00 28 Freq. 0 2 4 0 4 10 Row % 0.00 20.00 40.00 0.00 40.00 29 Freq. 0 0 2 4 2 8 Row % 0.00 0.00 25.00 50.00 25.00 175 Task Values (N=153) Total 1 30 Freq. 3 4 4 4 1 16 Row % 18.75 25.00 25.00 25.00 6.25 31 Freq. 2 6 8 2 4 22 Row % 9.09 27.27 36.36 9.09 18.18 32 Freq. 0 1 2 l 1 5 Row % 0.00 20.00 40.00 20.00 20.00 33 Freq. O 2 1 2 1 6 Row % 0.00 33.33 16.67 33.33 16.67 34 Freq. 0 l 0 0 0 1 Row % 0.00 100.00 0.00 0.00 0.00 35 Freq.- 0 l 0 0 0 1 Row % 0.00 100.00 0.00 0.00 0.00 36 Freq. 0 2 1 0 0 3 Row % 0.00 66.67 33.33 0.00 0.00 37 Freq. 2 4 7 1 4 18 Row % 11.11 22.22 38.89 5.56 22.22 38 Freq. 0 3 5 1 2 11 Row % 0.00 27.27 45.45 9.09 18.18 39 Freq. l 2 4 3 2 12 Row % 8.33 16.67 33.33 25.00 16.67 40 Freq. 0 1 4 1 l 6 Row % 0.00 14.29 57.14 14.29 14.29 41 Freq. 0 1 4 2 0 7 Row % 0.00 14.29 57.14 28.57 0.00 42 Freq. 0 0 3 2 l 6 Row % 0.00 0.00 50.00 33.33 16.67 43 Freq. 0 2 4 0 0 6 Row % 0.00 33.33 66.67 0.00 0.00 44 Freq. l 5 2 3 2 13 Row % 7.69 38.46 15.38 23.08 15.38 176 Task Values (N=153) Total 1 2 3 4 5 45 Freq. 0 4 5 2 2 13 Row % 0.00 30.77 38.46 15.38 15.38 46 Freq. l 3 2 4 5 15 Row % 6.67 20.00 13.33 26.67 33.33 47 Freq. 0 3 2 4 3 12 Row % 0.00 25.00 16.67 33.33 25.00 48 Freq. 3 2 2 2 6 15 Row % 20.00 13.33 13.33 13.33 40.00 49 Freq. 2 2 3 4 5 16 Row % 12.50 12.50 18.75 25.00 31.25 50 Freq. l 2 3 3 4 13 Row % 7.69 15.38 23.08 23.08 30.77 51 Freq. 0 1 4 l 2 8 Row % 0.00 12.50 50.00 12.50 25.00 52 Freq. 0 3 5 4 5 17 Row % 0.00 17.65 29.41 23.53 29.41 53 Freq. 2 l 4 7 7 21 Row % 9.52 4.76 19.05 33.33 33.33 54 Freq. 1 2 3 l 11 18 Row % 5.56 11.11 16.67 5.56 61.11 55 Freq. 3 1 2 8 6 20 Row % 15.00 5.00 10.00 40.00 30.00 56 Freq. 0 l 3 2 2 8 Row % 0.00 12.50 37.50 25.00 25.00 57 Freq. 0 1 2 0 2 5 Row % 0.00 20.00 40.00 0.00 40.00 60 Freq. 0 0 1 0 0 1 Row % 0.00 0.00 100.00 0.00 0.00 61 Freq. 0 0 3 0 0 3 Row % 0.00 0.00 100.00 0.00 0.00 177 Task Values (N=153) Total 1 2 3 4 5 62 Freq. 0 1 5 0 0 6 Row % 0.00 16.67 83.33 0.00 0.00 63 Freq. l 0 2 0 0 3 Row % 33.33 0.00 66.67 0.00 0.00 64 Freq. 2 0 l 3 2 8 Row % 25.00 0.00 12.50 37.50 25.00 65 Freq. l 0 1 2 l 5 Row % 20.00 0.00 20.00 40.00 20.00 66 Freq. 2 3 1 1 0 7 Row % 28.57 42.86 14.29 14.29 0.00 67 Freq. 2 2 4 0 0 8 Row % 25.00 25.00 50.00 0.00 0.00 68 Freq. l 0 2 0 0 3 Row % 33.33 0.00 66.67 0.00 0.00 69 Freq. 1 0 1 3 3 8 Row % 12.50 0.00 12.50 37.50 37.50 70 Freq. 0 1 l 0 l 3 Row % 0.00 33.33 33.33 0.00 33.33 71 Freq. 0 2 l l 0 4 Row % 0.00 50.00 25.00 25.00 0.00 72 Freq. 1 l 1 4 0 7 Row % 14.29 14.29 14.29 57.14 0.00 73 Freq. 0 0 1 0 0 1 Row % 0.00 0.00 100.00 0.00 0.00 74 Freq. 0 0 l 0 0 1 Row % 0.00 0.00 100.00 0.00 0.00 75 Freq. 0 2 4 0 1 7 Row % 0.00 28.57 57.14 0.00 14.29 76 Freq. 0 1 4 l l 7 Row % 0.00 14.29 57.14 14.29 14.29 178 Task Values (N=153) Total 3 77 Freq. 2 0 1 l 4 8 Row % 25.00 0.00 12.50 12.50 50.00 78 Freq. l 0 2 4 0 7 Row % 14.29 0.00 28.57 57.14 0.00 79 Freq. l 1 0 3 7 12 Row % 8.33 8.33 0.00 25.00 58.33 80 Freq. 0 2 l 2 3 8 Row % 0.00 25.00 12.50 25.00 37.50 81 Freq. 1 2 3 0 0 5 Row % 16.67 33.33 50.00 0.00 0.00 82 Freq. l 0 0 0 0 1 Row % 100.00 0.00 0.00 0.00 0.00 83 Freq. 0 1 1 1 0 3 Row % 0.00 33.33 33.33 33.33 0.00 84 Freq. 0 1 1 1 0 3 Row % 0.00 33.33 33.33 33.33 0.00 85 Freq. 0 l l 1 0 3 Row % 0.00 33.33 33.33 33.33 0.00 86 Freq. 0 l 1 l 0 3 Row % 0.00 33.33 33.33 33.33 0.00 87 Freq. 0 l 1 1 0 3 Row % 0.00 33.33 33.33 33.33 0.00 88 Freq. 0 1 0 0 0 1 Row % 0.00 100.00 0.00 0.00 0.00 89 Freq. l l l 1 0 4 Row % 25.00 25.00 25.00 25.00 0.00 90 Freq. 2 l 0 1 7 11 Row % 18.18 9.09 0.00 9.09 63.64 91 Freq. 4 l 6 5 3 19 Row % 21.05 5.26 31.58 26.32 15.79 179 Task Values (N=153) Total 1 2 3 4 5 92 Freq. 6 3 6 6 4 25 Row % 24.00 12.00 24.00 24.00 16.00 93 Freq. 4 2 5 3 3 17 Row % 23.53 11.76 29.41 17.65 17.65 94 Freq. 5 4 6 4 l 20 Row % 25.00 20.00 30.00 20.00 5.00 95 Freq. 5 3 5 1 4 18 Row % 27.78 16.67 27.78 5.56 22.22 96 Freq. 3 0 4 3 2 12 Row % 25.00 0.00 33.33 25.00 16.67 97 Freq. 6 1 3 0 0 10 Row % 60.00 10.00 30.00 0.00 0.00 98 Freq. 4 0 6 3 l 14 Row % 28.57 0.00 42.86 21.43 7.14 99 Freq. 4 2 5 1 7 19 Row % 21.05 10.53 26.32 5.26 36.84 100 Freq. 3 1 1 4 1 10 Row % 30.00 10.00 10.00 40.00 10.00 101 Freq. 2 3 4 3 2 14 Row % 14.29 21.43 28.57 21.43 14.29 102 Freq. 2 4 4 3 2 15 Row % 13.33 26.67 26.67 20.00 13.33 103 Freq. 0 1 0 0 0 1 Row % 0.00 100.00 0.0 0.00 0.00 104 Freq. 0 1 0 l 0 2 Row % 0.00 50.00 0.00 50.00 0.00 105 Freq. 2 2 2 4 1 11 Row% 18.18 18.18 18.18 36.36 9.09 106 Freq. l 3 8 3 0 15 Row % 6.67 20.00 53.33 20.00 0.00 180 Task Values (N=153) Total 1 2 3 4 5 107 Freq. l 4 l 2 0 8 Row % 12.50 50.00 12.50 25.00 0.00 108 Freq. 3 1 4 7 4 19 Row % 15.79 5.26 21.05 36.84 21.05 109 Freq. 3 6 4 7 11 31 Row % 9.68 19.35 12.90 22.58 35.48 110 Freq. 4 l 0 4 12 21 Row % 19.05 4.76 0.00 19.05 57.14 m Freq. 5 5 2 2 4 18 Row % 27.78 27.78 11.11 11.11 22.22 112 Freq. 2 l 4 4 l 12 Row % 16.67 8.33 33.33 33.33 8.33 113 Freq. 4 2 5 5 4 20 Row % 20.00 10.00 25.00 25.00 20.00 114 Freq. 4 4 7 9 5 29 Row % 13.79 13.79 24.14 31.03 17.24 115 Freq. 4 l 2 4 0 11 Row % 36.36 9.09 18.18 36.36 0.00 116 Freq. 1 l 2 1 2 7 Row % 14.29 14.29 28.57 14.29 28.57 117 Freq. 5 3 3 6 0 17 Row % 29.41 17.65 17.65 35.29 0.00 118 Freq. 3 4 2 5 0 14 Row % 21.43 28.57 14.29 35.71 0.00 119 Freq. 0 1 2 2 0 5 Row % 0.00 20.00 40.00 40.00 0.00 120 Freq. l 4 0 4 3 12 Row % 8.33 33.33 0.00 33.33 25.00 121 Freq. 6 3 1 3 3 16 Row % 37.50 18.75 6.25 18.75 18.75 181 Task Values (N=153) Total 1 2 3 4 5 122 Freq. 4 3 3 5 6 21 Row % 19.05 14.29 14.29 23.81 28.57 123 Freq. 3 4 0 0 2 9 Row % 33.33 ,44.44 0.00 0.00 22.22 124 Freq. 1 4 6 6 5 22 Row % 4.55 18.18 27.27 27.27 22.73 125 Freq. 2 6 6 7 5 26 Row % 7.69 23.08 23.08 26.92 19.23 126 Freq. 3 5 5 1 3 17 Row % 17.65 29.41 29.41 5.88 17.65 127 Freq. 3 5 6 3 1 18 Row % 16.67 27.78 33.33 16.67 5.56 128 Freq. 3 6 3 3 4 19 Row % 15.79 31.58 15.79 15.79 21.05 129 Freq. 1 0 0 0 0 1 Row % 100.00 0.00 0.00 0.00 0.00 130 Freq. 2 4 2 2 8 18 Row % 11.11 22.22 11.11 11.11 44.44 131 Freq. 3 0 1 1 2 7 Row % 42.86 0.00 14.29 14.29 28.57 132 Freq. 2 2 1 -2 0 7 Row % 28.57 28.57 14.29 28.57 0.00 133 Freq. 6 2 2 1 2 13 Row % 46.15 15.38 15.38 7.69 15.38 134 Freq. 1 6 1 4 11 23 Row % 4.35 26.09 4.35 17.39 47.83 135 Freq. 3 3 0 0 7 13 Row % 23.08 23.08 0.00 0.00 53.85 136 Freq. 4 4 3 0 7 18 Row % 22.22 22.22 16.67 0.00 38.89 182 Task Values (N=153) Total 1 2 3 4 5 137 Freq. 2 5 8 3 ll 29 Row % 6.90 17.24 27.59 10.34 37.93 138 Freq. 2 2 4 0 5 13 Row % 15.38 15.38 30.77 0.00 38.46 139 Freq. 2 6 5 1 3 17 Row % 11.76 35.29 29.41 5.88 17.65 140 Freq. 3 4 2 1 2 12 Row % 25.00 33.33 16.67 8.33 16.67 141 Freq. 3 6 2 0 5 16 Row % 18.75 37.50 12.50 0.00 31.25 142 Freq. 2 2 0 0 5 9 Row % 22.22 22.22 0.00 0.00 55.56 143 Freq. 2 4 0 2 5 13 Row % 15.38 30.77 0.00 15.38 38.46 144 Freq. 2 4 2 2 4 14 Row % 14.29 28.57 14.29 14.29 28.57 145 Freq. 0 3 5 0 6 14 Row % 0.00 21.43 35.71 0.00 42.86 146 Freq. 1 5 5 5 5 21 Row % 4.76 23.81 23.81 23.81 23.81 147 Freq. 1 6 7 8 4 26 Row % 3.85 23.08 26.92 30.77 15.38 148 Freq. 2 5 4 1 7 19 Row % 10.53 26.32 21.05 5.26 36.84 149 Freq. 2 1 1 5 l 10 Row % 20.00 10.00 10.00 50.00 10.00 150 Freq. 0 1 0 1 l 3 Row % 0.00 33.33 0.00 33.33 33.33 151 Freq. 2 l 3 2 2 10 Row % 20.00 10.00 30.00 20.00 20.00 183 Task Values (N=153) Total 1 2 3 4 5 152 Freq. 2 7 3 2 7 21 Row % 9.52 33.33 14.29 9.52 33.33 153 Freq. 0 5 6 3 9 23 Row % 0.00 21.74 26.09 13.04 39.13 154 Freq. 4 2 1 5 3 15 Row % 26.67 13.33 6.67 33.33 20.00 155 Freq. 4 4 1 0 4 13 Row % 30.77 30.77 7.69 0.00 30.77 Total 279 363 427 323 378 1,770 Frequency missing = 1 Values key: 1 2 3 4 5 Minor part of job Somewhat important Important Very important Extremely important APPENDIX J TABLE OF TASKS BY EMPLOYEE TYPE, TASK INVENTORY DATA SUMMARY—-SIGNIFICANCE VALUES, AND TABLE OF TASKS BY LENGTH OF EMPLOYMENT: ELECTRONICS MECHANIC 184 TASK INVENTORY DATA SUMMARY (Page 1) ELECTRONICS SUPERVISOR (n 19) (n WORKER STANDARD DEVIATION MEAN a Rs EN 80 up US NM DN R0 All Dal. “A I TV SE D MEAN NUMBER RESPONSES TASK eeeeeeeeeeeeeeee 068685564647755325234753253637923111935242498 1 eeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeee ooooooooooooooooooooooooooooooooooooo (continued next page) 185 TASK INVENTORY DATA SUMMARY (Page 2) ELECTRONICS 224.504.988.548 01001014150 820604.00 15 DN:227351056515 54004046715 5802515 70 R0: 0 o 0000000 o eeeeeeeeeeeeeee e o o Bshun“11.1111111110 01221111011 0021110 02 NA: aAI... TY: SE: D: m a S : 7340308063070050000000075000005507000000000 O O OOOOOOOOOOOOOOOOOOOOO O ......... O O O D. = u a. S = : = s“ E .. 99909590725311425345252341144544531111111n55 RS: 1 1 1 EN: 30: MD... ms“ u: :483325037950 00 0088 0 8 5580530 11111 11 nDUuW26555819-81910 00 0055 0 5 1150370 7.7777. ~14. . eeeeeeeeeeee e o ..... o o o t o ““3101101101010 00 0000 1 0 1100011 00000 00 NA: A1: TY: SE: D: a : 0333070982700000003700007007770800000000003 u" 533306501160000000360000600.05600000055555008 OOOOOOOOOOOOOOOOOOOOOOO O O O O U .0 O . O “a 34.334334443503334.42234533003344.4420?»33330004. R E: u : : Pu. : 0 : w s“ 8”“. 636643811832022132331.3113003333833022222026 . 11 um; MP... mm." . R: : : : 8789012345670123456789012345578901234.567890 : K = S : A : T : (continued next page) 186 TASK INVENTORY DATA SUMMARY (Page 3) ELECTRONICS 862561932020 12542053899037? 508819301 9029-12 D~8433204894287 2.558437089213520 2155270805 65885?- RO: on... one o e one e 00 o e e eeeee e eeeeee “"3111111001100 110011100111111 111010011 110011 NA: A1: TV: SE: D: n.0. a s :07.00000030900000000070oo747839601500682300050557 1 =Owl-03356570320045.5382?158338807705301350220735 v = 00000000000 o 00000000000000 000 000000000000000 m m:323222122222232223331222221122131332221322132 : 0.. : u a. S : : : s3 E:139008681567125660509615839717894128901044817 RS:11 11 1 111 11 11 1 1 1 mm“ MD... US: NE: R: Il‘l—-Illll llllllllllllllllllllllllllllllllllllllll :834.050020011 532381837435498001715024 113593 DN:2562851~16544 091342562371775365825522 337.016 R0: 000000000000 o oooooooooooooooo o ...... nAmnnu111110111011 102111101111000111111111 121211 NA: A1: TY: SE: D: a =57.000005070500000106547.094008300000433040070073 "n218932810255005155256080003452550543040060213 0000000000000 0.0.0... 0 0000000 0000.000. mu"332234135433003323343433343334333333330433344 Du : u a. : Du : 0 : w s“ E:828084.508488006929619694348745825149990833526 RS: 1 1 11 1 1 11 1 EN: 80: MD... US: "E: R: .. ll'luu{IIIMIIIIIMII|MIIIMMIlll'l'llllllll'l'll IIIII :123456789012345678901234567890123456789012345 :999999999000000000011111111112222222222333333 u" 111111111111111111111111111111111111 K = s : A = T. : (continued next page) TASK INVENTORY DATA SUMMARY (Page 4) ELECTRONICS 1 WORKER ll SUPERVISOR I ll TASK I NUMBER STANDARD || NUMBER STANDARD I RESPONSES MEAN DEVIATION IIRESPONSES MEAN DEVIATION 136 I 9 3.67 1.66 II 9 2.56 1.59 137 | 16 3.81 1.22 II 13 3.23 1.48 138 I 6 3.83 1.33 II 7 2.86 1.68 139 I 8 3.25 1.28 II 9 2.44 1.24 140 I 5 3.60 1.52 11 7 1.86 0.90 141 I 9 3.22 1.39 II 7 2.43 1.81 142 I 4 4.25 1.50 II 5 2 80 2.05 143 I 6 4.17 1.17 II 7 2.57 1.72 144 I 7 3.86 1.46 II 7 2.43 1.27 145 I 5 5.00 0.00 11 9 2.89 0.93 146 I 9 3.67 1.32 || 12 3.17 1.19 147 I 13 3.46 1.05 II 13 3.15 1.21 148 I 8 4.25 1.16 II 11 2.64 1.36 149 I 5 3.80 0.45 ll 5 2.60 1.82 150 I 2 4.50 0.71 ll 1 2.00 151 I 3 4.33 0.58 II 7 2.57 1.40 152 I 11 3.64 1.43 II 10 2.80 1.48 153 I 11 4.09 1.38 II 12 3.33 0.98 154 I 7 4.00 1.41 II 8 2.25 1.28 155 I 6 3.17 2.04 11 7 2.28 1.38 Tota1 916 815 188 Tasks Selected on the Basis of Length of Time Employed in Presggt Job Length of Time Employed Tota1 Task Total Responses 1 Year 3 Years 1 to 3 or Less or More Years 153 325 1,026 420 1,771 APPENDIX K TABLE OF TASKS BY STATE: ELECTRONICS MECHANIC 189 Task Selection Responses by State State Tota1 Task Tota1 Responses Michigan Georgia 153 932 839 1,771 APPENDIX L TABLE OF TASKS BY TYPE OF BUSINESS/INDUSTRY AND TABLE OF TASKS BY SIZE OF BUSINESS: ELECTRONICS MECHANIC 190 Task Selection Responses by Type of BusinessZIndustry Type of Business/Industry Tota1 Task Tota1 Responses 6 7 9 10 11 13 153 532 108 110 83 393 545 1,771 Key to Type of Business/Industry: 6 = Computer repair 7 = Computer/office equipment, manufacturing 9 = Communication equipment, manufacturing 10 = Electronic components, manufacturing 11 = Electronic/electrical repair 13 = Other 191 Task Selection Responses by Size of Business Size of Business Tota1 Task Total Responses 100 or Fewer 100 or More Employees Employees 153 1,248 523 1,771 BIBLIOGRAPHY BIBLIOGRAPHY Adams, D., & Fuchs, M. (1986). Educational computing: Issues, trends gnd a practical guide. Springfield, IL: Charles C. Thomas. Aldag, R. 0., & Brief, A. P. (1979). Iask design and employee motivation. Glenview, IL: Scott, Foresman. Alfthan, T. (1985). Developing skills for technological change: Some policy issues. Internationgl Lgbor Review, 124,5, 517- 529. Allen, C. R. (1919). The instructor. the man and the job. Philadelphia: Lippincott. Ammerman, H. L. (19773). Identifying relevgnt job Derformgnce content (Vol. 3). Columbus: The Ohio State University, Center for Vocational Education. Ammerman, H. L. (1977b). Stgtingthe tasks of the job (Vol. 2). Columbus: The Ohio State University, Center for Vocational Education. Ammerman, H. L., & Essex, D. w. (1977). Deriving performance reguirements for training (Vol. 4). Columbus: The Ohio State University, Center for Vocational Education. Ammerman, H. L., Essex, D. N., & Pratzner, F. C. (1974). Rating the job significgnce of technical concepts. Columbus: The Ohio State University, Center for Vocational Education. Ammerman, H. L., & Pratzner, F. C. (1977). Performgnce content for job training (Vol. 1). Columbus: The Ohio State University, Center for Vocational Education. Anastasi, A. (1976). Psychological testing (4th ed.). New York: Macmillan. Arnold, H. J. (1985). Task performance, perceived competence, and attributed causes of performance as determinants of intrinsic motivation. Academy of Management Journal, 28, 876-888. 192 193 Barton, P. (1982). Norklife transitions. In The ordeal of_chanqe (pp. 73-91) (The National Institute for Work and Learning). New York: McGraw-Hill. Beckendorf, L. (1988, February). Representative from the Michigan Department of Labor. Interview. Bemis, S. E., Belensky, A. H., & Soder, D. A. (1983). gap analysis: #59 effective management tool. Washington, DC: Bureau of National Affairs. Bibnotes. (1988, Spring). Application of the Michigan Occupational Data Analysis System (MODAS): Results of a national study. Lansing: Michigan State University, Michigan Career and Vocational Education Resource Center. Bluhm, H. P. (1987). Administrative uses of computers in the schools. Englewood Cliffs, NJ: Prentice-Hall. Boone, E. J. (1985). Develooingproqrams in adult education. Englewood Cliffs, NJ: Prentice-Hall. Borg, H. R., & Gall, M. (1983). Education research: An introduc- tion (4th ed.). New York: Longman. Boyle, P. G. (1981). Planning better programs: Adult Education Association professional develooment series. New York: McGraw-Hill. Bridgewood, A. (1987). Technical and vocational education: A comparison between institutes in Britain and abroad. Educa- tionalyResearch, 22,3, 63. Budke, H., & Charles, S. (1986, August). Resources for updating a curriculum. Vocational Education Journal, 31-32. Burt, S. M. (1967). Industryaand vocational education. New York: McGraw-Hill. Butler, F. C. (1972). Instructional systems development for vocational and technical_trajning. Englewood Cliffs, NJ: Educational Technology Publications. Calhoun, C. C., & Finch, A. V. (1982). Vocational education: Concepts and operations (2nd ed.). Belmont, CA: Wadsworth. Campbell, C. P. (1987). Instructional systems development: A methodology for vocational-technical training. Journal of European IndustrialyTraining (United Kingdom), 11,5, 1-42. 194 Caplan, L. (1983). Knowledge and skills for American workers. Paper prepared for a Northwest-Midwest House-Senate Coalition and the Northeast-Midwest Institute Conference. Chin, R. (1976). The utility of system models and developmental models for practitioners. In W. G. Bennis, K. D. Benne, R. Chin, & K. Corey (Eds.), The planning of change (3rd ed., pp. 90-102). New York: Holt, Rinehart & Winston. Christal, R. E. (1971). §tability of consolidated Job descriptions based on task inventoryasurvev information. San Antonio, TX: Lackland Air Force Base, Air Force Human Resources Laboratory. Clark, D. M. (1983). Displaced workers: A challenge for voca- tional education. Columbus: The Ohio State University, National Center for Research in Vocational Education. Clark, S. (1988, May). Faculty representative of the Michigan Occupational Criterion-Referenced Testing Project, Department of Agricultural and Extension Education, Michigan State University. Interview. Coates, J. F. (1984). The changing nature of work and workers. In l9844yearbook of the American,Vocational Association (pp. 3- 13). Arlington, VA: American Vocational Association. Counts, G. S. (1961). The impact of technological change. In W. G. Bennis, K. D. Benne, & R. Chin (Eds.), The planning of change (pp. 20-28). New York: Holt, Rinehart & Winston. Cronbach, L. J. (1972). Validation of educational measures. In G. H. Bracht, K. D. Hopkins, & J. C. Stanley (Eds.), Perspec- tives in education and psychological measurement (pp. 88-102). Englewood Cliffs, NJ: Prentice-Hall. Cross, K. P. (1984). Adult learning: State policies and institu- tional practices. Washington, DC: Association for the Study of Higher Education. Culpepper, C. E. (1987). MODAS program improvement study. Report prepared through the Occupational Research and Assessment Projects, Department of Agricultural and Extension Education, Michigan State University. Lansing: Michigan Department of Education, Vocational-Technical Education Division. Davies, 1. (1981, March). Task analysis for reliable human performance. Performance and Instruction, 29, 8-10. Dewey, J. (1938). Experience and education. London: Collier- Macmillan. 195 Dippo, D. (1988). Occupational analysis and work education. Journal of Educational Thought, 22,2A, 187-197. Dodd, W. E., Wollowick, H. 8., & McNamara, W. J. (1970). Task difficulty as a moderator of long-range prediction. Journal of Applied Psychology, §5,3, 265-270. Doran, G. R. (1983). Job analysis as a data base for selection and training of custodialZmaintenanca employees. Unpublished doctoral dissertation, Michigan State University. Drucker, P. F. (1986). The frontiers of management. New York: Truman Talley Books, E. P. Dutton. Feuer, D. (1987, December). The skill gap: America’s crisis of competence. Training, 27-35. Finch, C. R., & Crunkilton, J. R. (1979). Curriculum development in vocational and technical education. Boston: Allyn & Bacon. Frugoli, P. L. (1983, May). An introduction to using an occupa— tiopal information system: A reference for program plapninq. Washington, DC: National Occupational Coordinating Committee. Fryklund, V. C. (1965). Analysis technique for instructors. Milwaukee, WI: Bruce Publishing. Gael, S. (1983). Job analysis: A guide to assessipg work activities. San Francisco: Jossey-Bass. Ghorpade, J. (1988). Job analysis: A handbook for the humam resource director. Englewood Cliffs, NJ: Prentice-Hall. Goetsch, D. (1985). Impact of technology on curriculum and deliv- ery strategies in vocational education. In Adults and the changing workplace: 1985 yearbook of the American vocational Association (pp. 191—200). Arlington, VA: American Vocational Association. Gouldner, A. W. (1961). Theoretical requirements of the applied social sciences. In W. G. Bennis, K. D. Benne, & R. Chin (Eds.), The planning of change. New York: Holt, Rinehart & Winston. Guttman, R. (1983). Job Training Partnership Act: New help for the unemployed. Monthly Labor Review, 19§,3, 3-lO.. Hackman, J. R., & Lawler, E. E. III. (1971). Employee reactions to job characteristics. Journal of Applied Psychology, 55,3, 259- 285. 196 Hackman, J. R., & Oldman, G. R. (1980). Work redesign. Reading, MA: Addison-Wesley. Hamilton, S. F. (1986). Excellence and the transition from school to work. Phi Delta Kappan, 55,4, 239-242. Hannafin, M., Dalton, D. W., & Hooper, S. R. (1987). Computers in education: Barriers and solutions. In E. E. Miler & M. L. Mosley (Eds.), Educational media and technology yearbook (pp. 15-20. Littleton, CO: Libraries Unlimited. Help wanted. (1987, August 10). Business Week, pp. 48-53. Heneman, H. G. III, Schwab, D. P., Fossum, J. A., & Dyer, L. D. (1980). Personnethuman resource management. Homewood, IL: Richard D. Irwin. Kanin-Lovers, J. (1986). Choosing the best method for job analy- sis. Journal of Compensation and Benefits, 1,4, 231-235. Kosidlak, J. C. (1987, March). DACUM: An alternative job analysis tool. Personnel, 54, 14-21. Kuder, F. (1972). Some principles of interest measurement. In G. H. Bracht, K. D. Hopkins, & J. C. Stanley (Eds.), Perspectives in education and psychological measurement (pp. 265-283). Englewood Cliffs, NJ: Prentice-Hall. Lee, J., & Mendoza, J. L. (1981). A comparison of techniques which test for job differences. Personnel Psychology, 54, 731-748. Lester, J. N., & Frugoli, P. (1989). Career and occupational information: Current needs, future directions. In D. Brown 8 C. W. Minor (Eds.), Working in America; A status report on planning and problems (pp. 60-81). Washington, DC: National Development Association. Levin, E. L., Ash, R., Hall, A., & Bennett, N. (1980). Exploratory comparative study of f0ur job analysis methods. Journal of Applied Psychology, 55,5, 524-535. Levin, E. L., Ash, R. A., Hall, H., & Sistrunk, F. (1983). Evalua- tion of job analysis methods by experienced job analysts. Academy of Management Journal, 25,2, 339-348. Lewis, M. V. (1989). Coordination of public vocational education with the Job Training Partnership Act. Economics of Education Review, 8,1, 75-81. 197 Mann, P. M. (1978). Adaptability of locally based task amalysis to curricu1a for regions. Unpublished doctoral dissertation, Michigan State University. Maurer, S. D., & Fay, C. H. (1986). Legally fair hiring practices for small business. Journal of Small Business Management, 21,1, 47-54. McCage, R. (1986, September). Using the V-TECS data basEs. Vocational Education Journal, 39-41. McCage, R. (Ed.). (1987). V-TECS product development technical reference handbook (4th ed.). Georgia: Commission on Occupational Education Institutions, Southern Association of Colleges and Schools. McCage, R., & Olson, C. (1986). Using the computer to develop curricula_more effectively. A position paper. Georgia: Vocational-Technical Education Consortium of States. McCormick, E. J. (1979). Job analysis: Methods amd applications. New York: American Management Association. McNeil, J. D. (1977). Curriculum: A comprehensive introduction. Boston: Little, Brown. Mead, M. A., Essex, D. W., & Ammerman, H. L. (1977). Processing survey daLa Technical appendices (Vol. 5). Columbus: The Ohio State University, Center for Vocational Education. Meaders, D. O. (1989, August). Member of the Agriculture Education and Extension faculty at Michigan State University. Interview. Mehrens, W. A., & Lehmann, I. J. (1978). Measurement and evaluation in education and psychology (2nd ed.). New York: Holt, Rinehart & Winston. Michigan Occupational Information Coordinating Committee. (1989a). Jobs by industry: An OCCUpation guide for self-directed job search programs. Lansing: Department of Labor, Michigan Occupational Information Coordinating Committee. Michigan Occupational Information Coordinating Committee. (1989b). Standard industrial classification--Yellow Pages crosswalk. Lansing: Department of Labor, Michigan Occupational Information Coordinating Committee. Michigan Occupational Information Coordinating Committee. (1989c). Occupational prpjections and training information for Michigan: Users manual. Lansing: Department of Labor, Michigan Occupa- tional Information Coordinating Committee. 198 Michigan Occupational Information Coordinating Committee. (In press). Decision making for occupationa] training needs: A guide to the data applications of the OPTIM system. Lansing: Department of Labor, Michigan Occupational Information Coordi- nating Committee. Morton, M. R., & Cross, R. E. (1985, May). Using the law to improve vocational education. Journal of the American Vocational Association, 29-30. Morsh, J. E., & Archer, W. B. (1967). Erpcedural guide for conducting occupational surveys in the United States Air Force. San Antonio: Lackland Air Force Base, Personnel Research Laboratory, Aerospace Medical Division--Air Force Systems Command. Mussio, S. J., & Smith, M. K. (1973). Content yalidity: A proce- dural manual. Chicago: International Personnel Management Association. Naisbitt, J. (1982). Meqatrends: Ten new directions transformimq our lives. New York: Warner Books. Naisbitt, J., & Aburdene, P. (1985). Re-inventing the corporation. New York: Warner Books. National Occupational Information Coordinating Committee. (1982). Vocational preparation and occupations: Educational and occupational code crosswalk (Vol. 1 ). Washington, DC: U.S. Government Printing Office. National Occupational Information Coordinating Committee. (1987). OES survey dictionary. Des Moines: Iowa State Occupational Information Coordinating Committee, National Crosswalk Service Center. Oborne, D. J. (1985). Computers at work: A behavioral approach. New York: John Wiley & Sons. Office of Management and Budget, Executive Office of the President. (1987). Standard industrial classification manual. Springfield, VA: National Technical Information Service. Olson, C. (1987). Michigan occupational data analysis system. A position paper. East Lansing: Michigan State University, Department of Agricultural and Extension Education, Occupa- tional Research and Assessment Projects. 199 Pratzner, F. C., & Ashely, W. L. (1985). Occupational adaptability and transferable skills: Preparing today’s adults for tomorrow’s careers. In Adults and the chamgjnq workplace: 1985 yearbook of the American Vocational Association (pp. 13- 22). Arlington, VA: American Vocational Association. Protheroe, N., Carroll, 0., & Zoetis, T. (1982). Report: School district use of computer technology. Arlington, VA: Educa- tional Research Service. Ratzlaff, L. A. (Ed.). (1987). The education evaluator’s workbook: How to assess education programs (Vol. 1). Alexandria, VA: Capitol Publications, Education Research Group. Reece, B. L., 8 Brandt, R. (1984). Effective human relatipns in organizations. Boston: Houghton-Mifflin. Robertson, I. T., & Smith, M. (1985). Motivation and job design: Theory, research and practice. London: Institute of Personnel Management. Rosinger, G., Myers, L. B., Loar, M. Mohrman, S. A., Stock, J. R., & Levy, G. (1982). Development of a behaviorally based perform- ance appraisal system. Personnel Psychology, 55,1, 75-88. Rumberger, R. W. (1984). The growing imbalance between education and work. Phi Delta Kappan, 55,4, 342-346. Sax, G. (1989). Principles of educational and psychological measurement and evaluation (3rd ed.). Belmont, CA: Wadsworth. Shears, A. E. (1985). Occupational analysis and training. Journal of European Industrial Training, 5,1, 23—27. Sherer, R. C. (1989, January 25). Letter from the Director of the Michigan Occupational Information Coordinating Committee. State of Michigan. (1988). Automated cross-referencipg occupa; tional system (ACROS). Lansing: State Administrative Board. Steely, R. D. (1981). The program development process handbook. Prepared for Community College Services, Higher Education Management Services (Project No. 33CO-9152-43). Lansing: Michigan Department of Education. Sum, A., Amico, L., & Harrington, P. (1986). Cracking the labor market for human resource planning. Washington, DC: National Governors’ Association, Center for Policy Research and Analy- sis. 200 Thompson, D. E., & Thompson, T. A. (1982). Court standards for job analysis in test validation. Personnel Psychology, 55, 865- 874. Tyler, R. W. (Ed.). (1976). Prospects for research and development in education. Berkeley, CA: McCutchan. Tyler, R. W. (1980). Curriculum planning in vocational education. In A. A. Cross (Ed.), Vocational instruction (pp. 57-66). Arlington, VA: American Vocational Association. U.S. Congress. (1982). Job Training Partnership Act. Public Law 97-300. Washington, DC: Government Printing Office. U.S. Congress. (1984). The Carl Perkins Vocational Education Act. Public Law 98-524. Washington, DC: Government Printing Office. U.S. Department of Commerce. Bureau of the Census. (1980). Standard occupational classification manual. Washington, DC: Government Printing Office. U.S. Department of Commerce. Bureau of the Census. (1989). Cur- rent population profile of the United States. Special Studies Series P. 23, No. 159. Washington, DC: Government Printing Office. U.S. Department of Labor. (1986). Occupational projections and training data (Bureau of Labor Statistics Bulletin 2251). Washington, DC: Government Printing Office. Venn, G., & Skutack, D. E. (1980). Expectations and standards for quality instruction. In A. A. Cross (Ed.), Vocational instruc- tion (pp. 81-86). Arlington, VA: American Vocational Associa- tion. Veres, J. G. III, Lahey, M. A., & Buckly, R. (1987). A practical rationale for using multi-method job analyses. Public Person- nel Management, 15,2, 153-157. Where the jobs are is where the skills aren’t. (1988, September 17). Business Week, pp. 104-113. Wilms, W. W. (1984). Vocational education and job success: The employer’s view. Phi Delta Kappan, 55,5, 347-355. Wolff, W. W., Jurado, E., Kinnison, J. F., & Sponner, K. (1985). Eaasibility study of the Michiqan Occupational Data Analysis System (MODAS). Report presented to the Michigan Department of Education. Wisconsin: Western Occupational Research Corpora- tion. 2 “‘. l. x x r mum-H1 STQTE UNIV LIBRQPIE‘. 131 Ill if 1‘ 1H NH 1 293007934940