. 2 ... . é}. . .. ....;.....:......s§§2%...3.1fi£3u§...figuréiuxétkmwkfiaffi ......ks... ... ...... L, . .2 .... ......its. ... 5.2.3.5 .. 5423.....fi...tuft;..s..ub..~.nfl..nt....ciplttsii. . . a... l:?.£..) Ih..£v’|l..l|l\.lil.tf"o...1 u:§3..!3§ .flnfin.yfint.3:92.f.a3.....wt..it.3.2§..i§.3z...w...unt!v§.u; . - , . . . z... . - - 13.03:! ugguiflqtflh I. EA;I.;I§§J$J\SK t In... I '5‘. I .. 9351!...) 5.5.1.} 1.525.}..- 15:18.... 30...! 2 .3))...t!?..1..t3 1,315.}?! . .. . . A . to!” . 9.5.1360: 5‘!!! .fl\. if 5.36.1 (figs: 43*}? {txgl‘stll’ . ‘0.“ka bu.- fil‘i’ ‘ Iii, §‘.S‘-|Oiv . u" ...-x I01.“ .00).... 9 high: Carl‘s-r. Sgggshal...—Al...§b.l.¢.‘lxs’t’.hl.. 9.: ._ . .. ... .. lawmaiifi. ...... kits... 335...... iii t?§al}§ii...vrt.!; $33.21... Isl: gigging! I‘m: L. ~l 3...; §§?t§I’-.§‘r5i§:3sfi .- a 1. If; ‘ Ital-‘5 I I liltgiglfillkztifinflu‘iflu‘.’ gas... it}! is, gggét}:1 . in; u} “gliitpauflfinlgléé gig-11.1.3! i.l‘}.i||\$fi .. .lb.‘§5).)4!atfii§ll§§ltt$bi gél.‘uiiaflbf:)'iifils§ )Ixxgyniaiga) . 10‘s.“? g??? ..g {1% . . ...... . . u. gfi h. . {ll-l... .Ihfidfliii‘). lat]! 301?.5“3 1.13;; ‘lltl .NN: . 2.5310): .05!!! \Kafiaix: infill): gilt It}... .4 .é 03:! ...) ii is?! {It-at‘l %§ «8.. ...!f 3.? 3A .. . ) .. lit-81%}... 168ilibglg. duo’s, .(3. In} I hr, I - 1 t \— gl‘l‘“""l I:’i.‘§:\- (' :EEB;A . . ... ...... u . . 1 A u .1! 3.1.3118! Flirt-1.; Evin: . 35%!!!) \if’tl: 1‘! I; . I s .4 a .94... 22% I‘.‘Xt.:|lmv|0‘wl| 4%.2: i‘.3ti||"!tl.2ftl:.|luchlhn . u. . u :1! apron; IiX’EIt-Hglgfiv‘x Sivy‘sfl w. .30.. |.(1 . {Kin-TI)... ‘23 3|! {.5ng ‘.’.§.:. 1.. In). at“)! l:§?§ It‘ll?! ‘ t. .33“... \- Pnfiiiplil 1| . 1 Nut ... ..i I“ . 3...... Edi‘lfil .. . ......“ El. .ii. 01.13305!!!“ . . énzf :nh...:.&t.” (in?! ...... . .higa. . §.grt.n-O.giinull.lixr. iii; § _ . .- iii-.1.» z '- ..A‘r I I ‘3’} ...d\4"£!ua"% 1 . a... , .... . x. . hi3:)3l§§t:%.nol t ié‘éfiinh. In . . kick, I ilffifllul... . . n «. a ax‘oflagh I? . . J sag—{Jaggh . a a , ... ID. .... in?) . . . . .3351! .E?..k«.§w§%§t§§ l"§l’§’fl.5§ {Ill g} ‘.\-§"’ 3‘ U 2.5,: f.’:a’7uaz:}agaa.u;§5ar§ agraIJ‘h. 8.3.- t. \- ‘nupi‘l -I‘. I!.3§ v :55 I in: ....én..liflun.2..:nsizr. Hibeiigagmil . ...... r1) 4k} nails-4.“? . in?!” ly‘aifg.) ‘3 than”? J39 g) aid-I l; t . t. . .. . . ell, fill. Iiiggl, .5\ if“... . it”... I . 1 (will! 3...“...6. Kl. ... . I. at gull? . I. ||.§Il.ru..'§...tflz. .041. .... . gt: .3; ..££.x§§§l§it§ ii 3);; §£i)i:§)1:fi\v\g ¢| 5333 grin-5". sin-Ill El! g.‘i§llbl§? ’ . Egg; 5.. ...{ufluuf )Esiafifiagdflggflhigjé ‘II J ft giggii t ). 1M . 53...... If!!! {u it. .023 Oh I! . ingff ti; 3.3.x. burial..." §\§\ttlllisfi})i§§1Slit-.021! . I , 3% II! Ewgf‘ggg 1.. . 4 II N S . I A . .. I 3|}? til 531.5525?) ..a. «If»... till-2.8.. s!:?fi)v§ l )¥30.VIQJ ‘%¢Hfl.1l\l.§firlzi§ . . x 51.31;!!! .....Jngi 9' EVA-fig . 4(g’sflrk’ahtlaél‘l t 130513 E)! fails; it! )a)a&|>7lig.§lfi; 9 it 1 Nb catll {VP-Vii gslll‘il' fil’l‘fil}! :80?! > ..r. .. . I); ll gill-3533).. {til-‘37}: $115.... {Eh-IO! ...-.... :rHF—S‘S 293 10585 6557 This is to certify that the dissertation entitled A Study to Identify Major Field Techniques and Utilization Levels by Canadian Instructional Developers presented by Thomas Lawrence Bennett has been accepted towards fulfillment of the requirements for Ph. D. degree in Educational Systems Development Major professor Date April 26, 1983 MS U is an Affirmative Action/Equal Opportunity Institution 042771 )V1€SI.J RETURNING MATERIALS: Place in book drop to LIBRARJES remove this checkout from .J—t your record. FINES will be charged if book is returned after the date stamped below. “ll {7 .../“[41? as. ‘;'l\‘ A156 {001251995 A STUDY TO IDENTIFY MAJOR FIELD TECHNIQUES AND UTILIZATION LEVELS BY CANADIAN INSTRUCTIONAL DEVELOPERS BY Thomas Lawrence Bennett A DISSERTATION Submitted to Michigan State University in partial fulfillment of the requirements for the degree of DOCTOR OF PHILOSOPHY Educational Systems Development 1983 ABSTRACT A STUDY TO IDENTIFY MAJOR FIELD TECHNIQUES AND UTILIZATION LEVELS BY CANADIAN INSTRUCTIONAL DEVELOPERS by Thomas Lawrence Bennett The study attempts to identify which field techniques are currently being used by Canadian instruct- ional developers and to what extent they are being employed. It commences with a statement of the problem at hand and provides a discussion on the need for and originality of the work. In the second chapter, the researcher illustrates the relevance of techniques to the field of Instructional Development by presenting an historical perspective of how educational change may be brought about by technology. Instructional Development is discussed as one means toward a technological, systematic approach to change, and the value of field techniques to the operationalizing of ID systems models is also recognized. Further, the usage of techniques from other disciplines, or the hybrid principle, is also discussed, while the chapter concludes with a citing of Gentry's 1980-81 study and how it serves as a prototype for the present work. The third chapter presents the ten questions addressed by the study along with a discussion of the research population, the survey sample, the sampling Thomas Lawrence Bennett procedures, the composition of the survey instrument, and an investigation of the data collection procedures. The chapter culminates in a presentation of the analysis and interpretation procedures of the raw data. The fourth chapter offers an analysis of the data for each of the ten questions, and provides support— ing tables. In the fifth chapter, the researcher presents a summary of the findings and a set of conclusions drawn from the findings concerning the state-of—the-art of Instructional Development in Canada. From these conclus— ions, the study provides seven (7) recommendations for future research and study in the field of ID, with special emphasis on the Canadian educational arena. Supported by the findings of the study, as well as a number of relevant caveats within the literature, the work concludes with a presentation of five (5) implications for the future of Instructional Development in Canada. DEDICATION To my wife Trish and to my children Drew and Kate without whose love, trust, and continuing support ..... ii ACKNOWLEDGEMENTS The English poet John Donne wrote, No man is an Iland, intire of it selfe. The same may be expressed concerning research: no undertaking, no matter how ambitious or modest, is completed in isolation. The following were immeasurably supportive of this present work and the researcher wishes to acknowledge them at this time: Mr. & Mrs. T.M. Bennett, my parents, for the guiding light that they kindled so many years ago. Mr. James Rintoul, Mr. Frank Ferguson, and Mr. Peter Cremasco, for the teaching role models that they provided. West Parry Sound Board of Education, West Parry Sound Teachers Federation, and the Ontario Public School Teachers Federation, for their professional support. Mr. G.A. Snider, Mr. H.A. Woodhouse, and Mrs. Linda Sherwood for their personal support. Dr. Richard Lewis, Ms. Sally Landerkin, and Mr. David MacDougall, of AMTEC, for the early support they provided in the organization and presentation of the survey instrument. The Supportive Developers and Field Experts (Appendices B & D), who monitored and validated the study. The members of the Association of Media and Technology in Education in Canada, the survey population, without whose cooperation this study would not have been possible. Dr. Norman Geoff, Mrs. Maria Wong, and Mrs. Nancy Snider, of McMaster University, for their invaluable assistance in the processing and interpretation of the raw data for this study. Mrs. Donna Maddeford, my typist, for her personal encouragement and professional care throughout the study. Dr. Norman Bell, Dr. Cass Gentry, Dr. M. Ali Issari, and Dr. James Page, my Dissertation Committee, for their professional expertise and selfless concern. and Special thanks to Dr. Castelle G. Gentry, Dissertation Director and Committee Chairman, who is always a reliable friend, as well as an inspiring teacher. iii TABLE OF CONTENTS TABLE OF CONTENTS . . . LIST OF TABLES . . . . . LIST OF FIGURES . . . . . Chapter I. INTRODUCTION . . Statement of the Problem Need for the Study Basic Assumptions Limitations of the Study Definition of Terms 9 Summary and Organization of o the Study . Page . . . . . iv . . . . vi H O O O (DNONOKWF‘ II. REVIEW OF RELATED LITERATURE AND RESEARCH 10 Change . Educational Technology Instructional Development . . . . . ll 0 o o o n . l7 . . . . 22 Relation of Technique to Instructional Development . . . . . . . . . 25 Existing Techniques . . . . . . . . . 29 III. DESIGN OF THE STUDY . . . . . . 37 Introduction . . . . . . . . . . 37 Research Questions . . . . . . 38 Research Population and Sample . . . . 39 Instrumentation . . . . . . . . Al Data Collection Procedures . . . . . . A6 Analysis of Data . . . . . . . . . A6 IV. PRESENTATION AND ANALYSIS OF DATA . . . . 53 V. SUMMARY, CONCLUSIONS, RECOMMENDATIONS, AND IMPLICATIONS . . . . . . . . . 91 Summary of Findings Conclusions . . Recommendations for Implications . . REFERENCES . . . . . . . APPENDICES o A. Original List of Techniques iv . . 95 Further Research 96 . . . . 97 . . . . . . . 10A . . . . . . . 116 APPENDICES (Continued) B. C. D. H31 0’13?! WC—d Supportive Canadian Instructional Developers (Phone Survey: August, 1979). Gentry's Management Framework Model . . Panel of Field Experts (Addresses as per original mailings: October - December, 1979) . . . . . . Request Letter to Field Experts . . Field Experts' Validation Package . Techniques According to Ranking by Field Experts . . . . . . . . . . . . Final Survey Instrument . . . . . . Initial List of Techniques with ERIC Descriptors . . . . . . . . . . . . Professional Journals & Cited Techniques . . . . . . . Field Expert Response Form (Matching Techniques with Gentry's Management Framework Model) . . . . . 119 120 124 127 128 176 181 189 202 205 “.7 A.8 “.9 4.10 4.11 “.12 4.13 H.1A H.15 LIST OF TABLES Gentry's Listing of Techniques by Management Framework Model Components . . . . . . . . . Level of Use . . . . . . . . . . . . . . . Competency Level . . . . . . . . . . . . . Value to Instructional Development . . . . Degree to Which Institution Teaches . . Pearson Correlation Between Use of Technique and User Teaching Experience . Anova Between Level of Technique Use and Present Job Responsibilities (Original Job Categories) . . . . Anova Between Level of Technique Use and Present Job Responsibilities (Collapsed Job Categories) . . . . . . . Anova Between Level of Highest Education and Level of Use . . . . . . . . . . . Anova Between Level of Use and Level of Highest Education . . Anova Between Level of Highest Education and Competency Level . . . . . . . . . Anova Between Level of Highest Education and Competency Level Mean Scores (ordered). . . . . . . Anova Between Level of Highest Education and Degree to Which Institution Teaches Techniques . . . . . . . . . . . . . . Anova Between Level of Highest Education and Degree to Which Institution Teaches Techniques Mean Scores (ordered). . . . . . Pearson Correlation Coefficients for Appraisal Interview . . Listing of Management Framework Model Components by Techniques . Listing of Techniques by Management Framework Model Components . . . vi Page 34 55 57 59 62 65 68 69 73 73 71: 7A 75 75 8O 83 88 LIST OF FIGURES Figure Page 3.1 Distribution Levels of Survey Population Responses . . . . . . . . . . . . “8 “.l Collapsed Job Categories of Survey Population . . . . . . . . . . . . . 67 “.2 Pearson Correlation Values . . . . . . . . . 76 “.3 Field Expert Technique Matching Example . . . . . . . . . . . . . . 82 vii CHAPTER 1 INTRODUCTION Chapter 1 states the problem, as well as an explanation of the need for and originality of the study. These are followed by a listing of basic assumptions and limitations of the study, and a set of definitions for major terms used in the work. Also included is a summary and a brief outline of the organization of the remaining chapters. STATEMENT OF THE PROBLEM Instructional development curricula in Canadian institutions have not been systematically designed, and are inconsistent in the ID techniques that they teach students. This research is designed to partly solve this problem by determining, from the field, what techniques (recognized by national, field experts) are known and/or used by instructional developers. Based on these findings, recommendations can be made for the revision of ID programs, and the inclusion of relevant field techniques as suitable tactics for curricula implementation. The research attempted to identify whigh_techniques are currently employed in the field, and to what extent they were being used by Canadian instructional developers in the period 1980 to 1982. The survey population was made up of l 2 members of the Association of Media and Technology in Education in Canada (AMTEC), and although the actual make-up of this organization will be dealt with in greater detail in Chapter III, it is noteworthy that the organ- ization is the Canadian equivalent of the Association for Educational Communications and Technology (AECT), and supported an enrollment of 558 members, when the survey was conducted. Specifically, a major question of this study asked "how many of a selected set of recognized, valuable techniques were being taught in Canadian training programs for educational technologists or instructional developers?" Although a number of the techniques in the set selected for this study (Appendix A) are rooted in education in general and Instructional Development in particular, a significant number have been developed in other fields, such as Psych- ology, Communications, and business and industry. The question arises, "how many of these diverse techniques are being employed by developers in the_present Canadian educational arena?" A second question asks "which tech- niques are considered to be of outstanding value by these developers?" The responses to the above two questions should assist graduate instructional technology programs in deter- mining which techniques are of greatest perceived importance in the field of Instructional Development in Canada, and concomitantly, which techniques should be taught at the 3 graduate and the under-graduate level to preservice and inservice teachers. NEED FOR THE STUDY A major concern of the work was its uniquiness and relevance to the Canadian, Educational scene. How original was the topic of study? How useful was it to Canadian, instructional technologists? In order to answer these questions, a search of the Educational Resources Information Center (ERIC) and Dissertation Abstracts was performed. ERIC descriptors were also correlated with State-of—the-Art Reviews and several field techniques that would be used in the study (ex., Content Analysis, Critical Path Method, Force Field Analysis, Formative Evaluation, Summative Evaluation, Management by Objectives, and Critical Incidents, etc.). Of the eighteen entries resulting from the search none dealt specifically with the questions of this study. Only two citations surfaced as a result of dropping Teacher Education and Curriculum Education descriptors from the search: "The Documentation of Instructional Development" (Educational Technology, June 1975, pp. “3—“6), and "A Critical Review of the Instruct- ional Technology Mechanism of Task Analysis" (Improving Human Performance, Summer 197“, pp. 6“-70). Neither article relates directly to the need for the present study, although they are relevant to a survey of the literature. A second ERIC run with the addition of the “ descriptor "Canada" caused the number of citations to drop to zero. None of the abstracts addressed themselves to the specific needs of the present research work. An informal telephone survey of five (5) Canadian leaders in the field of Instructional Development was con- ducted in order to acquire authoritative opinion about the need for this study. The developers (see Appendix B) reacted most positively and gave assurances that such research is unique to the Canadian scene, as far as they were aware, and would benefit Canadian instructional developers. Further, these leaders agreed to serve as expert validators of the initial survey of techniques (Appendix A) and to provide additional names of other key developers in Canada who might subsequently act as field experts for the validation of the survey instruments. Therefore, after the above noted conferencing with Canadian field experts, and the dissertation abstracts search, it was concluded that little is known about the levels of knowledge and application of instructional develop- ment techniques among Canadian instructional technologists. Hence, this study is needed in order to provide educational planners and decision makers who are involved in training instructional technologists with hard data for determining which techniques are appropriate for inclusion in their academic curricula. Based on this need, the study is designed to provide conclusions and recommendations in response to the following questions: 1. Which of the major techniques are currently taught in graduate schools and teacher training institutions? 2. Which of the major techniques should be taught to Canadian developers of instructional programs? 3. Which of the major techniques should be taught to students in teacher training institutions, according to the perceived value or relevancy of each technique, as viewed by the developers? Another important consideration of the research was to match the major techniques, as determined by the study, with appropriate functions of an Instructional Devel— opment model. Gentry (1980) contends that instructional development system models are descriptive, while techniques used by instructional developers are prescriptive: " ...... it is appropriate for ID system models to tell us what must be done". He further asserts that "many useful techniques have been designed to operationalize ID system models, but the techniques are scattered and difficult to find or assess". Hence, as a secondary goal, this study attempts to determine the validity of Gentry's contentions by determin— ing the range of techniques known, or deemed valuable to ID by the survey population. BASIC ASSUMPTIONS Underlying the study were the following basic assumptions: 1. The instrument will yield the data that the researcher is seeking. All respondents will respond accurately. The researcher will be able to match the major techniques as determined by this study with appropriate functions of an Instructional Development model viz. Gentry's Management Framework Model (see Appendix C). This study may inspire further research relating to instructional techniques employed in graduate studies and teacher training institutions, in Canada. LIMITATIONS OF THE STUDY There are three specific limitations of this study: The data accepted for analysis is limited to the responses of the surveyed population. The study is limited to only those major techmiques agreed upon by the panel of field experts. (see Appendix D) The results of the study are generalizable only to Instructional Developers in Canada. DEFINITION OF TERMS Educational Technology is "a complex, integrated process involving people, procedures, ideas, devices and organizations, for analyzing problems, and devising, imple— menting, evaluating and managing solutions to those problems, involved in all aspects of human learning. In educational technology, the solutions to problems take the form of all the Learning Resources that are designed and/or selected and/or utilized to bring about learning; they are ident- ified as Messages, People, Materials, Devices, Techniques, and Settings." (A.E.C.T., 1977:16“—5) Instructional Development is a systematic approach to the Design, Production, Evaluation, and Utilization of complete systems of instruction, including all appropriate components and a Management Pattern for using them. Instructional Development functions are those which have as their purpose: "analyzing problems and devising, implementing, and evaluating the Learning Resources/ Instructional System Component solutions to these problems." (A.E.C.T., 1977:166) Instructional Developer is a professional practitioner in the field of Instructional Development. In the present study, the term instructional developer is used interchangeably with the membership of the Association for Media and Technology in Education in Canada, who are the members of the survey population. However, it must be noted that not all respondents are trained instructional developers, 8 although they may be involved in some phase of the I.D. process; some are administrators, librarians, technicians, etc. Hence the researcher employed "Instructional Developer" as a generic term for the survey population, fully realizing that the actual identification of such would not be possible until the analysis of the data in Chapter IV. Technique is a routine procedure or pre-cast mold for using Materials, Devices, Settings, and People to trans— mit Messages (A.E.C.T., 1977:169). It is specific, with well defined characteristics and processes which are learn— able and hence transferable. It is appropriate at the Strategical or Policy Decision Level of the Instructional Development process, as well as at the Tactical or Operat— ion Level. Finally, it must not exceed the constraints of the system in which it is operational: (a) time available, (b) skills available, (c) resources available, (d) client attitudes, and (e) physical limitations. SUMMARY AND ORGANIZATION OF THE STUDY This chapter has provided an outline of the problem under consideration as well as a declaration of the need and originality of the basic assumptions and limitat- ions of the work, and concluded with a set of definitions of the major terms used in the study. With regard to subsequent chapters, a review of the literature pertinent to the study is presented in Chapter II, while Chapter III outlines the specific 9 procedures involved in conducting the study. Chapter IV presents the findings of the study and the actual analysis of the data, and Chapter V concludes the presentation with the summary of major findings, conclusions, recommendations and implications for future research. CHAPTER II REVIEW OF LITERATURE AND RELATED RESEARCH It has been observed that the main end of educa- tion in any single period of history is that end which best reflects the needs of society at the time (Cole, 1960:618). Perhaps the dominant feature of global society today, as well as our own North American culture, is that of planned change. Zaltman and Duncan (1977:“) have focused on this issue in the field of communications, while major proponents of Educa— tional Technology including James Finn (196“:89), have pointed up the effects of technology on educational processes for at least two decades. It is becoming increasingly evident that education is being effected by the unstable social and economic cli- mates of society coupled with financial restraints persist— antly troubling the field of Instructional Development (Selby, 1980:13). These conditions logically make important the use of techniques which are not only effective, but efficient in terms of time and resources. Thus, the present study is concerned with techniques that are being used in the market place which are viewed by the survey population as being effective and efficient for Instructional Developers to use in order to bring about the desired changes in learning systems of society. In this chapter, several areas of the literature 10 11 will be investigated. The change literature will be examined as it relates to our field and educational organ— izations' traditional resistance to it. A second area of the literature to be reported on is technology in educa- tion and audiovisual education including educational systems and instructional develOpment. The latter will focus on tactics and strategies as they relate to field techniques. CHANGE The negative aspects of change are exemplified by John Steinbeck: Don't look behind. Something might be gaining on you. (Steinbeck, 1961:16“) Change in our society is often characterized by two consider- ations. First, people only too often use the past as measur- ing sticks for the present and future; second, change is occuring so rapidly that traditional means for dealing with it are no longer adequate. As an example of the former, McLuhan observed that we are often guilty of trying to do today's job with the aid of yesterday's tools and concepts. This key theme is repeated throughout his work The Medium is the Message. He warns: Our most impressive words and thoughts betray us - they refer us only to the past, not to the present. (McLuhan, 1967:63) Once again, this theme is evident as he asserts the following: We look at the present through a rear- view mirror. We march backward into the future. (McLuhan, 1967:75) 12 Although a colourful quote, McLuhan was not the first to express this sentiment with the "backward marching" metaphore. Muller espoused the identical philosophy fifteen years previously: . . . we have the curious spectacle of civilized man forever marching with his face turned backward — as no doubt the cave-man looked back to the good old days when men were free to roam instead of being stuck in a damn hole in the ground. (Muller, 1952:65) In the second consideration of the Steinbeck quote, change may be viewed as occuring rapidly. In many instances, it may be upon society before realization occurs. It is in this regard that Ellul (1963:19) warns of becoming enamour- ed of signposts mired in the past, and asserts that in the modern world, nostalgia has no survival value. Berlo (1975z3) concurs and states that old ground rules are obsolete in so many of the processes of our society. There can be little doubt that change in our society is occuring so rapidly that referents of the past are often useless. In a time of rapid change, the world will belong to "those who can grasp the nature of that change and fashion their life and culture to make the most of it." (Finn, l96“:5). The educational arena is no exception. Change there too has been rapid and pedagogues are faced with the dilemma of training students in the last quarter of the twentieth century with congruent tools and methodologies. Wittich and Schuller (197315—6) observed that great social as well as technological changes confront and effect teachers and their relationships with learners, and pointed out that 13 as a result teachers are faced with three primary needs: 1) the need to keep up to date with current information and field practices, 2) the need to deal with individual student differences, and 3) the need to acquire and pract- ice the best available teaching techniques. In short, their contention is that teachers must prepare themselves to operate in their market place with tools best suited to this present era in education. However, educational literature illustrates that it is not always easy to induce educators in general and even instructional developers in particular to adopt new tools or techniques. The rate of change in the education- al arena is often affected by numerous barriers to change. Zaltman and Duncan (1977:66-88) provide an overview of literature on change barriers, concluding that there are three basic ones: cultural/social, psychological, and organizational barriers. Further investigations support the above noted findings as in the work of Foster (1962:75-76), who asserts that one of the major barriers to change stems from cultural values and beliefs; this is further supported by Lippitt, et.al. (1958:181) and Frye (1969:1-12). The rate of change is further affected by psycho- logical barriers. Caffrey (l965:l“) notes that most people are "heel-dragging resistors to change, suspicious of the new, and not very much interested in creating new things”. Watson and Glaser (1965:36) point out that innovations that are introduced to a system from an outside source are often 1“ received with half-hearted support, if not overt opposition. As such, psychological barriers go much "deeper" than resistance to change, which as Judson (1966:69) observes is only a symptom of more basic problems; often the under—» lying causes are found within the system itself. Hence, an awareness of organizational barriers is also necessary for any agent who is concerned with accelerating the rate of change. Zaltman and Duncan suggest that one of the most important sources of resistance is that change may be per- ceived as threatening to the power structure of the organ- ization, and argue that for change and innovation to succeed in an organization, it is important that the structure of the organization in terms of authority patterns, channels of communications, divisions of labor, rules and procedures, etc., be compatible or supportive of the change. (Zaltman & Duncan, 1977:76) The above opinion is further supported by Judson (1966:80), Woods (1967257), and Broom & Selznick (1968:3““) to name a few. Thus, the rate of change is significantly affected by cultural, psychological, and organizational barriers and educational growth can be effected only if relevant and innovative techniques are permitted to be implemented. In the words of novelist John Irving: . you only grow by coming to the end of something and by beginning something else. (Irving, 1978: 159) The acceptance of an innovation is a major concern of any creator, whether he be scientist or messiah, 15 inventor or teacher. It is within the nature of man to change his environment: "Man's re-ordering of the face of the globe will cease only when man himself ceases" (Wein- berg, 197Sz2). Yet, as intimated above, the innovator is not always popular. Guskin (1969:10) notes that he is ".... an annoying minority, a gadfly, an irritant who nevertheless likes to think he will stimulate a pearl with- in the establishment's hard shell." Hence, careful consider- ation must be given to methods of how an innovation is diffused: The lack of a diffusion system will lead to abortive change. (Orlosky & Smith, 1972:“1“) Orlosky and Smith argue that change will not become wide- spread or permanently entrenched without a plan for diffus- ion, while Havelock (19732119) advises that the diffusion of an innovation begins with the acceptance of the idea by a few key members of the system. Statistically, Rogers and Shoemaker speak in terms of the diffusion effect and relate it directly to thresholds: . . as the rate of awareness - knowledge of the innovation increases up to 20 - 30%, there was almost no adoption. Then once this threshold was passed, each additional percentage of awareness - knowledge in the system was associated with several percentage increases in the rate of adoption. (Rogers & Shoemaker, 1971:163) As a testimony to the significance of the diffusion process, many citations may be found within the literature, including Beal (1962), Czepiel (1972), Eicholz (1963), 16 Grinstaff (1969), Gross (1971), Havelock (1971), Lin and Burt (1975), Rogers (1962), Smith (1968), Turnball, et.al. (197“), Zaltman (1971), and Zaltman and Stiff (1973). Much of the research provides the instructional developer with numerous techniques relevant to the change processes; there are many collections of techniques which may be used in order to bring about planned change. Of note are Havelock's The Change Agent's Guide to Innovation in Education, Roger's Diffusion of Innovations, Bennis, et.a1.'s The Planning of Change, and Roger & Svenning's Managing Change. Other citations include Beckhard (1969), Bennis, et.al. (1965), Havelock (1973), Johnson (1969), - Rothman (197“), Mehrabian (1970), and Zaltman and Duncan (1977), to name a few. In order to be effective in the educational arena, instructional developers must avail themselves of diffusion techniques, not only to implement new programs, but to develop a more favourable, basic attitude of the clients toward new ideas in particular and change in general (Rogers, 1976:281). Hence, it is important that change research is collected and incorporated into the curricula of Instruction- al Development programs. In such a manner, future developers and change agents may be properly equipped to effectively and efficiently diffuse educational innovations as well as accelerate the rate of educational change. In this manner, teachers may very well prepare themselves to operate in their market place with tools such as the above noted l7 techniques found within change literature. Further, it is no accident that many of these tools come from Educational Technology. It is the opinion of Brown, Norberg, and Srygley (1972:1—2), that technology can make education more productive and individual, instruction more scientific and powerful, learning more immediate, and access to education more equal. EDUCATIONAL TECHNOLOGY This section addresses the relevant literature on Educational Technology, specifically in terms of its subset Instructional Development and the relationship of techniques to both. Since the dawn of recorded history, mankind has sought to live in closer harmony and with greater ease with— in his environment. He has explored nature, attempted to conquer it and finally to understand it (Mumford, 1962:31). In so doing, he has sought the use of tools, technologies that have eased his burden and enabled him to exist harmon- iously with his surroundings (Ibid, 321). As a result, Mumford contends that by the sixteenth and seventeenth centuries, the new religious Messiah was the machine (Ibid, “5). This belief is shared by John Wilkinson in his intro- duction to Ellul's The Technological Society: Since the religious object is that which is uncritically worshipped, technology tends more and more to become the new god. (Ellul, 1967zxi) 18 Further, it may be asserted that the world is technological in nature, and as Finn (1962:70) pointed out, men are seek— ing to solve some of their problems by technological means; however, Finn is quick to point out that technology is not merely a collection of gadgets and hardware. In agreement with this philosophical stand is Saettler (1968:5-6) who suggests that the word technology "does not necessarily imply the use of machines....but refers to any practical art using scientific knowledge." If technology, which comes from the Latin texere meaning to weave and construct, is not just men and machines, what then is it, and how does it apply to our present invest— igation of change in education? Dealing with the first half of the above question, technology is a process and a way of thinking (Finn, l960:l“2). It is a complex, integrated organization of men and machines, of ideas, of procedures, and of management (Hoban, 1965:19“). Technology is a complex, integrated process for analyzing problems, and of devising, implementing, managing and controlling and eval- uating solutions to those problems (Association for Educa— tional Communications and Technology, 1977:169). Galbraith (1967:12) points out that the main characteristic of tech- nology is the breaking down of tasks into detailed sub— divisions so that organized knowledge may be put to work, and Finn (1965azl93) wrote that technology is a force which encompasses invention, techniques, machinery, men, money, and methods. 19 It has already been asserted that societal change necessitated the creation of new methodologies. Often, the old solutions to new problems are not feasible because we are not equipped with sufficient technologies (Hussain, 1973:208). However, as technology continues to develop, new forms of organizations are necessary (Broom and Selznick, 1968:78). Such is evident in the educational arena. In order to solve existing educational problems and keep abreast of the rapidly changing times, Educational Technology evolved and developed; practitioners of the pedagogical arts adopted procedures and philosophies of the general market place and developed the process of Education- al Technology. Definitionally, it is very similar to and incorporates many of the above definitions of Technology. According to the Task Force on Definition and Terminology of the Association for Educational Communications and Technology, Educational Technology is: A complex, integrated process involving people, procedures, ideas, devices and organization, for analyzing problems, and devising, implementing, evaluating and managing solutions to those problems, involved in all aspects of human learning. In Educational Technology, the solutions to problems take the form of all the Learning Resources that are designed and/or selected and/or utilized to bring about learning; they are identified as Messages, People, Materials, Devices, Techniques, and Settings. (A.E.C.T., l977:l6“—5) In a critical examination of the subject, Hlynka (1981) discusses a dual view of Educational Technology. He suggests that a Physical Science approach sees educational 20 technology as being primarily concerned with audiovisual hardware and software, while a Behavioral Science view concerns itself with the "practical application to education of the laws, rules and heuristics of educational psychology and educational communication, and general systems theory to education" (Hlynka, 1981z3). It is this duality of defin- ition which leads us to the next portion of this chapter's considerations. The historical roots of the field of Instructional Development are grounded in these two views of Educational Technology. In the former case, one can witness the significant contribution of audiovisual aids to education. Such a view is typified in Arnheim's enthus- iastic statement about one set of techniques: the contribution of photography in all its forms has revolutionized teach- ing and learning in most areas of study. (Arnheim, 10(5):l8) By the late l9“Os and early 1950s, an increasing emphasis was being placed upon newer and expanded media in education (Brown, Norberg, & Srygley, 1972:3“3—“). These media were fast becoming recognized as important aids to good instruction, rather than as a prop for poor teaching (Davies, 1981:192). Even earlier, Finn (l96“b:96) declared that "the educational future will belong to those who can graSp the significance of instructional technology", and Scuorzo (l967zvii) went so far as to suggest that every school in the United States should have an audiovisual coordinator to help teachers develop the AV presentations needed for optimum instruction. Further, Moller claimed 21 that . . . we are only beginning to discover how much all these media, each in its own way and often in concert, can contribute to achieving the major aims of modern education. (Moller, 1970:11) With the widespread use of such media, it was only a matter of time before they had an important effect on education. Glaser (1965:80“) noted the modification of instructional procedures and significant changes in mater- ials and techniques used by teachers, while Hussain (1973: 2-3) identified the increasing demand for services provided by information systems which included numerous management techniques. Hence, educational technology was adopting, as well as adapting techniques from the out—of-school society (Wittich & Schuller, 1973zxix), and field theorists were predicting the "development of team techniques involving the cooperative efforts of different types of expert communicators who are familiar with and able to apply learn- ing theory and know how to use information gathering, manipulation, and interpretation techniques" (Brown, Nor— berg, & Srygley, 1972:11-2). As a result, the management of information systems was recognized as one of the central competencies needed in modern education (Berlo, 1975:10). In order to achieve maximum results with the util- ization of audiovisual hardware/software, there had to be a Inodus, a means of incorporating the physical devices within 21 scientifically valid delivery system that was employed to anialyze, develop and evaluate practical solutions to teaching 22 and learning problems. Wittich and Schuller (1973:631) noted that there was a definite need for a new kind and level of planning to help assure that the efforts of educa- tional technologists would be successful and effective; they concluded that Instructional Development is predicated on such a systematic approach. INSTRUCTIONAL DEVELOPMENT Any study of Instructional Development must include the concept of systems and system approaches. A gystems approach is a systematic attempt to achieve spec- ific objectives or accomplish particular goals through the identification, development and evaluation of a set of materials and strategies (Erickson & Curl, 1972:66; Twelker, Urbach & Buck, 1972:1). As noted earlier, Instructional Development may be viewed as an area that applies systems approaches to solving instructional prob- lems. Definitionally, it has different meanings for diff- erent individuals. In his work The Intricacies of Instruc- tional Development, Duncan (1978:22) used the term instruc- ional development interchangeably with systems approach, instructional technology, and educational technology; however, this was only done in order to alleviate semant— ical confusion in his study. Instructional Development has been defined in terms ranging from the simple to the complex. In the former instance, Buhl (l975z2) has termed it "a set of 23 activities aimed at improving the condition of learning for students". Gustafson's (l97l:1) terse definition identifies Instructional Development as a process for improving the quality of instruction, and Low (1981:17) points out that its purpose is the synthesis of useful educational products. Instructional Development seeks to design instruction, rather than supplement it (Faris, 1968:971-3). More specifically, Abedor & Sachs (1978:“) pro- posed a definition which focused on the design, development, implementation and evaluation of instructional materials, lessons, courses, or curricula while attempting to improve teaching and learning. However, two facets of Instructional Development that are not explicit in the above definitions are those of problem identification and use of feedback loops, as expressed by Schauer's definition: . . . common-sense planning of cooperation to identify and define learning problems and to attempt resolutions of those problems with a plan for action, evaluation, tryout, feedback and results. (Schauer, 1971:““) Finally, the definition offered by the Association for Educational Communications and Technology encompasses the above, but further points out that instructional development is larger than instructional product develop— ment, which is concerned with only isolated products, and is larger than instructional design, which is only one phase of instruct— ional development. (A.E.C.T., 1977:172) 2“ In an era when educational change is so rapid, the value of a systematic approach to Instructional Development may be seen as valuable to educators who are encouraged to understand all the component parts (and their interactive factors) of a problem; the adoption of such an approach has gained significant currency in North America and the rest of the world (Bass & Hand, 1978:99). This spreading awareness of I.D. in the educational community is demonstrated by increasing references to its benefits in popular and prestigeous magazines (Heinich, 1970:15). A common method for illustrating the process of Instructional Development, is by means of a model. A number of such models exist which illustrate the relationships between and among the various components of the I.D. process. Examples are found within the works of DeCecco (1968), Hamreus (1970), the Instructional Development Institute (1971), Gerlach & Ely (1971), Gustafson (1971a), Gentry (1980—81), Stamas (1972), and Gentry & Trimby (1983), to name a few. For a relatively recent investigation of existing I.D. models, one may consult A Comparative Analysis of Models of Instructional Design by Andrews & Goodson (1980:2-16). Thus, following the processes illustrated by these models, it is assumed that the instructional developer may work more effectively and efficiently to bring about instructional change. Not only will application of system— atic processes assist the instructional developer in the design of instruction, but these processes assist him or her 25 in their role of catalyst to bring about change (Diamond, l97“:6-8). In agreement is Briley (l97l:39—“2) who states that Instructional Development agencies are . . catalysts for the improvement of instruction providing specialists in the techniques and resources needed to improve instruction. (Briley, l97l:39—“2) Generally the literature presents Instructional Development as a potent force for the execution of educa— tional change, where practitioners are skilled in the use of relevant resources and I.D. techniques. It is the con— sideration of these instructional development techniques which is the major focus of this study. The literature reports a wide range of techniques that may be used to accomplish the various functions of an I.D. process. An assumption of this study is that to be effective and efficient, the instructional developer must be aware of the existence, and skilled in the use of certain relevant techniques. As previously stated, this study is concerned with the knowledge about the utilization levels of field techniques as employed by Canadian developers. RELATION OF TECHNIQUE TO INSTRUCTIONAL DEVELOPMENT In order to achieve planned, educational change, the instructional developer must be equipped with a number of specific tactics that s/he can bring to bear upon the problem. Not to be confused with strategies (which simply consist of an over-all design likely to accomplish broad 26 objectives), tactics are much more detailed. According to Davies (1981:12“—6) strategies are concerned with the yhy and ghgg of instruction whereas tactics are concerned with the ggy of instruction. As a subset of strategy, tactics are the cutting edge of strategy; they are the activities that involve everything from how to write objectives to the behavior of instructors and learners, to the use of audio— visual aids. The activities element of tactics leads to the study of field techniques. In considering tactics, the focus is on technigues of classroom teaching and/or mediated teaching, employed to bring about the desired strategic objectives of instruction (Heinich, l970:1“1). What, then, is technique and how might a seemingly random set of techniques be culled from a myriad of discip- lines and beneficially applied in the field of Instructional Development? The French Sociologist Jacques Ellul in his classic work The Technological Society asserts that tech— nique has become almost completely independent of the machine (1967:“). Rather, technique is the means and the ensemble of means to attain a predetermined end (1967:19), an application of the formulas of a practical product to practical life (l967t7), and thus through the use of tech— nique, man is able to utilize to his profit powers that are alien or hostile. He is able to manipulate his surroundings so that they are no longer merely his surroundings but become a factor of equilibrium and of profit to him. (Ellul, 1967:25) 27 In the foreward to Ellul's work, Merton provides a more precise definition by stating: Technique refers to any complex of standardized means for attaining a predetermined result. (Merton, l967:vi) Relevant to this study, the Association for Educa- tional Communications and Technology defines technique as . a Learning Resource/Instructional System Component. Routine procedures or pre-cast molds for using Materials, Devices, Settings and People to transmit Messages. (A.E.C.T., 1977:169) In addition to the definition that was presented on page 8, techniques may be defined as those tactics employed by instructional developers that accomplish the requirements of any component or function within an I.D. systems model. Techniques will be examined in terms of their active ability to accomplish the objectives of educational change. To paraphrase Oscar Wilde (196“:62), "Good techniques exist simply in what they make, and consequently are perfectly uninteresting in what they are". Hence, this study is concerned with those techniques which can best effect educational change as they relate to the field of Instructional Development. Later in this chapter, will be discussed how some of them relate to various functions of an I.D. model, but for the moment it is necessary to investi- gate the source of these techniques. Many reports of techniques are found within educational literature, however, it is extremely important to note that they are often 28 difficult to discover. Gentry aptly supports this point: . . many useful techniques have been designed to operationalize I.D. system models, but the techniques are scattered and difficult to find or assess. (Gentry, 1980-81:33) A significant number of these techniques are not even found within the literature of Instructional Development, but rather are rooted within other disciplines such as Psych— ology, Communications, Economics, Business Management, Medicine, and others. Until recently, these other fields had to be researched in order to cull out relevant tech- niques that might be applied to our discipline. This adOption/adaption process is not new to education in general (Broom & Selznick, l968:3““) nor to the antecedent field of audiovisual education in particular (Finn, 1953:175; Finn, 1965:66; Wittich & Schuller, 1973:631). McLuhan suggested that the methodology of our era is to use multiple models for exploration (1967:69), and it is to him that we credit a label for this process. While discussing operation research programs, McLuhan (196“:62—3) compares them to the cultural mix occur— ing during wars and migrations. He calls this adoption of techniques from many areas the hybrid principle, in itself a technique of creative discovery. This hybrid principle would appear to be functioning with instructional develop- ment. In order to affect educational change, instructional developers have often adopted and/or adapted tactics from many other disciplines. In tracing the source of common 29 I.D. techniques, it is in such a manner that our own lexicon of field techniques has expanded. The hybrid principle is seen as important for the development and continuance of our field by our scientists as well as those in interfacing disciplines (Hussain, 1973:2-3; Hanneman, 1975:317; Zaltman & Duncan, 1977:91; Langdon, 1981:26). If the above conclusions are true then instruct- ional developers should be practising the hybrid principle, by using tactics of other disciplines when applicable, and assembling collections of such techniques which will be more accessible to subsequent developers when they are attempting to affect educational change in the future. Such collection procedures have already begun as we will note later in this chapter. This present dissertation is only one attempt to bridge the gap between what presently exists and what needs be accomplished. EXISTING TECHNIQUES As was noted in Chapter I of this study, the results of the ERIC and Dissertation Abstract search coupled with a subsequent search at the Ontario Institute for Studies in Education (OISE) at the University of Toronto revealed few documented works dealing with collections of field techniques. This provided evidence that the study under consideration was unique and original, yet it supplied little information concerning collections of field 30 techniques relevant to our discipline. However, numerous entries concerned with individual techniques were gleened from the pages of professional journals. One landmark study was provided by J. Christopher Jones in his book entitled Design Methods, seeds of human futures (1970). Generally, Jones' work is divided into two major sections: Part 1 acts as a review of past and present design methods that have been utilized in assisting planners and designers to find methods that can be applied to particular design situations, including traditional methods of design-by—drawing, design as an art or form of mathematics, and the consideration of surmounting interpersonal obstacles to solve modern design problems. Part 1 also includes a review of such relatively new methods as Black Box and Glass Box theory, as well as the concept of designers as self- organizing systems. Part 2 is more relevant to the present study. Here, an outline is provided with examples, of thirty-five methods of design that fall within the scope of this study; herein is found an investigation of numerous techniques ranging from Literature Search to Questionnaire, from Brainstorming to Morphological Charts, from Analysis of Interconnected Decision Areas (AIDA) to Checklists. Each technique citation dealt with an outline, aim, example(s), application, and comments on cost and time requirements. Almost as important, each technique descrip— tion included a set of references, which allowed investiga— tion of Primary and Secondary source materials. 31 Another valuable work found to be relevant to the present study was that of Delp, Thesen, et.al. (1977). Entitled Systems Tools for Project Planning, the book is designed to present a number of techniques which may be used by a practitioner to address problems of design such as generating ideas, objectives, and methods of evaluation. It is a toolbook which can be used either as a text or as a reference, and it is designed so that it may be used by developers in many different fields. Many of the examples are drawn from agriculture, health, family planning, and employment, as well as from the field of education. It is the belief of the authors that planning tools, such as the techniques found within the text, have universal utility. Each technique describes what the developer needs to know in order to select a tool, utilize the tool, and finally understand its implication and underlying theory. Each technique begins with a brief statement of the purpose and uses of the tool, supplemented by a list of key definitions relevant to the technique. Next is a presentation of the advantages and limitations of the technique, which is followed by a study of required resources, tool descrip— tions and method of use. The technique presentation con- cludes with a description of examples, a statement of fundamental background theory and a bibliography. A recent publication dealing with a variety of field techniques is Ivor K. Davies' Instructional Tech— nique, published in 1981. As the title implies, it is a J 32 text about instruction wherein the emphasis is on the techniques of teaching. It is divided into three major sections, the first one dealing with the strategies of instruction, which includes such chapter titles as Effic- iency and Effectiveness, Instructional Methods, Structure of a Lesson, and Lesson Planning. The second section deals with the tactics of instruction and contains such chapter titles as Question Technique, Assessment Techniques, Audiovisual Aids, and Needs, Objectives, and Commitment. The final section is entitled Instructional Concerns, and it deals with such topics as the personality of the instructor, individual differences, and discussion tech— niques. A number of important field techniques are pre- sented in Instructional Technique, but a major strength of the work is its relevancy to this present study in that it is a definitional organizer for strategies and tactics, as well as the duel topics of efficiency and effectiveness. In addition to the three above mentioned volumes, a total of forty (“0) professional journals were found that provided papers relevant to individual instructional devel— opment techniques. As was noted earlier in this chapter, a number of disciplines were researched in order to provide a thorough search of the literature. Therefore, citations surfaced in such fields as Psychology, Communications, Future Studies, Business Management, Lconomics, Medicine, and History as well as that of Education itself. Appendix J provides an alphabetized list of those journals, as well 33 as a corresponding set of techniques found therein. Although the journal references noted in Appendix J are not assumed to be exhaustive, they represent a large contribution of references to be found in ID and related journals for the period l9“7 to the present. However, an exception to other articles in the literature is Gentry's work entitled A Management Framework for Program Develop— ment Techniques (1980). In his article, Gentry introduces an ID approach model which he terms a Management Frame— work, that he recommends be used for organizing ID field techniques. He categorizes a sampling of such techniques according to the thirteen (13) functions of his model (see Appendix C), while at the same time providing definitions and references for each technique. His work parallels the present study in that he assembled a coll— ection of techniques viewed as relevant to ID practitioners. This study will use his model as an organizer of ID tech— niques, and will match a list of the techniques that are researched with the thirteen functions of Gentry's model. Such a matching is based upon his prototype which may be found in Table 2.1 which follows. 3“ TABLE 2.1 GENTRY'S LISTING OF TECHNIQUES BY MANAGEMENT FRAMEWORK MODEL COMPONENTS MODEL COMPONENTS TECHNIQUES Needs Analysis Brainstorming, Criteria for Rejecting Clients, Delphi Tech- nique, Fault Tree Analysis, Force— Field Analysis, Futures Wheel, Needs Assessment, Nominal Group Process, Scenario Writing. Adoption Brainstorming, Delphi Technique, Force-Field Analysis, Goal—Rating Procedure, Network Analysis. Design Contract Plan, Cost Benefit Analy— sis, Critical Path Method, Discov— ery Method, Ethnography, Function Analysis, Futures Wheel, Goal- Rating Procedure, Interaction Matrix Method of Sequencing Object- ives, Interactive Television, Microteaching, Nominal Group Pro— cess, Peer Tutoring, Scenario Writing, Sequencing & Clustering Large Numbers of Objectives, Sequencing Content, Simulation, Storyboarding, Task Analysis, Team Teaching, Telelecture, Trigger Film. Packaging Field Testing, Formative Evaluat— ion, Summative Evaluation, Time Study. Installation Critical Path Method, Fault Tree Analysis. Operation Information Mapping, Planning, Programming, Budgeting Systems, Personal Inverted Filing System, Summative Evaluation, Time Study. TABLE Evaluation Communication Network Information Handling Resource Acquisition and Allocation Personnel Facilities Leadership 35 2.1 (cont'd.) Field Testing, Force-Field Analysis, Formative Evaluation, Latent Image, Learners Verificat~ ion and Revision -LVR, Stake Model, Summative Evaluation. Content Analysis, Critical Path Method, Delphi Method, Inform— ation Mapping, Network Analysis. Decision Tables, Information Mapping, Personal Inverted Filing System. Cost Benefit Analysis, Manage— ment by Objectives, PPBS — Planning, Programing, Budget- ing Systems. Broken Squares, Managerial Grid, Merit Rating Chart. Interactive Television, Tele- lecture. Brainstorming, Broken Squares, Content Analysis, Critical Path Method, Decision Tables, Delphi Method, Fault Tree Analysis, Flow— charting, Information Mapping, Management by Objectives, Manager— ial Grid, Merit Rating Chart, Nominal Group Process, Personal Inverted Filing System - PIFS, Program Evaluation and Review Tech— nique - PERT, Relevance Trees, Task Analysis, Task Description, Team Teaching, Time Study. 36 In the summary of his article, Gentry asserts that such a framework for systematically organizing tech— niques: . . . alerts developers to the need for additional techniques, and to the need for objective research on the effectiveness, relevancy, and efficiency of existing tech— niques. (Gentry, 1980-81:36) This study is designed to identify E2223 techniques are being used by Canadian developers, in the period of 1980 to 1982. In addition, the study attempts todetermine the levels of use in terms of relevancy, efficiency, and effectiveness of each technique as perceived by the survey population of developers in Canada. The foregoing chapter illustrated that education- al change may be brought about by technology. Specifically, one such technological field is that of Instructional Devel— opment, which affects change through the operationalized processes of various functions of ID systems models. In order to execute such processes, we discussed the use of numerous tactics or field techniques reported in the liter— ature and it was further pointed out that techniques from other fields were also relevant to the needs of instruct- ional developers. This ”borrowing" or hybrid principle was acknowledged to be of significant importance to ID and the chapter continued with a discussion on the research of techniques employed in this study. In conclusion, the chapter pointed out the relevance of an article by Gentry and how it would serve as a prototype for this investigation. CHAPTER III DESIGN OF THE STUDY This chapter is a review of the fundamental design of the study. Each of the ten questions that the study addresses are stated, and are followed by a discuss— ion of the research population, the survey sample and the sampling procedures. The survey instrument, and the data collection procedures are dealt with next, followed by a plan for the analysis and interpretation of the raw data. INTRODUCTION As outlined in Chapter One, the study was design- ed to investigate knowledge and application levels of major instructional development techniques that are in current use by Canadian practitioners. Specifically, the research— er wanted to discover which techniques were being employed to a significant degree by the survey population, which were unfamiliar to the population, which techniques were perceived as being valuable to the field of Instructional Development, and which ones were actually being taught in Canadian institutions of learning. Further, the study ascertains if there were correlations between the level of technique use and employment areas of the surveyed develop— ers, between the level of technique use and the educational training of the surveyed developers, and between the level 37 38 of technique use and the number of years of teaching experience of the surveyed developers. Finally, the study culminates in a matching of the resulting major techniques (as perceived by the surveyed developers) with various functions of a recognized instructional develop- ment model. RESEARCH QUESTIONS In brief, the study is guided by the following questions: 1. What are the major techniques being employed by Canadian Instructional Developers in the field? 2. What is the developer's perceived level of competency with each technique? 3. What is the perceived relevancy of each technique as viewed by the developers? “. How many of these techniques are unfamiliar to the developers? 5. Which of these techniques are currently being taught by Instructional Development programs and teacher education programs in Canadian graduate and undergraduate institutions of learning? 6. Is the number of years of teaching experience relative to the use of instructional development techniques? Which techniques? 7. Are the respective employment areas of the survey- ed developers related to the level of technique use? 39 8. Is the graduate and/or post-graduate education of the developers related to the level of technique use? 9. Are the four Major Categories of Competency Level, Level of Use, Value to Instructional Development, and Degree to Which Institution Teaches, inter- related? (ie. Is there a relation between Level of Competency and Use, Competency and Value, etc.) 10. Given a typical instructional development model, how well do the perceived major techniques match with the required functions in the model? In conclusion and based upon the responses to the above research questions, the research is designed to determine which instructional development techniques should be taught to students in graduate and/or teacher training institutions in Canada, based on expert prefer- ence . RESEARCH POPULATION AND SAMPLE A major task of the study was to identify those educators in Canada who were involved in Instructional Development activities. Although “7 universities in Canada offer courses in Educational Technology (Barre, 1978), the writer was not cognizant of any institution that provided degree programs in Instructional Development. The first attempt to identify educators who had a “0 background in Instructional Development or ID-related programs, led to several established professionals in the Educational Technology field in Canada who were asked to suggest a list of names of those teachers involved in Instructional Development in Canada, and to act as valid- ators of the study's survey instrument. In all, nineteen (19) Canadian developers were approached for these two purposes. Discovering that each developer was a member of the Association of Media and Technology in Education in Canada (AMTEC), the researcher subsequently chose members of this organization as the survey population for the study. AMTEC is the Canadian equivalent of the AECT (Association for Educational Communications and Technology) in the United States of America. Dr. Richard Lewis, Editor of Media Messagg, the journal and official publication of the Association of Media and Technology in Education in Canada, volunteered the support of the Media Messgge journal and its sub- sequent organizational maChinery for surveying the member- ship. The survey was limited to those members who were connected with an educational institution, and excluded members of agencies or companies who were not. For example, all AMTEC members connected with non-educational organizations such as the National Film Board, the Secret- ary of State's office, Bell and Howell of Canada, etc., were deleted from the sample; further, all educational “1 organizations that did not cite an individual's name on the AMTEC mailing list's address label were also ignored. Hence those members who were named and/or connected with an educational institution or educational-interfacing organization (such as TV Ontario and the Alberta Educa— tional Communications Corporation) were chosen to make up the survey population. In all, 300 individuals made up the sample, chosen exclusively for their membership in AMTEC and their association with education related employ- ment. It was reasoned that they would have the requisite training and employment position that would most likely require the practice of ID techniques. INSTRUMENTATION For purposes of developing a questionnaire, an initial search was made for existing field teChniques currently used by instructional developers. The list was generated from professional journals (Appendix J) and numerous texts, many of which are found within the Biblio- graphy of this study. The search resulted in a list of 108 techniques, which are listed in alphabetical order in Appendix A. The list is not intended to be exhaustive, but rather documents the more popular techniques used by instructional developers in educational and training instit- utions. A panel of Field Experts were then identified to serve three purposes: First, they would adjudicate and “2 validate the survey instruments. Second, they would recom- mend additional techniques that were not found in the initial search. Third, they would rate each of the tech- niques in terms of their perceived importance to the dis- cipline of Instructional Development. This ranking would be used to delete the techniques perceived as least import- ant from the study's main questionnaire. Thirty Instructional Developers were identified from Canada and the United States to serve on the panel of field experts. All were employed in post-secondary school institutions and each was a recognized professional in his/ her field. In November, 1979, all thirty Field Experts were sent a letter (Appendix E) requesting their participa— tion. They included fourteen (1“) from the United States and sixteen (16) from Canada. In early June, 1980, a package of instruments and documents was sent to each of the panel members. The package (Appendix F) included one each of the following: 1. Questionnaire Survey: Technique Response Form 2. List of Techniques Definitions and References 3. Technique Rating Instrument “. Form for Additional Techniques 5. Form for Suggestions and Comments 6 Covering Letter and Field Expert Instructions Items numbered 1 and 2 above would be included in the final survey instrument sent to the total survey population. Information from item number 5 would provide the researcher “3 with suggestions for strengthening the quality of the final survey instrument as well as provide the Field Experts with an opportunity for recommending improvements to the study and research instruments. Number “ allowed the Field Expert to suggest any additional technique(s) that were not included in the original list, and number 3 was designed to allow the expert to rate each technique as to its importance in conducting instructional develop- ment activities. Reminder notices were sent to late returnees and by December 1980 twenty-five (25) Field Experts had responded for a return of 83.33%. Specifically, 13 of the original 1“ American Experts had responded for a total of 92.86%. Of these, eight (8) approved the study either in total or with slight modifications, representing 61.5“% approval; five (5) respondents representing 38.“6% expressed reservations that could have been interpreted as disapproval for the instrumentation and/or study. In terms of the Canadian Field Experts, 12 of the original 16 panel members responded in the same time frame, which represented a total of 75.00% return. Of these, 100% approved the study either in total or with slight modifica- tion. Based on the data received from the panel of Field Experts, the initial questionnaire was revised in preparation for sending it to the selected survey popula- tion of Canadian Instructional Developers. ““ With regard to the Techniques Rating Instrument (as per number 3 of the Field Expert package above), the Field Experts had rated each of the 108 techniques on a 0 to “ scale, with 0 representing "no value 93_not a recog— nized technique", 1 representing "low value" to ID activities, 2 indicating "valuable", 3 equalling "high value", and “ standing for "extremely high value" to Instructional Development activities. In this manner, each technique in the rating instrument returned by the panel members was totaled as to its aggregate score and thence divided by the number of experts (25) in order to estab- lish an average rating. The techniques were then ordered as to their respective ranks and a cut—off score was established at 1.50 (see Appendix G). In this manner, sixty (60) techniques remained. However, as GAMING and SIMULATION are often considered together in the literature, it was decided to collapse the data on the two into one technique, that of SIMULATION, which resulted in fifty-nine (59) techniques. Further, as IMMEDIATE FEEDBACK was felt to be a sub—set of FEEDBACK, it was decided to collapse the two into one technique, that of FEEDBACK, which resulted in fifty-eight (58) techniques. Finally INSTRUCTIONAL ANALYSIS KIT and PROGRAMMED INSTRUCTION were added to the list, two more techniques which were suggested by field experts. Thus, the final survey instrument (APPENDIX H) consisted of sixty (60) field techniques. The survey questionnaire was published in the Media Message journal, accompanied by self-addressed, “5 stamped envelopes, to facilitate the return of the complet— ed questionnaire. The instrument itself was divided into three major sections (Appendix H), under the title of A Study to Identify Major Field Techniques and Utilization Levels of Canadian Instructional Developers, (Media Message, 10:3, pp. 16—23). The first section introduced the survey and explained its purpose and benefits to the AMTEC organ- ization. The second portion listed the sixty (60) tech- niques in alphabetical order, and provided appropriate definitions and references for each. The final section was composed of the survey questionnaire, complete with response directions, declaration of anonymity and confident— iality, and respondent's background and experiential profile; the remainder of this section was composed of a series of response cells to each survey technique that was designed to determine the respondent's levels of competency and usage, as well as the perceived value to ID and the degree to which his/her institution teaches the technique in question. In order to acquire a significantly high level of response from the membership of AMTEC, the researcher acquired a mailing list, from which he was able to select his survey sample (as noted earlier in this chapter), and to whom a target letter was sent which solicited a response to the questionnaire. To reiterate, the final sample con— sisted of individuals 1) who were members of AMTEC, 2) were members of educational or educationally related institutions, and 3) whose names appeared on AMTEC's “6 mailing list. In all, 300 individuals were selected. DATA COLLECTION PROCEDURES The survey was originally published in Mgglg Message, and mailed to the AMTEC membership in April, 1981 and in the same month target letters were sent to the 300 sample members. From that time until the end of August, thirty (30) responses were received for a total of 10% of the survey population. On September 1, 1981 a second mailing which included a copy of the originally published survey instrument was sent to members of the sample who had not yet responded and by the end of November, 1981, forty-three (“3) or an additional l“.33% responses were received. A third mailing of the survey was made on December 1, 1981; by early February, 1982, another thirty—seven (37) returns or an additional 12.33% had been received. Finally, on February 17, 1982, a reminder card was sent out to each of the remaining members of the survey population. In answer, an additional two (2) responses, (.66%) were received. Hence, after a total of four mailings, the researcher had received 112 responses to his questionnaire which represented a total of 37.33% return. ANALYSIS OF DATA A code number was assigned to each possible response and IBM Fortran Coding Forms (#GXO9-OOll—6 U/M050) were utilized which contained the coded responses to the “7 2“8 items on the questionnaire. From the coded sheets, computer cards were punched and programs were run for analysis at the McMaster University Computer Centre, in Hamilton, Ontario. Computer analysis of the data was accomplished by the Statistical Package for the Social Sciences or S.P.S.S. (Nie, 1975), which measured frequencies and cross tabulations; the measured frequency counts added up and sorted out the data, as well as provided column and row percentages, while the cross tabs showed the relation- ship between two or more variables in a table. Signifi— cance, where applicable, was tested by the use of F-ratio and the hypotheses were tested at the .05 level. Specifically, the questions considered by the study were addressed as follows: For Question 1 (What are the major techniques being employed by Canadian Instructional Developers in the field?), an SPSS Batch System computer run was made of the survey population's responses to the "B" cells on the questionnaire (LEVEL OF USE). The results were then listed in numerical order by mggg, with the highest rated techniques being regarded as the major ones employed in the field and the lowest rated techniques being regarded as the least major ones. The actual separation of major techniques from minor techniques was based upon the place- ment of their means as presented in Figure 3.1 which provides an equal distribution of the four response levels. “8 FIGURE 3.1 HI MED LO NONE 3.00 - 2.26 2.25 - 1.51 1.50 - 0.76 0.75 - 0.00 If a technique fell within the "HI" (3.00 - 2.26) and "MEDIUM" (2.25 - 1.51) ranges above, then it was consider- ed to be a mgygg technique. If, however, the technique fell within the "LO" (1.50 - 0.76) and "NONE" (0.75 - 0.00) ranges, then it was classified as a min23_technique. Similarly for Question 2 (What is the developer's perceived level of competency with each technique?), the highest rated technique by mggg with regard to the responses in questionnaire cell "A" (COMPETENCY LEVEL) would reveal the highest regarded techniques in terms of the respondents' perceived competency levels. Conversely, the lowest rated techniques by mggg in the level of competency would pro- vide the answer to Question “ (How many of these techniques are unfamiliar to the developers?). The survey instrument required respondents to indicate their level of competency as "NIL", "HI", "MED" or "L0". Quantitatively, "HI" will be indicated by a score of 3.00 to 2.26, "MED" by a score of 2.25 to 1.51, and "LO" by a score of 1.50 to 0.76, according to Figure 3.1 above. "NIL" will be indicated by a score of 0.75 to 0.00; therefore the answer to Question “ will be determined by those techniques falling within the "NIL” range. “9 In a like manner, Question 3 (What is the per- ceived relevancy of each technique as viewed by the developers?), the highest rated techniques by mggg with regard to the responses in Questionnaire cell "C" (VALUE T0 INSTRUCTIONAL DEVELOPMENT), would reveal the highest regarded techniques in terms of the respondents' perceived relevancy level. Again, Figure 3.1 is consulted in order to determine which techniques are to be construed as most valuable to the field of Instructional Development: those techniques with mgg§§_in the "HI" and "MEDIUM" ranges (3.00 to 1.51) would be designated as the most relevant or valuable to ID. Question 5 (Which of these techniques are currently being taught by Instructional Development pro— grams and teacher education programs in Canadian graduate and under-graduate institutions of learning?) would be answered in the following manner: the techniques with mggng in the SPSS Batch System Computer run in question- naire cell "D" that fell within the "HI" (3.00 - 2.26), "MEDIUM" (2.25 - 1.51), and "L0" (1.50 - 0.76) ranges according to Figure 3.1 above, would be considered to be those techniques which are currently being taught by ID programs in Canada. Those techniques with a mggg_score of 0.75 and below would be considered not significant enough for inclusion in the list of techniques being currently taught by the respondents of the sample population. With regard to Question 6 (Is the number of 50 years of teaching experience relative to the use of tech- niques? Which techniques?), the number of years of teach- ing or educational work experience as per question #2 on the survey questionnaire was correlated with the level of use scores as per the data in Question 1. A PEARSON CORRELATION was employed and those techniques with a coefficient of .1900 and greater would be recognized as being relative to the number of years of teaching exper- ience of the respondents. In terms of Question 7 (Are the respective employ— ment areas of the surveyed developers related to the level of techniques use?), a one-way analysis of variance (ANOVA) was computed between the Level of Use score for each individual (as per questionnaire cell "B") and their respective job responsibility (as per questionnaire question no. 6). Significance would be tested at the .05 level. For Question 8 (Is the graduate and/or post- graduate education of the developers related to the level of technique use?), another one—way ANOVA was computed between the Level of Use score and the Level of Highest Education (as per questionnaire question no. 3), as well as between the Level of Highest Education and Competency Level, Value to Instructional Development, and Degree to which Institution Teaches (cells A, C & D of question- naire). Once again, significance was established at .05 level. To address Question 9 (Are the four major 51 categories of Competency Level, Level of Use, Value of Instructional Development, and Degree to Which Institution Teaches interrelated?), the researcher ran another PEARSON CORRELATION. Significance was established at the .05 level, and the correlation was tested for each of the sixty techniques researched in the study. In order to answer Question 10 (Given a typical instructional development model, how well do the tech- niques being used by Canadian Instructional Developers match the required functions in the model?) and subsequent- ly conclude the study, a list of field techniques, their respective definitions and relevant bibliographic inform- ation were sent back to each of the original Field Experts (Appendix D) who validated the survey instrument. The list was composed of those techniques that were g§§g_by the Canadian instructional developers in the survey population (ie. those techniques which scored a mggg within the "HI" (3.00 - 2.26), "MEDIUM" (2.25 - 1.51), or "L0" (1.50 - 0.76) ranges of cell "B" of the survey instrument; those techniques with a mean within the "NONE" range of 0.75 - 0.00 were not considered in this portion of the study). Along with this list of techniques, copies of Gentry's Management Framework Model (Appendix C) were sent, with the request that the field experts mgggh the techniques to appropriate functions or components of the model. A sample page of the Field Expert Response Form may be found in Appendix K. Thus, with the information 52 generated by the results from the above, the study could recommend which techniques (as determined by this research) could be matched with the functions of Gentry's Management Framework model. The preceeding chapter has dealt with the design of the study. It began with a statement of each of the ten questions addressed by the study, followed by a dis- cussion on the research population, the survey sample and the sampling procedures used in the study. Further, the chapter outlined the composition of the survey instrument as well as the data collection procedures. In conclusion, the chapter dealt with the analysis and interpretation procedures of the raw data. CHAPTER IV PRESENTATION AND ANALYSIS OF DATA The presentation and analysis of the data are contained in this chapter. Each of the ten major quest- ions addressed in the study will be considered in order, and an analysis of the data generated in the study will be presented, accompanied by supporting tables of inform- ation. Question One: What are the Major Techniques Being Employed by Canadian Instructional Develgp— ers in the Field? In order to address this question, an S.P.S.S. Batch System computer run was made of the survey populat— ion's responses to the "B" cells on the questionnaire (LEVEL OF USE), in Appendix H. Each technique of the 112 returned questionnaires was totaled according to the scale of "3" representing "HI", "2" representing "MEDIUM", "1" representing "L0", and "0" representing "NONE". A mggn was then determined for each of the techniques and the Techniques were then ordered from 1 to 60 with the first technique representing the one with the highest mean, while the sixtieth technique received the lowest mean score. In this manner, it was discovered that FEEDBACK 53 5“ with a mean of 1.86 was the most used technique, BRAINSTORMING with a mean of 1.70 was next most used, while GANNT CHART with a mean of 0.25 was the least used. The quantitative results for this question are found in Table “.l, which illustrates the ranking of the sixty (60) techniques by MEANS with their respective STANDARD DEVIATIONS. It was previously determined in Chapter III that a technique would be classified as mgjgg_if it fell within the "HI" (3.00 - 2.26) or "MEDIUM" (2.25 — 1.51) ranges of Figure 3.1. Accordingly, this study illustrated (see Table “.1) that only nine (9) techniques may be deemed to be mgjggz Feedback, Brainstorming, Field Test, Needs Assessment, Long-Range Planning, Multi-Image/Multi—Media Presentation, Questionnaire, Literature Search, and Flow- charting are those field techniques that were determined to be the major ones being used by Canadian Instructional Developers. The remaining fifty-one techniques fell within the "L0" or "NONE" categories and according to the para— meters of this study, they cannot be classified as mgjgg techniques. However, this is not to be interpreted to mean that the other techniques are not valuable, merely that they are not often employed or not known by the respondents. **#****** 55 TABLE “.1 LEVEL OF USE RANK TECHNIQUE MEAN S.D. 1 Feedback 1.86 1.15 2 Brainstorming 1.70 1.02 3 Field Test 1.68 1.19 “ Needs Assessment 1.67 1.10 5 Long-Range Planning 1.66 1.05 6 Multi-Image/Multi-Media Pres. 1.62 1.01 7 Questionnaire 1.60 1.09 8 Literature Search 1.5“ 1.15 9 Flowcharting 1.51 1.02 10 Story Boarding 1.50 1.13 11 Sequencing of Objectives l.“9 1.13 12 Checklists l.“l 1.03 13 Management by Objectives l.“l 1.12 l“ Formative Evaluation 1.38 1.18 15 Task Analysis (Task Desc.) 1.37 1.20 16 Summative Evaluation 1.35 1.18 17 Bloom's Taxonomy 1.29 1.07 18 Content Analysis 1.29 1.1“ 19 Case Studies 1.25 1.06 20 Interviewing Users 1.2“ 1.21 21 Computer Search 1.18 1.05 22 Appraisal Interview 1.15 1.09 23 Discovery Technique 1.11 1.08 2“ Criterion Referenced Meas. 1.10 1.09 25 Simulation (Gaming) 1.06 1.06 26 Authoritative Opinion 1.01 1.05 27 Cost—Benefit Analysis 1.00 1.10 28 Role Playing .99 .98 29 Computer Assisted Instruct. .96 .96 30 Programmed Instruction .96 .89 31 Behaviour Modelling .92 1.02 Standardized Tests .92 .98 33 Learner Verification & Revision .88 1.12 3“ Micro Teaching .86 .99 35 Likert Scale .80 1.07 36 Technical Conference .79 1.02 37 Contract Plan .79 -97 38 Program Plan. Budget. System .78 1.0“ 39 Gagne's Taxonomy .78 .95 “0 Program Eval. Review Tech. 77 .93 “1 Linear Programming .66 .90 “2 Critical Path Method (CPM) .53 .85 “3 Krathwohl's Taxonomy .“7 86 ““ Function Analysis “7 -33 “5 Observation Interview (eg. Time—Motion Studies) .“6 .72 “6 Instructional Analysis Kit .“2 .85 “7 Cognitive Mapping .“1 .77 “8 Discrepancy Evaluation .“1 .70 “9 Information Mapping .38 .76 50 Critical Incidents Technique .37 .76 Nominal Group Process .37 .76 52 Stake Model (Evaluation) .37 .75 53 In-Basket Technique .3“ .68 5“ Decision Tables .3“ .67 55 Delphi Technique .33 .68 56 Card Sort .33 .6“ 57 Shaping .30 .70 58 Mathetics 2 .71 59 Force-Field Analysis 27 .60 60 Gannt Chart .25 .68 (* Major Field Techniques) 56 Question Two: What is the Developer's Perceived Level of Competency with each Techniqge? As with question one, an S.P.S.S. Batch System computer analysis was made of the population's response to cell "A" (COMPETENCY LEVEL). Once again, the 112 respond— ents' responses for each technique were totalled and as illustrated in Table “.2 the techniques were ranked accord— ing to mggn. Thus it was discovered that the technique which the sample population appeared most competent with was that of MULTI—IMAGE/MULTI-MEDIA PRESENTATION, which received a mean of 2.20, while the least competent appraised technique was MATHETICS with a mean of 0.39. As in the case of Question 1, it was determined in Chapter III that a technique with which the sample population appeared most competent would fall within the "HI" (3.00 — 2.26) or "MEDIUM" (2.25 — 1.51) ranges of Figure 3.1 . Accordingly, this study illustrated (see Table “.2) that the top ranked twenty—eight (28) tech— niques would be classified as those with which the sample population felt competent (ie. MULTI-IMAGE/MULTI—MEDIA PRESENTATION in first place to CRITERION REFERENCED MEASUREMENT in twenty—eighth place). The remaining 32 techniques would not therefore fall within this class- ification. Ititllfi******fl*#*#.*'$*****l‘ + + + + + + + + + + + + + + 57 TABLE “.2 COMPETENCY LEVEL RANK TECHNIQUE MEAN S.D. 1 Multi—Image/Multi-Media Pres. 2.20 .99 2 Feedback 2.08 1.12 Needs Assessment 2.08 1.12 “ Brainstorming 2.07 .96 5 Story Boarding 2.06 1.21 6 Questionnaire 2.03 1.10 7 Long—Range Planning 1.98 1.10 8 Field Test 1.96 1.1“ 9 Flowcharting 1.90 1.10 10 Management by Objectives 1.88 1.17 11 Bloom's Taxonomy 1.8“ 1.1“ 12 Checklists 1.8“ 1.15 13 Literature Search 1.8“ 1.23 l“ Programmed Instruction 1.82 1.11 15 Formative Evaluation 1.76 1.27 16 Role Playing 1.71 1.06 17 Sequencing of Objectives 1.71 1.23 18 Summative Evaluation 1.71 1.27 19 Standardized Tests 1.65 1.10 20 Case Studies 1.65 1.17 21 Computer Search 1.60 1.17 22 Micro Teaching 1.60 1.18 23 Task Analysis (Task Desc.) 1.59 1.2“ 2“ Content Analysis 1.58 1.17 25 Interviewing Users 1.57 1.31 26 Discovery Technique 1.53 1.15 27 Appraisal Interview 1.52 1.19 28 Criterion Referenced Meas. 1.52 1.20 29 Simulation (Gaming) l.“9 1.19 30 Computer Assisted Instruct. 1.“6 1.08 31 Cost-Benefit Analysis 1.3“ 1.17 32 Behaviour Modelling 1.29 1.10 33 Authoritative Opinion 1.26 1.11 3“ Program Eval. Review Tech. 1.23 1.20 35 Contract Plan 1.21 1.1“ 36 Gagne's Taxonomy 1.20 1.21 37 Prog. Plan. Budget. System 1.18 1.22 38 Linear Programmin 1.13 1.19 39 Learner Verification & Revis. 1.13 1.21 “O Likert Scale 1.06 1.22 “1 Technical Conference 1.00 1.15 “2 Critical Path Method (CPM) .88 1.1“ “3 Observation Interview (eg. Time-Motion Studies) .87 .99 ““ In-Basket Technique .79 1.07 “5 Cognitive Mapping .78 1.01 “6 Krathwohl's Taxonomy .77 1.09 “7 Delphi Technique .71 1.03 “8 Shaping .71 1.0“ “9 Card Sort .71 1.06 50 Function Analysis .6“ 1.00 51 Information Mapping .63 1.01 52 Discrepancy Evaluation .63 1.02 53 Instructional Analysis Kit .57 1.03 5“ Decision Tables .56 .9“ 55 Critical Incidents Technique 5“ .96 56 Nominal Group Process .53 -93 57 Stake Model (Evaluation) .“9 .88 58 Force—Field Analysis .“7 -39 59 Gannt Chart .“5 .93 60 Mathetics .39 .3“ Significant competency among surveyed developers) Unfamiliar to surveyed developers) 58 Question Three: What is the Perceived Relevancy of Each Technique as Viewed by the Developers? Once again an S.P.S.S. analysis was utilized with the data being acquired from the responses to questionnaire cell "C" (VALUE TO INSTRUCTIONAL DEVELOPMENT). As did occur in question one and two, the results of the Batch System run was prioritized according to the mean of the techniques and it was revealed that the technique felt most valuable by the sample population was FEEDBACK with a mean of 2.11, while the least valuable was scored as CARD SORT with a mean of 0.36. Complete results of this question may be found in Table “.3. Again, it was determined in Chapter III that a technique would be classified as valuable to Instructional Development if it fell within the "HI" (3.00 — 2.26) or "MEDIUM" (2.25 - 1.51) ranges of Figure 3.1. Accordingly, this study illustrated (see Table “.3) that the top ranked twenty (20) techniques would be classified as being valuable to Instructional Development (ie. FEEDBACK in first place to SUMMATIVE EVALUATION in twentieth place). The remaining “0 techniques would not therefore fall within the "valuable" classification in terms of this study. *****$IIIII**II*‘*** 59 TABLE “.3 VALUE TO INSTRUCTIONAL DEVELOPMENT RANK TECHNIQUE MEAN S.D. 1 Feedback 2.11 1.20 2 Long-Range Planning 1.98 1.1“ 3 Needs Assessment 1.97 1.18 “ Field Test 1.96 1.23 5 Brainstorming 1.92 1.01 6 Multi-Image/Multi-Media Pres. 1.90 .99 7 Story Boarding 1.78 1.19 8 Computer Assisted Instruct. 1.77 1.16 9 Flowcharting 1.75 1.10 10 Literature Search 1.71 1.20 11 Sequencing of Objectives 1.71 1.26 12 Formative Evaluation 1.69 1.29 13 Questionnaire 1.63 1.12 1“ Bloom's Taxonomy 1.60 1.20 15 Content Analysis 1.60 1.2“ 16 Management by Objectives 1.59 1.12 17 Computer Search 1.58 1.18 18 Criterion Referenced Meas. 1.56 1.23 19 Task Analysis (Task Desc.) 1.55 1.28 20 Summative Evaluation 1.51 1.2“ 21 Interviewing Users 1.“3 1.29 22 Case Studies 1.“2 1.18 23 Appraisal Interview 1.“1 1.2“ 2“ Programmed Instruction 1.39 1.00 25 Micro Teaching 1.36 1.16 26 Checklists 1.35 1.05 27 Discovery Technique 1.33 1.13 28 Simulation (Gaming) 1.31 1.17 29 Standardized Tests 1.30 1.06 30 Role Playing 1.27 1.08 31 Cost-Benefit Analysis 1.23 1.15 32 Learner Verification & Revis. 1.19 1.28 33 Behaviour Modelling 1.16 1.1“ 3“ Authoritative Opinion 1.12 1.1“ 35 Gagne's Taxonomy 1.06 1.16 36 Contract Plan 1.05 1.07 37 Prog. Eval. Review Technique 1.05 1.29 38 Program Plan. Budget. System 1.00 1.12 39 Likert Scale .95 1.15 “0 Technical Conference .9“ 1.08 “1 Linear Programming .85 .99 “2 Cognitive Mapping .8“ 1.13 “3 Critical Path Method (CPM) .79 1.0“ ““ Observation Interview (eg. Time-Motion Studies) .70 .90 “5 Krathwohl's Taxonomy .65 1.03 “6 Discrepancy Evaluation .63 1.01 “7 Function Analysis .62 .97 “8 Delphi Technique .62 .92 “9 Critical Incidents Technique .5“ .95 50 Information Mapping .53 .93 51 Shaping .53 .93 52 Decision Tables .53 .86 53 Instructional Analysis Kit .52 .97 ““ In-Basket Technique .“9 .78 55 Nominal Group Process .“8 .90 56 Stake Model (Evaluation) .“3 .85 57 Gannt Chart 38 .82 58 Mathetics 38 .81 59 Force-Field Analysis .37 7 60 Card Sort .36 71 (* Valuable to Instructional Development) 60 Question Four: How Many of the Techniques are Unfamiliar to the Developers? In order to respond to this question, the researcher was able to consult Table “.2 and note those techniques that received the lowest mean score for COMPETENCY LEVEL. Specifically, those techniques that received mean scores within the "NIL" (0.75 — 0.00) range of Figure 3.1 would be classified as unfamiliar to the developers of the survey population. The rationale for this is found within the instructions of question 8 of the survey questionnaire which states "Please Note: If you are NOT FAMILIAR with a technique, please check the Nil box in category A and go to the next technique". Accordingly, this study determined (see Table “.2) that the techniques with mean scores of 0.75 or below were those techniques numbering in order from forty—seventh place (DELPHI) to sixtieth place (MATHETICS). Therefore the bottom ranked fourteen techniques (see Table “.2 on page 57) would be classified as unfamiliar to the developers surveyed in this study. Question Five: Which of These Techniques are Currently Being Taught by Instructional Development Programs and Teacher Education Programs in Canadian Graduate and Undergraduate Institutions of Learning? To address this question, another S.P.S.S. batch 61 run was performed in order to analyse the results of questionnaire cell "D" (DEGREE TO WHICH INSTITUTION TEACHES). The results may be found in Table “.“ and reveal that the technique which appears to be taught to the greatest degree by the sample developers was MULTI- IMAGE/MULTI-MEDIA PRESENTATION with a mean score of 0.63 while FORCE—FIELD ANALYSIS with a mean of 0.0“ appears to be taught least. However, as was outlined in Chapter III with regard to this question, 0.75 was used as a minimum mean score of significance (which would still average a rating of "NOT APPLICABLE" in cell "D" according to Figure 3.1). Thus it is immediately apparent that all of the techniques fall below the significance level. Hence, it may be reasoned that most techniques are not being presently taught in a formal manner at the institutions employing the members of the survey population. This consideration will be dwelt upon to a greater extent in the summary and findings section of Chapter V. 62 TABLE “.“ DEGREE TO WHICH INSTITUTION TEACHES RANK TECHNIQUE MEAN S.D. l Multi-Image/Multi-Media Pres. .63 1.06 2 Formative Evaluation .63 1.13 3 Feedback .60 1.11 “ Summative Evaluation .56 1.05 5 Literature Search .5“ 1.0“ 6 Bloom's Taxonomy .53 .96 7 Standardized Tests .52 .96 8 Computer Assisted Instruct. .51 .95 9 Criterion Refer. Measurement .50 .97 10 Story Boarding .“9 .96 ll Task Analysis (Task Desc.) .“9 .93 12 Needs Assessment .“8 .88 13 Questionnaire .“6 .92 Sequencing of Objectives .“6 .92 15 Long-Range Planning .“6 .89 16 Field Test .“5 .9“ 17 Micro Teaching .“3 .89 18 Programmed Instruction .“3 .85 Simulation (Gaming) .“3 .85 20 Brainstorming .“1 .79 21 Management by Objectives .38 .76 22 Discovery Technique .37 .81 23 Flowcharting .37 .77 Content Analysis .37 .77 25 Role Playing .37 .75 26 Interviewing Users .36 .81 27 Case Studies .36 7 28 Computer Search .35 .78 29 Learner Verification & Revision .3“ .83 30 Gagne's Taxonomy .33 .7“ 31 Behaviour Modelling .32 .71 32 Likert Scale .31 .77 33 Checklists .29 .68 3“ Linear Programming .28 .71 35 Appraisal Interview .28 .65 36 Krathwohl's Taxonomy .26 .73 37 Authoritative Opinion .26 .69 38 Program Eval. Review Technique .26 .68 39 Contract Plan .26 .61 “0 Cost-Benefit Analysis 21 .57 “1 Cognitive Mapping .18 .56 “2 Critical Path Method (CPM) .17 .50 “3 Delphi Technique .16 .“8 ““ Instructional Analysis Kit .15 .57 “5 Information Mapping .1“ .50 “6 Program Plan. Budget. System .13 .“6 “7 Stake Model (Evaluation) .13 .“5 Discrepancy Evaluation .13 .“5 “9 Technical Conference .13 .“3 50 Critical Incidents Technique .13 .“1 51 Decision Tables .12 .“6 52 Shaping .12 ““ 53 In-Basket Technique .11 39 5“ Function Analysis .10 “ 55 Observation Interview (eg. Time-Motion Studies) .09 .32 56 Mathetics .08 38 57 Nominal Group Process .08 36 58 Gannt Chart C7 .29 59 Card Sort 06 2“ 60 Force-Field Analysis 0“ 1 63 Question Six: Is the Number of Years of Teaching Experience Relative to the Use of Techniques? Which Techniques? In order to respond to this question, a PEARSON CORRELATION was performed using the data generated in Table “.1 (LEVEL OF USE) and the number of years of teaching or educational work experience, as per question 2 of the survey instrument questionnaire. The correla- tion data provided the researcher with a coefficient and significance level for each of the 60 techniques under consideration; such illustrated whether there was a relationship between the number of years of professional work experience of the surveyed developers and the use of each of the techniques. Upon analysing the results of the computer run of the Pearson Correlation, it was revealed that very few of the field techniques exhibited a relationship between use and user work experience. Employing .1900 as a level of coefficient significance, it was determined that only five (5) techniques exhibited the above mentioned relation- ship. According to Table “.5 the only techniques revealing a relationship were DELPHI TECHNIQUE (.2270), INSTRUCTIONAL ANALYSIS KIT (.193“), MICRO TEACHING (.2107), PROGRAM PLANNING BUDGETING SYSTEM (.2101), and ROLE PLAYING (.2181). The remaining fifty-five techniques did not according to the statistical analysis of this study exhibit a 6“ significant level of correlation between the number of years of teaching experience of the surveyed developers and the level of use of technique. 65 TABLE “.5 AND USER TEACHING EXPERIENCE PEARSON CORRELATION BETWEEN USE OF TECHNIQUE TECHNIQUE COEFFICIENT SIGNIFICANCE \OCONJQU'IEUJMH Appraisal Interview Authoritative Opinion Behaviour Modelling Bloom's Taxonomy Brainstorming Card Sort Case Studies Checklists Cognitive Mapping Computer Assisted Instruct. Computer Search Content Analysis Contract Plan Cost-Benefit Analysis Criterion Referenced Measurement Critical Incidents Technique Critical Path Method (CPM) Decision Tables Delphi Technique Discovery Technique Discrepancy Evaluation Feedback Field Test Flowcharting Force-Field Analysis Formative Evaluation Function Analysis Gagne's Taxonomy Gannt Chart In-Basket Technique Information Mapping Instructional Analysis Kit Interviewing Users Krathwohl's Taxonomy Learner Verification & Revision Likert Scale Linear Programming Literature Search Long-Range Planning Management by Objectives Mathetics Micro Teaching Multi-Image/Multi-Media Present. Needs Assessment Nominal Group Process Observation Interview (eg. Time-Motion Studies) Programmed Instruction Program Evaluation Review Tech. Program Plan. Budget. System Questionnaire Role Playing Sequencing Objectives Shaping Simulation (Gaming) Stake Model (Evaluation) Standardized Tests Story Boarding Summative Evaluation Task Analysis (Task Desc.) Technical Conference .0566 .l““3 .058“ .1027 .012“ .2300 .0532 .0876 .0“68 .1“81 .01““ .0736 .1183 .1011 .0289 .1083 .1287 .06“? .2270 .0823 .l“36 .01“2 .0623 .0222 .1631 .0297 .0858 .0163 .1191 .1“32 .0372 .193“ .0696 .0022 .0210 .0118 .1“99 .00“3 .1258 .0212 .029“ .2107 .0237 .1139 .1673 .0“05 .1239 .13““ .2101 .1217 .2181 .06“l .03“3 .0285 .0951 .093“ .0““6 .0117 .0870 .0359 Significant correlation between Years Teaching Experience and Technique Use.) 0 .277 .065 .270 .1“1 .““8 .007 .289 .179 312 060 .““0 .220 .107 1““ .381 .128 .088 .2u9 .008 .19“ .065 .““l 257 .“08 .0“3 .378 .18“ .“32 .105 .066 .3“8 .021 .233 .“91 .“13 .“51 .057 .“82 .093 .“12 .379 .013 .“02 .116 .039 .336 .097 .079 .013 .“10 .010 .251 .360 .383 .159 .16“ .320 .“51 .181 .353 Question Seven: Are the Respective Employment Areas of the Surveyed Developers Related to the Level of Technique Use? In order to ascertain whether or not the respective job responsibilities or Employment Areas of the surveyed developers are related to the level of tech- nique use, a one-way analysis of variance (ANOVA) was per- formed between the Level of Use (cell "B" on the question- naire) and the Title or Present Job Responsibility (as per questionnaire question no. 6). The results of the computer analysis may be found in Table “.6 (A) which presents data on the respect- ive means, standard deviation, and standard error. It was determined that there is no statistical significance at the .05 level (F9, 99 = .932, p = .5009). However, it was reasoned that perhaps there were too many Job Responsibility categories in order to pro— vide a significant relationship. Therefore, in order to test the difference among the means, the number of Job Responsibility responses were collapsed into 5 categories or groups. Such may be found in Figure “.1 which follows. 67 FIGURE “.1 New Group 21213. Old Group(s) Administrators 1 2 University & College Instr. 2, 10 3 Teachers / Consultants 3, “, 5, 8, 9 “ Support Staff 6, 7 5 Others 11 As illustrated in Table “.6(B), there is a definite trend_in relationship between the Level of Use and the Present Job Responsibilities, which follows the above ordered categories; ie. the Administrators category has a larger mean than the University and College Instruct- ors category, which in turn has a larger mean than the Teachers and Consultants category, which in turn has a larger mean than the Support Staff of Audiovisual Tech- nicians and Librarians category, etc. However, in spite of this trend, no two groups are significantly different at the .05 level (F = 1.350, p = .2567). There “, 10“ would be a 26% chance of error when suggesting that a significant difference existed. Hence, in summary, it must be asserted that although a trend does exist, there is no statistically significant evidence to suggest that the respective employ- ment areas of the surveyed developers are related to the level of technique use, with regard to this study. mmmm.m Haws.mm mamo.om QOH Hmo omaozaem zamz azaoo Enema 8 6 wwmz.mmwmm mOH Hmpoe mmma.mmm mHom.ommmm mm masoso cHnqu moom. mmm. mmma.Hmw Hoso.sw:s m masons cmmzpmm .momm a oHeam m mmmapam 24m: mmmmq zmm39mm Havm.: mHmOz¢ mmmw.m Hows.mm mamo.om QOH Hmo omoz< Amvo.: mqmae 70 Question Eight: Is the Graduate and/or Post-Graduate Education of the Developers Related to the Level of Technique Use? To determine whether or not a relationship exist- ed between the level of technique use and the user's graduate and/or post-graduate education, an analysis of variance or a one-way ANOVA was performed between the Level of Use score (cell "B" on the questionnaire) and the Level of Highest Education (as per questionnaire question no. 3). It was concluded that no two groups are signifi- cantly different at the .05 level as the F Probability was P = .“l““ (see Table “.7). However, an interesting pattern did appear as is illustrated by Table “.8 wherein one may see an increasing trend whereby respondents with a higher level of education have a greater mean score. Hence, it may be suggested that those members of the survey populat- ion with a PhD. make greater use of the field techniques than those with a masters degree, who in turn make more use of the techniques than those with a bachelors degree, and so on. However, it must be stressed that this is only a trend and that no statistical significance may be attrib- uted to these results (F 1.012, p = .“l““), as 5, 106 = there is a “1% chance that any decision based upon the statistics would be incorrect. However, when comparing the Level of Highest Education with that of the users' reported Competency Level with regard to the field techniques, a definite relationship 71 appeared. As illustrated in Table “.9 and Table “.10 it was statistically significant (F5, 106 = 2.709, p = .02“l) that those respondents with a higher education had a larger mean score in relationship to Competency Level. Further, there was a statistically significant relationship between Level of Highest Education and Degree to Which Institution Teaches the techniques. As illustrated in Table “.11 and Table “.12 it was statistically significant (F5, 106 = 5.378, p = .0002) that the survey population members with a doctorate degree had a larger mean score than those with a bachelors and masters degree in relat- ionship to the Degree to Which Institution Teaches; the relationship in the other three categories was not stat- istically significant. Finally, it must be noted as well that there was no statistically significant relationship between the Level of Highest Education and the users' perceived Value (of the techniques) to Instructional Development, as there was a 12% chance that any decision based upon the statist- ics would be incorrect (F5, 106 = 1.797, p = .1196). Hence, in conclusion to question 8, it must be decided that there was no significant relationship between the Graduate and/or Post Graduate Education (Level of High— est Education) ...... although a trend did appear ... and the Level of Technique Use, as well as the Level of Highest Education and the Value to Instructional Development. There was, however, a significant relationship between the 72 Level of Highest Education and the respondents' Competency Level, as well as between the Level of Highest Education and the Degree to Which the Institution Teaches the Techniques. 73 ooom.mm amoo.wm ooom.0m m 000500 0000.0 0000.0H 0000.H0 0 0000H0H0000\000HH00 HOHm.0 0000.00 0000.00 0H 000H0H0000 0000.0 0000.00 0000.00 0H .00.0\.00.0\.0.0 00Hm.0 m0HH.0m 0000.00 H0 .00.2\.00.2\.0.2 0000.0 0000.00 0000.00 00 0.00\0.00 00000 00002000 200000000 00002000 2002 02000 00000 Aommmmmov mmmoom Zmq 02¢ mm: mo qm>mq zmmBBmm <>Oz< 0.0 00000 0000.00H00 HHH H0000 0000.000 0000.000H0 00H 000000 0H00H3 00H0. 0H0.H 0000.000 0000.000: 0 000000 0003000 .0000 0 00000 0 0000000 2002 0000000 00 200 .0.0 000000 mm: mo qm>mq 02¢ ZOHBmq zmm38mm ¢>Oz< 0.: mqmm N mhmcuo HN:N.N QHHM.NH Doom.w: m ®PNOHQHuLOO\®wwHHOO wmmm.NH wwwr.mm OOO@.M© OH pwHHwHome OHmm.w osz.mm mwmm.>w mH .om.m\.Um.m\.<.m m:mm.: wme.mm omnm.:® Hm .Om.2\.UQ.E\.¢.E :0:N.® ONON.H2 OONH.mw mm .Q.Um\.Q.£m mommm Dm¢QZ¢Bm ZOHBMQ Qm¢QZ¢Bm Z¢mz EZDOO 030mm Ammmmomov mmmoom Z¢m2 Qm>mg WOZMBMAEOO QZ< ZOHE¢ODQM EmmmOHm m0 QM>MQ ZMMZBmm ¢>OZ¢ OH.: wwwm.mflwm:H HHH HmpOB ©NNM.NmNH :sz.HmNNMH 00H mQSOLU CHSpH3 Ammo. @00.m momm.mmmm m:w:.0mmma m 003000 cwmzpom .momm m OHEmq wozmemmzoo 02¢ 20HB¢ob0m Emmmon mo qm>mq 2003800 ¢>02¢ 0.: mqm¢8 75 Naom.m mozw.om moom.w Hm .om.z\.om.2\.<.z Hmwm.m mpmm.:m oooo.ma ma .om.m\.om.m\.<.m =wmm.m :mmm.am ooow.:a OH pmfifimfiomam Nmzm.m mwwm.fim ooow.mm m mmeHMprmoxmmmHHoo oooo.mm momm.wm oooo.om m mpmcpo w:mm.> mmmw.~m ooma.mz mm .a.vm\.a.cm mommm omma ommq zmmzemm <>Oz< NH.: mqmmH mo. pm pcmofihficwfim u * ompm.:am~m HHH Hmpoe mawm.zmp :Nwm.mmw- mOH mgsopo cfinpfiz * mooo. m»m.m HHwH.ommm mwow.om>mfi m quopo cmmzpmm .momm m oHamq zmmzfimm <>Oz< HH.: mqmm£om mcfififioooz LoH>mcom f ‘A zmfi>noucH mefimpoo< coficfioo o>fiuoofibozoomz. Valid» 1 A«QH«V ”Unhm moflmwomc- . oopooCCOvouCH uo mammawc< I monocoosoo wcficfienopoa you oocpvz m.noocmxofi< on .am: :on oz mm» mzo on .omz H: muo<> on .omz H: on .omz Hz. oz 902 uuANuQ7um£3 «pH names on: «0 .Q.H 0» osflw> Ho>oq mzufi a: monopooEoo a: m .0 o .m < Experts' 1d .0 mo H omwm a icon mmzommmm MDOHZIQME Fie 1537 Field Experts' Validation Package (continued) m I OmHm some ofiumm zomoooom mfiwmawc< bone pasmg acqmpwoccpm .MEAEEwLwopm oHEmcza J mcficgwmq «(Newcomme vocmpwfio :ofipmsfim>m zocwoopomfia ‘1".‘1 Ill . i escaccooe zpv>oomHQII 'Illlu'lll‘. II'IIIIIIII'I oscficcooa-wnmava moanwe :ofimfiooo vcoo n.0HmQ ._ Azaov cocooz coma Hoofiofipo mzdficcome “mucoofiocH mewufiuo .onswwoz noocoavumm :ofinoufiao mfiwhamc< ufiuocom unoo swam powpucoo pcoanwcwz zocowcfiucoo OA .sz mOHw OZ mm» mzoz Oq .sz H: m34<> OZ oq .sz H: 2oq a: .U .a.H oe oofio> .m Hm>oq mucouoasoo a: .< m2Ho>cH AmmHv Hanovm HwCOmpoapoch mfimzamc< vowunoch cofimfi>ofioe o>HuomLoch uoz :oHpownoch xfipuaz cofiuomnoch 41L usages: cofiumEAOECW. oscmcnove poxmmmucH .l': (I- ‘1 ll lul'lll Ill xowncmok opwfipoEEH poocu:opmmo mcfiewo Ecumzm ooHHHmo I-__LL1 meocoxme n.0cmwo Hows: mopzuzm wfimhamc< omwpoCSE cofipmzam>m o>fipmenom !l(|l mfiwzamc< oaofiknoouom | I0!tu w:«ppm£ozofim 04 .omz mon oz mm» mzoz 04 .sz HI ESQ<> 02 0A mm: H: 0A .sz H: «and. ti: o: comma 0% 3H comma .monm m2 moon .m .9 Rd. Ho>oH m2 .0 .Q.H oo ozfiw> .m Hm>mq monopooeoo z: .< mzwuoowno an pcosuwmcmzi. ucficpwflm oucmmnucoa .lllnlln- l‘.|ll|| zpmfio mofiI cosmom-op:uauoewg. ucfiEEwLmopm Emocfiq ofimom ozozfiq 17".“'I(} I'll. Loxpozoo oonpvumpmupmmmq :ofimfi>om a :ofiumOHufiLmMIwocpqu oumEH. pcmpwd zeocoxwe w.H:ozcqux.II usoaccooe caEzQZICOmccow when: wcazvd>pvch on .am: :on oz mm» mzoz oo.mmz H: mza Ho>oq mzoH z: zocopodeou z: .m .D .0 .m .< .w no : mama 1140 Field Experts' Validation Package (continued) . woopa.oocmzwaom mammamc< Hogucoo HMCOAumaum unomIG I .IIIIMNAddqdflq801dIIII. nmmmmv Ecumzw ucwpoupsm ocficcmfld EmLscgm .19 3¢w>vm Coaumzfim>m EMANGHMsIzz Asmadv uwmm mcfiflwm ovupw>CH HmcomRmm 4 V .33...ng poosummnnzruadqmodoaqudm I;.. umCfiCmed upaommudrhfidm: .. II, ppmco :ofiumxficwmno 3wfi>r~quH COH Uw>L¢mDO ——Jh—-— —-(> mmboOLm ozone chweoz ”2..me w mwfldIIWUDIMZI 'II codowocomonm II. snooznfioflsz\oEoEHuwww2masIz ucfiamom Hmcofimcmeficfiuazz _ mupmcu HmofimoHocoLoz Q, (Gm: ZOHE 02 mm» mzoz OJ .sz HI mbg<> 0Q mm: H: Z Ho>mg mzcz 05 pH comma .mopm z: moon Hm>oq >2 >2 0m 0Q 00 am o< .c 90 m mama :1 wodocopmHnQOUCH Hanna» unmoadauHIIo: duduumamamfi ducmhmMCCU fimUACLUUH wdmzamc< xmme QCHUdEHOHMCQHH éfluwa If-.. ucfiuwma UHEmunfiw: woaquQHWIII- eda085AdHMIoqudse:w.ztI .IIIIIIManUHdOQ .anOuw: : I. wumoe noufippwccmpm Hoooz ozmom d Experts' Validation Package (continued) .‘ l A Pie HIE .1 M cofiomzsefim , limd fimMQIm. - III iI,II1; .Wo>IIuMfioQ:wd choCMEmmm I ucficfimbe zufi>nofiwcom .II . Hofiocoooaonc ofiocmaom:-s: “ _ _ II. acammfim oaom OJ Om: :OH: 02 mm» 202 CA .sz HI m3g<> c..omz H: Z Ho>oq mzmq >2 zuccpooeoc >2 .o do w mums 142 Field Experts' Validation Package (continued) ALPHABETICAL LIST OF TECHNIQUES AND THEIR DEFINITIONS Alexander's Method for Determining Components Technique to find the right physical components of a physical structure such that each component can be altered independently to suit future changes in the environment. (Jones, J.C., Design Methods, John Wiley and Sons, Ltd., London, l970{) Analysis of Inter Connected Decision Areas (AIDA) - technique that identifies and evaluates all the compatible sets of sub-solutions to a design problem through the listing of all the sets of options in each decision area that can be combined together without incompatibility and choosing the set that best satisfies a single quantifiable criterion of choice (eg. cost). (Jones, J.C., Design Methods, John Wiley and Sons Ltd., London, 1970f) Appraisal Interview - a verbal communication between employee and manage- ment concerning the results of an employee appraisal, in order to encourage present behaviour, or to provide a warning for a behavioral change, or to simply provide information. (Kay, E., French, J.R., and Meyer, H.J., "A Study of the Performance Appraisal Interview", New York Managez ment Development and Emp1oyee Relations Services, General Electric Co., March, 1962. Norman, R.F. Maier, "Three Types of Appraisal Interview", Personnel, March - April, 1958). Authoritative Opinion Descriptive writing based upon the observations of experienced practitioners, or persons who have had direct contact with the environment they seek to describe or explain. (David, R.C., 1951. The Fundamentals of Top Management, Harper and Row, 1951. Fayol, H., Industrial and General Administration, Inter— national Management Institute., 1930.) 143 Field Experts' Validation Package (continued) Behavior Modelling - technique to enable managers to improve their managerial abilities by imitating "models" who have mastered the requisite skills. (Bandura, A., Principles of Behavior Modification, New York Holt, Rinehart and Winston, 1960. Goldstein, A.P. and Sorcher, M., "Changing Managerial Behavior by Applied Learning Techniques", Training and Development Journal, 1973, 36-39.) Behaviorally_Anchored Rating Scale - rating scale devised to keep a record of good or undesirable incidents occuring in an employee's work, and to minimuze the amount of subjectivity required when rating individuals. (Schwab, Heneman, and Decotis, "Behaviorally Anchored Rating Scales", pp. 549-551.) Bloom's Taxonomy - psychological model that describes the major categories within the Cognitive domain. viz. Knowledge, Comprehension, Application, Analysis, Synthesis, and Evaluation. The taxonomy proceeds on the assumption that knowledge is ordered hierarchically, and it is assumed that the six main classes are sequential, moving from knowledge to evaluation. (Gronlund, Norman B., Stating Behavioral Objectives for Classroom Instruction., MacMillan Co., 1970.) (Hunt, D.E., and Sullivan, E.V., Between Psychology and Education, (Dryden Press), Hinsdale, Ill. 1974.) Brainstorming Technique that enables a group of people to quickly produce many ideas without fear of criticism. Ideas are recorded first and evaluated afterwards. (Babach, W.J., and Barkelew, A.H., The Babach-Barkelew Brainstorming Book, Synergy Group, Inc., Utica, Mich., 1976. Havelock, R.G., The Change Agent's Guide to Innovation in Education, Ed. Tech. Publication, Englewood Cliffs, New Jersey, April, 1978.) 144 Field Experts' Validation Package (continued) Broken Squares - technique that enables a group to construct equally sized squares (one for each member) from packages of cut pieces, in order to analyze certain aspects of co-operation in solving a group problem as well as to sensitize group members to some of their own behaviors which contribute to or obstruct the solving of a problem. (Pfeiffer, J. William and Jones, John E., A Handbook of Structured Experiences for Human Relations Training, Vol. 1, University Press, Iowa, 1969.) Card Sort - pact of cards, containing goal statements on each card, is sorted into stacks that have been assigned value points, in order to determine a ranking of goals based upon importance and implementation priorities. (Witkin, Belle, Ruth, Educational Technology, November, 1977. Case Studies - a technique involving a comprehensive study of an individual, institution, or situation; used to provide detailed information for purposes of appraisal and recommendations. (Schatzman, L. and Strauss, A., Field Research, Prentice- Hall Inc., Englewood Cliffs, N.J., 1973.) Checklists - technique to enable designers to use knowledge of requirements that have been found to be relevant in similar situations by first preparing a list of questions that were determined to be important in similar situations and next asking some or all of these questions about the design that is to be evaluated. (Jones, J.C., Design Methods, John Wiley & Sons Ltd., London, 1970.) Cognitive Mapping - a systematic procedure for visually indicating how a person approaches "new" knowledge (cognitive information) in terms of perception, memory, thinking, and problem solving, based on previous knowledge or rules for acquir- ing "new" knowledge based on rules derived in learning "old" knowledge. 145 Field Experts' Validation Package (continued) (Schulman, Lee 8., Research in Education, Vol. 4, F.E. Peacock, 1976. Anderson, Scarvig B. and others: Encyclopedia of Educ- ational Evaluation, 1975, Joesey Bass Inc. Thorndike, Robert L. and Hagan, Elizabeth, Measurement and Evaluation, John Wiley and Sons, 1977.) Compressed Speech — methods developed for accelerating the recorded speed of speech so as to reduce the time spent listening to the spoken word without significant loss of compre- hension. (Barabasz, A.Y., "A study of recall and retention of accelerated lecture presentation, "Journal of Communication, 1968, 18, 283-287. Reid, R.R., ("Grammatical Complexity and Comprehension of Compressed Speech", Journal of Communication, 1968, 18, 236-242.) Computer Assisted Instruction - instructional technique in which the computer contains a stored instructional program designed to inform, guide, control, and test the student until a prescribed level of proficiency is reached. (Coulson, J.E., Programmed Learning and Computer-Based Instruction, New York, John Wiley & Sons Inc., 1962. Poirot, J. L. and Groves, D.N., Computer Science For the Teacher, Sterling Swift Pub. Co., Manchaca, Texas, 1976.) Computer Search (eg. ERIC) - computerized technique that enables the researcher to search thousands of articles in a short period of time, by the use of key words or descriptors. (eg. Literature Search). (Thesarus of Eric Descriptors, MacMillan Information, New York, N.Y.) Content Analysis - a method of analysing communication for the purpose of measuring variables, by studying the communications that people have produced and asking questions of the communications. (Kerlinger, F.N., Foundations of Behavioral Research, 19 4. 146 Field Experts' Validation Package (continued) Harder, M.D., (Ed.), Content Analysis As A Research Tool for Higher Educationf) Contextual Mapping (Time-Independent Contextual Mapping) - a future forecasting technique based upon a graphic (flowchart) depiction of past trends and their inter- relationships. (Forrester, Jay W., "Counterintuitive Behavior of Social Systems", Technology Review, January 1971. Hencley, Stephen P. and Yates, Futurism in Education: Methodologies, McCatchan Pub. Corp., Berkeley, California, 1974.) Contingency Management (see Least-Preferred Co-worker) Contract Plan - a written agreement between the student and teacher which lists a set of goals, skills, and assignments to be completed by the student within a reasonable time. (Haddock, T., "Individual Instruction Through Student Contracts", Arizona Teacher, May 1967.) Cost-Benefit Analysis - a generic term for such techniques as ZERO-BASED BUDGETING, COST EFFECTIVENESS, COST EVALUATION, etc., which assist the decision-maker in making a comparison of alternative courses of action in terms of their costs and effectiveness in attaining some specific objectives. (Forbes, R.R., "Cost—effectiveness analysis: Primer and Guidelines", Educational Technology, 1972. Prest, A.R. and Turvey, R., "Cost-benefit Analysis: A Survey", The Economic Journal, 1965, pp. 683-735. Wilkinson, G.L., "Cost evaluation of Instructional Strategies", Communication Review, 1973.) Criterion Referenced Measurement - tests constructed to yield measurements that are directly interpretable in terms of specified performance standards. (Humbleton, R.K., and Gorth W.P., "Criterion-Referenced Testing: Issues and Applications", Mass University Amherst School of Education, Sept. 1971, ED 60025. Jones, J.W., "A study of the Congruency of Competencies and Criterion-Referenced Measures", May 1977, ED 0142575.) 147 Field Experts' Validation Package (continued) Critical Incidents Technique - technique to acquire information on specific behavior patterns of a subject by interviewing the subject's work supervisor in order to ascertain behavior patterns relating to the skills being studied. (Borg, W.R. and Meredith D.C., Educational Research and Introduction, David McKay Co. Inc., pp. 249-251, New York, 1976. Flanagan, J.C., "The Critical Incident Technique", Psychological Bulletin, 51; 327-358. 1954.) Critical Path Method (CPM) - technique to aid researchers with the planning, scheduling expediting and progress monitoring tasks involved in a specific project by diagramatically platting work activities and events in sequence and determining the longest time (CPM) needed to complete the project. (Collins, F.T., Network Planning and Critical Path Schedulipg, KnowHow Publications, 1965.) Dale's Cone - pyramid-shaped model wherein learning is depicted as moving from the concrete (base of the pyramid) to the abstract (apex of the pyramid). Decision Tables - alternative to a flowchart for presenting the logic of a problem, wherein the table is a set of decision rules in which each rule identifies a set of conditions with its set of actions; it is divided vertically by condition statements and action statements and divided horizontally by stubs and entries. (Hussain, K.M., Development of Information Systems for Education, Englewood Cliffs, N.Y., Prentice Hall Inc., 1973.) Delphi Technique - a futurist research method which utilizes the systematic solicitation and combination of informed judgements from a group of experts on questions or issues relevant to the future. 148 Field Experts' Validation Package (continued) (Helmer, O., Analyeis of the Future: The Delphi Method, Santa Monica, Ca., The Rand Corp., 1967. Helmer, O. and Dalkey, N.C., "An experimental Application of the Delphi Method of the Use of Experts", Management Science, IX (April 1963) pp. 458-467. Weaver, W.T., Phi Beta Kappan, "The Delphi Forecasting Method", January 1971.) Discoveny Technique (Discovery Learning Model) - learning model by which the student problem solves through discovering a new method rather than relying upon prior knowledge and procedures. (Travers, R.M.W., Second Handbook of Research on Teachipg, American Educational Research Assoc., Chicago, 1973. Bruner, J.S., Toward a Theory of Instruction, Belknap Press, 1966.) Discrepancy Evaluation - method to identify differences between two or more elements of an educational or training program in order to determine how they might be corrected or reconcilled; differences are examined on the basis of logical ration- ale or statistical criteria. (Stake, R.E., Review of Educational Research, "Objectives, Priorities, and Other Judgment Data", Vol. 40, 1970.) Distance Teaching and Learning - teaching by correspondence through the use of the printed medium, radio, television, computer, recordings, tapes, or a combination of the above. (Good, H.M., and Trotter, B., University Teaching, "Frontiers in Course Development: Systems and Collaboration", Council of Ontario Universities, Jacksons Point.) Dynamic Programming - technique concerned with multi-stage decision processes and problems which can be interpreted as such, based upon Bellman's Principle of Optimality. (White, D.J., Dynamic Programmimg, Oliver and Boyd, Edinborough, 1969. Norman, J.M., Heuristic Procedures in Dynamic Program- ming.) 149 Field Experts' Validation Package (continued) Ethnograppy, - type of research that attempts to present a picture of the way of life of some group of people in terms of both process and product, in an analytical and multi-layered fashion. (Pelto, Pertti J. and Pelto, Gretel H., Anthropological Research, 2nd Edition, Cambridge University Press, Cambridge, 1978. Schatzman, L. and Straus, A.L., Field Research, Prentice- Hall Inc., Englewood Cliffs, N.J., 1973.) Fault Tree Analysis , - Operations research method for predicting the most probable ways by which a system or part of it might fail through the use of a logic network of events combined with a systematic method for qualitative and quantitative analysis. (Ericson, C., SystemSafety Technology-Fault Tree Ana1ysis, The Doeing Co., Seattle, Wash., Report #02-113-072-2, 1970. Witkin, B.R., and Stephens, K.G., A Fault Tree Analysis of Organizational Communication Systems, Western Speech Communication Association, Honolulu Hawaii, November 1972. Witkin, B.R. and Stephens, K.G., Fault Tree Analysis: A Research Tool Educational Planning, Technical Report #1, Alameda County Pace Center, Hayward California, October 1968.) Feedback - generative term that encompasses a number of techniques (including programmed texts, pull-tab response cards, Latent Image, etc.), which gives the learner an immediate response as to the correctness of his answers. It may also refer to data collected by researchers for purposes of evaluation. (Glaser, R. and Cooley W.W., "Instrumentation for Teach- ing and Instructional Management", Second Handbook of Research on Teaching, R. Travers (Ed.), Randy McNally College Pub. Co., Chicago, 1973.) Field Test - the assessment of a near-final model in an appropriate situation, according to specified criteria, for the purpose of determining what modifications of 150 Field Experts' Validation Package (continued) structure and performance are necessary (AECT). (Klausmeier, H., "Research & Development toward the Improvement of Education", Journal of Experimental Education, 37 (1968). "The Public Interest vis a vis Educational Research & Development", Journal of Research & Development in Education, 2 (1969). FIRO-B (Fundamental Interpersonal Relations Orientation- Behavior) - a fifty-four statement scale which produces 6 scores (nine in each of the 6 scores) which measures the expression of orientation by the subject to which one joins and includes others, controls and leads others, and is friendly and personal with others. (Pfeiffer, J.W., Heslin, R., Jones, J.E., Instrumentation in Human Relationships Training, La Jolla, California, University Press Assoc., 1976. Schutz, W.C., Fire: A Three Dimensional Theory of Interpersonal Behavior, Rinehart & Winston, New York, 1958.) Flowcharting - graphic representation for the definition, analysis, or solution of a problem, in which symbols are used to represent operations, data, flow, and equipment, etc. (Chapin, N., "Flowcharting with ANSI Standards: A Tutorial", 92mputing Surveys, II, June 1970. Enrick, N.L., Effective Graphic Communication, New York, Averback Publishers, 1972. Schiber, T.J., Fundamentals of Flowchartipg, New York, J. Wiley and Son, 1969.) Force-Field Analysis - graphic method of analysing the forces providing thrust towards or facilitating change, and the forces hindering change in a particular situation. (Lewin, K., "Frontiers in Group Dynamics: Concept, Method and Reality in Social Science", Human Relations, Vol. 1, No. 1, June, 1947. Giammato, N.C., "Suggested Activities for Learning About Role Behaviors, Problem Solving and Force Field Techniques", Northwest Regional Education Laboratory, 1969, ED030160.) 151 Field Experts' Validation Package (continued) Formative Evaluation - attempts to collect appropriate evidence during the construction and trying-out of a new curriculum, etc. in such a way that revision of it can be based on this evidence; evaluation of instructional programs while they are still in some stage of development. (Anderson, S.B., Ball, 8., and Murphy, R.T., Encyclopedia of Educational Evaluating, San Francisco. Jossey Bass Pub. 1975. Bloom, B.S., Hasting, J.T. and Madaus, G.F., Handbook on Formative and Summative Evaluation of Student Learning, New York, McGraw-Hill, 1971.) Function Analysis - in the Roger Kaufman Model for Educational Systems Planning, the Function Analysis stage is the process for determining requirements and subfunctions for accomplish- ing all of the elements stated in the objectives and problem identification stage; concerned with identifying the "whats" that have to be accomplished rather than the "hows". (Kaufman, B.A., Educational System Planning, Prentice- Hall Inc., Englewood Cliffs, N.J., 1972.) Futures Wheel - futurist technique that graphically helps the user to visualize the impact of a single forecast in one or more areas through the build up of interconnecting circles, each containing a related forecast. (Gunn, Jerry & Guy, "Easy Ways to Help Children Think About the Future", The Futurist, August 1974.) Gagne's Taxonomy - cognitive learning theory described as a hierarchy of learning processes that become increasingly complex, and which places more emphasis upon learning and less on the development aspect. (Hunt, D.E., and Sullivan, E.V., Between Psychology and Education, Dryden Press, Hinsdale, Illinois, 1974.) Galileo System - series of procedures for making a mental map of an audience by identifying the main concepts it uses to understand and define a topic; Galileo measures the beliefs and concepts that an audience holds concerning the topic 152 Field Experts' Validation Package (continued) under study. (Woelfel, J., Galileo IVL A Program of Metric Multi- dimensional Scaling, Honolulu, Hawaii, East-West Communication Institute, 1977. Gilham, J. and Woelfel, J., "The Galileo System: Preliminary Evidence for Precision, Stability and Equivalence to Traditional Measures", Human Communication Research, Fall, 1976.) Gaming (Simulation) - technique that provides a context for the acquisition of abstract conceptual tools which allow a participant to view new and emerging situations; elements common to most simulation techniques are role playing, a problem defining scenario, operating procedures, and an accounting system. (Coombs, D.H., "Is There a Future for Simulation and Gaming Research?", Educational Communication and Technology Journal, Vol. 26, No. 2, Summer, 1978. Spannaus, T.W., "What is Simulation?", Audio Visual Instruction, May, 1978.) Gannt Chart - a means of graphically illustrating a production schedule; the horizontal axis is used to depict time, with activities, items, or personnel listed vertically in the left-hand column. (Dessler, Gary, Management Fundamentals: A Framework, Reston, Va., Reston, 1977. Longenecker, J.G., Essentials of Management: A Behavioral Approach, Columbus, Ohio, Charles Merill Pub. Co., 1977.) Immediate Feedback (See Feedback) In-Basket Technique - technique to analyze a participant's decision- making abilities, managerial, and problem-solving skills, whereby she/he receives a "situation" set up on a memo to which a considered response is compared to answers suggested by field experts. (French, W., The Personnel Management Process, 4th ed., Boston, Houghton Mifflin, 1978. Ward, L.B., "The Use of Business Problems", Management Record, 22:30-33, June 1960.) 153 Field Experts' Validation Package (continued) Information Mapping, - system of graphically presenting information on a series of pages in the form of COBOL: each page is broken with horizontal lines dividing "chunks" of inform- ation into Definitions, Examples, Rules, etc. (Glaser, R., Teaching Research and Education, New York, Wiley, 1965. Horn, R.E., "Information Mapping: New Tool to Overcome the Paper Mountain", Educational Technology. Vol. 15, No. 5. May 1974 p. 5-81) Interaction Matrix - technique to permit a systematic search for connections between elements within a problem, whereby a matrix is set up in which every element of the design problem can be compared with every other on a three-point scale (0-2) of connections. (Jones, J.C., Design Methods, John Wiley and Sons Ltd., London, 1970.) Interaction Net - a graphic display of points linked by lines of connections which illustrate the patterns of connections between elements within a design problem as discovered in the use of an Interaction Matrix. (Jones, J.C., Des1gn Methods, John Wiley and Sons Ltd., London, 1970.) Interactive Television — system to communicate over a distance on a face- to-face basis by means of 2-way audio and video signals. (Hayes, J., "Interactive Communication is Goal of C.C.T.V. Network", Biomedical Communication, 1974. Wittson, L.L. and Benschoter, R., "Two-way Television: Helping the Medical Center Reach Out", American Journal of Psychiatry, 129:5, pp. 624- 77. Interface Ana1ysis - a method of analyzing a system by graphically depicting and hence analyzing each of the sub-systems that interface or adjoin one another. (Kindred, A.R., Data Systems and Management, Prentice- Hall Inc., Englewood Cliffs, N.J., 1973.) 154 Field Experts' Validation Package (continued) Interpersonal Recall (IPR) - technique to help developers expand their capac- ities of interpersonal communication and awareness of their own interpersonal styles and behaviors. (Kagan, N. and Burke, F., "Influencing Human Interaction", Student Manual MSU, ED 484, 1976.) Involvement Matrix - technique that serves as a prelude to effective systems design with provision for actions to solve procedural problems; developers and/or organizations are represented on one axis while the general tasks are represented on the other, while the resulting interest cells contain specific decisions regarding the level of responsibility and specific task assignments are delineated. (Springer, H.C. and Giles, F.T., "The Involvement Matrix: A Prelude to Effective Systems Design", in Educational Technology, Vol. 12 No. 8, August 1972 p. 49-51.) Interviewing Users - technique to elicit information that is known only to users of a product or system in question. (Jones, J.C., Design Methods, J. Wiley & Sons Ltd., London, 1970.) Johnson-Neyman Technique - A technique that can be used to identify the subgroups for which differences will be significant, by determining the permissable value on extraneous variables leading to significant differences on the criterion variable. (Johnson, P.O., Fay, L., "The Newman-Johnson Technique, It's Theory and Applications, Psychometrika, 1950, Vol. 15. Abelson, R.F., "A Note on the Neyman-Johnson Technique, Psychometrika, 1953, Vol. 1, No. 18 pp. 213-217.) Krathwohl's Taxonomy - psychological model that describes the major categories within the Affective Domain; viz. Receiving, Responding, Valuing, Organization, and Characterization by a Value or Value Complex. (Gronlund, N.E., Stating Behavioral Objectives for Classroom Instructions, MacMillan Co., 1970.) 155 Field Experts' Validation Package (continued) Latent Image - technique that uses chemically treated response sheets to provide immediate feedback to subjects upon answering test questions. (Nil). Learner Verification and Revision (LVR) - involves the concepts of evaluation, revision and decision to implement developed by Kenneth Komoski, and intended for use as an index of quality for educational materials; involves the tryout of a prototype educational product on the target audience to determine its weaknesses prior to revision. (Kandaswamy, S. et a1. "Learner Verification and Revision: An Experimental Comparison of Two Methods", A.V. Communication Review, Fall 1976. Stolovitch, H.D., "The Intermediate Technology of Learner Verification and Revision", Educational Technology, February 1978, p. 13-17.)' Least-Preferred Coworker - Technique to determine leadership style through the use of a set of sixteen adjective pairs on bipolar scales on a questionnaire (Semantic differential); a favourable description (high LPC) of the least preferred coworker is assumed to indicate a relationship-oriented leadership_style, whereas an unfavourable description (low LPC)Fis assumed to indicate a task-oriented leader- shi style. Fiedler, F.E., "Personality and Situational Determinants of Leader Behaviour", Department of Psychology, University of Washington, Technical Report 71-18, June, 1971. French, W., The Personnel Management Process, 4th ed., Boston, Houghton Mifflin, 1978.)' Likert Scale - to obtain summated ratings of information pertinent to affective variables, by responding to statements which are both favourable and unfavourable to the phenomenon under study; responses range on a scale of five (from "strongly agree" to "strongly disagree") and are thus analyzed to determine which items discriminate best between the high-scoring individuals and the low-scoring individuals. 156 Field Experts' Validation Package (continued) (Social Research, Phillips 1966. Educational and Psychological Measurement and Evaluation, Stanley & Hopkins.) Linear Programming - program in which the sequence of information present- ed to the students is fixed so that all students are given the same stimulii in exactly the same sequence followed by testing, followed by new information; based upon the stimulus-response works of Pressy and Skinner. (Brown, J.V., Lewis, R.B., Harceroad, R.F., AV Instruction Media and Methods, New York, McGraw-Hill Book, 1969. Hartley, J., "Programmed instruction 1954-1974: A Review", Programmed Learning and Educational Technology, July. 1975.) Literature Search - to find published information that can favourably influence the designers' output and that can be obtained without unacceptable cost and delay. (Jones, J.C., Design Methods, John Wiley & Sons Ltd., London, 1970.) Log Diary - technique to determine the activities and functions of a professional whereby such are mapped on a form con- taining activities listed vertically on the left side and half hour (or hour) intervals listed horizontally across the top. (Anderson, S.B., Ball, 8., Murphy, R.T., and Assoc., Encyclopedia of Educational Evaluation, Jossey-Bass Pub., San Francisco, California, 1975.) Long-Range Planning - methodology to develop an adaptive planning program consisting of "alternative future" general plans and derivative plans for the major components of the agency in questions; methods range from establishing goals, through developing plans for each alternative future, through selecting one alternative future plan and developing monitoring and shifting procedures. (Chase, R.B. and Clark, D.C., "Long Range Planning in Schfiol Districts", Educational Technology, Vol. 4, 197 . Salmon, R.D., "Developing a Long Range Planning System for Higher Education", School and Community, May 1971.) 157 Field Experts' Validation Package (continued) Management by Objectives (MBO) - process whereby the superior and subordinate managers of an organization jointly identify its common goals, define each individual's major area of responsibil- ity in terms of the results expected, and uses these measures as guides for operating the unit and assessing the contributions of each of its members. (Hollman, R.W., "Applying MBO Research to Practice", Human Resources Management, Winter, 1976. Stein, D.I., "Objective Management Systems: Two to Five Years After Implementation", Personal Journal, 54: 525-84, October, 1975.) Managerial Grid - technique devised by Blake and Mouton to describe managerial style and to predict interpersonal effective- ness and leadership skills based on a two-dimensional grid; where one dimension is concerned for people and the other is concerned for production or task orientation. (Bernardin, H.J. and Alvares, K., "The Managerial Grid as a Predictor of Conflict Resolution and Managerial EffectivenesSY Bowlin Green State University. Blake, R.R. and Mouton, J.S., The Managerial Grid, Houston, Gulf Pub. Co., 1964f) Mathetics - a training system to determine what to teach, a basis for determining strategy decisions, and a detailed procedure for contructing a lesson; those goals are attained through a series of ten steps which include occupational analysis, task selection, task analysis, population analysis, etc. (Gilbert, T.F. "Mathetics: II The Description of Teaching Exercises", Journal of Mathetics, Vol. 1, April 1962. Gilbert, T.F., Mathetics: "The Technology of Education", Journal of Mathetics, Vol. 1, January, 1962.) Matrix Sampling - a general statistical procedure of random sampling that increases efficiency by reducing the number of students involved in testing, wherein "K" test items are subdivided randomly into "t" subtests containing "K" items each with a subtest administered to "n" examinees selected randomly from the population of "N" examinees. 158 Field Experts' Validation Package (continued) (Sirotnik, K., "An Introduction to Matrix Sampling for the Practitioner", Evaluation in Education: Current Application, Berkeley, California: McCut- chan Pub. Corp., 1974.) Merit Rating Chart - method for determining employee progress and value to an organization whereby the rater places a check mark on a form next to the word or phrase describing the degree of merit for each of several different traits, such as "quality of work", "quantity of work", "co-operation", and so forth; degrees of merit run from "inadequate" to "superior". (Miller, Richard V., Merit Rating in Industry: A Survey of Current Practices and Problems", L.L.R. Research, 5:14, Fall, 1959. Tiffin, J., "6 Merit Rating Systems", Personal Journal 37:288, January 1959.) Micro Teaching - practice which allows pre-service or in-service teachers to develop or improve skills in applying a particular teaching technique, whereby a lesson is planned which concerns a single, unique topic to be presented to a small group of students, in a small time frame. (Sadkeg, Myra and David, "Microteaching for Affective Skills", The Elementary School Journal, 1976.) Monte Carlo Method of Analysis - futurist or prediction technique of using a computer and random numbers to simulate a real-world situation such as studies of population growth, or the evaluation of complicated integrals. (Cooley, W.W. and Lohnes, P.R., Introduction to Statist- ical Procedures with Computer Exercises, New York, Wiley, 1968. Hammersley, J.M. and Handscomb, D.C., Monte Carlo Methods, London, Methuem and Co. Ltd., 1965.) Morphological Charts - technique to widen the area of search for solutions to a design problem by defining the functions that the design must be able to perform, listing on a chart a wide range of sub-solutions or alternative means of performing each function and then selecting an acceptable set of 159 Field Experts' Validation Package (continued) sub-solutions, one for each function. (Jones, J.C., Des1gn Methods, John Wiley & Sons Ltd., London, 1970.) Multidimensional Scalimg - aim is to develop procedures which will assign sets of numbers to various quantities of attributes so that the numbers directly reflect varieties in the quantities of the attributes; produces a range of scores that have mean- ing with respect to each other's values or to some arbit- rary or absolute value set accepted by the scale. (Sheperd, R.W., Multidimensional Scaling: Theory and Applications in the Behavioral Sciences, New York, Seminar Press, 1972. Torgerson, W.S., Theory and Methods of Scaling, New York, Wiley and Sons, 1958.) Multi-Image/Multi-Media Presentation - the integration of more than one medium in a complementary manner in a presentation or module of instruction. (Wittich, W.A. and Schuller, G.F., Instructional Tech- nology, Its Nature and Use, Harper & Row, New York, 1973:) Needs Assessment (eg. PDK Model) - The process in which "real-world" data is collected from individuals and groups involved in a particular educational situation to determine the nature of the problem, to determine how the group involved (learners, implementers, community) value what exists (status quo), what should be (the ideal situation) and the discrepancy between what is and what should be, and to prioritize the problems and discrepancies. (Anderson, S.B., Ball, 8., and Murphy, R.T., Encyclopedia of Educational Evaluation, San Francisco, Jossey- Bass Pub., 1975. Witkin, B.R., Educational Technology, November, 1977.) Network Analysis - A specific process by which an existing communicat- ion network within an organization may be analyzed in terms of the flow of its essential and/or social information; under investigation are the network's groups, liason personnel, isolates, bridge links, etc. 160 Field Experts' Validation Package (continued) (Havelock, R.G., The Change Agent's Guide to Innovation in Education, Educational Technology Pub., Engle- wood Cliffs, N.J., 1975.) Nominal Group Process - method to generate and prioritize ideas regarding problem-solving, job performance improvement, etc. whereby each member of a study group generates ideas that are listed before the group, ranked, and valued (1-5), and finally prioritized. (Albanese, R., Managing: Toward Accountability for Performance, Homewood, 111., Irwin, 1978. Delbecq, A.L., VandeVen, A.H., "Nominal Group Tech- niques for Involving Clients and Resource Experts in Program Planning", Academy of Management Proceedings, 1970.) Observation Interview (eg. Time-Motion Studies) - method to define a task, analyze a job, or perform needs assessment or evaluation, whereby the investigator observes and questions an interviewee at the work site while the practitioner performs the activities under investigation. (Anderson, S.B., Ball, 8., Murphy, R.T. and Associates, Encyclopedia of Educational Evaluation, Jossey-Bass Pub., San Francisco, California, 1975. Bergman, A.B., Dassel, S.W., and Wedgwood, H.J., "Time-motion Study of Practicing Pediatricians", Pediatrics, 38:254-263, 1966.) Organization Chart - chart graphically depicts the various departments, relationships, and lines of authority within an enterprise, including the major functions, channels of supervision and relative authority and responsibility of each employee in a position of authority. (Dessler, G., Mapagement Fundamentals: A Framework, Reston, Va., Reston, 1977. Lott, R.W., Basic Systems Analysis, San Francisco, Canfield, 1971.) Pair-Associate Learning, - may be used for instructional and measurement purposes to determine the learner's ability to associate pairs of sounds, words, pictures, or word/picture combin- ations in order to investigate the meaningfulness, 161 Field Experts' Validation Package (continued) familiarization, or similarity of stimulus members and response members. (Batting, W., "Analysis of Paired-Associate Learning", John Cook, Studies in Guided Learning. Gross, A., Paired-Associates Learning: The Role of Meaningfulness, Similarity, and Familiarization. Participative Management - leadership style which modifies patterns of super— vision by encouraging workers to make decisions for them- selves and to participate more in planning and policy making functions; eg. McGregor's Theory Y of Leadership. (McFarland, D.T., Management: Princ1ples and Practices, Second Edition, Macmillan Co., New York, 1964. Wortman, M.S., and Luthans, P., Emerging Concepts in Management, The Macmillan Co., London, 1969.) Path Analysis - technique for investigating the interrelationships of a set of variables within the context of a causal model, in which every included variable in a qualitative diagram (measured or hypothetical), is represented by arrows either as completely determined by certain others or as an ultimate factor. (Anderson, J.C., and Evans, F.E., "Causal Models in Education Research: Recursive Models", American Educational Research Journal, 1974, Vol. II, pp. 29-39. Wright, 8., "Path Coefficients and Path Regressions, Alternative or Complementary Concepts?", Biometrics, 1960, Vol. 16, pp. 189-202.) Personal Inverted Filing System (PIFS) - filing and retrieval system whereby documents are filed by accession numbers which are in turn entered on alphabetized Scan-Match Sheets containing appropriate descriptors. (Holmes, T.F., and Gentry, C.G., "A Foolproof Personal Filing System", Audipvisual Instruction, May 1979.) Program Evaluation Review Technique (PERT) 7- a systematic timetabling and programming technique developed to measure, monitor, and control the development and progress of a project or program, wherein a network of events and work activities is identified, including the critical path of the one which takes the longest time to 162 Field Experts' Validation Package (continued) complete. (Cook, D.L., Program Evaluation and Review Technique: Applications in Education, U.S.H.E.W., Office of Education, No. 17, 1966. Kohn, M., Qynamic Managing: Principles, Process, Practice, Melno Park, California, Cummings, 1977, pp. 121-128. Lott, D.R., Basic Systems Analysis, New York, Onfield Press, 1971:) Program Planning Budgeting System (PPBS) - a planning budgeting system in which resources are allocated according to specified project or program needs it directly relates substantive planning to fiscal plan- ning requiring a detailed operational plan to which costs are then assigned on a programmatic, rather than on a line item basis. (Kindred, A.R., Data Systems and Management, Prentice— Hall, Englewood Cliffs, N.J., 1973.) Questionnaire - instrument for recording data ranging from sociol- ogical opinions and attitudes to psychological variables which include opinions, attitudes and behavior; technique to obtain responses and reactions from a large number of individuals who could not be interviewed personally within a short period of time without considerable expense. (Bloom, Benjamin S. & Hastings, Mabaus, G.F., Handbook on Formative and Summative Evaluation of Student Learning, New York, Holt, McGraw-Hill Co., 1971. Kerlinger, F.N., Foundations of Behavioral Research, 2nd Edition, New York: Holt, Rinehart & Winston, 1973.) Q-Sort - attitude measurement technique with scores based on self-reports, a personality inventory in which the subject sorts a considerable number of cards containing attitudinal statements into categories that represent the degrees to which the statements apply to him/her. (Caggiano, R., "The Relationship Between the Value and Attitudes of Elementary School Teachers and Pupils Regarding Pupil Behaviors", Graduate Research in Education and Related Disciplines, Vol. 6, No. 1. 1970. 163 Field Experts' Validation Package (continued) Kerlinger, F.N., "The Attitude Structure of the Individual: A Q-Study of the Educational Attitudes of Professors and Laymen", Genetic Psychological Monographs, No. 53, 1956.) Relational Control Analysis - technique that combines the assumption that messages contain both report (content) and command (relational) aspects: it involves a coding technique that defines message sequences, indexes relational control, and maps transactional patterns as they unfold over time, to study the control dimension of a relationship. (Ericson, R.M. and Rogers, L.E., "New Procedures for Analyzing Relational Communication", Family Process, 12-245-267, 1973. Rogers, L.E. and Farace, R.V., "Analysis of Relational Communication in Dyado New Measurement Procedures", Human Communication Research, 222-239, 1975.) Relevance Trees - normative forecasting by graphically illustrating the steps required to meet a predetermined goal; future needs and goals are determined and then a hierarchy of events which must occur for the attainment of the goals, are traced backwards to the present. (Esch, N.E., "Honeywell's Pattern Planning Assistance Through Technological Evaluation of Relevance Numbers", In A Guide to Practical Technological Forecasting, Englewood Cliffs, N.J., Prentice-Hall, 1973. Yates, J.R., & Hencley, S.P., Futurism in Education, McMillan Publishing, 1974.) Role Playing - instructional technique involving a spontaneous portrayal (acting out) of a situation, condition, or circumstance by selected members of a learning group who assume either overtly or in imagination, of the part or function of another or others. (Cooper, J., American Psychologist, August 1976. Keller, C.W., "Role Playing and Simulation in History Classics", The History Teacher, August 1975, Vol. VIII, No. 4.) Semantic Differential - technique to determine the underlying meaning as well as value of a given concept which are measured in dimensions of Evaluation, Potency, and Activity; subjects 164 Field Experts' Validation Package (continued) are asked to rate a given concept by locating it between two polar adjectives which are divided by seven (7) units. (Semantic Differential Social Research, Phillips, 1966. Educational and Psychological Measurement and Evaluation, Stanley & Hopkins.) Sensitiviterraining - technique involving a situational T-group experience that is designed to increase sensitivity toward the needs and attitudes of others and to increase one's individual self-awareness. ,XDelbecq, A.L., Sensitivity Training, in Contemporary _‘I'“ Readings in Organizational Behavior, ed. by .9021" F. Luthans, McGraw-Hill, 1972. House, H.J., 1967 "T-group education and leadership effectiveness: A review of the empiric literature and a critical evaluation", Personnel Psychology, 20, 1967.) Seqnencing of Objectives - objectives are sequenced according to a number of different methods in order to facilitate learning. (Popham, W.J., and Baker, E.L., Systematic Instruction, Englewood Cliffs, N.J., Prentice-Hall, 1970. Posner, G.J. and Strike, K.A., "A Categorization Scheme for Principles of Sequencing Content", Review of Educational Research, Fall 1976, 46 (K), 665-689.) Shaping - a method of successive approximation to teach humans and animals a new skill; it reinforces behaviors that approximate the final performance one wants the subject to perform by shaping the learner's behavior by rewarding him whenever he is successful in approximating the skill being taught. (Davis, Alexander, and Yelon, 8., Learning Systems Designs, Michigan State University, East Lansing, Michigan.) Simulation (see Gaming) Stake Model (Evaluation) - technique intended for the evaluation of educational programs by providing data for decision-making; it provides measurements on a matrix of the match between what an educator intends to do and what she/he actually 165 Field Experts' Validation Package (continued) accomplishes. (Anderson, S.B., Ball, 8., and Murphy, R.T., Encyclopedia of Educational Evaluation, Jossey-Bass Pub., San Francisco, 1975.) Systemic Testing - technique to identify actions that are capable of bringing about desired changes in situations that are too complicated to understand, through the selection of the most promising and the least harmful of tests, system constraints as an avenue for planning and achieving the desired changes. (Jones, J.C., Design Methods, John Wiley and Sons, London, 1970.) System Transformation - method of transforming an unsatisfactory system so as to remove its inherent faults and finding a sequence of changes (transformation route or evolutionary pathway) that would allow the existing components to evolve into the new ones. (Jones, J.C., Design Methods, John Wiley & Sons, London, 1970.) Task Analysis (Task Description) - the analysis and synthesis of a real world behavior and/or situation, including knowledge, skills and attitudes, including (a) a listing of the activities performed, (b) an indication of the sequence and relationships among the knowledge, skills and attitudes, (c) the conditions under which the knowledge, skills and attitudes occur, and (d) the acceptable criteria for knowledge, skills and attitudes performance. (Davis, I.K., "Task Analysis: Some Process and Content Concerns", AVCR 1973, Spring pp. 73-83. Duncan, K., Hartley, J. (ed), "Strategies for Analysis of Task", Strategies for Programmed Instruction:_ An Educational Technology, London, 1972. Gagne, R.M., Task Anaiysis - Its Relation to Content Analysis, A paper presented at the annual meeting of the American Educational Research Assoc., Chicago, April 1974.) 166 Field Experts' Validation Package (continued) Technical Conference - a group of high-level technical or subject matter experts are brought together to collectively determine the responsibilities and procedures of a set position. (Goldstein, 1.1., Training Program Development and Evaluation, Wadsworth Pub. Co. Inc., Belmont, California, 1974. Segall, Asher, Vanderschmidt, Hannelure, Burglass, Ruanne and Frostmas, Thomas, Systematic Course Designed for the Health Fields, John Wiley and Sons Inc., New York, 1975?) Standardized Tests - an instrument constructed in accord with detailed specifications, in which the items have been selected after trying out for appropriateness in difficulty and discriminating power, one which is accompanied by a manual giving definite directions for uniform administration and scoring, and one which provides relevant and dependable norms for score interpretations. (Borg, W.R. and Gall, M.D., Educational Research (2nd), David McKay Co., New York, 1971. Buros, O.K., The Mental Measurement Yearbook, Gryphon, Highland Park, N.J.) Story Boarding - the activity of preparing a series of sketches or pictures and any accompanying text used to visualize each topic or item in an audiovisual material (or presentation) to be produced; usually used for planning. (Kemp, J.E., Planning and Producing Audiovisual Materials, Chandler Publishing Co., 1968. Brown, L., A.V. Instruction, Technology,_Media, and Methods, McGraw-Hill Book Co., 1973.) Summative Evaluation — evaluation intended to provide data for product validation and oriented to consumer-administration-teacher criteria and standards, used to assess the overall effect- iveness of some program of material. (Anderson, S.B., Bull, 8., Murphy, R.T., and Associates, Encyclopedia of Educational Evaluation, San Francisco, Josse-Bass Inc. 1973. Bloom, B.S., Hastings, T., Mabaus, G.F., Handbook on Formative and Summative Evaluation of Student Learning, New York, McGraw-Hill Book Co., 1971.) 167 Field Experts' Validation Package (continued) Synectics - problem solving and design technique that provides for the spontaneous activity of the brain and the nervous system towards the exploration and transformation of design problems, through the use of analogies; use is made of the black box view of designing. (Jones, J.C., Design Methods, John Wiley and Sons Ltd., London, 1970. Weinberg, G.M., An Introduction to General Systems Thinking, John Wiley and Sons Ltd., New York, 1975.) Telelecture - an arrangement which brings a teacher or any lecturer to the classroom audience via regular telephone lines enabling the speaker to participate with several classes simultaneously at different locations; the installation may provide two-way communication between speaker and audience. (Chu, C.G., and Schramm, W., Learning From Television: What The Research Says, The National Society of Professionals in Telecommunications, Stanford, November, 1967. Schramm, W., (ed), Quality in Instructional Television, University Press of Hawaii, Honolulu, 1972.) Trialogue - an application of multimedia used in school systems to plan and disseminate projects tailored around the use of two or more media, wherein planning and project construc- tion issues are discussed before producing the project ( u h as function of the program, effects of learning aids, the roles of teachers, etc.) Visual Inconsistencies (Search for) - through the examination of examples and/or photographs of an existing design, the developer attempts to identify design conflicts and compromises that may have been necessary in the past but may be avoidable in the future. (Jones, J.C., Design Methods, John Wiley and Sons, London, l970T) 168 Field Experts' Validation Package (continued) TECHNIQUE RATING INSTRUMENT Rate each of the following techniques as to its value to instructional development. CIRCLE one number only. If you know the technique by another term, write such in the blank provided. 4 = extremely high value 3 = high value 2 = valuable l = low value 0 = no value Qfi_not a technique TECHNIQUE ALTERNATE TERM RATING Alexander's Method for Determining Components 4 3 2 1 Analysis of Inter Connected Decision Areas (AIDA) 4 3 2 l Authoritative Opinion 4 3 2 1 Appraisal Interview 4 3 2 1 Behavior Modelling 4 3 2 l Behaviorally Anchored Rating Scale 4 3 2 l Bloom's Taxonomy (Classifying Objectives) 4 3 2 l Brainstorming 4 3 2 1 Broken Squares 4 3 2 1 Card Sort 4 3 2 1 Case Studies 4 3 2 l 169 Field Experts' Validation Package (continued) Checklists Cognitive Mapping Compressed Speech Computer Assisted Instruction Computer Search (eg. ERIC) Content Analysis Contextual Mapping Contingency Management Contract Plan Cost—Benefit Analysis Criterion Referenced Measurement Critical Incidents Technique Critical Path Method (CPM) Dale's Cone Decision Tables Delphi Technique Discovery Technique Discrepancy Evaluation Distance Teaching & Learning Dynamic Programming Ethnography Fault Tree Analysis Feedback Field Test Jr: Ettttt-fi' 31:1: birth-Ebb 1:4: w wwwwwww wwwwwwwwwwwwww H O H F4 14 H FJ +4 H H H H 00000 O 170 Field Experts' Validation Package (continued) FIRO — B Flowcharting Force-Field Analysis Formative Evaluation Function Analysis Futures Wheel Gagne's Taxonomy Galileo System Gaming Gannt Chart Immediate Feedback In-Basket Technique Information Mapping Interaction Matrix Interaction Net Interactive Television Interface Analysis Interpersonal Recall (IPR) Involvement Matrix Interviewing Users Johnson-Neyman Technique Krathwohl's Taxonomy Latent Image Learner Verification and Revision (LVR) ttttttttttttttt 3.1: wwwwwwwwwwwwwwwwwwwwwww n) H F1 +4 14 H NNNN H F1 F4 +4 14 H ...—5. 000 O OO 171 Field Experts' Validation Package (continued) Least—Preferred Coworker Likert Scale Linear Programming Literature Search Log Diary Long-Range Planning Management by Objectives Managerial Grid Mathetics Matrix Sampling Merit Rating Chart Micro Teaching Monte Carlo Method of Analysis Morphological Charts Multidimensional Scaling Multi-Image/Multi-Media Presentation Needs Assessment Network Analysis Nominal Group Process Observation Interview Organization Chart Pair-Associate Learning Participative Management Path Analysis rttktbtttttt .1: 1:41-22:42- LJO wwwwwwwwwwww WWWWWWWUOUU [UNIV [.4 H F4 r4 F' +4 l4 H F4 +4 +4 H 172 Field Experts' Validation Package (continued) Personal Inverted Filing System (PIFS) Program Evaluation Review Technique (PERT) ‘ Program Planning Budgeting System (PPBS) Questionnaire Q-Sort Relational Control Analysis Relevance Trees Role Playing Semantic Differential Sensitivity Training Sequencing of Objectives Shaping Simulation Stake Model Standardized Tests Story Boarding Summative Evaluation Synectics Systemic Testing System Transformation Task Analysis Technical Conference Ettttth‘ttt: J‘: tth: U0 wwwwwwwwwwwwwwwww - - ~.~O“ 173 Field Experts' Validation Package (continued) Telelecture 4 3 2 l O Trialogue 4 3 2 1 O Visual Inconsistencies 4 3 2 l O 174 Field Experts' Validation Package (continued) ADDITIONAL TECHNIQUES FORM The researcher recognizes that his list of 108 terms is by no means exhaustive. Hence, you are encouraged to add any technique(s) that you are aware of, which satisfies the criteria for such and which you feel merits a score of 2, 3, or 4 on our five-point scale. Please supply a suitable definition wherever possible. Thank you. Technique Definition 175 Field Experts' Validation Package (continued) FIELD EXPERT FORM FOR SUGGESTIONS & COMMENTS The validity of the researcher's survey instrument is dependent upon the sanction and approval of the panel of Field Experts. Therefore, you are encouraged to add any comments or suggestions for improvement that you deem necessary. If you have none, please check the appropriate box. Thank you. I approve of the survey instrument proposed by the researcher without reservations. I approve of the survey instrument, but would like to see the following modifications made: (Use reverse side if necessary) (signature) APPENDIX G 176 Techniques According to Ranking by Field Experts Technique Ranking Field Test 3.26 Brainstorming 3.17 Formative Evaluation 3.13 Needs Assessment 3.00 Feedback 2.96 Summative Evaluation 2.83 Task Analysis 2.78 Learner Verification & Revision (LVR) 2.65 Criterion Referenced Measurement 2.61 Story Boarding 2.57 Flowcharting 2.57 Interviewing Users 2.48 Bloom's Taxonomy 2.48 Observation Interview 2.43 Questionnaire 2.43 Critical Path Method (CPM) 2.43 Checklists 2.39 Literature Search 2.35 Sequencing of Objectives 2.26 Content Analysis 2.26 Simulation (see Gaming) 2.22 Cost-Benefit Analysis 2.17 Immediate Feedback (see Feedback) 2.17 Likert Scale 2.17 177 Techniques According to Ranking by Field Experts (continued) Contract Plan 2.13 Management by Objectives (MBO) 2.13 Computer Search 2.09 Gaming 2.07 Program Eval. Review Tech. (PERT) 2.04 Technical Conference 2.04 Gannt Chart 2.04 Card Sort 2.04 Discovery Technique 2.00 Force-Field Analysis 2.00 Function Analysis 2.00 Shaping 2.00 Micro Teaching 2.00 Case Studies 1.96 Gagne's Taxonomy 1.96 Discrepancy Evaluation 1.87 Long-Range Planning 1.87 Cognitive Mapping 1.83 Nominal Group Process 1.78 Mathetics 1.78 Stake Model 1.78 Role Playing 1.78 Standardized Tests 1.74 Computer Assisted Instruction 1.74 Decision Tables 1.74 178 Techniques According to Ranking by Field Experts (continued) Critical Incidents Technique Krathwohl's Taxonomy Behavior Modelling Delphi Technique In-Basket Technique Authoritative Opinion Program Planning Budgeting System (PPBS) Appraisal Interview Linear Programming Multi-Image/Multi-Media Presentation Information Mapping Analysis of Inter Connected Decision Areas Morphological Charts Interaction Matrix Network Analysis Interface Analysis Involvement Matrix Semantic Differential Organization Chart Fault-Tree Analysis Q-Sort Participative Management Interpersonal Recall (IPR) Broken Squares Interactive Television 1.74 1.70 1.61 1.61 1.61 1.61 1-57 1.57 1.57 1.57 1.52 1.48 1.48 1.48 1.43 1.43 1.39 1.39 1.35 1.35 1.35 1.30 1.30 1.26 1.26 179 Techniques According to Ranking by Field Experts (continued) Managerial Grid 1.26 Behaviorally Anchored Rating Scale 1.26 Matrix Sampling 1.22 Interaction Net 1.17 Log Diary 1.17 Pair-Associate Learning 1.17 Contingency Management 1.17 Galileo System 1.13 Path Analysis 1.13 Systemic Testing 1.09 Sensitivity Training 1.09 Contextual Mapping 1.09 Synectics 1.04 Multidimensional Scaling 1.04 Dale's Cone 1.00 Johnson-Neyman Technique 1.00 Ethnography 1.00 Distance Teaching and Learning 0.96 Compressed Speech 0.96 Latent Image 0.96 Merit Rating Chart 0.91 Visual Inconsistencies 0.91 Least Preferred Coworker 0.83 System Transformation 0.83 Monte Carlo Method of Analysis 0.78 180 Techniques According to Ranking by Field Experts (continued) Trialogue 0.74 Telelecture 0.74 Alexander's Method for Determining Components 0.74 Futures Wheel 0.70 Relevance Trees 0.70 Dynamic Programming 0.65 Relational Control Analysis 0.61 FIRO - B 0.48 Personal Inverted Filing System (PIFS) 0.43 APPENDIX H 181 Final Survey Instrument A Study to Identify Major Field Techniques and Utilization Levels by Canadian Instructional Developers As members of the Association for Media and Technology in Education in Canada, many of us are professionally in- volved with instructional development. Central to this field is the utilization of di- verse techniques which have become rooted in education. Of these techniques, many were spawned by instructional developers, while others have been adopted by us from psychology. communications, business and industry, etc. it is the purpose of the following study to investigate knowledge and application lev— els of a number of these major instructional development techniques, as they apply to the AMTEC membership. For this purpose. the researcher assembled a list of 108 such techniques. and subsequently surveyed a Thomas L. Bennett group of 30 field experts in Canada and abroad. The survey instrument revealed that of the original list, 60 techniques were deemed to be of sufficient value to be in- cluded in the final study. it is acknowledged that this list is by no means definitive: a large number of techniques were culled from the original list, which in itself was not exhaustive. However, with this limitation noted, let us proceed; let us make a begin- ning. The first rung must be mounted be- fore the ladder is ascended. The researcher would beg your indul- gence to consider the present survey. You are respectfully asked to devote a half hour or so of your time and complete the follow- ing document that has been printed in the centre portion of this issue of Media Mes- sage. It may be detached easily and re- turned to the researcher in the enclosed, ad- dressed, stamped envelope. Further, the following list of alphabetized techniques and their definitions have been included, which may remain with the journal. it is hoped that the accompanying references may be of service to you in your future endeavours. In conclusion, an analysis of the data and a complete report of the survey will be pub- lished in a future issue of Media Message; however. strict observance of individual anonymity will be maintained. It is felt that the results will be of significant value to the membership of AMTEC, and for this rea- son the researcher would like to thank you for your consideration and kind assistance. Techniques and Definitions Appraisal Interview A verbal communication between employee and management con- cerning the results of an employee appraisal. in order to encourage present behaviour. or to provide a warning for a behavioral change, or to simply provide information. Kay. E. A Study of the performance appraisal interview. New York management development and employee relations services. New York: General Electric Company. March, 1962. Norman, R.F. Three types of appraisal interview, Personnel. March, 1958. Authoritative Opinion Descriptive writing based upon the observations of experienced practitioners, or persons who have had direct contact with the en- vironment they seek to describe or explain. Davis, R.C. Thefundamentals oftop management. New York: Har- per & Row, 1951. Fayol, H. Industrial and general administration. lntemational Man- agement Institute, 1930. 16 MEDIA MESSAGE 182 Final Survey Instrument (continued) Behavior Modelling Technique to enable managers to improve their managerial abili- ties by imitating “models” who have mastered the requisite skills. Bandura. A. Principles of behavior modification. New York: Holt, Rinehard 84 Winston. Goldstein, A. P. or r.cher M Changing managerial behavior by applied (learning techniques Training and Development Journal, 1973.-.9 Bloom‘ s Taxono Psychological model that describes the major categories within the cognitive domain. knowledge. comprehension application analysis. synthesis. and evaluation. The taxonomy proceeds on the assump- tion that knowledge is ordered hierarchically. and it is assumed that the six main classes are sequential. moving from knowledge to evaluation. Gronlund. Norman E. Stating behavioral 0objectives for classroom instruction. New York: Macmilln Hunt. D. E. & Sullivan. E V. Between gpsycho/ogy and education. Hinsdale, ll|.: Dryden Press 1974. Brainstorming Technique that enables a group of people to quickly produce many ideas without fear of criticism. Ideas are recorded first and evaluated afterwards. Babach, W. J., 5t Barkelew A H. The Babach- Barkelew 6brain5torm— ing book Utica, Michigan. Synergy Group Inc., Havelock R. G. The change agent '5 guide Ito tion Englewood Cliffs. N. J.: tion. April I97 8 novittion in educ ca- Educational Technology Publica- Card Sort Pack of cards. containing goal statements on each card, is sorted into stacks thath ave beena assigned value points in order to deter- mine a ranking of goals based upon importance and implementation priorities. Witkin B. R. Needs assessment kits models and tools Educational Technologt I977 l7 -.8 Case Studies A technique involving a comprehensive study of an indIVIduaJ. in- stitution, or situation; used to provide detailed information for pur- poses of appraisal and recommendattons. Schatzman, L., 8t Strauss. A. Field research. Englewood Cliffs. N.J.: Prentice-Hall Inc., I973 Checklists Technique to enable designers to use knowledge of requirements that have been found to be relevant in similar situations by first preparing a list of questions that were determined to be important in similar situations and next asking some of all of these questions about the design that is to be evaluated. Jones, J.C. Design methods. London: John Wiley & Sons. I970. Cognitive Mappin E A systematic procedure for visually indicating how a person ap- proaches new knowledge (cognitive information) in terms of percep- tion. memory. thinking. and problem solving. based on previous knowledge or rules for acquiring new knowledge based on rules derived in learning old knowledge. Thorndike, R.I_., & Hagan, E. Measurement and evaluation. New York: John Wiley & Sons. I977. Computer Assisted Instruction Instructional technique in which the computer contains a stored instructional program designed to inform. guid , control, and test the student until a prescribed level of proficiency is reac ed. Coulson. J. E. Programmed learning anld computer-based instruc- tion. New York. John Wiley 8: Sons, 9.2 Poirot. J. L. Jr Groves. D. N. omputer9 sctence for7 the teacher. Manchaca Texas: Sterling Swift Publishing Co., Computer Sear Computerizedh technique that enables the researcher to search thousands of articles in a short period of time by the use of key words or descriptors; e.g., literature seaI'c . Thesants of ERIC descriptors. New York: Macmillan Information. I980. Content Anal SI 5 A procedure for identifying intellectual tasks including. the con- cepts involved in a competency the relat tionships among the co n- cepts the behaviors performed using the concepts and relationships. (AECT definition) Kerlinger, F. N. Foundattons of behavioral research. New York: Holt. Rinehart& Winston I973. Contract Plan A written agreement between the student and teacher which lists a set of goals. skills, and assignments to be completed by the student within a reuonable time. Haddock. T. Indixidual instruction through student contracts. Ari- zona Teacher, May l.967 Cost- Benefit Analysis generic term for such techniques as zero based budgeting. cost effectiveness cost evaluation. etc. which assist the decision- maker in making a comparison of alternative courses of action in terms of their costs and effectiveness in attaining some specific objectives. Prest, A.R.. & Turvey R. Cost4benefit analysis: a survey. The Eco omic Journal, I965, 75. 683-735. Wilkinson. 6.1.. Cost evaluation of instructional strategies. Com- munication Review, 197 . Criterion Referenced Measurement Tests constructed to yield measurements that are directly interpret- able in terms of specified performance standards. Humbleton. R.K., & Gorth. W.P. Criterion-referenced testing: is- sues and applications. Amherst. Mass.: Amherst School of Educa- VOl L‘MF l0, NUMBER} 17 183 Final Survey Instrument tion. Sept. 1971. (ERIC Document Reproduction Service No. ED 60025 Jones. J.W. A study of the congruency of competencies and criter- ion-referenced measures. Master's thesis from Mississippi State University. t977. (ERlC Document Reproduction Service No. l42575). Critical incidents Technique Technique to acquire information on specific behavior patterns of a subject by Interviewing the subject' 5 work supervisor in or cr to as- certain behavior patterns relating to the skills being studied. Borg. W. R. & Meredith. D. C. Educational research and inlroduc» tion. New York. David Mckay Co., Flanagan. J. C. The critical incident technique. Psychological Bulle- tin. l954, 5]. 327-358. Critical Path Method Techn nique to aid researchers with the planning. scheduling. ex- pediting and progress monitoring tasks involved in a specific project by diagramatically plotting work activities and events in sequence and determining the longest time needed to complete the project. Collins. F. T. Network planning and critical math scheduling. New ork: Know How Publications. 1965 Decision Tables Alternative to a flowchart for presenting the logic of a problem. wherein the table is a set of deciSIon rules in which each rule identifies a set of conditions with Its set of actions; it is divided vertically by condIIIon statements and action statements and divided horizontally by stubs and entries. Hussatn. K. M. Development of Information systems/or education. Englewood Cliffs. N J Prentice- Hall itic. l973. Delphi Technique A futurist research method which utilizes the systematic solicita- tion and combination of Informed judgments from a group 0 ex pcrts on questions or issues relevant to the future Helmet. ()..1nu/yris of the future: the Delplti method. Sarita Mung ica (a.: The Rand Corporation. i967 Mclmer. 0.. & [)alkcy. An experimental application of the Delphi method of the use of experts Management S(ll’n((’ i963. 9.4 47.6 “cave: \V. T. ihc Delphi forecasting method. Phi Beta ltappan. January. 197] . Discmery Technique Learning model by which the student problem solves through discovering a new ntcthod rather than relying upon prior knowledge and procedures. Tuba. ii. Learning by discowry. Elementary School Journal. 1963. 63(6). 308-316 Travers. R.M.W. (Ed). Second handbook of research on teaching. Chicago: Rand McNally. 1973. Discrepancy Evaluation bed of identifying the causes of the difference between stated mobjectives and actual performance (AECT definition) (continued) Stake, R E. Objectives. priorities. and otherjudgment data Review of Educational Research i970. 70. iBi- 22 Feedback Generative term that encompasses a number of techniques (in- cluding programmed texts. pull-tab response cards. Latent image. etc.). which gives the learner an immediate response as to the correct- ness of his answers. it may also refer to data collected by researchers for purposes of evaluation. Glaser. R., «It Cooley, W.W. instrumentation for teaching and in- structional management. in R. Travers (Ed). Second handbook of research on teaching. Chicago: Rand McNally, i973. Field Test The assessment of a near- fi-nai model in an appropriate situation. according to specified criteria for the purpose of determini ing what modifications of structure and performance are necessary (AECT definition) Klausmeier. H. Research and development toward the improvement of Cdsu6 cation. Journal of Experimental Education. 1968. 37. 146-1 Flowcharting Graphic representation for the definition, analysis. or solution of a problem in which 5 mbois are used to represent operations. data. flow. and equipment. etc. Chapin N. Flowcharting with ANSI standards: a tutorial. Comput- ing Surveys. June. 1 Enrick. N. L. Effective graphic communication. New York. Aver- back Publishers. i972. Schiber T. J. Fundamentals offlowcharting. New York: J. Wiley & Force-Field Analysis Graphic method of analyzing the forces DIO\Idlng thrust towards or facilitating change and the torces hindering change In a particular situatio Lew 'in. K. Frontiers in group dynamics: concept method and reali- ty III social science. Human Relations i947 .l. Giamtnato. M. C. Suggested activities for learning about role behav iors. problem solving and force field techniques Northwest Regional Education Laboratory. Formative [\aluation An attempt to collect appropriate evidence during the construction and trying out of a new curricu um. etc. in such a way that re\ision of it can be based on this evidence. evaluation of instructional pro- grams whiie they are still in some stage of development. Anderson. S.B.. Bali. S.. & Murphy. R.T. Encyclopedia of educa- tional evaluating. San Francisco: Jossey-Bass. i975. Bloom. 35.. Hasting, J.T.. & Madaus, G.F. Handbook on forma- tive and summative evaluation of student learning. New York: McGrawuHill. 1971. Function Analysis in the Roger Kaufman Model for Educational Systems Planning. the Function Analysis stage is the process for determining re- quirements and subfunctions for accomplishing all of the elements stated in the objectives and problem identification stage. it is con- cerned with identifying the whats that have to be accomplished lR MEDIA MESSAGE lBH Final Survey Instrument (continued) rather than the hows. Kaufman. R.A. Educational system planning. Englewood Cliffs. N.J.: Prentice-Hall. I972. Gagne's Taxonomy Cognitive learning theory described as a hierarchy of learning pro- cesses that become increasingly complex and which places more em- phasis upon learning and less on the development aspect. Hunt, D.E. & Sullivan. E.V. Between psychology and education. Hinsdale, Ill.: Dryden Press. 1974. Gannt Chart A means of graphically illustrating a production schedule; the horizontal axis is used to depict time. with activities. items. or per- sonnel listed vertically in the left-hand column. Dessler. G. Management fundamentals: a framework. Reston, Va.: Reston, I977. Longenecker, J.G. Essentials of Management: a behavioral ap- proach. Columbus. Ohio: Charles Merrill. I977. ln-Basket Technique Technique to analyze a participant’s decision-making abilities. managerial and problem-solving skills. whereby s/he receives a “situation" set up on a memo to which a considered response is com- pared to answers suggested by field experts. French. W. The personnel management process (4th ed.). Boston: Houghton Mifflin, 1978. Ward. L.B. The use of business problems. Management Record, 1960. 22, 3033. Information Mapping System of graphically presenting information on a series of pages in the form of COBOL: each page is broken with horizontal lines dividing chunks of information into definitions. examples. rules. etc. Glaser. R. Teaching research and education. New York: John Wiley. I965. Horn. R.E. information mapping: new tool to overcome the paper mountain. Educational Technology. I974. 15(5). 5-8. Instructional Analysis Kit Self-evaluation of instructional procedures as a vital step towards course improvement. Donald. Janet G., & Penney. M. Instructional analysis kit. Mon- treal. Quebec: McGill Centre for Learning & Development. I977. interviewing Users Technique to elicit information that is known only to users of a product or system in question. Jones. J.C. Design methods. London: John Wiley. I970. Krathwohl’s Taxonomy Psychological model that describes the major categories within the Affective Domain: receiving. responding. valuing. organizing. and characterizing by a value or value complex. Gronlund. N.E. Stating behavioral objectives for classroom instruc- tions. New York: Macmillan. 1970. Learner Verification and Revision Involves the concepts of evaluation. revision and decision to implement developed by Kenneth Komoski, and intended for use as an index of quality for educational materials; involves the tryout of a prototype educational product on the target audience to determine its weaknesses prior to revision. Kandaswamy. S. Learner verification and revision: an experimental comparison of two methods. A. V. Communication Review, 1976, 24, 316 - 328. Stolovitch, H.D. The intermediate technology of learner verification and revision. Educational Technology, 1978. 18. 13-17. Likert Scale To obtain summated ratings of information pertinent to affective variables. by responding to statements which are both favourable and unfavourable to the phenomenon under study; responses range on a scale of five (from “strongly agree" to "strongly disagree") and are thus analyzed to determine which items discriminate best between the high-scoring individuals and the low-scoring individuals. Phillips. Social research. I966. Stanley & Hepkins. Educational and psychological measurement and evaluation. Linear Programming Program in which the sequence of information presented to the students is fixed so that all students are given the same stimuli in exactly the same sequence followed by testing. followed by new in- formation; based upon the stimulus~response works of Pressy and Skinner. Brown. J.V., Lewis. R.B.. & Harceroad, F.E. A Vinstruction media and methods. New York: McGraw-Hill. 1969. Hartley. J. Programmed instruction 19544974: a review. Pro- grammed Learning and Educational Technology, 1974. 11, 278-291. Literature Search To find published information that can favourably influence the designers’ output and that can be obtained without unacceptable cost and delay. Jones. J.C. Design methods. London: John Wiley. I970. Long-Range Planning Methodology to develop an adaptive planning program consisting of “alternative future" general plans and derivative plans for the major components of the agency in question; methods range from establishing goals. through developing plans for each alternative future. through selecting one alternative future plan and developing monitoring and shifting procedures. Chase. R.B.. & Clark. D.C. Long range planning in school districts. Educational Technology, 1974. 4. 32—36. Salmon. R.D. Developing a long range planning system for higher education. School and Community. May I971. Management by Objectives Process whereby the superior and subordinated managers of an VOI llMF l0, NUMBFR 3 l9 "imma: -y,\ r_'-.“ .1- _ . 3;; j' . _ . . ... _ -. 185 Final Survey Instrument (continued) organization jointly identify its common goals. define each indi- viduai's major area of responsibility in terms of the results expected. and uses these measures as guides for operating the unit and assessing the contributions of each of its members. Hollman. R.W. Applying MBO research to practice. Human Re- sources Management. Winter. I976. Stein. D.l. Objective management systems: two to five years after implementation Personnel Journal. I975. 54. 525-583. Mathetics Training system to determine what to teach. a basis for determin- ing strategy decisions. and a detailed procedure for constructing a lesson; those goals are attained through a series of ten steps which in- clude occupational analysis. task selection. task analysis. population analysis. etc. Gilbert. T.F. Mathetics ii: the description of teaching exercises. Journal of Mathetics. April I962. I. Gilbert. T.F. Mathetics: the technology of education: Journal of Mathetics. January I962. I. Micro Teaching Practice which allows pre-sersice or in-sers'ice teachers to develop or improse skills in applying a particular teaching technique. whereby a lesson is planned which concerns a single. unique topic to be presented to a small group of students. in a small time frame. Allen. [).\\'.. & Pyan. K..-\. Microteaching. Reading. Mass.: Addi- son-Wesley. I967. Sadker. 51.. & Sadker. D. Microteaching for affectise skills. The Elementary School Journal. I976. 76, 90-99. Multi-Image’Multi-Media Presentation The integration of more than one medium in a complementary manner in a presentation or module of instruction. Wittich. W.A.. & Schuller. CF. Instructional technology, its nature and use. New York: Harper & Row, I973. Needs Assessment The process in which “real-world“ data is collected from indisid- uals and groups insols ed in a particular educational situation to de- termine the nature of the problem. to determine how the group in- solscd (learners. implementers, community) salue what exists (status quo). what should be (the ideal situation) and the discrepancy be- tw ecn w hat is and what should be. and to prioriti/e the problems and discrepancies. Anderson. S.B., Bail. S. & Murphy. R.T. liner/opedia of (’(Illt'ulltfii— al evaluation. San l'rancisco: Jossey-Bass. I975. Witkin. B.R. Needs assessment. kits. models and tools. l-fduca- tional Techno/oer. I977. 17. S-IR. Nominal Group Process Method to generate and prioritize ideas regarding problem- .solying. job performance improscmcnt. etc., whereby each member of a study group generates ideas that are listed before thc group. ranked. and valued (l-S). and finally prioritized. Albanese. R. Managing; toward accountability for performance. Homewood. Ill.: Richard D. lrwin. I978. Delbecq. A.L., VandeVen, A.H. Nominal group techniques for in- volving clients and resource experts in program planning. Academy of Management Proceedings. I970. Observation Interview Method to define a task. analyze a job. or perform needs assess- ment or evaluation, whereby the investigator observes and questions an interviewee at the work site while the practitioner performs the ac- tivities under investigation. Anderson. S.B., Bali. 5.. & Murphy. R.T. Encyclopedia of educa- tional evaluation. San Francisco: Jossey-Bass. I975. Bergman. A.B., Dassel, S.W., & Wedgwood. R.J. Time-motion study of practicing pediatricians. Pediatrics. 1966. 38,254-263. Programmed instruction A generic term referring to a technique of. and materials for in- struction; the process of constructing sequences of instructional ma- terial in a way that maximizes the rate of acquisition and retention. and enhances the motivation of the student; instruction utilizing a workbook textbook. or a mechanical and/or electronic device pro grammed to help pupils attain a specified level of performance. (AECT definition) Briggs. L.J. Sequencing of instruction in relation to hierarchies of competence. Pittsburgh: American institutes for Research. 1968. Briggs. L.J. Handbook of procedures for the design of instruction. Pittsburgh: American institutes for Research. I970. Program Evaluation Review Technique A systematic timetabling and programming technique developed to measure. monitor. and control the development and progress of a project or program. wherein a network of events and work activities is identified. including the critical path of the one which takes the longest time to complete. Cook. D.L. Program evaluation and review technique: applications in education. Washington: U.S.H.E.W Office of Education. I966. 17. Kohn. M. Dynamic managing: principles. process. practice. Melno Park. Calif: Cummings. I977. Lott. D.R. Basic systems analysis. New York: Onfield Press. l97l. Program Planning Budgeting System A planning budgeting system in which resources are allocated ac- cording to specified project or program needs; it directly relates sub- stantive planning to fiscal planning requiring a detailed operational plan to which costs are then assigned on a programmatic. rather than on a line item basis. Kindred. A.R. Data systems and management. Englewood Cliffs. N.J.: Prentice-Hall. I973. Magaro. J.D. P.P.B.S.: a means towards accountability. Audiovis- ual Instruction. l975. 20(I0). lO-IZ. Questionnaire instrument for recording data ranging from sociological opinions and attitudes to psychological sariables which include opinions. at- titudes and behavior; technique to obtain responses and reactions from a large number of individuals who could not be interviewed personally within a short period of time without considerable es- pense. Bloom. B.S., & Hastings. .\1. Handbook on formative and summa- 20 Mi l)lA MESSAGE 186 Final Survey Instrument (continued) tive evaluation of student learning. New York: McGraw-Hill. 1971. Kerlinger, F.N. Foundations of behavioral research (2nd ed.). New York: Holt, Rinehart & Winston. I973. Role Playing instructional technique involving a spontaneous portrayal or act- ing out of a situation. condition. or circumstance by selected mem- bers of a learning group who assume either overtly or in imagination. the part or function of another. Cooper. J. Deception and role playing: “On telling the good guys from the bad guys." American Psychologist. August. 1976. 31, 605-610. Keller. CW. Role playing and simulation in history classics. The History Teacher. 1975. 8(4). Sequencing of Objectives Objectives are sequenced according to a number of different methods in order to facilitate learning. Popham. W.J., 8; Baker. LL. Systematic instruction. Englewood Cliffs. N.Y.: Prentice-Hall. 1970. Posner. G.J...& Strike. K.A. A categorization scheme for principles of sequencrng content. Review of Educational Research. 1976. 46(4). 665-689. Shaping A method of successive approximation to teach humans and animals a new skill; it reinforces behaviors that approximate the final performance one wants the subject to perform by shaping the learner‘s behavior by rewarding him whenever he is successful in ap- proximating the skill being taught. Davis. A.. & Yelon. 5. Learning systems design. East Lansing. Mich- igan State University. 1976. Simulation A learning process which involves pupils as participants in role presentations and/or games simulating real-life situations or environments; a learning activity which makes the practice and ma- terials as near as possible to the situation in which the learning will be applied. Greenblat. C.S.. & Duke. R. Gaming — simulation: rationale. de- sign and application. New York: Halsted Press. 1975. Spannaus, T.W. What is simulation? Audiovisual Instruction, 1978, 23(7), l6-l7. Stake Model Technique intended for the evaluation of educational programs by providing data for decision-making; it provides measurements on a matrix of the match between What an educator intends to do and what s/he actually accomplishes. Anderson. S.B., Ball. 8.. & Murphy. R.T. Encyclopedia of educa- tional evaluation. San Francisco: Jossey-Bass. 1975. Stake. R.E. Evaluating the arts in education: a responsive approach. Columbus: Charles Merrill. 1975. Standardized Tests An instrument constructed in accord with detailed specifications. in which the items have been selected after trying out for appro- priateness in difficulty and discriminating power. one which is ac- companied by a manual giving definite directions for uniform ad- ministration and scoring. and one which provides relevant and dc- pendable norms for score interpretations. Borg. W.R.. & Gall. M.D. Educational research (2nd ed.). New York: David Mckay. 1971. Buros, OK. The mental measurement yearbook. Highland Park. N.J.: Gryphon. 1977. Story Boarding The activity of preparing a series of sketches or pictures and any accompanying text used to visualize each topic or item in an audiovisual material (or presentation) to be produced; usually used for planning. Kemp. LE. Planning and producing audiovisual materials. New York: Chandler Publishing. 1968. Brown. L. A. V. instruction, technology, media, and methods. New York: McGraw-Hill. 1973. Summative Evaluation Evaluation intended to provide data for product validation and oriented to consumer-administration-teacher criteria and standards. used to assess the overall effectiveness of some program of material. Anderson. S.B., Bull. 8.. & Murphy. R.T. Encyclopedia of Educa- tional evaluation. San Francisco: Jossey-Bass. 1973. Bloom. B.S., Hastings. T., & Mabaus, G.F. Handbook onformative and summative evaluation of student learning. New York: McGraw-Hill. 1971. Task Analysis The analysis and synthesis of a real world behavior and/or situa- tion. including knowledge. skills and attitudes. including the fol- lowing: a listing of the activities performed. an indication of the se- quence and relationships among the knowledge. skills. and attitudes. the conditions under which the knowledge, skills and attitudes occur; and the acceptable criteria for knowledge. skills and attitudes perfor- mance. Davis. l.l(. Task analysis: some process and content concerns. AVCR. Spring. 1973. 73-83. Gagne, R.M. Task analysis — its relation to content analysis. A paper presented at the annual meeting of the American Educa- tional Research Association. Chicago: April. l974. Technical Conference A group of highvlevel technical or subject matter experts are brought together to collectively determine the responsibilities and procedures of a set position. Goldstein, 1.1. Training program development and evaluation. Bel- mont. California: Wadsworth. I974. Segall, Asher. et. 0!. Systematic course designed for the health fields. New York: John Wiley. 1975. VOl L'Ml- l0. NL'MBFR .7 2] 187 Writ: (continued; survey Of . Field Anonymity and Confidentiality All information you furnish will be held in strict confidence and reported in statistical Tech n i u e s aggregates only. No data which will link an individual to specific or general responses will q 7 be reported. Utilized b’ y Filling out the questionnaire will require approximately 30 minutes of your time. Most questions can be answered simply by checking the appropriate box. In a few instances you canad i an will be requested to make brief written responses. Please answer all questions whenever possible. I n Stru c t i O n a I wiggle? have finiShCG. please detach survey forms and return in the enclosed. stamped Mr. Thomas Bennett 8' . . . Developers s. I. Respondent's name (for clarification purposes only) 2. Years of teaching or educational work experience __ 3. Level of highest education (check one) __ B.A./B.Ed./B.Sc. _ Ph.D./EdD. _ Cenificaie (Specify diploma. __ Specialist or equivalent __ College Diploma or institution. and Program __ M.A./M.Ed./M.Sc. 4. Name of University or College of your higheSt degree Other (specify) description) 5. Name of the program of your highest degree if you graduated with one of the degrees in No. 3 above (check one). __ Adult Education _ Educational Administration _ Instructional Development and __ Applied Psychology and/or Planning Technology (Educational Media. etc.) ___ Computer Applications __ Higher Education __ Sociology in Education _, _ Curriculum _ History and "or Philosophy _ Special Education _ Measurement and Evaluation of Education _ Other (Specify) 6. State title or present job responsibility (check one) __ Administrator _ Remedial Teacher _ Elementary School Teacher _ _ College Instructor __ Special Education _ Secondary School Teacher __ Curriculum Coordinator __ Audiovisual Technician _ University Professor/Instructor or Consultant __ Librarian __ Other (specify) 7. Institution (School Board in which you are currently employed. (a) Approximate Enrollment (c) Name of Institution/School Board (response is optional) (b) Faculty Size 8. In order to identify your level of competency. level of usage. and how valuable you feel each technique is to Instructional Development. please check the appropriate box for each of categories A.B. and C. Further. ifyou are presently employed in a University. College. or Teacher Training Institution that provides Instructional Development Programs and/or Teacher Training Programs. please indicate whether such programs teach the TECHNIQUE and to what DEGREE. by checking the appropriate box in category D. Please Note: If you are NOT FAMlLlAR with a technique. please check the Nil box in category A and and go to the next technique; do not proceed with Categories B through D. Thank you for your help in completing this survey. Your efforts are greatly appreciated. C. D. A. B. Value to Instructional Degree to Which Technique Name Competency Level 1e£ of Use Development Institution Teaches Nil Hi Med l-o Hi Med I 0. None Hi Med Lo None Hi Med Lo N/A Appraisal Interview ___fi_______fi _4 __ ‘ +——<.— ____‘ Authoritative Opinion __d _ __ __ __ _ Behaviour Modelling _____ _*_* A _ __ __ - __ Bloom‘s Taxonomy Brainstorming 188 Final Survey Instrument (continued) Sort Case Studies Checklists Mapping Assisted Instruct. Search Content Contract Plan Cost-Benefit Anal Criterion Referenced Measurement Critical Incidents T Critical Path Method (CPM) Decision Tables T UC T Evaluation Feedback Field Test Flowchart Force-Field Analysis Formative Evaluation Function Analysis Gagne's Taxonomy Gannt Chart In-Basket Technique Information Instructional Analysis Kit Users Interv Krathwohl's Taxonomy Learner Verification and Revision Likert Scale Linear Literature Search L Man Mathetics Micro T Multi-l /Multi-Media Pr Needs Assessment -Ran t by Objectives Nominal G Process Observation Interview (eg. Time-Motion Studies Instruction Program Evaluation Review T ue S Cl em Role Pla of Ob ives Simulation (Gamin ) Stake Model (Evaluation) Standardized Tests Story Summative Evaluation Task Analysis (Task Desc.) Technical Conference APPENDIX I 189 Initial List of Techniques with ERIC Descriptors + = primary descriptors * = techniques utilized in final survey, after adjudication by panel of field experts. \ TECHNIQUE ERIC DESCRIPTORS ALEXANDER'S METHOD FOR DETERMINING COMPONENTS Group Behavior / Dynamics Intercommunication Social Relations DUMP ANALYSIS OF INTERCONNECTED DECISION AREAS (AIDA) Administrative Evaluation Course Evaluation Curriculum Evaluation Decision Making .B’UONH o a o o AUTHORITATIVE OPINION l-’ Power Structure APPRAISAL INTERVIEW Individual Tests Performance Criteria Performance Factors BEHAVIOR MODELLING Behavior Patterns Contingency Management Process Education . Role Models .123me WWI-J O O O BEHAVIORALLY ANCHORED RATING SCALE Affective Tests Personality Assessment Rating Scales BLOOM'S TAXONOMY Educational Objectives Management by Objectives Measurement Goals Needs Assessment tWNH LUMP BRAINSTORMING Decision Making Educational Innovation Management Games Management Systems Problem Solving + Teaching Techniques GWEWNH 190 Initial List of Techniques with ERIC Descriptors (cont‘d) BROKEN SQUARES CARD SORT CASE STUDIES CHECK LISTS COGNITIVE MAPPING COMPRESSED SPEECH COMPUTER ASSISTED INSTRUCTION COMPUTER SEARCH CONTENT ANALYSIS thH o o o o O O O O C Q I O O C O . 1:“me W-C-‘UUI'UH «=me \flll’WND-J Game Theory Management Games Problem Solving Simulation . Data Processing Factor Analysis Personality Assessment Q-Sort + Questionnaires Case Records Case Studies Facility Case Studies . Longitudinal Studies Administrative Evaluation Case Records Case Studies Course Evaluation Curriculum Evaluation Cognitive Measurement Cognitive Tests Learning Plateaus Psychometrics Information Theory Language Research . Speech Compression + Computer Assisted Instruction Computer Oriented Programs Man Machine System Teaching Machines Data Processing Information Processing Programing Content Analysis + Course Content Evaluation Methods Item Analysis 191 Initial List of Techniques with ERIC Descriptors (cont'd) CONTEXTUAL MAPPING Decision Making Multiple-Regression Analysis Planning Prediction + Trend Analysis CONTINGENCY MANAGEMENT Behavior Chaining Behavior Change Contingency Management + Individualized Instruction Teaching Methods CONTRACT PLAN Individualized Instruction Open Education Performanced Based Education Tutorial Programs 3 LUMP \J'l-t-‘UONl-J UT-LTUU NH COST—BENEFIT ANALYSIS Cost Effectiveness + Program Budgeting Program Effectiveness Program Evaluation Systems Analysis “1:me o o o o o CRITERION REFERENCED MEASUREMENT H O Criterion Referenced Tests + Measurement Techniques Norm Referenced Tests Mastery Learning + U0 N o o CRITICAL INCIDENTS TECHNIQUE Critical Incidents Method + Job Analysis Measurement Techniques Task Analysis thUH CRITICAL PATH METHOD Critical Path Method + Cost Effectiveness . Management Systems Program Evaluation Scheduling O O DALE'S CONE Audiovisual Instruction Concept Formation . Models . Multi-Media Instruction + .12?me U'l-E'LUNI-J Initial List of Techniques with ERIC Descriptors (cont'd) * DECISION TABLES * DELPHI TECHNIQUE * DISCOVERY TECHNIQUE * DISCREPANCY EVALUATION DISTANCE TEACHING & LEARNING DYNAMIC PROGRAMMING ETHNOGRAPHY FAULT TREE ANALYSIS * FEEDBACK * FIELD TEST DUMP DONE-J UW-t'WNl—J 1':me 1:me o o o 0 NH DUMP tWTUI-J O O O O O :xonJH FHURJH O O 0 0 Data Processing + Decision Making Skills Management Systems Problem Solving Decision Making Futures Prediction + Social Change Trend Analysis Discovery Process Observational Learning Open Education Evaluation Criteria Evaluation Methods + Needs Assessment + Correspondence Study + Home Study Independent Study Part-time Students Educational Research Program Design Program Development + Program Planning + Field Studies + Learning Theories Research Flow Charts + Systems Analysis + Feedback + Reinforcement Program Instruction Information Processing Field Studies + Formative Evaluation + Program Evaluation Research Methodology 193 Initial List of Techniques with ERIC Descriptors (cont'd) FIRO - B FLOWCHARTING FORCE-FIELD ANALYSIS FORMATIVE EVALUATION FUNCTION ANALYSIS FUTURES WHEEL GAGNE'S TAXONOMY GALILEO SYSTEM GAMING o o o o o o o o o o o o o o O O O .1:me DUMP UW-C'UUNH (”NI-J :WNH 41'me Chm-DUONH U'IJI‘UUNl-J tWNl—J O O Controlled Environment Interpersonal Relationship Self Actualization Self Evaluation + Computer Programs Flow Charts + Graphs Planning . Force-Field Analysis + Interdesciplinary Approach Research Methodology Curriculum Evaluation Educational Improvement Formative Evaluation + Program Improvement Summative Evaluation + Models Needs Assessment + Systems Analysis + Decision Making Futures Prediction + Trend Analysis Cognitive Processes Information Processing Learning + Learning Characteristics Sequential Learning + Cluster Analysis + Correlation + Discriminant Analysis + Factor Analysis Intervals + Linear Programming Game Theory Role Playing + Simulation + Socio-Drama Initial List of Techniques with ERIC Descriptors (cont'd) GANNT CHART IMMEDIATE FEEDBACK IN-BASKET TECHNIQUE INFORMATION MAPPING INTERACTION MATRIX INTERACTION NET INTERACTIVE TELEVISION INTERFACE ANALYSIS INTERPERSONAL RECALL l\.) NH WEWNH KIT-Cw H U‘I-L’WNF-J LIONH (JUMP 1‘:me U'lt-‘LJJNH U'I-E‘UUNI-J PERT + Program Planning Scheduling + Feedback + Information Processing Programmed Instruction Leadership Training Management Education + Role Playing + Simulation + Supervisory Training Computer-Assisted Instruction Instructional Design Decision Making Instructional Design Matrices Relevance Systems Analysis + Decision Making Instructional Design Matrices Relevance Systems Analysis Educational Television Man Machine Systems Teaching Machines Television + Information Networks Input Output Intercommunication Management Systems Systems Analysis + Recall Ratio Relevance (information retrieval) . Relevance Ratio Search Strategies Systems Analysis + 195 Initial List of Techniques with ERIC Descriptors (cont'd) INVOLVEMENT MATRIX * INTERVIEWING USERS JOHNSON-NEYMAN TECHNIQUE * KRATHWOHL'S TAXONOMY LATENT IMAGE * LEARNER VERIFICATION & REVISION LEAST-PREFERRED COWORKER * LIKERT SCALE * LINEAR PROGRAMMING 4‘:me LUMP-J ONW-E‘WNH .1:me 3?me .17:me WKUONH .1:me .2?me Decision Making + Decision Making Skills Management Games Problem Solving Accountability Case Records Case Studies Evaluation Methods Feedback + Item Sampling + Group Behavior Group Dynamics + Group Structure Information Processing Learning + Learning Characteristics Psychometrics Feedback + Information Processing Knowledge of Results Programmed Instruction Reinforcement Pre-Testing + Pretests Test Construction . Testing Contingency Management + Leadership Training + Management Development Management Education Supervisory Methods & Training Course Evaluation Curriculum Evaluation Student Evaluation Summative Evaluation + Branching Computers Matrices Operations Research 196 Initial List of Techniques with ERIC Descriptors (cont'd) LITERATURE SEARCH Data Processing Information Processing Retrieval LOG DIARY Evaluation Methods Needs Assessment + Objectives Planning Systems Analysis LONG—RANGE PLANNING Futures Planning Program Design Systems Analysis + :WNI-J mzwmb-J LUMP-J MANAGEMENT BY OBJECTIVES Accountability . Educational Accountability Management by Objectives + Management by Systems tWNl-J MANAGERIAL GRID Conflict Resolution + Decision Making Group Relations Interpersonal Competence Problem Solving O MATHETICS Curriculum Design Curriculum Development + Industrial Education + Industrial Training Planning MATRIX SAMPLING Item Banks Item Sampling + Measurement Techniques MERIT RATING CHART Achievement Rating Evaluation Job Skills Merit Rating Programs 4‘:me WNW U‘l-ELAJNH UT-C'UUNH O O O MICRO TEACHING Micro Counseling Micro Teaching + NH MONTE CARLO METHOD OF ANALYSIS Futures Game Theory Predictive Measurement Probability Theory + . Trend Analysis + U'I-E'UUNH O O O 0 197 Initial List of Techniques with ERIC Descriptors (cont'd) MORPHOLOGICAL CHARTS Decision Making Instructional Design Models Needs Assessment + Systems Analysis + MULTIDIMENSIONAL SCALING Cluster Analysis Discriminant Analysis Internal Scaling Multidimensional Scaling + 42'me U'I-II'LMNH * MULTI-IMAGE/MULTI—MEDIA PRESENTATION Audiovisual Instruction Instructional Media Instructional Technology . Multi Media Instruction + 41'me * NEEDS ASSESSMENT . Evaluation Methods Needs Assessment + Objectives Planning Policy Formation Systems Analysis NETWORK ANALYSIS Human Relations + Interagency Co-ordination Intercommunication Interpersonal Relationships . Networks * NOMINAL GROUP PROCESS Decision Making + Educational Innovation . Management Games Management Systems Problem Solving Teaching Techniques Chm-E'WNH U'l-II’UUNH U‘l-E'LUNH * OBSERVATION INTERVIEW Affective Tests Performance Appraisal Personality Assessment Rating Scales JrLJUMF-J ORGANIZATION CHART Interagency Co-ordination Intercommunication . Networks PAIR—ASSOCIATE LEARNING Associative Learning + Pair Associate Learning Transfer of Training + UUI'UF’ DUMP 198 Initial List of Techniques with ERIC Descriptors (cont'd) PARTICIPATIVE MANAGEMENT PATH ANALYSIS PERSONAL INVERTED FILING SYSTEM PROGRAM EVALUATION REVIEW TECHNIQUE PROGRAM PLANNING BUDGETING SYSTEM QUESTIONNAIRE Q-SORT RELATIONAL CONTROL ANALYSIS kWMI-J LAMP o o o UWJI'WMH o o o o o H cmunaw ONUTJZ'UJM SLUMH U'l-II'UUMH WMHH . . . Decision Making + Decision Making Skills Management Games Problem Solving Critical Path Method + Sequential Approach Systems Analysis Classification Data Processing Indexing Information Processing Information Storage Critical Path Method + Fast Track Scheduling Sequential Approach Scheduling Cost Effectiveness Program Money Management Program Budgeting Program Designs Program Planning + Program Costs Biographical Inventory Data Sheets Q-Sort Questionnaires + Attitude Data Processing Measurement Techniques Personality Assessment Q-Sort + Questionnaires . Human Relations + Interaction Interagencies Co—ordination Intercommunication 199 Initial List of Techniques with ERIC Descriptors (cont'd) SEMANTIC DIFFERENTIAL Measurement Technique Personality Tests Rating Scales Samantic Differential + RELEVANCE TREES 1. Decision Making 2. Futures 3. Prediction + A. Social Change 5. Trend Analysis ROLE PLAYING 1. Game Theory 2. Role Playing + 3. Simulation + A. Sociodrama 5. Stimulators l 2 3 A SENSITIVITY TRAINING Group Therapy ' Humanistic Education Interaction Process Sensitivity Training + Training Group Discussion SEQUENCING OF OBJECTIVES Critical Path Method + Program Designs Program Planning + Scheduling Sequential Approach 0 O O SHAPING Learning Theories Positive Reinforcement Reinforcement + SIMULATION/GAMING Dramatic Play Role Playing + Simulated Environment Simulation + Socio-Drama + STAKE MODEL Course Evaluation Curriculum Evaluation Educational Assessment Evaluation + Formative Evaluation Summative Evaluation; Synthesis mmzwmr—I \n-EWMH DUMP U'l-II'UQMH U'IJIUUMH 200 Initial List of Techniques with ERIC Descriptors (cont'd) * STANDARDIZED TESTS Criterion Referenced Tests National Competency Tests Objective Tests Referenced Tests Standardized Tests + * STORY BOARDING Dramatics Playwriting + Scripts Sequential Approach Teaching Technique * SUMMATIVE EVALUATION Administrative Evaluation Course Evaluation Curriculum Evaluation Program Evaluation Program Validation Summative Evaluation + OUT-CLUMI-J mil-LUMP UTJ‘JWMl-J SYNECTICS Behavior Chaining . Behavior Patterns Information Theory Problem Solving . Thought Processes Transfer of Training + . O mm-DWMH SYSTEMIC TESTING Behavior Change Change Strategies Social Change Systems Analysis .II'WMH SYSTEM TRANSFORMATION Change Strategies Program Design Program Planning + Sequential Approach Systems Analysis U‘ltLle-J * TASK ANALYSIS (TASK DESCRIPTION) Job Analysis Skill Analysis Task Analysis + Task Performance bWMl—J * TECHNICAL CONFERENCE Evaluation Methods Job Analysis + Skill Analysis . Task Analysis + Task Performance Thought Processes O‘Wfl-ITUUMH 201 Initial List of Techniques with ERIC Descriptors (cont'd) TELELECTURE Exceptional Child Education Instructional Media Telecourses + Telephone Communication Systems Telephone Instruction + TRIALOGUE Diffusion + Educational Innovation Innovation Instructional Innovation Multi-Media Instruction + VISUAL INCONSISTENCIES Administrative Evaluation Concept Formation Decision Making Models Stimulators Systems Analysis + ONU‘l-D‘UUMH Ul-C—‘UUMH \D tWMH APPENDIX J 202 Professional Journals & Cited Techniques JOURNALS TECHNIQUES ACADEMY OF MANAGEMENT JOURNAL Management by Objectives ADMINISTRATIVE QUARTERLY Brainstorming AMERICAN PSYCHOLOGIST Role Playing ARIZONA TEACHER Contract Plan AUDIOVISUAL COMMUNICATIONS REVIEW 1. Cost Benefit Analysis 2. Learner Verification & Revision 3. Task Analysis/Task Description AUDIOVISUAL INSTRUCTION l. Learner Verification & Revision 2. Multi-Image/Multi-Media Presentation 3. Program Planning Budgeting System A. Simulation/Gaming ECONOMIC JOURNAL, THE Cost Benefit Analysis EDUCATIONAL COMMUNICATION & TECHNOLOGY JOURNAL Simulation/Gaming EDUCATIONAL PSYCHOLOGY Task Analysis/ Task Description EDUCATIONAL RESEARCHER Discrepancy Evaluation EDUCATIONAL TECHNOLOGY 1. Card Sort 2. Cost Benefit Analysis 3. Formative Evaluation A. Learner Verification & Revision 5. Long-Range Planning 6. Needs Assessment 7. Simulation/Gaming ELEMENTARY SCHOOL JOURNAL, THE Discovery Technique Micro Teaching MP 203 Professional Journals & Cited Techniques (continued) FUTURES: THE JOURNAL OF FORECASTING & PLANNING Delphi Technique HISTORY TEACHER, THE Role Playing HUMAN RELATIONS Force-Field Analysis HUMAN RESOURCES MANAGEMENT Management by Objectives IMPROVING HUMAN PERFORMANCE QUARTERLY Simulation/Gaming JOURNAL OF AESTHETIC EDUCATION Stake Model (Evaluation) JOURNAL OF APPLIED PSYCHOLOGY Brainstorming JOURNAL OF CREATIVE BEHAVIOR Delphi Technique JOURNAL OF EXPERIMENTAL EDUCATION Field Test JOURNAL OF MATHETICS Mathetics JOURNAL OF RESEARCH AND DEVELOPMENT IN EDUCATION Field Test JOURNAL OF TEACHER EDUCATION Critical—Incidents Technique MANAGEMENT RECORD In-Basket Technique MANAGEMENT SCIENCE Delphi Technique NURSING RESEARCH Observation Interview OCCUPATIONAL PSYCHOLOGY Task Analysis - Task Description PEDIATRICS Observation Interview PERSONNEL ADMINISTRATION Behaviour Modelling PERSONNEL JOURNAL l. Appraisal Interview (Appraisal) Interview 2. Management by Objectives 20“ Professional Journals & Cited Techniques (continued) PERSONNEL PSYCHOLOGY Behaviour Modelling PROGRAMMED LEARNING AND EDUCATIONAL TECHNOLOGY Simulation/Gaming PSYCHOLOGICAL BULLETIN Critical-Incidents Technique RESEARCH METHODS Case Studies REVIEW OF EDUCATIONAL RESEARCH l. Discrepancy Evaluation 2. Sequencing (of Objectives) REVIEW OF RESEARCH IN EDUCATION Cognitive Mapping TEACHERS COLLEGE REPORT Stake Model (Evaluation) TRAINING AND DEVELOPMENT JOURNAL Behaviour Modelling VIEWPOINTS Formative Evaluation APPENDIX K 205 FIELD EXPERT RESPONSE FORM (Matching Techniques with Gentry's Management Framework Model) TECHNIQUE NAME FUNCTION LETTER FUNCTION NAME Appraisal Interview Authoritative Opinion Behavior Modelling Bloom's Taxonomy Brainstorming Case Studies Checklists 206 Matching Techniques with Gentry's Management Framework Model (continued) Computer Assisted Instruction Computer Search Content Analysis Contract Plan Cost-Benefit Analysis Criterion Referenced Measurement Discovery Technique Feedback 207 Matching Techniques with Gentry's Management Framework Model (continued) Field Test Flowcharting Formative Evaluation Gagne's Taxonomy Interviewing Users Learner Verification & Revision Likert Scale Literature Search 208 Matching Techniques with Gentry's Management Framework Model (continued) Long Range Planning Management by Objectives Micro Teaching Multi-Image/Multi—Media Presentation Needs Assessment Program Evaluation Review Technique Programmed Instruction Program Planning Budgeting System 209 Matching Techniques with Gentry's Management Framework Model (continued) Questionnaire Role Playing Sequencing of Objectives Simulation Standardized Tests Story Boarding Summative Evaluation Task Analysis 210 Matching Techniques with Gentry's Management Framework Model (continued) Technical Conference 1““ oiddolp nooq a; --“Lu =s1vmaivw BNINHHIEH I [13' ‘ l ”AMMM