A DESCRIPTIONOF THE VERBAL BEHAVIOR OF . SELECTED INSTRUCTIONAL DEVELOPERS IN THEIR ' INITIAL CONFERENCE WITH NEW CLIENTS: AN ‘ EXPLORATORY STUDY ' Dissertation for the Degree 0f Ph. D. MICHIGAN STATE UNIVERSITY ROBERT D. PRICE 1976 III I This is to certify that the thesis entitled A Description.of the Verbal Behavior of Selected Instructional Developers in Their Initial Conference with New Clients: An Exploratory Study presented by Robert Dean Price has been accepted towards fulfillment of the requirements for Ph.D. Secondary Education de ee in gr and Curriculum fessor Date {/6/7‘ (:4/ 0/0 57 ABSTRACT A DESCRIPTION OF THE VERBAL BEHAVIOR OF SELECTED INSTRUCTIONAL DEVELOPERS IN THEIR INITIAL CONFERENCE WITH NEW CLIENTS: AN EXPLORATORY STUDY By Robert Dean Price Even though many instructional development models are_provided by the literature, little information is available as to the details of the interaction of people engaged in instructional development. This study responds, in part, to that need by describing the verbal behavior of six instructional developers as they meet with new clients for the first time. Six subjects participated in the study. All subjects were Ph.D.s at Michigan State University with major professional training in instruc- tional development practicing full-time at the time the study was con- ducted. The six subjects received their professional graduate training at four different institutions, and represented a wide variety of national and international experience. The subjects were all males with a mean age of 46 years. Michigan State University faculty members who were known to be in- terested or involved in instructional develOpment were identified and contacted to determine their interest in participating in the study. Eighteen individuals were identified and briefed on their role in Robert Dean Price the research. Of the eighteen, fifteen people contacted a developer for an appointment and eleven were audio recorded by the developers. All conferences took place in the office of the instructional developer and lasted approximately one hour. In all, thirteen tapes were recorded (eleven clients the researcher involved, and two that came in naturally). Five developers completed two tapes and one three, for a total of thirteen. Each tape was immediately transcribed when received and a content analysis method was adopted for analyzing the transcriptions. A classi- fication system was established with 22 categories of analysis for the purpose of classifying all verbal utterances of instructional developers. A verbal utterance is all the words,phrases and sentences which communi- cated a message to the client of the instructional developer. Coders were selected and trained to code the data and a method for determining intercoder reliability was established. Three major category groups were identified which operated concur- rently on any one verbal utterance. In the classification scheme, four temporal phases were identified which all conferences seemed to pass through. The next group consisted of five categories which included the substantive and instructional design verbal behaviors that the subject seemed to exhibit. The last twelve categories were described as the pro- cess behavior functions which the subject seemed to perform. The results of reliability calculation showed the agreement to be 0.82 across all developers and all clients. It was found that all the developers had the largest percent of ver- bal activity (74.57%) in discussing solutions to the client's problem. Robert Dean Price In the substantive and instructional design categories, the methods category received the greatest percent of verbal attention. An average of 51.87% of the utterances across all developers was classified as occurring in the substantive and/or instructional design categories. In the process behavior functions, there were four categories which stand out as receiving the largest percent of verbal attention by all the develOpers: explaining, opining, reinforcing and soliciting informa- tion. Only 0.5l% of all categories in all groups could not be coded by the coders because of typing errors or incomprehensible utterances. In comparing the results of this study to the developer/professional- client interaction models of Davies, Silber and Havelock, several rela- tionships can be seen. The strongest relationship seemed to exist in the categories of problem identification and solution discussion with some consistency with the introductory category, assumption clarification cate- gory and client reinforcement. In considering the instructional developer's environment and the client's orientation to the research project, it was concluded that in the initial conference: (1) the verbal interactions of this study were in- consistent with the theoretic models of the developer/professional-client interaction, (2) more than one-half of the dyadic interchange was spoken by the developer, (3) subjects in the initial meeting spent the greatest percent of their time identifying the client's problem and discussing solutions to the problem, (4) about one-half of the verbal activity was concerned with content related utterances, and (5) from a process per- spective, the developers put most of their effort into explaining Robert Dean Price concepts, principles and giving opinions on the client's problem. Areas for additional research are identified as: (l) the treatment that different clients receive, (2) the impact of client behavior on the educational quality of the product, (3) the effect of non-verbal communi- cation on the developer-client relationship, and (4) research projects which examine the long-term relationship of developer-client interaction. A DESCRIPTION OF THE VERBAL BEHAVIOR OF SELECTED INSTRUCTIONAL DEVELOPERS IN THEIR INITIAL CONFERENCE WITH NEW CLIENTS: AN EXPLORATORY STUDY By ooo‘ Robert D. Price A DISSERTATION Submitted to Michigan State University in partial fulfillment of the requirements for the degree of DOCTOR OF PHILOSOPHY College of Education 1976 © COpyright by ROBERT DEAN PRICE l976 DEDICATION This thesis is dedicated to my Mother and Father ii ACKNOWLEDGMENTS This writer wishes to express his appreciation to the many persons who have contributed to the design, development and execution of this study. Appreciation is expressed to Dr. Allan J. Abedor as chairman of the guidance committee for his counsel and advice, and to Drs. Kent L. Gustafson, Bruce Miles, Norman T. Bell and George Sargent for their assistance and insightful suggestions. Sincere gratitude goes to Dr. Paul W.F. Witt for his advice in the formulation of this study and his continued support over the last several years. A special thank-you to Ms. Jane R. Jensen for valuable assistance in the preparation of this study. Her help made the tasks easier and the product better. Gratitude is also expressed to the instructional developers and faculty who participated in this study. My deepest appreciation goes to my wife Loretta and my daughters Chris and Beth for their patience and support while this thesis was in progress. Their encouragement, love and support was an invaluable asset. iii TABLE OF CONTENTS Chapter 239$. I INTRODUCTION ....................... 1 Statement of the Problem ............... l Objectives of the Study ................ 4 Assumptions ...................... 4 Definitions of Terms ................. 5 Limitations of the Study ............... 6 General Organization of the Thesis .......... 6 Summary ........................ 7 II REVIEW OF LITERATURE ................... 8 Introduction ..................... 8 organization of the Review .............. 8 Theoretical Frameworks ................ 9 Davies' Approach -- An Overview .......... 9 Product-Oriented Assumptions .......... ll Prescription-Oriented Assumptions ........ 12 Product-Process-Oriented Assumptions ...... l3 Stages in the Development of a Developer- Client Relationship .............. l3 Silber's Approach ................. lS Havelock's Change Agent Model . . . ........ l8 Schein's Process Consultation Model ........ 2l Implications from Medicine ............. 24 The Counseling Literature ............. 27 iv Chapter III IV TABLE OF CONTENTS (continued) Components of the Initial Stages in the Previously Reviewed Models' ............. Davies' Approach .................. Silber's Model ........ . ........... Schein's Process Consultation Model ........ Havelock's Change Agent Model ........... Medical Problem Solving Processes ........... A Counseling Process ................ Summary ........................ DESIGN OF THE STUDY ................... The Subjects ..................... The Clients ...................... Data Gathering Process ................ Transcriptions .................... Data Analysis Method ................. Method for the Establishment of Analysis Categories . . Validity ....................... The Data Coding Process ................ Determining Intercoder Reliability .......... Data Analysis ..................... Summary ........................ PRESENTATION AND INTERPRETATION OF THE DATA ....... Intercoder Reliability ................ Developer-Client Conferences ......... i. . . . ”V 29 3O 32 35 38 46 48 48 53 53 54 56 57 58 59 62 62 68 7O 71 73 73 74 TABLE OF CONTENTS (continued) em 2292 The Findings for Each Developer ............ 75 Instructional Developer One ............ 75 Instructional Developer Two ............ 78 Instructional Developer Three ........... - 81 Instructional Developer Four ............ 84 Instructional Developer Five ............ 87 Instructional Developer Six ............ 90 Comparison of All Instructional Developers ..... 92 Discussion of Findings as Related to Selected Theoretic Models ............. lOO Davies' Approach .................. lOO Silber's Approach ................. 102 Havelock's Change Agent Literature ......... lO4 Summary ........................ 107 V SUMMARY AND CONCLUSIONS ................. 109 Summary ........................ l09 The Subjects .................... llO The Clients .................... llO Data Collection .................. lll Content Analysis Method .............. llZ Intercoder Reliability ............... llB Findings ...................... llB Comparison to Reviewed Theoretical Models ..... llS Preamble to Conclusions ................ ll6 vi TABLE OF CONTENTS (continued) Chapter Eggg Conclusions ...................... 117 Implications from the Conclusions ........... 121 Implications for Further Research ........... 123 Conclusing Remarks .................. 125 APPENDICES: A Description of Subjects ................ 128 8 Subject Orientation and Directions .......... 130 C ' Conference Information Form .............. 133 D Transcript and_Coding Process ............. 134 E Directions for Coders and Coding Form ......... 153 F Coder Cases ...................... 158 G System of Content Analysis .............. 159 BIBLIOGRAPHY .......................... 161 vii TABLE OF CONTENTS (continued) Chapter ngg_ Conclusions ...................... 117 Implications from the Conclusions ........... 121 Implications for Further Research ........... 123 Conclusing Remarks .................. 125 APPENDICES: A Description of Subjects ................ 128 8 Subject Orientation and Directions .......... 130 C ' Conference Information Form .............. 133 D Transcript and Coding Process ............. 134 E Directions for Coders and Coding Form ......... 153 F Coder Cases ...................... 158 G System of Content Analysis .............. 159 BIBLIOGRAPHY .......................... 161 vii Table (JON 1O 11 12 13 14 LIST OF TABLES Four Coder Composite Reliability .......... Conference Utterances of Clients and Developers Developer One, Clients A and 8, Percent of Developer Utterances by Categories ........ Developer Two, Clients A and 8, Percent of Developer Utterances by Categories ........ Developer Three, Clients A and 8, Percent of Developer Utterances by Categories ........ Developer Four, Clients A and B, Percent of Developer Utterances by Categories ........ Developer Five, Clients A and 8, Percent of Developer Utterances by Categories ........ Developer Six, Clients A, B and C, Percent of Developer Utterances by Categories ...... Percent of Developers' Utterances in Each Phase Each Developer and All Clients, Percent of Developer Utterances by Content Categories . . . . Each Developer and All Clients, Percent of Developer Utterances by Process Functions . . . . Agreement Between Theoretical Models and Findings by Phases ................ Agreement Between Theoretical Models and Findings by Content Categories .......... Agreement Between Theoretical Models and Findings by Process Categories .......... viii Page 74 75 77 80 82 85 88 91 94 96 99 102 104 107 CHAPTER I INTRODUCTION The primary purpose of this study was to explore the verbal behavior of selected instructional developers in their initial conferences with new clients. This descriptive study utilized practicing instructional development consultants and university faculty in dyadic interchange cen- tered around specific instructional problems. The initial conference be- tween each instructional developer and faculty client was audiotaped and later analyzed for instructional developer verbal behavior. The findings of this study will report the results of that analysis. The research presented in this report was conducted at Michigan State University during the Fall, Winter and Spring quarters of the 1975-76 academic year. It is the object of this research to broaden the body of knowledge in instructional development so that improved applica- tion by practitioners and more relevant preparation of students can con- tribute to greater effectiveness, efficiency and joy in learning. If the results of this study in some way help to achieve that end, the effort will be considered worthy. Statement of the Problem Instructional deve10pment (ID) is in its infancy. It has evolved out of a broader concept of educational technology and is still in the process of rationalizing its existence, demonstrating its validity and conceptualizing its domain. Hoban (1973), in summarizing the views of Heinich, Gustafson, Merrill, Briggs and Hamreus from presentations given at the 1971 Association for Educational Communications and Tech- nology Convention, has stated: These five views offered at the 1971 AECT Convention serve to remind the field that it is still essentially new. Such concepts as learning theory, instructional theory, systems, strategies, empiricism, techniques, and others were used by the various leaders in attempting to provide a framework for instructional development. Some of the concepts and definitions are contradictory; others are complementary. Such thinking, however, is characteristic of a new field as it goes through a cyclic process of ex- panding and contracting its conceptual framework. This process will continue until educators feel that an accu- rate and satisfying definition has been provided. As instructional developers or instructional development conSul- tants, as they are sometimes called, work with clients to improve the quality of instruction through the utilization of these concepts, they apply processes or models developed from their education and/or experience. Some of the processes or models which have been formulated which give guidance regarding general procedures essential to the in- structional development process are: Banathy, 1968; Griggs, 1970; Davis, Alexander and Yelon, 1974; Gerlach and Ely, 1972; Hamreus, 1968; Kaufman, 1970; Popham and Baker, 1970. These models for the most part are systematic in nature but do not provide great specificity for their application. Holsclaw (1974) supports this view when he states that little information is available in the literature giving explicit de- tails as to how these general design procedures can be applied suc- cessfully in higher education. Even though Davies (1973) and Silber (1973) provide a theoretical orientation in some dimensions, little research data seem to be available. Diamond (1975) calls for more re- search data when he states that we need to know more about ID ap- proaches and techniques that work, and more about what has been tried and what happened when it was tried. He labels all these kinds of concerns as research questions. Hoban (1973) has called for more descriptive data on the inter- action of people engaged in instructional development, and Schein (1969), in discussing process consultation, indicates that little specific information has been written about what a consultant does when he is with a client. Thiagarajan (1973) supports these points of view by commenting on the need for more information to prepare in- structional developers for the complexities of client interaction. The problem seems to be clear that there is a need for a greater knowledge base as to what, exactly, instructional development consul- tants do as they apply and interpret any one or more of a wide variety of general models to working with clients in an instructional develop- ment environment. This study was designed to respond to part of this need by describing what instructional developers do in the initial conference with a new client. It was the goal of this study to analyze the verbal behavior of instructional developers and describe it in a systematic and quantified fashion. Such data should contribute significantly to the descriptive literature and, hopefully, have some direct impact on the behavior of practitioners as well as professional program cur- ricula. A direct benefit of this study should be the addition of data to the descriptive information base from which more experimental research can take place. Through such activities we can continue to build a scientific research base for this emerging, contracting and expanding discipline. Objectives of the Study The five objectives of the study are: 1. The Establish and define categories of verbal behavior of selected instructional developers in initial interviews with new clients. Describe the verbal behavior for each instructional developer in each new client conference. Describe the commonalities and differences in verbal behavior from initial conferences, for each developer, across all clients. Describe the commonalities and differences in verbal behavior across all developers and all clients. Compare the verbal behavior of instructional developers to selected theoretic models in the literature in terms of commonalities and differences. Assumptions four major assumptions of this study are as follows: The faculty clients who visited the instructional develop- ment consultants were typical of the clients the develop- ers see over time. The verbal behavior of the instructional development consultants was their typical behavior. The instructional development consultants selected for the study were typical of instructional developers in higher education. The faculty members at Michigan State University with real instructional development problems, who were used as the clients, would conduct themselves in a "normal" nenner. Definition of Terms Client: A University faculty or staff member responsible for the imple- mentation of an instructional system. Initial Conference: The first face-to-face meeting between the instructional develop- er and client for the purpose of beginning discussion of the client's instructional problem. Instrugtjonal Developer orfInstructional Development Consultant: Any individual who provides an instructional development service to clients. For this study, instructional developers were Ph.D.‘s with a minimum of three years' experience working full-time in a university setting. Instructional Development: A systematic procedure for devising and managing a set of experiences for a specific population with the intent of facilitating their attainment of a specific set of learning outcomes. Verbal Behavior: Statements that are spoken, questions that are asked and other utterances that are intended to communicate a verbal message. Limitations of the Study The four limitations of this study are: 1. By studying a relatively small number (6) of instruc- tional developers who work at Michigan State University, the generalizability of the study may be limited. 2. Because the study describes client and developer be- havior, little can be said about the effects of impor- tant variables in terms of cause and effect. 3. Due to a limitation of new clients during the period of the study, faculty with instructional problems were recruited for the study and asked to talk to an in- structional developer. The fact that they knew they were part of the study may have affected their behavior. 4. The instructional development consultants knew they were part of a research project. No attempt was made to determine the effect of this on their behavior. General Organization of the Thesis This thesis has been organized in a conventional manner and consists of five chapters. The remainder of Chapter I presents a chapter summary. Chapter II presents the review of related literature examining var- ious models of client consultation. Chapter III presents the methodology of the study which includes a description of the subjects, the clients, client problems, the data gather- ing, coding and data analysis process. Chapter III also includes the coder training process and intercoder reliability methods. Chapter IV presents and discusses the data organized around the ob- jectives of the study. Chapter V summarizes the data, draws conclusions and makes recom- mendations for further study. Limitations of the Study The four limitations of this study are: 1. By studying a relatively small number (6) of instruc- tional developers who work at Michigan State University, the generalizability of the study may be limited. 2. Because the study describes client and developer be- havior, little can be said about the effects of impor- tant variables in terms of cause and effect. 3. Due to a limitation of new clients during the period of the study, faculty with instructional problems were recruited for the study and asked to talk to an in- structional developer. The fact that they knew they were part of the study may have affected their behavior. 4. The instructional development consultants knew they were part of a research project. No attempt was made to determine the effect of this on their behavior. General Organization of the Thesis This thesis has been organized in a conventional manner and consists~ of five chapters. The remainder of Chapter I presents a chapter summary. Chapter II presents the review of related literature examining var- ious models of client consultation. Chapter III presents the methodology of the study which includes a description of the subjects, the clients, client problems, the data gather- ing, coding and data analysis process. Chapter III also includes the coder training process and intercoder reliability methods. Chapter IV presents and discusses the data organized around the ob- jectives of the study. Chapter V summarizes the data, draws conclusions and makes recom- mendations for further study. Summary Much has been written in the field of instructional development with regard to general approaches or models of conducting instruc- tional development. The activities of the instructional development consultant require more insight into working relationships than is provided by such models. The need for more explicit knowledge is supported by many leaders in the field which provides the rationale for this study. Since the need seems to be great for more specific and explicit descriptive data on what instructional developers do, there seems to be justification for focusing on part of that need -- the initial conference. The study is directed toward the description of the verbal be- havior of instructional development consultants in the initial con- ference with new clients. Five objectives for the study have been specified, and these will be used as an organizational structure for data analysis, data presentation and discussion. Summar Much has been written in the field of instructional development with regard to general approaches or models of conducting instruc- tional development. The activities of the instructional development consultant require more insight into working relationships than is provided by such models. The need for more explicit knowledge is supported by many leaders in the field which provides the rationale for this study. Since the need seems to be great for more specific and explicit descriptive data on what instructional developers do, there seems to be justification for focusing on part of that need -- the initial conference. The study is directed toward the description of the verbal be- havior of instructional development consultants in the initial con- ference with new clients. Five objectives for the study have been specified, and these will be used as an organizational structure for data analysis, data presentation and discussion. CHAPTER II REVIEW OF LITERATURE Introduction The purpose of this study was to identify and describe the stra- tegies used by instructional developers in the initial interaction with the client in a clinical type setting. The strategy employed in this review was to examine selected areas of education for relevant studies related to the issue. Little was found which could be considered rele- vant for inclusion in this study. The strategy then was to widen the search to include medicine, industrial consultation, organization develop- ment and consultation, sociology, counseling, diffusion of innovation and change agent literature. While many reviewed studies and theoretical articles could have been related to the study in a vague and general way, few addressed the main theme of this study. 'Consequently, what follows is a review of selected authors who have contributed in a major way to the central purpose of this study. These primary authors represent var- ious substantive areas, but all have addressed key issues which are rele- vant to this research. In addition, selected concepts and principles of secondary authors will be included. Organization of the Review Each of the authors selected has developed a conceptual framework for professional interaction with a client, part of which is the initial conference. Since this study was concerned with only that activity, the emphasis of the review will be placed on the stages which seem to be included in the initial conference. The total conceptual framework for client professional interaction will first be presented so the reader will be able to view the initial stages as they relate to a total approach. The following questions will be used to examine the writing of each author: 1. What is the overall theoretical framework in each author's approach? 2. What are the specific components of the initial phases of each model which are included in the initial con- ference and what are the strategies and variables which affect the initial interaction between the instructional developer and the client? Theoretical Frameworks Davies' Approach -- An Overview Davies (1973) describes three dimensions of an instructional developer-client relationship. One dimension includes the preliminary functions which are performed with the client, which are problem defi- nition, interpretation of the data and the generation of alternative solutions. The second dimension of the working relationship is con- cerned with the kinds of assumptions which underlie the relationship. Davies categorizes these assumptions into product-oriented assumptions, prescription-oriented assumptions and product-process-oriented assump- tions. The third and last dimension of the Davies model lists the successive stages in the development of the instructional developer- client relationship. These three stages are: entering the relationship, maintaining the relationship and terminating the relationship. Parallel 10 to the last dimension are a sequence of task-oriented steps which include the general areas of diagnosis, planning action, implementing action and evaluation. Problem definition is defined as helping the client describe the problem with which he is faced. The developer must address such issues as: (1) How well or how much has the client defined the problem already? (2) Is the problem really a problem or just a symptom of a greater prob- lem? and (3) Should the developer attempt to change or reformulate the problem? Interpretation of data is described as a complex process affected by point of view, assumption and a subject matter discipline. If not handled in an apprOpriate manner, such discussion could hamper or even terminate a working relationship. In generating alternative solutions, the developer presents the client with a range of possible solutions to the problem. The developer is obligated to provide advantages and disadvantages for each of the proposed solutions and increase the client's awareness of alternatives._ In describing instructional developer-client relationships, Davies analyzes the different assumptions and expectations that may underlie the relationship. His categorized assumptions (product-oriented assump- tions, prescription-oriented assumptions and product-process-oriented assumptions) represent a mental set with which an instructional develop- er and client will enter into a relationship. These assumptions and expectations affect the direction of activities, satisfaction of effort and output of the relationship. Seven phases through which a develOper- client relationship may pass are also described with an explanation of 11 the concurrence of some of the phases. Nierenberg and Calero (1975) support Davies in their contention that the moment the professional and the client meet, there is communication be- tween them and attitudes and assumptions on both sides come into play. These assumptions have a deep-seated foundation in the values, phi1050phy, culture and education of both parties in a developer-client relationship. They will affect the perspective of a particular situation, affect decision- making, can be cumulative in nature, and most certainly will result in differences in perception of the developer and the client because of the discrepancies between the two sets of assumptions. Thiagarajan (1973) agrees with Davies in his belief that these discrepancies can create po- tential confrontation and should be aired in order to reduce defensive- ness and promote more open communication. Product-Oriented Assumptions. According to Davies (1973), in this particular orientation the client recognizes the fact that he has an instructional problem and seeks out or is offered the services of a developer with specific professional skills. The product that normally results from such a relationship is a package of instructional materials such as slides, film or videotape with whatever machinery is needed for utilization of the materials. Davies perceives the relationship as one of a customer (client) and a supplier (instructional deve10per) and states, "A need is per- ceived, a supplier is chosen, an explanation is made as to what is needed, a final selection is made from a range of alternatives, and agreement is reached over cost and delivery." The need of the client is to be served, and the developer takes the total responsibility for supplying the product. While this may seem to be a very mechanical 12 and technical process, educational principles must be employed to pro- duce a quality educational product that learners can use in an effec- tive and efficient way. Gustafson (1971) indicates the emphasis of this approach in much of the literature and activities in instructional development. Prescription-Oriented Assumptions. The assumptions that support this kind of relationship are based on the premise that the instructional deve10per is a professional who can diagnose the client's instructional problem and prescribe the necessary course of action to remedy the prob- lem. Davies likens this approach to a doctor-patient relationship where the physician takes the dominant role in diagnosing the illness and prescribing a treatment. Like the patient, the client will visit the professional when he feels ill and is in need of a diagnosis and treatment. This kind of relationship assumes the instructional developer will take charge and tell the client what the nature of the problem is and what action to take. As with the product-oriented relationship, it is necessary that the initial level of communication clear any dis- crepancies that might exist if the client is not operating on the same set of assumptions that form the operational base for the instructional developer. Some developers may function on the basis of the prescrip- tion-oriented assumptions and some clients may or may not enter a rela- ‘tionship with these kinds of expectations in mind. Either way, unless theeassumptions are aired, problems with the relationship could exist from the very beginning. 13 Some instructional developers may prefer this model and actually perceive their clients on this basis, while a client could prefer the product-oriented model and act accordingly. In such a situation, major differences in the assumptions and expectations of both parties can exist. Product-Process-Oriented Assumptions. Gustafson supports Davies' category of assumptions by indicating while it is important to develop quality products, a balance must be struck between product development and people development. This can be one of the advantages or outcomes of the product-process orientation set of assumptions. Davies (1973) describes this set of assumptions as concerned with "the view that the most effective, efficient relationship comes from considering it as a process directed towards the achievement of some mutually agreed and valued instructional result in accord with the organization's mission." This set of assumptions views the relationship as an objectives- oriented relationship employing the basic functions of management to clarify tasks that need to be completed and the necessary role to per- form such tasks during all phases of the project. The deve10per may enter the relationship with such a set of assumptions but cannot assume the client has the same set of assump- tions, so part of the role of the developer will be to make clear his assumptions and list the advantages of the product-process orientation for development activities. Stpges in the Development of a Developer-Client Relationship, Davies describes two concurrent processes as the developer and the client interact in their developing relationship. One process consists 14 of the actual tasks of the project which are described as: diagnosis, planning action, implementing action and evaluation. These tasks can be further analyzed into a variety of specific design and development activities, such as objective specification, strategies, media pro- duction and formative evaluation. Operating concurrently with the task-oriented activities and within the preliminary functions (problem identification, etc.) are the seven phases of relationship development which the deve10per-client relationship possesses. Davies groups these seven phases into three categories: ' A. Entering into a relationship 1. Initial contact with the client system 2. Negotiation of a formal relationship 8. Maintaining an ongoing relationship 1. An analysis-oriented relationship 2. A decision-making-oriented relationship 3. An action-oriented relationship C. Terminating the relationship 1. Evaluating the fruits of the relationship 2. Reducing involvement and/or terminating the relationship It is important to emphasize that Davies suggests, as the developer- client relationship is initiated and develops, that project activities are being carried out concurrently with phases of relationship develop- ment. All this is happening within the context of the set of assump- tions that each member brings to the relationship. 15 It is also important to realize that as an instructional develop- ment project is initiated and evolves, as a system it is always changing and as a result people's assumptions and expectations may change as new situations are created. This may mean that the developer continuously faces a certain level of incongruency in assumptions that must be openly discussed between parties as the indicators are perceived. Davies' entering into a relationship (category A on page 14) will be discussed at a later point in the chapter. Two components of this category will be examined in detail after the general framework of each author is presented. Silber's Approach Silber (1973) proposes a five¥stage process which he describes as the "People Function/Skills Involved in Working Content Specialty to Develop Instructional Systems". He describes the five stages as: (1) Establishing the relationship; (2) Gathering data regarding the prob- lem; (3) Working toward a solution; (4) Challenging the client; and (5) Managing the development. As with the other approaches examined in this chapter, a brief look will be given later in the chapter to the entire model with a close examination of the initial stages. Stage one, establishing a relationship, is mostly concerned with how to break the ice with the client, the establishment of an open light- hearted, non-threatening tone, and equal, Open, non-verbal communication systems. In addition, the identification of expectations and assumptions in the relationship, as well as being a listener, is considered important. 16 Establishing a "contract" with the client is also an element of the re- lationship's establishment stage. Gathering data regarding the problem. This second stage is concerned with assessing the state of the current system, client probing techniques, strategies for getting the client to use examples of problem symptoms, and identifying the type of client with whom the developer is interacting. Several attitudinal and personality categories are included. Working toward a solution. In stage three attention is given to the informing and training of the client in approaches for solving the problem. In addition, brainstorming is listed as a technique when stuck on a particular problem and a strategy of suggesting answers to problems is discussed. Periodic redirecting of general strategy is suggested as a method of getting out of a stalemate condition. The synthesizing ele- ment of stage three requires the tying together of all the information gained from other functions up to this point. The next element of this stage is labeled decision-making and involves the monitoring of the effects of all decisions that are made and forcing decision-making in method and content. Stage four, challengipg the client, is concerned with challenging the client intellectually by selectively disagreeing in an academic manner and being able to withdraw a challenge if it questions too deeply the belief of the client. If the relationship is strained or the pro- ject is threatened by such a "challenging" technique, the instructional developer must know how to lose challenges, as a strategy, for the pur- pose of enhancing the project or relationship. 17 Persuading/cajoling is a strategy for affecting decision-making, through such techniques as presenting both sides, looking at consequen- ‘ ces, demonstrating better and easier ways, compromising and demanding, if possible. The next component of stage four emphasizes methods of confronting persons/clients regarding the violation of commitment or psychological contracts made by the client. Perhaps one of the most use- ful strategies suggested by Silber which follows the previously mentioned stages and techniques is emphasizing/consoling. This technique is sug; gested after confronting, challenging, training or relationship estab- lishment and is defined as essentially putting oneself in the position of the client and siding with the client against others, "the system" or unreasonable rules or expectations. Using personal stories which relate to the client's situation is suggested in this stage as a useful tech- nique for instructional developers. Silber then describes the last stage -- managing the development. The management process is analyzed into two categories: assigning work and criticizing materials together. The work assignment category is concerned with the establishment of a clear organizational procedure for getting the work done by identifying the tasks, assigning work, develop- ing time lines and getting a psychological commitment from the client to do his share. In criticizing materials together, the instructional developer should reinforce the client for his efforts, ask for rationale, use questions instead of criticism, be constructive, offer to work with the client on correcting efforts, and return to retraining stage if necessary. 18 Havelock's Change Agent Model The instructional developer may be perceived as an innovator or change agent in education. As the instructional developer acts in the real world, he must function in the capacity of a product developer and a change agent in a people world (Nord, 1973). As the developer inter- acts with people in education, he may be attempting to get them to adopt new ideas or innovations both in product and process. Bennis and others (1962) define the change agent as the person who is attempting to effect change. Rogers and Shoemaker (1971) define the change agent as: a professional who influences innovation decisions in a direction deemed desirable by a change agency. In most cases, he seeks to secure the adoption of new ideas... (p. 227). Havelock (1973) provides the change agent with a guide to working with clients in an education setting. He lists six stages that a change agent moves through as he is involved in a particular innovation: 1. Relationship 2. Diagnosis 3. Acquisition 4. Choosing 5. Acceptance 6. Self-renewal ’ Stage 1, the relationship stage, describes the first step in how the change agent works. The change agent needs to establish a viable relationship with the client system. It is important to start the re- lationship in a well delineated, helping role. This role can provide the foundations from which the relationship can develop. 19 Stage 2 is concerned with diagnosis of the client problem. The change agent must determine if the client is able to clarify his prob- lem by articulating his needs as problem statements. Stage 3, acquiring relevant resources, is the next stage after the problem has been defined. The purpose of this stage is to identify and gather resources which are relevant to possible problem solutions. Stage 4 is choosing the solution. After the problem has been iden- tified, resources identified and gathered, a number of alternatives are generated from which a potential solution is selected. Stage 5 is concerned with gaining of acceptance for the solution. The purpose of this stage is to get the solution accepted and adopted by the largest number possible in the client system. The change agent helps the client to adopt the innovation. In stage 6, stabilization and self-renewal, the change agent develops the client so that he can function without the change agent. The client must be able to continue using the innovation and also be- gin to work on other problems in a similar way. These stages of how the change agent works are based on the assump- tion that the change agent is a "process helper" as opposed to a catalyst, a solution giver or a resource linker. Havelock defines these as our primary ways in which a person can act as a change agent. The catalyst pushes and pressures the system to be less complacent and to start work- ing on serious problems. The instructional developer may first act as a catalyst agent if he is to succeed (Hartsell, 1971). Havelock (1971) states the solution giver must have a solution to a problem and know when and how to offer the solution. In addition, he 20 must know how to help the client adapt his needs and problems to the solution. The resource linker is the change agent who brings people together and helps the client find and make the most effective and efficient use of the resources inside and outside his own system. The procedure that Havelock emphasizes is the change agent as a process helper. Within the six stages, he lists some specific areas in which the process helper role can provide valuable assistance: 1. Showing the client how to recognize and define needs 2. Showing the client how to diagnose problems and set objectives Showing the client how to acquire relevant resources Showing the client how to select or create solutions Showing the client how to adopt and install solutions OUT-#00 Showing the client how to evaluate solutions to deter- mine if they are satisfying his needs This procedure reflects an attitude toward working with clients and,of the three assumption-based approaches described by Davies, it may resemble most the process-product approach. It would be of greatest importance to clarify assumptions related to role definition in a re— lationship following the process helper procedure since some clients might be operating from the belief that the change agent/instructional developer is serving as a resource linker, solution giver or catalyst. It is interesting to note here that what Davies (1973) identified as the prescription-oriented approach could be closely related to the solution giver and as indicated earlier, the process helper approach most re- sembles some elements of Davies' product-process-oriented model. 21 Later in this chapter, a close look will be given to Havelock's stage 1, buildipg a relationship. The specific steps, strategies and techniques will be examined. Schein's Process Consultation Model Schein (1969) describes several models of consultation which are common in a variety of professions and which are applicable to this study. Perhaps the most commonly used model of consultation is the Purchase Model. The buyer, or client, is the purchaser of some expert information or expert service. The client perceives a need of some kind and seeks to purchase a service from the consultant. The client knows the service he needs and his ability to communicate that need to the consultant affects the success of the relationship. The Doctor-Patient Model takes a slightly different approach to consultation. The consultant in this approach would "examine" the client to determine the problem and make recommendations for a treat- ment of some kind. This approach has two primary problems. The first is the reluctance of the patient (client) to reveal the kinds of in- formation necessary to make an accurate diagnosis. The second is the unwillingness, at times, of the client (patient) to believe the diag- nosis or accept the prescription or treatment of the consultant. The physician holds a unique position in our society and his ser- vices are in great demand. He is seen to embody great wisdom and social status to the point of almost being super-human. When a non- physician attempts to use the Doctor-Patient Model in other contexts, he may not enjoy the same social status and image, so he may not be as successful in the approach. This phenomenon may be the reason 22 why consultants who use this approach cannot be as "successful" as the physician. The consultant could overcome this problem to some degree by building a common diagnosis frame of reference with the client so that the client is continuously involved and informed as in the pre- scription specification. Schein offers process consultation as the approach that should be used with clients. Process consultation focuses on joint diagnosis and passing those diagnostic skills on to the client. A fundamental assumption of process consultation is that the client must learn to see the problem, share in the diagnosis and help to determine a solu- tion. It is the role of the process consultant to be an expert in how to diagnose problems and how to develop a helping relationship. Schein lists seven stages in the process consultation approach: 1. Initial contact with the client organization 2. Defining the relationship, formal contract and psychological contract Selecting a setting and method of work Data gathering and diagnosis Intervention Reducing involvement Noam-aw Termination Schein points out that these stages interact and overlap with each other. Also, they are not easily defined in periods of time and some of them go on simultaneously. 23 Stage 1, the initial contact, may take place through a letter or telephone call and can result in some idea of the problem. Most likely the problem will not be clarified until the exploratory meeting, which will be the first direct face-to-face client contact. Schein states, "One of the most important criteria for predicting the likelihood that a useful consultation relationship will result is the initial relation- ship formed between contact client and consultant." Stage 1 and stage 2 will usually take place separately, but little depth is achieved in stage 1 unless a letter or telephone call explains the problem in some detail. It is at the exploratory meeting that stage 2 generally takes place. The initial relationship is established, written or verbal con- tracts are agreed upon and a psychological issue such as expectations and assumptions are aired. ‘ Schein suggests that the final phase of the exploratory meeting usually results in stage 3 being completed. This would involve the specification of a timetable, methods of work, preliminary goals and statements deve10ped. Even though the model calls for stage 3 to be completed during the exploratory meeting, this should vary with rela- tionship. Stage 4, the data gathering and diagnosis stage, is really done simultaneously with stage 5, the intervention stage. As data are being gathered from the client, a certain amount of intervention is taking place. The purpose of this stage is to gather as much data about the problem as possible in order to make a diagnosis of the problem so an appropriate method of intervention can be generated, which is stage 5. 24 Stages 6 and 7, reducing involvement with the client and termina- tion of the relationship, usually take place after a mutual decision has been agreed upon by the client and the consultant. Some involvement may continue at a low level for some time and may even be established based on client needs. The Schein seven-stage model is developed for use with industrial organizations and some of the terminology reflects that orientation. However, much of the approach, especially in the early stages, has much to offer the instructional developer. It is the early stages with which this study is concerned and will explore in further detail later in the chapter. Implications from Medicine Another way of thinking about the consultant or professional client relationship is in terms of the relationship between a patient and his physician. This was described briefly by Schein (1973) in the previous model. The emphasis in the physician-patient relationship is the diagnosis of the problem. Harvey (1972) describes the process by which a physician arrives at a diagnosis as the following sequence of steps: 1. Collecting the facts a. Clinical history b. Physical examination c. Ancillary examination d. Observation of the course of illness 25 2. Analyzing the facts a. Critically evaluate the collected data b. List reliable findings in order of apparent importance c. Select one or preferably two or three central features d. List diseases in which these central features are encountered e. Reach final diagnosis by selecting from the listed diseases either: (1) The single disease which best explains all the facts, or if this is not possible, (2) The several diseases, each of which best explains some of the facts f. Review all the evidence -- both positive and negative -- with the final diagnosis in mind The emphasis in this model is the collection of as much data as possible and then the analysis or review of the data until a final diagnosis is reached. Wintrobe (1970) suggests a similar sequence, and suggests that hypotheses may emerge at various points as the physician is involved in data analysis simultaneously with data collection. This view of the diagnostic process is based on a conceptual, scien- tific inquiry that is inductive and the belief that conclusions evolve out of an objective analysis of the data and that these data constitute the ultimate arbiter of scientific claim of knowledge. One challenge to this classical empiricist conception is the research within the do- main of cognitive or information-processing theories of psychology. Neisser (1967) suggests that a person's expectations have a significant impact on the simpler forms of cognitive activity and the more complex 26 process of problem solving. Another view has been developed in the research by Renaldi (1968), Elstein (1972), Barrons and Bennett (1972). Their research may be sum- marized as follows. Physicians tend to generate multiple diagnostic hypotheses within the first five minutes of the "workup". They also con- stantly test and revise these hypotheses through the analysis of addi- tional data. In addition, their mode of processing the data is struc- tured by the hypothesis they are entertaining at the time. As these medical approaches are examined and compared to previous models in this chapter, it is apparent that much research in this area is applicable to the dynamics of the instructional developer-client re- lationship, even though medical approaches do not deal in total with all the same variables. The doctor-patient analogy can be misleading and is not the same as the instructional developer-client relationship. First of all, the way the "patient" is identified is different. It is not as clear who is saying, if at all, "something is wrong with me". The establishment of the client or patient system is not the same. For the instructional developer the question has to be asked, "How many pe0p1e is the client?" Secondly, the responsibility for diagnosis is different with the in- structional developer-client. In the developer-client relationship, the diagnosis of the problem may be more of a mutual activity entered in- to by peers. Presently, that peer relationship does not exist in the physician-patient interaction. Thirdly, even though health services de- livery is changing, the degree of participation by the "patient" is dif- ferent than the instructional developer-client relationship. More will 27 be discussed on these points later in the chapter. The Counseling Literature The counseling literature presents several theoretical approaches to counseling. Arbuckle (1970) lists five of them: (1) the Psycho- analytic Theory; (2) the Existentialism Theory; (3) the Rational Psycho- therapy Approach; (4) Client-Centered Counseling; and (5) Behavioral Counseling. What follows is a brief examination of the salient points about these five theories. The Psychoanalytic Theory is a therapeutic approach which views man as a creature governed by aggressive instincts. The goal of the professional or therapist is to help the client maintain a balance be- tween forces not of his making. Stress is placed on uncovering and in- terpreting the client's life and living with a psychosexual orientation. The approach with the clients tends to be rational and cognitive. Existentialist Counseling presents man as a free being, alone and the determiner of his life and living. It also presents man as a grow- ing, evolving being always in a state of change. Existentialism sees counseling as a human encounter with a primary purpose of helping the client to achieve an acceptance of responsibility for self which results in freedom. The approach is for the counselor who has achieved indi- vidual freedom, to experience with the client and help the client to achieve the same level of self-acceptance. Rational Psychotherapy presents a more scientific approach. It views the ability to think our way out of negative and harmful states of mind. The major function of the counselor is to teach the client to 28 change irrational views through persuasion, challenge and prodding. The client is taught to scientifically view his irrationality. The client- counselor relationship is not considered to be important in this ap- proach. The Client-Centered Counseling approach is characterized by a warm, acceptant, empathic and congruent relationship between the client and the counselor. The role of the counselor is to create a threat-free relationship with the client in order to promote self-growth, self- development and self-actualization. The counselor is not directive and there is little advice or information given to the client. Reflection is more important than interpretation since the latter may only be the counselor's perception of reality. Behavioral Counseling is essentially stimulus-response psychology, and the individual is viewed as a set of behaviors more than a gestalt human being. The role of the counselor is to modify the behavior of the client through manipulation. The relationship becomes a series of conditioning or counter-conditioning experiences. Even though exposure to such thinking may influence instructional development practitioners, it is unlikely that the specifics of such techniques can be used with any success due to an incompatibility of client-developer relationship outcomes. Typically, the counselor will focus on the client and the feelings of the client toward himself or the system in which the problem is arising (McGeheaty, 1968). The emphasis is placed more on the personal feelings of the client while the instructional development consultant is more concerned that he and the client come together as peers to examine 29 an instructional problem and work for solutions, not looking at the client as the problem. However, some of the skills and techniques of the counselor such as empathy and listening skills may be essential if the developer is to become a more effective consultant. Instead of estab- lishing a peer relationship, sharing knowledge and taking objective views of the problem under examination, most counseling theories put the client into more of a clinical atmosphere, similar to the physician-patient model. This is not to say the instructional developer shall not be aware of a behavior change in the client -- some situations may require client behavior change. For this reason, the Behavioral Counseling model may have more direct application than the other theories. More will be pre- sented later in the chapter regarding the application of counseling theory to the initial client conference. Components of the Initial Stages in the Previously_ReviewedTModéls An overview of several professional-client models has been presented. The purpose was to provide the reader with a view of each approach be- fore a more complete examination of the initial stages of each approach takes place. It is the initial conference between the instructional developer and client with which this study was concerned and the re- mainder of this chapter will provide a closer examination of the early stages of each approach with some interpretation and comparisons. When an instructional developer meets with a client the first time, the outcomes of those meetings will most likely vary because of indi- vidual personalities, different client problems, skills of the deve10per, individual commitments, needs, personal security and a host of other 30 variables. Some developer-client relationships may move quickly in a first meeting to the selection from alternatives to a solution of the problem, while others may continue over several meetings before the problem can be defined. Since it was the first meeting with which this study was concerned, initial stages of each reviewed model will now be examined more closely. Davies'Approach Davies (1973) describes the first stage with a client as "entering into a relationship". He has further subcategorized this phase into: (1) initial contacts with the client system, and (2) negotiation of a formal relationship. Davies emphasizes that when entering into a rela- tionship, there is a desire to get on with the job instead of establish- ing the relationship and defining roles of all involved in terms of what the client expects to accomplish. Davies feels that it is impor- tant to lay this strong foundation from which the project can be develop- ed. The initial contact with the client system may come from a letter or telephone call to set up a meeting. In the telephone conversation or letter the instructional developer may be given some information as to the problem or reason for the meeting, or the initial contact may just establish a meeting time. Davies suggests trying to find out why the developer was approached, as it may provide data as to the client's philosophical and professional orientation. The first face-to-face contact may come at the exploratory meeting. This is the first time that the client and developer may come together to discuss in precise terms the nature and scope of the problem situation. 31 In addition, the meeting should serve to establish the degree of assis- tance that can be given and to design a plan for any further action. Davies also suggests that the deve10per will want to determine the con- tact client as well as determine the organization of the client system. The exploratory meeting should also include other relevant members of the client system and other developers, if necessary. A high ranking member of the client system should be there to commit faculty member's time and some financial support to the project. An open but confronting atmosphere should be maintained. The developer should attempt to deter- mine client willingness and commitment, the practical outcomes of the purpose for the meeting, and potential resource allocations. The developer also has to examine the potential payoff for himself and the possibility of an institutionalized project. Once it has been agreed that the project is worth developing, Davies has suggested the negotia- tion of a formal relationship has begun. Davies next raises the quedf tion of the most appropriate method of entering the client system and how it may not be through the person who was initially identified as the client. This is an important issue that Davies has raised because of the efficiency of the developer's time. It is important that the developer deal with a person, as a client, who will at least have a positive effect on the client system, if not a major impact. Davies suggests that there are actually two contracts to be nego- tiated with the client. One is a formal contract and the other is a psychological contract. The formal contract addresses such issues as tasks, time, method, termination, variables and benefits for all con- cerned. The psychological contract is concerned with the expectations 32 that each has of the other. It involves a willingness and commitment to the project in terms of time and degree of effort. Davies also indicates that the contracts should specify role definition and allocation of re- sources. When considering the Davies approach, it is important to integrate all he has to say regarding the clarification of assumptions and expecta- tions and the very important role that this insight should play in the initial exploratory meeting and the continued relationship with the client. Silber's Model Silber (1973) includes in his approach much Of what Davies explores, but arranges the process somewhat differently and is somewhat more ex- plicit about his strategies. He is also more specific in some categories than Davies. Silber analyzes the initial two phases of the developer-client rela- tionship as: (A) establishing a relationship, and (B) gathering data re- garding the problems. Under establishing the relationship, he has listed: (1) breaking the ice, (2) establishing the tone, (3) communicating non- verbally, (4) identifying expectations, (5) identifying expectations and assumptions, (6) listening, and (7) contracting. Breaking the ice is a strategy that is designed to establish a pleasant and cordial relationship with the client. Warmness and openness are also characteristics of this strategy. Attempting to let the client know "where you are coming from" or something of your value system may help draw out the client so you can begin to learn "who he is" or some- thing of his values. Davies' (1973) expectation of the exploratory meeting does not include the detail that Silber presents, but he does use 33 such language as "openness" in describing the tone of the exploratory meeting. In Silber's subcategory under establishing a relationship, he describes the procedure for establishing the tone with the client. He describes the importance of being task-oriented and establishing an air of equality. Openness is again listed as an important element of set- ting an appropriate tone of praise, constructive criticism and a non- threatening, helping relationship which contributes to a positive atmos- phere. The importance of non-verbal communication is next emphasized with attention given to the value of a pleasant, calm and equal voice tone. Body language should be read as equal and open and non-verbal reinforc- ers should reward and be conducive to an open and equal interchange. Identifying expectations is described as the next strategy for establishing a relationship. Davies (1973) is very thorough in his des- cription of client expectations and assumptions. Silber does not list assumption clarification specifically, but in his description of what he calls identifying expectations, he does allude to assumption testing. He mentions the necessary components of this strategy as determining why the client is seeing the developer and what brought the client at this particular time. What the client and others expect out of the project is listed as necessary to determine, as well as to clarify understanding of what the client thinks the developer can do for him. The client should also be clear as to what he expects of himself and how much time he is willing to commit to the project. Another essential area of clarification is the amount of time available for the completion of the project. 34 Listening is another skill that must be practiced carefully and accurately. The instructional developer must "listen" to both the ver- bal and non-verbal stimuli. Careful listening is essential if the developer is to be able to synthesize and interpret what the client has said verbally and non-verbally. Another key point to remember is not to interrupt the client. Davies (1973) considers the development of a formal and psychological contract as necessary for the satisfactory im- plementation and completion of a project. Silber, too, has contracting as a category for special attention. He states that this needs to be done carefully and in a timely way and should result in a specification of what is to be done, when it is to be completed and by whom. Modes of accountability should be determined for all parties and a clear commitment should be given by all parties to fulfill their reSpective parts of the contract. The second phase that could be considered within the domain of the first meeting between client and developer is the gathering of data re- garding the problem. Silber has divided this activity into four major categories: (1) identifying the current state of the art; (2) electing/ probing the client; (3) generating examples; and (4) identifying the client. Davies (1973) establishes problem definitions as necessary stages that the developer goes through, but his process is not as linear as Silber's. It seems that the stages of relationship development that Davies suggests are optional when the problem is being defined with the client as well as his interpretation of the data. Silber first suggests that the present system be assessed in terms of content, technique, instructor role and use of media. The identification of course success 35 in terms of student attitude, instructor reaction, as well as problems, provide key data at this point. It is strongly suggested that solutions should be avoided at this stage because of the lack of data. The next step suggests the developer solicit and probe the client for information regarding the project or course. This is done by asking general questions related to goals and from there to more specific ques- tions. The generation of examples, the next subcategory, can be useful in helping the client to get to specifics and clarify the answers to prev- ious questions. Silber also suggests that it is necessary to identify the type of client with which the developer will be interacting. He lists this as the last step in gaining data regarding the problem. Some of the information necessary for such a decision is to determine whether the client will be cooperative or tend to block efforts being made in the project. It must also be determined if the attitude of the client is a superior one or one of equality. Determining where he is philo- sophically and pragmatically will help to determine his attitude towards instructional development and the developer. Identifying the client's personality as to openness vs. defensiveness, whether he is a traditional- ist or an empiricist, a worker or slow producer and his general ability can all be data which may be helpful in decision making. Schein's Process Consultation Model Within the scheme by Schein (1969) previously described in this chapter, stage one (initial contact with the client's organization) and stage two (defining the relationship, formal contract and psychological 36 contract) indicate an obvious similarity between Schein and Davies (1973) exists because much of what Davies proposes was adopted from Schein's writings. However, there is enough difference and specificity in Schein's work to warrant a close examination of the first two stages of his approach. The initial contact with the client system can be made in a variety of ways. These contacts will also vary in the degree of information avail- able from the client. The client may attempt to describe the problem or may simply make an appointment with the developer or the developer's sec- retary. Regardless, the contact is usually not face-to-face but by tele- phone or letter and the relationship does not actually start until the first exploratory meeting. Schein, Davies and Silber emphasize that it is important to find why the client has come to the developer. At that time, the instructional developer may also learn something of the client's values, expectations or assumptions by finding out why the developer was contacted. Schein emphasizes that the quality of the long-term relation- ship can be determined by the initial relationship formed between the client and the consultant. The client can also be evaluated for degree of openness, spirit of inquiry and authenticity of communication by assess- ing responses to questions related to his willingness to explore the di- mensions of the problem he professes to perceive. Schein points out that if the client wants reassurance for some form of action or a quick solution to a temporary superficial problem, it may be a predictor of an uncommitted client. If all the client wants is a meeting, it may not be possible to explore these directions and they become part of the next phase, the 37 exploratory meeting. Regardless of where the information is gathered, it is all essential to the very early stages of developing the relationship and defining the problem. The exploratory meeting can easily become the first diagnostic step toward the establishment of a relationship. The exploratory meeting involves face-to-face contact with the client, perhaps some of his associates and the consultant or instructional developer. Others may be included in the meeting if they are decision- makers in the organization. It might be wise to invite other pe0ple to a later meeting, once a relationship has been established with the client and the problem has been identified. Schein then outlines the purposes of the exploratory meeting as: (1) determine the problem, (2) determine the involvement of the consultant/developer, (3) determine the interest of the project to the consultant, and (4) formulate the next action steps. Schein (1973) spent a great period of time establishing the relation- ship by listing methods of breaking the ice, setting the tone, identify- ing expectations and assumptions. Davies, too, devotes much effort to examining the kinds of assumptions that need to be clarified early in the relationship. In most cases, the exploratory meeting will be the environment for such a clarification procedure. Schein's basic premise is that it is the role of the consultant to help others help themselves and it is this principle which affects his decision-making as he inter- acts with clients. He explains that the process starts with the first contact, whether that is an initial telephone call or the exploratory meeting. I The exploratory meeting is usually carried on with an open-ended discussion with the client which is designed to sharpen and highlight 38 the aspects of the problem and test how open and frank the client is willing to be. Schein also mentions the formal and psychological con- tracts. Much of what Schein mentions Davies has already summarized in his interpretation of the formal and psychological contracts. Silber (1973) also considers contracting with the client as to what will be done, when and by whom. He also considers the clarification of expecta- tions as part of this contract as well as both parties' understanding their obligations to fulfill their part of the bargain. Schein makes it clear that crucial elements of the relationship are definition of role, willingness of client to be open and taking on responsibility, and stating openly expectations and assumptions. The final phase of the exploratory meeting, which is relevant for the instructional deve10per, establishes a method of work. This pro- cedure involves the setting of goals, strategies or methods necessary to achieve those goals, and a time schedule for the completion of goals. How often the client and developer should meet is also an issue in need of clarification. It is important that agreement is achieved on the goals of work, as well as the methods to be used and the establishment of the realistic time schedule. Havelock's Change Agent Model As mentioned early in this chapter, Havelock (1973) provided an ex- planation of the stages the change agent goes through. He presents the change agent in education as a process-oriented person. Since the in- structional developer is considered by many to be a change agent in educa- tion, this particular approach seems to be a relevant one to examine. Stage 1, building a relationship, will be examined along with stage 2, 39 diagnosing the problem, since these two stages would most likely be stages the change agent/instructional developer moves through as he initially works with the client. This change will be further examined in some detail as they relate to this study. Havelock, in his explanation of creating a relationship with a new client, defines the relationship as a complex and delicate bridge. He places importance on the change agent being acutely aware of norms, values, leadership and influence patterns and how they affect the develop- ing relationship. He believes that the relationship is beset by a host of uncertainties and unknown factors. He also emphasizes that the change agent is continuously making decisions from data gathered from a variety of resources. These decisions may be based on bits and pieces of data that may or may not be reliable and can affect the relationship and direc- tion of the project. An interesting concept proposed by Havelock sug- tests that the client usually enters a relationship with an open-minded attitude of which the change agent should take complete advantage. The client may believe a new face will bring new possibilities. This posi- tive approach has not been proposed by other authors and may offer an opportunity for deve10pers that has not been exploited to its limits. The instructional developer tends to work from outside the client's system and is faced with a variety of advantages and disadvantages as an outsider. One advantage is that the enters the client's sytem without any negative stereotypes. As an outsider, the change agent can also gain a perspective of the problem which helps "get an objective look". He may be able to see problems and identify needs and opportunities which the 40 client may not perceive. Being outside the client's power structure and having a different expertise than the client are also advantages. Being a stranger to the system can be a disadvantage because he may be perceived as a threat because he has the potential of disturbing the natural order of the system. The change agent may also lack know- ledge of the system in which the client functions which may cause him to make wrong decisions. The change agent also may not care enough about the needs of the client since the system is not his. Havelock emphasized the fact that the relationship between the client and change agent builds on the first encounter. How the client sees the change agent and his initial feeling will affect how success- ful a problem solving stage will be. Havelock indicates there are four major considerations for managing the initial encounter with the client, they are: (l) friendliness, (2) familiarity, (3) rewardingness, and (4) responsiveness. Friendliness is usually judged by a smile, a warm greeting, looking directly into the eye and calling a person by their first name whenever possible. Havelock describes these gestures as important elements of the first encounter since it is not clear to the client whether the change agent will be a threat to his environment. Familiarity is also considered by this approach to be an essential element in the initial interaction. Within this strategy the change agent should attempt to establish some common point of communication outside the project, such as sports, politics, entertainment or common acquaintances. Humor can also serve to "break the ice", as Silber (1973) describes it in his approach. 41 Havelock also describes "rewardingness" as a technique of strategy in gaining the client's confidence. It is referred to as a token award system which is operationalized by doing something for the client as soon as possible. This may be the dispensing of printed material, iden- tifying a resource person that might be useful to the client, or suggest- ing a technique which the client may not have within his experience. Responding to the client's explanation and talking at the appropriate time are also suggested by this approach. Listening carefully to the client and responding verbally and non-verbally to his talking will show "alertiveness". Asking for clarification of points or responding back in different words will also show the client that the change agent under- stands and is responding to his needs, concerns and ideas. These strategies (friendliness, familiarity, "rewardingness" and responsiveness) are considered preliminary niceties and they begin t0 build the necessary trust and openness which provide the foundation for attacking the issues of the problem. Silber (1973), in his explanation of "establishing the relationship", also describes such strategies as being pleasant, cordial, open, praising the client, using non-verbal com- munication reinfbrcers as well as listening skills which relate very closely to the strategies Havelock explains. Havelock goes on to Specify the components of the ideal relation- ship. Reciprocity is described as the first characteristic of such a relationship and emphasizes the importance of give-and-take as a method of increasing mutual appreciation of the problem and making diagnosis more accurate. 42 Openness is second and is essential for both the client and the change agent. Openness is analyzed into: (1) receiving new ideas, (2) seeking out new ideas, (3) desire for self-renewal, (4) willingness to share new ideas with others, (5) openness to listen to problems of others, and (6) openness to give authentic feedback to each other. Setting realistic expectations for any kind of project is another characteristic of the ideal relationship. Importance is placed on the client's understanding of expectations for the project, so that he will not expect too much or too little. Identifying expectations has been emphasized throughout this chapter by Davies, Silber, Schein and Havelock as a necessary element to initial stages of developing a working relation- ship. Havelock does mention, however, that the change agent should give the client a reason to believe or expect a change and that the innovation will, if successful, make things better. One strategy which may be used to encourage the client to accept success from the change is piloting the project. This strategy can be used to show the client that change can take place without committing major resources. Piloting is also mentioned by Rogers (1971) as a way of providing tangible evidence that the change agent can be a helpful person and conveying to the client that the relationship can be a re- warding one. Structure is another characteristic of the ideal relationship. Structure should define the rules, working procedures and expected out— comes. The change agent needs to be careful that this is not done in a rigid fashion, but that goals, procedures and a distribution of labor and reward be clarified. 43 Davies, Schein and Silber also emphasize along with Havelock the importance of establishing a contract if the commitment of the client is in question. Such a contract would: state the commitments, be re- ciprocal, be clear on a time line and clarify the conditions of termina- tion by mutual consent. Equal power is also described as a characteristic of the ideal re- lationship in terms of the importance of the effect on the relationship if one party has much more power than the other party. When the power of the two parties is equivalent, it cannot be the factor that brings about change. Where power is not equivalent, it may appear to bring about change by forcing the weaker partner to comply and this does not bring about lasting effectiveness. Since most people, according to Havelock, look upon any change in their environment as initially a dis- turbance and not a benefit, it is important that the change create, as much as possible, a non-threatening atmosphere in the relationship. Thus, creating this minimum threat situation through equal power is another characteristic of the ideal relationship. Silber describes confronting as an important strategy for the developer in Opening up those critical matters that delay the project. (Havelock also considers confronting the difference that exists as an issue or characteristic of the relationship that must be addressed.) The openness to be frank on crucial matters and have an honest confron- tation of differences may create a stronger relationship and get both parties through hard times. As a last characteristic, Havelock describes the importance of keep- ing all relevant parties informed as to the state of development of 44 activities. These relevant parties that the change agent will affect, especially policy makers or decision makers, must be involved to some degree. They must know "where you are," "why you are there" and must accept to some degree the change agent being there. In addition to the characteristics of the ideal relationship, Havelock also describes some six indicators or danger signals of a poten- tially bad relationship which may be doomed to failure. A long history of unresponsiveness to change is one of the danger signals. The change agent should study the past history of the prospective client in dealing with change. If the client or his system has responded in the past with indifference or rejection, it may be inefficient for the change agent to get involved. Being used as a pawn in a political situation or to help the client prove a point with someone is another danger signal and the situation should be avoided. Responding to a client who is likely to have very little effect on his total system is another danger signal, and may not be an efficient use of the change agent's time. Even though the client is innovative and receptive to change, if he is powerless in the system where the leader(s) is resistant to change, the change agent may be wasting his time if he wants to affect the system through the client. Diamond (1975) describes the cooperation of the leader(s) of the system as essential before any commitment should be made by an instructional developer. The change agent should also learn to recognize the signs of client pathology as major incapacities or danger signals to establishing a pro- ductive relationship. Excessive rigidity or obsessive concerns with par- ticular issues or seeing issues in only black and white terms are 45 indicators of problems. The inability of the system to gather resources or get key people to meetings may also be indicators of system incapa- city. If such signs do appear in the early stages, the change agent should be aware of them. The last danger signal which Havelock describes is when the client makes negative responses to a well-managed initial encoun- ter. If the change agent has done all the "right things" and receives negative responses, a dismal, non-productive relationship may be forecast. In addition to the characteristics of an ideal initial relationship and the six danger signals, Havelock offers five points on how to size up the relationship. The first deals with the importance of building the inside-outside team. This is concerned with building a team of people that will be involved in the project that represents those people that are inside the client system, as well as those that are outside the client system. The second point is concerned with the importance of using the techniques of friendliness, familiarity, rewards and responsiveness in the initial encounter. The third emphasis is on the importance of being able to identify in each relationship the features of the ideal relationship. The fourth point emphasizes the importance of being able to spot the danger signals, and the fifth is concerned with maintenance of the relationship. Havelock also mentions that these features of his approach are not just emphasized in the initial stages of the relationship, but are important to build on as the relationship progresses. Much is 46 dependent on the personality of the change agent and the skills that he has in dealing at the interpersonal level with people. Medical Problem Solving Processes Early in this chapter a series of steps were listed which outlined the conventional medical problem solving model. Two major steps, (1) collecting the facts, and (2) analyzing the facts, were detailed into ten steps which take the physician through the process of collecting a variety of data, the generating of alternatives and the selection of a final diagnosis which leads to treatment. It was emphasized by Wintrake (1970) that the physician may gen- erate a hypothesis at any point in the process. The approach is based on a conception of scientific inquiry which assumes that the scientist objectively goes about gathering data and then objectively and syste- matically analyzes the data and inductively generates conclusions. Kessel (1970) asserts that the expectations and assumptions of the scientist affect his judgement of the data and that the new data con- tinue to change his expectation, which affect his cognitive information processing. Nisser (1967) suggests that a person's expectations (pre- suppositions, assumptions, ideas or hypotheses) have a significant impact on cognitive activity such as visual and auditory perception and visual and auditory memory. Newell and Simon (1972) indicate that such mental sets (assumptions, expectations, etc.) initiate, organize and direct his subsequent information processing activities. The results of such research, integrated into the dynamics of the instructional developer-client initial interaction, places greater emphasis on what Davies (1973) describes as clarifying expectations and 47 assumptions on both sides of a struggling relationship. In general, the scientific inquiry approach has little to offer the initial stages of the developer-client relationship until the relation- ship develops to the problem identification stage. At that point, as- pects of other approaches examined in the chapters are similar, at least in logic, to the conventional medical approach. The research that has been done in recent years in medical problem solving seems to have the most to offer the dynamics of the instructional developer—client rela- tionship at the point where the problem is foremost. Early in the chapter the analogy of the patient-physician relation- ship to the consultant-client was examined for its relevance to the in- structional developer-client relationship. Three points were made which suggested the physician-patient relationship and the consultant-client relationship were misleading analogies: (l) the way the "patient" is identified; (2) the responsibilities for the diagnoses; and (3) the de- gree of participation of the client in formulating and carrying out the "treatment". Some specifics can now be explored since they are directly related to approaches that can be taken in initial interaction between instructional developer and clients. The consultant does not function similar to the doctor since few "patients" (clients) say "something is wrong with me" as he might say to the physician. The consultant is likely to hear statements that would relate to problems of someone else, something else or the system in general. The patient approaches the system ready to admit that it is he, himself, who needs help. 48. The physican-patient relationship can also be questioned as a model for consultants and/or developers because it does not address the key issues of the client-consultant or developer-client relationship. These issues are: (1) what is the client system, (2) the development of I strategies for establishing an initial relationship, and (3) examining expectations and assumptions and other elements that Davies (1973), Silber (1973), Schein (1969) and Havelock (1973) describe. Another rea- son for rejecting the doctor-patient model is that the instructional developer or consultant involves the client in diagnosis and then helps the client to help himself. It is more of a peer interaction, as con- trasted to medicine where the patient, in most cases, is not actively involved in data gathering, decision-making or treatment implementation. However, it must be mentioned that medical schools are attempting to prepare physicians to involve patients more in the process of their own medical care. The patient, in most cases, follows the direction of the physician without a lot of knowledge about what they physician is doing or why he is doing it. A CounselingProcess Arbuckle (1970) describes several variables that he considers essential prior to and during the initial conference with a new client. He first describes the physical surroundings as an important variable which may affect the client's attitude. Privacy is mentioned as a neces- sity if the client is to talk freely and openly. A clean, comfortable and pleasant office may give the client some confidence in the counselor. 49 Preparation for the counseling session is considered by Arbuckle to be necessary. The degree to which the counselor prepares may depend on how he perceives his role and the constraints of time. The establish- ment of rapport is next mentioned by Arbuckle as a crucial variable in the initial client conference. Rapport can promote an easy, comfortable and free relationship where both the counselor and client can be honest and open. According to Arbuckle, such rapport may be influenced by: (l) the image of the counselor or the counselor role that the client may have prior to the first meeting; (2) why the client comes to the counselor (Is the client there-of his own free will or because he has been sent or asked to come?),(3) the immediate or first impression that the client re- ceives when he enters the counselor's office or office area; (4) the ability of the counselor to accept the client's behavior or view. A client may sometimes make challenging statements or behave in a challeng- ing manner in order to test the counselor's acceptance of him; and (5) the degree to which the counselor is attuned to the feelings and needs of the client. Arbuckle indicates that the ability of the counselor to establish rapport by overcoming crucial situations depends on his life experiences and personal security. Buchheimer and Balagh (1961) describe the beginning session between the client and counselor as going through three primary phases: (1) the statement of the problem, (2) exploration and (3) closing and planning for the future. (They indicate that in phase one a goal of the counselor is to communicate complete and genuine acceptance.) In phase two, the 50 exploratory phase, he needs to be adept at reflection of feeling, clari- fication of feeling, restatement of content, abstraction of care, and recognition of the essence of themes and concerns. Phase three, the sum- mary or closing and planning phase, is described as the most active phase. The counselor summarizes to effect closure in order to give the client the sense of a complete experience. The intent in planning is to give the client an incentive to return for further counseling. Buchheimer and Balagh are quick to point out that this pattern of behavior is merely a theoretical framework and that counseling inter- views never proceed in such exact sequences. The pattern should only serve as a guide and the counselor should not attempt to force the in- terview into any pattern that the client does not go along with. Borg and Gall (1971) describe the literature review as an aid to the researcher in developing understanding of the previous knowledge in the area undertaken in the research. The review indicated that little information is available giving explicit details as to theoretical ap- proaches which were applied which confirmed the need for the study. In addition, the literature review provided a theoretical framework from which an analysis system could be generated. The review also provided theoretical models of instructional developer behavior with which find- ings of this study could be compared. Summar In summary, six views of how professional-client relationships are established, managed, maintained and terminated have been reviewed. 51 Davies (1973) offers a theoretical model which presents the developer- client relationship going through several stages of complexity. He sug- gests that the relationship may pass through seven phases within three major stages. Davies also emphasizes the importance of clarifying the assumptions and expectations on which the relationship will be founded. He provides three types of assumptions that tend to influence the di- rection and success of the relationship. Silber (1973) proposes a five-stage process in which the client and developer are involved throughout their relationship. He lists eleven strategies that the developer should consider while moving the client through the first two phases of the relationship. Havelock (1973) presents a six-stage process that the change agent-client rela- tionship moves through. He analyzes the first two stages into a series of strategies that the change agent can follow while interacting with the client. Havelock describes the components of an ideal client rela- tionship and presents six danger signals of a potentially bad relation- ship, as well as five points on sizing up the relationship. Schein (1973), in his view on process consultation, lists seven stages that a process consultant should consider when interfacing with the client. The first two stages he breaks into several steps for the consultant. Schein's basic premise is that it is the role of the con- sultant to help others help themselves and it is this principle which affects all decision making. Schein's theoretical process resembles much of the work of Davies. 52 Medicine provides the diagnostic model which lists ten steps grouped into two major phases. The emphasis is to collect as much data as possi- ble and then analyze or review it until a final diagnosis is reached. The effect of a person's expectations on cognitive processing was reviewed, along with the research that indicates that physicians generate multiple hypotheses and during diagnosis are in a constant state of testing and revising these hypotheses based on data input. Some of the differences between the physician-patient and developer-client are also examined. Five theoretical approaches to counseling are reviewed. For the most part, these theories represent philosophical schools of thought as they have been applied to counselor preparation and practice. Several variables of the counseling process are also reviewed, along with one view of the stages of the counseling-client relationship. CHAPTER III DESIGN OF THE STUDY The purpose of this study was to explore and describe the verbal behavior of instructional developers in their initial conferences with new clients. The study concentrated on the gathering of verbal data from client-developer conferences and analysis of that data to deter- mine categories of behavior, the frequency with which developers ex- hibited such behaviors, and the consistency and inconsistency of behav- ior patterns. Chapter III presents all the components in the design of the study. The Subjects The six subjects who participated in this study were all practicing instructional developers at Michigan State University with major profes- sional training and experience in instructional development (ID). Each were practicing instructional development on a full-time basis at Michigan State University during the period of this study. See Appendix A for a description of each subject. The subjects received their graduate education at four different institutions and have a wide variety of na- tional and international experience. The subjects were all male, ranged in age from 37 years to 55 years with a mean age of 46 years. The subjects in this study were not drawn randomly but were selec- ted for the study based on certain criteria and their availability. The 53 .54 criteria consisted of: (l) a Ph.D. in ID or strongly related area with at least two years of experience as a practicing instructional developer, and (2) practicing full-time at the time the study was conducted. The six subjects who participated in the study represented approximately 50 percent of the instructional developers on the Michigan State University campus who met the qualifications. It was assumed by the researcher that the subjects were typical of instructional developers at Michigan State University. However, it should be noted that no claim of represen- tativeness of all instructional developers is made and no attempt at generalization beyond the sample of subjects will be attempted. Each instructional developer was contacted and asked to participate in the study. The objectives of the study were outlined and the role of the instructional developer was specified. Each developer was asked to audio record the initial conference with the new clients. A letter fol- lowed the personal contact outlining the objectives of the study and what would be expected of each participant. See Appendix B for an ex- ample of the letter. Over a period of several months, reminder memos were sent to all the participants and the secretaries of the instruc- tional developers were also contacted and asked to remind the developer to record new client conferences. Each developer was offered a cassette recorder and blank tape if he did not have easy access to such equip- ment. The developer was asked to send completed tapes to the researcher immediately after the taping. The Clients The generation of new clients for instructional developers went through two phases. It was the intention of the researcher to use. as 55 much as possible, new clients of instructional developers as they were contacted through the normal process of developer activities. It be- came apparent, after several months of following this method, that new clients were not coming to the developers fast enough for this study to be completed within the time constraints imposed on the researcher. In order to overcome this problem, faculty members at Michigan State Univer- sity who were known by the researcher to be involved in instructional development activities, were contacted to determine their interest in getting development assistance. Eighteen such people were identified from a variety of colleges and departments and all agreed to contact an instructional developer for an appointment. It was explained to each faculty member that they were participating in a research study and they were to approach the developer as if the initiative had come from them. Each faculty member was oriented to his responsibility and was also given the name of the instructional developer with which the appointment was to be made. It was stressed several times to each fac- ulty member that the developer was to be approached in a normal manner as if the faculty member had taken the initiative from the beginning. Each new faculty member (client) outlined the instructional development problem to the researcher, assured the researcher that the problem was genuine, and that they looked forward to getting assistance. It should be noted here that the six subjects did pgt_know that such new clients. were generated by the researcher. The influx of new clients was spread out over a three-month period so as to minimize suspicion by the developer that this study in which he had agreed to participate was gen- erating new clients. Of the eighteen faculty who said they would 56 cooperate, fifteen actually made appointments and eleven were recorded by the developers. Two clients from the College of Education were recorded who were not generated by the researcher, to bring the total clients actually included in the study to thirteen. After the study-generated clients made appointments, they were to return a form indicating the client, time, place, developer and a brief explanation of the problem to be discussed. See Appendix C for a sample of the form. The study-generated faculty were drawn from a variety of depart- ments and colleges across the Michigan State University campus. Areas that were represented were: (1) the College of Education, Department of Curriculum and Secondary Education, and the Continuing Education Service; (2) College of Social Science, Department of Sociology; (3) College of Human Medicine, Office of Health Services Education and Research, and Department of Surgery; (4) College of Natural Science; (5) College of Urban Development, Department of Urban and Metropolitan Studies; (6) College of Natural Science, Department of Physics; and (7) College of Agriculture, Department of Agricultural Economics. Data Gathering Process Each subject was asked to audio record all new client conferences. At first contact, the number of client conferences being requested from the subjects was left open ended so as to assure that the subjects would record each new client and not just select a number from which he con- sidered to be the best. Also, with many variables that could affect the technical quality of the tapes, it was also felt this strategy would. 57 result in a better selection of tapes for the researcher. The subjects were notified in writing during the data collection stage of the need for as many tapes as could be provided. It was the desire of the researcher to get three complete and technically correct tapes from each subject so as to provide as wide a data base as possible within the constraints of 1 the subject's ability to cooperate and the time frame of the researcher. A second strategy of sending the clients in with recorders was developed to assure getting the data on any particular session that the subjects did not volunteer to tape. This strategy was not implemented until to- ward the end of the data gathering period when it became crucial to re- cord the last several conferences in order to have the minimum number of interactions necessary to complete the study. Those faculty that were given recorders were told to request the recording of the conference only if the instructional development consultant did not ask permission to record. The faculty member was to tell the developer that the tape was for their own personal use and, for them, it was better than taking notes. Only two faculty actually recorded their own sessions. Transcriptions Upon receiving the tapes, transcriptions were typed. A person then read the transcription and listened to the tape simultaneously to iden- tify any errors in the transcriptions. The tape was next typed in a final format for use by the data coders. This process was repeated for each tape. See Appendix 0 for an example of a final transcription. 58 Data Analysis Method Based on the research techniques suggested by Gottschalk and Gleser (1969) and Bellack (1966), a content analysis process was designed to analyze the data from this study. Gottschalk and Gleser applied content analysis to verbal behavior of psychological states and Bellack used con- tent analysis to examine the verbal language of the classroom. Berlson (1952) defines content analysis as: A research technique for the objective, systematic, and quantitative description of the manifest con- tent of communication. He goes on to explain that content analysis can be applied to private communication like conversation or the psychoanalytic interviews. Schutz (1950) suggests that content analysis techniques can also include the description of human behavior, particularly linguistics. Holsti (1969), in discussing the application of content analysis to social science research, affirms that content analysis is a multipurpose re- search method developed specifically for investigating any problem in which the content of communication serves as the basis of influence. Borg and Ball (1971) describe the typical process of content analysis as: (l) the researcher tape records the verbal behavior under study; (2) making a typed transcript from the audiotape; and then (3) analyzing the content of the transcription in order to measure the variables form- ulated by the researcher. Borg and Gall (1971) summarize key important considerations in planning for content analysis as: 59 1. Establish specific objectives 2. Selection of unbiased content for analysis 3. Establishment of a classification system for analyzing the content 4. Selection and training of raters or coders 5. Determination of interrater reliability The method used in this study followed these five procedures Method for the Establishment of Analysis Categgries For this study, an utterance was selected as the basic unit of analy- sis. The utterance being defined as one or more words or verbal expres- sions which are intended to communicate to another individual. For the most part, the utterance was a sentence, but included all verbal responses such as phrases. After all the tapes were transcribed, a classification system was developed in order to identify the categories of utterances. As suggested by Budd (1967), the categories were developed within the framework of three primary requirements: (1) that they fit the needs of the study, (2) be exhaustive, and (3) be mutually exclusive. Each transcript was first read by the researcher while listening to the tape. The transcripts were then re-read without the tape. During each of these readings, the transcripts were reviewed in the following areas: 1. Classes of behavior that seem to be exhibited by each developer. 2. Frequency of this behavior for each developer. 3. Frequency of this behavior for all developers. 60 The preliminary reading disclosed to the researcher three tentative major groups of behavior which appeared to be operating concurrently. These groups were: (1) the phases of verbal behavior which extended throughout the conference; (2) the content of the verbal behavior; and (3) the verbal behavior which seemed to deal with process. Each utter- ance could fall into each one of these groups of behavior. During a second and third reading of the transcripts, each group of behaviors was analyzed to determine the component parts (see Appendix G for the first draft of the category system prior to testing with data coders). In the final version of the category system, the first major group of utterances was related to phases and seemed to break the conference into four temporal parts. The three groups of verbal behavior are con- current with regard to any particular utterance which means any utter- ance could be included in more than one group. The four temporal parts are: (l) the introductory phase, (2) the problem identification phase, (3) the solution(s) discussion phase, and (4) the termination or closing phase. The introductory phase was defined as that period of time early in the discussion when the developer was getting personal background infor- mation on the client, making initial small talk, breaking the ice, and in general putting the client at ease. The problem identification phase was defined as when the developer tried to identify and clarify the problem with which the client is concerned. The solution(s) dis- cussion phase seemed to occur after a particular solution or solutions has been suggested or when the advantages and disadvantages of a par- ticular solution were discussed. The last phase, termination or 61 closing, included those utterances which began to draw the conference to a close. The second major group of utterances which was concerned with content included instructional design and/or substantive verbal be- havior. Instructional design verbal behavior seemed to consist pri- marily of problem related objectives, methods, media and evaluation, so categories were established in these areas. The objectives cate- gory included all those utterances which were primarily concerned with objectives of the instruction under discussion. The methods category included any utterances related to instructional strategy or method. The media category was related to any utterance concerned with instructional materials. The last design category in this group was evaluation, and all utterances related to student or course eval- uation were included here. The substantive-content-utterances in- cluded all those that were related to the body of knowledge of the client. The last major group of utterances was somewhat more specific with regard to process functions and included all verbal communica- tion behaviors. Twelve were finally identified for the data coders as: reinforcing, soliciting information, soliciting agreement, prompt- ing, defining, explaining, opining, assumption and expectation clari- fication, informalizing, structuring, summarizing and declarative. The three groups of verbal behaviors are concurrent behaviors with regard to any particular utterance. For example, a statement must be included in one of the five phases, may be included as a design component 62 or be a substantive statement concerned with clarifying a particular con- cept within the body of knowledge of the client. At the same time, every utterance should be included in one of the last twelve communication cate- gories. In review, every utteranCe falls within one of the phases. Addi- tionally, an utterance might fall in one of the design or substantive categories. If the utterance does not deal with a content area, no code is given in this area. Any particular statement would fall into at least two categories and perhaps three. Validity Holsti (1969) describes content or face validity as being adequate in a descriptive study where content analysis is being used. He goes on to say that content or face validity is usually established through the informal judgment of the investigator. Budd (1967) also indicates that face validity is assumed by the content analysist if the categories seem to measure the desired information. After the categories and cod- ing systems had been established, the researcher asked a member of the Michigan State University Learning and Evaluation Services to respond to the face validity of the system. This individual concurred with the face validity of the categories. The Data Coding Process Coding for this study meant the assignment of each verbal utterance by each instructional developer to a minimum of two and a maximum of three categories. A coding form was developed, four coders were iden- tified and a training system designed to enable coders to reliably code the data. 63 The training system consisted of a multi-stage process. Prior to the implementation of training, coding directions were developed along with the coding form and an example transcript taken from a partial tape received but not used as part of the study. All coders met with the re- searcher and reviewed the coding directions. The group read through the directions and explanations were given to clarify any lack of under- standing. -It was then explained to the coders that they were to take the example transcript, the coding form and using the coding directions, independently code the example transcript. The coding form was then explained, several examples were given and the coders practiced using the form. The coders then coded the example transcript. After the example was completed, the group then discussed it with the researcher and discussed problems or concerns in coding the data. Each utterance that was coded was discussed as to why each was coded‘ in a particular way. Their suggestions for changes in the coding directions and coding form, along with the researcher's, were penciled in on all working copies and they were asked to take the materials home and code the example transcript again using the suggested changes. The second meeting followed the same format as the first. Several addi- tional changes were made in the coding process (see Appendix F for capies of the final version of the directions and coding form). After the training was completed, several changes were made in the categories of analysis (see Appendix G for the first draft of the cate- gory system). In the behavior phases, those verbal behaviors concerned with solutions to the client's problem were collapsed into phase or category, due to the fact that coders were unable to perform fine discriminations between solution generation and solution 64 implementation. This merger resulted in four phases or categories in the final analysis system over five in the initial form. The content group of categories made up of instructional design categories and a substantive category had two changes made. The sub- stantive category was placed at the beginning because coders agreed they felt more comfortable responding to this category first (see coding form in Appendix E). Also, another category was added to the substantive and instructional design area. This last category, number ten, was to be selected if any particular utterance could not be categorized as either substantive or an instructional design component. This would give the coders one response in all groups of behavior since each utterance had to be categorized in both of the other groups of categories. In the last group of categories, the process functions, which in- clude verbal communication behaviors, several changes were made in the final analysis system. First, the soliciting category was dropped and included in soliciting information. Defining was also dropped from the first system and included in the explaining category in the final version. Dispensing was added to the final version, along with the nonsense category. The nonsense category covered all groups not just the process group. After training, the final coding categories and descriptions for analysis.were: Phases of Behavior. Every utterance should fall into one of thesegphases: 1. Introductory Phase: This is where the developer is getting backgroundTTnformation on the client, making initial small talk (i.e., weather, baseball, etc.). 65 2. Problem Identification Phase: This is when the developer is trying to pinpoint the problem the client is faced with. The client may not even know the real problem. 3. Solution Discussion Phase: This is when the developer discusses with the client possible solutions to the problem. Included also would be any utterances related to the sugges- ted solutions. 4. Termination or Closing Phase: This is when the developer draws the meeting to a close in a formal way by making statements which communicate to the client that the meeting is nearing the end or ending. Content Categories. The second general category in the content analysis system deals with Substantive and Instructional Design components. Not all utterances can be codéd one of these. If it has not been coded substantive or instructional design, code it "None". The categories are: 5. Substantive: All those utterances which are concerned with the content or body of knowledge relating to the client's problem (i.e., botany, sociology. etc.). Not all utterances can be coded as substantive. 6. Objectives: This would be any utterance in reference to the objectives of instruction. 7. Methods: This would be any statement which would be con- cerned with method (a strategy) or how to teach. It could be very general or very specific. 8. Media: This would be concerned with any discussion of instructional materials (film, TV, slides, overhead trans- parencies, etc.). 9. Evaluation: Any statement about how to evaluate students, the course, etc., would fall into this category (e.g., tests). 10. None: Any statement that cannot be coded Substantive or Instructional Design. Process Categories. The third general category in the content analysis system is concerned with the process of verbal communi- cation behavior functions of the developers. This is classifying in a specific way what the developer is saying with each statement to the client. The specific categories are: ll. Reinforcing: Those utterances which verbally reward or praise the client for something he has said or done (e.g., right, very good, good, good point, etc.). 12. 13. 14. 15. 16. 17. 18. 19. 20. 21. 22. 66 Soliciting Information: Utterances which seek information of any kind from the client. Prom tin : A statement which cues the client toward some intended behavior. Careful attention should be paid to this category (i.e., what may appear to be a question is really attempting to prompt the client to discover a point, i.e., "Why do you think students behave that way?"). Explaining: Statements which explain to the client issues, methods or applications of concepts or principles. These are usually theory or data based utterances. Opining: The developer giving a personal point of view or opinion. Careful attention should be given as to discrimi- nate from explaining (example: I think that in hi her education today we don't structure learning enoughg. These are usually not theory or data based utterances. Assumption and Expectation Clarification: Statements which clarify roles, responsibilities andiexpected results of the relationship between the developer and the client. Informalizipg; Statements which promote an Open, informal talking environment (i.e., "break the ice" statements; "Isn't old Joe Blow in your department?" "Want a cup of coffee?"). Structuring: Statements which establish or re-establish the process of the working relationship (Let's first take a look at your problem and then come up with possible solutions). This usually includes utterances which refer to what will happen. Summarizing: This is a review of any particular steps in the structuring, and are statements which review progress in the discussion. Be careful that the developer is not restructuring the process. Reviewing one part of what has happened; looking backward. Declarative: Statements in which the developer declares intent to do something for the client (i.e., "I'll get a copy of that book and send it to you“). Dispensing: The giving of information about human and non- human resources which may help the client with his problem. This last category, even though it is numbered with the communications, is really separate. Nonsense: Any utterance which can't be read or understood by you, the coder. If it just doesn't make sense, code it here. 67 Numbering system for content analysis categories: Phases: 1. Introductory 2. Problem Identification 3. Solution Discussion 4. Termination/closing Content (Substantive and Instructional Design): 5. Substantive 6. Objectives 7. Methods 8. Media 9. Evaluation 10. None Process (Process and Verbal Communication Behavior Functions): 11. Reinforcing 12. Soliciting Information 13. Prompting l4. Explaining 15. Opining l6. Assumption and Expectation Clarification l7. Informalizing 18. Structuring 19. Summarizing 20. Declarative 21. Dispensing 22. Nonsense After the coding form and directions were redesigned, all trans- cripts were placed in random order in a file drawer along with ample coding forms. The coders were required to select the transcripts ran- domly and work independently at their own pace through each transcript. A record was kept of when transcripts were checked out and when they were returned. Each transcript had the number of the instructional developer (1-6) and also the letter of the client (A, B, C). All coding forms were marked for the developer, client and coder. Five developers had two clients and one developer had three, for a total of thirteen clients or transcripts. 68 The four coders were employed as research aids by the unit in which the researcher worked and were assigned to the coding task as part of their job responsibility. Three were college seniors majoring in jour- nalism and one was a first-year graduate student in psychology. All were extremely cooperative and enthusiastic about the task. During the actual coding of the transcripts, the researcher moni- tored for potential problem areas with coding and the attitude of the coders. The only problem that was detected was related to the nature of the data itself. Even though specific coding categories were deve10ped and understood by the coders, a certain amount of judgement was still re- quired. The fact that they were making judgements and not just making simple discriminations created some anxiety. However, the general atti- tude of the coders was positive throughout the coding process. Determining_lntercoder Reliability Borg and Gall (1971) and Holsti (1969) agree that interrater or intercoder agreement on the use of the analysis categories with the data must be sufficiently high if the study is to satisfy the requirement of objectivity. Holsti, Scott (1955), Budd (1967) and Block (1961) suggest procedures which can be used to determine intercoder reliability within content analysis methods. The simplest procedure is suggested by Holsti and also recommended by Budd. This procedure is: R: 2M 'l—N+N2 In this formula M is the number of coding decisions on which two judges are in agreement, and N1 and N2 refer to the number of coding 69 decisions made by judges l and 2, respectively. This formula has been criticized because it does not take into account the extent of inter- coder agreement which may result from chance alone. Scott (1955) developed a method which adjusts for the chance factor and is found in Holsti's Reliability (R). Scott's method is: R' = % observed agreement - % expected agreement OR 100 - % expected agreement R - E IOO - E R' = where "% observed agreement" is the R from Holsti's formula above. Expected agreement for each combination of two coders was determined by finding the proportion of response frequencies falling into each cate- gory and summing the square of the proportions. Block (1961) suggests a procedure for calculating intercoder relia- bility when more than two coders are used in data coding which has the effect of maximizing intercoder reliability. Holsti (1964) concurs with this method and suggests its use. The method is used to determine a composite reliability coefficient. It may be computed by the follow- ing formula where average inter-judge agreement is R' on the output of Scott's formula and N denotes the number of judges on coders. CR- N (average inter- judge agreement) OR 1 + (N - 1) (average inter- -judge agreement) NXR' CR T+(N-I)xw These methods are accepted by researchers in the content analysis on determining intercoder reliability and were used in this study. 70 as a series of steps. The output of these steps will be reported in Chapter IV. Data Analygjs The objectives of this study served to provide the direction for the analysis of the data. As indicated in Chapter I, the objectives for the study were as follows for initial conferences: 1. Establish and define categories of verbal behavior of selected instructional developers in initial inter- views with new clients. Describe the verbal behavior for each instructional developer in each new client conference. Describe the commonalities and differences in verbal behavior from initial conferences, for each developer, across all clients. Describe the commonalities and differences in verbal behavior across all deve10pers and all clients. Compare the verbal behavior of instructional developers to selected theoretic models in the literature in terms of commonalities and differences. Consistent with the objectives of the study, the following pro- cedures were used to analyze the data in the study: 1. Frequency of utterances by each developer and each client. Percentage of verbal behavior by each developer and each client. Percentage of verbal behavior by all developers and all clients. Frequencies in each category for each developer an each client. ~ Percentage of responses in category for each developer and each client. Percentage of responses in each category for each developer and all clients. 71 7. Percentage of responses in each category for all developers with all clients. No inferential statistics were used to analyze the data. The study was not based on any particular set of hypotheses. The descriptive pro- cedure in terms of frequency and percentage was considered to be ade- quate for such an exploratory approach. Summary The research is an exploratory and descriptive study of the verbal behavior of instructional deve10pers in their initial conference with new clients. The chapter explained the process by which the six sub- jects were selected and identified their education and experience rele- vant to this study. A description was also given of how the subjects were oriented for participation in the research. An explanation was also provided on how the 13 clients were gen- erated for the subjects, the area of expertise they represented and how study-generated clients were oriented to their responsibility. The data gathering process was also described with the procedures developed to assure technically sound audiotapes and an adequate number of tapes. The process for tape transcription was briefly described, as well as the content analysis process selected for data analysis. The pro- cedure for establishing categories of analysis was explained, as well as a listing and description of all the analysis categories. The con- currency of the categories was explained with an example of how a particular utterance might be categorized. 72 The coding process, forms and coder training methods were explained as well as a description of the coders. The method for determining intercoder reliability was described, including the statistical formula used for calculating the results. Objectives of the study were repeated from Chapter I and a ten- step data analysis process was outlined. The ten-step procedure will result in the output described in Chapter IV. CHAPTER IV PRESENTATION AND INTERPRETATION OF THE DATA The purpose of this study was to describe the verbal behavior of instructional developers in their initial conferences with new clients. The initial client conferences of six instructional developers was audio recorded and analyzed. Two client conferences were recorded from five developers and three from one developer for a total of thirteen initial conferences.. The recordings were transcribed and four data coders were trained to code the data according to the categories de- veloped by the researcher. The purpose of this chapter is to present the analysis of the data with interpretation. The results of intercoder agreement on the con- sistent use of the coding system will be presented first. The data will be reported by first presenting the findings from each deve10per, followed by the findings from all developers. Intercoder Reliability Berlson (1952), Budd (1967), Holsti (1969) and Borg and Gall (1971) all suggest that the consistency with which coders can use content analy- sis categories indicates the degree of objectivity of a study. Four people were trained as coders to code the thirteen transcribed audio tapes from six instructional developers. Each coder applied the coding system to each of the thirteen transcriptions. A composite 73 74 reliability method was used (see Chapter III, pages 65-66), which has been suggested by Block (1961) and confirmed by Budd (1967) and again by Holsti (1969). This method was used to calculate the extent of agreement to which four coders could apply their coding system. Table 1 shows that the highest percentage agreement (0.87) occurred with developer two and client 8. The highest agreement (0.855) for all developers occurred with developer two. The agreement across all developers was 0.832. Table 1 Four Coder Composite Reliability * Instructional Developer Client 1 2 3 4 5 6 All Dev. A 0.82 0.84 0.79 0.74 0.81 0.83 B 0.85 0.87 0.80 0.86 0.82 0.84 c - - - - - 0.84 A11 Clients .835 .855 .795 0.80 .815 .836 .832 *Each developer had a different client A, B or C. Borke (1969) reports that an agreement in the 0.60-0.70 range is the average reported by other investigators where complex judgments are being made. The reliability for this study seems to be more than ade- quate to demonstrate the ability of the four data coders to apply the analysis categories to the transcripts. Developer-Client Conferences Of all the utterances spoken by the developers and the clients, the developers dominated 3,353 utterances out of 5,319 for 63%. In all 75 cases, each developer talked more than 50% of the time. The dominance of the instructional developers ranged from 52% for developer two, client A, to 75% for deve10per four, client A (see Table 2). Table 2 Conference Utterances of Clients and Developers Frequency (F) of Utterance = Developer -- 3,353 Clients -- 1,966 Total -- 5,319 Dev. 1 Dev. 2 Dev. 3 Dev. 4 Clients 1A 18 2A 28 3A 38 4A 48 F % F % F %i F % F % F % F % F % Dev. 303 63 176 57 232 52 129 64 314 61 201 66 324 75 248 54 C1. 150 37 133 43 210 48 74 36 202 39 102 34 106 25 213 46 Total F 483 309 442 203 516 303 430 461 Dev. 5 Dev. 6 Clients 5A 58 6A 68 6C F % F % F % F % F % Dev. 185 57 191 53 220 58 413 72 417 74 Cl. 142 43 167 47 161 42 161 28 145 26 Total F 327 358 381 574 562 The Findings for Each Developer Instructional Developer One A The instructional problem that client A brought to instructional developer one was generally concerned with the design for a curriculum for 4-H youth. The client described a need for a three-part program that could be used independently but be tied together conceptually. One part of the three-part program had already been partially designed prior 76 to the meeting. The general problem that client 8 outlined to developer one dealt with the design of a four and one-half hour workshop for adult women on the subject of menopause. The conference was to be delivered in one and one-half hour blocks of time over a two-day period. In examining the interaction of client A and developer one, it can be observed that 58.25% of the developer's verbal effort was spent in discussing solutions (3)* to the client's problem (see Table 3.) A simi- lar observation can be made for client B, only the percentage of time spent in solution discussion increased to 68.32%. A similar effort in the problem identification phase (2) was given to both clients A and B, 24.06% and 27.98%, respectively. Comparable attention, in terms of in- troductory remarks (1) was given to both clients in the beginning of the conferences, since they show an effort of 1.82% and 1.85%. It should be noted here that the sequence of all the coded data show that the instruc- tional deve10per one was involved in the introductory phase first, fol- lowed by the problem identification phase, then the solution discussion phase and, lastly, the termination/closing phase. Fifty percent of all the utterances spoken by the developer in the conference with client A was concerned with content, either in the sub- stantive (5) knowledge base of the client, or an instructional design component (6, 7, 8 and 9). This increased slightly to 66.90% with client 8. In the instructional design categories, most of the *Refers to the category number. Refer to pages 65-68 in Chapter III for complete names and definitions for all categories. Developer One, Clients A and B Percent of Developer Utterances 77 Table 3 by Categories Percent = 100% for Phases, Content and Process Frequency = For A, N=303 Frequency = For B, N=l76 Content Phases Substantive and Instructional Design Cat. Prob Sol Term/ . .. Name Intro Id Dis Clos Sub Obj Meth Media Eval None. Cat. 3 l 2 3 4 5 6 7 8 9 10 Client . A-% 1.82 24.26 582515.6812.8714.11 9.49 4.04 9.49 50.0 A-Freq 5.5 73.5 176.5 47.5 39.0 42.75 28.75 12.25 28.75 151.5 B-% 1.85 27.98 68.32 1.14 17.47 9.23 19.74 10.09 10.37 33.10 B-Freq 3.25 49.25 120.25 2.0 30.75 16.25 34.75 17.75 18.25 58.25 . Process Behavior Functions . gig; Reinf ffigg Prom Expln Opin 233p Infor Struc Sum Declar Cat. # ll 12 73 14 15 16 17 18 19 20 Client ‘ A-% 11.22 18.48 4.95 24.67 21.78 2.31 2.97 .74 2.06 .99 A-Freq 34.0 56.0 15.0 74.75 66.0 7.0 ’9.0 2.25 6.25 3.0 B-% 10.80 15.06 2.56 37.07 27.84 .71 .14 1.85 1.28 .43 B-Freq 19 26.5 4.5 65.25 49.0 1.25 .25 3.25 2.25 .75 Process Behavior Functions (continued) Cat. , Name D‘SP * Cat. iii 21 Client A-% 9.16 A-Freq 27.75 B-% ’2.13 B-Freq 3.75 Refer to pages 65-68 in Chapter III for names and definitions oi’cate- Frequency is an average of four coders. gories. 78 components received equal attention except the instructional methods cate- gory was given a 19.74% emphasis. In the process functions, the verbal behavior emphasis was placed primarily in three areas for both client A and B. Explaining (14) was concentrated on 24.67% of the time with client A and 37.07% of the time with client 8. Opining (15) also received a high degree of effort with clients, 27.28% with A and 27.04% with B. The last major emphasis in the process function categories comes in the soliciting information cate- gory where client A received 18.48% and client B 15.06%. Developer one also reinforced (11) or rewarded both clients' past or present behavior, related to their problems, by expending 11.22% (A) and 10.80% (B) of all utterances in this area. Category 10 represents all those utterances that could not be coded either substantive or instructional design. The total of categories 5, 6, 7, 8, 9 and 10 will equal 100%. Instructional Developer Two Client A, for deve10per two, proposed a general problem concerned with the improvement of a course that had been taught for several years. The client felt the course was not having any impact on the long-term behavior of the upper-level doctoral students enrolled and wanted help from the instructional deve10per. Client B, for developer two, also wanted to improve a course that she had been teaching for the past year. This course had several hundred undergraduate students enrolled and she felt the answer was more effic- ient use of media, primarily overhead transparencies and motion picture ‘ film. 79 An examination of conference data showed that the phases of the con- ference (or the first four data categories) are similar in results to developer one. The major emphasis was on the solution discussion phase, with both client A and B with 70.58% and 71.51%, respectively (see Table 4). As with developer one, problem identification received the next amount of attention with a 24.78% for client A and 20.93% for client 8. Some introductory remarks were made by the developer with both clients. Client A received 4.31% of the verbal utterances in all the phases, while client 8 received 1.94%. In the content or substantive and instructional design categories of verbal behavior, the greatest attention was given to instructional methods. Client A received 34.16% of the content categories and client 8 20.93%. Since client B's initial concern was with instructional media, it is not surprising that 20.54% of the content categories dealt with this issue. With client A, 58.51% of the verbal utterances did not deal with the content categories at all, and with client 8 it was 44.77% (see category 10, Table 4). A non-content category statement could be any utterance that was not content category related, but might be intended to primarily reinforce the client's behavior, structure the process of the meeting, opining or a particular issue, or any one of several other process functions. In the process functions, developer two's primary effort was ex- plaining problem-related concepts and principles to the client. This verbal effort included 53.66% of all process functions for client A and 48.45% for client 8. Developer two also gave his opinion 12.39% of the time with client A and 15.70% with client 8. Throughout this Developer Two, Clients A and 8 Percent of Developer Utterances 80 Table 4 by Categories Percent = 100% for Phases, Content and Process Frequency = For A, N=232 = For B, N=129 Frequency Content Phases Substantive and Instructional Design Cat. Prob Sol Term/ . .. . Name Intro Id Dis Clos Sub Obj Meth Media Eval None _Qgt. fig 1 2 3 4 5 6 7 8 9 10 Client A-% 4.31 24.78 70.58 .32 2.59 .22 34.16 4.53 O 58.51 A-Freq 10 57.50 163.75 .75 6 .50 79.25 10.50 0 135.75 B-% 1.94 20.93 71.51 5.43 3.10 5.23 20.93 20.54 5.43 44.77 B-Freq 2.5 27 92.25 7 4 6.75 26.50 26 7 57.75 Process Behavior Functions 32;; Reinf EH22 Prom xpln Opin gigp InforiStruc Sum Declar Cat. # 11 12 *i13 114 15 16 17 18 19 20 Client A-% 5.50 11.21 6.25 53.66 12.39 .86 3.34 1.19 .11 4.09 A-Freq 12.75 26 l4.50124.5028.75 2 7.75 2.75 .25 9.50 B-% 7.3611.82 3.29 484515.70 .78 4.65 .58 1.94 .39 B-Freq 9.501525 4.25 62.502025 1 6 .75 2.50 .5 PFocess Behavior Functions (continued) Cat. , Name D‘SP Cat. # 21 Client A-% .86 A-Freq 2 B-% ’ 3.88 B-Freq 5 Refer to pages 65-68 in Chapter III for names and definitions aficate- Frequency is an average of four coders. gories. 81 conference, 11.21% of the time for client A, and 11.82% for client 8, was devoted to soliciting information from the client. Instructional Deve10per Three With developer three, client A, three faculty showed up to discuss the general problem with the developer. So, in fact, the client became three people. The problem that they brought to the instructional developer dealt with the issue of implementing quality educational pro- grams in community settings. Specifically, the problem was how to imple- ment and maintain quality in medical education communication settings where the instruction is done by local professionals. Client B brought in an evaluation problem. Specifically, the prob— lem was how to go about the evaluation of an exportable-type educational package in the first field use of a sample of the intended audience._ As with developers one and two, the emphasis with both of developer three's clients in the phases of the conference were placed with the problem identification phase and solution discussion. With client A, 80.97% of the conference was spent in discussing possible solutions to the problem (see Table 5). With client 8, the percentage of the utter- ances falling into the solution discussion phase was 57.96%. Problem identification received 17.36% of the deve10per's verbal effort for client A, and 29.10% for client 8. The developer made no introductory comments that were recorded with client A, but with client 8 8.96% of the time was spent with preliminary remarks. With both clients, a small percent (1.67% and 2.11%, respectively) was spent closing out the meeting. 82 Table 5 Developer Three, Clients A and B Percent of Developer Utterances by Categories Percent = 100% for Phases, Content and Process Frequency = For A, N=3l4 Frequency = For B, N=201 Content Phases Substantive and Instructional Design Cat. Prob Sol Term/ . . . Ngme Intro Id Dis Clos Sub Obj Meth MedTa Eval None Cat. # l 2 3 4 5 6 7 8 9 10 Client A-% O 17.36 80.971.67 9.47 .56 40.29 1.35 1.04 47.05 A-Freq O 54.5 254.25 5.25 29.75 1.75126.5 4.25 3.25 147.75 B-% 8.96 29.10 57.96 2.11 9.58 9.58 6.97 1.24 21.52 51.12 B-Freq 18 .58.5116.5 4.2519.2519.2514 ' 2.5 43.25 102.75 Process Behavior Functions £23; Reinf EH20 Prom Expln Opin Sigp Infor Struc Sum Declar Cat. # ll 12 13 1T41 15 16 17 18 19 20 Client - A-% 2.71 6.05 2.15 59.63 19.82 1.59 .72 1.11 .64 1.11 A-Freq 8.5 19 6.75187.2516.25 5 2.25 3.5 2 3.5 B-% 4.48 8.58 8.08 46.36 23.51 .62 3.61 1.24 .12 .37 B-Freq 9 17.25 16.25 93.25 47.25 1.25 7.25 2.5 .25 .75 Process Behavior Functions (continued) Cat. , Name D‘SP Cat. # 21 Client A-% 4.46 A-Freq 14 B-% ’ 2.74 B-Freq 5.5 Refer to pages 65-68 in Chapter III for names and definitions of cate- Frequency is an average of four coders. gories. '83 In the content categories for client A, 40.29% of the total verbal utterances were concerned with instructional methods as opposed to only 6.97% for client 8. Substantive concerns were emphasized by the developer to the degree of 9.67% with client A for the next highest level of ef- fort. With client A also, 48.05% of the total utterances were not con- cerned with content categories. Since client B was mostly concerned with an evaluation question, it is not surprising that most of the time in the content categories was spent in this area. Related to that is a 9.58% emphasis on instructional objectives. Both client A and B received about the same level of client verbal activity in the substantive area, 9.67% and 9.58%, respectively. As with previous developers, the emphasis in the process categories for all verbal behavior showed deve10per three spent the largest percen- tage of his time in explaining some dimensions of the problem to the client. With client A he spent 59.63% of his time and with client B 46.39%. A comparable amount of time was also spent in giving opinions to the clients. With client A, 19.82% of all verbal utterances dealt with developer Opinion, and with client 8 the developer opined 23.51%. The developer solicited information from the clients at a similar level, 6.65% for client A and 8.58% for client 8. The developer also expended some effort in reinforcing the behaviors of both clients. The percent of all utterances categorized as client reinforcing was 2.71% for client A and 4.48% for client 8. The developer also spent some time in dispensing re- source information to the clients related to the problems under discussion. With client A the percentage of utterances considered to be dispensing in nature were 2.71% and for client 8, 4.48%. Verbal behavior was exhibited 84 in all of the other process categories but at very low percentage levels. Instructional Developer Four The problem that client A brought to this developer was preparing women to be thorough, unobstrusive court observers. The client was charged with the task of preparing about one hundred women to get case backgrounds, observe court action and then talk to the judge as to the reason for the sentence received from the court. Client B was concerned with the improvement of a short course that was offered by the Continuing Education Service at the community level. Specifically, the client was concerned about the improvement of one com- ponent of the course, which was the quality of teaching by guest lecturers. The course was tied organizationally to a state agency, which required the use of guest lecturers or experts on the subject area. These experts or guest lecturers were receiving poor ratings by the course participants each time the short course was offered and the client, who was responsi- ble for the design and delivery, needed help in improving the performance of the guest lecture component. As with previous instructional developers discussed in this study, developer four also spent the largest percent of his effort in the phase of the conference in discussing possible solutions to the problem. With client A, 63.81% of all utterances made by the developer were related to the discussion of possible solutions to the problem, and with client B it was even higher at 82.72% (see Table 6). This instructional developer spent 32.06% of his time in identifying the problem of client A, and 15.82% with client B. Little effort was put into any kind of introduc- tory statement. The percent of total verbal behavior spent in closing Percent = 100% for Phases, Content and Process Table 6 85 Developer Four, Clients A and 8 Percent of Deve10per Utterances by Categories Frequency = For A, N=324 Frequency = For B, N=248 Content Phases Substantive and Instructional Design Cat. Prob Sol Term/ . . . "9.11.18 Intro Id Dis Clos Sub Obj Meth MedTa Eval None Cat. # 1 2 ‘ 3 4 5 6 7 8 9 10 Client A-% .15 15.82 82.72 1.31 10.65 .31 32.87 8.95 2.85 44.29 1;-qu .5 51.25 268 4.25 34.5 1.0106.5 29 9.25 143.5 3.1 O 32.06 6381 4.03 1.81 O 32.76 5.95 .10 59.07 B-Freq 0 79.5 158.2510 4.5 0 81.2514.75 .25 146.5 ~ Process Behavior Functions Cat. - Sol . Asmp Name R91nf Info Prom Expln Opin Exp Infor Struc Sum Declar Cat. # ll 12 l3 14 15 16 17 18 19 20 Client A-% 3.40 7.41 3.63 59.03 20.14 1.39 .15 1.00 .69 1.23 A-Freq 11 24 11.7519l.25 65.25 4.5 I .5 3.25 2.25 4 3.7; 13.81 17.64 6.3544.1510.8 .20 0 1.11 .20 4.84 3-9.9,, 34.25 43.75 15.751096 27 .5 .0 2.75 .5 12 PFocess Behavior Functions (gontinued) Cat. , Name 0‘59 Cat. if 21 Client A-% 1.77 A-Freq 5.75 B-% 0 B-Freq O Refer to pages 65-68 in Chapter III for names and definitions of cate- Frequency is an average of four coders. gories. 86 out the conference varied with these two clients. With client A the developer spent 4.13% in closing, and even a smaller amount of time with client 8, 1.31%. Deve10per 'flnn~ required 10.65% of his time to get the substantive background necessary to deal with client A's problem, but client B re- quired less at 1.81%. As with other developers in this study, the larg- est number of utterances in the content categories came in the instruc- tional methods area. With client A, 32.87% dealt with methods and with client B, 32.76%. The instructional objectives components received almost no attention while the instructional media category was the second largest with 8.95% for client A and 5.95% for client B. Approximately 50% of everything mentioned in the conference by the developer dealt with a substantive or instructional design concern. This developer, as did others, expended the greatest amount of verbal effort in process, explaining to the client concepts and principles related to the problem. With client A, developer four spent 59.03% of his verbal effort explaining, and with client 8 he spent 44.15%. With client A,the developer spent the next highest percent of his time opining. He only devoted 10.89% of his effort to client 8 in this area. With client 8, the developer apparently needed to gather more in- formation from the client, since 17.64% of his communication functions dealt with soliciting information. Only 7.41% of his time was spent so- liciting information from client A. It should be noted that the developer spent 13.81% of his time reinforcing client B's behavior re- lated to the problem. The reinforcement given to client A equalled only 3.40%. 87 Instructional Developer Five The general problem that client A brought to developer five was the improvement of a graduate level course in education. Specifically, the client was concerned with the ratings he had received from the students and wanted some assistance in redesigning the course. The client, based on student feedback, felt the problem may be that he had been trying to cover too much information in a superficial way and not providing enough depth in anything, but he didn't know for sure. Client B also taught a graduate course in education and wanted to improve her course through a more systematic design. She felt the need to develop course objectives and better teaching methods, but needed help on how to proceed. Of the four phases in the conference, developer five spent the larg- est percent of time in discussing the possible solutions to the problem. Of all the total utterances for the four phases for client A, 62.54% dealt with solution discussion (see Table 7). For client B, 77.75% dealt with solution discussion. Problem identification was dealt with for A and B at 13.11% and 19.24%, respectively, which for both clients is the highest percent of verbal activity on the part of the developer. It is interesting to note, however, that f0r client A the second highest percentage level of verbal activity was in closing or terminating the conference. Developer five spent 20.68% of his time involved in this kind of activity. A 3.38% and 2.88%, respectively, emphasis in the introductory phase with client A and 8 showed developer five's effort to get necessary Percent = 100% for Phases, Content and Process 88 Table 7 Developer Five, Clients A and 8 Percent of Developer Utterances by Categories Frequency = For A, N=125 Frequency = For B, N=19l Content Phases Substantive and Instructional Design N21111:; Intro ngb 3?; 332/ Sub Obj Meth Media Eval None I Cat. # 1 2 3 4 5 6 7 8 9 10 Client - A-% 3.38 13.11 62.84 20.68 3.38 5.54 21.35 7.43 9.19 52.97 A-Freq 6.25 24.25 116.25 38.25 6.25 10.25 39.5 13.75 17 98 B-% 2.88 19.24 77.75 .13 4.32 8.25 15.45 9.03 9.16 53.80 B-Freq 5.5 36.75 148.5 25 8.25 15.75 29.5' 17.25 17.5 102.75 Process Behavior Functions , N258 R910f ffiflg Prom Expln Opin gigp Infor Struc Sum Declar Cat.?Ir 11 12 T3 14 05 16 17 18 19 20 Client . ' A-% 9.32 6.22 5.5fi 43.24 20.27 2.30 1.08 2.43 .95 2.97 A-Freq 17.25 11.5 10.25 80 37.5 4.25 '2 4.5 1.75 5.5 B-% 11.78 8.77 4.71 34.69 22.12 3.27 .92 1.18 .79 2.49 B-Freq 22.5 16.75 9 66.25 42.25 6.25 .1.75 2.25 1.5 4.75 Piacess Behavior Functions(§0ntinued) Cat. , Name 0‘59 Cat. # 21 Client A-% 5.68 A-Freq 10.5 B-% 7.98 B-Freq 15.25 Refer to pages 65-68 in Chapter III for names and definitions of cate- Frequency is an average of four coders. gories. 89 preliminary client background information and perhaps permeate a more relaxed environment. In the content categories, developer five was consistent with other deve10pers and spent the largest percent of his time in the instructional methodology area. With client A, 21.35% of all the utterances was spent on methods, and with client 8 it was 15.45%. The verbal effort put into objectives, media and evaluation for both clients was spread between 5.54% and 9.19%. Less of the developer's time was put into finding out more about the substantive orientation of the problem. Again, approximately 50% of all the developer's verbal behavior dealt with substantive issues of one of the instructional design components. Being consistent with the other developers previously examined in this chapter in the process categories, developer five spent the largest percent of his time with client A and B explaining. The next highest level of verbal activity was spent in opining, and reinforcing client be- havior was third. With the explaining category, 43.74% of the developer's verbalism was in explaining, and for client B 34.69%. Opining ran about 20% for both clients with 20.77% for A and 22.12% for B. As indicated, deve10per'five spent considerable time in reinforcing the client's behavior. Client A received a 9.32% effort, and client 8 11.28%. Developer five was consistent in many categories across both clients, and was verbally active in soliciting information, prompting the client's behavior, testing the expectation and assumption of the interaction, sum- marizing the state of the conference and providing the clients with re- sources related to the problem. 90 Instructional Developer Six Developer six was the only deve10per that was able to provide the research with the three clients. The general concern that client A brought to the developer was the improvement of his overall teaching. He was interested in microteaching as a possible technique for self-improve- ment, but was also interested in other procedures that might be used not only by himself but his peers as well. Client 8 teaches a graduate course in education with a large number of students. The general concern of the client was her inability to or- ganize her lectures in order to maximize class participation within the constraints of varying levels of student competency. Client C's general concern was how and if a course, that he had taught for several years, could be redesigned following a mastery model approach. Throughout the four phases of the conference, the greatest emphasis, with all of developer six's clients, was with the discussion of solu- tions to the problem. The percent of verbal activity in this area for client A was 80%, and for clients 8 and C 79.66% and 91.31%, respectively (see Table 8). Except for client A, developer six spent practically no verbal effort in preliminary introductory behaviors. No termination be— havior of any consequence was exhibited either, and in the content cate- gories across all the utterances in the entire conference, developer six spent more time talking about instructional methods. The developer talked to client A 26.25% of his time about methods and client 8 46.25%. Client C received a 26.08% effort in this area. Client C, who was in- terested in the mastery model, dealt with deve10per six in the evaluation Table 8 91 Developer Six, Clients A, B and C Percent of Developer Utterances by Categories Percent = 100% for Phases, Content and Process Frequency = For A, N=220 Frequency = For B, N=413 Frequency = For C, N=4l7 Content Phases Substantive and Instructional Design Cat. Prob Sol Term/ . . Name Intro Id Dis Clos Sub Obj Meth Media Eval None Cat. # 1 2 3 4 5 6 7 8 9 10 Client . . A-% 2.16 17.84 80.00 0 2.16 3.07 26.25 6.36 8.52 53.64 A-Freq 4.75 39.25 176 O 4.75 6.75 57.75 14 ' 18.75 '118 B-%’ .36 19.31 79.66 .73 2.00 .06 46.27 1.94 1.82 47.88 B-Fre 1.5 79. 75 329 3 8.25 .25191 23 ' ‘7.5 ‘197.75 C—% .66 7. 97 91.31 .06 4.86 4.62 26.08 5.22 22.42 36.81 C-Freq 2. 75 33.25 380.75 .25 20.25 19.25 108.7 21.75 93.5 153.5 . Process Behavior Functions 7 52;; ReTnf EH20 Prom Expln Opin 233p InforIStruc Sum. Declar Cat. if 11 12 13 14 15 16 17 18 19 20 Client A-% 12.73 3.41 3.18 57.16 13.64 1.36 .45 2.05 .80 1.36 A-Freg 28 - 7.5 7 125.75 30 3 1 4.5 1.75 3 B-% *8.‘35”3.03 3.15 60.11 16.22 T73 .30 2.851 1.15 1.82 B—Freq 34. 5 12.5 13 248.25 67 3 1.25 11.75 4. 75 7.5 C-%* 13.07 1.20 .90 57.19 16.97 .60 .24 1.86 .84 1.02 C-Frgg 54.5 5 3.75 238.5 70.75 2.5 l 7.75 3.35 4.25 , Process Behavior thctions (continued) Cat. , Name DISP pg Cat. ii 21 Client A-% 3.75 A-Freq 8.25 B-% 1.03 B-Freq, 4.25 C-% 5.76 C- Fre 24 Refer to pages 65- 68 in Chapter III for names and definitions of cate- Frequency is an average of four coders. ' gories. 92 categories more than the other two clients (22.42%). In all of the con- ferences, verbal activity on the part of the developer over 50% was put into the content categories of behavior with clients 8 and C receiving 52.2% and 63.19%, respectively. With client A, developer six put out a 46.36% effort into content categories. Of all the conference utterances in the process categories, developer six spent the most time explaining concepts and principles to the clients which related to the problems under discussion. With client A, 57.16% was spent explaining, and with client 8 it was 60.11%, and with client C, 57.19%. The developer spent considerable effort reinforcing client be- havior. It ranged from 8.35% for client 8 to 13.07% for client C, with client A getting a 12.73% effort. Developer six also spent a relatively high amount of effort in giving his opinion to the clients. A 13.64% effort was directed toward client A, with client 8 getting 16.22% and client C the largest with 16.97%. The rest of the developer's verbal activity in the process categories was below 4% with all clients. Comparison of A11 Instructional Developers It became apparent after looking at the behavior of each developer and each client, that there were several commonalities of behavior in the initial conference regardless of the client or the problem the client brought to the developer. The most obvious behavior was that, in the four phases of behavior, developers spent the largest percent of their time discussing possible solutions to the clients' problems. This cate- gory ranged from 57.96% for developer three, client 8, to 91.31% for developer six, client C (see Tables 5 and 8, respectively). The per- centage for all developers was 74.57% (see Table 9). 93 Another obvious trend was the percent of time spent on problem iden- tification. Without exception, the second highest time-consuming effort by all the deve10pers went to the identification and clarification of the client's problem. The range for this verbal activity by developers was from 13.11% for developer five, client A (see Table 7), to 32.06% for developer four, client A (see Table 6). Across all the developers, the percentage for this category was 19.81% (see Table 9). The percent of time the developers spent in the introductory phase in such preliminary activities as getting background on the clients or putting the client at ease through small talk or, in general, "breaking the ice", varied considerably. With some clients, developers spent 0.00% performing this behavior, such as developer three, client A (see Table 5), or developer four, client 8 (see Table 6). On the other hand, developer two spent 8.96% of the time in preliminary activities with client 8 (see Table 5). The percent for all deve10pers and clients was 1.80% (see Table 9). The trend across all developers placed the introductory phase as receiving the least amount of emphasis in the initial client interactions. The fourth phase, termination and closing, also shows a great var- iety of behavior across all clients. It ranges from 0.00% with developer six, client A (see Table 8), to a 20.68% from developer five, client A (see Table 7). For all developers, termination and closing was 3.66% (see Table 9), which gave it the third level of emphasis in the four phases of verbal behavior. In the content categories, the only obvious common pattern of ver- bal behavior across all clients and all developers exists in the methods 94 Table 9 Percent of Developers' Utterances In Each Phase Total Utterance Frequency == 3353 Phases Prob. Sol. Term. Not Dev. Intro. Id. Discus. Closing Coded % Total 1 1.83 25.63 61.95 10.33 0.26 100 2 3.46 23.41 70.91 2.15 0.07 100 3 3.60 21.94 71.99 1.84 0.63 100 4 0.09 22.87 74.52 2.49 0.03 100 5 3.13 16.22 70.41 10.24 0.00 100 5 0.86 14.50 84.36 .31 0,00 100 Wtd. Avg. 1.80 19.81 74.57 3.66 0.16 100 category. Most developers spent 20%-40% of their time talking about instructional methods with the client, regardless of the client's prob- lem. The two exceptions to this were deve10per one, client A, who spent 9.44% on methods (see Table 3) and developer three, client 8, who spent only 6.97% in this area. It is interesting to note that in the developer one, client A, conference the issue was very subject specific and required a 12.87% effort in the substantive category and 14.11% in the objective category (see Table 3). With developer three and client 8, the problem under discussion was one of evaluation which required 21.52% (see Table 5) of the developer's time in the evaluation area. It seems wherever the developer had a more general problem to deal with, in- structional methods received the largest percent of attention of all the content categories. 95 Over the rest of the content categories, the verbal behavior varies greatly. In the substantive category, for example, the percent for all six developers ranged from 2.77% for developer two, to 14.56% for developer one (see Tables 4 and 3). The percent for all developers in this category was 6.43%. The second highest percent in the content categories across all deve10pers was the evaluation area. Even though the range of developers' verbal behavior was from 0.00% (see Table 4) to 22.42% (see Table 8), there was a tendency by most instructional developers who were not specifically dealing with evaluation, such as developer three, client 8 (21.52%, see Table 5),and developer six, client C (22.42%, see Table 8), to treat evaluation equally with the other content categories. It can be said that, because of the relatively high percentage, evaluation received the second highest degree of verbal attention over all the instructional developers. The objective category also ranged widely. Developer four, with both clients, gave this category practically no emphasis, and developer one gave 14.11% and 9.23% with clients A and B, respectively. The objective category, over all developers, was 4.19% (see Table 10). The last category to be discussed will be media, which is cate- gory 8. Over all the developers, the media category was more consis- tently emphasized than evaluation. The range of verbal behavior of the developers was 1.24% by developer three, client 8 (see Table 5), who was primarily dealing with an evaluation problem, to 20.54% for developer two, client 8 (see Table 4), who was working with a client 96 Table 10 Each Developer and All Clients Percent of Developer Utterances by Content Categories Total Utterance Frequency = 3353 Categories . . Ngt Dev. ’Sub Obj Meth MedTa Eval None Co ed 1 14.56 12.32 13.26 6.26 9.81 43.79 0.00 2 2.77 2.01 29.43 10.25 1.94 53.60 0.00 3 9.51 4.08 27.28 1.31 9.03 48.64 0.00 4 6.82 .17 32.82 7.65 1.66 50.70 0.18 5 3.86 6.91 18.35 8.24 9.18 53.39 0.07 6 3.17 2.50 34.05 4.17 11.40 44.69 0.02 Wtd. Avg. 6.43 4.19 27.57 5.73 7.88 48.13 0.07 who wanted to improve her teaching through more effective use of over- head transparencies and motion picture film. The percent across all developers was 5.75%. Nine of the thirteen clients placed an empha- sis on the media category from 4% to 10%. Category 10 in Table 10 shows that a percent of 48.13% of all the utterances across all developers could not be included as fitting into one of the content categories. Obviously, this means that an average of 51.87% of all the utterances could be included as substantive and instructional design. The "not coded" percent included all nonsense and error. In the process behavior functions there were four areas of empha- sis that stand out as receiving most of the developer's attention in 97 the initial conference. The obvious area is the explaining category (14). The average across all developers in this category was 49.22% (see Table 11). They ranged from 29.67% for developer one, client A (see Table 3), to 60.11% for developer six, client C (see Table 8). This, of course, . means that 49.22% of all verbal behaviors was spent in explaining con- cepts and principles related to the client's problem. Regardless of the client or client's problem, the developers spent, by far, the largest percent of their time explaining or, in a manner of speaking, teaching the client key ideas related to his or her instructional problem. Related to the explaining category is the opining category, be- cause in both cases the developers were telling the clients something. This category received the second highest emphasis for all instructional developers with 18.29% (see Table 11). This means, of course, according to the data, for all developers, that 18.29% of the developers' verbal behaviors was spent in giving the clients their personal opinions on problem-related issues. Combined with the 49.72% in the explaining cate- gory, the deve10pers spent 68.01% of all their verbal behavior telling the client something. The opining category ranged from 10.89% with developer four, client 8 (see Table 6) to a 77.84% with developer one, client B (see Table 3). The percent for all developers in the reinforcing category was 8.79% (see Table 11). This received the third highest emphasis for all developers. So 8.79% of the developers' verbal activity was spent in reinforcing or praising the client on his behavior related to the prob- lem being discussed. 98’ It is interesting to note that for all the developers, the fourth highest amount of verbal activity occurred in the area of reinforcing client behavior. This category received a 8.38% (see Table 11) across all developers. This, according to the data, shows that out of all the verbal behaviors, the developers are putting an 8.38% effort into tell- ing the client how good he has done or is doing with the problem at hand which, of course, can affect positively the client's continued in- terest and guide desired behavior change. _ The data also shows that a 3.93% effort was spent in prompting the client to learn or discuss a desired behavior in the initial conferences. In such an instance, a developer may not have explained to the client a particular concept or principle but, through a series of statements, helped the client to discover it himself. Another category of behavior which received a moderately high em- phasis by all developers was the dispensing of information related to resources, human and otherwise, that might help the client with the problem. Dispensing received a 3.70% emphasis overall (see Table 11) and two developers gave it an emphasis of over 7% (see Tables 3 and 7). All of the rest of the verbal behavior in the process categories were below 2%, with most being clost to 1% (see Table 11). The least emphasis for all developers was placed with the declarative category (19), which indicates, according to the data, that little verbal behavior overall was devoted to giving a commitment to do something for this client after the conference. This might include commitment to a next meeting in order to continue helping the client with the problem. 99 Table 11 Each Developer and All Clients Percent of Developer Utterances Total Utterance Frequency = 3353 by Process Functions Categories Dev. Reinf $01 Info Prompt Explain Opining As/Exp l 11.06 17.22 4.07 29.23 24.01 1.72 2 6.16 11.43 5.19 51.80 13.57 .83 3 3.40 7.04 4.47 54.57 21.26 1.21 4 7.91 11.84 4.81 52.58 16.13 .87 5 10.57 7.51 5.12 38.90 21.21 2.79 6 I 11.14 2.38 2.26 58.33 15.98 .81 Wtd. Avg. 8.79 8.38 3.93 49.72 18.29 1.24 Dev. Infor Struct Summar Declar Dispen ngzd 1 1.93 1.15 1.77 .78 6.58 0.48 2 3.81 .97 .76 2.77 1.94 0.77 3 1.84 1.17 .44 .83 3.79 0.00 4 .09 1.05 .48 2.80 1.01 0.43 5 1.00 1.80 .86 2.73 6.85 0.66 6 .31 2.29 .95 1.40 3.48 0.67 Wtd. Avg. 1.19 1.54 .88 1.76 3.76 0.07 The percentage for all developers over all categories in each group of behavior can be ranked as follows: 100 Phases Ranking Category Number Category_Name Average % 1 3 Solution Discussion 74.57% 2 2 Problem Identification 19.81% 3 4 Termination/Closing 3.66% 4 1 Introduction 1.80% Content (Substantive and Instructional Design) 1 7 Methods 27.57% 2 9 Evaluation 7.88% 3 5 Substantive 6.43% 4 8 Media 5.73% 5 6 Objectives 4.19% Process Behavior Functions 1 14 Explaining 49.72% 2 15 Opining 18.29% 3 ll Reinforcing 8.79% 4 12 Soliciting Information 8.38% 5 13 Prompting 3.93% 6 21 Dispensing 3.76% 7 20 Summarizing 1.76% 8 18 Structuring 1.54% 9 16 Assumption/Expectation 1.24% 10 17 Informalizing ' 1.19% 11 19 Declarative .58% Discussion of Findings as Related to Selected Theoretic Models Davies Approach Davies (1973) described three dimensions of an instructional developer-client relationship. One dimension includes the preliminary functions, which are performed with the client, which are problem defi- nition, interpretation of the data and the generation of alternative solutions. The second dimension of the working relationship, according to Davies, is concerned with identifying the type of assumption which underlies the relationship. The third dimension is listing the succes- sive stages in the development of the relationship. Parallel to the last 101 dimension 'Hs a sequence of task-oriented steps which include the general areas of diagnosis, planning action, implementing action and evaluation. The categories developed in this study to analyze the verbal behav- iors of instructional developers in initial developer-client conferences indicate that there are some similarities to what Davies proposes in his model. According to the data from this study, there was in each conference $33: a problem definition activity and a significant emphasis on solution dis- 5 cussion which seems to agree with what Davies proposes as generation of alternative solutions (see Table 12). The difference may be in how much of the solution discussion was centered around the generation of several L_, alternatives or the discussion of the solution(s). Problem identification, or problem definition as Davies labels it, was found in this study to be a significant part of the developer's verbal activity. Davies, too, places important emphasis on problem definition which indicates a point of agreement between the study and Davies' model. Davies also places great emphasis on clarifying the kinds of assump- tions which will underlie the relationship. In terms of the verbal be- havior of the developer, this received one of the least amounts of empha- sis. It should be noted here that one of the assumptions which seem to underlie Davies' thinking is that the developer-client relationship may be more longitudinal in nature, than just one meeting. In no instance did any developer indicate a strong interest in extending the relation- ship beyond the initial conference in an organized manner. This may account for the small emphasis, since the developers may not have expected any lasting professional relationship. 102 Davies' three stages of the instructional developer-client relation- ship (entering into the relationship, maintaining the relationship and terminating the relationship) is difficult to relate to the data from the initial conferences. Davies' stages seem to indicate a longer term relationship than was apparent in this study. However, in examining the first stage, entering into a relationship, and its two subcomponents (1) initial contact with the client system, and (2) negotiation of a formal contract, it is obvious that there was client contact. However, analysis of the verbal behavior in establishing the categories of analysis did not reveal any utterances by any developers to negotiate a formal rela- tionship with the clients.‘ Table 12 Agreement Between Theoretical Models and Findings by Phases Phases Model 1 2_- _3 4 Davies . Some ‘ 162' ies Some Silber Some Yes Yes No Havelock Some Yes Yes Some Silber'sgApproach Silber (1973) proposes a five-stage process which he describes as the "People Function/Skills Involved in Working Content Specialty to Develop Instructional Systems." He describes the five stages as: (l) establishing the relationship, (2) gathering data regarding the problem, 103 (3) working toward a solution, (4) challenging the client, and (5) managing the development. Stage one in Silber's model is related closely to the introductory phase of the researcher's analysis categories. Silber describes the components of his stage as breaking the ice, establishing an open light- hearted, non-threatening tone. The data show that all deve10pers placed 915; some emphasis in this area. Silber also sees the identification of ex- pectations and assumptions as part of this stage. The data do show that all developers gave some emphasis in this area, but it is not clear where, within the stages or phases of the conference, this did take place (see Table 12). Silber identifies the next stage as the process of gathering data regarding the problem, which is comparable to the problem identification phase in this study. The third stage Silber presents is aimed at work- ing toward a solution, which is consistent with phase three of this study, solution discussion. This study showed that the solution dis- cussion phase received more emphasis in terms of developer verbal be- havior than the other three phases. Silber or Davies do not mention relative emphasis in terms of their stages, but Silber does present strategies for implementing all his principles. Silber's stage four, challenging the client, is described as chal- lenging the client intellectually by selectively disagreeing in an aca- demic manner. This stage did not manifest itself in the verbal behavior of the instructional deve10pers in this study. Silber describes his last stage as managing the development. As in Davies, the implication here is that an agreement has been made for the 104 developer and client to work beyond the initial conference in an organized manner, which was not true with any of the developers in this study. This stage of Silber's includes two subcategories. One is assigning work and the other is criticizing materials together. Assigning work calls for task identification and assignment, time lines and getting a psycho- logical contract from the client to do his share. In criticizing work to- 5‘05 gether, the emphasis is placed on being constructively critical and offer- ing to help with counseling efforts. The developers in this study gave no indication, by their verbal behavior, that the behavior represented in this stage of Silber's model was being performed. gyp, Table 13 Agreement Between Theoretical Models and Findings by Content Categories Content (Substantive and Instructional Design) Model 5 6 7 8 9 10 Davies Some No No No Some ---- Silber Some No No No No ---- Haveloc Some No No No No ---- Havelock's Change Agent Literature Havelock (1973) presents a model for working with clients by the change agent which provides a six-stage process that may have consider- able application to the instructional developer. Stage one, the relation- ship stage, suggests that a viable helping role must be established with the client early if the relationship is to have a strong foundation for 105 further development. He describes stage two as a period of problem diag- nosis, and stage three a step to identify and acquire relevant resources prior to choosing a solution, which is stage four. During the solution choosing stage, Havelock suggests that a number of alternatives are gen- erated from which a potential is selected. Stage five is concerned with gaining acceptance of the solution and in stage six, stabilization and self-renewal, the change agent attempts to develop the client so that he can function independently. In this study, the introduction category of verbal behavior is con- {In-v.91.» .' . " sistent with what Havelock calls the relationship stage and is also simi- lar to the early stages suggested by Davies and Silber. Problem diagno- sis, as Havelock labels it, was found to be comparable to phase two of this study. The third stage in Havelock's model (acquiring relevant re- sources) was not explicit in the verbal behavior of instructional developers in the initial conference with new clients. In the developer- client conference, the developer served as the primary resource for prob- lem solution. In each case, since the developer went into discussing the solution, he apparently felt he had enough data to proceed in such a man- ner. It is obvious, since all developers went through the solution choos- ing stage to closing out the conference, that none felt the need to stop after the problem had been identified and seek more resource data prior to selecting a solution in a next meeting. Havelock's stage four (choos- ing a solution) is included in the solution discussion phase of this study. Havelock places no emphasis on this stage, but seems to be in agreement on the importance of the activity (see Table 12). Stage five (acceptance of the solution) would also be part of the 106 definition of solution discussion. In the initial coder training process, an attempt was made to train coders to pick this dimension out of the general solution discussion category, but after several training attempts the coders felt the judgments were too sophisticated with the kind of data that was being used. Stage six, stabilization and self-renewal, seems to relate to a long- nan. term change Strategy and not the kind of verbal behavior found in the ' thirteen client conferences. In this study, the developer-client relation- ship seemed to come to closure after the initial conference and not be structured or organized to move into a longer relationship. The data in this study seems to confirm at a general level certain components of Davies, Silber and Havelock models. The three models seem to be consistent with each other and this study, in terms of a first step with the client which was labeled introductory, relationship estab- lishing or entering into the relationship stage. The models and the re- sults of the study seem to agree on the importance of a problem identifi- cation, description emphasis and also a solution generating, solution discussion phase. The models went beyond the results and intentions of this study to propose how each phase or stage might be performed by the instructional developer or professional. The results of this study clearly go beyond the models by describing what goes on within the first stage in terms of types of verbal behavior related to the content or substantive domain of the client. In addition, the results of this study go on to describe the kind and frequency of ver- bal communication behaviors which function concurrently with substantive and instructional design components as part of a major phase of behavior. 107 Table 14 Agreement Between Theoretical Models and Findings by Process Categories Models 11 12 13 14 15 16 Davies No Some No No No No Fj%_ Silber Some Some No No No No i Havelock Some Some Some No No No Models 17 18 19 20 21 22 Davies No No No No No -- ‘ I Silber Some No No No No -- IF—f Havelock Yes No No No Some -- am Chapter IV has included a presentation, interpretation and discus- sion of the findings of this study. A rationale for determining inter- coder reliability was provided, along with a review of the methods used and its support. The results of intercoder agreement for all coders across all developers was 0.82. Each deve10per's verbal behavior was presented, interpreted and dis- cussed by the phases of behavior categories, substantive and instructional design categories and the communication behavior functions. The results of verbal behavior of all developers were then presented, interpreted and discussed. Each group of categories was examined for the behavior that the developer seems to emphasize the most. A ranking of results was pro- vided which shows the relative emphasis of the six developers on the 108 categories of behavior. Lastly, the relationship between selected client-professional inter- action models was compared to the findings of the study. Each model was examined for its similarities and differences to the findings. Idem ltu~Put 31‘} _..‘V -' I > (10 I o i! a 0 <1“ .1 hu ()0 percent Recms'med Wage. APPENDIX D TRANSCRIPT AND CODING PROCESS Deve10per: Client: Developer: Client: Developer: Client: Developer: Client: Developer: Client: Developer: 134 1. What's up? O.K. I had a first exposure to your group this fall when you asked for faculty interested in seeing how they per— form the video tape. 2. Like the micro teaching situation. Right. And I couldn't take advantage of it, I came-to the first class. I liked it but my schedule was such that I couldn't take advantage of it so I looked into it a little bit more and was interested to have other's opinions. Time being what it is. It is so rare you have anybody that has time unless they are paid for it to do the thing right. 3. Righto. To get anything of benefit back to yourself, feedback. So I am.a very strong proponent of general education, trained in science, but to me the most important population to get science to is not the science population itself. You don't have to convince them. It is the non-science group and here in the university college, the nat.sc1. department is one of the finest approaches to get to the non-scientist who really have some power to vote. The handle on what we can or can't do, can or cannot do with our finances. I firmly believe with Fernowski for instance, that there is a corre- lation with freedom, science, understanding of at least democracy. These are all tied and it is frightening when you see the mass of the population not really knowing how the hell babies are produced. You know they are all going to be doing it if they haven't already. I am kind of scared, and so I have a population of non-science majors. Generally I have had very favorable res onse. There is a personality thing they like or they don t like. But there might be a better way to do some of the things I am now doing and get it across to them, And I don't know if this is what you deal with but I am exploring to see what this university has. I was very surprised at how many things are here for faculty that are not unused. 4. Damn right, absolutely. If they take the fact to find out. size of what is available here for services. stuff even the faculty don't know about. It is frightening the There is 5. You're exactly right. I'm.fishing. I like to fish. 6. Good. 7. Let me then, give you a pool. Client: Developer: Client: 135 Let me ask something. Could I leave around noon or so, because I have to be somewhere. 8. Oh yeah. 9. This is what I will do. 10. I will lay out in a few minutes the kind of resources we have got and the kind of things that people do with us you know and we do for them. 11. What we do for individuals or for groups of people. 12. You know and give you a kind of idea. 13. And let you know. 14. Look I will pull a list of some of the things I have been doing lately and the clients that I have got. 15. You know. ' 16. I got a list of some of the things I have been doing lately with some clients. I have another motive to this and it has to do with review of teaching by peers. In other words faculty evaluation is what it amounts to. From the teaching aspect. I am kind of tired of all my colleagues, not all of them.but most of my colleagues washing their hands of something that should not be done because it is a threat. Youjknow the idea of how do you evaluate good patient care in the medical arena. Do you incorporate the patient into this. The doctor does say no, he doesn't know a damn thing about it. ‘Well I see they are grasping a little with that. I have been with the medical arena a little bit. They are trying to put a handle on this nebulous thing called good patient care, from a peer review type of thing. I see the same thing here with peer review of what is good teaching. And the patient of course being the student. I am tired of my colleagues saying, fill these damn student evaluation things out. Let's do it and not raise a fuss. ‘Well someone wants us to do it. Maybe there is something there that we can look at. No one has done enough of a study to say it is good or bad. Sure if the student is happy he is going to give you a better mark than if he is failing. You know these arelittle exceptions to the rule to tell me things are good or bad and I would like to get heavily involved in designing some instrumentation. So when.we have evaluation for faculty, by god there is something that is standardized, and uniform. Because it has the root bearing on salary Developer : Client: Developer : Client: Developer : Client: Developer: Client: Develop er : 136 increases, and tired of secretary rumor of determining what the person gets for salary and not whether you are on two committees or three committees. That's crap. 17. I know. 18. Are you aware of the SIRS? 19. The various levels of the SIRS form that they are developing now. Not to any large degree. Only that I look at the one and two versus the three and four on that choosing do you agree or do you strongly disagree. I don't agree with all the current criteria the student has. 20. What we are developing right now is different forms of that SIRS rating. 21. You are not aware of it. 22. O.K. 23. Well one form: I don't know which is which - I get confused at times. 'We have a long and short form that we use in our department. 24. These are for different purposes you see. 25. One of them is for general comparison among colleagues. 26. And it is for purposes of promotion, raises and so on. 27. And that is designated as such, and that is what that is. The SIRS Form, what does it stand for? 28. Student Instructor Rating System. Something like that. Oh. It is from the aspect of the student. The student evaluating and using it for other things than we we think. 29. Right. 30. So they are going to develop one form which has to do with that kind of thing you can compare people on to each other. Client: Developer : Client: Developer : Client: Developer : Client: Developer: 137 31. And then there is another they are going to develop for the departments. 32. For each department that wants to develop their own. 33. And SIRS will help them develop it for the particular needs of that department. 34. And I think there is even a third form.that will be used for instructors but the main purpose is not for looking at the instructor for purposes of promotion, tenure and so on. 35. But for purposes of improvement. 36. Self improvement and so Is there confidentiality built into this. Because you know that's a very strong motive. 37. O.K. _ 38. Some of the levels of SIRS they divide them up this way and I don't have all the details on my tongue but some are definitely secure and some are definitely not secure. 39. Some are to be public information and some are not. And departmental use for salaries, you have to have the thing identified. 40. That's right. But they are doing something on this. 41. Absolutely, and if you want to talk to somebody in detail on that, Lee Olson, one of our men is in charge of that. ' was he into this microbe teaching? I think he was intro- duced. ‘Wash 't he. 42. Yeah. Client: Developer: Client: Developer : Client: Developer: Client: Developer: Client: 138 O.K. The name is familiar to me. Lee Olson. 43. Yeah. O.K. That's very good, because just this afternoon we are having a committee meeting about salary and raise guide- lines. And we are hasseling with what options do we have. Not a hell of a lot right now. Objective ones. 44. O.K. 45. At least this one. 46. This part of the SIRS will help you. This comes from students, information from them that is used for these categories. 47. Right. 48. Now, let me give you two things. 49. One is the description of what we can do to provide data to faculty members. 50. But we will only provide data for purposes of improve- ment. 51. Our office will not get caught up in providing data for purposes of tenure or promotion. Not talking about my being here today or the bigger picture. 52. No the bigger picture. 53. Because we are not going to be put in the position of commissar. 54. O.K. 55. We are not the deans, and the chairpersons. 56. You know policemen. 57. Checking up on faculty members. 58. Hell we would never get a client in here - you know. If you did this on an individual basis though for 35 faculty in one department then someone else could take those data and do what they wish with it, could they not. WOuld You be adamant to. Developer: 59. 60. 61. 62. 63. 64. 65. 66. 67. 68. 69. 70. 71. 139 Only if it were completely agreed upon by all and I mean all cause I could feel what it would be like to be in one guy's place. But if there were complete consensus upon it and we were doing it for the purposes of improvement. But all the guys said O.K. yes but you can also use this information for other purposes. When we collect data for somebody as purposes of improvement, first of all the data is completely secure. For example, let me give you some examples of the kinds of ways you could do it. D we can, if the class is small enough they could bring it up here and they could teach that class in front of videotape. ‘We won't release the video tape to the chairman unless the guy says, yeah I would like my chairman to see that. Fine, great, terrific. Sometimes, now we have got new equipment. Larry says he got it in his office today which we could go to that class now that we have portable equipment. And we could video tape in his class and give a critique on that. We won't just however go and give the guy, make a tape, let's say here's your tape, if we do it we are going to provide some guidance. That doesn't mean tell the guy what we are going to do 72. 73. 74. 75. 76. 77. 78. 79. 80. 81. 82. 83. 84. 85. 86. 87. 88. 89. 140 necessarily, but we may, if he says give him advice we will give him advice. O.K. So that is another data of sort. We can construct particular questionnaires for an in- dividual. What we often do, for example, with a person is help him collect data informally. Now a professor came in here from pathology. And he didn't know, he knew he was doing some stuff that could be improved in terms of his lectures. He didn't know whatthough. What was wrong. He knew there was something wrong. It could be better. And so all we did was make up a very simple thing that could be answered in just a few minutes had someone go and explain to the students what he was trying to do. We want to do it. Although some students didn't even interpret that correctly. we were looking at the guy, you know. But it was explained that he was doing it. He wanted this information. We were helping him collect it. In fact we explained it but he should have done it and we should not have even been there helping him with the form. Client: Developer: 141 90. And the form said O.K. what do you like best about this particular lecture type, what did you like least, what improvements would you like seen made. 91. What questions did you have. 92. What did you achieve. 93. And we took the data form.and summarized it for himm 94. Some guys we can just show it to them.and they can figure it out and summarize it and bring it in and then we brainstorm about it, figure out what we can do and the person sets a little goal for himself. 95. Yeah I notice it is my questioning technique or you know I didn't have enough lucid examples. 96. Something and so you are playing with the guy okay why don't you really hit back. ' 97. Set a little goal for yourself and bingo. 98. Do that. 99. And the guy does it, gets some data again and we compare the data, take a look and so the guy says look we are really making some strides, some headway. 100. So that's another way we do it on that sort of data. 101. That's confidential too. That's nice. I am now only talking for my individual self. Maybe there are. I am.aure there are. I just don't know what they might be. I try illustrations, I do transparen- cies, I like slides, I like informal, I like didactic. 102. Yeah. 103. By the way what usually happens we get pretty good teachers in here. 104. What we do is we make the good better. Client: Developer: 142 105. You know the people that come in to us are really, if they recognize that they want to improve, you know they are usually pretty good to begin with. 106. That is what we have found anyway. . 107. Another thing that we can do and I have done this with a number of people is send in, I sent in a couple of observers. 108. We have staff members or either advanced graduate students in here who after identifying you know what the person wants to have looked at we get people to go in. 109. It is less obtrusive than having you know a camera in there and so on. 110. Although that happens very quickly.‘ ‘Well see what's nice about my lecture hell is there is a closed in area where you.have a projector room that none use either. No one can see if there is someone up there. 111. That's it exactly. 112. And so that gives you an idea. 113. We do observation. 114. You know data collection, you know by little instruments. 115. By video tape and so on. 116. we do review procedure on how the person improves but the trick is that to figure out what is it the person ‘wants to look for. 117. Sometimes the person does not even know what it is the person wants to look for. 118. Sometimes the person who does not knOW‘Wlll take the sort of shotgun approach at first. Client: Developer: 119. 120. That I just described to you. Uh huh. 121. Then we figure out some goal and then we think about it. 122. Now a general approach that some departments have used in other places has been something like this. 123. That each instructor, that could go in any area, but I am just talking about teaching makes a little con- tract could be quarterly or yearly with the chairman. 124. So he sits down and says well in my teaching what I am going to do is I am going to improve my discussion technique. 125. You see and here is how I am going to do it. 126. I may use this technique or I am going to try this and collect some data on it. 127. So that is what I am going to do. 128. I am also going to try this little thing about ques- tionning technique which I am going to improve. 129. I am going to go to this workshop over here and then I am going to try it in my classes and bring evidence that I have done this. 130. This is the type of evidence that would be appropriate. 131. And then maybe the chairman will take a look and say 143 We will use some of the selective informal post class questionnaire. that is not enough you know and people have been doing that and its a nifty approach because its personal, its individual, it makes sense to the chairman and its Client: Developer: Client: Developer: Client: Developer: Client: Developer: 144 spikes concensus between the chairman and the indivi- dual. 132. That's nice. 133. Yeah its a nice set up. 134. Yeah its an individual contract. 135. And everybody knows whats up. 136. It could even be written out as a memo, you know its on paper and everybody knows whats going on. 137. You know that might be good. YOu know I kind of like that. It puts a little hammer over you to do it for your own good, it sort of forces you a little bit, because you could let it dally forever if you don't do something like this. 138. That's right and look at individuals have different needs. That's interesting. 139. And individuals have different course loads and at different quarters and sometimes you may do more and sometimes you may do less. 140. Sometimes you may do nothing. 141. I mean in response to that. 142. And that seems to be a nice equitable way of doing it. Thus keep the powers in being in your department knowing what you are doing. 143. Exactly. 144. And you are trying to improve and you are improving this much or that much. 145. That makes a lot of sense to me. Yes, it is quite interesting. 146. So the thing about you personally, you know about what Client: Developer: (Ilient: 147. 148. 149. Its 150. 151. 152. 153. 154. 155. 156. 145 you want to do, at this point if you wanted to you could say alright - its up to you. You could say right now okay let me think about all these things. You know and I will get back to you or you could say geez you know look I have this particular area I would like to check on or I am not sure, I wonder if I could try this, you know sort of shotgun instrument - this very infonmal instrument and see that and try that and see what kind of data we get and then we could see where we might go. Or if your class is small enough which I know is prob- ably under 40. Yeah.we could do that. I am.not sure of our new equipment yet so I will not promise anything with that yet. Well you might want observers to come in and take a look or you might want to try a combination. Where observers go in and also you use the informal questionnaire. Now here is what is happening. You get a general rundown. I have done this with a person in humanities. I was thinking because my lectures have been usually well received. They think its in order whatever order may be. They think I can fare whatever hell fare means. But I am looking for something that goes beyond them.getting a grade and feedint back to me the things I have given to them and so on. I always make the coment that if they leave the course with nothing more than the interest and ability to Developer: Client: Develop er : (ILient: Developer: Client: Developer: 146 read Time magazines science section I will be very happy because this student just generally does not read it un- less it is on the front cover of the recent Newsweek - Cancer. And a few of them do it. But I am looking beyond for a wider spectrum of the non-scientist who may be in- stead of always going to sports first will include in their browsing and reading materials because just by plain read- ing you get educated. Whether it is in Time or not, then it is just so much that they can opinionate on when you talk about genetic engineering, you know or mind control, hormonal control of reproduction and so forth. And to use it beyond the class as well as to use the principles that in science to them most of them.have a button in their head and they have told me this through their verbal presentations or through their written evaluations I look at. They hear the word and they push the button auto- matically and they sit there like zombies. Because its too hard. I don't like it. And they really don't know what they like and don't have any idea what it is. It is something that is secret. All the characters of what science and scientists are are there. Middle-age old men looking at test tubes and doing weird things that no one can understand doing strange formulas. 157. Right. And the whole process of how we critically review things carries over. Especially in reading a newspaper, in sign- ing a contract, in going to the store. You know there is a lot of principles that scientists really carry to a farther degree. There is certain testing, refinement nonetheless its the everyday man of which we understand 20% of our population in this country is functionally illiterate. 158. That's right. The everyday man has benefits to derive from studying things that are based upon such as science. This type of logical review of things demand for specific objectives as much as possible, criteria of evidence. What is true, what is fact. You know these are all qualified. And to carry that over they have to take another science course. You see if they don't they still know that genes do this. 159. So what you are saying to me is (interrupted) Behavioral objective beyond the course. This is really the goal that I want. Whether I give 4 points or two points I really couldn't care. a 160. Are they realizing that now. 161. Are they getting that? 162. Is that what you are saying? Client: Developer : (Ilient: Developer : Client: Developer : Client: Developer : 147 163. Okay I understand what you are after. 164. You are after some principles that they need to learn and some processes of thinking that they need to learn and I think they are not achieving that objective. I don'tknow. I have stated it and I have told them. I ' have tried to give examples. 165. Have you tried to create a test which might simulate that? No. You see maybe its something like this initial ques- tionnaire. 166. Or do you do activities in class which might stimulate that. Well I always have a weekly bulletin board and take out of the State News and State Journal things that are mentioned that we are studying. Like there were several things mentioned on cancer and mutations. 167. But can you guarantee do you set up that the students do something with that. No I haven't. Maybe I should program it within my normal structure. You see I just returned to teaching after five years away. So I am.just now keeping ahead of them the basic ~didactic laboratory and its a year away before I feel free enough, confident enough, okay know how do I start modifying out of this structure I mm just doing be- cause it is a familiar one. You see and it is never too early to start but I am looking beyond a nat. sci. course for the kids. 168. Okay for example. 169. Just on the basis of what you said, I could and I am just doing this to illustrate. 170. Just on the basis of what you said you want to do is to teach these kids how to use the various principles and concepts of thinking that take place in science in their everyday lives. okav. Client: Deve10per : 148 171. That's what you want to do and you want to use it. 172. To continue to have the use of it in their everyday lives and to continue to learn about science and other things. 173. Okay. 174. Now see if I were looking at that the first thing I might consider looking at in your course and we couldn't do that by just observing, you lecture. 175. But we got it from interview. 176. And that is you might have, you.might benefit from tests not only final exams but quizzes which simulate the kinds of goals that you really are after. 177. Of for example, having a recent statement from.Time magazine about a particular topic and a question based on it. 178. You know where there are maybe choices of interpreta- tions if here is the article and there are various interpretations of you know With common flaws of thinking involved in each of the interpretations. I did that this term with so many papers written by scientists about democracy and it seems for some individuals be so out of the realm on their everyday thinking. It was lost at first. Some reacted to the style. That turned them off although did not with the content. Others dis- agreed with the content. So there is a beginning towards that, if what you are talking about. 179. And the idea then that you know to zero that in so that it fits and makes sense to the student. 180. They understand its relevant. 181. Your making it relevant, they have got to understand what it is relevant for. 182. 183. 184. 185. 186. 187. 188. 189. 190. 191. 192. 149 We would take a look at this in terms again. What has that got to do with the students past experiences. Their aspirations, or their present interest or their values and so on. And what is important to them. So we can use those as sort of criteria to judge these items. we make quizzes or an exam-like that and in class, if that is what we are going to test them on in class as part of the lecture, what we might do is set up something like, or present some ideas about the looking at certain ideas and critiquing certain ideas and figuring out what is a reasonable scienti- fic inference from the data and so on and then in- stead of lecturing the whole time we would stop after say ten minutes of lecture and then present lets say on the overhead here's a chunk from Time magazine and here is what this guy says now I want you each to write down what you think, what do you think the in- ference is. And do you think this inference is valid and why? Now check your inference with just one other person in the class. Just pair off with one other person. Okay. Now this takes three minutes and maybe another three 'minutes, six minutes and you probably learn more in Client: Developer: Client: Developer: 150 the ten minutes you talk about it. 193. You know maybe they synthesize what you talked about and say okay now. 194. Here anybody want to volunteer. 195. No. 196. Any non-chickens in the crowd, and you comment on this. 197. And you say well here will be my thing. 198. I don't think its valid. 199. For this reason. 200. You guys agree with me. 201. And how does that compare with yours. 202. And you go to your next point and so on. 203. Zingo, Well? My last lecture in the term on genetics. I come in and I write you know after they have a foundation and understand- ing of chemicals that give rise to, the genes that give rise to chemicals called enzymes, that give rise to things that we see. I always come in to the last lecture and put on the board that I firmly believe that there are mental differences between Blacks and whites. Am.I a bigot is a question I use. And then I just let the lecture go on and so on. And after 40 minutes of violent accusa- tions and defense on my part, it goes back and forth. Some- one will ask the question.well what do you mean by differ- ences. You see. And little by little, and is this in the vein that you are trying to say. 204. Yes, definitely. Because I put it up there and they say you don't really be- lieve that. They look and automatically think that black is below the white and its no good and their stupider. The fact is they are so programmed ot that. I never forget the one time I had this one black girl come up to me and scream at the top you're a damn bigot. I had her all term and she was very militant. And I simply accepted it and listened to it and then someone enlightened her well that he isn't for this. But after 45 minutes the black girl said what do you mean by difference? You know it just blossomed up to a flower. 205. Great. 9n‘ Croat- Client: Developer: Client: Developer: Client: 151 But damn you know I have trouble where do I stop talking about information so that they haveenough knowledge to intelligently talk in the way that you are talking about to discuss the pros and ons of an issue. You know there is so blasted much to teach these kids. That the luxury isn't there of time to okay let's talk about the issues of society because you have to teach them what a cell is. Some of them don't know where their kidneys are. You know all these things. You know I wonder what the hell they're doing in high school sometimes. So that's a problem. 207. 208. 209. 210. That 211. 212. 213. 214. Okay Okay. Very good because you see what you are saying to me then is okay you want to teach these principles you want to teach them to think and to use this stuff in the real world but there is a certain amount of con- tent. I hope I am getting this right. You have got to figure out where the hell do you draw the line on how much information. they have coming to them. Oh that's very good because out of all of that content you choose, then you have to make a decision how is that to be delivered. See do you deliver it written, in lecture do you have to repeat, do you have to provide activities on these things. Okay. What you are saying to me it doesn't sound to me from what you said earlier that you need immediate improve- ment or anything like that in your presentation mode in your lecture mode or anything like that. I am exploring. Developer: Client: 152 215. But it sounds to me like where you feel, this is by reading what you are saying. 216. You feel like at least this is what I am reading. 217. What you need is some technique, some procedures whereby you can figure out content and appropriate amount of content, how much content and what activi- ties would lead to these high level objectives that you really have in mind. 218. Now given these high level objectives, what activities do I do here and how does that jibe with the amount of information in this. 219. Now that's a design. 220. That is not a delivery. See cause one of the things I do is I show them you know Augenstein who was killed here the biophysicist. He had the motion picture and he would talk about mongolian idiots and so he would show the chromosomal reasons why and have some understanding why you get genetic. Then I talk about abortion. And I talk about or should these individuals not be allowed to be born. If ou were a ‘mother. You know he had a book called say 'Let's Play God" and there were some questions in there that I usually like to give these kids and they had answered before. But after they learned of this they looked back and re- answered the thing. APPENDIX E DIRECTIONS FOR CODERS AND CODING FORM 153 DIRECTIONS FOR CODERS General Description of Task. The task will be to read 13 transcriptions of audio tapes recorded from interactions between instructional developers and clients, who are faculty members at a university. The general purpose of the study you are assisting with it to describe what the instructional developer does when he/she interacts with a new client the very first time. When you read these transcriptions, you will be concentrating mainly on what the developer is saying. However, what the client is saying can't be ignored because the developer is responding to what the clgent is saying. Only the statements of the developer will be co ed. Specific Tasks Your specific task will be to assign a code to each and every utterance that the deve10per states. The code will be explained to you, and you will have practice using it and will have ample opportunity to ask questions so that you are comfortable with it before you start. You will also get definitions of all the coding words to use when you code the transcriptions. You will be able to use these directions for assistance.. The Setting for Each Transcription An instructional developer is meeting face to face with a new client (a faculty member) for the first time. The general task of the deve10per is to help the client with some kind of instructional problem or situation the client has. The Coding Categories You are being asked to read the transcriptions and "label" each utterance the deve10per states with the code of two or three of the following categories: A. Phases of Behavior. After looking at the transcriptions, it was decided that most developers went, basically, through 5 phases or stages as they worked with the client. Everv utterance should fall intoipne of‘these phases: 1. Introductory Phase: This is where the developer is getting background'information on the client, making initial small talk (i.e., weather, baEEbETT} etc.) 2. Problem Identification Phase: This is when the developer is trying to pinpoint the'problem the client is faced with. The client may not even know the real problem. 3. Solution Discussion Phase: This is when the developer discusses with the client possible solutions to the problem. Included also would be any utterances related to the suggestion solutions. 154 4. Termination or Closing Phase: This is when the deve10per draWs the meeting to a ETose in a formal way by making statements which communicate to the client that the meeting is nearing the end or ending. The second general category in the content analysis system deals with Substantive and Instructional Design Components. If an utterance has‘been coded substantive it can'the one of these. Not all utterances can be coded one of these. If it has not been coded substantive or Instructional Development, code it "None". The categories are: 5. Substantive: All those utterances which are concerned with the content or body of knowledge the client's problem may be centered around (i.e., Botany, sociology. etc.). Not all utterances can be coded as substantive. 6. Objectives: This would be any utterance in reference to the objectives of instruction. 7. Methods: This would be any statement which would be concerned with method (a strategy) or how tg_teach. It could be very general or very specifTE. 8. Media: This would be concerned with any discussion of instructional materials (film, TV, slides, overhead transparencies, etc.). 9. Evaluation: Any statement about how to evaluate students, the course, etc., would fall into this category (i.e., tests). 10. None: Any statement that cannot be coded Substantive or Instructional Design. The third general category in the content analysis system is concerned with the Communication Behavior functions of the developers. This is classifying in a specific way what the developer is saying with each statement to the client. The specific categories are: BEHAVIOR FUNCTIONS DEFINED ll. Reinforcing: Those utterances which verbally reward or praise the client for something he has said or done (i.e., right, very good, good, good point, etc.). 12. Soliciting Information: Utterances which seek information of any kind from the client. l3. Prompting: A statement which cues the client toward some intended behavior. Careful attention should be paid to this category (i.e., what may appear to be a question is really 155 attempting to prompt the client to discover a point, i.e., "Why do you think students behave that way?"). 14. Explaining; Statements which explain to the client, issues, methods or applications of concepts or principles. These are usually theory or data based utterances. 15. 0 inin : The developer giving a personal point of view or opinion. Careful attention should be given as to discrim- inate from explaining (example: I think that in higher education today we don't structure learning enough). These are usually not theory or data based utterances. 16. Assumption and Expectation Clarification: Statements which clarify roles, responsibilities and exoected results of the relationship between the developer and’the client. l7. Informalizing: Statements which promote an open, informal talking environment (i.e., "break the ice" statements; "Isn't old Joe Blow in your department?" "Want a cup of coffee?"). 18. Structuring: Statements which establish or re-establish the process of the working relationship (Let's first take a look at your problem and then come up with possible solutions). This usually includes Utterances which refer to what will happen. 19. Summarizin : This is a review of any particular steps in the structuring, and are statements which review progress in the discussion. Be careful that the developer is not restructuring the process. Reviewing one part of what has happened; looking backward. 20. Declarative: Statements in which the developer declares intent to as something for the client. ("I'll.get a copy of that book and send it to you".) 21. Dispensing: The giving of information about human and non- human resources which may help the client with his problem. This last category,.even though it is numbered with the communications, is really separate. 22. Nonsense: Any utterance which can't be read or understood 5y you the coder. If it just doesn't make sense, code it here. CATEGORIES FOR CONTENT ANALYSIS Category Phases Category_ Substantive and Instructional Design 1 Objectives 5 Substantive 2 Methods 6 Introductory phase 156 Category Phases Category 3 Media 7 4 Evaluation 8 Category ll 12 13 14 15 16 17 18 19 20 21 22 Substantive and Instructional Design Problem identification phase Solution discussion phase Termination or closing phase None or Omit Communication Behavior Functions Reinforcing Soliciting information Prompting Explaining Opining Assumption and expectation clarification Informalizing Structuring Summarizing Declarative Dispensing Nonsense How do you use these codes to code the sentences? Each sentence will be labeled in one of the phases. In addition, each sentence mgy_ be an instrugtional design component or a substantive statement, Agl_statements will also be labeled in one of the functional cate- gories. So all sentences will have at least two labels or codes and may have three. 157 CODING FORM 1 PAGE: 8 n O .1. t C n U V. r. O i m e B Design Developer: moau «wmsu mow Client: mCmu “NHCmEE: o>HuoumHuwa «usuuanum swamBDONCH cows m< as“ as u um uaoa om no om on~m>m «new: access: “you no o>aucmumpam umcfiauoe mac couuaflom ewes seasons u so cH UTTERANCES 12345678 9 0 1 l 1 2 1 3 1 I4 1 5 1 (D 1 7 l 8 1 9 1 0 2 1 2 2 2 3 2 I“ 2 5 2 PLACE TOTALS HERE: APPENDIX F CODER CASES \OCDVQU'l-th—J Developer Chaimmmmmmmmmmmmmmmmono-1.5.5.94:-aha-bummedwwwwNNNNNNNN—a—a—a-a-a—a—a 158 CODED CONFERENCES BY CODERS Client Gonna-Joncoco>>3>>oomoouo>>>>wooww>>>>mmoom>>>>woooow>>>>cooomm>>>> Coder 45LON-"43.00N-‘th-‘hWN—‘wa—‘th—‘th-‘hwN—‘th—‘th-J-bWN—‘wa-J-DWN Sentences 303 303 303 303 176 l76 176 I76 232 232 232 232 l29 129 129 129 3l4 314 314 314 201 ZOl 201 201 324 324 324 324 248 248 248 248 185 185 l85 185 191 19l l9l 191 220 220 220 220 413 413 413 413 4l7 417 417 4l7 13,412 = 3,355 # of Client Sentences 159 Phases ‘ Introductory That time early in the discussion when the Phase -- developer was getting personal background in- formation on the client, making small talk, breaking the ice and, in general, putting the client at ease. Problem Identifica- The identification and clarification of the tion Phase -- client's problem. Alternative Solu- Discussing the advantages and disadvantages tion(s) Discussion of any particular solution. Phase -- Solution Implemen- Discussing potential strategies for implement- tation Phase -- ing suggested solutions. Termination and Included all those utterances which began to Closing Phase -- draw the conference to a close. Content Categories Objectives -- All utterances which were primarily concerned with objectives of the instruction under dis- cussion. Methods -- All utterances related to instructional stra- tegy or method. Media -- Any utterance concerned with instructional materials. Substantive -- All those utterances related to the body of knowledge of the client. Process Categories Reinforcing -- Those statements which verbally reward the client for something he has said or done (i.e., right, very good, good, good point, etc.). Soliciting Statements which seek information from the Information -- client; any statement which seeks any informa- tion from the client. Soliciting Statements which seek agreement or endorsement Agreement -- from the client on a particular issue or point (i.e., Do you agree?, Right?, etc.). Prompting -- Defining -- Explaining -- Opining -- Assumption and Expectation Clarification -- Informalizing -- Structuring -- Summarizing -- Declarative -- 160 A statement which cues the client toward some intended behavior. Careful attention should be paid to this category (i.e, what may appear to be a question is really attempting to prompt the client to discover a point; Why do you think students behave that way?). Statements which define a specific concept, principles or term. Statements which explain to the client issues, methods or applications of concepts or princi- ples. The developer giving a personal point of view or opinion. Careful attention should be given as to discriminate from explaining (example: I think that in higher education today we don't structure learning enough.). Statements which clarify roles, responsibilities and expected results of the relationship between the déveloper and the client. Statements which promote an open, informal, talk- ing environment (i.e., "break the ice" statements: Isn't old Joe Blow in your department?, Want a cup of coffee?). Statements which establish or re-establish the process of the working relationship (Let's first take a look at your problem and then come up with possible solutions.). This is a review of any particular steps in the structuring, and are statements which review progress in the discussion. Be careful that the developer is not structuring the process. Statements in which the developer declares in- tent to do something. May come toward the end in the termination phase, but not necessarily (i.e., I'll get a copy of that book and send it to you.). . . BIBLIOGRAPHY 10. 11. 12. 13. BIBLIOGRAPHY Arbuckle, Dugald 5. Counseling: Philosophy, Theory and Practice. Boston, Mass. Allyn and Bacon, Inc. 1970. Banathy, B.H. Instructional Systems. Palo Alto, Calif. Fearon Publishers. 1968. Barrows, H.S. and Bennett, K. "The Diagnostic Problem Solving Skill of Neurologists." Archives of Neurology. 1972. Bellack, Arno A. and others. The Language of the Classroom. New York, New York. David McKay Company, Inc. Berelson, Bernard. Content Analysis in Communication Research. Glencoe, Ill. The Free Press Publishers. 1952. Block, Jack. The Q:Sort Method in Personality Assessment and Psychiatric Research. Springfield, Ill. Charles C. Thomas, Pub1isher. 1961. Borg, Walter R. and Gall, Meredith 0. Educational Research: An Introduction. New York, New York. David McKay Company, Inc. Buchheimer, Arnold and Balogh, Sara C. The CounselinggRelationshjp, Chicago, Ill. Science Research Associates, Inc. 1961. Budd, Richard W. and others. Content Analyses of Communications. New York, New York. The MacMillian Company. 1967. Davis, Robert H., Alexander, Lawrence 1., and Yelon, Stephen L. Learning Systems Design. New York, New York. McGraw-Hill Book Company. 1974. Davies. Ivon R. "The Management of an Instructional Development Client, Evaluation-Client Relationship: Some Aspects of a Theory of Advice." Instructional Science, Fall, 1974. Diamond, Robert M. Instructional Development for Individualized Learning in Higher Education. Educational Technology Publi- cation. 1975. Elstein, A.S.,Kagan, N., Shilman, L.S., Jason, H., and Loupe, M.J. "Methods and Theory in the Study of Medical Inquiry." Journal of Medical Education. 1972. 161 14. 15. 16. 17. 18. 19. 20. 21. 22. 23. 24. 25. 26. 27. 162 Gerlach, V.S., and Ely, D.P. Teaching and Media: A Systematic Approach. Englewood Cliffs, N.J.: Prentice-Hall. 1971. Gottschalk, Louis A. and Glesen, Goldine C. The Measurement of Psychological States Through the Content Analysis of Verbal Behavior. Berkley andeos Angeles, Calif. University of California Press. 1969. Gustafson, Kent. "Toward a Definition of Instructional Development: A Systems Approach." A paper presented to the Instructional Development Division at the 1971 AECT Convention, Philadelphia, Pennsylvania. Hamerus, D.G. "The Systems Approach to Instructional Development." The Contributions of Behavioral Science to Instructional Technology. Monmouth: Teaching Research Division of Oregon State System of Higher Education. 1968. Harvey, A.M. and Bordley, J. Differential Diagnosis. Philadelphia: Saunders. 1970. Havelock, Ronald G. The Change Agent Guide to Innovation in Education. Educational Technology Publication. 1973} Hoban, John Dennis. A Study to Determine the Characteristics of Instructional Developers. Unpublished Doctoral Dissertation. T973. Holsclaw, J.E. The Development of Procedural Guidelines for the Systematic Design of Instruction Within Higher Education. Holsti, Ole R. Content Analysis for the Social Sciences and Humanities. Reading, Mass. Addison-Wesley Publishing Company. 1969. , Kaufman, R.A. Educational Systems Planning. Englewood Cliffs,. N.J.: Prentice-Hall. 1972. Kessel, F.A. "The PhilOSOphy of Science as Proclaimed and Science as Practiced, Identify or Dualism." American Psychologist. 1964. Kleinmontz, B. "The Processing of Clinical Information by Man and Machine." In B. Kleinmontz (Ed.), Formal Representation of Human Judgement. New York, New York: Wiley. 1968. McGehearty, Joyce. "A Case for Consultation." ERIC. ED 023 120. No date. National Special Media Institutes: Participant Manual Prototype Specification Exercise -- Instructional Development Institute. U.S. Office of Education. 1972. 28. 29. 30. 31. 32. 33. 34. 35. 36. 37. 38. 39. 40. 41. 163 Neisser, V. Cognitive Psyphology. New York: Appleton Century-Crofts. 1967. Newell, A. and Simon, H.S. Human Problem Solving. Englewood Cliffs, N.J.: Prentice-Hall. 1972. Nierenberg, Gerard I., Caleno, Henry H. Meta-Talk. Pocket Books. 1975. Nord, J. "Instructional Development: A Second Look." Unpublished paper, Michigan State University, East Lansing, Michigan. 1973. Popham, W.J. and Baker, E.L. Systematic Instruction. Englewood Cliffs, N.J.: Prentice-Hall. l970. Rinoldi, H.J. A Evaluation and Training of Clinical Diagnostic Skills. Psychometric Laboratory, Loyola University, Publication No. 41. 1963. Rogers, E. and Shoemaker, F. Communication of Innovation. New York, New York. The Free Press. 1971. ~ Schein, Edgar. Process Consultation: Its Act in Organizational Development. Addison-Wesley. 1969. Schultz, 5.0. A Differentiation of Several Forms of Hostility by Scales Empirically Constructed from Significant Items on the Minnesota Multiphasic Personality Inventory. Unpublished Doctoral Dissertation, Pennsylvania State College. 1954. Schdlman, L.S. and Keisler, E.R. (Eds.) Learning by Discovery: A Critical Appraisal. Chicago, Ill. RandeMcNally. 1966. Scott, W.A. "Reliability of Content Analysis: The Case of Nominal Scale Coding." Public Opinion Quarterly. 1955. Silber, Kenneth H. "Organizational and Personnel Management Struc- tures Needed for the Successful Implementation of Instructional Development in Educational Institutions." An unpublished paper. 1973. Thiagarajan, Sivasailam. "SME + ID = (Choose One) Frustration, Confusion/Collaboration." NSPI Newsletter. March, 1973. Wintrobe, M.D., et al (Eds.). "Harrison's Principles of Internal Medicine." (6th Edition) New York, New York: McGraw-Hill. 1970. "I114111111111111'11S