I I \\ “1“,“ Y n . ‘ H \ ‘ ‘ H w ‘, ‘ m Wm , ‘le’ I _ ,‘7fii Ill/W /! .l {NFSRMATWN AS A CDNSTRUCT éN HUMAN CGMMUMSATifiN SYSTEMS I '—I :8: CID-Am Thesis for the Degree of Ph.. D. MiCHIGAN STA‘fE fiNiVERSITY CLYDE D. MORRES 1968 flllfljllllfllfllflfllfllUUIIIUHIHHIHJIIIUUIIIWI ’ 10375 6825 This is to certify that the thesis entitled Information as a Construct in Human Communication Systems presented by Clyde D. J. Morris has been accepted towards fulfillment of the requirements for Ph.D. degree in Communication § Major professor Date July 18¢_] 0-169 -’ ammo av (16 v0 scie And th Co Co ABSTRACT INFORMATION AS A CONSTRUCT IN HUMAN COMMUNICATION SYSTEMS by Clyde D. J. Morris In the past two decades, a new discipline has appeared that is devoted to the scientific study of human communication. As with any science, one of the discipline's main goals is theory construction. And one of the first tasks of theory construction is construct explication. One construct frequently used in communication research is "information." This paper describes ways the construct has been explicated during the last 20 years and suggests some areas for further explication of the construct. Bxplication of a construct, for purposes of this paper, is taken to be establishing two linkages: a linkage to physical events (Operational definition) and a linkage to other terms (constitutive definition). These linkages are examined in various areas of communication research such as applications of information theory, dissonance theory, mass communication and diffusion of innovations. After analyzing how the term has been used in communication research, a matrix of usages for the term "information" is presented that shows the locus of the usage, the function, the relationship to uncertainty and the advantages and disadvantages to the researcher of the particular usage. Usage number fOur is related to states of the communication system and is part of an approach to the study of human communication based on General Systems Theory. Accepted by the faculty of the Department of Communication, College of Communication Arts, Michigan State University, in partial fulfillment of the requirements for the Doctor of Philosophy degree. 9 67W ctor of Thesis Guidance Committee: M. Chairman E ‘ INFORMATION AS A CONSTRUCT IN HUMAN COMMUNICATION SYSTEMS By Clyde D. J. Morris A THESIS Submitted to Michigan State University in partial fulfillment of the requirements , for the degree of DOCTOR OF PHILOSOPHY Department of Communication 1968 peo be par the Ca: the uh, ACKNOWLEDGMENTS My warmest thanks go to Dr. Randall P. Harrison, my thesis advisor; and to Dr. Hideya Kumata, Dr. R. Vincent Farace, and Dr. John Vinsonhaler, the members of my guidance committee. My thanks also to two men whom I greatly admire: Dr. Miles Martin and Dr. Everett Rogers. A special kind of thanks to Dr. David K. Berlo - a man who understands how to create an intellectual climate in which researchers and scholars feel free to work on problems they're interested in with people that they like to work with. So many colleagues have influenced my thinking that they can not be listed. But Jeffrey Katzer, David Beatty and Linda Davis have been particularly helpful to me in the formulation of this thesis. And thanks to Gail Morris. Graduate programs are much harder on wives of graduate students than they are on graduate students. Gail was patient and understanding when she should have been and not patient and understanding when she should not have been. Two men who have greatly influenced my thinking should be thanked here: Mr. J. L. Thomas and Dr. John Paul. My sincerest thanks to Mrs. Shirley Sherman, Mrs. Joyce Flea and Mrs. Donna Kokx for their help in preparing the several drafts of this thesis. And a long overdue thanks to John Morris and Agnes Morris. I can think of several highly articulate and eloquently written books that have said things that my parents told me, in simpler language, when I was a boy. ii CE; CHAPTER I. INTRODUCTION: THE PROBLEM . . II. III. IV. INFORMATION TO COMMUNICATION V. CONCLUSIONS AND IMPLICATIONS FOR RESEARCH BIBLIOGRAPHY . . . . iii TABLE OF CONTENTS SHANNON'S USE or INFORMATION (11) , , , - GENERAL SYSTEMS THEORY APPROACH INFORMATION2 AND INFORMATION3(ABSOLUTE AND DISTRIBUTED) 22 28 39 #3 Table 1. LIST OF TABLES Four Uses of "Information" . . . iv LIST OF FIGURES Page Figure 1. Kinds of Information and Uncertainty . . l2 devot cien and o Berle funda (Ber; of ti SCier that Ix) *5 w. CHAPTER I INTRODUCTION: THE PROBLEM In the past two decades, a new discipline has appeared that is devoted to the scientific study of human communication. As with any science, one of the discipline's main goals is theory construction. And one of the first tasks of theory construction is construct explication. Berlo notes that "the creation, care and feeding of constructs is a fundamental task, and one which must precede most other scientific work." (Berlo, p. 2)._ One construct frequently used in communication researCh is "in- fbrmation." This paper will describe ways the construct "infOrmation" has been explicated during the last 20 years and will suggest some areas - for further explication of the construct. The Problem of Construct Explication For purposes of this paper, construct is used in Conant's sense of the word - a concept that has been created or appropriated for scientific purposes. (Conant, p. 31). A construct is a word (a symbol) that the scientist can manipulate and for which he can provide meaning. To be useful to the scientist, the construct must be reliable; that is, when it is applied it must consistently include the same phenomena andexclude other phenomena. This suggests two things: 1. Tightness or rigor at a logical, formal level. 2. Tightness in operationalization or measurement. In Berlo's terms, there are two linkages for the construct; the linkage of the term to physical events (Operational definition) and the linkage of the term to other terms (the constitutive definition). (Berlo, p. 2). These problems of definition are of practical importance when we deal with the construct information because the term is used by both the scientist and the layman. To the layman, "fish" is practically anything that swims, including whales, eels, etc. But to an icthyologist, it has a rigorous, Specific meaning. And to the layman, "electricity" is a useful construct to describe whatever it is that makes the lights go on but the electrical engineer needs a good deal more precision. To him, electricity is almost a meaningless construct unless it is specified in terms of volts, amps, resistance, etc. in the circuit. These few sentences are, of course, only a clarification of what is involved in construct explication for purposes of this paper. Rudner deals at some length with the problem of construct explication in the social sciences (Rudner, p. 19). Information — Boundaries and Measures As has been noted, to the layman "information" means a variety of vague things. For example, he could mean information about new cars, or the information explosion, or the service the phone company provides. Sometimes the communication researcher uses it like the layman, sometimes he uses it more precisely. When using the word information, this paper suggests that the researcher could be using the word even more precisely. As it has been used in research, the word seems to shift focus and change the boundaries of what is in and what is out. Sometime the boundary is drawn so that the "infermation" is in the external world (that is, in the stimuli). Sometimes the boundary is drawn so the information is in the person's head (what is informative is what will change what is already "known.") This problem of changing focus and boundary will be dealt with as a problem of locus. On the measurement dimension, the operationalization of information differs. It is often at the nominal level of measurement (that is in- formation and this isn't); sometimes it is at the ordinal level of measure- ment (200 words is defined to be more information than 100 words but not twice as much). Sometimes it is difficult to measure information at all (the amount and kind of reSponses we get when we present a question to an infbrmation retrieval system). In this last case, the question of what is a relevant answer and what is not relevant becomes important, as does the problem of how much of the relevant information has been retrieved and how much has not been retrieved. A Matrix for Examining the Multiple Usages of Information Section V of this paper presents a table that illustrates four uses for the word "infermation," the locus of the usage, the function of the usage, how the usage relates to uncertainty, and some advantages and disadvantages of the particular usage of the construct when studying the process of human communication. The four usages are labeled I1, 12, 13, and Iu° 11 is statistical information. The word is used here in Shannon's sense (Shannon, P- 32). 12 may be looked at as "pieces" of information. ("Pieces" is used to avoid confusion with Shannon's "bits"). The pieces are known collectively as the world's knowledge. Another way of'con- ceptualizing this usage is to borrow Teilhard de Chardin's word nooSphere - auord he intellect‘ earth, mu way of co as oppose Absolute is more a a musici. Creative infomat fomatio dictiona formed, an answe concem interpe EESture funCtic SYSIEms t0 the 0f inf for fu a word he used to describe the "thinking layer" or the total Sphere of intellectual activity which he predicted would eventually cover the e arth, much as the atmoSphere does. (Teilhard, p. 180). Still another Way Of conceptualizing 12 is in Brillouin's term "absolute information" as opposed to "distributed information," (Brillouin, 1950, p. 595). Absolute infermation exists as soon as at least one person has it. There is more absolute information as a result of a new scientific discovery, a musician composing a symphony, an author writing a book - anything creative or imaginative which adds to the noosphere. When the absolute information spreads to more than one person, we have distributed in- formation which is labeled I This usage is close to the more normal 30 dictionary sense of the word. This usage is an informing or being in- formed, a telling or being told of something, news, acquisition of facts, an answer to a question, etc. The fourth usage, In: is proposed as concerning the state of the communication system. Much of In at the interpersonal level is received via nonverbal channels (smile, frown, gestures, etc.) which makes it difficult to articulate its meaning. Its functions are largely to: 1) define the relationship of the living systems to one another, 2) define the communication system's relationship to the environment, and 3) define the relationship of the actions of the system to the goals of the system. The next feur sections of this paper will deal with the four usages of information. Section V presents some conclusions and implications for further research. Sh; Specified a certair are recef is produ measnm much unc maSUpe Which s the nu: altem; in the “ceiv exampl If m We hat CHAPTER II SHANNON'S USE OF INFORMATION ( Il ) Shannon's concern is with the transmission of symbols within a Specified system. As the symbols are sent from a source to a receiver, a certain amount of uncertainty of the receiver is reduced as the symbols are received. Shannon was looking for a measure of how much information is produced as the receiver receives the symbols. His measure is a measure of the freedom of choice in selection, which is related to how much uncertainty there is to begin with as the message is received. The measure he derived is: H““ Pi 1°8Pi 1 which states that the freedom of choice or uncertainty is a function of the number of alternatives and the probability of occurrence of these alternatives (Shannon, p. 50). Shannon was not concerned with meaning in the sense of what human value might be attached to the symbols being received. The symbols are not necessarily letters, as the next few examples attempt to illustrate. Information as a Choice between "on" and "off." A stimulus from the environment can be a source of information if we have a_ priori knowledge that the stimulus can either be on or off. If we have ringing). assign ou: symbols it of: ringi is at the the door" It continuox the bell The latt. beam tel 115 some Signs uh E and we 1 (anothe: by answq for him 2? km w2’11 r 3? diff Stimuli ”rosin Host oh an Oper If we have a doorbell, it can be in either of two states (ringing or not ringing). The meaning of the ring is arbitrary — something which we assign ourselves. In Shannon's sense of information, there are two symbols in the repertoire that we as receivers have §_priori knowledge of: ringing and no ringing. In general, we assign the meaning "someone is at the door" to the state of ringing and the meaning "no one is at the door" to the state of not ringing. It could easily be arranged the other way. The bell could ring continuously when no one was there and, when someone pushed the button, the bell would stop ringing which would tell us that someone was there. The latter case is the way we rig electronic eyes. The continuous beam tells us no one has crossed it. The cessation of the beam tells us someone is there. In each of the instances described, we have signs which are capable of being in more than one state (on or off). Suppose that we get tired of answering the doorbell ourselves and we hire a butler. We'll call him Harry. If someone knocks (another on - off system) on the door or rings the hell, he responds by answering the door. But on Harry's day off his brother stands in for him. Alfred won't open the door until the caller rings the bell and knocks on the door. Our friends will ask us "What's with those two?" We'll reply with a shrug and say "I don't know, they're just different." By different, what we're saying is Harry and Alfred seem to process stimuli from the environment differently. All "objective" observers perceive two stimuli - a knock on the door and a ring of the bell. Most observers agree that the expected response for either stimuli is an opening of the door. Yet, Alfred doesn't open the door until he hears both. 01‘ curve dro nhhahe Tr we percei forwhe someone . to them thaxone or hi be no touc] wrmo we take to us. (in m set be pac Could Betty "0n" 5 the s< Sigma. op So both. Or maybe he doesn't "hear" both. Maybe his hearing response curve drops sharply at the frequency range of the ring of the bell. Or, maybe he requires a ring and_a knock as conditions for opening the door. This discussion underscores that we know what we know by what we perceive through our senses. But we may not all have the same meaning f<>r what we perceive. And even if we perceive the same stimuli as someone else, maybe we process incoming stimuli differently and respond to them differently. A stimulus isn't a stimulus unless there is more than one alternative on the incoming channel. It must be bell-no bell, or bi bellelow bell, light-no light or green light-red light, touch- no touch or hot touch-cold touch, smell of roses or smell of beer. If our receptors (our senses) perceive some change in the environment and we take these changes into account, then these changes become information to 118. Information as Freedom of Choice (Degrees of Freedom) ' Information in this sense is a measure of the freedom of choice (in the statistical sense) when Selecting a "message" from an available set. Many more symbols to which more meanings could be assigned could be packed into our doorbell code. A bachelor with five girl friends could agree in advance with each one on a unique ring. Mary has one, Betty has two, etc. Now when the bell rings, he knows which of the "on" states he hears which tells him: 1. someone is there, and 2. who the someone is. The patterned input has a referent; the pattern now signals a unique young lady. ‘But a problem arises. Any stranger is likely to use one or two or so rings. And our friend may wish to know whether a friend or a stranger is at the second be hears a i i which I "teaming unique r acquaint find him friends1 assignec with feet he may a Closed . more in (mm,er is the The man Shamnon tramSrni “use ShamIon (I Hf w h .e Uses is at the door. So he puts in a conVeniently located but secret second bell which has a different sound from the first. Now when he hears a hell he knows: l. someone is there, 2. friend or stranger, and 3. which of the five girls it is. If our friend has a desire for even more certainty as to the "meaning" of the bell when he heard it, he could assign every friend a unique ring. He may even want to discriminate between close friends and acquaintances. He could use more bells and more codes. But then he may find himself in the position, if he is a popular young man with many friends, of waiting for several minutes as his friends ring out their assigned code. He could reduce his waiting time by putting in more bells with fewer codes per hell, but bell installation can become costly. He may ask himself if it wouldn't be cheaper and faster to install a closed circuit TV system. He has figured intuitively that he can get more information per unit of time per unit of money on a TV channel than on many Spearate wire channels. This comparison of the relative information carrying capacity (number of alternatives) of different channels of information transmission is the one faced for many years by engineers at Bell Telephone Laboratories. The man who devised a statistic to compare different channels was Claude Shannon. He didn't care what the "meaning" of the information being transmitted was. That would be left to the discretion of the person using the channel. Shannon's Mathematical Theory of Communication ( Information Theory) Shannon, in his classic paper, points out that information, as he uses the term, is significant in that the actual message is one selected from a set of possible messages. His problem, as a communi- cation engineer, is to design a system to operate for each possible selection, not just for the one which will actually be chosen since this is unknown at the time of designing the system. Information theory is not a theory as much as it is a measurement. It is a useful statistical technique for quantifying amount of informa- tion. Information in Shannon's sense as we have noted is a function of l. the number of alternatives. 2. the probability of occurrence of these alternatives. To clarify this point, let us consider an example. Imagine a dime on one square of a checkerboard. A checkerboard has 6n squares counting red and black squares. Pretend you have your'back to the board and a friend is standing there who will answer yes or no to any question you ask about the location of the dime. With one question, you could know for certain which half (left or right) of the board the (lime is on. If you say "Is it on the left half," and it is on the left half, he will answer "yes." If it is not on the left half, he will answer "no." In either case, you know which half for sure with just one question. With another question, you could find out whether it is in the top or the bottom of the half. If you try this yourself with a dime and a board, you will see that with six questions, you can locate the dime precisely. In other words, each yes or no answer reduces your uncertainty (remember you are the receiver) by one half. You start with en possibilities and, with one appropriately asked question, reduce these al' to 16 an to one - infomat tematix statistf whether 50y or imagine altema 0.999 a 0f alte b0y in case sj eqUallj binary half, 1 the al- inesu. aVai la] decima that in People at PPE: 10 these alternatives to 32. The next question reduces the alternatives to 16 and so on down to the last question which reduces the alternatives to one - the one with the dime. This example is intended to illustrate information as a function of the number of alternatives; the more a1- ternatives to begin with, the more information, i.e., the bigger the statistics. Now to the probabilities. Imagine a school with 500 boys and 500 girls. You are to guess whether or not the first person you see coming through the door is a boy or a girl. You have a 50-50 chance or a probability of 0.5. Now, imagine a school with 999 boys and one girl. There are still two alternatives (boy or girl) but the probability of seeing a boy is 0.999 and the probability of seeing a girl is 0.001. While the number of alternatives remains the same (boy or girl), the appearance of a boy in the first case is more "informative" than a boy in the second case since it is less probable (.50 vs. .999). In general then, the more alternatives we have and the more equally probable these alternatives are, the more "information." Each binary decision (yes or no) reduces the receiver's uncertainty by one half, given equal probabilities. It is the different probability of the alternatives that makes the game 20 Questions possible. Each question has a binary (yes or no) reSponse. The total infermation available 18 220 alternatives or 1,098,576 if we express it in the decimal system. If we restricted the game to just people, this means that we can eliminate as possible answers just a little over a million people with 20 questions. But there are over 3 billion peOple on earth at present and more than that if we count those who have lived and those 11 who will live. The only reason we can, with 20 guesses, reduce the choices to Liz Taylor or Julius Caesar is because they are more probable selections than the third name from the top on page 33 of the New York telephone directory or the centurion who was in charge of the palace guard on the night Caesar was killed. Shannon's theory then gives us a measurement for "information" Specifically based on the number of al- ternatives and probabilities for the alternatives. Information and Uncertainty - Some Distinctions This paper attempts to illustrate different kinds of information. These different kinds of information each reduce some uncertainty. But, just as there seems to be different kinds of information, there seems to be different kinds of uncertainty, which might be useful to distinguish. For some kinds of uncertainty, there are methods for having a priori knowledge of the range of uncertainty. If we roll a die, for example, we have some knowledge that the outcome will be one of six possibilities. In reading a message written in English, we have a priori knowledge of the letters that will be used and the words that will be used. But we have no way of calculating the probability that the nth person reading this paper will drink a glass of orange juice while he or she reads it. There appears to be some limits for the different kinds of uncertainty. Brillouin (1962a) presents a discussion of uncertainty in terms of scientific experiments which illustrates some of the different kinds of uncertainty. If we plot two variables against one another, for example x and y, and let a be the full range for x and b be the full range fo: fall wit] a certai' range of Range of 3‘19 note 1. has 2. is c With,ut and Val is the that we 12 range fer y, then after a few measurements we conclude that x and y always fall within a certain shaded stripe (See Figure 1). This accounts fer a certain empirical law and certain limits of error. P0 = ab the full range 0f variation 0f X and Y while P1 is the area of the shaded region. Range of error is el to 32, Y ---.-po 91 ------- b-Pl 82 —---Experimental Law x a Figure 1. Kinds of Information and Uncertainty He notes that a scientific law: 1. has a limited field of application 2. is correct "within possible errors" Without specifying both 1 and 2 the statement of the law is incomplete and valueless. In summary, then, there are several levels of uncertainty. There is the uncertainty of measurement error but this falls within a range that we can specify. There is the uncertainty of where the relationship 13 will lie when measured but, before measuring, we can predict that it will fall in the range ab. Finally, there is the uncertainty of all other variables not covered by the area ab. If the system we are examining is a complex one, we may not know if ab is orthogonal or what the relationship of a and b are to the rest of the variables we could use if we considered them separately. Information as Uncertainty Reduction We may now examine the point of view in the previous examples. It is the receiver’s uncertainty which is being reduced, and the uncertainty is within Specified bounds. The sender encodes a message using a set of symbols which is agreed upon as a code by the sender and the receiver. The receiver must have prior knowledge of the code in order to decode the symbols. The meaning of the symbols is arbitrary. Even in the game, 20 Questions, the sender already knows the answer. It is the receiver who is reducing the uncertainty that he, the receiver, has. A Note on Shannon and Weaver As this chapter progresses, the examples shift from what has been labeled Il to what we have labeled 13- This is, in part, due to a shift in emphasis from the writing of Shannon to the writing of Weaver. Shannon deals primarily with what has been labeled I Weaver deals with in- lo formation, but suggests an extension of the work into the semantic level and the behavioral level. These levels treat meaning - something Shannon does not do in his paper. Weaver is careful to make the distinction in his paper but, in the application of Shannon's work by other researchers, infomatic his conf' Ma the logon due to th periment; He descm' we take. sense of Ga function Stimuli: fomatio to a PEG 1°80?! cc This mee by the F 8th Pee 1n information (in Shannon's sense) and meaning are sometimes confused. This confusion will be discussed later in this section. USes of "Information" Similar to Shannon's MacKay distinguishes two kinds of psychological information - the logon and the metron. He defines logon as fa_priori information" due to the logical structure of a set of alternatives or of an ex- periment; this is closely related to the concept of degrees of freedom. He describes metron as information which is obtained from measurements we take. It is related to the concept of precision in the statistical sense of the word. Garner uses MacKay's notions in an example of reaction time as a function of the modality of the stimulus. Let's say we have three stimuli: visual, auditory and tactual. This limits the amount of in- :fbrmation we can obtain. Three stimuli have more potential infermation ‘to a receiver than one stimulus. This aspect of the problem provides .logon content. We then measure reaction time for each of these stimuli. 'This measurement also provides information, but its amount is limited lay the precision of the measurement; and we ordinarily determine several asuch reaction times in order to obtain some estimate of the precision ommwo cofiumu quseeoo mo nowpnoo owanOwuma no.» masons. coHumowcseeoo mo cprnoa unoucoo mesons mH gnomes ow onmoemum mm mm: mamzdmcm moon oaaonemm now @006 mommucm>o< Eoumzm mo mamom ow mEoumhm mo meowuo< .m peoeconw>ao Op Eopm>m .N nosuonm oco ow manpmzmnsm .H “one awesome meson wowsmcowumaom pcoeconw>co was op power 09 conned made: omno>qu was camaoxo on posouum m.:mz maonezm weapon now moans menosoo acmuocsm .ouo .oumum cannon none .oouoooxo mm was: no soumhm on“ mo mopmpm ou movmaom Awsfiumou some o>wpfiewoov venomous on non usmueoo canonsmmoe hawomon poz mo>wpmcnouam was so spaflwaasopa ace wm>fiumcnopam mo cowuonsm hucfimvnooe: AmEopmmmnsm shoe no oz: 53 ohm Aswumhm maa>nav cornea =80£Q80G= .maomnoo ow Hmcnouxm oooo .Hoeemno mecca :cofiumenowcHz mo mow: snow .H manna BIBLIOGRAPHY Ackoff, R. L., "Toward a Behavioral Theory of Communication," Management Science, 1958, 3’ 218-23“. Attneave, F.,.Applications of Information Theory to Psychology) New York: Holt, Rinehart and Winston, 1959. Bar-Hillel, Y., "An Examination of Information Theory," Philosophy of ’3 Science, 1955, 23, 86-105. - Bateson, 6., "Exchange of Information about Patterns of Human Behavior," in W. 5. Fields and W. Abbott, (Eds.), Information Stgrgge and Neural Control, Springfield, 111.: C. C. Thomas, 1963, pp. 173-186. Beer, S., Cybernetics and Management, New York: Wiley, 1959. i Berlo, D. K., "The Process of Social Research," unpublished manuscript. Department of Communication, Michigan State University,_1967. Berlyne, D. E., Conflict, Arousal and Curiosity, New York: McGraw-Hill, 1960. Boulding, K. E., "General Systems Theory - The Skeleton of Science," Management Science, 1956, 33 197-208. Bradford, L. P., Gibb, J. R., and Benne, K. D., T-Group Theory and Laboratory Method, New York: Wiley, 196u. Brehm, J. W., and Cohen, A. R., Explorations in Cognitive Dissonance, New York: Wiley, 1962. Brillouin, L., "Emperical Laws and Physical Theories: The Respective Roles of Information and Imagination," in M. C. Yovits, G. T. Jacdbi and G. D. Goldstein, (Eds.), Self-Organizing Systems 1962, Washington, D.C.: Spartan Books, 1962a, pp. 231-2u2. Brillouin, L., Science and Information Theory, New York: Academic Press, 1962b. Brillouin, L. , "Thermodynamics and Information Theory," American Scientist, 1950, _3_8, sen-599. _——' Broadbent, D. E. , "Information Theory and Older Approaches in Psychology," Proc. 15th International Congress in Psychologz Brussels, 1957, 111—115. Bruner, J. S., The Process of Education, New York: Random House, 1969. 43 1414 Cherry, C., On Human Communication, New York: Wiley, 1957. Conant, J. , Science and Common Sense, New Haven: Yale University Press, 1951. Dervin, B., "A Critical Review of Research Relating Attitudes and Attitude Change to Prior Information, Information Gain, and Retention," unpublished paper. Department of Communication, Michigan State University, 1967. Garner, W. R., Uncertainty and Structure as Psychological Concepts, New York: Wiley, 1962. ‘ ”“237 .i' in! Hovland, C. I., Janis, I. L., and Kelley, H. H., Communication and Persuasion, New Haven: Yale University Press, 1953. Katz, D., and Kahn, R. L., The Social Psychology_of Organizations, E New York: Wiley, 1966. Kelly, G. A., The Psychology of Personal Constructs, New York: Norton, 1955. MacKay, D. M., "Quantal Aspects of Scientific Information," Philosophy Magazine, 1950, 3;, 289-311. Mednick, S., Learning, Englewood Cliffs, N. J.: Prentice-Hall, 1964. Miller, G. A., "The Magical Number'Seven Plus or Minus Two," Psychological Review, 1956, 63, 81-97. Miller, G. A., Galanter, E., and Pribram, K. H., Plans and the Structure of Behavior, New York: Holt, Rinehart and Winston, 1960. Miller, J. G., "Living Systems: Basic Concepts," Behavioral Science, 1965a, 32, 193-237. Miller, J. G., "Living Systems: Cross-Level Hypotheses," Behavioral Science, 1965b, 12, 380-”11. Miller, J. G., "Living Systems: Structure and Process," Behavioral Science, 1965c, 39, 337-379. Rogers, E. M., Diffusion of Innovations, New York: The Free Press, 1962. Rudner, R. S., Philosophy of Social Science, Englewood Cliffs. N.J.: Prentice-Hall, 1966. Shannon, C. E., and Weaver, W., The Mathemgtical Theory of Communication, Urbana: The University of Illinois Press, 19u9. 1+5 Shibutani, T., Improvised News: A Sociological Study of Rumor, New York: BobbS-Merrill, 1966. Smith, M. B., Bruner, J. S., and White, R. W., Opinions and Personality, New York: Wiley, 1956. Tannenbaum, P. H., and Greenberg, B. S., "Mass Communication," Annual Review of Psychology, 1968, 223 351-386. Teilhard de Chardin, P. , The Phenomenon of Man, New York: Harper, 1961. .3] gen- Thayer, L. , Communication and Communication Systems, Homewood, Ill.: Irwin, 1968. -___ - Watzlawick, P., Beavin, J. H., and Jackson, D. D., Pragmatics of Human Communication, New York: Norton, 1967. Weaver, W. , (See Shannon and Weaver). . b Wiener, N., gybernetics, Cambridge, Mass.: The M.I.T. Press, 191:8.