w ...... .5 c . t f “a . :l. $351.1” 3: :1 I... J 1. . a 3% 1 . $.35”! .fifihfifi ‘u. ~‘VLV‘ .fiihf \Jiot 2.... ,m%w.r .fi. nmfiWM-WW . . , . , a .«i ...? 53$“. 2: ..sn .3 “mafia? as?!“ . , , , . an“... a a...“ . . 9%... if: aw 74 W?) Vih’r'l‘u. . '32-... 3%.: x ...... u; a. :mauL.Gvu.. . r, supra??? a . L. kw? 3'1.“ .. .. Mu , 3;: ‘ £ 37......204. .... 5 {via y 5:,» .. .. .wfl. 94.3%.," v7 .....a .. :er Yum . a... a g... .., . is m 1.... 2 ...7 .r 3...? am”... 2m. . . ..m... .5 .. . . a“ e k . , 3 .chEW 1 1!}. [‘1 3%.... . . ... L:-u'.~"~v~. Exam )2}... ‘1‘ . .73.! hflw‘m‘ t... .25....Lr , . e 5 . . ‘ Fri): 3. . .....bgfv‘ .. ‘ meals :‘J‘O‘ LIBRARY ' Michigan State University This is to certify that the dissertation entitled THE ADOPTION OF COMPUTERS AS AN INSTRUCTIONAL TOOL BY MICHIGAN HEAD START TEACHERS presented by Cynthia Jeanne Bewick has been accepted towards fulfillment of the requirements for Ph.D. degreein Family and Child Ecology ; 2am professo; Date /’3//4/60 MSU is an Affirmative Action/Equal Opportunity Institution 0-12771 PLACE IN RETURN BOX to remove this checkout from your record. TO AVOID FINES return on or before date due. MAY BE RECALLED with earlier due date if requested. DATE DUE DATE DUE DATE DUE , no; menu. war W ‘13? 001323: 2 Wiriohiz fl 2 11/00 cJClRCJDateDuopfls-p.“ THE ADOPTION OF COMPUTERS AS AN INSTRUCTIONAL TOOL BY MICHIGAN HEAD START TEACHERS BY Cynthia Jeanne Bewick A DISSERTATION Submitted to Michigan State University in partial fulfillment of the requirements for the degree of DOCTOR OF PHILOSOPHY Department of Family and Child Ecology 2000 ABSTRACT THE ADOPTION OF COMPUTERS AS AN INSTRUCTIONAL TOOL BY MICHIGAN HEAD START TEACHERS BY Cynthia Jeanne Bewick In order to succeed in an increasingly technological age, Head Start children must learn about and with computers. Researchers suggest that poor children do not have computer access at home. Head Start teachers can bridge the gap between home and public school success by using computers as instructional tools. However, Head Start policy makers and administrators make decisions regarding fiscal, physical, and human resources without data regarding the availability of computers and the ways teachers use them. Previous studies have not used Head Start teachers as their focus. The investigator described computer resources in Head Start classrooms, how and to what extent teachers used computers for instructional purposes, and how teachers learned about computers. Contextual and personal variables related to computer use by teachers were also examined. The researcher collected data from 323 randomly selected Michigan Head Start classroom teachers, using a mailed survey instrument. Data analysis consisted of frequency counts, descriptive statistics and one way ANOVA. Head Start teachers generally use and have computers available in their classrooms. Many teachers use computers for instruction and instructional support, although some use them in a limited fashion. One out of three teachers do not integrate the computer center with other classroom activities and materials. Head Start teachers primarily learn about computers by ‘messing around” and through interaction with other people. Teachers reported that five contextual variables (talking with other Head Start teachers, type of software, curriculum guidelines, training on computer operation, program philosophy) and four personal variables (knowledge about computers, previous experiences with computers, comfort level with computers, and household income) were related to making their computer use with children significantly easier. Copyright by Cynthia Jeanne Bewick 2000 To my friends and colleagues at TIL-County Head Start. Your support made my accomplishment possible. ACKNOWLEDGMENTS I would like to express my sincere appreciation for everyone who has supported my work on this dissertation. My committee chair, Dr. Marjorie Kostelnik, provided patient guidance throughout the entire process. This continued even after she left Michigan State University. (I still wonder if her fax machine ever burst into flames.) Dr. Anne Soderman challenged me to think beyond my presumptions. Dr. June Youatt supplied insightful direction. Dr. Steve Yelon consistently shared his gift of clarity. I appreciate each of their contributions to my scholarship. I would also like to thank Julie Scott at Western Michigan University, Dr. Mark Reckase, Dr. Ken Frank, and Dr. Tom Luster at Michigan State University for their statistical support and expertise, and Warren Buckleitner, editor of The Children’s Software Revue, for software title recommendations. I will always treasure my camaraderie and support from other doctoral students in Family and Child Ecology at Michigan State, in particular Kara Gregory and Valerie Bellas. Warmest regards and thanks as well to Norma vi Bobbitt, Sally Hruska, and Suzanne Thouvenelle for words of wisdom throughout this journey. I gratefully acknowledge the directors, education coordinators and classroom teachers in Michigan Head Start programs. This study could not have happened without them. I can not begin to thank my friends and colleagues at Tri-County Head Start. They endured endless hours of rambling, edited my work, and gave me chocolate when things seemed bleak. Orion Flowers, Cathy Hosner, Connie Harrison, Mary Salman, and Vicki Sitar deserve special recognition as well as medals for tolerance and compassion. Bob Charters, Marcy Borchers, and Kim Hoffmann assured that things ran smoothly so children and families received quality services. Frances Rose understood how one balances work and school, Maryann Malberg entered data, and Jeannie Melville helped me retrieve more voice mail than any human being should handle. I am grateful to everyone in our program that supported my efforts. I promise never to do this again. vfi TABLE OF CONTENTS LIST OF TABLES ......................................... xi LIST OF FIGURES ....................................... xvi CHAPTER ONE INTRODUCTION Statement of the problem....f§Kk .................... 3 Need for the study...............;x< ............... 4 Purpose of the study .............................. 5 Research questions ................................. 5 Theoretical foundation ............................. 6 Human Ecology theory .......................... 6 Adoption of innovations ....................... 7 Conceptual and operational definitions ............ 10 Summary ........................................ 12 CHAPTER TWO LITERATURE REVIEW Head Start ........................................ 13 Computers ........................................ 16 Access for low income families ............... 16 Early childhood teachers and computers ....... 20 Computers in Head Start classrooms ........... 28 Innovation adoption and levels of use frameworks..32 General innovation adoption frameworks ....... 34 Computer and technology adoption or levels of use frameworks .................. 43 Summary ........................................ 51 CHAPTER THREE METHODS Research design ................................... 54 Sampling ........................................ 55 Population ................................... 55 Sampling frame ............................... 55 Sampling method .............................. 56 Instrumentation ................................... 57 Instrument development ....................... 57 Pilot testing ................................ 59 vfii The relationship of the variables, research questions and instrument ..................... 60 Computer use dependent variable .............. 67 Validity ..................................... 68 Data collection .................................. 69 Follow-up strategies ......................... 71 Data analysis ..................................... 72 Strengths ........................................ 76 Limitations ....................................... 76 Summary ........................................ 78 CHAPTER FOUR RESULTS The sample ........................................ 80 Question 1 ........................................ 88 Question 2 ........................................ 93 Question 3 ....................................... 100 Question 4 ....................................... 103 Question 5 ....................................... 109 Summary .......................................... 115 CHAPTER FIVE DISCUSSION Question 1 ....................................... 116 Hardware components ......................... 117 Operating status ............................ 118 Software programs ........................... 119 Software selection .......................... 120 Question 2 ....................................... 120 Computer use with preschoolers .............. 121 Instructional support ....................... 121 Instruction ................................. 122 Curriculum integration ...................... 125 Question 3 ....................................... 126 Reasons Head Start teachers learned about computers .............................. 127 How Head Start teachers learned about computers .............................. 128 Question 4 ....................................... 129 Question 5 ....................................... 132 Conclusions ...................................... 134 Limitations ...................................... 135 Implications for Head Start policy makers ........ 136 Implications for Head Start administrators ....... 138 Implications for future research ................. 142 ix Personal observations ............................ 143 Ecological implications .......................... 146 Summary .......................................... 147 APPENDICES APPENDIX A Coin toss directions ............... 149 APPENDIX B The Head Start Teacher Computer USe (HSTCU) Profile .................... 151 APPENDIX C University Committee on Research Involving Human Subjects approval..162 APPENDIX D Head Start director letter ......... 164 APPENDIX E Education coordinator letter ....... 165 APPENDIX F Classroom teacher letter ........... 166 APPENDIX G Postcard reminders ................. 168 APPENDIX H Location names ..................... 173 REFERENCES ............................................ 177 Table Table Table Table Table Table Table Table Table Table Table Table Table Table Table Table Table Table 10. ll. 12. 13. 14. 15. l6. l7. 18. LIST OF TABLES Computer use scale items ...................... 7 Age of the subjects .......................... 81 Education level of the subjects (recoded)....82 Household income of the subjects ............. 82 Head Start parent status of the subjects ..... 83 Adopter category of the subjects ............. 83 Number of years subjects reported teaching preschool (recoded) .......................... 84 Number of years subjects reported working in Head Start (recoded) .............................. 84 Number of years subjects reported working as a Head Start classroom teacher (recoded) ....... 85 Age of Head Start children in classes taught by the subjects ................................. 85 Number of Head Start children per session ....86 Length of the Head Start classroom day ....... 86 Number of days per week that Head Start classes meet ........................................ 87 Availability of computers subjects reported using with Head Start children ..................... 88 Location of computers subjects can use with children ..................................... 88 Number of computers by location .............. 89 .90 Hardware components as reported by subjects. Number of computers with all hardware components ................................... 91 xi Table 19. Frequency that computers work properly ....... 91 Table 20. Software programs teachers most like to use with children ............................... 92 Table 21. Who chooses children's software .............. 93 Table 22. Computer use of the subjects ................. 93 Table 23. The extent of computer use with preschoolers by the subjects ................................. 94 Table 24. Frequency of computer use to make instructional materials ...................... 95 Table 25. Frequency of computer use to find resources..95 Table 26. Frequency of computer use to keep records....95 Table 27. Frequency of computer use to email parents and professionals ............................ 96 Table 28. Frequency of computer use for teaching fine motor skills ....................................... 97 Table 29. Frequency of computer use for teaching socio- emotional skills ............................. 97 Table 30. Frequency of computer use for teaching numeracy skills ....................................... 97 Table 31. Frequency of computer use for teaching literacy skills ....................................... 98 Table 32. Frequency that teachers set time limits for children's computer use ...................... 98 Table 33. Frequency that teachers choose software programs for children's use ........................... 98 'Pable 34. Frequency of computer use for teaching English or other languages .............................. 99 {Table 35. Frequency that teachers write lesson plans for the computer center .......................... 99 xfi Table Table Table Table Table Table Table Table Table Table Table Table Table Table Table Table 36. 37. 38. 39. 40. 41. 42. 43. 44. 45. 46. 47. 48. 49. 50. 51. Frequency that classroom activities reflect concepts in computer programs ............... 100 Frequency that classroom materials reflect concepts in computer programs ............... 100 Subjects' interest in learning about computerlel Subjects' reasons for learning about computerlel How Head Start teachers learn about computers ................................... 102 Contextual variable - the effect of talking with other Head Start teachers on computer use...105 Contextual variable - the effect of the type of software on computer use .................... 105 Contextual variable — the effect of curriculum guidelines on computer use .................. 105 Contextual variable - the effect of training on how to operate computers on computer use....105 Contextual variable - the effect ofprogram philosophy on computer use .................. 106 Contextual variable - the effect of the number of computers on computer use ................... 106 Contextual variable - the effect of the number of software programs on computer use ........... 106 Contextual variable - the effect of the amount of time in the daily schedule on computer use..106 Contextual variable — the effect of the amount of classroom space on computer use ............. 107 Contextual variable - the effect of the number of electrical outlets in the classroom on computer use ....................................... 107 Contextual variable — the effect of training on using computers with young children on computer use ....................................... 107 flfi Table Table Table Table Table Table Table Table Table Table Table Table Table Table 52. 54. 55. 56. 57. 58. 59. 60. 61. 62. 64. 65. Contextual variable — the effect of a computer technician on computer use .................. 107 Contextual variable - the effect of meeting Head Start requirements on computer use .......... 108 Contextual variable — the effect of administrators on computer use .............. 108 Mean difference scores of contextual variables identified as significant with one way ANOVA ....................................... 109 Personal variable of Head Start teachers — the effect of teacher knowledge about computers on computer use ................................ 110 Personal variable of Head Start teachers — the effect of teacher previous experience with computers on computer use ................... 111 Personal variable of Head Start teachers - the effect of teacher comfort level with computers on computer use ................................ 111 Personal variable of Head Start teachers — the effect of household income on computer use..111 Personal variable of Head Start teachers — the effect of teacher age on computer use ....... 111 Personal variable of Head Start teachers - the effect of education level on computer use...112 Personal variable of Head Start teachers — the effect of preschool teaching experience on computer use ................................ 112 Personal variable of Head Start teachers - the effect of Head Start classroom teacher experience on computer use ............................. 112 Personal variable of Head Start teachers - the effect of adopter category on computer use..112 Income variable: Mean difference scores ..... 114 my Table 66. Personal Variables of Head Start Teachers: Mean difference scores ........................... 115 XV Figure Figure Figure Figure Figure LIST OF FIGURES Theoretical framework (Rogers, 1995, Lerner, 1984, 1986) ................................... 9 Hall and Loucks Levels of Use of the Innovation 1977). ............................ 40 Moersch’s Levels of Technology Implementation (1995). ...................................... 47 The relationship of the research questions, variables, and instrument .................... 61 Operational map .............................. 75 xvi CHAPTER ONE INTRODUCTION Children must learn about and use computers as technology increases dramatically throughout society (NTIA, 1999, Dutton, Rogers & Jun, 1987, ISTE, 1998). Forty—two percent of American households have computers. Of these households, four out of five have an annual income of $75, 000 or more. A growing segment of the United States population is experiencing a “digital divide” or lack of computer access, that is related to income, education, and race (NTIA, 1999). Poor children have a one in five chance of having a computer in their home (NTIA, 1999). A low income Black child is three times less likely to have a computer at home than a comparable White child; White children are four times as more likely to have computers than Hispanic children (NTIA, 1998). Poor children may also encounter lack of computer access at their local public school or early childhood program (National Center for Educational Statistics, 1999, Day & Yarbrough, 1998). Head Start can be a societal equalizer by connecting poor children with computers in preparation for a technology based future (Day & Yarbrough, 1998, Thouvenelle, Borunda, & McDowell, 1994, Taylor, 2000). Recent research indicates that most half-day early childhood programs serving middle and upper class families have a computer (Clements & Swaminathan, 1995). Little evidence is available however about early childhood programs serving low-income populations and the availability or use of computers with low-income children. Head Start administrators and policy makers must allocate resources and influence teachers to assure children can learn with and about computers for their present and future academic success (DHHS—HDS, 1990, MOBIUS, 1990, Head Start Act, 1998, Barnett, 1995). Ten years after an IBM demonstration project produced several recommendations for computer use in Head Start classrooms, funding for computers in Head Start classrooms continues with federal and private monies (MOBIUS, 1990, DHHS-HDS, 1990, NSHA, 1999). Teachers are the critical medium between administrators' initiatives and children. They assure appropriate computer implementation and preschool curriculum integration and promote “equitable access to technology for all children” (NAEYC, 1996, p.13). Despite this professional and ethical responsibility, only two teachers out of every ten are serious users of computers in classrooms. Nearly half of every ten teachers never use their classroom computers (Cuban, 1999). Administrators play an important role by supporting teachers and making planning and allocation decisions regarding computer resources (MOBIUS, 1990, DHHS-HDS, 1990). However, their actions may be inefficient and possibly ineffective without empirical information. There is limited evidence regarding early childhood teachers and their integration of computer activities, computer education, and attitudes towards computers (Education TURNKEY Systems Inc., 1998, Pierce, 1994, Hohmann, 1994, Ainsa, 1992). Fewer than ten empirical studies have investigated teachers and their instructional use of computers in early childhood classrooms (Wood, Willoughby, & Specht, 1998, Bilton, 1996, Edyburn & Lartz, 1986, Landerholm, 1995, Fite, 1993). Virtually none of the studies address early childhood programs serving mostly low-income families. Statement of the Problem Head Start is moving into the technological age. The Associate Commissioner of the Head Start Bureau declares, “Head Start programs must take advantage of available technologies and pass those advantages on to parents and children” (Taylor, 2000, p.1). A national Head Start newsletter asserts that Head Start programs must help teachers use technology as a tool through training and support as technological literacy becomes a national standard (Thouvenelle, 2000). Despite these calls to action, there is almost no information available regarding Head Start teachers and their instructional use of computers in Head Start classrooms. The researcher did not find instruments to measure variables regarding computer use by the Head Start teacher population. Need for the Study Head Start administrators must have information regarding the availability and type of computer resources, how and to what extent Head Start teachers use computers as an instructional tool, how Head Start teachers learn about computers and why computer use varies so they can make effective planning and development decisions and influence teacher use. It is critical that administrators and policy makers base their decisions upon data or their efforts may have little impact on Head Start teachers and programs. Managers are making choices today without evidence. This study’s findings will provide information that Head Start administrators, researchers, and policy makers can use to make decisions about future allocation of physical, fiscal, and staff development resources within local Head Start agencies. Purpose of the Study The purpose of this study is to describe existing computer resources in Head Start classrooms, how and to what extent teachers use computers, how teachers learn about computers, and what contextual and personal variables of Head Start teachers affect computer use. Research Questions The limited availability of current research prompted the five questions presented below. The researcher will formulate hypotheses for Questions 4 and 5 regarding the effects of contextual and personal variables on computer use. They will be stated in the null and alternate form, such as: Ho: There are no contextual variables that affect computer use or Ha: There is at least one contextual variable that affects computer use. Question 1. What computer hardware components and software are in Head Start classrooms? Question 2. How and to what extent do Head Start teachers use computers as an instructional tool? Question 3. How do Head Start teachers learn about computers? Question 4. What contextual variables affect computer use by Head Start teachers? Question 5. What personal variables of Head Start teachers affect computer use? Theoretical Foundation Two organismic theories provided a framework for this study. Human ecology theory suggested the dynamic and reciprocal interaction between teachers and computers within the classroom microsystem. Adoption of innovations theory proposed characteristics of the person, innovation, and context that could be used as possible variables. Human EcologyfiTheory Human ecological theorists (Bronfenbrenner, 1979, Lerner, 1984, 1986) examine the reciprocal interaction among organisms and their environments. Besides the family, preschool classrooms are one of the influential microsystems in children's lives with the potential to affect their social and cognitive development (Schweinhart, Barnes, & Weikart, 1993, Barnett, 1995, Bronfenbrenner, 1979). The majority of Head Start children participate in classroom programs, a significant element of this study (Devaney, Ellwood, & Love, 1997). Lerner (1991) suggests the concept of developmental contextualism; all living things change as the result of ‘dynamic interaction” within their context. Contextual environments contain other living things, social institutions, and “features of physical ecology” (Lerner 1984, 1986). This study looks at the immediate Head Start classroom network, within the broader Head Start program context. Teachers and computers (a feature of the physical ecology) are members of the classroom context. One can view them as dynamic elements engaged in reciprocal interactions. Teachers are influenced by their environment. These environmental factors can range from the amount of classroom space or time in the daily schedule to administrators and national Head Start requirements. Adoption of Innovations An innovation is something perceived as new, whether an idea or object (Rogers, 1995). Several theorists describe how people adopt and use various innovations, either individually or in groups (Hall & Loucks, 1977, Hord, Rutherford, Huling—Austin, & Hall, 1987, Rogers, 1995). Three other theorists conceptualize approaches about the adoption of computers and other forms of technology (Moersch, 1995, Cory, 1983, Coughlin & Lemke, 1999). None directly address early childhood teachers and their use of computers for instruction. Rogers' Model (1995) outlines the relationship between the person, the innovation, and the surrounding environment as well as a normative sequence of innovation adoption. Rogers indicated that individual or system characteristics may be associated with innovation use. Interpersonal and mass media communication channels carry information about the innovation throughout the adoption process. Rogers also categorizes individuals by when they adopt an adoption. Although some Head Start teachers may not perceive computers as an innovation or new practice, Rogers’ Model (1995) seemed particularly relevant to the study. Rogers presented a way of looking at possible variables associated with Head Start teachers, computers as instructional tools, and the classroom/program context. Personal and contextual variables that may affect computer use were based upon those suggested by Rogers’ prior conditions (the teacher, classroom, or local Head Start program), perceived characteristics of the innovation (the computer), or characteristics of the decision-making unit (the teacher). Figure 1 displays the theoretical framework for the study. //’ , HEAD‘SIA‘RT- C MPUTERS QLA§§RIQOMZPRQ Q RAM, Contextual variables ’ Contextual Variables. ; : W HEAD START QOMPQZER TEACHERS ' EDQCA TIQN , I Personal variables Personal variables Figure 1. Theoretical framework (Rogers, 1995, Lerner, 1984, 1986) Conceptual and Operational Definitions Readers need a clear understanding of the concepts in this study. Listed below are conceptual and operational definitions to help the reader. Adoption of Computers as an Instructional Tool Conceptual definition: Adoption is commonly known as the use or implementation of a new idea, activity or practice. The extent of adoption may vary among people. Operational definition: The adoption of computers by Head Start classroom teachers was measured by a researcher developed survey instrument, The Head Start Teacher Computer USe Profile. The researcher described four types of adoption: Instructional support, instruction, curriculum integration, and computer use with preschoolers. Figure 4 contains additional operational information regarding these constructs. Computer Conceptual definition: A computer is an electronic machine which by means of stored instructions and information, performs rapid, often complex calculations or compiles, correlates, and selects data (Webster's, 1999). Operational definition: A computer was defined as two separate yet related categories: hardware components (the actual computing device, monitor, keyboard, mouse & other input mechanisms, speakers, printer, and modem) and software (the internal instructions and information, commonly known as programs, which allow actions to be electronically executed). A researcher developed survey instrument, The Head Start Teacher Computer Use Profile (HSTCU Profile), measured computers. Classroom Conceptual definition: A room in a school in which groups of students are taught. (Webster’s, 1999). Operational definition: A classroom was defined as the physical location where the majority of Head Start teacher/child interaction and activity occur. (Computer 10 labs therefore generally did not meet this definition.) The physical location may be within a variety of buildings; e.g. schools, churches, former office buildings or store fronts. Head Start Conceptual definition: A federally funded, child focused program with the overall goal of increasing the social competence and school readiness of young children in low- income families. Social competence addresses “the child’s everyday effectiveness in dealing with both his or her present day environment and later responsibilities in school and life” (DHH S, 1996, p.1). Operational definition: Head Start grantee and delegate (a local public or private non-profit agency to which a grantee has delegated all or part of its responsibility for operating a Head Start program) programs (DHHS, 1996) were selected from the Michigan Head Start Program Directory 2000, compiled by the Michigan Head Start Association. Each Michigan grantee and delegate agency was contacted for inclusion in the study sample, with the exception of the researcher's own agency. Early Head Start (services to children 0-3) and Home Based Head Start (provided primarily in Head Start family homes) were not part of the study. Teacher Conceptual definition: A teacher is an individual who shares knowledge or skills with a learner or group of students (Webster’s, 1999). Teachers are adult individuals who have the primary responsibility for implementing curriculum within Head Start programs. Some Head Start programs label all adults in the classroom as teachers regardless of their role as a lead teacher, paraprofessional or volunteer. Operational definition: The local Education Coordinator from Michigan Head Start grantee or delegate agencies randomly selected lead or head classroom teachers who teach preschool children. The Head Start teacher was the study's unit of analysis. 11 Preschool children Conceptual definition: Preschool children are commonly known as children who are not age eligible to attend first grade. They may range from birth through six years old. Operational definition: Preschool children were defined as those who are three to five Years old and not attending first grade. Summary This chapter contained the statement of the problem, need for the study, the purpose of the study, theoretical foundation using human ecological theory and Rogers' Model of Stages in the Innovation Decision Process (1995), and the conceptual and operational definitions. Chapter Two includes a review of the literature related to this study in three sections: 1) Head Start overview and purpose, 2) computer issues (access for low income families, early childhood teachers and computers, computers in Head Start classrooms), and 3) frameworks for innovation adoption and levels of use. Chapter Three will describe methodology including research design, sampling, instrumentation, data collection, data analysis, strengths, and limitations. Chapter Four contains the results for each research question and the hypotheses. Finally, Chapter Five discusses the results, implications for Head Start policy makers and administrators, and suggestions for future research. 12 CHAPTER TWO LITERATURE REVIEW The purpose of the literature review is to clarify the relationship of this study with previous research efforts. This chapter has three sections. The first part provides an overview of the Head Start program. The second section addresses three areas related to computers: 1) access for low-income families, 2) early childhood teachers and computers and 3) computers in Head Start classrooms. The third portion briefly describes various frameworks of innovation adoption and levels of use. The dynamic interaction between computers and teachers within the immediate Head Start classroom network and the broader Head Start program context provided the foundation for this study. The three sections of the literature review integrate computers with contextual factors and personal characteristics of Head Start teachers. Head Start Head Start, a federally funded preschool program, began in 1965 as part of President Johnson's War on Poverty initiative. The program design provided low-income preschool children with a broad range of comprehensive services (education, medical, dental, nutrition, mental health, and social services) with social competence as the 13 primary outcome. Parents became integral partners who make planning, budgeting, and staffing decisions in the local program (Zigler and Muenchow, 1992). Congress establishes annual Federal Poverty Guidelines that determine eligibility for Head Start participants. The maximum income for a family of four is $17,050; the rate decreases or increases according to the size of the family unit (DHHS, 1996, Federal Register, 2000). During Fiscal Year 1998, Head Start nationally served 822, 316 children at a cost of $4.23 billion (Head Start Bureau, 1999). Michigan Head Start programs provided services to 33, 316 children, mostly three to five year olds in classroom programs, for $162.3 million during the same time period (Head Start Bureau, 1999). Seventy two percent of families had annual incomes less than $12,000 during the 1997-98 operating period (Head Start Bureau, 1999). Only fifty percent of children eligible for Head Start, actually receive the program (Children’s Defense Fund, 2000). Increased public and legislative pressures for accountability and the publication of the Advisory Committee’s Report on Head Start Quality and Expansion, produced recent changes in Head Start (GAO, 1998, DHHS, 1993). Most significantly, the revised Head Start Act 14 (1998) mandates the implementation of twenty-four performance measures and changes the program’s primary mission from social competence to school readiness (DHHS, 2000). Head Start adopted a “whole child” perspective of school readiness from the National Educational Goals panel recommendation that addresses five developmental domains: 1) physical and motor development, 2) social and emotional development, 3) approaches to learning, 4) language use and emerging literacy, and 5) cognition and general knowledge (Research, Demonstration and Evaluation Branch & Head Start Bureau, 1998). Head Start grantee and delegate programs must implement educational activities based upon results based performance measures and assure program effectiveness (DHHS, 2000). Federal review teams monitor educational progress by evaluating local outcome systems and supporting documentation (Head Start Act, 1998). Grantee and delegate programs prepare children for school success via locally designed plans that address readiness domains (DHHS, 2000). A national Head Start curriculum does not exist. Programs choose their own curricular model, based upon developmentally appropriate principles, that consists of written goals, objectives, activities and materials (DHHS, 1996). 15 Competent teachers are an important factor in implementation of quality curriculum. A 1998 Head Start initiative mandates that fifty percent of Head Start classroom teachers must have an Associate or Bachelors degree with a major in Early Childhood Education by September, 2003 (Head Start Act, 1998). This legislation exceeds the current minimum qualification of a Child Development Associate credential (DHHS, 1996). The Head Start Bureau gives local agencies funding for the new initiative based upon the number of classroom teachers who do not have qualifying degrees (DHHS, 1999). Payment of courses applicable for qualifying degrees, increases in teacher compensation, and additional training are allowable uses for these funds (DHHS, 1999). Low-income children remain the focus of Head Start with a revised mission of school readiness, complimented by local agency curriculum choice, and increased teacher qualifications and training. Fiscal resources support these efforts. Computers are considered an important aspect of school readiness as well as social competence (ISTE, 1998). The nature of computer access for low-income families, early childhood teachers and computers, and computers in Head Start, will be discussed next. 16 Computers Access for low—income families The National Telecommunications and Information Administration (NTIA), an agency of the U.S. Department of Commerce, began collecting data in 1994 regarding American household accessibility to telephones, computers and the Internet (NTIA, 1998). Their most recent report contends that: The “digital divide” — the divide between those with access to new technologies and those without - is now one of America's leading economic and civil rights issues. ...Overall, we have found that the numbers of Americans connected to the nation’s information infrastructure is soaring. Nevertheless, this year's report finds that a digital divide still exists, and in many cases, is actually widening over time. Minorities, low-income persons, the less educated, and children of single parent households, particularly when they reside in rural areas or central cities, are among the groups that lack access to information resources. (NTIA, 1999, p. xiii) Furthermore, “the gap for computer access has generally grown larger by categories of education, income, and race” within the last five years (NTIA, 1999, p.2). Throughout the United States, NTIA (1999) reports that forty-two percent of all households have a computer. Someone has a Bachelor’s degree in nearly sixty-eight percent of computer households compared to thirty-one percent with a diploma or sixteen percent with some high school education. Computers are in eighty percent of homes 17 with annual incomes of $75,000 or more. Sixteen percent of homes however with annual incomes less than $20,000 have computers. Married couples with children under eighteen \ are nearly twice as likely (61.8 % vs. 31.7%) to have a home computer than female single parents with children under 18 (NTIA, 1999). A low income White child is three times more likely to have a computer at home than a comparable Black child and four times as likely as a Hispanic child (NTIA, 1998). There is limited criticism of NTIA's conclusions. Parish (1997), citing Quantum Electronic Database Services, notes that data were not correlated with household income and other socioeconomic factors. She suggests that when this is done, both black and white households of similar income groups have similar computer ownership patterns. Parish does not examine other family characteristics. More frequently, NTIA findings are used to adopt new policy and program directions. Four hundred private companies and non-profit organizations recently signed a “National Call to Action” as President Clinton outlined two national goals: 1) to provide Zlfl‘century learning tools for every child in every school and 2) to create digital opportunity for every American family and community (The White House, April 4, 2000). Previous efforts also have 18 focused on computers and the United States low-income population. The Apple Classrooms of Tomorrow program has a long standing tradition of placing computers in schools who have limited access due to their economic environment (Fisher, Dwyer, & Yocam, 1996). The Gates Library Foundation donated $200 million for U.S. public libraries to provide public access to computers with an emphasis on low—income communities (Chapman & Rhodes, 1997). Microsoft and Toshiba began their Learning with Laptops initiative in 1995, which subsidizes laptop purchases for students in low—income schools (Romano, 1998). Recently, Nike pledged $2.6 million for computers in Head Start classrooms for collaboration between Head Start teachers and parents (NHSA News, 1999). Some suggest that early childhood programs become the catalysts to provide computer access to low income children. Day & Yarbrough (1998) report the social implications of quality preschool programs, especially for those in poverty, as children prepare for entrance into technology based work environments. Thouvenelle, Borunda, and McDowell (1994) describe the increasingly important role of technology in early childhood education as a societal equalizer for children from differing races, cultures, and income. Taylor cautions that if Head Start 19 children and families do not “have access to computer technology and learn how to use it, they will find themselves cut off from the information and economic opportunities they need to succeed in the Zlfl‘century” (Taylor, 2000, p.1). The reader will note that poor and minority children frequently do not have computer access at home or in their public school. Business and government have developed several computer initiatives to close this gap. Early childhood education programs can serve as a connection between home and public school. Head Start is one place where low-income children can learn about and with computers. Early childhood teachers and computers The effect of children’s computer use dominates the literature (Clements & Nastasi, 1993, Wright & Shade, 1994), yet, less than ten empirical studies investigate early childhood teachers and their use of classroom computers (Bilton, 1996, Wood, Willoughby & Specht, 1998, Landerholm, 1995, Fite, 1993, Edyburn & Lartz, 1986, Haugland, 1997, Education TURNKEY Systems, 1998). None use Head Start teachers as the unit of analysis. Only the TURNKEY survey (Education TURNKEY Systems, 1998) indicates that Head Start teachers are study participants. The literature to date suggests several 20 primary themes and possible variables. They include computer availability, hardware components, software, teacher computer training, skills/concepts taught with computers, and cost. The TURNKEY survey (Education TURNKEY Systems, 1998) provides the broadest evidence of teacher computer use in early childhood programs. The survey provided marketing information for prospective sales clients about technology product selection, purchase and use in current and future early childhood education programs. Participants were early childhood directors in school districts with enrollments of more than eight thousand children, Head Start programs serving more than one thousand students, and religiously based child care centers serving more than one hundred fifty children, geographically distributed across the United States. One hundred forty responses represented sixteen Head Start programs (two from Michigan). Critical TURNKEY findings indicate that the majority of early childhood teachers use computers to teach literacy (identifying shapes, alphabet recognition) and numeracy (identifying numbers). Head Start teachers reported that teaching children to use computers and promoting creativity were the most effective uses of technology. This position differs from other preschool and kindergarten teachers who 21 emphasized higher-order thinking skills and the promotion of collaboration and sharing. Twenty-one percent of the respondents indicated that new Head Start guidelines requiring literacy and numeracy skill development would increase technology use. A substantial majority stated additional staff development on classroom technology integration as the most significant reason for increased use. The TURNKEY findings provide possible variables of interest for the study. Teachers make professional judgements about the appropriateness of computers in the early childhood classroom (NAEYC, 1996). A professional position statement outlines guiding principles for technology use with young children (NAEYC, 1996). However, teachers may find themselves confronted with computers in their classroom and a lack of curriculum guidelines or training regarding their use (Landerholm, 1995). Therefore, some may bear the burden of inappropriate hardware components or software or not know how to integrate computers into their curriculum. Two studies conducted in Europe and Canada affirm this perspective. Bilton (1996) distributed a survey regarding the level of computer use in one English county's seventy-six nursery schools and classes. The majority of the programs reported 22 using computers with young children. Computer equipment was generally described as “out of date”. The programs cited cost as the primary reason for non-use. Teachers gained knowledge about computers through planned inservice programs, on the job training, specific computer courses, self-teaching and a combination of these methods. They also shared computer strategies with other teachers. Teachers indicated that these experiences did not provide them adequate time to experiment with effective classroom computer strategies. Bilton does not describe specific computer strategies that teachers use for instruction. A Canadian study (Wood, Willoughby & Specht, 1998) surveyed seventy-five preschool and day care directors regarding computer use with a twenty-eight question instrument. It operationally defines computer use as hardware components, software, and teacher education about computers rather than how teachers use the computer for instruction. Less than half of the respondents report having computers despite favorable attitudes towards their inclusion within the curriculum. These findings differ from others which assert that most preschool programs have computers (Clements & Swaminathan, 1995, Fite, 1993). All respondents indicate that their staff did not have “sufficient expertise or experience with computers to use 23 them effectively” (Wood, Willoughby, & Specht, 1998, p. 241) yet seventy percent of the programs did not provide staff computer training. Many programs did not have adequate hardware accessories to run most educational software for young children. One of the most commonly used software programs, Reader Rabbit, received the lowest rating by early childhood computer experts (Haugland & Shade, 1990). Landerholm (1995) reflects similar findings with a random sample of one hundred ten, public and private, preschool and kindergarten teachers in Chicago, who completed a thirty-four item written survey. She first analyzed the data with Cory's Levels of Implementation (1983) framework, described more fully later in this chapter, integrating several computer constructs related to instruction. Twenty nine percent of the teachers were at Stage 0, Not on the Bandwagon, because they did not have a computer in their classroom or computer lab for children’s use. Fifty one percent reported using computers with children, had computer access either in the classroom (31%) or in a computer lab (20%), received basic computer training and had basic software. They were placed at Stage 1, Getting On The Bandwagon. Fourteen percent of the teachers had computers at home, in the classroom and a 24 computer lab, and had more training and software. They composed Stage 2, Confusion. Only six percent of the teachers represented Stage 3, Pulling It All Together. They had a wider variety of software, computer guidelines for instruction, choice of software, and more training as well as the items characteristic of Stage 2 (Confusion) teachers. No teachers were listed at Stage 4, Full Implementation, that included paid personnel for computer staff development. The majority of Landerholm’s (1995) teachers learned how to use computers independently due to lack of training and organized planning. Sixty seven percent reported having some knowledge, training, or experience with computers. Landerholm uses an integrated framework to examine computer use for instruction. She does not address personal characteristics that may reflect differences in individual computer use. Over ten years ago, another Illinois study (Edyburn & Lartz, 1986) used a non-probability sample of kindergarten and special education teachers to identify their computer experiences and attitudes with a twenty-eight item telephone survey. Nearly half of the eighty-four respondents used computers with their children yet only twenty six percent said that computer curriculum guidelines were available. Teachers most frequently taught visual 25 concepts (shapes and colors), reading, and math, with the computer. Forty percent of teachers did not know the titles of the software programs they used. Although most had specific computer training, teachers reported that more training is needed for software selection and techniques for integrating the computer in the classroom. They also identified problems in scheduling computer use with students, coordinating computers with other classroom activities, and the need for more computers, space, and time. Edyburn and Lartz offer possible variables and skills taught by early childhood teachers with computers for inclusion in the study. Fite (1993) abstracted her findings, on the availability and use of computers with three to six year old children, from her larger study on children’s emerging literacy supported by technology. She conducted a literature review, distributed and analyzed a written questionnaire (283 Texas and Florida respondents at publication) and made site visits to Texas, New Mexico, .Arkansas, Colorado and Mexico schools (the number of visits :vas not available to the researcher). She concludes that ccnnputers for children under six are available in computer leflos or the classroom and have literacy benefits for crnildren. Fite (1993) notes that one third of teachers 26 actually using computers for child instruction, have spent most of their training time learning how to use the equipment rather than how to integrate computers in the classroom due to time or budget restraints. She lists several research based (sources are not cited) guidelines about the actual use of computers in school, placement and introduction of classroom computers, benefits of children working in groups at the computers, keyboarding and program effectiveness. Computer availability, instructional use of the computer for literacy, and teacher computer education emerge as potential factors for additional study. Haugland’s (1997) three-year comparative study used slightly more than one hundred participants, mostly teachers, who attended computer based sessions at the annual National Association for the Education of Young (fluildren conference. Each completed a seven item survey. Participants reported using computers with young children arui needing software that is more child centered, focused (n1 child interests and easier for children to operate jJudependently. Haugland asserts most preschools have conquters but does not supply supporting evidence. Computers may be unused by early childhood teachers dufa'to their physical location (Fite, 1993). Placing conquters in computer labs reduces their potential learning 27 effectiveness and amount of curriculum integration by limiting teachers and children, direct classroom access (Haugland & Wright, 1997). Additionally, appropriate computer training and time to apply learned strategies have broad implications for developmentally appropriate use and curriculum integration (Hohmann, 1994). As noted earlier, the evidence regarding early childhood teachers and their instructional use of computers is limited. The majority of the studies examine findings based upon an N of 140 or less. No study combines variables related to computer resources, teacher and program characteristics, and skills taught by preschool teachers with computers. Computers in Head Start classrooms In 1990, the federal Head Start Bureau repealed its 1984 moratorium regarding the purchase of computers for classroom use (DHHS-HDS, 1990). Almost simultaneously, a demonstration project between IBM and Head Start produced several .recommendations regarding computers in Head Start <31assrooms (MOBIUS, 1990). Forty-four classrooms in eight Phead Start grantees across the country were participants. Eactlcflassroom.was equipped with a special computer lruarning center containing two computers with sound carxacity, color monitors, a printer, and various input 28 devices (mouse, keyboard, and touch pad). Teachers received initial and on-going training as well as technical support on computer skills and strategies for using computers with children and integrating them into the classroom. Program administrators supported the introduction of computers into classrooms and project staff regularly appraised them of progress. Final recommendation categories included benefits to children, staff readiness, software, Computer placement, learning center organization, initial experiences with children, managing child access, linking with other learning centers, helping train teachers, practice, sharing ideas among staff members, and parent involvement. The project’s findings identified primary characteristics for successful integration of classroom computers. Key factors included administrative support, staff communication, enthusiastic teachers, hands- on computer training for teachers, two computers in each computer learning center, gradual introduction of appropriate software, and cooperative learning techniques with children. The final project report and new budget flexibility paved the road for classroom computers in Head Start classrooms. Few publications focus on Head Start teachers and their instructional use of computers. Most contain little, 29 if any, empirical evidence. Some provide principles for integrating computers into the classroom (Wolverton, Plutro & Bewick, 1992, Fitch & Sims, 1992, Hutinger, Robinson, & Johanson, 1990) while others concentrate on physical requirements, logistics, or software selection (Colker, 1997, Tsantis, Wright & Thouvenelle, 1989). Kersh (1999), a software company marketing vice president, describes teacher benefits (communication and improved employability skills) for using computers with children as well as suggestions for organizing the computer center published in the National Head Start Association journal as a feature article. She concludes that appropriate teacher training is critical without providing supporting evidence. As in the literature on early childhood teachers and computers, teacher computer training emerges as an important factor. The Head Start Performance Standards, mandatory regulations for all Head Start programs (DHHS, 1996), do not address computers within the classroom context. The Guidance for the Performance Standards illustrates how Head Start regulations can be implemented but is not a .legislated mandate (DHHS, 1996). According to the (miidance, computers in Head Start classrooms can help Cfliildren discover numerical concepts, develop reasoning and 30 problem solving skills and provide opportunities to work with others in a classroom computer station (DHHS, 1996). Despite extensive program monitoring, no one knows precisely how many computers are in Head Start classrooms, what types of computer activities occur, how teachers use (or don't use) computers for instruction or even if they are available. (W. Sullivan, Program Manager, ACF-DHHS, personal communication, March 4, 1999). Data from the Region IV Head Start Training and Technical Assistance Services contractor’s mailed survey indicate eighty nine percent of 175 respondent Head Start classrooms have computers. They do not provide information how teachers use computers for instruction or if they are operational (Bickel, 1996). A recent technology edition of the Head Start Bulletin presented articles on basic purchasing and selection, the Internet, distance learning, and helping families become computer literate (Head Start Bureau, 2000). The Associate Commissioner states, “ Head Start programs must take advantage of available technologies and pass those advantages on to parents and children” (Taylor, 2000, p.1). Two photographs show children using computers in unspecified locations. One article states that Head Start programs must help teachers use technology as a tool by 31 offering them training and support as technological literacy becomes a national standard (Thouvenelle, 2000). Specific evidence is not cited. Private corporations promote computers as a technological innovation in Head Start classrooms. The Nike Corporation has made a three—year, $2.6 million commitment, to implement Start Line, an educational outreach program designed to provide computers, software, and staff training to approximately 500 selected Head Start centers in California, Washington, and Oregon (NHSA News, 1999). Additional information is unavailable about the specifics of this program. Innovation Adoption and Levels of Use Frameworks Rogers defines an innovation as “an idea, practice, or object that is perceived as new by an individual or other unit of adoption” (p.11, 1995). Head Start teachers View computers in the classroom as a new practice, do not feel knowledgeable or secure about their use and need information on how to successfully introduce computers to three and four year old children (E.D. Wolverton, Chief, Education Services Branch, Head Start Bureau, personal communication, September 14, 1999). One could therefore consider computers as instructional innovations in Head Start classrooms. This study chose however to look at 32 computers as instructional tools rather than innovations because some teachers may not perceive them as new practice. The Head Start Bureau and National Head Start Association promote computer use as an tool through several recent publications and two national technology conferences (Head Start Bureau, 1997, Head Start Bureau, 2000, National Head Start Association, 1999). The adoption of innovations literature provides possible variables and ways of looking at computers as instructional tools in Head Start classrooms. Several frameworks or models illustrate innovation adoption or levels of use. Each describes how people, either individually or within groups, receive information, make decisions, implement, and modify an innovation through differing numbers of stages or levels. Four (Rogers, 1995, Hall & Loucks, 1977, Hord, Rutherford, Huling-Austin & Hall, 1987, Rothman, Erlich, & Teresa, 1981) outline the adoption of innovations, without regard to the type of innovation. Three others, (Cory, 1983, Moersch, 1995, Coughlin & Lemke, 1999) suggest innovation adoption levels for computers or other forms of technology. All frameworks contain issues related to the innovation, the innovation user (or adopter), and a broader contextual system. Each perspective is briefly discussed below. 33 General innovation adoption frameworks Rogers' Model of Stages in the Innovation Decision Process (1995) proposes that individuals pass through a series of five stages from the time they first hear about an innovation until they use or do not use it. Prior conditions related to the individual and surrounding context, personal characteristics, and characteristics of the innovation, comprise factors that affect these stages. Rogers suggests that interpersonal and mass media communication channels connect them. Each element is a significant part of the innovation decision process. Although the process occurs through time, there is no time limit for the entire process or individual stage. The five categories are described below. 1. Knowledge An individual learns about an innovation’s existence and can answer the question, “What is it?” It is unknown whether awareness of the innovation or a need for the innovation, comes first. Individuals with higher education and socioeconomic status, and increased interpersonal channels of communication are more likely to have knowledge of innovations. 2. Persuasion Individuals form either a favorable or an unfavorable attitude towards the 34 innovation. They often consult peers for information and social reinforcement despite the availability of more scientific resources. They want to know adoption advantages, disadvantages and implications for their unique situation. Some choose not to adopt the innovation despite concluding with a favorable attitude. Rogers also contends that the following innovation characteristics must also be considered: Relative advantage How well the innovation is perceived as being better than the idea or practice it replaces. Compatibility The degree of innovation consistency with the needs of potential adopters, existing values, and past experiences. Complexity The degree that the innovation is perceived as complex or difficult to understand. Trialability The degree that the innovation can be tried on an experimental basis Observability The degree that the results of an innovation are visible to others 3. Decision Individuals engage in activities that lead to either adoption or rejection of the innovation. Rogers recommends that individuals have 35 an opportunity to use the innovation on a trial bias or see others demonstrating its advantages 4. Implementation The individual puts the innovation into practice and needs answers to operational questions such as “ How do I use the innovation? What problems will I encounter? How do I solve them?” Implementation concludes when the innovation becomes habitual or conventional. Users frequently modify the innovation as it becomes routine in a process called re-invention. This encourages participants to add personal meaning to the innovation which increases the likelihood of adoption. 5. Confirmation Different types of confirmation exist. Continued users recognize the benefits of using the innovation and promote it to others. Some reject the idea after earlier implementation and either replace it with one perceived as better (replacement discontinuance) or discontinue it completely as a result of dissatisfaction with its performance (disenchantment discontinuance). Others maintain their initial rejection of the innovation. Four prior conditions influence Rogers’ model: previous practice, sensed need or problem, innovativeness 36 (how likely an individual or group adopts a new idea before others), and the social system norms. None is more significant than another. Combined conditions are more likely to have greater impact upon adoption. Rogers (1995) also outlines five adopter categories that follow the bell-shaped normal distribution. He bases this classification upon how soon someone adopts or uses an innovation compared to other members of the system (innovativeness). The categories are conceived as “ideal types” and are based upon actual observations and related personal characteristics. According to Rogers, the five adopter categories, population distribution, and their characteristics are: l. Innovators (2.5%) Innovators literally are on the fringe and are the first to adopt an innovation. Their “venturesomeness is almost an obsession” (Rogers, 1995, p. 263) and leads them into broad social connections. They are able to cope with a high degree of uncertainty about the innovation and frequently have substantial financial resources. 2. Early Adopters (13.5%) Early adopters have peer respect and often act as role models for others who consider adopting an innovation. They “are the embodiment of successful, discrete use of new ideas” 37 (Rogers, 1995, p. 264). Early adopters are a part of the local social system vs. the broader range of the innovator. 3. Early.Majority (34%) The early majority adopt an innovation just before the average person, interact frequently with their peers but seldom hold the role of opinion leader. They are deliberate in their adoption, being not the first or the last to use an innovation. 4. Late Majority (34%) The late majority adopts just after the average person, perhaps due to economic necessity or pressure from members of the system. They View innovations with skepticism and need peer pressure for motivation. 5. Laggards (16%) Laggards are the last to adopt an innovation and may be isolated in their social system. They tend to be traditional and suspicious of innovations and change. Rogers’ Model of the Stages in the Innovation Decision Process (1995) presents an inter—related contextual approach to looking at individuals or other decision-making tulits, an innovation, and the broader system. Characteristics of the person, the innovation and the 38 surrounding context are important in understanding how and why an innovation is used. As another general framework of innovation adoption, Hall and Loucks (1977) state that change models do not effectively describe classroom use or non-use of innovations. Their perspective therefore places greater emphasis on the context of the classroom as it interconnects with the innovation and the individual. They used focused interviews with individuals as the unit of analysis. Unlike Rogers (1995), individuals are not placed into adopter categories and personal characteristics of individuals that support or inhibit innovation use are not identified. Hall and Loucks report that a person's behaviors, rather than their attitudes, determine their innovation level of use. They developed the Levels of Use of the Innovation framework with decision points between each level (Figure 2), using ethnographic methodology for validity. Hall and Loucks contend that their levels can be modified for any type of innovation. They stress the importance of first—hand documentation to assure that the innovation is actually used. Clients (students) and colleagues also appear as members of the framework system. 39 Level Definition of Use Nonuse User has little or no knowledge, no involvement, & does nothing regarding the innovation. Decision Point A Takes action to learn more detailed information about the innovation. Orientation User is exploring or has received recent information about the innovation. Decision Point B Makes a decision to use the innovation by establishing a time to begin. II. 1 Preparation User prepares for first use of the innovation. Decision Point C Changes, if any, and use are dominated by user needs. 111. Mechanical Use User focuses on short-term innovation use with little reflection as they attempt to master the steps for usigg the innovation. Decision Point D-l A routine pattern of use is established. IVA. Routine The innovation is used on a regular basis. The user gives little thought to improving use. Decision Point D-2 Changes use of the innovation based on formal or informal evaluation of client outcomes. IVB. Refinement The user varies innovation use to increase impact on those in immediate environment or makes variations based upon short or long term consequences. Decision Point E Initiates change in use of innovation based on input of and in coordination with what colleagues are doing. Integration The user combines their ideas about innovation use with others to initiate change. Decision Point F Begins exploring alternatives to or major modifications of the innovation presently in use. Renewal The user re-evaluates the quality of innovation use, seeks major modifications, explores new developments and sets new goals for self. Figure 2. Hall and Loucks Levels of Use of the innovation (1977). 40 Others (Hord, Rutherford, Huling—Austin & Hall, 1987) apply a nearly identical model and research method as Hall & Loucks (1977) when studying how teachers apply innovations or other improvements within schools. They examine characteristics of the innovation itself as well as the individual teacher within the context of the classroom. Hord, Rutherford, Huling-Austin & Hall (1987) note that people frequently modified the innovation from its original presentation. Consequently, they developed checklists for “identifying specific components or parts of an innovation and the variations that might be expected as the innovation is put in operation” (Hord, Rutherford, Huling-Austin & Hall, 1987, p. 15). Checklists can be developed for any type of innovation. Hord and her colleagues (1987) also propose that the most critical element of any change process is the individual making the change. They outline seven developmental stages of concern (awareness, informational, personal, management, consequences, collaboration and refocusing) and suggest interventions for each. Individuals implement innovations in the classroom according to the Levels of Use chart, evaluated by the focused interview. Hord (1987) notes that training and 41 certification are required before using the chart and interview for research tools. Supported by the National Institute of Mental Health, Rothman, Erlich, & Teresa (1981) used twenty-two field researchers in a one year qualitative study of human service agencies in southeast Michigan to develop four innovation adoption principles or action guidelines. They are: 1. Demonstrate the innovation with a small group of the target population before expanding to the larger group. 2. Introduce new groups who support new goals or increase of the influence of current group members who support new goals. 3. Offer benefits associated with participation. 4. Clarify role performance of individuals and obtain support from supervisors and other influential people. These four principles provide a framework for the integration of the innovation by individuals within and outside a particular organization. Rothman, Erlich, and Teresa (1981) contend that potential factors or characteristics of individuals, agencies, and communities, may facilitate or limit the adoption of the innovation. These items include relationships with other people, personal knowledge of the clients or community, lack of time, support, shifting goals, funding, and physical 42 facilities. The researchers suggest that goal statements and the operationalization of guideline elements can form an effective plan to implement new practices or ideas. The broad innovation adoption frameworks examine the interaction of the individual and the innovation within the context of an environment or system. They contrast with how the innovation or individual is characterized as well as the dynamics of the context. Other frameworks look specifically at the implementation of computers and technology. Computer and technology adoption or levels of use frameworks Three frameworks (Cory, 1983, Moersch, 1995, Lemke & Coughlin, 1999) specifically examine the use of computers and/or technology primarily within the school environment context. Various combinations occur as individuals or organizational systems interact with computers as technological innovations for instruction. Cory’s 4-Stage Model of Development for Full Implementation of Computers for Instruction in a School System (1983), offers a conceptual framework that is not :supported by verifiable evidence. Cory proposes that the iJitroduction of computers into school systems is different tflnan.other educational changes. She bases her proposition tux n three assumptions, 1) teachers do not learn how to use 43 computers during their professional preparation, 2) there are inadequate funds to purchase computers for full implementation, and 3) there is a lack of “best plan” prototypes. She also contends that teachers must be able to program computers and write their own software before they can use them effectively with students. Coy suggests four sequential stages that develop over time within a school system. They are: Stage I Getting on the Bandwagon Stage II Stage of Confusion Stage III Stage of Pulling It All Together Stage IV Stage of Full Implementation Characteristics of six factors, 1) hardware, 2) software, 3) staff development, 4) computer-assisted learning, 5) computer literacy, and 6) attitude, define each stage. They reflect the assumptions stated earlier. For example, a Stage III characteristic of attitude is that teachers recognize they do not need to be expert programmers. Cory states that not all characteristics may be present at each stage due to varying resources among school districts. Cory views the school system as an integrated unit. Staff development, computer literacy, and attitude categories mention teachers with little attention to personal characteristics of change. She states that some school systems may be pre-Stage I, Not on the Bandwagon, as 44 they consider whether to use or not use computers. Cory presents a developmental contextualist perspective when she says: It is impossible for a school system to know what to do with computers until its own faculty and staff know what computers can do, and it is not possible for them to know what the potential really is until they've purchase enough hardware, used enough software, and spent enough time learning to really understand what the possibilities are. (Cory, 1983, p.11). The reciprocal interactions between the teacher, computer hardware and software, and the school system, produce change. Cory does not address personal characteristics of teachers that may cause differences in computer implementation. As another perspective, Moersch developed his Levels of Technology Implementation (LoTi) Framework (1995, Figure 3) in response to ineffective staff development programs and school district concerns regarding curriculum integration of technology. His framework outlines seven technology implementation levels teachers demonstrate, in a variation adapted from Hall and Loucks (1977). Decision points, used by others (Hall & Loucks, 1977, Hord, Rutherford, Huling-Austin & Hall, 1987) between levels, are omitted. Each level produces changes in the instructional curriculum as the teacher moves from one to the next. The 45 LoTi instrument, which measures these levels, continues to be field tested in later elementary and secondary schools (C. Moersch, President, National Business Educational Alliance, personal communication, March 5, 2000). Moersch makes slight mention about higher uses of innovations by individuals with high levels of self-efficacy yet other information on the framework does not describe how this personal characteristic is evaluated and used. 46 Level Category Description Nonuse A perceived lack of access to technology based tools or a lack of time to pursue electronic technology implementation. Existing technology is predominately text-based (e. g. ditto sheets, chalkboard, overhead projector) Awareness The use of computers is generally one step removed from the classroom teacher (e. g. integrated learning system labs, special computer-based pullout program, computer literacy classes, central word processing labs.) computer based applications have little or no relevance to the individual teacher’s instructional program. Exploration Technology based tools serve as a supplement to existing instructional program (e. g. tutorials, educational games, simulations). The electronic technology is employed either as extension activities or as enrichment exercises to the instructional program. Infiision Technology based tools, including databases, spreadsheets, graphing packages, probes, calculators, multimedia applications, desktop publishing applications, and telecommunications applications, augment isolated instructional events (e. g. a science it experiment using spreadsheets/ graphs to analyze results or a telecommunications activity involving data-sharing among schools). Integration Technology based tools are integrated in a manner that provides a rich context for student’ understanding of the pertinent concepts, themes and processes. Technology (e.g. multimedia, telecommunications, databases, spreadsheets, word processors) is perceived as a tool to identify and solve authentic problems relating to an overall theme/concept. Figure 3. Moersch’s Levels of Technology Implementation (1995). 47 Expansion Technology access is extended beyond the classroom. Classroom teachers actively elicit technology applications and networking from business enterprises, governmental agencies (e. g. contacting NASA to establish a link to an orbiting space shuttle via the Internet), research institutions, and universities to expand student experiences directed at problem solving, issues resolution, and student activism surrounding a major theme/concept. Refinement Technology is perceived as a process, product (e. g. invention, patent, new software design), and tool to help students solve authentic problems related to an identified real-world problem or issue. Technology, in this context, provides a seamless medium for information queries, problem solving, and/or product development. Students have ready access to and a complete understanding of a vast array of technology based tools. Figure 3. Moersch’s Levels of Technology Implementation (1995). 48 The Milken Exchange on Educational Technology introduced the Professional Competency Continuum (PCC) (Coughlin & Lemke, 1999) as a self-assessment process for educators who use technology. The PCC is part of a larger framework, Technology in American Schools: 7 Dimensions of Progress (Lemke & Coughlin, 1998). Professional competency is one of seven educational technology dimensions (the others are learners, learning environment, system capacity, community connections, technology capacity and accountability) necessary to prepare students for success in a “digital communication age” (Lemke & Coughlin, 1998). The broad framework integrates learners, teachers, and computers within the school and community context. Three PCC stages, Entry, Adaptation, and Transformation, measure progress in the areas of professional practice, classroom and instructional management, core technology skills, and administrative competencies. They are based upon the “stages of instructional evolution” suggested by research conducted from the Apple Classrooms of Tomorrow program (Coughlin & Lemke, 1999, p.11). Again, learners, teachers, and computers are integrated in the school/community context. Coughlin and Lemke describe each stage as follows: 49 Stage 1 Entry .At this stage educators, students and the community are aware of the possibilities that technology holds for improving learning but learning, teaching, and the system remain relatively unchanged. Educators at this level lack access to technology and the requisite skills to implement and sustain significant changes in practice. Stage II Adaptation Technology is thoroughly integrated into the classroom in support of existing practice. Educators at this stage have developed skill related to the use of technology but have primarily applied these skills to automate, accelerate and enhance the teaching and learning strategies already in place. Stage III Transformation Technology is a catalyst for significant changes in learning practice. Students and teachers adopt new roles and relationships. New learning opportunities are possible through the creative application of technology to the entire school community (Coughlin & Lemke, 1999, p. 11). Teachers may complete the PCC on line and receive immediate feedback regarding their stage of technology use (Available: http://www.milkenexchange.org). The Milken 50 Foundation is compiling data based upon these stages but no findings have been published as of this date. Each computer implementation framework specifically illustrates varying levels of computer use and related factors for each stage. Although potentially useful for elementary and secondary personnel, many of the indicators are not related to skills or concepts normally taught by early childhood teachers to preschool children. Summary Head Start, the federally funded program for low- income families and their children, has evolved since its inception in 1965. Pressured by new legislation and accountability demands, the program’s focus has changed from social competence to school readiness. Low-income families as well as single parent households have substantially limited access to computers than those with greater education and income. Head Start programs can receive funding for classroom computers and teacher training. Computers in Head Start are promoted as a tool for instruction. Limited evidence on early childhood teachers and classroom computers suggests a lack of teacher training and curriculum integration as well as varying degrees of computer availability and use. Head Start teachers have not 51 been the unit of analysis in any study. Computers are a recommended tool in Head Start classrooms and may be considered new practice by some teachers. Various frameworks illustrate levels of innovation adoption and use. Some are technology specific but were developed for elementary and secondary teachers rather than preschool teachers. Each moves from a beginning stage of not knowing about the innovation to a stage of actual use. All examine the interaction between individuals or organizations and the innovation within a contextual environment. The literature suggested several explanatory variables. They were either associated with characteristics of the person, the innovation, or the contextual environment. These variables include the availability of computers, training, administrators, education level, income, and innovator adopter category. Other variables of interest are computer location, operational status, software, social system norms, and computer education. Based upon the literature review, there are significant gaps. Little evidence exists regarding the status of Head Start classrooms with computers. There is 'virtually no evidence about Head Start teachers who use (:lassroom computers as an instructional tool. The current .literature lacks any study that describes computer 52 resources in Head Start classrooms, how and to what extent Head Start teachers use with computers for instruction, and what contextual and personal variables of Head Start teachers may effect computer use. The researcher designed this studyrto fill these gaps. “...—— ,l, -" k Chapter Three describes the study's methodology. It includes descriptions of the research design, sampling, instrumentation, data collection, data analysis, strengths and limitations. 53 CHAPTER THREE METHODS This chapter describes the methodology for the study. Included here are descriptions of the research design, sampling, instrumentation, data collection, data analysis, strengths, and limitations. The study and research methods were guided by the five research questions and related hypotheses presented below. Question 1. What computer hardware components and software are in Head Start classrooms? Question 2. How and to what extent do Head Start teachers use computers as an instructional tool? Question 3. How do Head Start teachers learn about computers? Question 4. What contextual variables affect computer use by Head Start teachers? Ho: There are no contextual variables that affect computer use by Head Start teachers. Ha: There is at least one contextual variable that affects computer use by Head Start teachers. Question 5. What personal variables of Head Start teachers affect computer use? Ho: There are no personal variables of Head Start teachers that affect computer use. Ha: Personal variables of Head Start teachers affect computer use. Research Design The study reported here is a one-shot case study, pre- experimental design (Campbell & Stanley, 1966) with randomly selected participants. Individual Michigan Head 54 Start teachers were the unit of analysis. The one-shot case study design was selected in order to obtain information about large numbers of individuals at one point in time and to establish a base for future research (Babbie, 1998). Sampling Population As a state, Michigan has a diverse geographic distribution of people ranging from rural, suburban and urban areas as well as low and high density population centers. Michigan Head Start programs approximate the breadth of Head Start nationally by including large urban agencies, smaller local grantees, migrant and Native American populations (MHSA, 2000). Approximately 1000 Michigan Head Start classroom teachers work with preschool children in 36 grantee and 36 delegate agencies (MHSA, 2000, W. Sullivan, Program Manager, ACF- DHHS, personal communication, September 17, 1999). Sampling frame Every Michigan grantee and delegate Head Start agency listed in the 2000 Michigan Head Start Program Directory (MHSA 2000) was contacted to be included in the study. All funded agencies agreed to participate. Tri-County Head Start, a Head Start grantee where the researcher is employed, and Niles Public Schools, their delegate agency, were excluded from the study to eliminate jpotential bias. The researcher selected only lead classroom 55 teachers in Head Start programs. Early Head Start teachers were not eligible since they work with children from birth to three years old. A total of 323 randomly selected subjects participated in the study. Samplingimethod A master list of Michigan Head Start classroom teachers does not exist. Head Start education coordinators, who act as agency leaders for curriculum implementation and often directly supervise classroom personnel, have lists of teacher names. The researcher asked education coordinators to randomly select lead or head classroom teachers from their teacher list using a coin toss. HEADS meant the teacher was in the study; TAILS indicated that the teacher was not a part of the study. Appendix A contains the coin toss directions. When survey designs are used, experts recommend doubling the size of the initial sample in order to achieve a final response rate of at least 50% (Babbie, 1990, Rea & Parker, 1997, Fink, 1995). Populations between 1000 and 1500 require at least 278 subjects for a 95% level of confidence, plus or minus five percent (Rea & Parker, 1997). The sampling method was likely to produce an adequate sample size of 300 Head Start lead classroom teachers, resulting in the likelihood of a Type I error at 56 .05 and Type 11 error at .80, considered as the minimal standard by researchers (Kirk, 1995). Instrumentation Researchers frequently use surveys to measure attitudes, behaviors and demographic information as well as answer foundational questions on a chosen topic (Babbie, 1990, Cresswell, 1993, Alreck & Settle, 1995). Surveys are also used when individuals are the unit of analysis so that generalizations from samples can be made regarding the larger population (Fink & Kosecoff, 1998). A preliminary step in this quantitative descriptive study was the development and pre-testing of a survey instrument, the Head Start Teacher Computer USe Profile (HSTCU Profile). Instrument Development No standardized instrument was available that measured instructional computer use by early childhood teachers. The researcher considered combining sections of typical technology and education surveys created for other audiences. Such measures, however, omitted important computer issues related to preschool teachers or contained items atypical for preschool classrooms. For example, it is unlikely that most Head Start teachers will plan spreadsheet activities for children. 57 The researcher adapted or borrowed items appropriate for Head Start teachers from the Teacher Technology Survey (American Institutes for Research, 1998), The 1997 TURNKEY Survey of Technology USe in Early Childhood Education Programs (Education TURNKEY Systems, 1998), LoTi Questionnaire (Moersch, 1995) and The Professional Competency Continuum (Coughlin & Lemke, 1999). These instruments asked questions about computer availability, hardware and software, use of computers for instruction, and personal attitudes. Participants responded either to nominal or ordinal scale items. The HSTCU Profile uses similarly scaled items. The HSTCU Profile has 57 items divided into four sections: A) background information consisting of demographic items regarding the teacher, classroom and program, B) items focused on using and learning about computers, C) items about computer hardware and software, and D) items about using computers for instruction with preschool children. Respondents could complete the questionnaire in approximately twenty minutes. The findings from the instrument pilot test and expert reviewers assisted the researcher in determining instrument content validity. The researcher amended the draft instrument based upon these 58 sources and used the revised instrument in the study. Appendix B contains the instrument, The HSTCU Profile. Pilot testing The researcher conducted a pilot test of the draft instrument during the end of May and beginning of June 2000 with fifteen Head Start lead classroom teachers. Teachers in the pilot phase came from Tri-County Head Start, the researcher's employer. They were not part of the sample for the study itself. Tri-County Head Start's grantee and delegate programs have twenty-eight lead classroom teachers. The researcher placed the names of each lead teacher in a container and randomly selected fifteen for inclusion in the instrument pilot test. She telephoned selected teachers, described the purpose of the pilot test, and requested their voluntary participation. The researcher scheduled personal interviews and observations with the fifteen teachers who agreed to volunteer. Teachers provided written consent through their completion of the draft instrument. This process determined clarity of instrument items and personal and contextual variables that were associated with varying teacher use of computers as instructional tools. After the pilot test, the researcher revised Question 1 (location of Head Start center) and rewrote or added items in Sections B. (using and learning about computers) and C. (computer 59 hardware and software). Upon completion of the pilot test and expert review of the instrument, permission for the study was requested from the University Committee on Research Involving Human Subjects (UCRIHS). Relationship of the research questions, variables, and instrument The researcher developed the draft of the HSTCU Profile based upon variables prompted by the research questions and literature review. These variables were operationalized and items constructed for each variable. Subjects generally indicated one response per item except for those that requested “mark all that apply.” Some variables were addressed by more than one item. For example, social system norms refers to Head Start requirements, program philosophy, and curriculum guidelines as measured by HSTCU Profile items 31 — 33. In the same fashion, computer background refers to teacher comfort level, knowledge about and previous experiences with computers as measured by HSTCU Profile items 19 — 21. Figure 4 displays this information. 60 Figure 4. The relationship of the research questions, variables, and instrument 61 unmaduumcH cam .moabmfinm> .mcofiumosg noumomou on» yo aflnmcoflumHou one .v madman mhgom mmmOOSO NV 55802 on; .3229: .6 20¢ mv .9552 88m: Emaoa 053:8 anagram 2.80.5 E03 madam mv .865 223an0 65 55:02... 3:26.30 A90 .2930 .1982 .2899. c255 .omaoEV mucocanoo Ecmcanoo it? 65:52 29522 Co .383: new out»... magnum: c2220 55, mm: :3 2282 mv .865 92388 Co .8832 3:530 56:5 55, mm: :8 Become. 3 65802 55 $2388 ..o 5:83 8:80.. omEootmmEo tEw one: c_ 2m 295:8 new macocanoo .coCEEo Sega: 559:8 “95> 5? mm: :8 2620mm. .5 8.390 ov .mEEoz 22388 Co b___nw__m>< >==nm=m>< % Eu: 2.: 2:05 o3utu> e.gatu> 35.303830 oEuz e.gutu> 5.325 5.303: 39—h: 62 unmaduumcfl 0:0 .00H00Hu0> .chHumosU :oumomou 0:» mo Qflnmcoflumaou 0:9 Umscflucoo v musmflm 082005 20.250 .2588 :. 058:8 80:2 00.2.2200 0:0 0.0.208 5.0508. 3.00 .0820 88000.0 20:“ 0:03:02“. 8:30:50 058:8 0:0 0.6.0 :20..:0 :002 0:0 003.200 00.00 2 223588 00.3 .0820 00: 20:002 20:“ 5:04.02”. 5.20.208 0005002 0:: 5 ...080 03 0.0500055 w 2:205 5.? 200.5888 .0282 502. 0.0.508 0x08 2 22.588 .5530 00-00 .0820 00: 20:002 20:“ 5:03.02”. 05:03:08 2050:0005 5.3 22.588 2050:0005 0? .0820 :0 00: 8250 5 8206 5.3 00: 555800 0.02 05:03:08 :0 00 22588 00: 22.002 :20 .2588 00: 000T. on 8008 0:2, 2 0:0 26: 0.. .05802 0 8.0: 0:052 5:000 ... .2588 .0522”. N 2080030 0 82. 0%... 2:9... 032...; 030...; ted-0:030:50 0802 0.:0t0> 830030 :05000¢ 30.5: 63 Damssuumcfl 0:0 .00H30H00> .mcoflumodv noummmmu 0:0 00 macmcoflumamu 0:9 poscflucoo 0 00:0Hm 22588 5000 :50. 2 00.00: 5 0500.030: 00.0.5 05.5> 0:.002 0:0 60.50300 .0858. 5 58.2 0.88. .26 5.; 22.0. 60.520000 600258.595 305.0... t 58802 55.20:. 5 00: 00.02020 0:853 22588 5000 0:850. 0.. 55802 :. 522:. :2 :0002 5:000 ... 022:. 5.. :0000m. 022588 5000 :50. 20802 :20 000... 00 30: 22588 5000 0:850. .0 500000 3 58802 c. 2022:. 0220 5:000... 0022:. 0 E2. 0%... 2:20 0.:0...0> 0_n0_5> 002.05.00.50 0:52 0.:0t0> 8.00030 9.05000: 64 0:085:000H 0:0 :00HQ0H00> :0cofium0sv 0000000: 0:0 00 afizmcoau0a0n 0:9 000000000 0 00:0H0 00 5:85 055008.800. 555008.804: 05:50: 50:05:00 :0 5:55 .56 000... 5:5 5.3 0:0..0... 5.3 5.5058. 00 5:55 555050 8:50:50 00 5:55 80000.80 82020 08082500: .085: :0 .205 00.0 000: 8.00.2 50.20 0.80 80.0850. on 5:55 5.08:5. 550800 550800 0550:0020 5.3 mm .0:.0:O 0550800 0:.0: :0 0:85; 5.5500 mm .0:.0:O :050800 :0 0:85: .. .0:.:.0:... 8020005 2050:0020 um .0:.0:O :. 05.50 50:50.0 5 508:2 5.3. 00: 550800 :0 50.5 00000 .808:0:.>:0 5:50: 0 00: x05 :.: :0 5.000 00 5:55 8005005 0.0055 5 8:08< 8020005 .505... ...: 0008 00.00.22. 000:: 0502.00 3.00 8020005 550:: 005: 050050000. . 00 .0:.0:O :. 0.00.55 0020050 08.... .08.... «05:000. 00 5:55 080520 0:03:00 5 508:2 00.0 000... >0 00: 550800 mm .0805 082005 0:02:50 5 00>... 5555.0 50.5 00.00.5> 5:52:00 5:>> 0550800 ..b...00_.0>0 .0 5.00000 um 5:55 8020005 5 508:2 550800 0 :5: 00.? 2:20 0.00.5> 0.00.5> 03.55.5500 0:52 0.00.5> 5.50:0 :050001 DUhm: 65 0008000000 000 :00000000> :000000000 00000000 000 00 000000000000 0:9 005000000 0 000000 005.595 :050800 E 5:85 05.55 5 502 u0.0.0. 0550800 00 .205 580 008.38.. :o :80: :00 0550800 5.? .0::0:00.000 0.. .0505 .05. 05.800 5 .502 :50 550800 00.0.2000 200.00 .0000. :50: 0:3... 500200 00 5:55 5. 800200 00502 .50 :20000 :0.5>0::. 5:002 8020005 0 5:55 .500 000... 00 25> 00:0.50x0 0550:0020 0 .0505 5000020 0:.:002 050> 05:000.. :5: 00: :200800 :0 50:0 5:50: 0 00: 5:. ...: :0 5.000 w .0805 0800:. 0.0505: 8250 0800:. .505: ..: 0008 00505:. 000:. 550:: 020: 050050000. . 808:.25 000: :050800 50.5 0:0:002 :20 0. 5:55 5:050:00 5 .05. 50:9... .05. 5.50:0w 000... 5 00.00.5> 5:00:00 5:>> .m 5.50:0 N 5:55 000 5 050> 00.0. a 800. 00»... 25:0 0.00.:0> 0.00.5> 03.55.5500 0:52 0.00.5> 5.50:0 55000”. aohm: 66 Computer Use Dependent Variable In anticipation of possible missing values or incongruency of a subject's responses regarding computer use, the researcher combined 11 HSTCU Profile items to form the continuous dependent variable, computer use. Table 1 contains the scale items. Table 1 Computer Use Scale Items HSTCU Profile Item # Variable name 16 , Uses a computer 18 Uses computers with preschoolers 36 Uses computer to make instructional materials 37 Uses computer to keep records 38 Uses computer to email parents and professionals 39 Uses computer to find resources and ideas 51 Uses computer to teach socio-emotional skills 52 Uses computer to teach literacy 53 Uses computer to teach numeracy 54 Uses computer to teach English or other languages 55 Uses computer to teach fine motor skills Following data collection, the researcher conducted a reliability analysis for this scale. Only subjects who answered all the scale items were included in the analysis. The reliability analysis produced an alpha of .79 for 302 subjects. The resulting descriptive statistics for the scale were 0 19.28, variance 82.73, and SD 9.10. A histogram of computer use values showed a relatively normal distribution with a slight negative skew (-.4l). The HSTCU Profile instructed some subjects not to answer particular items. These subjects reported they did not use 67 computers with preschoolers or have an available computer. Zeros (frequency = not at all) were entered into items these subjects were directed not to answer (M. Reckase, Professor, Michigan State University, personal communication, November 3, 2000). This action produced a greater number of relatively low scores, hence the negative skew. Validity Face or content validity was established by four national authorities in the fields of computers and early childhood education. 0 Suzanne Bredekamp, Ph.D., National Association for the Education of Young Children Technology Panel member, The Council for Early Childhood Professional Recognition, Washington, D.C. 0 Charles Hohmann, Ph.D., Educational Psychologist, High Scope Research Foundation, Ypsilanti, MI 0 Suzanne Thouvenelle, Ph.D., Vice President of Research & Coordinator of 1987 Head Start/IBM Demonstration Project, MOBIUS Corporation, Alexandria, VA 0 Katie Roberson, classroom teacher, South Bay Union School District, Imperial Beach, CA The researcher mailed the draft instrument and research questions to the reviewers. Each reviewer was asked to examine the instrument for clarity, face/content validity, and item suggestions. They were also invited to provide additional written comments as relevant. The 68 researcher revised the draft instrument based upon the resulting recommendations as well as the findings from instrument pre-testing. Data Collection The chain of command in Head Start programs is very clear. The director is the primary administrative and operational leader. The education coordinator, responsible for curriculum implementation, reports to the director. The education coordinator generally supervises and approves written communications with classroom teachers. It is eXpected practice in Head Start programs to follow the organizational chain of command before engaging in research or new programming. The study therefore followed the data collection process described below. Following approval of the study by the University Committee on Research Involving Human Subjects (Appendix C), data collection began with a mailing to Head Start directors, requesting permission to conduct the study in their agencies and communicate with the education coordinator. The purpose of the study, selection procedure, amount of participant time, and specific contact information were part of the written request. The researcher telephoned directors one week later for study 69 consent. Appendix D contains a copy of the letter to the Head Start director. After the director authorized the researcher to conduct the study and communicate with the education coordinator, the researcher mailed packets to education coordinators. Mailings included general information about the study, an introduction letter, purpose of the study, selection procedure, amount of participant time, specific contact information, and written coin toss directions. Appendix E contains a copy of the education coordinator’s letter. Individual survey kits were also included. The number of lead classroom teachers in the agency determined the number of individual survey kits included in the mailings. The researcher asked education coordinators how many lead classroom teachers worked in their program. She mailed a quantity of survey kits based upon approximately 50% of that number; 647 total kits were sent. Approximately, one week after mailing, the researcher called education coordinators to confirm receipt of study materials, answer any questions, and determine if additional survey kits were needed. After following the coin toss directions, Head Start education coordinators gave selected lead classroom teachers one survey kit. Each survey kit contained one 70 questionnaire, one self-addressed stamped envelope, one color marker as a small incentive, and a cover letter that explained the study purpose, selection procedure, amount of participant time, assurances of confidentiality and voluntary participation, researcher follow-up promise, contact information, and written instructions. A voluntary consent statement was also included on the questionnaire. Appendix F contains a copy of the teacher's cover letter. Teachers completed the survey and returned it to the researcher using the self-addressed stamped envelope. Follow—up Strategies At two week intervals, the researcher mailed four reminder post cards and telephoned education coordinators in Head Start programs with limited survey return. Appendix G contains copies of the postcard reminders. At the researcher's request, the Michigan Head Start Association sent two reminders via email to Head Start directors and classroom teachers, approximately five and seven weeks after the first mailing, encouraging survey return. The email reminders had the same content as Reminder Postcards #1 and #2. The researcher telephoned respondents who provided contact information and completed any missing items upon 71 survey return. Missing values for respondents without contact information were left empty. Data Analysis The investigator aggregated data. No individual participants or programs were identified in the analysis. Upon return, the researcher numbered each questionnaire, stamped the return date, and separated any identifying information. Completed questionnaires were securely stored. After responses were coded and entered, frequencies and descriptive statistics were run for all variables to determine basic characteristics of the sample, missing data, data entry errors, and potential outliers. The education level, years teaching preschool, years in Head Start, and years as Head Start classroom teacher items each had a small number of respondents per category. For this reason, they were recoded into 5 groups per variable rather than the original 7 — 10 groups per variable. The researcher used frequency counts and descriptive statistics to answer Questions 1 through 3. Data analyses for Questions 4 and 5 used one way ANOVA and post hoc procedures to measure the effect of contextual and personal variables on computer use (the dependent variable) by comparing the means of two or more groups. These measures 72 were appropriate for the categorical nature of the HSTCU Profile items and the continuous dependent variable (Shavelson, 1996, Norusis, 1997). Subjects assigned themselves to categorical groups for both contextual and personal variables by responding to HSTCU Profile items. Group divisions depended on the particular variable. For example, the basis for one set of groupings was whether the subject rated a variable (e.g. personal comfort level with computers) as making their use of computers with preschoolers easier, harder, or if it had a neutral effect. Subjects responded to other variables that served to divide them into groups based upon an ordered range of values. For example, subjects chose from 5 different groups for the age variable. They indicated whether they were members of the less than 21 years, 22— 30 years, 31 — 40 years, 41 — 50 years, or more than 51 years group. One-way ANOVA was run for each contextual and personal variable. The resulting F value from the ANOVA indicated ‘whether the observed differences represent a chance occurrence or systematic effect” (Shavelson, 1996, p. 371). Significant F values prompted further analyses. The Levene's test of equality of error variances determined the selection of the post hoc procedure. The 73 Scheffe method measured variables with assumed equal variances across groups and the Dunnett T3 method measured variables with non equal variances. These procedures produced significant mean difference scores (p<.05) that indicated where differences within personal or contextual variable groups occurred. Figure 5 illustrates the operational map for the study. 74 QOMPUTERS Contextual variables Availability Software HEAD START QLAS§ROOMIPRO§RAM Contextual variables Time Classroom environment Training Computer technician Social system norms lnteraction with colleagues k Administrators COMPUTERS S W Computer Use (DV) HEAD START TEACHERS - - Personal variables COMPUTER ED CA TION Personal variables Age Education level Computer baCKQI’OUfld Income Teaching experience Innovation adopter category \ k Figure 5. Operational map 75 J Strengths There were two primary strengths of this study. First, a relatively large N facilitated generalization from the sample to the population and increased the credibility of the study. A total of 323 subjects is larger than the minimum of 278 needed for a 95% level of confidence, plus or minus 5%, for a population of 1000 (Rea & Parker, 1997). This was especially important since this was the first study that looked at Head Start teachers and their instructional use of computers. Second, education coordinators chose subjects using simple random selection. This method assured that every lead Head Start teacher in Michigan classrooms had an equal chance of being selected for participation in the study. Strong external validity can occur only if the sample is representative of the general population (Fink, 1995). Using this process assured that this sample was likely to represent the population of classroom teachers in Michigan Head Start programs. Limitations The study had two primary threats to internal validity: instrumentation, and non-response bias (Campbell & Stanley, 1966, Fink, 1995). A self-designed instrument was used because a standardized measurement tool for the 76 population or content was currently unavailable. Potential problems with any survey could be a lack of content and face validity as well as items that were unclear or confusing to respondents. The researcher minimized these limitations as described earlier in this section. The reliability of the instrument has yet to be substantiated. To reduce non-response bias, the instrument cover letter contained the study purpose, an assurance of confidentiality and researcher contact information. It is possible, however, that some respondents chose not to return the questionnaire or to answer only selected portions, resulting in other non-response bias issues. The instrument pilot test was used to determine whether respondents were likely to leave particular questions blank. Those items were revised. The researcher also telephoned respondents who provided contact information in order to complete missing items. Sampling error could possibly affect the study's findings. The researcher could not monitor or provide onsite support to Head Start education coordinators when they followed the sampling directions. They may have chosen only potential respondents who have computers in their classrooms or teachers they perceived would “favorably” complete the questionnaires. The researcher 77 wrote clear and concise sampling directions to reduce this potential limitation. During telephone calls with education coordinators, she emphasized the importance of following the sampling procedure. She also removed any returned questionnaires where respondents identified themselves as members of a different population, e.g. administrators or home based teachers. Another potential limitation is the researcher's assumption that self reported data are reflective of actual practice. In reality, the two may differ. Summary A minimum response rate of 300 Michigan Head Start lead classroom teachers was desired for the study. All Michigan grantee and delegate Head Start agencies (except the grantee where the researcher is employed, and their delegate agency) were contacted for inclusion in the study. Every agency agreed to participate. Lead classroom teachers, randomly selected by the local Head Start education coordinator, completed a self-administered mailed questionnaire. Five research questions guided the study of relationships among Head Start teachers, computers, and the classroom/program environment. These questions addressed computer hardware components and software in Head Start 78 classrooms, use of computers as instructional tools, how teachers learn about computers, and various contextual and personal variables associated with computer use by Head Start teachers. It was hypothesized that contextual and personal variables could affect computer use by Head Start teachers in their classrooms. The Head Start Teacher Computer USe Profile measured these questions and tested the hypotheses. The researcher collected data through mailed questionnaires. She conducted data analyses using descriptive statistics, one way ANOVA, and post-hoc procedures. Descriptive statistics indicated frequencies and distributions of teacher demographic information, computer hardware components and software in Head Start classrooms, use of computers as instructional tools, and computer education of Head Start teachers. The descriptive findings provided an overview of the data. One way ANOVA was used to compare the means of different groups within categorical variables in order to determine the effect of personal and contextual variables on computer use. A scale of 11 HSTCU Profile items comprised the computer use dependent variable. Chapter Four contains the results for each research question and the hypotheses. 79 CHAPTER FOUR RESULTS In this chapter, the results of the data analysis are reported. First, the demographic characteristics of the sample will be described. Next, the statistical findings for each research question will be presented in the order in which the questions were asked. Frequency counts and descriptive statistics were used to answer Questions 1 through 3. The researcher addressed Questions 4 and 5 with one way ANOVAs, comparing the means of different groups within contextual and personal variables with the dependent variable (computer use). Additional post hoc tests used Scheffe and Dunnett T3 methods to determine the categorical groups within each variable that differed. The analysis included only subjects who answered related items on the HSTCU Profile. Missing data therefore accounts for the variation in N that is reported. The Sample The analysis included 323 lead Head Start classroom teachers distributed throughout the state of Michigan. These teachers represented 134 communities where Head Start classrooms are located. Most subjects (85%) worked in Lower Peninsula locations; the remainder (15%) were from the Upper Peninsula. Teachers reported Detroit as the most 80 frequent location, representing 18.1% of the total return. Lansing was the second most frequent location cited(4.7%), followed by Flint (2.8%), Battle Creek and Jackson, (each with 2.5%), and Inkster (2.2%). All other subjects worked in communities that individually represented 2% or less of the total sample. The names of all locations are listed in Appendix H. The age range of the subjects in the sample was 22 years to more than 51 years. The mode (39%) was 41 — 50 years old, followed by 31 — 40 years old (27%), 22 — 30 years old (20%), and persons older than 51 years (14%). These results are reported in Table 2. Table 2 Age of the Subjects n % 21 or younger O 0 22 - 30 62 20 31 — 4O 84 27 41 — 50 121 39 Older than 51 42 14 Nonresponses=14 N=309 The education level of the subjects ranged from having a high school education to completion of a graduate degree. The mode was a Bachelor degree (37%). An Associate(s) degree (26%) was the next largest group, followed by the Child Development Associate credential (C.D.A., 25%). Five 81 percent of the subjects had a high school education as their highest degree. Table 3 details these results. Table 3 Education Level of the Subjects (recoded) n % High school 15 5 C.D.A. 81 26 Associate degree 80 25 Bachelor degree 116 37 Graduate degree 25 8 Nonresponses=6 N=317 Household income categories ranged from less than $20,000 to more than $75,000. The mode was $21,000 - $35,000 (37%), followed by $50,000 - $74,000 (24%), $36,000 - $49,000 (17%), more than $75,000 (13%) and less than $20,000 with (9%). These results are reported in Table 4. Table 4 H0usehold Income of the Subjects n % Less than $20,000 27 9 $21,000 - $35,000 108 37 $36,000 - $49,000 51 17 $50,000 - $74,000 70 24 More than $75,000 37 13 Nonresponses=30 N=293 One third (33%) said they had been parents of Head Start children. Two thirds (67%) of the teachers said they had not been Head Start parents. Table 5 reports these results. 82 Table 5 Head Start Parent Status of the Subjects n % Yes 107 33 No 216 67 N=323 The HSTCU Profile asked participants to identify to what extent they were likely to adopt a new idea, object, or activity. The categories ranged from innovator to laggard. The modal group identified themselves as early majority (44.5%). Subjects who identified themselves as early adopter made up the second largest group (42.2%), followed by innovator (10.4%), late majority (2.3%), and laggard (.6%). These results are reported in Table 6 Table 6 Adopter Category of the Subjects n % Innovator 32 10.4 Early adopter 130 42.2 Early majority 137 44.5 Late majority 7 2.3 Laggards 2 .6 Nonresponses=15 N=308 The number of years subjects taught preschool-aged children ranged from 3 years or less to more than 16 years of experience. The modal group reported teaching 6 - 10 years (25%), followed by 11 - 15 years and more than 16 83 years (22% each), 3 years or less (16%) and 4 — 5 years (15%). These results are reported in Table 7. Table 7 NUmber of Years Subjects Reported Teaching Preschool (recoded) n 96 3 or less 53 16 4 - 5 49 15 6 — 10 79 25 11 - 15 70 22 More than 16 71 22 Nonresponses=l N=322 The number of years subjects reported working in Head Start ranged from 3 years or less to more than 16 years. The modal group had 3 years or less of experience (28%), followed by 6 — 10 years (27%). The two remaining categories each included 15% of the respondents each. These results are reported in Table 8. Table 8 NUmber of Years Subjects Reported Working in Head Start (recoded) n % 3 or less 89 28 4 - 5 50 15 6 — 10 87 27 ll — 15 48 15 More than 16 49 15 N=323 The number of years subjects reported working as actual teachers in Head Start classrooms ranged from 3 84 years or less to more than 16 years. The modal group had 3 years or less of experience (37%), followed by 6 — 10 years (27%). The 4 - 5 years group (17%) was next, followed by the 11 - 15 years group (10%). The group that worked more than 16 years was smallest with 9%. See Table 9. Table 9 NUmber of Years Subjects Reported Working as a Head Start Classroom Teacher (recoded) n % 3 or less 121 37 4 — 5 55 17 6 - 10 86 27 11 - 15 33 10 More than 16 28 9 N=323 A majority (69.7%) of the respondents reported that they teach children in mixed age groups; 25.9% worked mainly with 4 year olds; and, 4.1% teach children 3 years of age or younger. These results are reported in Table 10. Table 10 Age of Head Start Children in Classes Taught by the Subjects n % 3 or younger 13 4.1 4 83 25.9 5 l .3 Mixed age 223 69.7 Nonresponses=3 N=320 Subjects reported that the number of children in each classroom session ranged from less than 13 to more than 21. 85 A majority (81%) of the respondents reported that there were 16-20 children in each session. Sixteen percent said they had 13-15 children in their sessions. Only 2% of the teachers reported more than 21 children, while 1% reported they had fewer than 13 children per session. See Table 11. Table 11 NUmber of Head Start Children Per Session n % Less than 13 3 1 13 — 15 52 16 16 — 20 259 81 More than 21 7 2 Nonresponses=2 N=321 Subjects indicated that the length of the classroom day ranged from less than 3.5 hours to 6 hours or more. Slightly more than half (52.6%) of the respondents reported that their class is 3.5 — 5 hours long each day. Twenty eight percent reported the day was less than 3.5 hours. Class sessions of 6 or more hours were reported by 19.3% of the respondents. These results are tabulated in Table 12. Table 12 Length of the Head Start Classroom Day n % Less than 3.5 hours 90 28 3.5 - 5 hours 169 52.6 6 or more hours 62 19.3 Nonresponses=2 N=321 86 Subjects indicated that their Head Start classes met from 3 days or less to 6 days or more each week. The majority of respondents (83.3%) reported meeting with children 4 days a week, followed by 5 days (11.6%), 3 days or less (4.7%), and 6 days or more (.3%). These results are reported in Table 13. Table 13 NUmber of Days Per Week that Head Start Classes.Meet n % 3 days or less 15 4.7 4 days 265 83.3 5 days 37 11.6 6 or more 1 .3 Nonresponses=5 N=318 The majority of the subjects in the sample were 31 years of age or older (80%) and had an Associate or higher degree (70%). Respondents reported a household income greater than $35,000 (54%) as well as more than 5 years experience as a Head Start classroom teacher (46%). Subjects indicated that they taught 16 - 20 preschool children (81%) in mixed age groups (69.7%); Head Start classes met 3.5 — 5 hours daily (52.6%), 4 days each week (83.3%). 87 Question 1. What Computer Hardware Components and Software Are in Head Start Classrooms? A majority (87.6%) of subjects reported that they have computers to use with Head Start children. Computers were not available to 12.4% of the respondents. These results are reported in Table 14. Table 14 Availability of Computers Subjects Reported USing with Head Start Children n % Yes 282 87.6 No 40 12.4 Nonresponses=1 N=322 Head Start teachers reported that most of the computers (97%) available to children were located in the classroom. Other locations in which computers were available to children included the library/media center (3%), computer lab (2%), and Head Start office (1%). These results are reported in Table 15. Table 15 Location of Computers Subjects Can USe with Children n % of Case Classroom 267 97 Library/media 9 3 center Computer lab 6 2 Office 4 1 Nonresponses=48 N=275 88 According to reports from the subjects, it is most common for teachers to have access to one computer in the classroom (67.9%). Some classrooms had two computers (25.6%) and an even smaller number (1.2%) had three. The number of computers in computer labs varied from 2 (.8%), 3 (.7%), 6 (.4%), to 8 (.4%). A.small number of computers were reported to be in the library/media center (1.6%). Two subjects indicated that one computer was available to children in the Head Start office (.7%). These results are reported in Table 16. Table 16 NUmber of Computers By Location Location Number of Computers n % Classroom 1 178 67.9 2 67 25.6 3 4 1.6 Computer lab 2 2 7 3 2 7 6 1 4 8 1 4 Library/ media center 1 4 1.6 2 1 4 Office 1 2 .7 Nonresponses=61 N=262 Respondents were asked to identify what hardware components were available on computers that they used with children. Subjects reported that at least one computer per 89 classroom had a mouse (99.6%), keyboard (98%), and monitor (95%). Other available hardware components included a printer (83%), speakers (83%), and a CD-ROM drive (80%). The hardware components least likely to be available were Internet access (19%) and touch screen (19%). Table 17 lists the potential hardware components and the percentage of respondents who had access to them. Table 17 Hardware Components as Reported by Subjects Item n % of Cases Mouse 275 99.6 Keyboard 270 98 Monitor 263 95 Printer 230 83 Speakers 228 83 CD-ROM 220 80 Internet access 52 19 Touch screen 51 19 Nonresponses=47 N=276 Teachers were asked to report how many computers had all of the hardware components identified in Table 17. Nearly 57% reported that at least one computer had all the hardware components identified above; 19% said all computers were fully equipped, 16.6% said two computers had these components, 4.5% said some of the computers had these components, and 3% said most of the computers had the identified hardware components. Table 18 contains these results. 90 Table 18 Number of Computers with All Hardware Components Item n % One 144 56.9 Two 42 16.6 Some of them 12 4.5 Most of them 7 3.0 All 48 19.0 Computers not available=36 Nonresponses=34 N=253 Teachers reported that their computers worked properly most of the time (53.1%) or always (24.9%). Only .7% of the subjects said their computers never worked. These results are tabulated in Table 19. Table 19 Frequency that Computers Work Properly Item n % Never 2 .7 Hardly ever 9 3.3 Sometimes 49 18.0 Most of the time 145 53.1 Always 68 24.9 Computers not available=32 Nonresponses=18 N=273 Respondents marked their three favorite software programs to use with children. The software programs most frequently mentioned from the 250 subjects who answered this item were Jump Start Preschool (41%), Millie's Math House (36%), Sammy’s Science House (35%), and Bailey's Book House (34%). Smaller numbers were reported for Kid Pix 91 (19%), the Reader Rabbit series (18%), Just Grandma and Me (14%), and Arthur’s Preschool (13%). Ten percent of the respondents indicated that they did not remember the names of the software programs they liked to use with children. Respondents listed 18 additional software programs in the comments section. Table 20 has a complete listing of software programs and the percentage of respondents who reported using them. Table 20 Software Programs Teachers.Most Like to USe with Children Title N % Jump Start Preschool 102 41 Millie's Math House 91 36 Sammy’s Science House 87 35 Bailey's Book House 85 34 Kid Pix 47 19 Reader Rabbit series 44 18 Just Grandma and Me 35 14 Arthur's Preschool 33 13 Don’t remember the names 24 10 Kidware 20 8 Art Center 18 7.2 Crayola Art Studio 17 6.8 Little Monster at School 16 6.4 Playroom 15 6.0 Winnie the Pooh Preschool 14 5.6 Freddi Fish 5 2.0 Nick Jr. Play Math 4 1.6 Kid Works Deluxe 4 1.6 Stickybear’s Reading Room 4 1.6 Trudy’s Time 3 1.2 Hello Kitty Big Fun 2 .8 Foo Castle 1 .4 Nonresponses=73 N = 250 92 Respondents (n=244) identified the individuals or groups of people who chose children’s software in their Head Start program. The education coordinator was most frequently cited (64%), followed by the classroom teacher (43%), Head Start director (29%), and computer specialist (13%). See Table 21 for a complete listing of individuals who choose children’s software programs in Michigan Head Start programs. Table 21 Who Chooses Children’s Software Individual/Group N % of cases Education coordinator 155 64 Classroom teacher 106 43 Head Start director 71 29 Computer specialist 31 13 Executive director 13 5 Parent Policy Council 11 4 Technology committee 9 3.7 Principal 6 2.5 Librarian 1 .4 Nonresponses=79 N=244 Question 2. How and to What Extent Do Teachers Use Computers as an Instructional Tool? A strong majority of respondents (90.5%) reported that they use a computer. Nine percent do not. Table 22 Computer USe of the Subjects n % Yes 287 90.5 No 30 9.5 Nonresponses=6 N=317 93 Respondents were asked to describe to what extent they used computers with preschoolers. Ten percent said they used computers extensively, 17% said they did not use computers at all, and most reported that they used computers with preschoolers to some degree (73%). Table 23 summarizes the results of these categories of computer use by Head Start teachers. Table 23 The Extent of Computer USe with Preschoolers by the Subjects n % Use 233 72.6 Don't use 56 17.4 Use extensively 32 10 Nonresponses=2 N=321 Instructional support A scale that ranged from ‘Not at all” (0) to ‘Daily” (5) was used to measure the frequency with which teachers used computers to support instruction. Teachers reported that they were most likely to use computers to make instructional materials (u 1.98), to find resources for lesson plans and ideas for best teaching practices (u 1.50), and to keep records about the children or classroom activities (u 1.41). They were least likely to use computers to email parents or professionals (u=.60). See Tables 24 - 27 for results regarding computer 94 use for instructional support. Only those respondents who answered these items were included in the analyses. Table 24 Frequency of Computer USe to.Make Instructional.Materials n % Not at all 85 26.8 Once a year 22 6.9 Once a month 100 31.5 Once a week 51 16.1 2—3 times a week 32 13.6 Daily 16 5.0 u=1.98 Nonresponses=6 N=317 Table 25 Frequency of Computer USe to Find Resources n % Not at all 140 44.21 Once a year 15 4.7 Once a month 72 22.7 Once a week 50 15.8 2—3 times a week 34 10.7 Daily 6 1.9 u=1.50 Nonresponses=6 N=317 Table 26 Frequency of Computer USe to Keep Records n % Not at all 172 54.1 Once a year 12 3.8 Once a month 40 12.6 Once a week 45 14.2 2-3 times a week 24 7.5 Daily 25 7.9 u=1.41 Nonresponses=5 N=318 95 Table 27 Frequency of Computer USe to Email Parents and Professionals n % Not at all 254 81 Once a year 6 2 Once a month 16 5 Once a week 19 6 2-3 times a week 7 2 Daily 13 4 u=.60 Nonresponses=8 N=315 Instruction A scale that ranged from ‘Not at all” (0) to ‘Daily” (4) was used to measure the frequency with which teachers used computers to select activities and teach children skills and concepts. Subjects reported that they most often used computers to teach children fine motor skills such as hand-eye coordination and keyboarding (u 2.95), socio-emotional skills (u 2.75), numeracy concepts and skills (u 2.67), and literacy concepts and skills (u 2.64). Teachers were least likely to choose software programs for children (u 1.58), to teach English or other languages (1.11) or to write lesson plans for the computer center (u .92). See Tables 28 — 35 for results regarding computer use and instruction. 96 Table 28 Frequency of Computer USe for Teaching Fine.Motor Skills n % Not at all 62 19.3 Once a month 3 .9 Once a week 21 6.5 2-3 times a week 38 11.8 Daily 197 61.4 u=2.95 Nonresponses=2 N=321 Table 29 Frequency of Computer USe for Teaching Socio—Emotional Skills n % Not at all 74 23.1 Once a month 11 3.4 Once a week 19 5.9 2-3 times a week 33 10.3 Daily 184 57.3 u=2.75 Nonresponses=2 N=321 Table 30 Frequency of Computer Use for Teaching Numeracy Skills n % Not at all 71 22.1 Once a month 10 3.1 Once a week 30 9.3 2-3 times a week 54 16.8 Daily 156 48.6 u=2.67 Nonresponses=2 N=321 97 Table 31 Frequency of Computer USe for Teaching Literacy Skills n % Not at all 72 22.4 Once a month 11 3.4 Once a week 31 9.7 2-3 times a week 53 16.5 Daily 154 48.0 u=2.64 Nonresponses=2 N=321 Table 32 Frequency That Teachers Set Time Limits for Children’s Computer USe n % Not at all 95 30.0 Once a month 2 .6 Once a week 16 5.0 2-3 times a week 28 8.8 Daily 176 55.5 u=2.59 Nonresponses=6 N=317 Table 33 Frequency That Teachers Choose Software Programs for Children’s USe n % Not at all 161 50.6 Once a month 12 3.8 Once a week 35 11.0 2-3 times a week 19 6.0 Daily 91 28.6 u=1.58 Nonresponses=5 N=318 98 Table 34 Frequency of Computer USe for Teaching English or other Languages n % Not at all 205 64.7 Once a month 15 4.7 Once a week 13 4.1 2-3 times a week 24 7.6 Daily 60 18.9 u=1.11 Nonresponses=6 N=317 Table 35 Frequency that Teachers Write Lesson Plans for the Computer Center n % Not at all 205 64.5 Once a month 20 6.3 Once a week 45 14.2 2-3 times a week 8 2.5 Daily 40 12.6 u=.92 Nonresponses=5 N=318 Curriculum Integration A scale that ranged from ‘Not at all” (0) to ‘Daily” (4) was used to measure the frequency with which classroom activities and materials reflected concepts covered in the computer programs subjects used with children. Subjects reported that classroom activities (u 2.14) and materials (p 2.05) matched the content of selected computer programs on average, about 99 once a week. See Tables 36 and 37 for results regarding curriculum integration and computer use. Table 36 Frequency that Classroom Activities Reflect Concepts in Computer Programs n % Not at all 100 31.9 Once a month 24 7.7 Once a week 35 11.2 2-3 times a week 39 12.5 Daily 115 36.7 u=2.14 Nonresponses=10 N=313 Table 37 Frequency that Classroom.Materials Reflect Concepts in Computer Programs n % Not at all 115 36.4 Once a month 21 6.6 Once a week 31 9.8 2-3 times a week 31 9.8 Daily 118 37.3 u=2.05 Nonresponses=7 N=316 Question 3. How Do Head Start Teachers Learn About Computers? The majority (93%) of the respondents reported that they were interested in learning about computers. A small number of the subjects (7%) indicated that they were not interested. See Table 38 for these results. 100 Table 38 Subjects’ Interest In Learning about Computers Variable n % of Cases Yes 297 93 No 23 7 Nonresponses=3 N=320 The researcher asked subjects about their reasons for learning about computers by checking all that applied. The most common reasons reported were to improve their skills (86%), to teach children to use computers (73.6%), to make their job easier (63.9%), and to communicate with others (55.1%). Subjects indicated that other people telling them to use computers (3.7%) was the least common reason for their interest in learning about computers. Results regarding teacher reasons for learning about computers are reported in Table 39. Table 39 Subjects’ Reasons for Learning about Computers n % of Cases Improve skills 255 86.1 Teach children to use 218 73.6 Makes job easier 189 63.9 Communicate with others 163 55.1 Program requires 102 34.5 Everyone uses them 81 27.4 Something new 61 20.6 Others said I should 11 3.7 Nonresponses=27 N=296 101 Respondents were asked to indicate how they had learned about computers by marking all items that applied. The most common response by 288 subjects was ‘messing around” by myself (75.3%), followed by a friend or family member (42.7%), in high school or college (35.8%), and watching others (31.9%). Teachers were least likely to learn about computers by reading the NAEYC position statement on technology, the Head Start Bulletin, or Computers in Head Start Classrooms (2.1% each). See Table 40 for a complete listing of how teachers learn about computers and the percentages. Table 40 How Head Start Teachers Learn About Computers Variable n % of Cases Messing around by myself 217 75.3 From friend or family member 123 42.7 In high school or college 103 35.8 Watching others 92 31.9 My own children taught me 82 28.5 Another teacher 80 27.8 At a Head Start workshop 61 21.2 At a workshop or seminar 39 13.5 Took another class 30 10.4 The HS children taught me 30 10.4 Reading other books or journals 16 5.6 Reading Young Children 10 3.5 At a professional conference 9 3.1 Reading NAEYC position stmnt. 6 2.1 Reading Head Start Bulletin 6 2.1 Reading Computers in HS Clsms. 6 2.1 Nonresponses=35 n=288 102 Question 4. What Contextual Variables Affect Computer Use by Head Start Teachers? Ho: There are no contextual variables that affect computer use by Head Start teachers. Ha: There is at least one contextual variable that affects computer use by Head Start teachers. In order to answer this question, subjects chose from one of three responses that represented the effect of each contextual variable on their computer use. These groups were based upon whether the subjects rated a variable as making it harder, easier, or if it had a neutral effect upon their use of computers with preschoolers. The researcher analyzed the resulting three groups with one way ANOVA to test possible differences between group means of the contextual variable and the dependent variable, computer use. Subjects responded to 14 contextual variables. These variables were the number of computers, type of software, number of software programs, time in the schedule, classroom space, training about computer operation and using computers with preschoolers, computer technician, Head Start requirements, program philosophy, curriculum guidelines, interaction with other Head Start teachers, and administrators. Participant responses indicated significant relationships for 5 of the 14 contextual variables. Listed 103 in descending order, these variables are talking with other Head Start teachers, type of software, curriculum guidelines, training on computer operation, and program philosophy. Significant F values (p<.05) indicate that there is a difference between the means of the harder, easier, and neutral groups for these variables that can not be explained by chance. This suggests that the five contextual variables indicated above have a significant relationship with computer use by Head Start teachers. The null hypothesis is therefore rejected. The other contextual factors (the number of computers, software programs, or classroom electrical outlets, the amount of time in the daily schedule or classroom space, computer technician, administrators, or Head requirements) had no significant relationship with this sample. The analyses contain only the subjects who answered the related item as well as those items composing the computer use scale. The ‘n” therefore differs for each variable. See Tables 41 — 54 for results regarding the effect of contextual variables on computer use. 104 Table 41 Contextual variable — The Effect of Talking with Other Head Start Teachers on Computer USe n u SD MS F Harder 7 9.00 6.63 486.98 5.57** Neutral 105 20.32 10.12 Easier 177 21.01 8.94 N=289, *p<.05, **p<.01 Table 42 Contextual variable — The Effect of the Type of Software on Computer USe n u SD MS F Harder 44 16.34 10.21 480.29 5.56** Neutral 56 20.18 10.58 Easier 187 21.52 8.64 N=287, *p<.05, **p<.Ol Table 43 Contextual variable — The Effect of Curriculum Guidelines on Computer USe n u SD MS F Harder 22 14.59 9.64 444.04 5.07** Neutral 158 20.77 9.54 Easier 105 21.52 9.01 N=285, *p<.05, **p<.01 Table 44 Contextual variable — The Effect of Training on How to Operate Computers on Computer USe n u SD MS F Harder 29 16.41 10.95 332.82 3.73* Neutral 66 19.47 9.70 Easier 194 21.27 9.11 N=289, *p<.05, **p<.01 105 Table 45 Contextual variable — The Effect of Program Philosophy on Computer USe n u SD MS F Harder 17 16.59 9.95 282.13 3.17* Neutral 171 20.05 9.70 Easier 96 22.18 8.85 N=284, *p<.05, **p<.Ol Table 46 Contextual Variable — The Effect of the NUmber of Computers on Computer USe n u SD MS F Harder 78 18.33 10.32 245.41 2.78 Neutral 98 21.17 8.95 Easier 111 21.35 9.09 N=287, *p<.05, **p<.01 Table 47 Contextual variable - The Effect of the NUmber of Software Programs on Computer USe n u SD MS F Harder 56 18.64 10.20 233.76 2.66 Neutral 93 19.90 9.86 Easier 137 21.83 8.65 N=286, *p<.05, **p<.01 Table 48 Contextual variable - The Effect of the Amount of'Time in the Daily Schedule on Computer USe n u SD MS F Harder 73 18.45 9.31 243.60 2.77 Neutral 96 20.53 9.47 Easier 117 21.74 9.33 N=286, *p<.05, **p<.01 106 Table 49 Contextual Variable — The Effect of the Amount of Classroom Space on Computer USe n u SD MS F Harder 74 18.50 10.74 237.07 2.67 Neutral 106 20.46 8.51 Easier 109 21.78 9.33 N=289, *p<.05, **p<.01 Table 50 Contextual variable - The Effect of the NUmber of Electrical Outlets in the Classroom on Computer USe n u SD MS F Harder 98 19.02 9.82 259.60 2.90 Neutral 80 20.20 8.78 Easier 107 22.17 9.62 N=285, *p<.05, **p<.01 Table 51 Contextual variable - The Effect of Training on USing Computers with Young Children n u SD MS F Harder 21 16.00 11.88 241.16 2.70 Neutral 61 20.03 9.42 Easier 204 20.98 9.19 N=286, *p<.05, **p<.Ol Table 52 Contextual variable — The Effect of a Computer Technician on Computer USe n u SD MS F Harder 34 17.97 10.17 155.58 1.76 Neutral 103 20.44 10.28 Easier 148 21.30 8.52 N=285, *p<.05, **p<.01 107 Table 53 Contextual variable - The Effect of Meeting Head Start Requirements on Computer USe n u SD MS F Harder 26 18.04 9.61 84.15 .94 Neutral 174 20.76 9.26 Easier 85 20.53 9.82 N=285, *p<.05, **p<.Ol Table 54 Contextual variable — The Effect of Administrators on Computer USe n u SD MS F Harder 27 17.85 10.45 141.92 1.60 Neutral 154 21.28 9.16 Easier 104 20.31 9.53 N=285, *p<.05, **p<.01 Post hoc pairwise procedures were conducted with the Scheffe method for variables with assumed equal variances across groups and the Dunnett T3 method for non equal variances, as determined by Levene's test of equality of error variances. The methods described above produced significant mean difference scores (p<.05) between the ‘harder” and ‘easier” groups for talking with other Head Start teachers, type of software, curriculum guidelines, training on how to operate a computer, and program philosophy. Significant mean differences also resulted between the groups designated as ‘harder” and ‘neutral” groups for curriculum guidelines and talking with other Head Start teachers. The analyses detected no differences 108 between subjects in ‘easier” and ‘neutral” groups for any factor. Table 55 presents the results of these post—hoc analyses. Table 55 .Mean Difference Scores of Contextual variables Identified as Significant with One way ANOVA Variable HarderlNeutral HarderIEasier EasierlNeutral Type of software -3.84 -5.18* 1.34 Training: -3.06 -4.86* 1.80 Operate computer Program philosophy-3.46 —5.59 2.13 Curriculum -6.17* -6.93* .76 guidelines Talking with -11.32* -12.01* .68 other HS teachers *=p<.05 Question 5. What Personal Variables of Head Start Teachers Affect Computer Use? Ho: There are no personal variables of Head Start teachers that affect computer use. Ha: Personal variables of Head Start teachers affect computer use. The researcher identified nine personal variables that might affect computer use as suggested by the literature. Respondents answered categorical items on the HSTCU Profile for six variables regarding their age, education level, income, teaching experience, and adopter type. Subjects also indicated whether three other variables (comfort level, knowledge, and previous experience with computers) 109 made their use of computers with preschoolers harder, easier, or if it had a neutral effect. The researcher analyzed the resulting groups of subjects with one way ANOVA. The analysis included only the subjects who answered the particular item of interest as well as all of the computer use scale items. The ‘n” therefore differs for each variable. The analyses indicated that income as well as comfort level, knowledge about, and previous experience with computers, were significant factors related to Head Start teachers’ use of computers (p<.05). The null hypothesis is rejected. Factors that had no effect with this sample were age, education level, teaching experience, and adopter type. See Tables 56 - 64 for results regarding the effect of personal variables of Head Start teachers on computer use. Table 56 Personal variable of Head Start Teachers - The Effect of Teacher Knowledge about Computers on Computer USe n u SD MS F Harder 66 14.97 9.27 1407.17 17.18** Neutral 48 19.12 8.76 Easier 184 22.48 9.05 N=298, *p<.05, **p<.01 110 Table 57 Personal variable of Head Start Teachers - The Effect of Teacher Previous Experience with Computers on Computer USe n u SD MS F Harder 48 14.60 9.60 1149.77 13.84** Neutral 52 19.17 8.69 Easier 193 22.18 9.10 N=293, *p<.05, **p<.01 Table 58 Personal variable of Head Start Teachers — The Effect of Teacher Comfort Level with Computers on Computer USe n u SD MS F Harder 46 15.15 9.20 1078.56 12.75** Neutral 69 18.41 9.53 Easier 181 22.25 9.07 N=296, *p<.05, **p<.01 Table 59 Personal Variable of Head Start Teachers — The Effect of Household Income on Computer USe n u SD MS F Less than $20,000 26 23.23 10.20 543.92 6.26** $21,000 - $35,000 101 17.50 9.82 $36,000 - $49,000 45 24.40 7.99 $50,000 - $74,000 65 18.62 9.37 More than $75,000 35 22.83 8.60 N=272, *p<.05, **p<.01 Table 60 Personal Variable of Head Start Teachers — The Effect of Teacher Age on Computer USe n u SD MS F 22 - 30 years 60 19.73 10.22 95.85 1.03 31 - 40 76 18.78 10.30 41 - 50 113 20.49 9.64 Older than 51 years 38 21.95 6.86 N=287, *p<.05, **p<.Ol 111 Table 61 Personal variable of Head Start Teachers — The Effect of Education Level on Computer USe n u SD MS F High school 15 20.53 8.98 7.81 .08 C.D.A. 76 20.36 10.34 Associate degree 73 20.37 10.68 Bachelor degree 106 19.75 8.65 Graduate degree 25 20.60 8.91 N=295, *p<.05, **p<.01 Table 62 Personal Variable of Head Start Teachers - The Effect of Preschool Teaching Experience on Computer USe n u SD MS F 3 years or less 50 19.86 9.69 56.20 .61 4 - 5 years 47 21.19 9.18 6 — 10 years 74 19.39 10.29 11—15 years 65 21.31 8.81 More than 16 years 64 19.36 9.61 N=300, *p<.05, **p<.01 Table 63 Personal Variable of Head Start Teachers - The Effect of Head Start Classroom Teacher Experience on Computer USe 11 u SD MS F 3 years or less 114 20.21 10.05 63.09 .68 4 - 5 years 52 19.63 9.08 6 - 10 years 78 20.72 9.73 11-15 years 32 21.69 7.72 More than 16 years 25 17.80 10.68 N=301, *p<.05, **p<.Ol Table 64 Personal Variable of Head Start Teachers — The Effect of Adopter Category on Computer USe n u SD MS F Innovator 28 21.18 9.50 130.96 1.44 Early adopter 123 21.29 9.30 Early majority 131 18.92 9.62 Late majority 7 18.43 12.80 Laggards 2 12.50 3.54 N=29l, *p<.05, **p<.01 112 Significant F values indicate that there is a difference between subject groups that can not be explained by chance. The previous analyses, however, did not show which subject groups were different. Levene’s test of equality of error variances determined that each group within the variable had equal variances. Post hoc pairwise procedures were therefore conducted with the Scheffe method. The method described above determined significant differences between subjects in the $21,000 - $35,000 and $36,000 - $49,000 (p<.01) as well as between the #36,000 - $49,000 and $50,000 — $74,000 (p<.05) categories for the income variable. There were no other differences within income groups. These results are tabulated in Table 65. 113 Table 65 Income variable:.Mean Difference Scores (1) Household income (J) Household income Mean Difference (I-J) Less than $20,000 $21,000 - $35,000 5.74 $36,000 - $49,000 -1.17 $50,000 — $74,000 4.62 More than $75,000 .40 $21,000 - $35,000 Less than $20,000 —5.74 $36,000 - $49,000 -6.90** $50,000 - $74,000 -1.12 More than $75,000 —5.33 $36,000 - $49,000 Less than $20,000 1.17 $21,000 - $35,000 6.90** $50,000 - $74,000 5.78* More than $75,000 1.57 $50,000 - $74,000 Less than $20,000 -4.62 $21,000 - $35,000 1.12 $36,000 - $49,000 -5.78* More than $75,000 -4.21 More than $75,000 Less than $20,000 -.40 $21,000 - $35,000 5.33 $36,000 - $49,000 -1.57 $50,000 - $74,000 4.21 *p<.05, **p<.01 Post hoc procedures also indicated significant difference (p<.01) occurred between subjects in the harder and easier groups for teacher comfort level with computers, teacher knowledge of computers, and previous experience with computers. Subjects in the easier and neutral groups were significantly different for teacher comfort level and (p<.05). teacher knowledge of computers These results are presented in Table 66. 114 Table 66 Personal variables of Head Start Teachers Identified as Significant:.Mean Difference Scores Variable HarderlNeutral HarderlEasier EasierlNeutral My comfort level -3.25 -7.10** 3.95* with computers My knowledge of -4.16 —7.51** 3.35 computers My previous -4.57* -7.57** 3.00 experience with computers *=p<.05, **=p<.01 Summary This chapter first described the demographic characteristics of the sample. The results from the five research questions were presented next. Both hypotheses regarding the effects of contextual or personal variables of Head Start teachers upon computer use were supported. Chapter 5 discusses the results, possible implications for Head Start policy makers and administrators, and suggestions for future research. 115 CHAPTER FIVE DISCUSSION This study investigated computer use by Head Start teachers in support of their teaching responsibilities. As a preliminary step, the researcher developed an instrument, the HSTCU Profile. Five research questions and two corresponding hypotheses guided the study. Results are summarized and discussed for each research question and hypotheses. Implications for Head Start policy makers and administrators are explored. Implications for future research and personal observations are also presented. Question 1. What computer hardware components and software are in Head Start classrooms? According to the data, computers for instructional use are generally available to Head Start teachers as they work with young children. Such computers are most frequently located in the classroom. This finding compliments Bickel's (1996) study that reports preschool programs generally have computers (Landerholm, 1995, Clements & Swaminathan, 1995, Fite, 1993). This means Head Start programs are similar to other preschool settings in terms of technology. However, subjects who answered the survey may be peOple who support computer use with children. They could be more likely to 116 have a classroom computer than nonrespondents who may not have a computer or may not View computers as instructional tools. Hardware components Eight of ten subjects with computers reported that their computers had these components: mouse, keyboard, monitor, printer, speakers and CD-ROM drive. Such components represent common hardware configurations of many computer systems. Subjects were less likely to have Internet access or a touch screen. One may speculate that teachers who did not have particular hardware components might have older computer systems or a brand of computer that is incompatible with certain components, e.g. Internet access or touch screen (Bilton, 1996, Wood, Willoughby, & Specht, 1998). Additional contextual factors outside of the computer system itself might also influence which hardware components are present. For example, inadequate wiring in the classroom or lack of funds to support Internet services could determine whether the Head Start program provides Internet access. This study did not examine these particular issues. Only 1 out of 5 respondents reported that most or all of their computers had the hardware components identified above. Yet, more than half of them (56.9%) said that at 117 least one computer to which they had access had these parts. This appears to be a contradictory finding with the number of computers that teachers reported. The researcher believes that this finding could be the result of a poorly worded item on the HSTCU Profile. If a subject had one computer, his or her correct response to the question, “How many computers that you use with children have all of the parts checked in Question 43?,” could be ‘one” or ‘all”. In the same fashion, subjects who had two computers could possibly (and correctly) answer either “two” or ‘all”. These circumstances make the findings questionable and unclear. Operating status According to reports from the subjects, the computers that Head Start teachers use for instructional purposes usually are operating correctly. Three out of four subjects reported that their computers worked properly most of the time or always. Data indicates that both teachers and children in Head Start programs have access to computers that work properly. Readers are reminded, however, that the respondents might be individuals who are computer advocates. For this reason, teachers could overlook or understate operational difficulties with their computer. This finding could also be related to the age of the subjects’ computers or the 118 quality of computer maintenance they experience (Bilton, 1996, Cory, 1983). Newer computers are less likely to have operational difficulties. Also, subjects who have their computer repaired quickly may View this factor more favorably than other subjects who must wait a long period of time. The same factors could also influence the 25% of the subjects who reported that their computer worked sometimes, hardly ever, or never. If interested, future researchers could obtain data regarding what hardware components were not working properly or the amount of time they had not worked correctly. Software programs Respondents reported using a variety of software programs. The majority did not choose one particular title. Subjects listed 18 additional software titles beyond the 21 choices listed in the HSTCU Profile. Ten percent of the subjects reported not remembering the program names they used. The variety of responses indicates that Head Start teachers have access to a wide array of computer programs. These outcomes give some sense of the popularity of certain children's software programs but provides us no information related to program quality (Edyburn & Lartz, 1986). Just because a software program is used frequently, does not mean that it is appropriate for young children (Clements, 119 1994, Haugland & Shade, 1990). The findings of this study do not address the issue of quality. It might also be the case that the programs teachers listed are the only ones available to them. Future investigators could examine why Head Start teachers used or preferred particular software programs. Software selection Subjects reported that the education coordinator and classroom teacher most commonly made software choices. This finding is congruent with the common Head Start practice of education coordinators functioning in partnership with teachers as the leader of the classroom. The Head Start director also chose software programs, according to 30% of the subjects. Directors generally make primary fiscal decisions within Head Start program. Computers and software would fall under this category. Question 2. How and to what extent do Head Start teachers use computers as an instructional tool? An overwhelming majority of subjects (90.5%) reported that they use a computer. When the researcher talked with each education coordinator, most confirmed that all classroom teachers had computers and used them with children. A small number of respondents (9.5%) reported 120 that they did not use a computer. This means that most Head Start teachers use computers in some fashion. Computer use with preschoolers When asked to categorize their computer use with preschoolers in particular, 10% of the subjects used computers extensively. A majority of participants in the study (73%) indicated that they used computers with children to some extent, 17% did not use them at all. Some teachers reported that they did not use computers with preschoolers. However, among the same subjects, many marked several HSTCU Profile items about teaching children various concepts with the computer. One could consider that despite instructions to the contrary, perhaps these subjects actually indicated how other adults in the classroom (rather than they themselves) used computers with children. Instructional support Participants had the opportunity to answer 4 items about computer use for instructional support regardless of whether they used computers with preschoolers or not. Subjects reported using computers to make instructional materials (65%), find resources (50%) or keep records (40%) at least once a month. However, most subjects (80%) said they did not email parents or other professionals. Thus, teachers seem 121 to be using computers for practical day to day classroom support but not to communicate with others. One reason subjects might not use email may be because they do not have Internet access or hardware capability to support this use. A small number of subjects' computers (19%) had Internet access. Instruction Many subjects reported using computers to teach children fine motor skills (73%), socio-emotional skills (68%), numeracy (64%) and literacy skills and concepts (64%) nearly 2 - 3 times each week. Teachers were less likely to direct children to use particular software programs (35%), teach English or other languages (26%) or write lesson plans for the computer center (15%) during the same time period. One way to explain why fine motor skills were cited most frequently by teachers is that in order to make something happen on the computer screen, children must use input devices such as a mouse or keyboard. These tools require them to coordinate movements of their hands and fingers, a fine motor activity. In addition, during their general interaction with young children, one must consider that Head Start teachers frequently teach socio-emotional skills such as cooperation and turn-taking regardless of the curricular content area. 122 The finding about socio-emotional skills could be the result of this primary interchange between Head Start teachers and children, as teachers support children’s learning with the computer. Another way to interpret these findings relates to the content of the software programs. Some software may require that the Head Start teacher or a more competent peer must help children who can not read or easily follow the directions of the program, leading to increased social interaction. Head Start teachers are mandated to teach literacy and numeracy skills. This requirement might explain why subjects frequently teach literacy and numeracy skills via computer. Children's software programs often address literacy and numeracy skills and concepts, another interpretation for this finding. Future research could examine this issue more completely. Two thirds of Head Start teachers reported setting time limits for children’s computer use; one third did not. One way to explain this finding is that teachers are promoting turn-taking at the computer center. Another perspective is that Head Start teachers limit children’s computer time so those children may participate in other classroom activities. The Head Start teachers who did not set time limits could view the computer center as just one 123 of several classroom activities from which children may choose whenever they wish. This perspective compliments the Head Start pilot study about classroom computers that recommended integration of the computer center with other classroom learning centers (MOBIUS, 1990). Further investigation could clarify this finding. According to the data, Head Start teachers are divided regarding how often they select which software programs children use during the classroom day. Half of them never choose programs; the other teachers choose from once a year to daily. One way to explain this finding is that teachers who never select programs allow children to decide which programs they prefer. Another perspective is that this HSTCU Profile item was not clear to respondents. They might have interpreted it as software choices made during the classroom day or selection of software for purchase. Nearly two thirds of Head Start teachers reported that they never use the computer to teach English or other languages. However, 20% of subjects indicated that they did teach English or other languages with their computer on a daily basis. Teachers who do not use computers for this purpose might not have software programs that teach these skills. Another possibility could be that many Head Start teachers do not view the computer as an appropriate tool 124 for teaching any language. One might also assume that Head Start teachers who teach English or other languages not only have the software capacity but children who need instruction in this area. Approximately 65% of Head Start teachers reported that they never write lesson plans for the computer center. About 1 out of every 4 teachers however do write computer lesson plans once a week or more frequently. This means that many teachers did not View the computer center as requiring lesson planning, but some saw this practice as necessary. Perhaps, those who do not write plans view the computer center as different from other classroom learning centers. As an independent activity, complete with ready- made programming, they might believe that lesson planning would be redundant. In contrast, teachers who write lesson plans for the computer center might be more likely to plan complementary ‘hands-on” activities for children in order to integrate it with other learning centers (MOBIUS, 1990). Curriculum integration As reported in Chapter 4, many Head Start teachers indicated that their classroom activities (60%) and materials (57%) reflected concepts covered in the computer programs they used with children, once a week or more frequently. The data also indicated that nearly 1 out of 3 Head Start teachers said their 125 classroom activities and materials never reflected computer program concepts. Such findings indicate that Head Start teachers may have different perspectives about the role of the computer center in their classroom. Teachers who reported that their classroom materials and activities reflected computer programs might view these programs as an additional classroom teaching tool. Or these teachers may have integrated computer programs with classroom themes (MOBIUS, 1990). Head Start teachers who reported not having classroom materials and activities related to the computer may view computers as independent of other classroom activities, have insufficient classroom resources, or lack strategies for incorporating the computer into other classroom areas. Question 3. How do Head Start teachers learn about computers? According to the data, almost all (93%) Head Start teachers were interested in learning about computers. Although the researcher investigated the reasons behind that interest, little is known about those subjects who indicated they were not interested. One could speculate that teachers who were not interested tend to be cautious and skeptical when trying something they perceive as new or different or that they did not consider computers as 126 personally advantageous (Rogers, 1995). Furthermore, they might not have opportunities for experimenting with computers so that they are ‘not on the bandwagon” (Cory, 1983). Future research efforts could examine more thoroughly why some teachers are not interested in learning about computers. Reasons Head Start teachers learned about computers In Chapter 4, it was reported that the primary reasons Head Start teachers learned about computers were to improve their skills, teach children how to use computers and to make teaching easier. It could be suggested that these Head Start teachers perceived both personal and professional benefits from learning about computers. On the other hand, perhaps subjects generally had positive attitudes towards computers and are exploring and adapting their use as an instructional tool as a result of their favorable opinions (Cory, 1983, Moersch, 1995, Coughlin & Lemke, 1999). Subjects reported that they were least likely to learn about computers ‘because others said I should”. Telling Head Start teachers they must learn about computers does not appear to motivate them to learn computer skills. The finding could imply that administrators who mandate computer use by Head Start teachers are less likely to be 127 successful and more likely to have unenthusiastic participants. How Head Start teachers learned about computers Three out of four teachers said they learned about computers by ‘messing around by myself”. Thus, a majority of Head Start teachers learned computer skills through independent experimentation. This finding supports Landerholm’s (1995) finding that most preschool and kindergarten teachers learn to use computers independently. Although this is a significant finding in the current study, specific information regarding the ‘messing around” period is not available. The next most frequently reported strategies for learning about computers were those involving interactions with other people. Subjects reported learning about computers from friends (42.7%), by watching others (31.9%), from interacting with their own children (28.5%), or from interacting with another teacher (27.8%). This would indicate that human interaction is an important element in learning computer skills (Rogers, 1995, Hall & Loucks, 1977, Rothman, Erlich, and Teresa, 1981). Many subjects (35.8%) reported learning about computers in high school or college. Some participants learned about computers from a Head Start workshop (21.2%), 128 a workshop or seminar not sponsored by Head Start (13.5), took another class (10.4%), or attended a professional conference (3.1%). Clearly, Head Start teachers learn about computers through different types of education and training. Although this is an important finding, the reader should note that it does not reflect the availability, content, quality, or length of these formal or informal sessions. Less than 5% of teachers reported reading about computers as a way to learn about them, despite free or low cost publications on the subject (NAEYC, 1996, MOBIUS, 1990, Head Start Bureau, 2000). Possibly these teachers did not have these publications available or did not find them useful. Another interpretation is that computers could be an activity teachers must learn about through action rather than by reading. Question 4. What contextual variables affect computer use by Head Start teachers? Ha: There is at least one contextual variable that affects computer use by Head Start teachers. There was statistical support for this hypothesis. Five contextual variables were related to computer use by the Head Start teachers in this study. In descending order, these variables were talking with other Head Start 129 teachers, type of software, curriculum guidelines, training about computer operation, and program philosophy. As reported in Chapter 4, the results from post hoc procedures further indicated if particular subject groups (easier, harder, neutral) within each of the variables noted above were likely to have similar computer use scores. This means that Head Start teachers who rated talking with other Head Start teachers, type of software, curriculum guidelines, and training on computer operation as ‘easier” are more likely to have higher computer use scores than those subjects who rated these items as ‘harder”. Similarly, Head Start teachers who found that curriculum guidelines and talking with their teaching colleagues made computer use ‘harder”, were more likely to have lower scores than those subjects who rated these variables as ‘neutral.” Other researchers report that similar variables support computer use. Sharing computer strategies helps preschool teachers learn about computers (Bilton, 1996). Rogers (1995) suggests that talking with colleagues is a communication behavior that fosters the use of something new. Curriculum guidelines and the type of software can influence how teachers use computers with young children (Landerholm, 1995, Haugland, 1997). Teachers must spend 130 time learning how to operate computers before they can use them with other teaching activities (Fite, 1993). Another interpretation of these findings could be that they were the result of the ambiguity of particular HSTCU Profile items. For example, when subjects decided how ‘training: how to operate computers” affected their computer use, they could not indicate whether they were addressing the availability, content, or quality of such training. Unequal numbers of subjects in several categories could also skew the statistical process. The post hoc methods presume that each subject group is no more than three times larger than another. When this occurs, the statistical computer package creates estimates as if the groups were similar sizes (Norusis, 1997). Although these findings have statistical significance, they may reflect a moderate effect of the variables. Subjects reported that the nine other contextual factors were not significantly related to their computer use. Although the literature identified time, classroom space, the number of computers or software programs, training on using computers with young children, computer technicians, administrators, and program requirements as factors influencing computer use, the data did not confirm 131 the influence of these variables. One might speculate that the subjects in this study did not perceive these factors as important for their use of computers with young children. The data could also reflect the ambiguity of particular HSTCU Profile items. For example, a subject was unclear whether the ‘computer technician” item indicated the availability, quality, or timeliness of a computer technician. Question 5. What personal variables of Head Start teachers affect computer use? Ha: There is at least one personal variable of Head Start teachers that affects computer use. There was statistical support for this hypothesis. Four personal variables of Head Start teachers related to the computer use of the subjects in this study. These included income, as well as teacher comfort level with, knowledge about, and previous experiences with computers. These findings mirror previous research. Wood, Willoughby, & Spect (1998) as well as Edyburn & Lartz (1986) reported that computer use by early childhood teachers is influenced by their comfort level, knowledge, and previous experiences with computers. This assertion also seems valid for Michigan Head Start teachers. 132 Teachers who rated their comfort level, knowledge of, and previous experiences with computers as ‘easier” are significantly more likely than those who rated them as ‘harder” to have a greater degree of computer use. One could also speculate that if a person feels comfortable, and has knowledge and previous experience with an activity, he or she is likely to be a more engaged participant than someone who does not have these characteristics. The literature (NTIA, 1999, NTIA 1998, Rogers, 1995) suggests that individuals are more likely to use computers as their income increases. The data in this study, however, did not support this perspective. Those subjects in the lowest income group have higher average computer use scores than other income groups. Unequal numbers of subjects within different income groups could have influenced these differences. One could also speculate that participants in the lower income groups may have learned about computers in high school or college. Subjects in higher income groups could primarily reflect the earnings of another household member. To better understand income variations, researchers should examine this issue more closely in the future. Thus, personal variables of Head Start teachers relate to computer use. Three variables (comfort level with, 133 knowledge about, and previous experience with computers) compose the Head Start teacher’s background about computers. The income variable is a socio-economic characteristic of the individual teacher. Teachers reported that other personal variables (age, preschool and Head Start teaching experience, adopter category) did not have a significant relationship with their computer use. The data indicate that the Head Start teachers in this sample have higher education and household income levels than others commonly perceive. Rogers (1995) suggests that individuals with these characteristics are more likely to adopt an innovation. Yet, the data does not support this theory. The majority (87%) of subjects identified themselves as either early adopters or early majority, people generally apt to implement new practices before many others. Future researchers could examine in more depth why these differences occur. Conclusions Based upon the data, Head Start teachers have computers for instructional use with children and use these computers for various instructional purposes. Teachers reported learning about computers primarily through experimentation and interaction with other people. A majority of Head Start teachers use computers in some 134 fashion with preschoolers. Nearly all reported their interest in learning about computes. One might consider that individual Head Start teachers are moving through the process of adopting the computer as an instructional innovation. Although theoretical perspectives vary, general consensus could speculate that most Michigan Head Start teachers are in the particular stages reflecting implementation or adaptation. Head Start teachers also reported that particular contextual and personal variables significantly relate to making their computer use easier. It could be considered that increasing the likelihood or occurrence of these factors might support the computer use of Head Start teachers. Limitations The reader may note that potential limitations were previously described in Chapter Three. Instrumentation became a threat to internal validity due to ambiguous items that arose despite pilot testing and review by experts. The researcher minimized the additional limitation of non- response bias by obtaining missing data from several subjects who provided contact information. Other subjects did not provide this information; their missing data remained incomplete. It is unknown whether education 135 coordinators accurately following the sampling directions. If they did not, sampling error could influence these findings. Finally, it is unknown whether the practices reported by Head Start teachers are those that actually occur in their classrooms. Implications for Head Start Policy Makers This section lists suggestions for national Head Start policy makers about the use of computers by Head Start teachers. These recommendations are based upon the findings of this study. They are listed in order of descending importance. 1. Develop self-instructional training materials that support independent ‘messing around” with computers for individual classroom teachers. A.majority of subjects reported learning about computers by ‘messing around by myself.” Enhance this experiential approach through the provision of self-study materials that promote exploration and experimentation. For example, one lesson could teach teachers how to create parent newsletters with the computer. Another lesson might help teachers record child outcomes data on a spreadsheet. 136 2. Provide fiscal resources for operational support of classroom computers, replacement of outdated hardware components, additional hardware components, and Internet installation and access. The findings indicated that some computers do not consistently work properly or that they have inadequate hardware components. Eighty percent of teachers reported that their computer did not have Internet access. One could speculate that some Head Start teachers might not use the Internet and email because computers or classrooms are not equipped for this function. 3. Offer recommendations regarding criteria for the selection of children's software. Subjects reported using a variety of software programs; however, the quality of these programs is unknown. Although teachers reported teaching literacy and numeracy skills and concepts with computers, the programs they chose may or may not be congruent with developmentally appropriate practices. ‘4.Provide recommendations about how computers can and should be integrated into classroom activities and materials as another learning center. One out of three Head Start teachers reported never integrating classroom 137 materials and activities with computer program content. This may indicate a lack of knowledge or adequate resources . Implications for Head Start Administrators This section includes recommendations for Head Start administrators regarding the use of computers as instructional tools. These suggestions are based upon the results and discussion of the data. 1. Budget fiscal resources for operational support of classroom computers, replacement of outdated hardware components, and additional hardware components. One out of four teachers reported that their computers worked properly sometimes or hardly ever. 2. Add Internet components and access to classroom computers if local Head Start teachers request email and computer access to on-line resources. Four out of five teachers reported that their computer did not have Internet capability. 3. Carefully select and monitor the types of software programs that classroom teachers use with children. The findings from the study suggested two different 138 software issues. First, a significant number of teachers reported that the type of software made it easier to use computers with children. (The number of software programs was not a significant factor.) Second, although teachers used a variety of software programs, the quality of these programs is unknown. Offer opportunities for teachers to learn how to operate computers. Subjects reported that training on computer operation facilitated their use of computers with children. Help classroom teachers increase their comfort level, knowledge, and experiences with computers through activities that encourage ‘human support” and experimentation during the learning process. Head Start teacher ratings of their comfort level, knowledge, and previous experiences with computers, indicated that these factors made their computer use with children much easier. Most commonly, teachers reported learning about computers through activities that involved other people (a friend, another teacher, or children) or individual experimentation. For example, in a location that has working computers, each teacher could list his or her 139 most common problems with computers. Interaction strategies would include shared problem solving, demonstration, and practice among the Head Start teachers. Tell teachers that if they learn about computers, they can improve their skills, do their job easier, and teach children how to use them. Head Start teachers most often learned about computers for these three reasons . Allow ample ‘messing around” time for teachers when new computers and or software programs are placed in the classroom. Three out of four Head Start teachers reported that they learned about computers by ‘messing around by myself”. Create opportunities for Head Start teachers to share strategies for using computers as instructional support. Teachers reported that talking with other Head Start teachers was a significant factor in making computer use with children easier. Share curriculum guidelines and program philosophy that support computers used as another classroom learning center. Teachers reported that both curriculum 140 10. 11. 12. guidelines and program philosophy were highly significant in making their computer use with children easier. Observe and monitor classroom materials and activities so that these curricular components reflect the content of computer programs (if the programs are appropriate for young children). One out of three classroom teachers said that their materials and activities never reflected the content of the computer programs. Encourage teachers to write lesson plans for the computer center so that is integrated into daily classroom programming. More than half (65%) of the subjects reported never writing lesson plans for the computer center. Increase teachers’ knowledge by distributing free and low cost publications about using computers with young children. Head Start teachers reported that they rarely read about computers and young children. Some possibilities include Computers in Head Start Classrooms (MOBIUS, 1990) and the NAEYC Position Statement on Technology and Young Children (1996). 141 Implications for Future Research This section lists suggestions for future research about Head Start teachers and their use of computers. These recommendations are based upon the findings of this study. 1. 3 This study represents classroom teachers in Michigan Head Start programs. Future research efforts could focus on a national study of Head Start classroom teachers and their use of computers. Revise the HSTCU Profile and begin reliability testing. This study used a new instrument. Despite pilot testing, some items were unclear to respondents. The length of the instrument may also have contributed to missing data problems. The reliability of the instrument has yet to be proven. .A large number of subjects indicated their willingness to talk with someone about computer use or have an observer conduct a research study in their classroom. Future research could employ interview and observation techniques, yielding richer data than what can be gathered through a written survey. . Findings from the study regarding training were somewhat unclear. Investigators implementing new research efforts 142 could obtain data regarding the availability, quality, content, methodology, and effectiveness of computer training for Head Start teachers. 5.Am.examination of the software programs that teachers prefer to use in Head Start classroom is needed. Little is known about the quality of this software or the criteria used for software program selection. Personal Observations This study focused on how teachers use computers primarily in the Head Start classroom. Whether computers are appropriate for young children was beyond the scope of this research effort. However, one must acknowledge recent controversy in the field. Several educators and physicians charge that young children’s use of technology is akin to educational malpractice and costly in terms of health risks and intellectual growth (Thomas, 2000, Kelly, 2000, Alliance for Childhood, 2000). As noted in Chapter 2, others consider this debate settled. Future researchers may wish to examine the suggested moratorium on computers in the early childhood classroom and the potential effect upon low-income children. Of the 323 respondents, 111 indicated that they were interested in talking with someone about using computers 143 with preschoolers or having an observer visit their classroom for a research study. Several subjects impressed the researcher with their interest and enthusiasm during telephone calls made to obtain missing information. Many subjects wanted to discuss what they had done with computers and children since the beginning of the program year. Others asked for recommendations regarding hardware, software, and activities that could be used to integrate the computer center into regular classroom activities. Some requested information regarding best practices for computers with young children. Nearly everyone asked about training on using computers with children although the findings did not reflect this issue. Sixty-six subjects chose to write comments at the conclusion of the survey. Although not reported in the data, these statements reflected a wide range of issues and concerns. Subjects most frequently wrote about the appropriateness of computers for young children. One teacher noted, ‘The generation we’re teaching is growing up in a technical world and should have the knowledge” yet another noted that ‘Kids have a steady diet of movies and video games at home. Here they need peaceful interaction with people and three dimensional materials”. Other comments reflected subject’s desires for additional 144 computers and software, frustrations with old or donated equipment or personal limitations. These comments might imply that some Head Start teachers do not have adequate computer equipment or feel uncomfortable regarding computer use. Little is known about individuals who chose not to participate in the study. The Director of one program told the researcher, ‘We don’t believe in using computers with preschoolers”. He elaborated and said his teachers had ‘more important issues” to address rather than to ‘teach children to use computers”. Although 5 surveys were returned from his agency, they represented a small portion of the 40 possible subjects from that setting. The timing of the study was another issue that could have affected the response rate. During the beginning of the program year, Head Start teachers were conducting intake home visits, participating in preservice training, and teaching new groups of children. Participation in a research study would not be a top priority. One education coordinator told the researcher that if it had been in her power, she would not have authorized their agency’s cooperation in the study ‘because it is the wrong time of year”. She also said that she waited to distribute survey packets until three weeks after their receipt. Other 145 education coordinators noted that the beginning of the program year was a ‘frazzled time” but they were more than willing to help the researcher. The researcher spent many hours addressing missing data problems. Several subjects did not completely answer all questionnaire items. After obtaining missing data during follow—up telephone calls, the researcher asked if there had been a particular reason for leaving items blank. The responses from participants ranged from ‘I don’t have a computer so they didn’t relate to me” to ‘It was one of those days. I just didn’t finish.” Some subjects who only left the income item blank said they felt uncomfortable disclosing this information. They were also concerned that other colleagues would somehow receive the information. A shorter survey combined with better timing for the study could possibly avoid future missing data issues. Future researchers might also consider telephone interviews rather than mailed questionnaires as a better way to get computer data. Ecological Implications A theoretical framework that combined human ecological approaches and adoption of innovations theory was the foundation for this study. It was theorized that the dynamic interaction between contextual variables associated 146 with computers and the Head Start classroom/program and personal variables of Head Start teachers were significantly related to computer use. The findings from the study suggest that this interaction between the individual and their environmental context indeed occurs. Five contextual variables, one associated with computers, four associated with the Head Start program, were significant in making computer use easier for Head Start teachers. Additionally, four personal variables of Head Start teachers also had a significant relationship with computer use. Summary This study represents a first step in examining the use of computers by Head Start teachers. As noted earlier, Head Start children must learn with and about computers to succeed in this increasingly technological age. Their Head Start teachers can bridge the gap between home and public school success by using computers as instructional tools. The data provided from this research effort could help Head Start policy makers and administrators plan and allocate physical, fiscal, and human resources more effectively. Head Start teachers learn about computers by ‘messing around” and having them available in their classrooms. They use computers primarily for instruction and 147 instructional support. Head Start teachers rated talking with other Head Start teachers, the type of software, curriculum guidelines, training on computer operation, and program philosophy as making computer use much easier. Teachers also indicated that their comfort level, knowledge about and previous experiences with computers made computer use easier. The randomly selected sample of 323 Michigan Head Start teachers was more than twice as large as the TURNKEY study of teacher computer use in early childhood programs (Education TURNKEY Systems, 1998) and three times as large as others that investigated early childhood teachers and computers (Bilton, 1996, Wood, Willoughby & Specht, 1998, Landerholm, 1995, Edyburn & Lartz, 1986). No previous studies used Head Start teachers as the unit of analysis. The current research effort, therefore, provides a broad foundation for future research investigating Head Start teachers and their use of computers. 148 APPENDIX A. Coin Toss Directions 149 COIN TOSS DIRECTIONS You need: ale One coin ale A list of classroom teachers Be certain your list has ONLY . Each classroom’s lead or head teacher . Teachers who teach 3-5 year olds 1. Begin at the top of the list. 2. Toss the coin in the air. 3. If the coin lands HEADS up, the teacher is IN THE STUDY. Mark the teacher’s name with a check. Example: I Hilda Head Start 4. If the coin lands TAILS up, the teacher is NOT in the study. 5. Repeat Steps 2-4 until you have completed the list. 6. Give each teacher whose name is included in the study, one packet that contains one questionnaire, one stamped return envelope, and one color marker. 7. THANKS FOR HELPING!!! 150 APPENDIX B . The Head Start Teacher Computer Use Profile 151 L The HEAD START TEACHER COMPUTER USE PROFILE Thank you for completing this survey. Please remember that your answers are confidential and will be kept secure. The researcher intends to report findings as a group; no individual person or program will be associated with any information. At any time, you may choose NOT to answer any or all of the questions. You indicate your voluntary participation by completing and returning this questionnaire. Your return also implies your permission to use the results for future educational publications. DIRECTIONS: Write the answer to each question or mark an X. Choose only ONE answer unless indicated. 1. Where is your Head Start center located? Name of community 2. How old are you? ( )21oryounger ( )22—30 ( )31-40 ( )41-50 ( )Olderthan51 3 What is your highest education level? ( ) Public schooling without diploma or GED. ( ) High school diploma or GED. ( ) Child Development Associate credential (C.D.A.) [ECE = Early Childhood Education] ) Associate degree with ECE or Child Development major ) Other Associate degree ) Bachelor degree with ECE or Child Development major ) Other Bachelor degree ) Graduate degree with ECE or Child Development major ) Other Graduate degree ) OTHER AAAAAAA 4. What is your yearly hoUsehold income? , ( )Lessthan$20,000 ( )$21,ooo-_$35,000 ( macaw-$49,000 ( )$50,ooo-$7_4,ooo ( )More than $75,000 152 5. Were you at any time a Head Start parent? ( ) YES ( ) NO 6. How long have you taught preschool children? ( ) Less than one year ( )1 - 3 years ( I )4 - 5' years ( )6-10years ( )11—15years ( )16-20years ( ) More than 20 years I 7. How many total years have you worked for Head Start, regardless of your job? ( ) Less than one year ( ) 1 - 3 years ( )4 - 5 years ( )6-10years ( )11—15years ( )16-20years ( ) More than 20 years 8. How long have you been a Head Start Classroom Teacher? f ( )Lessthan oneyear ( )1-3years ( )4-5years .- ( )6-10years ( )11-_-15years ' ( )16-20years ( )More than 20 years I A ‘ A 9. How would you describe the age group you teach? ( ) 3 years old & younger ( ) 4 years old ( ) 5 years old ( ) Mixed age group 10. How many children are in each classroom session? ( )Lessthan13 ( )13—15 ( )16-20 ( )21ormore 11. How long is your Head Start class per session? ( ) Less than 3.5 hours ( ) 3.5 - 5 hours ( )6 or more hours 12. How many days a week does your class meet? ( ) 3 days or less ( )4 days ( ) 5 days ( ) 6 days or more 153 13. What type of “star" are you when trying or using a new idea, object, or activity? Use the enclosed marker and CIRCLE the section that best describes you. or buy it right now! ost everyone else. Its n , I try it, usually be 154 14. Are you interested in learning about computers? ( )YES IF YES, go to Question 15. ( ) NO If NO, go to Question 16. 15. Why are you interested in learning about computers? (MarkALLthat apply). ’ ( ) Computers are something new. ( ) To improve my skills ( ) Everyone else is using them. ( ) To communicate With other pe0ple. ' ( ) Computers can make my job easier ( ) Other people told me I should. ( ) The program requires that teachers use computers with children. I ( ) I want to teach children how to use them. OTHER: 16. Do you use a computer? ( ) YES If YES, go to Question 17. ( ) NO If NO, go to Question 18. 17. . Howdid you learn about computers? (Mark ALL that apply). 7 ( ') “Messing around” by myself ( ) From a friend or family member ( ) My own children taught me. ( )The Head Start children taught me. ( ) From another teacher (‘ )Watching other people ' ( ) When I was in high school or college ( ) After I graduated, I took another class . ( ) At a professional conference ( ) At a Head Start workshop or inservice * ( ) At a workshop or seminar, sponsored by a group other than Head Start ( ) Reading the NAEYC Position Statement on Technology ( _ I Reading Computers in Head Start Classrooms (Head Start/IBM report) ( ) Reading the Head Start Bulletin ( ) Reading Young Children ( ) Reading other books or journals w ‘ OTHER: * 155 18. Write an X on the box that BEST describes your current use of computers with preschoolers. I don’t use computers I use computers with I use computers with with preschoolers. preschoolers. preschoolers extensively. This It This Write one X for each item that tells how you makes it doesn’t makes believe each item makes it easier or hager HARDER matter it to use computers wrt' h preschoolers. (neutral) EASIER whether you use them with children or not. 19. My “comfort level” with computers 20. My knowledge about computers 21. My previous experience with computers. 22. Number of computers in the classroom. 23. Type of software programs. 24. Number of software programs 25. Amount of time in the daily schedule. 26. Amount of classroom space. 27. Number of electrical outlets in classroom. 28. Training: How to operate computers. 29. Training: Using computers with preschoolers. 30. Computer technician 31. Meeting Head Start requirements 32. Program philosophy 33. Curriculum guidelines. 34. Talking with other Head Start teachers 35. Administrators 156 Wn'te oneX for each NOT Oncea Oncea Oncea 2-3 DAILY item that tells HOW AT YEAR MONTH WEEK times OFTEN you use a ALL. each computer for this WEEK. purpose. 36. I use a computer to make instructional materials (labels, games, pictures, stories etc.) 37. I use a computer to keep records about the children or classroom activities. 38. I use a computer to email parents, and professionals. 39. I use a computer to find resources for lesson planning and ideas for best teaching practices. COMPUTER HARDWARE 8: SOFTWARE 40. Do you have computers to use with your Head Start children? ( ) YES ( ) NO ale *If this is your answer, go to Page 7. Begin at the x}. 41. Where are the computers you can use with children located? (Mark all that apply) ( ) My classroom ( ) Computer lab ( ) Library/media center ( ) Office OTHER 157 42. How many computers are in EACH location? Only write the # you can use with children. ( ) My classroom ( ) Computer lab ( ) Library/media center ( ) Office OTHER 43. Any or all of the computers that you use with children have-the following ‘ parts: (Mark all that apply) . y. - ( )Mouse ( )Keyboard ’(-, )To'uch screen . _ ( )Speakers ( )Printer ( )Monitor. ( )CD-RI‘OM’dn've. ., ( )lnternet access ' -‘ ‘4 OTHER 44. How many computers that you use with children, have ALL of the parts checked in Question 43? ( ) One ( ) Two ( ) Some of them ( ) Most of them ( )ALL of them 45. How often do the computers that you use with children, work properly: ( ) Never ( ) Hardly ever ( ) Sometimes ( ) Most of the time ( ) Always 46. What three software programs do you like to use most with children? (Mark only THREE) ( ) Art Center ( ) Jump Start Preschool ( ) Nick Jr. Play Math ( ) Arthur’s Preschool ( ) Just Grandma & Me ( ) Playroom ( ) Bailey’s Book House ( ) Kid Pix ( ) Reader Rabbit series ( ) Crayola Art Studio ( ) Kidware ( ) Sammy’s Science House ( ) Foo Castle ( ) Kid Works Deluxe ( ) Stickybear’s Reading Room ( ) Freddi Fish ( ) Little Monster @ School ( )Trudy’s Time ( ) Hello Kitty Big Fun ( ) Millie’s Math House 158 ( )VVlnnie the Pooh Preschool ( ) I don’t remember the program names. OTHER: 47. Who chooses children’s software? (Mark all that apply) ( )Classroom teacher ( )Education Coordinator( )Computer specialist ( )Technology committee( )Head Start Director ( )Principal ( )Executive Director ( )Parent Policy Council ( )Librarian OTHER Rs OR tasty-Rucpomymn PRESCHOOL 2-3 Write one X for each item that NOT ONCE A ONCE TIMES DAILY tells HOW OFTEN this happens. AT MONTH. A EACH ALL. WEEK WEEK. 48. I set time limits for _ . . . children's computer use. . 3 p ’ i 49. I choose which computer programs children use in the classroom. 50. I write lesson plans for the computer center. 51. I use the computer to teach children socio—emotional skills (cooperation, tum-taking, etc.). 52. I use the computer to teach children literacy concepts and pflfls Qetters, sounds etc.) 53. I use the computer to teach children numeracy concepts and skills (sets, counting, numerals etc.). 159 Write one X for each item that tells HOW OFTEN this happens. NOT AT ALL. ONCE A MONTH. ONCE WEEK. 2-3 TIMES EACH DAILY 54. I use the computer to teach English or other languages to children. WEEK. 55. I use the computer to teach children fine motor skills (hand- eye coordination, keyboarding etc.) 56. My classroom materials (books, toys, etc.) reflect the concepts in the computer programs. 57. My classroom activities reflect the concepts in the computer programs. 160 z}, ............................................................................. Thank you for your honest answers and the time you spent answering these questions. Please write additional comments. if you wish. In the future, I am interested in having someone talk with me about using computers with preschoolers andlor observe in my classroom for a research study. Please check one box. NOTE: All information, including this questionnaire, is confidential. No thanks, I’m not interested. YES, I am interested. You can reach (Leave information below BLANK) me at: (COMPLETE information below.) My name: Street Address City State: Zip: Phone number:( ) Work( ) Home 161 APPENDIX C. University Committee on Research Involving Human Subjects Approval 162 OFFICE or RESEARCH AND GRADUATE STUDIES lverslty Committee on Research Involving Human Subjects Michigan Slate University 6 Administration Building East Lansing, Michigan 48824-1046 517/355-2180 FAX: 517/353-2976 ”winsucdu/user/ucrihs E — Mail: ucrihs©msuedu MICHIGAN STATE UNIVERSITY August 10, 2000 TO: Marjorie KOSTELNIK 107 Human Ecology Building RE: IRB# 00-430 CATEGORYZ1-C APPROVAL DATE: August 9, 2000 TITLE: THE ADOPTION OF COMPUTERS AS AN INSTRUCTIONAL TOLL BY MICHIGAN HEAD START TEACHERS The University Committee on Research Involving Human Subjects' (UCRIHS) review of this project is complete and I am pleased to advise that the rights and welfare of the human subjects appear to be adequately protected and methods to obtain informed consent are appropriate. Therefore, the UCRIHS approved this project. RENEWALS: UCRIHS approval is valid for one calendar year, beginning with the approval date shown above. Projects continuing beyond one year must be renewed with the green renewal form. A maximum of four such expedited renewals possible. investigators wishing to continue a project beyond that time need to submit it again for a complete review. REVISIONS: UCRIHS must review any changes in procedures involving human subjects, prior to initiation of the change. if this is done at the time of renewal, please use the green renewal form. To revise an approved protocol at any other time during the year, send your written request to the UCRIHS Chair, requesting revised approval and referencing the project's lRB# and title. Include in your request a description of the change and any revised instruments, consent forms or advertisements that are applicable. PROBLEMSICHANGES: Should either of the following arise during the course of the work, notify UCRIHS promptly: 1) problems (unexpected side effects, complaints, etc.).lnvolvlng human subjects or 2) changes in the research environment or new information indicating greater risk to the human subjects than existed when the protocol was previously reviewed and approved. If we can be of further assistance, please contact us at 517 355-2180 or via email: UCRIHS@msu.edu. Please note that all UCRIHS forms are located on the web: http:l/www.msu.eduluser/ucrihs Sincerely, aw M 62110324} Kenneth Marvin Vice Chair, UCRIHS 163 APPENDIX D. Head Start Director Letter Dear Head Start Director Research shows that often children from low-income families do not have computers at home. Consequently, many Head Start programs installed computers in their classrooms. You might even have asked yourself, How many Head Start classrooms have computers? or How do Head Start teachers use computers with children? No one has answered these basic questions despite funding and program guidelines. I am a doctoral candidate at Michigan State University. I am also the Education Services Manager at Tit-County Head Start in Paw Paw, Michigan and have more than twenty five years of Head Start experience. My research study is called The adoption of computers as_ an instructional tool by Michigan Heed SLart teechers. The findings will help administrators and policy makers make informed decisions about staff development, technical support and budget plans. I would like to have your agency’s voluntary participation in the study. I understand this is a hectic time of year but your staff” 3 participation should take just a few minutes. After your permission, Education Coordinators (or the individual that supervises classroom teachers) will select teachers with a coin toss. I expect that it will take no more than half an hour to choose teachers and a maximum of twenty minutes-to complete a survey. I will keep all responses confidential. There is no cost to participants and no risk of any physical or other injury. I promise to call you next week to answer your questions, as well as request your permission to contact your Education Coordinator and begin the study in your agency. If you prefer, you may also phone me during the day at 1-800-792-0366, extension 116, at home, 616-624-5107 or send email to bewickcy@pilot.msu.edu. I also promise to send you a copy of the study results at your request. Thank you for your consideration, Cindy Bewick 164 APPENDIX E . Education Coordinator Letter Dear Head Start Education Coordinator Research shows that often children from low-income families do not have computers at home. Consequently, many Head Start programs installed computers in their classrooms. You might even have asked yourself, How many Head Start classrooms have computers? or How do Head Start teachers use computers with children? No one has answered these basic questions despite funding and program guidelines. I am a doctoral candidate at Michigan State University. I am also the Education Services Manager at Tri-County Head Start in Paw Paw, Michigan and have more than twenty- five years of Head Start experience. My research study is called The adoption of computers as an instmamal tool by Micmgan Head Start teachers. The findings will help administrators and policy makers make informed decisions about staff development, technical support and budget plans. Last week, your Head Start Director gave permission for your agency’s voluntary participation in this study. I understand this is a hectic time of year but your staff” 5 participation should take just a few minutes. You will select teachers with a coin toss, using the attached directions. I expect that it will take no more than half an hour to choose teachers and a maximum of twenty minutes for them to complete the survey. I have enclosed individual packets with the survey, a stamped return envelopes so they can easily return it to me as well as one color marker as a thank you. I will keep all responses confidential. There is no cost to participants and no risk of any physical or other injury. I promise to call you next week to answer any questions you may have. Ifyou prefer, you may phone me during the day at 1-800—792-0366, extension 116, at home, 616-624- 5107, or send email to bewickcy@pilot.msu.edu. I also promise to send you a copy of the study results at your request. Thank you for your time, Cindy Bewick 165 APPENDIX F. Classroom Teacher Letter 166 Dear Head Start Classroom Teacher Research shows that often children from low-income families do not have computers at home. Consequently, many Head Start programs installed computers in their classrooms. You might even have asked yourself, How many Head Start classrooms have computers? or How do Head Start teachers use computers with children? No one has answered these basic questions despite funding and program guidelines. I am a doctoral candidate at Michigan State University. I am also the Education Services Manager at Tri—County Head Start in Paw Paw, Michigan and have more than twenty- five years of Head Start experience. My research study is called The adoption of computers as an instructional tool by Michigan Head Start teachers. The findings will help administrators and policy makers make informed decisions about staff development, technical support and budget plans. Recently, your Head Start Director gave permission for your agency’s voluntary participation in this study. You were randomly selected as a study participant. Your participation is completely voluntary. . You may chose NOT to complete the survey OR only answer selected questions. I will keep all responses confidential and secure. I intend to report findings as a group; no individual person or program will be associated with any information. There is no cost to you and no risk of any physical or other injury. Ifyou chose to participate, I expect you will need no more than twenty minutes to complete the survey. The Head Start Teacher Computer Use Profile has four sections. First, you’ll answer basic questions about yourself and your classroom. You will decide what category best describes you when trying out new things or activities. The next section asks about how you use and have learned about computers. You’ll also choose how various items make it easier or harder to use computers with preschoolers. The third section asks about computer hardware and software. Finally, you write IF or HOW OFTEN you do certain computer activities with children. Even if you do not have a classroom computer, there are questions for you to answer. A color marker is enclosed for your use and as a thank you for your time. If you have any questions, you may phone me during the day at 1-800-792-0366, extension 116, at home, 616-624-5107, or send email to bewickcy@pilot.msu.edu. Please send your completed survey in the enclosed stamped return envelope. Ifyou would like a copy of the study results, please send a separate post card to 27364 Manstrom Drive, Lawton, MI 49065 with your name and mailing address. You may contact Dr. David Wright at 517-355-2180 or UCRIHS@msu.edu if you have any questions or concerns regarding human subjects issues. Your work is so important for children and families. I applaud your efforts and consideration of my request. Sincerely, Cindy Bewick 167 APPENDIX G. Postcard Reminders 168 Reminder Postcard #1 BEWICK 27364 Manonm Dr. Lawton, MI 49065 REMEMBER those surveys about Head Start teachers and computers? » - Even though you might be up to your nec in alligators, have you: ale Used the coin toss directions to choose lead classroom teachers? ale Given survey packets to the “winners” of the coin toss? ale Called Cindy Bewick IF you have any questions, need more surveys or are just plain confused? Than/cs again for your help! Cindy Bewick 800-792-0366, X116 Office bewickcy@pilot.msu.edu 169 Reminder Postcard #2 BEWICK 27364 Manstrom Dr. Lawton, MI 49065 K. TIME IS RUNNING OUT! WWW? Your teachers have returned, children are coming to school, and you mightbe able to see a tiny corner of your desk. You can make more space & finish another project IF you ale Give survey packets to the “winners” of the coin toss ale Ask selected teachers to mail their surveys by September 25! ale Call Cindy Bewick IF you have questions Man/cs for helping a Head Start colleague! Cindy Bewick 800-792-0366, X116 bagickcydhpflotrnmredu 170 Reminder Postcard #3 BEWICK 27364 Manstrom Dr. Lawton, MI 49065 You hold the key.... UNLOCK THE SECRET to no more post cards or phone calls AND , Contribute to scienb'fic research! Remember those surveys about Head Start teachers & computers? Help your teachers (and me) by sending them in NOW! Every survey is immfiant. Finish one more thing on your “to do” list and: ale Ask selected teachers to mail their surveys NQW! ale Call IF you have questions or need more packets ”wan/cs for helping a Head Start colleague! Cindy Bewick 800-792-0366, X116 bewickcy@pilot.msu.edu 171 1V. Reminder Thank You BEWICK 27364 Manstrom Dr. Lawton, MI 49065 WOW, YOU DESERVE A GOLD STAR! Rmmr rv He eahe an mu ? You quickly leaped on this task, did the coin toss and gave the “winners” their packets. You probably can even see the comer of your desk! I've received several surveys from your area. You can help one last time if you: ale Remind selected teachers (IF they're interested) to mail th 'r surveys by September 25! ' ale Call Cindy Bewick IF you have questions 777an/G for hekping a Head Start colleague! Cindy Bewick 800-792-0366, X1 16 bewickcy@pilot.msu.edu 172 APPENDIX H. Location Names Frequency Percent Adrian 5 1.5 Allegan 2 .6 Alpena l .3 Ann Arbor 2 .6 Ashley 1 .3 Baraga 2 .6 Battle Creek 8 2.5 Bay City 2 .6 Bay County 1 .3 Beaverton l .3 Belding l .3 Big Rapids 1 .3 Blanchard l .3 Boyne City 1 .3 Brighton 1 .3 Brimley 2 .6 Burton 1 .3 Cadillac 1 .3 Carleton 1 .3 Caro l .3 Center Line 1 .3 Central Lake 1 .3 Charlotte 1 .3 Chassell 1 .3 Coleman 1 .3 Crestwood 1 .3 Crystal 1 .3 Dearborn Heights 1 .3 Deerfield l .3 Delton l .3 Detroit 58 18.0 Dickinson-Iron 1 .3 County Dowagiac l .3 E. Lansing l .3 Escanaba 2 .6 Farmington 1 .3 173 Farmington Hills Fennville Ferrysburg Flint Flint Township Fraser Garden City Gaylord Genesee County Gladwin Grand Blanc Grand Haven Grand Rapids Greenville Hamtramck Hannahville Harrison Hart Hastings Hazel Park Hermansville Highland Park Hillsdale Holland Horton Houghton Houghton Lake Howell Huron County Indian River Inkster Iron Mountain Ironwood Jackson Kalamazoo Kingsford Lake City Lansing Lapeer Lincoln Mackinac Mackinaw City Macomb Marquette Marysville Mesick Midland 174 HNwCDl-‘H\lt—‘t—‘ml—‘wl-‘Nl—‘l—‘F-‘Nl-‘l—‘l—‘l—‘wl—‘(fiml—‘l—‘Nl—‘t—‘l—‘Nle—‘l—‘l—l t-J U1 NHwat—‘Hl—‘H O N O O H O O. O O. OONOONOOHOOOO O. O. O O O mwwmmwwwwmwmmmwwmwwmwmwmwwwmwwwwmwmmwwmwwwmmwww A Millington Monroe Morley Mt. Clemens Mt. Pleasant New Haven Newberry No name Norway Oak Park Okemos Onaway Osceola Oshtemo Owosso Peck Perry Peshawbestown Petosky Pinckney Pinconning Pontiac Port Huron Portage Rapid River Redford Reed City Remus Richmond Roseville 8. Range Sandusky Sault Ste. Marie Scotts Sebewaing Sidney Southfield St. Ignace St. Louis Standish Sterling Sterling Heights Stockbridge Sturgis Sumner Traverse City Utica 175 l—INHi—IHl—IMHi—INMHl—IHmHHwHHHmHthNHHi—Ii—IHHNHHHHHHKOI—JHNOH—IOJH u)one)urn)uiosuiuioxoxuiuiuib1uiu)m>oiuiuib16idim)oxuiuiuiuiuioioxuiuiuiu)uiu)&)uiu>mmo