PRESERVICE TEACHERS’ SELF-EFFICACY, INTENT TO USE, AND TECHNOLOGY INTEGRATION DESCRIPTIONS: A STUDY OF TECHNOLOGY LEARNING EXPERIENCES AND THEIR EFFECTS By Tracy Ellen Russo Amalfitano A DISSERTATION Submitted to Michigan State University in partial fulfillment of the requirements for the degree of Educational Psychology and Educational Technology–Doctor of Philosophy 2017 ABSTRACT PRESERVICE TEACHERS’ SELF-EFFICACY, INTENT TO USE, AND TECHNOLOGY INTEGRATION DESCRIPTIONS: A STUDY OF TECHNOLOGY LEARNING EXPERIENCES AND THEIR EFFECTS By Tracy Ellen Russo Amalfitano Today’s teachers are charged with transforming education through technology, providing all learners with access and meaningful experiences across content areas in ways of teaching and learning they may never have experienced themselves. This dissertation examines the effect of increasing preservice teachers’ prior personal learning experiences using digital instructional tools has on their self-efficacy, intent to use, levels of technology integration, and instructional features described in context of their future teaching using similar digital tools. Set within an educational technology teacher preparation course, the aim of this quasi-experimental study was to describe the effects resulting from intentional pedagogical changes using existing course content and assessments for teacher technology preparation improvement. Interventions to increase personal learning experiences using three focal tools (Interactive PowerPoint, Webquests, and Digital Storytelling) were administered in three of six course sections prior to participants completing course activities using these tools to create teaching materials. Surveys were administered in all sections to investigate changes in participants’ overall digital self-efficacy and their stated intent to use the focal tools. Written reflections about possible future focal tool uses in teaching and technology integrated lesson plans written at the end of the course were analyzed to examine differences in technology integration levels using the Replacement, Amplification, and Transformation framework (Hughes, Thomas, & Scharber, 2006). Additionally, key instructional features in the written descriptions (student learning, instructional methods, curriculum/content, and structural factors) were investigated. Students in both the control and treatment sections showed significant growth in computer self-efficacy during the course. The intervention changes to increase prior personal learning experiences did not result in a significant difference between groups in self-efficacy growth or stated intent-to-use. A significant difference between treatment and control groups was found in the described integration levels for Webquest as well as the number of instructional features described related to structural factors for Interactive PowerPoint. No significant differences between groups were found in the descriptions of future use for Digital Storytelling or the final Technology Integrated Lesson Plan. Copyright by TRACY ELLEN RUSSO AMALFITANO 2017 ACKNOWLEDGEMENTS Without the help and support of many people, this work would not have been possible. Thank you to the Michigan State University Department of Counseling, Educational Psychology, and Special Education for the supportive Fellowships. Thank you to Dr. John Bell and Dr. William Cain of the CEPSE Design Studio, for all of the energy and mentoring put into SLATE so all of us remotely located students could participate in ongoing MSU research projects. Most of all, thank you to Dr. Ralph Putnam, who tirelessly walked side by side with me as my mentor and committee chairman. I could not have finished without your steadfast belief in my ideas. I am also indebted to Dr. Andy Topper at Grand Valley State University (GVSU) and Dr. Jason Siko at Madonna University for their support and active work to provide time and opportunity to complete this study during my GVSU teaching. Many thanks to Mark DeLonge, MSU Alumni extraordinaire and NMC tech and design guru for support along the way. I am also grateful to the many people I had the opportunity to work with and learn from along the way. Each of my Dissertation Committee members, EPET faculty, and others in our pilot cohort shared time and expertise as I made this intellectual journey into the unknown. Angie Johnson walked a mile of IRR data coding with me at the end, and we did it! My grandparents, Sam and Ellen Dzembo, no longer with us, provided me a lifetime of inspiration as well as my first piece of technology—a typewriter with memory! They also passed on a love of music, the outdoors and faith that has kept me grounded. Most importantly, to my family. John, my husband, you’ve done so much along the way to make this dream come true for me. Jesse, Ellen, and Gabrielle, thank you for never doubting that I’d be able to do this, even when the going was rough! v TABLE OF CONTENTS LIST OF TABLES ....................................................................................................................... viii LIST OF FIGURES ....................................................................................................................... ix CHAPTER 1 ................................................................................................................................... 1 INTRODUCTION .......................................................................................................................... 1 Statement of Purpose ................................................................................................................... 5 Research Questions ..................................................................................................................... 5 Overview of the Dissertation....................................................................................................... 6 CHAPTER 2 ................................................................................................................................... 7 LITERATURE REVIEW ............................................................................................................... 7 Technology’s Potential to Transform Learning .......................................................................... 7 Effects of Technology on Desired Learning Outcomes .......................................................... 7 Technology’s Effects on the Learning Process ....................................................................... 8 Technology’s Potential to Transform Teaching ........................................................................ 11 Effects of Technology on Teacher Preparation Standards .................................................... 11 Measuring Levels of Technology Integration........................................................................ 12 Technology’s Effects on the Teaching Process ..................................................................... 15 Supporting Teacher Change in Using Technology ................................................................... 17 First-Order Changes in Support of Technology Integration .................................................. 17 Second-Order Changes in Support of Technology Integration ............................................. 18 Approaches to Address Gaps between Having and Using Technology .................................... 19 Supporting Teachers’ Self-Efficacy ...................................................................................... 20 Supporting Teachers’ Knowledge Development ................................................................... 22 Experience Matters .................................................................................................................... 23 CHAPTER 3 ................................................................................................................................. 25 METHOD ..................................................................................................................................... 25 Participants ............................................................................................................................ 25 The Course ............................................................................................................................. 26 The Intervention .................................................................................................................... 28 Intervention Details. .............................................................................................................. 30 Treatment and Control Groups .............................................................................................. 32 Data Sources and Descriptions of the Measures ................................................................... 33 Data Analysis ......................................................................................................................... 36 CHAPTER 4 ................................................................................................................................. 41 RESULTS ..................................................................................................................................... 41 Preservice Teachers’ Digital Self-Efficacy ............................................................................... 41 Intent to Use the Digital Tools .................................................................................................. 43 Descriptions of Future Teaching with Technology ................................................................... 45 vi Levels of Integration .............................................................................................................. 46 Instructional Features ............................................................................................................ 48 Technology Integrated Lesson Plan Common Course Assessment ...................................... 50 Results Summary....................................................................................................................... 51 CHAPTER 5 ................................................................................................................................. 53 DISCUSSION ............................................................................................................................... 53 Changes to Self-Efficacy ........................................................................................................... 53 Intentions to Use Focal Tools .................................................................................................... 54 Describing Future Teaching and Technology ........................................................................... 57 Levels of Integration .............................................................................................................. 57 Limitations ................................................................................................................................ 63 Research Implications ............................................................................................................... 66 Effects on Preservice Teachers of Pedagogical Changes ...................................................... 67 Organizational Influences on Preservice Teacher Technology Preparation .......................... 68 Conclusion................................................................................................................................. 70 APPENDICES .............................................................................................................................. 73 APPENDIX A— Consent for Participation .............................................................................. 74 APPENDIX B — Course Calendar............................................................................................76 APPENDIX C — Computer Technology Integration Survey .................................................. 77 APPENDIX D — Reflection Prompts ...................................................................................... 78 APPENDIX E — Technology Integrated Lesson Plan Template and Scoring Rubric ............. 79 APPENDIX F — Training Instructions Provided to Second Rater .......................................... 83 REFERENCES ............................................................................................................................. 84 vii LIST OF TABLES Table 1. Participants Giving Voluntary Consent and Section Information ......................... 26 Table 2. Comparison of Control and Treatment Group Unit Experiences .......................... 32 Table 3. Data Sources and Unit of Collection ......................................................................... 33 Table 4. Coding Frame of Reflective Paragraphs and Technology Integrated Lesson PlanLayer 1 ......................................................................................................................................... 38 Table 5. Coding Frame for Instructional Features Described in Reflective Paragraphs – Layer 2 ......................................................................................................................................... 39 Table 6. Mean (SD) CTIS Survey Pre- and Post-Survey Scores ........................................... 41 Table 7. Paired Samples Test of Change between Pre- and Post-CTIS Mean Score .......... 43 Table 8. Descriptive Statistics for Intent to Use Specific Tools ............................................. 44 Table 9. Proportion of Integration Levels Described for Each Focal Tool .......................... 46 Table 10. Mean Number of Descriptors for Each Instructional Feature Category............. 48 Table 11. Chi-square Tests for Independence between Condition and Number of Data Segments within Instructional Feature Categories .................................................................. 49 Table 12. Chi-square Tests for Independence between Rubric Criteria and Condition, N = 114................................................................................................................................................. 51 viii LIST OF FIGURES Figure 1. Course structure and timing of intervention and focal units................................29 Figure 2. Estimated marginal means of pre- and post-Computer Technology Integration Survey........................................................................................................................................... 43 Figure 3. Line graph of intent to use focal tools by condition ................................................ 45 Figure 4. Integration level proportions for control and treatment groups ........................... 47 Figure 5. Percentage of time spent by participants in different semester activities ............. 64 ix CHAPTER 1 INTRODUCTION Technological innovations continue to change rapidly, increasing access to information, communication, and creative arts. Transforming learning through technology and the ability to meet student and workforce needs never before possible is exciting for many educators. No longer do teachers and students depend on replies to handwritten pen pal letters sent to foreign countries viewed on a pull-down map for current knowledge. Today, learning about our world can be transformed by technology’s potential for instantaneous connections between people and information almost anywhere in the world, from devices smaller than a teacher’s hand. Pea (1985) argued that computer technologies do more than amplify, or enhance, human thinking and performance—they change how we think and act, thus transforming learning. Salomon and Perkins (2005) expressed similar ideas about the cognitive effects when a student learns with, of, and through technology. • Effects With are an enhanced performance enabled by the technology/tool, such as using a writing spell check feature or a computational system to solve equations. • Effects Of are lasting changes in the person that are result of interacting with the technology, such as a generalized ability to recognize part-to-whole relationships after guided experiences using a zoom feature with digital imagery. • Effects Through are new kinds of thinking/performance made possible by the technology—much like Pea’s (1985) transformation. For example, combining global information system (GIS) and historical data allow us to model potential environmental impacts of new buildings. 1 It is these effects through, or transformation, we want happening in our schools. Technology can do more than help learners practice basic skills and work more efficiently; it can help prepare today’s learners for a future that hasn’t been invented yet (Dede, 2000, 2011). These ideas have also been applied to uses of technology for instruction; Hughes et al. (2006) have argued that teachers can use technology to • simply replace other technologies without changing the nature of instruction, learning or content • enhance existing instructional approaches by amplifying—like effects with technology or Pea’s amplification • transform learning – by creating new learning experiences not possible without the technology Connect research on technology and cognition with teacher technology practices, Hughes et al. (2006) expanded descriptions of these uses into a framework for technology integration assessment. The Replacement, Amplification, and Transformation (RAT) framework posits the tool used and implementation method can effect student learning, instructional practices, and content/curriculum. Moving to such transformative uses of technology is not easy for teachers, given the complexity of redesigning their teaching practices to encompass technology’s potential. Much effort has gone into helping teachers make these transformations of practice. Researchers have proposed two different levels types of support needed to help address teacher preparedness to leverage technology usage in transformative ways for learning. Structural, or first-order changes to support technology integration such as funding, documented standards, organizational structures, professional development, and school access 2 to the internet and technology are increasingly established (Buckenmeyer, 2010; Ertmer, 1999; Ertmer, Ottenbreit-Leftwich, Sadik, Sendurur, & Sendurur, 2012) For example, the 2015 Every Student Succeeds Act (ESSA) includes educational technology use funding. The 2016 Council for Accreditation of Educator Preparation (CAEP) describes technology as a cross-cutting theme in teacher preparation recommendations. Since 1994 access to technology has increased from 3% to almost 100% for teachers and students (Gray, Thomas, & Lewis, 2010) making the vision to transform education seem within reach. Many changes to teaching practice resulting from this increased expectation for technology use expectations are related to second-order changes, or intrinsic factors. Secondorder changes known to be important indicators of teachers’ effective technology teaching practices include teacher attitudes, confidence, and self-efficacy (Dennis, 2013; Eifler, Greene, & Carroll, 2001; Ertmer & Ottenbreit-Leftwich, 2010; Ertmer et al., 2012; Kay, 2007; Moser, 2007). However, progress in this area falls short of the vision where all teachers are prepared to enact the full transformation of k-12 learning consistent with the United States’ 2010 and 2017 National Education Technology Plans. Despite emphasis on the infusion of digital technologies into the schools by teachers, half of U.S. public school teachers still report feeling unprepared to implement innovative technology uses in their classrooms (Buckenmeyer, 2010; Gray et al., 2010; Kay, 2007; Moser, 2007). Preparing teachers to transform student learning experiences substantially via technological innovations is not a new challenge, with guidelines in use for over two decades as technology availability increased (Gillingham & Topper, 1999). Although teachers have consistently reported needing additional training for implementing technologies in the 3 classroom (Buckenmeyer, 2010; Fantilli & McDougall, 2009; Kay, 2006a; Moser, 2007), this may not be the actual problem as courses and professional developments already abound. Seeking to address the complex challenge of preparing teachers for future technology integration within my own teaching context at a Midwestern, regional university, these intrinsic factors stood out as targets for improvement efforts. Student feedback has indicated less than positive attitudes and beliefs about using technology in their future teaching—a poor indicator of future success (Tschannen-Moran & Hoy, 2001). Using curriculum mapping (Branch, 2009; Carr-Chellman, 2014; Wiggins & McTighe, 1998) to critically reflect on pedagogical experiences of our students within the course, we found we were teaching about transformative technologies using fairly traditional means (Buckenmeyer, 2010; Larry Cuban, 2001, 2008; Pearcy, 2013), rather than using the technologies transformatively ourselves. In other words, we expected students to enact changes in teacher practice without providing them time to observe us enacting these changes ourselves or providing them personal experiences learning through these practices. Research focused on how pre- and in-service teachers actually experience learning technology related to teaching and learning practices is limited in scope and quality (Kay, 2007). Studies of adult learning emphasize the importance of prior experience in establishing confidence, beliefs, and skills with new tools and/or strategies (Adamy & Boulmetis, 2005; Bruner, 1956; Dewey, 1916; Knowles, 1977; Kolb, 1984; McDade, 1988; Tuzzio, 2007; Vygotsky, 1977). Studies of technology integration in general such as PT3 grantees as well as other studies on specific technology integration efforts also indicate a positive relationship between increases in preservice teachers’ authentic, hands on technology experiences and their beliefs and intentions to use said technologies in the future, calling for more research in this 4 area (Alexander, Knezek, Christensen, Tyler-Wood, & Bull, 2014; Davies, 2011; Ertmer, 2005; Jeffs & Morrison, 2005; Polly, Mims, Shepherd, & Inan, 2010). Accordingly, we proposed focusing on developing first-hand preservice teachers’ experiences with technology as a means to develop second-order supports to their own development as future designers and implementers of educational technology. Results of a small pilot study of an intervention designed to increase hands-on experiences with a Personal Learning Network in two sections of our course were encouraging but not conclusive, emphasizing the need for this larger, more detailed study (Russo & Siko, in progress) Statement of Purpose This study examines the effects of providing preservice teachers with first-hand learning experiences using technology—before they learn to use that technology for instruction. Student experiences learning through technology within a stand-alone technology teacher preparation course are described in relation to their beliefs and intentions to use similar technologies in future teaching as well as instructional features for technology integration they attend to. A quasi-experimental design assigning sections of the course to a control or treatment condition with an intervention increasing personal learning experiences with focal technologies was used. This in-situ study uses authentic course artifacts to investigate student technology integration experiences within a context similar to other institutions’ teacher preparation programs. Research Questions The following research questions were used to frame this study: To what extent does having opportunities to learn using digital instructional tools affect preservice teachers’: 1. Overall self-efficacy towards using digital instructional tools? 2. Intent to use focal digital instructional tools in their future teaching? 5 3a. Integration levels to which teachers attend when describing educational uses of technology? 3b. Instructional features to which teachers attend when describing educational uses of technology? Overview of the Dissertation Chapter 2 is a literature review with an introduction to factors such as self-efficacy and personal experiences that are considered essential for changing teacher practice to increase classroom technology integration. Research questions that guide this study are provided at the end of this chapter. Chapter 3 describes the research method of the study in detail. Chapter 4 provides a quantitative analysis of student responses on the pre- and post-self-efficacy surveys focused on the amount of change resulting from course experiences and the intent to use surveys from each of the three focal tools. A qualitative analysis of the integration levels characterized by individual student reflections and the diversity and number of instructional features attended to in focal tool reflections as well as the quality of their final projects is shared. Chapter 5 is a general discussion of the study’s findings and limitations with a focus on implications to future research in teacher preparation programs as well as the teacher candidates themselves. 6 CHAPTER 2 LITERATURE REVIEW To situate the present study this review of research is organized around four critical areas of research related to increasing teachers’ technology integration: (a) research describing hoped-for student learning changes related to technology use possibilities; (b) research describing technology’s potential to transform teaching; and (c) research about efforts to support teachers in changing their practices and incorporate technology’s possibilities. Technology’s Potential to Transform Learning Technology innovations continue to change rapidly, increasing access to information, communication, and creative arts. It is an exciting time in education, transforming learning by harnessing the power of digital technology to meet student and workforce needs never before possible. The most important challenge the U.S. education system faces is fostering 21st Century skills and knowledge in learners so they can participate in our global, knowledge-based civilization (Dede, 2000, 2003b, 2003a; Salpeter, 2003). Effects of Technology on Desired Learning Outcomes To take advantage of technology’s potential to connect learners globally to people, places, and resources, the National Education Technology Plan (Thomas, 2016) describes a model of learning centered around students, providing flexibility on multiple dimensions to empower them to take control of their own learning. No longer is the student the recipient of one way, single transmission from the teacher. In this model, the student is connected to a network of teachers, mentors and coaches, peers, parents, information resources, expertise and authoritative sources, tools to build knowledge and manage information, and other online supports (Thomas, 2016). The idea of reforming our educational system to maximize 7 knowledge, learning, and skilled intelligence needed for competitiveness in our new knowledge economy was voiced as early as 1984, with A Nation at Risk (National Commission on Excellence in Education, 1983). However, in the past three decades, much research has been done to make explicit what qualitative learning changes are sought. One comprehensive resource describing student learning competencies for technology use to meet these wide-sweeping educational changes is provided by the International Society for Technology in Education’s (ISTE, 2016) standards for students. There are six distinct areas included: (a) creativity and innovation; (b) communication and collaboration; (c) research and information fluency; (d) critical thinking, problem solving, and decision making; (e) digital citizenship; and (f) technology operations and concepts. These standards are not tied to a particular grade, content area, or technology, but rather describe types of thinking skills and processes needed in order to take advantage of technological advances in society. As well, described standards take into account the dynamic and evolving nature of technology, building in a process of refresh inclusive of educators and community members to include new developments every five to ten years (ISTE, 2016). Formerly called the National Education Standards, or NETS, the name was changed to ISTE in 2013 as a reflection of their global and inclusive nature. Technology’s Effects on the Learning Process As changes integrating technology are implemented in our educational system and student learning standards, the study of changes to the individual learning process itself also becomes of interest. Learning in general is loosely defined for this study as the strengthening of an individual’s practices and participatory abilities considered to be of value to the larger 8 community as a whole, inclusive of both specific and nonspecific transfer of these practices and abilities to new contexts (Bruner, 1960, 1966; Greeno, Collins, & Resnick, 1996; Pea, 1985). Pea (1985) conceptualized transformation of learning related to technology in terms of the contributions to changes in students' cognitive thinking and the processes of education. "How might information technology redefine the very possibilities of education?...a primary role for computers is changing the tasks we do by reorganizing our mental functioning, not only by amplifying it” (Pea, 1985, p. 167-168). According to Pea, the amplifier metaphor originally used to describe the effects of using computers for learning falls short of describing fundamental changes that can occur. A reorganization metaphor more accurately describes the possibilities of how the mental work of a task can be restructured by using a tool such as electronic spreadsheets, according to Pea. Although considering ways in which technology can amplify the process of teaching and learning can increase efficiency, ways technology can help serve as a tool to improve cognition is an area in need of study to guide future technology education. Building on this research and seeking to understand whether technologies make us smarter, Salomon and Perkins (2005) described three types of cognitive changes related to technology: effects with, of, and through technology. Effects with and of technology. Effects with technology on learning do not change the nature of task nature itself but result in amplification of performance and/or increased productivity resulting from technology sharing part of the cognitive load during the task (Salomon & Perkins, 2005). For example, young students writing with the help of a computer before they are able to read or older students using an online calculator rather than solving problems by hand calculation. Effects of technology occur when the act of using the technology changes the way in which we do other tasks after the specific technology is no 9 longer being used. For example, studies on video games resulted in changes to an individual’s ability to find a target in a messy scene. Those developing their skills first while playing video games out-performed those who did not play video games in other tasks requiring the same skill of finding the target (Salomon & Perkins, 2005, p. 79). Effects through technology. Effects through technology is the area of interest for this study. These type of effects were not included in Salomon and Perkin’s original conceptualization (Salomon, Perkins, & Globerson, 1991), but were added after observing types of learning changes resulting from cognitive technologies. Effects through technology “do not simply enhance performance, but reorganize it” (Salomon & Perkins, 2005, p. 80) In these cases, the task itself could not be performed in a comparable manner without the technology, such as using mobile technology platforms incorporating Global Information System (GIS) data and real-time global communication. For example, students can explore invasive species in their local setting for a science class, collect images and data, and communicate their findings in real time with researchers elsewhere in the world mapping the spread of these species through tools such as ArcGIS. Later in the classroom, they can immediately use these GIS mapping images to see how their locale compares to other locales based on their actual experiences rather than as abstract concepts from a textbook. Transformation and learning through technology. Technology as transformation is another label used in the literature that is directly associated with Pea’s (1985) transformation and Salomon and Perkin’s (2005) effects through technology. Transformation is used in framework descriptions such as Hughes et al.’s (2006) Replacement, Amplification, and Transformation and Puentedura’s (2006) Substitution, Augmentation, Modification, and Redefinition frameworks. Although the phrase transformative learning is also understood to 10 reference a uniquely adult form of metacognitive reasoning related to personal underlying assumptions and personal perspective (Dirkx, 1998; Mezirow, 1991), in the scope of this study it is interpreted as referring to dramatic changes in general in types of learning as resulting from technology use. Technology’s Potential to Transform Teaching Whether related to our educational system as a whole or to the individual learning process, the call to transform learning with technology has direct effects on teachers and their practice as they prepare future citizen workers for competing in a global economy. (Hughes et al., 2006; Nut, 2010; Richardson et al., 2013). Effects of Technology on Teacher Preparation Standards Although educational technology is its own discipline, in the larger context of school the practice of integrating technology spans all disciplines (AECT, 2009; Council for the Accreditation of Educator Preparation, 2016). In addition to the new learning model for students, the NETP (2008; 2017) describes transformation expectations for individual educator accountability. “Educators will be supported by technology that connects them to people, data, content, resources, expertise, and learning experiences that can empower and inspire them to provide more effective teaching for all learners” (NETP 2017, p. 28). The connected learning model by the NETP describes how teachers can access almost limitless ideas for designing authentic learning tasks, collaborate with others worldwide regardless of their geographic location, and use data and evaluation resources for continuous learning improvement. At the same time, the NETP acknowledges gaps between what is known about how teachers can leverage technology in specific settings versus how teachers can effectively implement these ideas in all settings. 11 More specific description on educator expectations related to transforming learning with technology can be found in technology standards included in teacher licensure. In addition to ISTE Standards for Students (ISTE*S) described above, ISTE Standards for Teachers (ISTE*T) that provide a “road map for educators worldwide as they navigate decisions about curriculum, instruction, professional learning, and how to transform pedagogy with technology” (ISTE, 2016). These standards describe observable teacher behaviors for promoting desired student learning. For example, ISTE*T 1 is described as teachers using “their knowledge of subject matter, teaching and learning, and technology to facilitate experiences that advance student learning, creativity, and innovation in both face-to-face and virtual environments” (ISTE*T p. 1, emphasis added). The National Council for Accreditation of Teacher Educators adopted the original version of these standards as a required component in 2000 (Burke, 2000). Technology is also described as one of two cross-cutting themes in the recommendations from the 2016 Council for Accreditation of Educator Preparation (CAEP). In the context used for this study, a final course assessment uses ISTE*T 2, Design and develop digital age learning experiences and assessments to document teacher candidate preparation as part of the institutional CAEP accreditation documentation. Measuring Levels of Technology Integration One accepted practice for determining whether teachers have met the different standards for technology integration is through rating an observed lesson or lesson plan based on a framework describing different levels of technology integration. The Technology Integration Matrix (TIM; Harmes, Welsh, & Winkelman, 2016) the Substitution, Augmentation, Modification, and Redefinition (SAMR; Puentedura, 2006), and the Replacement, 12 Amplification, and Transformation (RAT; Hughes et al., 2006) are widely used models for measuring levels of technology integration. All three models describe a variation of transformation or learning through technology as the highest level of integration. The TIM framework (Harmes et al., 2016) is the most complex of the three models, originally designed to support evaluation of technology integration levels by schools and districts for technology planning and program accountability. In the TIM framework, there are five levels of technology integration, each with five characteristics of meaningful learning environments, to create a 25-cells matrix. Transformation as the highest level of technology integration: “The teacher encourages the innovative use of technology tools. Technology tools are used to facilitate higher order learning activities that may not have been possible without the use of technology” (Florida Center for Instructional Technology, 2017) The SAMR model (Puentedura, 2006) uses pedagogical and content references as the basis for scaffolding teachers from lower level technology uses up to the highest of four different integration levels. SAMR was developed as a part of Puentedura’s work with the Maine Technology Learning Initiative (Romrell, Kidder, & Wood, 2014). The first two, Substitution and Augmentation, are considered Enhancement uses of technology, where technology substitutes directly for another tool either for no functional change to learning or augmentation of learning with functional improvement. Modification and Redefinition levels are considered Transformation integration levels, where existing learning is modified through technology for significant task redesign or Redefined, where “computer technology allows for new tasks that could not have been done without the use of the technology” (Romrell et al., 2014, p. 4) 13 The RAT model (Hughes, 2000; Hughes et al., 2006) also built on existing pedagogical and content knowledge to assess technology integration. The RAT model was originally based on Pea’s (1985) notion of technologies as amplifiers or transformers of cognitive activity, serving as a self-assessment for pre- and inservice teachers related to their critical technological decision-making abilities. Replacement is defined as the technology serving as a different means to the same instructional practices. Amplification uses of technology increase efficiency, effectiveness, and productivity, but the basic instructional practices remain unchanged. Transformation uses of technology invent new instruction, learning, or curricula. Because the RAT model is used in the educational technology course in this study and its alignment with Salomon and Perkins’ (2005) work on learning with, of, and through technology, it will be used to measure the level of technology integration in educational activities in this study. Technology integration and key instructional features. Beyond these broad levels of technology integration, educators have proposed a number of more specific aspects or features of instruction to which teachers should attend as they integrate new technologies into their instructional practice. In the RAT framework, Hughes et al. (2006) described three distinct dimensions that can be changed through technology use. The instructional methods dimension considers instructional areas specific to the teacher’s role in the instruction, such as their student interaction, lesson preparation, assessment of learning, and other administrative tasks such as grading. For example, teachers can use tools such as Google Forms to formatively assess students, using the spreadsheet produced with the responses to analyze, record and share the data electronically with others as needed. The student learning processes dimension considers technology’s effects on the students’ thinking process, knowledge transfer, type of grouping, and motivation. Students creating a digital story to demonstrate concept knowledge requires 14 different thinking processes than a written response. The curriculum goals dimension considers technology’s effects on curricular experiences, processes, or procedures. For example, the ability to take a virtual field trip of a distant archaeological site would be a distinct change to a curricular experience due to technology as compared to viewing pictures in a textbook. According to Hughes (2006), a particular technology use may change the level of integration in any one of the three dimensions, and all three must work together well for effective learning. In this study, these three dimensions were used to characterize the aspects or features of instruction to which teachers attended when describing future uses of educational technology. Technology’s Effects on the Teaching Process For students to learn through technology in the ways called for today (Dede, 2011; Dede, Honan, & Peters, 2005; Gray et al., 2010; Hughes et al., 2006; Thomas, 2016) teachers must be able to teach through technology. Thus, teachers themselves must also acquire the set of practices needed when learning through technology in order to connect, communicate and collaborate with others around new technologies (Chen, 2010; Mishra & Koehler, 2006, 2009). I propose that these practices are an analogous set of relationships to the learning with, of, and through technology processes that exist as described by Salomon and Perkins (2005)—the processes of teaching with, of, and through technology. These practices can also be measured using the RAT framework described above. Teaching with and of technology. For the purposes of this study, teaching with and of technology consist of teaching processes whereby a teacher uses available technologies to perform the same or similar tasks as they did prior to incorporating technology without fundamentally changing their teaching activities. For example, as a new teacher before 15 technology, I copied a daily math problem and sentence from a provided list onto the chalkboard for students to solve prior to a whole class mini-lesson. This replacement integration level did not change the learning task from what it was prior to technology. Later, as my classroom was equipped with a computer and projection system, I taught with technology by using a word processing program to prepare these problems ahead and save myself after school chalkboard clean up (amplification integration level). Although these experiences are often considered a precursor to effective and transformative technology integration (Davies, 2011), they are not considered sufficient to meet the technology standards now in place for k-12 students and teachers since little daily activity was transformed (Buckenmeyer, 2010; Cuban, Kirkpatrick, & Peck, 2001; Larry Cuban, 2001; Wells & Lewis, 2006). This practice of teaching of and with technology is often the basis of criticisms on use of educational technology in the classroom citing decades of computers and technology integration professional development availability without much observed change to how teachers teach or student achievement (Buckenmeyer, 2010; Cuban, 2001, 2010). Teaching through technology. Extending Salomon and Perkins’ (2005) conceptualization of learning through technology encompassing a type of learning and reorganization of the cognitive activity not possible without the technology, I also define teaching through technology as encompassing a reorganization of cognitive activities related to teaching not possible without the technology. This may take place as a part of teacher lesson preparation, as a part of the enacted student lesson, or both. Using the aforementioned example to clarify, the teacher could teach through technology as part of teacher planning. For example, a teacher could virtually connect with community members and collect authentic mathematical problems used in people’s jobs, then later assist students in solving them using classroom 16 concepts. Or, making use of technology’s ability to support academic and/or interest based personalized learning, the teacher could design an online learning module with multiple paths for students to move through learning objectives at their own pace to acquire the different skills. Supporting Teacher Change in Using Technology In my own learning and early teaching career, I was unaware that my personal experiences paralleled early integration of educational technology. Until beginning my research in the field, I attributed different ways in which I used technology as related to my interests and needs rather than educator efforts to increase technology use and availability. For example, in high school we progressed to electric typewriters after mastering typing on a manual, and I took a programming class as a senior year elective. In college, we used computer labs to type longer papers. A few years later as a beginning teacher, our school used grant funding to provide three classroom computers for use with project-based learning strategies to increase student performance and engage our surrounding community. A decade later, I taught a university large lecture course and discovered benefits of learning management systems for connecting with students. When my children were old enough for me to pursue a doctorate, hybrid programs were available. First-Order Changes in Support of Technology Integration According to Ertmer’s (1999) early work on strategies for technology integration, inclusion of computers in schools followed a trajectory similar my own. By 2009, 97% of k-12 public classroom teachers had an overall student-to-computer ratio of 5.3:1, and internet was available 93% of the time (Gray et al., 2010). Of teens 13-18, one large-scale quantitative study by the Pew Research Center found 92% of them access online daily and 88% have access to cell phones at home (Lenhart, 2015). Ertmer categorized lack of computer and internet access as a 17 first-order barrier, pulling from other research in organizational behavior and school reform (Ellis, 1994; Goodman, 1995). First-order changes are typically extrinsic and visible in nature, although these changes do not necessarily create the desired qualitative changes to student learning. In context of general school reform these include factors such as reducing class size, site-based councils, and ninety-minute teaching blocks (Fouts, 2003). First-order changes more directly related to technology integration include increasing the number of computers available in schools and increased funding and availability for technology professional development (Brickner, 1995; Ertmer, 1999). Inclusion of funding allocations for technology in the re-authorized Every Student Succeeds Act (2015) as well as 441 Preparing Tomorrow’s Teachers Today (PT3) grants totaling $275M from the U.S. Department of Education (Lei, 2009) are positive, first-order changes. Also, the majority of teacher preparation programs have added a mandatory educational technology course typically focused on technical skills and positive attitudes (Anderson, Groulx, & Maninger, 2011; Tanguma, Martin, & Crawford, 2002; Zhao, Pugh, Sheldon, & Byers, 2002) needed in support of desired transformations in k-12 education as described in the NETP (2017) and ISTE student standards (ISTE*S; ISTE, 2016). Second-Order Changes in Support of Technology Integration To accomplish transforming schools through technology beyond a surface level, however, teachers must have support in what is categorized as second-order changes (Brickner, 1995; Ertmer, 1999; Fouts, 2003; Hew & Brush, 2007). Second-order changes are typically intrinsic, addressing why a change or reform is needed. Through changing ideas behind school organization and classroom practices, a qualitatively different educational experience can occur. For example, in Fouts’ (2003) compilation of research about Washington’s school reform 18 efforts, teachers who continued their previous beliefs and practice of lecturing despite reduced class sizes did not significantly impact student learning, whereas teachers exhibiting second order changes such including more collaborative and constructivist pedagogical practices within these smaller classes did significantly affect student learning. More closely related to technology integration, Ertmer (1999) identified self-efficacy, experience, and teacher pedagogical beliefs based in constructivist philosophy as second-order changes. In a review of empirical studies on technology integration, Hew and Brush (2007) identified similar factors leading to teachers’ use of technology. Three factors, technology skills, technology beliefs, and perceived technology barriers, encompassed approximately 76% of all identified factors that impact technology integration (Brush, Glazewski, & Hew, 2008; Hew & Brush, 2007). In spite of successful efforts greatly increasing training opportunities, these second-order changes needed for qualitative changes in student learning have not occurred for everyone (Adamy & Boulmetis, 2005; Beyerbach, Walsh, & Vanatta, 2001; Polly et al., 2010). My personal experiences using technology to learn and teach differently were only shared by 5% in K-12 settings in 1999, and almost two decades later, only about half of K-12 teachers report feeling equipped and using technology transformatively (Buckenmeyer, 2010; Larry Cuban, 2001, 2010, Ertmer & Ottenbreit-Leftwich, 2013, 2010; Ertmer et al., 2012; Gray et al., 2010; Hew & Brush, 2007; Lei, 2009). Approaches to Address Gaps between Having and Using Technology As technology availability in classrooms became more prevalent in response to firstorder changes, more research began to focus on changing teacher practice. Three separate but related approaches emerge in literature seeking to identify effective methods to support teachers’ effective integration of available technologies. 19 Supporting Teachers’ Self-Efficacy Self-efficacy is consistently linked with increased technology integration and intentions to use technology in future teaching (Ertmer, 1999; Hsu, 2012, 2013; Lee & Tsai, 2008). Bandura’s (1977) self-efficacy theory explains personal efficacy expectations as derived from four principal sources of information: performance accomplishments, vicarious experience, verbal persuasion, and physiological states. Stated simply, Bandura’s theory connected what happens in one’s mind with what one actually does. The importance of mastery or performance experiences to study was expanded on, finding that “…mastery experiences and comparative appraisals are more reliable diagnostic indicants of capability than affective arousal, which bears no uniform relationship to performance accomplishments” (Bandura, 1986, p. 365). As a psychological construct, his theory was conceived to be inclusive of all people and contexts, and therefore is a foundational construct in this study. To understand influences on teaching self-efficacy beliefs, Tschannen-Moran and Woolfolk-Hoy (2006) studied 255 novice and career teachers building on Bandura’s (1986, 1997) earlier work using two of the four sources. Verbal persuasion was operationalized as the interpersonal characteristics between colleagues, parents and community. Mastery experiences were operationalized as a sense of satisfaction with one’s past teaching successes. TschannenMoran and Woolfolk-Hoy echoed Bandura’s (1986, 1997) findings that mastery experiences were the strongest predictor of self-efficacy for both career and novice teachers, with 19% of the variance related to self-efficacy explained by contextual and mastery experiences for the career teachers, and 49% for novice teachers. These findings and recommendations for future study have implications for computer self-efficacy development as the lack of prior experience as a learner using digital technologies is mentioned as a barrier to technology use (Lei, 2009; 20 Olofson, Swallow, & Neumann, 2016; Park & Ertmer, 2008). In studies within technology integration, teachers also report the ability to experience using tools in context as a key component in their successful future use of these tools (Archambault & Larson, 2015; Brenner & Brill, 2016; Park & Ertmer, 2008; Storandt, Dossin, Lacher, & Piacentini, 2012). Within the domain of computer self-efficacy, Wang, Ertmer, and Newby (2004) studied 337 participants taking an Introduction to Educational Technology course at a large Midwestern University to examine effects of vicarious experiences and goal setting on preservice teachers’ judgments of self-efficacy for technology integration. Using a mixed-factorial design creating four experimental conditions for the two independent variables, the authors designed and administered a pre- and post-assessment using the Computer Technology Integration Survey following a one-time treatment using an electronic instructional tool (VisionQuest CD-ROM) to provide vicarious teaching experiences. They found significantly positive gains in participants’ computer self-efficacy who were exposed to vicarious experiences (with and without goal setting). Confirming importance of the in-class experiences on preservice teacher technology integration, this study is still limited due to the one-time event design similar to that of Tcshannen-Moran and Woolfolk-Hoy (2006). Repeated studies find these second order changes such as self-efficacy and beliefs about technology are often as or more important to effective technology integration in the classrooms than having all of the right resources in place (Anderson et al., 2011; Buckenmeyer, 2010; Chen, 2010; Ertmer, 1999; Ertmer & Ottenbreit-Leftwich, 2010; Hughes, Ko, Lim, & Liu, 2015; Zhao & Frank, 2003). In a study of 25 exemplary technology-using teachers, effective technology integration was observed even when resources and time were limited. The teachers, selected from five different Midwestern universities based on winning technology integration 21 awards, cited intrinsic factors of confidence and commitment as most important. Comparing groups with high confidence and espoused beliefs about the importance of technology integration in the classroom, teachers with more years of experience demonstrated higher enacted beliefs about technology integration (Ertmer, Ottenbreit-Leftwich, Sadik, Sendurur, & Sendurur, 2012). Many of today’s preservice teachers have grown up with technology and use it personally. Lei (2007) surveyed digital natives (as defined by Prensky, 2001) about their beliefs, attitudes, confidence, and interest in technology to understand what additional preparation, if any, was needed for technology integration, given the research on digital natives describing them as enthusiastic uses of technology personally and in school (Vandewater et al., 2005). Findings suggested that although the digital natives reported strong beliefs and confidence in their technology skillsets, these did not transfer from their social-communication activities and learning activities as students to intended classroom use. “As preservice teachers, they lack the knowledge, skills, and experiences to integrate technology into classrooms to help them teach and to help their students learn, even though they fully recognize the importance of doing so” (Lei, 2009, p. 92). Supporting Teachers’ Knowledge Development Teachers are charged with using technology to teach children to solve problems in deeply ambiguous and confusing situations (Jacobsen, Clifford, & Friesen, 2002). Yet, CAEP and the NETP acknowledge a lack of a deep level understanding from experience of how teaching and learning with technology changes all facets of teaching practice. The TPACK framework (Mishra & Koehler, 2006) is one way researchers have conceptualized the new resulting intersections of knowledge and abilities occurring when integrating technology, 22 consisting of three originally discrete knowledge areas: technological, pedagogical, and content knowledge. This representation of knowledge extends the work of Shulman’s (1986) original conceptualization of teacher pedagogical content knowledge as “a particular form of content knowledge that embodies the aspects of content and of teaching ability” (p. 9). Other researchers have built upon Shulman’s work to examine specific areas such as science teacher knowledge (Abell, 2008; Magnusson, Krajcik, & Borko, 1999) and assessment knowledge (Avargil, Herscovitz, & Dori, 2012). The TPACK framework is unique in that it adds technological knowledge directly into Shulman’s PCK knowledge framework to create three equally intertwined knowledge areas. One of the original intentions of the TPACK framework was to capture the dynamic interplay of the flexible thinking needed for teachers to go beyond isolated discipline knowledge, pedagogical strategies and technologies to assess and analyze teachers’ acquisition of TPACK (Koehler et al., 2011; Koehler, Mishra, & Yahya, 2007; Mishra & Koehler, 2006; Mouza, Karchmer-Klein, Nandakumar, Yilmaz Ozden, & Hu, 2014; Rosenberg & Koehler, 2015). Experience Matters For teacher preparation in general, research finds structured experiences while being mentored by experienced teachers critical for classroom teaching success. Efforts to create a guide for preservice and early career mentors on effective teaching emphasized benefits of personal experience with learning as well as using different pedagogical strategies as formative to preservice teacher development(Darling-Hammond, 1999; Darling-Hammond & Sykes, 1999; Darling-Hammond, Wei, Andree, Richardson, & Orphanos, 2009). Brown and Duguid (2011) also describe the need to be a practitioner of teaching, rather than simply learning about the practice of teaching for teacher preparation. The CAEP (2016) rationale also recommends 23 candidates experience themselves first technologies they will later teach students. For example, “…opportunities to develop the skills and dispositions for accessing online research databases, digital media, and tools and to identify research-based practices that can improve their students’ learning, engagement, and outcomes” (p. 32). Teachers also need experience discerning effectiveness for different innovations in their teaching context and changes needed in their pedagogical practice (Bolick, Berson, Friedman, & Porfeli, 2007; Pearcy, 2013). “Learners need time and guidance to achieve the effects that many contemporary cognitive technologies afford…it takes time for innovators to see the possibilities…” (Salomon & Perkins, 2005, p. 81). In one study of six schools in three states chosen for their attempts to implement technology (Frank, Zhao, & Borman, 2004), longitudinal and network data were studied to explore the processes of implementing new teaching practices. Using multiple quantitative measures, seven characteristics from diffusion of innovation and educational research were measured for their relationship to the outcome of computer implementation. Teacher’s own expertise with computers was statistically significant and the most important predictor of implementation, with a standardized coefficient of .32—double the size of perceived social pressure to use computers. Access to expertise through help and talk (social capital) was also significant, with the second highest standardized coefficient of .21. The need for personal experiences to develop technology integration expertise is echoed in other research. Overall, teachers who experienced learning in a transformative environment had better understanding of the concepts needed to design transformative learning environments (Ertmer, 2003; Fantilli & McDougall, 2009; Martin et al., 2010; Strudler & Hearrington, 2008). “Teachers, like most adults, learn from experience.” (Luckman, 1996 as quoted in Burden & Hunt, 2010, p. 148). 24 CHAPTER 3 METHOD To study effects of prior experiences learning through technology have on preservice teachers’ self-efficacy, intentions to use, and ability to describe teaching through technology, a quasi-experimental design was used within six sections of an undergraduate, teacher preparation educational technology course at a comprehensive, Midwestern regional university. This upperdivision teacher preparation program is an offers a rich balance of the broad knowledge, thinking, and exploration indicative of the liberal arts combined with extensive, situated, supported practice in P-12 schools as well as opportunities for diverse field experiences. Students enrolled in sections of a stand-alone educational technology course as part of normal school administration and assigned to either a secondary or elementary section. Half of the intact sections participated in an experimental treatment incorporating three specific face-toface experiences learning through technology prior to a unit in which they learned about using that technology for teaching. Half of the sections participated in the same unit in which they learned about using the technology for teaching but did not participate in the experimental treatment in a prior unit. Participants Participants were typical traditional college students in their last year of the teacher preparation program taking a required Technology in Education course. Most took this course simultaneously with their teacher assisting field experience and other methods courses for a fulltime or more student course load, with many working part time. Although students may have experienced technology integration in content area courses, this experience is not consistent across content areas and faculty members. Additionally, a wide range of technology skills was 25 represented, ranging from novice users with primary experiences limited to word processing to extensive users of social media. A few students had additional specialty technology experience such as multimedia creation and graphic design, music composition, and mathematics programs. The students were divided fairly equally between elementary and secondary education majors, with between 5-10% majoring in Art, Music, or Physical Education. Approximately 85% of the students were Caucasian, similar in cultural diversity to the region. Of the eight course sections (each capped at 20 students) offered in the Fall 2016 semester, six taught by regular faculty were included in the study. A voluntary consent form (Appendix A) was distributed to each section in the first face-to-face class. Table 1. Participants Giving Voluntary Consent and Section Information First set of Fridays Second set of Fridays Instructor Meeting Time Elementary, n = 20 Elementary, n = 19 1 3:00 – 5:50 PM a Elementary, n = 17 Elementary, n = 18 2 3:00 – 5:50 PM b Secondary, n = 19 Secondary, n = 17 1 6:00 – 8:50 PM a b Out of 18 total possible section participants. Out of 19 total possible section participants. The Course The 3-credit Technology in Education course is required for students after they are accepted to the College of Education, but prior to their student teaching semester. Students must achieve a B- or higher in the course to move on to student teaching. The course is offered during the year in a hybrid format consisting of four face-to-face sessions, each three hours long with the remainder of the course being conducted online. Course overview. The course is designed to provide students with a foundation for using constantly changing hardware, software, and web applications in a purposeful manner incorporating the latest in learning and media theory. A major emphasis within the course is 26 ISTE Standards - Teachers (ISTE-T) 2: “Design and develop digital age learning experiences and assessments (ISTE, 2016).” Other ISTE-T standards are also addressed to a lesser extent during the course as well, related to issues of the digital divide, cultural awareness, and educational technology policy. There is no field observation component, nor is there consistency in the technology present or the technology efficacy of supervising teachers and field coordinators in existing field components. A common course assessment and a common course syllabus are required for use by all instructors. These are publicly available on the GVSU website and any changes require GVSU Curriculum Committee approval. During the semester of this study, all instructors teaching the course communicated frequently to ensure alignment across all sections. Course structure. Within the 14-week course, content is organized into seven units stemming from the ISTE-S standards used for k-12 teaching and learning of educational technology. Each two-week long unit includes a technology-based project incorporating classroom applications along with educational technology-specific content. These units afford students a holistic view of how educational technology is used not only in the pedagogical aspects of teaching, but for many other areas within teacher practice essential for ongoing personal development and professionalism (Siko, 2016). For example, during the first unit students create their own Personal Learning Network (PLN) using a blog as a base. Throughout other units, they return to their PLN to add additional resources and blog posts on other aspects of teaching as well as to share their information with others. In each unit, students also use a different digital instructional tool as a means to share their learning as well as develop technical skills in a modified Learning by Design format (Kolodner et al., 2003). 27 These seven units also build and contribute different pieces to the major end-of-course assessment, a technology integrated lesson plan (TILP) worth 40% of their course grade. Students develop a multi-step lesson designed to meet both a content and a technology standard, inclusive of their rationale and explanation for technology used in the teaching and assessment strategies. Each part of the TILP is structured to incorporate scaffolding and feedback by the instructor to support students, including a required conference. The TILP is scored using a common course assessment level rubric designed to ensure students are able to demonstrate ISTE*T Standard 2. Each course in GVSU’s teacher preparation program has a required common course assessment aligned to the ISTE*T standards, and students are familiar with the concept and process from prior courses. All common course assessments are vetted by the College of Education Curriculum Committee. The Intervention The intervention consisted of changing three of the seven units to provide students (preservice teachers) with opportunities using a given digital environment to learn through the technology before expecting them to teach through the technology. The changed lessons took place during the first three face-to-face sessions. The selected instructional tools for the intervention are as follows, all addressing (ISTE*T 2a, 2b, 2c): 1. Interactive PowerPoint (non-linear presentation). Using a non-linear presentation to support students’ individual curiosities and diverse learning abilities (Unit 2). 2. Web Quest. Using a web quest to support collaborative, project-based learning (Unit 4). 3. Digital Storytelling. Using the process of creating a digital story to communicate essential information using a multi-modal approach (Unit 6). 28 *Face-to-face session. **Computer Technology Integration Survey. Figure 1. Course structure and timing of intervention and focal units. For each selected unit, students in the treatment group used a specific instructional tool for a learning activity in face-to-face sessions prior to the unit in which they learned about and used the tool for teaching others. For example, Unit 4 focused on web quests as a digital learning environment. In the face-to-face session for Unit 3, treatment group students learned course content by engaging in an instructor-designed web quest, thus experiencing learning through this instructional technology. Students in the control group learned about the same content in their Unit 3 face-to-face setting, but with different activities that did not include engaging in a web quest—their learning activity did not involve learning through technology. In Unit 4, students in both the treatment and control groups learned about designing and using web quests for instruction. Both groups had access to the same reading materials and other resources for learning about using web quests. At the end of Unit 4, students in both groups submitted a web quest they designed to teach others for their unit assessment and took a one-question survey about their intent to use that tool in the future. 29 Intervention Details Each of the three interventions are described here, including the operational definition of each instructional tool, a short rationale, and a comparison of the control and treatment groups. The interventions for all three instructional tools took place in face-to-face sessions--the classroom setting most similar to the context students would be using that type of teaching through technology in the future. For both groups, the artifact and assessment of their ability to teach through the instructional tool was the same. Access to all online materials was the same. See Appendix B for a detailed course calendar listing each of the units and scheduled interventions. 1. Interactive PowerPoint: An instructional tool applicable in both online and offline settings that promotes individualized learning paths through a specific area of content. This is also referred to as a non-linear presentation. These can be created by students, teachers, or both and can be packaged in non-internet dependent formats for greater access. Due to being able to be used without internet access, it can be implemented in settings where access to the internet is either not available or not recommended due to safety policies. 2. Webquest. An instructional tool applicable in online settings that promotes collaborative, inquiry-based learning using web resources and is focused on the learner creating solutions rather than following a traditional sequence of learning tasks (Dodge, 1998; Wood, 2001). Although the “quest” and the resources are presented in an online environment, the end product created can be for either on- or off-line use. Constructivist pedagogy is employed, where the web quest is created by the teacher to encompass 30 specific content, and the learner then transforms this content into a new solution/product by completing the web quest. 3. Digital storytelling. An instructional tool where the learning is dependent on using a combination of audio and video in order to communicate a desired message to a specified audience digitally. Through creating and sharing a digital story, students can enhance their research, professional communication, and collaboration skills (Stewart & Gachago, 2016). A particular advantage to the use of digital storytelling is its versatility. Programs and apps exist that are simple enough for kindergarteners to use with minimal help as well as programs complex enough to be aligned to any level of skill. For students with disabilities and/or impairments in reading or writing, using images and spoken words can be used as an accommodation strategy. The products created also afford students means to engage with others digitally. These can be teacher- or student-created, and packaged to share using online and offline tools. Table 2 below shows the intervention structure comparing control and treatment group experiences. 31 Table 2. Comparison of Control and Treatment Group Unit Experiences Group Control Group Learning experiences through the technology (Intervention Units: 1, 3, 5) Learning activities use traditional formats of presentation, discussion and other teacher directed activities. No teacher-directed experiences within course learning through the three instructional tools chosen for intervention within face to face sessions. Treatment Group Some learning activities use traditional formats of presentation, discussion and other teacher directed activities. Students engage in a teacher-created, learning activity learning through the selected instructional tool during a face-to-face session in order to complete unit assignments. Both Groups Teaching experiences through the technology (Focal Units: 2, 4, 6) Students use teacher selected course readings to discuss instructional tool benefits for their unit assignments and submission of a self-created artifact using the specific tool for teaching content to others. Students use teacher selected course readings to discuss instructional tool benefits for unit assignments and submission of a self-created artifact using the specific tool for teaching content to others—as well as their prior teacher-directed experience learning through the instructional tool during the face to face session. Have online access to identical examples, how-to materials and resources about instructional tool pedagogies, as well as required participation in online and face-to-face learning activities. Identical artifact submission and assessment. Treatment and Control Groups The study used six of the eight Fall 2016 course sections. Students self-registered for these sections based on the meeting times and whether they were secondary or elementary; at time of registration the instructors were not assigned to the course sections. Each elementary section had four face-to-face sessions on a Friday from 3:00 pm – 6:00pm. Secondary sections met Fridays from 6:00 pm – 9:00pm, in identical classrooms equipped with room projection systems. The intervention used pairs of sections based on teaching level with three pairs in all (two elementary and one secondary pairs of sections). I taught one elementary and one secondary 32 pair; a university faculty member taught the other elementary pair. Both instructors had taught this course multiple times within the past year using a format similar to the control version. Data Sources and Descriptions of the Measures Four data sources were collected during specified course units as listed in Table 3 and described below. All data sources were part of the regular graded coursework, and some students chose not to complete all items. Any differences in the number of participants for each data source are described where applicable. Table 3. Data Sources and Unit of Collection Dependent Variable Overall selfefficacy (RQ1) Intent to use in future teaching (RQ2) Post-unit Unit artifact Technology Pre-assessment Intent to use reflection Integrated Post-assessment CTIS survey CTIS survey survey paragraph Lesson Plan (Quantitative) (Quantitative) (Qualitative) (Qualitative) (Quantitative) Unit 1 Unit 7 Unit 2 Unit 4 Unit 6 Descriptions of integration levels and instructional features (RQ3) Unit 2 Unit 4 Unit 6 TILP institutional assessment quality score (RQ3) Unit 7 Self-efficacy survey. The Computer Technology Integration Survey (CTIS) was created by Wang et al. (2004) to measure self-efficacy in a similar study with preservice teachers in an introductory educational technology course. This 21-question survey, administered electronically in the first and last unit of the course, used a 5-point Likert scale (5=Strongly 33 Agree, 4 = Agree, 3 = Neutral, 2 = disagree, 1 = Strongly Disagree). Wang et al., (2004) established content validity for self-efficacy by a panel of experts prior to survey administration. A factor analysis on pre- and post-CTIS data showed 16 of the 21 items related to the self-efficacy construct. Five items representing a second factor (external influences of computer technology uses) were not used in data analysis. A Cronbach alpha coefficient of 0.94 and 0.96 for the pre- and post-survey respectively was reported. As the current study’s sample size (N=113) was not large enough to be considered reliable for a factor analysis given the number of variables in the CTIS survey, the Wang et al. (2004) factor analysis was used (MacCallum, Widaman, Zhang, & Hong, 1999). The CTIS question wording and format were not modified in any way when administered in this study. All 21 questions were administered in the pre- and post-assessment as part of the regular course content; only those 16 questions specific to the self-efficacy construct were included in the data analyzed. Participants took the CTIS during the first and last face-to-face class sessions on their personal devices while the instructor stepped out of the room, consistent with the method the institution uses for end-of-course surveys. During the final class session, four (of 38) elementary and two (or 19) secondary participants in the control sections did not respond to the CTIS due to a school-sponsored travel abroad trip, as well as one elementary and one secondary participant in the treatment section. Participant numbers for the treatment groups were additionally affected by a severe snowstorm hitting the area the evening of the final session. Seven (of 36) elementary participants and seven (of 17) secondary participants were absent due to travel or weather. Although the CTIS was also made available electronically, this option was not used by students as it was also the last day of the semester. Thus complete pre- and postCTIS data were available for only 76 participants. 34 Survey of intent to use. For the instructional tool in each of the focal units (Interactive PowerPoints, Webquests, and Digital Storytelling), participants indicated their level of intent to use that tool in their future teaching on a one-question survey using the same 5-point Likert scale from the CTIS. Reflective paragraphs. As a part of each unit assessment, students wrote a 1-2 paragraph reflection describing how they might or might not use the instructional tool presented in their own future classroom teaching. For example, in the Pedagogy and Learner Analysis Unit, the students used Interactive PowerPoint to present their findings to their peers. As a part of their presentation, they responded to the following prompt: Include a section where you reflect on different ways you could use Interactive PowerPoints in your future classroom. Be as specific as possible, including examples and why you would use this tool rather than other options. Include both affordances and constraints where applicable. This should be about one or two paragraphs in length. (Unit 2 Assessment Artifact) The purpose of this reflection data was two-fold. First, to identify the effects of the intervention on the PSTs’ ability to describe different aspects of teaching practice related to their future use of a particular instructional tool, including whether or not it is a replacement, amplification, or transformative level of technology integration (Hughes et al., 2006). Second, to identify if there were additional factors related to participants’ ability to describe in narrative form their future use of an instructional tool These reflections were submitted by participants as a part of each focal unit. In some cases, students did not complete the artifact itself or did not complete the reflection portion. Per standard course procedure, any student missing part or all of an assignment was given a reminder and opportunity to complete the assignment or given a reduced grade if the work was not 35 completed. As a result, the Ns for analyses of reflective paragraphs vary across tools (Interactive PowerPoint = 107; Webquest = 88; Digital Storytelling = 97). The Technology Integrated Lesson Plan (TILP). The final major course assessment was a technology-based lesson plan following the ASSURE model (Lowther, Smaldino, & Russell, 2008). The TILP and the assessment criteria were introduced at the beginning of the semester as the summative assessment for the class; see Appendix D for the TILP Template and TILP Rubric. Each participant’s TILP describes in detail one multi-part lesson in which a minimum of one content area standard and one technology standard is chosen as the basis. Each participant submitted one TILP that was treated as an individual data segment. The purpose of collecting and analyzing this data was two-fold. First, the levels of demonstrated integration and ability to describe different instructional features directly related to ISTE*T Standard 2 is of interest. Second, the relationships between the levels and description used in the summative assessment with the instructional tools used in the course are of interest to ensure course content alignment with student outcomes and stated future instructional needs. Data Analysis CTIS survey. Likert items were treated as continuous-level data as by the CTIS authors Wang et al. (2004). A qgplot was used to check the normality and a scatterplot to meet homogeneity assumptions. Levene’s test for homogeneity of variances was used and corrective measures were applied. A mixed ANOVA was used with within-subjects factor of time (preand post-CTIS) and between-subjects factor of condition (control and treatment). Intent-to-use survey. The Intent-to-use survey used the same 5-point Likert scale as the CTIS items and treated as continuous data. The single survey question was administered at three separate and independent time points during each focal unit. A mixed ANOVA was performed 36 with a within-subjects factor of the three digital tools (Interactive PowerPoint, Webquest, and Digital Storytelling) and a between-subjects factor of condition (control, treatment). Reflective paragraph analysis. Text from each of these paragraphs was coded using a constant comparative method (Corbin & Strauss, 2008; Kolb, 1984; Rourke & Anderson, 2004; Taylor & Bogdan, 1998).  The researcher and another full-time faculty member teaching the course began developing the coding framework with a data set from the summer pilot course. First the data sets were read individually and initial codes developed. Then, both researchers compared the different sets of codes, and began a series of readings and conversations about the emerging patterns. Using different data sets from additional previous student coursework with similar content, the diversity of the responses possible combined with the open-ended quality to the reflection prompt observed by both researchers led to a decision to use existing frameworks as a basis for the data analysis. Two layers of coding emerged, with Layer 1 being the overall level of integration of the technology use described as one of the three categories from the RAT framework (Hughes et al., 2006): Replacement, Amplification, or Transformative (See Table 4). If a reflective paragraph contained multiple uses of the technology, only the highest level of integration was coded. 37 Table. 4 Coding Frame of Reflective Paragraphs and Technology Integrated Lesson Plan-Layer 1 Integration Level Code Example Replacement R Factors indicating the technology use results in no fundamental changes to cognitive learning processes occurring during lesson Amplification A Factors indicating the technology use results in amplified learning possibilities, i.e., increased productivity by sharing a Google template with students, but cognitive activities themselves are not inherently changed. Transformation T Factors indicating technology use results in cognitive changes for the teacher or student that could not occur without the use of the technology. i.e., conducting a pro-/con- debate between classrooms using Skype in two different countries. Coding Layer 2 addressed aspects of teaching practice, or instructional features, to which participants attended in their reflective paragraphs. The three dimensions of instructional practice from Hughes et al.’s (2006) RAT framework—instructional methods, student learning processes, and curricular goals served as the initial coding framework. As coding proceeded, the instructional methods category was broadened to include the teacher’s role, and the curricular goals category was broadened to include focus on content. Additionally, there were a number of student statements that appeared to fit well with Ertmer’s (1999) work on barriers describing structural factors impacting technology integration. Therefore, structural factors was added as a fourth category. The final codes are shown in Table 5 below. Each participant reflection (one for each focal technology) was coded using these codes. As the participant reflections were not structured through the use of a template or word count, the writing varied in format. Most students wrote in paragraph form, but some chose to use bullet points or a table format. Each sentence or sentence equivalent was considered an individual data point to be coded for instructional features.  In some cases, I divided sentences 38 into two or more sentence equivalents for coding when multiple instructional features were described, such as within a compound sentence or when bullet points were used. Table 5. Coding Frame for Instructional Features Described in Reflective Paragraphs –Layer 2 Instructional Features Code Example Student Learning Processes SL Learning Activity/task, thinking process -- mental process, knowledge transfer, task milieu (individual, small group, whole-class, others), student motivation, student attitude Teacher/Instructional Methods IM Description of instructional method/activity; Teacher’s role in instruction, interaction with students, assessment of students, instructional preparation, administrative task related to instruction (e.g. grading), communication with others Curricular/Content CC Curricular knowledge or concepts, curricular experiences, curricular processes or procedures Structural Factors SF Items such as needing to have access to computers, time to plan, money to purchase things, the computer itself not working (as that is typically related to the school IT department, whereas a program not working is user error) etc... Not Directly Related, too general/vague NR Statements that do not address the prompt or add meaning to the aspects of teaching practice attended to in description. Reliability of coding. To establish reliability of coding, a second coder—a doctoral student in my cohort—coded a random subset of the reflective paragraphs. Although the second coder was not familiar with the specific course content, her doctoral program and work as a teacher/media specialist gave her a high familiarity with the field of educational technology and teaching with technology in the classroom. A data set of approximately 5% of the reflective paragraphs was randomly selected for coder training. Both coders coded these data and discussed disagreements to refine code definitions. The training instructions are provided in Appendix F. 39 After coder training, 22% of the reflective paragraphs were randomly selected and coded by me and the second coder. Cohen’s Kappa was used to calculate interrater agreement. Cohen’s Kappa is appropriate for interrater agreement of categorical scores with two coders, such as the reflection paragraphs and TILPs. Reliabilities between 0.81—1.00 are considered almost perfect (Landis & Koch, 1977). Cohen’s Kappa for the level of integration (Layer 1) codes was 0.92 and for the instructional features (Layer 2) codes was 0.79. Technology Integrated Lesson Plan. An existing rubric—developed within the Educational Technology Unit and approved by the institution’s College of Education as part of the accreditation process—was used to score the TILPs for levels of quality across six different criterion areas aligned with ISTE*T Standard 2: “Design and develop digital age learning experiences and assessments” (ISTE, 2016). The rubric specifies three levels (Proficient, Developing, and Unsatisfactory) for each of the six criterion areas. The rubric’s six criterion areas are included in Appendix E; five of the criteria are related to the research questions of this study and included in analysis: (a) standard and objective alignment; (b) presentation strategies; (c) generative strategies; (d) instructional media; and (e) needs of diverse learners. 40 CHAPTER 4 RESULTS Results are presented in three sections. The first reports on the pre- and post-Computer Technology Integration Survey (CTIS) measuring preservice teachers’ digital self-efficacy. The second presents of three separate surveys asking participants’ to report their intent to use each of the focal tools (Interactive PowerPoint, Webquest, and Digital Storytelling) in future teaching. The third section focuses on participants’ descriptions of technology integration in their reflections on use of the focal tools in teaching and in their final technology integrated lesson plans. Preservice Teachers’ Digital Self-Efficacy Research Question 1 addressed how participants’ digital self-efficacy is affected by their experiences using technology as learners themselves. Participants’ digital self-efficacy was measured by the CTIS pre- and post-surveys. On the five point Likert scale used in the CTIS, a score of 5 represents a response of strongly agree, so that a higher score indicates higher digital self-efficacy. Mean scores on the pre- and post-survey presented by treatment condition and instructor in Table 6 demonstrate a fairly high level of digital self-efficacy. The overall mean score for the CTIS post-survey taken during the last unit of the class was 4.29 (N = 76, range: 1 to 5, SD: 0.45). Table 6. Mean (SD) CTIS Survey Pre- and Post-Survey Scores Condition Control Treatment Instructor 1 3.62 (.49) 3.56 (.58) Pre-CTIS Instructor 2 3.72 (.28) 3.70 (.26) All 3.64 (.45) 3.62 (.48) 41 Post-CTIS Instructor 1 Instructor 2 4.40 (.48) 4.25 (.37) 4.15 (.41) 4.24 (.49) All 4.37 (.45) 4.17 (.42) A mixed ANOVA was used with within-subjects factor of time (pre- and post-CTIS) and between-subjects factor of condition (control and treatment). A mixed ANOVA was performed to investigate effects on the dependent variable (CTIS survey score) of within-subjects factor time (pre- and post-CTIS) and between-subjects factors condition (control, treatment) and instructor (Instructor 1, Instructor 2). Inspection of Q-Q Plots from the initial CTIS pre-survey data showed digital self-efficacy as normally distributed for both conditions and instructors. Levene’s Test for Equality of Variances was not significant, suggesting no assumption violation of homogeneity of variance for condition or instructor. The ANOVA resulted in a significant main effect for time (pretest, posttest), F (1, 72) = 75.23, p < .001, η p 2 = .51, indicating that students’ self-efficacy increased across the semester. The variance in subjects’ CTIS scores associated with time was 51% (Bakeman, 2005). There was no main effect for condition (control, treatment), F (1, 72) = 0.38, p = 0.54, ηp2 = .005, or for instructors, F (1, 72) = 1.12, p = .29, η p 2 = .015, nor was there a significant interaction between conditions and instructors, F (1, 72) = 0.48, p = .49, η p 2 = .007. shows the Estimated Marginal Means of the pre- and post-CTIS score by condition. 42 Figure 2 Figure 2. Estimated marginal means of pre- and post-Computer Technology Integration Survey. Within the control and treatment groups, both instructors showed a significant difference between the pre- and post-CTIS, as shown in Table 7 below. No significant difference was found between the two instructors for condition (treatment, control). Table 7. Paired Samples Test of Change between Pre- and Post-Mean Score on CTIS Condition Control Treatment Instructor Instructor 1 Instructor 2 Instructor 1 Instructor 2 T 8.97 8.30 4.69 3.51 Df 33 10 23 6 Sig. (2-tailed) .000 .000 .000 .013 Intent to Use the Digital Tools To examine effects of the treatment on participants’ intent to use particular tools (Research Question 2), intent to use each digital tool (Interactive PowerPoint, Webquests, Digital Storytelling) was measured at the completion of the respective focal units. On the five-point 43 Likert scale used in the Intent to Use surveys, a score of 3 represents a neutral response and a 4 represents a response of agree. Means and standard deviations of these measures are presented below in Table 8. Table 8. Descriptive Statistics for Intent to Use Specific Tools Tool* Condition Mean Std. Deviation N IPPT Intent Control Treatment 3.96 3.83 .469 .816 46 47 WQ Intent Control Treatment 3.52 3.32 .863 .935 46 47 DS Intent Control 3.87 .749 46 Treatment 3.60 .876 47 *Tool names are abbreviated in tables and figures as: Interactive PowerPoint (IPPT), Webquest (WQ), and Digital Storytelling (DS) To test whether these differences were significant, data were analyzed using a mixeddesign ANOVA with within-subjects factor of the three digital tools (Interactive PowerPoint, Webquest, Digital Storytelling) and between-subject factor of condition (control, treatment). Mauchly’s test indicated that the assumption of sphericity had been violated (χ2 (2) = 7.2, p < .027), therefore degrees of freedom were corrected using Greenhouse-Geisser estimates of sphericity (ε = .93). Levene's Test of Equality of Error Variance showed that on the intent to use surveys the error variance was not equal across groups. This violated the assumption of homogeneity of variances. The ratio of treatment group size to control group size in this study was 47/46 = 1.02 (less than 1.5). This indicated that the F statistic was robust (Green, Salkind, & Akey, 2000). No significant main effects for condition (control, treatment), F (1, 91) = .84, p = 0.362, η p 2 = .09, were detected for participants’ stated intent to use each focal tool. 44 Figure 3. Line graph of intent to use focal tools by condition. There was a significant main effect for tool F (1.86, 1.00) = 9.95, p < .001, η p 2 = .10, showing participants’ stated intent to use varied across the three tools, with the Webquest being lower (M = 3.42, SD = .901) than both Interactive PowerPoint (M = 3.89, SD = .67) and Digital Storytelling (M = 3.73, SD = .82). Descriptions of Future Teaching with Technology Research Question 3 focused on how experiences learning through technology affected participants’ descriptions of technology integration in their future teaching. For each focal technology (Interactive PowerPoint, Webquest, Digital Storytelling), students wrote a brief reflection describing how they might use the instructional tool in their own future classroom teaching. These descriptions were coded for (a) level of integration (Replacement, Amplification and Transformation) and (b) instructional features of instruction attended to in teacher’s descriptions. 45 Levels of Integration The first layer of descriptions analyzed was the overall level of integration described in each of the three different reflective paragraphs submitted in the digital tool focal units. The level of integration codes identified which integration level (Replacement, Amplification and Transformation) was the highest one described in the written submissions. Table 9 below shows the proportion of integration levels within each focal tool listed separately for control and treatment groups. Table 9. Proportion of Integration Levels Described for Each Focal Tool Focal Control Tool Replacement Amplification Transformation IPPT WQ DS 0.41 0.73 0.44 0.57 0.27 0.48 0.02 0.00 0.08 Replacement 0.41 0.55 0.42 Treatment Amplification Transformation 0.46 0.45 0.56 0.04 0.00 0.02 Figure 4 below shows the proportion of each integration level to the total number of reflections for that group (control, treatment) organized by focal tool. The bars are paired for each integration level, with the control group on the left and treatment group on the right. The majority of all integration levels attended to in the reflections by participants are either Replacement or Transformation. 46 Figure 4. Integration level proportions for control and treatment groups. A chi-square test of independence was performed on the integration levels for each focal tool to determine whether significant differences existed between control and treatment in the distribution of integration levels of the activities described by participants. There was a significant difference between control and treatment groups for the Webquest reflection, X2 (1, N = 88) = 3.14, p = .038. As the center panel in Figure 4 shows, treatment participants were more likely than control participants to describe an amplification technology use and less likely to describe a replacement use. Neither group described transformative activities using a Webquest. There were no significant differences between the two conditions for the Interactive PowerPoint, X2 (1, N = 107) = .72, p = .35, or the Digital Storytelling, X2 (1, N = 97) = 1.69, p = .22. The two groups showed similar distribution patterns between the different integration levels with slightly more amplification levels than replacement levels, and only a small proportion of transformation levels described in their reflections. 47 Instructional Features The second layer of descriptions analyzed consisted of categorizing each sentence (or sentence equivalent) within each participant’s focal tool reflections. Four categories of instructional features (student learning, instructional methods, curriculum/content, and structural factors) were derived from the literature describing technology effects on teacher practice (Ertmer, 1999; Hughes et al., 2006). One additional code was included for not-related (NR). This layer was investigated for differences between groups for the number of data segments within each of the four instructional feature categories. Number of instructional features described by individual category. The number of data segments within each instructional feature category participants attended to was used as a measure indicating technology integration understanding in their reflection for each of the three focal tools. In this study’s context, a larger number of data segments in a given category would be considered an indicator of technology integration understanding. Both groups averaged more descriptors in the student learning and instructional methods categories than in the curricular/content and structural factors category, shown below in Table 10. Table 10. Mean Number of Descriptors for Each Instructional Feature Category Tool Interactive PowerPoint Condition Control Treatment Student Learning 3.18 3.04 Webquest Control Treatment 2.91 3.57 3.55 3.66 0.95 1.18 1.34 1.32 Control Treatment 3.31 3.00 2.52 3.00 0.98 0.91 1.40 1.02 Digital Storytelling Instructional Method 3.48 3.76 Curriculum/ Content 0.59 0.88 Structural Factors 1.80 0.69 48 To understand differences that might occur in the number of descriptors for each of the four different instructional feature categories between the control and treatment groups, a chi-square test for independence was performed, shown below in Table 11. A significant difference was found between control (N = 56, M = 1.18, SD = 1.65) and treatment (N = 51, M = .69, SD = 1.01) for Interactive PowerPoint for number of structural factors participants described, t (105) = 1.84, p = .69, d = 0.45. No other significant differences were found between the two groups for the average number of descriptors used within each category for each focal tool. Table 11. Chi-square Tests for Independence between Condition and Number of Data Segments within Instructional Feature Categories Instructional Feature Category Student learning Instructional Method Curriculum/Content Structural Factors X2 18.40 14.64 6.41 17.75 Df 11 11 5 6 Sig. (2-tailed) .73 .20 .27 .007 Cramer’s V .42 .37 .25 .41 Webquest, N = 88 Student learning Instructional Method Curriculum/Content Structural Factors 13.32 12.58 7.97 10.02 10 11 5 6 .207 .322 .158 .124 .39 .38 .30 .34 Digital Storytelling, N = 98 Student learning Instructional Method Curriculum/Content Structural Factors 16.41 14.02 3.14 3.33 10 8 5 5 .088 .081 .679 .650 .41 .38 .18 .18 Focal Tool Interactive PowerPoint, N = 107 Although there was a significant difference for Interactive PowerPoint structural features, in the remaining categories for the tools it appears the interventions to increase personal experiences participants had learning through the focal tool did not have an impact on the 49 number of descriptors used within each category as compared to the participants who did not experience the intervention. Technology Integrated Lesson Plan Common Course Assessment The Technology Integrated Lesson Plan (TILP), submitted as the final assessment in the course, was used as a second data source to investigate differences between the two conditions in their descriptions of technology integration in the context of future teaching that might have occurred as a result of personal learning experiences preservice teachers had using the focal digital tools. The TILPs are included as part of the institutional CAEP accreditation process, and are scored by each instructor using an institutionally provided rubric (Appendix E) aligned with the Interstate Teacher Assessment and Support Consortium (InTASC) standards 3, 7, and 8 as well as ISTE*T Standard 2. Quality of Technology Integrated Final Lesson Plan (TILP). The TILP assessment rubric has six criterion rows, each worth up to three points for a total possible score of 18. For all participants, TILP scores ranged from 12 – 18, M = 16.57, SD = 1.56. To compare possible differences in the overall TILP scores for the treatment and control groups, an independentsamples t-test was conducted. There was no significant difference identified in the scores of the control (M = 16.68, SD = 1.42) and the treatment (M = 16.45, SD = 1.69) based on condition, t (111) = .81, p = .42, d = 0.15. To see if there were other differences between the two groups for the five individual criterion rows assessed related to this study, chi-square tests for independence were also performed as shown in Table 12 below. 50 Table 12. Chi-square Tests for Independence between Rubric Criteria and Condition, N = 114 Rubric Criteria All rubric criteria X2 6.45 Df 6 Sig. (2tailed) .38 Objective and Standard Alignment 1.30 2 .52 .11 Teaching Strategy Integration .37 1 .54 .06 Learning Strategy Integration .21 1 .65 .04 Instructional Media Integration 1.26 2 .53 .11 Meeting Diverse Learner Needs .59 2 .74 .07 Cramer’s V .13 In this comparison of the likelihood either group would have a higher reported quality on the TILP as a whole as well as on the five individual criterion rows assessed related to this study, no significant differences were found as shown above in Table 12 Results Summary Research questions for this experimental study focused on whether learning activities increasing prior personal learning experiences using digital tools affected preservice teachers’ self-efficacy for using these tools in instruction, their expressed intent to use these tools in their teaching, and the quality of educational uses of technology they described. Across all course sections, participants’ self-efficacy for educational technology use increased from the beginning to the end of the course as measured by a pre- and post-Computer Technology Integration Survey (CTIS), but there was no significant difference found between treatment and control groups or between the two instructors, nor between the two groups in participants’ stated intent to use the tools. There was a significant difference between treatment and control groups for integration levels described in Webquest reflections, with a larger proportion of amplification uses described by the treatment group. There was also a significant difference found in the Interactive PowerPoint reflections between groups for the number of structural features 51 described. No other significant differences were found between groups related to their descriptions of technology integration or their achievement on the technology integrated lesson plan course assessment. 52 CHAPTER 5 DISCUSSION This study investigated whether preservice teachers could be positively affected by participating in course experiences that entailed learning content themselves through digital tools before working to create teaching activities and materials with those tools. Changes to Self-Efficacy There was a significant and positive change in self-efficacy for all participants across the semester. This positive finding is consistent with earlier research on technology integration in teacher preparation programs (Gillingham & Topper, 1999; Kennedy & Archambault, 2012; Williams, Foulger, & Wetzel, 2009) in supporting the claim that time spent in stand-alone educational technology courses significantly increases self-efficacy for educational technology use in future teaching. In contrast to expectations, there was no significant difference in selfefficacy change between the treatment and control groups. As the CTIS survey encompassed all course units, not only the focal tool units, it may be that differences in the amount of course time spent in the interventions was not enough to create a difference in the two groups using this measure. Both groups participated in numerous learning experiences during 12 hours of face to face time, including the three hours used for the interventions. Also, participants were assessed on over ten different teaching artifacts in the fourteen weeks of the class, including the three focal unit artifacts. In Tschannen-Moran and Hoy's (2007) study of novice and career teachers’ self-efficacy, mastery experiences were the strongest predictor of self-efficacy. Most students received A’s, indicating that their course learning experiences were positive and likely to impact overall self-efficacy, and all passed with a grade higher than a B-. 53 Also, in Wang et al.'s (2004) study where the CTIS was originally created, it was implemented immediately before and after an intervention to increase self-efficacy. In this study, the three interventions were spaced out across the course with the post-CTIS collected three weeks after participants experienced the last intervention. In a future study, reorganizing the timing of the CTIS administration to be more tightly aligned with intervention experiences along with increasing the amount of intervention time overall is recommended. This would help determine if the increases to preservice teachers’ self-efficacy, intent to use, and descriptions of technology use are a result of the increased experiences learning through technology or a result of time spent learning to teach with technology in general. Inclement weather negatively impacted attendance and response rates on the post-CTIS administered in the final class session for the treatment groups of both instructors. A major snowstorm reduced face-to-face attendance by a third during the final treatment groups’ final session. Prior, attendance had been consistent between the control and treatment groups. It is unknown whether absent participants made their decision solely on the weather or if they represented a more highly confident group that felt comfortable missing class. Intentions to Use Focal Tools There were no significant differences in participants’ self-reported intent to use the focal tools in future teaching by control and treatment groups. Many focal tool reflections focused on teacher-centered activities, often presentation related and experienced prior to this course. Participant reflections for Interactive PowerPoints, in particular, frequently mentioned how they experienced this tool in other settings or how their cooperating teacher used it. This familiarity is one of the reasons Interactive PowerPoints were included as a focal tool, using the TCK→ TPACK approach (Koehler, Mishra, & Cain, 2013) to deepen the level of technology integration. 54 However, this approach may have contributed to the lack of significant difference between treatment and control groups. As previous research describes, presentation technologies such as PowerPoint and short videos/digital stories are one of the most common types of technology uses by inservice teachers and higher education faculty (Wetzel, Foulger, & Williams, 2008). As well, other research indicates that these types of uses are not the most effective for transforming student learning experiences through technology (Ertmer & Ottenbreit-Leftwich, 2013; Ertmer et al., 2012). The length of time treatment participants spent in the interventions may simply have not been enough to change preconceived beliefs about the focal tools. Further explorations using technologies selected for use in learner-centered approaches and that are not typically used in teacher-centered approaches would help control for experiences participants had outside of the course. Of the three focal tools used, the Webquest is the only one originally conceived to support learner-centered, collaborative, inquiry-based learning (Yang, Tzuo, & Komara, 2011). Previous experiences using them for these purposes were rarely mentioned in participants’ reflections. Although there were no significant differences between the groups in their level of intent to use the focal tools, participants in both treatment and control groups reported significantly lower intent to use Webquest compared to the other focal technologies (Interactive PowerPoint and Digital Storytelling). Based on a cursory analysis of the Webquest reflections, the following may have contributed to the low intent to use. There were 86 mentions of time, an external constraint consistently reported as a barrier in studies on technology use of inservice teachers (Brush et al., 2008; Ertmer, 1999, 2005; Ertmer & Ottenbreit-Leftwich, 2010; Fantilli & McDougall, 2009; Hew & Brush, 2007; Wang et al., 2004). Participants described time both in 55 relation to the teacher not having enough time to create them and students not having enough computer time available. It took me about 4 hours to create the webquest for this class and I am not sure I would have the time to do that when I’m a real teacher. In my current placement, we only have access to enough technology once a week for an hour. (Sandra, control group) A limited understanding of the pedagogical affordances of Webquests were often expressed in relation to using them for research projects. “We use it (Webquest) to provide students with websites to gather information, and then give them the means to organize their new findings,” described Don in the control group. At that reduced level of use, the time investment does seem high. Few indicated understanding the advantages such as Alyssa in the treatment group described, “Webquests are a great way for students to work as a team and allow the teacher free time to circulate around the room and help students.” At least two students mentioned the Webquest being an old strategy as a reason they did not envision using it. Given the alignment of the original Webquest design concept with many of the transformations called for using technology (Council for the Accreditation of Educator Preparation, 2016; Dede, 2011; Thomas, 2016) additional research using these Webquest reflections may be warranted to explore connections between lower intent to use Webquests and factors such as computer availability and pedagogical knowledge. Overall, the hypothesis addressing RQ2 that an intervention designed to increase participants’ experiences learning through a specific technology would have a more positive effect than a traditionally structured experience learning about a technology on participants’ intention to use a similar technological tool in future teaching was not supported in this study. 56 Describing Future Teaching and Technology Two different written data sources were used to investigate RQ3. First, the reflection paragraphs submitted by each participant at the end of each tools’ focal unit were analyzed on two levels: (a) Overall level of integration used in each paragraph, and (b) Instructional features used within each paragraph. Second, the Technology Integrated Lesson Plan (TILP) completed by participants during the final course unit were scored for quality with a common course rubric. Levels of Integration Given research describing current educational technology uses as what could be described as Replacement, or a basic level, (Buckenmeyer, 2010; Larry Cuban, 2001, 2008) and the tendency of teachers to teach as they have been taught (Martin et al., 2010; Strudler & Hearrington, 2008) it was hypothesized the intervention to increase personal experiences learning through technology would positively change participants’ described level of integration. In particular, by experiencing the interventions, the treatment group would be more likely to describe Amplification and Transformation uses of technology in future teaching contexts than the control group. Figure 4 in Chapter 4 presents distribution data for this analysis. Webquest and integration levels. For the Webquest reflection paragraphs, there was a significant difference between the integration levels of activities described by the treatment and control groups, with treatment participants describing more amplification uses and fewer replacement uses than control participants. Prior research on preservice teachers’ use of technology beyond productivity and presentation found a limited comfort and experience level, (Chen, 2010; Gray et al., 2010; Kumar & Vigil, 2011) informing this study’s intervention design. This difference supports the hypothesis that increasing experiences learning through a Webquest would positively affect participants’ ability to describe teaching through a Webquest. 57 Both control and treatment groups frequently described using a Webquest as an amplification for research projects in order to create structured freedom. “Instead of opening students up to the internet and saying, ‘Hey, good luck,’ Webquests help them hone in on what I need them to learn,” Abby wrote. Some Webquest reflections referencing prior experiences with Webquests outside of this course revealed misconceptions of use. “Although I have seen them used in Spanish classrooms, I remember from my own experiences that webquests do not really engage students the way I would want,” Mark reflected. Both groups also described using Webquests as ways to increase group work, with 70 overall uses of the word (and variations of) groups. The treatment group also had details tightly related to the way grouping was used in the intervention. When I first heard the word webquest, I instantly thought of a boring task that requires me to fill in the blanks. However, after learning more about them, I am glad that my idea of them is wrong…I like the idea of assigning students different roles and making sure these roles are equal. (Julia, Webquest reflection in treatment group) Research projects supported by Webquests were common with 76 uses of the word research. The treatment group also had responses that mentioned specific teacher behaviors during time students were working on the Webquest, such as the intervention modeled. “...students break off into groups to expand on their learning and I function as a facilitator who clarifies, guides and mentors the students as they complete the Webquest,” reflected one treatment group participant (italics added). Further analysis of these reflections to explore preservice teachers’ descriptions of how technology can support collaboration and group work may be helpful in preparing teachers and students to meet ISTE-Teacher Standard 4, Collaborator, and ISTE-Student Standard 7, Global 58 Collaborator (ISTE, 2016). Other technology integration research also mentions a need for the ability to facilitate group work (Avargil et al., 2012; Ertmer & Ottenbreit-Leftwich, 2010; Fantilli & McDougall, 2009; Williams et al., 2009). The lack of any transformative Webquest use descriptions may also be related to limited pedagogical skills when using web tools. Interactive PowerPoint and Digital Storytelling integration levels. For the Interactive PowerPoint and Digital Storytelling, there were no significant differences between the treatment and control groups in the levels of integration described in their reflective paragraphs. Many described using PowerPoint in the past and wanting to use the interactive features to amplify their presentation ability to meet student learning needs. “The home slide could state unit learning targets so each time I come back to the home slide, students are reminded of them. I could easily go back to a previous lesson to review or look ahead …” wrote Josh. Many other Interactive PowerPoint reflection paragraphs mentioned having prior experiences using them for jeopardy-style test prep games, with 25% of participants listing this as a future use. The lack of significant results of the IPPT and DS reflections related to integration levels suggest the one hour used in the intervention for the treatment group demonstrating learning though these tools was not sufficient to make a significant difference for teaching through these tools. Instead, it appears that participants relied on their more extensive experiences with these tools used in other contexts during their own k-12 learning, a consistent finding in teacher education studies (Brown & Warschauer, 2006; Brown & Duguid, 1991; Burden & Hunt, 2010; Darling-Hammond, 1999; Darling-Hammond & Bransford, 2005; Martin et al., 2010; Pearcy, 2013). The amount of intervention time in the study will be discussed further in limitations. Instructional features attended to in reflections. The dimensions Hughes et al.(2006) created for inservice teacher observational use with the RAT integration model, student learning, 59 instructional methods, and curriculum/content, were modified for use with the written reflection paragraphs in the second level of analysis (Appendix D) along with a fourth category, structural factors (Ertmer, 1999). Diversity of instructional features attended to as well as the total number of instructional features across all categories was determined to be an indicators of participants’ ability to describe aspects of teaching practice related to technology integration (Tables 10, 11). There were no significant differences in the diversity of instructional feature categories used out of four possible between the two conditions. As well, the number of instructional features within each individual category did not have a significant difference found between the treatment and control groups. The structure of the reflection prompt may have substantially contributed to the lack of significant differences between groups. Participants were encouraged to write one or two paragraphs in total for each of the focal tools, and examples from previous semesters were shared. Although examples were related to different focal tools and activities, those selected contained above average responses and most addressed at least three of the four instructional feature categories. Wetzel et al. (2008) used a survey as well as focus groups in their study of redesigned teacher preparation course activities similar to the type of activities in this course. In future, using a similar strategy with questions targeting specific aspects of using the focal tools in future teaching might result in measurable differences between the two groups as a result of the intervention increasing experiences learning through technology. Describing multiple instructional features of focal tools. In comparing the number of instructional features described within each of the four categories used for the focal tool reflections, there was a significant difference found for the Interactive PowerPoint reflections for the category of Structural Factors between the treatment and control groups. A larger number of 60 structural factors were described by the control group (N = 56, M = 1.18, SD = 1.65) compared to the treatment group (N = 51, M = .69, SD = 1.01). This suggests the intervention designed to increase participants’ experiences learning through an Interactive PowerPoint prior to creating materials for teaching through an Interactive PowerPoint had a negative effect on participants’ ability to describe multiple structural factors related to technology integration. Within the context of this study, the description of structural factors was considered to be more distant from the core practices of teaching described in the other three categories. In other research literature, structural factors are most often described as barriers to technology adoption and use, such as not having sufficient time or computers for students to access learning (Ertmer & Ottenbreit-Leftwich, 2010; Hew & Brush, 2007; Park & Ertmer, 2008). “I personally would be more likely to use technology that is easier for my students and me, and less time consuming,” stated one control group participant. “If they have limited access to technology or money to afford technology tools, their opportunities to use interactive PowerPoints would be very slim,” said another. A few students in both groups described structural factors related to the logistics of teaching, such as planning for substitute teachers. “…teachers never really know if the substitute will be able to teach the material, so if students had a self-guided assignment (Interactive PowerPoint) they could still be learning content…when you couldn’t physically be there.” Although the option of adding codes related to a positive or negative participant response was considered, the in situ nature of this study using previously designed course assignments that encouraged participants to share affordances and constraints did not lend itself well to this distinction. Analyzing this level of detail within the reflections was beyond the scope of this study, although in the future it would be of interest to look more deeply within the different 61 instructional feature categories for trends and patterns and the effects from increased personal learning experiences through technology. For example, is there a difference in the types of structural factors mentioned by participants that relate to other indicators of future successful technology integration? For example, structural factors described in a learner-centered reflection would be important to address in future teacher preparation (Ertmer et al., 2012; Polly et al., 2010). Similarly, structural factors related to a specific physical setting may not be as important in teacher preparation. It was hoped that the intervention to increase the experiences participants had learning through focal technologies before using these same technologies in the context of teaching practice would increase the number of instructional features attended to in all categories of focal tool reflections. However, no other significant difference was found for the number of instructional features in the Interactive PowerPoint, Webquest and Digital Storytelling reflections related to participation in control or treatment groups (Tables 10, 11). Quality of descriptions used in the TILP. The Technology Integrated Lesson Plan (TILP) is the common course assessment for students to demonstrate mastery of the International Society for Technology Education (ISTE) standard 2 for teachers as scored on a six level rubric. Prior to this study, it was observed by faculty teaching this course that students were able to succeed on the TILP based on rubric criteria regardless of their expressed self-efficacy, intent to use, and descriptions of quality technology integration used in other areas of the course. No significant difference was found between the overall scores of the control and treatment conditions, as shown in Table 12. As well, no significant difference was found between scores on individual criterion rows within the rubric for the control and treatment conditions. This was the expected response due to students’ historical success on the institutionally-specified 62 assessment rubric. The lack of significant differences between the treatment and control groups for the TILP rubric scores indicates that both groups were equally as likely to score highly on the different items measured in the assessment rubric. Limitations A limitation of this research is that assignment of students to course sections (and thus treatment and control conditions) was not random. Students choose which course section to take based on being either an elementary education major (Friday afternoons from 3:00pm to 6:00pm) or secondary major (Friday evenings from 6:00pm to 9:00pm). The sections are filled sequentially by the registrar’s office unless a specific request is made by the participant. Hence, the first set of Fridays with four possible sections fills first with the students who register earlier, and the second set of Friday sections opens only as the first ones are full. ˜Given the paired sample design, this meant that early registered students were enrolled in the three control sections and students registering later were in the treatment sections. Research shows us that students who register later for classes tend to do more poorly in their education (Moore & Shulock, 2009). Therefore, it is possible more of the high performing students were in control sections than in treatment sections. An unforeseen weather factor affected treatment group participants completing the postCTIS compared to the control groups as well as their overall time spent in face-to-face sessions. A major snowstorm greatly reduced attendance for treatment groups on the final Friday of the post-CTIS administration. Although the option to complete the CTIS survey online was made available for absentees, participants did not choose to do and the sample size was reduced from 110 participants giving consent to 76 participants who completed both the pre- and post-CTIS survey. It is possible that those students in the treatment section who felt strongly positive about 63 their ability to perform educational technology related teaching tasks chose to stay home rather than chance driving long distances in the foul weather, which would negatively impact the overall post-CTIS mean for the treatment group. Another major factor that may have impacted results of all participants is the in situ nature of the study giving rise for other influences on study participants. The limited generalizations of the interventions using a single time point of study was criticized in earlier research (Kay, 2006a, 2007). This study sought to address that gap and in doing so, effects of the time spent in the interventions may have been mitigated by the time spent in other semester activities. Figure 5 below shows the percentages of time spent in different college-related activities, using the Higher Learning Commission’s time estimates for coursework per credit hour. The three hours in total of the interventions are not large enough to constitute 1% of the total time spent by participants during the semester compared to the other required activities. Other coursework 35% Teacher Assisting 35% Intervention 0% Course face-to-face time 2% Course online time 14% Other face-to-face courses 14% Figure 5. Percentage of time spent by participants in different semester activities. Although this may appear on the surface to impact the treatment and control groups equally, the 35% of time spent during their teacher assisting may reflect earlier differences 64 related to registration. Teacher assisting placements are filled as students registered, and cooperating teachers with higher ratings or in schools/settings having better relationships with the institution are filled first. Therefore, the treatment group would be more likely to be in settings with less access to functional technology and/or with less skilled cooperating teachers were not supportive or knowledgeable about using technology in their own practice, consistent with descriptions of field experiences in Brenner and Brill (2016). For example, many students shared in class that they had access to interactive whiteboards in the classrooms, but had not seen the teacher use it with the students. These anecdotal statements are echoed in other large-scale technology use studies, such as Teachers’ Use of Educational Technology in U.S. Public Schools: 2009 finding that of those teachers who have access to interactive whiteboards, only 57% report using it sometimes or often for instruction. “Twenty-nine percent of teachers reported using computers in general for instruction rarely or never, even though they had access (Gray et al., 2010). Other concerns related to access expressed by students were in regards to the types of technology they had access to: “I am in an urban school and while there is a computer lab and a set of laptops, they are very hard to access…the laptops are old…half do not work and the others are so slow,” reflected Arthur. In future research, controlling for factors related to the cooperating teachers’ skill with technology integration as well as access to technology and technology support for teachers and students is recommended for participants’ field experiences. One additional limitation to the in situ context of this study is in regards to the timing of the intervention and measurement. Each intervention experience and intent to use survey submission on the specific tool took place in two sequential units spanning four weeks. The intervention took place in a face to face setting, with the remainder of that unit as well as the entirety of the following unit took place online. Additionally, students were attending their 65 teacher assisting daily as well as other teaching methods courses that were fully face to face. Therefore, the extent on students’ intent to use technologies as well as their descriptions of technology integration from other student experiences that occurred between the timing of the intervention and the timing of the measurement are unknown. In a future study, adjusting the timing to tightly coordinate the intervention and the measures is recommended. Finally, this research was conducted in the context of a particular institution and metropolitan area, and any attempts to generalize this beyond that setting must be done cautiously. As much as the course itself as well as the course interventions were designed to be representative of the larger picture of educational technology coursework and technologyagnostic, there are still likely to be differences. This course was held in a room equipped with an interactive whiteboard and students brought their own devices which may or may not be similar to other teacher preparation programs. The focal tools chosen were selected for their ability to facilitate constructivist, technology integrated learning that existed at the time of this study, but as with any technologies, affordances and constraints change quickly making them more or less desirable for educational use. Research Implications One hope of this study was to extend on research indicating pedagogical changes within technology learning experiences can positively impact teachers, and in turn, positively impact the future quality of teaching through technology in classrooms. Although the majority of the findings in this study showed no significant differences between control and treatment groups for teacher self-efficacy, intent to use, and descriptions of quality technology integration, research implications emerged in two areas. First, implications related to effects of the particular 66 pedagogical strategies used in teacher preparation. Second are implications related to organizational contexts of a preservice teacher educational technology course. Effects on Preservice Teachers of Pedagogical Changes My study showed that experiences learning in a traditional lecture/discussion format and experiences learning through a focal technology both successfully increase computer selfefficacy. Other research in general education also show multiple types of teaching and learning strategies are effective for learning (Brenner & Brill, 2016; Larry Cuban, 2008; DarlingHammond & Bransford, 2005; Davies, 2011; Polly et al., 2010; Tanguma et al., 2002; Williams et al., 2009). In future studies, testing multiple pedagogical strategies using similar participants, content, and data to explore if measurable differences exist between pedagogical practices experienced during an educational technology course may inform future practices. Additionally, collecting more of the participants’ demographic data would help ascertain if different pedagogical strategies were more or less effective for specific participant characteristics and contexts. The specific pedagogical strategy used for this study’s interventions incorporated a constructivist approach that students may or may not have already experienced, in addition to the constructs I studied. When a constructivist environment is being implemented for the first time, participants tend to report dissatisfaction with the learning experience and a desire for more scaffolding (Adamy & Boulmetis, 2005; Avargil et al., 2012; Williams et al., 2009). One implication for future studies using similar pedagogical strategies would be to restructure the timing and implementation of the interventions in order to ensure participants are first comfortable learning in a comparable format. Shifting the interventions to later in the course after the pedagogical practice is used and discussed for other learning experiences outside of 67 those used in the study rather than starting with the first unit of study is one possibility. Another possibility to mitigate effects from using an unfamiliar pedagogical strategy is to pre-assess participants for any prior experiences with similar pedagogical strategies, and address this through a different measure or different groupings. In choosing focal tools and artifacts for a future study, eliminating digital instructional tools that most likely are familiar from prior participant experiences while keeping the same intervention, content and data might also produce different results more indicative of the impact increasing preservice teachers’ experiences learning through similar tools. Also, comparing effects on teacher self-efficacy and intent to use technologies in settings using a TCK→ TPACK approach and a PCK→ TPACK approach as well as a more holistic LT/D approach for TPACK development within an educational technology course could add to what we know about best practices to use for increasing technology integration. Organizational Influences on Preservice Teacher Technology Preparation As in any study of individual teachers, school and education system organizational contexts may also impact findings. In this study on preservice teachers’ self-efficacy, intent-touse, and ability to describe transformative uses of technology over the course of a college semester, the lack of significant findings may also be a result of contextual features present in the teacher preparation program, teacher assisting classrooms, and surrounding school districts. Returning to the original problem of trying to increase transformative technology use for teaching and learning as called for by educational leaders and policy makers, changing pedagogical practices within an educational technology course may or may not be the best use of limited time and resources to enact sufficient changes to our educational system. 68 For example, the post-CTIS surveys taken during the last unit of the class where participants reported their self-efficacy with technology integration showed an average above four (M = 4.2862) on a five point Likert scale where five is “strongly agree.” Yet, group means for intent to use the focal tools are all less than four (agree), using the same five-point scale. In other words, in the context of this course and the institutionally supported technologies participants felt higher self-efficacy than their stated intent to use these same tools in an unknown school context in their future. This finding is similar to Lei's (2009) study seeking to understand what technology preparation is needed for digital natives. Her survey analysis findings showed strong positive beliefs about having the skills to use technologies in learning, but a more reserved attitude towards using technologies in the classroom. Although the 51 participants in Lei’s study were freshmen rather than the upper-classmen studied here, both groups were in teacher education programs but had not yet completed student teaching. In other related studies on equipping preservice teachers for technology integration, a similar difference between feeling skilled enough to use technologies and being ready to integrate technologies into meaningful learning experiences such as a Webquest is associated with the need for additional training in field experiences with technology above and beyond educational technology coursework (Gray et al., 2010; Polly et al., 2010; Thomas, 2016; Tondeur et al., 2012). These findings raise interesting questions on whether any changes in the context of a stand-alone educational technology course as currently structured will be enough to sufficiently prepare teachers to transformatively use technologies in the unknown context of their future classrooms. Not only is the perception of a stand-alone educational technology course as less valuable to students than their other teacher preparation courses historically (Gillingham & Topper, 1999), this particular technology course is the only hybrid course required making it even further 69 removed perceptually from the core practice of teaching by students. Additionally, some methods faculty who meet face to face with students do not support having a stand-alone technology course or model effective use of technology themselves (Hare, Howard, & Pope, 2002; Pellegrino, Goldman, Bertenthal, & Lawless, 2007; Tanguma et al., 2002). As multiple studies show this gap between our ability to teach students skills they need for technology integration and their comfort and actual use of it as early career teachers (Brenner & Brill, 2016; Gray et al., 2010; Hsu, 2013) is it time to more holistically rethink the context of the educational technology course in relation to field experiences and content area courses? More research of educational technology courses in different contexts to establish if other preparation options such as pairing it with field experiences, collaboratively teaching it with content/methods courses, discontinuing the hybrid option (or using the hybrid option in conjunction with other hybrid courses rather than in isolation) is suggested. Another possibility emerging from participant reflections describing influences of their cooperating teacher would be additional research in settings where cooperating and preservice teachers experienced the same interventions to increase technology integration. This would necessitate developing strong community area educator networks in order to accomplish. Conclusion The context of a stand-alone educational technology course provides a study setting where interventions can be more easily constructed and measured than a k-12 classroom setting, yet it remains a proxy measurement for the desired outcome of educational system transformations using technology. Many changes have been incorporated to teacher preparation programs targeting technology integration over the past two decades, preparing future teachers for technology integration has still not resulted in the desired transformations to our educational 70 system (Brenner & Brill, 2016; Buckenmeyer, 2010; Ertmer, 1999; Gray et al., 2010; Pearcy, 2013; Thomas, 2016). Although a supply of skilled teacher candidates is essential for sustaining successful schools (Darling-Hammond & Bransford, 2005; Darling-Hammond & Sykes, 1999), other organizational factors are important to any school reform implementation. Fouts’ 2003 analysis of Washington State’s school reform efforts spanning a decade describe how an “illusion of change is created through a variety of activities, but…the deep culture of the classroom and school are unaffected” (p. 12) often results from top-down, bureaucratic school reform efforts. Meaningful and authentic change to schools results from reforms that are driven by clear and accepted beliefs and philosophies driving practice that differ from the status quo. Schools where resources and restructuring take place without a corresponding change to the foundational beliefs and philosophies of the teachers encourages resistance to change (Ellis, 1994; Fouts, 2003; Goodman, 1995). This organizational behavior is also described in a similar fashion in other research more specific to technology integration and innovation (Ertmer et al., 2012; Frank et al., 2004; Rogers, 1962; Zhao & Frank, 2003) If competent preservice teachers according to standards from accreditation organizations such as the International Society for Technology in Education (ISTE) and the Council for Accreditation of Educator Preparation (CAEP) have continually entered our education system without also seeing a measurable increase in technology integration in active classrooms, perhaps this is an avenue researchers need to attend to in future studies. The call to transform our educational system with technology has been increasingly evident in educational policy and national leadership (Brown & Duguid, 1991; Dede, 2011; Gray et al., 2010; Thomas, 2016). Existing research about school reform may inform areas where efforts to change inservice 71 teachers’ beliefs and philosophies may have impact, beyond our current efforts preparing new teachers to meaningfully integrate technology. Through studying beliefs and philosophies of effective teachers (including their use of technology), policy makers and leaders may find that there are better ways to transform our educational system than currently described with technology. 72 APPENDICES 73 APPENDIX A— Consent for Participation 1. EXPLANATION OF THE RESEARCH and WHAT YOU WILL DO: Thank you for taking the time to complete this consent form survey. It will take you approximately 5 minutes to complete. You are being asked to participate in a research project. Researchers are required to provide a consent form to inform you about the study, to convey that participation is voluntary, to explain risks and benefits of participation, and to empower you to make an informed decision. You should feel free to ask the researchers referenced below any questions you may have. The purpose of this project is to study the effects of different learning experiences on preservice teachers’ knowledge and beliefs about teaching with technology. The research is being conducted by Tracy Russo, one of the EDT 370 instructors and a doctoral candidate at Michigan State University. The research will involve examining EDT 370 students’ coursework and responses on surveys also given as part of regular coursework. No additional activities or assignments are required for those participating in the research. You do not have to explain why you do or do not choose to participate in this research study. You must be at least 18 years old to participate in this research. If you consent to participating in this research, you are granting permission for the researchers to examine the following: a. The Computer Technology Integration Survey, given at the beginning and end of the semester. b. The self-assessment surveys on each of the major instructional tools used in the course. c. The reflection paragraphs written as part of your unit deliverables. d. The Technology Integrated Lesson Plan (TILP) you submit as the major course assessment. 2. YOUR RIGHTS TO PARTICIPATE, SAY NO, OR WITHDRAW: Participation in this research project is completely voluntary and confidential. You have the right to say no. You may change your mind at any time and withdraw. Whether you choose to participate or not will have no effect on any of your evaluations or your grade in the course. To protect your privacy, all documents will be coded with a computer-generated randomization number and any other identifiers destroyed, including any names of students, instructors, or course section number. If you choose not to participate, none of your work will be used in this study. 3. COSTS AND COMPENSATION FOR BEING IN THE STUDY: There are no costs for participating in this study, and you will not receive any compensation. The study will not take any additional time outside of your time spent on the teaching and learning activities in your EDT370 course during the semester. 4. CONTACT INFORMATION FOR QUESTIONS AND CONCERNS: If you have concerns or questions about this study, such as scientific issues, how to do any part of it, or to 74 report an injury, please contact the researcher Tracy Russo, russotr@gvsu.edu, 616-334-6225, Office 443C DeVos, College of Education, 401 W. Fulton St, Building C, Grand Rapids, Michigan 49504 OR Ralph Putnam, ralphp@msu.edu, 517-353-9285, 511 Erickson Hall, East Lansing, Michigan, 48824-1034 5. DOCUMENTATION OF INFORMED CONSENT. You indicate your voluntary agreement to participate by completing and electronically submitting this short one-question survey. If you choose not to participate, you can still complete this survey without clicking on the final submit button though your responses will not be saved and none of your information will be included in this study. Question 1: Please select either yes or no to the following question by circling the word Yes or No, then sign and date your name below. 1. Are you willing to voluntarily participate in this research study on the effects of prior experiences learning with technologies on preservice teachers’ beliefs, descriptions, and intent to teach through technology in the future with similar technologies? Note: Whether you choose to participate or not will have no effect on any of your evaluations as all data will be anonymously recorded for use in the study, and no additional coursework will be required to participate. A. Yes, I am willing to voluntarily participate in this research study B. No, I am not willing to voluntarily participate in this research study Signature of Participant: Date: 75 APPENDIX B — Course Calendar 76 APPENDIX C — Computer Technology Integration Survey The purpose of this survey is for preservice teachers to share their beliefs and confidence regarding the instructional tool used for the unit assessment artifact. This survey will be administered online using Qualtrics online survey tools. All questions use the following 5-point Likert scale: 1 – Strongly agree 2 – Agree 3 – Neutral 4 – Disagree 5 – Strongly disagree 1. I feel confident that I understand computer capabilities well enough to maximize them in my classroom. 2. I feel confident that I have the skills necessary to use the computer for instruction. 3. I feel confident that I can successfully teach relevant subject content with appropriate use of technology. 4. I feel confident in my ability to evaluate software for teaching and learning. 5. I feel confident that I can use correct computer terminology when direction students computer use. 6. I feel confident I can help students when they have difficulty with the computer. 7. I feel confident I can effectively monitor students’ computer use for project development in my classroom. 8. I feel confident I can mentor students in appropriate uses of technology. 9. I feel confident about assigning and grading technology-based projects. 10. I feel confident that I can consistently use educational technology in effective ways. 11. I feel confident that I can provide individual feedback to students during technology use. 12. I feel confident I can regularly incorporate technology into my lessons, when appropriate to student learning. 13. I feel confident about selecting appropriate technology for instruction based on curriculum standards. 14. I feel confident about keeping curricular goals and technology uses in mind when selecting an ideal way to assess student learning. 15. I feel confident about using technology resources (such as spreadsheets, electronic portfolios, etc.) to collect and analyze data from student tests and products to improve instructional practices. 16. I feel confident I am comfortable using technology in my teaching. 17. I feel confident I can be responsive to students’ needs during computer use. 18. I feel confident that, as time goes by, my ability to address my students’ technology needs will continue to improve. 19. I feel confident that I can develop creative ways to cope with system constraints and continue to teach effectively with technology. 20. I feel confident that I can carry out technology-based projects even when opposed by skeptical colleagues. 77 APPENDIX D — Reflection Prompts Each of these three prompts was given as a part of the assignment during the focal unit where the participant used the particular instructional tool in order to create a teaching-related artifact using course content. These were electronically submitted. The Interactive PowerPoint reflection prompt was answered as one of the slides for the assignment. The Webquest and Digital Storytelling reflection prompts were answered using Qualtrics as a part of the same survey asking participants’ intent to use that tool. Prompt #1, Interactive PowerPoint Reflect on a way you could use interactive PowerPoint in your future classroom. Be as specific as possible, including examples and why you would use this tool rather than other options. Include both affordances and constraints where applicable. This should be about one or two paragraphs in length. Prompt #1, Webquest Reflect on a way you could use a Webquest in your future classroom. Be as specific as possible, including examples and why you would use this tool rather than other options. Include both affordances and constraints where applicable. This should be about one or two paragraphs in length. (The box will expand while you type if needed) Prompt #1, Digital Storytelling Reflect on a way you could use digital storytelling in your future classroom. Be as specific as possible, including examples and why you would use this tool rather than other options. Include both affordances and constraints where applicable. This should be about one or two paragraphs in length. (The box will expand while you type if needed) 78 APPENDIX E — Technology Integrated Lesson Plan Template and Scoring Rubric Name: Grade Level: Content Area: Topic: Length of Lesson: Summary of Lesson: Learner Analysis General Characteristics: List populations that may need special attention: Specific Entry Competencies: Competency/Skill When Acquired Learning Traits: Standards and Objectives Content Standard(s): Include the full standard (number and statement) Technology Standard(s): Include the full standard (number and statement) Behavioral Objectives: Objective Standard(s) Addressed 1. 79 2. 3. 4. Assessment Plan Objective How will you summatively assess each objective? [provide any rubrics at end or link to other document] 1. 2. 3. 4. Accommodations/Modifications for Summative Assessment: Strategies Teacher-focused uses: How are you/media presenting content to the student? Description Obj. Technology Used Justification for technology • • • Does it Replace, Amplify, or Transform an activity w/o technology? Is there evidence of the technology/strategy’s effectiveness? Has technology been evaluated for accuracy/credibility/accessibility? [add rows as necessary] How are you checking for understanding (i.e., formative assessment)? Mention specific technologies. How are students engaging with the content? Description Obj. Technology Used Justification for technology • 80 Does it Replace, Amplify, or Transform an activity w/o technology? • • Is there evidence of the technology/strategy’s effectiveness? Has technology been evaluated for accuracy/credibility/accessibility? [add rows as necessary] How are you checking for understanding (i.e., formative assessment)? Mention specific technologies. How are students demonstrating their knowledge of the content? Description Obj. Technology Used Justification for technology • • • [add rows as necessary] Accommodations/Modifications: Timeline Resources Things to include: Links to any resources you are using Links to other Google Docs for grading rubrics 81 Does it Replace, Amplify, or Transform an activity w/o technology? Is there evidence of the technology/strategy’s effectiveness? Has technology been evaluated for accuracy/credibility/accessibility? Grading Not provided [0] Proficient [3] Developing [2] Unsatisfact ory [1] Standard and objective alignment (ISTE-T: 2d) (InTASC 7) Learner demonstrates an appropriate understanding of how objectives measure mastery of standards Learner demonstrates an incomplete/incorrect understanding of how objectives measure mastery of standards Learner states standards and objectives. Learner does not provide standards and/or objectives Presentation (teacher/mediacentered) strategies are effectively integrated. (ISTET: 2a) (InTASC 8) Learner provides an appropriate rationale on how teacher/mediacentered strategies impact learning Learner provides an incomplete/incorrect rationale on how teacher/mediacentered strategies impact learning Learner states teacher/mediacentered strategies Learner does not provide any teacher/mediacentered strategies Generative (learner-centered) strategies are effectively integrated. (ISTET: 2b) (InTASC 8) Learner provides an appropriate rationale on how learnercentered strategies impact learning and are more effective than alternatives Learner provides an incomplete/incorrect rationale on how learner-centered strategies impact learning Learner states learnercentered strategies Learner does not provide any learner centered strategies Instructional media are effectively integrated. (ISTE-T: 2a) (InTASC 3) Learner provides an appropriate rationale for instructional media integrated in lesson plan and justification for use over alternative media Learner provides an incomplete/incorrect rationale for instructional media integrated in lesson plan Learner states instructional media integrated in lesson plan Learner does not employ any instructional media in lesson plan The needs of diverse learners are considered into the task. (ISTE-T: 2c) (InTASC 7) Learner demonstrates the ability to appropriately apply resources to target specific learning needs of multiple diverse populations Learner demonstrates an incomplete/incorrect application of resources to target specific learning needs of multiple diverse populations Learner states the presence of diverse learners in learner analysis Learner does not acknowledge diverse populations in lesson plan A plan for assessment of student learning is included. (ISTE-T: 2d) (InTASC 7) Learner creates an assessment plan that demonstrates the relationship between formative assessment data, resulting instructional adjustment strategies, and summative assessment Learner creates an assessment plan that includes elements of formative and summative assessment Learner creates an incomplete assessment plan Learner does not provide an assessment plan. Criterion 82 APPENDIX F — Training Instructions Provided to Second Rater 83 REFERENCES 84 REFERENCES Abell, S. K. (2008). Twenty years later: Does pedagogical content knowledge remain a useful idea? International Journal of Science Education, 30(10), 1405–1416. https://doi.org/10.1080/09500690802187041 Adamy, P., & Boulmetis, J. (2005). The impact of modeling technology integration on preservice teachers’ technology confidence. Journal of Computing in Higher Education, 17(2), 100–120. https://doi.org/10.1007/BF03032700 AECT. (2009). AECT [Organizational]. Retrieved May 15, 2016, from http://aect.site-ym.com/ Alexander, C., Knezek, G., Christensen, R., Tyler-Wood, T., & Bull, G. (2014). The impact of project-based learning on pre-service teachers’ technology attitudes and skills. Journal of Computers in Mathematics and Science Teaching, 33(3), 257–282. Retrieved from https://www.learntechlib.org/p/112337/ Anderson, S. E., Groulx, J. G., & Maninger, R. M. (2011). Relationships among preservice teachers’ technology-related abilities, beliefs, and intentions to use technology in their future classrooms. Journal of Educational Computing Research, 45(3), 321–338. https://doi.org/10.2190/EC.45.3.d Archambault, L., & Larson, J. (2015). Pioneering the digital age of instruction: Learning from and about K-12 online teachers. Journal of Online Learning Research, 1(1), 49–83. Re Avargil, S., Herscovitz, O., & Dori, Y. J. (2012). Teaching thinking skills in context-based learning: Teachers’ challenges and assessment knowledge. Journal of Science Education and Technology, 21(2), 207–225. https://doi.org/10.1007/s10956-011-9302-7 Bakeman, R. (2005). Recommended effect size statistics for repeated measures designs. Behavior Research Methods, 37(3), 379–384. Bandura, A. (1986). The explanatory and predictive scope of Self-Efficacy Theory. Journal of Social and Clinical Psychology, 4(3), 359–373. https://doi.org/http://dx.doi.org.ezproxy.gvsu.edu/10.1521/jscp.1986.4.3.359 Beyerbach, B., Walsh, C., & Vanatta, R. (2001). From teaching technology to using technology to enhance student learning: preservice teachers’ changing perceptions of technology infusion. Journal of Technology and Teacher Education, 9(1), 105. Bolick, C. M., Berson, M. J., Friedman, A. M., & Porfeli, E. J. (2007). Diffusion of technology innovation in the preservice social studies experience: Results of a national survey. Theory & Research in Social Education, 35(2), 174–195. https://doi.org/10.1080/00933104.2007.10473332 Branch, R. (2009). Instructional Design: The ADDIE Approach. Springer New York. 85 Brenner, A. M., & Brill, J. M. (2016). Investigating practices in teacher education that promote and inhibit technology integration transfer in early career teachers. TechTrends; Washington, 60(2), 136–144. https://doi.org/http://dx.doi.org.ezproxy.gvsu.edu/10.1007/s11528-016-0025-8 Brickner, D. L. (1995). The effects of first and second-order barriers to change on the degree and nature of computer usage of mathematics teachers: A case study (Doctoral dissertation). Purdue University. Brown, D., & Warschauer, M. (2006). From the university to the elementary classroom: Students’ experiences in learning to integrate technology in instruction. Journal of Technology and Teacher Education; Norfolk, 14(3), 599–621. Brown, J. S., & Duguid, P. (1991). Organizational learning and Communities-of-Practice: Toward a unified view of working, learning, and innovation. Organization Science, 2(1), 40–57. Bruner, J. S. (1956). A study of thinking. New York: Wiley. Bruner, J. S. (1960). The process of education. Cambridge, MA: Harvard University Press. Bruner, J. S. (1966). Toward a theory of instruction. Cambridge, MA: Belknap Press of Harvard University. Brush, T., Glazewski, K. D., & Hew, K. F. (2008). Development of an Instrument to Measure Preservice Teachers’ Technology Skills, Technology Beliefs, and Technology Barriers. Computers in the Schools, 25(1–2), 112–125. https://doi.org/10.1080/07380560802157972 Buckenmeyer, J. A. (2010). Beyond computers in the classroom: Factors related to technology adoption to enhance teaching and learning. Contemporary Issues in Education Research, 3(4), 27–35. Retrieved from http://search.proquest.com.ezproxy.gvsu.edu/docview/196352151/abstract/DE169C4090 094092PQ/1 Burden, J., & Hunt, A. (2010). The start of a new era of teacher-led innovation? Journal of Biological Education, 44(3), 100–101. https://doi.org/10.1080/00219266.2010.9656204 Burke, J. (2000). New directions: Teacher technology standards. Southern Regional Education Board. Retrieved from http://eric.ed.gov/?id=ED459695 Carr-Chellman, A. (2014). Instructional design for teachers : Improving classroom practice (1st ed.). Florence: Routledge. Chen, R.-J. (2010). Investigating models for preservice teachers’ use of technology to support student-centered learning. Computers & Education, 55(1), 32–42. https://doi.org/10.1016/j.compedu.2009.11.015 86 Corbin, J., & Strauss, A. (2008). Basics of qualitative research: Techniques and procedures for developing grounded theory (3rd ed.). 2455 Teller Road, Thousand Oaks California 91320 United States: SAGE Publications, Inc. https://doi.org/10.4135/9781452230153 Council for the Accreditation of Educator Preparation. (2016). CAEP Accreditation Handbook [Organizational]. Retrieved July 17, 2017, from http://caepnet.org/accreditation/caepaccreditation/caep-accreditation-handbook Cuban, L, Kirkpatrick, H., & Peck, C. (2001). High access and low use of technologies in high school classrooms: Explaining an apparent paradox. American Educational Research Journal, 38(4), 813–834. Cuban, Larry. (2001). Oversold and underused. Cambridge, MA: Harvard University Press. Cuban, Larry. (2008). The perennial reform: Fixing school time. Phi Delta Kappan, 90(4), 240. Cuban, Larry. (2010). Rethinking education in the age of technology: The digital revolution and schooling in America. Science Education International, 94, 1125–1127. Darling-Hammond, L. (1999). America’s future: educating teachers. Education Digest, 64(9), 18. Darling-Hammond, L., & Bransford, J. (2005). Preparing teachers for a changing world: what teachers should learn and be able to do. San Francisco, CA: Jossey-Bass. Darling-Hammond, L., & Sykes, G. (1999). Teaching as the learning profession: Handbook of policy and practice. Jossey-Bass Inc. Darling-Hammond, L., Wei, R. C., Andree, A., Richardson, N., & Orphanos, S. (2009). State of the Profession. Journal of Staff Development, 30(2), 42–44,46–50,67. Davies, R. S. (2011). Understanding technology literacy: A framework for evaluating educational technology integration. TechTrends, 55(5), 45–52. Dede, C. (2000). Emerging influences of information technology on school curriculum. Journal of Curriculum Studies, 32(2), 281–303. https://doi.org/10.1080/002202700182763. Dede, C. (2003a). Multi-user virtual environments. EDUCAUSE Quarterly, 38(3). Dede, C. (2003b). No cliche left behind: Why education policy is not like the movies. Educational Technology, 43(2), 5. Dede, C. (2011). Reconceptualizing technology integration to meet the necessity of transformation. Journal of Curriculum & Instruction, 5(1), 4–16. https://doi.org/10.3776/joci.20yy.v5n1p4-16. 87 Dede, C., Honan, J. P., & Peters, L. C. (2005). Scaling up success: Lessons learned from technology-based educational improvement. New York: Jossey-Bass, An Imprint of Wiley. Dennis, L. B. (2013). How Are Teachers Integrating Technology in K-5 Classrooms? Studying Student Cognitive Engagement Using the Instructional Practices Inventory-Technology (IPI-T) Instrument (Doctoral dissertation). New Mexico State University, Order No. 3679874, ProQuest LLC. Dewey, J. (1916). Democracy and education: An introduction to the philosophy of education. New York. Dirkx, J. M. (1998). Transformative Learning Theory in the practice of adult education: An overview. PAACE Journal of Lifelong Learning, 7, 1. Dodge, B. (1998). Some thoughts about WebQuests [Institutional]. Eifler, K. E., Greene, T. G., & Carroll, J. B. (2001). Walking the talk is tough: From a single technology course to infusion. The Educational Forum, 65(4), 366. Ellis, A. K. (1994). Research on school restructuring. Princeton Junction, N.J.: Eye On Education. Ertmer, P. A. (1999). Addressing first- and second-order barriers to change: Strategies for technology integration. Educational Technology, Research and Development; New York, 47(4), 47. Ertmer, P. A. (2003). Transforming teacher education: Visions and strategies. Educational Technology Research and Development, 51(1), 124–128. Retrieved from http://www.jstor.org/stable/30220367. Ertmer, P. A. (2005). Teacher pedagogical beliefs: The final frontier in our quest for technology integration? Educational Technology Research and Development, 53(4), 25–39. Ertmer, P. A., & Ottenbreit-Leftwich, A. (2013). Removing obstacles to the pedagogical changes required by Jonassen’s vision of authentic technology-enabled learning. Computers & Education, 64, 175–182. https://doi.org/10.1016/j.compedu.2012.10.008. Ertmer, P. A., & Ottenbreit-Leftwich, A. T. (2010). Teacher technology change. Journal of Research on Technology in Education, 42(3), 255–284. https://doi.org/10.1080/15391523.2010.10782551. Ertmer, P. A., Ottenbreit-Leftwich, A. T., Sadik, O., Sendurur, E., & Sendurur, P. (2012). Teacher beliefs and technology integration practices: A critical relationship. Computers & Education, 59(2), 423–435. https://doi.org/10.1016/j.compedu.2012.02.001. 88 Fantilli, R. D., & McDougall, D. E. (2009). A study of novice teachers: Challenges and supports in the first years. Teaching and Teacher Education, 25(6), 814–825. https://doi.org/10.1016/j.tate.2009.02.021. Florida Center for Instructional Technology. (2017). Matrix | TIM [university sponsored]. Retrieved August 4, 2017, from https://fcit.usf.edu/matrix/matrix/. Fouts, J. T. (2003). A decade of reform: A summary of research findings on classroom, school, and district effectiveness in Washington State. Retrieved from http://eric.ed.gov/?id=ED482631. Frank, K. A., Zhao, Y., & Borman, K. (2004). Social capital and the diffusion of innovations within organizations: The case of computer technology in schools. Sociology of Education, 77(2), 148–171. Gillingham, M. G., & Topper, A. (1999). Technology in teacher preparation: Preparing teachers for the future. Journal of Technology and Teacher Education, 7(4), 303–321. Goodman, J. (1995). Change without difference: School restructuring in historical perspective. Harvard Educational Review; Cambridge, 65(1), 1. Gray, L., Thomas, N., & Lewis, L. (2010). Teachers’ use of educational technology in US public schools: First look 2009. National Center for Education Statistics. Retrieved from http://eric.ed.gov/?id=ED509514. Green, S., Salkind, N., & Akey, T. (2000). Using SPSS for Windows; analyzing and understanding data (2nd ed., Vol. 25). Upper Saddle River, NJ: Prentice-Hall. Greeno, J., Collins, A., & Resnick, L. (1996). Cognition and learning. In D. Berliner & R. Calfee (Eds.), Handbook of Educational Psychology (pp. 15–46). New York: Macmillan. Hare, S., Howard, E., & Pope, M. (2002). Technology integration: Closing the gap between what preservice teachers are taught to do and what they can do. Journal of Technology and Teacher Education, 10(2), 191–203. Harmes, J. C., Welsh, J. L., & Winkelman, R. J. (2016). A framework for defining and evaluating technology integration in the instruction of real-world skills. In S. Ferrara, Y. Rosen, & M. Tager (Eds.), Handbook of research on technology tools for real-world skill development (pp. 137–162). Hershey, PA: IGI Global. Hew, K. F., & Brush, T. (2007). Integrating Technology into K-12 Teaching and Learning: Current Knowledge Gaps and Recommendations for Future Research. Educational Technology Research and Development, 55(3), 223–252. Hsu, P.-S. (2012). Examining the impact of educational technology courses on pre-service teachers’ development of technological pedagogical content knowledge. Teaching Education, 23(2), 195. 89 Hsu, P.-S. (2013). Examining changes of preservice teachers’ beliefs about technology integration during student teaching. Journal of Technology and Teacher Education, 21(1), 27–48. Hughes, J. E. (2000). Teaching English with technology: exploring teacher learning and practice (Doctoral dissertation). Retrieved from Michigan State University electronic dissertations. Hughes, J. E., Ko, Y., Lim, M., & Liu, S. (2015). Preservice teachers’ social networking use, concerns, and educational possibilities: Trends from 2008-2012. Journal of Technology and Teacher Education, 23(2), 185–212. Hughes, J. E., Thomas, R., & Scharber, C. (2006). Assessing technology integration: The RAT – Replacement, Amplification, and Transformation - Framework. In Proceedings of Society for Information Technology & Teacher Education International Conference (Vol. 2006, pp. 1616–1620). Retrieved from https://www.learntechlib.org/p/22293/. ISTE. (2016). ISTE Standards Teachers Refresh. Retrieved August 2, 2017, from http://www.iste.org/standards/standards/for-teachers-refresh-2016-lp Jacobsen, M., Clifford, P., & Friesen, S. (2002). Preparing teachers for technology integration: Creating a culture of inquiry in the context of use. Contemporary Issues in Technology and Teacher Education [Online Serial], 2(3). Jeffs, T., & Morrison, W. (2005). Special education technology addressing diversity: A synthesis of the literature. Journal of Special Education Technology, 20(4), 19–25. https://doi.org/10.1177/016264340502000403. Kay, R. (2006a). Evaluating strategies used to incorporate technology into preservice education: a review of the literature. Journal of Research on Technology in Education, 38(4), 383+. Kay, R. (2007). The impact of preservice teachers’ emotions on computer use: A formative analysis. Journal of Educational Computing Research, 36(4), 455–479. https://doi.org/10.2190/J111-Q132-N166-K249. Kennedy, K., & Archambault, L. (2012). Offering preservice teachers field experiences in K-12 online learning: A national survey of teacher education programs. Journal of Teacher Education, 63(3), 185–200. https://doi.org/10.1177/0022487111433651 Knowles, M. S. (1977). Adult learning processes: Pedagogy and andragogy. Religious Education, 72(2), 202–211. Koehler, M. J., Mishra, P., Bouck, E. C., DeSchryver, M., Kereluik, K., Shin, T. S., & Wolf, L. G. (2011). Deep-play: Developing TPACK for 21st century teachers. International Journal of Learning Technology, 6(2), 146–163. Koehler, M. J., Mishra, P., & Cain, W. (2013). What Is Technological Pedagogical Content Knowledge (TPACK)? The Journal of Education, 193(3), 13–19. 90 Koehler, M. J., Mishra, P., & Yahya, K. (2007). Tracing the development of teacher knowledge in a design seminar: Integrating content, pedagogy and technology. Computers & Education, 49(3), 740–762. https://doi.org/10.1016/j.compedu.2005.11.012. Kolb, D. A. (1984). Experiential learning: experience as the source of learning and development. Englewood Cliffs, N.J: Prentice-Hall. Kolodner, J. L., Camp, P. J., Crismond, D., Fasse, B., Gray, J., Holbrook, J., … Ryan, M. (2003). Problem-based learning meets case-based reasoning in the middle-school science classroom: Putting Learning by Design(tm) into practice. Journal of the Learning Sciences, 12(4), 495–547. https://doi.org/10.1207/S15327809JLS1204_2 Kumar, S., & Vigil, K. (2011). The net generation as preservice teachers. Journal of Digital Learning in Teacher Education, 27(4), 144–153. https://doi.org/10.1080/21532974.2011.10784671. Landis, J. R., & Koch, G. G. (1977). The measurement of observer agreement for categorical data. Biometrics, 33(1), 159. https://doi.org/10.2307/2529310. Lee, M.-H., & Tsai, C.-C. (2008). Exploring teachers’ perceived self efficacy and technological pedagogical content knowledge with respect to educational use of the World Wide Web. Instructional Science, 38(1), 1–21. https://doi.org/10.1007/s11251-008-9075-4 Lei, J. (2009). Digital natives as preservice teachers. Journal of Computing in Teacher Education, 25(3), 87–97. https://doi.org/10.1080/10402454.2009.10784615. Lenhart, A. (2015, April 9). Teens, social media & technology overview 2015 [Institutional]. Retrieved August 21, 2017, from http://www.pewinternet.org/2015/04/09/teens-socialmedia-technology-2015/. Lowther, D., Smaldino, S., & Russell, J. D. (2008). Instructional technology and media for learning (9th ed.). Upper Saddle River, NJ: Prentice Hall. MacCallum, R. C., Widaman, K. F., Zhang, S., & Hong, S. (1999). Sample size in factor analysis. Psychological Methods, 4(1), 84–99. https://doi.org/10.1037//1082-989X.4.1.84 Magnusson, S., Krajcik, J., & Borko, H. (1999). Nature, sources, and development of pedagogical content knowledge. In J. Gess-Newsome & N. G. Lederman (Eds.), Examining pedagogical content knowledge: The construct and its implications for science education (pp. 95–132). Boston: Kluwer. Martin, W., Strother, S., Beglau, M., Bates, L., Reitzes, T., & Culp, K. M. (2010). Connecting instructional technology professional development to teacher and student outcomes. Journal of Research on Technology in Education, 43(1), 53–74. McDade, L. A. (1988). Knowing the “right stuff”: Attrition, gender, and scientific literacy. Anthropology & Education Quarterly, 19(2), 93–114. https://doi.org/10.1525/aeq.1988.19.2.05x1802h. 91 Mezirow, J. (1991). Transformative dimensions of adult learning (First edition). San Francisco: Jossey-Bass. Mishra, P., & Koehler, M. J. (2006). Technological Pedagogical Content Knowledge: A framework for teacher knowledge. Teachers College Record, 108(6), 1017–1054. Mishra, P., & Koehler, M. J. (2009). Too cool for school? No way! Using the TPACK framework: You can have your hot tools and teach with them, too. Learning & Leading with Technology, 36(7), 14. Moser, F. Z. (2007). Faculty adoption of educational technology. EDUCAUSE Quarterly, 30(1), 66. Mouza, C., Karchmer-Klein, R., Nandakumar, R., Yilmaz Ozden, S., & Hu, L. (2014). Investigating the impact of an integrated approach to the development of preservice teachers’ technological pedagogical content knowledge (TPACK). Computers & Education, 71, 206–221. https://doi.org/10.1016/j.compedu.2013.09.020. National Commission on Excellence in Education. (1983). A nation at risk: The imperative for educational reform. The Elementary School Journal, 84(2), 113–130. Nut, J. (2010). Professional educators and the evolving role of ICT in schools: Perspective report. ICT in Schools. Retrieved from http://www.ictliteracy.info/rf.pdf/ICTinSchools.pdf. Olofson, M. W., Swallow, M. J. C., & Neumann, M. D. (2016). TPACKing: A constructivist framing of TPACK to analyze teachers’ construction of knowledge. Computers & Education, 95, 188–201. https://doi.org/10.1016/j.compedu.2015.12.010 Park, S. H., & Ertmer, P. A. (2008). Examining barriers in technology-enhanced problem-based learning: Using a performance support systems approach. British Journal of Educational Technology, 39(4), 631–643. https://doi.org/10.1111/j.1467-8535.2008.00858.x. Pea, R. D. (1985). Beyond amplification: Using the computer to reorganize mental functioning. Educational Psychologist, 20(4), 167. Pearcy, M. (2013). A year of reflection: The more things change. CITE Journal, 13(4), 360–385. Pellegrino, J. W., Goldman, S. R., Bertenthal, M., & Lawless, K. (2007). Teacher education and technology: Initial results from the “What Works and Why” project. Yearbook of the National Society for the Study of Education, 106(2), 52–86. Polly, D., Mims, C., Shepherd, C. E., & Inan, F. (2010). Evidence of impact: Transforming teacher education with preparing tomorrow’s teachers to teach with technology (PT3) grants. Teaching and Teacher Education, 26(4), 863–870. https://doi.org/10.1016/j.tate.2009.10.024 92 Puentedura, R. R. (2006). SAMR: Transformation, Technology, and Education. Retrieved from www.hippasus.com. Richardson, J. W., McLeod, S., Flora, K., Sauers, N. J., Kannan, S., & Sincar, M. (2013). Largescale 1: 1 computing initiatives: An open access database. International Journal of Education and Development Using Information and Communication Technology, 9(1), 4. Rogers, E. M. (1962). Diffusion of innovations (1st ed.). New York: Free Press of Glencoe. Romrell, D., Kidder, L. C., & Wood, E. (2014). The SAMR model as a framework for evaluating mLearning. Online Learning, 18(2). Rosenberg, J. M., & Koehler, M. J. (2015). Context and Technological Pedagogical Content Knowledge (TPACK): A systematic review. Journal of Research on Technology in Education, 47(3), 186–210. https://doi.org/10.1080/15391523.2015.1052663. Rourke, L., & Anderson, T. (2004). Validity in quantitative content analysis. Educational Technology Research and Development, 52(1), 5–58. Russo, T. E., & Siko, J. P. (2016, in progress). I do, we do, you do: Learning through technology before teaching through technology. Technology, Instruction, Cognition, and Learning. Salomon, G., & Perkins, D. (2005). Do technologies make us smarter? Intellectual amplification with, of, and through technology. In R. J. Sternberg & D. D. Preiss (Eds.), Intelligence and technology: The impact of tools on the nature and development of human abilities (pp. 71–86). New York: Routledge. Salomon, G., Perkins, D. N., & Globerson, T. (1991). Partners in cognition: Extending human intelligence with intelligent technologies. Educational Researcher, 20(3), 2–9. Salpeter, J. (2003, October). 21st century skills: Will our students be prepared? Technology & Learning, 24(3), 17–26. Siko, J. (2016). The P4 framework for pre-service and in-service teacher technology integration. In Proceedings of Global Learn 2016 (pp. 114–118). Limerick, Ireland: Association for the Advancement of Computing in Education (AACE). Stewart, K., & Gachago, D. (2016). Being human today: A digital storytelling pedagogy for transcontinental border crossing. British Journal of Educational Technology, 47(3), 528– 542. https://doi.org/10.1111/bjet.12450. Storandt, B., Dossin, L., Lacher, C., & Piacentini, A. (2012). Toward an understanding of what works in professional development for online instructors: The case of PBS Teacherline. Journal of Asynchronous Learning Networks, 16(2), 121–162. Strudler, N., & Hearrington, D. (2008). Quality support for ICT in schools. In J. Voogt & G. Knezek (Eds.), International handbook of information technology in primary and secondary education (pp. 579–596). Springer Netherlands. 93 Tanguma, J., Martin, S. S., & Crawford, C. M. (2002). Higher education and technology integration into the learning environment: Results of a survey of teacher preparation faculty. In Proceedings of Society for Information Technology & Teacher Education International Conference 2002 (pp. 736–740). Chesapeake, VA: SITE International Conference. Retrieved from http://eric.ed.gov/?id=ED482906. Taylor, S. J., & Bogdan, R. (1998). Introduction to qualitative research methods: a guidebook and resource. New York: Wiley. Thomas, S. (2016). Future ready learning: Reimagining the role of technology in education. 2016 National Education Technology Plan. Office of Educational Technology, US Department of Education. Retrieved from https://eric.ed.gov/?id=ED571884. Tondeur, J., van Braak, J., Sang, G., Voogt, J., Fisser, P., & Ottenbreit-Leftwich, A. (2012). Preparing pre-service teachers to integrate technology in education: A synthesis of qualitative evidence. Computers & Education, 59(1), 134–144. https://doi.org/10.1016/j.compedu.2011.10.009. Tschannen-Moran, M., & Hoy, A. W. (2001). Teacher efficacy: capturing an elusive construct. Teaching and Teacher Education, 17(7), 783–805. https://doi.org/10.1016/S0742051X(01)00036-1. Tschannen-Moran, M., & Hoy, A. W. (2007). The differential antecedents of self-efficacy beliefs of novice and experienced teachers. Teaching and Teacher Education, 23(6), 944–956. https://doi.org/10.1016/j.tate.2006.05.003 Tuzzio, L. (2007). Factors that influence teachers’ proficiency with and use of educational technology. Central Connecticut State University. Retrieved from http://cdm16627.contentdm.oclc.org.ezproxy.gvsu.edu/cdm/ref/collection/ccsutheses/id/9 61. Vandewater, E. A., Bickham, D. S., Lee, J. H., Cummings, H. M., Wartella, E. A., & Rideout, V. J. (2005). When the television is always on: Heavy television exposure and young children’s development. American Behavioral Scientist, 48(5), 562–577. https://doi.org/10.1177/0002764204271496 Vygotsky, L. S. (1977). The development of higher psychological functions. Journal of Russian and East European Psychology, 15(3), 60–73. https://doi.org/10.2753/RPO10610405150360. Wang, L., Ertmer, P. A., & Newby, T. J. (2004). Increasing preservice teachers’ self-efficacy beliefs for technology integration. Journal of Research on Technology in Education, 36(3), 231–250. Wells, J., & Lewis, L. (2006). Internet access in U.S. public schools and classrooms: 1994-2005. National Center for Education Statistics. Retrieved from https://eric-edgov.proxy1.cl.msu.edu/?id=ED494307. 94 Wetzel, K., Foulger, T. S., & Williams, M. K. (2008). The evolution of the required educational technology course. Journal of Computing in Teacher Education, 25(2), 67–71. Wiggins, G. P., & McTighe, J. (1998). Understanding by design. Association for Supervision and Curriculum Development. Williams, M. K., Foulger, T. S., & Wetzel, K. (2009). Preparing preservice teachers for 21st century classrooms: Transforming attitudes and behaviors about innovative technology. Journal of Technology and Teacher Education, 17(3), 393–418. Wood, D. (2001). WebQuests: Pathways for teaching to learn, learning to teach [university sponsored]. Retrieved August 19, 2017, from http://technologysource.org/article/webquests/. Yang, C., Tzuo, P., & Komara, C. (2011). Using Webquest as a Universal Design for Learning tool to enhance teaching and learning in teacher preparation programs. Journal of College Teaching and Learning, 8(3), 21–29. Zhao, Y., & Frank, K. A. (2003). Factors affecting technology uses in schools: An ecological perspective. American Educational Research Journal, 40(4), 807–840. Zhao, Y., Pugh, K., Sheldon, S., & Byers, J. L. (2002). Conditions for classroom technology innovations. Teachers College Record, 104(3), 482–515. 95