g 13.1 1. P: .‘u .7 .91! #5.}! 2‘ .1 . . x .3: . NJ}. 12!... .. 3.... 1 1 3.31 hi; ‘Lifl . a mu «Wu..fi fl v3 £3. ; id»: ~ . ”$4.3... , i, 5......" nmmufiz .1: LIBRARY Michigan State University This is to certify that the dissertation entitled TEACHER MODERATING AND STUDENT ENGAGEMENT IN SYNCHRONOUS COMPUTER CONFERENCES presented by SHUFANG SHI has been accepted towards fulfillment of the requirements for the Ph.D. degree in Learning, TechnologL8t Culture few Major Professor's Signature 3‘ /I l 0 5" ’ I Date MSU is an Affirmative Action/Equal Opportunity Institution PLACE IN REtURN BOX to remove this checkout from your record. To AVOID FINES return on or before date due. MAY BE RECALLED with earlier due date if requested. DATE DUE DATE DUE DATE DUE JUL 0 7 2007 2/05 cJCtRC/DateDmJndd-pJS TEACHER MODERATING AND STUDENT ENGAGEMENT IN SYNCI-IRONOUS COMPUTER CONFERENCES By Shufang Shi A DISSERTATION Submitted to Michigan State University in partial fulfillment of the requirements for the degree of DOCTOR OF PHILOSOPHY Department of Counseling, Educational Psychology and Special Education 2005 ABSTRACT TEACHER MODERATING AND STUDENT ENGAGEMENT IN SYNCHRONOUS COMPUTER CONFERENCES By Shufang Shi Online learning has received a great deal of attention lately, especially in higher education. The bulk of research has focused on asynchronous environments (such as web-based bulletin boards, e-mail systems etc.). Synchronous communication, by contrast, despite its popularity, has received less research attention. Of particular interest is the manner in which instructors manage the ebb and flow of classroom discussion and how this affects student engagement. This dissertation study attempts to develop a deeper understanding of the relationship between teacher moderating and student engagement. The primary data source for the study was 44 transcripts collected from 4 groups of college students over 11 weeks of conferences in a moderated synchronous online course taught in a Canadian university. The study used a mixed method design where the results of the quantitative analysis were used to select cases for qualitative analysis to better understand the substantive processes of engaged collaborative discourse. An important part of the analysis was the development of new constructs and measurement methods to measure teacher moderating behaviors and a range of student engagement variables (behavioral, social-emotional and intellectual). The quantitative analysis revealed that student intellectual engagement was a function of both students’ participation, and the number and quality of teacher postings. For the qualitative part of the research, the researcher applied discourse analysis techniques to transcripts of interest in order to discover specifically what was happening with teacher moderating. This provided a unified picture of the complex nature of the interaction process in synchronous learning environments as well as an opportunity to identify and present key themes and practices for effective online moderating. In summary, the methodologies and findings of this study contribute to a better understanding of how teachers can provide effective online mentoring and scaffolding to facilitate student engagement with each other and with the subject matter. It also contributes to a better understanding of whether and how a community of inquiry develops by means of synchronous computer conferencing and how students can become invested behaviorally, social-emotionally, and intellectually. This research also informs both research and practice related to the larger goal of improving the quality of online teaching and learning. Copyright by SHUFANG SHI 2005 In memory of my father, Xianqian Shi Dedicated to my son, Michael (DD) ACKNOWLEDGEMENTS I am fortunate to have known a host of individuals that have come to my aid, enthusiastically, willingly, knowingly (or not), in my educational and professional sojourn. From these individuals I found a common element - a common mindset - which seemed linked to the elegantly flowing discourse of computer conferencing itself: a commitment to sharing, openness, freedom, exploration and, most of all, devotion to education. In their collective and individual minds, I encountered an ethic to improve the world by means of digital technologies. Most of those I acknowledge here embrace and continue to build upon the view of a connected, learning, interactive world of learners. There is magic in this environment, and we have witnessed some - but by no means all - of that magic. Among those to whom I am indebted are true pioneers in instructional technology and online conferencing: prophets and seers in an age that fulfills prophecy almost before it is recorded. There are, in this group, leaders and trailblazers in the computer conferencing arena, and others who view this arena as interesting, but — on a personal level — not for them. All have helped me shape my own views and approach to the subject, and all have added to whatever insights I have divined by their scholarship and discipline. For the most part, though, I am grateful to have been part of a fellowship that understands the potential of online collaboration at its most profound levels, and to have been blessed with such active participants. Dr. Punya Mishra, my dissertation chair and academic advisor: thank you for your unremitting support, your scholarly insights, and your sense of humor. You have contributed immeasurably to my completion of this doctoral program. You provided me vi g h..-- o with an opportunity to experience an exceptional mentor-student relationship, one that will be a model for working with my own students in the future. For the privilege of working with you I will be forever grateful. The great precursor for the mindset of my dissertation committee, Dr. Curtis Bonk, from Indiana University: thank you not only for your incredible tele-mentoring but also for your sincere friendship. You have provided me with abundant opportunities to learn and grow in the field of online learning. Dr. John Dirkx thank you for your insightful suggestions on online discussion and its social- emotional aspects. Dr. Susan Florio- Ruane: thank you for your incredible scholarship and your unending support throughout this intellectual endeavor. It is from this constructive collaborative team teaching with Dr. David Pearson that I truly come to appreciate socio-cultural perspectives that have become a permanent part of my teaching philosophy. Dr. Ralph Putnam: thank you for your insightful critiques and suggestions and your support for my research and growth. The opportunity to teach an online educational research course not only apprenticed me in online teaching but also impressed upon me the importance of teacher-student interactions. Your gentle, individualized, and “care-full” approach to guidance and “the care and feeding” of graduate students should be bottled up so that I can use it in the future for my students! Dr. Mark Reckase: thank you for your incredible scholarship in measurement and your understanding and flexibility. You made my work stronger through your thoughtful suggestions which were greatly appreciated. Dr. Yong Zhao and Xi Chen: thank you for providing me with an opportunity to start this long journey. Dr. Zhao brought me to this institution, for which I will be forever grateful; he saw more in me than I was then able to recognize, and he motivated me to do more than I knew was possible. His wife, Xi, has been a constant source of vii encouragement throughout my sojourn at Michigan State University. Dr. James Porter and Dr. Ellen Cushman: thank you for your incredible scholarship and the illuminating dialogues in the early stage of my research. Blaine Morrow, Dr. Chris Wheeler, Dr. Bob Floden, Dr. Jack Schwille, Dr. Carol Sue Englert: thank you for providing me opportunities to work with you and to grow into a professional. Without your support, the completion of this doctoral study would not be possible. Sue Barrett, Joni Smith, Sharon Anderson and many other wonderful secretaries: thank you for your hard work and your support. Mia Lobel and Mike Neubauer: thank you for your support of this project and access to your interesting LearningByDoing tool and for sharing your research and data with me. You have forged paths that others will follow, and make it possible for those who merely watch, record, and analyze to also feel a sense of wonder at your inventiveness and authenticity. Steven Wang: thank you for offering your mind and heart. Your problem-solving talents, your versatility in multiple disciplines, your pursuit of truth, and all the illuminating dialogues made it possible for me to finish this project. You deserve a doctorate in education as well! Man Xi: thank you for generously sharing your insights and your work with me in online learning research. You have provided suggestions and inspirations that have added immensely to my understanding of this subject and its potential value to educational technology research. Tianshu Pan: thank you for offering your talents and patience. You were always there, ready to help. Linda Chard, Shu-Chuan Kao, Lixiong Gu, Deping Li, Raymond Mapuranga, Leigh Graves Wolf, and Bo Yan, my colleagues and friends: thank you for viii sharing your talented minds and providing excellent dialogue and support in this intellectual endeavor. Troy Hicks: thank you for offering me your amazing scholarship and friendship. Dr. Patricia Wilson and Robert Birch: thank you for providing excellent dialogue that greatly motivated me to accomplish the final writing. Mrs. Krause: thank you for being a model for me to follow as an educator, and thank you for your friendship. I first came to know your heart and mind by observing your classroom teaching, and your devotion to education permanently impressed and influenced my research and teaching philosophy. Of all those I have come to know in this country, you have provided the purest example of kindness and unconditional love. There are no words that can properly express my gratitude. Wei He, my best friend and former colleague: thank you for your warm companionship and for being always there for me. Haojing Chen, John Klusinske, Sophia Tan, Rachel Tan, Xiuwen Wu, and Haniza Yon: thank you for your sincere friendship, your heart-warming companionship and your encouragement. My former MA advisor Pro. Zhaoyan Xi: thank you for being my mentor and a friend. Your righteous and nurturing personality has always illuminated the dark moments. My mom and dad: you did not know what a PhD is, but you granted me a direct and sincere character. Your love to me and your faith in me enlighten my road to pursue truth and wisdom. My brothers and sisters, my best friends and classmates Yong Yi and Xianjun Wei: thank you for your heart-warming support throughout this long journey. You are far away in physical distance but always close to my heart. My dear son DD has been a tremendous source of love, inspiration, energy, and happiness. I thank him and hug him. ix TABLE OF CONTENTS LIST OF TABLES .......................................................................................................... xiii LIST OF FIGURES .......................................................................................................... xvi CHAPTER 1 ........................................................................................................................ 1 PURPOSE OF STUDY ....................................................................................................... 1 1.1 Statement of the Problem .......................................................................................... 1 1.2 Purpose of the Study .................................................................................................. 3 1.3 Project Significance ................................................................................................... 6 1.4 Overview of Chapters ................................................................................................ 6 CHAPTER 2 ........................................................................................................................ 8 THEORETICAL PERSPECTIVES .................................................................................... 8 2.1 Socio-cultural Learning Theory ................................................................................. 8 2.2 Characteristics of Synchronous Computer Conferencing ......................................... 9 2.3 Online Moderating ................................................................................................... 11 2.4 Student Engagement ................................................................................................ 13 2.5 Moderating Levels and Student Engagement in Synchronous Computer Conferencing ................................................................................................................. 15 2.6 Summary .................................................................................................................. 17 CHAPTER 3 ...................................................................................................................... 19 METHOD .......................................................................................................................... 19 3.1 The Origin of the Current Design-Pilot Studies ...................................................... 19 3.2 Research Context ..................................................................................................... 21 3.3 Data Collection ........................................................................................................ 22 3.4 Variables and Their Measures ................................................................................. 24 3.4.1 Teacher Moderating Levels .............................................................................. 24 3.4.2 Student Engagement Variables and Their Measures ........................................ 25 3.5 Research Questions .................................................................................................. 30 3.6 Research Design ...................................................................................................... 32 3.6.1 Mixed Method .................................................................................................. 32 3.6.2 Transcript Analysis ........................................................................................... 33 3.6.3 Units of Analysis .............................................................................................. 34 3.6.4 Coding Process ................................................................................................. 34 3.6.5 Inter-rater Reliability ........................................................................................ 37 3.7. Data Analysis .......................................................................................................... 38 CHAPTER 4 ...................................................................................................................... 4O PRESENTATION AND ANALYSIS OF QUANTITATIVE DATA .............................. 40 4.1 Statistical Analysis Procedures ................................................................................ 40 4.1.1 Combining Variables ........................................................................................ 40 4.1.2 The Analysis ........ '. ............................................................................................ 45 4.1.3 Null Hypothesis ................................................................................................ 46 4.2 Statistical Analysis Process and Results .................................................................. 47 4.2.] Mean Differences of Variables over Weeks ..................................................... 47 4.2.2 Mean Differences of Variables across Groups ................................................. 56 4.2.3 Relationships Between and Among Variables ................................................. 65 4.3 Conclusions and Discussion .................................................................................... 92 4.3.1 Changes in Teacher Moderating Levels and Student Engagement over Tirne.92 4.3.2 Changes in Teacher Moderating Levels and Student Engagement across Groups ....................................................................................................................... 94 4.3.3 Relationships Between/Among Teacher Moderating Levels and Student Engagement ............................................................................................................... 95 CHAPTER 5 ...................................................................................................................... 98 DISCUSSION OF QUANTITATIVE DATA AND RESULTS ....................................... 98 5.1 How Was Student Social-emotional Engagement Related to Student Intellectual Engagement ................................................................................................................... 98 5.1.1 Relationship Between Student Social-emotional Engagement and Higher-order Thinking and Interactivity ......................................................................................... 99 5.1.2 Relationships Between Student Social-emotional Engagement and Student Behavioral Engagement ........................................................................................... 101 5.1.3 Relationship Between Student Social-emotional Engagement and Teacher Moderating Levels ................................................................................................... 102 5.1.4 Summary - The Core Question Remains: What Was Related to Student Intellectual Engagement .......................................................................................... 102 5.2 How Was Student Behavioral Engagement Related to Student Intellectual Engagement ................................................................................................................. 107 5.2.1 What Student Behavioral Engagement Consists of - Relationship Between Attending and Participation ..................................................................................... 108 5.2.2 How was Attending or Participation Related to Student Intellectual Engagement -Higher-order Thinking or Interactivity, Respectively ....................... 110 5.3 How are Teacher Moderating Levels Relate to Student Intellectual Engagement 1 11 5.3.1 Relationship Between Teacher Moderating Levels and Student Social- emotional Engagement ............................................................................................ 112 5.3.2 Relationship Between Teacher Moderating Levels and Student Behavioral Engagement ............................................................................................................. 1 12 5.3.3 Relationship Between Teacher Moderating Levels and Student Intellectual Engagement ............................................................................................................. l 13 5.4 What Was Related to Student Intellectual Engagement - Comprehensive Factor TRP .............................................................................................................................. 115 5.5 Conclusion ............................................................................................................. 117 CHAPTER 6 .................................................................................................................... 119 QUALITATIVE ANALYSIS ......................................................................................... 119 6.1 Selection Process of the Transcripts and Sections of Transcripts for Qualitative Analysis ....................................................................................................................... 119 6.2 The Qualitative Analysis Procedures .................................................................... 121 6.3 Revisiting the Roles of Moderators and the Use of Moderating Functions .......... 122 6.4 Good Moderating Practices - A General Picture ................................................... 124 6.5 Good Moderating Practices - Themes ................................................................... 136 6.5.1 Providing Hooks with Both Ends ................................................................... 136 6.5.2 Modeling and Tele-mentoring ....................................................................... 140 6.5.3 Confronting and Conflicting ........................................................................... 143 6.5.4 Setting up Norms ........................................................................................... 147 6.5.5 Social-emotional Elements ............................................................................. 148 xi CHAPTER 7 .................................................................................................................... 151 GENERAL DISCUSSION .............................................................................................. 151 ACCOMPLISHMENTS AND FUTURE STUDIES ...................................................... 151 7.1 Review of Goals and Summary of Accomplishments ........................................... 151 7.2 Limitations of the Study ........................................................................................ 153 7.3 Future Studies ........................................................................................................ 155 APPENDICES ................................................................................................................. 158 APPENDIX A ................................................................................................................. 159 RUBRIC FOR MEASURING TEACHER MODERATING LEVELS ......................... 159 APPENDIX B .................................................................................................................. 160 RUBRIC FOR MEASURING SOCIAL-EMOTIONAL ENGAGEMENT ................... 160 APPENDIX C .................................................................................................................. 161 RUBRIC FOR MEASURING HIGHER-ORDER THINKING ..................................... 161 APPENDIX D ................................................................................................................. 162 RUBRIC FOR MEASURING INTERACTIVITY ......................................................... 162 APPENDIX E .................................................................................................................. 163 SUMMARY OF RESEARCH FINDINGS FOR NULL HYPOTI-IESES ...................... 163 BIBLIOGRAPHY ........................................................................................................... 165 xii LIST OF TABLES Table 1. Mean and Standard Deviation (SD) of Number of Teacher Postings (T) over Weeks ................................................................................................................................ 48 Table 2. Repeated Measure ANOVA for Number of Teacher Postings (T) over Weeks .49 Table 3. Mean and Standard Deviation (SD) of Rating of Teacher Moderating Levels (R) over Weeks ........................................................................................................................ 49 Table 4. Repeated Measure ANOVA for Rating of Teacher Moderating Levels (R) over Weeks ................................................................................................................................ 50 Table 5. Mean and Standard Deviation (SD) of Attending (A) over Weeks ..................... 51 Table 6. Repeated Measure ANOVA for Attending (A) over Weeks ............................... 51 Table 7. Mean and Standard Deviation (SD) of Participation (P) over Weeks ................. 52 Table 8. Repeated Measure ANOVA for Participation (P) over Weeks ........................... 52 Table 9. Mean and Standard Deviation (SD) of Social-emotional Engagement (8) over Weeks ................................................................................................................................ 53 Table 10. Repeated Measure ANOVA for Social-emotional Engagement (S) over Weeks ........................................................................................................................................... 53 Table 11. Mean and Standard Deviation (SD) of Higher-order Thinking (H) over Weeks ........................................................................................................................................... 54 Table 12. Repeated Measure ANOVA for Higher-order Thinking (H) over Weeks ........ 54 Table 13. Mean and Standard Deviation (SD) of Interactivity (I) over Weeks ................. 55 Table 14. Repeated Measure ANOVA for Interactivity (I) over Weeks ........................... 55 Table 15. Mean and Standard Deviation (SD) of the Number of Teacher Postings (T) across Groups .................................................................................................................... 58 Table 16. Repeated Measure ANOVA for Number of Teacher Postings (T) across Groups ........................................................................................................................................... 58 Table 17. Mean and Standard Deviation (SD) of Rating of Teacher Moderating Levels (R) across Groups .................................................................................................................... 59 Table 18. Repeated Measure ANOVA for Rating of Teacher Moderating Levels (R) across Groups .................................................................................................................... 59 Table 19. Mean and Standard Deviation (SD) of Attending (A) across Groups ............... 60 Table 20. ANOVA table for Attending (A) across Groups ............................................... 60 Table 21. Mean and Standard Deviation (SD) of Participation (P) across Groups ........... 61 Table 22. ANOVA table for Participation (P) across Groups ........................................... 61 Table 23. Mean and Standard Deviation (SD) of Social-emotional Engagement (S) across Groups ............................................................................................................................... 62 Table 24. ANOVA table for Social-emotional Engagement (S) across Groups ............... 62 Table 25. Mean and Standard Deviation (SD) of Higher-order Thinking (H) across Groups ............................................................................................................................... 63 Table 26. AN OVA table for Higher-order Thinking (H) across Groups .......................... 63 Table 27. Mean and Standard Deviation (SD) of Interactivity (1) across Groups ............. 64 Table 28. ANOVA table for Interactivity (1) across Groups ............................................. 64 Table 29. Linear regression results of Number of Teacher Postings (T) and Student Social-emotional Engagement (S) ..................................................................................... 67 xiii Table 30. Nonlinear Regression Results of the Number of Teacher Postings (T) and Student Social-emotional Engagement (S) ........................................................................ 67 Table 31. Linear Regression Results of Rating of Teacher Moderating Levels (R) and Social-emotional Engagement (S) ..................................................................................... 68 Table 32. Nonlinear Regression Results of the Rating of Teacher Moderating Levels (R) and Student Social-emotional Engagement (S) ................................................................. 69 Table 33. Linear Regression Results of Attending (A) and Social-emotional Engagement (S) ...................................................................................................................................... 69 Table 34. Linear Regression Results of Student Social-emotional Engagement (S) and Participation (P) ................................................................................................................. 70 Table 35. Linear Regression Results of Social-emotional Engagement (S) and Higher- order Thinking (H) ............................................................................................................. 71 Table 36. Nonlinear Regression Results of Social-emotional Engagement (S) and Higher- order Thinking (H) ............................................................................................................. 71 Table 37. Linear Regression Results of Social-emotional Engagement (S) and Interactivity (I) ................................................................................................................... 72 Table 38. Nonlinear Regression Results of Student Social-emotional Engagement (S) and Interactivity (I) ............................................................................................................ 72 Table 39. Linear Regression Results of Attending (A) and Participation (P) ................... 74 Table 40. Linear Regression Results of Number of Teacher Postings (T) and Attending (A) ...................................................................................................................................... 76 Table 41. Linear Regression Results of the Rating of Teacher Moderating Levels (R) and Attending (A) ..................................................................................................................... 77 Table 42. Nonlinear Regression Results of the Rating of Teacher Moderating Levels (R) and Attending (A) .............................................................................................................. 77 Table 43. Linear Regression Results of Number of Teacher Postings (T) and Participation (P) ...................................................................................................................................... 78 Table 44. Linear Regression Results of Rating of Teacher Moderating Levels (R) and Participation (P) ................................................................................................................. 78 Table 45. Nonlinear Regression Results of Rating of Teacher Moderating Levels (R) and Participation (P) ................................................................................................................. 79 Table 46. Linear Regression Results of the Number of Teacher Postings (T) and Higher- order Thinking (H) ............................................................................................................. 81 Table 47. Linear Regression Results of Rating of Teacher Moderating Levels (R) and Higher-order Thinking (H) ................................................................................................ 82 Table 48. Linear Regression Results of Attending (A) and Higher-order Thinking (H) ..82 Table 49. Nonlinear Regression Results of Attending and Higher-order Thinking .......... 83 Table 50. Linear Regression Results of Participation (P) and Higher-order Thinking (H) ........................................................................................................................................... 84 Table 51. Linear Regression Results of Number of Teacher Postings (T) and Interactivity (I) ....................................................................................................................................... 85 Table 52. Linear Regression Results of the Rating of Teacher Moderating Levels (R) and Interactivity (I) ................................................................................................................... 85 Table 53. Linear Regression Results of Attending (A) and Interactivity (I) ..................... 86 Table 54. Nonlinear Regression Results of Attending (A) and Interactivity (I) ............... 86 Table 55. Linear Regression Results of Participation (P) and Interactivity (I) ................. 87 xiv Table 56. Linear Regression Results of Comprehensive factor TRP and Higher-order Thinking (H) ...................................................................................................................... 89 Table 57. Linear Regression Results of Comprehensive factor TRP and Interactivity (1)90 Table 58. Summary of ANOVA Results and Scheffe Post Hoc Results of all the 7 Variables over Weeks. ....................................................................................................... 93 Table 59. Summary of ANOVA Results and Scheffe post hoc Results of all the 7 Variables across Groups. ................................................................................................... 95 Table 60. Summary of Regression Result of Relationships between Student Social- emotional Engagement and all the other Variables - T, R, A, P, H, and I ...................... 105 XV LIST OF FIGURES Figure 1. Variables and Their Subcategories ..................................................................... 24 Figure 2. NVIVO Tree Nodes ........................................................................................... 36 Figure 3. All variables After Combination ........................................................................ 44 Figure 4. Overview of the Research Design and Data Analysis ....................................... 45 Figure 5. An Example - the Descriptive Statistics of Number of Teacher Postings (T-table) ........................................................................................................................................... 45 Figure 6. A Quadratic Relationship Between Attending and Higher-order Thinking ....... 83 Figure 7. A Linear Relationship Between Participation and Higher-order Thinking ....... 84 Figure 8. A Quadratic Relationship Between Attending and Interactivity ........................ 87 Figure 9. A Linear Relationship Between Participation and Interactivity ........................ 88 Figure 10. A Linear Relationship Between the Comprehensive Factor TRP and Higher- order Thinking ................................................................................................................... 89 Figure 11. A Linear Relationship Between Comprehensive Factor TRP and Interactivity ........................................................................................................................................... 90 Figure 12. Mean Differences of Secial-emotional Engagement across Groups .............. 103 Figure 13. Mean Differences of Attending across Groups .............................................. 103 Figure 14. Mean Plots of Student Higher-order Thinking across Groups ....................... 104 Figure 15. Mean Plots of Participation across Groups .................................................... 104 Figure 16. Attending and Participation (Behavioral Engagement) as a Link to Connect Individuals to the Group .................................................................................................. 107 Figure 17. Student Social-emotional Engagement Was Exclusively Related to Attending and Participation (Behavioral Engagement) .................................................................... 107 Figure 19. The Whole Picture -What Influenced Student Intellectual Engagement ....... 116 xvi CHAPTER 1 PURPOSE OF STUDY 1.1 Statement of the Problem Current theories of learning emphasize the value of dialogue for student engagement and achievement (Bruffee, 1993, 1999; Cazden, 2001). Researchers argue that learning and working with a small group, as opposed to individual activity, may facilitate learning (Bosworth & Hamilton, 1994; Bruffee, 1999; Hamm & Adams, 1992; Johnson & Johnson, 1975; Zhang, 2004). Research has also shown that the nature of classroom discourse depends greatly on the teacher (Anderson, Rourke, Garrison & Archer, 2001 ). In face-to-face classrooms, these issues are relatively well understood. However, perceptions of group learning dynamics and online teachers’ roles in distance education environments remain quite varied and controversial (Dennen, 2001; Lobel, Neubauer, & Swedburg, 2002a). For instance, although online instruction literature increasingly emphasizes the importance of moderation and leadership (Anderson, et al., 2001; Feenberg, 19893), the relationship between moderation and student engagement is often unclear. Computer conferencingl is an important part of online learning. It has broadened opportrmities for the exchange of ideas, facts, and opinions by enabling one-to-many “Computer conferencrng” m this study refers to computer-mediated communication (CMC), a generic term now commonly used for a variety of systems that enable people to communicate with others by computers and networks. and many-to-one exchanges (Wagonner, 1992). The capacity of computer conferencing to support collaborative work and interaction has led to an appreciation of computer conferencing as a powerful learning environment (O'Malley, 1995). Computer conferencing has two subcategories: asynchronous and synchronous. Asynchronous Computer Conferencing refers to electronic bulletin boards, discussion boards, or electronic mail that participants can access at any time. Synchronous Computer Conferencing” in this study refers to “real time” or “chat” programs though which participants communicate at the same time. Despite the ability to transcend the factor of place while seeking educational Opportunities, many online students feel they must learn in isolation. The depth of the students’ learning is hampered by the lack of timely interaction with other students. A synchronous learning environment provides the flow of learning and the thought development processes that may become more akin to the face- to-face classroom learning experience because of the capability for providing immediate communication. Synchronous communication has the capacity to deepen the connection between students and instructors while promoting spontaneous learning experiences. The direct application of real-time tools in the online environment provides the opportunity for “right-now” learning. The adoption of computer conferencing in higher education has far outpaced our understanding and knowledge of it (Garrison, Anderson & Archer, 2001). Despite the promise of computer conferencing for creating powerful learning spaces and the fact that this technology has been widely adopted, many questions and issues remain (Bonk & Wisher, 2000; Herring, 2003).This lack of knowledge is even more pronounced in the case of synchronous computer conferencing learning environments because most research on distance education has focused on asynchronous systems. Nevertheless, the contextual aspects of learning - real-time social interaction and negotiation with peers, experts, moderators, and instructors - are vital to a student’s movement from novice or legitimate peripheral participant to eventual contributor or expert (Bonk & King, 1998; Lave & Wenger, 1991; Orvis, Wisher, Bonk, & Olson, 2002; Wenger, 1998). This study focused on online conferences conducted in a synchronous mode — synchronous computer conferences. 1.2 Purpose of the Study This dissertation research project is designed to address the above issues within the context of an online college level course that was based completely on text-based synchronous computer conferencing. This course used a custom-built software system to develop structured discussions around topics under the guidance of teacher moderators. The key research issue was the manner in which engaged collaborative discourse can be developed and maintained, through actions and behaviors of the moderator. Engaged is used here not only in the general sense of participants interacting with each other but also in the sense of engaging with subject matter in a collaborative discourse (Xin, 2002). Discourse is used instead of discussion because it conveys the definition relating to the “the process or power of reasoning,” rather than the more social connotation of conversation (Anderson, 2004; Clark & Schaefer, 1989). Given the above context, and operating within these definitions, this dissertation addresses three key issues: 1) The need to measure different aspects of learner engagement in collaborative discourse, and the relationships among them; 2) The need to understand and identify what factors contribute to student learning through engaged collaborative discourse; particularly, the role of teacher moderators in enhancing student learning; and 3) The need to better understand the nature and dynamics of moderated synchronous group discussion per se. Scholars have identified ways in which computer conferencing can support collaborative discourse and student learning. However, there is minimal theory-based and data-driven research that measures student engagement in learning through collaborative discourse in synchronous computer conferencing environments (Xin, 2002). Measurement is an essential aspect of scientific research (Hilzt, Coppla, Rotter, & Turoff, 2000). Researchers have designed, modified, and adapted numerous measurement methods and instrtunents for studying learning phenomena in computer conferencing environments, but few of these measures and instruments have been verified by empirical data. Moreover, most of these measures were developed for measuring asynchronous learning environments; few have been applied to synchronous learning environments. Once student engagement can be adequately measured and interpreted, then it is possible to focus on the factors that are related to engagement. As part of this increased understanding of factors that contribute to student engagement in learning, it is vital to know the respective roles that teachers and group dynamics play in student engagement in learning. The second goal of this study is to form a model of learning through engaged collaborative discourse in computer conferencing. By building such a model, relationships between teacher moderating behaviors and student engagement as well as relationships among student engagement aspects may be disentangled, potentially revealing major factors critical to student learning. One of the most significant gaps in our knowledge about the use of computer conferencing for learning concerns the relationships between individual thinking processes and group interactions. For instance, Cazden (2001), while discussing discourse in classrooms, writes, “it is never easy to talk about relationships between individual (silent) thinking process and the dyadic or group (often noisy) interactions in the classroom.” However, this relationship between individual cognition and group interaction lies at the heart of student learning (Cazden, 2001) and is particularly important if the potential of computer-mediated communication is to be achieved. To accomplish these three goals, this research applied a mixed method research design to examine the transcripts generated by moderated synchronous discussions of four groups of students over 11 weeks of an online course. The researcher measured teacher moderating levels (based on the quantity and quality of moderator intervention), developed constructs and sub-constructs of student engagement (seen as consisting of three key aspects: behavioral, social-emotional, and intellectual) and coding schemes for measuring student engagement. By means of quantitative analyses, the researcher explored relationships between teacher moderating levels and student engagement variables and relationships among student engagement variables as they developed over time. The broad patterns resulting from the quantitative analyses allowed the selection of transcripts and sections of transcripts of interest for qualitative analyses. The qualitative analyses provided a closer look at the nature of the “lived” experience of conferencing, the process of collaborative meaning construction, and the transactional nature of the relationship between teacher moderating and student engagement. By analyzing how various moderating techniques and implementations influence and are influenced by group interactions, and by demonstrating the effect of these on factors related to learning, this study will contribute to the growing body of knowledge related to engaged collaborative discourse in computer conferencing. 1.3 Project Significance Research related to the underlying processes of synchronous computer conferencing vis-a-vis online moderating contributes to a better understanding of how teachers can provide effective online mentoring and scaffolding to facilitate collaborative student engagement, both in a social sense and with subject matter (Bonk et al., in press). It also contributes to a better understanding of whether and how a community of inquiry develops by means of synchronous computer conferencing where students are most likely to become invested behaviorally, social-emotionally, and intellectually. Findings from this research should inform research and practice on the larger goal of improving the quality of online teaching and learning. 1.4 Overview of Chapters This dissertation is organized into seven chapters. Chapter 1, the current chapter, provides an overview of the problems, needs, and purpose of the study as well as an explanation of the significance and organization of the entire project. Chapter 2 reviews the pertinent literature and provides the theoretical perspectives for this study. Perspectives of socio-cultural learning theory, characteristics of synchronous computer conferencing, online moderating and student engagement are discussed. Chapter 3 describes the methodology used in this study. It provides the research context, research design, and data collection and it delineates variables and their measures. It also provides the data analysis procedures including the use of content analysis with the aid of QSR NVIVO 2.0 software, the data analysis techniques used to answer the research questions, and the qualitative analysis procedures. Chapter 4 presents the quantitative data and analysis results, together with rationales for the null hypotheses. Chapter 5 discusses the results derived from quantitative analysis. Through the discussion, a model of learning through engaged collaborative discourse in the context of synchronous computer conferencing is offered. Chapter 6 presents the qualitative data analysis procedures and results. The results are presented in the form of narratives of good teaching practices supported by examples. Chapter 7 provides a review of the goals of this research and a summary of its accomplishments, limitations, implications, and recommendations. CHAPTER 2 THEORETICAL PERSPECTIVES This research draws upon multiple research perspectives. In this chapter the pertinent literature related to socio-cultural learning theory, characteristics of synchronous computer conferencing, online moderating, student engagement, and their relationships will be reviewed. The initial discussion will concern socio-cultural learning theory, in which the project is situated. 2.1 Socio—cultural Learning Theory The project, situated within a Vygotskian, or socio-cultural framework (V ygotsky, 1978, 1986; Wertsch, 1985), studied computer conferencing as a medium for providing scaffolded feedback from multiple sources and perspectives (Dennen & Bonk, 2000). It also draws upon the cognitive apprenticeship model espoused by Collins, Brown, and Newman (1989). Vygotsky’s theory provides a solid basis for understanding learning as a process of social negotiation and collaborative sense-making, with mentoring as an effective technique to assist students in collaborative activities and knowledge construction (Zhu, 1996). The framework contends that, “while the individual learner is the only one who can construct his or her unique understanding of the world, this understanding emerges in a social context” (Cunningham, Duffy, and Knuth, 1993). An important concept in Vygotsky’s theory is his idea that intellectual development takes place between people before being internalized. From this point of view, instruction is more effective when it takes form in discussions or dialogues in small groups wherein learners interact with peers and adults or mentors who challenge, support, and scaffold their learning (Zhu, 1996) Cognitive apprenticeship (Collins, Brown and Newman, 1989) emphasizes the solution of real world problem-solving under expert guidance that fosters cognitive and metacognitive skills and processes. The notion of guided experience in cognitive apprenticeship corresponds to the concept of guidance and collaboration in the zone of proximal development (ZPD) introduced by Vygotsky (1978). ZPD is defined as The distance between the actual developmental level as determined by independent problem solving and the level of potential development as determined through problem solving under adult guidance or in collaboration with more capable peers (p.86). Exposure to the strategies, skills, and ideas of others on a social plane can be individually appropriated and internalized as independent problem solving skills. The implication is that, to learn better, learners have to be situated in the social and functional context embedded with the learning skills and knowledge (Wang & Bonk, 2001). Synchronous computer conferencing can provide such a context - as this discussion will clarify. The nature of collaborative discourse in synchronous computer conferencing is framed within the basic characteristics of the medium, including its affordances and constraints (Tu, 2003). 2.2 Characteristics of Synchronous Computer Conferencing Computer conferencing is increasingly gaining acceptance and use in many higher education courses delivered or supported by electronic means (Keynes, 2003, p.361). There are two modes of computer-mediated communication: asynchronous and synchronous. This research focused on synchronous computer conferencing as a Web- based communication system that supports real-time, many-to-many textual interactions. The interactions made possible through synchronous communication technologies allow participants to experience “same-time, same place” or “same-time, any place” collaboration. This collaboration demonstrates the important traits of immediacy, fast planning, problem-solving, scheduling and decision-making, which can be difficult to replicate in an asynchronous environment (Knolle, 2002; Marjanovic, 1999). Students can ask questions and share information related to presentations and (when facilitated) the technology can be used to prevent students from moving off-topic or distracting others, and has been shown to be effective (Marjanovic, 1999). This mode of communication (computer conferencing) prompted some authors to speculate that it would be an ideal medium to support substantive discussion (F eenberg, 1989b; Kaya, 1992; Roschelle, 1996). The act of encoding ideas in textual format and communicating them to others forces cognitive processing and the resulting clarity is strongly associated with scholarly practice and effective communication (F eenberg, 1989a, Henri, 1992). At the individual level, it provides time for self-reflection that is critical for higher-order thinking and deep learning (Garrison, et al., 2001). At the social level, Feenberg (1989b) points out, “a group which exists through an exchange of written text has the peculiar ability to recall and inspect its entire past” (p25). This ability, Feenberg argues, provides the opportunity for the group to work together to advance its collective memory. This kind of text-based and many-to-many communication may promote reflective and critical thinking, negotiation of meaning, and collaborative construction of knowledge. There is some evidence to support these predictions (Bruffee, 1992, 1993; Hara, Bonk, & Angeli, 2000; Hillman, 1999; Newman, Johnson, Webb, & Cochrane, 1997; Newman, Webb, & Cochrane, 1995). 10 The ways in which we communicate face-to-face are reconstituted when we move online. The pure textual nature of computer—mediated conferencing constrains interaction in some ways. For instance, some “cues” such as the frequency of an individual’s writing, the nature and content of the group’s referencing of its past, etc., bear some “family” resemblance to those strategies used in face-to-face groups which also have collective memory and texts (for example, Burbules, 1993; Florio-Ruane & deTar, 2001). However, not all aspects of meaning that are communicated in speech, especially during face-to- face conversation, can be easily represented in writing, nor do we know how paralinguistic information is conveyed in writing alone - e.g., leadership, reluctance, disagreement, and so forth. Although investigators have researched the means by which groups accomplish this orally (and in the writing accompanying oral communication, e. g. Florio-Ruane & deTar, 2001; John-Steiner, 2000; Tannen, 1989), we also have much to learn about how this is accomplished and interpreted by people working exclusively on written communication via the computer, and especially on synchronous written communication (diSessa , 2001). The sound, tone, and tempo of the speech and the non- verbal expression of face-to-face conversation are lost in computer conferencing. This narrowed communication channel challenges participants; as a result, successful communication in such an environment requires conscious effort and skilled coordination and collaboration. This leads to the issue of online moderating and its role in online discussions. 2.3 Online Moderating To moderate is to preside or to lead (F eenberg, 1989a; Mason, 1991; Paulsen, 1995). Drawn on the idea of discussion as language games (Wiittgensein, 195 8), 11 moderating functions play an important role in keeping participants absorbed in the ongoing dialogue “game.” Playing at computer conferencing consists of making moves that keep others playing (Xin, 2002). In this way, computer conferencing favors open- ended comments, and this calls for a moderator who provokes and instigates in order to keep the game alive. When a message fails to function as a link, at one end or the other, moderating fimctions (e. g. recognition, prompting, weaving) are needed to tie up the loose ends and strengthen the link in order to keep the chain of conversation going (Xin, 2002). Collins, Brown and Newman (1989) and Rogoff’s (1990) model of apprenticeship in thinking, or Bruner’s adaptation of Vygotsky’s “zone of proximal development” including the supportive dialogue within that zone - or “scaffolding” (Ninio & Bruner, 1978) - are all analogies employed to illustrate an assistive role for teachers in providing instrumental support to students from their position of greater knowledge content (Bonk & Cunningham, 1998; Garrison & Archer, 2000). What is not clear is how much “scaffolding” is required or is appropriate. The literature on online discussion has tended to favor high levels of moderating. Based on the over-arching ethos of good teaching and learning (The Report of the University of Illinois, 1999) and the limitations of computer conferencing, researchers have often argued for strong online moderating. Studies have shown that when learning based on computer conferencing fails, it is usually because of the lack of teaching presence and appropriate online leadership (Garrison, et al., 2001; Gunawardena, Anderson & Lowe, 1997 ; Harasim, 1990; Hiltz et al., 2000). However, researchers have identified problems when the instructor exclusively assumes the role of discussion leader (Rourke & Anderson, 2002). One consistently cited issue is the authoritarian presence that the instructor brings to the discussion. Bloxom, et 12 a1. (1975), and Kremer and McGiness (1998) each warn that this type of presence can inhibit the free exchange of ideas. The ultimate concern is that instructor-led discussions can easily revert to a recitation structure, or initiate-respond-evaluate structure, of a traditional lecture in which the student is ofien a passive and unreflective audience member (Rourke & Anderson, 2002). Meanwhile, many corporate training settings favor independent study and self-directed online learning. Some practitioners of online teaching prefer not to moderate online discussions since they think the teacher’s intervention may limit students’ freedom in the discussion. The purpose of moderating is to promote student engagement. Student engagement in collaborative discourse through a community of inquiry is the key educational objective (Bloom, 1956; Carroll, 1963) and a key indicator of the ideal learning process via the medium of synchronous computer conferencing. Student engagement is discussed below. 2.4 Student Engagement All learning requires engagement to attain mastery (Bloom, 1956; Carroll, 1963). Based on Bloom’s well-cited taxonomy of educational objectives, student engagement in a learning process embraces three aspects: (1) cognitive, (2) affective, and (3) psychomotor (Bloom, 1956). Such a threefold division is as ancient as Greek philosophy: philosophers and psychologists have repeatedly used similar tripartite organizations such as cognition, conation, and feeling; thinking, willing, and acting; etc. (Krathwohl, Bloom, & Masia, 1964). In his seminal article “learning community,” Schwab (1975) argued that fruitful conversation requires three kinds of community: (1) community of confidence and trust, (2) affective community, and (3) cognitive community. 13 There must be community of confidence and (cautious) trust, which arises from the past collaboration in which the usefulness of each to other, and a degree of dependability, have been discovered. There must be affective community, which arises from shared vicissitudes and satisfactions. There must be a cognitive community (p39). Online conferencing similarly requires engagement to reach ideal educational objectives, although the divisions and the definitions of the components of student engagement are not the same as those of the classic taxonomies of learning objectives by Bloom (1956), Carroll (1963), or Schwab (1975). Potentially, synchronous online communication engages students in knowledge sharing, mutual inspiration, interdependence, and active learning through conversation, argument, debate, and discussion among peers, experts, and teachers or moderators (Bonk & Cunningham, 1998; Kaye, 1992). As Kaye (1992) has stated, the practical reality of collaboration is that it requires a higher order of involvement, or engagement (Schrage, 1990). However, there is a need for more research into the nature of engagement online as well as the activities and contextual features of an engaging online curriculum. Engagement here is synonymous with investment, involvement, or commitment. For the purpose of this research, student engagement is defined as a phenomenon that occtus when students become invested behaviorally, social-emotionally, and intellectually in the collaborative discourse of a community of inquiry through the medium of computer conferencing (cf. Christenson & Menzel, 1998; Sanders & Wiseman, 1990). By synthesizing a broad literature review beginning with Bloom’s educational objectives (1956), Carroll’s model of school learning (1963), and Schwab’s learning community (1975), and encompassing the more current computer-mediated communication field (Kearsler & Shneiderrnan, 1998; National Survey of Student 14 Engagement 2003; Scardamalia & Bereiter, 1996; Scardamaliz, Bereiter, McLean, Swallow, & Woodruff, 1989; Shneiderman, 1992, 93, 98), this researcher has developed the constructs of student engagement in computer conferencing. The argument will be made that student engagement can be seen to consist of three sub-constructs: behavioral, social-emotional, and intellectual engagement. Behavioral engagement is derived from Carroll’s seminal model of school learning (1963), which states that learning requires involvement to attain mastery. Social-emotional engagement embraces both Schwab’s community of confidence and trust and the affective community while intellectual engagement corresponds to Schwab’s cognitive community (1975). The research maintains that these three dimensions of student engagement are closely interrelated; this is an empirical question that will be tested in the present research. The construct of student engagement and each of its sub- constructs is explained in greater detail in the research design section. 2.5 Moderating Levels and Student Engagement in Synchronous Computer Conferencing The relationship between online moderating and student engagement in a synchronous computer conferencing learning environment is complicated. It may be easy to assume that there is a direct causal relationship between moderator behavior and student engagement (i.e., the nature of moderator behavior determines the level of student engagement). According to this perspective, the moderator strongly influences the nature of student experience in synchronous online discussions and determines its relative success or failure. An alternative view might be that student engagement determines the nature of moderator behavior. For instance, a group of highly engaged students who 15 participate in discussion with little encouragement will clearly require little guidance by the moderator to get the discussion going. It is not surprising that some theorists in this area argue for strong moderating intervention while others believe that self-direction on the part of students is more important. Both of these deterministic approaches overlook the dynamic nature of student- instructor interaction in synchronous computer conferences. In an educational context, the development of shared understanding is a complex process mediated by the prior knowledge of the students, their interaction and engagement with each other, the subject matter, and the moves made by the instructor. The development of dialogue, where newer messages build on earlier messages, can be one indicator of the manner in which shared understandings are constructed by the instructor and the students. Given this fact, the instructor has a relatively privileged position in the classroom, and this holds true in virtual classrooms as it does in more conventional face-to-face classrooms. Thus the shared construction Of knowledge in a virtual classroom can be strongly influenced by the role taken by the instructor. Consequently, the relationship between teacher moderating levels and the relationships among student engagement variables in synchronous computer conferencing is complex and dynamic where changes in one variable may influence the others. The complexity of the phenomena has significant implications for the design of any research related to them. It is for this reason that this researcher developed a mixed- methods research design, including both quantitative and qualitative components. The quantitative component involves converting communication content into discrete units and calculating the frequency of occurrence of each unit. It also extends the descriptive results of content analysis to inferential hypothesis testing (Borg & Gall, 1989; Rourke et 16 a1, 2001) which is meant to certify the relationship between teacher moderating levels and student engagement. By contrast, the qualitative approach is aimed at understanding and clarifying the dynamics of interaction between the students and the instructor. This approach pays close attention to the content of what is discussed and the intricate give- and-take that characterizes the relationship between moderating behavior and student engagement in a synchronous computer conferencing learning environment. 2.6 Summary Synchronous computer conferencing is being increasingly implemented in many higher education courses delivered or supported by electronic means (Keynes, 2003). There is a need to understand conferencing in depth in order to provide convincing evidence about the learning that is taking place and how knowledge construction occurs. As Fahy et a1 (2001, p2) state: “Practitioners and researchers must be able to describe on- line interaction more than impressionably and measure more effectively than anecdotally”. Socio-cultural learning theory supports the role of dialogue and group learning and the role of a moderator in collaborative learning process, performed by the instructor, to help bridge and extend the ZPD through peer collaborative learning (Zhang, 2004). Synchronous conferencing may facilitate collaborative learning and delayed reflection (Lin, Hmelo, Kinzer, & Secules, 1999; Zhang, 2004). Yet learners are faced with many challenges during online collaboration, due to the characteristics of this communication technology and its less traditional approach to collaborative learning. Various forms of scaffolding can be provided during the course of online collaboration and moderations are required to achieve smooth, effective online collaborative learning. It is strongly 17 recommended by research that structuring, moderating, and scaffolding efforts should be provided in peer online collaboration to enhance student engagement, since all learning requires engagement to attain mastery (Zhang, 2004). This study will empirically test the effect of different levels of teacher moderating on different aspects of student engagement: behavioral engagement, social-emotional engagement, and intellectual engagement. l8 CHAPTER 3 METHOD This chapter will outline the history of the study, the variables and measurements for each variable, research questions, research context, research design, data collection, statistical analysis procedures for the quantitative analysis and the qualitative analysis procedures, and the null hypotheses. Units of analysis and Inter-rater reliability will also be addressed. 3.1 The Origin of the Current Design-Pilot Studies This dissertation research builds on a prior research project and a pilot study. The prior research project was a case study analysis of discussion transcripts, which led to the development of a thread theory, an analytical framework for analyzing synchronous computer conferencing transcripts (Shi, Mishra, Bonk, Tan & Zhao, under review ;Shi & Tan, 2003). The research focused on decrypting the interaction patterns of collaborative discourse and on the teacher’s role in such discussions. The project led to reflections on and questions about the nature of student engagement and the role of the teacher in online conferences. In turn, this initial work resulted in a pilot study that formed the basis of this dissertation research. In the pilot study, the researcher analyzed a small data set, similar to that used in this dissertation. First, the researcher adapted and developed the constructs, indicators, and protocols from the literature as well as the theoretical frameworks. Second, the derived indicators and categories with the protocols were applied deductively to the data for preliminary coding in order to test the applicability of the indicators, categories, and 19 the protocols. Third, based on local observations (Cronbach, 1975), the categories, indicators, and protocols were modified in accordance with the preliminary coding and preliminary analysis (Shi, Mishra, & Bonk, 2004). Some preliminary findings from the pilot analyses indicated that differences in student engagement occur when group discussions are moderated at different levels. However, the effect of teacher moderating on different aspects of student engagement appeared to vary. Discovering these trends on a small-scale data set, this researcher decided to examine them in more detail. As mentioned in Chapter One, this dissertation addresses three keys issues related to the subject. First, there is the need to measure the different aspects of learner engagement with the substantive processes of engaged collaborative discourse and the relationships between and among these aspects of learner engagement. Second, there is the need to understand what factors contribute to student learning, particularly, what role teacher moderators play in enhancing student learning through engaged collaborative discourse. And, third, there is the need to better understand the nature and dynamics of moderated synchronous group discussion per se. Much research has examined the effects of collaborative learning on individual learning outcomes; however, researchers argued that valid assessment of collaborative learning, practically (Dillenbourg, 1999) and theoretically (Perkins, 1993), should be looking at group - rather than individual - achievement. This study was primarily interested in group performance; it did not investigate individual learning performance. Consequently, all the measures of variables were in terms of group performance instead of individual performance. 20 3.2 Research Context Conducted within a synchronous, online, three-credit university level undergraduate course entitled “Interpersonal Communications and Relationships,” the course was designed to enhance students’ understanding of effective communication behavior and to improve their abilities to attend to verbal and non-verbal communication with others, exchange constructive feedback, engage in effective problem-solving, address and deal constructively with conflict, and communicate across such differences as gender, class and race (Lobel, et al., 2002a). In each class session, small “breakout rooms” were used for group discussion that followed a highly detailed agenda. This course was delivered online, in real time, to 32 students in the fall semester of 2002. The course consisted of eleven consecutive three-hour weekly sessions, taught from 7:00 to 10:00 pm. EST on Wednesday evenings, between September 8, and December 11, 2002. The medium - a synchronous, online eClassroom available over the Internet and designed specifically for experiential “learning-by-doing” pedagogy - used a real-time, interactive HTML-formatted text, image, and animated messaging system. The eClassroom, consisting of a main “room” and four “breakout rooms” for small eGroup experiential eActivities and eDiscussions, was password-protected, monitored, and archived. Most students logged into the eClassroom from their homes. Text and image based lecture materials were posted to the eClassroom in real time, and the “LearningByDoing” eGroup activities offered, in this medium, facilitated learning through practice and discussions (Lobel, et al., 2002a). One principal instructor and three eGroup co-facilitators/moderators staffed the eClassroom. The virtual classroom consisted of a public main “room” where the whole 21 class met to receive the course content for the first part of the session. When each content delivery session ended, students broke up into groups and attended one of the four “breakout rooms” for small online group activities and discussions for the second- class session, which usually lasted for about an hour. The 32 student participants in the course were randomly divided into four discussion groups at the beginning of the semester. Each small group had a discussion moderator. Irnportantly, this arrangement remained unchanged for the entire semester (11 weeks). Students wrote weekly eJournals, which served as an asynchronous component of the eCourse, and these were e-mailed weekly to their eGroup co-facilitator and principal instructor for comments on learning progress. All eClassroom activities and interactions took place online, in real-time. There were no face-to-face interactions between the students and the instructors during the 11 weeks of the eCourse (Lobel et al., 2002a). 3.3 Data Collection The prime data source for this study consisted of 44 automatically archived conference transcripts from an online course, each with an average of 350 postingsz. In order to better understand the context within which these discussions worked and to help triangulate research results (Patton, 2002), the following additional sources of data were collected: (1) Field notes. Preliminary analysis of some transcripts from a similar data set suggested that the researcher should use participant observation strategies (Spradley, 2Altogether there are 44 x 350 = 15400 messages - this is a far more comprehensive data set than has been used in previous studies in the literature. ‘ 22 1980) to observe the actual online discussions unfolding, thus affording a better understanding of the dynamics of the discussions. The researcher took field notes while observing each conference - both in the main room and in the break out rooms. A word processor window was opened along with the conferencing windows. The researcher observed the discussions and took notes on what was currently happening. After every 10 minutes, the researcher inserted a separator to indicate that 10 minutes had passed. At the end of the session, the researcher spent a few minutes reflecting on the entire conversation and quickly noted any significant ideas or themes that seemed appropriate. The researcher conducted participant observation and wrote field notes during the one-hour online meetings of the teaching team that were held immediately before every class. Throughout these meetings, the teaching staff processed the class agenda and made sure everyone was “on the same page” about the agenda of the next class session. (2) The 11 archived transcripts of the main room (called the “public” area) were also examined to better understand the context of the discussions in the breakout rooms. (3) All class materials were collected. These materials included the course syllabus, course readings, classroom activity agendas developed by the teaching team and delivered to each teaching staff member once a week, two days before class, and class preparation (the one hour online meetings of the teaching staff immediately before class and all of the course assignments). These data were used to help define the context of each conference. 23 3.4 Variables and Their Measures The variables in this study fall into two major categories: teacher moderating levels and student engagement. Each of these is further divided into sub-categories, as explained below. Although this will be described in greater detail later (in the Section 4.1.1 Combining Variables), these subdivisions were based on theoretical and statistical concerns. After data collection and analysis, some of these sub-categories were later combined. The numeric values of each variable are frequencies of occurrence. Figure 1 provides a schematic view of all the variables and indicators used prior to re-categorization, and each of them will be discussed in greater detail below. Teacher Number of Teacher Postings (T) ( Moderating Rating of Teacher Moderating Levels (R) (Behavioral {Attending (A) e-Classroom 5119399'119'1t Participation (P) W th \Student Social-emotional(s) c:::nslon Engagement< Engagement SE-lnteractive Thinking Negotiation Integration HIgh-order (H) {Initialization Klntellectual Engagement Initialization wlsollcitation . Declarative Interactivrty (I) {Reactive lnteracflve Figure 1. Variables and Their Subcategories 3.4.1 Teacher Moderating Levels Literature shows that both the quantity and the quality of teacher moderating levels matter. Therefore, two dimensions were used to measure teacher moderating levels: 24 (1) the number of teacher postings and (2) ratings of teacher moderating levels. Each of these measures is discussed below. The number of teacher postings is a tally of teacher postings from each transcript. The rating of teacher moderating levels was measured through a model developed by Xin (2002) (Appendix C). In this model, the minimal level of moderating (Level 1) includes when the moderator: opens discussion; establishes the computer conferencing agenda; and observes conference norms. At the high end of moderating (Level 5), the moderator strongly weaves and summarizes participants’ ideas in addition to performing all previous moderating levels or fiinctions. A value of 1, 2, 3, 4, and 5 is respectively assigned to each of the subcategories of level 1 through level 5 in this study. The ratings of teacher moderating levels for each transcript represents the sum of the product of each subcategory divided by the total number of messages posted by the moderator. The value falls between 1 and 5. The formula is shown below: _1*T1+2*T2+3*T3+4*T4+5*T5 T1+T2+T3+T4+TS R Where 1, 2, 3, 4, 5 are values assigned to each level from level 1 to level 5, T1. . .. T5 were the number of teacher postings for each level. 3.4.2 Student Engagement Variables and Their Measures Student engagement was measured by means of three indicators (sub-constructs): Behavioral Engagement, Social-emotional Engagement, and Intellectual Engagement. 25 3.4.2.1 Behavioral Engagement In his model of school learning, Carroll (1963) first proposed the idea of engagement rate, measured as the percentage of the allocated time that students are actually involved in the learning process. The nature of engagement online and the contextual features of an online curriculum have their own features; nevertheless, being invested in the conferencing - attending and being attentive - is the basic requirement of any collaborative learning process. Based on earlier studies in the literature and the more recent studies done by Lobel et al. (2002a, 2002b) and Neubauer & Lobel (2003), the researcher defined Behavioral Engagement in a computer conference as a phenomenon in which participants are attentively participating in the collaborative discourse (Lobel, et al., 2002a). From this framework, there are two aspects of student behavioral engagement: Attending and Participation. Attending is “Listen” and Participation is “Talk”. Based on Lobel et al’s studies (2002), Attending is quantified as the frequency with which participants actively poll the server for the data generated since their last request. Participation is defined as the state of being related to a larger whole. Participation is quantified as the number of messages containing a communication sent by participants. 3.4.2.2 Social-emotional Engagement While behavioral engagement is vital to the outcomes of online learning environments - synchronous and asynchronous - social-emotional engagement of learners is also indicative of student involvement in collaborating in a community of critical inquiry. For the purpose of this research, Social-emotional Engagement is defined as the phenomenon that occurs when students see themselves as part of a group rather than as individuals, and, therefore, make efforts to build cohesion, acquire a sense of belonging, 26 and render mutual support (Rogers, 1970; Rourke, et al., 1999; Rourke & Anderson, 2002) Social-emotional engagement is essential to knowledge construction in that it makes group interactions appealing, and, thus, intrinsically rewarding. This, in turn, leads to an increase in academic, social, and institutional integration and results in increased persistence (Anderson & Kanuka, 1997; Garrison, et al., 2000, Kanuka & Anderson, 1998; Rourke, et al., 1999). Irnportantly, Brookfield and Preskill (1999) note that critical thinking is facilitated by the social-emotional support of others. Establishing a community of inquiry is related to social and emotional interaction and support. It is not uncommon in computer conferencing to experience a series of superficially related monologues rather than contextualized and personalized dialogues or common-logues (Xin, 2002). This is an indication that participants are not social- emotionally engaged in the conferencing. Given the reliance of computer conferencing on the written word, the establishment of a community of inquiry requires greater attention to social and emotional interaction and support (Garrison et al., 2000). This research adapted the model used by Rourke et al (1999) and Garrison et al. (2000) to measure social-emotional engagement (Shown in Appendix B). The three indicators of social-emotional engagement are Warmth, Cohesion, and Interactive. Warmth includes the use of emoticons (Rourke, et al.,1999), complimenting or expressing appreciation, expressing identical feeling, humor, and self-disclosure. Cohesion is exemplified by activities that build and sustain a sense of group commitment. It is defined by three indicators: phatics and salutations, vocatives, and addressing the group as “we”, “us”, “our”, and “group” (Bussmann, 1998; Rourke et al, 1999). Interactive includes referring or responding to previous messages. Short et al. (1976) 27 identify “evidence that the other is attending” as a critical feature in the promotion of socially meaningful interaction. Eggins and Slade (1997) add that responses and rejoinders serve several beneficial purposes in conversation. They build and sustain relationships, express a willingness to maintain and prolong contact, and tacitly indicate interpersonal support, encouragement, and acceptance of the initiator (Rourke, et al., 1999,). 3.4.2.3 Intellectual Engagement Intellectual engagement is the third indicator of student engagement in the collaborative discourse of computer conferencing. Intellectual Engagement is defined as the phenomenon in which participants interact and debate not only with each other but also, together or as individuals, reflect deeply on the issues of the prevailing task or subject matter and undergo cognitive change and grth through engaging in this process (Xin, 2002). Intellectual engagement in a community of inquiry indicates that students are involved in knowledge construction and meaning confirmation through sustained discourse and negotiation (Garrison, et al., 2000). In effect, they are not simply acquiring new knowledge and applications skills, but are negotiating with peers and instructors in deep levels of critical thinking and justification of reasoning (Newman et al., 1997). Higher-order thinking and interactivity are consequently considered key indicators of student Intellectual Engagement. Higher-order Thinking Higher-order Thinking requires sustained critical discourse where dissonance and problems are resolved through the full cycle of the critical thinking process. This 28 generally occurs through interaction between the publicly shared world and the privately shared world (Duffy, Deuber & Hawley, 1998; Garrison, et al., 2000; Xin, 2002). Critical thinking is defined and described by many researchers as a process of achieving understanding and intellectual advancement through logical inquiry or reasoning, critical evaluation, problem solving, and decision making (Garrison, et al., 2000, 2001; Lipman, 1991; Newman, et al, 1997). This process, however, can only be precipitated by interactions that are challenging and/or critical. As Brown and Palincsar (1989) noted, “change does not occur when pseudo-consensus, conciliation, or juxtaposed centrations are tolerated” (p409). Contradictory perspectives disturb learners’ initial impressions of the content and prompt them to process it more thoroughly. Combing and adapting Xin’s (2002) coding scheme to measure engaged collaborative discourse with Garrison et al.’s (2001) coding scheme to measure cognitive presence, this researcher used the following rubric to measure Higher-order Thinking (Appendix C). According to Xin (2000), Harasim (1990), and Garrison et al. (2000), the first phase of critical inquiry is problem initiation and brainstorming. The second phase is problem exploration and investigation, and the third phase is problem solution and integration of ideas. The frequency and percentage of messages during each phase must be tallied and compared for accurate and usefitl measurement. An ideal conferencing session where students are intellectually engaged is presumed to be one in which the collaborative inquiry goes along the full cycle of the critical thinking process, from problem initiation to problem exploration, and then on to problem solution and idea integration (Garrison, et al., 2000; Xin, 2002). The focus here is on a collaborative process of critical inquiry; that is, the focal point is the higher-order thinking process 29 through collaborative discourse rather than any particular student’s individual performance (Garrison, et al., 2001; Newman et al, 1997). Interactivity Critical thinking and inquiry are not personally reflective processes that are hidden and internal to the mind. Critical thinking can be found at the intersection of 1) personal reflection and deliberation and 2) shared understanding and group negotiation of meaning (Garrison, et al., 2000). Shared understanding is manifested, at least in part, by interactivity. According to Rafaeli and Sudweeks (1996), interactivity is “the extent to which messages in a sequence relate to each other, and especially the extent to which later messages recount the relatedness of earlier messages” (p3). This study’s interactivity measure is adapted from Rafaeli and Sudweek (1996) and Sarlin, Geisler, and Swan (2003). The coding scheme (Appendix D) is as follows: A declarative message (an initiation or a new idea; a new line of thought; etc.) is coded as Declarative. A question or a comment that is unrelated to prior posts is coded as Initiation with a Question3. A response by one poster to a declarative message of another poster is coded as Reactive. Any additional follow-up to an interactive message is coded as Interactive. An ideal conferencing session will be one in which there are more interactive message types than other kinds (Li, 2001; 2002). 3.5 Research Questions 3The category “Initiation with a question” was added by the researcher to the original coding scheme developed by Rafaeli and Sudweek (1996). 30 The quantitative portion of this study was designed to investigate the factors that have an effect on student Intellectual Engagement in a community of inquiry developed by means of synchronous computer conferencing. The qualitative portion explores the process of collaborative meaning construction by a teacher moderator and group work. As delineated in 3.4 in this chapter, seven variables have been identified, two of which are related to teacher moderating levels and five of which relate to student engagement (Shown in Figure 1, 2 & 3). The trend/patterns of variations of all seven variables both over time and across groups were examined first (i.e., Research Question 1 and Question 2, respectively). The relationships between and among these seven variables were then investigated in order to identify the critical factors that influence Higher-order Thinking and Interactivity (Question 3). In a separate chapter - Chapter 6 - the meaning construction process with the leadership of teacher moderators (Question 4) is examined. ' As shown in the research design and variables (Figure 4 and Figure 3), the relationships between and among all of these variables are complicated. After several rounds of trial analysis, this researcher arrived at a way to organize the relationships in four steps that lead the examiner to the final answer of the core question. The four steps become the four sub-questions of Question 3. The rationale for taking these steps and the procedures for doing so will be discussed in Chapter 5, where the results of the quantitative analyses are interpreted. The three questions and sub-questions are as follows: Research Question I : Do the seven variables of teacher moderating levels (T and R) and student engagement (A, P, S, H and I) change over time? Research Question 2: Are the seven variables different across groups? Research Question 3: Quantitatively, are these variables related to each other? What variables are related to student Intellectual Engagement (H and I)? 31 3a. Is student Social-emotional Engagement related to all the other variables? 3b. How do students connect to each other through Behavioral Engagement (A and P)? What is the relationship between Attending and Participation? 3c. Are teacher moderating levels related to student Behavioral Engagement? 3d. What variables are related to student Intellectual Engagement? Research Question 4: Qualitatively, what does the process of collaborative meaning construction look like? What is the transactional nature of the relationship between teacher moderating levels and student engagement? Having discussed the variables and research questions, this summary will proceed to a description of the research design. 3.6 Research Design 3.6.1 Mixed Method This research applies a mixed method approach - a combination of qualitative and quantitative methodologies. Its quantitative methods involve a process of converting communication content into discrete units and calculating the frequency of occurrence of each unit. It also extends the descriptive results of content analysis to inferential hypothesis testing (Borg & Gall, 1989; Rourke et al, 2001) which intends to certify the relationship between teacher moderating levels and student engagement. The qualitative methodology takes an iterative approach to the development of the constructs. Using the quantitative results as a guide, typical transcripts are selected for further examination. Recursive processing of field notes, course agenda, and other course materials provide rich, “thick” description of the collaborative discourse of the community of critical inquiry (Firestone, 1993; Patton, 2002). 32 3.6.2 Transcript Analysis Transcripts of online class group discussions are often the most obvious and easily accessible source of data available for research in computer-mediated communication (Romiszowski & Mason, 1996, Riel & Harasim, 1993). Rourke and others have argued (Routrke, et al., 2001) that there are many educational treasures regarding learning in the online environment that are locked in these transcripts which can be released through appropriate content analysis. This research utilizes a technique called quantitative content analysis4. Quantitative content analysis was defined by Riffe, Lacy, and Fico (1998) as: “The systematic and replicable examination of symbols of communication, which have been assigned numeric values according to valid measurement rules using statistical methods, in order to describe communication, draw inferences about its meaning, or infer from the communication to its context, both of production and consumption.” (p22) Analysis of synchronous computer conferencing transcripts provides a way to decrypt the interactional patterns of group discussion in order to understand the learning process of individuals who participate in the discussion. It also elicits data usefirl for gauging the efficacy of interaction among instructors and students. The analysis of the transcripts of computer conferences can also shed light on how the collaborative learning process can be supported, sustained or hindered (Henri & Rigault, 1996). Only when we have a better understanding of what is happening in computer conferencing can we offer specific suggestions about how to make use of this medium for learning (Bruce, 1993, 4 . . . . . In thrs paper, the researcher uses the terms “content analysrs” and “transcript analysrs” interchangeably, although they can have different meanings in other contexts where the content being analyzed may not consist of transcripts of online discussion. 33 1997; Henri, 1992). This understanding comes only from a finer-grained analysis of the content of the conferencing. 3.6.3 Units of Analysis An important step in assigning data to categories is determining the units of analysis. Rourke et al (2001) identified the five most commonly used units of analysis in computer conferencing research - proposition units, sentence units, paragraph units, thematic units, and message units. Of these five, thematic units were most commonly used by researchers, and message units were reported as the most practical. Given the brevity of synchronous conferencing messages, this researcher used individual messages as the basic unit of analysis. A message unit is operationally defined as a posted message that is automatically numbered by the system. 3.6.4 Coding Process Following Xin’s (2002) experience, the basic stages of the transcript analysis are 3, 66 divided into “before coding, coding,” and “post coding.” The steps taken within each stage will be described below. (1) Before Coding Stage. Following Xin’s scheme, the Before Coding Stage mainly allows for coding preparation. The researcher first reviewed all the supporting data including the course syllabus, course readings, and an agenda for each class session and the field notes taken during participant observation while the course was taught. These data provided a broader context for the transcripts of the conferencing under analysis in the Coding Stage. 34 (2) Coding Stage. The Coding Stage involved building tree nodes and (of course) the actual coding itself. The researcher used the qualitative data analysis software QSR NVIVO 2.0 to manage and organize the coding of an extremely large data set: 44 transcripts, each consisting of about 350 messages. NVIVO allowed the organization of nodes into hierarchical clusters called tree nodes. Nodes were developed both from the rubrics and by paying attention to categories emerging from the data (Miles & Huberrnan, 1994). Based on these rubrics for each construct, the researcher built NVIVO tree nodes with ninety major nodes from the coding rubrics and eight free nodes (Shown in Figure 2). The eight fiee nodes were added as “supplementary” when new categories were found that had not been included in any of the coding schemes during trial coding. In this research, NVIVO software was mainly used as an organizer, despite its usefulness for data analysis. Ultimately, the researcher made message-coding decisions manually. (3) Post Coding Stage. During the Post Coding Stage, the researcher focused on three tasks. The first task was to export the coding results from NVIVO to SPSS for statistical analysis. The second was to normalize the data. Because the length of the conferences varied, comparisons and analyses could not be validly made directly from the raw data. The researcher normalized all the variables and their subcategories into 90 minute-long conferences, meaning that the analyses and results were based on a conference unit of the same duration. The use of the 90-minute block was derived from the mean of all 44 conferences, which is 93.47 minutes. The third Post Coding Stage task was to decide on levels of subcategories to be used in assigning variables for statistical analysis. These latter decisions were made based on statistical results and on relevant theories. The rationale and procedures for this combination will be discussed in chapter 4, section 4.1.1. 35 *4 Trees (90) rev-1 student engagement i F}! Behavioraloencagement -0 attending #0 participation 3 as ! social-emotional engagement a»! warmth f - 0 Expression of emotions im0 Use of humor 3*»0 expr oornpll, apps-e, encou, agree 0 showing personal care . s~0 Self-disdosure Er! cohesion ' %-~0 Vocatives .0 Addresses or refers to the group usi %--—0 Phatics, Salutations i 3...; SE— interactive H”; intellectual engagement éi Q interactivity - . %-0 interactive i -0 declarative ' g---0 reatlve . i ‘0 lnnltla with solicitation [+2 high orderthlnldng an! Initializing - % .‘3-0 sense of puzzlernent f is 0 recognizing the problem a- 2 Negotiation and oo-constructlon g~ 0 divergence— -vvtihln the online oommun 0 informaiton exdlange 90 suggestions for consideration 30 brainstorming i--0 leaps to condusing }« 0 soliciting L0 divergence-mum a singel message Er -1 Integration :j~— 0 convergence among group members - 0 Convergence within a single message l—~0 connecting Ideas, synthesis : ~0 creating solutions i «,9. teacher moderating levels r? filevell a} Qievell shavers Q’ * Ievel4 E” ! levels Figure 2. NVIVO Tree Nodes 36 3.6.5 Inter-rater Reliability The overriding concern of many educational researchers is whether or not computer conferencing can facilitate higher-order thinking outcomes; educational theorists are beginning to regard these not as overt products or manifest variables, but rather as covert products or latent variables (Rourke, et al, 2001). Many educational researchers (including this researcher) are more interested in struggling with important (though hidden) facets of individual and social cognition rather than in assessing merely that which is most easily measured (Rourke, et al, 2001). Behavioral Engagement variables - Attending and Participation - are manifest variables while Social Emotional Engagement and Intellectual Engagement variables are all latent variables. The current study used transcripts analysis that involves both extensive and intensive coding of latent variables; therefore, the reliability issue for this study must be specifically addressed. Inter-rater reliability was addressed by asking another rater to code the same transcripts and sections of transcripts with the researcher and, through discussion, arrive at acceptable reliability, which is .70 cut-off point for interpretable and replicable research proposed by Riffe, Lacy, and F ico (1998) (Rourke Anderson 2002). Coding decisions of the two coders were evaluated for inter-rater reliability. Separate inter-rater reliability procedures were conducted for teacher moderating levels and student engagement. For teacher moderating levels, the researcher and the other rater coded the same 6 transcripts respectively (about 13.6 percent of the whole) in three rounds: coding 2 transcripts first, comparing and discussing; coding another 2, comparing and discussing, and coding the last two, comparing and discussing. The researcher used the simplest and most common method of reporting interrater reliability: the percentage agreement statistic (Rouke, et al., 2001, pl 1). This statistic reflects the 37 number of agreements per total number of coding decisions. The final percentage agreement of the scores by the two raters was 73 percent (the total number of decisions is 1980) For student engagement, 11 transcripts were selected from the 44 transcripts, and 80 messages were selected from each transcript, from the beginning, the middle, and the end. The researcher and the other rater coded all selected sections of discussions The percentage agreement of the overall scores for all student engagement variables was 61 percent (the total number of decisions were 880 messages). While this seems low, Riffe et al. (1998) maintain that research that is breaking new ground with concepts that are rich in analytical value should go ahead with reliability levels somewhat below the conventional range. This lenience is based on the premise that some measures that are taken to increase reliability may simultaneously reduce the value of the results, or in Krippendorf’s words “reliability often gets in the way of validity” (1980, p130). Rourke, et al. (2001) say “it is premature to declare a conventional level of acceptability” and “we feel that the mere act of reporting these figures gives readers sufficient information to interpret results” in communication research (p12). 3.7. Data Analysis Statistical analyses were used to examine each of the first three research questions. Chapter 4 is devoted to the investigation of these questions. Using the quantitative analysis result as a guide, typical cases of transcripts or sections of transcripts were selected to take a closer look by means of a qualitative case study method. Field notes, course syllabi, course readings, classroom activity agenda, and other materials that provide rich context to each transcript were used to triangulate the 38 results (Cresswell, 2003). The qualitative analysis is designed to answer question #4. Chapter 6 is devoted to the investigation of this question. The quantitative and qualitative methods used in the analysis supplemented each other. Compared to previous studies using transcript (content) analysis in the literature, the 44 transcripts formed a large data set, the statistical analysis of which could provide the field with broad numeric trends and patterns of the interactions of computer conferences. Detailed qualitative analysis revealed the complicated nature of the conferencing process under certain circumstances and within local contexts (Pershkin, 2000). The qualitative analysis provided insights into the emerging patterns of the possible transactional nature of teacher moderating behaviors and student engagement. Participant observation supplied “insider information” of the lived experience of participants in the learning environment (Creswell, 2003; Howe & Eisenhart, 1990). While the quantitative method revealed broad patterns of conferencing activities, the qualitative method facilitated local clarification through observation, description, and interpretation of the features of interactions and the roles of teachers, peers, and tasks (Firestone, 1993; Pellegrino & Goldman, 2002). To summarize, this chapter provides an outline of the research context, data collection, research design and data analysis for this study. It also provides details about the variables and their measures. Detailed data analysis procedures and results will be presented in Chapter 4 and Chapter 6. 39 CHAPTER 4 PRESENTATION AND ANALYSIS OF QUANTITATIVE DATA This study was designed to investigate what factors contribute to student Intellectual Engagement in the collaborative discourse of a community of inquiry through the medium of computer conferencing. The researcher approached this question by disentangling the relationships between teacher moderating levels and student engagement variables and the relationships among student engagement variables. Before dealing with the major question, the study investigated how these factors changed over time and across groups. In this chapter, the statistical analysis of the data related to the first three major research questions and their sub-research questions will be presented. The fourth research question, which is related to the qualitative analyses, is examined in Chapter 6. The first section of this chapter discusses the manner in which some of the variables described in Chapter 3 were fine-timed (and in certain cases combined) followed by a description of the data analysis procedures. The second section presents the data analysis processes and results. 4.1 Statistical Analysis Procedures 4.1.1 Combining Variables As is shown in Figure 1, each variable has layers of subcategories. Some variables were combined in order to best organize and manage the data analysis and to enable comparison of discussion transcripts. 40 4.1 .1.1 Combining Social-emotional Engagement Statistical analysis shows that the three indicators of student Social-emotional Engagement are highly correlated: the correlation was significant at the .01 level (two- tailed test). The Pearson Product Moment Correlations between any two of the three variables range from .729 to .852. It is reasonable to combine these three subcategories into a single category: Social-emotional Engagement. Since the three subcategories were mutually exclusive, the values of the three variables were combined to form the variable of student Social-emotional Engagement. 4.1.1.2 Combining Higher-order Thinking To best organize and manage the data analysis and its interpretation, the researcher explored ways to combine Higher-order Thinking, one of the two subcategories of Intellectual Engagement (the other subcategory is Interactivity). As was shown in Figure 1 (and Appendix C), Higher-order Thinking consists of three subcategories, namely, (1) Initiation, (2) Negotiation, and (3) Integration. While higher- order thinking involves the whole process of problem initiation, meaning negotiation and integration, the purpose of a discussion is to enlarge common ground; that is, to reach integration/agreement or “agree to disagree”. Therefore, Integration should weigh the most among the three subcategories of Higher-order Thinking (Xin, 2002). Nevertheless, the frequency of Integration in discussion is always much smaller than that of Negotiation and Initiation (Anderson et al., 2001). This was also the case in the data collected for the present study. To emphasize the significance of Integration in Higher-order Thinking and make the comparison of transcripts applicable, it is necessary to linearly adjust the scale of 41 Integration to the largest by multiplying a constant, symbolized by ConstH in this study. One way to accomplish this is to adjust the scale of Integration to the simple sum of the three subcategories. Since the frequency of Initiation and Negotiation can legitimately represent their own significance, it is reasonable to directly add them. Therefore the combined Higher-order Thinking is the sum of the Initiation, negotiation and the adjusted Integration. Expressed in formula, this would be: H=Ini+Neg+Int "' ConstH (Where H stands for Higher-order Thinking; Ini, Initiation; Neg, Negotiation; and Int, Integration; and ConstH stands for the constant of Integration scale adjustment.) H, Ini, Neg and Int are frequencies of the measure of High-order Thinking, Initiation, Negotiation and Integration, respectively. ConstH is the maximum value of the sum of the three variables divided by the value of Integration. This can be expressed in the formula: Ini+Neg+Int ConstH = W [ Int l The constant of Integration scale adjustment (ConstH) in this study - calculated from the above formula - is 18:11. The measure of Integration scale is thus linearly adjusted to the scale of the simple sum of the three subcategories. Though expressed in a formula, the emphasis here is on trends, not on the specific accuracy of the frequency of each subcategory. 42 4.1 .1 .3 Combining Interactivity As was shown in Figure 1 (& Appendix D), Interactivity is measured in terms of four subcategories. These four subcategories must be combined in order to make comparisons of analysis results across transcripts applicable. Critical thinking and inquiry are not personally reflective processes that are hidden and internal to the mind. Shared understanding is manifested, at least in part, by interactivity. It is not uncommon in computer conferencing to experience a series of superficially related monologues rather than contextualized and personalized dialogues or common-logues (Xin, 2002). This is an indication that participants are not engaged in the conferencing. Declarative messages therefore contribute the least to interactivity. An ideal conferencing session will be one in which there are more interactive message types than other kinds. Interactive messages should weigh the most among all the subcategories of Interactivity, but this value is (again) always the smallest. To emphasize the significance of Interactive messages in Interactivity and make the comparison of transcripts applicable, it is necessary to linearly adjust the scale of Interactive measures to the largest by multiplying a constant, symbolized by Constl in this study. One way is to accomplish this is to adjust the scale of Interactive measures to the simple sum of the three subcategories- Interactive, Reactive, and Initiation with a Question, minus Declarative messages, via multiplying the measure of Interactive by a constant. This is expressed in the formula: I=(In+Re+ IQ—De)+ln* Constl (Where I stands for Interactivity; In, Interactive; Re, Reactive; IQ, Initiation with a Question; and De, declarative; and Constl stands for the constant of Interactivity scale adjustment). 43 I, In, Re, IQ, and De are frequencies of the measure of Interactivity, Interactive, Reactive, Initiation with a Question, and Declarative, respectively. Constl is the maximum value of the sum of the first three variables minus the Declarative measure divided by the value of the Integration measure, expressed in the formula below: In+Re+IQ-De ConsI = MAX [ In l The constant of Interactivity scale adjustment in this study - calculated from the above formula - is 2.67. The measure of the Interactive scale is thus linearly adjusted to the scale of the simple sum of the four subcategories. Although expressed in a formula, the emphasis is on trends, not the accuracy of the frequency of each subcategory. Below are the variables represented alter combination (cf. Figure 1) Teacher Number of Teacher Postings (T) / I MOderatmg Rating of Teacher Moderating Levels (R) Behavioral {Attending (A) e-Classroom Engagement Participation (P) \ Student Social-emotional (5) Engagement Engagement High-order (H) Intellectual Thinking Engagement Interactivity (I) Figure 3. All variables After Combination 44 /Week1.6mup4/Week2.6roup4/- - - - - . /Week10.Gmuijeek11.Group4 a; /Week1.Group3/Week2.6roup3/- - - - - . /Week10.Group3/Week11.6roup3/ Week1.Gmup2 Week2.Gmup2 Week10. Group2 Weeki 1 .Group2 week ‘55? Week1.Gmup1 Week2.Group1 - - - - - - Week1o.Group1 Week11.6roup1 , Varlab A ,3? T 13.11 65.57 I I I I I I 43.9 28.67 )8 *i' R 2.05 2.27 . - - - . - 2.22 2.33 7 '5’ A 887.22 1895.88 - - - - - - 1827.44 1385.84 , 4 8’ P 159.14 399.88 . . . . . . 282.07 285.93 E a s 8 120.53 223.71 . . . . . . 163.54 187.28 f ~z~ |-| 105.98 904.24 . . . . . . 731.3 1238.18 ,3 \ | 28.05 478.85 - - - - - - 185.82 400.58 Figure 4. Overview of the Research Design and Data Analysis Group4 Group3 Group2 Groupl 26.72 93.86 47.5 86.25 74.68 67.33 41.76 68.48 90.00 30.68 97.71 62.17 76.88 92.14 60.00 45.29 58.78 81.00 30.73 91.29 63.00 98.78 90.00 96.67 45.88 43.33 81.00 13.11 65.57 50.43 68.05 45.00 52.00 38.24 93.91 67.50 45.00 50.49 69.15 43.90 Weekl Week2 Week3 Week4 WeekS Week6 Week7 Week8 Week9 Week] 0 Weekl 1 96.37 57.35 50.18 28.67 Figure 5. An Example - the Descriptive Statistics of Number of Teacher Postings (T-table) 4.1.2 The Analysis To answer the core question - what factors have an effect on student Intellectual Engagement - student Intellectual Engagement variables (Higher-order Thinking and Interactivity) were treated as dependent variables and all the remaining variables became independent variables. To answer the core question, it was vital to disentangle the complicated relationships among the independent variables. This meant that within the independent variables some would, in turn, be treated as independent and some as 45 dependent. Thus, depending on the question, variables could be either dependent or independent. Each variable was measured for 11 weeks across four groups as shown in Figures 4 & 5. Figure 4 offers an overview of the research data organization, while Figure 5 provides an example of some descriptive statistics related to one variable - Number of Teacher Postings for each of the four Groups over 11 Weeks. Repeated measure analysis of variance (ANOVA) was used to test whether there would be significant differences in teacher moderating levels and student engagement over time. This addresses research question #1. A multivariate analysis of variance (MANOVA) was applied and then univariate analyses of variance (ANOVE) were used to test whether there were any significant differences in teacher moderating levels and student engagement across groups. This deals with research question #2. Regression analyses were conducted to investigate the relationships between and among variables in order to identify what factors are related to student Intellectual Engagement. These regressions were performed to answer question #3. The research questions had to be “converted” to null hypotheses. The following section discusses the null hypotheses that will guide the quantitative analyses in this Chapter. 4.1.3 Null Hypothesis The study focused on four questions (cited in Chapter 3, Section 3.5). The first section of the following discussion deals with the changes in teacher moderating levels and student engagement over time (Question #1); this question was “converted” into 7 46 null hypotheses. The second session deals with changes in teacher moderating levels and student engagement across groups (Question #2); this question was also “converted” into 7 null hypotheses. The third session deals with relationships (Question #3); this question was “converted” into 21 null hypotheses. The answers to each of the three research questions will be investigated by testing the 35 null hypotheses. Answers to Question #4 is examined in Chapter 6. 4.2 Statistical Analysis Process and Results 4.2.1 Mean Differences of Variables over Weeks This section investigates the changes in teacher moderating levels and student engagement over time, which provides results to the answers of research question #1. The null hypotheses that will be tested are as follows. Ho 1.1 Number of Teacher Postings does not change over weeks; Ho 1.2 Rating of Teacher Moderating Levels does not change over weeks; Ho 1.3 Attending does not change over weeks; H0 1.4 Participation does not change over weeks; Ho 1.5 Social-emotional Engagement doest not change over weeks; H0 1.6 Higher-order Thinking does not change over weeks; and Ho 1.7 Interactivity does not change over weeks. To test the null hypotheses of no difference across time points (weeks), repeated 47 measure analysis of variance (ANOVA) was applied. The level of significance for alpha was set at .05. The individual results are shown below. H0 1.1: The number of Teacher Postings (T) does not change over weeks The first null hypothesis tests the assumption that the quantity of teacher moderating levels does not change over time. The mean of Number of Teacher Postings (T) over weeks was calculated by averaging the measure of the Number of Teacher Postings of the 4 groups. These means are presented in Table 1. The mean difference of the Number of Teacher Postings (T) as determined by the repeated measure ANOVA was significant (F 10, 3o=6.42) (Shown in Table 2). This led to the rejection of the null hypothesis. It can be concluded that over time, the changes in Number of Teacher Postings were statistically significant. Table 1. Mean and Standard Deviation (SD) of Number of Teacher Postings (T) over Weeks N Mean SD T-Weekl 4 25.31 8.35 T-Week2 4 87.11 14.60 T-Week3 4 55.77 7.96 T-Week4 4 82.49 13.16 T-Week5 4 75.46 21 .74 T-Week6 4 69.00 19.48 T-Week7 4 42.79 3.54 T-Week8 4 66.13 21 .22 T-Week9 4 79.88 9.28 T-WeeklO 4 52.13 11.70 T-Weekll 4 58.14 28.25 48 Table 2. Repeated Measure ANOVA for Number of Teacher Postings (T) over Weeks Source df SS MS F Week (T) 10 13876.45 1387.65 642*, p<.05 Error 30 6483.00 216.10 H0 1.2: Rating of Teacher Moderating Levels does not Change over Weeks This null hypothesis tests the assumption that the quality of teacher moderating levels does not change over time. The mean of the measure of Rating of Teacher Moderating Levels (R) over weeks was calculated by averaging the measures of Rating of Teacher Moderating Levels of the 4 groups. These means are presented in Table 3. The mean difference of the Rating of Teacher Moderating Levels as determined by the repeated measure ANOVA was not significant (F10, 30=1.84, p=.095) (Shown in Table 4). The null hypothesis, therefore, was retained. It can be concluded that over time, the changes in Rating of Teacher Postings were not statistically significant. Table 3. Mean and Standard Deviation (SD) of Rating of Teacher Moderating Levels (R) over Weeks N Mean SD R-Weekl 4 2.17 0.15 R-Week2 4 2.38 0.17 R-Week3 4 2.24 0.09 R-Week4 4 2.43 0.14 R-WeekS 4 2.40 0.29 R-Week6 4 2.45 0.30 R-Week7 4 2.38 0.22 R-Week8 4 2.43 0.24 R-Week9 4 2.50 0.46 R-Weekl O 4 2.35 0.20 R-Weekll 4 2.53 0.24 49 Table 4. Repeated Measure ANOVA for Rating of Teacher Moderating Levels (R) over Weeks Source df SS MS F Week (R) 10 .436 .044 1.84, p>.05 Error 30 .710 .024 Ho 1.3: Attending (A) does not change over weeks. The null hypothesis tests the assumption that the frequency of students “listening” to others does not change over time. The mean of Attending (A) over weeks was calculated by averaging Attending measures of the 4 groups. These means are presented in Table 5. The mean difference of Attending over weeks as determined by the repeated measure ANOVA was significant (F10, 30:9.05) (Shown in Table 6). This led to the rejection of the null hypothesis. It can be concluded that over time the changes in Attending over weeks were statistically significant. 50 Table 5. Mean and Standard Deviation (SD) of Attending (A) over Weeks Mean SD 635.17 173.19 1316.25 287.68 A- Week 3 1168.62 103.85 A- Week 4 2034.75 561.31 N A-Weekl 4 4 4 4 A-Week5 4 2017.57 513.46 4 4 4 4 4 4 A- Week 2 A- Week 6 1839.33 242.54 A- Week 7 840.00 205.55 A- Week 8 1651.36 161.52 A- Week 9 2202.62 692.71 A- Week 10 1360.15 328.75 A- Week 11 1075.62 287.18 Table 6. Repeated Measure ANOVA for Attending (A) over Weeks Source df SS MS F Week (A) 10 10802595.71 108025957 9.05“, p<.05 Error 30 357931807 119310.60 Ho 1.4: Participation (P) does not change over weeks. The null hypothesis tests the assumption that the frequency of students’ “Talk” does not change over time. The mean of Participation (P) over weeks was calculated by averaging the Participation measures of the 4 groups. These means are presented in Table 7. 51 The mean difference of the Participation over weeks as determined by the repeated measure AN OVA was significant (F10, 30=6.54) (Shown in Table 8). This led to the rejection of the null hypothesis. It can be concluded that over time, the changes in Participation over weeks were statistically significant. Table 7. Mean and Standard Deviation (SD) of Participation (P) over Weeks N Mean SD P- Week 1 4 194.01 50.21 P- Week 2 4 372.86 79.10 P- Week 3 4 276.97 37.85 P- Week 4 4 363.81 44.68 P- Week 5 4 360.88 69.38 P- Week 6 4 387.67 58.77 P- Week7 4 221.18 31.12 P- Week 8 4 323.24 98.92 P- Week 9 4 424.88 95.23 P- Week 10 4 280.15 13.51 P- Week 11 4 354.42 69.71 Table 8. Repeated Measure ANOVA for Participation (P) over Weeks Source df SS MS F Week (P) 10 Error 30 208363.56 95571.06 20836.36 6.54*, p<.05 3185.70 52 H0 1.5: Social-emotional Engagement (S) does not change over weeks The null hypothesis tests the assumption that student Social-emotional Engagement does not change over time. The mean of Social-emotional Engagement (S) for each week was calculated by averaging Social-emotional Engagement measures of the 4 groups. These means are presented in Table 9. The mean difference of the Social-emotional Engagement (S) over weeks as determined by the repeated measure ANOVA was not significant (F 10, 3o=1.84, p=.096) (Shown in Table 10). The null hypothesis was, therefore, retained. It can be concluded that the changes in Social-emotional Engagement over weeks were not statistically significant, Table 9. Mean and Standard Deviation (SD) of Social-emotional Engagement (S) over Weeks N Mean SD S-Weekl 4 203.92 51.59 S-Week2 4 220.82 107.94 S-Week3 4 102.18 12.67 S-Week4 4 247.63 100.25 S-Week5 4 213.80 70.25 S-Week6 4 133.17 78.50 S-Week7 4 136.47 43.16 S-Week8 4 182.54 155.55 S-Week9 4 132.00 51.86 S-WeeklO 4 143.78 26.99 S-Weekl l 4 208.27 63.65 Table 10. Repeated Measure ANOVA for Social-emotional Engagement (S) over Weeks Source df SS MS F Week (S) 10 88961.25 8896.13 1.84, p>.05 Error 30 144899.63 4829.99 53 H0 1.6: Higher-order Thinking (H) does not change over weeks The null hypothesis tests the assumption that student Higher-order Thinking does not change over time. The mean of Higher-order Thinking for each week was calculated by averaging the Higher-order Thinking measures of the 4 groups. These means are presented in Table 11. The mean difference of the Attending over weeks as determined by the repeated measure ANOVA was significant (F 10, 30:12.87) (Shown in Table 12). This led to the rejection of the null hypothesis. It can be concluded that over time, the changes in Higher-order Thinking over weeks were statistically significant. Table 11. Mean and Standard Deviation (SD) of Higher-order Thinking (H) over Weeks N Mean SD H-Weekl 4 243.97 186.21 H-Week2 4 923.78 189.88 H-Week3 4 634.99 291.87 H-Week4 4 1036.24 161.95 H-Week5 4 996.62 240.94 H-Week6 4 1246.45 421 .08 H-Week7 4 610.21 115.29 H-Week8 4 533.51 120.29 H-Week9 4 1007.79 1 19.13 H-Weekl 0 4 567.57 174.97 H-Weekll 4 1552.28 337.32 Table 12. Repeated Measure ANOVA for Higher-order Thinking (H) over Weeks Source df SS MS F Week (II) 10 555133612 555133.61 12.87*, p<.05 Error 30 129432980 43144.33 54 Ho 1.7: Interactivity (I) does not change over weeks The null hypothesis tests the assumption that student Interactivity does not change over time. The mean of Interactivity (I) for each week was calculated by averaging the Interactivity measures of the 4 groups. These means are presented in Table 13. The mean difference of the Interactivity over weeks as determined by the repeated measure ANOVA was significant (F 10, 30=10.32) (Shown in Table 14). This led to the rejection of the null hypothesis. It can be concluded that overtime, the changes in Interactivity were statistically significant. Table 13. Mean and Standard Deviation (SD) of Interactivity (I) over Weeks N Mean SD I-Weekl 4 38.95 23.53 I-Week2 4 214.07 43.70 I-Week3 4 203.62 52.65 I-Week4 4 186.12 56.68 I-Week5 4 163.27 42.21 I-Week6 4 317.17 62.65 I-Week7 4 145.15 27.10 I-Week8 4 176.98 44.69 I-Week9 4 331.13 80.24 I-Weekl 0 4 172.87 1 1.97 I-Weekl 1 4 198.92 52.89 Table 14. Repeated Measure ANOVA for Interactivity (I) over Weeks Source df SS MS F Week (I) 10 250579.01 25057.90 10.32*,p<.05 Error 30 72877.1 1 2429.24 55 Summary Here is the summary of the null hypotheses tested. Null Hypothesis Findings The number of Teacher Postings does not change over Rejected H0 1'1 weeks 1 2 The Rating of Teacher Moderating Levels does not Failure to reject H0 ' change over weeks p=.095 H0 13 Attending does not change over weeks. Rejected H0 1.4 Participation does not change over weeks. Rejected H 1 5 Social-emotional Engagement does not change over Failure to reject 0 ' weeks p=.096 HO 1.6 Higher-order Thinking does not change over weeks. Rejected HO 1.7 Interactivity does not change over weeks Rejected The Number of Teacher Postings (T) changed significantly over weeks while the change of the Rating of Teacher Moderating Levels (R) did not reach a significant level over weeks. Among all student engagement variables (A, P, S, H, I), only Social-emotional Engagement (S) and the Rating of Teacher Moderating Levels did not significantly change over weeks; all others did. 4.2.2 Mean Differences of Variables across Groups This section investigates the changes in teacher Moderating levels and student engagement across groups, which provides results to research question #2. As stated in Chapter 3, the null hypotheses that will be tested are as follows. H0 2.1 Number of Teacher Postings does not change across groups; H0 2.2 Rating of Teacher Moderating Levels does not change across groups; Ho 2.3 Attending does not change across groups; 56 Ho 2.4 Participation does not change across groups; H0 2.5 Social-emotional Engagement doest not change across groups Ho 2.6 Higher-order Thinking does not change across groups; and H0 2.7 Interactivity does not change across groups. To test the mean differences of variables across all groups and all variables, a multivariate analysis of variance (MANOVA) was performed on all of the dependent variables simultaneously. Levene's Test of Equality of Error Variances showed that the variances of dependent variables are homogeneous. Because this analysis showed m /’” \ significant differences in the groups, with PhilaTragi: being significant at alpha .05 (F 3, 24 = 2.91) separate univariate ANOVAs were performed to determine which specific dependent variables were significantly different. Again, the level of significance for alpha was set at .05. The individual ANOVA results are shown below. H0 2.]: The Number of Teacher Postings (T) does not change across groups The null hypothesis tests the assumption that the quantity of teacher Moderating levels does not change across groups. The mean of Number of Teacher Postings (T) across groups was calculated by averaging the measure of the Number of Teacher Postings (T) over the 11 weeks. These means are presented in Table 15. The mean difference of the Number of Teacher Postings (T) as determined by ANOVA was not significant (F 3, 40=1.36, p=.267) (Shown in Table 16). The null hypothesis, therefore, was retained. 57 It can be concluded that across groups, the changes of the Number of Teacher Postings were not statistically significant. Table 15. Mean and Standard Deviation (SD) of the Number of Teacher Postings (T) across Groups N Mean SD T-Group 1 11 51.49 21.91 T-Group 2 11 69.09 24.05 T-Group 3 11 64.77 20.27 T-Group 4 11 67.09 23.81 Total 44 63.1 1 22.85 Table 16. Repeated Measure ANOVA for Number of Teacher Postings (T) across Groups Source df SS MS F Group (T) 3 2083.33 694.44 1.36, p>.05 Error 40 20359.45 508.99 * Levene's statistic = .37, p = .78 Ho 2.2: Rating of Teacher Moderating Levels (R) does not change across groups The null hypothesis tests the assumption that the quality of teacher Moderating levels does not change across groups. The mean of Rating of Teacher Moderating Levels (R) across groups was calculated by averaging the measure of the Raging of Teacher Moderating Levels of the 11 weeks. These means are presented in Table 17. The mean difference of the Rating Teacher Moderating Levels (R) as determined by ANOVA was significant (F 3, 40:14.95) (Shown in Table 18). This led to the rejection of null hypothesis. 58 It can be concluded that across groups, the changes of Rating of Teacher Moderating Levels was statistically significant. Table 17. Mean and Standard Deviation (SD) of Rating of Teacher Moderating Levels (R) across Groups N Mean SD R-Group l 11 2.30 0.15 R-Group 2 11 2.35 0.13 R-Group 3 11 2.23 0.14 R-Group 4 11 2.67 0.23 Total 44 2.39 0.24 Table 18. Repeated Measure ANOVA for Rating of Teacher Moderating Levels (R) across Groups Source df SS MS F Group (R) 3 1.29 0.43 14.95“, p<.05 Error 40 1.15 0.03 * Levene's statistic = .80, p = .50 Ho 2.3: Attending (A) does not change across groups The null hypothesis tests the assumption that the frequency with which students “listen” to others does not change across groups. The mean of Attending (A) across groups was calculated by averaging Attending measures of the 11 weeks. These means are presented in Table 19. The mean difference of Attending (A) as determined by ANOVA was not significant (F 3, 40=.85, p=.473) (Shown in Table 20). The null hypothesis, therefore, was retained. 59 It can be concluded that across groups, the changes in Attending were not statistically significant. Table 19. Mean and Standard Deviation (SD) of Attending (A) across Groups N Mean SD A-Groupl 1 1 1665.63 699.74 A-Group 2 11 1331.44 506.51 A-Group 3 11 1545.66 668.26 A-Group 4 1 1 1326.88 495.41 Total 44 1467.40 596.54 Table 20. ANOVA table for Attending (A) across Groups Source df SS MS Group (A) 3 920140.53 306713.51 Error 40 l4381913.77 359547.84 .85, p>.05 "‘ Levene’s statistic = .75, p = .53 H0 2.4: Participation (P) does not change across groups. The null hypothesis tests the assumption that the frequency of students “Talk” to others does not change over time. The mean of Number of Participation (P) across groups was calculated by averaging the measure of Participation over 11 weeks. These means are presented in Table 21. The mean difference of the Participation (P) as determined by ANOVA was not significant (F3, 40=1.77, p=.168) (Shown in Table 22). The null hypothesis, therefore, was retained. 60 It can be concluded that across groups, Changes in Participation were not statistically significant. Table 21. Mean and Standard Deviation (SD) of Participation (P) across Groups N Mean SD P-Groupl 1 1 306.81 87.76 P-Group 2 11 287.37 95.96 P-Group 3 11 368.48 82.87 P-Group 4 1 1 331.90 81.35 Total 44 323.64 89.49 Table 22. AN OVA table for Participation (P) across Groups Source df SS MS F Group (P) 3 40449.33 13483.11 1.77, p>.05 Error 40 303934.62 7598.37 * Levene's statistic = .10, p = .96 Ho 2.5: Social-emotional Engagement does not change across groups The null hypothesis tests the assumption that student Social-emotional Engagement does not change over time. The mean of Social-emotional Engagement (S) across groups was calculated by averaging the measure of Social-emotional Engagement over 11 weeks. These means are presented in Table 23. The mean difference of the Social-emotional Engagement (S) as determined by ANOVA was significant (F 3, 40=3.57, p=.022) (Shown in Table 24). This led to the rejection of the null hypothesis. 61 It can be concluded that across groups, the changes in Social-emotional Engagement were statistically significant. Table 23. Mean and Standard Deviation (SD) of Social-emotional Engagement (S) across Groups N Mean SD S-Groupl 11 185.78 59.58 S-Group 2 11 1 1 1.73 49.68 S-Group 3 11 211.11 98.94 S-Group 4 11 191.23 87.06 Total 44 174.96 83.03 Table 24. ANOVA table for Social-emotional Engagement (S) across Groups Source df SS MS F Group (S) 3 62550.26 20850.09 3.57*, p<.05 Error 40 233860.88 5846.52 "‘ Levene's statistic = 1.21, p = .32 Ho 2.6: Higher-order Thinking (H) does not change across groups The null hypothesis tests the assumption that student Higher-order Thinking does not change across groups. The mean of Higher-order Thinking (H) across groups was calculated by averaging the measure of Higher-order Thinking over 11 weeks. These means are presented in Table 25. The mean difference of Higher-order Thinking (H) as determined by ANOVA was not significant (F 3, 40=1.01, p=.399) (Shown in Table 26). The null hypothesis was therefore retained. 62 It can be concluded that across groups, the changes in Higher-order Thinking were not statistically significant. Table 25. Mean and Standard Deviation (SD) of Higher-order Thinking (H) across Groups N Mean SD H-Groupl 1 1 699.29 318.55 H-Group 2 11 894.18 429.43 H—Group 3 11 812.92 341.38 H-Group 4 11 994.85 531.17 Total 44 850.31 413.80 Table 26. ANOVA table for Higher-order Thinking (H) across Groups Source df SS MS F Group (H) 3 517256.47 172418.82 1.01,p>.05 Error 40 684566593 171141.65 * Levene's statistic = .81, p = .50 Ho 2.7: Interactivity does not change across groups The null hypothesis tests the assumption that student Interactivity (I) does not change across groups. The mean of Interactivity (1) across groups was calculated by averaging the measure of Interactivity (I) over 11 weeks. These means are presented in Table 27. The mean difference of Interactivity (I) as determined by ANOVA was not significant (F3, 4o=.65, p=.591) (Shown in Table 28). The null hypothesis was therefore retained. 63 It can be concluded that across groups, the changes in Interactivity were not statistically significant. Table 27. Mean and Standard Deviation (SD) of Interactivity (1) across Groups I-Groupl I-Group 2 I-Group 3 I-Group 4 Total N 11 11 11 11 44 Mean 270.85 325.23 344.74 356.45 324.32 SD 139.63 137.95 137.06 201.73 154.58 Table 28. ANOVA table for Interactivity (1) across Groups Source df Group (I) 3 Error 40 47400.8 1 MS F 15800.27 .65, p>.05 980079.45 24501.99 * Levene's statistic = .12, p = .95 Summary Here is the summary of the null hypotheses HO 2'1 across groups H 2 2 The Rating of Teacher Moderating Levels does not 0 ' change across groups Ho 23 Attending does not change across groups, Participation does not change across groups, H0 2.4 H 2 5 Social-emotional Engagement does not change across 0 ' groups. H0 2 6 Higher-order Thinking does not change across groups, Interactivity does not change across groups. H0 2.7 64 The number of Teacher Postings does not change Failure to reject p=.27 Rejected Failure to reject p=.47 Failure to reject p=.16 Rejected Failure to reject p=.40 Failure to reject p=.59 Among Teacher Moderating variables (T and R), the Rating of Teacher Moderating Levels changed significantly across groups while the Number of Teacher Postings did not change significantly across groups. Among all student engagement variables (A, P, S, H, and I), only Social- emotional Engagement (S)[and (Rating of Teacher Moderating Levelsjchanged significantly across groups. None of the changes of others student engagement variables (A, P, H, I) reached significant level. 4.2.3 Relationships Between and Among Variables This section investigates the relationships among teacher Moderating levels and student engagement variables and the relationships among student engagement variables themselves. Since the relationships have many dimensions, they are investigated in four separate sections: 3a. Relationships between Social emotional Engagement and all the other variables; 3b. Relationship within the student Behavioral Engagement variables; 3c. Relationship between Teacher Moderating Levels and student Behavioral Engagement variables, and finally, 3d. The influence of each variable on student Intellectual Engagement. To test the null hypotheses with regard to the relationships between/among variables, regression analyses were applied. If, after applying linear regression, a significant contribution was discovered, the analysis was completed. If the linear relationship was not significant, a nonlinear regression was applied. The purpose of doing nonlinear regression was to find the lowest order relationships between variables 65 mentioned in the hypotheses. Customarily, linear, quadratic, and cubic relationships were taken into consideration. Again, the alpha level is .05. All the results are shown below. 3a. Relationships between Social-emotional Engagement and all other variables The researcher first looked at the relationships between Social-emotional Engagement variables and each of the other variables. As stated in Chapter 3, the null hypotheses that were tested include: H0 38.] There is no relationship between the Number of Teacher Postings and student Social-emotional Engagement; Ho 3a.2 There is no relationship between Rating of teacher Moderating levels and student Social-emotional Engagement; H0 3a.3 There is no relationship between Attending and Social-emotional Engagement; Ho 38.4 There is no relationship between Participation and Social-emotional Engagement; H0 3a.5 There is no relationship between Higher-order Thinking and Social- emotional Engagement; H0 3a.6 There is no relationship between Interactivity and Social-emotional Engagement Each null hypothesis is tested, sequentially, in the following discussion. Ho 3a. 1: There is no relationship between the number of Teacher Postings and student Social-emotional Engagement. The null hypothesis tests the assumption that the quantity of teacher Moderating levels has no effect on student Social-emotional Engagement. Linear regression analysis demonstrated that the Number of Teacher Postings (T) did not have a significant effect on student Social-emotional Engagement (S) (t=.25, p=.807). Thus the null hypothesis was 66 retained. This led to the conclusion that the Number of Teacher Postings (T) was not linearly related to student Social-emotional Engagement (Shown in Table 29). Although no significant linear relationship between the Number of Teacher Postings (T) and student Social-emotional Engagement (S) was demonstrated, a nonlinear regression was applied in order to verify this result. The nonlinear regression results are shown in Table 30. The nonlinear regression results demonstrated that none of the three types of relationships (linear, quadratic, and cubic) reached a significant level. From the results, it can be reasonably concluded that the quantity of teacher postings is not significantly related to student Social-emotional Engagement. Table 29. Linear regression results of Number of Teacher Postings (T) and Student Social-emotional Engagement (S) B Standard Error t Intercept 166.28 37.56 4.43 T 0.14 0.56 0.25 R2=.001 p>.05, Dependent Variable: Social-emotional Engagement Table 30. Nonlinear Regression Results of the Number of Teacher Postings (T) and Student Social-emotional Engagement (8) 2 Relationship R df F Sig. Quadratic .004 41 .09 .912 Cubic .047 40 .65 .585 67 H0 38.2: There is no relationship between Rating of Teacher Moderating Levels (R) and student Social-emotional Engagement. The null hypothesis tests the assumption that the quality of teacher Moderating levels has no effect on student Social-emotional Engagement. Linear regression results demonstrated that the Rating of Teacher Moderating Levels (R) did not have a significant effect on student Social-emotional Engagement (S) (t=.01, p=.995). The null hypothesis was, therefore, retained and the conclusion reached that the Rating of Teacher Moderating Levels (R) was not linearly related to student Social-emotional Engagement (Shown in Table 31). Although no significant linear relationship between the Teacher Moderating Levels and student Social-emotional Engagement was discovered, a nonlinear regression analysis was applied in order to verify this result. The nonlinear regression results are shown in Table 32. The nonlinear regression results demonstrated that of the three types of relationships (linear, quadratic, and cubic) none reached a statistically significant level. From these results, it can be reasonably concluded that quality of Teacher Moderating Levels is not significantly related to student Social-emotional Engagement. Table 31. Linear Regression Results of Rating of Teacher Moderating Levels (R) and Social-emotional Engagement (S) B Standard Error t Intercept 174.15 129.25 1.35 R .342 53.87 .01 R2=.000 p>.05, Dependent Variable: Social-emotional Engagement 68 Table 32. Nonlinear Regression Results of the Rating of Teacher Moderating Levels (R) and Student Social-emotional Engagement (S) 2 Relationship R df F Sig. Quadratic .029 41 .60 .552 Cubic .034 41 .73 .490 H0 3a.3: There is no relationship between Attending and Social-emotional Engagement. The null hypothesis tests the assumption that Attending has no effect on student Social-emotional Engagement. Linear regression results demonstrated that Attending (A) has a significant effect on student Social-emotional Engagement (S) (t=2.16) (Shown in Table 33). This led to the rejection of the null hypothesis. From these results, it can be tentatively concluded that student Attending has an effect on student Social-emotional Engagement. Table 33. Linear Regression Results of Attending (A) and Social-emotional Engagement (3) B Standard Error t Intercept 108.04 32.26 3.35 A .04 .020 2.16* R2=.100 *p<.05, Dependent Variable: Social-emotional Engagement Ho 3a.4: There is no relationship between Social-emotional Engagement and Participation The null hypothesis tests the assumption that student Social-emotional Engagement has no effect on Participation. 69 Linear regression results indicates that Social—emotional Engagement (S) has a significant effect on Participation (P) (t=3.68) (Shown in Table 34). This led to the rejection of the null hypothesis. From these results, it is reasonable to conclude that student Social-emotional Engagement affects student Participation. That is to say, the more social-emotionally engaged students are, the more actively they will participate in the discussion. Table 34. Linear Regression Results of Student Social-emotional Engagement (S) and Participation (P) B Standard Error t Intercept 231.76 27.62 8.39 S .53 .14 3.68* R2=.244 *p<.05, Dependent Variable: Participation H0 3a.5: There is no relationship between Social-emotional Engagement and Higher- order Thinking. The null hypothesis tests the assumption that student Social-emotional Engagement has no effect on student Higher-order Thinking. Linear regression results indicate that Social-emotional Engagement (S) has no significant effect on Higher-order Thinking (H); t=.87, p=.389 (Shown in Table 35). Thus the null hypothesis was retained and the conclusion was reached that the Social- emotional Engagement (S) was not linearly related to student Higher-order Thinking. Although no significant linear relationship between the Social-emotional Engagement (S) and Higher-order Thinking (H) was discovered, a nonlinear regression analysis was applied in order to verify this result. The nonlinear regression results are 70 shown in Table 36. The nonlinear regression results demonstrated that none of the three types of relationships (linear, quadratic, and cubic) reached significant level. From these results, it can be reasonably concluded that student Social-emotional Engagement had no significant effect on student Higher-order Thinking. Table 35. Linear Regression Results of Social-emotional Engagement (S) and Higher- order Thinking (H) B Standard Error t Intercept 735.87 145.60 5.05 S 0.66 0.76 0.87 R2=.018 p>.05, Dependent Variable: Higher-order Thinking Table 36. Nonlinear Regression Results of Social—emotional Engagement (S) and Higher- order Thinking (H) Relationship R (If F Sig. Quadratic .022 41 .46 .632 Cubic .066 40 .94 .433 H0 3a.6: There is no relationship between Social-emotional Engagement and Interactivity. The null hypothesis tests the assumption that student Social-emotional Engagement has no effect on student Interactivity. Linear regression results indicate that student Social-emotional Engagement (S) has no significant effect on Interactivity (I); t=.34, p=.733. The null hypothesis was, therefore, not rejected. Student Social-emotional Engagement (S) is not linearly related to student Higher-order Thinking (Shown in Table 37). 71 Although no significant linear relationship could be established between Social- emotional Engagement (S) and Interactivity (I), further research — using nonlinear regression — was conducted. The nonlinear regression results are shown in Table 38. From the nonlinear regression results, it is clear that none of the three types of relationships (linear, quadratic, cubic) reached a statistically significant level. It is logical to conclude from these data that Social-emotional Engagement (S) does not have any effect on student Interactivity. Table 37. Linear Regression Results of Social-emotional Engagement (S) and Interactivity (I) B Standard Error T Intercept 307.33 54.80 5.61 S 0.10 0.29 0.34 R2=.003 p>.05, Dependent Variable: Interactivity (I) Table 38. Nonlinear Regression Results of Student Social-emotional Engagement (S) and Interactivity (I) Relationship R2 df F Sig. Quadratic .013 41 .27 .768 Cubic .041 40 .57 .640 72 Summary H0 38.] H0 3a.2 H0 3a.3 Ho 3a.4 Ho 33.5 H0 3a.6 There is no relationship between the Number of Teacher Postings and Social-emotional Engagement There is no relationship between the Rating of Teacher Moderating Levels and Social-emotional Engagement There is no relationship between Attending and Social- emotional Engagement There is no relationship between Social-emotional Engagement and Participation There is no relationship between Social-emotional Engagement and Higher-order Thinking There is no relationship between Social-emotional Engagement and Interactivity Failure to reject p=.8 1 Failure to reject p=1.0 Rejected Rejected Failure to reject p=.39 Failure to reject p=.73 Among all the relationships between Social-emotional Engagement (S) and all the other variables explored in this study (T, R, A, P, H, and I), Social-emotional Engagement (S) only has a significant effect on Participation (P) and it is only significantly affected by Attending (A). Neither the Number of Teacher Postings (T), nor the Teacher Moderating Levels (R) had any effect on student Social-emotional Engagement (S). Social-emotional Engagement (S), likewise, had no significant effect on either Higher-order Thinking (H) or Interactivity (I). This researcher concluded that Social-emotional Engagement influences student Intellectual Engagement through the Behavioral Engagement variables Attending and Participation. Chapter 5 will include further elaboration of this assumption. 3b. Relationship between Attending and Participation Given the above findings, this researcher looked at how students were connected to one another in their online groups. They “listened” to others through Attending and they “talked” to others through Participation. Therefore, Attending and Participation 73 l'i became a link to connect to one another. The research question became: “How is Attending related to Participation?” The null hypothesis resulting from this is: Ho 3b There is no relationship between Attending and Participation. The null hypothesis tests the assumption that student Attending has no effect on student Participation. Linear regression results indicate that Attending has a significant effect on Participation (F558) (Shown in Table 39). The null hypothesis was rejected. It appears from these results that Attending has a significant effect on Participation. That is to say, the more students are attentive to what others are saying, the more actively they will participate in the discussion. Table 39. Linear Regression Results of Attending (A) and Participation (P) B Standard Error t Intercept 1 80.08 27.75 6.49 A 0.10 0.02 5.58* R2=.425 *p<.05, Dependent Variable: Participation (P) Summary H 3b There is no relationship between Attending and Rejected 0 Participation Within student Behavioral Engagement variables, Attending had a significant effect on Participation. 3c. Relationship between Teacher Moderating Levels and Student Behavioral Engagement 74 How are teacher Moderating levels related to student Behavioral Engagement? How are teacher Moderating Levels related to student Attending and Participation, respectively? Here are the four null hypotheses to be tested. H0 3c.1 There is no relationship between the Number of Teacher Postings and Attending; H0302 There is no relationship between the Rating of Teacher Moderating Levels and Attending; H03c.3 There is no relationship between the Number of Teacher Postings and Participation; and H03c.4 There is no relationship between the Rating of Teacher Moderating Levels and Participation. Each hypothesis is examined in order. Ho3c. 1: There is no relationship between Number of Teacher Postings and Attending The null hypothesis tests the assumption that the quantity of teacher Moderating levels has no effect on student Attending. Linear regression results indicate that the Number of Teacher Postings (T) had a significant effect on Attending (A) (t=2.96) (Shown in Table 40). The null hypothesis was rejected. The number of teacher postings had a significant effect on student Attending. That is to say, the more the teacher posted, the more attentive students were. 75 1'1 .' uh '—- '1 Table 40. Linear Regression Results of Number of Teacher Postings (T) and Attending (A) B Standard Error t Intercept 782.72 245.65 3.19 T 10.85 3.67 2.96“ RZ=.173 *p<.05, Dependent Variable: Attending (A) l H0 3c.2: There is no relationship between Rating of Teacher Moderating Levels and Attending. __ The null hypothesis tests the assumption that the quality of teacher Moderating levels has no effect on student Attending. Linear regression results demonstrated that the Rating of Teacher Moderating levels had no significant effect on student Attending (t=.33, p=.807) (Shown in Table 41). The null hypothesis could not be rejected. The Rating of Teacher Moderating Levels (R), therefore, is not linearly related to student Attending (A). Although no significant linear relationship between the Rating of Teacher Moderating Levels (R) and Attending (A) was found, additional research was applied by means of nonlinear regression. The nonlinear regression results are shown in Table 42. These results indicate that none of the three types of relationships (linear, quadratic, and cubic) reached a statistically significant level. This leads to the conclusion that the quality of teacher moderating levels does not have a significant effect on student Attending. 76 Table 41. Linear Regression Results of the Rating of Teacher Moderating Levels (R) and Attending (A) B Standard Error t Intercept 1 168.01 926.94 1.26 R 125.40 386.38 0.33 R2=.OO3 p>.05, Dependent Variable: Attending (A) Table 42. Nonlinear Regression Results of the Rating of Teacher Moderating Levels (R) and Attending (A) Relationship R2 df F Sig. Quadratic .041 41 .87 .428 Cubic .041 41 .88 .422 Ho 3c.3: There is no relationship between the Number of Teacher Postings and Participation The null hypothesis tests the assumption that the quantity of teacher Moderating levels has no effect on student Participation. Linear regression results showed that the Number of Teacher Postings (T) had a significant effect on Participation (P) (t=4.72) (Shown in Table 43). It seems clear from these data that the Number of Teacher Postings has a significant effect on student Participation. Specifically, the more actively the teacher posts, the more actively students are likely to participate in the discussion. 77 Table 43. Linear Regression Results of Number of Teacher Postings (T) and Participation (P) B Standard Error t Intercept 178.14 32.75 5.44 T 2.31 0.49 4.72“ R2=.346 *p<.05, Dependent Variable: Participation (P) 110 3c.4: There is no relationship between the Rating of Teacher Moderating Levels and Participation The null hypothesis tests the assumption that the quality of teacher moderating levels has no effect on student Participation. Linear regression results indicate that the Rating of Teacher Moderating Levels (R) had no significant effect on Participation (P) (t=.65, p=.517) (Shown in Table 44). Although no significant linear relationship between the Rating of Teacher Moderating Levels (R) and Participation (P) was found, a nonlinear regression was performed to “double-check” these results. The nonlinear regression results are displayed in Table 45. They show that none of the three types of relationships (linear, quadratic, and cubic) reached a statistically significant level. Therefore, the quality of teacher moderating levels does not have a significant effect on student Participation. Table 44. Linear Regression Results of Rating of Teacher Moderating Levels (R) and Participation (P) B Standard Error t Intercept 233.48 138.53 1.69 R 37.77 57.74 0.65 R2=.010 78 p>.05, Dependent Variable: Participation Table 45. Nonlinear Regression Results of Rating of Teacher Moderating Levels (R) and Participation (P) 2 Relationship R df F Sig. Quadratic .012 41 .25 .784 Cubic .01 1 41 .24 .790 Summary Here is the summary of the null hypotheses: There is no relationship between the Number of Teacher Rejected H0 3 0.1 Postings and Attending There is no relationship between the Rating of Teacher Failure to reject H0 3c.2 Moderating Levels and Attending. p=.8] There is no relationship between the Number of Teacher Rejected H0 30.3 Postings and Participation H0 3c 4 There is no relationship between the Rating of Teacher Failure to reject Moderating Levels and Participation. p=.52 The Number of Teacher Postings has a significant effect on student Behavioral Engagement - both Attending and Participation; the Rating of Teacher Moderating Levels has no significant effect on either Attending or Participation. 3d. What Is related to Student Intellectual Engagement - Overall Relationships The comprehensive question for this study is: what factors are related to student Intellectual Engagement? The relationship of each variable to Higher-order Thinking and Interactivity, respectively, are important to this discussion. The null hypotheses are listed below. 79 H03d.1 There is no relationship between the Number of Teacher Postings and Higher-order Thinking; Ho 3d.2 There is no relationship between Rating of Teacher Moderating Levels and Higher-order Thinking; H0 3d.3 There is no relationship between Attending and Higher-order Thinking; Ho3d.4 There is no relationship between Participation and Higher-order Thinking; H0 3d.5 There is no relationship between Number of Teacher Postings and Interactivity; H03d.6 There is no relationship between Rating of Teacher Moderating Levels and Interactivity; H03d.7 There is no relationship between Attending and Interactivity; H0 3d.8 There is no relationship between Participation and Interactivity; H0 3d.9 There is no relation between the comprehensive factor and Higher-order Thinking; and finally, H0 3d.10 There is no relationship between the comprehensive factor and Interactivity. Ho 3d.1: There is no relationship between the Number of Teacher Postings and Higher- order Thinking The null hypothesis tests the assumption that the quantity of teacher Moderating levels has no effect on student Higher-order Thinking. Linear regression results indicate 80 that the Number of Teacher Postings (T) has a statistically significant effect on Higher- order Thinking (H) (t=4.55) (Shown in Table 46). The null hypothesis was rejected. The Number of Teacher Postings had significant effect on student Higher-order Thinking. That is to say, the more the teacher posted, the more likely Higher-order Thinking occurred. Table 46. Linear Regression Results of the Number of Teacher Postings (T) and Higher- order Thinking (H) B Standard Error t Intercept 193.95 153.38 1.27 T 10.40 2.29 4.55”“ R2=.330 *p<.05, Dependent Variable: Higher-order Thinking H03d.2: There is no relationship between the Rating of Teacher Moderating Levels and Higher-order Thinking The null hypothesis tests the assumption that the quality of teacher moderating levels has no effect on student Higher-order Thinking. Linear regression results indicate that the Rating of Teacher Moderating Levels (R) has a statistically significant effect on Higher-order Thinking (H) (t=2.99) (Shown in Table 47). The null hypothesis was rejected. The Rating of Teacher Moderating Levels is related to Higher-order Thinking. That is to say, the higher quality the teacher moderated, the more likely Higher-order Thinking occurred. 81 Table 47. Linear Regression Results of Rating of Teacher Moderating Levels (R) and Higher-order Thinking (H) B Standard Error t Intercept -890.20 584.50 -1.52 R 729.01 243.64 2.99* R2=.176 *p<.05, Dependent Variable: Higher-order Thinking H0 3d.3: There is no relationship between Attending and Higher—order Thinking The null hypothesis tests the assumption that student Attending is not related to student Higher-order Thinking. Linear regression results indicate that the effect of Attending on Higher-order Thinking did not reach a statistically significant level (t=1.88, p=.067) (Shown in Table 48). Although no significant linear relationship between Attending and Higher-order Thinking was discovered, a nonlinear regression was applied to fiirther investigate this relationship. The nonlinear regression results are shown in Table 49. Surprisingly, Attending has a quadratic relationship with Higher-order Thinking (F 1 , 41=3.62, p=.036) (Shown in Table 49 and Figure 6). Thus, there might be an optimal level of Attending in terms of student Higher- Order Thinking. This will be further explored in the discussion in Chapter 5. Table 48. Linear Regression Results of Attending (A) and Higher-order Thinking (H) B Standard Error t Intercept 566.32 162.55 3.48 A 0.19 0.10 1.88 R2=.O78 82 l‘ p>.05, Dependent Variable: Higher-order Thinking Table 49. Nonlinear Regression Results of Attending and Higher-order Thinking Relation (Quadratic) df F B0 B1 BZ Post 41 3.62* -34.81 1.0702 -.0003 R2=.150 *p < .05, Dependent Variable: Higher-order Thinking Hig her-order Thinking (Combined) 0 Observed 2000.00- 0 ~— Quadratic 0 1500.00- 0 O O O 9 8 0 1000.001 0 o e 00 8° ° 500.00- 0 o o 00 O o o 0.00- 1 I 1 I 1 I 1 0.00 1000.00 2000.00 3000.00 500.00 1500.00 2500.00 Attending Figure 6. A Quadratic Relationship Between Attending and Higher-order Thinking Ho 3d.4: There is no relationship between Participation and Higher-order Thinking. The null hypothesis tests the assumption that student Participation is not related to student Higher-order Thinking. Linear regression results indicated that Participation (P) has a statistically significant effect on Higher-order Thinking (H) (t=4.93) (Shown in Table 50, Figure 7). The null hypothesis was rejected. Participation has a significant effect on Higher-order Thinking. That is to say, the more actively students participate, the more likely Higher-order Thinking will occur. 83 Table 50. Linear Regression Results of Participation (P) and Higher-order Thinking (H) B Standard Error t Intercept -56.12 190.47 -0.30 P 2.80 0.57 4.93* R2=.367 *p<.05, Dependent Variable: Higher-order Thinking Hig her-order Thinking (Combined) 0 Observed 2000.007 0 ._ Linear 1500.00‘ 1000.00- 500.00- 0.00- I I I I 200.00 300.00 400.00 500.00 Participation Figure 7. A Linear Relationship Between Participation and Higher-order Thinking H0 3d.5: There is no relationship between the Number of Teacher Postings and Interactivity The null hypothesis tests the assumption that the quantity of teacher moderating levels is not related to student Interactivity. Linear regression results indicate that the Number of Teacher Postings (T) has a statistically significant effect on Interactivity (I) (t =4.65) (Shown in Table 51). The null hypothesis was rejected. Thus, the Number of Teacher Postings (T) has a significant effect on Interactivity (I). In other words, the more the teacher posts, the higher the Interactivity is. 84 Table 51. Linear Regression Results of Number of Teacher Postings (T) and Interactivity (I) B Standard Error t Intercept 75.51 56.87 1.33 T 3.94 0.85 4.65* R2=.34O *p<.05, Dependent Variable: Interactivity H0 3d.6: There is no relationship between the Rating of Teacher Moderating Levels and Interactivity The null hypothesis tests the assumption that the quantity of teacher moderating levels has no effect on student Interactivity. Linear regression results indicate that the Rating of Teacher Moderating Levels (R) has a statistically significant effect on Interactivity (I) (t=2.63) (Shown in Table 52). The null hypothesis was rejected. The quality of Teacher Moderating Levels has a significant effect on student Interactivity. The higher the quality that teacher moderates, the higher the Interactivity is. Table 52. Linear Regression Results of the Rating of Teacher Moderating Levels (R) and Interactivity (I) B Standard Error t Intercept -258.15 222.89 -1 . 16 R 243.97 92.91 263* R2=. 141 *p<.05, Dependent Variable: Interactivity (1) H0 3d.7: There is no relationship between Attending and Interactivity. 85 The null hypothesis tests the assumption that Attending has no effect on student Interactivity. Linear regression results indicate that the effect of Attending (A) on Interactivity (I) does not reach a statistically significant level (t=1.70, p=.097) (Shown in Table 53). Although no significant linear relationship between Attending and Interactivity was discovered, further research was conducted by applying nonlinear regression. The nonlinear regression results are shown in Table 54. Customarily, only linear, quadratic, and cubic relationships are taken into consideration. From this exhaustive analysis, it is clear that Attending has a quadratic relationship with Interactivity (F1,41=11.07) (Shown in Table 54 and Figure 8). Attending has a significant effect on Interactivity. That is to say, the more students are attentive to others, the better the Interactivity is. Table 53. Linear Regression Results of Attending (A) and Interactivity (I) B Standard Error t Intercept 228.06 61 .17 3.73 A 0.07 0.04 1 .70 R2=.064 p>.05, Dependent Variable: Interactivity Table 54. Nonlinear Regression Results of Attending (A) and Interactivity (I) Relation (Quadratic) df F BO Bl B2 Post 41 528* -85.239 .5225 -.0001 R2=.205 *p < .05, Dependent Variable: Interactivity (I) 86 Interactivity (Combined) 700.00— 0 0 Observed -— Quadratic 600.00- 500.00- 400.00- 300.00- 200.00- 100.00“ 0.00-I I I I I 0.00 I 1000.00 I 2000.00 I 3000.00 500.00 1500.00 2500.00 Attending Figure 8. A Quadratic Relationship Between Attending and Interactivity Ho 3d.8: There is no relationship between Participation and Interactivity. The null hypothesis tests the assumption that Participation is not related to Interactivity. Linear regression results indicate that Participation (P) has a significant effect on Interactivity (I) (t=6.13) (Shown in Table 55 & Figure 9). The null hypothesis was rejected. Student Participation has an effect on Interactivity. That is to say, the more actively students participated, the higher the level of Interactivity became. Table 55. Linear Regression Results of Participation (P) and Interactivity (I) B Standard Error t Intercept -59.91 64.95 -0.92 P 1.19 0.19 6.13* R2=.472 *p<.05, Dependent Variable: Interactivity 87 Interactivity (Combined) 700.00% 0 0 Observed — Linear 600.00- 500.00- 400.00- 300.00- 200.00- I I I I 200.00 300.00 400.00 500.00 Participation Figure 9. A Linear Relationship Between Participation and Interactivity Ho 3d.9: There is no relationship between Comprehensive factor TRP and Higher-order Thinking. The null hypothesis tests the assumption that the Comprehensive Factor TRP is not related to student Higher-order Thinking. Linear regression results indicated that the comprehensive factor TRP had a significant effect on Higher-order Thinking (H) (t =6.36) (Shown in Table 56 & Figure 10 for statistics and Figure I for curve estimation.) The null hypothesis was rejected. Thus, the comprehensive factor TRP has an effect on Interactivity. Specifically, the higher the value of this comprehensive factor, the higher the Higher-order Thinking that can be demonstrated. 88 Table 56. Linear Regression Results of Comprehensive factor TRP and Higher-order Thinking (I-I) B Standard Error t Intercept 308.98 96.26 3.21 TRP 10.41 1.64 6.36* R2=. 491 *p<.05, Dependent Variable: Higher-order Thinking ng her-order Thinking (Combined) 0 Observed 2000.00- 0 ~ Linear 1500.00-1 1000.00- 500 .00‘ 0.00- l i i i l 0.00 25.00 50.00 75.00 100.00 125.00 Comprehensive Factor T‘R'P Figure 10. A Linear Relationship Between the Comprehensive Factor TRP and Higher- order Thinking H0 3d.10: There is no relationship between Comprehensive factor TRP and Interactivity. The null hypothesis tests the assumption that the Comprehensive Factor TRP is not related to student Interactivity. Linear regression results indicated that the comprehensive factor TRP had a statistically significant effect on Interactivity (I) (t 89 =6.77). This is shown statistically in Table 57 and in Figure 11 for curve estimation. The null hypothesis was rejected. Comprehensive factor TRP appears to have a significant effect on Interactivity. That is to say, the higher the value of the comprehensive factor, the higher the level of the Interactivity. Table 57. Linear Regression Results of Comprehensive factor TRP and Interactivity (I) B Standard Error t Intercept 1 15.80 34.84 3.32 TRP 4.01 0.59 6.77* Rz=.522 *p<.05, Dependent Variable: Interactivity Interactivity (Combined) 0 Observed Linear 700.00- 600.001 500.00- 400.00- 300.00- l i 1 fi 1 i 0.00 25.00 50.00 75.00 100.00 125.00 Comprehensive Factor (T*R*P) Figure 11. A Linear Relationship Between Comprehensive Factor TRP and Interactivity 90 1% Summary H0 3d.] H0 3d.2 H0 3d.3 Ho 3d.-4 H0 3d.5 Ho 3d.6 H0 3d.7 H0 3d.8 Ho 3d.9 Ho 3d.10 There is no relationship between the Number of Teacher Postings and Higher-order Thinking There is no relationship between the Rating of Teacher Moderating Levels and Higher-order Thinking There is no relationship between Attending and Higher- order Thinking There is no relationship between Participation and Higher-order Thinking There is no relationship between the Number of Teacher Postings and Interactivity. There is no relationship between th Rating of Teacher Moderating Levels and Interactivity There is no relationship between Attending and Interactivity. There is no relationship between Participation and Interactivity There is no relationship between Comprehensive factor TRP and Higher-order Thinking There is no relationship between Comprehensive factor TRP is not related to Interactivity Rejected Rejected Rejected, Quadratic Rejected Rejected Rejected Rejected, Quadratic Rejected Rejected Rejected After investigating all the variables explored in this study (T, R, A, P, S, H and I- refer to Figure l), the data indicate that both teachers and students in a group influenced student Intellectual Engagement. Both teacher moderating levels (T & R) and student Behavioral Engagement (A & P) have significant effects on Higher-order Thinking. Coincidentally, Teacher moderating levels (T & R), student Behavioral Engagement (A & P) also had a significant effect on Interactivity. Social-emotional Engagement did not have a significant effect on student Intellectual Engagement (H or I). A comprehensive factor, which was the product of T, R, and P, could briefly and efficiently express which factors influenced student Intellectual Engagement. This Comprehensive Factor had a statistically significant effect on student Intellectual Engagement, accounting respectively 91 for 49.1% of Higher-order Thinking and 52.2% of Interactivity. Details of the Comprehensive factor are discussed in Chapter 5. 4.3 Conclusions and Discussion Thirty-five null hypotheses were tested in this chapter in order to answer the three research questions of the study. The results of the test of these null hypotheses are shown in a summary table in Appendix E. Below are conclusions and discussions. 4.3.1 Changes in Teacher Moderating Levels and Student Engagement over Time The first research question concerned how each of the seven variables of teacher moderating levels (T and R) and student engagement (A, P, S, H and I) changed over time. To answer this question, repeated measure analyses of variance (ANOVAs) were applied to test the mean differences of the variables over weeks. It was found that the Number of Teacher Postings (T) changed significantly over weeks while the change of the Rating of Teacher Moderating Levels (R) did not reach a statistically significant level over weeks. Among all students engagement variables (A, P, S, H, I), only Social- emotional Engagement (S) did not significantly change over weeks; all others did (Shown in Table 58). Scheffe post hoc procedures were applied to variables that significantly changed over weeks, namely, Teacher moderating levels (T), Attending (A), Participation (P), Higher-order Thinking (H), and Interactivity (I), in order to investigate in which weeks those variables had extremely low or high value. The Scheffe post hoc results, together with the repeated measure analyses of variance (ANOVAs), are shown below in Table 58. 92 Table 58. Summary of ANOVA Results and Scheffe Post Hoc Results of all the 7 Variables over Weeks. R A S H ANOVA Results * n.s. * * n.s. * * Week with lowest value 1 1 1 1 1 Week with highest value 2 9 9 1 1 11 Note: “n.s.” stands for “not significant”, i.e. p> .05. “*” stands for significant, i.e. p< .05. The numbers stand for week number. Scheffe post hoc comparison showed that the measures of variables in week 1 are significantly distinctive from that of week 2, 4, 9 and 11, with the lowest value for week 1 and higher value for week 2, 4, 9 11. For instance, the Number of Teacher Postings (T) in the week 1 is the lowest. Consequently Attending (A), Participation (P), Higher-order Thinking (H) and Interactivity (I) were also low in week 1. This was because the first week class was mainly aimed at self-introduction and the teacher moderator didn’t participate much. As with many first class sessions, there was no concrete task at hand or problems to solve. On the contrary, in week 2, teachers moderated in a much more intensive manner in order to boost the student engagement, though values for the student engagement variable were not necessarily the highest in that week. Both Attending and Participation in week 9 were very high. The design of the discussion topic and the class agenda may have contributed to this high attending and participation. The topic for week 9 was to practice listening, responding and giving feedback using a feedback formula. The discussion was initiated by reading an interesting story that could provoke very different opinions and students practiced how to ‘buy and 8811’ values. Teacher moderating was pretty high though not the highest (the highest was week 2). Intellectual engagement — Higher-order Thinking (H) and Interactivity (I) - were both very high in week 11. 93 One noticeable phenomenon from the follow-up post hoc study in mean differences over time was that at the end of the semester student intellectual engagement was much higher than in the beginning of the semester. Another noticeable phenomenon was that the teacher moderating levels and student engagement was not a simple one-to- one causal relationship (i.e., that one was high and the other was consequently high). This means that there were other confounding factors that might have affected this relationship. Further research is needed to investigate the complicated relationship between teacher moderating levels and student engagement. 4.3.2 Changes in Teacher Moderating Levels and Student Engagement across Groups The second research question concerned that how each of the seven variables of teacher moderating levels and student engagement changed across groups. To answer this question, multivariate analyses of variance (MANOVA) was applied, followed by univariate analyses of variance (ANOVAs). It was found that the Rating of Teacher Moderating Levels (R) changed significantly across groups while the changes in the Number of Teacher Postings (T) did not reach a statistically significant level. Among all student engagement variables (A, P, S, H, I), only Social-emotional Engagement (S) changed significantly across groups. None of the other students’ engagement variables (A, P, H, I) reached a statistically significant level (Shown in Table 59). Although the mean difference of those variables (T, A, P, H, & I) did not reach significant value, the differences did exist. Before applying MANOVAs and ANOVAs, for each variable, the group mean and standard deviation were calculated from the measures of 11 weeks. For those variables that changed significantly over weeks, the standard deviations were very big. When applying MANOVAs and ANOVAs, both the 94 group means and standard deviations were taken into consideration. With such large standard deviations, the mean values across groups were not likely to be significant, although group differences did exist. These differences were “masked” when 11 weeks were “compressed” to represent the group means. Scheffe post hoc procedures were applied to variables that significantly changed across groups, namely, the Rating of Teacher Moderating Levels (R) and student Social- emotional Engagement. The Scheffe post hoc results together with the repeated measure analyses of variance (ANOVAs) are shown below in Table 59. Table 59. Summary of ANOVA Results and Scheffe post hoc Results of all the 7 Variables across Groups. T A P H I ANOVA Results n.s. * n.s. n.s. * n.s. n.s. Group with lowest value 3 2 Group with highest value 4 3 Note: “n.s.” stands for “not significant”, i.e. p> .05. “*” stands for significant, i.e. p< .05. The numbers stand for group number. The Scheffe post hoc comparison demonstrated that Group 4 was significantly different from the other three groups in terms of the Rank of Teacher Moderating levels (R) and in student Social-emotional Engagement. The implications of these conclusions are explicated in Chapter 5. 4.3.3 Relationships Between/Among Teacher Moderating Levels and Student Engagement The third research question concerned the relationships between and among teacher moderating level variables and student engagement variables and what factors were related to student Intellectual Engagement. This question resulted in four steps. 95 The first step looks at the relationships between Social-emotional Engagement variables and each of the other variables. Among all the relationships between Social- emotional Engagement (S) and any one of the variables explored in this study (T, R, A, P, H, I), Social-emotional Engagement (S) was shown to only have a significant effect on Participation (P) and was only influenced by Attending (A). Neither the Number of Teacher Postings (T), nor the Teacher Moderating Levels (R) had a significant effect on Social-emotional Engagement (S); By contrast, Social-emotional Engagement (S) had no significant effect on either Higher-order Thinking (H) or Interactivity (I). It is logical to assume that Social-emotional Engagement influenced student Intellectual Engagement through the Behavioral Engagement variables Attending and Participation. The second step in this analysis focuses on how students were connected to each other in a group. They “listened” to others through Attending and they “talked” to others through Participation. Therefore, Attending and Participation became a bridge to connect one another. Statistical analyses showed that Attending significantly influenced Participation. Teacher Moderating levels relate to student Behavioral Engagement, from these analyses. The third step in this study involves investigation of the relationship between teacher moderating levels and student Attending and Participation. Statistical analyses indicate that both Number of Teacher Postings and the Rating of Teacher Moderating Levels had a statistically significant effect on student Behavioral Engagement--Attending and Participation. The comprehensive question - what factors are related to student Intellectual Engagement- is the fourth step in this analysis. This step examines the relationship of each variable to Higher-order Thinking and Interactivity, respectively. Among all the 96 variables explored in this study (T, R, A, P, S, H and I), both teachers and students in groups jointly influenced student Intellectual Engagement. The comprehensive factor - which is the product of T, R, and P - briefly and efficiently reveals what factors influenced student Intellectual Engagement. This Comprehensive Factor has a statistically significant effect on student Intellectual Engagement: it can explain 49. 1% of Higher-order Thinking and 52.2% of Interactivity. This point will be discussed in detail in Chapter 5. 97 CHAPTER 5 DISCUSSION OF QUANTITATIVE DATA AND RESULTS The purpose of this study was to investigate factors that are related to student Intellectual Engagement in a community of inquiry maintained by synchronous computer conferencing. A key finding was that teacher moderating levels and student participation in the synchronous sessions had a significant effect on student Intellectual Engagement. The process leading to the above conclusion, along with the statistical analyses and results supporting both the rationale for each analytical step and the branch conclusions will be discussed in this chapter. As was shown in Figure 3 in chapter 4, there are seven variables associated with teacher moderating levels and student engagement variables. The core question asked if and how student Intellectual Engagement variables (I-l & I) were influenced by (1) student Social-emotional Engagement (S); (2) by student Behavioral Engagement (A & P); and (3) by teacher moderating levels (T & R). Results of these three sets of relationships will be explained, along with the overall process of disentangling the complicated relationships among student engagement variables and the relationship between teacher moderating levels and student Intellectual Engagement. The first step was to explain how student Social-emotional Engagement is related to student Intellectual Engagement. 5.1 How Was Student Social-emotional Engagement Related to Student Intellectual Engagement By definition, Social-emotional Engagement is a measure of the degree to which students view themselves as part of a group rather than as individuals and, consequently, 98 make efforts to build cohesion, acquire a sense of belonging, and render mutual support. The definition of engagement in the literature tends to be general and is even broader in the online discussion literature. Consequently, the source of emotional reaction is not clear, and the end (what it afiects) is not obvious, either. As Fredricks, Blumenfeld, and Paris (2004) point out, it may not be clear, for instance, whether students’ positive emotions are directed toward academic content, their fiiends, or the teacher. This investigation identifies both what contributes to student Social-emotional Engagement and what factors are influenced by student Social-emotional Engagement in the process of collaborative meaning construction by a community of inquiry using computer conferencing. 5.1.1 Relationship Between Student Social-emotional Engagement and Higher-order Thinking and Interactivity Regression analyses showed that student Social-emotional Engagement did not have a significant effect on either Higher-order Thinking or on Interactivity. The hypothesis testing of the mean differences of Social-emotional Engagement (Shown in Figure 12) as well as of Higher-order Thinking (Shown in Figure 14) also supported this conclusion. Looking at the results from each group supports this conclusion as well. The Social-emotional Engagement of Group 1 was very high, while its Higher-order Thinking was the lowest. Social emotional Engagement of Group 2 was the lowest, but its Higher- order Thinking was extremely high. Social emotional Engagement of Group 3 was the highest while its Higher-order Thinking was low. Social-emotional Engagement of Group 4 was high and its Higher-order Thinking was also extremely high; in fact, the highest (cf. Figure 14 with Figure 12). From these results, there is insufficient evidence to conclude that Social-emotional Engagement has a direct effect on Higher-order Thinking. 99 The finding that Social-emotional Engagement was not significantly related to student Intellectual Engagement contradicts the popular assumption in online learning literature that stresses the importance of student Social-emotional Engagement. So this prompts the question: what is the function of student Social-emotional Engagement in the collaborative meaning construction process in synchronous online conferencing? This question inspired exploration of the relationships of student Social-emotional Engagement with all the other remaining variables - student Behavioral Engagement (Attending and Participation); and teacher moderating levels (Numbers of Teacher Postings and the Rating of Teacher Moderating Levels). Literature on online learning frequently argues that Social-emotional Engagement is an important element that contributes to student Intellectual Engagement (Garrison & Anderson, 2003; Gunawardena, 1995; Gunawardena & Zittle, 1997; Perry & Edwards, 2005; Rourke, et al., 1999). This study’s analyses showed that none of the relationships of student social emotional engagement with other variables is significant except Attending and Participation. Social-emotional engagement did not lead to student Intellectual Engagement directly. The analyses found that teacher moderating levels did not have a significant effect on student Social-emotional engagement directly; rather, teacher moderating levels had a significant effect on student Behavioral Engagement. It is fair to conclude that teacher moderating levels influenced student Behavioral Engagement, which then led to higher student social-emotional engagement. In other words, Behavioral Engagement led to social-emotional engagement, which in turn led to student Intellectual Engagement, but social-emotional engagement did not directly lead to student Intellectual Engagement. In short, the moderating behaviors of teachers led to 100 higher intellectual engagement but had no direct effect on student Social-emotional Engagement. 5.1.2 Relationships Between Student Social-emotional Engagement and Student Behavioral Engagement The relationships were analyzed according to two sets: relationships between student Social-emotional Engagement and Attending and relationships between student Social-emotional Engagement and Participation. Regression analyses (Shown in Table 33) showed that Attending had a significant linear relationship with Social-emotional Engagement; that is, the more attentive students were to the discussion, the more they saw themselves as part of a group rather than as individuals. In addition, when students were more attentive to the discussion, they were more likely to make efforts to build cohesion, to develop a sense of belonging, and to render mutual support. The conclusion that Social-emotional Engagement and Attending had a linear relationship was also supported by hypothesis testing of the mean difference of student Social-emotional Engagement (Shown in Figure 12) and Attending (Shown in Figure 13) across groups. The trend of Social-emotional Engagement and the trend of Attending were similar though not exactly identical (of. Figure 13 with Figure 12). However, the conclusion that student Attending had a significant effect on student Social-emotional Engagement was only tentative. In the regression model, the R square is very small (= .10). This indicates that there might be other factors that were related to student Social-emotional Engagement. However, what those factors were was not identified in this study. Further research is suggested to investigate the results. 101 The relationship between student Social-emotional Engagement and student Participation was the next analysis. Regression analyses (Shown in Table 34) showed that Social-emotional Engagement had a significant linear relationship with Participation; that is, the more social-emotionally engaged students were, the more actively they participated in the discussion. This conclusion confirmed trends highlighted in the literature (Garrison & Anderson, 2003; Gunawardena & Zittle, 1997; Perry & Edwards, 2005) The conclusion that Social-emotional Engagement had a linear relationship with Participation was also supported by hypothesis testing of the mean difference of Participation (Figure 15) and student Social-emotional Engagement (Figure 12) across groups. The trend of Social-emotional Engagement and the trend of Participation were similar but not identical (cf. Figure 15 and Figure 12). 5.1.3 Relationship Between Student Social-emotional Engagement and Teacher Moderating Levels The analyses showed that neither the Number of Teacher Postings (T) (Shown in Tables 29 & 30) nor the Rating of Teacher Moderating Levels (R) (Shown in Tables 31 & 32) significantly influenced student Social Emotional Engagement. 5.1.4 Summary - The Core Question Remains: What Was Related to Student Intellectual Engagement There is rich discussion in the literature on the importance of student Social- emotional Engagement on student learning and intellectual engagement (Garrison & Anderson, 2003; Gunawardena & Zittle, 1997; Perry & Edwards, 2005). In fact, much of the research literature related to online group discussion presumes that Social-emotional 102 Social-emotional Engagement (8) 220.00" 200.00" 180.00" 160.00- 140.00" 120.00- 100.00‘ eroup Figure 12. Mean Differences of Social-emotional Engagement across Groups. Attending 1700.00" 1600.00- 1500.00- 1400.00- 1300.00" 1 l I l 1 2 3 4 group Figure 13. Mean Differences of Attending across Groups 103 Higher-order Thinking 1000.00-1 900 .00— 800.00- 700.00- group Figure 14. Mean Plots of Student Higher-order Thinking across Groups Participation 380.0 360.00— 340.00'1 320.00‘ 300.00- 28000-1 d to o: .5—1 group Figure 15. Mean Plots of Participation across Groups 104 Engagement is essential to critical thinking (Anderson & Garrison, 1995; Anderson, Garrison & Archer, 2001). Literature suggests that Social-emotional Engagement is essential to knowledge construction by making group interactions appealing and thus intrinsically rewarding, leading to an increase in academic, social, and institutional integration and resulting in increased persistence (Anderson & Kanuka, 1997; Kanuka & Anderson, 1998; Garrison, et al., 2000; Rourke, et al., 1999). Some researchers maintain that critical thinking is facilitated by the socio-emotional support of others (Brookfield & Preskill, 1999). The analyses of the relationships between Social-emotional Engagement and all the other variables - Behavioral Engagement (A & P), teacher moderating levels (T and R), and Higher-order Thinking and Interactivity (H & I) - showed that Social-emotional Engagement had significant relationships only with Behavioral. Engagement. It did not have a significant relationship with any of the other variables of teacher moderating levels or student Intellectual Engagement. The summary of the relationship between Social-emotional Engagement and all the other variables is shown in Table 60. Table 60. Summary of Regression Result of Relationships between Student Social- emotional Engagement and all the other Variables - T, R, A, P, H, and I Linear Quadratic Cubic T n.s. n.s n.s. R n.s. n.s. n.s. A * n.s. n.s. P * n.s. n.s. H n.s. n.s. n.s. I n.s. n.s. n.s. Note: “n.s.” stands for “not significant”, i.e. p> .05; “*” stands for significant, i.e. p< .05 105 Statistical analysis at this stage showed that student Social-emotional Engagement had a significant relationship only with student Behavioral Engagement. In the small group communities (“Group,” represented by its first letter G, as shown in Figure 16) of collaborative discourse, student Behavioral Engagement functioned as the link that connected individuals to the community of inquiry in the group, including both teachers and all students. Students “listen” to others (Attending), that is, they have “input” from the group; on the other hand, students “talk” to others (Participation), that is, they have “output” to the group. This listening or Attending had the most obvious effect on student Social-emotional Engagement (A->S), which in turn, directly influenced how actively students “talked” or had “output” to group-Participation (S->P). The relationship between Attending (A), Social-emotional Engagement (S), and Participation (P) is shown in Figure 17. It seems reasonable to speculate that student Behavioral Engagement indirectly influenced student Intellectual Engagement through the link of Social- emotional Engagement. The discovery that Social-emotional Engagement did not significantly influence student Intellectual Engagement was a step closer to answering the core issue. The statistical analysis results of the relationships of student Behavioral Engagement with student Intellectual Engagement and the relationships of teacher moderating levels with student Intellectual Engagement in the following sections will successively clarify this issue and reveal some tentative answers to the core issue about what contributed to student Intellectual Engagement. The following discussion examines the relationship between student Behavioral Engagement and student Intellectual Engagement. 106 5.2 How Was Student Behavioral Engagement Related to Student Intellectual Engagement Before exploring the relationship between student Behavioral Engagement and Intellectual Engagement, it may be useful to re-examine the assumption made in the last section: student Behavioral Engagement served as a link to connect individuals to the group in the collaborative discourse of meaning construction in online conferencing. This “link” idea facilitates understanding of what Behavioral Engagement consists of and of how individuals are connected to the group through the link of Attending and Participation. / \ A P A P StudentB Figure 16. Attending and Participation (Behavioral Engagement) as a Link to Connect Individuals to the Group Output Figure 17. Student Social-emotional Engagement Was Exclusively Related to Attending and Participation (Behavioral Engagement) 107 5.2.1 What Student Behavioral Engagement Consists of - Relationship Between Attending and Participation Based on Lobel et al’s (2002) study, Attending is “listening” and Participation is “Talking.” Each individual listened to others’ talk, and each also talked to the group. In this sense, Attending and Participation served as a link that connected individuals to the group. The link was bi-directional: through Attending, individuals received various inputs, while through Participation, individuals contributed output (Shown in Figures 15 & 16). Without this link, through which input was received and output was sent, individuals would be unable to connect and there would not be a community of inquiry. Characteristics of online communication make Behavioral Engagement - Attending and Participation - indispensable in online communication. In face-to-face environments, one does not need to engage in “vocal Participation” to show s/he is “paying attention,” while “non vocal participation” does not necessarily imply inattention. Online interactions in general and the e-classroom in particular engender a new type of communication: parallel communication where time to communicate is no longer linear and serial, but holistic and parallel (Garcia & Jacobs, 1999; Herring, 1999; Lobel et al., 2002a; 2002b;). Individuals post to the whole (Participation) and a “pool" is, thereby, created. Individuals also obtain information from the pool (Attending). Through adding knowledge to the class narrative, the community is established. From the community, individuals acquire knowledge and experience intellectual change. Attending and Participation constitute interaction and interaction creates mutual and reciprocal action or influence. In the virtual environment, where all paralinguistic features are invisible, one needs to formulate “vocal participation” to show s/he is “paying attention.” Taking the 108 class discussion as a whole, if nobody speaks up, there is no Participation, there is nothing to “listen to,” and the class has nothing to attend to. No community will be established and nobody can benefit from the community. As obvious as it may seem, then, it is imperative that students and teachers communicate online. Moreover, these interchanges must lead to meaning-making and higher-order thinking in order to have any effect on student Intellectual Engagement. Statistical analyses (Shown in Table 39) showed a significant linear relationship between Attending and Participation: the correlation between Attending and Participation is .65 (significant at the 0.01 level, 2-tailed)). This means that the more one listens and pays attention, the more ideas will be stimulated and the more one can say. However, with the time limitation of a specific class session in a synchronous setting, Attending has limitations in terms of Participation. For example, if one “listens” too much, then s/he will not have time to talk (contributing to low Participation). Of course, one can talk without Listening to others but this makes it less likely that such postings will be interactive. The discussion would be more like monologues instead of dialogues or common-logues (Xin, 2002). The finding of the linear relationship between Attending and Participation differs from what Lobel, et al. (2002a; 2002b) found. Lobel et al. discovered that there was no apparent correlation between the measures of “Attending” and “Participating.” The findings of this study, then, offer a different perspective on these behaviors, and show a significant positive connection between Attending and Participation. The regression results that Attending and Participation have a significant linear relationship might simplify the complicated nature of online discussion, and, thus, the complicated 109 relationship of Attending and Participation. Further research is needed to verify the relationship between Attending and Participation. 5.2.2 How was Attending or Participation Related to Student Intellectual Engagement - Higher-order Thinking or Interactivity, Respectively Regression analyses showed that the relationship between Attending and Higher- order Thinking was quadratic (Shown in Table 48 & 49, and Figure 6) and the relationship between Participation and Higher-order Thinking was linear (Shown in Table 50 and Figure 7). Symmetrically, regression analyses showed that the relationship between Attending and Interactivity is quadratic (Shown in Table 53 & 54, and Figure 8), whereas the relationship between Participation and Interactivity was linear (Shown in Table 55 and Figure 9). These linear and quadratic relationships provide a strong reason to postulate that there may be an optimal level of Attending in terms of Intellectual Engagement and Participation. This is a very interesting finding and will be elaborated below. First, there is an attention-grabbing relationship between Participation and Intellectual Engagement. Regression analyses showed that within the limitations of the contexts of this study, the higher the Participation, the higher the Higher-order Thinking and the Interactivity; that is, the higher the level of Intellectual Engagement. Second, the quadratic relationship between Attending and Intellectual Engagement is of interest. Nonlinear regression results showed that if Attending was too low, which meant that if students did not “listen” or failed to pay attention, then the Intellectual Engagement would also be low. If Attending was too high, which meant that, students only “listen” without “talking,” then there would be low Participation. As a result, there would be low Intellectual Engagement because Participation has a significant 110 linear relationship with the two indicators of Intellectual Engagement. The minimum Attending (of the 44 discussion transcripts) of a group (8 people) in a normalized time period of 90 minutes is 385 times, and the Maximum is 2,743 times. This maximum Attending meant that each individual “listened” or “polled the server” (or scrolled the bar) once every 16 seconds. With that high frequency of Attending, one would not have much time to participate in the group discussion. The Nonlinear regression results indicate that the optimal level of Attending for a group in a time period of 90 minutes (in terms of Higher-order Thinking) is 1,750 times. Stated in another way, the optimal level of Attending a synchronous online learning session within the limitations of this particular 5 study entailed that an individual “listened” or “polled the server” (or scrolled the bar) every 25 seconds, or 2 to 3 times every minute. If they “listened” more than that, they weren’t contributing to the conversation and, consequently, had lower Intellectual Engagement levels. It is necessary to point out that the purpose of doing the above quantitative analyses was more of an exploration of a research direction - to measure some variables and to quantify relationships between/among these variables - than for providing specific statistics. Within the limitations of this study, some of the R squares are not big enough to draw reliable conclusions about relationships of variables. 5.3 How are Teacher Moderating Levels Relate to Student Intellectual Engagement As explained in the above sections, the three student engagement variables - Behavioral, Social-emotional and Intellectual Engagement - were closely interrelated. Therefore, to answer the question of how teacher moderating levels were related to student Intellectual Engagement, it is legitimate to discuss how teacher moderating 111 levels were related to the other two student engagement variables - Social-emotional Engagement and student Behavioral Engagement. 5.3.1 Relationship Between Teacher Moderating Levels and Student Social-emotional Engagement The relationship between Teacher Moderating Levels and student Social- emotional Engagement was discussed in Section 5.1.3. The findings showed that Teacher Moderating Levels did not significantly influence student Social-emotional Engagement. 5.3.2 Relationship Between Teacher Moderating Levels and Student Behavioral 1 Engagement There is rich literature that describes the function of teacher moderating in online interaction (Feenberg, 1989; Feenberg & Xin, 2002; Gunawardena & Anderson, 1997); however, there has been little research on how the quantity and quality of teacher moderating behaviors respectively and collectively influence student Behavioral Engagement. This dissertation research investigated and filled this research gap. The regression analyses showed that the Number of Teacher Postings (I) had a significant effect on both student Attending (Shown in Table 40) and Participation (Shown in Table 43). This may be because a higher amount of teacher postings showed stronger teaching presence (Anderson et al, 2001) and students then paid more attention and had more active Participation. The benefits of allowing students to independently study in an online environment were not evidenced in this study. Indeed, active Participation of moderators increased student Behavioral Engagement. The regression analyses showed that the quality of teacher postings did not have a significant effect on either Attending (Shown in Table 41 & 42) or Participation (Shown 112 in Table 44 & 45). In the context of this study, the quality of teacher postings did not affect the frequency that students scroll the bar or “poll the server” in order to read newly posted messages and/or to review messages posted earlier, nor did the quality of teacher postings affect the frequency with which students posted. This finding is contrary to expectations. The quality of teacher postings had no significant effect on student Behavioral Engagement. To summarize, this investigation shows how the quantity and quality of teacher postings respectively and collectively influenced student Attending and Participation. The result showed that the quantity of teacher postings mattered, but that, surprisingly, the quality of teacher postings did not matter in terms of student Behavioral Engagement. While investigating the quantity of teacher postings in a similar context, Lobel et al. (2002) found that the instructor did not need to interact with the students much more than one post per minute to facilitate the discussion that exhibited high rates of individual participation. 5.3.3 Relationship Between Teacher Moderating Levels and Student Intellectual Engagement There is an interesting relationship between the Number of Teacher Postings and student Intellectual Engagement. Regression analyses showed that the Number of Teacher Postings had a linear relationship with both Higher-order Thinking (Table 46) and Interactivity (Table 51). A higher number of teacher postings might have produced a stronger sense of teaching presence, and, as a result, student Intellectual Engagement increased. With R square equaled to .330 (Higher-order Thinking) and .340 (Interactivity), a tentative conclusion can be drawn that the Number of Teacher Postings explain 33 113 percent of the changes in student Higher-order Thinking and 34 percent of the changes in Interactivity. How did the Rating of Teacher Moderating Level relate to student Intellectual Engagement? Regression analyses showed that the Rating of Teacher Moderating Levels had a linear relationship with both Higher-order Thinking (Table 47) and Interactivity “' Ti (Table 52). This finding verifies prevailing perspectives in the literature on the importance of teacher moderation, although there are few previous studies that reported supporting evidence through hypotheses testing as the present study did. In fact, most perspectives found in the synchronous (as well as asynchonrous) conferencing literature related to the effect of the Rating of Teacher Moderating Levels on student Intellectual Engagement employed theoretical postulations rather than hypothesis testing. This study shows that higher Rating of Teacher Moderating Levels resulted in better direction of discussion and higher Intellectual Engagement. To summarize, this section investigated how teacher moderating levels - both number of teacher postings and quality of teacher moderating levels — influenced the three aspects of student engagement - Social-emotional, Behavioral, and Intellectual - respectively. Statistical results showed that neither the number of teacher postings nor the quality of teacher moderating levels influenced student Social-emotional Engagement, 8 component of online learning that had been previously theorized as very important to intellectual achievement. Statistical results also showed that the number of teacher postings had a significant effect on student Behavioral Engagement while the quality of teacher moderating levels did not have a significant effect on student Behavioral Engagement. Finally, analyses showed that both the 114 number of teacher postings and the quality of teacher moderating levels had a significant effect on student Intellectual Engagement. 5.4 What Was Related to Student Intellectual Engagement - Comprehensive Factor TRP There are seven variables related to teacher moderating levels (T & R) and student engagement variables (A, P, S, H & 1) (Shown in Figure 1, 2 & 3). The core issue of this project was to determine if and how other student engagement variables (A, P, and S) and teacher moderating levels (T and R) influence student Intellectual Engagement (I-I & I). On the way to investigating the relationship between each variable and student Intellectual Engagement, the analyses followed, but were not limited to, the sole relationship of each variable with student Intellectual Engagement. Other relationships were investigated because the variables were closely interrelated (shown in Figure 18). Higher teacher moderating levels - both in terms of quantity and quality - contributed to higher student Behavioral Engagement, which consequently contributed to higher Social- emotional levels, ultimately leading to higher student Intellectual Engagement. After all the branch analyses were performed (in Section 4.2.3), a comprehensive factor was sought that could most efficiently express what influenced student Intellectual Engagement through the collaborative discourse of a community of critical thinking in the medium of synchronous computer conferencing. Statistical analyses revealed such a comprehensive factor - the product of the Number of Teacher Postings (T), the Rating of Teacher Moderating Levels (R), and student Participation (P). The comprehensive factor consisted of T, R, P, yet excluded A and S, which had previously been theorized as related, directly and indirectly, to Intellectual Engagement. 115 StudentC Student Learning H, I Figure 18. The Whole Picture -What Influenced Student Intellectual Engagement Student Social Emotional Engagement did not have a significant effect on student Intellectual Engagement; hence, it was excluded from the comprehensive factor. Attending had a quadratic relationship with Intellectual Engagement; however, Attending as “Listening,” was more consumption than contribution, which means that its effect on Higher-order Thinking was not direct but rather indirect. Therefore, Attending was also excluded from the comprehensive factor. Given that Social-emotional Engagement (S) and Attending (A) were excluded in the comprehensive factor, it is important to examine why and how T, R, and P were included in the comprehensive factor. The Number of Teacher Postings (T), Rating of Teacher Moderating Levels (R), and Participation (P) each had a significant effect on student Intellectual Engagement. This led to the speculation that the product of the three (T*R*P) could be a comprehensive factor that might efficiently represent all three factors and express how they together influence student Intellectual Engagement. Regression analyses (Shown in Table 56 & 57) showed that the product of the three had significant linear relationships with Intellectual Engagement. 116 The R Squares of T*R*P with both Higher-order Thinking (. 491) and Interactivity (. 522) were much larger than that of any of the three factors individually. This in turn supported the legitimacy of the comprehensive factor. 5.5 Conclusion Student Intellectual Engagement is influenced by the product or the combination of the Number of Teacher Postings, the Rating of Teacher Moderating Levels, and student Participation. The product of T, R, and P can be seen as an index of teacher- student participation and a kind of quality/importance/rating of the participation. The product of the Number of Teacher Postings and the Rating of Teacher Moderating Levels (T‘P) measures the overall impact of teacher moderating - both quantity and quality. The product of the overall impact of teacher moderating and student participation (T*R*P) provides an overall measure of teacher-student participation. The more actively moderators posted in a synchronous online learning conference, combined with a higher quality of moderating, the more active the student participation, and, consequently, the more elevated the levels of Higher-order Thinking and Interactivity. Put briefly, the higher the TRP, the better the student Intellectual Engagement. The teacher’s goals should not merely be to have social-emotionally engaged students, but rather to have students attend to each other’s thoughts and ideas and actively participate as a group. When teachers moderate, the quantity and quality of their moderating should focus on students Attending to each other, which will increase their Social-emotional Engagement and their willingness to actively participate. Rather than simply trying to create a safe or comfortable environment, teachers who try to get students listening and responding to each other will be rewarded with higher Intellectual 117 Engagement. In contrast to what Cazden (2001) found in the teacher-student I-R-E (Initiation-ResponseEvaluation) classroom discourse model, this data showed that Intellectual Engagement was brought about by effective teacher moderating, not simple initiation-response discussion. Thus to ensure that students have higher Intellectual Engagement, teachers need to facilitate a common-logue that engages the whole group. 118 CHAPTER 6 QUALITATIVE ANALYSIS The core issue of the study was to investigate the factors that affect student intellectual engagement. The study used a mixed method approach, exploring the core issue by examining the relationship between teacher moderating levels and student engagement, on the one hand, and the relationships among student engagement variables on the other. Quantitative results confirmed the effect of teacher moderating levels on student engagement. However, these results did not answer specifically how the moderating firnctions worked in terms of the collaborative meaning construction process. A descriptive discourse analysis of the transcripts was needed to unpack these “hows”. The broad patterns resulting from quantitative analyses allowed for selection of transcripts and sections of transcripts of interest to conduct the qualitative analyses. This chapter consists of five sections. Section 6.1 presents the selection process of the transcripts and sections of transcripts of interest. Section 6.2 presents the qualitative analysis procedures. Section 6.3 revisits moderating functions in general. Section 6.4 presents the analysis of a whole transcript (one out of the 44 transcripts) with the purpose of providing a unified picture of the interactive process of synchronous online discussion. Section 6.5 discusses themes emerging fi'om the qualitative analysis. 6.1 Selection Process of the Transcripts and Sections of Transcripts for Qualitative Analysis 119 Findings from the quantitative analysis in Chapter 4 showed that the number of teacher postings (T), the rating of teacher moderating levels (R), and Participation (P) comprehensively impacted student intellectual engagement (i.e., Higher-order Thinking (H) and Interactivity (1)). Expressed in another way: H =Linear (T*R*P); I =Linear (T*R*P) The higher the product of T, R, and P, the higher H and I were; that is, the higher the student intellectual engagement was. Based on the above quantitative results, three weeks - weeks 2, 6, and 9 - were selected for close analysis because in those weeks, T, R and P were relatively high, and H and I were consequently high. Also selected were weeks 3, 7, and 10, in which T, R, and P were relatively low, and H and I were consequently low. The beginning week - week 1 - and the end week - week 11 - were excluded because they were special weekss. For each week, both the high quality teacher moderating group - Group 4 - and the low quality moderating group - Group 1 - were examined and compared. Of these three variables, the most interesting one, from a qualitative point of view, is R (Rating of Teacher Moderating Levels), the quality of teacher moderating levels. For quantitative analysis, this was coded into five levels, but such coding can result in loss of meaning. In contrast, the T (the Number of Teacher Postings) and P (student Participation) are merely raw numbers. It is reasonable to “tease” R out when doing qualitative analysis and examine how the instructor moderated the conferences and what effects moderating 5 . The first week was an introductory week and the last was a summary week. 120 exerted on the quality of the collaborative discourse. 6.2 The Qualitative Analysis Procedures The qualitative analysis process consisted of four phases. The first phase took place before the computer conferencing sessions started. The researcher identified the central parameters underpinning the conferences such as the background information, class objectives, and approaches to moderation (Keynes, 2003). These parameters were ascertained through review of the supporting data that provided context for the later automatically archived transcripts - the prime data source of the study. The supporting data included the characteristics of the synchronous tool/technology, previous studies done in the same setting, course syllabus, course readings, and other related information. The supporting data also included agendas for each class session — separate agendas both for the public main room lecturette and activities and agenda for the breakout room small group conferencing activities. These data provided a broader context for the transcripts under analysis. The second phase occurred during the synchronous computer sessions. During this phase of the study, the researcher was a participant observer of both the synchronous online discussion sessions and the one-hour pre-class preparation meetings held by the teaching team immediately before class every week. Field notes were taken during the observation. Development of these field notes was unique. For instance, using an alarm clock as a reminder, the researcher jotted down what was going on every ten minutes during the three-hour class session each week. At the end of each class session, the researcher made summary reflection notes. 121 The third phase of the qualitative analysis took place during the coding process for the quantitative analysis. Because the coding process involved careful scrutiny and decision-making about which category each posting would be coded and assigned, the researcher utilized the process and made hundreds of pages of memos. These memos included impressions and perceptions of each conference. From these memos some perceptions and loosely defined themes emerged. The fourth phase of qualitative analysis was an intensely purposeful analysis of the transcripts selected based on the quantitative analysis results using computer- mediated discourse analysis, an approach to researching online interactive behavior (Herring, 2003). The basic goal of this discourse analysis is to identify patterns in discourse that are demonstrably present, but that may not be immediately obvious to the casual observer or to the discourse participants themselves. Through discourse analysis, some patterns and themes emerged and the researcher identified themes that were mainly related to teacher moderating behaviors and student intellectual engagement. These are presented in Section 6.4. 6.3 Revisiting the Roles of Moderators and the Use of Moderating Functions What is a computer conferencing moderator? Winograd provides this definition: A moderator even in this educational setting wears many hats: lecturer, tutor, facilitator, mentor, assistant, provocateur, observer, host, and participant. A moderator is a generalist who is sensitive to the individuals and dynamics that make up the conference and through this sensitivity can decide when a conference is doing well or poorly and deciding on action to take if a conference is going awry (Winograd, 2002). Obviously, a moderator needs to know when to wear which hat and how to perform the role accordingly. There is increasing literature discussing the role of the 122 moderator (Berge & Collins, 1995; Rohfeld, & Hiemstra, 1995), moderating functions (Feenberg, 1989b), and online teaching presence (Anderson, et al. 2001). Based on a broad literature review, Xin (2002) compiled a list of moderating fimctions and these functions were incorporated in the rubric used to measure the quality of teacher moderating levels in this study (Shown in Appendix A). These functions include opening comments, setting norms, and setting agenda; referring, recognition, prompting, and delegating; and assessing; meta-commenting, and weaving. The rating for moderating levels in the study was based on each one of these tasks and the quality of performance of functions. There are differences in quality in the use of moderating functions. For example, the use of recognition and prompting can be simply cheering and soliciting, or it can be supplied with context and real substance concerning the topic of discussion (Xin, 2000) The effective use of moderating functions addresses a central problem or concern of computer conferencing: namely, online leadership. The effective use of online moderating functions supports and facilitates student engagement and ensures that a healthy context is established and maintained where learning progress is made through sustained dialogue. On the social-emotional side, the use of moderating functions attempts to sustain class dialogue while, at the same time, maintaining the social milieu needed to encourage democratic participation and interaction. On the knowledge construction side, because moderating functions encapsulate cognitive acts, the effective use of them necessarily fulfils an intellectual role. Through the exercise of moderating functions, the moderator helps learners engage with the subject matter, deepen their understanding, and work together toward idea integration and convergence (Xin, 2002) 123 The parameters of the online course in this study dictate that all the conferences are structured conferences requiring the active participation of a subject matter expert. This subject matter expert is expected to provide both direct and indirect instruction by interjecting comments, referring students to information resources, and organizing activities that allow the students to construct the content in their own minds and personal contexts. Although the conferences were all structured - with syllabi and agenda - the moderator still had extensive roles and responsibilities to ensure that students were learning the material. There were time pressures, a plethora of ideas and comments to monitor and respond to, and the need to capture one’s thoughts and ideas in fairly pithy and understandable postings. Certainly, moderating in such a context is not an easy job. In the qualitative analysis, the researcher will reveal best moderating practices through analysis of a whole transcript and then through analyses of chunks of transcripts. 6.4 Good Moderating Practices - A General Picture While a general analysis examined the semester-long discussion, a detailed analysis of a single transcript was conducted to examine the role of an instructor as an expert moderator. These transcript analyses explored the methods of moderating used in the conference as well as the effect of the moderating on the patterns of the electronic discussions and knowledge construction (Keynes, 2003). The transcript selected for analysis was Transcript # 2 September 15, Group 4. The quantitative findings showed that Group 4 was the group with the highest quality of moderating, the highest product of TRP, and the highest Higher-order Thinking and Interactivity; that is, the highest Intellectual Engagement. 124 Below is a brief synopsis of the context of this conference. As was mentioned in the research context section of the methodology chapter, each class session lasted three hours. The class first met in the main public room for about an hour and half for a lecturette and theory processing and then went to breakout rooms for small group discussion for about an hour and half. The lecturette in the main public room was about a theory - Shutz’s Inclusion Needs. The objectives of the small group discussion were to experience and observe how one would negotiate membership in a group and to discuss strategies for improving interpersonal effectiveness through identifying effective and less effective elements. Major discussion questions were developed by the one-hour pre-class preparation involving the whole teaching team. These questions were to serve as guides for the moderated discussion. For students in each of the four breakout rooms, there were six discussion questions. The discussion questions given to students were: 1. What are some of the typical behaviors you use in order to include yourself into a group? 2. What are some of the ways you include others into a group? 3. What behaviors would you like to practice more of and/or less of? 4. What do you know of the impact you have on others? Others have on you? 5. What are some of the norms or ground-rules you need to have in a relationship with others, for you to include yourself to your satisfaction? 6. What relationship norms are negotiable for you? What norms are not negotiable? In this presentation of the excerpts, the original format of all the postings has been preserved, but some characteristics that were digital graphics (such as pictures, emoticons and flash applets) were not preserved either because it was not possible to preserve them 125 or because of space limitations. Typos that might impede understanding were changed. Names of participants were changed to protect the rights of human subjects. The moderator’s messages are marked with serial numbers. Following each numbered moderator posting is discussion of the strategies embodied in that posting. Finally, the effect of each of the moderator’s postings on student intellectual engagement and on student behavioral and social-emotional engagement were observed and enumerated. #1 1186 Mon, Sep 15 9:04pm -- Amy Amy hello all 120 ( Students said hellos to each other-mainly Social-emotional messages. 134 #2 135 Mon, Sep 15 9:08pm -- Amy Amy John: suffer with grace, i suppose so here are the questions again 1. What are some of the typical behaviors you use in order to include yourself into a group? 2. What are some of the ways you include others into a group? 3. What behaviors would you like to practice more of and/or less of? 4. What do you know of the impact you have on others? Others have on you? 5. What are some of the norms or ground-rules you need to have in a relationship, for you to include yourself to your satisfaction? 6. What relationship norms are negotiable for you? What norms are not negotiable? 6 The posting number started from 118 because the system began archiving well before the small group discussion formally started when some group members just stopped by and said hello to each other while the whole class was still meeting and conferencing in the main room. 126 The first part of the moderator Amy’s posting was to answer in a teasing way one student’s posting where he jokingly asked how he would survive being the only male in the group. Amy posted questions for the group in the second part of this message - this was agenda-setting. 136 l The discussion went lively and it was well on-task. Each individual posted more than one message to the questions that they might have responses to offer. 151 #3 152 Mon, Sep 15 9:12pm -- Amy AmY 1.i smile my head off make jokes pay attention to what people are saying, and not saying, and then build on that 2. i convey to people that they matter to me that i am interested in them 3. i would like to feel my heart beating a little slower in new situations... i achieve that thru breathing... i would like to practice walking up to people, instead of waiting for them to walk up to me that way, i don’t get stuck with all the people who chose me and lose out on the people whom i found interesting Amy here was modeling. She clearly articulated her answers to the questions one by one. Please also be aware that she split her answers into two parts to make it easier for students to process. Too big a chunk of information may cause “cognitive load” for student information processing. Moderators in other groups did not chunk the questions as Amy did and the ensuing discussion did not flow as well as in this group. 153 Students continued to exchange ideas based on the five questions posted by the moderator in posting #135. The effect of the moderator’s modeling hadn’t showed up yet. 156 127 #4. 157 Mon, Sep 15 9:14pm -- Amy Amy my best trick is being genuine authentic is low maintenance my dad had a saying in a new relationship, always put your worse foot forward he meant, just be real people will like that besides, you will not need to keep up appearances you can just be This message is a continuation of her previous posting and Amy here continued modeling. It was easily seen that the moderator’s message, through self-disclosure, was very emotion-arousing and very provocative. The effect of this posting was obvious as shown below. 158 In fact, Amy’s previous posting - posting#4 stimulated very hot discussion: 13 postings were made and they were all on-task and were coded into one of the subcategories of Higher-order Thinking. The Interactivity was also very high: all messages were coded as either reactive or interactive messages. 171 #5 172 Mon, Sep 15 9:16pm -- Amy Amy Cheryl : Cathy: breathing helps! Rose: yes and if you look around as a rule, it is the genuine people who are most included and liked Amy here confirmed individuals’ opinions. Individuals needed nodding, confirming and applauding. l 3 I Each group member spoke up and reached very high integration - an important indicator of higher-order thinking. 178 128 #6 179 Mon, Sep 15 9:17pm -- Amy Amy John: Liz: telling the truth seems to be the function of the relationship the more intimate, the more 'into me see' there is? The message is a well-reflected answer to the message #168 (#168: Mon, Sep 15 9:16pm - Liz yeah, it's important to be honest, but sometimes you can't tell the whole truth to people) where Liz had hesitation and puzzlement about being real and telling the truth. Amy’s message might have extended the “zone of proximal development” for Liz through settling her cognitive conflicts and perturbations, thereby maximizing cognitive growth and development (?) 180 The discussion was still very live with each group member participating. i All the postings were well on-task and were coded into high-order thinking and Interactivity categories. 1 87 #7 188 Mon, Sep 15 9:20pm -- Amy Amy one ground rule i seem to need is for me and the other to understand that my truth is not meant to hurt my truth is just that mine and the truth as i see it at the moment in exchange, i will not take anything you do personally this does not mean that i will not get affected but it is not personal my truth is ONLY about me, as yours is only about you if we can agree to discuss, explore these truth, but not kill one so the other can exist, i am there Cathy i think that is my best foot Amy articulated her strongly personal perspective which was built upon students’ previous postings as well as her own previous postings. The moderator brought in not only content knowledge but also attitudes and skills to the critical discourse. These efforts made by Amy took the discussion to deeper levels of reflection. 129 189 The discussion was very lively and students exchanged ideas on the topic at hand and applauded each other like “gimme a high five girls! !” In other words, students are involved and engaged behaviorally, social-emotionally, and intellectually. 196 #8 197 Mon, Sep 15 9:22pm -- Amy Amy Annal: maybe it is more trust in myself than it is confidence when i trust myself to cope, i am ok i don’t mean like it, even understand it, i mean just cope.. i can embark on anything Shirley: thank you i have a desire in me to include you Again, Amy worked within the ZPD (zone of proximal development) of Nikki - she modified what Nikki said and articulated what Nikki was unable to say. The effect of self-disclosure - “I don’t mean like it, even understand it, I mean just cope. . .” was obvious in the following messages. One of the students said in message #200 “Amy- that’s something I’m working on... like with my boyfriend...” she communicated in the way Amy modeled - also using self-disclosure. It is also worth noting the warmth Amy expressed to one student (i.e., Shirley) who in an earlier message expressed her frustration about her slow computer: “I have a desire to include you”. Amy moderated on both task level and social emotional level. 198 l Students were on task, revealed higher-order thinking and high Interactivity. 200 #9 201 Mon, Sep 15 9:23pm -- Amy Am 1. What are some of the typical behaviors you use in order to include yourself into a group? 2. What are some of the ways you include others into a group? 3. What behaviors would you like to practice more of and/or less of? 130 4. What do you know of the impact you have on others? Others have on you? 5. What are some of the norms or ground-rules you need to have in a relationship, for you to include yourself to your satisfaction? 6. What relationship norms are negotiable for you? What norms are not negotiable? When this thread seemed to run out of energy after several rounds of discussion, Amy re-posted the major questions to reiterate and refresh. This is something like adding fuel to the discussion - to stimulate new ideas. Below is the effect of this reposting. 202 Whereas several messages showed warmth and cohesions (applause to each other), all the others were deep reflections. For instance, the message posted in #206 “sometimes we ask our friends questions, but we don't really want to hear the answer” and still another student emulated the role of a moderator and strongly weaved and wrapped it up in #208: #208 Mon, Sep 15 9:25pm -- John John to all the questions above V honesty, respect, truth and self-confidence are the key attitudes that we should have 210 #10. 211 Mon, Sep 15 9:25pm -- Amy Ami' Rose: in my practice i see people who held in their truths they were taught it was toxic it'll make people cry, mad and dangerous so they swallowed their truth said to themselves i better not say this, i better not feel that well, duh if it is toxic then keeping it in is the worse thing to do and it is so the body shows up and hands them the bill sorrow on all of us Amy dug into deep concerns or issues here. Once again, as an expert, she articulated what a novice was unable to articulate, helping students connect their life experience with theory. 212 131 actively engaged in constructing a better understanding of the issue at l The discussion continued to be lively. Students exchanged views and hand. 215 #11 216 Mon, Sep 15 9:26pm -- Amy 1 Amy Shirley: any comments on a question, on what people are posting? In this instance, Amy was pushing some of the lurking or quiet participants by directly calling those persons to task. Some participants’ action of “free riding” and “social loafing” and failure to contribute can affect others’ enthusiasm and motivation in the course of collaborative learning. Moderators need to deal with this. Here, Amy used a simple way by just calling the person. #12 217 Mon, Sep 15 9:27pm -- Amy Amy why would my truth be a gun at your head to change? Once again, Amy pushed the group to think deeper - another round of adding fuel when she noticed the group running out of energy. 218 The discussion became lively and the students’ postings were deep 1 reflections and interactive. 222 #13 223 Mon, Sep 15 9:28pm -- Amy Amy in terms of inclusion an analogy occurred to me if i am masquerading as a horse, when i am really a bunny, because i decided being a bunny is not good enough, then i am neither a good horse nor a good bunny and then who am i? Amy used analogy/metaphor to illustrate what and how she wanted the discussion to ensue. Analogies and metaphors are good hooks for both the existing discussion as well as for stimulating new ideas and connections. On the one end, it connected to the 132 previous messages about “inclusion,” while, on the other end, it provoked and instigated new discussions, debates, and controversies. The charm of the metaphor made the discussion an extremely lively one. 224 Amy’s humor stimulated other humorous messages. Message #224 said “Amy: a donkey! Lol”and message #225 is a deep reflection on a previous message. 225 #14 226 Mon, Sep 15 9:30pm -- Amy Arny Rose: yep John: double yep Annal: triple yep Amy’s humor. She confirmed what students said in a humorous way. 227 Individuals were well on task and provided answers to the moderator’s question, in a similar humorous tone. One said “you're an incompetent animal” and the other said “Amy: actually, you will be yourself and no one will be close to look like you.” 228 #15 229 Mon, Sep 15 9:30pm -- Amy AmY Cathy: mule? Earlier, a group member, Cathy, posted an answer to Amy’s question “. . .then i am neither a good horse nor a good bunny, and then who am i?” and Cathy said “donkey!” Amy interpreted in a humorous way “mule?” meaning “stubbornness” and Cathy replied later in Message #236 “Amy: not a mule: a horse with long ears = looks like a rabbit = a donkey...” 133 230 Again, students were well on-task. The message #230 answered to a previous posting and message #231 answered to Amy’s horse bunny question and said “Amy, you’re a bunny”, which showed warmth in a humorous way as Amy did. 231 #16 232 Mon, Sep 15 9:30pm -- Amy Amy John: i would be a failed unlived self Amy was observing the group members’ reflections and when all members posted their ideas, it was time to wrap up; Amy did so in a timely manner. 233 l Individuals continued the horse and bunny topic, all flavored with Amy’s humor. 234 #17 235 Mon, Sep 15 9:31pm -- Amy Amy Rose: sweet hop hop The moderator was showing warmth, attention and humor to one individual student, but in reality the whole group became the recipient. 237 Students continued reflecting on the horse and bunny question and the l messages all indicated higher-order thinking and Interactivity. One said “it's really important to be yourself. So often people announce that but then don't do it themselves.” 238 #18 240 Mon, Sep 15 9:32pm -- Amy Amy so i have a question what do we each need to feel like we belong in this group? When the discussion went deep enough and the current thread ran out of energy, Amy added new directions for the discussion. She provided a hook with both ends, this 134 time, putting more weight on the end that intended to elicit future responses-“I have a question ...?” Amy here actually articulated the major question/objective of the whole discussion. Amy posted this question after the “inclusion” topic was discussed thoroughly, which was timely and fortuitous. What’s more, she made the question relate to their (the group’s) present online experience “what do we each need to feel like we belong in this group?” This question activated several other rounds of extremely heated and lively discussion. With Amy using different moderating strategies skillfully, students stayed well on-task and produced sharp and deep reflections, together with informal banters and elements of humor as lubricants. All of these elements are reflective of students being engaged behaviorally, social-emotionally, and intellectually. Due to space concerns, the analysis of subsequent discussions will be omitted. The above section demonstrates the process of meaning construction by the community with the leadership of the moderator. Through the analyses of transcripts, some themes emerged and these themes were labeled “good moderating practices.” For example, the moderating levels of the examples were coded high and were also observed high. Along these same lines, the consequent results of these exemplary moderating practices - student engagement - were also high both quantitatively and qualitatively. While this does not provide an exhaustive list of moderating functions, it does serve to highlight some observations of good moderating practices and how they affected the meaning construction process where scenarios of learning were seen to take place. 135 6.5 Good Moderating Practices - Themes During the process of the analyses (the four phases described in Section 6.3), themes of effective moderating strategies emerged. The themes were organized into five major categories and each theme will be presented in a three-part format: 1) the theme - the structuring and moderating efforts that were actually provided by the instructor during the course of the online collaboration; 2) theories that underpin the theme; and 3) supporting examples followed by a brief discussion on how these efforts may have impacted the subsequent discussion. The five themes are as follows: 1) Providing hooks with both ends 2) Modeling and tele-mentoring ' 3) Confronting and conflicting 4) Setting up norms 5) Mixing with social-emotional elements 6.5.1 Providing Hooks with Both Ends Some researchers (F eenberg, 1989b) use sports and language games as a metaphor to illustrate the satisfaction of playing an engaged dialogue game. “Play” at online discussion consists of making moves that keep others playing. Therefore, to sustain the dialogue game, every message fulfils a double goal: (1) communicating something and (2) evoking future response (F eenberg & Xin, 2002). In this vein, each message functions as a link that at one end connects to one or multiple previous messages, and, at the other end, provides a hook for creating future message(s) One way of connecting to previous messages is called “weaving.” (Xin, 2002). Weaving is a key function performed from time-to-time to thematize or ground the communication, on the 136 one hand, and to insert order or structure in the discussion or synthesize accomplishments on the other hand. Weaving functions “as a plateau or landing area in the middle of a climb or intellectual journey. It. allows climbers (i.e. students) to look back on the various trails and trials they have gone through, reflect upon their achievements made, and prepare for the next advance.” (Xin, 2002). Below, the researcher will present both positive examples and negative examples. This discussion allows inspection of the effect of postings with or without hooks. Example #1 The topic of this class was the Myers Briggs Personality Type preference (MBTI), and how one's own MBTI personality type preference can affect interpersonal relationships. In the postings prior to this excerpt students talked about the differences of the personality types and that thread was fairly extensive - about ten messages. At this point, the moderator (i.e., Amy) posted a message that not only strongly weaved what was discussed in the previous messages but also provided a hook for future messages. 678 Mon, Oct 27 9:41pm -- Amy Ami' so we agree that there were no differences in wanting to be good and fun people the differences are in how we go about this so What do you see are the implications of these differences? (Transcript #5, October 27, Group 4) However, providing a hook did not always activate discussion on the topic. After the message was posted, it was perhaps not processed well or interpreted properly, and the topic “implications of these differences” did not get developed. One possible reason 137 was that these students might have had difficulties processing this question. Therefore, the moderator used an example to interpret the question, shown in message #693. 693 Mon, Oct 27 9:44pm -- Amy Amy what if your parents are big time organized people and your style is to go with the flow what are the implications of these preferences for you (Transcript #5, October 27, Group 4) After this particular posting, the discussion was developed but not as much as might be expected because the discussion was drawing to an end, and, not surprisingly, students could not stay well focused. It is also helpful to review moderating postings without hooks, or postings with hooks that had only one end - postings that either only solicited without providing context or related materials, or only summed up previous messages. What effects did such postings produce? Example 2 207 Mon, Nov 3 8:11pm -- Jodi Jodi Marie: #197 What would you need to get the same feeling in a f2f class? Renee: What is the meaning of your message-4? 215 Mon, Nov 3 8:12pm -- Jodi Jodi Renee: Can you articulate more? 219 Mon, Nov 3 8:13pm -- Jodi Jodi Arlene: #216 Why do you think that is? (Transcript #7, November 3, Group 1) The topic of this discussion was students’ feelings about the absence of moderators. In the postings prior to this excerpt, students talked about their feelings. The 138 moderator in this group posted messages without hooks or with hooks that were very flat and weak, or hooks that had only one end that functioned as “soliciting without providing context and related materials”. Furthermore, using serial numbers of postings as a reference did not work well because the flow of the messages was so quick that it was not convenient or practical for students to scroll back and forth to address a moderator’s question. The effects of these postings were not obvious. Post #207 was not addressed at all, post #215 also was not addressed and post #219 was picked up but without deep reflection. Here are more examples of hooks with only one end. Example 3 283 Mon, Nov 3 8:27pm -- Lindsey Life is good Joyce: that is a great observation... "So I think people have underestimated their skills. I believe this group would average a 4 in most of those questions" (Transcript #7, November 3, Group 1) Moderator Lindsey summarized without suggesting next steps. Postings like this were “flat”- they did not weave with other postings or provoke further discussion - and consequently would not produce more discussion. These two posts activated no further responses. Finally, we could observe how moderator Amy strongly weaved and wrapped up to finish her class with a pleasant conclusion wherein she praised the participants. Example 4 288 Mon, Oct 20 10:05pm -- Amy Amy ok i'm aware of the time just want to say how impressed i am again with this group we did a bunch of totally new and bewildering activities 139 used the whiteboard, filled in questionnaires, without java and so on and you were all troopers i feel so proud for all of you and i want to thank you for being so open and accepting, as i remind you that we are all learning here, as we keep pushing that envelope i bow to each of you (Transcript #5, October 20, Group 4) 6.5.2 Modeling and Tole-mentoring As a relatively new learning method, online collaboration itself is a learning process that needs scaffolding from capable experts to smooth the process as well as to guide the content learning to achieve smooth, effective online collaborative learning (Zhang, 2004, p16). Instructors are expected to provide supports in the collaborative learning process by motivating students, monitoring and regulating performance, providing reflections, modeling, moderation and scaffolding (Brandon & Hollingshead, 1999; Brown & Palinscar, 1989; Jonassen, 1999, Zhang, 2004). Vygotsky proposed that learning occurs in social activities (Driscoll, 1994; Vygotsky, 1978), and that complex, higher-order thinking gradually develops through social interactions with others in the culture (Gredler, 1997; Vygotsky, 1978). According to socio-cultural theorists, people learn from mediations and scaffoldings, which are offered within one’s zone of proximal development (ZPD) from experts or more capable peers (Bonk & Cunningham, 1998; Gredler, 1997; Wertsch, 1985). Vygotsky defined ZPD as the distance between a person’s independent competency and that obtained with assistance from an expert or in collaboration with more capable peers (Wersch, 1985). Such a distance can be bridged and extended through scaffolding efforts, as external assistance is gradually reduced and the learner finally achieves independent competency in the task (Gredler, 1997). 140 In this particular study, there were various degrees of effectiveness in performing moderating functions such as recognition and prompting. The mere performance of recognition and prompting without involving the real substance of the subject matter did not always generate positive effects (i.e., increased participation and interaction). As Xin (2002) observed, just being a cheerleader is not enough. It sometimes worked at the beginning of a seminar; however, the effect diminished quickly if there was no real intellectual substance combined with the cheering and soliciting. When there was performance of moderating functions coupled with deep engagement with real issues related to the topic, participants were drawn into the discourse. Example 5 132 Mon, Sep 29 9:30pm -- Amy Am i like the fact that it is an 'i statement' it describes without evaluating or judging what i observed ie. you are driving at 150miles/hour and not you are driving like a maniac then, i get to say what i feel that's not negotiable if i say i feel scared, no one can tell me i don’t, or shouldn’t then i like the part where i get to elaborate on my reasons, though this part is not always necessary finally, i like the part where i can tell you what i need i sure did not like it at first criticized it, refused to use it consistently till my friend said ah, i see... you really don’t wish to be heard, right? (Transcript # 4, September 29, Group 4) In message #123, Amy posted new questions in order to bring the discussion to a deeper level (note that these questions were not included in the original agenda, but Amy raised these questions according to the situation - some students felt frustrated when beginning to discuss the formula). Alter most group members responded to the questions, 141 Amy posted her way of looking at the formula using personal experience and reasoning at message #132. She was demonstrating and modeling, perhaps within the zones of proximal development of some of the individuals. Example 6 187 Mon, Sep 29 9:50pm -- Philippe Philippe mom, i am frustrated that we seem to miscommunicate as to what you need me to do to help out with dad. i feel like there is more that i can do, but i feel that you do not communicate this to me clearly. i would like to do what i can, but i need you to help me to understand what this is. 193 Mon, Sep 29 9:54pm -- Amy Amy Philippe: notice the 'you statement' you are making how may you change that, i.e. mom, when we discuss the type of help you need from me, i feel frustrated because i am not clear as to what you think i could be doing and i need you to be clear about what you think and say? (Transcript # 4, September 29, Group 4) Students were asked to put forward a formation based on the formula given. Group member Philippe did so in message #187. Moderator Amy gave concrete suggestions to individuals through modeling at message #193. The following is a similar example. Example 7 349 Mon, Sep 22 8:44pm -- Philippe Philippe Cheryl : no way, i don't think you come across as a pessimist. There’s soooooooooooooo much to take in, so much going on, and your picture reflects that 363 Mon, Sep 22 8:48pm -- Amy Amy Philippe: what seems to be missing in this environment are the eye balls we all imagine are out there judging us of course, those eyeballs rarely bother, being too busy worrying about 142 their eye balls but face2face, we imagine people see exactly what we wish to hide here, there is a sense of perceived anonymity and safety no eyeballs you're at home have more time to think here also... (Transcript # 4, September 29, Group 4) Message #363 posted by moderator Amy was intended to answer the above message - message #349 - and a few other messages in which Philippe and other group members felt that people tended to use the Internet, but he failed to clearly articulate his reasoning. Amy clarified what the students wanted to say but were apparently unable to articulate. In this sense, students’ ZPDs were bridged. Based on this scenario, it appears that to moderate well, one needs not only effective scaffolding skills, but also sufficient knowledge of the area and ability to offer reflective comments and critical thinking or analyses. 6.5.3 Confronting and Conflicting Social cognitive conflict theory (Clement & Nastasi, 1988; Piaget, 1977) provides insights on how online discussion can serve as a valuable contribution to learning. The underlying assumption of this theory is that knowledge is motivated, organized, and communicated in the context of social interaction. Doise and Mugny (1984) argued that when individuals operate on each other’s reasoning, they become aware of contradictions between their logic and that of their partner. The struggle to resolve these contradictions propels them to new and higher levels of understanding. Research by Bearison (1982) as well as Perret-Claremont, Perret, and Bell (1989) supports the assertion that the conflict embedded in a social situation may be more significant in facilitating cognitive development than the conflict of the individual focusing alone (Rourke & Anderson, 143 2002,). In Rourke and Anderson’s (2002) study, interviewed students claimed that the additional perspectives offered by others in the form of opinions, personal experiences, and analogies added to their understanding of the content, and made it more concrete. Contradictory perspectives disturb their initial impressions of the content and prompt learners to process it more thoroughly. This latter process, however, can only be precipitated by challenging and critical interactions. As Brown (1989) notes: “change does not occur when pseudo-consensus, conciliation, or juxtaposed centrations are tolerated” (p409). There is little argument that learning may be defined as the progressive modification of ideas and behaviors through interpersonal interaction. There were times in this study when students became frustrated and they complained. Is it better for the instructor moderator to confront these reactions or to ignore or avoid them? Moderator Amy’s practices provided some insight into this question. Example 8 170 Mon, Nov 3 7:59pm -- Olga Olga Rose: are they doing it again? This class is slow I’m starting to get annoyed... I’m only on 3hrs of sleep for 2 nights now... 176 Mon, Nov 3 8:00pm -- Amy Amy hmmm a trick? well it was more like providing you with an experience of possible discomfort the main risk is that you would get pissed at us, but hey, we were willing to live with that so if there was a trick, excuse me, but it is on us? 190 Mon, Nov 3 8:02pm -- Amy Amy Ofelia: i would love to explain 144 i dont know which part you are not understanding though. (Transcript #7, 8, 3“, Group 4) As an experiment, moderators did not arrive on time to see how students would react. Later, when the truth was revealed, some students complained and said it was a trick and they didn’t like it. Moderator Amy reacted by confronting the complaints. The effect of this was that students reached understanding (or were pacified) and the discussion returned to task-oriented issues. In other groups, complaints about being tricked were not addressed by the moderators, resulting in stifled or digressive discussions. Example 9 148 Mon, Oct 20 9:33pm -- Philippe Philippe i think this was kind of a dumb assignment. i mean, all the questions were basically just different ways of re-wording the same question, and i'm just not convinced that the results are very meaningful 158 Mon, Oct 20 9:34pm -- Amy AmY Philippe: i'm not a fan of questionnaires myself yet this one is actually a very good one, in as much as it has very high internal validity and is used in many selection processes both in academy and in corporations go figure. 1 would suggest we get past what we don’t like though and look at what is useful about this whole issue of learning and learning style. (Transcript #5, October 20, Group 4) Here is another example of a student complaint. One student complained about the assignment in message #148 and called it “dumb” and not meaningful. In posting #158, the moderator handled the complaint by voicing her opinion and suggesting more positive reactions: to find what was useful about the whole experience. 145 It is extremely interesting that some active individuals defended and debated fairly different and conflicting ideas. They noted their different viewpoints from their peers as well as from the moderator; in fact, there were also occasions where they agreed to disagree. As they assumed or appropriated roles that the moderator modeled, they began to share the role of a moderator. Example 10 501 Mon, Sep 22 9:51pm -- Gabriel Gabriel Brandie, i would tend to think in the ways of “well he got what he deserved" which might not be the RIGHT thing to do. 507 Mon, Sep 22 9:52pm -- Gabriel Gabriel Tracy: that doesn’t sound too healthy. Don't you think that sometimes if you consciously behave the way you do, people will start to think that you're getting annoying? 512 Mon, Sep 22 9:53pm -- Samantha Samantha Tracy: I voice my opinion a lot too, but you have to know when to keep it closed sometimes ....... it CAN get you in trouble... 526 Mon, Sep 22 9:56pm -- Gabriel Gabriel But hold on, all this THEORY is nice and dandy but is this the way the world really works? I would think not. I would think the world works with 'survival of the fittest in mind'. Those who can empower others and order others around always seem to win? 529 Mon, Sep 22 9:56pm -- Evangelos Evangelos Brandie makes a good point. If you agree with the 2 people (in a cheating situation) it probably wouldn't bother most people as much. I, personally, feel that no one should be belittled even when they do something like cheating 535 Mon, Sep 22 9:58pm -- Samantha Samantha Tracy: personal situations, personal reactions....doesn't mean either of us is wrong ...... (just saying!) 146 545 Mon, Sep 22 10:00pm -- Gabriel Gabriel Myrna: Yes. I'm sure it doesn't ALWAYS work that way, but the world is a competitive Arena first, a democratic society second. 553 Mon, Sep 22 10:02pm -- Brandie Brandie Myrna: No i don’t think it sounds selfish to respect yourself...hmmmm...but to put priorities in me before others does sound selfish.. 555 Mon, Sep 22 10:03pm -- Gabriel Gabriel But is simply being AWARE only a way to excuse your cowardice and non-action? (Transcript #3, September 22, Group 2) 6.5.4 Setting up Norms As the focus changes from “teaching” to active “learning,” the instructor must take substantial responsibility for fostering a leamer-centered peer collaborative learning environment. Group dynamics contribute to students’ performance in collaborative learning and to their satisfaction with the learning experience (Bosworth & Hamilton, 1994). Some participants’ action of “free riding” and “social loafing” and failure to contribute, however, can damage others’ enthusiasm and motivation in the course of collaborative learning. In addition, the feeling of “talking in a vacuum” with online collaboration, fi'ustrations with technology, and other factors make online collaboration a challenge to many participants (F lannery, 1994; Zhang, 2004). What did expert moderators do to activate participation of all group members? Here is one example. Example 11 467 Mon, Sep 22 9:37pm -- Amy Amy be fun to count all the languages between us another thing that would be good, for the rest of the semester, if we all agreed to some protocol 147 like for example when it comes to taking turns, how about we use the room menu? whomever is first there, goes first and so on that way, the J ohari window of the group would enlarge some we will all know that this is how we do an activity i need feedback does this make sense? (Transcript #3, September 22, Group 4) Here, in the beginning of the second part of the conference. Moderator Amy was setting up norms for the discussion. She proposed that people take turns. Apparently, students did not understand her directions. She stopped some off-task discussion in message #474. She posted the main discussion topic in message #491 and then clarified in message #492. After Amy set up the norms and gave clear direction and guidance, the discussion did not apparently need as much prodding but, nevertheless, continued in an active and lively manner. 6.5.5 Social-emotional Elements “Conversation, fastidious Goddess, loves blood better than brick, and feasts most subtly on the human will.” (Virginia Woolf, Mrs. Dalloway). In the virtual environment, as in the face-to-face environment, students naturally showed affective reactions - interest, boredom, happiness, sadness, and anxiety (F redricks, et al, 2004). The social dimension is a crucial factor in determining the “climate” of conferences, that is, the willingness of people to contribute and engage seriously with the effectiveness of the discussion (Keynes, 2003). One of the major moderating efforts was to motivate the group in the online collaboration process through showing warmth, care and encouragement. In addition to constantly checking the task progress, the instructor also needed to provide 148 .91 .. motivational moderations by recognizing individuals that showed active collaboration as well as encouraging the others who were inert or less active toward more active participation at the same time. Example 12 366 Mon, Sep 22 8:50pm -- Amy Amy Ofelia: smile yes i cam my living with such things John: 101 goodjob! Fiona are you here? (Transcript # 3, September 22, Group 4) Example 13 495 Mon, Sep 22 9:41pm -- Amy Amy Rose: oh dear you are tired we just had 10 minutes or so hugging you so lets go (Transcript # 3, September 22, Group 4) It is difficult for the quantitative analyses to find significant effects of teacher’s moderating levels on student social emotional engagement because of the various limitations of the measures. However, it is useful and informative to observe the efforts that moderators made to facilitate student Social-emotional Engagement. The above is only one of several pertinent examples. Summary 149 Using the quantitative analysis results as a guide, the researcher identified transcripts and sections of transcripts for qualitative analysis. Putting the transcripts and sections of transcripts of interest in both their broader and immediate context, the descriptive discourse analyses provided a general picture of the interactive process of synchronous online discussion through the analysis of an entire transcript and the analysis of sections of transcripts. The themes emerging from the qualitative analysis, together with the supporting theories and practices, showed the manner in which instructors manage the ebb and flow of synchronous discussion and how this affects student engagement. 150 CHAPTER 7 GENERAL DISCUSSION ACCOMPLISHMENTS AND FUTURE STUDIES 7.1 Review of Goals and Summary of Accomplishments Extensive attention has been paid to the use of interactive online technologies in higher education. This particular study looked at just one of them: synchronous conferencing tools. The primary goal of this study was to address three key issues in synchronous online conferencing: (1) to measure the different aspects of learner engagement with the substantive processes of engaged collaborative discourse; (2) to understand what factors contribute to student learning (particularly, what role teacher moderators play in enhancing student intellectual engagement through engaged collaborative discourse); and (3) to better understand the nature and dynamics of moderated synchronous group discussion. The realization of the three goals is summarized below. Measurement is a key aspect of scientific research. Through a broad literature review and pilot studies, this study fulfilled the first goal: it developed new constructs of student Engagement and its three sub-constructs; (1) Behavioral Engagement; (2) Social- Emotional Engagement; and (3) Intellectual Engagement. It also designed, modified, and adapted various measurement methods and instruments and verified them by means of empirical data from a synchronous computer conferencing learning environment. The overall research assumption was that if student engagement can be adequately measured and interpreted, then it is possible to focus on the factors that affect engagement. 151 This research fulfilled the second goal of the study - to form a model of learning through engaged collaborative discourse in computer conferencing. By building such a model, the relationships between teacher moderating behaviors and student engagement and relationships among student engagement aspects were disentangled and clarified. The model revealed that student Intellectual Engagement is influenced by a comprehensive factor: the product of the amount of teacher postings, the quality of teacher moderating levels, and student participation. This factor (which turned out to be comprehensive) is an index of teacher-student participation and the importance of participation. Within the limitations of this study, it is found that the more actively moderators post, the higher the quality of teacher moderating. Equally important, the more active the student participation in a synchronous online learning conference, the more elevated the level of Higher-order Thinking and Interactivity; i.e., the more students are intellectually engaged. The third goal of the study was to better understand the nature and dynamics of moderated synchronous group discussion as it relates to individual cognition and group interaction. While such a goal is hard to achieve, it lies at the heart of student learning (Cazden, 2001). By combining a naturalistic inquiry with a descriptive discourse analysis of the transcripts, the study provided a unified picture of the interactional process of synchronous online discussion through the analysis of an entire transcript. The themes emerging from the qualitative analysis, together with the supporting theories and practices, uncovered the underlying processes of synchronous computer conferencing in relation to online moderating. Using a mixed method approach, this study developed innovative methodological and interpretive frameworks that bridged the gap between quantitative and qualitative methodologies in data analysis of computer-mediated communication. 152 These methodologies and findings may contribute to a better understanding of how teachers can provide effective online mentoring and scaffolding to facilitate student engagement with each other and with the subject matter (Bonk et al., in press). It may also contribute to a better understanding of whether and how a community of inquiry develops by means of synchronous computer conferencing where students are most likely to become invested behaviorally, social-emotionally, and intellectually. Findings from this research should inform research and practice on the larger goal of improving the quality of online teaching and learning. 7.2 Limitations of the Study The study was made in a specific context: a synchronous, online, three-credit university level course entitled “Interpersonal Communications and Relationships” delivered through a tool called LearningByDoing, a real-time, interactive HTML- fonnatted text, image, and animated messaging system. Within the limitations of its context, the study revealed that the product of the quantity or number of teacher postings, quality/rating of teacher moderating levels, and student participation comprehensively influenced student intellectual engagement and that this relationship is a linear one. The study revealed an optimal level of Attending - the frequency that students “listen to” or read the group postings for student intellectual engagement. The hypothesis that there may exist an optimal level of teacher moderating, however, was not verified by the data. Future studies can research further both what factors are involved here and how such factors jointly influence student intellectual engagement. Is there an optimal level of teacher moderation frequency? How does this 153 relate to the optimal level of listening/reading (Attending) from students? What factors bear most directly on active participation? The online conferences examined in this study were structured conferences moderated by instructors. The course had its own unique subject matter, tasks, and structure. Differences in any of these aspects might generate different needs for moderation (Zhang & Ge, 2003). According to Zhang and Ge (2003), different tasks may generate different needs for moderation and the effects of the same approach on other types of tasks may vary. Investigating the relative effects of the moderating approaches on online collaboration in other content areas or other disciplines (such as sociology, telecommunications, business, informatics, and science) could add to the understanding of the field of teacher moderation within synchronous conferencing. The study made significant efforts to adapt instruments that can measure both manifest and latent variables and address, to some extent, the validity and reliability issues of those measures. However, improvements or advancements in synchronous coding schemes are still needed. The study also made significant advances in developing and combining important constructs such as social-emotional engagement, higher-order thinking, and Interactivity. In addition, it elucidated new rationale, methods and formulae for such combinations. These methods and formulae might be deemed pioneering in the synchronous conferencing research arena while serving as a springboard for further research in this area. As this study was primarily interested in group performance, it did not investigate individual learning performance. However, the group performances may not necessarily reflect the possible effects of the teacher moderating levels on individual engagement and individual performance. To understand the complicated nature of the relationships 154 between teacher moderating levels and student engagement, research needs to look at all the aspects of online collaborative learning simultaneously: the individuals, the group, the team task, and the delivery media (Zhang and Ge, 2003). 7.3 Future Studies If synchronous conferencing begins to impact teaching and learning at even one- tenth the degree to which asynchronous conferencing has played a role in reshaping higher education courses during the past decade, there will be a tremendous need to understand student engagement and participation and teacher facilitation and moderation in such environments. Already, numerous indications from corporate training suggest that synchronous forms of learning can play a significant role in adult leaming. There are also many recent research results from the social presence and online learning community literature that indicate that online students in higher education want and expect more direct and timely interactions with instructors and other students. As they begin to demand more synchronous opportunities, research such as the present study can better inform how, when, and where to embed real-time virtual learning experiences. This study is only one look at online synchronous moderation. It provides a humble starting point for future empirical studies. To understand the dynamics of synchronous online conferencing, research must consider all aspects of online collaborative learning simultaneously. In view of these facts, some suggestions and recommendations are relevant. The current study was a quasi-experimental research project. The assignment of group membership and moderators used some randomization. In theory, a true randomization would have involved randomly assigning individuals to controlled or pre- 155 selected moderating conditions. Future studies might attempt to control teacher moderating levels to examine the effects of moderating on student engagement. Future studies might also observe students as they progress through a second or third course with this tool, i.e., conducting a longitudinal study. The primary data used for this study were automatically archived transcripts. Future studies can collect robust data - such as surveys, interviews, focus groups, and course products - to help build a deeper understanding of the issues and problems underlying synchronous online learning. It might also be possible to have students retrospectively reflect on their chat transcripts or watch and comment on a replay of their synchronous chat sessions. Instructors, too, might be involved in such retrospective analyses. This study was based on one kind of technology - a synchronous conferencing tool that has its own unique features, options, and limitations. There is an enormous variety of conferencing tools, both asynchronous and synchronous. Even commonly used and debated synchronous tools such as Breeze, CCCConfer, Centra, HorizonWimba, Interwise, LiveMeeting, Macromedia, NetMeeting, and Webex may provide different learning environments with vastly different affordances and constraints. The study was based on one level of technology application. It occurred totally online, without any face-to-face meetings. Future studies may investigate discussions in varied online settings; for instance, synchronous, asynchronous, and blended environments. Studies of blended leaning may add to the understanding of synchronous learning and online teaching and learning in general. This study linked both the processes and the educational objectives of computer conferencing to student behavioral engagement, social-emotional engagement, and 156 intellectual engagement. As such, it fills a significant gap in the synchronous conferencing literature. Eventually, research in this area can extend to online training programs and curricula. The results of the study may help researchers and practitioners develop better protocols for moderating online discussions. Such knowledge is essential if online learning (particularly synchronous conferencing) is to achieve its firll potential. 157 APPENDICES 158 APPENDIX A RUBRIC FOR MEASURING TEACHER MODERATING LEVELS Moderating Levels Descriptor Level 1 Open discussion; Set conference norms; and Observe conference norms Level 2 Respond to prompts; Acknowledge other contributions with simple confirmation; Solicit contribution without providing context and related materials; Refer to materials without explanation; and/or Make easy meta comments; In addition to performing Level 1 fiinctions. Level 3 Recognize other contributions with elaboration; Solicit provided with context and related materials; Refer materials with explanation; Raise new relevant topics; and/or Perform significant meta commenting; In addition to performing Level 2 functions. Level 4 Weave in light forms; Assess; and/or Delegate; In addition to performing Level 3 functions. Level 5 Weave in strong forms; In addition to perform Level 4 functions. (Xin, 2002) 159 APPENDIX B RUBRIC FOR MEASURING SOCIAL-EMOTIONAL ENGAGEMENT Catjgoq Indicators Definition Warmth Expression of Conventional expressions of emotions, or emotions unconventional expressions of emotion, includes repetitious punctuation, conspicuous capitalization, emoticons Use of humor Teasing, cajoling, irony, understatements, sarcasm Showing personal Expressing care to individual’s feelings and aspects of care. life outside of class Expression of Complimenting others or contents of others’ message, compliment, showing appreciation, encouragement and agreement appreciation, encouragement, and agreement Self-disclosure Presenting details of life outside of class, or expressing vulnerability. Cohesion Vocatives Addressing or referring to participants by name Addresses or Addressing the group as “we”, “us”, “our”, and refers to the group “group” using inclusive pronouns Phatics, Communication that serves a purely social function; Salutations greetings, closures. Interactive Continuing a . Continue with or reply to one or multiple previous (SE- thread messages rather than starting a new thread Interactive) (Adapted from Rourke, Anderson, Garrison, & Archer, 1999) 160 APPENDIX C RUBRIC FOR MEASURING HIGHER-ORDER THINKING Descriptor Indicators Socio-cognitive process Phase I Initiation Sense of puzzlement Asking questions Problem Messages that take discussion in new Initiation and direction Brainstorming Recognizing the problem Presenting background information that culminates in a question Phase 11 Negotiation Divergence - Unsubstantiated contradiction of Problems within the online previous ideas Investigation and community Meaning Co- lnfonnation exchange Personal construction narratives/descriptions/facts/(not used as evidence to support a conclusion) Suggestions for Author explicitly characterizes message consideration as exploration, e.g. “does that seem about right?” “Am 1 way off the mark?” Brainstorming Adds to established points but does not systematically defend/justify/develop addition Leaps to conclusion Offers unsupported opinions Divergence - Many different ideas/themes presented within a single message in one message Phase III Integration Convergence among Reference to previous message Problem Solving group members followed by sustained agreement, e.g., and Idea “1 agree because...” Integration Convergence within a single message Connecting ideas, synthesis Creating solutions Justified, developed, defensible, yet tentative hypotheses. Integrating information fi'om various sources - - textbook, articles, personal experience Explicit characterization of message as a solution by participant (Adapted from Xin, 2002 and Garrison et al., 2001) 161 APPENDIX D RUBRIC FOR MEASURING INTERACTIVITY Cateflry Descriptors Initiation with A question that is unrelated to a prior A Question post Declarative An initiation or a new idea; a new line of thought; Reactive A response by one poster to a declarative message of another poster Interactive Any additional follow-up to an interactive message (Adapted from Rafaeli and Sudweek, 1996 and Sarlin, Geisler, and Swan, 2003). 162 APPENDIX E SUMMARY OF RESEARCH FINDINGS FOR NULL HYPOTHESES H01.l H012 H013 H01.4 H015 1101.6 H01.7 H021 Ho 2.2 H0 2.3 Ho 2.4 H0 2.5 H0 2.6 H0 2.7 Ho 3a.] Ho 3a.2 Ho 3a.3 Ho 3a.4 Ho 3a.5 H0 3a.6 Null Hypothesis The number of Teacher Postings does not change over weeks The Rating of Teacher Moderating Levels does not change over weeks Attending does not change over weeks. Participation does not change over weeks. Social-emotional Engagement (S) does not change over weeks Higher-order Thinking does not change over weeks. I Interactivity does not change over weeks The number of Teacher Postings does not change across 1 groups a The Rating of Teacher Moderating Levels does not change across groups Attending does not change across groups. 1 Participation does not change across groups. Social-emotional Engagement does not change across groups. 7 7 f f Higher-order Thinking does not change across groups. Interactivity does not change across groups. There is no relationship between the Number of Teacher Postings and Social-emotional Engagement There is no relationship between the Rating of Teacher Moderating Levels and Social-emotional Engagement There is no relationship between Attending and Social- emotional Engagement There is no relationship between Social-emotional Engagement and Participation There is no relationship between Social-emotional Engagement and Higher-order Thinking There is no relationship between Social-emotional Engagement and InteractivityI Findings Rejected Failure to reject p=.095 Rejected Rejected Failure to reject p=.096 Rejected Rejected 1 Failure to reject p=.27 Rejected ‘ Failure to reject . p=.47 Failure to reject p=. 16 Rejected Failure to reject p=.40 Failure to reject p=.59 Failure to reject p=.8] Failure to reject p=1.0 Rejected Rejected Failure to reject . p=.39 Failure to reject p=.73 163 SUMMARY OF RESEARCH FINDINGS FOR NULL HYPOTHESES (cont’d) H0 3b H0 30.1 H0 3c.2 H0 3c.3 H03c.4 H0 3d.] H0 3d.2 Ho 3d.3 Ho 3d.4 Ho 3d.5 H0 3d.6 Ho 3d.7 H0 3d.8 H0 3d.9 H03d.10 There is no relationship between Attending and Participation. There is no relationship between the Number of Teacher Postings and Attending ., ,, There is no relationship between the Rating of Teacher Moderating Levels and Attending. ‘ There is no relationship between the Number of Teacher Postings and Participation There is no relationship between the Rating of Teacher Moderating Levels and Participation. , There is no relationship between the Number of Teacher _ Postings and Higher-order Thinking There is no relationship between the Rating of Teacher Moderating Levels and Higher-order Thinking There is no relationship between Attending and Higher- . order Thinking There is no relationship between Participation and Hi gher-order Thinking ~ There is no relationship between the Number of Teacher Postings and Interactivity. A, -- -_ , There is no relationship between th Rating of Teacher Moderating Levels and Interactivity 7 , There is no relationship between Attending and Interactivity. There is no relationship between Participation and Interactivity There is no relationship between Comprehensive factor TRP and Higher-order Thinking There is no relationship between Comprehensive factor TRP is not related to Interactivity Rejected Rejected Failure to reject p=~81_ - Rejected Failure to reject _ p=.52 Rejected I Rejected Rejected, _ Quadratic Rejected Rejected Rejected . Rejected, Quadratic Rejected Rejected ' Rejected 164 BIBLIOGRAPHY Anderson, T. (2005). Teaching in an Online Learning Context. In Anderson, Terry & Elloumi, Fathi. ( Eds.) (2004). Theory and Practice of Online Learning. Athabasca, CA: Athabasca University. Anderson. T. & Garrison, D. R. (1995). Critical thinking in distance education: developing critical communities in an audio teleconference context. Higher education, 29, 183-199. Anderson, T. Rourke, L, Garrison, D. R. & Archer, W. (2001) Assessing Teaching Presence in a computer conferencing context. Journal of Asynchronous Learning Networks 5(2). Angeli, C., Bonk, C., & Hara, N. (2000). Content analysis of online discussion in an applied educational psychology course. Instructional Science, 28(2), 115-152. Retrieved February 12, 2005 from: http://crlt.indiana.edu/publications/techreport.pdf Bloom, B. S. (Ed.) (1956) Taxonomy of educational objectives: The classification of educational goals. Handbook I: Cognitive domain New York ; Toronto: Longmans, Green. Bloxom, M., Caul, W., Fristoe, M., & Thomson, W. (1975). On the Use of Student Led Discussion Groups. Educational Forum, 39, 223-230. Bonk, C. J ., & Cunningham, D. J. (1998). Searching for learner-centered, constructivist, and sociocultural components of collaborative educational leaming tools. In C. J. Bonk, & K. S. King (Eds), Electronic collaborators: Learner-centered technologies for literacy, apprenticeship, and discourse. 25-50). Mahwah, NJ: Erlbaum. Bonk, OJ and King, KS. (1998) Computer conferencing and collaborative writing tools: starting a dialogue about student dialogue. In C.J. Bonk and KS. King (Eds), Electronic Collaborators - learner-centered technologies for literacy, Apprenticeship, and discourse. 3-23. Mahwah: :Lawrence Erlbaum Associates, Publishers. Bonk, C. J., Wisher, R. A., & Nigrelli, M. L. (in press). Learning communities, communities of practice: Principles, technologies, and examples. To appear in K. Littleton, D. Faulkner, & D. Miell (Eds), Learning to collaborate, collaborating to learn. NOVA Science. Bonk, C. J., & Wisher, R. A. (2000). Applying collaborative and e-learning tools to military distance learning. A research framework. (Technical Report 1107). Alexandria, VA: US. Army Research Institute for the Behavioral and Social Sciences. 165 Borg, W. & Gall, M. (1989). The methods and tools of observational research. In W. Borg. & M. Gall (Eds.) Educational Research: An introduction (5th ed.) London: Longman. 473-530 Bosworth, K. & Hamilton, S.J. (eds.) (1994) Collaborative Learning: Underlying Processes and Eflective Techniques, Jossey-Bass Inc. USA. Brookfield, S. A. & Preskill, S. (1999). Discussion as a way of teaching: Tools and techniques for democratic classrooms. San Francisco: J ossey-Bass. Brown, A. & Palincsar, A. (1989). Guided cooperative learning and individual knowledge acquisition. In L.B. Resnick (Ed.), Knowing, learning, and instruction, Essays in honor of Robert Glaser. Hillsdale, New Jersey: Erlbaum. 393-451. Bruffee, K. A. (1992). Collaborative learning and the “conversation of mankind.” In A.S. Goodsell, M.R. Maher, and V. Tinto (eds.), Collaborative Learning: A Sourcebook for Higher Education. University Park, Pa.: National Center on Postsecondary Teaching, Learning, and Assessment. Bruffee, KA (1993). Education as conversation. In (Ed.), Collaborative learning: Higher education, interdependence, and authority of knowledge. Baltimore: Johns Hopkins University Press. Bruffee, K. A. (1999) Collaborative learning: higher education, interdependence, and the authority of knowledge (2"d ed.) Baltimore: John Hopkins University Press. Bruner, J. (1960). The process of education. Cambridge, MA: Harvard University Press. Burbules, N. C. (1993). Dialogue in teaching: Theory and practice. New York: Teachers College, Columbia University. Bussmann, H. (1998). Phatic communication. In G. Trauth, K. Kazzazi & K.Kazzazi (Eds.). T outledge dictionary of language and linguistics. London: Routledge. Carroll, J. (1963). A model of school learning. Teachers College Record, 64, 723-733. Cazden, C. B, (2001), Classroom Discourse: the Language of Teaching and Learning, Heinemann, Portsmouth: NH Christenson, L. & Menzel, K. (1998). The linear relationship between student reports of teacher immediacy behaviors and perceptions of state motivation, and of cognitive, affective, and behavioral learning. Communication education, 47, 82-90 Clark, H., & Schaefer, E. F. (1989). Contributing to discourse. Cognitive Science, 13, 259-294. 166 Collins, A., Brown, J.S. & Newman, SE. (1989). Cognitive apprenticeship: Teaching the craft of reading, writing and mathematics. In L.B. Resnick (Ed.), Knowing, learning and instruction: Essays in honor of Robert Glaser. Hillsdale, NJ: Erlbaum. 453-494 Creswell, J. W. (2003). Research Design: Qualitative, quantitative, and mixed method approaches. Thousand Oaks: Sage Publications. Cronbach, L. (1975). Beyond the two disciplines of scientific psychology. American Psychologist (F ch.) 1 16-127 Cunningham, D.J., Duffy, T.M. & Knuth, R. (1993). The textbook of the future. In C. McKnight, A. Dillion & J. Richardson (Eds), Hypertext: A Psychological Perspective. Ellis Harwood. Dennen, V. P. (2001). The design and facilitation of asynchronous discussion activities in Web-based courses. Indiana University.Unpublished doctoral dissertation, Bloomington, IN. Dillenbourg, P. (1999). Introduction: What Do You Mean By “Collaborative Learning”? in Dillenboug, P. (Ed) Collaborative Learning: Cognitive And Computational Approaches (pp. 1-19). UK: Pergamon. diSessa, A. A. Changing minds: computer, learning, and literacy. Cambridge (Mass): The MIT Press, 2001. Duffy, T., Deuber, B., & Hawley, C. (1998). Critical thinking in a distributed environment: a pedagogical base for the design of conferencing systems. In C. Bonk & King (Eds.) Electronic Collaborators: Learner-Centered Technologies for Literacy, apprenticeship, and discourse. New Jersey: Lawrence Erlbaum Associates, Publishers. 51-78 Eggins, S., & Slade, D. (1997). Analyzing causal conversation. Washington, DC: Cassell. Garcia, A.C., Jacobs, J .B., (1999) The Eyes of the Beholder: Understanding the Tum- Taking System in Quasi-Synchronous Computer-Mediated Communication. Research on Language and Social Interaction, v32 n4 p337-67 1999. Garrison, D. R., & Archer, W. (2000). A transactional perspective on teaching and learning: A framework for adult and higher education. Oxford, UK: Pergamon, 2000. ht_tp://www.atl.ualbertaca/cmc/publicationshtml. Garrison, D. R., & Anderson, T. (2003). E-learning in the 21 st Century: A framework for research and practice London: Routledge Falmer. Garrison, D. R., Anderson, T., & Archer, W. (2000). Critical inquiry in a text-based environment: Computer conferencing in higher education. The Internet and Higher Education, 2(2-3), l-l9, hng/wwwatl.ualbe@.ca/cmc/Dublicationshtml 167 Garrison, D.R., Anderson, T., & Archer, W. (2001) Critical thinking, cognitive presence, and computer conferencing in distance education. American Journal of Distance Education, 15(1), 7-23. hm;://www.atl.ualberta.ca/cmc/publications.htrnl. Gunawardena, C. N. (1995). Social presence theory and implications for interaction and collaborative learning in computer conferences. International Journal of Educational Telecommunications, 1(2/3), 147-166. Gunawardena, C. N., Lowe, C. A., & Anderson, T. (1997). Analysis of a global online debate and the development of an interaction analysis model for examining social construction of knowledge in computer conferencing. Journal of Educational Computing Research, 17, 395-429. Gunawardena, C. N., & Zittle, F. (1997). Social presence as a predictor of satisfaction within a computer mediated conferencing environment. American Journal of Distance Education, 1 1(3), 8-25. Fahy, P.J. (2001). Addressing some common problems in transcript analysis. International Review of Research in Open and Distance Learning, 1(2). Retrieved on April 15, 2005 from: http://www.irrodl.org/content/v1.2/research.html#Fahv Feenberg, A. (1989a). A user's guide to the pragmaties of computer mediated communication. Semiotica, 75(3/4), 257-278. Feenberg, A. (1989b). The written world. In R. Mason & A. Kaye (Eds.), Mindweave: Communication, computers, and distance education (pp. 22-39). Oxford: Pergamon Press. Feenberg, A., & Xin, M. C. (2002). A teacher’s guide to moderating online discussion forums: From theory to practice. Available: http://www.textweaverorg/modmanuam.htrn. Firestone, W. A. (1993). Alternative arguments for generalizing from data as applied to qualitative research. Educational Researcher, 22(4), 16-23. Flannery, J. L. (1994). Teachers as co-conspirator: knowledge and authority in collaborative learning. In K. Bosworth & S. J. Hamilton (eds) Collaborative learning: underlying processes and effective techniques. San Francisco: J ossey-Bass Publishers, 15-23. F lorio-Ruane, S. with deTar, J. (2001). Teacher education and the cultural imagination: Autobiography, conversation, and narrative. Mahwah, NJ: Erlbaum. Fredricks, J. A., Blumenfeld, P.C., & Paris, A. H. (2004). School Engagement: Potential of the Concept, State of the Evidence. Review of the Educational Research. 74(1). 168 Hamm, M. & Adams, D. (1992). The collaborative dimensions of learning. Norwood, NJ: Ablex Publishing. Harasim, L. (1990). Online education: An environment for collaboration and intellectual amplification. In L. M. Harasim (Ed.), Online education: Perspectives on a new environment (pp. 39-63). New York, NY: Praeger Pulishers. Hawisher, Gail E. (1992) “Electronic meeting of minds: research, electronic conferences, and composition studies” in Hawisher G. and LeBlanc P. (Eds), Re-imaging computers and composition: teaching and research in the virtual age, Postmouth, NH: Boynton/Cook. Herring, S. (1999) Interactional Coherence in CMC, Journal of Computer-Mediated Communication 4 (4). Retrieved February, 12, 2005 from http://www.ascusc.org/jcmc/vol4/issue4/herring.html Herring, S. C. (2003). Computer-Mediated Discourse Analysis: An Approach to Researching Online Behavior. In S.A. Barab, R. Kling, & J .H. Gray, (Eds.). Designing for Virtual Communities in the Service of Learning. New York: Cambridge University Press. Retrieved on April 15, 2005 from http://elirslis.indiafleduhherring/cmdahtml Hillman, D. C. A. 1999. A New Method for Analyzing Patterns of Interaction. The American Journal of Distance Education, 13, (2): 37-47 Hilzt, S.R. , Coppla, N., Rotter, N., & Turoff, M. (2000). Measuring the importance of collaborative learning for the effectiveness of ALN: A multi-measure, multi-method approach. J ournal of Asynchronous Learning Networks. 4(2). Retrieved on October, 10, 2004 from http://www.af1nrese2_rrch.org/JSP/pamrs frame Lisp Howe, K. & Eisenhart, M. (1990). Standards for qualitative (and quantitative) research: A prolegomenon. Educational Researcher, 19(4), 2-9. Howell-Richardson, C., & Mellar, H. (1996). A methodology for the analysis of patterns of participation within computer-mediated communication courses. Instructional Science, 24, 47-69. Kanuka, H. & Anderson, T. (1998). Online social interchange, discourse, and knowledge construction. Journal of Distance Education, 13(1), 57-74. Kaye, A (1992) Learning together apart. In Kaye AR (Ed.): Collaborative Learning Through Computer Conferencing. NATO ASI Series. Vol 90. Berlin: Springer-Verlag, pp 1-24 Kearsley, G. & Shneiderman, B. (1998) Engagement Theory: A framework for technology-based Teaching and learning. Educational Technology, September-October 1998, 20-23. 169 Keynes, M. (2003) Sharpening the focus: methodological issues in analyzing on—line conferences. Technology, Pedagogy and Education, 12 (3). Knolle, J. W. (2002). Identifying the best practices for using Horizonlive to teach in the synchronous online environment. California State University, Chico. Unpublished doctoral dissertation. Krathwohl, D., Bloom, B., & Masia, B. (1956). Taxonomy of educational objectives. Handbook 11: Affective domain. New York: David McKay. Kremer, J. & McGuinness, C. (1998). Cutting the Cord: Student-Led Discussion Groups in Higher Education. Education and Training, 40(2), 44-49. Krippendorf, K. (1980). Content analysis: an introduction to its methodology. London:Sage. Johnson, D. & Johnson, R. (1975). Learning together and alone. Englewood Cliffs, NJ: Prentice-Hall. John-Steiner, V. (2000). Creative collaboration Oxford. New York : Oxford University Press Lave, J. & Wegner, E. (1991). Situated Learning: Legitimate Peripheral Participation. Cambridge, Englang: Cambridge University Press. Lemke, J. L. (1982). Classroom communication of science. Final report to NSF/RISE, April. (ED 222 346). Li, Q (2001) Development of the Collaborative Learning Measure in CMC. Journal of Educational Technology Systems. 30 (1), 1941. Li, Q. (2002). Exploration of collaborative learning and communication in an educational environment using computer-mediated communication. Journal of Research on Technology in Education, 34 (4), 503-516. Lin, X., Hmelo, C., Kinzer, C & Secules, T. (1999). Designing Technology to Support Reflection. Educational Technology Research and Development, 47(3), 43-62. Lipman, M. (1991). Thinking in Education. Cambridge: Cambridge University Press. Lobel, M., Neubauer, M. & Swedburg, R. (2002a) The eClassroom used as a Teacher's Training Laboratory to Measure the Impact of Group Facilitation on Attending, Participation, Interaction, and Involvement. International Review of Research in Open and Distance Learning, October, 2002. Retrieved September 23, 2004 from: http://www.irrodl.org/content/v3.2/1ns.html. 170 Lobel, M., Neubauer, M. & Swedburg, R. (2002b) Elements of Group Interaction in a Real-Tirne Synchronous Online Learning-By-Doing Classroom Without F2F Participation, Journal of the United States Distance Learning Association, 16 (2). Retrieved on April 20, 2004 from: http://www.usdla.org/html/journal/APR02_Issue/article01 .htrnl Marjanovic, O (1999). Learning and Teaching in a synchronous collaborative environment. Journal of Computer Assisted Learning 15, 129-138. Mason, R. (1991). Moderating educational computer conferencing. DEOSNEWS, 1(19), 1-11. Retrieved on June 20, 2003 from http://pchfstudl .hsh.no/hfa_1g/litteratur/ienssen/deosnews/mason.htm on June 20 National Survey of Student Engagement (2003). The College Student Report. Retrieved on May 14, 2004 from: mz/lwww.indiana.edu/~nsse/ Neubauer, M., & Lobel, M. (2003). The Learningbydoing eClassroom©. July 2003, Learning Technology, publication of IEEE Computer Society, Learning Technology Task F orce (LTTF). Retrieved on May, 12, 2004 from: http://lttf.ieee.org/leam tech/issues/julv2003/#8 Newman, D. R., Johnson, C., Webb, B., & Cochrane, C. (1997). Evaluating the quality of learning in computer supported co-operative learning. Journal of the American Society for Information Science, 48(6), 484-495 Newman, D. R., Webb, B., & Cochrane, C. (1995). How to measure critical thinking in face-to-face and computer supported through content analysis. Interpersonal Computing and Technology Journal, 3(2), 56—77. Ninio, A., & Brunner, J. (1978). The achievement and antecedents of labeling. Journal of Child Language. 5, l-15. O'Malley, C. (Ed). (1995). Computer Supported Collaborative Learning. Berlin: Springer-Verlag. Orvis, K. L., Wisher, R. A., Bonk, C. J ., & Olson, T. (2002). Communication patterns during synchronous Web-based military training in problem solving. Computers in Human Behavior, 18(6), 783-795. (Special Journal Issue on Computer-Based Assessment of Problem Solving). Patton, M. Q. (2002), Qualitative research and evaluation methods, 3rd ed. London: Sage. Paulsen, M. P. (1995). Moderating educational computer conferences. In Berge, Z. L. & Collins, M. P. (Eds.). Computer-mediated communication and the on-line classroom in distance education. Cresskill, NJ: Hampton Press. 171 3...; ..s. .. ‘.. '; ..- u'. - .. _. I - ~- 3 - .u .h !-' . . -..1 . . . n . ~- . . ‘9 - ‘0 . . - o -. I L‘ .-I 1.: . t r ~o ... ~— _. ; _, '- ,, 1. I . 3- .. -o - .. ‘rn . x .. ‘~ ,.- - o . T- .9- ... , 3 s - I.“ E. .1. t‘ - :7 9 .- -* ~. ‘. o I ‘:‘. 'w 3' . ‘t ’3‘. ’. c'. ‘C \ .1, .‘, .3, 3_L] w“ .. 7.. .— -. .s- ..- o “- " "g“. t V" r d .' 2'3"}:5‘3‘53’ ’Jtlt I Pellegrino, J. W. and Goldman, S. R. (2002) Be Careful What You Wish For - You May Get It: Educational Research in the Spotlight. Educational Researcher. 31(8). 15-17. Perry, B., Edwards, RN. (2005). Exemplary Online Educators: Creating a Community of Inquiry, Turkish Online Journal of Distance Education 6(2). Retrieved April 8, 2005, from http://tojde.anadolu.edu.tr/tojde18/articles/article6.htm Perkins, D. N. (1993). Persons-Plus: A Distributed View of Thinking and Learning. In Salomon, G (Ed.), Distributed Cognitions. Psychological and Educational Considerations (pp.88-110). Cambridge, MA: Cambridge University Press. Peshkin, A. (2000). The nature of interpretation in qualitative research. Educational Researcher, 29(9), 5-9. Potter, J ., & Levine-Donnerstein, D. (1999). Rethinking Validity and Reliability in Content Analysis. Journal of Applied Communication Research, 27, 258-284. The Report of the University of Illinois Teaching at an Internet Distance Seminar (1999). Teaching at an Internet Distance: The pedagogy of online teaching and learning. University of Illinois Faculty Seminar, Urbana-Charnpaign, IL: The University of Illinois. Retrieved on January, 12, 2003 from: http://www.vpaa.uillinois.edu/tid/reportl. Riffe, D., Lacy, S., F ico, F. (1998). Analyzing media message using quantitative content analysis in research. Mahwah, NJ: Laurence Erlbaum. Rogers, C (1970): Encounter groups. London: Allen Lane, the Penguin Press. Rogoff, B. (1990) Apprenticeship in thinking. New York: Cambridge University Press. Rohfeld, R. W. & Hiemstra, R. (1995). Moderating Discussions in the Electronic Classroom. In Berge, Z.L. & Collins, M.P. (Eds).(l995). Computer-mediated communication and the on- line classroom in Distance Education. Cresskill, NJ: Hampton Press. Retrieved on July 16, 2003 from http://www.emoderators.com/moderators/rohfeld.html. Roschelle, J. (1996). Learning by collaborating: Convergent conceptual change. In N. J. M. T. Koschmann., & L. Erlbaum (Ed.), CSCL : theory and practice of an emerging paradigm (pp. 209 - 248). Mahwah, N.J.: L. Erlbaum Associates. Rourke , L., & Anderson, T. (2002). Exploring Social Communication in Computer Conferencing. Journal of Interactive Learning Research 13(3), 259-275. Retrieved on July 16, 2003 from: http://dl._a_ace.org/9253 Rourke, L. and Anderson, T. (2002). Using Peer Teams to Lead Online Discussions. Journal of Interactive Media in Education, 2002, (1). Retrieved on February 2, 2004 from: www-iime.open.ac.ul_(/2002/1. 172 Rourke, L., Anderson, T., Garrison, D. R., Archer, W. (1999) Assessing social presence in asynchronous, text-based computer conferencing. Journal of Distance Education, 14(3), 51-70. Rourke, L., Anderson, T., Garrison, D. R., & Archer, W. (2001). Methodological issues in the content analysis of computer conference transcripts. Journal of Artificial Intelligence in Education, 12. Sanders, J. & Wiseman. R (1990). The effect of verbal and nonverbal teacher immediacy on perceived cognitve, emotional, and behavioral learning in the multicultural classroom. Communication education, 39, 341-353. Sarlin, D., Deisler, C., & Swan, K. (2003). Responsible selection: synchronous or asynchronous communication tools and patterns of engagement in online courses. Chicago, IL: Paper presented at the Annual Conference of the American Educational Research Association Scardamalia, M., & Bereiter, C. (1996). Computer support for knowledge-building communities. In T. Koschmann (Ed.), CSCL: Theory and practice of an emerging paradigm. Mahwah, NJ: Lawrence Erlbaum. Scardamaliz, M. Bereiter, C., McLean, R., Swallow, J. & Woodruff, E. (1989) Computer supported intentional learning environments. Journal of Educational Computing Research, 5(1). Schrage, M. (1990). Shared minds: the new technologies of collaboration. New York: Random House. Schwab, J. J. (1975). On learning community: Education and the state. The Center Magazine, 8(3), 30-44. (Social interaction and learning engagement). Shi, S., & Tan, S. (2003) Threads: Woven Into a Picture of Postmodern Style, Paper presented at the Annual Conference of American Educational Research Association (AERA), Chicago, IL Shi, S, Mishra, P., Bonk, J. C., Tan, S. & Zhao, Y., (under review). Thread Theory - a Framework Applied to Content Analysis of Synchronous Computer Mediated Communication Data. Journal of Computer Mediated Communication. Shi, S., Mishra, P., Bonk, C. J. (2004). Linkage between Instructor Moderation and Student Behavioral Engagement in Synchronous Computer Conferences. Proceedings: the Association for Educational Communications and Technology (AECT) 2004 International Convention, Chicago, IL. Shneiderman, B. (1992) Education by engagement and construction: a strategic education initiative for a multimedia renewal of American education, In Ed Barrett (Ed), The social 173 creation of knowledge: Multimedia and information technologies in the university. MIT Press, Cambridge, MA. Shneiderman, B. (1993) Education by Engagement and Construction: Experiences in the AT&T Teaching Theater AACE (Charlotesville, VA) Education Multimedia and Hypermedia Annual, Maurer, H., Ed., 1993, Ed-Media 93 (Orlando, FL, June 23-26, 1993) 471-479. Retrieved on March 15, 2003 from: http://www.cs.umd.edu/hcil/pubs/tech-reports.shtml# 1 993 Shneiderman, B. (1998) Relate-Create-Donate: a teaching/learning philosophy for the cyber-generation. Computers & Education, 31, 25-39. Short, J ., & Williams, E., & Christie, B. (1976). The social psychology of telecommunications. Toronto, ON:Wiley. Spradley, J. (1980). Participant Observation. New York: Holt, Rinehart and Winston. Tannen, D. (1989) Talking voices: repetition, dialogue, and imagery in conversational discourse. Cambridge [England] ; New York : Cambridge University Press. Tu, C. H. (2003). Factors impacting online collaborative learning community. Paper presented at the Annual Conference of American Educational Research Association (AERA). April 21-25, Chicago, IL. Vygotsky, LS. (1978). Mind in Society. Cambridge, MA: Harvard University Press. Waggoner, M. D. (Ed). (1993). Empowering Networks. New Jersey: Educational Technology Publications. Wenger, E (1998). Communities of practice: learning as a social system. The system thinker, 9(5), 1-5. Wiittgensein, L. (1958). Philosophical investigations ( 3rd ed.). New York: Macmillan. Winograd, D. (2000). Guidelines for Moderating Online Educational Computer Conferences. Retrieved on July 8, 2003 from: htLp://www.emoderz_1tors.com/moderators/winogradhtml Xin, M, (2002) Validity Centered Design for the Domain of Engaged Collaborative Discourse in Computer Conferencing, Brigham Yong Univeristy. Unpublished doctoral dissertation. Zhang, K (2004). Effects Of Peer-Controlled Or Externally Structured And Moderated Online Collaboration On Group Problem Solving Processes And Related Individual Attitudes In Well-Structured And Ill-Structured Small Group Problem Solving In A Hybrid Course. Pennsylvania State University. Unpublished doctoral dissertation. 174 Zhang, K. & Ge, X. (in press). The dynamics of online collaborative learning: Team task, group development, peer relationship, and communication media, in AD. de Figueiredo, & A. A. Fonso, (eds). Managing learning in virtual settings: the role of context. Idea Group. Zhu, Erping. (1996). Meaning negotiation, knowledge construction, and mentoring in a distance learning course. In Proceedings of Selected Research and Development Presentations at the 1996 National Convention of the Association for Educational Communications and Technology (18th, Indianapolis, IN). Available from ERIC documents: ED397849. 175 . . Z. I ~ 1:. I- \ .. A» n‘.’.“4. u .3.... :H. . 1 , . .. all... ......:..,... . . 2...: . .. ..f..~.:....a.....a 7 . (0.. v... . 4 . .» lllllllllllll 02736 4 «military