TEACHING PRESENCE IN A FULLY ONLINE ASYNCHRONOUS UNDERGRADUATE MATHEMATICS COURSE AND ITS IMPACT ON SOCIAL AND COGNITIVE PRESENCE By Robert Andrew Elmore A DISSERTATION Submitted to Michigan State University in partial fulfillment of the requirements for the degree of Mathematics Education—Doctor of Philosophy 2022 ABSTRACT TEACHING PRESENCE IN A FULLY ONLINE ASYNCHRONOUS UNDERGRADUATE MATHEMATICS COURSE AND ITS IMPACT ON SOCIAL AND COGNITIVE PRESENCE By Robert Andrew Elmore The number of fully online asynchronous undergraduate mathematics courses is growing rapidly, making it imperative that the instructional choices that are chosen by instructors and their effects on students’ opportunities to learn in the online learning environment be further explored. Therefore, this research aims to understand instructors' choices when teaching an online undergraduate mathematics course, and how these decisions impact students' communication opportunities. This research organized the instructors' decisions and their impacts on students using the community of inquiry framework. The three categories of the community of inquiry framework, teaching presence, social presence, and cognitive presence, were analyzed through course artifacts, an instructor interview, student interviews, student surveys, and course usage data. The primary analysis was performed using the interviews with the other data sources providing further detail and explanation. Four claims were generated while analyzing these data sources. Claim one posits that students tend to have singular preferences of the course’s direct instructional elements. Claim 2 proffers that students who chose to work with others report having positive experiences, and those who decided not to work with others report not needing help, with one exception. Claim 3 states that meaningful contact points can be created between the instructor and student using surveys and personalized mass emails; however, most describe learning mathematics in Math 101 as not making them feel a part of a learning community. And claim 4 posits that elements of the teaching presence were more likely to foster participation if they were associated with a grade. The results of this study have implications for both the research and practice communities. The current study’s results imply that—even though sizes of online mathematics classes may still grow—there are ways instructors can facilitate high levels of social processes using mass email, surveys, cooperative learning groups, and other online tools. These specific tools should be studied and evaluated for their effects on social presence and cognitive presence on the mass scale. The present study suggests four specific things that instructors should familiarize themselves with that are available today, (a) prescribe opportunities for students to communicate with each other such as having assignments that are completed in cooperative learning groups, (b) communicate with your students through personalized means (e.g., emails, surveys, and Zoom sessions), (c) use feedback from surveys to inform your future teaching practice, and (d) ensure that students observe your communication and direct instruction by tying them to elements associated with grades. Keywords: teaching presence, social presence, cognitive presence, online mathematics learning. Copyright by ROBERT ANDREW ELMORE 2022 ACKNOWLEDGEMENTS Lao Tzu is credited with saying, “the journey of a thousand miles begins with one step.” This quote describes my journey through the Program in Mathematics Education at Michigan State University, and that first step was a phone call placed to the program office in the summer of 2017. Therefore, the first person I must thank is Lisa Keller. If she had not answered the phone that day, I do not think I would have completed that first step. There is one for Freda Cruél and Kelly Fenn, along with the thank you to Lisa. My chances of success would have dwindled significantly without them. I thank all others in the program alongside me but give special thanks to my adoptive cohort, Chuck Fessler, Valentin Kuechle, and Merve Kursav. Their support, words of encouragement, and fellowship helped me persevere along this journey. I also must thank those in my cohort, Sarah Castle, Rileigh Luczak, Sunyonug Park, and Jonathan Gregg. Their conversations made the PrIME community feel like a family. Outside of my program, I also must thank Brittney Dillman for her continued support and guidance through the program. I commend Ralph Putnam and Jack Smith for their guidance throughout my graduate journey. Ralph’s perspective and continued guidance as a professor and program director have always been invaluable. Jack’s unwavering support and understanding while pushing for continued growth are hard to describe and are priceless. Both Jack and Ralph made me stretch to become a better researcher and student. Throughout my dissertation research, I cannot express how supportive Vincent Melfi, Monica Karunakaran, and Aman Yadav have been. I thank Vincent Melfi for his careful work guiding me through the program’s twists and turns. Monica Karunakaran has been extraordinary v by her ability to make you feel like you are doing the right thing and that you need to do something different simultaneously is unmatched. And Aman Yadav, I truly appreciated his help and perspective from another department. My dissertation research would not have been as good without him. Shiv Karunakaran helped me so much that he must get his own paragraph. I cannot imagine that there exists a better graduate or dissertation advisor. Shiv goes out of his way to make your vision possible. He has a fantastic gift of seeing things from your perspective and helping you achieve your goals. Even though his own family, research, and teaching keep him busy, he always seems to find a way to make time and find resources for his graduate students. Go Shivites! Finally, I must thank my parents, Rod and Joanne Elmore, for encouraging me and setting an excellent example with their lives dedicated to education. I cannot thank them enough. However, beyond all others, I must thank those who helped me carry the burden the most, my wonderful wife Hilary and son Preston. They have made many sacrifices over the past five years. The long nights spent waiting for me to get home from East Lansing, putting up with the constant fixture of me reading or typing away at my computer, and the endless conversation about mathematics education research. Thank you for helping me carry on through this program. Thank you all for the support; I could not have done it without all of you pulling and pushing me across the finish line. vi TABLE OF CONTENTS LIST OF TABLES ......................................................................................................................... ix LIST OF FIGURES ........................................................................................................................ x CHAPTER 1: RATIONALE .......................................................................................................... 1 Current Study’s Conception ........................................................................................................ 3 How to Improve Online Learning ............................................................................................... 8 How to Study Online Learning Environments.......................................................................... 10 Research Questions ................................................................................................................... 11 CHAPTER 2: LITERATURE REVIEW ...................................................................................... 14 Instructional Strategies for Teaching Undergraduate Mathematics .......................................... 16 Active Learning Instructional Strategies .............................................................................. 17 Active Learning Techniques are not Bulletproof.................................................................. 23 Research on the Prevalence of Active Learning Instructional Strategies ............................. 24 Research on Online Undergraduate Mathematics Courses ....................................................... 26 Comparison of Student Success Between Face-to-face and Online ..................................... 27 Instructional Choices Impact on Online Courses.................................................................. 30 How Student Characteristics Impact Success ........................................................................... 33 Research in Face-To-Face Classrooms Can Inform Online Instruction ............................... 34 The Community of Inquiry Framework .................................................................................... 38 Teaching Presence ................................................................................................................ 40 Social Presence ..................................................................................................................... 41 Cognitive Presence................................................................................................................ 42 Conceptual Framework ............................................................................................................. 43 CHAPTER 3: METHOD .............................................................................................................. 46 Selection of Participants ........................................................................................................... 47 Description of Participants ........................................................................................................ 49 Description of Instructional Team ........................................................................................ 49 Description of Student Population ........................................................................................ 50 Description of Interview Participants ................................................................................... 50 Data Collection ......................................................................................................................... 51 Course Artifacts .................................................................................................................... 52 Survey Data ........................................................................................................................... 53 Course Usage Data ................................................................................................................ 54 Interview Data ....................................................................................................................... 55 Data Analysis ............................................................................................................................ 58 Analysis of Course Artifacts ................................................................................................. 60 Analysis of Instructor Interview ........................................................................................... 60 Analysis of Student Interviews ............................................................................................. 64 Analysis of Student Surveys ................................................................................................. 70 Analysis of Course Usage ..................................................................................................... 71 vii CHAPTER 4: RESULTS .............................................................................................................. 73 Claims ....................................................................................................................................... 74 Claim 1: Students tend to have singular preferences of course’s direct instructional elements ............................................................................................................................................... 75 Claim 2: Students who chose to work with others report having positive experiences, those who chose not to work with others report not needing help, with one exception................. 83 Claim 3: Meaningful contact points can be created between instructor and student using surveys and personalized mass emails; however, most describe learning mathematics in Math 101 as not making them feel a part of a learning community ..................................... 93 Claim 4: Elements of the teaching presence were more likely to foster participation if they were associated with a grade................................................................................................. 99 Answering the Research Questions ........................................................................................ 100 Answer Research Question 1 .............................................................................................. 100 Answer Research Question 2 .............................................................................................. 103 Answer Research Question 3 .............................................................................................. 105 CHAPTER 5: DISCUSSION, IMPLICATIONS, AND CONCLUSION .................................. 108 Discussion ............................................................................................................................... 109 Connections to Active Learning Research .......................................................................... 111 Connections to the Online Learning Environment Research .............................................. 114 Connections to the Community of Inquiry Framework ...................................................... 118 Implications............................................................................................................................. 122 Limitations and Future Research ............................................................................................ 124 Concluding Remarks ............................................................................................................... 126 APPENDICES ............................................................................................................................ 128 APPENDIX A: Survey Questions .......................................................................................... 129 APPENDIX B: Student Interview Guide ................................................................................ 135 APPENDIX C: Instructor Interview Guide ............................................................................ 138 APPENDIX D: Recruitment Emails ....................................................................................... 140 APPENDIX E: Project Timeline ............................................................................................ 144 REFERENCES ........................................................................................................................... 146 viii LIST OF TABLES Table 2–1 Community of inquiry framework used for coding ..........................................43 Table 3–1 Names of interview participants .......................................................................50 Table 3–2 Data collection timeline ....................................................................................52 Table 3–3 Connection between each data source, claims, and research questions............59 Table 3–4 Community of inquiry coding scheme..............................................................62 Table 4–1 Description of selected survey results ..............................................................82 Table 4–2 Description of selected survey results ..............................................................98 Table 4–3 Usage of graded versus non-graded course elements .....................................100 ix LIST OF FIGURES Figure 2–1 Elements of the Community of Inquiry Framework .......................................39 x CHAPTER 1: RATIONALE The number of undergraduate mathematics courses being taught both fully or partially online is increasing, in part, due to their low cost, efficiency, and convenience for students (Kaser & Hauk, 2016; Trenholm, Alcock, & Robinson, 2016), and more recently by calls for social distancing due to the coronavirus pandemic. While this movement has been noticeable in terms of the increase in the number of online courses, research on instruction in these undergraduate mathematics courses is sparse (Trenholm, Peschke, & Chinnappan, 2019). For instance, Trenholm and colleagues (2019) found in their review of the literature that most of the research on online undergraduate mathematics courses has been primarily focused on achievement measures such as pre- versus post-test comparisons or final course grades with outcomes resulting in much-unexplained variability. This research on achievement measures such as pre- versus post-test comparisons or final course grades is essential to compare delivery methods for their efficacy. However, now that this research has primarily taken place and online courses are widespread, it is imperative to look closely at the quality of instruction for students taking these courses. Currently, it has been found in many cases, mathematics instructors teaching online courses must develop their instructional strategies from experience and their knowledge of best practices that were designed for face-to-face instruction (Baran, Correia, & Thompson, 2011; Trenholm, Peschke, et al., 2019), like the research and practice in active learning (Freeman et al., 2014), cooperative group-based learning (Fullilove & Treisman, 1990; Treisman, 1992), or by following the guidance of organizations, such as Quality Matters’ (2020). The research that has been published on what instruction and learning look like in an online undergraduate 1 mathematics learning environment and what has been reported about online instruction reinforces the inclinations from my experience finding that instructors are attempting to replicate established practices from the face-to-face classroom (Trenholm et al., 2016; Trenholm, Peschke, et al., 2019), such as using videos in place of lectures and discussion boards in place of classroom discussion (Draus, Curran, & Trempus, 2014; Hegeman, 2015; Trenholm, Hajek, et al., 2019). Given the importance of offering high-quality mathematics instruction in the online environment, it is imperative that the instructional choices and their effects on students’ learning opportunities in the fully online asynchronous undergraduate mathematics learning environment be further explored. At this point, I would like to describe some of the terms used while making a case for this study’s importance in this chapter. For instance, there are many ways classes utilize the internet in the present. These utilizations may range from a face-to-face class using the internet supplementally—with most instruction inside a physical classroom—to a fully online course using the internet as their only means of meeting and communication. Some terms like fully online, partially online, blended, hybrid, or face-to-face are often used to describe where classes fit on this vast spectrum of internet usage. From this point forward, face-to-face will refer to a class that meets face-to-face for some or all the instruction, and online will refer to a class that does not meet in any physical way. Furthermore, online classes structure their communication using the internet in two distinct ways: synchronously and asynchronously. Synchronous refers to courses where everyone meets and participates simultaneously, like in a face-to-face class or online using a video conferencing service. Asynchronous refers to classes where there are no class meetings held with everyone in attendance at the same time. These asynchronous courses are typically offered online and offer students a variety of learning activities. From this point 2 forward, face-to-face will imply synchronous in-person instruction, and online will indicate asynchronous online instruction. This distinction is essential and will become more apparent as the present research is described; however, at this point, I would like to state that I have chosen to focus this research specifically on online classes. Current Study’s Conception Now that some general terms have been discussed, I will describe how I arrived at my strategy to study instructors' instructional choices and their effects on students’ opportunities to learn in the online undergraduate mathematics learning environment. Like many of the instructors teaching courses like those described in the studies in the previous paragraphs, I was asked to teach undergraduate mathematics online early in my career. While designing my first class for the online modality in 2006, I attempted to mimic each teaching strategy I learned and used while teaching in the traditional classroom. These strategies that I used in the traditional classroom focused on getting students to work together in groups, discuss their problem-solving techniques, and present course content in a lecture format with me up at the board, writing definitions, demonstrating examples, and answering questions. Because of these experiences, I developed for my online class video lectures, online quizzes, and discussion boards for students to participate in so the students would perform well on the course’s three exams that were given face-to-face using the college’s proctoring service. Unaware at the time, now I understand that these strategies that I was using are described in the literature surrounding mathematics education in the areas of active learning (Freeman et al., 2014), peer instruction (Vickrey, Rosploch, Rahmanian, Pilarz, & Stains, 2015), inquiry-based learning (Aditomo, Goodyear, Bliuc, & Ellis, 2013; Edelson, Gordin, & Pea, 1996; Freeman et al., 2014), and cooperative learning groups (Felder & Brent, 1996; Fullilove & Treisman, 1990; Springer, Stanne, & 3 Donovan, 1999). Another resource that influenced these decisions that I had to make came from the college's distance education department. This department kept up on general literature for best practices in teaching online. While I am not sure where they were gathering their ideas then, the same department currently adopts rubrics and strategies from Quality Matters’ (2020). Using the description of my journey in the previous paragraph, I have described how I came to understand teaching online. Now, I will explain how each of these instructional influences from the literature for teaching in the face-to-face classroom, active learning (Freeman et al., 2014), peer instruction (Vickrey et al., 2015), inquiry-based learning (Aditomo et al., 2013; Edelson et al., 1996; Freeman et al., 2014), and cooperative learning groups (Felder & Brent, 1996; Fullilove & Treisman, 1990; Springer et al., 1999), have informed instructors strategies in the online classroom. Following, I will explain how these critical aspects of teaching best-fit teaching in the online environment and how I chose the communities of inquiry framework (Garrison, 2017) for the present study. Active learning (Freeman et al., 2014) has been a popular instructional strategy because it has been found to increase performance on examinations by almost one-half of a standard deviation as compared to lecturing, which has been shown to increase failure rates by 55% (Freeman et al., 2014). Active learning has been described as any learning activity that “requires students to do meaningful learning activities and think about what they are doing” (Prince, 2004, p. 1). While many authors have written about active learning and its benefits without defining it directly (Freeman et al., 2014; E. Johnson, Keller, & Fukawa-Connelly, 2018), some of the aspects of active learning that can be gleaned from this research are that it takes place in the classroom and requires students to engage with content in a role other than passively listening and watching a lecture. From this view of active learning, many activities fit its definition, such as having students 4 discuss problems, ask questions, work in groups, and assign mathematical problems that encourage group collaboration. These are all examples of how active learning can be demonstrated in the undergraduate mathematics classroom. Examples of active learning strategies include peer instruction (Vickrey et al., 2015), inquiry-based learning (Aditomo et al., 2013; Edelson et al., 1996; Freeman et al., 2014), and cooperative learning groups (Felder & Brent, 1996; Fullilove & Treisman, 1990; Springer et al., 1999). Each of these strategies will be discussed in greater detail and how they relate to the online classroom in the next chapter. The concept of active learning in the face-to-face classroom can be found in the research on peer instruction (Vickrey et al., 2015), inquiry-based learning (Aditomo et al., 2013; Edelson et al., 1996; Freeman et al., 2014), and cooperative learning groups (Felder & Brent, 1996; Fullilove & Treisman, 1990; Springer et al., 1999). Peer instruction, first introduced by Eric Mazur in 1991 (Crouch & Mazur, 2001), gives students opportunities to work together and instruct each other. This type of instruction is an active learning instructional strategy because it engages every student in the room. In contrast, the traditional lecture method only engages the few eager students who ask questions. Working cooperatively in groups, as described in the research about Mathematical Workshop Program (MWP) (Fullilove & Treisman, 1990), has shown similar benefits of having students work together as peer instruction. Notably, the types of questions asked during peer instruction and MWP have been studied and affect the outcome (Crouch & Mazur, 2001). These questions need to be complex and meaningful to engage students in thought (Crouch & Mazur, 2001; Treisman, 1992) like those found in the literature about inquiry-based learning (Prince & Felder, 2007). Inquiry-based learning is an inductive teaching method motivated by solving problems, analyzing data, or testing hypotheses (Prince & Felder, 2007). The problems created for inquiry-based learning are intended to combine the 5 tenants of engaging the students in learning activities and reflecting on their learning, with the added dimension that students are engaged in problem- or project-based tasks (Aditomo et al., 2013). Inquiry-based learning can be implemented within many active learning instructional strategies, including peer instruction and MWP (Aditomo et al., 2013). At a basic level, implementing MWP using inquiry-based learning problems could be called cooperative learning. Cooperative learning has been defined as having students work together toward a common goal while maintaining individual assessment (D. Johnson, Johnson, & Smith, 1998). Similarly, it has also been discussed as team-based learning, where students work together on complex problems in teams (Felder & Brent, 1996). In either case, these strategies have been shown to have positive results in the face-to-face classroom; however, how do these strategies and their research change when their adaptations are implemented in the online classroom is the open question that this research attempts to sort out. In the following few paragraphs, I will explain how teaching strategies have been found in online undergraduate mathematics classrooms. Now, having discussed in the previous paragraphs how active learning instructional strategies and the factors that impact their implementation in the undergraduate mathematics face-to-face classroom, I will turn to research on online mathematics courses. As stated earlier, there is not much research in online undergraduate mathematics instruction (Shea & Bidjerano, 2018; Trenholm, Peschke, et al., 2019). Of the research conducted on online undergraduate mathematics courses, most has been research in how successful online undergraduate mathematics courses are compared to face-to-face courses (Shea & Bidjerano, 2018; Trenholm, Hajek, et al., 2019). The research on teaching undergraduate mathematics online showed online courses to have lower success rates than traditional face-to-face courses (Shea & Bidjerano, 6 2018; Trenholm, Peschke, et al., 2019). Nevertheless, enrollment in online courses increased by 30% at colleges and universities between 2012 and 2015 (Trenholm, Peschke, et al., 2019). The conclusion that I draw from this research is that the increase in enrollment combined with the less than desirable achievement of online undergraduate mathematics courses points to further rationale for improving the quality of online undergraduate mathematics instruction. Trenholm and colleagues (2016) identified frequent communication with the instructor as significant differences between face-to-face, which had a lot, and online learning environments, which had very little. Other research suggests that clear communication is an essential factor impacting the quality of online instruction by giving the student more interaction with the iterative process of formative feedback (Testone, 2019). These points about communication emphasize the importance of high-quality—detailed—assessment feedback in the online classroom. Not having personal feedback or the perceived presence of an instructor was found to harm student success (Testone, 2019). Testone (2019), through personal experience, suggests that clear communication from the instructor is an essential piece of online instruction. However, Trenholm and colleagues (2016) found that much of the communication given to students was computer-based. Even in the discussion boards that students primarily used to communicate, there was little communication with each other and their instructor. In a face-to-face classroom, the instructor is present during the lecture; however, the adaptation of this popular face-to-face instructional method is lecture videos. Lectures are a widely used instructional method in face-to-face undergraduate mathematics courses and are usually thought of as the antitheses of active learning; however, as we discussed with peer instruction, there are ways to place active learning within a lecture format. In contrast, when lecture videos are used to substitute for face-to-face lectures in the 7 online learning environment (Draus et al., 2014; Hegeman, 2015; Trenholm, Hajek, et al., 2019), there is no option for increasing active learning in the same ways. Although, their use has been found to positively impact students’ grade performance in online undergraduate mathematics classes, increasing grades by 3.2% (Hegeman, 2015) compared to not having recorded lectures whatsoever. Interestingly, Hegeman (2015) found in their university study that the addition of instructor-created lecture videos positively affected students’ perceived value of the course and increased their engagement in discussion boards. Draus and colleagues (2014) found similar positive results comparing instructor-created videos to publisher-created videos. These findings suggest that more research needs to be done on how online classes can position students more actively in their learning. How to Improve Online Learning Previously, I discussed how instructional strategies in the face-to-face classroom inform online undergraduate mathematics courses by highlighting some of the obstacles in the online learning environment. Additionally, I proffer that one of the ways to improve online undergraduate mathematics instruction is to focus on what makes active learning strategies successful in the face-to-face classroom and provide mechanisms in the online environment that offer students the same opportunities. Now, I will draw parallels between these bodies of research that might help mathematics educators focus their research on the instructional strategies used in online instruction. The two similarities that I will focus on are how important the creation of opportunities for students to rephrase and synthesize their mathematical problem- solving strategies are and the crucial roles that the instructor plays in online instruction. I posit that the research in active learning converges around the need to create an environment for students to actively engage in problem-solving activities (Fong & Visher, 2013). 8 Much of this movement has manifested in peer instruction (Vickrey et al., 2015) or asking students to work in cooperative learning groups (Felder & Brent, 1996). In both of these instructional strategies, designing questions congruent with inquiry-based learning has been recommended and is widely used (Aditomo et al., 2013). The key to their success has been the requirement of students to rephrase and synthesize their mathematical problem-solving strategies while discussing them with their classmates (Kogan & Laursen, 2014). Specifically, it is this rephrasing and synthesizing process that is at the heart of the success in each of these three active learning instructional strategies in the face-to-face classroom and it is this rephrasing and synthesizing process that must be attended to when creating new and innovative online undergraduate mathematics instructional strategies. In support of this claim, I provide two examples, from the research, of how current online instructional strategies have been shown to have a positive impact on students learning. The first example is online discussion boards, and the second example is recorded video lectures. Online discussion boards have been one of the strategies found in the research (Trenholm, Peschke, et al., 2019) that attempts to create space, like the space cooperative learning groups offer in the face-to-face environment, for students to rephrase and synthesize their mathematical ideas in online undergraduate mathematics courses. However, I argue that the feedback received by the students on their rephrased ideas from other students or the instructor—as important as it has been found (Trenholm, Alcock, & Robinson, 2015)—comes too late to reinforce students’ learning in the online classroom as effectively as it does in the face-to-face classroom (Springer et al., 1999). This lateness is due to the nature of the asynchronicity of online learning. This example illustrates how challenging online instruction is and how face-to-face instructional strategies may not be replicated effectively in the online learning environment. 9 To further complicate how instructional strategies play out in different learning environments, it has been found that video lectures have a slightly positive impact on students’ performance in online undergraduate mathematics classes (Hegeman, 2015). Puzzlingly, these same video lectures lower success rates when provided to face-to-face courses, further complexifying these results (Hegeman, 2015). Hegeman (2015) posits, and I would agree, that this may be due to students using lecture videos as a replacement for face-to-face lectures instead of a supplement. This problem would not be the case in an online undergraduate mathematics course because these would be the only lectures that they are provided. Though, it still seems complex that a student supplied with only video lectures in an online undergraduate mathematics course would do better than a student choosing to only rely on video lectures in a face-to-face undergraduate mathematics course. Interestingly, this provides us with an example of an instructional strategy working in one environment and not in another, supporting our assertion that instructional strategies may not be universally used in both the face-to-face and online learning environments. These findings point to the need for both increased feedback by the instructor in the online learning environment and for space to be created for students to rephrase and synthesize their mathematical problem-solving strategies in an online learning environment. How to Study Online Learning Environments Having made my case for how popular and successful instructional strategies, like those found in active learning literature (Freeman et al., 2014), have had mixed results when applied to the online learning environment. Furthering the notion that more research is required for these high-quality strategies in the online environment is imperative to the success of an online undergraduate mathematics classroom. I will now state my case for studying the online undergraduate mathematics learning environment in the present study. 10 Previously, I identified what I suspect are two of the most critical impact factors that online instructional strategies need to employ to have a positive impact on student success; the increase in communication that gives students space to rephrase and synthesize their mathematical problem-solving strategies. For this research, I required a framework that describes each of the elements that made up the online learning environment. Through conversations with wise researchers and a literature review, it became apparent that the best framework to situate my research in was the community of inquiry framework (Garrison, 2017). This framework conceptualizes the learning environment into three main categories, teaching presence, social presence, and cognitive presence. These categories capture my research well because they account for each aspect I deemed necessary in the previous argument. When designing and implementing their instructional strategies online, an instructor's choices of what to include in their course make up teaching presence. The communication opportunities and the space for students to rephrase and synthesize their mathematical problem-solving approach are essential features that make up the social presence. Furthermore, the impacts that these instructional strategies and communications have on students' learning make up the cognitive presence. With each of these aspects accounted for, I will now describe, in the next section, this study's research questions and how they are positioned in the community of inquiry framework (Garrison, 2017). Research Questions This research aims to understand instructors' choices when teaching an online undergraduate mathematics course, how these decisions impact students' communication opportunities, and how these affect student learning. For this research, I organize the instructors' decisions and their impacts on students using the community of inquiry framework. This framework will be discussed in more detail later in the literature review; however, it is important 11 to briefly describe its three major components, teaching presence, social presence, and cognitive presence here. Teaching presence is “the design, facilitation, and direction of cognitive and social processes to realize personally meaningful and educationally worthwhile learning outcomes” (Anderson, Rourke, Garrison, & Archer, 2001, p. 5). Social presence is the ability of participants to identify with a group, communicate openly in a trusting environment, and develop personal and affective relationships progressively by protecting their personalities (Garrison, Cleveland-Innes, & Fung, 2010). Cognitive presence is the extent to which learners can construct and confirm meaning through sustained reflection and discourse in a critical community of inquiry” (Anderson et al., 2001, p. 11). This research on instructors' choices, how these decisions impact students' communication opportunities, and how these affect student learning will focus on answering the following three research questions. 1. How does teaching presence manifest itself in an asynchronous online undergraduate mathematics course? 2. How do students report interacting with asynchronous online undergraduate mathematics courses’ activities, assessments, and resources, and how does this impact their social presence? 3. How do these students report their cognitive presence being influenced by the asynchronous online undergraduate mathematics course’s activities, assessment, and resources? This research will contribute to the research and teaching practice communities in answering these questions. First, it will demonstrate the usefulness of applying the community of inquiry framework to an online asynchronous undergraduate mathematics course. Second, it will contribute to the practice of teaching online undergraduate mathematics courses by uncovering 12 some of the impacts that instructors’ choices of activities, assessments, and resources make on students' social and cognitive presence in the online undergraduate mathematics learning environment. These contributions will help researchers and instructors progress in studying and increasing the quality of online undergraduate mathematics instruction. 13 CHAPTER 2: LITERATURE REVIEW This research aims to understand instructors' choices when teaching an online undergraduate mathematics course and how these decisions impact students. For this research, I organize instructors' decisions and their impacts on students using the community of inquiry framework (Garrison, 2017). This framework will be discussed in more detail later in this literature review; however, it is important to briefly describe its major components, teaching presence, social presence, and cognitive presence, along with the study’s three research questions. Teaching presence is “the design, facilitation and direction of cognitive and social processes for the purpose of realizing personally meaningful and educationally worthwhile learning outcomes” (Anderson et al., 2001, p. 5). Social presence is the ability of participants to identify with a group, communicate openly in a trusting environment, and develop personal and affective relationships progressively by protecting their individual personalities (Garrison et al., 2010). Cognitive presence is the extent to which learners can construct and confirm meaning through sustained reflection and discourse in a critical community of inquiry” (Anderson et al., 2001, p. 11). This research on instructors' choices and their impact on students will focus on answering the following three research questions. 1. How does teaching presence manifest itself in an asynchronous online undergraduate mathematics course? 2. How do students report interacting with asynchronous online undergraduate mathematics courses’ activities, assessments, and resources, and how does this impact their social presence? 14 3. How do these students report their cognitive presence being influenced by the asynchronous online undergraduate mathematics course’s activities, assessment, and resources? This research contributes to the research and teaching practice communities in answering these questions. First, it demonstrates the usefulness of applying the community of inquiry framework to an online asynchronous undergraduate mathematics course. Second, it contributes to the practice of teaching online undergraduate mathematics courses by uncovering some of the impacts that instructors’ choices of activities, assessments, and resources make on students' social presence and cognitive presence in the online undergraduate mathematics learning environment. These contributions will help researchers and instructors progress in studying and increasing the quality of online undergraduate mathematics instruction. Focusing on the purpose of this study to gain a deeper understanding of the instructors' choices when teaching an online undergraduate mathematics course and how these decisions impact students using the community of inquiry framework and its three categories, teaching, social, and cognitive presences, I will review the literature in three foundational areas: (a) instructional strategies, (b) research in online mathematics learning environments, and (c) the community of inquiry framework. Stated differently, it is essential that this review covers research of instructional strategies in the undergraduate mathematics learning environments, both online and face-to-face, along with knowing how they have been studied using the present study’s chosen framework. After this review, the current study’s conceptual framework will be presented. 15 Instructional Strategies for Teaching Undergraduate Mathematics As described in the previous chapter instructional strategies from the face-to-face classroom have been discovered to influence how mathematics is taught in the online learning environment (Trenholm et al., 2016; Trenholm, Peschke, et al., 2019). Therefore, it is essential to investigate these strategies to gain at least a brief understanding of the research surrounding them. Research analyzing instructional strategies in the undergraduate mathematics face-to-face classroom can easily be found in the literature. Notably many of the instructional strategies hailed as being best practices for the face-to-face classroom in the literature are rooted in active learning because they have been found to increase student performance compared with traditional lectures of students in science and mathematics (Freeman et al., 2014). Though active learning instructional strategies (Freeman et al., 2014) have been observed and studied in the undergraduate mathematics face-to-face classroom, it is important to emphasize that these approaches are still largely absent in many undergraduate mathematics classrooms. Instead, many instructors use the traditionally predominant lecture instructional strategy (Walcyzk, Ramsey, & Zha, 2007). I purport that the diversity of the instructor population, in both their level and focus of education and their employment status, impact the use of active learning instructional strategies in the face-to-face classroom and that these impacts may be found to be similar on the use of instructional strategies in the online environment. In this first section, I will be illustrating some of the research on instructional strategies in three main parts: (a) some of the most popular active learning instructional strategies that can be found in the research, (b) what strategies have been found in the classroom, and (c) what factors might keep these best practices from being implemented in the classroom. 16 Active Learning Instructional Strategies Active learning (Freeman et al., 2014) has been found to increase performance on examinations by almost one-half of a standard deviation as compared to lecturing, which has been shown to increase failure rates by 55% (Freeman et al., 2014). Active learning has been described as any learning activity that “requires students to do meaningful learning activities and think about what they are doing” (Prince, 2004, p. 1). It is also important to note that active learning is usually found in opposition to what is thought of as traditional lecture, where students passively watch instructor presentations with little to no interaction between instructor and students or amongst the students themselves. Others have defined active learning as any classroom activity that requires students to be “actively engaged in problem-solving rather than listening to a lecture” (Fong & Visher, 2013, p. 13). While many authors have written about active learning and its benefits without defining it directly (Freeman et al., 2014; E. Johnson, Keller, et al., 2018), some of the aspects of active learning that can be gleaned from this research are that it takes place in the classroom and requires students to engage with content in a role other than passively listening and watching a lecture. From this view of active learning, many activities fit its definition, such as having students discuss problems, ask questions, work in groups, and assigning mathematical problems that encourage group collaboration. These are all examples of how active learning can be demonstrated in the undergraduate mathematics classroom. In this section, I will focus on three research-based instructional strategies that I found to be most prevalent in the research—each exemplifying active learning, peer instruction (Vickrey et al., 2015), inquiry-based learning (Aditomo et al., 2013; Edelson et al., 1996; Freeman et al., 2014), and cooperative learning groups (Felder & Brent, 1996; Fullilove & Treisman, 1990; Springer et al., 1999). The first active learning instructional strategy that I will examine is peer instruction. Once these active learning 17 instructional strategies have been laid out, I will end the section by addressing how prevalent these instructional strategies have been in the undergraduate mathematics classroom and how the diversity of the instructor population might impede their use. Peer instruction. Eric Mazur first introduced the concept of peer instruction in 1991 (Crouch & Mazur, 2001). Through a review of the literature, Vickrey and colleagues (2015) describe peer instruction to follow several steps generally. First, the instructor gives a form of lecture, and then the instructor poses conceptual questions designed to focus on students’ misconceptions. After time has been provided, students answer using polling methods like electronic clickers or flashcards. After students give their answers, they are given time in class to discuss their solutions with fellow students, each trying to convince their peers why their conceptualization of the problem is correct. The round of peer instruction ends with students being allowed to change their answers, followed by the instructor leading a discussion about how to conceptualize the problem. Other authors have described peer instruction more briefly as an instructional strategy where students are given a series of short presentations, each followed by a set of conceptual questions with time given to discuss their answers with peers (Crouch & Mazur, 2001). Peer instruction has been conceptualized as engaging students by asking them to apply and explain the core concepts of the mathematical content that is being presented to their peers (Crouch & Mazur, 2001). It has been found that the implementation of peer instruction increases attitudes among both instructors and students, persistence among students, and increases students’ abilities to solve conceptual and quantitative problems as compared to the traditional lecture (Crouch & Mazur, 2001). This type of instruction is an active learning instructional strategy because it engages every student in the room. In contrast, the traditional lecture method 18 only engages the few eager students who ask questions. It is also important to note that the types of questions asked during peer instruction are important. These questions need to be complex and meaningful to engage students in thought (Crouch & Mazur, 2001). In these paragraphs, I discussed the research-based instructional strategy, peer instruction. In the following paragraphs, I will discuss inquiry-based learning. Inquiry-based learning. Another approach that researchers and practitioners have used to increase the effectiveness of their instruction in undergraduate mathematics classrooms is inquiry-based learning. Inquiry-based learning is an inductive teaching method where learning is motivated by solving problems, analyzing data, or testing a hypothesis (Prince & Felder, 2007). The problems created for inquiry-based learning are intended to combine the tenants of engaging the students in learning activities and reflecting on their learning, with the added dimension that students are engaged in problem- or project-based tasks (Aditomo et al., 2013). Prince and Felder (2007) define these problems- or project-based tasks to be a part of inquiry-based learning, defining inquiry-based learning as “presenting students with a specific challenge, such as experimental data to interpret, a case study to analyze, or a complex real-world problem to solve” (p. 14). This method drives students to engage in solving problems that can be found in the real world. This positions the students in an active learning environment because it typically requires them to engage in thinking and working cooperatively with others and reflect on their learning (Aditomo et al., 2013). Inquiry-based learning can be implemented in many active learning instructional strategies, including small student-centered groups large instructor-led groups (Aditomo et al., 2013). Inquiry-based learning has been found to promote collaboration and deep engagement between students and mathematical ideas (Laursen, Hassi, Kogan, & Weston, 2014). Research 19 has found that students taking an undergraduate mathematics course taught using inquiry-based learning do at least as well as students taking the same course being taught using traditional methods, and the student’s performance is maintained in subsequent courses (Kogan & Laursen, 2014). This maintained performance is notable because a common critique of inquiry-based learning is that since it takes a lot of class time to solve these problems, then it implies that the students must not be learning as much content as other classes being taught using traditional methods (Kogan & Laursen, 2014). An additional benefit of inquiry-based learning is that it may help close the gender gap in mathematics (Kogan & Laursen, 2014). In one study, female students reported having more confidence in their mathematical and scientific abilities when having learned in an inquiry-based learning course as compared to a traditional course (Laursen et al., 2014); however, this has not been visible in all studies (E. Johnson, Andrews-Larson, et al., 2018). Johnson and colleagues (2018) found that inquiry-based instruction widens the gender gap. More generally, students who have taken a subject using the inquiry-based model tend to choose courses using the inquiry-based model for subsequent courses (Kogan & Laursen, 2014). Thus, inquiry-based teaching methods do not lead to a drop in achievement, are favored over courses taught traditionally by students familiar with inquiry-based learning, and could positively impact women's confidence in mathematics. Inquiry-based learning instructional strategies usually result in students collaborating (Kogan & Laursen, 2014). This collaboration often happens in cooperative learning groups— another popular and heavily studied active learning instructional strategy that has been shown to improve learning (Springer et al., 1999). Having established inquiry-based learning, I will focus on cooperative learning groups' active learning instructional strategy in the following few paragraphs. 20 Cooperative groups and the mathematical workshop program. A widely used form of active learning is the integration of cooperative learning groups. The use of cooperative learning groups in undergraduate mathematics instruction has been shown to improve learning as measured by instructor created exams (Springer et al., 1999), especially among minority students (Fullilove & Treisman, 1990; Treisman, 1992), and to lead “to more favorable attitudes between men and women” (Springer et al., 1999, p. 40). Cooperative learning groups have been described as having students work together to pursue a common goal while being assessed individually (Prince, 2004). I contend that the best way to describe cooperative learning groups is to illustrate how they were noticed in a study conducted at Berkley between 1975 and 1976 (Treisman, 1992). First, during a study of calculus students at Berkley between 1975 and 1976, Treisman (1992) noticed that working in cooperative learning communities was the critical difference between the success of Chinese American students and the struggles of African American students even though similar time and effort was applied to learning mathematics. Specifically, even though both African American and Chinese American students reportedly worked on their calculus eight hours per week, Chinese American students performed at the top, and African American students performed at the bottom of those enrolled in calculus. Upon further investigation, Treisman (1990) discovered that it was not the amount of time spent on studying calculus but rather how the time was spent. Chinese American students spent much of their time studying in a group, whereas African American students rarely studied in groups (Fullilove & Treisman, 1990). This research led to the creation of the Mathematical Workshop Program (MWP) (Fullilove & Treisman, 1990). 21 The MWP was designed to emulate what Treisman (1992) witnessed when observing Chinese American students by immersing students in challenging group work activities. In the MWP, groups of five to seven students were created, and they were asked to do carefully designed problems together for two hours twice a week (Fullilove & Treisman, 1990). The worksheets designed for MWP were stated to be an integral part of MWP (Fullilove & Treisman, 1990). These problems were not unlike problems developed and implemented as part of inquiry- based learning. Specifically, Fullilove and Treisman (1990, p. 468) reported that the selected problems fit into five categories: (1) “old chestnuts” that appear frequently on examination but rarely on homework assignments (2) “monkey wrenches”—problems designed to reveal deficiencies either in the students’ mathematical backgrounds or in their understanding of a basic concept (3) problems that introduce students to motivating examples or counterexamples that shed light on or delimit major course concepts and theorems (4) problems designed to deepen the students’ understanding of and facility with mathematical language (5) problems designed to help students master what is known, in MWP parlance, as “street mathematics”—the computational tricks and shortcuts known to many of the best students which are neither mentioned in the textbook nor taught explicitly by the instructors These types of questions provide a dynamic and rich environment for students to interact with calculus. The MWP was shown to be successful in “promoting high levels of academic performance among African American mathematics students” (Fullilove & Treisman, 1990, p. 22 476), and it was also shown to be effective among other universities (Fullilove & Treisman, 1990). Formally, the MWP evolved into what is currently known as the Emerging Scholars Program (Treisman, 2008). At a basic level, the MWP combined inquiry-based learning problems and students working cooperatively in groups. This combination is often referred to as cooperative learning. Cooperative learning has been defined as having students work together toward a common goal while maintaining individual assessment (D. Johnson et al., 1998). Similarly, it has also been discussed as team-based learning, where students work together on complex problems (Felder & Brent, 1996). In either case, cooperative learning has been commended because it requires students to reflect on problem-solving and to rephrase their approach to solving problems when discussing their problem-solving strategies with other students in the group (D. Johnson et al., 1998). Cooperative learning is a successful instructional strategy. However, it has been noted that it is hard to tease out the positive effects that small cooperative groups have on student learning because it is often implemented with other forms of active learning. However, they were able to state that it positively affects developing interpersonal skills (Prince, 2004). Active Learning Techniques are not Bulletproof Although there is considerable evidence for the benefits of active learning, not all research has found active learning to positively affect students (Sonnert, Sadler, Sadler, & Bressoud, 2015). Sonnert and Sadler (2015) studied how pedagogical choices affect students’ attitudes towards mathematics by using multivariate regression analyses on a large data set that was part of the Characteristics of Successful Programs in College Calculus that took place in 2009. Sonnet and Sadler found that the use of “Ambitious teaching (e.g., group work, word problems, flipped reading, student explanations of thinking) had a small negative impact on 23 student attitudes” while “instructors who employed generally accepted good teaching practices (e.g., clarity in presentation and answering questions, useful homework, fair exams, help outside of class), were found to have the most positive impact, particularly with students who began with weaker initial mathematics attitudes” (p. 29). While Sonnet and Sadler conclude that ambitious teaching—which included group work, word problems, and student explanation of their thinking—negatively affects students’ attitudes, they do not explain why they defined ambitious teaching to have those characteristics. I posit that Sonnet and Sadler’s claims about active learning negatively impacting student attitudes would change if they had studied each element in their definition of ambitious teaching independently. In this paragraph, I reviewed the faults that some researchers have claimed about active learning’s effect on students' attitudes. In the next section, I will discuss the prevalence of active learning instructional strategies and the barriers to the instructor's use of these strategies. Research on the Prevalence of Active Learning Instructional Strategies As outlined in the previous section, research on active learning strategies and their benefits can be found in the literature; however research on instructional strategies in the undergraduate mathematics classrooms shows active learning strategies are not widely implemented (Walcyzk et al., 2007). I proffer that out of the many factors that could be keeping the implementation of these strategies low, the diversity of both the educational background and employment of college instructors is at the heart of its low use. Undergraduate mathematics courses are taught by instructors who are either part-time graduate assistants, adjunct professors, full-time professors, or full-time tenure track professors (Ehrenberg & Zhang, 2005), making this population diverse in employment appointments. Furthermore, the educational background of instructors varies from a bachelor’s degree to a graduate degree in mathematics, mathematics 24 education, or in a closely related field. These diversities leave this instructor population with a wide array of educational experiences. In the following paragraphs, I argue that this wide array of employment appointments and educational background impacts the use of active learning instructional strategies because of their diverse experience in the knowledge of these strategies. Teaching mathematics has been found to vary from instructor to instructor, among course sections, and across colleges (E. Johnson, Keller, et al., 2018; Walcyzk et al., 2007). A survey study of abstract algebra instructors’ uses of pedagogical teaching methods found that instructors use many different instructional strategies (E. Johnson, Keller, et al., 2018). Most notable is that 85% of survey respondents reported that lecturing was their primary teaching method (E. Johnson, Keller, et al., 2018) and therefore, only 15% of college abstract algebra instructors surveyed reported using a mixture of pedagogical methods that include active learning as their primary teaching method. Other research supports this claim that most instructors use lecturing as their primary instructional strategy (Walcyzk et al., 2007). Walcyzk and colleagues (2007) posit that (a) instructors not being directly evaluated on the quality of their instruction, (b) traditional assessments (e.g., exams and assignments) and curricular material are still designed without regard to active learning, and (c) the lack of professional development opportunities in active learning instructional strategies all act as obstacles to the prevalence of the implementation of these strategies. Other research echoes the call for more professional development opportunities reporting that instructors have said that they are not using other methods because of their lack of knowledge awareness (E. Johnson, Keller, et al., 2018). It has been shown that given both the opportunity for professional development and incentives to improve teaching, instructors will adopt active learning instructional strategies (Walcyzk et al., 2007). This has been identified in research by two strong 25 correlations, (a) between opportunities for professional development and teaching effectiveness with more use of active learning instructional strategies and (b) colleges that attribute a portion of instructor advancement and teaching effectiveness with the more prevalent use of active learning instructional strategies (Walcyzk et al., 2007). Now that I have finished outlining the most prevalent active learning instructional strategies, and I have argued that the research shows that despite the positive impact these active learning instructional strategies have on students, they are not widely used (E. Johnson, Keller, et al., 2018) and this is, in part, due to the diversity in both education and employment of the instructor population. The following section will present the current research in online undergraduate mathematics instruction. Research on Online Undergraduate Mathematics Courses Previously, I have discussed how active learning instructional strategies and the factors that impact their implementation in the undergraduate mathematics classroom, with the focus thus far being wholly on the traditional face-to-face classroom where all instruction is conducted between instructors and students who are physically in the same room. Now, I will examine research on online undergraduate mathematics courses. This leap into the world of online mathematics instruction requires discussing the different forms online mathematics can take. Therefore, at this point, it is important to recall from chapter one that a class as being face-to- face refers to a class that meets face-to-face for some or all their instruction and online to refer to a class that does not meet in any physical way. Furthermore, these online classes utilize the internet in two very distinct ways: synchronously and asynchronously. I will use synchronous to refer to classes where everyone meets and participates simultaneously, like in a face-to-face class or online using a video conferencing service. I will use asynchronous to refer to classes where 26 there are no class meetings held with everyone in attendance at the same time. These asynchronous courses are typically offered online and offer students a variety of learning activities. Thus, face-to-face will imply synchronous in-person instruction, and online will indicate asynchronous online instruction. As stated earlier, there is not much research in online undergraduate mathematics instruction. I arrived at this conclusion after talking with colleagues, reviewing online journals, and searching using the Google Scholar search engine. There are also references to the lack of research on online undergraduate mathematics in the literature (Shea & Bidjerano, 2018; Trenholm, Peschke, et al., 2019). Of the research conducted on online undergraduate mathematics courses, most has been research in how successful online undergraduate mathematics courses are compared to face-to-face courses (Shea & Bidjerano, 2018; Trenholm, Hajek, et al., 2019). There has also been some research on how instructional choices impact these online courses’ quality (Baran et al., 2011; Engelbrecht & Harding, 2005; Testone, 2019; Trenholm et al., 2016). Finally, how students’ characteristics impact their success (Cho & Heron, 2015; Glass & Sue, 2008) has also shown up in the literature. To review this foundation of research, I will organize it into the following three parts: (a) comparison of student success between face-to-face and online, (b) how instructional choices impact online courses, and (c) how student characteristics impact success. Comparison of Student Success Between Face-to-face and Online There have been concerns from both instructors and researchers about teaching students online. Concurrently, from my own experience in recent years, there has been more pressure placed on institutions to increase their retention and degree completion rates while keeping costs low, let alone the huge motivation caused by the coronavirus pandemic. These concerns and 27 motivations have led to research interested in the comparison of online to face-to-face courses (Shea & Bidjerano, 2018; Trenholm, Peschke, et al., 2019). Thus, most of the literature in online undergraduate mathematics has been found to compare online learning with face-to-face learning (Trenholm, Peschke, et al., 2019) to ensure that this leap into online instruction does not hurt student's achievement. In the following paragraphs, I will examine both of these points by introducing two studies that capture the research that has been done in online learning (Shea & Bidjerano, 2018; Trenholm, Peschke, et al., 2019). Trenholm, Peschke, and colleagues (2019) conducted a large-scale meta-analysis on the research of online undergraduate mathematics courses taught from 2000 to 2015. Trenholm, Peschke, and colleagues’ review of the literature revealed results consistent with my findings that research in online mathematics instruction is scant. They found an array of articles focused on performance differences between online and face-to-face courses; however, only some of these—a mere 2.2%—were focused on online instruction. Many were studies interested in the use of some form of online learning software such as computer-aided homework systems. While computer-aided homework systems are an essential feature in the online learning environment, I understand this research to be considerably different from research on online courses. The bulk of their meta-analysis was focused on challenging the popular claim that there is no statistical difference between online undergraduate mathematics instruction and face-to-face instruction effects on student achievement as determined by grades and pass/fail rates (Trenholm, Peschke, et al., 2019). The result of their study was that the claim that there is no statistical difference between online courses and the face-to-face course, using grades or pass/fail rates, is unfounded and that online students have lower levels of success compared to their face-to-face counterparts. 28 More recently than the Trenholm, Peschke, and colleagues (2019) meta-analysis, Shea and Bidjerano (2019) studied degree completion rates and how they compared at 30 community colleges in New York. This study was not conducted solely on undergraduate mathematics courses but online undergraduate courses in all subjects at these colleges. Shea and Bidjerano focused on the ratio of online courses to face-to-face courses that a student takes and how that ratio would result in the highest probability of degree completion and found that students taking roughly 40% of their coursework online had the highest chance of completing their degree. Furthermore, this completion rate lowered with higher proportions of online coursework. Shea and Bidjerano’s findings are different from Trenholm, Peschke, and colleagues that online students have lower success levels than their face-to-face counterparts. Shea and Bidjerano, studying across disciplines, highlight that those students taking some, roughly 40%, online courses increased their success in degree completion. These findings between the two studies are interesting and lead me to assume that different disciplines have different levels of success in their online offerings. In summary, I have found that there was not much research on teaching undergraduate mathematics online and that research showed online courses to have lower success rates than traditional face-to-face courses (Shea & Bidjerano, 2018; Trenholm, Peschke, et al., 2019). Nevertheless, enrollment in these courses increased by 30% at colleges and universities between 2012 and 2015 (Trenholm, Peschke, et al., 2019). The conclusion that I draw from this research is that the increase in enrollment combined with the less than desirable achievement of online undergraduate mathematics courses points to the necessity of improving the quality of online undergraduate mathematics instruction, which is the present study's focus. The next section will 29 discuss some of the research on characteristics that impact quality in online undergraduate mathematics. Instructional Choices Impact on Online Courses As I have discussed in the previous paragraphs, little research has been done on instruction in online undergraduate mathematics courses (Trenholm et al., 2016). It is suggested from this research that teaching mathematics online is difficult because it takes many different skills and technologies, each with its affordances and constraints (Engelbrecht & Harding, 2005). It has also been further identified that the instructor’s choices of assessment feedback and modality of instruction have significant effects on these courses’ success (Testone, 2019; Trenholm et al., 2015). This section will discuss current online instruction and some research on online instruction (Engelbrecht & Harding, 2005; Testone, 2019; Trenholm et al., 2015, 2016). These articles show that instructors’ level and focus of education and employment status impact the implementation of active learning instructional strategies in the face-to-face classroom. They also parallel some of the issues found in the implementation of online learning. Online instruction requires different skills. It has been found that online instruction requires a different set of skills than is required in teaching face-to-face courses (Engelbrecht & Harding, 2005) and that many of the instructors that do teach online are trying to replicate the instructional practices they use in their face-to-face courses (Trenholm et al., 2016). Trenholm and colleagues (2016) found this by interviewing over 70 undergraduate mathematics instructors by asking them to compare their instruction in their online courses versus their face-to-face courses. They report that many of these instructors spent time trying to replicate practices from their face-to-face courses in their online courses, such as replacing lectures with videos and group work with online discussion boards but found that they did not have the same results of 30 student achievement in terms of grades and pass/fail rates (Trenholm, Peschke, et al., 2019). The online and the traditional face-to-face environments are different and, while it is understandable that these instructors chose to replicate practices from their face-to-face classroom (e.g., using discussion boards as a way of replicating group work), it is also understandable that these practices may not work the same way in an online course because both these environments offer different amounts of interaction among students and between students and the instructor. Communication has a significant impact on learning. Trenholm and colleagues (2016) identified frequent communication with the instructor as one of the major differences between face-to-face, which had a lot, and online learning environments, which had very little. Other research suggests that clear communication is a significant factor impacting the quality of online instruction by giving the student more interaction with the iterative process of formative feedback (Testone, 2019). These points about communication emphasize the importance of high- quality—detailed—assessment feedback in the online classroom. As referenced before, Trenholm and colleagues (2015) investigated, through a large-scale survey of 70 undergraduate mathematics instructors, the use of assessment feedback in online undergraduate mathematics courses. In many of these courses, students' assessment feedback was from web-based homework that offered instant feedback to students (Trenholm et al., 2015). In these cases, this feedback was given by a computer program, was short and unspecific, and not aimed at furthering the student’s knowledge. For example, the students would know if they answered a problem correctly, but they would not be given feedback that would help them answer the problem correctly on their next attempt (Trenholm et al., 2015). Not having personal feedback or the perceived presence of an instructor was found to have a negative impact on student success (Testone, 2019). Testone (2019), through personal experience, suggests that clear 31 communication from the instructor is an essential piece of online instruction; however, Trenholm and colleagues (2016) found that much of the communication given to students is computer- based and even in the discussion boards that students are primarily using to communicate offer little communication with each other and with their instructor. In a face-to-face classroom, the instructor is present during the lecture; however, the adaptation of this popular face-to-face instructional method is lecture videos. Lectures are a widely used instructional method in face-to-face undergraduate mathematics courses and are usually thought of as the antitheses of active learning; however, there are ways to place active learning within a lecture format. For example, peer instruction, using clickers, and having peers interact with each other is one of the tools that has been designed to increase the level of active learning in large lecture formats. This example of how active learning can be injected into a traditional lecture to increase the participation of students allows us to see how problematic a lecture can be through a recorded video format. This difference in success between videos and in-person lectures might be because lecture videos do not provide these valuable opportunities for student interaction. Lecture videos in online courses. Lecture videos are used widely in online undergraduate mathematics courses (Draus et al., 2014; Hegeman, 2015; Trenholm, Hajek, et al., 2019). While the results of their use have been shown to have a negative impact on student’s grade performance in traditional face-to-face courses, because students were choosing to view the lecture instead of attending class (Trenholm, Hajek, et al., 2019), they have been found to have a slightly positive impact on students’ grade performance in online undergraduate mathematics classes, increasing grades by 3.2% (Hegeman, 2015). Interestingly, Hegeman (2015) found in their university study that the addition of instructor created lecture videos had a 32 positive effect on students’ perceived value of the course and increased their engagement in discussion boards even though Hegeman suggested that they did not consider the increase in grades by 3.2% to be very significant. Draus and colleagues (2014) found similar positive results comparing instructor-created videos to publisher-created videos. The instructor created videos that had a positive impact on students’ engagement and performance (Draus et al., 2014). These findings show that lecture videos created by the instructor help improve performance in online classes; however, it needs to be emphasized that when compared to attending traditional lectures, instructor-created videos still did not provide the same quality of instruction (Trenholm, Hajek, et al., 2019). This suggests that more research needs to be done on how instruction in online classes can position the student more actively in their learning. How Student Characteristics Impact Success Seemingly unnecessary to state, it has been found that students play a more prominent role in their success in online undergraduate mathematics courses than traditional face-to-face courses (Cho & Heron, 2015; Glass & Sue, 2008). One of the factors found in student success in online courses was the self-regulated nature of online learning, where a student’s success as a self-regulated learner refers to the student’s ability to “set goals, plan ahead, and consistently monitory and reflect on their learning process” (Cho & Heron, 2015, p. 81). Cho and Heron (2015) analyzed students in remedial online undergraduate mathematics courses and found that the level of self-regulated learning could describe the most important factor to impact student success. Self-regulated learning factors were found to have a much more significant impact on students' success than the instructor’s instructional choices (Cho & Heron, 2015). Another study on university business students taking an online mathematics course showed that students’ participation was increased when the course had an online homework system that provided 33 instant feedback; still, students also mentioned that the lack of partial credit was disappointing (Glass & Sue, 2008). Glass and Sue (2008) recommend—in their discussion—that required discussion boards increased student participation in the course and found that this also increased student satisfaction. I conjecture that the increased interaction between instructors and students in these discussion boards makes students have a higher participation and success rate because it increases how much feedback students receive from other students and the instructor and demands students paraphrase their thinking into words. Furthermore, Cho and Heron (2015) recommended that instructors build opportunities for interaction early in their online mathematics course to help develop a student’s self-confidence. This is consistent with findings from other studies about feedback having an essential role in students' success (Trenholm et al., 2015). In this section, I have outlined research in online undergraduate mathematics. This research has supported my statements that while the performance of online undergraduate mathematics courses has been studied, instructional strategies in online undergraduate mathematics courses have received little attention. Furthermore, research on instructional strategies and their lack of use in face-to-face classrooms lends researchers a roadmap to researching the use of online instructional strategies. This next section will discuss how research from the face-to-face classroom might inform online mathematics instruction. Research in Face-To-Face Classrooms Can Inform Online Instruction Previously, I discussed research in both instructional strategies in the face-to-face classroom and online undergraduate mathematics courses. By highlighting some of the obstacles active learning instructional strategies have faced in becoming popular in the face-to-face classroom, I argue that because many of the same elements of these obstacles exist for the online 34 learning environment, they might impact the implementation of instructional strategies there too. Additionally, I proffer that one of the ways to improve online undergraduate mathematics instruction is to focus on what makes active learning strategies successful in the face-to-face classroom and provide mechanisms in the online environment that offer students the same opportunities. Now, I will draw parallels between these research bodies that might help illuminate the importance of studying the choices instructors make in their teaching presence and how these choices impact students’ social and cognitive presences. The two parallels that I will focus on to make these points are how important the creation of opportunities for students to rephrase and synthesize their mathematical problem-solving strategies are and the crucial roles that the instructor plays in online instruction. I posit that the research in active learning converges around the need to create an environment for students to actively engage in problem-solving activities (Fong & Visher, 2013). Much of this movement has manifested in the use of peer instruction (Vickrey et al., 2015) or by asking students to work in cooperative learning groups (Felder & Brent, 1996). In both of these instructional strategies, designing questions congruent with inquiry-based learning has been recommended and is widely used (Aditomo et al., 2013). The key to their success has been the requirement of students to rephrase and synthesize their mathematical problem-solving strategies while discussing them with their classmates (Kogan & Laursen, 2014). I proffer that it is this rephrasing and synthesizing process that is at the heart of the success in each of these three active learning instructional strategies and it is this rephrasing and synthesizing process that must be attended to when creating new and innovative online undergraduate mathematics instructional strategies. 35 Now, I will provide two examples of how current online instructional strategies used in online undergraduate mathematics courses do not share their face-to-face counterpart’s impact. The first example is the use of online discussion boards, and the second example is recorded video lectures. Online discussion boards have been one of the strategies found in the research (Trenholm, Peschke, et al., 2019) that attempts to create space, like the space cooperative learning groups offer in the face-to-face environment, for students to rephrase and synthesize their mathematical ideas in online undergraduate mathematics courses. However, I argue that the feedback received by the students on their rephrased ideas from other students or the instructor— as vital as it has been found (Trenholm et al., 2015)—comes too late to reinforce students’ learning in the online classroom as effectively as it does in the face-to-face classroom (Springer et al., 1999) due to the asynchronous nature of online learning. This example illustrates how challenging online instruction is and how face-to-face instructional strategies may not be able to be replicated effectively in the online learning environment. Therefore, other mechanisms need to be created for students to work collaboratively. This challenge is enough on its own; however, it must not be forgotten that a large and complex piece of this puzzle is the role of the mathematics instructor and their adoption of research-based instructional strategies. The role of the instructor in mathematics courses has been pointed to in both bodies of research. In the face- to-face classroom, the research suggests that the instructor limits their lecture time to make opportunities for students to discuss their problem-solving ideas (Freeman et al., 2014); however, in the online learning environment, communication between instructor and student, formative assessment feedback, has been identified as being necessary for increasing student success (Testone, 2019). 36 To further complicate how instructional strategies play out in different learning environments, it has been found that video lectures have a slightly positive impact on students’ performance in online undergraduate mathematics classes (Hegeman, 2015). Bewilderingly, these same video lectures lower success rates when provided to face-to-face courses, further complexifying these results (Hegeman, 2015). Hegeman (2015) posits, and I would agree, that this may be due to students using lecture videos as a replacement for face-to-face lectures instead of a supplement, and this would not be the case in an online undergraduate mathematics course because these would be the only lectures that they are provided. It still seems complex that a student supplied with only video lectures in an online undergraduate mathematics course would do better than a student choosing to only rely on video lectures in a face-to-face undergraduate mathematics course. Interestingly, this provides us with an example of an instructional strategy working in one environment and not in another, supporting our assertion that instructional strategies may not be universally used in both the face-to-face and online learning environments. These findings point to the need for both increased feedback by the instructor in the online learning environment and for space to be created for students to rephrase and synthesize their mathematical problem-solving strategies in an online learning environment. In this second foundational section, I have presented research that has been conducted in the online learning environment, mathematics and otherwise, and how they point to two of the most critical impact factors that online instructional strategies need to employ to have a positive impact on student success, the increase in instructor feedback and communication, and the need to create a place for students to rephrase and synthesize their mathematical problem-solving strategies. I also showed how replication of face-to-face instructional strategies in online learning environments had been found to have different success outcomes in online undergraduate 37 mathematics courses. In the next and last foundational section of this chapter, I will describe the community of inquiry framework and how it assists in the present study’s focus of discovering the impacts of the decisions instructors make when designing their teaching presence on the opportunities that are created for students to communicate and learn in the social presence and cognitive presence. The Community of Inquiry Framework First, to best understand the community of inquiry framework used in the present study, it is important to have at least a brief understanding of its roots and history. The ideas presented in the framework that were developed for the online learning environment (Akyol, Garrison, & Ozden, 2009) were initially presented as the community of inquiry by Lipman (2003) to describe learning in the face-to-face classroom (Lipman, 2003). Moreover, these ideas have been linked closely to Dewey (1933) (Kennedy, 2012). First, Dewey (1933) described the learning environment as habit formation (Kennedy, 2012), which requires three main components as part of its method, (a) establishing conditions, (b) arousing curiosity, and (c) promoting the flow of suggestion (Dewey, 1933). The establishment of these conditions are essential pieces that allow the formation of intellectual thought and inquiry (Dewey, 1933). Although there is not a one-to-one correspondence between Dewey’s three categories and Lipman’s (2003) three categories that make up the community of inquiry framework, it is clear that Lipman draws upon Dewey’s philosophy of learning (Kennedy, 2012). These categories that will remain as a primary focus of the present research are teaching presence, social presence, and cognitive presence (Anderson et al., 2001; Lipman, 2003). Although these will each receive attention below, it is crucial to establish their lineage right now. Lipman created each of these categories to describe the environment in which 38 learning takes place in the face-to-face classroom. Later work was done on these ideas bringing them to be used to research on blended and fully online learning environments (Akyol, Garrison, et al., 2009; Anderson et al., 2001; Garrison et al., 2010). These ideas culminated in the framework used for the present study (Akyol & Garrison, 2008; Garrison, 2017; Garrison, Cleveland-Innes, Koole, & Kappelman, 2006). Please see figure 1 below for a more detailed illustration of how these three categories are perceived to interrelate. Figure 2-1 Elements of the Community of Inquiry Framework Note. The figure depicts the three elements of the community of inquiry framework intersecting to make up the educational experience: teaching, social, and cognitive presence. Showing the intersection between teaching presence and cognitive presence to be direct instruction, cognitive presence and social presence to be creating meaning, and social presence and teaching presence to be facilitating discourse. 39 Most importantly, I organize the instructors' decisions for this research and their impacts on students using the community of inquiry framework (Garrison, 2017). This framework will now be discussed in more detail with each of its major components, teaching, social, and cognitive presence are the focus. Recall each definition that the present study uses. Teaching presence is “the design, facilitation and direction of cognitive and social processes for the purpose of realizing personally meaningful and educationally worthwhile learning outcomes” (Anderson et al., 2001, p. 5). Social presence is the ability of participants to identify with a group, communicate openly in a trusting environment, and develop personal and affective relationships progressively by protecting their individual personalities (Garrison et al., 2010). Cognitive presence is the extent to which learners can construct and confirm meaning through sustained reflection and discourse in a critical community of inquiry” (Anderson et al., 2001, p. 11). Now, I will describe research on each below. Teaching Presence The instructor plays an essential role in creating a learning environment through the design of the course, how they facilitate learning and communication, and present content. The community of inquiry attributes these actions as part of the teaching presence (Akyol, Garrison, et al., 2009). Teaching presence has been described as “the design, facilitation and direction of cognitive and social processes for the purpose of realizing personally meaningful and educationally worthwhile learning outcomes” (Anderson et al., 2001, p. 5). It is perceived to influence social and cognitive presence (Akyol & Garrison, 2008). This perceived influence prescribes some crucial responsibilities to teaching presence, the design and facilitation of instruction to promote social processes to achieve learning outcomes (Anderson et al., 2001; Garrison et al., 2010). 40 The design and facilitation include creating curriculum content, learning activities, and establishing timelines (Garrison et al., 2010). Facilitation of social process is the “monitoring and managing purposeful collaboration and reflection” (Garrison et al., 2010). And altogether, the course design and facilitation of social processes aid in “ensuring that the community reaches the intended learning outcome” (Garrison et al., 2010). This course design in an online classroom, represented as a website, describes the instructional tools and activities the students interact with while in the online classroom (Wertz, 2022). Teaching presence was further noted as an important and influential piece in students' satisfaction and learning experiences (Swan & Ice, 2010). Social Presence Social presence was one of the earliest focuses of online learning research because communication among students and community building was doubted the most in the online learning environment (Garrison et al., 2010). After all, it was not entirely accepted that students would be able to communicate with each other in meaningful ways effectively. This skepticism is persistent in the culture today and is one of the foci of the current study. Garrison and colleagues (2009) describe social presence as “the ability of participants to identify with the community (e.g., course of study), communicate purposefully in a trusting environment, and develop inter-personal relationships by way of projecting their individual personalities” (p.32). While most researchers (Deris, Zakaria, & Wan Mansor, 2012; Tirado Morueta, Maraver López, Hernando Gómez, & Harris, 2016; Wertz, 2022) have adopted the previous definition, to some extent, others researching the online learning environment have defined social presence more specifically to their needs as ”the degree to which participants in computer-mediated communication feel affectively connected one to another” (Swan & Ice, 2010, p. 1). The 41 question of how social presence is affected by online learning environments is still largely unanswered (Wertz, 2022). Cognitive Presence Cognitive presence has been shaped by the teaching presence and social presence (Garrison et al., 2010) and refers to a student’s ability to demonstrate knowledge by conferring meaning (Anderson et al., 2001). Most accept Garrison and colleagues (2009) definition of cognitive presence in the online learning environment as being “conceptualized as the extent to which learners are able to construct and confirm meaning through sustained reflection and discourse” (Arbaugh et al., 2008; Swan & Ice, 2010). To be studied, cognitive presence is broken down into four linear phases, (a) triggering event, (b) exploration, (c) integration, and (d) resolution (Garrison et al., 2006). Each of these describes the events that take place while a student is learning a new concept. When the student comes across a new idea or concept, this creates a triggering event. The student then must explore their new and current knowledge to find helpful information. This information must then be integrated into the concept to come to a resolution (Tirado Morueta et al., 2016). When taken together, the three elements of cognitive presence, social presence, and teaching presence make up the communities of inquiry framework used in the present study. Table 2-1 below shows the framework described in categories used for coding. This table was adopted for the current research from a refined coding scheme used by Garrison, Cleveland- Innes, Koole, and Kappelman (2006) for transcripts. 42 Table 2-1 Community of inquiry framework used for coding Elements Categories Example Indicators Cognitive Presence Triggering event Sense of puzzlement Exploration Information exchange Integration Connecting ideas Resolution Apply new ideas Social Presence Affective Expressing emotion Open communication Risk-free expression Group cohesion Encouraging collaboration Teaching Presence Design and organization Setting curriculum and methods Facilitating discourse Sharing personal meaning Direct instruction Focusing discussion Note. Adopted from “Revisiting Methodological Issues in Transcript Analysis: Negotiated Coding and Reliability,” by Garrison, Cleveland-Innes, Koole, and Kappelman (2006) Conceptual Framework Using the literature in this chapter, students learn best through meaningful engagement with content, each other, and the instructor. The literature on active learning instructional strategies provides us with examples of ways instructors can create these opportunities for meaningful engagement in the face-to-face learning environment. Furthermore, I proffer that the unifying principle that makes each of these active learning instructional strategies successful is the creation of opportunities for the students to rephrase and synthesize their mathematical ideas. I conceptualize that these opportunities are created in an online undergraduate mathematics course through the instructor's decisions when they choose what activities, assessments, and resources will make up the elements of their virtual classroom. To aid in my analysis of the choices instructors make in their virtual classrooms and how they impact students, I will view these choices, and their impacts, through the communities of 43 inquiry framework (Garrison, 2017). This positions the instructor’s choice of activities, assessments, and resources as elements in the teaching presence in the online undergraduate mathematics learning environment. Furthermore, this teaching presence creates the setting for social presence. And together, teaching presence and social presence provide opportunities for students’ cognitive presence to be impacted positively through their interactions through social discourse and with the course’s activities, assessments, and resources. Therefore, one way to think of this research in terms of the communities of inquiry framework would be to say that I am studying what the teaching presence looks like in an online undergraduate mathematics learning environment and how that teaching presence impacts the creation of the social presence and cognitive presence that students’ learning requires. Therefore, this research on instructors' choices and their impact on students will focus on answering the following three research questions. 1. How does teaching presence manifest itself in an asynchronous online undergraduate mathematics course? 2. How do students report interacting with asynchronous online undergraduate mathematics courses’ activities, assessments, and resources, and how does this impact their social presence? 3. How do these students report their cognitive presence being influenced by the asynchronous online undergraduate mathematics course’s activities, assessment, and resources? In answering these questions, this research contributes to both the research and teaching practice communities. First, it demonstrates the usefulness of applying the community of inquiry framework to an online undergraduate mathematics course. Second, it contributes to the practice 44 of teaching online undergraduate mathematics courses by uncovering some of the impacts that instructors’ choices of activities, assessments, and resources make on student's social and cognitive presence in the online undergraduate mathematics learning environment. These contributions will help researchers and instructors progress in studying and increasing the quality of online undergraduate mathematics instruction. In the next chapter, I will describe the method in which I have carried out such research. 45 CHAPTER 3: METHOD This research aims to understand instructors’ choices to represent their teaching presence when teaching an online undergraduate mathematics course and how these decisions impact students’ social and cognitive presence. This research is focused on answering the following three research questions. 1. How does teaching presence manifest itself in an asynchronous online undergraduate mathematics course? 2. How do students report interacting with asynchronous online undergraduate mathematics courses’ activities, assessments, and resources, and how does this impact their social presence? 3. How do these students report their cognitive presence being influenced by the asynchronous online undergraduate mathematics course’s activities, assessment, and resources? These questions, when answered together, are designed to support an understanding of how teaching presence manifests itself in a fully online asynchronous undergraduate mathematics learning environment and how its presence impacts students’ social and cognitive presence. To answer these research questions, I collected data from three different data sources, beyond course artifacts, such as the syllabus, (a) a selection of questions from the course surveys, (b) course usage data, and (c) instructor and student interviews. The instructor and student interviews were the study’s leading source of data. The course artifacts, course usage data, and the survey question data were selected to support the interview data by triangulating the arguments used to support each of the four claims and ultimately 46 answering each of the three research questions. The four claims will be mentioned briefly in this chapter, and they will be discussed in detail in chapter 4. These data sources were collected from the instructor and students taking a specific introductory undergraduate mathematics course online in the Spring 2021 semester at a large midwestern university. In the following subsections, I will describe the participant population and why the participants were selected, each of the data sources and why they were chosen, and how each of the collected data sources was analyzed. Selection of Participants This research aims to understand how instructors' choices represent their teaching presence when teaching an online undergraduate mathematics course and how these decisions impact students’ social and cognitive presence. To aid in this understanding, I sought an undergraduate mathematics course with high enrollment that was intentionally created for fully online asynchronous instruction. I felt that it was important that the course was intentionally created for asynchronous online instruction because that would give me the best chance to collect reports from students exposed to a variety of instructional elements designed for asynchronous learning environments. The alternative, a course that was not intentionally developed for an asynchronous learning environment, might not give a good representation of students responding to instructional elements likely created for a face-to-face classroom and being implemented in an online environment. I also sought a course with high enrollment to maximize the number of data points collected and give myself a higher likelihood of a more extensive selection pool of willing interview participants. The chosen course was recently developed for fully online asynchronous instruction in response to the Covid-19 pandemic. I will identify this course as Math 101 to align with its 47 characteristic of being a typical introductory mathematics course for non-mathematics majors that you might find at any college or university. From conversations with one of the professors involved in the course’s development, I surmised that the course had been designed for asynchronous online learning in response to the Covid-19 pandemic by a small group of instructors at a large midwestern university. This recent purposeful redesign gave the course, in my perspective, a higher chance of having a variety of instructional elements created for students. Furthermore, the Fall 2020 enrollment for the course was 929 students. This high number of enrollments solidified my feeling that there would likely be a high number of students enrolled during the data collection period, the Spring 2021 semester. This intentional development and high student enrollment made this course a good fit for my study because it gives me an opportunity to compare students' reports of different instructional elements and how these elements impact their social and cognitive presence. Moreover, it is essential to note that all students enrolled in this course participated in the same format. Typically, at large universities, courses are offered in different formats, giving students a choice to select whichever format they prefer, online, face-to-face, or other blended and creative options. Although it could easily be argued that giving students the choice of format is beneficial for them, under the circumstances of my research, the nonexistence of this choice is beneficial because the group of participants will not be limited to those students who self- selected the online format in the face of many other choices. Instead, this group of participants will be all students taking Math 101, giving the data the best opportunity for diverse perspectives. 48 Description of Participants Participants in this student are the instructional team involved in Math 101 and the students who take Math 101 in the Spring 2021 semester. As discussed in the following data collection section, each of these participant groups will be studied at various levels. All the student population’s actions in the course will be contained in the course usage data. A subset of the student population will be represented in the survey day—those that choose to complete them. Moreover, another subset of the student population will be represented in the student interview data, our richest source. The primary instructor on record will be interviewed, and the remainder of the instructional team will only be represented by secondhand accounts or website biographies found on the online course website. Description of Instructional Team The instructional team is comprised of a lead instructor, female, supported by a small team of undergraduate students. The lead instructor, who will be referred to as the instructor from here on, is a full-time faculty member at the institution with over ten years of teaching experience and a master’s degree in mathematics. The undergraduate students were made up of past students who had successfully completed Math 101 and other students who are mathematics majors at the university. Of the undergraduate students, seven were female, and four were male. Although only mentioned by the instructor in the instructor interview, other full-time faculty members and graduate students were involved in the development and assessment of Math 101. This group was not described in detail, but at least some of the members were faculty and graduate student in the department of mathematics education. 49 Description of Student Population The students in Math 101 are best described as first- and second-year students, non- mathematics majors at the university. Although this is a reasonable description of the population, some did not fit this description, such as some third- and fourth-year students. The university undergraduate population, from which these Math 101 students are a subset, can be described as in general as having approximately thirty-six thousand undergraduate students; 51% female, and 49% male; 67% white, 7.7% black or African, 9.5% international, 6% Asian, 5% Hispanic, 3% multi-ethnic, and 1% other or unknown; and having over 81% of its population between the ages of 18 and 24 (“Michigan State University demographics and diversity report,” 2021). Description of Interview Participants The interview participants were students in Math 101 that were willing and able to meet for an interview conducted through Zoom. In terms of time in college, they range from first-year students in their second semester to fifth-year students in their last semester of college. There were sixteen total student interview participants, of which seven were male, and nine were female. Furthermore, as determined in the interview process, thirteen were from the United States, and three were international students. Two from China and one from Greece. Of the students from the United States, one was a black male; all others were white. The pseudonyms for each interview participant are given in table 3-1 below. Table 3-1 Names of interview participants Kent Pearl Jennifer John Bree Suzy Joe Lucas Beverly Shawn Olivia Feng Sam Cheng Hilary Joanne 50 Data Collection Available course artifacts and three types of data were collected to inform the research questions: a selection of the course survey questions, course usage data, and instructor and student interviews. The course survey questions give a proxy for how students respond to different teaching presence elements such as assessments and resources. The course usage data provided the number of students that accessed each of the course elements. The instructor interviews provided an understanding of the motivation that went into selecting the teaching elements. Furthermore, the student interviews gave a detailed understating of how the different activities, assessments, and resources impact the students’ social and cognitive presence. This research was approved by the Institutional Review Board and has been determined to be exempt. Further detail on how these data sources support the research questions and how these data will be collected is described below. A timeline for data collection can be found in table 3-2 below, followed by a description of how each data source was collected. A general timeline for the entire research project can be found in Appendix E. 51 Table 3-2 Data collection timeline Dates Data Source Description January 15 Survey Data I checked in with the instructional team to ensure that each of the questions on the surveys were going to be deployed. February 24 Interview Data I interviewed the instructor through Zoom. March 15 – Interview Data Using one of the completed course surveys, I April 9 identified willing student interview participants and, through email, recruited them for an interview. March 16 – Interview Data Student interviews are conducted. April 16 May 3 Course Usage Data I contacted the university’s technology team to pull course usage data from Math 101’s online course learning platform. May 12 Survey Data & Course I downloaded the completed survey data from the Artifacts Qualtrics site and the Course Artifacts from the Math 101 website. May 14 Course Usage Data The university’s technology team delivered the course usage data from Math 101’s online course learning platform. Course Artifacts After the course was complete, on May 12, 2021, I downloaded some course artifacts, the syllabus, and a course overview. The syllabus was downloaded as a Portable Document Format (PDF); however, to save an overview of the course, I screen-recorded a video while scanning through each of the areas of the Math 101 course that students could visit. These areas included the prominent announcement messaging pages linked at the top of the course’s website, a top- level scan of each week, and a deeper scan into many of the weeks so they could be referenced later. 52 Survey Data Students in Math 101 participated in weekly surveys as part of their course requirements. Survey responses are gathered from all survey respondents using a web-based software called Qualtrics. A list of the survey questions used in this research can be found in Appendix A. Below, I describe why these survey questions were selected to support research question 2 and give a brief description of what was learned from the pilot data collection. Survey question selection. Each of the survey questions is listed in Appendix A. These survey questions were selected from the existing body of survey questions in the course because they focus on how students report interacting with Math 101’s activities, assessments, and resources. Moreover, the selected questions ask students how they describe and feel about their engagement with the content of the course, other students from the course, and the instructional team. These questions, taken together with the student interviews, develop a description of how these elements of the teaching presence impact students’ social and cognitive presence in Math 101. As part of Math 101, there were many questions that students were being asked during the surveys. When I inspected each survey, I looked for all questions that were remotely related to my research questions. For instance, many questions were unrelated, such as those asking students whether they felt the grading in the course was fair or if they appreciated that their assignments were due on Wednesdays. Instead, I searched for and made notes of questions pertinent to the present research, such as those asking students if they worked in groups on the project, how they communicated with other students, and if they communicated with their instructor. A more detailed list of the questions I considered and collected can be found in 53 Appendix A. As you will read in chapter 4, only a subset of the survey questions considered and collected were used to support the research analysis. Pilot data collection. In preparation for this dissertation research, I gave a subset of the survey questions to the students in Math 101 at the studied university. This administration of the survey questions allowed me to gain confidence in the distribution and response rate of the survey because of its high participation rate. 85% of the 929 students enrolled in Math 101 responded to the survey questions in the Fall 2020 semester. This pilot data was helpful in gaining an understanding of student participation and the administrative details of how the surveys would be deployed, as well as how I was going to be able to access and download the results. Specifically, as you will read in the section on student interviews, I learned in this pilot data that I would be able to successfully identify students who were willing to be contacted to participate in a student interview using part of a survey given early in the semester. This allowed me to avoid sending out separate messages to all students as an outsider—making for more reliable communication as part of the natural communication with Math 101. The survey data were conducted and collected as part of Math 101’s process as determined by the instructional team with the addition of asking students if they were willing to participate in a short interview. I accessed these surveys twice throughout the semester. First, on March 15, 2021, to determine which students elected to be contacted for an interview, and second on May 12, 2021, to download each of the survey data files from the Qualtrics website. Course Usage Data The course usage data was used to give an idea of which elements in the teaching presence were accessed by most students to help answer research question 3. The universities technology team gathered this data from the learning management system’s course page for each 54 of the Spring 2021 sections of the Math 101 course once the semester is completed. It was delivered as one Excel file on May 14, 2021. The Excel file consists of three columns, the first column indicated the name of the course element the link is associated with, the second column indicated how many students had access, and the third column indicated how many accessed each link. As can be surmised by the description, these data points of how many students accessed each course element were collected in aggregate across all sections of Math 101 and are not specific to any student. This data source was selected after conversations with researchers and the university’s technology team that has access to this data because it was the only identified way to observe student activity in the online course due to its asynchronous nature. The course usage data will assist in answering research question two by indicating the quantity and duration of access of each of the course elements. Interview Data Each of the interviews was conducted using Zoom video conferencing software. Each interview participant received a link to the Zoom meeting five to seven minutes before their scheduled interview time. After an appropriate amount of time for greetings, interview participants were asked to confirm audibly that they were giving permission for the interview’s audio and video to be recorded. Then, the Zoom transcription feature was turned on to transcribe all the audio into text automatically. After each interview, all the data (i.e., video, audio, and text files) were processed by the Zoom software and saved to my computer. The following paragraphs will detail how the student interview data was collected and how the instructor interview data was collected. The student interview data was collected between March 16 and April 16, 2021, using the process described in the previous section. This time window was intentionally planned for the 55 latter half of the Math 101 course, so students had time to understand better how they felt about their interactions with Math 101’s teaching presence. During the planning phase of this research, it was determined that I would start interviewing eight students and continue until data saturation had been reached. Furthermore, that data saturation would be determined when I felt that a reasonably accurate picture of how students describe their experience had been achieved and no new descriptions of student experience materialized in the most recent interviews. However, after seeing how many students were willing to be interviewed and how easily scheduled these interviews were, this strategy changed to interviewing all willing participants. Therefore, students were randomly selected from the pool of students who indicated that they were willing to participate in a short interview and placed into four different contact groups (Group 1, Group 2, Group 3, and Group 4). Group 1 had nineteen students, Group 2 had forty students, Group 3 had thirty-seven students, and Group 4 had twelve students. The division of all willing participants into four smaller groups was done to accommodate the large number of students who indicated that they were willing to participate. On March 15th, just before the first week of interviews, Group 1 was sent the initial email directing students to set up a time for an interview. On March 19th, just before the second week, students in Group 2 were sent the initial email, and students who had not responded from Group 1 were sent the follow-up email. Just before the third week, on March 26th, students in Group 3 were sent the initial email, students who had not responded from Group 2 were sent the follow- up email, and students who had not responded from Group 1 were sent the final email. This pattern continued until each group had received each email in the progression, with the last final emails going out to Group 4 on April 9, 2021. Each of these emails can be found in Appendix D. The responses from the interviews will be used to help answer research questions two and three. 56 The student interview questions can be found in Appendix B. These interview questions were designed first to establish a rapport, then get the student talking about their experience in Math 101 in general, and finally to specifically target each of the three presences in the communities of inquiry framework (i.e., teaching presence, social presences, and cognitive presence). Rapport was attempted by asking how the student was doing in general and specific to Math 101. An open-ended question like, “can you walk me through a typical week in Math 101” was used to get the student thinking about their class experience and talking about that experience. Teaching presence was followed-up on after the question about the student’s typical week if it was felt they were missing anything or had more to add about their experiences. Social presence was followed up on, more specifically, by asking the students questions about how they communicated with the instructional team and the students and how these communications made them feel. Finally, questions about which elements were beneficial for their learning and what helped the most when they were struggling were asked as a follow-up on their cognitive presence. Ultimately, the goal of each of the interviews was to get the student participant talking about their experiences in Math 101 in relation to their interactions with the course elements, the instructional team, and other students. The responses to the student interview questions were designed to support answering research question 1, research question 2, and research question 3. The instructor interview data was conducted on February 24, 2021 and was collected using the same process described at the beginning of the interview collection section. Although there were many instructional team members, only one instructor taught Math 101 in the Spring 2021 semester. The instructor interview questions can be found in Appendix C. The questions were designed to establish rapport and then focus on identifying the assessments, learning activities, motivations for selecting the assessments and learning activities, and a description of 57 the communication opportunities between the instructional team and students and among students in Math 101. In doing so, these elements helped establish identify the teaching presence of the course. Because this was an interview with an instructor, I felt that the questions could be straightforward, such as, “what assessments do you use in Math 101?” The responses from the interview will be used to help answer research question 1 and, partly, research questions 2 and 3. Data Analysis In the previous section, I described how I collected Math 101 course artifacts and data from three different sources, (a) student survey questions, (b) course usage data, and (c) instructor and students’ interviews to answer my research questions. This section will describe how I analyzed the collected data to support four different claims created while analyzing the data and how these four claims and their arguments were used to answer the research questions. I will organize each of the following analysis sections by data source, starting with a restatement of the description for each data source. Then, I will briefly describe how these analyses were used to create each of the four claims. For ease of reference, a table showing the connection between the claims, data sources, and research questions can be found in table 3-3 below. Please note that claims are discussed in detail in chapter 4. In this chapter, they are only described briefly and are included to show the connection between how the data was analyzed and how the research questions were answered. 58 Table 3-3 Connection between each data source, claims, and research questions Data Source Goal of Analysis Claim Argument Use RQ Supported To reference, in Claim 1 Course Artifacts more detail, each RQ 1 course element. Claim 4 To determine elements of the teaching Instructor presence and RQ 1 Interview their intended Claim 4 RQ 2 effect on social RQ 3 presence, and cognitive presence. To determine how the teaching presence Claim 1 RQ 1 Student elements Claim 2 RQ 2 Interviews affected social Claim 3 RQ 3 presence, and Claim 4 cognitive presence. To determine how the teaching presence elements Claim 1 RQ 2 Student Surveys affected social Claim 2 RQ 3 presence, and Claim 3 cognitive presence. To determine level of use of each of the Course Usage elements in the Claim 4 RQ 1 teaching presence. 59 Analysis of Course Artifacts Each course artifact, such as the syllabus and video of the Math 101 course website, was opened and read. While reading these course artifacts, notes were made of each instructional element. These notes were used to confirm details after other mentions from other data sources. These notes were then checked against each presence profile for accuracy and to glean more detail. These presence profiles are discussed in more detail in the following two sections, analysis of instructor interview and analysis of student interviews. Analysis of Instructor Interview Since research question 1 aims at understanding what the teaching presence looks like in Math 101. As discussed in chapter 3, teaching presence in an online course is represented by the activities, assessments, and resources an instructor provides in the design elements. Therefore, the instructor interview intended to gain a deeper understanding of the elements that make up the teaching presence in Math 101 and their intended impact on the social and cognitive presences. Specifically, the instructor interview is aimed at two things, (a) ensuring that I have an accurate record of what the teaching presence looks like and (b) exploring the reasoning the instructor had with the choices they made when designing each element of the course’s teaching presence with an eye on how they intended to impact social presence and cognitive presence. The analysis of the instructor interview was done in two stages. First, starting with our coding scheme, found in table 3-4 below the next paragraph, I used inductive coding to code the transcripts into three separate rubrics, teaching presence, social presence, and cognitive presence. Each of these rubrics was kept in a Microsoft Excel file and contained each of the categories outlined in table 3-4. Then each rubric’s category was transferred into a corresponding profile. Each of these profiles was kept in a Microsoft Word file where each of the categories was titled, followed by 60 the transcript snippets. These transcript snippets were still direct quotes from the instructor interview, with timestamps still attached. At this time, short descriptions of each teaching presence element were written in the teaching presence profile because I felt that I had a relatively good grasp of what each of the teaching presence elements was. However, descriptions were not written, at this time, in either the social presence profile or the cognitive presence profile due to not feeling like the elements in those presences had been described enough for me to understand them clearly. Nothing more was done with these profiles until after the student interviews were brought through a similar process, described in the next section titled analysis of student interviews. Below I will give examples detailing how the instructor interviews were coded during the initial inductive coding phase and then a short description of how the teaching presence profile was created in the more deductive axial coding phase. During the instructor interview, I asked for a description of each of the course’s design elements that make the course’s teaching presence, followed by a question asking why the design element was chosen. The interview protocol can be found in Appendix C. Then, I analyzed each of the responses by coding them into each of the three rubrics that I had created to represent the coding scheme suggested by Garrison, Cleveland-Innes, Koole, and Kappelman (2006). A table showing the community of inquiry coding scheme (Garrison et al., 2006, p. 5) can be found in table 3-4 below. 61 Table 3-4 Community of inquiry coding scheme Elements Categories Example Indicators Cognitive Presence Triggering event Sense of puzzlement Exploration Information exchange Integration Connecting ideas Resolution Apply new ideas Social Presence Affective Expressing emotion Open communication Risk-free expression Group cohesion Encouraging collaboration Teaching Presence Design and organization Setting curriculum and methods Facilitating discourse Sharing personal meaning Direct instruction Focusing discussion Note. Adopted from “Revisiting Methodological Issues in Transcript Analysis: Negotiated Coding and Reliability,” by Garrison, Cleveland-Innes, Koole, and Kappelman (2006) Each of the rubrics created for each of the three elements (presences) has categories corresponding to each of the category titles and example indicators found in table 3-4 above. Each notable statement was placed in one or more rubric boxes when analyzing the interview data. Here is an example from one of the instructor’s responses taken from the instructor interview transcript. Interviewer: Are there other ways that [students] can collaborate with each other? Instructor: There's the projects, they could opt into groups, and they're collaborating and then how they're earning that, optional, 10% by filling out a form at the end where they're reporting back on how they contributed to the group and how the other group members contributed, what technology they used to collaborate, and then they kind of rate each other on like respect for each other's ideas and things like that. So, that's how they're earning that, optional, 10% is by reporting back at the end. 62 Instructor: Other than that, to be honest, I would say there's probably not that much by the way that students are interacting with each other, is just through basically like a question forum where they could post a question and then hope for a response. This excerpt generated the following three codes: “There’re the projects, they could opt into groups, and they're collaborating and then how they're earning that, optional, 10%.” would be transferred to the teaching presence rubric under facilitating discourse and the social presence rubric under group cohesion. ” …what technology they used to collaborate” would be transferred to the social presence rubric under group cohesion. “Other way than that, to be honest, I would say there's probably not that much by the way that students are interacting with each other, is just through basically like a question forum where they could post a question and then hope for a response.” would be transferred to the teaching presence rubric under design and organization and facilitating discourse. As described above, after this inductive coding process was complete for the instructor interview, each of the codes was transferred to the Microsoft Word documents containing the profiles, and a short description of each of the teaching presence elements was written at the top of each of the teaching presence element categories. For instance, the teaching presence profile began with a description of each of the course elements in each of the three teaching presence categories (design and organizational, facilitating discourse, and direct instruction). These were further broken down into subcategories. For example, under design and organization, a subcategory read “graded assignments.” That subcategory listed: WebWork Homework, Snapshot Quiz, Projects, Collaboration, Reflections, Survey Participation, Final Exam. These 63 categories were designed to organize all the coded information into usable synthesized pieces of information. Less granular, there were descriptions written about the parent category itself. An early memorandum of design and organization looked like this. Design and organization refer to the general organization of the course’s design structure. Math 101 is organized in a web-based course delivery system called desired to learn (D2L). Several resources are listed at the top of the course page that includes links to the syllabus, get help, technology, and a scavenger hunt. Weekly modules follow this, each containing an introduction, task list, videos with PowerPoints and filled-in PowerPoints, online homework, and a quiz. Some weeks also included a project and survey. After these short descriptions were written, these presence profiles were set aside while the student interviews were analyzed. Analysis of Student Interviews As described in the data collection section, the student interviews were designed to investigate each of the three presences from the student’s perspective. Specifically, the student interviews are aimed at (a) ensuring that I have an accurate record of what the teaching presence looks like to students and (b) exploring how the teaching presence elements affected students’ social presence and cognitive presence in Math 101. The analysis of the student interviews was done in two stages, like the instructor interview. First, starting with our coding scheme, found in table 3-5 after the next page, I used inductive coding to code the transcripts into three separate rubrics, teaching presence, social presence, and cognitive presence. Each of these rubrics was kept in a Microsoft Excel file and contained each of the categories outlined in the table below. This was done with each of the student interviews. After completing these student interviews, I transferred each of the rubric’s categories to the corresponding profile. The same profiles that 64 contained information were gathered and created because of coding the instructor interview. Now, each of these profiles contained transcript snippets, sorted into each category within the profile, from each of the student interviews. These transcript snippets were still direct quotes from student interviews with timestamps in place. This preservation of the student’s words was intentionally done, so quoting them would be more accessible when writing chapter 4. At this time, short descriptions of each teaching presence element, social presence elements, and cognitive presence elements were written in each of the presence profiles because now, having coded all the interview data, I felt that I had a good grasp on how each of the elements appeared. Below I will give examples detailing how the student interviews were coded in the initial inductive coding phase. During the student interviews, I asked for a description of each of the course’s design elements that make the course’s teaching presence and questions about how these elements affected the student’s social presence and cognitive presence. The interview protocol for the student interviews can be found in Appendix B. Then, I analyzed each of the responses by coding them into each of the three rubrics that I had created to represent the coding scheme suggested by Garrison, Cleveland-Innes, Koole, and Kappelman (2006). A table showing the community of inquiry coding scheme (Garrison et al., 2006, p. 5) in table 3-4. Each of the rubrics created for each of the three presences has categories corresponding to each of the category titles and example indicators found in table 3-5 above. When analyzing the interview data, each applicable statement was placed in one or more rubric boxes. Here is an example from one of the student’s responses taken from the instructor interview transcript. In this case, the student being interviewed is Pearl. 65 Interviewer: Is there anything that makes you feel like you're a part of the class or learning community? Pearl: Um, I know we don't really do much, I guess so, it's like, the group projects and I like that [the instructor] gives us the option to work with people. I chose to work with the same group that I did for the last project because I really liked them and how they worked and how we were really good at collaborating. And like I said, I also know that we have a GroupMe, I’m pretty sure I left that, because it was…well, I don't really want to say it, but it really wasn’t collaboration. This excerpt generated the following three codes: “I know we don't really do much, I guess so, it's like, the group projects and I like that [the instructor] gives us the option to work with people.” would be transferred to the teaching presence rubric under facilitating discourse and the social presence rubric under group cohesion. ” I chose to work with the same group that I did for the last project because I really liked them and how they worked and how we were really good at collaborating.” would be transferred to the social presence rubric under group cohesion. “I also know that we have a GroupMe, I’m pretty sure I left that, because it was…well, I don't really want to say it, but it really wasn’t collaboration.” would be transferred to the social presence rubric under group cohesion. As described above, after this initial inductive coding process was complete for the student interviews, each of the codes was transferred to the Microsoft Word documents containing the profiles, and a short description of each of the presence elements were written at the top of each of the categories. For instance, the beginning of the social presence profile began with a 66 description of each of the course elements in each of the three teaching presence categories that were designed to encourage social processes. These were further broken down into subcategories of types of social processes (e.g., affective communication) and where each of these social processes (e.g., during the project). For example, under group cohesion, a subcategory read “projects.” That subcategory listed each of the time-stamped mentions taken from the student interviews. These categories were designed to organize all the coded information into usable synthesized pieces of information. Memorandum creation for all presence profiles. Once each of the interviews, both the instructor interview and all the student interviews, were inductively coded and transferred to the appropriate presence profiles, and the surveys and usage data were incorporated, the more deductive axial coding began. Starting with teaching presence, continuing to social presence, finishing with cognitive presence. Each profile that at this point contained only bulleted transcript snippets with some minor descriptions written was read. While reading, notes were made (some mental and some physical) about whether each of the category elements was sufficiently organized and supported. After this polishing of the organization was complete, more detailed descriptions were written for each category and element based on the transcript snippets. Routinely, when coming upon unclear transcript snippets while writing these descriptions, I would go back to the interview data to check for clarity. Often, this included watching the video data source. These resulting descriptions had varying levels of detail depending on what the data supported. Here are two examples from the presence profiles. This first example was taken from the social presence profile. It is of the category describing the instances found in the interviews of students describing how the surveys affected them. 67 The surveys are not mentioned many times by students. When mentioned, students are mostly positive about them once the student expresses that he does not like them but makes sure that he understands why they are there and that they might be helpful for other students, of the students who report about the surveys positively that reasons range from enjoying completing the surveys to being affected positively. Student 2 describes that the surveys make her feel heard. This is an example of a category with a smaller amount of data available. Note that names had not yet been assigned at this point in the data analysis. So Pearl was still being referred to as Student 2. As early as possible, to protect student privacy, I changed all the names to Student 1, Student 2, and so on. Then, as writing and analysis continued, these naming placeholders were replaced by assigned pseudonyms. Here is an example of a description that was more supported by interview data. This example is from the teaching presence profile’s category’s introduction of the course. Math 101 is organized into weekly modules, each starting on Monday and finishing the next Wednesday (nine days later). This gives students ten days for each module and creates some overlap between modules, and most students mention liking these timelines. The instructor designed each module to be alike for consistency. Each contains an introduction page, task list, instructional video, lecture notes, lecture notes that are filled out, web-based homework, quizzes, and projects. There are course resources separate from the modules, such as a syllabus, forum, message center, email list, and Zoom sessions. These course elements will be discussed below in three different categories: design and organization, facilitating discourse, and direct instruction. After each of these presence profiles was written, they were used to create each claim. 68 Patterns across all the profiles were sought and noticed. I sought these patterns by first looking for items that occurred in the data more than once or were interesting. Then, I highlighted where each of these items occurred throughout each profile. Once this was done, each of these items was titled possible claims. Notes were then made as to which research questions were supported by each of the possible claims to ensure that my backing of these claims would answer the research questions. Each interview transcript, survey data, and the course usage data were then revisited to look for further support for each possible claim now that they were identified. Once all possible claims were supported as much as they could be by the data, they were ranked by strength. After this ranking was complete, it was determined that the top four claims were supported enough for commitment. All the other, less supported claims were discarded. An example of a less supported and, therefore, a discarded claim would be possible claim 5. Possible claim 5 posited that students who were having trouble in Math 101 hesitated to reach out for help. This claim was generated from one interview and therefore had little support. However, since this notion seemed interesting, which was why it was identified in the first iteration, much of the details of this case were presented along with claim 2 as an exception. Here is a list of the four claims that were further studied. Each of these claims is studied in detail in chapter 4. Claim 1: Students tend to have singular preferences of the course’s direct instructional elements. Claim 2: Students who chose to work with others report having positive experiences, those who chose not to work with others report not needing help, with one exception. 69 Claim 3: Meaningful contact points can be created between instructor and student using surveys and personalized mass emails; however, most describe learning mathematics in Math 101 as not making them feel a part of a learning community. Claim 4: Elements of the teaching presence were more likely to foster participation if associated with a grade. Analysis of Student Surveys Each of the surveys that were collected was opened and inspected. These data files were collected in a report format from Qualtrics and downloaded as a Microsoft Excel file. The spreadsheet in these Excel files was arranged with each survey question appearing in the first row across all the columns. Once I identified all the questions that pertained to the present research, I copied each of the columns associated with those questions to a separate Excel file titled Analysis of Surveys. Once this was complete, I tallied up each response category for each question. So, the tally read out how many students answered yes and how many answered no for yes or no questions. A total was counted for other questions with multiple responses for each category. In one instance, when the survey allowed for students to respond with a selection of multiple items, each instance when an item was present was counted. In all cases, each question was also analyzed using the COUNTA Excel function. Below are two examples from the Group Work Feedback Survey Part II survey showing how the data from the surveys were analyzed. For the first example, I will show how the question “Did you select to work with a group for any project in Math 101?” was analyzed. This question allowed students to answer by selecting yes or no. The analysis on this question was done by finding three values, the number of students who answered yes, the number of students who answered no, and the total number of students who answered the questions. To find the number of students who answered yes, I 70 created the Excel function = COUNTIF(B13:B600,”yes”). This function counted each time “yes” appeared in column B from rows 13 to 600. A similar function was used to count all the “no” responses, =COUNTIF(B13:B600,”no”). Then all the responses were counted using this Excel function, =COUNTA(B13:B600). For the second example, I will show how the question “How did you communicate with your group members? (Select all that apply)” was analyzed. This question allowed students to answer by selecting each that applied from the following list: SnapChat, Email, Texting, iMessenger, GroupMe, Google Docs, Spartan365, and I did not join a group. The analysis on this question was done by finding nine different values, how many students responded to the questions along with how many selected each of the given choices. Calculating the number of students who answered the question was done in the same way described in the previous example. Calculating how many students selected each of the choices was done for each of the choices using the Excel function = COUNTIF(D$13:D$600,” *SnapChat*”). In this instance, this function counts how many responses contain the phrase SnapChat. After the results from all the calculations were complete, I transferred the values to the appropriate presence profiles. Later, once the claims had been developed, these values were included in the documents where my claims were being written. Analysis of Course Usage The raw course usage Microsoft Excel file was copied and saved as Analysis of Course Usage. Then the Analysis of Course Usage file was manipulated to order all the elements that were in rows so that the elements with the highest number of students accessing them were at the top and the lowest at the bottom. After the ordering, it appeared that generally, all the graded assignments were at the top. Upon further review, it was clear that all the course links associated 71 with graded assessments made up a large majority of the top section of rows. There were only a couple of exceptions, the weekly tasks list. After noticing the relationship between graded assessments and the ordering of the list, I decided to calculate and compare the likelihood of each of the links being accessed in the two groups, links associated with graded assessments, and links not associated with graded assessments. To do this, I first highlighted each of the elements associated with a graded assessment so I could easily discern them from the links not associated with a graded assessment. Then I wrote an Excel formula to sum the total number of students who accessed each element associated with graded assignments. This number was then divided by the total number of links used in the calculation, giving the average number of students who accessed each of the links associated with graded assignments. The same was done for all the elements that were not associated with a graded assessment. These calculations were used to determine the percentage of students who accessed elements associated with graded assignments and those not associated with graded assignments. These percentages, along with their quotient to show how many more times a student is likely to access a course element associated with a grade than one not associated with a grade, were then transferred to the Word file where I was keeping notes for claim 4. 72 CHAPTER 4: RESULTS This research aims to understand instructors' choices when teaching an online undergraduate mathematics course and how these decisions impact students. For this research, I have organized instructors' decisions and their impacts on students using the community of inquiry framework (Garrison, 2017). This framework has been discussed in more detail in chapter 2; however, it is essential to briefly recall three of its major components here with a description of teaching presence, social presence, and cognitive presence. Teaching presence is “the design, facilitation and direction of cognitive and social processes for the purpose of realizing personally meaningful and educationally worthwhile learning outcomes” (Anderson et al., 2001, p. 5). Social presence is the ability of participants to identify with a group, communicate openly in a trusting environment, and develop personal and affective relationships progressively by protecting their individual personalities (Garrison et al., 2010). Cognitive presence is defined as the extent to which learners are able to construct and confirm meaning through sustained reflection and discourse in a critical community of inquiry” (Anderson et al., 2001, p. 11). This research on the choices an instructor makes and their impact on students will be focused on answering the following three research questions. 1. How does teaching presence manifest itself in an asynchronous online undergraduate mathematics course? 2. How do students report interacting with asynchronous online undergraduate mathematics courses’ activities, assessments, and resources, and how does this impact their social presence? 73 3. How do these students report their cognitive presence being influenced by the asynchronous online undergraduate mathematics course’s activities, assessment, and resources? This chapter is focused on answering these questions with data that I have collected during the data collection phase of this research in the spring of 2021, as described in chapter 3. To answer these questions, I will be separating this chapter into two main sections. The first section will describe each of four claims supported by the data. In each claim, I will demonstrate a relative consensus amongst interview participants for the statements that I am making. By relative consensus, I mean that most participants agree and that none offer a contradicting opinion. Then, I will present data from the other data sources that support each claim. The second section will be dedicated to answering the three research questions using the arguments that have been developed for each claim that was described in the first section, along with other supportive data when necessary. It is worth noting that many of the claims address more than one research question, and thus, each answer to the research question may have support from more than one claim. Claims Each claim presented below will consist of a statement of the claim followed by a brief description. The brief description will include markers indicating which parts of the claim support particular research questions. Following the claims brief opening statements, I will describe the construction of my argument supporting the claim, how each of these claims was discovered in the collected data, as described in chapter 3, followed by my argument for each claim supported by the study’s data. 74 Claim 1: Students tend to have singular preferences of the course’s direct instructional elements Individual students gravitate towards one form of direct instruction (e.g., videos, PowerPoints, or HW attempts) to gain an understanding of the course content (addressing research question 1). These direct instruction elements are part of the teaching presence. Efficiency is a popular primary consideration when choosing a preferred form of direct instruction, specific to the individual student’s preferences (addressing research question 1). Students report going to their preferred, usually because it can be quickly accessed, direct instructional element as their primary information source during the exploration phase after encountering an obstacle as their first choice before choosing to communicate with others (addressing research questions 1 & research question 3). The exploration phase is an element of the cognitive presence. Claim 1 is comprised of three main elements: 1) individual students gravitate towards one direct instructional element (e.g., videos, PowerPoints, or Homework attempts) to gain an understanding of the course content, 2) efficiency is the primary consideration when choosing a preferred form of direct instruction, and this is specific to the individual student’s preferences, and 3) students report going to their preferred, quickly accessed, direct instruction element during the exploration phase after encountering an obstacle as their first choice before choosing to communicate with others. The first interview participant, Kent, described all three elements of the first claim during his interview. Kent stated that he did not have time that could be wasted on the instructional videos, and he elaborated that he learns the course material by reviewing the filled-in notes and then completing each graded assignment. When I asked a follow-up question about using the 75 other elements in the course, he clarified that he did tour all the elements in the first week but settled on using the filled-in notes because of their “efficiency” for learning; calling them “genius.” Furthermore, when Kent was asked to describe what he did when he was asked to complete something on a graded assignment that he was not able to recall, he said that he would either look it up in the provided filled-in notes or, if it were not there, he would run a search on Google. He indicated that the information was usually contained in the filled-in notes. Kent often conveyed that efficiency was an important reason for his choices and that he referenced filled in notes for his resource when he said, “it is all about time for me when I need something, boom, I try to get to it quick.” Kent’s description of what resources he utilized to learn the material served as the piece of evidence that illuminated claim 1 because Kent was the most vocal and adamant about how he chose his preferred instructional element. Once I had read Kent's interview, it brought together the points made by the other interview participants, as I will describe below. Now that I have described Kent’s case to help illustrate what claim 1 is, I describe the claim and support for the claim in more detail. As stated before, claim 1 is broken up into three elements. Support for these elements is discussed below. Element one states that individual students gravitate towards one form of direct instructional element (e.g., videos, PowerPoints, or Homework attempts) to gain an understanding of the course content. To support this element, I will first present the majority evidence confirming the element, followed by an affirmation of the lack of opposition. Individual students gravitating towards one form of direct instruction was found in thirteen of the sixteen interviews, with the other three not mentioning their opinion on their individual learning. The first four example participants that epitomize this claim are John, Suzy, 76 Jennifer, and Cheng. Jennifer and Cheng described their approach to learning Math 101’s material by watching each of the instructional videos while taking notes in their notebook as their preferred method after attempting a couple of other methods in the first two weeks of the semester. Jennifer stated, “I started out printing off the blank notes from the modules, but after a couple of weeks, I stopped doing that and started using my notebook instead because I like having all of my stuff for class in one binder,” Cheng used his notebook from the start and therefore never had to make a change like Jennifer. Jennifer remarked, “I may have used the notes if they were already printed, like for purchase from the bookstore,” but this was not the case. Interestingly, even though Jennifer’s decided method included watching the instructional videos and taking notes, she states that she wished the videos were not asynchronous so she would have the ability to ask questions. Joe and Lucas both skipped the ungraded direct instructional resources and only did the graded elements in the course. Joe stated, ‘I don’t go to Zoom, but I guess I might, because then I would be able to ask questions…you can’t ask questions when you are watching one of the videos, but that is the only thing that I have done so far.” Lucas simply said that he only requires a bit of refresher when learning the Math 101 material and that the graded assignments sufficiently provided this refresher. Feng, Bree, and Sam first skip all the ungraded direct instructional resources and attempt to complete the graded elements of the course, making their approach first appear like Joe and Lucas’. The difference from Joe and Lucas’ approach is that Feng, Bree, and Sam all stated knowing that they are skipping those ungraded direct instructional elements with the idea in mind that they may have to return to them if they have identified what knowledge they need to learn to fill in the gaps in their previous knowledge. Sam’s comment was the best example of this 77 description when he said, “I already know a lot of the material, so I like to get started and see what I can do before I go back and look at stuff when I need to…that’s the quickest way to complete each week’s work.” All three described that returning to the direct instructional elements is rare but still acknowledged the return as part of their plan. When they need to return to those instructional elements, Feng stated that he uses Google while Bree and described Sam returning to topic-specific videos. Shawn, Olivia, and Hilary seemed only to utilize the videos for their learning. Shawn stated, “I hate the videos, but it is the best thing for me to use to learn.” Oddly, Shawn both said that he would like it if the videos were longer, like full lecture length like what he is familiar with from his classroom experience, and that he finds the shorter videos to be painfully dull. Shawn suggested that “Easter Eggs” or fun solutions to homework problems should be scattered throughout the videos to make them more interesting and less tedious. Olivia and Hilary simply described that they watched the videos as their sole source of information and did not take notes. The only exception is when Hilary states, “I sometimes jot down a formula so I can use it later, but beyond that…no…I don’t use notes.” In the interviews, Pearl, Beverly, and Joanne did not mention using any of the direct instructional elements. Although they each acknowledged that the elements exist, they simply did not state an opinion on what they did or what they found useful specific to themselves. Instead, they each made general statements like Pearl’s when she stated, “all the elements are great because different students might use different ones.” Since these participants never spoke of their disposition towards any of the direct instructional elements, they cannot be included as evidence as supportive or unsupportive of element one. 78 Furthermore, the course usage and survey data neither support nor contradict element one. The course usage data indicates the number of students that clicked a course link and the average time spent in each course link. Since this data is link-specific and not student-specific, it cannot be used to determine, with any sense of certainty, whether each student had a singular preference. The survey data does not include any questions pertaining to claim 1. Element two states that efficiency is the primary consideration when choosing a preferred form of direct instruction, which is specific to the individual student’s preferences. To support this element, I will first present the majority evidence confirming the element, followed by an affirmation of the lack of opposition. If you recall, Kent was the participant that thoroughly explained his motivation for efficiency, and, surprisingly, others did too. An explanation of this nature requires a participant to be very comfortable because they are discussing motives that they might feel are counter to what the instructor wants them to do. Even though the admission of efficiency might be more difficult for a participant to explain, 7 of them did, with another 3 explaining the uselessness of some of the instructional elements; it is important to note that these opinions shared by the participants are not an indictment of any of the course elements, they are merely emphasized here to show the existence of participants motivation to efficient. Kent, John, Feng, Bree, Sam, Joe, and Lucas each described efficiency as a reason why they made decisions about what instructional elements they utilized. Kent explained that his choice is focused on “whatever gives him the information the fastest.” John described playing the lecture videos while taking some notes and working on the homework. Each of the others explained that they do not need to worry about the non-graded elements of the course because it is worth their time. Hilary, Sam, and Cheng described different elements as not being useful. 79 Hilary and Sam both stated that the homework seemed unrelated to the coursework, and they wished it were not included in the course, and Cheng explained that the introduction pages were unnecessary. Furthermore, no other examples were found in the interviews stating that decisions were made because of efficiency; however, there were no examples of participants asking for additional instructional elements unless they too had a lean towards efficiency. An example of this was found in two participant interviews, John’s and Sam’s. Both John and Sam voiced the need for Zoom lectures, with John stating, “I think the Zoom sessions should be mandatory for students.” At the surface, these seem not to support element two; however, John and Sam’s motivation for these Zoom lectures do. They explain that being in a Zoom lecture would take the same amount of time as a video lecture but would have the added benefit of having their questions answered simultaneously, and they would be able to watch others get their questions answered. This description of converting time spent on the video to time spent on videos and questions is motivated by efficiency. As stated in the previous section, this element is not contradicted by any interview participants. It is also not addressed in the surveys or the course usage data. Element three states that students report going to their preferred, quickly accessed information source during the exploration phase after encountering an obstacle as their first choice before choosing to communicate with others. To support this element, I will first present the majority evidence confirming the element, followed by an affirmation of the lack of opposition. Thirteen of the sixteen interview participants described consulting their preferred informational source during the exploration phase after encountering an obstacle, with most of 80 them describing that they reached out to others or that they simply reached out to others straight away. Sam, Beverly, John, Pearl, and Kent described that they reviewed their preferred instructional element when encountering an obstacle. Sam rewatched a video, Kent looked at slides, and the others reviewed their notes. Then Beverly, Suzy, and Kent described that they would reach out to the instructional team for help if their first step did not help them resolve their obstacle. Beverly stated, “if I get stuck, I send the professor an email…she gets back really quickly.” Cheng, Olivia, Lucas, Shawn, Joe, Bree, and Jennifer reported going straight to communicating with others when encountering an obstacle. Cheng, Bree, and Jennifer posted questions on GroupMe. Cheng says, “I once had a question about turning in an assignment…it was a tech type questions…students I asked that I GroupMe and I received an answer quickly.” Olivia asked her roommate (Bree asks her roommate too if available), and Shawn and Joe reached out to the instructional team through email or attended Zoom office hours. As stated in the previous section, this element is not contradicted by any interview participants. It is partially addressed in the surveys and is not addressed in the course usage data. The survey results, found in table 4-1 below, indicate that most students feel that WebWork and Video Lectures contribute to their learning the most in Math 101. While these data do not speak specifically to our claim that students select a preferred instructional element, they are certainly willing to describe one when prompted. 81 Table 4-1 Description of selected survey results Question: What elements of the class most contribute to your learning? Response indicating: WebWork: 152 Video: 105 Notes: 49 SnapShot: 45 Projects: 20 Zoom: 15 Piazza: 10 These data helped generalize the comments from our student interviews onto the broader Math 101 student population in two ways. The survey helps identify students who reported feeling that they have an instructional element that they select as most beneficial for their learning; although, this selection was not made in the event of an option allowing students to report that they found none of the instructional elements helpful. Moreover, and somewhat less clear, they supported the idea that there are at least popular ways students describe as beneficial to their learning in Math 101. In summary, each of the three main elements of claim 1 is supported by the interview data and not contradicted by any other data sources. Individual students gravitate towards one form of direct instruction (e.g., videos, PowerPoints, or Homework attempts) to gain an understanding of the course content was supported with many participants choosing videos and others choosing filled-in notes. Efficiency is the primary consideration when choosing a preferred form of direct instruction is supported by several participants directly stating this as their motivation, with a few students explaining this motivation in other ways. Students reported going to their preferred, quickly accessed information source during the exploration phase after encountering an obstacle as their first choice before choosing to communicate with others is 82 supported by an overwhelming amount of interview participants explaining that they either reach out to others for help or they refer to their preferred method of leaning the course material (e.g., notes, slides, or videos). With all three of these elements being supported by the interviews, claim 1 seems true. Claim 2: Students who chose to work with others report having positive experiences, those who chose not to work with others report not needing help, with one exception Claim 2’s statement about students working together supports research question 2 because of its focus on the categories that describe social presence elements, namely affective, open-communication, and group cohesion. Many of the interview participants reported not working with others in a meaningful way (i.e., had no contact with others or only exchanged a few messages using GroupMe or Piazza) because they felt that it was unnecessary since they were confident with the course content (addressing research question 2). Students who chose to work with others reported doing so because they needed help or wanted more contact with others (addressing research question 2). These students experienced positive impacts on their learning and wellbeing through receiving answers to their questions, developing deeper understanding through discussion, or feeling a part of a learning community (addressing research question 2 and research question 3). Positive impacts on a student’s learning are situated within the cognitive presence. These two groups of students were well represented in the interview data with one interesting exception: a student that desperately needed help but would not reach out to peers. Claim 2 proposes that students largely fall into two groups in terms of communication: 1) students who report not working with others in a meaningful way because they felt it was unnecessary, and 2) students who did report working with others because they needed help or 83 wanted more contact and these students reported positive impacts on their understanding of the course content or feeling of connection with a learning community. Combined, these two groups make up most of the participants except one. One participant needed help and did not want to reach out to others. I proffer that a successful argument for claim 2 must show a relative consensus amongst the two groups of students described in the claim: students who work with others and those who do not. Furthermore, given any contradiction, it can be explained by outside influences. The first two interview participants, Kent and Pearl, embodied the first element of this claim. Kent and Pearl worked together in this class on projects and homework; they also met on Zoom or communicated through a text messaging application to discuss struggles that they were experiencing in their coursework and life. They became friends who depended on each other throughout this time in and out of class. Both participants described this academic and social relationship as meaningful to them and the other in their own opinion. Kent and Pearl’s description of how they worked together and communicated with each other served as the piece of evidence that illuminated claim 2 and made me look for other examples within the data of this occurring and not occurring. Evidence of each was found in the student interviews and the instructor interview. The instructor reported that about 279 or 51% of students requested to work together on the projects. Conversely, this example also highlights the instances when students choose not to work together. As stated before, claim 2 describes two distinct groups of students found in the interview data. First, I will describe the opportunities for communication within the course and how each of those opportunities was used and talked about by the interview participants. This piece is important because it helps conclude how these elements of the teaching presence affected 84 communication. Then, I will conclude this section with an example of a student, Joe, whose choice not to communicate was not linked to an element of the teaching presence. Claim 2 proposes that students largely fall into two groups in terms of communication: 1) students who report not working with others in a meaningful way because they felt it was unnecessary, and 2) students who did report working with others because they needed help or wanted more contact and these students reported positive impacts on their understanding of the course content or feeling of connection with a learning community. To support this, I will present evidence confirming that there were opportunities for students to communicate in Math 101 and how students chose to react to these communication opportunities. Finally, a description of the one counter-example (i.e., a student who knew they needed help but chose not to communicate with classmates) will be given with rationale of why it may be the exception that proves the rule, or at least, does not disprove the rule. Opportunities for student communication. Students in Math 101 expressed that they had the ability to communicate and felt comfortable sharing their ideas and questions in both the student interview and the student surveys. Specifically, none of the students interviewed said they did not feel comfortable communicating; however, most did not communicate with others during the course except when communicating about their project. Their reasons for not communicating were described by some as not needed. They felt comfortable with the mathematics content and their ability to complete the course tasks independently that they did not reach out to anyone from class. Notably, a handful of students did reach out to others. Math 101 had several activities that promote communication or provide an opportunity for communication. Through inspection of the D2L course and the instructor interview, these opportunities were discovered to include projects, Piazza and Messaging applications, email, and 85 assignment submissions. Students were given the option to work together on the course projects. Working together is typically required in other semesters, but students were given the option because of the Covid-19 pandemic. There is a collaboration grade on the syllabus, but this is simply a larger project grade for the observed semester for those students who worked alone. The last portion of the project contains a reflection that gives students the ability to reflect and communicate their ideas to themselves and the instructional team. The instructional team encouraged the use of Piazza for students to post questions, which are then answered by other students or the instructional team. Students were also encouraged to sign up for a messaging application called GroupMe, which was intended to help students communicate with each other. Communication, in general, was also encouraged by the first assignment. The first assignment asks students to download a Microsoft Word file, answer a few questions, and then create and upload the file as a PDF. This early semester assignment is done partly to foster the relationship between the instructional team and the students. The instructional team implements further communication. The instructor sends out emails about every two weeks, and some of these emails are progress notes while others are informational. Finally, there are also opportunities for students to sign up to meet with instructional team members using Zoom or attend one of the pre-scheduled group Zoom meetings. Students mention these opportunities for communication in their interviews. Below, I will discuss how students described their communication generally, using GroupMe, attending the Zoom sessions, using email, Piazza, and during the projects. Moreover, when data is available, I will present how these interactions affected students’ creation of friendships and feelings about being a part of a learning community. 86 General communication. Students reported that while there were plenty of opportunities to interact with their classmates, some felt that there should be more. Joe stated that he would have liked to have more interactive assignments in the class and that the worst thing about Math 101 was the lack of human interaction. Joe also described that this feeling of a lack of human interaction was not unique to Math 101, but it was like the feeling of isolation that he felt in all their asynchronous online courses. Joe suggested that a class meeting on Zoom once or twice a week would be very helpful for him. This student acknowledged that there were optional Zoom meetings but did not feel that those were as adequate as mandatory class meeting time. Another student, John, suggested that presentation might be a good way for communication to be increased in Math 101, and that was something they liked about their other courses. Notably, John did not choose to communicate with other students because he did not feel that there was a community culture for him to do so. GroupMe communication. Six of the students who were interviewed mentioned GroupMe. Joanne explained that the GroupMe for the course was “very helpful for getting their questions answered quickly” while others, others, Bree, and Cheng, described instances where they posted questions to classmates on GroupMe about projects and technical issues like Cheng’s remark, “when I could not figure out how to turn in an assignment early on in the semester, I posted on GroupMe and received a lot of answers quickly.” Bree liked the feature that allowed communication using pictures of questions and that the communication made them feel very comfortable. On the other hand, some students reported not liking GroupMe, Shawn, Beverly, Jennifer, and Pearl. Shawn explained that he knew there was a GroupMe but that he rather works alone by stating, “oh yeah, I looked at GroupMe, I rather work through things on my own,” and Beverly only used it a couple of times before turning off the notifications because there were too 87 many students— “this makes communication annoying and not useful.” Finally, Pearl had the harshest words for GroupMe when she described it as not there for student communication, just a long annoying list. Zoom communication. Students generally indicated that they were happy that meeting with someone from the instructional team on Zoom was an option. Even though some of the students reported not using Zoom, none of them indicated that they did not like that it was available. The positive opinions about using Zoom had a large range. Olivia, Kent, and Shawn reported going to Zoom office hours and setting up Zoom appointments with the instructional team whenever they needed help, and Shawn stated, “I wish they were required for all students so I could interact with them and hear their questions too.” This point is echoed by John, who said that they “wished Zoom sessions were required or highly recommended for students. Beverly recognized that they need the ability to ask questions freely and said, “I like asking questions during Zoom” and “Zoom calls are the only thing that makes it feel like a real class, other classes do not have that option, and I feel more disconnected there.” Furthermore, Beverly explained, “I usually skip all of the other things in class, like the videos, and only go to the Zoom, I feel open to ask questions there, and I don’t feel dumb about it.” Of the students who liked Zoom Bree offered the critique that she wished she did not have to wait stating, “it would be great if I could get help on Zoom right when I needed it.” Other students reported not attending the Zoom sessions at all. Jennifer, Joanne, and John had not gone to any Zooms at the time of the interview, which was conducted after the halfway point in the semester. Joanne said that “they like that they are available” and, if you recall, John wished that they were required. Suzy and Joe both said that they did not attend any of the 88 sessions but would if they needed to. Joe described not having any questions, and when he did, he said he would watch one of the videos. Email communication. All the interview participants indicated that they liked the emails that were sent out by the instructional team and that they felt comfortable reaching out to the instructional team through email. Some noted that these emails were exceptionally helpful to them feeling like a member of a class. Pearl said, “the emails make me feel seen because they are personalized, they even have my name at the top.” Olivia also remarked how nice it was to receive a personalized email from their professor. Many other students, Kent, Jennifer, Suzy, Joe, Lucas, Beverly, Feng, and Cheng, described a similar situation when they had reached out to the instructional team. These situations were all question-based. Questions about the logistics of turning in assignments, technical issues, assignment clarification, or reaching out for help with the course content. Although each of these instances of students reaching out with questions was different, they can all be described as like Kent’s example where he said, “yeah, whenever I have a question, I email the professor, and I cannot believe how quickly they get back with me.” Joanne and Sam had not sent any email but stated that they had never had a question or a reason to do so, with Joanne saying, “everything in this class is pretty easy, but if I did have a question, I would ask.” Piazza communication. Only six students mentioned Piazza during the student interviews. Two of these six, Bree and Pearl, indicated that they did not use it with Pearl, suggesting that the class switch from Piazza to another online platform called PackBack. Only one of the six, John, posed a question; their question was about a technical issue they had while trying to submit their first assignment. John received quick and helpful comments from his peers. Jennifer, Beverly, Joanne reported using Piazza by scrolling through to see if their question had 89 already been asked and answered in the forum. Each of these three reported that this was not their go-to place to get questions answered and that they only looked on Piazza once or twice. Project-related Communication. The projects moved many students to communicate with each other. As stated before, 279 or 51% of students chose to work together, as reported by the instructor. I will describe students’ dispositions towards working together on the projects by arranging the interview participants in three different groups, 1) students who worked together, 2) students who chose to work with others and then did not, and 3) students who chose not to work with others. First, Kent, Pearl, Shawn, and Feng chose to work with others. As was already described during the opening of this claim, Kent and Pearl worked together and developed a relationship where they met on Zoom and helped each other with different things. Pearl described, “we mainly met on Zoom or communicated through SMS chat for the project and just to check in with each other to see how things were going with the class and, you know, the struggles of the pandemic.” Kent specified that “working together was great; everyone in my group struggle with different things, so we help each other.” Feng worked with his roommate, whom he had before the pandemic started, even though his roommate was in the United States, and he was now living in China. Shawn said that he chose to work with others on the project, and he described his choice like this, “I like that everything in this class can be done in one day except the Zoom time and group member time, and I really only chose to work in groups because it was encouraged.” Joanne, Hilary, Shawn, Beverly, and Lucas all described electing to work with others for the projects and them abandoning the group or not communicating with their group during the group projects. Hilary and Shawn both explained that they briefly worked with others during the group project but did not communicate outside of those projects. Moreover, the communication 90 during the projects was brief. Hilary stated, “I worked in partners for the project, but not really other than that…we only communicated briefly using GroupMe.” Joanne worked with another student and described the experiences stating, “yeah, I did have a partner for the first two projects, it was kind of weird though, I reached out to them a couple of times and did not really receive much communication back, so I really just worked alone, but they were in my group.” While Beverly and Lucas each had similar experiences as Joanne, where they worked with another student briefly on project one, with a little communication, and then never worked with another again. Finally, using the reported number of students that chose to work in groups on the project of 279 out of 547 students, it can be deduced that 268 students chose not to work together. The interview participants also represented students who chose not to work with others. Sam, Olivia, John, and Jennifer all chose not to work with others on the projects and instead worked along. Olivia, John, and Jennifer responded similarly, answering “no,” “I worked alone,” and “I did not sign up to work in groups for the projects,” respectively. Whereas Sam gave more detail when he stated, “I liked the option to work in groups on the project, but I did not want to burden others, so I decided to work alone.” Surveys. The communication aspect of the surveys was mentioned by two students, Sam and Pearl. They both described the surveys as a point of contact with their professor. Sam described that they have only communicated with their professor through the surveys and comments that they received on their graded projects. Whereas Pearl explained and reiterated during the interview that the surveys were important to her because they “gave me a voice and really made me feel heard.” 91 In all, the descriptions of the instructional elements of the course and the communication opportunities described in the interviews gave students multiple points of communication contact. Furthermore, each of the descriptions given by students shows that they either chose to use the communication opportunities or not, and if they did, they largely had a positive experience. Joe stood out as an interview participant because his situation was unique compared to the others. He is an example of a student needing to reach out for help but hesitating. Joe described having a hard time in Math 101, falling behind, and not being able to, as he states, “bring myself to do the work.” This struggle that Joe described to me during the interview because of the central focus of our conversation. Interestingly, because of the pandemic, Joe lived at home during his first year of college with his parents. Joe described his father as a professor and as being very supportive of his academics. Joe’s experience in Math 101 may not be unique among all Math 101 students, but it certainly was of all the interview participants. It is likely that students who are behind, struggling, or otherwise do not feel good about their performance would not volunteer to be interviewed; however, this is only conjecture. Specifically, Joe said that he had trouble working on anything and felt that he should know the material. He stated, “I know what I have to do, I just can’t seem to do it” and “I have not reached out to fellow students because I don’t think they can help.” As you may recall, Joe was referenced in my argument for claim 2, where Joe described a feeling of a lack of human interaction in Math 101 and that this feeling was not unique, but it was like the feeling of isolation that they felt in all their asynchronous online courses. Furthermore, Joe was the participant that suggested that a class meeting on Zoom once or twice a week would be very helpful but also acknowledged that there were optional Zoom meetings but did not feel that those 92 were as adequate a mandatory class meeting time. To me, Joe is a case of a student that I have seen many times in my teaching career. Perhaps, I have even been Joe from time to time. His description of “knowing what to do” leading to him not reaching out for help seems so familiar. Now, I suggest a notion that might help Joe. This notion is not a claim on its own; however, it is linked to Joe and is supported during the discussion of claim 4. Given that Joe continues to describe that he needs to communicate with others but cannot bring himself to do so may give more importance to required assignments. As you will see in claim 4, required assignments motivate more participation. This motivation needed to participate is what Joe is describing that he requires. Claim 3: Meaningful contact points can be created between instructor and student using surveys and personalized mass emails; however, most describe learning mathematics in Math 101 as not making them feel a part of a learning community Numerous students reported feeling a part of a learning community when they received emails from the instructional team and when they were asked for their opinions and experiences through surveys (addressing research question 1 and research question 2). Feeling a part of a learning community is an important aspect of social presence. Although two students felt negatively about the surveys, with one saying they “were a waste of their time” or “I did not get anything from them,” both described that they felt that they might be meaningful to others in the class. However, many interview participants reported not choosing to communicate with others in the class and that they did not feel like they were a part of a class in terms of comparing it to what they were accustomed to in a traditional face-to-face course. Interestingly, two of the students that will be described in the argument below sought help in the class from friends instead of classmates. 93 Claim 3 has two main points, 1) personalized mass emails and surveys initiated by the instructional team may create meaningful contact points with students, and 2) most students do not feel like they were a part of their math class compared to a traditional face-to-face course. To show this claim, I will first describe instances where students described how personalized mass emails and surveys affected them in positive ways. I propose that these positive accounts from students in Math 101 about using mass emails and surveys are sufficient to support the first point in claim 3. The second point in claim 3 will be shown with most students describing not feeling a part of the class. My attention was drawn to the first point in claim 3 early in the interview phase of data collection. Going into the interview, I was curious if students felt like they were a part of a classroom community like the ones described by the literature about social presence in asynchronous online courses. This curiosity was satisfied during the first two interviews with Kent and Pearl. As you may recall, Kent and Pearl were in the same project group, and this group connection grew into a meaningful relationship. One where Kent and Pearl exchanged many forms of communication, texts, emails, and conversations over Zoom, about the class, school, and coping with the pandemic. Furthermore, both Kent and Pearl described emailing and speaking with the instructional team. Pearl elaborated that the communication through email that she received from the instructor made her feel seen and that the course’s surveys made her feel heard. Kent described email as to how he preferred to reach out for help and that the emails he had received from the instructional team made him feel that way. These early accounts from Kent and Pearl were enough to highlight the connection and importance of these meaningful contact points and how they can be created between instructor and student using surveys and personalized mass emails, and these contact points are sometimes described as creating feelings 94 of belonging. On the other hand, the second point in claim 3 was discovered in most student interviews while they were being conducted and during the analysis process. In the following paragraphs, I will describe instances of when students commented about the instructional team’s emails and surveys. Following, I will support these comments by analyzing the survey results that pertained to these forms of communication. First, Kent, Pearl, Suzy, Cheng, and Shawn all reported that the communication in Math 101 helped them feel that they were a part of an educational community. As described before, Kent and Pearl spoke highly of their communication with the instructional team. Kent stated, “whenever I need help, I have no problem reaching out to the professor because she makes me feel invited to do so.” This invitation that Kent is speaking about is from the mass emails that the instructional team has sent out. Kent clarified this when I asked him directly; he responded, “well, she is always sending us emails with all the ways to contact her for help.” Furthermore, Pearl talked about how the emails made her feel like an individual important enough to be given a personalized email from the professor—Pearl did not realize that the emails were sent in mass. She described this by saying, “I can’t believe that the professor has enough time to send out these emails…she must have hundreds of students to contact.” This statement may seem trivial, but this feeling allows students to feel more comfortable reaching out and communicating when they need help if they feel that they are important and not just one of another hundred students. Other students described the communication in Math 101 positively as well, although perhaps not as emphatically as Pearl and Kent. Suzy said, “all of the points of contact make this course feel like a ‘real’ course.” Suzy elaborated that she, too, feels comfortable reaching out to others because there are so many ways to do so if they need to. Suzy stated, “I can send an email or drop into and hang out in a Zoom.” They do not report reaching out or using Zoom; however, 95 they described that having Zoom as an option makes them feel like a real class is going on. Cheng and Shawn did not mention the email or surveys being meaningful; however, they both described going to the Zoom sessions. Cheng stated, “the weekly Zoom sessions that we have in this class make it like a traditional course.” Moreover, “I usually attend these Zoom sessions in the middle of the night, I’m in China, then I go to bed right after.” These accounts from students about these connection points being meaningful show us how important these connection points are. As described before, many students reported not communicating with others in the course or taking advantage of some of the opportunities to communicate. Sam had the most to say about these opportunities and how they compared with a face-to-face course in relation to social presence. It is important to note that Sam was also the most inquisitive about the nature of my research, so he was given a short description of teaching presence, social presence, and cognitive presence. Sam described that he has not talked with any other student or anyone from the instructional team because “the class is really easy,” and he described his belief that the present research will “find that the only negative impact to shifting to asynchronous is on the social presence…you just aren’t going to have any of the normal impromptu meetings before and after class.” He went on to say, “I haven’t attended any of the Zoom sessions, but I do not think I would be asking questions there, I wouldn’t want to burden anyone with them…it does not feel like that in a regular classroom.” This sentiment was not unique to Sam, Joanne, Jennifer, John, Suzy, Joe, and Lucas all reported not working with others or communicating with anyone from the instructional team. Most of them answered no, even when I followed up about specific instances. They also mostly reported that they did not care to communicate with others. When asked of not being in communication with others bothered them, Joanne said, “no, it does not 96 bother me, it would…maybe… be bothersome.” While Hilary remarked, “Um, I don't really feel like it's a class environment, but it doesn't really bother me. I actually like how it's set up because I know what I have to do every week, and like I like a schedule, so like having all of the stuff that I know is due on a specific without having the need to communicate with others.” During a rewatch of the interviews of Joanne, Sam, Jennifer, John, Suzy, Joe, and Lucas, it seemed to me, by their demeanor, that these students did not need any help in the Math 101, the content was easy, and therefor communication with others have simply been another task. A task that was perceived as not being beneficial. Even Joanne, when they remarked about not wanting to bother others, seemed to be making a comment from a place of confidence in their ability to understand or come to an understanding of the Math 101 content. On the other hand, some of the students who did not communicate with others in the class expressed some feelings that they wish they had, or they did have others to communicate with, about their mathematics, outside of class. John wished that they had more opportunities to communicate, and he expressed, “I feel really disconnected in this class…like I’m all alone.” To me, these students, who need more support but are unwilling to reach out for that support, need to be thought about by instructors the most. Finally, two students did not communicate with others in the class because they had more immediate forms of communication that they could utilize while working through the Math 101 content. Faith and Hilary both gave these types of reasons. Hilary stated, “well, I didn’t really reach out…I think when I was doing my work…like, I was with a friend, not from class, and I asked her if she knew anything about [my class], and she kind of helped me a little bit here and there.” Similarly, Olivia noted, “I don’t work with others from this class, but I do from my poli sci class, and sometimes I have asked them for help in math.” This statement points to the 97 importance of helping students build connections with other students so each student can enjoy the same benefits of having people that they feel comfortable reaching out to when struggling, like Olivia and Hilary. The opinions about communicating with others were also found in the survey. The survey results, found in table 4-2 below, tend towards students stating that Math 101 provided them with ample opportunities to communicate with the instructional team and with other students. Furthermore, they support students’ more likely choice of not attending Math 101’s Zoom sessions or asking questions on Piazza; however, there was an indication that most students would still choose to look at Piazza to see if their question had already been answered. Table 4-2 Description of selected survey results Question: There are ample opportunities to ask members of your instructional team questions. Strongly Agree: Somewhat Neither Agree Somewhat Strongly 234 Agree: 124 nor Disagree: 63 Disagree: 17 Disagree: 5 Question: There are ample opportunities to ask questions of your classmates in class. Strongly Agree: Somewhat Neither Agree Somewhat Strongly 137 Agree: 136 nor Disagree: 95 Disagree: 48 Disagree: 27 Question: Have you attended any of the weekly Zoom help? Responded Yes: 92 Responded No: 377 Question: Have you used Piazza to ask any questions this semester? Responded Yes: 119 Responded No: 347 Question: Have you used Piazza to read other students' questions and answers? Responded Yes: 299 Responded No: 167 These data help generalize the comments from our student interviews onto the broader Math 101 student population in two ways. They clearly identified students who reported feeling that there were ample opportunities to communicate with the instructional team. And, less clearly, they 98 support the calls from interview participants that they require communication to be timely or not want to bother others. This requirement is evident by most students indicating that they do not post questions on Piazza, but many students do look for answers that are already there. In summary, given what we have found in our data, there were meaningful contact points created between the instructor and student using surveys and personalized mass emails. Moreover, in some cases, these contact points were described as helping the student feel seen, heard, and part of their Math 101 learning community. However, most generally described learning mathematics in Math 101 as not making them feel a part of a learning community. These students felt that they did not need to communicate with others or had other places to turn when they needed help. A small subset, two students, did indicate that they did not communicate with others when they should have. It is this group that requires the most attention from researchers and instructors. Claim 4: Elements of the teaching presence were more likely to foster participation if they were associated with a grade This claim is the most straightforward of the claims that have been put forward in this chapter. In general, it would be hard to find someone that would not agree that students generally participate more in activities that affect their grades than those that do not (Support of research question 1 and research question 2). I propose that even though this claim may seem obvious and predictable, it must still be studied because of its importance in an asynchronous online course. Differing from a face-to-face course or an online course offered synchronously, asynchronous courses offer students more freedom to choose how they participate in their learning. Furthermore, as we have seen in the previous claim, claim 3, students sometimes do not choose the most beneficial path, even when they know they should. Here is a short argument that 99 students are more likely to participate in activities that affect their grades constructed from the study’s course usage data in table 4-3 below. Table 4-3 Usage of graded versus non-graded course elements The average percent of students who accessed links associated with a 89.7% graded assignment: The average percent of students who accessed links associated with a 41.4% non-graded assignment: Note: These data indicate that students were 2.17 times more likely to access an element of Math 101 that was associated with a graded assignment as compared to an ungraded assignment. Answering the Research Questions The findings described in the arguments for the claims have been provided, in detail, in the previous section. This current section will use those previously laid out arguments to answer this study’s three research questions. When necessary, to specifically address the research question, more detail from the data may be given outside the previously provided arguments. Now, each research question will be restated with its answer, supported by the arguments put forth in the claims and the collected data, following. Answer Research Question 1 First research question: How does teaching presence manifest itself in an asynchronous online undergraduate mathematics course? 100 Recall that teaching presence is “the design, facilitation and direction of cognitive and social processes for the purpose of realizing personally meaningful and educationally worthwhile learning outcomes” (Anderson et al., 2001, p. 5). In the following answer to research question 1, I will describe the design and facilitation of Math 101, how cognitive and social processes were directed by the instructor, and, most importantly, how these elements manifested (i.e., appeared to the student). First, numerous data on Math 101 have been collected. Teaching presence has been described, in part, by each data source. First, many course activities, assignments, and resources can be seen in the class. These include links to a syllabus, descriptions of upcoming weeks, tasks list, instructional videos, PowerPoints, practice exercise documents, WebWork homework, SnapShot quizzes, projects, surveys, grade book, articles, and multiple options for communication opportunities (i.e., Zoom meetings, Piazza, GroupMe, email, and PackBack). During the instructor interview, each of these elements was described as having the main purpose of helping students realize the course’s educationally worthwhile learning outcomes by facilitating cognitive and social processes. The instructor stated, “each week was designed to be very consistent, so students will understand what they need to do and not get confused.” Furthermore, she explained that each week starts with a task list, then continues with lecture videos and PowerPoints accompanied by notes options (some with blanks for students to fill in and some filled out). She stated, “this is where students learn the content.” Then students are directed to complete online homework through a web-based system called WebWork. The instructor remarked that this was very purposefully designed to make sure students learned the course content, “I like the WebWork, I know that it is not that popular, but to me, this is where students show if they know how to do the math.” Then there are other assignments that are more designed for students to work together on real-world problems. The quizzes are called 101 SnapShots. She described them as being directed towards cognitive presence by stating that they are “real-world type application scenarios where students must read an article and answer questions. She also remarked, “these are quizzes but not really, I would love it if the students would do them together so they would communicate. This description of directing social processes continues when the instructor talks about student communication and the projects. She says, “this semester, because of Covid, we let them choose whether or not they worked in groups on the projects…usually, they have to work in groups.” Later it was described that students were always encouraged through email to come to Zoom sessions, Zoom office hours, ask questions, and communicate with each other using the Piazza forum or GroupMe. In the previous paragraph, I described the elements in the Math 101 online classroom and how these elements were directed towards facilitating students' cognitive and social processes. Now, I will describe how these manifested in the course. Specifically, how did these affect students, from the students’ points of view and actions? From the argument for claim 1 that was developed using the student interview data and survey data, students tend to gravitate towards one form of direct instruction (e.g., videos, PowerPoints, or HW attempts) to gain an understanding of the course content. Moreover, efficiency is a popular primary consideration when choosing a preferred form of direct instruction, which is specific to the individual student’s preferences. This is even true for students who report going to their preferred, quickly accessed information source during the exploration phase after encountering an obstacle as their first choice before choosing to communicate with others. Numerous examples from the data were shown of students choosing these preferred sources. Some students watched all the instructional videos; some only looked at the filled-in notes, while others skipped the direct instruction elements altogether. While claim 3 shows that students take part in selected elements of the 102 teaching presence, claim 4 shows that students are just over twice as likely to participate in the graded elements of the course. In Math 101, these would include, WebWork homework, Snapshot quizzes, projects, reflections, survey participation, and the final exam. Furthermore, some reports of the direction of the social processes being successful for some students; however, many chose not to communicate. From claim 3’s argument, numerous students reported feeling a part of a learning community when they received emails from the instructional team and when they were asked for their opinions and experiences through surveys. Although two students felt negatively about the surveys, with one saying they “were a waste of their time” or “I did not get anything from them,” both described that they felt that they might be meaningful to others in the class. However, many interview participants reported not choosing to communicate with others in the class and that they did not feel like they were a part of a class in terms of comparing it to what they were accustomed to, a traditional face-to-face course. In totality, the teaching presence in Math 101 manifested elements that had the purpose of helping students realize the course’s educationally worthwhile learning outcomes by facilitating both cognitive and social processes and that sometimes these purposes were realized among student experiences. Answer Research Question 2 Second research question: How do students report interacting with asynchronous online undergraduate mathematics courses’ activities, assessments, and resources, and how does this impact their social presence? In the following answer to research question 2, I will describe how students' reported interacting with Math 101’s activities, assessments, and resources and how these impacted their 103 social presence by drawing on the arguments made for claim 2, claim 3, and claim 4. First, recall that social presence is the ability of participants to identify with a group, communicate openly in a trusting environment, and develop personal and affective relationships progressively by protecting their individual personalities (Garrison et al., 2010). From claim 1, it was described that many of the interview participants reported not working with others in a meaningful way (i.e., had no contact with others or only exchanged a few messages using GroupMe or Piazza) because they felt that it was unnecessary since they were confident with the course content. These points of view of students choosing not to work with others were exemplified and described by Olivia, John, and Jenner. Students who chose to work with others report doing so because they needed help or wanted more contact with others. However, the students who chose to work with others reported experiencing positive impacts on their learning and well-being by receiving answers to their questions, developing deeper understanding through discussion, or feeling a part of a learning community. Pearl and Kent were the most demonstrative of these characteristics because they reported developing a relationship that could be described as personal and affective while being able to communicate openly in a trusted environment. Altogether, these two groups of students were well represented in the interview data and survey data. Furthermore, from claim 3, meaningful contact points can be created between instructor and student using surveys and personalized mass emails. Some students reported feeling a part of a learning community when they received emails from the instructional team and when they were asked for their opinions and experiences through surveys. Although two students felt negatively about the surveys, with one saying they “were a waste of their time” or “I did not get anything from them,” both described that they felt that they might be meaningful to others in the 104 class. However, many interview participants reported not choosing to communicate with others in the class and that they did not feel like they were a part of a learning community in terms of comparing it to what they were accustomed to a traditional face-to-face course. Many of these students, as was described in the argument for claim 3, did feel like they were comfortable enough to reach out, but they simply did not have any reason to. One student, Joe, is an example of a student who described needing help and not reaching out to his classmates. Although Joe explained later that he does reach out to the instructor, he would have liked some of the option opportunities for communication, like the Zoom sessions, to be mandatory for a grade, which would have encouraged him to stay on track. In claim 4’s argument, it is shown that students are 2.17 times more likely to access an element of Math 101 if the element was associated with a graded assignment as compared to an ungraded assignment. This finding from claim 4 supports the notion that if Zoom sessions and collaboration on projects were mandatory, then students would be more likely to do so and thus create more instances where students may develop personal and affective relationships. This, in turn, would affect social presence positively. Answer Research Question 3 Third research question: How do these students report their cognitive presence being influenced by the asynchronous online undergraduate mathematics course’s activities, assessment, and resources? First, recall that cognitive presence is defined as the extent to which learners are able to construct and confirm meaning through sustained reflection and discourse in a critical community of inquiry” (Anderson et al., 2001, p. 11). Furthermore, cognitive presence can be 105 categorized into triggering events, exploration, integration, and resolution (Akyol & Garrison, 2008). Throughout the next few paragraphs, I will attempt to answer research question 3 to the extent possible from the study’s data. Support for these research questions comes from claim one and claim 2. From claim 1’s argument, it was shown that students reported choosing a preferred form of direct instruction and that they go to this preferred, quickly accessed information source during the exploration phase after encountering a triggering event as their first choice before choosing to communicate with others. It was also discussed during the argument for claim 2 that many of the interview participants reported not working with others in a meaningful way (i.e., had no contact with others or only exchanged a few messages using GroupMe or Piazza) because they felt that it was unnecessary since they were confident with the course content. However, students who chose to work with others report doing so because they needed help or wanted more contact with others. This choice that these students make to communicate with others after needing help places the social connections that they have made central to their process for resolving triggering events within the cognitive presence. Encouragingly, these students who reported experiencing positive impacts on their learning and wellbeing from these communication contacts through receiving answers to their questions, developing deeper understanding through discussion, or feeling a part of a learning community. Together, these claims show that Math 101’s teaching presence provided the necessary opportunities to direct students to engage in the cognitive presence. The data showed that some students experience triggering events while learning and engaging with elements directed towards learning. Moreover, students who experienced these triggering events sometimes chose to communicate with others or investigate their preferred direct instructional element (e.g., notes, 106 Google, and videos) as part of the exploration phase. With many students reporting—and no contradictory reporting statements—that they were able to come to a resolution. 107 CHAPTER 5: DISCUSSION, IMPLICATIONS, AND CONCLUSION This research aims to understand instructors' choices to represent their teaching presence when teaching an online undergraduate mathematics course and how these decisions impact students’ social and cognitive presence. This research is focused on answering the following three research questions. 1. How does teaching presence manifest itself in an asynchronous online undergraduate mathematics course? 2. How do students report interacting with asynchronous online undergraduate mathematics courses’ activities, assessments, and resources, and how does this impact their social presence? 3. How do these students report their cognitive presence being influenced by the asynchronous online undergraduate mathematics course’s activities, assessment, and resources? These questions, when answered together, were designed to support an understanding of how teaching presence manifests itself in a fully online asynchronous undergraduate mathematics learning environment and how its presence impacts students’ social presence and cognitive presence. Specifically, how may these two presences, social and cognitive, be affected positively by the instructor’s choices when representing their teaching presence? To answer these research questions, I collected data from three different data sources, beyond course artifacts, such as the syllabus, (a) a selection of questions from the course surveys, (b) course usage data, and (c) instructor and student interviews. Four claims were generated while analyzing these data sources, and the research questions were answered. In this chapter, I will discuss how the data, claims, 108 and answers to the research questions from the present study connect with current research on instructional strategies and online instruction in undergraduate mathematics learning environments. The chapter will conclude with what I ascertain to be the study’s limitation, suggestions for future research, and my concluding remarks. Discussion I will first summarize the present study’s key findings to start this discussion. This study made four claims that were based on the collected data. Claim one posits that students tend to have singular preferences of the course’s direct instructional elements. This, in turn, helped answer research question 1, pertaining to how teaching presence manifests, and research question 3, how cognitive presence is affected. In simpler terms, students pick something they like and stick with it whenever they interact with the teaching presence. This is true if they are trying to learn from direct instruction in the teaching presence or if they need to refer to in the exploration phase after they get stuck during the cognitive presence. Claim 2 proffers that students who chose to work with others report having positive experiences, those who chose not to work with others report not needing help, with one exception. Data and arguments from this claim helped answer research question two and research question 3 about how social presence and cognitive presence are affected, respectively. Students working together is a key element of social presence, and promoting these social processes is one of the goals that the teaching presence is charged with supporting (Garrison, 2017; Kovanović et al., 2018). Social presence has been a focal point for some research because of the thought that online courses would have the most significant negative impact on students' ability to communicate with each other, thus harming their social presence (Garrison et al., 2010; Testone, 2019). Understanding this key area is paramount to the success of teaching mathematics online. 109 Claim 3 states that meaningful contact points can be created between the instructor and student using surveys and personalized mass emails; however, most describe learning mathematics in Math 101 as not making them feel a part of a learning community. The arguments from this claim support the answers to research question 1 and research question 2 and contribute to understanding the relationship between teaching presence and social presence. Specifically, this claim highlights how an element from the teaching presence can positively influence social presence. Moreover, this teaching presence element was of low cost to the instructor and high value to some students because the communication initiated by the instructor was done once but replicated in a personalized way for all students by addressing each student by name in the body of the emails. This is important because, as stated in the previous paragraph, communication has been doubted to take place in meaningful ways in the online classroom by researchers (Garrison et al., 2010). One of the student participants, Sam, even voiced this doubt when he described social presence as something that was going to be impacted negatively the most when moving a class online. Therefore, this claim serves as a glimmer of hope to those wondering if students’ social presence can be impacted positively by communication with an instructional team, especially when the ratio of instructional team members to students was twelve to five hundred forty-seven. Finally, claim 4 posits that elements of the teaching presence were more likely to foster participation if they were associated with a grade. The present study’s course usage data indicated that students were 2.17 times more likely to access an element of Math 101 associated with a graded assignment than an ungraded assignment. This result, above all others, seems the most predictable and surprising at the same time. It seems reasonable that students would spend more time interacting with online activities, or course links in this case, that were associated with 110 their grades. However, over twice as likely still seems surprising. This means that since the teaching presence is represented as a collection of links on a website, instructors need to be conscious of the ways in which they choose to communicate to students because students have a choice of which links they choose to interact with by clicking them. Now that these findings have been summarized, I will make connections between them and the current research. There are three general areas of research that will be discussed and connected to, research on active learning (Felder & Brent, 1996; Freeman et al., 2014; Prince, 2004; Prince & Felder, 2007; Treisman, 1992; Vickrey et al., 2015; Zientek, Yetkiner Ozel, Fong, & Griffin, 2013), the online mathematics learning environment (Draus et al., 2014; Engelbrecht & Harding, 2005; Hegeman, 2015; Trenholm et al., 2016), and community of inquiry framework (Akyol, Arbaugh, et al., 2009; Anderson et al., 2001; Garrison, 2017; Garrison et al., 2010; Kovanović et al., 2018; Swan & Ice, 2010). Connections to Active Learning Research Active learning is not something that can be readily found in the literature on online mathematics instruction. Typically, it is used to describe the type of learning that is different from the stand and deliver traditional lectures that most students can envision from some of the experiences they have had in the face-to-face classroom, but still in the face-to-face classroom. Moreover, in at least some research, the term active learning is used while not being defined directly (Freeman et al., 2014; E. Johnson, Keller, et al., 2018), although describing active learning as “actively engaged in problem-solving rather than listening to a lecture” (Fong & Visher, 2013, p. 13) seems reasonable. Notably, there are many examples of instructional strategies that would be considered active learning in the face-to-face classroom like the ones that were described in chapter 2, peer instruction (Vickrey et al., 2015), inquiry-based learning 111 (Aditomo et al., 2013; Edelson et al., 1996; Freeman et al., 2014), and cooperative learning groups (Felder & Brent, 1996; Fullilove & Treisman, 1990; Springer et al., 1999). While all these bodies of work were presented from research done in the face-to-face classroom, I will draw the connections that can be found to these strategies from the present study in the online learning environment. The results from the present study did not describe or make any claims about the strategy of peer institution; however, there was some mention in the student interviews that were presented in the arguments for the claims in chapter 4 of the use of problems that could be described as inquiry-based learning and activities like those found in the literature about cooperative learning groups. The four course elements that were well represented in the data and the claims were the video lectures, Zoom sessions, the Snapshots, and the projects. These serve as the connections to the active learning literature I will focus on in the following few paragraphs. First, I will focus on the video lectures and the Zoom sessions. Students described the video lectures in Math 101 as being short videos of problem demonstrations that were performed by someone on the instructional team. This form of direct instruction contained in the teaching presence would not be considered a form of active learning (Fong & Visher, 2013; Freeman et al., 2014) because it asks students to passively listen to a lecture without the ability to ask questions or discuss problems. This makes them kin to traditional stand and delivers lecture with the added insurance that no questions could be asked during lecture, but rather, questions would have to be submitted through some other written, electronic means, such as email or the Piazza question forum. On the other hand, the Zoom sessions were an environment that allowed for the demonstration of mathematical problems and the ability for students to ask questions 112 instantaneously. This opportunity for interaction would indicate a higher amount of active learning than the lecture videos if active learning was thought of as a spectrum with different levels as it is sometimes described in the literature (E. Johnson, Keller, et al., 2018). Second, both the projects and the Snapshot quizzes serve as examples of inquiry-based learning (Prince & Felder, 2007). Prince and Felder (2007) describe inquiry-based learning as solving problems, analyzing data, or testing a hypothesis. The present study’s data showed that these activities were involved when completing Math 101’s Snapshot quizzes and projects. In either case, student descriptions supporting these instances can be found in the arguments made in chapter 4, although these were not central to any of the present study’s claims for research questions. More connected, the projects describe cooperative learning groups (Felder & Brent, 1996) because they involve students working on complex problems together while maintaining individual assessments (D. Johnson et al., 1998). While this format of the project was only present in those instances where students in Math 101 elected to work together in groups, it was shown to have a beneficial impact like the positive effects that have been described in the research on cooperative learning groups (Prince, 2004). Prince (2004) describes cooperative learning as being a successful instructional strategy that has shown to have a positive impact on learning and the development of interpersonal skills. While it is difficult to assess these claims and how they compare to our present study’s data, it would be reasonable to state that they were supported because students were interacting with each other. Take the case of Kent and Pearl. Kent and Pearl described the growth of a friendship that was more than about their mathematical work. It included talking about life and coping with the pandemic. This is an example of how the group projects in Math 101 created the environment necessary to positively affect interpersonal skills during a special time, a pandemic, and using a 113 newly studied medium, the online learning environment. Specifically, the results of this study’s findings on the social presence demonstrate that it is possible to have created a rich active learning environment, of the cooperative learning group variety, in an online course. Still, this leaves the question of how to make these demonstrations of cooperative learning groups more prevalent in the online learning environment. It would be necessary to address some of the students' inclinations not to engage with others. An example of a student that chose not to engage with others was epitomized by another participant’s, Sam, statement, “I liked the option to work in groups on the project, but I did not want to burden others, so I decided to work alone.” Connections to the Online Learning Environment Research Three research areas on how instructional choices impact students in an online undergraduate mathematics course are how different skills are required to instruct in this modality, the importance and impact that communication has in online learning, and the use of videos to replace in-person lectures. These are the three research areas in which the present study can draw connections. A prevalent theme in the literature described that many of the instructors that teach online try to replicate the instructional practices they use in their face-to-face courses (Trenholm et al., 2016). The present study confirms that this claim remains relevant. For instance, the replication of face-to-face lectures, quizzes, projects, and tests was all present in Math 101 as instructional videos, Snapshot quizzes, projects, and online tests. Each shows varying levels of success in their ability to replicate the best features of those instructional elements. These are consistent with the findings of Trenholm and colleagues (2016) when they discovered this by interviewing over 70 undergraduate mathematics instructors and asking them to compare their face-to-face courses with their online courses. 114 Understanding that the replication of face-to-face instructional strategies for the online learning environment was present in the current study’s data on Math 101 is essential. More important, however, is how each of these elements was replicated and if any of the present study results indicate that these instructional strategies that were implemented in the teaching presence had a positive impact on the social presence and cognitive presence (Garrison et al., 2010). Of the results from the present study, instructional videos, as was described in the previous section, seemed to relate closely with traditional face-to-face lectures because they provided no interaction. This level of interaction indicates a low level of social process facilitation. On the other hand, the projects indicated a high level of social process among those students who chose to work with others. These results demonstrate that replicating face-to-face instructional strategies is still likely to be found in the online learning environment and that that is not enough to make a claim about high instructional quality when using the metric of encouraging cognitive and social processes (Garrison et al., 2010) or active learning(Freeman et al., 2014). Furthermore, the present study’s data that described the social connections created among students who worked together on projects suggests a way to design an instruction strategy that increases social processes and thus increases social presence. Another finding from the present study that describes affecting the social presence was the data describing how some students were affected by the instructional team’s mass emails and surveys. The student interview participant, Pearl, described that these mass emails and surveys made her feel a part of the class and that she felt seen and heard. This notion was described in more detail in the study’s second claim in chapter 4. Testone (2019) explains that not having personal feedback, or the perceived presence of an instructor was found to negatively impact student success. The example from the present study shows that, perhaps, positive feedback can come from these communication points 115 on a mass scale. Something that may become more important in the future if the current financial constraints continue to impact higher education (Kaser & Hauk, 2016). This leads to another facet of teaching mathematics online; online instruction requires different skills (Engelbrecht & Harding, 2005). Technology-specific skills were noted throughout the data and arguments presented in chapter 4. Creating quality online instructional videos, quizzes, projects, sending out mass personalized emails and surveys are all in a required skillset for instructing a course online and different from the skills required to instruct face-to-face. Of these skills, the ability to send out personalized mass emails and surveys was found in the present study’s data. I personally have taught for eighteen years, and I do not know how to send out a personalized mass email; however, I know how to send out an online survey. My personal skillset serves as an example of how an instructor’s skillset may affect their ability or inclination to apply different strategies that require technological skills. Whether it be understanding how to upload a video or program a mass email, this study indicates that these skills were valuable and impacted the teaching presence’s ability to affect social processes as part of the social presence. Trenholm and colleagues (2016) found that much of the modes of communication provided to students were computer generated and that even the discussion boards that students were mostly used to communicate provided very few examples of students communicating with each other and with their instructor. Furthermore, Trenholm and colleagues found that this little communication had a negative impact on students’ perception of the course. However, the present study results suggest that mass communication may positively affect students when approached differently. I propose that the difference is whether the student perceives that personal communication is taking place. For example, the communication that Trenholm and 116 colleagues describe was generated by a computer program and, perhaps, was easily seen by the students as such. Whereas, when the mass email communication in Math 101 was described in the student interviews, the students explained that the emails were personal and that they envisioned the instructor personally writing to them because their name was in the greeting. During the instructor interview, the instructor explained that she could type an email using a generic name inserted in the text that would be automatically replaced with each of the students' first names associated with their emails. This example describes a fine line between what students may feel is personal communication and what they do not. Moreover, this line may move over time as technology evolves and thus the general knowledge of what can and cannot be a personalized communication change. For instance, many would cede that when digital robots started tending to help sections for business websites, they may have been more easily mistaken for humans than they are today because of their wide use. Finally, Cho and Heron (2015) recommended that instructors build opportunities for interaction early in their online mathematics course to help build a student’s self-confidence. This last note is important because the instructional team for Math 101 provided these opportunities for students to interact and build confidence. An example of this was described in the instructor interview and by the student interview participant, Cheng, both presented in chapter 4. The instructional team for Math 101 created an early assignment that required students to perform some relatively simple technical tasks. Specifically, to download a file, convert the file to PDF, attach the file to an email, and send it to the instructional team. The instructor described the motivation behind this task being three-fold. First, to help the student overcome technical issues early in the semester. Second, to build their confidence by completing a task. 117 And third, to help them create a relationship with someone on the instructional team. This example is an exemplar from the present study data of Cho and Heron’s recommendation. Connections to the Community of Inquiry Framework The community of inquiry framework describes educational experiences using three interrelated categories: teaching presence, social presence, and cognitive presence (Anderson et al., 2001; Lipman, 2003). These three categories were present in the present study’s research questions and collected data. The first outcome that the present study concluded about the community of inquiry framework is that all the data was represented by one of the three community of inquiry categories and that each of the categories was represented in the data. This result describes the good fit between the community of inquiry framework and the present study’s research to understand the instructors' choices when teaching an online undergraduate mathematics course and how these decisions impact students. Now, I will describe how each category represented by the present study’s data connects to those found and described in the research. Teaching presence describes the actions taken by the instructor to facilitate learning and communication (Akyol, Garrison, et al., 2009). Teaching presence has been described as “the design, facilitation and direction of cognitive and social processes for the purpose of realizing personally meaningful and educationally worthwhile learning outcomes” (Anderson et al., 2001, p. 5), and it is perceived to influence both the social presence and cognitive presence (Akyol & Garrison, 2008). The present study results concluded that these descriptions of teaching presence were accurate; however, perhaps the sentiment provided by Wertz (2022) is more succinct when applying the community of inquiry framework to an online learning environment. Wertz characterizes teaching presence in an online classroom as represented by the instructional tools 118 and activities the students interact with while in the online classroom environment. This representation is closer to the teaching presence in an online classroom because 100% of the teaching presence takes place via electronic means. Teaching presence was further noted in the research literature as an essential and influential piece in students' satisfaction in their learning experiences (Swan & Ice, 2010). Swan and Ice’s (2010) sentiment is consistent with the present study results. As was described in chapter 4, many of the students explained that they were happy with the elements that made up the teaching presence of Math 101, some had ideas for improvement, but none voiced any dissatisfaction. This study offers more support for this finding by Swan and Ice that teaching presence plays an important and influential piece in students' satisfaction in their learning experience. Akyol and Garrison (2008) state that teaching presence is perceived to influence the social presence and cognitive presence. The present study suggests that these findings (Akyol & Garrison, 2008) are valid. Each of the present study’s four claims states how the teaching presence impacted the social and cognitive presence. Claim 1 and claim 4 describe how students choose to interact with the teaching presence, which leads to the impacts described by claim 2 and claim 3. Importantly, communication initiated by the instructional team influenced the social processes of feeling to have affective communication among students. Encouraging students to work together in groups allowed some students to build relationships where they could communicate meaning through discussion with their groupmates. This finding offers support to the notion that teaching presence influences social presence and cognitive presence (Akyol & Garrison, 2008) and further that it is charged with “the design, facilitation and direction of 119 cognitive and social processes for the purpose of realizing personally meaningful and educationally worthwhile learning outcomes” (Anderson et al., 2001, p. 5). Social presence was one of the earliest focuses of online learning research because it depended on communication among students and community building which were both doubted to be successfully replicated in the online learning environment (Garrison et al., 2010). After all, it was not fully accepted that students would be able to effectively communicate with each other in meaningful ways online. This skepticism is persistent in the culture today and is one of the foci of the current study. This skepticism was even echoed by Sam, one of the student interview participants, in the present study after he had heard me briefly explain the study’s research focus. Like Garrison and colleagues (2010), Sam expressed that social presence was the least likely to be demonstrated in the online environment. This belief was, in part, due to the online nature of communication at the time and the struggles that the communication aspect of active learning had been implemented in the face-to-face classroom (Walcyzk et al., 2007). The results of the present study indicated that some students were able to share meaningful communication when they chose to communicate. Garrison and colleagues (2009) describe social presence as “the ability of participants to identify with the community (e.g., course of study), communicate purposefully in a trusting environment, and develop inter- personal relationships by way of projecting their individual personalities” (p.32). Most of the data that was presented in chapter 4 would seem to conclude that Sam, and Garrison and Colleagues were correct in their assumptions because most of the student interview participants, by any of the metrics described above from the research, chose not to communicate with others, including the instructional team. However, the present study indicates that those who did choose to communicate with others were able to do so in positive ways. 120 Another place that the present study can connect is to the research by Swan and Ice (2010), who focused their definition of social presence on the online learning environment. Thus, they defined social presence, more specifically, as ”the degree to which participants in computer- mediated communication feel affectively connected one to another” (Swan & Ice, 2010, p. 1). This specific definition does work well, in its scope, within the present study. It offers a way to ask the more specific question that may be more pertinent to computer-mediated communication, does the communication in the online course allow students to feel affectively connected to one another? The present study results indicate that using this definition, computer-mediated communication allows students to feel affectedly connected to one another. This connection is evident in the relationship described by Kent and Pearl. Therefore, the present research supports the definition provided by Swan and Ice. However, the present study’s data also confirms the necessity to use a broader definition typically used to define social presence by most researchers (Anderson et al., 2001; Deris et al., 2012; Tirado Morueta et al., 2016; Wertz, 2022). Cognitive presence has been seen to be shaped by the teaching presence and social presence (Garrison et al., 2010) and refers to a student’s ability to demonstrate knowledge by conferring meaning (Anderson et al., 2001). Most accept Garrison and colleagues (2009) definition of cognitive presence in the online learning environment as “cognitive presence is conceptualized as the extent to which learners are able to construct and confirm meaning through sustained reflection and discourse” (Arbaugh et al., 2008; Swan & Ice, 2010). And that cognitive presence can be broken down into four linear phases, (a) triggering event, (b) exploration, (c) integration, and (d) resolution (Garrison et al., 2006). Each of these describes the events that take place while a student is learning a new concept. When the student comes across a new concept or obstacle, this creates a triggering event. The student then must explore new and current 121 knowledge to find helpful information. This information must then be integrated into the concept for the student to come to a resolution (Tirado Morueta et al., 2016). The present study cannot offer anything that contradicts these definitions or assumptions of cognitive presence. Instead, it can only detail how students interact with the teaching presence and social presence when faced with a triggering event during the exploration phase. The study results suggest that students would often go to their chosen source of instruction (e.g., instructional videos, Google, and notes) while others might reach out for help using Piazza, GroupMe, or, more often, to someone convenient—like a roommate or friend. In either case, efficiency seemed to be described as one of the factors that students considered when trying to negotiate the exploration phase within the cognitive presence category. Another small group of data presented in chapter 4 was that some students chose to skip each of the direct instructional elements and go straight to the homework so they could confirm their content knowledge. This example depicts a student going straight to the resolution phase because they could successfully integrate the homework tasks to their prior knowledge (Garrison et al., 2006). Implications This study was designed to support an understanding of how teaching presence manifests itself in a fully online asynchronous undergraduate mathematics learning environment and how it impacts students’ social presence and cognitive presence. In other words, it asks the question, what impacts does the general design of an online undergraduate mathematics course have on students in terms of their social communication and learning? As such, the result of this study has implications for both the research and practice communities. I will now describe these implications in two sections: a section describing the implications on the research community 122 and a section describing the implications on the practice community. The implications for the practice community will be presented, in part, by a personal example. First, the implication related to the research community is very optimistic. Starting with the research surrounding active learning, many researchers have described active learning strategies as not widely implemented (Walcyzk et al., 2007) and that much of the communication that is required for these active learning strategies to be implemented are challenging to do in the online learning environment (Garrison et al., 2010). This is a description of disaster for the online learning environment; however, the present study marks hope for these strategies being implemented successfully in the online learning environment. Much of the findings from the present study offer examples of affective communication being able to be created using cooperative learning groups in an online mathematics course and using personalized mass emails. Moreover, the present study has implications for research in the online learning environment. The current study results imply that even though sizes of online mathematics classes may still grow, there may be ways the teaching presence can facilitate high levels of social processes using mass email, surveys, cooperative learning groups, and other online tools. Thus, these specific tools should be studied and evaluated for their effects on social presence and cognitive presence on the mass scale. Second, this research has many implications for the practitioner community. It implies that it is crucial and worthwhile for online mathematics instructors to stay well versed with online tools and instructional strategies that can be implemented to create a high-quality online learning environment for students. This need for instructors to stay up to date includes things that have not been created yet; however, the present study suggests four specific things that 123 instructors should familiarize themselves with that are available today, (a) prescribe opportunities for students to communicate with each other such as having assignments that are completed in cooperative learning groups, (b) communicate with your students through personalized means (e.g., emails, surveys, and Zoom sessions), (c) use feedback from surveys to inform your future teaching practice, and (d) ensure that your communication and direct instruction is observed by tying it to elements associated with grades. Here is an example of how the current research could impact my own practice. Currently, I am the instructor to approximately one to two hundred students taking a course like the one in the present study. After analyzing the implications, I plan to implement student surveys to give students a chance to voice their opinions. This will allow the student to feel more a part of the course and for me to make improvements based on their opinions. I also plan to request data from my college’s technology team. This data should be available to all instructors, like that of this studies course usage data, to better understand what parts of my teaching presence students are participating with. Then, I will be able to assess how I have organized the information in the course to ensure that it is being observed by the most students possible. Finally, I will continue to offer students timely feedback, cooperative group assignments, and I will start learning how to create and send out mass personalized emails. Limitations and Future Research The present study had many limitations, some that could have been avoided and some that could not have. This section will describe the present study’s limitations and how identifying these limitations may inform future research. First, this study, decidedly, focused on understanding the instructor's impacts while designing the teaching presence on the social presence and cognitive presence from the students’ 124 points of view. The focus, therefore, was to gain an understanding of the student’s perspective in their own words. This was done mainly using student interviews with some support from student surveys. This choice was made because it seemed to be the best balance of getting rich data from the student interviews and more brief data, like that from a student replying to a multiple-choice survey. This points to one of the most considerable limitations of the present study, its generalizability. From this study, one can feel very confident in how an interview participant feels about a particular aspect of Math 101, and, through the supportive data that the survey offered, they can feel somewhat confident in how prevalent some of these expressed opinions might be held across all the students in Math 101. However, as the generalizability grows, the amount of confidence of each of the statements weakens. Therefore, I propose that future research focusing on specific aspects of the current study be carried out using surveys. Now that some more examples of how students express their opinions have been discovered, surveys questions could be designed to capture the essence of what the interview participants described, but in survey question form so more students’ opinions can be more accurately captured. Another of the study’s limitations was due to the student population and the course design. First, the participant population was from a large midwestern university and not a small community college. These are markedly different populations for many reasons. To name one, the university has acceptance requirements based on the student’s prior educational experiences and performance. In contrast, community colleges are primarily open-door institutions with no educational barriers to entry. This point alone makes a strong enough case to be suspicious about the generalizability of the present study’s findings to the community college population. Therefore, studies like this should be done focusing on community college populations. Then researchers may be able to draw parallels between the two populations in terms of how the 125 choices instructors make in their teaching presence impact the social presence and cognitive presence. Finally, the design of this study did not focus on several of the different aspects of cognitive presence (Akyol, Garrison, et al., 2009; Garrison et al., 2010). During the study's planning phase, the questions and attention given by the cognitive presence were decidedly framed with how students described their reactions to getting stuck while learning the course content. However, now that the present study has been conducted, it has become clear that a more detailed approach to understanding how teaching presence and social presence impact cognitive presence is warranted. Therefore, a future study designed around the four phases of the cognitive presence, (a) triggering event, (b) exploration, (c) integration, and (d) resolution (Garrison et al., 2006) should be investigated more closely for how students’ progress through each phase. Concluding Remarks I personally had one underlying thought from the beginning of the current research. What will I be able to add to the current body of research through student interviews that a large quantitative, more generalizable study may not? The present study proved to be a necessary investigation of how students are affected by instructors' instructional choices by presenting empirical data that express these effects in students' own words. Even though these community of inquiry categories has been studied before using student surveys, it was only through these student interviews that I was able to find how these choices the instructor makes made students feel in a detailed way. Using student interviews as the primary source was key to discovering that students choose a primary source for their go-to instructional element and that students can be affected positively by personalized mass emails. These are some unique details that a more 126 large-scale approach may miss. However, the present study may aid larger-scale studies by offering descriptions of these students’ opinions in the detail that only an interview can provide. 127 APPENDICES 128 APPENDIX A: Survey Questions 129 Math 101 Survey Questions Example Survey Opening Questions The following questions inquire about how well the statement describes how you engage with Math 101. There are no right, or wrong answers so please do not hesitate to be honest. The results of this survey will be used to try to improve courses like Math 101. This survey will take approximately 5-10 minutes to complete. You will be given complete credit for completing this survey. The course instructor will not see or have access to the results of this survey at any point. The instructor will only be given a list of names of students who have completed the survey, so that they may get full credit. Any responses submitted by students will only be used in the aggregate. That is, before any of the responses are used for any purpose, all the names of the students will be removed. Please provide your MSU email. __________________________________________________ Would you be willing to be contacted for a short (1 hour long) interview? _________________ Questions from Math 101 Course Usage Survey Do you watch the weekly Math 101 content videos on D2L? These are the videos that explain the mathematical concepts every week. Yes No, I don't feel the need to since I rely on the written notes. No, I didn't know there were videos that explain the content. Please write down any feedback you must improve the Mat 101 content videos. Have you attended any of the weekly Zoom help sessions? Yes, No If you answered no, then please provide some feedback on why not. Have you used Piazza to ask a question this semester? Yes, No Have you used Piazza to read other students’ questions and answers? Yes/No Please provide any feedback that may improve the organization of the Math 101 D2L site. Please provide any feedback on anything your other courses are doing that Math 101 is not doing and that you would like us to do. 130 Questions from Online Student Engagement Survey For each of the following statements, chose the description that most accurately reflects your point of view while specifically reflecting on your recent experiences in Math 101. Very Moderately Not really Not at all Characteristic of characteristic characteristic characteristic characteristic me of me of me of me of me Staying up on modules (readings, ○ ○ ○ ○ ○ videos, etc.) Engaging in conversations online ○ ○ ○ ○ ○ (chat, discussions, email) Taking good notes over readings, ○ ○ ○ ○ ○ PowerPoints, or video lectures Having fun in online chats, discussions or ○ ○ ○ ○ ○ via email with the instructor or other students Participating actively in small-group ○ ○ ○ ○ ○ discussion forums (e.g., Piazza, PackBack, etc.) Helping fellow students ○ ○ ○ ○ ○ Doing well on the Snapshots/quizzes ○ ○ ○ ○ ○ Getting to know other students in the ○ ○ ○ ○ ○ class Posting in the class discussion forum ○ ○ ○ ○ ○ regularly 131 Questions from Mid Semester Student Course Feedback Survey The course assessments (e.g., WebWork, Snapshots) deepen your understanding of the learning objectives and/or key concepts. Strongly disagree, Somewhat disagree, Neither agree nor disagree, Somewhat agree, Strongly agree The course content delivery (e.g., notes, content videos) deepen your understanding of the learning objectives and/or key concepts. Strongly disagree, Somewhat disagree, Neither agree nor disagree, Somewhat agree, Strongly agree The course content delivery (e.g., homework, lectures, or discussions) prepares you for the assessments (e.g., WebWork, Snapshots). Strongly disagree, Somewhat disagree, Neither agree nor disagree, Somewhat agree, Strongly agree How well are the questions on assessments (e.g., WebWork, Snapshots) aligned with course content delivery (e.g., homework, lectures, discussions)? They are not aligned. They are generally not aligned. They are neither aligned or misaligned. They are generally aligned. They are tightly aligned. How much time each week do you spend outside of the classroom to work on assignments, readings, and/or studying? Less than 1 hour 1-3 hours 4-6 hours 7-10 hours More than 10 hours There are ample opportunities to ask members of your instructional team questions. Strongly disagree, Somewhat disagree, Neither agree nor disagree, Somewhat agree, Strongly agree There are ample opportunities to ask questions of your classmates in class. Strongly disagree, Somewhat disagree, Neither agree nor disagree, Somewhat agree, Strongly agree The use of technology (e.g., D2L, video, online resources) enhances your understanding of key concepts. Strongly disagree, Somewhat disagree, Neither agree nor disagree, Somewhat agree, Strongly agree 132 Which of the following resources do you use to contact your instructors? (Select all that apply) Weekly Zoom sessions Email Piazza forum One on one appointment with instructors Other Which of the following resources do you use to contact your fellow students? (Select all that apply) Weekly Zoom sessions Email Piazza forum One on one appointment with instructors Texting GroupMe Other The resources available to you (e.g., D2L, help room, office hours) are sufficient to help you succeed. Strongly disagree, Somewhat disagree, Neither agree nor disagree, Somewhat agree, Strongly agree Rate your overall satisfaction with the instructional team. Extremely dissatisfied Somewhat dissatisfied Neither satisfied nor dissatisfied Somewhat satisfied Extremely satisfied Rate your overall level of learning in this course. I am not learning anything I am earning very little I learn a few things now and then I learn new things often I am learning new things almost every day What elements of the class most contribute to your learning? If someone told you they were taking this course next semester, what advice would you give them about the class? What could be added or changed to make this class work better for you, or to help you learn more? Optional: general comments 133 Questions from D2L Usage Survey Part II How often do you access the Math 101 D2L course page? Once a week Multiple times a week Every day of the week Less than once a week and only when an assignment is due I haven't been to the page yet Have you been able to find and use the weekly content videos and explanations of content on D2L? Yes/No If you answered no, please explain what the issue is: Have you been able to contact someone from the instructional team to ask questions about any aspect of the course? Yes/No/Not applicable If you answered no, please explain what the issue is: Are there specific aspects of the Math 101 D2L course page (e.g., content, organization, flow) that you find particularly useful? Yes/No If you answered yes, please explain what you find useful about the D2L course page? If you answered no, please explain what you do not find useful and any suggestions to improve the D2L course page. Questions from End of Course Survey What is one thing you think the weekly video lessons did well? What is one way you think the weekly video lessons could be improved? Any other comments about the weekly video lessons. Questions from Group Work Feedback Survey Part II Did you select to work with a group for any project in Math 101? How did you communicate with your group members? (Select all that apply) 134 APPENDIX B: Student Interview Guide 135 Student Interview Guide Begin the interview by introducing yourself. Explaining the project and how the interview data will be used. Then ask for consent to record the Zoom session indicating that this will record both audio and video. Then begin asking the interview questions with their follow-ups. A sample introduction can be found below. Hi, I am Bob. I am a graduate student in the mathematics education program here. Thank you for agreeing to me with me today for this interview. I would like to first introduce and explain my research and then ask you for consent to interview you. My research is interested in how activities, assessments, and resources impact students in an online undergraduate mathematics class. Since you are enrolled in Math 101, I am really interested in how you describe your experience with the course. The answers that you share with me will help me understand how you feel about your interaction with Math 101’s activities, assessments, or resources. I will be the only person who reviews this interview, and I will describe your answers in my research by referring to you with an alias. Do you have any questions about this research before we begin the questions? Do you have any suggestions to what alias you would like me to use for you? 1. The audio and video for this interview is being record through the Zoom conferencing software. Do you consent to this recording? Please indicate verbally. 2. How are you doing today? 3. How are your classes this semester? 4. How do you feel about how Math 101 is going? 5. How do you engage with MAT 101 online? Can you walk me through a typical week as it relates to Math 101? (Keep an eye out for Teaching, Social, and Cognitive Presence). a. What do you have to do exactly? Are there any assignments? Are there optional activities? 6. Follow up about Teaching Presence a. Explain your experience with the instruction that you described in the course. i. Assignments, activities, or resources? 7. Follow up about Social Presence. a. What were some of the activities that made you feel like you were a part of the class? b. Please describe how you communicated with your instructor or other students in this course. i. Did this communication help you? ii. Which of these ways to communicate were helpful? Why? iii. Which of these ways to communicate were not helpful? Why? iv. Did you feel like you got to know the instructor or the other students in the course? v. Did the communication opportunities help you feel more comfortable learning the Math 101 content? 8. Follow up about Cognitive Presence. a. What was the most beneficial thing for your learning the MAT 101 content? b. What about this made it beneficial? 136 c. Can you describe an example of a topic that you struggled with in this course and how this helped you learn the topic? 9. What was the least beneficial thing in this class for your learning? a. What about this made it not as beneficial? 10. Have you taken any other online classes? Was there anything that you found useful in those courses that you think would be helpful in this course? In what ways did those that thing help you? 137 APPENDIX C: Instructor Interview Guide 138 Instructor Interview Guide Begin the interview by introducing yourself. Explaining the project and how the interview data will be used. Then ask for consent to record the Zoom session indicating that this will record both audio and video. Then begin asking the interview questions with their follow-ups. A sample introduction can be found below. Hi, I am Bob. I am a graduate student in the mathematics education program here. Thank you for agreeing to me with me today for this interview. I would like to first introduce and explain my research and then ask you for consent to interview you. My research is interested in how activities, assessments, and resources impact students in an online undergraduate mathematics class. Since you are the instructor of Math 101, I am really interested in how you describe the elements of the course and your motivations for their design and inclusion. I will be the only person who reviews this interview, and I will describe your answers in my research by referring to you with an alias. Do you have any questions about this research before we begin the questions? 1. The audio and video for this interview is being record through the Zoom conferencing software. Do you consent to this recording? Please indicate verbally. 2. How are you doing today? 3. What are the assessments in Math 101? a. What were the motivations for the design and inclusion of these assessments? 4. What are the learning activities Math 101? a. What were the motivations for the design and inclusion of these activities? 5. What resources are provided in the course (videos, text, etc.)? a. What were the motivations for the design and inclusion of these resources? 6. What are the expected ways for communication between you and the students? 7. What are the expected ways for communication between the students? 139 APPENDIX D: Recruitment Emails 140 Recruitment Emails Initial Email Subject: Math 101 | Setting Up Time for Interview Hi, I cannot wait to speak with you about your Math 101 experience. Earlier this semester you indicated on a survey that you were willing to share your thoughts about Math 101. Do you have time to meet on Zoom for a 30-minute chat? Your opinions and experience will help shape the future of online math courses at MSU and other colleges and universities. Please respond to this email with times that work best for you. If any of these times do not work, please suggest a time. Monday at 2 pm Tuesday at 2 pm Wednesday at 2 pm Thursday at 2 pm Friday at 11 am Have a fantastic day, Bob Elmore Doctoral Candidate Program in Mathematics Education Michigan State University 141 Follow-up Email Subject: Math 101 | Setting Up Time for Interview - Does this week work? Hi, I understand that last week did not work in your schedule, however, if you still would like to share your thoughts about Math 101, I would love to hear them. Do you have time to meet on Zoom for a 30-minute chat? Your opinions and experience will help shape the future of online math courses at MSU and other colleges and universities. Please respond to this email with times that work best for you. If any of these times do not work, please suggest a time. Monday at 2 pm Tuesday at 2 pm Wednesday at 2 pm Thursday at 2 pm Friday at 11 am Have a fantastic day, Bob Elmore Doctoral Candidate Program in Mathematics Education Michigan State University 142 Final Email Subject: Math 101 | Setting Up Time for Interview – Last Chance Hi, I understand that last week did not work in your schedule, however, if you still would like to share your thoughts about Math 101, I would love to hear them. Do you have time to meet on Zoom for a 30-minute chat? Your opinions and experience will help shape the future of online math courses at MSU and other colleges and universities. Please respond to this email with times that work best for you. If any of these times do not work, please suggest a time. Also, this will be my last email bugging you to chat. I hope this semester goes well and have a wonderful Spring! Monday at 2 pm Tuesday at 2 pm Wednesday at 2 pm Thursday at 2 pm Friday at 11 am Have a fantastic day, Bob Elmore Doctoral Candidate Program in Mathematics Education Michigan State University 143 APPENDIX E: Project Timeline 144 Project Timeline December 15, 2020 .........................................................................Dissertation Proposal Defended February 23, 2021 .......................................................................... Dissertation Proposal Approved Spring 2021 ................................................................................................................Collected Data January 25 – May 6 ................................................................... Students Complete Surveys February 24 ........................................................................................ Instructor Interviewed March 15 – April 9 .................................................................................. Recruited Students March 16 – April 16 ............................................................................. Interviewed Students May 3 ........................................ Contacted University’s Technology Team for Usage Data May 12 ................................................................................... Downloaded the Survey Data May 12 ...........................Downloaded Syllabus and Recorded Video of Math 101 Website May 14 .................................................................................... Received Course Usage Data Summer 2021 .............................................................................................................Analyzed Data May 19 – June 30 ............................. First Level Coding of all Interview Data into Rubrics July 8 – July 28 ....................................... Second Level Coding of all Rubrics into Profiles July 29 – August 31 ............. Analyzed Course Artifacts, Surveys, and Course Usage Data 145 REFERENCES 146 REFERENCES Aditomo, A., Goodyear, P., Bliuc, A. M., & Ellis, R. A. (2013). Inquiry-based learning in higher education: Principal forms, educational objectives, and disciplinary variations. Studies in Higher Education, 38(9), 1239–1258. https://doi.org/10.1080/03075079.2011.616584 Akyol, Z., Arbaugh, J. Ben, Cleveland-Innes, M., Garrison, D. R., Ice, P., Richardson, J. C., & Swan, K. (2009). A response to the review of the community of inquiry framework. International Journal of E-Learning & Distance Education, 23(2), 123–136. Akyol, Z., Garrison, D. R., & Ozden, M. Y. (2009). Development of a community of inquiry in online and blended learning contexts. Procedia - Social and Behavioral Sciences, 1(1), 1834–1838. https://doi.org/10.1016/j.sbspro.2009.01.324 Akyol, Z., & Garrison, R. (2008). The development of a community of inquiry over time in an online course: Understanding the progression and integration of social, cognitive and teaching presence. Online Learning, 12(3–4), 3–22. https://doi.org/10.24059/olj.v12i3- 4.1680 Anderson, T., Rourke, L., Garrison, R., & Archer, W. (2001). Assessing teaching presence in a computer conferencing context. Journal of Asynchronous Learning Networks, 5(2), 1–17. Arbaugh, J. B., Cleveland-Innes, M., Diaz, S. R., Garrison, D. R., Ice, P., Richardson, J. C., & Swan, K. P. (2008). Developing a community of inquiry instrument: Testing a measure of the community of inquiry framework using a multi-institutional sample. Internet and Higher Education, 11(3–4), 133–136. https://doi.org/10.1016/j.iheduc.2008.06.003 Baran, E., Correia, A. P., & Thompson, A. (2011). Transforming online teaching practice: Critical analysis of the literature on the roles and competencies of online teachers. Distance Education, 32(3), 421–439. https://doi.org/10.1080/01587919.2011.610293 Cho, M. H., & Heron, M. L. (2015). Self-regulated learning: The role of motivation, emotion, and use of learning strategies in students’ learning experiences in a self-paced online mathematics course. Distance Education, 36(1), 80–99. https://doi.org/10.1080/01587919.2015.1019963 Crouch, C. H., & Mazur, E. (2001). Peer Instruction: Ten years of experience and results. American Journal of Physics, 69(9), 970–977. https://doi.org/10.1119/1.1374249 Deris, F. D., Zakaria, M. H., & Wan Mansor, W. F. A. (2012). Teaching presence in online course for part-time undergraduates. Procedia - Social and Behavioral Sciences, 66, 255– 266. Dewey, J. (1933). How we think. Lexington, MA: D.C. Hearth. Draus, P., Curran, M., & Trempus, M. (2014). The influence of instructor-generated video 147 content on student satisfaction with and engagement in asynchronous online classes. Journal of Online Learning and Teaching, 10(2), 240–254. Edelson, D. C., Gordin, D. N., & Pea, R. D. (1996). Addressing the challenges of inquiry-based learning through technology and curriculum design. Journal of Learning Sciences, 8(4), 1-. https://doi.org/10.1080/10508406.1999.9672075 Ehrenberg, R., & Zhang, L. (2005). Do tenured and tenure-track faculty matter? The Journal of Human Resources, 40(3), 647–659. Engelbrecht, J., & Harding, A. (2005). Teaching undergraduate mathematics on the internet. Educational Studies in Mathematics, 58(2), 253–276. https://doi.org/10.1007/s10649-005- 6457-2 Felder, R. M., & Brent, R. (1996). Navigating the bumpy road to student-centered instruction. College Teaching, 44(2), 43–47. https://doi.org/10.1080/87567555.1996.9933425 Fong, K., & Visher, M. (2013). Math redesign at Broward College. In Fast forward: A case study of two community college programs designed to accelerate students through developmental math (pp. 11–25). https://doi.org/10.1177/0306422013500187 Freeman, S., Eddy, S. L., McDonough, M., Smith, M. K., Okoroafor, N., Jordt, H., & Wenderoth, M. P. (2014). Active learning increases student performance in science, engineering, and mathematics. Proceedings of the National Academy of Sciences, 111(23), 8410–8415. https://doi.org/10.1073/pnas.1319030111 Fullilove, R. E., & Treisman, P. U. (1990). Mathematics achievement among African American undergraduates at the University of California, Berkeley: An evaluation of the mathematics workshop program. The Journal of Negro Education, 59(3), 463. https://doi.org/10.2307/2295577 Garrison, D. (2017). E-Learning in the 21st century: A community of inquiry framework for research and practice (3rd ed.). New York, NY: Routledge. Garrison, D., Cleveland-Innes, M., & Fung, T. (2010). Exploring causal relationships among teaching, cognitive and social presence: Student perceptions of the community of inquiry framework. Internet and Higher Education, 13(1–2), 31–36. https://doi.org/10.1016/j.iheduc.2009.10.002 Garrison, D., Cleveland-Innes, M., Koole, M., & Kappelman, J. (2006). Revisiting methodological issues in transcript analysis: Negotiated coding and reliability. Internet and Higher Education, 9(1), 1–8. https://doi.org/10.1016/j.iheduc.2005.11.001 Glass, J., & Sue, V. (2008). Student preferences, satisfaction, and perceived learning in an online mathematics class. MERLOT Journal of Online Learning and Teaching, 4(3), 325–338. Hegeman, J. S. (2015). Using instructor-generated video lectures in online mathematics courses improves student learning. Online Learning, 19(3), 70–87. 148 Johnson, D., Johnson, R., & Smith, K. (1998). Cooperative learning returns to college: What evidence is there that it works? Change, 30(4), 26–35. Johnson, E., Andrews-Larson, C., Keene, K., Keller, R., Fortune, N., & Melhuish, K. (2018). Inquiry and inequity in the undergraduate mathematics classroom. In Proceedings of the 40th Annual Conference of the North American Chapter of the International Group for the Psychology of Mathematics Education. Greenville, SC. Johnson, E., Keller, R., & Fukawa-Connelly, T. (2018). Results from a survey of abstract algebra instructors across the United States: Understanding the choice to (not) lecture. International Journal of Research in Undergraduate Mathematics Education, (4), 254–285. https://doi.org/10.1007/s40753-017-0058-1 Kaser, J., & Hauk, S. (2016). To be or not to be an online math instructor? MathAMATYC Educator, 7(3), 41–47. Kennedy, N. S. (2012). Lipman, Dewey, and philosophical inquiry in the mathematics classroom. Education and Culture, 28(2), 81–94. https://doi.org/10.1353/eac.2012.0012 Kogan, M., & Laursen, S. L. (2014). Assessing long-term effects of inquiry-based learning: A case study from college mathematics. Innovative Higher Education, 39(3), 183–199. https://doi.org/10.1007/s10755-013-9269-9 Kovanović, V., Joksimović, S., Poquet, O., Hennis, T., Čukić, I., de Vries, P., … Gašević, D. (2018). Exploring communities of inquiry in massive open online courses. Computers and Education, 119(November 2017), 44–58. https://doi.org/10.1016/j.compedu.2017.11.010 Laursen, S. L., Hassi, M., Kogan, M., & Weston, T. (2014). Benefits for women and men of inquiry-based learning in college mathematics: A multi-institution study. Journal for Research in Mathematics Education, 45(4), 406–418. https://doi.org/10.5951/jresematheduc.45.4.0406 Lipman, M. (2003). Thinking in education (2nd ed.). New York, NY: Cambridge. Michigan State University demographics and diversity report. (2021). Retrieved February 1, 2022, from https://www.collegefactual.com/colleges/michigan-state-university/student- life/diversity/#overview Prince, M. (2004). Does active learning work? A review of the research. Journal of Engineering Education, 93(3), 223–232. https://doi.org/10.1002/j.2168-9830.2004.tb00809.x Prince, M., & Felder, R. (2007). The many faces of inductive teaching and learning. Journal of College Science Teaching, 36(5), 14–20. https://doi.org/2200/20080506115505992T Shea, P., & Bidjerano, T. (2018). Online course enrollment in community college and degree completion : The tipping point. The International Review of Research in Open and Distributed Learning, 19(2), 282–293. 149 Sonnert, G., Sadler, P. M., Sadler, S. M., & Bressoud, D. M. (2015). The impact of instructor pedagogy on college calculus students’ attitude toward mathematics. International Journal of Mathematical Education in Science and Technology, 46(3), 370–387. https://doi.org/10.1080/0020739X.2014.979898 Springer, L., Stanne, M. E., & Donovan, S. S. (1999). Effects of small-group learning on undergraduates in science, mathematics, engineering, and technology: A meta-analysis. Review of Educational Research, 69(1), 21–51. https://doi.org/10.3102/00346543069001021 Swan, K., & Ice, P. (2010). The community of inquiry framework ten years later: Introduction to the special issue. Internet and Higher Education, 13(1–2), 1–4. https://doi.org/10.1016/j.iheduc.2009.11.003 Testone, S. (2019). Quality online developmental math courses : The instructor’s role. Research and Teaching in Developmental Education, 19(2), 59–63. Tirado Morueta, R., Maraver López, P., Hernando Gómez, Á., & Harris, V. W. (2016). Exploring social and cognitive presences in communities of inquiry to perform higher cognitive tasks. Internet and Higher Education, 31, 122–131. https://doi.org/10.1016/j.iheduc.2016.07.004 Treisman, U. (1992). Studying students studying calculus: A look at the lives of minority mathematics students in college. The College Mathematics Journal, 23(5), 362–372. Treisman, U. (2008). Emerging scholars program. Making the Connection: Research and Teaching in Undergraduate Mathematics Education., 18(73), 205. Trenholm, S., Alcock, L., & Robinson, C. (2015). An investigation of assessment and feedback practices in fully asynchronous online undergraduate mathematics courses. International Journal of Mathematical Education in Science and Technology, 46(8), 1197–1221. https://doi.org/10.1080/0020739X.2015.1036946 Trenholm, S., Alcock, L., & Robinson, C. (2016). The instructor experience of fully online tertiary mathematics: A challenge and an opportunity. Journal for Research in Mathematics Education, 47(2), 147–161. Trenholm, S., Hajek, B., Robinson, C. L., Chinnappan, M., Albrecht, A., & Ashman, H. (2019). Investigating undergraduate mathematics learners’ cognitive engagement with recorded lecture videos. International Journal of Mathematical Education in Science and Technology, 50(1), 3–24. https://doi.org/10.1080/0020739X.2018.1458339 Trenholm, S., Peschke, J., & Chinnappan, M. (2019). A review of fully online undergraduate mathematics instruction through the lens of large-scale research (2000-2015). PRIMUS, 29(10), 1080–1100. https://doi.org/10.1080/10511970.2018.1472685 Vickrey, T., Rosploch, K., Rahmanian, R., Pilarz, M., & Stains, M. (2015). Research-based implementation of peer instruction: A literature review. CBE Life Sciences Education, 150 14(1), 1–11. https://doi.org/10.1187/cbe.14-11-0198 Walcyzk, J., Ramsey, L., & Zha, P. (2007). Obstacles to instruction innovation according to college science and mathematics faculty. Journal of Research in Science Teaching, 44(1), 85–106. https://doi.org/10.1002/tea.20119 Wertz, R. E. H. (2022). Learning presence within the community of inquiry framework: An alternative measurement survey for a four-factor model. Internet and Higher Education, 52, 100832. https://doi.org/10.1016/j.iheduc.2021.100832 Zientek, L. R., Yetkiner Ozel, Z. E., Fong, C. J., & Griffin, M. (2013). Student success in developmental mathematics courses. Community College Journal of Research and Practice, 37(12), 990–1010. https://doi.org/10.1080/10668926.2010.491993 151