THREE CONSTRUCTS EXAMINED: THEORETICAL FORCES THAT COULD AFFECT RETENTION IN ONLINE COLLEGE CLASSES By Ruth Jay Shillair A THESIS Submitted to Michigan State University in partial fulfillment of the requirements for the degree of Telecommunication, Information Studies and Media- Master of Arts 2013 ABSTRACT THREE THEORETICAL CONSTRUCTS EXAMINED: THEORETICAL FORCES THAT COULD AFFECT RETENTION IN ONLINE COLLEGE CLASSES By Ruth Jay Shillair Online education appears to meet the need of providing educational opportunities for a growing proportion of the population in a way that is economical and scalable. It has become strategic for many institutions of higher education, especially community colleges. These colleges often serve populations who are often at greater risk of attrition from classes. Therefore, understanding theoretical forces that affect student persistence is important to guide policy decisions in all aspects of administration, design, and teaching of these classes. Previous studies showed strong connections between the theoretical constructs of self-efficacy, usability, and social presence and its correlation to student persistence in online classes. However, massive and rapid changes in technology acceptance, Internet accessibility and student expectations call for a need to reexamine these constructs to see if indeed these are still key factors in student persistence. The online writing and English classes (N=706) of a large urban community college were invited to participate in a survey and the completed surveys (N=49) were analyzed to look for correlations between self-efficacy in online education, usability of the Learning Management System, and social presence in students who persisted in the classes (N=43) and those who dropped (N=6). Little difference was found between persisters and non-persisters in all of the constructs studied, which was in contrast to previous research. Qualitative analysis of comments found varying frustration levels in all three areas, even among students who were persisting. The decision to drop or persist appears to be closely tied to strategic choices made by the students. Copyright by RUTH JAY SHILLAIR 2013 DEDICATION To my patient and supportive husband Tom: thank you for all you do, and most of all for who you are. Also, to my children and their spouses as well as my grandchildren: thank you for your prayers and encouragement. Hugs to all- iv ACKNOWLEDGEMENTS I owe a great deal to my advisor, Dr. Johannes Bauer. Despite his incredibly busy schedule and global travels, he always is so kind to take the time to share great insights into my research area, and discuss upcoming trends. I am thankful how he has taught me that economic philosophies and issues underlie almost all decisions, especially in education. I am grateful to Dr. Carrie Heeter, who while being amazingly gentle and encouraging, gave me a passion for understanding usability and seeing the power of deep qualitative analysis. Also, I am very grateful to Dr. Constantinos Coursaris, who helped me to understand the potential of analyzing data in multiple ways to tease out previously unseen connections. Also, thank you to Rachel Iseler, she always seems to know exactly who to talk to, and what to do at each step of the way. v TABLE OF CONTENTS LIST OF TABLES .................................................................................................................................................. viii LIST OF FIGURES .................................................................................................................................................... x INTRODUCTION ...................................................................................................................................................... 1 Purpose of this research .............................................................................................................................................................1 Background of the Issue .............................................................................................................................................................3 Growing need for higher education........................................................................................................................ 3 Strategic Nature of Online Education .................................................................................................................... 4 Promise and Problems with Online Education .................................................................................................. 5 Importance of Student Retention ...........................................................................................................................................7 Past research on persistence in online classes .................................................................................................................9 Questions for Research ............................................................................................................................................................. 12 Basic Factors for Measurement ........................................................................................................................................... 14 Student Self-Efficacy in the Online Educational Environment .................................................................. 17 Usability of the Educational Interface ................................................................................................................. 19 Social Presence in the Virtual Classroom ........................................................................................................... 21 Common Theories of Student Persistence ....................................................................................................................... 22 METHODOLOGY ................................................................................................................................................... 24 Research Design .......................................................................................................................................................................... 24 Choice of Institution.................................................................................................................................................... 25 Choice of Sample Population ................................................................................................................................... 27 Instrumentation .......................................................................................................................................................................... 27 Survey Questions ......................................................................................................................................................................... 29 Human Subjects approval ....................................................................................................................................................... 29 Data Analysis ................................................................................................................................................................................ 30 RESULTS ................................................................................................................................................................. 31 Overall Demographic Information ..................................................................................................................................... 31 Demographics of all participants .......................................................................................................................... 32 Demographics and Persistence ............................................................................................................................................ 35 Grade Point Average- ................................................................................................................................................. 35 Ethnic/ Racial Background ...................................................................................................................................... 35 Gender .............................................................................................................................................................................. 35 First Person .................................................................................................................................................................... 36 Distance............................................................................................................................................................................ 36 Class Enrolled ................................................................................................................................................................ 36 vi Self-Efficacy in Online Classes ............................................................................................................................................... 36 Usability of the Learning Management System (LMS) .............................................................................................. 39 Social Presence Measures ....................................................................................................................................................... 41 Overall Satisfaction Levels...................................................................................................................................................... 44 Student Reported Reasons for Dropping ......................................................................................................................... 45 Qualitative Analysis of the Comments............................................................................................................................... 45 Demographics of Commenters ............................................................................................................................... 45 Comments about Social Presence of the Instructor (Communication) ................................................. 50 Comments about Self-Efficacy (or expressing concern about lack of efficacy) .................................. 52 Comments about Usability of the System as utilized by the instructor and Comments about Usability of the LMS .................................................................................................................................................... 53 Comments about Support Services....................................................................................................................... 55 Comments about Social Presence of fellow students .................................................................................... 56 Other Findings .............................................................................................................................................................................. 57 LIMITATIONS........................................................................................................................................................ 59 DISCUSSION........................................................................................................................................................... 61 Persistence and Self-Efficacy ................................................................................................................................................. 62 Persistence and Usability of Instructional Interface .................................................................................................. 62 Persistence and Social Presence .......................................................................................................................................... 64 Further Discussion ..................................................................................................................................................................... 65 Suggestions for further research ......................................................................................................................................... 66 Final Remarks............................................................................................................................................................................... 67 APPENDICES ......................................................................................................................................................... 69 APPENDIX A: COPY OF THE SURVEY INSTRUMENT.................................................................................................. 70 APPENDIX B: COPY OF THE CODEBOOK ......................................................................................................................... 95 WORKS CITED .................................................................................................................................................... 127 vii LIST OF TABLES Table 1 Approximate GPA of all participants ............................................................................. 32 Table 2 Ethnic/ Racial Background of Participants .................................................................... 32 Table 3 Citizenship ...................................................................................................................... 32 Table 4 Gender of Participants .................................................................................................... 33 Table 5 First person in family to attend college .......................................................................... 33 Table 6 Still Enrolled in class ...................................................................................................... 33 Table 7 Distance from College .................................................................................................... 34 Table 8 Class Enrollment ............................................................................................................. 34 Table 9 Self Efficacy in Online Classes bivariate correlations..................................................... 38 Table 10 Usability of the LMS Questions Correlations............................................................... 40 Table 11 Usability per instructor's utilization of the LMS .......................................................... 41 Table 12 Social Presence of Fellow Students Correlations .......................................................... 43 Table 13 GPA of Commenters ..................................................................................................... 48 Table 14 Gender of Commenters ................................................................................................. 48 Table 15 Mean Distance in Miles from Commenters by category .............................................. 48 Table 16 Categories of Comments ............................................................................................... 49 Table 17 Satisfaction with Instructor's Social Presence ............................................................. 52 Table 18 Commenter's Self-Efficacy Scores ............................................................................... 53 Table 19 Comments About Usability of the System ................................................................... 54 Table 20 Satisfaction with Instructor's Utilization of the LMS ................................................... 54 Table 21 Satisfaction with Student Services (Computer Help, Tutoring, and Writing Center) .. 55 viii Table 22 Code for Quantitative Analysis of Responses .............................................................. 97 ix LIST OF FIGURES Figure 1 Computer Mediation Points ........................................................................................... 15 Figure 2 Shannon-Weaver Theory Applied to Online Education ................................................. 16 Figure 3 Approximate GPA ......................................................................................................... 32 Figure 4 Ethnic/ Racial Background ............................................................................................ 32 Figure 5 Citizenship ..................................................................................................................... 32 Figure 6 Gender of Participants ................................................................................................... 33 Figure 7 First Person in Family to Attend College ...................................................................... 33 Figure 8 Still Enrolled in Class ................................................................................................... 33 Figure 9 Distance from College in Miles ..................................................................................... 34 Figure 10 Class Enrollment of Participants ................................................................................. 34 Figure 11 Social Presence of the Instructor ................................................................................. 42 Figure 12 Class Satisfaction Levels ............................................................................................. 45 Figure 14 Gender of Commenters................................................................................................. 48 Figure 15 Distance in Miles of Commenters ................................................................................ 48 Figure 13 GPA of Commenters .................................................................................................... 48 Figure 16 Satisfaction with Instructor's Social Presence ............................................................. 52 Figure 17 Commenter's Self-Efficacy Scores ............................................................................... 53 Figure 18 Comments About Usability of the System ................................................................... 54 Figure 19 Satisfaction with the Instructor's Utilization of the LMS ............................................. 54 Figure 20 Satisfaction with Student Services (Computer Help, Tutoring and Writing Center) ... 55 Figure 21 Students Who Use a Mobile Device to Access Online Class ....................................... 57 x Figure 22 Students Who Use Mobile Devices Also Use ............................................................. 57 Figure 23 Number of Devices Used to Access Class ................................................................... 58 Figure 24 Desired Instructor Response Time ............................................................................... 58 xi INTRODUCTION Purpose of this research The growth and ubiquity of computers and the digital revolution has made profound changes in almost every aspect of daily life. As a result of growing mechanization and use of robotics during the mid to late Twentieth Century, the labor force shifted from production positions in manufacturing to professional, technical and service workers. This change accompanied a massive investment by businesses in Information and Communications Technology (ICT). The growth of computers, information technology and software were in the double digits throughout the period of the 1950s to the late 1990s. For example, in 1996, business expenditures for ICT were $29,200 per worker in the telecommunications industry; this compares to an investment of only $7,600 for real estate and office space per worker during the same time (Fisk, 2003). By1999, the professional, technical, and service workers sector employed 78% of all workers in the United States (Fisk, 2003). Mechanization, computers, and robotics continue to further change the modern workplace. As discussed by Massachusetts Institute of Technology's (MITs) economists Acemoglu and Autor, the rapid diffusion of new technologies has produced a tremendous shift in the demand for certain jobs, with low-skill workers suffering loss of opportunities, and the remaining positions have experienced "significant declines in real wages" (Acemoglu & Autor, 2010). As machines replace even more low-skilled positions, it is of growing importance to find ways to improve education to better utilize human capital. Therefore, having a populace that is well-educated is strategic not only to the individuals that are otherwise facing a lifetime of limited opportunities, but also to the future of a community or a nation (Means et al, 2010; Beerkens, 2003). As a result, a larger percentage 1 of the population is entering into the higher educational system, including those who were demographically, or academically, not traditionally college-bound. At the same time that educational institutions are faced with meeting this burgeoning population, they also are facing challenges in controlling costs. Institutions are often meeting these dual challenges by offering classes in the online environment; however, the very students, the non-traditional college students, who need to negotiate these innovative learning spaces, have the lowest retention rates in online classes (Herbert, 2006). Therefore, it is important to closely look at ways to better understand how to 1) develop systems that attract and engage students, 2) design interfaces that not just enable, but enhance learning, and 3) direct educators towards effective pedagogical practices in online education. To better understand the forces that affect student retention in online classes, a measurement instrument was developed to ascertain the impact of three theoretical constructs that correspond to each of the three major foci of development, design, and direction. To better develop systems that attract and engage students, it is essential to understand their levels of confidence and comfort in working on educational materials in an online format; therefore, selfefficacy in online learning will be measured as part of this research. Secondly, in finding ways to design an interface that enhances learning, the students’ evaluation of the usability of current learning management system, as well as the usability of the instructors’ choices in the utilization of that interface will also be measured. Thirdly, the students’ perception of direction and communication, otherwise known as social presence between the students and the instructor as well as the students and their classmates will be measured. Hopefully, by evaluating these relationships new insights can be gleaned, and new innovations and policies that help improve persistence in online classes can be developed. 2 Finding solutions to meet the critical needs in education by utilizing technological innovation will require cross-discipline cooperation and thorough research. Institutions and government entities should be careful to make policies that are founded on solidly researched principles, to assure that while harnessing the power and potential of Information and Communications Technology (ICT) in education crucial learning objectives are still met. The first step in this multi-faceted process is to examine the forces that encourage student retention in online classes. Background of the Issue Growing need for higher education With business and industry facing the growing need of a population who has advanced educational skills, this means a projected increase in the number of students enrolling in college level classes. Enrollment in institutions of higher education increased by 37% in the period of 2000 to 2010, by 2010 there were 21 million students enrolled in colleges and universities (U.S. Department of Education, 2012). According to the National Center for Education Statistics, in the 2009-2010 academic year over 940,000 bachelor's degrees were granted in the United States and they project that by 2021 that number will increase to 1,160,000 (National Center for Education Statistics, 2011). To meet this increase in demand would normally require a massive investment in expanding the basic infrastructure of universities across the nation. Yet at the same time that higher education is more crucial than ever, most governmental entities are cutting funding. A study by the National Association of Student Financial Aid Administrators found that even though enrollments in higher education had grown by 12.5% during 2008-2012, state and local funding had dropped by 7% in 2010, and a further drop of 3.7% in 2011 ("State and local", 2012). These rates of declining support along with the increases in enrollment combine to 3 reduce the rate of state and local support for the equivalent of each full time student to the lowest that it has been in 25 years ("State and Local", 2012). Reductions varied across states, but the draconian cuts in some areas came as a result of declining state revenues. As an example, funding in the state of Minnesota was reduced by 35% from 2000-2010, while the national average was a 20% decline (Hawkins, 2012). This puts a greater burden on the student to cover the increasing gap between governmental support and the actual costs of providing an education. Strategic Nature of Online Education In an era of tight budgets and decreasing governmental support, efficiency and optimal utilization of resources is key to being able to offer higher education to an increasing percentage of the population at a reasonable price. According to a report sponsored by the Bill and Melinda Gates foundation, to meet the projected needs of an additional 1,000,000 more college graduates by the year 2020 at today’s level of degree productivity, the government should be investing at least $52 billion more in higher education per year (Auguste et al, 2010). However, in this era of tight budgets, even though higher education is a critical and strategic investment, the increases needed to just barely keep the status quo are not likely. Therefore, it is imperative for institutions to find ways to improve productivity, as Auguste et al (2010) emphasize, the goal is to, “to produce more graduates for the same total expenditures without compromising the quality of degrees awarded or reducing access” (Auguste et al, 2010). A potential key to this improvement in productivity could be more extensively utilizing online classes, the analysis sponsored by the Gates foundation found that online education could be up to 48% more cost effective than the traditional classroom counterpart (Tutty & Ratliff, 2012) To meet the increased student load many educational institutions are already putting more classes into the online environment. The online format is scalable and flexible. It allows more 4 classes to be added within a fairly short time, and with little capital outlay, yet it also allows college administrators to reduce capacity quickly if demand drops. For example, the administrative nightmare of having to consolidate several traditional sections of classes because of low enrollment that meet at various places and times is simplified if the sections are online. Also, since with online education, students study the material when they want and where they want, scheduling classes for optimal times or locations becomes a moot point. Therefore, many universities see online classes as having high strategic importance. The Babson research group found that 65.5% of the institutions surveyed agreed with the statement, "Online education is critical to the long-term strategy of my institution" (Allen & Seaman, 2011). Most universities now offer online classes, the Pew Internet and American Life Project found that 89% of fouryear public universities offer online classes and 91% of 2-year private and public colleges offer online sections (Taylor, Parker, Lenhart & Patten, 2011). Online education is also growing in acceptance by students, in 2010, in the United States, there were "over 6.1 million students taking at least one online course... an increase of 560,000 from the previous year" (Allen & Seaman, 2011). Promise and Problems with Online Education For some students, especially those who work part time or have family responsibilities, the online environment offers many advantages. Students can go over lecture material and do assignments during times that are convenient to them. Also, the computer mediated learning environment allows students to repeat lecture points as needed for personal review. Often the same professors teach both the traditional face-to-face classes and the online sections, so the students have the opportunity to learn from the same instructor and cover the same material without the difficulties and inconveniences they may face in coming in to a traditional class. 5 Younger undergraduate students usually feel comfortable with technology; they are sometimes called digital natives, since they have grown up with the Internet, computers, smart phones, and other forms of computer mediated communication (Prensky, 2001). It is estimated that by the time students graduate from college they have spent about 10,000 hours of their lives playing video games and only 5,000 hours reading; one might conclude that these students may actually learn better in the digital environment (Prensky, 2001). The other tremendous opportunity that online education brings is that it can reach beyond geographic boundaries and allow students to continue their studies even if they are physically located far away from the institution. This medium can therefore offer the potential for institutions to reach new market segments. Especially given the need to offer continuing educational opportunities to large portions of the population, using technology to provide information in the online environment can help bring solutions that have the potential to be effective, economical, and scalable (Kenney, Hermens, & Clarke, 2004). Given that online classes offer scalability, economy of delivery, the possibility to control costs, and the ability to reach students who have time restraints or other commitments, it seems as though online instruction would be welcomed and embraced by institutions and students alike. Especially since careful meta-analysis of studies shows that the potential for learning outcomes from online or hybrid sections is the same as that of traditional face-to-face classrooms (Corey et al, 2012). Furthermore, direct studies also found that outcomes could be similar to that of traditional classroom instruction (Fortune, Spielman & Pangelinan, 2011). Yet, despite the potential advantages for online classes, outcomes are often rather disappointing, with online students much more likely to drop classes (McFadden, 2009). The rate of attrition can be significantly high, a four-year study of 51,000 students in the state of Washington attending a 6 community college found a 8% lower completion rate for online sections; this rate rose to a 15% gap if the student took remedial courses (Brown, 2011) Other studies show retention rates down to only 20% of those originally enrolled, with college administrators surveyed estimating that online sections have retention levels 10-20% lower than the similar face-to-face sections (Tutty & Ratliff, 2012). This phenomenon occurs at all levels of higher education, surprisingly even at the graduate level. For example, a study of MBA students at a major university found that some sections had a drop rate four times that of the same face-to-face class (Patterson & McFadden, 2009). Importance of Student Retention Student retention and completion is a serious concern; the National Center for Education Statistics reports that only 36% of enrolled college students at four year institutions complete their bachelor's degree by four years, even if that is extended to six years only 57.5% are able to graduate (National Center for Education Statistics, 2007). The non-completion rates grow even worse when those entering two year institutions are counted in the analysis. These institutions tend to serve the non-traditional students: returning adults, minorities, and the disadvantaged; approximately 42% of their student body are the first in their family to attend college, so their mission to bridge the gap and help make education accessible is crucial (Clay, 2012). A report published by the National Student Clearinghouse Research Center in November of 2012, in an analysis of over 600,000 students across the nation found that of those who started in a two-year institution, after six years, only 23.9% had completed an associate’s degree at that institution, and only an additional 9.4% had completed at a four year institution after getting an associate's degree (Shapiro et al, 2012). This issue has sparked concern in both public and private arenas to the point that the 2010 U.S. Department of Education budget included a $2.5 billion dollar 7 program over five years called the Access and Completion Incentive Fund to help find and support new initiatives that help students, particularly the disadvantaged, to complete college (The White House, 2010). Student attrition is not only problematic at the national and regional levels; at the institutional level it can also cause serious financial loss. Many institutions have enrollment levels at which they will run a class, if the class is below that threshold it is not profitable to run and the class will be cancelled (McDonald, 1995). If students enroll in online sections and commitments are made to run the class, and subsequently a significant number drop the class within the drop/add period the institution must refund the students' money and the institution will have to run the section at a loss. Another consequence of the last minute shuffling and cancellation of classes is that students are unable to complete their degrees in two years or even three years. There are concerns that the cancellation of classes might be a contributing factor to community colleges' low graduation rates (Schneider & Lin, 2012). It is to every educational institution's advantage to encourage students that enroll in classes to stay enrolled and be able to successfully complete the academic goals within that class. By discovering some of the key characteristics of the experiences of students who persist in online classes, interventions could be put in place to help the students persevere (Parker, 1999). Also, for the individual who chooses to drop an online class there are multiple levels of "cost". If they enroll in an online class and then choose to drop it before they complete the material they may face lost tuition, fees, or even more importantly, time. They may have to wait until the next semester to attempt to take a required class again, "The longer they wait to graduate and get a job, those are extra years of their careers...not making money" (Lukerson, 2013). Logically, it would seem that if a student faces high frustration levels in attempting to 8 complete their degree, as well as a longer time elapsed until completion, the more likely they are to drop out entirely from any educational program. Those with incomplete degrees face a harder time competing for jobs or getting a better position. Past research on persistence in online classes Since college completion is a serious issue, and that of particular concern is the students from non-traditional backgrounds, finding ways to help improve retention and completion of coursework is of importance. Also, since online classes are strategically significant for many institutions, it is of particular concern to improve retention and learning outcomes for all students. There are many studies done on the macro-level, looking at societal trends, cultural and pedagogical changes, demographic, and economic changes that are influencing overall college retention rates ( Shachar & Neumann, 2010; Thomas, Cooper & Quinn, 2003; Hermanowicz, 2003; Moxley, Najor-Durack, & Dumbridge, 2001; Crosling, Thomas, & Heagney, 2008; Welsh, 2007; Mancuso, 2008). However, the pool of research looking specifically at examining online classes and the reasons why certain individuals persist and why others in the same class will drop is much smaller. Even more difficult to find is research examining the students facing the highest risk on dropping out, the community college student taking online classes (Muse, 2003). Research in this area is important because community college students are at the highest risk of attrition, and online sections have a lower rate of persistence (Nakajima, Dembo, & Mossler, 2012). Also, another reason for the need to research online student retention at the community college level is because about 50% of all online college classes are offered through community colleges (Johnson & Berge, 2012). Therefore, focused research on this demographic of students and how to improve student retention is needed. Yet looking at simply the demographic risk factors for student retention in online classes gives an incomplete picture of possible solutions to 9 improve retention and learning. There is also the need to research aspects the students’ perceptions of the design and usability of the online system itself, and how the system is utilized in practice. Online education, in its infancy, was simply treated as a new mode for delivery of distance education, where lessons and learning material were mailed to the student and the student would complete the materials individually and mail the results back to the instructor (Shachar & Neumann, 2010; Bramble & Lu, 2011). The initial advantage of the online mode was simply the speed of delivery and response time was shorted from weeks and days to hours and minutes. The basic pedagogy of most instructors didn't change using online systems, a series of lessons was prepared and posted in the students' content management system and then the student would work through the checklist of activities to complete the class (Steinbronn, 2007). Additionally, early systems for delivering online educational content were often designed by universities primarily as an efficient way to simply offer traditional class material for retrieval by students in an online environment (Coates & Baldwin, 2005). Later these systems were expanded for use as entire class delivery systems. Yet, even something that appears as straightforward as this scenario can have complex implications when serving a critical function, such as working as a conduit for higher education (Coates & Baldwin, 2005). Some of the early reported points of frustration with and attrition from online learning classes was reported to be difficulties with the technology and the inability to easily access information (Shrank, 2009). However, as computers have become more ubiquitous and Internet access improved, these problems of basic access should be re-examined, particularly to see if these issues are still barriers to at-risk students since students’ skills, comfort with, and expectations from technology have changed rapidly (Roberts, 2005). 10 A further possible factor affecting student retention in online classes is the pedagogical practices of the instructors. Even with massive changes in the availability of technology the growth of adoption and the changes in the capabilities of the online environment, there is often little overall change in how most online classes are administered from the point of the instructor (Ray, 2009). They commonly use pedagogical practices that have been used for millennia in traditional instructional spaces, rather than radically alter the design of their instruction to a new paradigm. For example, an andragogically based format that would give more responsibilities to learners in online environments (Gibbons & Wentworth, 2001). These lack of pedagogical changes are often not neglect on the part of the instructors, but due to the fact that training for teaching in online environments usually consists of purely the technological “how to use” the interface, rather than instruction on effective pedagogy in these new environments (Gerard et al, 2011; Ray, 2009; Bailey & Card, 2009). Also, a growing body of researches in Information and Communications Technology (ICT) and Human-Computer Interaction (HCI) studies are starting to better understand how humans interact in these spaces, and how computer mediated communication (CMC), even though in many ways is the same as more traditional forms of communication, is fundamentally different (Kiesler, Siegel, & McGuire, 1984; Olson & Olson, 2003). The different aspects of student demographics and preparation, design of the interface, and usability of the computer system utilized for class delivery should be carefully examined to look for ways to improve usability and "bake in" effective pedagogical practices in the design and implementation of the systems used to develop and administer online education. This is even more crucial for the disadvantaged or non-traditional student as they may be less familiar with educational expectations, and with improved design they could potentially not only be able to 11 persist in classes, they may be able to flourish and succeed. Another growing issue that needs to be carefully examined is the increased use of mobile devices by students. These devices radically change the visibility and usability of systems designed for desktop use. Designing learning and content management systems that work robustly on mobile devices could better engage students and greatly improve retention, even among at-risk student populations. Therefore, this research attempts to fill that gap by looking holistically at students as they come in to the online class, their interaction with the computer interface, their interaction with the instructor, and their interaction with each other. Most previous research has looked primarily at one aspect at a time, such as student demographics (Welsh, 2007; Patterson & McFadden, 2012; Hart, 2012), satisfaction levels (Muse, 2003; Auguste et al, 2010), engagement levels (Saltmarsh & Sutherland-Smith, 2010; Sundar & Marathe, 2010; Artino, 2009), or student motivation and preparation (Mancuso, 2008; Huckabee, 2010; Welsh, 2007). This research attempts to not only include these important aspects, especially students’ sense of being capable to handle the online learning environment, often defined as self-efficacy, as well as measuring students' sense of usability with the interface, and their sense of connection with other classmates and the instructor. The findings have the potential to help build bridges as both computer interface developers and educators work cross-discipline to generate solutions that truly enhance and facilitate online education. Questions for Research After a review of relevant researches in this area, several basic constructs were found to frequently be a factor in student retention, yet these constructs were not studied in conjunction on the same test population to look for similarities. These three theoretical constructs revolve around self-efficacy, usability, and social presence. The specific research questions are: 1) if 12 students have a higher sense of self-efficacy in online learning, will they be more likely to persist in online classes; 2) if students feel that the computer interface is favorable in usability are they more likely to persist in the class; 3) if students feel the instructor has integrated their course materials in a usable format are they more likely to persist; 4) if the students feel a sense of social presence with their instructors are they more likely to persist; and 5) if the students feel a sense of social presence with their fellow classmates, apart from their instructor, are they more likely to persist? Other issues examined in this research are demographic information, (e.g., gender, GPA, and type of device used to access the class) and how these aspects may also be factors in student retention. To answer these questions current literature was carefully reviewed to incorporate both current research findings, classic measurement instruments were utilized and adapted to develop a measurement tool, and a location was chosen to attempt to measure these forces on the target population. The hope is to be able to reach a population that is often overlooked in research studies, the community college student, in order to gain insights to design solutions and develop policies that will enhance the learning process and improve retention levels of even the most atrisk student populations. This research seeks to examine the following basic hypotheses: H1: Students with a high sense of self-efficacy will be more likely to persist in the class H2: Students that feel the Learning Management System (LMS) is usable are more likely to persist in the class. H3: Students that feel the instructor has utilized the affordances of the LMS are more likely to persist in the class. 13 H4: Students that have a high sense of social presence with the instructor are more likely to persist in the class. H5: Students’ sense of social presence with their fellow students will not be significant in their choice to persist in the class. Basic Factors for Measurement The first step to creating designs that will enhance, not just allow, online education is to understand current reasons why students are dropping online classes and why students persist in these same classes. By looking at the differences, hopefully areas can be pinpointed where changes can be made to help improve student retention. Students who enrolled and dropped an online class at any point in the process will be surveyed; additionally, students who are persisting in those same classes will be surveyed to measure how they perceive their experience in the class. The answers to the surveys will be closely examined to look for the different factors that might be indicative of persistence or dropping. There are several key points where ICT is used to interact with the student. These are the critical points where communication and interaction are strongly affected by the computermediated environment, the design of the interface, and utilization of that environment. Specifically, these are the interfaces between: the student and the instructional material, the student and the instructor, the student and other students, the instructor and the instructional material, the instructor and the students, and the other students with both the instructional material and the instructor. These interfaces are the points where any weaknesses in communication will cause the process of learning to be less than optimal. 14 Figure 1 Computer Mediation Points For interpretation of the references to color in this and all other figures, the reader is referred to the electronic version of this thesis Instructor Instructional Activities Student Classmates These points closely correspond to Shannon and Weaver's classic model of communication, where "noise" is the points where the encoding and decoding process of communication is most likely to be degraded (Shannon & Weaver, 1949). In Shannon and Weaver’s model, the encoding and decoding concept were dealing with telecommunication issues and the loss of data in the transmission of the telephone signals. In the online classroom environment there are similar points of “noise” or interference where communication is likely to break down and the student is more likely to disengage from the class. Using this medium, the online class, takes extra effort to overcome the “noise” and maintain the communication link to learn the material. In this research, three critical points where the student is likely to face 15 frustration and there may be an increase in student attrition are closely examined using the theoretical constructs that are the basis of this research. These critical measurement points include: the student self-efficacy in the online environment as they come in to the class; the usability of both the interface itself and how the instructor has utilized this technology; and the social presence or connections between the student and the instructor as well as the student and their fellow students. Figure 2 Shannon-Weaver Theory Applied to Online Education Self-Efficacy of Taking Online Education: Confidence in the abilty to do online education/ ability to work with the technology Student Noise Usability: The ability to perform tasks associated with the class work Computer Mediated Learning Management System Social Presence: The connection to the instructor/ fellow students Noise Instructor/ Fellow Students In designing an instrument for this research that could effectively measure the critical forces affecting retention in online environments, current literature and research in this area were carefully examined. Indexed peer-reviewed materials were gathered to look for thematic insight into potential causes of attrition and find potential solutions for improving retention. 16 Student Self-Efficacy in the Online Educational Environment The initial point of possible “noise” or difficulty to overcome for the online student is the initial use of technology for the purpose of education. The student might face difficulty in utilizing the technology; this is potentially problematic for non-traditional college students or those from disadvantaged backgrounds. There are concerns about the digital divide, where those from different societal backgrounds have widely different skill sets in utilizing the Internet and technology for gathering information, and that this divide might further marginalize those who need access to educational material (Van Deursen & Van Dijk, 2010). Even with rapid changes in access to the Internet across all spectrums of society, there still are significant differences in strategic Internet access skills, especially among those with lower educational levels (Van Deursen & Van Dijk, 2010). Research into generational differences of being able to navigate Internet usage for gathering information found that after initially learning basic skills, such as email and search, there was little conclusive differences between those who grew up using digital devices and those who learned usage later in life (Salajan, Schonwetter, & Cleghorn, 2010). These findings, along with other research, hint that difficulties in access are complex and often related to educational background and economic conditions (Van Deursen & Van Dijk, 2010; Salajan, Schonwetter, & Cleghorn, 2010; Prensky, 2001). Therefore, the student who enrolls in an online class faces the need for a special skill set that is able to utilize all the technology needed to successfully interact in the class, which would include: the computing device, Internet access, navigating the content management system, and logging into any necessary components of the class. Previous research in overall student retention has discovered that one of the key elements is the student’s sense of self-efficacy (Wang & Newlin, 2001; Hodges, 2008; Yi & Im, 2004). 17 Wang and Newlin's (2001) meta-analysis found that there was a strong correlation with selfefficacy and student persistence, even when studied at different achievement levels, different topics or different models of research. Additionally, more focused literature overview that addressed persistence in online and distance education at tertiary levels found several key indicators of persistence: comfort with online course work, the flexibility of the online environment, commitment to goals, GPA, feedback and interactions that felt meaningful, a relevance of course material to the student's life, self-efficacy, social presence, and support (Hart, 2012). Of all of these factors, the one most commonly mentioned in the literature reviewed was that of self-efficacy. Research specifically looking at the community college student demographic also found that self-efficacy, which was closely correlated to students' overall GPA, was a key predictor to student retention (Nakjima, Dembo, & Mosler, 2012). Even though self-efficacy is obviously important in student retention, it is a broad concept that needs to be carefully defined and operationalized in order to measure it for this particular research. According to Bandura’s (2006) definitions of self-efficacy, a student who felt they were capable of handling the technical skill set needed for functioning in the class would have selfefficacy in this area. This is not a measure of actually doing something; it is rather the selfperception that the subject feels they have the capability to achieve a task (Bandura, 2006). Therefore, to guide in designing a tool that would measure self-efficacy in an accurate way, Bandura's guidelines (2006) for constructing self-efficacy scales was utilized with questions being specifically tailored toward self-efficacy with online education and the specific challenges that would be faced in that environment. However, even though the impact of self-efficacy in online environments is well proven, it is not usually combined with other constructs to see if this attitude is significant in persistence when measured at the same time as other crucial constructs. 18 Usability of the Educational Interface The second key point of interaction and potential for “noise” is the computer interface itself and how the instructor utilizes this interface. The computer interface that educational institutions commonly use to manage educational content is often called either Content Management Systems (CMSs) or Learning Management Systems (LMSs). Some researchers are rather adamant about the basic design differences in that CMSs simply facilitate the instructor’s ability to place course materials online, monitor student performance, and set a framework for communication between the student and the instructor as well as the student and his or her classmates (Watson & Watson, 2007). On the other hand, LMS systems are geared more for closely following the learning process with monitors to measure if individual or institutional goals are being met in the learning process (Watson & Watson, 2007). However, in the general literature these terms are used interchangeably and even major systems such as Blackboard, while referred to in academic literature as a LMS, identifies themselves as a CMS in their own literature (Watson & Watson, 2007). No matter how it is defined, most institutions use some sort of CMS or LMS for students to access class content information for all of their classes: traditional, hybrid, or online. This system is the default, and sometimes, the only, system used by both faculty and staff to post class content and monitor student activity. The usability of the system itself is one of the keys to the ease with which the student can access the information that they need to succeed (Shrank, 2009; Minocha, 2009). The use of information and communications technology (ICT) for accessing educational material is essential to the ability to participate in online education. However, usability is key for lowering frustration levels, a Nielsen Norman group study (2010) using college students from selective universities around the world, measured their ability to utilize web sites for specific tasks. They found that 19 even these highly educated users were likely to pass over cumbersome or difficult to use sites and lost patience quickly. Even though the college students were very goal oriented in their use of the Internet, when faced with a site that was difficult to “decode”, they would move on (Nielsen, 2010). Additional research on usage of LMSs in online education for those not so adept at dealing with ICT was done, their findings concluded that students with limited ICT backgrounds were able to learn the LMS if they had personal help to get past critical error points. However, overall, a lack of ICT skills, especially in dealing with the LMS, was seen as an issue that could potentially hamper efforts to more be more inclusive in online education (Pretorious & Van Biljon, 2010). Therefore, another important factor that needs to be operationalized in order to be measured in determining is the usability of the LMS interface itself. Since all of the interactions that normally take place within a classroom or between classmates are done in a computer mediated environment, the ease of using that interface would affect student frustration levels and potentially contribute to a decision to drop a class. However, determining clear measures for usability is challenging, because there are many aspects to consider when examining a particular LMS or CMS product (Joo & Lee, 2011). Yet despite the wide variety of possible aspects to consider in usability, some of the most universally adopted measures are those developed by Jakob Nielsen (1994) in explaining the basic heuristics that are essential for ICT usability. His specific publication that guided the usability design of the measurement instrument for this research study is his work on college students and the Web (Neilson, 2010). Yet, measuring the usability of the LMS or CMS system itself is not sufficient in understanding its effectiveness in learning, because even the best designed systems are not effective if their affordances are not used as designed, or if the organization of the class is not 20 developed to the level where it is useful for the target audience. Several research studies have looked at how the LMS affects the pedagogical choices of the instructors. One team of researchers said, “LMS are not pedagogically neutral technologies, but rather through their very design, they influence and guide teaching. As the systems become more incorporated into everyday academic practices, they will work to shape and even define teachers’ imaginations, expectations and behaviors” (Coates, James, & Baldwin, 2005). These researchers are concerned about the tendency of institutions and instructors to simply adapt their teaching pedagogies to the already developed systems, rather than to take pedagogical “best practices” and design systems that enhance the learning process. Furthermore, the technologies used within the discipline of teaching deeply affect institutional policies, “these kinds of technologies are productive of the cultural practices, institutional ethos and broader educational discourses within whose terms the academic self is in turn produced” (Saltmarsh, Sutherland-Smith, 2010). Many students have been enrolled in more than one online class; therefore, some have experienced how instructors effectively or ineffectively utilize the LMS to better optimize the affordances of the environment. Therefore, several questions on the research instrument probe the student’s perceptions of usability as pertaining to the usability of the instructor’s use of the LMS for the class. Social Presence in the Virtual Classroom The concept of social presence in the online class environment is one that is of great interest in recent studies and well-proven to be a very important construct as a factor in student retention (Swan et al, 2012; Russell and Curtis, 2012; Crimm, 2006; Gunawardena & Zittle, 1997; Huckabee, 2010; Dabbagh & Kitsantas, 2012; Daniels & Stupnisky, 2012; Mckerlich et al, 2011; Brinthaupt et al, 2011). Social presence, or immediacy, can either be fostered through interactions with the instructor (e.g. e-mail, responses to communications) or through 21 interactions with fellow students. As for the social presence or immediacy with the instructor, this has been frequently seen as a critical factor in student satisfaction and engagement in online classes (Johnson & Card, 2007; Crim, 2006; Gunawardena & Zittle, 1997). The sense of "temporal immediacy," the response within a timely manner to questions, comments and assignments is seen as key to a connection that goes beyond the actual words or feedback exchanged (Johnson & Card, 2007). In respect to the student-to-student interaction, an instructor can design a learning environment that encourages and supports student-to-student discussion, they can help guide by example and act as a facilitator to build an overall sense of social presence (Ng, Cheung, &Hew, 2012). On the other hand, the social presence that could potentially be built between students, according to research by Ke (2012), has not been found to be a strong factor in “student knowledge construction,” therefore, following the SelfDetermination Theory this aspect of social presence would probably not be a significant factor in student persistence. Yet, however it is measured, the sense of social presence in computer mediated environments, such as online classes, are seem as about 60% of the variance affecting student satisfaction (Gunawardena & Zittle, 1997). Common Theories of Student Persistence In looking at theories of student persistence, the most commonly used constructs are Bean's Model of Student Departure and Tinto's Student integration model (Cabrera, Castaneda, & Nora et al, 1992; Braxton, Milem, & Sullivan, 2000; Wilging & Johnson, 2004; Osborn, 2000). From Bean's model, the four primary variables that predict student attrition are: poor academic performance, often closely connected to low achievement levels in high school; intent to leave, often determined by low satisfaction levels, and lack of utility; background, which includes demographic variables such as age, gender, enrollment status; and environmental 22 variables, such as finances, outside work responsibilities, and family responsibilities (Bean & Metzner, 1985). Additionally, many student retention policies are developed around Tinto's Student Integration Model which sees the importance of student integration, positive social interactions with staff and fellow students, as well as positive institutional experiences to strongly contribute to student retention (Cabrera et al, 1992; Wilging & Johnson, 2004; Welsh, 2007; Achiles et al, 2011, Nakajima, Dembo, & Mossler, 2012; Muse, 2003). Two of the constructs under study in this research are closely tied to elements of Bean's model, self-efficacy and usability. A student's prior academic performance, completing classes in an online environment, will give a measure for the perception of self-efficacy. Also, usability of the LMS interface and how it is utilized by the instructor are closely tied to the psychological outcomes that influence the "intent to leave" (Bean & Metzner, 1985). The third construct under examination, social presence, is closely tied to Tinto's model which includes social integration and positive interactions with both faculty and fellow-students (Achiles et al, 2011, Herbert, 2006). Other factors such as academic performance (GPA), distance from the college, gender, and first person in family to attend college will be examined. 23 METHODOLOGY Research Design In designing this research study, there were several key elements to choose in order to get a better understanding of how the constructs of self-efficacy, usability, and social presence affect student retention in online instruction. Previous studies in online education were primarily done on traditional students attending four-year research institutions; these institutions often have different student demographics than community colleges, so their data may not represent the special needs of the non-traditional student (Nakajima, Dembo, & Mossler, 2012). These studies may have biases in that the student populations given that these institutions have already had high enough academic achievement and self-efficacy to be able to matriculate in a selective institution. Also, these institutions would probably have a student population that was very comfortable with technology, so usability would not be such a crucial hindrance for student retention. Finally, previous studies were frequently done at residential institutions, so the element of online social presence would not be as critical to help the student feel connected to both the institution and the class (Nakajima, Dembo, & Mossler, 2012). Therefore, even though it would be more challenging to get access to a student population from a non-research institution, selecting a community college for the sample population provides a test group to examine that might not have high levels of self-efficacy. Along with the importance of getting an institution that would have the student population that is both more likely to find online classes challenging, it was important to find specific classes to study that would have a diverse student population. Previous studies that are focused on the interaction of one or two classes in highly specialized fields might deal with student populations that are very familiar with working with various computer environments and 24 they would not be as deeply affected by usability as the students who are not as familiar with using ICT for education. Therefore, sampling student experiences from a general education class that is required of all majors would get a diverse and representative sample from the entire student body. Another advantage of using a class that is required for all majors is that it would avoid majors that have a built in sense of community and social presence that goes beyond the actual classroom because of a shared interest in a particular field. It was determined that the best selection of classes for research would be the online Writing and English classes. These are required for all majors and needed to transfer to any four-year institution. These also are highly interactive classes that benefit greatly from a sense of social presence as students revise papers based on feedback from the instructor and peer review. Choice of Institution A large Midwestern community college in Michigan was selected for this research. This institution is ideal for research targeting a diverse student body for many reasons: it is located in a city that had large numbers of people formerly employed in auto manufacturing, many of these plants closed and adults are returning for education in different fields; and it is in an urban area and has a significant minority population, 17.7%; at an enrollment of approximately 20,000 students in 2012, has a large student body (Michigan Department of Treasury, 2012). Since this community college has a high number of students who are non-traditional, they often face the challenges of working to support a family, caring for children or other family members, and living a significant distance from the college. All of these factors make the availability of highquality online classes even more strategic, and the need for insights on student experience more imperative. 25 This college over the years has instituted many proactive measures to help aid in student retention and success. They have a computer help desk to help with technical issues that include access to the Learning Management System (LMS). This service is accessed by phone. Additionally, there are tutoring services available to give the students a resource for regularly scheduled help in many specific topics. Finally, there is a writing center to give one-to-one interactions with trained writing assistants. Also, before signing up for online classes there is a simple questionnaire for the student to self-assess their ability to succeed in an online class. This is to help students understand the demands an online class will take in both time and task management. Finally, it is required for all instructors have all taken basic training in how to use the institutional LMS (Desire 2 Learn); however, this training is simply in how to utilize the system and not special pedagogical training for working in online educational environments. Along with many other public educational institutions, this institution has had to face many challenges to endeavor to provide a quality educational experience for students. These include declining revenues, a student population that is increasingly diverse and needing preparation and support for succeeding at college level classes, and most importantly for online education, a faculty that often struggles to be fully engaged. About 80% of the faculty is adjunct, and there are very limited funds for professional development (Bergeron, 2012). Since adjunct faculty often either works at another job or at multiple institutions, it is very difficult to have the interaction that promotes communication of online pedagogical standards, or informs instructors of the latest research findings on how to improve the student learning process. All of the factors mentioned: student demographics, institutional policies for improving achievement levels, financial pressures, and diversity in faculty, combine together to make this 26 college a choice that could offer insights into how the constructs under consideration affect student retention in online classes. Choice of Sample Population The online Writing and English classes in the spring semester of 2013 were selected for this research. The classes surveyed included all the online sections of: Class A --Pre-College Writing (remedial writing to prepare students for college level writing); Class B --Composition I (basic college level research writing); Class C --Composition II (writing an academic argument); Class D --Writing about Literature (basic writing class using English literature as a source for topical analysis); Class E--Writing about Literature & Ideas (literature analysis and argumentative writing); Class F --Honor’s Composition I; Class G --Honor’s Composition II. However, there were no online sections for Class D, Class F or G during the time examined. For the face-to-face sections of these classes, 2,630 students were enrolled and 177 dropped, for an attrition rate of 6.73%. The online sections of these same classes had 706 students enrolled with 70 dropping for a drop rate of 9.92%. Instrumentation There are many methods to effectively sample the opinions and gain insights from the experiences of students. Given that the goal for this research was to get a sample from online students, who might live at various distances from campus and have busy schedules, the method that would be least intrusive on the students would be an online survey. This method, although very easy to complete, often suffers from non-response, or that those who participate are selfselecting because of strong opinions about a particularly bad or exceptionally good experience (Groves et al, 2009). In order to get better participation and more even representation, as found 27 by research to increase participation, an incentive was offered of the possibility of winning one of two $25 gift certificates from a local general merchandise store (Groves et al, 2009). To avoid ineligible responses, survey links were sent only to all the community college students who had enrolled at any time in the classes under consideration in the Spring of 2013, both the students who had dropped and those who had persisted in the class were sent the survey link. Also, to avoid oversampling, the IP addresses and email address of the respondents were screened to look for duplication, then deleted to protect privacy after the prize announcement was made. The computerized self-administered questionnaires (CSAQ) allowed privacy so the students could respond honestly to their experience without worrying about their grades being affected. The survey instrument was sent the end of March, about 75% of the way through the class. By this point in the semester, all the students who are going to drop the class would have probably already done so. Also, by this point the student would be able to report on their interactions with fellow classmates and the instructors in a knowledgeable way. However, there is the likelihood that students who had issues with the usability of the interface might have forgotten their early frustration. A little over a week after the first email was sent out telling the students about the survey, a second email was sent out as a reminder. The initial email link was sent out on March 28, 2013 with a response of 30 completed surveys and 2 incomplete surveys. The second email was sent out on April 8, 2013 with an additional response of 19 more completed surveys, which was an increase of 63% valid surveys. The two incomplete surveys were simply agreeing to the consent form and no additional answers were given. These two surveys were discarded and not included in any of the totals. Overall, of the 706 students invited to participate, 49 valid surveys were completed for a participation rate of almost 7%. 28 Survey Questions To assure that the questions truly measure the constructs under consideration, tested instruments were used with only minimal modification. The questions assessing self-efficacy in online education were adapted from Bandura's "Guide for Constructing Self-Efficacy Scales" (Bandura, 2006). The questions for assessing utilitarian usability come from Jakob Nielsen's usability measures (Nielsen, 1994). The bipolar terms using "semantic differential adjective pairs" assessing feelings of satisfaction with the overall online class have been used by Ajzen & Fishbein (1977), Spreng et al (1996), and Coursaris et al (2007, 2012). The measures for social presence are adapted from researches by Johnson and Card (2007) on the effect of student immediacy behavior and by Coursaris et al (2012) on the Integrated Model of User Satisfaction (Johnson & Card, 2007; Coursaris et al, 2012). Also to increase validity, several questions were included that either rephrased the same question, and also a question that incorporated reverse scoring. Demographic questions were taken from the college’s admission questionnaire. A question was added to determine what types of devices the students were using to access their class materials. This allowed the users to select all of the types of devices that they used to access online class content to better discover future class design implications. Human Subjects approval The study was submitted to Michigan State University for evaluation and approval by the Human Research Protection Program, Institutional Review Board (IRB) (application #i042417) in the fall of 2012 and was assigned number x12-1302. It was determined exempt and once wording for the consent form was modified, full approval was granted. Before the study was run at a Midwestern Community College, I met with the executive director of their Institutional 29 Effectiveness, Research, and Planning department and also received approval for running the research study with the selected students. All approval documentation is in the appendix Data Analysis The survey answers were downloaded into an Excel sheet. The survey into data that could be easily analyzed using Excel and SPSS. Each section of questions evaluating one particular construct included space for comments to gather qualitative data from the students. Several students gave detailed comments that expressed their feelings about their experiences in the class and their suggestions for future improvement. A codebook was developed to closely analyze these comments according to the aspects of the three theoretical aspects under consideration. The coding process for the comments is discussed in more detail in the Qualitative Analysis of the Comments section. The data was analyzed using Excel and SPSS. The codebook is included in the appendix. 30 RESULTS Overall Demographic Information The basic demographic information from all of the completed and valid survey participants (N=49) is in the following tables. It shows that the mean GPA was in the 3.00-3.49 bracket with a standard deviation of .957. A significant percentage of students, 34%, were the first person in their immediate family to attend college. The zip codes showed that most of the participants were from the same area as the college, 63.3% live less than twenty miles away from the college; however, 4% report their home as being more than 50 miles away from campus. One of those students reported they were taking the classes from Detroit, a distance of 81 miles. The overall average one-way distance for the students was 15.1 miles. Females were 67% of the respondents and 86% were white. By the time of the survey 12% of the respondents had dropped the class. 31 Demographics of all participants Table 1 Approximate GPA of all participants Approximate GPA 1.50-1.99 2.00-2.49 2.50-2.99 3.00-3.49 3.50-4.00 Prefer Not to Answer Frequency Percent 1 2.0% 2 4.1% 4 8.2% 16 32.7% 22 44.9% 4 8.2% Figure 3 Approximate GPA Table 2 Ethnic/ Racial Background of Participants Frequency Percent Frequency Percent Black or African American White Two or more races Prefer Not to Answer 3 42 1 6.1% 85.7% 2.0% 3 2% 6% 6% 15 U.S. Citizen Prefer Not to Answer 0 86% Black or African American White Two or more races Prefer Not to Answer 32 98% 1 2% Figure 5 Citizenship 2% U.S. Citizen 10 5 48 6.1% Figure 4 Ethnic/ Racial Background 25 20 Table 3 Citizenship 98% Prefer Not to Answer Table 4 Gender of Participants Male Female Prefer Not to Answer Table 5 First person in family to attend college Frequency 15 33 Percent 30.6% 67.3% 1 2.0% Figure 6 Gender of Participants Yes No Prefer Not to Answer Frequency Percent 17 34.7% 30 61.2% 2 Frequency 43 6 Yes No Percent 87.8% 12.2% 4.1% Figure 7 First Person in Family to Attend College Male 2% Table 6 Still Enrolled in class Figure 8 Still Enrolled in Class 4% Yes Yes 4% 31% Female 35% 67% Prefer Not to Answer 61% No Prefer not to answer 33 35% No 61% Prefer not to answer Table 7 Distance from College Table 8 Class Enrollment # of Students Class A Class B Class C Class D 4 25 4 16 prefer not to… > 60.0 50.1-60.0 40.1-50.0 30.1-40.0 8.2% 20.1-30.0 4 30 25 20 15 10 5 0 10.1-20.0 Percentage 49.0% 14.3% 14.3% 10.2% 0.0% 2.0% 2.0% 0-10 Miles 0-10 10.1-20.0 20.1-30.0 30.1-40.0 40.1-50.0 50.1-60.0 > 60.0 Prefer not to answer # of Students 24 7 7 5 0 1 1 Figure 9 Distance from College in Miles Figure 10 Class Enrollment of Participants Percent 8.2% 51.0% 8.2% 32.7% 8% Class A 33% Class B 51% 8% 34 Class C Class E The questions dealing with the different constructs under analysis: self-efficacy, usability (of the LMS as well as the instructor’s utilization of the LMS), and social presence were checked using Cronbach’s Alpha to assure for reliability. Then each set of questions was further analyzed for Correlation using Pearson’s Correlation for Significance. Then the groupings that were shown to have reliability and significance were further tested against the direct variable of dropping or persistence in the class using an Independent Sample t Test. In light of what was already found in previous research published in the literature reviewed, the surprising results reflected the tremendous speed at which achievement, attitudes, and expectations are changing in online education. Demographics and Persistence Grade Point Average- A t test failed to reveal a statistically reliable difference between the mean GPA in the students who persisted in the class (M = 6.20, s = .992) and those who dropped (M = 6.60, s = .548), t(43) = .879, p = .384, α = .05. The mean GPA of those dropping the class was actually slightly higher than those who persisted. Ethnic/ Racial Background- A t test failed to reveal a statistically reliable difference in the ethnic or racial background in the students who persisted in the class (M = 4.90, s = .632) and those who dropped (M = 5.00, s = .000), t(44) = -.384, p = .703, α = .05. The ethnic composition of the students who persisted was more diverse than the students who dropped. Gender- A t test failed to find a statistically reliable difference between the mean Gender in the students who persisted in the class (M = 1.64, s = .485) and those who dropped (M = 2.00, s = .000), t(46) = -1.787, p = .080, α = .05. All of the students who dropped and participated in the survey were females; however, because of the small sample size of those who dropped, this cannot be reliably deemed as a factor. 35 First Person- A t test failed to reveal a statistically reliable difference between the mean First Person in their family to attend college in the students who persisted in the class (M = 1.72, s = .549) and those who dropped (M = 1.50, s = .548), t(47) = .942, p = .360, α = .05. Distance- A t test failed to reveal a statistically reliable difference between the mean Distance from the college in the students who persisted in the class (M = 17.19, s = 17.05) and those who dropped (M = 11.27, s = 9.58), t(43) = .826, p = .413, α = .05. Class Enrolled- A t test failed to reveal a statistically reliable difference between the mean of the class enrolled for those who persisted in the class (M = 2.91, s = 1.461) and those who dropped (M = 3.50, s = 1.64), t(47) = .919, p = .363, α = .05. Of the participating classes, only Class C and Class E had students who dropped the class. Overall, none of the demographic variables produced a significantly reliable measure to predict the likelihood to persist in a class. Self-Efficacy in Online Classes The Previous Experience scale (2), previously enrolled in a class or previously completing a class, were strongly correlated, α=.950, and a Chronbach’s Alpha of .974. However, a t test failed to reveal a statistically reliable difference between the mean of Previous Experience with the students who persisted in the class (M = 1.28, s = .554) and those who dropped (M = 1.25, s = .418), t(47) = .148, p = .883, α = .05. After the questions about previous experience in an online class there was a set of questions that dealt with self-efficacy in the online class environment. Within this set of questions are measures for both the Self-Efficacy in Computer Comfort (2) (Chronbach’s alpha .765), as well as questions that deal specifically with Self-Efficacy with Online Classes (3) (Chronbach’s alpha .826). The Self-Efficacy in Computer Comfort examined with a t test failed 36 to reveal a statistically reliable difference of the mean with the students who persisted in the class (M = 6.51, s = .736) and those who dropped (M = 6.83, s = .418), t(47) = 1.04, p = .303, α = .05 The measures for Self-Efficacy in Online Classes in a t test also did not produce a statistically reliable difference of the mean with the students who persisted in the class (M = 6.05, s = .981) and those who dropped (M = 6.28, s = .574), t(47) = .542, p = .590, α = .05. When all of the self-efficacy measures were combined and analyzed using a Pearson’s correlation, overall, the questions were found to be correlated to each other with one exception (comfortable with computers and able to get homework done on time). 37 Table 9 Self Efficacy in Online Classes bivariate correlations Feel comfortable working with computers Can access Information that I need from computers Feel I can finish homework by deadlines Feel I can get myself to do my assignments I know where to go for help if needed N=49 Pearson Correlation Sig. (2-tailed) Pearson Correlation Feel comfortable working… Can access information … Feel I can finish homework … I know where to go… 1 .681** 0.256 .403** .365** .681** 0 1 0.076 .353* 0.004 .557** 0.01 .472** 0.013 1 0 .696** 0.001 .728** 0 1 0 .610** .610** 0 1 Sig. (2-tailed) Pearson Correlation 0 0.256 .353* Sig. (2-tailed) Pearson Correlation 0.076 .403** 0.013 .557** .696** Sig. (2-tailed) Pearson Correlation 0.004 .365** 0 .472** 0 .728** Sig. (2-tailed) Feel I can get myself … 0.01 0.001 0 ** Correlation is significant at the 0.01 level (2-tailed). * Correlation is significant at the 0.05 level (2-tailed). 38 0 These scale questions for Self-Efficacy in the Online Environment (5) had a Chronbach’s alpha of .814. However, when this scale was examined with a t test, it failed to reveal a statistically reliable difference between the mean of the students who persisted in the class (M = 6.24, s = .775) and those who dropped (M = 6.50, s = .414), t(47) = .809, p = .422, α = .05. The students overall showed a high level of self-efficacy. The mean for self-efficacy of those who persisted in the class was 6.23 .77 out of a possible 7. The mean for those who dropped the class was 6.5 .41. Usability of the Learning Management System (LMS) The survey instrument contained questions both about the usability of the LMS and the usability of the class material as a result of the instructor’s utilization of the LMS. These are two distinct aspects of usability when dealing with the online class interface. The LMS itself has certain affordances and specific limitations. Students who are new to the LMS system might have trouble negotiating how the interface works while those who have worked with it before might be able to effortlessly negotiate the space. The other measures deal with student perception of utilization of the system by the instructors. Technically savvy instructors can either take advantage of the potential affordances of the system or find ways to work around its limitations. On the other hand, the instructor’s use of the LMS or organization of materials may actually detract from the usability of the LMS. When looking at the usability measures of the LMS there was a significant correlation between the various questions even at the .01 level. These questions also had a Cronbach’s alpha of .828. 39 Table 10 Usability of the LMS Questions Correlations Pearson’s Correlation Learning how … Course's LMS … Because of how… Learning how to access class material was easy Course's LMS helped me complete my class assignments 1 .666** .496** The course's LMS … .794** .666** 1 .309* .604** Because of how the LMS works I often struggle with accessing what I am supposed to do (reverse scored) .496** .309* 1 .554** The course's LMS features the professor used in the class were easy to use .794** .604** .554** 1 ** Correlation is significant at the 0.01 level (2-tailed). *** Correlation is significant at the 0.05 level (2-tailed). The usability of the LMS had a wide range of survey answers, even though the mean was “somewhat easy” to use, the variance was 2.8 and the standard deviation 1.68. However, when this scale was examined with a t test, it failed to reveal a statistically reliable difference between the mean of the students who persisted in the class (M = 5.18, s = 1.66) and those who dropped (M = 4.94, s = 1.02), t(47) = .316, p = .754, α = .05. The questions dealing with the instructors’ utilization of the LMS were also found to be closely correlated with the Pearson’s results in Table 10 and a Cronbach’s alpha of .907. 40 Table 11 Usability per instructor's utilization of the LMS Correlations – Instructor’s Utilization of the LMS System The instructor The instructor's Because of how utilized … organization … the instructor … The instructor utilized 1 .718** .718** many features of the LMS The instructor's .718** 1 .868** organization of the class materials is effective Because of how the .718** .868** 1 instructor utilized the LSM I can efficiently access my online class material ** Correlation is significant at the 0.01 level (2-tailed). Yet again, when this scale was examined with a t test, it failed to reveal a statistically reliable difference between the mean of the students who persisted in the class (M = 5.10, s = 1.29) and those who dropped (M = 5.71, s = 1.61), t(47) = 1.037, p = .305, α = .05. So neither the usability of the system, nor the instructors’ utilization of the LMS alone was correlated to persistence in an online class. Social Presence Measures There were two aspects of social presence measured, the social presence of the instructor and the social presence of the fellow students. The questions to measure the social presence of the instructor (2) were checked for correlation and found to have a Cronbach’s alpha of .795 and Pearson’s Correlation of .660 and significant at the .01 level. These questions, however, when a t test was run were not found to have a statistically significant reliable difference between the mean of the students who persisted in the class (M = 4.17, s = 1.79) and those who dropped (M = 4.33, s = 2.16), t(47) = .199, p = .843, α = .05. Several of the students were highly dissatisfied with the immediacy of the instructor, even though these students were persisting in the class. 41 This contributed to the result of social presence not being a significant factor in predicting persistence. Figure 11 Social Presence of the Instructor 8 7 Frequency 6 5 4 3 2 1 0 Satisfaction Levels However when the means of the social presence of the instructor was compared with the bipolar adjectives using a t test that indicated student satisfaction there was a significant correlation. The mean of the students who were satisfied with the class (M = 5.0, s = 1.77) and those who were not satisfied (M = 3.4, s = 1.77), t(47) = 3.48, p = .001, α = .05. This showed the power of social presence in determining satisfaction, even if the student continues to persist in the class. The questions to assess the students’ sense of social presence with their classmates were also tested for correlation and found to have a Cronbach’s alpha of .853 as well using Pearson’s and found to correlate as shown in Table 11. 42 Table 12 Social Presence of Fellow Students Correlations When I posted … When I posted a comment my classmates responded to me in a reasonable time I enjoyed interacting with my online classmates I have shared personal information with my online classmates My classmates share personal information about themselves My classmates express their agreement or disagreement with what I post My classmates share … My classmates express… 1 I enjoyed I have interacting… shared personal … .625** .422** .518** .577** .625** 1 .479** .432** .515** .422** .479** 1 .850** .431** .518** .432** .850** 1 .545** .577** .515** .431** .545** 1 ** Correlation is significant at the 0.01 level (2-tailed). When the social presence of fellow students questions, were examined with a t test, they were not found to have a statistically significant reliable difference between the mean of the students who persisted in the class (M = 4.85, s = 1.25) and those who dropped (M = 4.43, s = 1.33), t(47) = .753, p = .455, α = .05. This finding supported H5, that the social presence of peers would not be a significant 43 Overall Satisfaction Levels When examining the students’ responses to the Bipolar Adjective Pairs that evaluate their emotional responses to the class first the adjective pairs were examined for their correlation. The Cronbach’s alpha is .972 and the Pearson’s correlations are shown in Table 12. The measures were examined with a t test, the adjective pairs were not found to have a statistically significant reliable difference between the mean of the students who persisted in the class (M = 4.47, s = 1.73) and those who dropped (M = 3.83, s = 1.47), t(47) = .865, p = .391, α = .05. The overall spread of the bi-polar adjectives show a slightly better than neutral overall feeling towards the class. Figure 12 Class Satisfaction Levels 6 5 Frequency 4 3 2 1 0 BiPolar Adjectives 44 Student Reported Reasons for Dropping Overall, each of the constructs measured was found to be not statistically significant alone in determining if a student was persistent in the class. Therefore only the last hypothesis was supported by the data: the social presence of fellow students would not be significant in persistence. The other hypotheses were surprisingly not as robust as in earlier researches. In examining the reasons given by the six students who dropped the class, two were dropped administratively –which could be a number of reasons such as a class being cancelled or nonpayment of tuition. However, since all of the students reported that they had accessed the material for the class and had formed opinions about organization and satisfaction levels, this indicated that they decided to drop after the period where students could freely add or drop classes and had to get administrative assistance to drop the class. This means that they would have lost probably a significant portion, if not all of the tuition they paid towards the class. One student indicated they no longer needed the class and another stated they could not keep up with the assignments as their schedule had changed. One commented they asked to be dropped and made statements indicating the decision was a result of feeling the instructor was not competent. The remaining students who dropped the class did not indicate why they dropped, but they did fill out all the other questions dealing with the constructs being measured. Qualitative Analysis of the Comments Demographics of Commenters After each section with questions measuring one construct under examination, there was an open area for the students to add their comments. Each section usually gathered a few comments that were closely related to the questions just asked. Overall, the response for each section was fairly modest with only two or three comments. However, the comment section after 45 the bipolar adjectives probing how the class made the student feel, "Overall, this online class made me feel..." had a very intense response with 13 students, almost 27% of the total participants in the study, and these commenters sometimes gave fairly detailed and insightful comments. In the entire survey there were 29 different comments from 17 students recorded. These were analyzed and coded in respect to the constructs under construction: self-efficacy in the online class, usability of the interface, the instructor's utilization of the interface, the social presence of fellow students, and the social presence of the instructor. After breaking each comment down into smaller units of concerns that the students mentioned, these were rated using the same seven-point scale as the survey questions: 1) very dissatisfied; 2) dissatisfied; 3) somewhat dissatisfied; 4) neutral; 5) somewhat satisfied; 6) satisfied; and 7) very satisfied. Therefore each comment often produced several measurable factors and the factors could be analyzed to look for patterns and relationships. This process provided 97 measurable factors. The demographics of the students commenting were 71% female and 23% male of those who chose to share their gender. Even though 18% reported being the first person in their family to attend college, those who left comments were fairly successful in their studies, their selfreported an average GPA of the 3.00 to 3.49 range. However, not all the students were high achievers, several reported a GPA below 3.00. As can be expected in this type of survey, those who have had extremely good or extremely bad experiences will be more likely to add additional comments. Yet, the comments seemed fairly restrained, clear and thoughtful given the opportunity to vent. They were all clearly written and shared their insights without using derogatory terms or profanity. The students discussed a wide range of aspects about the class, their struggles with the interface, their appreciation of the opportunity to study online, their 46 satisfaction with the instructor, their frustration with fellow students, or their frustration with the instructor. The distance that the students who made comments were from campus was fairly consistent with the overall enrollment of the class as shown, with many of them living fairly close to campus. This would imply that distance from campus was not a major factor in their decision to enroll in the online class. These distances are shown in Table 15 and Chart 11. 47 Table 13 GPA of Commenters Alpha value (for confidence interval) Count Mean Variance Standard Deviation Mean Standard Error Minimum Maximum Range 0.02 16 6.3125 0.7625 0.87321 0.2183 4. 7. 3. Figure 13 GPA of Commenters 9 8 7 6 5 4 3 2 1 0 Table 14 Gender of Commenters Male Female Prefer Not to Answer Table 15 Mean Distance in Miles from Commenters by category All who enrolled Those who persisted Those who dropped Those who commented 4 12 1 Figure 14 Gender of Commenters 6% Male 23% Female Prefer Not to Answer 71% 48 16.4 17.29 11.27 15.35 Figure 15 Distance in Miles of Commenters 9 8 7 6 5 4 3 2 1 0 Apart from a few comments about fellow students, “group projects for online classes doesn’t work” or the very supportive comments about tutoring services, the computer help services, and the writing center, “I have used tutoring and the resources at the writing center and it was always helpful!” almost all of the comments were about the instructor or the LMS. The particular LMS system used, was only in its second semester of use at this college, therefore it is not surprising that there were usability issues and that not all of the instructors have had extensive experience in utilizing the possible affordances it may offer, and since the system is constantly being upgraded, hopefully the serious issues reported, such as incompatibility with Windows 8, and frequent outages will be improved. By far, most of the comments were about the communication (social presence) and organization (usability) issues that were a result of the instructors’ choices in pedagogy. Table 16 Categories of Comments Types of Comments Comments about Social Presence of the Instructor (communication) Comments about Self-Efficacy (or expressing concern about lack of efficacy Comments about Usability of the System as utilized by the instructor Comments about Usability of the LMS Comments about Support Services Comments about Social Presence of fellow students Number of Comments Percentage of Total 24 32.0% 22 29.3% 13 17.3% 9 12.0% 6 8.0% 1 1.3% 49 Comments about Social Presence of the Instructor (Communication) By far the most frequently mentioned factor in the comments was the instructor. They ranged from, "My instructor was terrible," "my instructor is not on top of things," "...is the worst instructor I have had…," to "my professor makes it easy to use," and "my teacher is fantastic." Since the range of classes surveyed were all fairly similar in theme and process, and the LMS used was identical, the ultimate factor to the satisfaction with striving to learn from the class appears to be the utilization of the affordances of the online format by both the instructor and the student. For those who left comments, many expressed their efforts in trying to communicate and achieve learning goals, yet felt left dangling by the instructors. One student put rather succinctly, “Some instructors I have had do not give feed back in a timely manner and are rather lazy about the fact that it is an online class. I don't feel they give it the same respect they do a face-to-face class. They think that just because they don't see me twice a week I don't need to hear from them. I think that instructors should be held to the same kind of deadlines I am held to when giving feedback. They need to make sure that I am given the information I need long before it is too late to be useful.” This student had previously completed online classes and reported a GPA in the 3.49-4.00 range, so they obviously had experience to judge what could be done in the environment, and had seen a range of responses. They commented that “some instructors,” realizing that this was not “all” instructors, were not as effective in teaching online and they realized there is a range of dedication in instructors. On the other hand, there were also comments that gave great praise to instructors, “My teacher is fantastic. She has kept everything extremely organized and communicates with fast 50 responses.” This particular student had never taken an online class before, so she got a new laptop specifically for the class and then discovered that her new Windows 8 laptop was not compatible with the new LMS, Desire2Learn (D2L). She ended up having to go to her parent’s house or access her class material on her smart phone since she was having so many compatibility problems with the version of D2L and Windows at that time. So even though her ratings of the usability of the system were understandably very low, her overall comments and satisfaction with the class were high because of the quick responses from her instructor and the instructor’s organization of the class material. Yet despite the few positive comments about instructors, overall most comments dealt with frustration over what students felt were slow response times. After evaluating the comments on how strongly the student expressed their satisfaction or dissatisfaction with the social presence of the instructor the overall comments showed extremely high levels of frustration with poor communication and a lack of response or accessibility to the instructor. Comments included, “My instructor has poor communication skills. She did not return emails or post grades in a timely manner,” and “This class has been a real struggle what with my instructor…not responding when I have emailed her. Her average response time is about two weeks, which is usually after an assignment is due.” Even considering the few comments that praised the instructor’s response, the overall consensus was that the speed sense that the students could communicate with the instructor was the key issue for the students commenting. When coded, most of the comments were very dissatisfied (1), and even though some were neutral (4), or very satisfied (7), the mean was dissatisfied (2.17). 51 Table 17 Satisfaction with Instructor's Social Presence 0.02 24 2.17 4.49 2.12 0.43 1. 7. 6. 1. 20 Number of Students Alpha value (for confidence interval) Count Mean Variance Standard Deviation Mean Standard Error Minimum Maximum Range Mode Figure 16 Satisfaction with Instructor's Social Presence 15 10 5 0 1 2 3 4 5 6 7 8 Score Comments about Self-Efficacy (or expressing concern about lack of efficacy) Students who left comments often showed a great deal of efficacy in working in the online environment. Their comments indicated that they were looking for efficiency, professional communication levels, and timely feedback. Comments were coded as follows, very low sense of self-efficacy (1), low self-efficacy (2), somewhat low sense of self-efficacy (3), neutral (4), somewhat high self-efficacy (5), high self-efficacy (6), and very high selfefficacy (7). Statements such as, “professors should be held to a timeframe just as I am,” show the student’s own sense of efficiency. One wrote, “My instructor is not on top of things! She often displays poor grammar…” which would show the student’s own command of grammar and effective strategies, especially in an online writing or English class. This student’s comments were coded as “very high self-efficacy”. Some students who had ended up persisting in the class started out with very low sense of self-efficacy, yet were able to self-organize and persist, “I was very nervous at first about an online class, but once I located and printed out all my information 52 and kept it organized in a binder it got easier.” This comment was coded as “somewhat low selfefficacy”. The overall self-efficacy of the commenters was coded at a mean of 6.32 or “high self-efficacy”. Table 18 Commenter's Self-Efficacy Scores Alpha value (for confidence interval) Count Mean Variance Standard Deviation Mean Standard Error Minimum Maximum Range Mode Figure 17 Commenter's Self-Efficacy Scores 14 12 10 8 6 4 2 0 0.02 22 6.32 0.89 0.95 0.2 3. 7. 4. 7. 1 2 3 4 5 6 7 8 Value Comments about Usability of the System as utilized by the instructor and Comments about Usability of the LMS In analyzing the comments that referred to the usability of the online class, the comments that dealt strictly with the LMS interface, such as, “My biggest complaint has to do with D2L technical problems this semester” were separated from, “…my instructor not knowing how to use D2L…” and, “The teacher didn't organize the assignments the best she could,” if the comments mentioned the LMS specifically it was coded as Usability of the Interface. If the comments discussed specifically usability levels due to the affordances or problems caused by the instructor’s utilization of the LMS, then the comments were coded for Satisfaction with Instructor's Utilization of the LMS. Overall the comments on the usability of the system reflected 53 both extremes of high satisfaction with the LMS and great frustration with the LMS. Specific operating systems seem to have had serious functionality issues that caused problems, while for others with different systems there seemed to be no problem at all. Because of the wide range of experiences the mean for LMS usability was a neutral 4. On the other hand, even though there was high praise for some instructors, a majority of the comments dealt with dissatisfaction of the instructor’s utilization of the system. This produced a mean of 2.15, or dissatisfied. Table 19 Comments About Usability of the System 0.02 9 4.56 5.78 2.4 0.8 1. 7. 6. Number of Students Alpha value (for confidence interval) Count Mean Variance Standard Deviation Mean Standard Error Minimum Maximum Range Figure 18 Comments About Usability of the System 3 2 1 0 1 Table 20 Satisfaction with Instructor's Utilization of the LMS 2 3 4 5 Value 6 7 8 Figure 19 Satisfaction with the Instructor's Utilization of the LMS 0.02 14 2.07 4.69 2.16 0.58 1. 7. 6. 1. Number of Students Alpha value (for confidence interval) Count Mean Variance Standard Deviation Mean Standard Error Minimum Maximum Range Mode 4 12 10 8 6 4 2 0 1 54 2 3 4 5 6 Value 7 8 Comments about Support Services Student’s comments about student support services, such as the computer help desk, tutoring services, and the writing center were either very positive or they expressed that they were going to use the service soon. Comments such as, “The Writing Center is a great resource!” and, “I have used tutoring and the resources at the writing center and it was always helpful!” were both coded as very satisfied. The latter comment was coded as positive both for the tutoring services as well as the writing center since both were mentioned. Since according to the student’s zip codes, some of the online students live far away from campus, coming in to campus for personal help might be very difficult and would show a great deal of dedication. There were only a few comments, but overall the mean of the comments was a 6, or satisfied. Students who commented that they were having an appointment soon, but had not yet utilized the service were not coded. Table 21 Satisfaction with Student Services (Computer Help, Tutoring, and Writing Center) 4 0.02 4 6. 4. 2. 1. 3. 7. 4. 7. Number of Students Alpha value (for confidence interval) Count Mean Variance Standard Deviation Mean Standard Error Minimum Maximum Range Mode Figure 20 Satisfaction with Student Services (Computer Help, Tutoring and Writing Center) 3 2 1 0 1 55 2 3 4 5 6 Rating of Service 7 Comments about Social Presence of fellow students There was only one comment about the social presence of classmates and it was fairly neutral, “our discussion boards were very assignment based, not much chit chat,” therefore, it was coded a “4”. However, since there was only one comment, there are no averages to report for it. The comments revealed a great deal about the students’ experiences and gave further insight into the overall survey results. They also showed that there have been massive and rapid changes in student’s technological comfort and expectations in the past few years. Generally, most of the students showed comfort in working in the online environment. Some indicated their preference for online classes because of the convenience that it offers. Yet, the majority of the comments dealt with dissatisfaction over the online class simply because of slow or poor response from the instructor. The pedagogical choices that instructors make in terms of response times, organization, instruction, and clearly communicated expectations seems to make a huge difference between the “My instructor was terrible” from a student who dropped the class to, “My teacher is fantastic.” The key is the rest of the statement from the student with a “fantastic” teacher, “She has kept everything extremely organized and communicates with fast responses.” Even more surprising is that this statement came from the student who had the most usability difficulties, having gotten a brand new computer with the latest operating system specifically for this class and then finding out that the LMS was not compatible with her computer. This shows the tremendous power of the personal connection that a student develops when they feel they can communicate quickly and easily with their instructor. Further analysis of the comments and how they gave further insight into the answers from the rest of the respondents will be in the discussion section. 56 Other Findings Institutional initiatives to improve student learning and experience seem to be very successful. Those seeking help from the writing center, tutoring services, and the computer help desk appear to be overall very satisfied and the qualitative feedback showed a high level of student appreciation. One very interesting finding is the high percentage of students who access their online class via their mobile devices. Almost half of the students (45%) used a mobile device (smartphone) to access their online class materials. They also used multiple devices to access their class materials. All of the students who used a mobile device to access class materials also used other devices such as laptops, desktops or computers at the college’s computer labs. Figure 21 Students Who Use a Mobile Device to Access Online Class Figure 22 Students Who Use Mobile Devices Also Use 20 18 16 14 12 10 8 6 4 2 0 0% 45% 55% Yes No 57 Figure 23 Number of Devices Used to Access Class 18 16 14 12 10 8 6 4 2 0 The students were also asked to give feedback on what they felt were a reasonable time for getting a response from their instructor when they asked a question. The results showed overall expectations from the students were 24 hours, which is fairly consistent with best practices for online instructors (Reed, 2013). Figure 24 Desired Instructor Response Time 30 25 20 15 10 5 0 Less 6 12 18 1 day 2 than hours hours hours days 6 hours 58 LIMITATIONS One of the major limitations to this survey is the small number of students who dropped that class that participated in the survey (N=6). Proportionally, they are fairly representative of the entire population surveyed, but the number that voluntarily dropped the class is very small. Because of the sample size only bivariate correlations were examined. These cannot deduct any casual relations; there are potential interactions between the variables that could be examined through correlation analysis, but more elaborate statistical analysis of the data was not seen as meaningful given the small number of participating students who had dropped the class. Also, the students who had extremely low self-efficacy in online education or that had usability issues might not even check their student email or feel comfortable responding to an online survey. Therefore, to capture the insights from the most vulnerable students a short survey might be integrated into the drop process. These students have much to gain from furthering their education, and the flexibility of online education offers great potential, yet these students are difficult to reach –especially through the medium that they find challenging. An additional limitation, the issue of self-selection, is also always an issue with any voluntary survey; because the students had particularly bad or especially good experiences will most likely respond. The element of having a potential reward appears to have helped bring in students who had more temperate opinions, since the means of many questions are fairly moderate. Additionally, the students from each class surveyed (e.g. Composition 1, English 122) did not equally respond, so there is overrepresentation of some of the classes. There are always many limitations to data collected by an online survey. Some of these issues, such as sampling ineligible units, or duplication were reduced because only those who 59 were enrolled in the classes were sent links to the surveys and IP addresses as well as email addresses for the contest entries were checked to help prevent duplication. One of the greatest limitations of any research in online education is trying to measure theoretical constructs and their effects, while so many elements that pertain to online education are changing at a tremendous speed. It is like trying to carefully describe something that is morphing in front of our eyes. New LMSs are becoming available, smart phones and tablets are replacing students’ PCs and desktops, and Massive Open Online Courses (MOOCs) are becoming the topic du jour. Also, students’ abilities and expectations appear to be changing rapidly (Kim & Bonk, 2006). Surprisingly, even though many of the students who responded to the survey reported very low levels of satisfaction, they were mostly still persisting in the class. From the comment of one who had dropped and offered comments, stated she specifically dropped because, “my instructor was terrible”. She also indicated that she intended to take the class again with a different instructor. That particular student had a GPA of 3.00- 3.50 range; therefore, the drop could have been strategic to protect their overall GPA. 60 DISCUSSION It would be very easy to come to clear conclusions if there were the obvious “smoking gun” of one or two clear constructs that could point to predicting student persistence. This would allow policy-makers to implement programs or processes that would easily and predictably improve retention. Yet, this research shows that there are many different factors effecting students’ choices in staying in a class. The comments that the students gave who persisted in the class were very enlightening. They tended to support and give more insight into the survey questions that were derived from proven measures. Even though students are sometimes not satisfied with their online class, many are persisting in it. This is probably strongly related to the fact that a section from this set of classes is a required class for almost every major and for transfer students. The strategic value of the completing the class might be a major factor for student persistence, even when satisfaction is low and the constructs of self-efficacy, usability, and social presence are also low or marginal. This research discovered the potential factor of Strategic Value to predict student persistence. If the students feel that the benefit gained from completing the class outweighs the difficulties encountered through the process, they will persist even if they are not satisfied with the class itself. Also, this research helped show because of the massive and exponential changes in technology acceptance and improvements in LMS design, that the factors of self-efficacy in online environments and usability, which were strong in predicting student persistence five to eight years ago are now no longer a major factor. The details of how each construct was found to influence persistence are discussed in the following sections. 61 Persistence and Self-Efficacy Earlier studies predicted a major factor in persistence for this student body would be high self-efficacy in online classes (Wang & Newlin, 2002; Yi & Im, 2004). Yet, in this case, the mean scores of those who dropped the class (M=6.28) were actually higher than those who persisted (M = 6.05) even though the findings were not statistically significant, this definitely is not the difference expected. Most of the students who responded to the survey reported that they felt confident in working with computers in an online environment (M=6.27) even though four students did report that they were less than somewhat comfortable in the online environment. For those who responded to the survey, the decision to drop the class did not appear to be related to their own levels of self-efficacy either in using a computer but rather, according to comments and their survey answers, strategic choices made to get a different instructor (2) or due to changing external circumstances (2). Persistence and Usability of Instructional Interface The relationship between the usability of the interface and the instructor’s utilization of the interface also did not have a statistical relationship with a student’s persistence. This also was the area with a very range of variability in the satisfaction levels of the students. Some of the students commented on how much they enjoyed the system and it worked well. Others had tremendous problems with the LMS not working with their particular computer operating system. The institution studied had just started using a new LMS the previous semester so the initial technical issues should have been somewhat settled by the second semester of use. However, one student wrote, “My biggest complaint has to do with D2L technical problems this semester. Students are at the mercy of the system to complete their assignments and get instruction.” The key to overall satisfaction when the student had any usability problems was 62 often either the instructor, or someone from the student services (help desk, tutoring services, or the writing center) helped them through the difficulty. Instructors’ choices seemed to have a major influence on student’s perception of usability. One student commented that their instructor posted the same material in several ways, which that student found very confusing. Additionally, some students commented and the instructor didn’t seem to know how to use the LMS. This section was very revealing in the wide range of experiences reported; this indicates that the instructors had probably inadequate training for teaching in the online environment. Additionally, since almost half of the students accessed class materials on a mobile device or tablet device, this shows the potential for new affordances in online education, such as further developing immediacy or preparing learning modules that can be accessed at any time. These potential innovations could allow students to make the learning process something that happens throughout the day rather than just at a specific time when they go to a desktop computer. In looking at the differences found in this research from previous researches in usability, students seem to have rapidly adapted to the use of technology in education. They have become sometimes even savvier than their instructors. This is quite a change from studies based on data from five to six years ago where usability standards were sometimes focused on keeping things very simple to handle at modem speeds of 28.5 KPS (Mancuso, 2008). Since over 71% of the students in this research study had taken classes online previously, they had experience in what could be done with an online class. They were similar to experienced consumers, and were not happy when the instructor showed low levels of competence using the LMS; the mean of the comments dealing with instructor usability was 2.07 ± 2.16 which meant they were dissatisfied with how the instructor utilized the LMS in teaching the class. 63 Persistence and Social Presence Even though this construct was not found to be statistically significant in student persistence, 32% of the comments were focused on issues dealing with the students’ attitudes towards their instructor’s social presence and immediacy. There was an overall mean of 4.19 ± 1.81 for satisfaction with the instructor’s social presence, this compares to a mean of 6.29 ± .74 for self-efficacy, usability of the LMS having a mean of 5.14 ±1.68, fellow student’s social presence of 4.79 ± 1.25, and the instructor’s usability of the LMS 5.17 ± 1.34. Issues dealing with instructor immediacy, response time, and perceived expertise in the online environment appear to be important to the students and they are significantly tied to student satisfaction levels. As reported earlier, the mean of the students with high satisfaction of their instructor’s social presence were also satisfied with the class (M = 5.0, s = 1.77) and those who were not satisfied with the instructor’s social presence were not satisfied with the class (M = 3.4, s = 1.77), t(47) = 3.48, p = .001, α = .05. For the students who greatly enjoyed the class, they had very supportive comments such as, “My teacher is fantastic. She kept everything extremely organized and communicates with fast responses.” This student summed up the key points to social presence that the students commented most strongly about, 1) communicate with fast responses, and 2) extremely organized. Organization seems to give the students a feeling of order so they know what to do, and fast answers provides immediacy to feel connected to the class. Social presence definitely was the construct that brought the most deeply felt comments. And a key to improving that social presence is giving instructors the training, as well as allowing the extra time that they need to respond to questions and interact with students as part of their teaching time. 64 Further Discussion Since many students who report being not happy with their online class experience are still persisting, class persistence seems to be primarily a personal strategic decision. Given this is a required class, and dropping would be a loss of time and money, they make a strategic decision to persist. Going to college classes to gain personal advantage is probably a more purposeful decision for the community college student. A traditional student might go to college because it is just the “next” thing that is expected in a young adult’s life. They may spend several years studying at the university before actually settling into a major or realizing just how strategic a college education can potentially be. Community college students are often working while going to school, returning adults, or students who have chosen to a community college for financial or academic reasons. For them to persist, they may have to see the personal advantage that continuing their education will bring, and see the value in persisting in a class. With online education being such a strategic element for the future, it is imperative to continue to investigate and evaluate what theoretical forces are affecting students and instructors in this dynamic and constantly changing environment. I have done exploratory qualitative interviews with three instructors who teach online. These interviews revealed a wide variety of preparation for working in this environment. All of those interviewed had no formal training in online methodologies, either by their institution or during the process of getting their degree. They often were unsure of what methods would enhance the learning process for the student. A research project on training and course development found that “large percentage of instructors are not receiving any training in pedagogy or technology prior to instructing their first online course” (Ray, 2009). 65 Adding to the frustration for some instructors is that certain standard pedagogies that work well in face-to-face sections, such as scaffolding and Vygotsky’s Zone of Proximity Development, have the potential to work extremely well online, but when the instructor steps back to let the students do more of the work in the traditional classroom, the students still see the instructor watching and giving support through body language and attentiveness. On the other hand, in the online environment, if the instructor “steps back” and does not post comments, the students do not know if the instructor is even there. On the other hand, the danger of using every technology possible and over-involvement by the instructor can also lead to frustration and confusion to both the students and the instructor. This becomes a “cart-before-the-horse” type situation where technology is used for its own sake and not for pedagogical purposes (Brinthaupt et al, 2011). This research, as well as many other researches discussed in the literature review, shows that examining the paradigm used for teaching, especially in the dynamically changing online environment, needs to be reexamined. Suggestions for further research The choice of students to persist even when they are very unsatisfied with a class almost suggests the type of strategic choices that might be analyzed using game theories. As University of British Columbia professor Kevin Leyton-Brown (2012) said, "game theory is (important in)...modeling self-interested participants and the ways that they strategically interact with each other". Here, the participants are the students, the instructors, and even the degree granting institution. Using models to analyze students' strategic decisions to drop or persist in classes might help develop academic initiatives that communicate better to students the benefits of completing their education. Additionally, since the students surveyed had a fairly high mean of self-efficacy in online education, it would be good to also study the self-efficacy levels of 66 instructors in the online environment and possible correlations with student learning and persistence. Some of the research indicates that instructors want to learn more about pedagogical discussions and improved learning strategies for the online environment, yet they are uncertain of the latest research findings (Ray, 2009). Development of mobile applications and ubiquitous computing is changing the concepts of usability for education content. Research needs to be done as to what methods and strategies bring effective learning with these new devices and spaces. Also, the development of security and verification processes so that it can be reasonably proven a student is honestly doing their own work. This is a continuing concern, as having educational credentials is a valuable commodity; therefore, online education becomes a target for falsification. LMSs learning efficacy also needs to be further researched. Social presence with the instructor is a key issue in student satisfaction, yet LMSs often have clunky and tedious interfaces for things as elemental as class discussions. Research and development should be ongoing for new tools, designed by educators, for the use of educators, which can enhance and facilitate social presence in a way that is delightful to use. Final Remarks Online education has tremendous potential in reaching populations that previously could not access higher education. It also has the economic appeal of rapid scalability and reducing expenses. Policy makers in institutions should continue to support systems that are working: support services and usability improvements. Areas that have vast potential include ongoing training for instructors in current trends and facilitating their engagement in online environments. Obviously, many of the initiatives started even a few years ago, such as improving LMS usability are starting to pay off in relationship to student persistence. Hopefully, as new 67 pedagogies are developed that are effective for achieving learning goals in online spaces, persistence and college completion will increase. 68 APPENDICES 69 APPENDIX A: COPY OF THE SURVEY INSTRUMENT Survey administered through SurveyMonkey with skip logic to avoid unnecessary questions. Breaks in the page and the heading "Online Education Research" indicate these skip logic points. The circular check points indicate a single choice and the square boxes indicate multiple choices allowed. 70 Online Education Research 1. Online Education Research Consent Form The following is a survey of your online class experience. You are getting this survey because at some point you were enrolled in an online class this semester. We would like to hear from you, even if you dropped the class before you started. The questions pertain to the online writing class that you enrolled in this semester; if you enrolled and then changed sections we would like to hear about the last class you enrolled in. This research is to help improve the delivery of future online instruction. Your feedback will be very helpful. Your participation in this survey should take about ten minutes or less. You may stop at any time, but completing the survey will provide the most helpful information. Your opinions, experience and any demographic information you choose to share with us will be compiled and used to help researchers understand ways to improve online classes. If you choose to participate there will be two $25 gift certificates to Meijer’s to be awarded from the pool of submitted surveys. Your contact information for awarding the prize will be kept separate from the survey answers. All of your responses are confidential and will not be shared with your instructor. This study is part of a master’s research project for Telecommunications, Information Studies and Media department at Michigan State University. If you have any questions or concerns please contact Ruth Shillair, 434 Farm Lane Road, 300 Bessey Hall, Michigan State University, East Lansing, MI 48824, or email msuonlineresearch@gmail.com. Again, participation is entirely voluntary and you may refuse to answer any questions or withdraw at any time without penalty. By clicking accept, you indicate that you are at least 18 years-of-age and you agree to participate in the survey. o Accept o Decline 71 Online Education Research 2. Online class from the list below that I enrolled in this semester (if enrolled in more than one, select the last one enrolled in) o WRIT 117 o WRIT 121 o WRIT 122 o ENG 121 o ENG 122 o WRIT 131 o WRIT 132 3. Have you ever enrolled in an online class before? o Yes o No 4. Have you ever completed an online class before? o Yes o No 72 Online Education Research 5. The following are a set of questions about how comfortable you are with computers and online education. Please check the box that most closely matches how you feel about each statement Strongly Disagree Disagree Somewhat Disagree Neither Agree nor Disagree Somewhat Agree Agree Strongly Agree I feel comfortable when working with computers or a mobile device I feel I can access information that I need from computers I feel I can finish my homework assignments by deadlines I feel I can get myself to do my assignments o o o o o o o o o o o o o o o o o o o o o o o o o o o o I know where to go for help if I need more help with my assignments o o o o o o o 73 Online Education Research 6. Did you stay enrolled long enough to log in to the class site? o Yes o No 7. What type of device(s) do you use to access your online class materials?  mobile phone  mobile tablet (iPad, Samsung Galaxy, MS Surface, etc.)  laptop computer  desktop computer (at home or work)  LCC computers  Other (please specify) 8. Did you have any trouble logging in to your online class? o Yes o No 9. Did you need any technical assistance in accessing your online class? o Yes o No 74 Online Education Research 10. How hard was it to get help? o Very difficult o Difficult o Somewhat difficult o Neutral o Somewhat easy o Easy 11. What the person you contacted able to help resolve the problem? o Yes o No 12. Comments 75 Online Education Research 13. Thinking about the online management system used in this online class (Desire2Learn). Strongly Disagree Disagree Somewhat Disagree Neither Agree nor Disagree Somewhat Agree Agree Strongly Agree Learning how to access my class material was easy The course’s online learning management system interface helped me to complete my class assignments successfully Because of how the online learning management system works, I often struggle with accessing what I am supposed to do in the class I am able to find the assignments easily o o o o o o o o o o o o o o o o o o o o o o o o o o o o The instructor clearly communicated his expectations for each assignment o o o o o o o 76 o o o o o o o o The course’s online learning management system features the professor used in delivering class material was easy to use Because of how the instructor has utilized the online learning management system, I can efficiently access my online class material o o o o o o 14. Overall the online class made me feelStrongly Agree Agree with first with first choice choice Somewhat agree with first choice Neutral Somewhat agree with the second choice Agree with the second choice Strongly agree with the second choice Terrible/ Delighted o o o o o o o Frustrated/ Contented o o o o o o o Unhappy/ Gratified o o o o o o o 77 Sad/ Joyful o o o o o o o Dissatisfied/ Satisfied o o o o o o o Displeased/ Pleased o o o o o o o 15. Comments 78 Online Education Research 16. In thinking of my online classStrongly disagree I feel that my professor responds in a reasonable time to comments I would like it if my professor responded more quickly to my comments Disagree Somewhat disagree Neither disagree nor agree Somewhat agree Agree Strongly agree o o o o o o o o o o o o o o 17. How quickly would you feel is a reasonable time for your professor to respond to your comments and/or questions? Time Frame 79 Online Education Research 18. In thinking about this online class and your interaction with your classmatesStrongly disagree When I posted a comment on our class site my classmates responded to me in a reasonable time I enjoy interacting with my online classmates I have shared personal information with my online classmates My classmates share personal information about themselves My classmates express their agreement or disagreement with what I post Disagree Somewhat disagree Neither agree nor disagree Somewhat agree Agree Strongly agree o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o 80 19. Comments 81 Online Education Research 20. If I needed more help with my assignments I know where to go for help o Yes o No 21. Did you ever try to use tutoring services? o Yes o No 82 Online Education Research 22. Was it easy to get an appointment? o Yes o No 23. Was the session(s) helpful? o Yes o No o Other 24. Comments 83 Online Education Research 25. Did you ever try to use the Writing Center service? o Yes o No 84 Online Education Research 26. Was it easy to get an appointment? o Yes o No 27. Was the session(s) helpful? o Yes o No o Other 28. Comments- 85 Online Education Research 29. Are you still in the class? o Yes o No 86 Online Education Research 30. Were you administratively (either by the professor or the system) dropped for any reason? o Yes o No 87 Online Education Research 31. Did you drop the class before starting work on the material? o Yes o No 88 Online Education Research 32. Reasons for dropping the class (Select as many as apply)-  I was able to get into a face-to-face section  I didn’t need the class  I was too busy  I didn’t have the technical equipment I needed  I heard negative things about the class from other students  Other 33. Check the reason that most influenced your decision o I was able to get into a face-to-face section o I didn’t need the class o I was too busy o I didn’t have the technical equipment I needed o I heard negative things about the class from other students o I heard negative things about the professor from other students o Other 89 Online Education Research 34. Did you drop the class before completing it? o Yes o No 90 Online Education Research 35. Reasons for dropping the class  I was having technical problems  I didn’t understand how the course was organized  I couldn’t keep up with the assignments  The professor didn’t give good feedback  The other students were not very helpful  My schedule changed and I was too busy  Other 36. Select the reason that most influenced your decision o I was having technical problems o I didn’t understand how the course was organized o I couldn’t keep up with the assignments o The professor didn’t answer my questions o The professor didn’t give good feedback o The other students were not very helpful o My schedule changed and I was too busy o Other 91 Online Education Research 37. Any other comments you would like to offer about your online education experiences, either for this class or other online classes you have been in. 38. Your Zip Code- 39. Are you the first person in your immediate family to attend college? o Yes o No o Prefer not to answer 40. What is your approximate GPA? o .00-.99 o 1.00-1.49 o 1.50-1.99 o 2.00-2.49 o 2.50-2.99 o 3.00-3.49 92 o 3.50-4.00 41. Gender o Male o Female o Prefer not to answer 42. Ethnic/ Racial Background o American Indian or Alaska Native o Native Hawaiian or other Pacific Islander o Black or African American o Asian o White o Hispanic Latino o Two or more races o Other o Prefer not to answer 43. Citizenship o U.S. Citizen o Permanent Resident o Refugee o Immigrant 93 o Political Asylum o Other (example H4, TPS, B2) o Prefer not to answer 44. Thank you so much for your participation in this research. Would you like to enter in the drawing for a $25 gift certificate from Meijer? o Yes o No 45. Please enter your email. You will only be contacted if you win. Your email will not be shared or used for any purpose other than to distribute the winning gift certificates. Thank you again for your help in this research. 94 APPENDIX B: COPY OF THE CODEBOOK 95 THREE THEORETICAL CONSTRUCTS EXAMINED: THEORETICAL FORCES THAT COULD AFFECT RETENTION IN ONLINE COLLEGE CLASSES CODEBOOK Ruth Shillair Michigan State University Presented to the faculty of Michigan State University, Communication Arts & Sciences, in partial fulfillment of the requirements for the Master’s Degree in Telecommunications, Information Studies and Media. 96 Table 22 Code for Quantitative Analysis of Responses Original Question or value name RespondentID Variable ID RespondID Unique Survey ID # CollectorID CollectID Irrelevant- survey overall ID StartDate StartDate Date Survey was started EndDate EndDate Date Survey was ended IPAddress IPAddress Irrelevant- but shows lack of duplication Online Education Survey Consent Form Online class from the list below that I enrolled in this semester (if enrolled in more than one, select the last one enrolled in) Have you ever enrolled in an online class before? Have you ever completed an online class before? CnsntForm Decline= 00 Accept =01 ClassName WRIT 117= 01 WRIT 121= 02 WRIT 122= 03 ENG 121 = 04 ENG 122= 05 WRIT 131 = 06 WRIT 132= 07 Yes= 01 No= 02 Measuring Self-Efficacy with online classes- experience Yes= 01 No =02 Measuring Self-Efficacy with online classes- experience PrevEnroll PrevCompl Value Variable ID 97 Notes Table 22 (cont’d) Original Question or value name I feel comfortable when working with computers or a mobile device Variable ID CompCmft I feel I can access information that I need from computers InfoAccss I feel I can finish my homework assignments by deadlines HmwrkDdln Value Variable ID Strongly Disagree = 01 Disagree =02 Somewhat Disagree = 03 Neither Agree nor Disagree =04 Somewhat agree =05 Agree =06 Strongly Agree=07 Strongly Disagree = 01 Disagree =02 Somewhat Disagree = 03 Neither Agree nor Disagree =04 Somewhat agree =05 Agree =06 Strongly Agree=07 Strongly Disagree = 01 Disagree =02 Somewhat Disagree = 03 Neither Agree nor Disagree =04 Somewhat agree =05 Agree =06 Strongly Agree=07 98 Notes Measuring Self-Efficacy with online classes- (computing devices) Measuring Self-Efficacy with online classes- (computing devices) Measuring Self-Efficacy with Education Table 22 (cont’d) Original Question or value name I feel I can get myself to do my assignments I know where to go for help if I need more help with my assignments Variable ID SelfMotiv SelfDirect Did you stay enrolled LogIn long enough to log in to the class site? What type of DevNumber device(s) do you use to access your online class materials? (Cumulative Number) Value Variable ID Notes Strongly Disagree = 01 Disagree =02 Somewhat Disagree = 03 Neither Agree nor Disagree =04 Somewhat agree =05 Agree =06 Strongly Agree=07 Strongly Disagree = 01 Disagree =02 Somewhat Disagree = 03 Neither Agree nor Disagree =04 Somewhat agree =05 Agree =06 Strongly Agree=07 Yes =01 No =02 Measuring Self-Efficacy with Education One device= 01 Two devices =02 Three devices =03 Four devices =04 Five devices=05 Question allows multiple device answers to collect all devices used. The sum of devices used 99 Measuring Self-Efficacy with Education Filtering question to assess if dropped before any interaction with class Table 22 (cont’d) Original Question or value name What type of device(s) do you use to access your online class materials? What type of device(s) do you use to access your online class materials? What type of device(s) do you use to access your online class materials? What type of device(s) do you use to access your online class materials? What type of device(s) do you use to access your online class materials? Variable ID Value Variable ID Notes DevMobile Yes =01 No = 99 If the student accesses class materials on a mobile phone DevTablet Yes =01 No = 99 If the student accesses class materials on a tablet DevLaptop Yes =01 No = 99 If the student accesses class materials on a laptop DevDesktp Yes =01 No = 99 If the student accesses class materials on a Desktop (home or office) DevMCCDktp Yes =01 No = 99 If the student accesses class materials on a college computer 100 Table 22 (cont’d) Original Question or value name Did you have any trouble logging in to your online class? Did you need any technical assistance in accessing your online class? How hard was it to get help? Was the person you contacted able to help resolve the problem? Comments Variable ID Value Variable ID Notes TechLgIn Yes =01 No =02 Measuring Usability of Interface TechUse Yes =01 No =02 Measuring Use of Technical Assistance TechEase Very Difficult=01 Difficult= 02 Somewhat difficult=03 Neutral = 04 Somewhat Easy= 05 Easy = 06 Very Easy =07 Not applicable= 99 Yes =01 No =02 Not applicable= 99 Measuring Use of Technical Assistance Not applicable=99 Technical Usability comment reflecting some struggle with use (somewhat difficult) =3 Question to encourage any qualitative feedback about technical assistancelocated within this section to encourage feedback about this specific topic. TechHlpfnss CmntTA 101 Measuring Use of Technical Assistance Table 22 (cont’d) Original Question or value name Learning how to access my class material was easy Variable ID UsblIntAccss The course's online UblLMSInt learning management system interface helped me to complete my class assignments successfully Because of how the UblLMSUse online learning management system works, I often struggle with accessing what I am supposed to do in the class Value Variable ID Strongly Disagree = 01 Disagree =02 Somewhat Disagree = 03 Neither Agree nor Disagree =04 Somewhat agree =05 Agree =06 Strongly Agree=07 Strongly Disagree = 01 Disagree =02 Somewhat Disagree = 03 Neither Agree nor Disagree =04 Somewhat agree =05 Agree =06 Strongly Agree=07 Strongly Disagree = 07 Disagree =06 Somewhat Disagree = 05 Neither Agree nor Disagree =04 Somewhat agree =03 Agree =02 Strongly Agree=01 102 Notes Measuring Usability of Interface Measuring Usability of Interface (Learning Management System- LMS) Measuring Usability of Interface REVERSE SCORE- validity check for all questions (checking for participants just checking the same box) Table 22 (cont’d) Original Question or value name I was able to find the assignments easily Variable ID UblLMSOrg The instructor UblInsFeat utilized many features of the online learning management system The instructor's organization of the class material is effective UblInsOrg Value Variable ID Strongly Disagree = 01 Disagree =02 Somewhat Disagree = 03 Neither Agree nor Disagree =04 Somewhat agree =05 Agree =06 Strongly Agree=07 Strongly Disagree = 01 Disagree =02 Somewhat Disagree = 03 Neither Agree nor Disagree =04 Somewhat agree =05 Agree =06 Strongly Agree=07 Strongly Disagree = 01 Disagree =02 Somewhat Disagree = 03 Neither Agree nor Disagree =04 Somewhat agree =05 Agree =06 Strongly Agree=07 103 Notes Measuring Usability of Interface Measuring Usability of Instructor Utilization (Pedagogical choices for online education) Measuring Usability of Instructor Utilization (Pedagogical choices for online education) Table 22 (cont’d) Original Question or Variable ID value name The instructor clearly UblInsComm communicated his expectations for each assignment The course's online UblInsUtlz01 learning management system features the professor used in delivering class material was easy to use Because of how the UblInsUtlz02 instructor has utilized the online learning management system, I can efficiently access my online class material Value Variable ID Strongly Disagree = 01 Disagree =02 Somewhat Disagree = 03 Neither Agree nor Disagree =04 Somewhat agree =05 Agree =06 Strongly Agree=07 Strongly Disagree = 01 Disagree =02 Somewhat Disagree = 03 Neither Agree nor Disagree =04 Somewhat agree =05 Agree =06 Strongly Agree=07 Strongly Disagree = 01 Disagree =02 Somewhat Disagree = 03 Neither Agree nor Disagree =04 Somewhat agree =05 Agree =06 Strongly Agree=07 104 Notes Measuring Usability of Instructor Utilization (Pedagogical choices for online education) Measuring Usability of Instructor Utilization (Pedagogical choices for online education) Measuring Usability of Instructor Utilization (Pedagogical choices for online education) VALIDITY CHECK with previous question to check if participants are reading carefully- basically the same measure, worded slightly differently Table 22 (cont’d) Original Question or value name Overall the online class made me feelTerrible/ Delighted Overall the online class made me feelFrustrated/Contented Variable ID Value Variable ID Notes UsbSat01 Strongly Agree 1st choice = 01 Agree with 1st choice = 02 Somewhat agree with 1st choice =03 Neutral= 04 Somewhat agree with the 2nd choice -05 Agree with the 2nd choice = 06 Strongly agree with the 2nd choice = 07 Bipolar Measures (“semantic differential adjective pairs”) to measure satisfaction levels from Usability of the Course. Design from the Integrated Satisfaction Model (Coursaris et al, 2007, 2012; Ajzen & Fishbein, 1977) UsbSat02 Strongly Agree 1st choice = 01 Agree with 1st choice = 02 Somewhat agree with 1st choice =03 Neutral= 04 Somewhat agree with the 2nd choice -05 Agree with the 2nd choice = 06 Strongly agree with the 2nd choice = 07 Bipolar Measures (“semantic differential adjective pairs”) to measure satisfaction levels from Usability of the Course. Design from the Integrated Satisfaction Model (Coursaris et al, 2007, 2012; Ajzen & Fishbein, 1977) 105 Table 22 (cont’d) Original Question or value name Overall the online class made me feelUnhappy/ Gratified Overall the online class made me feelSad/ Joyful Variable ID Value Variable ID Notes UsbSat03 Strongly Agree 1st choice = 01 Agree with 1st choice = 02 Somewhat agree with 1st choice =03 Neutral= 04 Somewhat agree with the 2nd choice -05 Agree with the 2nd choice = 06 Strongly agree with the 2nd choice = 07 Bipolar Measures (“semantic differential adjective pairs”) to measure satisfaction levels from Usability of the Course. Design from the Integrated Satisfaction Model (Coursaris et al, 2007, 2012; Ajzen & Fishbein, 1977) UsbSat04 Strongly Agree 1st choice = 01 Agree with 1st choice = 02 Somewhat agree with 1st choice =03 Neutral= 04 Somewhat agree with the 2nd choice -05 Agree with the 2nd choice = 06 Strongly agree with the 2nd choice = 07 Bipolar Measures (“semantic differential adjective pairs”) to measure satisfaction levels from Usability of the Course. Design from the Integrated Satisfaction Model (Coursaris et al, 2007, 2012; Ajzen & Fishbein, 1977) 106 Table 22 (cont’d) Original Question or value name Overall the online class made me feelDissatisfied/ Satisfied Overall the online class made me feelDispleased/ Pleased Variable ID Value Variable ID Notes UsbSat05 Strongly Agree 1st choice = 01 Agree with 1st choice = 02 Somewhat agree with 1st choice =03 Neutral= 04 Somewhat agree with the 2nd choice -05 Agree with the 2nd choice = 06 Strongly agree with the 2nd choice = 07 Bipolar Measures (“semantic differential adjective pairs”) to measure satisfaction levels from Usability of the Course. Design from the Integrated Satisfaction Model (Coursaris et al, 2007, 2012; Ajzen & Fishbein, 1977) UsbSat06 Strongly Agree 1st choice = 01 Agree with 1st choice = 02 Somewhat agree with 1st choice =03 Neutral= 04 Somewhat agree with the 2nd choice -05 Agree with the 2nd choice = 06 Strongly agree with the 2nd choice = 07 Bipolar Measures (“semantic differential adjective pairs”) to measure satisfaction levels from Usability of the Course. Design from the Integrated Satisfaction Model (Coursaris et al, 2007, 2012; Ajzen & Fishbein, 1977) 107 Table 22 (cont’d) Original Question or value name Comments Comments after Usability/ Satisfaction- Self Efficacy Variable ID Value Variable ID Notes CmtUsbSat The answers to this question were Question to encourage any qualitative subdivided into how they related feedback about usability and satisfaction to the three constructs under levels evaluation CmUsbSSatSE Very insecure about the class=01 Insecure about the class=02 Somewhat insecure about the class=03 Neutral about the class=04 Somewhat Confident about the class=05 Confident about the class=06 Very confident about the class=07 No comment=99 108 Comments that show either selfconfidence in the class and ability to selfdirect, even if there is dissatisfaction with how either the professor or classmates are participating. Table 22 (cont’d) Original Question or value name Comments after Usability/ SatisfactionUsability of the LMS Variable ID CmUsbSSatUSSys Value Variable ID Very frustrated with the system=01 Frustrated with the system=02 Somewhat frustrated with the system=03 Neutral=04 Somewhat happy with the LMS system=05 Happy with the LMS system=06 Very happy with the LMS system=07 No comment=99 109 Notes Comments dealing with the usability of the system – some had very strong comments and deep frustration. A student could be unhappy with the LMS and yet happy with the usability that the instructor enabled- “My teacher is fantastic. She has kept everything extremely organized and communicates with fast responses. I have a windows 8 laptop, and have had a hard time with D2L being user friendly. I bought a computer for this class and have to go use my parents’ computer or a school computer due to the complications from windows 8.” Table 22 (cont’d) Original Question or value name Comments after Usability/ SatisfactionUsability per the instructor’s organization Variable ID CmUsbSatUSIns Value Variable ID Very frustrated with the instructor’s organization=01 Frustrated with the instructor’s organization =02 Somewhat frustrated with the instructor’s organization =03 Neutral=04 Somewhat happy with the instructor’s organization =05 Happy with the instructor’s organization =06 Very happy with the instructor’s organization =07 No comment=99 110 Notes These were comments expressing satisfaction or frustration with the professor’s use of the system- “My biggest complaint has to do with D2L technical problems this semester. Students are at the mercy of the system to complete their assignments and get instruction. I have had moments of frustration. But overall, I believe online writing classes add wonderfully to the process of learning. My instructor is helpful, but miscommunications have occurred do to the online environment. I am benefiting greatly from the experience and will continue to take online classes in the future.” Table 22 (cont’d) Original Question or value name Comments after Usability/ Satisfaction- Social Presence Variable ID CmUsbSSatSP Value Variable ID Notes Very frustrated with the instructor’s communications (social presence)=01 Frustrated with the instructor’s communications (social presence) =02 Somewhat frustrated with the instructor’s communications (social presence) =03 Neutral=04 Somewhat happy with the instructor’s communications (social presence) =05 Happy with the instructor’s communications (social presence) =06 Very happy with the instructor’s communications (social presence) =07 No comment=99 Note: All comments were directed towards either high satisfaction with/ or high dissatisfaction with social presence and interaction with the instructor. There were no comments pertaining to social presence of the fellow-students in this section.” 111 Table 22 (cont’d) Original Question or value name I feel that my professor responds in a reasonable time to my comments Variable ID SPProfEml01 I would like it if my professor responded more quickly to my comments SPProfEm02 How quickly would you feel is a reasonable time for your professor to respond to your comments or questions? SPProfEm03 Value Variable ID Strongly Disagree = 01 Disagree =02 Somewhat Disagree = 03 Neither Agree nor Disagree =04 Somewhat agree =05 Agree =06 Strongly Agree=07 Strongly Disagree = 01 Disagree =02 Somewhat Disagree = 03 Neither Agree nor Disagree =04 Somewhat agree =05 Agree =06 Strongly Agree=07 Less than 6 hours= 01 6 hours=02 12 hours=03 18 hours=04 1 day=05 2 days=06 3 days=07 More than 3 days=08 112 Notes Social Presence and email response time satisfaction at current levels- level of immediacy with instructor Social Presence and email response time preferences- level of immediacy with instructor Exploring student expectations for email response satisfaction levels with instructor Table 22 (cont’d) Original Question or value name When I posted a comment on our class site my classmates responded to me in a reasonable time Variable ID SPStud01 I enjoy interacting with my online classmates SPStud02 I have shared personal information with my online classmates SPStud03 Value Variable ID Strongly Disagree = 01 Disagree =02 Somewhat Disagree = 03 Neither Agree nor Disagree =04 Somewhat agree =05 Agree =06 Strongly Agree=07 Strongly Disagree = 01 Disagree =02 Somewhat Disagree = 03 Neither Agree nor Disagree =04 Somewhat agree =05 Agree =06 Strongly Agree=07 Strongly Disagree = 01 Disagree =02 Somewhat Disagree = 03 Neither Agree nor Disagree =04 Somewhat agree =05 Agree =06 Strongly Agree=07 113 Notes Social Presence and levels of immediacy with classmates Social Presence and comfort of interaction with classmates Social Presence with classmates- Deep sharing Table 22 (cont’d) Original Question or value name My classmates share personal information about themselves My classmates express their agreement or disagreement with what I post Variable ID SPStud04 SPStud05 Value Variable ID Strongly Disagree = 01 Disagree =02 Somewhat Disagree = 03 Neither Agree nor Disagree =04 Somewhat agree =05 Agree =06 Strongly Agree=07 Strongly Disagree = 01 Disagree =02 Somewhat Disagree = 03 Neither Agree nor Disagree =04 Somewhat agree =05 Agree =06 Strongly Agree=07 114 Notes Social Presence with classmates- Deep sharing Social Presence with classmates Table 22 (cont’d) Original Question or value name Comments Variable ID CmtSPCom Value Variable ID Very frustrated with the instructor’s communications (social presence)=01 Frustrated with the instructor’s communications (social presence) =02 Somewhat frustrated with the instructor’s communications (social presence) =03 Neutral=04 Somewhat happy with the instructor’s communications (social presence) =05 Happy with the instructor’s communications (social presence) =06 Very happy with the instructor’s communications (social presence) =07 No comment=99 115 Notes Question to encourage any qualitative feedback about social presence and communication issues. All the answers, except one, were directed at the social presence and communication issues with the instructor. The one exception was neutral about communications, so it was not coded in the findings since there was not enough to findings to create a new category, but the comment will be addressed in the findings. Most comments were-“ This class has been a real struggle what with my instructor not knowing how to use D2L and not responding when I have emailed her. Her average response time is about two weeks, which is usually after an assignment is due. This is very aggravating.”- this would show very frustrated with communications/social presence. Table 22 (cont’d) Original Question or value name If I needed more help with my assignments I know where to go for help Did you ever try to use tutoring services? Variable ID Value Variable ID Notes SelfDirect02 Yes= 01 No = 02 Validity question to SelfDirect TSUse Yes= 01 No = 02 Dealing with institutional best practices and supplemental services Was it easy to get an appointment? TSEase Dealing with institutional best practices and supplemental services Was the session(s) helpful? TSHlpflnss Yes= 01 No = 02 Not applicable=99 Yes= 01 No= 02 Other = 03 Not applicable=99 116 Dealing with institutional best practices and supplemental services Table 22 (cont’d) Original Question or value name Comments Variable ID Value Variable ID Notes CmtTS Very unhappy with their service=01 Unhappy with their service=02 Somewhat unhappy with their service=03 Neither happy nor unhappy with their service=04 Somewhat happy with their service=05 Happy with their service=06 Very happy with their service=07 Haven’t used the service yet/ not applicable=99 Question to encourage any qualitative feedback about tutoring services Did you ever try to use the Writing Center services? Was it easy to get an appointment? WCUse Yes= 01 No = 02 Dealing with institutional best practices and supplemental services WCEase Yes= 01 No = 02 Not applicable=99 Dealing with institutional best practices and supplemental services 117 Table 22 (cont’d) Was the session(s) helpful? WCHlpflnss Comments- CmtWC Are you still in the class? Were you administratively (either by the professor or the system) dropped for any reason? StuRtn StuDrpAdmin Yes= 01 No = 02 Other= 03 Not applicable=99 Very unhappy with their service=01 Unhappy with their service=02 Somewhat unhappy with their service=03 Neither happy nor unhappy with their service=04 Somewhat happy with their service=05 Happy with their service=06 Very happy with their service=07 Haven’t used the service yet/ not applicable=99 Dealing with institutional best practices and supplemental services Yes= 01 No = 02 Yes= 01 No = 02 Not applicable=99 Retention check Question to encourage any qualitative feedback about the Writing Center Administrative drops- non-payment of tuition, loss of scholarship, nonattendance, other issues 118 Table 22 (cont’d) Original Question or value name Did you drop the class before starting work on the material? Reasons for dropping the class(Select as many as apply) Variable ID Value Variable ID Notes StuDrpSelfBf Yes= 01 No = 02 Not applicable=99 Drops before starting material at all, to find those who preferred to be in a different section for various reasons StuDrpBfRsnAll I was able to get into a face-toface section = 01 I didn’t need the class =02 I was too busy = 03 I didn’t have the technical equipment I needed =04 I heard negative things about the class from other students =05 I heard negative things about the professor from other students = 06 Not applicable=99 Drops before starting material at all, to find those who preferred to be in a different section for various reasons- to find all reasons 119 Table 22 (cont’d) Original Question or value name Check the reason that most influenced your decision Did you drop the class before completing it? Variable ID StuDrpBfRsnPr StuDrpSlfDrng Value Variable ID I was able to get into a face-to-face section = 01 I didn’t need the class =02 I was too busy = 03 I didn’t have the technical equipment I needed =04 I heard negative things about the class from other students =05 I heard negative things about the professor from other students = 06 Other= 07 Not applicable=99 Yes= 01 No = 02 Not applicable=99 120 Notes Drops before starting material at all, to find those who preferred to be in a different section for various reasons-to find primary reason Drops during the class Table 22 (cont’d) Original Question or value name Reasons for dropping the class Select the reason that most influenced your decision Variable ID StuDrpDrngAll StuDrpDrngPr Value Variable ID I was having technical problems= 01 I didn’t understand how the course was organized = 02 I couldn’t keep up with the assignments =03 The professor didn’t give good feedback =04 The other students were not very helpful =05 My schedule changed and I was too busy =06 Other =07 Not applicable=99 I was having technical problems= 01 I didn’t understand how the course was organized = 02 I couldn’t keep up with the assignments =03 The professor didn’t give good feedback =04 The other students were not very helpful =05 My schedule changed and I was too busy =06 Other =07 Not applicable=99 121 Notes Drops during the class, to find all reasons Drops during the class, to find primary reason Table 22 (cont’d) Original Question or Variable ID value name Any other comments CmnAll you would like to offer about your online education experiences, either for this class or other online classes you have been in. CmnAllSE Value Variable ID Notes These are coded into the three constructs under consideration. Question to encourage any qualitative feedback about the online educational experience Very insecure about the class=01 Insecure about the class=02 Somewhat insecure about the class=03 Neutral about the class=04 Somewhat Confident about the class=05 Confident about the class=06 Very confident about the class=07 No comment=99 Final Comments relating to a student’s sense of self-efficacy in online classes 122 Table 22 (cont’d) Original Question or value name Variable ID Value Variable ID Very frustrated with the instructor’s organization=01 Frustrated with the instructor’s organization =02 Somewhat frustrated with the instructor’s organization =03 Neutral=04 Somewhat happy with the instructor’s organization =05 Happy with the instructor’s organization =06 Very happy with the instructor’s organization =07 No comment=99 CmnAllSPStu Very frustrated with fellow students’ communications (social presence)=01 Frustrated with fellow students’ communications (social presence) =02 Somewhat frustrated with fellow students’ communications (social presence) =03 Neutral=04 Somewhat happy with fellow students’ communications (social presence) =05 Happy with fellow students’ communications (social presence) =06 Very happy with fellow students’ communications (social presence) =07 No comment=99 CmnAllUS 123 Notes Final Comments relating to the usability of the class due to instructor’s utilization Final Comments relating to the social presence/ communication with other students Table 22 (cont’d) Original Question or value name Your Zip Code- Variable ID Value Variable ID CmnAllSpIns Very frustrated with the instructor’s communications (social presence)=01 Frustrated with the instructor’s communications (social presence) =02 Somewhat frustrated with the instructor’s communications (social presence) =03 Neutral=04 Somewhat happy with the instructor’s communications (social presence) =05 Happy with the instructor’s communications (social presence) =06 Very happy with the instructor’s communications (social presence) =07 No comment=99 Zip Prefer not to answer=99 Are you the first DemFstPrsn person in your immediate family to attend college? Yes =01 No =02 Prefer Not to Answer =99 Notes Final Comments relating to the social presence/communications of the instructor= “I like online classes but I think professors should be held to a timeframe just as I am. If I post an assignment I would like to get feedback within a week rather than whenever they get to it or not at all.” Would express strong dissatisfaction with social interaction at the same time as strong self-efficacy in online classes. Demographics- financial Demographics- background 124 Table 22 (cont’d) Original Question or value name What is your approximate GPA? Variable ID DemGPA Gender DemGen Ethnic/ Racial Background DemRacial Value Variable ID .00-.99= 01 1.00-1.49=02 1.50-1.99=03 2.00-2.49=04 2.50-2.99=05 3.00-3.49 =06 3.50-4.00 = 07 Prefer not to answer=99 Male= 01 Female=02 Prefer not to Answer =99 American Indian or Alaska Native =01 Native Hawaiian or Other Pacific Islander =02 Black or African American =03 Asian =04 White =05 Hispanic, Latino =06 Two or more races=07 Other=08 Prefer not to answer=99 125 Notes Demographics- GPA Demographics- gender Demographics- racial Table 22 (cont’d) Original Question or value name Citizenship Thank you so much for your participation in this research. Would you like to enter in the drawing for a $25 gift certificate from Meijer? Variable ID DemCitz DemCert Value Variable ID U.S. Citizen =01 Permanent Resident=02 Refugee= 03 Immigrant=04 Political Asylum =05 Other (example:H4, TPS, B2) =06 Prefer not to answer =99 Yes =01 No=02 126 Notes Demographics- citizenship (ESOL) Demographics- motivation for participation WORKS CITED 127 WORKS CITED Acemoglu, D., & Autor, D. (2010). Skills, Tasks and Technologies : Implications for Employment and Earnings. National Bureau of Economic Research. Working Paper No. 16082. Retrieved from http://www.nber.org/papers/w16082 Achilles, W., Byrd, K., Felder-strauss, J., Franklin, P., & Janowich, J. (2011). Engaging students through communication and contact : Outreach can positively impact your students and you. MERLOT Journal of Online Learning and Teaching, 7(1), 128–133. Ajzen I and Fishbein M (1977) Attitude-behavior relations: A theoretical analysis and review of empirical research. Psychological Bulletin 84(5), 888-918. Ajzen, I., & Fishbeing, M. (1977). Attitude-behavior relations: A theoretical analysis and review of empirical research. Psychological Bulletin 84(5), 888-918. Allen, E. & Seaman, J. (2011). Going the distance: Online education in the United States, 2011. Babson Survey Research Group. Retrieved from http://sloanconsortium.org/publications/survey/going_distance_2011 Archibald, R.B., and Feldman, D.H. (2010). What Does College Cost so Much? New York, NY; Oxford University Press. Artino,, A. R. (2009). Online learning: Are subjective perceptions of instructional context related to academic success? The Internet and Higher Education, 12(3-4), 117–125. doi:10.1016/j.iheduc.2009.07.003 Auguste, B. G., Cota, A., Jayaram, K., & Laboissiere, M. C. A. (2010). Winning by degrees : the strategies of highly productive higher- education institutions (p. 66). New York, NY. Retrieved from http://mckinseyonsociety.com/downloads/reports/Education/Winning by degrees report fullreport v5.pdf Bailey, C. J., & Card, K. A. (2009). Effective pedagogical practices for online teaching: Perception of experienced instructors. The Internet and Higher Education, 12(3-4), 152– 155. doi:10.1016/j.iheduc.2009.08.002 Bandura, A. (2006). Guide for constucting self-efficacy scales. Self-Efficacy Beliefs of Adolescents (pp. 307–337). Information Age Publishig. 128 Bean, J. P., & Metzner, B. S. (1985). A Conceptual Model of Nontraditional Student Attrition Undergraduate. Review of Educational Research, 55(4), 485–540. Retrieved from http://www.jstor.org/stable/1170245 Beerkens, E. (2003). Globalisation and Higher Education Research. Journal of Studies in International Education, 7(2), 193–206. doi:10.1177/1028315303251395 Bergeron, J. (2012). Integrating adjunct faculty into our student success initiative. Lansing Community College: Strategic Planning Process. Retrieved from http://web.lcc.edu/strategy/2012/01/16/integrating-adjunct-faculty-into-our-student-successinitiative/ Bramble, N., & Lu, Y. J. (2011). Accreditation of tuition-free online universities and peer-topeer postsecondary learning communities: a framework for protecting students and expanding access to higher education. Yale Information Society Project Working Paper Series. Braxton, J. M., Milem, J. F., & Sullivan, A. S. (2000). The influence of active learning on the college student departure process. The Journal of Higher Education, 71(5), 570–590. Brinthaupt, T. M., Fisher, L. S., Gardner, J. G., Raffo, D. M., & Woodard, J. B. (2011). What the Best Online Teachers Should Do. MERLOT Journal of Online Learning and Teaching, 7(4), 515–524. Brown, R. (2011, July 18). Community-college students perform worse online than face to face. The Chronicle of Higher Education. Retrieved from http://chronicle.com/article/Community-College-Students/128281/ Cabrera, A. F., Castaneda, M. B., Nora, A., & Hengstler, D. (1992). The Convergence Between Two Theories of College Persistence. The Journal of Higher Education, 63(2), 143. Caudill, J. G. (2007). Mobile Computing : Parallel developments. International Review of Research in Open and Distance Learning, 8(2), 13. Retrieved from http://works.bepress.com/cgi/viewcontent.cgi?article=1003&context=jason_caudill Clay, R. (2012). Diversity at Community Colleges. Monitor on Psychology. 43(8), 38. Retrieved from http://www.apa.org/monitor/2012/09/diversity.aspx Coates, H., James, R., & Baldwin, G. (2005). A critical examination of the effects of learning management systems on university teaching and learning. Tertiary Education and Management, 11(1), 19–36. doi:10.1080/13583883.2005.9967137 129 Corey, K., Keller, J., O'Leary, B., Richards, A., Coddington, R., & Smallwood, S. (2012). College completion: Who graduates from college, who doesn't, and why it matters. The Chronicle of Higher Education. Retrieved from http://collegecompletion.chronicle.com/college-stats/ Coursaris CK, Hassanein K, Head MM and Bontis N (2007) The impact of distraction on the usability and the adoption of mobile devices for wireless data services. Proceedings of the European Conference on Information Systems, HCI Track, St. Gallen, Switzerland Coursaris CK, Hassanein K, Head MM and Bontis N (2012) The impact of distractions on the usability and intention to use mobile devices for wireless data services. Computers in Human Behavior 28(4), 1439-1449 Coursaris, C, van Osch, W. (2012). An integrated model of user satisfaction (IMUS): Disrupting the dichotomy of utilitarian versus hedonic system performance. Manuscript submitted for publication. Coursaris, C.K., Hassanein, K., Head, M.M., & Bontins, N. (2007). The impact of distraction on the usability and the adoption of mobile devices for wireless data service. Computers in Human Behavior 28(4), 1439-1449. Coursaris, C.K., Kripintris, K. (2012). Web aesthetics and usability: An empirical study on the effects of white space. International Journal of E-Business Research 8(1), 35-53. Coursaris, CK, (et al--??). (2012) An integrated model of user satisfaction (IMSU): Disrupting the dichotomy of utilitarian versus hedonic system performance. Manuscript Submitted for Publication. Crim, S. J. (2006). An examination of social presence in an online learning environment. University of Louisville. Crim, S. J. (2006). An examination of social presence in an online learning environment. University of Louisville. Crosling, G., Thomas, L., & Heagney, M. (Eds.) (2008). Improving student retention in higher education: The role of teaching and learning. New York: Routledge. Dabbagh, N., & Kitsantas, A. (2012). Personal Learning Environments, social media, and selfregulated learning: A natural formula for connecting formal and informal learning. The Internet and Higher Education, 15(1), 3–8. doi:10.1016/j.iheduc.2011.06.002 130 Daniels, L. M., & Stupnisky, R. H. (2012). Not that different in theory: Discussing the controlvalue theory of emotions in online learning environments. The Internet and Higher Education, 15(3), 222–226. doi:10.1016/j.iheduc.2012.04.002 Fisk, D. M. (2003). U . S . Bureau of Labor Statistics Compensation and Working Conditions Online American Labor in the 20th Century (pp. 1–9). Fortune, M. F., Spielman, M., & Pangelinan, D. T. (2011). Students ’ Perceptions of Online or Face-to-Face Learning and Social Media in Hospitality , Recreation and Tourism. MERLOT Journal of Online Learning and Teaching, 7(1), 1–16. Gerard, L. F., Varma, K., Corliss, S. B., & Linn, M. C. (2011). Professional Development for Technology-Enhanced Inquiry Science. Review of Educational Research, 81(3), 408– 448. doi:10.3102/0034654311415121 Gibbons, H. S., & Wentworth, G. P. (2001). Andrological and Pedagogical Training Differences for Online Instructors. Online Journal of Distance Learning Administration, 4(3), 1–5. Retrieved from http://www.westga.edu/~distance/ojdla/fall43/gibbons_wentworth43.html Groves, R.M., Fowler, F.J., Couper, M.P., Lepkowski, J.M., Singer, E., Tourangeau, R. (2009). Survey Methodology. Hoboken, NJ: Wiley & Sons. Gunawardena, C. N., & Zittle, F. J. (1997). Social presence as a predictor of satisfaction within a computer- mediated conferencing environment. American Journal of Distance Education, 11(3), 8–26. Retrieved from http://dx.doi.org/10.1080/08923649709526970 Gunawardena, C. N., & Zittle, F. J. (1997). Social presence as a predictor of satisfaction within a computer- mediated conferencing environment. American Journal of Distance Education, 11(3), 8–26. Retrieved from http://dx.doi.org/10.1080/08923649709526970 Hart, C. (2012). Factors Associated With Student Persistence in an Online Program of Study: A Review of the Literature. Journal of Interactive Online Learning, 11(1). Hawkings, B. (2012 July 12). Prepare now for Minnesota's future higher-ed needs, Itasca report advises. MinnPost. Retrieved from http://www.minnpost.com/learningcurve/2012/07/prepare-now-minnesotas-future-higher-ed-needs-itasca-report-advises Herbert, M., & Ph, D. (2006). Staying the Course : A Study in Online Student Satisfaction and Retention. Online Journal of Distance Learning Administration, 9(5). Retrieved from http://fsweb.bainbridge.edu/qep/Files/TeachingRes/Staying the Course.pdf 131 Hodges, C. B. (2008). Self-Efficacy in the Context of Online Learning Environments : A Review of the Literature and Directions for Research. Performance Improvement Quarterly, 20(3/4), 7. Huckabee, S. B. (2010). Environmental and Psychological Factors Contributing to Student Achievement in a High School Online Mediated Credit Recovery Program By Sheila B . Huckabee A Dissertation Submitted to the Gardner-Webb University School of Education in Partial Fulfillment. Gardner-Webb University. Jamrisko, M., & Kolet, I. (2012 Aug. 15). Cost of college degree in U.S. soars 12 fold: Chart of the day. Bloomberg. Retrieved from http://www.bloomberg.com/news/2012-08-15/costof-college-degree-in-u-s-soars-12-fold-chart-of-the-day.html Johnson, E. J., Ed, D., & Card, K. P. D. (2007). The Effects of Instructor and Student Immediacy Behaviors in Writing Improvement and Course Satisfaction in a Web-based Undergraduate Course. MountainRise, the International Journal of the Scholarship of Teaching and Learning. Johnson, S. G., & Berge, Z. (2012). Online education in the community college. Community College Journal of Research and Practice, 36(11), 897–902. doi:10.1080/10668920903323948 Joo, S., & Lee, J. Y. (2011). Measuring the usability of academic digital libraries: Instrument development and validation. The Electronic Library, 29(4), 523–537. doi:10.1108/02640471111156777 Ke, F. (2013). Online interaction arrangements on quality of online interactions performed by diverse learners across disciplines. The Internet and Higher Education, 16, 14–22. doi:10.1016/j.iheduc.2012.07.003 Kenney, J., Hermens, A., & Clarke, T. (2004). The political economy of e-learning educational development : strategies , standardization and scalability. Education & Training, 46(6/7), 370. Kiesler, S., Siegel, J., & McGuire, T. W. (1984). Social psychological aspects of computermediated communication. American Psychologist, 39(10), 1123–1134. doi:10.1037//0003066X.39.10.1123 Leyton-Brown, K. (2012). Game Theory Introduction [Video file]. Retrieved from https://class.coursera.org/gametheory-2012-002/class/index 132 Luckerson, V. (2013 Jan. 10). The Myth of the Four-Year College Degree. Time Business & Money: Educational Financing. Retrieved from http://business.time.com/2013/01/10/themyth-of-the-4-year-college-degree/?iid=obinsite Mancuso, S. M. (2008). A qualitative study of the barriers to participation in a Web-based environments among learners at the community college level. Capella University. McDonald, J. (1995 Feb. 18) Colleges Cancel Dozens of Courses because of Low Enrollment. Los Angeles Times. Retrieved from http://articles.latimes.com/1995-02-18/local/me33324_1_ventura-college Mckerlich, R., Riis, M., Anderson, T., & Eastman, B. (2011). Student Perceptions of Teaching Presence , Social Presence , and Cognitive Presence in a Virtual World. MERLOT Journal of Online Learning and Teaching, 7(3), 324–336. Means, B., Toyama,Y., Murphy, R., Bakia, M., & Jones, K. (2010). Evaluation of evidencebased practices in online-learning: A meta-analysis and review of online learning studies. U.S. Department of Planning, Evaluation and Policy Development Policy and Program Studies Service. Retrieved from http://www2.ed.gov/rschstat/eval/tech/evidence-basedpractices/finalreport.pdf Michigan Department of Treasury. (2012). Lansing Community College: Michigan postsecondary handbook profile page. Retrieved from http://www.michigan.gov/documents/Lansing_Community_College_151090_7.pdf Moxley, D., Najor-Durack, A., & Dumbrigue, C. (2001). Keeping students in higher education: Successful practices & strategies for retention. Sterling, VA: Kogan Page. Muse, H. E. (2003). The Web-based community college student: An examination of factors that lead to success and risk. The Internet and Higher Education, 6(3), 241–261. doi:10.1016/S1096-7516(03)00044-7 Nakajima, M. a., Dembo, M. H., & Mossler, R. (2012). Student Persistence in Community Colleges. Community College Journal of Research and Practice, 36(8), 591–613. doi:10.1080/10668920903054931 National Center for Education Statistics. (2007). College & Career Tables Library. Institute of Education Sciences. Retrieved from http://nces.ed.gov/datalab/tableslibrary/viewtable.aspx?tableid=4540 133 National Center for Education Statistics. (2011). Projections of Education Statics to 2021: Table 33. Institute of Education Sciences. Retrieved from http://nces.ed.gov/programs/projections/projections2021/tables/table_33.asp Ng, C. S. L., Cheung, W. S., & Hew, K. F. (2012). Interaction in asynchronous discussion forums: peer facilitation techniques. Journal of Computer Assisted Learning, 28(3), 280– 294. doi:10.1111/j.1365-2729.2011.00454.x Nielsen, J. (1994). Enhancing the explanatory power of usability heuristics. Conference companion on Human factors in computing systems - CHI ’94, 210. doi:10.1145/259963.260333 Nielson, J. (2010). College students on the web. Nielson Norman Group: Evidence-Based User Experience Research, Training, and Consulting. Retrieved from http://www.nngroup.com/articles/college-students-on-the-web/ Olson, G. M., & Olson, J. S. (2003). Human-computer interaction: psychological aspects of the human use of computing. Annual review of psychology, 54, 491–516. doi:10.1146/annurev.psych.54.101601.145044 Osborn, V. (2000). Identifying at risk students: An assessment instrument for distributed learning courses in higher education. University of North Texas. Retrieved from http://search.proquest.com.proxy1.cl.msu.edu/docview/304612885/fulltextPDF/139C18D7 D012E149321/1?accountid=12598 Parker, A. (1999). A Study of Variables that Predict Dropout from Distance Education. International Journal of Educational Technology, 1(2), 1–10. doi:Proquest:62307270 Patterson, B., & Mcfadden, C. (2012). Attrition in Online and Campus Degree Programs http://www.westga.edu/~distance/ojdla/summer122/patterson112.html. Online Journal of Distance Learning Administration, 1–9. Retrieved from http://www.westga.edu/~distance/ojdla/summer122/patterson112.html Prensky, M. (2001). Digital Natives, Digital Immigrants Part 1. (R. K. Belew & M. D. Vose, Eds.)On the Horizon, 9(5), 1–6. doi:10.1108/10748120110424816 Pretorius, M., & van Biljon, J. (2010). Learning management systems: ICT skills, usability and learnability. Interactive Technology and Smart Education, 7(1), 30-43. doi:http://dx.doi.org/10.1108/17415651011031635 Ray, J. (2009). Faculty Perspective : Training and Course Development for the Online Classroom. Journal of Online Learning and Teaching, 5(2). 134 Roberts, G. (2005). Technology and learning expectations of the Net generation. In Oblinger, D. & Oblinger, J. (Eds.) Educating the Net Generation (pp 32-38). Educause. Russell, V., & Curtis, W. (2012). Comparing a large- and small-scale online language course: An examination of teacher and learner perceptions. The Internet and Higher Education, 16, 1–13. doi:10.1016/j.iheduc.2012.07.002 Sadeh, N. (2002) M-Commerce: Technologies, Services and Business Models, John Wiley and Sons, Inc, Canada and USA. Salajan, F. D., Schönwetter, D. J., & Cleghorn, B. M. (2010). Student and faculty intergenerational digital divide: Fact or fiction? Computers & Education, 55(3), 1393–1403. doi:10.1016/j.compedu.2010.06.017 Saltmarsh, S., & Sutherland‐ Smith, W. (2010). S(t)imulating learning: pedagogy, subjectivity and teacher education in online environments. London Review of Education, 8(1), 15–24. doi:10.1080/14748460903557613 Schneider, M., & Yin, L. M. (2012 April 03). Completion matters: the high cost of low community college graduation rates. American Enterprise Institute. Retrieved from http://www.aei.org/outlook/education/higher-education/community-colleges/completionmatters-the-high-cost-of-community-college-graduation-rates/ Shachar, M., & Neumann, Y. (2010). Twenty years of research on the academic performance differences between traditional and distance learning : Summative meta-analysis and trend examination. Journal of Online Learning and Teaching, 6(2). Shannon, C. E., & Weaver, W. (1949). The mathematical theory of communication. Champaign, IL, US: University of Illinois Press, Champaign, IL. Retrieved from http://ezproxy.msu.edu/login?url=http://search.proquest.com/docview/615198423?accounti d=12598 Shrank, P. (2009). Usability Issues That Impact Online Learning. Faculty focus: Focused on today’s higher education professional. Retrieved from http://www.facultyfocus.com/articles/instructional-design/usability-issues-that-impactonline-learning/ Song, G. (2005). Transcending e-Government : a Case of Mobile Government in Beijing. The First European Conference on Mobile Government (pp. 1–9). Brighton, England. Spreng, R. A., MacKenzie, S.B., & Olshavsky, R.W. (1996). Areexamination of the derminants of consumer satisfaction. Journal of Marketing 60(3), 240-253. 135 State and local higher education funding hits 25-year low. (2012). National Association of Student Financial Aid Administrators. Retrieved from http://www.nasfaa.org/research/News/State_and_Local_Higher_Education_Funding_Hits _25-Year_Low.aspx Steinbronn, P. E. (2007). Faculty perceived utility of methods and instructional strategies used in online and traditional teaching environments. Drake University. Sundar, S. S., & Marathe, S. S. (2010). Personalization versus Customization: The Importance of Agency, Privacy, and Power Usage. Human Communication Research, 36(3), 298–322. doi:10.1111/j.1468-2958.2010.01377.x Swan, K., Matthews, D., Bogle, L., Boles, E., & Day, S. (2012). Linking online course design and implementation to learning outcomes: A design experiment. The Internet and Higher Education, 15(2), 81–88. doi:10.1016/j.iheduc.2011.07.002 Taylor, P., Parker, K., Lenhart, A., & Patten, E. (2011, August 28). The digital revolution and higher education: College presidents, public differ on value of online learning. Pew Internet & American Life Project: Social & demographic trends. Retrieved from http://www.pewsocialtrends.org/files/2011/08/online-learning.pdf The White House. (2012). Higher Education- Education: Knowledge and Skills for the jobs of the future. Retrieved from http://www.whitehouse.gov/issues/education/higher-education The White House: Office of Management and Budget. (2010). A new era of responsibility: Renewing America's promise. U.S . Department of Education 2010 Budget (p. 2). Thomas, L., Cooper, M., & Quinn, J. (Eds.) (2003). Improving completion rates among disadvantaged students. Stoke on Trent, UK: Trentham Books. Tutty, J., & Ratliff, J. (2012). Techniques for Improving Online Community College Completion Rates: Narrow the Path? Community College Journal of Research and Practice, 36(11), 916–920. doi:10.1080/10668926.2012.692300 U.S. Department of Education. (2012). Enrollment. Institute of Education Sciences: National Center for Education Statistics Retrieved from http://nces.ed.gov/fastfacts/display.asp?id=98 Van Deursen, a., & Van Dijk, J. (2010). Internet skills and the digital divide. New Media & Society, 13(6), 893–911. doi:10.1177/1461444810386774 136 Vedder, R. (2012). 12 Reasons College Costs Keep Rising. The Fiscal Times. Retrieved from http://www.thefiscaltimes.com/Articles/2012/06/18/12-Reasons-College-Costs-KeepRising.asp Wang, A. Y., & Newlin, M. H. (2002). Predictors of web-student performance: the role of selfefficacy and reasons for taking an on-line class. Computers in Human Behavior, 18(2), 151–163. doi:10.1016/S0747-5632(01)00042-5 Watson, B. W. R., & Watson, S. L. (2007). An Argument for Clarity : What are Learning Management Systems, What are They Not , and What Should They Become ?, 51(2), 28– 34. Welsh, J. B. (2007). Identifying factors that predict student success in a community college online distance learning course. University of North Texas. Retrieved from http://search.proquest.com.proxy1.cl.msu.edu/docview/304833843/139C18D7D012E14932 1/3?accountid=12598 Willging, P. A., & Johnson, S. D. (2004). Factors that influence students' decision to dropout of online courses. Journal of Asynchronous Learning, 8(4), 105–118. Retrieved from http://scholar.google.com/scholar?q=online+education+dropout+rate&hl=en&as_sdt=0&as _vis=1&oi=scholart&sa=X&ei=66V9UMqkBoWxygG88YCABA&ved=0CD0QgQMwAA Yi, M. Y., & Im, K. S. (2004). Predicting Computer Task Performance. Journal of Organizational and End User Computing, 16(2), 20–37. doi:10.4018/joeuc.2004040102 137