Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • View all journals
  • Explore content
  • About the journal
  • Publish with us
  • Sign up for alerts
  • Published: 25 January 2021

Online education in the post-COVID era

  • Barbara B. Lockee 1  

Nature Electronics volume  4 ,  pages 5–6 ( 2021 ) Cite this article

138k Accesses

203 Citations

337 Altmetric

Metrics details

  • Science, technology and society

The coronavirus pandemic has forced students and educators across all levels of education to rapidly adapt to online learning. The impact of this — and the developments required to make it work — could permanently change how education is delivered.

The COVID-19 pandemic has forced the world to engage in the ubiquitous use of virtual learning. And while online and distance learning has been used before to maintain continuity in education, such as in the aftermath of earthquakes 1 , the scale of the current crisis is unprecedented. Speculation has now also begun about what the lasting effects of this will be and what education may look like in the post-COVID era. For some, an immediate retreat to the traditions of the physical classroom is required. But for others, the forced shift to online education is a moment of change and a time to reimagine how education could be delivered 2 .

research objectives about online classes

Looking back

Online education has traditionally been viewed as an alternative pathway, one that is particularly well suited to adult learners seeking higher education opportunities. However, the emergence of the COVID-19 pandemic has required educators and students across all levels of education to adapt quickly to virtual courses. (The term ‘emergency remote teaching’ was coined in the early stages of the pandemic to describe the temporary nature of this transition 3 .) In some cases, instruction shifted online, then returned to the physical classroom, and then shifted back online due to further surges in the rate of infection. In other cases, instruction was offered using a combination of remote delivery and face-to-face: that is, students can attend online or in person (referred to as the HyFlex model 4 ). In either case, instructors just had to figure out how to make it work, considering the affordances and constraints of the specific learning environment to create learning experiences that were feasible and effective.

The use of varied delivery modes does, in fact, have a long history in education. Mechanical (and then later electronic) teaching machines have provided individualized learning programmes since the 1950s and the work of B. F. Skinner 5 , who proposed using technology to walk individual learners through carefully designed sequences of instruction with immediate feedback indicating the accuracy of their response. Skinner’s notions formed the first formalized representations of programmed learning, or ‘designed’ learning experiences. Then, in the 1960s, Fred Keller developed a personalized system of instruction 6 , in which students first read assigned course materials on their own, followed by one-on-one assessment sessions with a tutor, gaining permission to move ahead only after demonstrating mastery of the instructional material. Occasional class meetings were held to discuss concepts, answer questions and provide opportunities for social interaction. A personalized system of instruction was designed on the premise that initial engagement with content could be done independently, then discussed and applied in the social context of a classroom.

These predecessors to contemporary online education leveraged key principles of instructional design — the systematic process of applying psychological principles of human learning to the creation of effective instructional solutions — to consider which methods (and their corresponding learning environments) would effectively engage students to attain the targeted learning outcomes. In other words, they considered what choices about the planning and implementation of the learning experience can lead to student success. Such early educational innovations laid the groundwork for contemporary virtual learning, which itself incorporates a variety of instructional approaches and combinations of delivery modes.

Online learning and the pandemic

Fast forward to 2020, and various further educational innovations have occurred to make the universal adoption of remote learning a possibility. One key challenge is access. Here, extensive problems remain, including the lack of Internet connectivity in some locations, especially rural ones, and the competing needs among family members for the use of home technology. However, creative solutions have emerged to provide students and families with the facilities and resources needed to engage in and successfully complete coursework 7 . For example, school buses have been used to provide mobile hotspots, and class packets have been sent by mail and instructional presentations aired on local public broadcasting stations. The year 2020 has also seen increased availability and adoption of electronic resources and activities that can now be integrated into online learning experiences. Synchronous online conferencing systems, such as Zoom and Google Meet, have allowed experts from anywhere in the world to join online classrooms 8 and have allowed presentations to be recorded for individual learners to watch at a time most convenient for them. Furthermore, the importance of hands-on, experiential learning has led to innovations such as virtual field trips and virtual labs 9 . A capacity to serve learners of all ages has thus now been effectively established, and the next generation of online education can move from an enterprise that largely serves adult learners and higher education to one that increasingly serves younger learners, in primary and secondary education and from ages 5 to 18.

The COVID-19 pandemic is also likely to have a lasting effect on lesson design. The constraints of the pandemic provided an opportunity for educators to consider new strategies to teach targeted concepts. Though rethinking of instructional approaches was forced and hurried, the experience has served as a rare chance to reconsider strategies that best facilitate learning within the affordances and constraints of the online context. In particular, greater variance in teaching and learning activities will continue to question the importance of ‘seat time’ as the standard on which educational credits are based 10 — lengthy Zoom sessions are seldom instructionally necessary and are not aligned with the psychological principles of how humans learn. Interaction is important for learning but forced interactions among students for the sake of interaction is neither motivating nor beneficial.

While the blurring of the lines between traditional and distance education has been noted for several decades 11 , the pandemic has quickly advanced the erasure of these boundaries. Less single mode, more multi-mode (and thus more educator choices) is becoming the norm due to enhanced infrastructure and developed skill sets that allow people to move across different delivery systems 12 . The well-established best practices of hybrid or blended teaching and learning 13 have served as a guide for new combinations of instructional delivery that have developed in response to the shift to virtual learning. The use of multiple delivery modes is likely to remain, and will be a feature employed with learners of all ages 14 , 15 . Future iterations of online education will no longer be bound to the traditions of single teaching modes, as educators can support pedagogical approaches from a menu of instructional delivery options, a mix that has been supported by previous generations of online educators 16 .

Also significant are the changes to how learning outcomes are determined in online settings. Many educators have altered the ways in which student achievement is measured, eliminating assignments and changing assessment strategies altogether 17 . Such alterations include determining learning through strategies that leverage the online delivery mode, such as interactive discussions, student-led teaching and the use of games to increase motivation and attention. Specific changes that are likely to continue include flexible or extended deadlines for assignment completion 18 , more student choice regarding measures of learning, and more authentic experiences that involve the meaningful application of newly learned skills and knowledge 19 , for example, team-based projects that involve multiple creative and social media tools in support of collaborative problem solving.

In response to the COVID-19 pandemic, technological and administrative systems for implementing online learning, and the infrastructure that supports its access and delivery, had to adapt quickly. While access remains a significant issue for many, extensive resources have been allocated and processes developed to connect learners with course activities and materials, to facilitate communication between instructors and students, and to manage the administration of online learning. Paths for greater access and opportunities to online education have now been forged, and there is a clear route for the next generation of adopters of online education.

Before the pandemic, the primary purpose of distance and online education was providing access to instruction for those otherwise unable to participate in a traditional, place-based academic programme. As its purpose has shifted to supporting continuity of instruction, its audience, as well as the wider learning ecosystem, has changed. It will be interesting to see which aspects of emergency remote teaching remain in the next generation of education, when the threat of COVID-19 is no longer a factor. But online education will undoubtedly find new audiences. And the flexibility and learning possibilities that have emerged from necessity are likely to shift the expectations of students and educators, diminishing further the line between classroom-based instruction and virtual learning.

Mackey, J., Gilmore, F., Dabner, N., Breeze, D. & Buckley, P. J. Online Learn. Teach. 8 , 35–48 (2012).

Google Scholar  

Sands, T. & Shushok, F. The COVID-19 higher education shove. Educause Review https://go.nature.com/3o2vHbX (16 October 2020).

Hodges, C., Moore, S., Lockee, B., Trust, T. & Bond, M. A. The difference between emergency remote teaching and online learning. Educause Review https://go.nature.com/38084Lh (27 March 2020).

Beatty, B. J. (ed.) Hybrid-Flexible Course Design Ch. 1.4 https://go.nature.com/3o6Sjb2 (EdTech Books, 2019).

Skinner, B. F. Science 128 , 969–977 (1958).

Article   Google Scholar  

Keller, F. S. J. Appl. Behav. Anal. 1 , 79–89 (1968).

Darling-Hammond, L. et al. Restarting and Reinventing School: Learning in the Time of COVID and Beyond (Learning Policy Institute, 2020).

Fulton, C. Information Learn. Sci . 121 , 579–585 (2020).

Pennisi, E. Science 369 , 239–240 (2020).

Silva, E. & White, T. Change The Magazine Higher Learn. 47 , 68–72 (2015).

McIsaac, M. S. & Gunawardena, C. N. in Handbook of Research for Educational Communications and Technology (ed. Jonassen, D. H.) Ch. 13 (Simon & Schuster Macmillan, 1996).

Irvine, V. The landscape of merging modalities. Educause Review https://go.nature.com/2MjiBc9 (26 October 2020).

Stein, J. & Graham, C. Essentials for Blended Learning Ch. 1 (Routledge, 2020).

Maloy, R. W., Trust, T. & Edwards, S. A. Variety is the spice of remote learning. Medium https://go.nature.com/34Y1NxI (24 August 2020).

Lockee, B. J. Appl. Instructional Des . https://go.nature.com/3b0ddoC (2020).

Dunlap, J. & Lowenthal, P. Open Praxis 10 , 79–89 (2018).

Johnson, N., Veletsianos, G. & Seaman, J. Online Learn. 24 , 6–21 (2020).

Vaughan, N. D., Cleveland-Innes, M. & Garrison, D. R. Assessment in Teaching in Blended Learning Environments: Creating and Sustaining Communities of Inquiry (Athabasca Univ. Press, 2013).

Conrad, D. & Openo, J. Assessment Strategies for Online Learning: Engagement and Authenticity (Athabasca Univ. Press, 2018).

Download references

Author information

Authors and affiliations.

School of Education, Virginia Tech, Blacksburg, VA, USA

Barbara B. Lockee

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Barbara B. Lockee .

Ethics declarations

Competing interests.

The author declares no competing interests.

Rights and permissions

Reprints and permissions

About this article

Cite this article.

Lockee, B.B. Online education in the post-COVID era. Nat Electron 4 , 5–6 (2021). https://doi.org/10.1038/s41928-020-00534-0

Download citation

Published : 25 January 2021

Issue Date : January 2021

DOI : https://doi.org/10.1038/s41928-020-00534-0

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

This article is cited by

A comparative study on the effectiveness of online and in-class team-based learning on student performance and perceptions in virtual simulation experiments.

BMC Medical Education (2024)

Leveraging privacy profiles to empower users in the digital society

  • Davide Di Ruscio
  • Paola Inverardi
  • Phuong T. Nguyen

Automated Software Engineering (2024)

Growth mindset and social comparison effects in a peer virtual learning environment

  • Pamela Sheffler
  • Cecilia S. Cheung

Social Psychology of Education (2024)

Nursing students’ learning flow, self-efficacy and satisfaction in virtual clinical simulation and clinical case seminar

  • Sunghee H. Tak

BMC Nursing (2023)

Online learning for WHO priority diseases with pandemic potential: evidence from existing courses and preparing for Disease X

  • Heini Utunen
  • Corentin Piroux

Archives of Public Health (2023)

Quick links

  • Explore articles by subject
  • Guide to authors
  • Editorial policies

Sign up for the Nature Briefing newsletter — what matters in science, free to your inbox daily.

research objectives about online classes

  • Research article
  • Open access
  • Published: 02 December 2020

Integrating students’ perspectives about online learning: a hierarchy of factors

  • Montgomery Van Wart 1 ,
  • Anna Ni 1 ,
  • Pamela Medina 1 ,
  • Jesus Canelon 1 ,
  • Melika Kordrostami 1 ,
  • Jing Zhang 1 &

International Journal of Educational Technology in Higher Education volume  17 , Article number:  53 ( 2020 ) Cite this article

148k Accesses

48 Citations

24 Altmetric

Metrics details

This article reports on a large-scale ( n  = 987), exploratory factor analysis study incorporating various concepts identified in the literature as critical success factors for online learning from the students’ perspective, and then determines their hierarchical significance. Seven factors--Basic Online Modality, Instructional Support, Teaching Presence, Cognitive Presence, Online Social Comfort, Online Interactive Modality, and Social Presence--were identified as significant and reliable. Regression analysis indicates the minimal factors for enrollment in future classes—when students consider convenience and scheduling—were Basic Online Modality, Cognitive Presence, and Online Social Comfort. Students who accepted or embraced online courses on their own merits wanted a minimum of Basic Online Modality, Teaching Presence, Cognitive Presence, Online Social Comfort, and Social Presence. Students, who preferred face-to-face classes and demanded a comparable experience, valued Online Interactive Modality and Instructional Support more highly. Recommendations for online course design, policy, and future research are provided.

Introduction

While there are different perspectives of the learning process such as learning achievement and faculty perspectives, students’ perspectives are especially critical since they are ultimately the raison d’être of the educational endeavor (Chickering & Gamson, 1987 ). More pragmatically, students’ perspectives provide invaluable, first-hand insights into their experiences and expectations (Dawson et al., 2019 ). The student perspective is especially important when new teaching approaches are used and when new technologies are being introduced (Arthur, 2009 ; Crews & Butterfield, 2014 ; Van Wart, Ni, Ready, Shayo, & Court, 2020 ). With the renewed interest in “active” education in general (Arruabarrena, Sánchez, Blanco, et al., 2019 ; Kay, MacDonald, & DiGiuseppe, 2019 ; Nouri, 2016 ; Vlachopoulos & Makri, 2017 ) and the flipped classroom approach in particular (Flores, del-Arco, & Silva, 2016 ; Gong, Yang, & Cai, 2020 ; Lundin, et al., 2018 ; Maycock, 2019 ; McGivney-Burelle, 2013 ; O’Flaherty & Phillips, 2015 ; Tucker , 2012 ) along with extraordinary shifts in the technology, the student perspective on online education is profoundly important. What shapes students’ perceptions of quality integrate are their own sense of learning achievement, satisfaction with the support they receive, technical proficiency of the process, intellectual and emotional stimulation, comfort with the process, and sense of learning community. The factors that students perceive as quality online teaching, however, has not been as clear as it might be for at least two reasons.

First, it is important to note that the overall online learning experience for students is also composed of non-teaching factors which we briefly mention. Three such factors are (1) convenience, (2) learner characteristics and readiness, and (3) antecedent conditions that may foster teaching quality but are not directly responsible for it. (1) Convenience is an enormous non-quality factor for students (Artino, 2010 ) which has driven up online demand around the world (Fidalgo, Thormann, Kulyk, et al., 2020 ; Inside Higher Education and Gallup, 2019 ; Legon & Garrett, 2019 ; Ortagus, 2017 ). This is important since satisfaction with online classes is frequently somewhat lower than face-to-face classes (Macon, 2011 ). However, the literature generally supports the relative equivalence of face-to-face and online modes regarding learning achievement criteria (Bernard et al., 2004 ; Nguyen, 2015 ; Ni, 2013 ; Sitzmann, Kraiger, Stewart, & Wisher, 2006 ; see Xu & Jaggars, 2014 for an alternate perspective). These contrasts are exemplified in a recent study of business students, in which online students using a flipped classroom approach outperformed their face-to-face peers, but ironically rated instructor performance lower (Harjoto, 2017 ). (2) Learner characteristics also affect the experience related to self-regulation in an active learning model, comfort with technology, and age, among others,which affect both receptiveness and readiness of online instruction. (Alqurashi, 2016 ; Cohen & Baruth, 2017 ; Kintu, Zhu, & Kagambe, 2017 ; Kuo, Walker, Schroder, & Belland, 2013 ; Ventura & Moscoloni, 2015 ) (3) Finally, numerous antecedent factors may lead to improved instruction, but are not themselves directly perceived by students such as instructor training (Brinkley-Etzkorn, 2018 ), and the sources of faculty motivation (e.g., incentives, recognition, social influence, and voluntariness) (Wingo, Ivankova, & Moss, 2017 ). Important as these factors are, mixing them with the perceptions of quality tends to obfuscate the quality factors directly perceived by students.

Second, while student perceptions of quality are used in innumerable studies, our overall understanding still needs to integrate them more holistically. Many studies use student perceptions of quality and overall effectiveness of individual tools and strategies in online contexts such as mobile devices (Drew & Mann, 2018 ), small groups (Choi, Land, & Turgeon, 2005 ), journals (Nair, Tay, & Koh, 2013 ), simulations (Vlachopoulos & Makri, 2017 ), video (Lange & Costley, 2020 ), etc. Such studies, however, cannot provide the overall context and comparative importance. Some studies have examined the overall learning experience of students with exploratory lists, but have mixed non-quality factors with quality of teaching factors making it difficult to discern the instructor’s versus contextual roles in quality (e.g., Asoodar, Vaezi, & Izanloo, 2016 ; Bollinger & Martindale, 2004 ; Farrell & Brunton, 2020 ; Hong, 2002 ; Song, Singleton, Hill, & Koh, 2004 ; Sun, Tsai, Finger, Chen, & Yeh, 2008 ). The application of technology adoption studies also fall into this category by essentially aggregating all teaching quality in the single category of performance ( Al-Gahtani, 2016 ; Artino, 2010 ). Some studies have used high-level teaching-oriented models, primarily the Community of Inquiry model (le Roux & Nagel, 2018 ), but empirical support has been mixed (Arbaugh et al., 2008 ); and its elegance (i.e., relying on only three factors) has not provided much insight to practitioners (Anderson, 2016 ; Cleveland-Innes & Campbell, 2012 ).

Research questions

Integration of studies and concepts explored continues to be fragmented and confusing despite the fact that the number of empirical studies related to student perceptions of quality factors has increased. It is important to have an empirical view of what students’ value in a single comprehensive study and, also, to know if there is a hierarchy of factors, ranging from students who are least to most critical of the online learning experience. This research study has two research questions.

The first research question is: What are the significant factors in creating a high-quality online learning experience from students’ perspectives? That is important to know because it should have a significant effect on the instructor’s design of online classes. The goal of this research question is identify a more articulated and empirically-supported set of factors capturing the full range of student expectations.

The second research question is: Is there a priority or hierarchy of factors related to students’ perceptions of online teaching quality that relate to their decisions to enroll in online classes? For example, is it possible to distinguish which factors are critical for enrollment decisions when students are primarily motivated by convenience and scheduling flexibility (minimum threshold)? Do these factors differ from students with a genuine acceptance of the general quality of online courses (a moderate threshold)? What are the factors that are important for the students who are the most critical of online course delivery (highest threshold)?

This article next reviews the literature on online education quality, focusing on the student perspective and reviews eight factors derived from it. The research methods section discusses the study structure and methods. Demographic data related to the sample are next, followed by the results, discussion, and conclusion.

Literature review

Online education is much discussed (Prinsloo, 2016 ; Van Wart et al., 2019 ; Zawacki-Richter & Naidu, 2016 ), but its perception is substantially influenced by where you stand and what you value (Otter et al., 2013 ; Tanner, Noser, & Totaro, 2009 ). Accrediting bodies care about meeting technical standards, proof of effectiveness, and consistency (Grandzol & Grandzol, 2006 ). Institutions care about reputation, rigor, student satisfaction, and institutional efficiency (Jung, 2011 ). Faculty care about subject coverage, student participation, faculty satisfaction, and faculty workload (Horvitz, Beach, Anderson, & Xia, 2015 ; Mansbach & Austin, 2018 ). For their part, students care about learning achievement (Marks, Sibley, & Arbaugh, 2005 ; O’Neill & Sai, 2014 ; Shen, Cho, Tsai, & Marra, 2013 ), but also view online education as a function of their enjoyment of classes, instructor capability and responsiveness, and comfort in the learning environment (e.g., Asoodar et al., 2016 ; Sebastianelli, Swift, & Tamimi, 2015 ). It is this last perspective, of students, upon which we focus.

It is important to note students do not sign up for online classes solely based on perceived quality. Perceptions of quality derive from notions of the capacity of online learning when ideal—relative to both learning achievement and satisfaction/enjoyment, and perceptions about the likelihood and experience of classes living up to expectations. Students also sign up because of convenience and flexibility, and personal notions of suitability about learning. Convenience and flexibility are enormous drivers of online registration (Lee, Stringer, & Du, 2017 ; Mann & Henneberry, 2012 ). Even when students say they prefer face-to-face classes to online, many enroll in online classes and re-enroll in the future if the experience meets minimum expectations. This study examines the threshold expectations of students when they are considering taking online classes.

When discussing students’ perceptions of quality, there is little clarity about the actual range of concepts because no integrated empirical studies exist comparing major factors found throughout the literature. Rather, there are practitioner-generated lists of micro-competencies such as the Quality Matters consortium for higher education (Quality Matters, 2018 ), or broad frameworks encompassing many aspects of quality beyond teaching (Open and Distant Learning Quality Council, 2012 ). While checklists are useful for practitioners and accreditation processes, they do not provide robust, theoretical bases for scholarly development. Overarching frameworks are heuristically useful, but not for pragmatic purposes or theory building arenas. The most prominent theoretical framework used in online literature is the Community of Inquiry (CoI) model (Arbaugh et al., 2008 ; Garrison, Anderson, & Archer, 2003 ), which divides instruction into teaching, cognitive, and social presence. Like deductive theories, however, the supportive evidence is mixed (Rourke & Kanuka, 2009 ), especially regarding the importance of social presence (Annand, 2011 ; Armellini and De Stefani, 2016 ). Conceptually, the problem is not so much with the narrow articulation of cognitive or social presence; cognitive presence is how the instructor provides opportunities for students to interact with material in robust, thought-provoking ways, and social presence refers to building a community of learning that incorporates student-to-student interactions. However, teaching presence includes everything else the instructor does—structuring the course, providing lectures, explaining assignments, creating rehearsal opportunities, supplying tests, grading, answering questions, and so on. These challenges become even more prominent in the online context. While the lecture as a single medium is paramount in face-to-face classes, it fades as the primary vehicle in online classes with increased use of detailed syllabi, electronic announcements, recorded and synchronous lectures, 24/7 communications related to student questions, etc. Amassing the pedagogical and technological elements related to teaching under a single concept provides little insight.

In addition to the CoI model, numerous concepts are suggested in single-factor empirical studies when focusing on quality from a student’s perspective, with overlapping conceptualizations and nonstandardized naming conventions. Seven distinct factors are derived here from the literature of student perceptions of online quality: Instructional Support, Teaching Presence, Basic Online Modality, Social Presence, Online Social Comfort, cognitive Presence, and Interactive Online Modality.

Instructional support

Instructional Support refers to students’ perceptions of techniques by the instructor used for input, rehearsal, feedback, and evaluation. Specifically, this entails providing detailed instructions, designed use of multimedia, and the balance between repetitive class features for ease of use, and techniques to prevent boredom. Instructional Support is often included as an element of Teaching Presence, but is also labeled “structure” (Lee & Rha, 2009 ; So & Brush, 2008 ) and instructor facilitation (Eom, Wen, & Ashill, 2006 ). A prime example of the difference between face-to-face and online education is the extensive use of the “flipped classroom” (Maycock, 2019 ; Wang, Huang, & Schunn, 2019 ) in which students move to rehearsal activities faster and more frequently than traditional classrooms, with less instructor lecture (Jung, 2011 ; Martin, Wang, & Sadaf, 2018 ). It has been consistently supported as an element of student perceptions of quality (Espasa & Meneses, 2010 ).

  • Teaching presence

Teaching Presence refers to students’ perceptions about the quality of communication in lectures, directions, and individual feedback including encouragement (Jaggars & Xu, 2016 ; Marks et al., 2005 ). Specifically, instructor communication is clear, focused, and encouraging, and instructor feedback is customized and timely. If Instructional Support is what an instructor does before the course begins and in carrying out those plans, then Teaching Presence is what the instructor does while the class is conducted and in response to specific circumstances. For example, a course could be well designed but poorly delivered because the instructor is distracted; or a course could be poorly designed but an instructor might make up for the deficit by spending time and energy in elaborate communications and ad hoc teaching techniques. It is especially important in student satisfaction (Sebastianelli et al., 2015 ; Young, 2006 ) and also referred to as instructor presence (Asoodar et al., 2016 ), learner-instructor interaction (Marks et al., 2005 ), and staff support (Jung, 2011 ). As with Instructional Support, it has been consistently supported as an element of student perceptions of quality.

Basic online modality

Basic Online Modality refers to the competent use of basic online class tools—online grading, navigation methods, online grade book, and the announcements function. It is frequently clumped with instructional quality (Artino, 2010 ), service quality (Mohammadi, 2015 ), instructor expertise in e-teaching (Paechter, Maier, & Macher, 2010 ), and similar terms. As a narrowly defined concept, it is sometimes called technology (Asoodar et al., 2016 ; Bollinger & Martindale, 2004 ; Sun et al., 2008 ). The only empirical study that did not find Basic Online Modality significant, as technology, was Sun et al. ( 2008 ). Because Basic Online Modality is addressed with basic instructor training, some studies assert the importance of training (e.g., Asoodar et al., 2016 ).

Social presence

Social Presence refers to students’ perceptions of the quality of student-to-student interaction. Social Presence focuses on the quality of shared learning and collaboration among students, such as in threaded discussion responses (Garrison et al., 2003 ; Kehrwald, 2008 ). Much emphasized but challenged in the CoI literature (Rourke & Kanuka, 2009 ), it has mixed support in the online literature. While some studies found Social Presence or related concepts to be significant (e.g., Asoodar et al., 2016 ; Bollinger & Martindale, 2004 ; Eom et al., 2006 ; Richardson, Maeda, Lv, & Caskurlu, 2017 ), others found Social Presence insignificant (Joo, Lim, & Kim, 2011 ; So & Brush, 2008 ; Sun et al., 2008 ).

Online social comfort

Online Social Comfort refers to the instructor’s ability to provide an environment in which anxiety is low, and students feel comfortable interacting even when expressing opposing viewpoints. While numerous studies have examined anxiety (e.g., Liaw & Huang, 2013 ; Otter et al., 2013 ; Sun et al., 2008 ), only one found anxiety insignificant (Asoodar et al., 2016 ); many others have not examined the concept.

  • Cognitive presence

Cognitive Presence refers to the engagement of students such that they perceive they are stimulated by the material and instructor to reflect deeply and critically, and seek to understand different perspectives (Garrison et al., 2003 ). The instructor provides instructional materials and facilitates an environment that piques interest, is reflective, and enhances inclusiveness of perspectives (Durabi, Arrastia, Nelson, Cornille, & Liang, 2011 ). Cognitive Presence includes enhancing the applicability of material for student’s potential or current careers. Cognitive Presence is supported as significant in many online studies (e.g., Artino, 2010 ; Asoodar et al., 2016 ; Joo et al., 2011 ; Marks et al., 2005 ; Sebastianelli et al., 2015 ; Sun et al., 2008 ). Further, while many instructors perceive that cognitive presence is diminished in online settings, neuroscientific studies indicate this need not be the case (Takamine, 2017 ). While numerous studies failed to examine Cognitive Presence, this review found no studies that lessened its significance for students.

Interactive online modality

Interactive Online Modality refers to the “high-end” usage of online functionality. That is, the instructor uses interactive online class tools—video lectures, videoconferencing, and small group discussions—well. It is often included in concepts such as instructional quality (Artino, 2010 ; Asoodar et al., 2016 ; Mohammadi, 2015 ; Otter et al., 2013 ; Paechter et al., 2010 ) or engagement (Clayton, Blumberg, & Anthony, 2018 ). While individual methods have been investigated (e.g. Durabi et al., 2011 ), high-end engagement methods have not.

Other independent variables affecting perceptions of quality include age, undergraduate versus graduate status, gender, ethnicity/race, discipline, educational motivation of students, and previous online experience. While age has been found to be small or insignificant, more notable effects have been reported at the level-of-study, with graduate students reporting higher “success” (Macon, 2011 ), and community college students having greater difficulty with online classes (Legon & Garrett, 2019 ; Xu & Jaggars, 2014 ). Ethnicity and race have also been small or insignificant. Some situational variations and student preferences can be captured by paying attention to disciplinary differences (Arbaugh, 2005 ; Macon, 2011 ). Motivation levels of students have been reported to be significant in completion and achievement, with better students doing as well across face-to-face and online modes, and weaker students having greater completion and achievement challenges (Clayton et al., 2018 ; Lu & Lemonde, 2013 ).

Research methods

To examine the various quality factors, we apply a critical success factor methodology, initially introduced to schools of business research in the 1970s. In 1981, Rockhart and Bullen codified an approach embodying principles of critical success factors (CSFs) as a way to identify the information needs of executives, detailing steps for the collection and analyzation of data to create a set of organizational CSFs (Rockhart & Bullen, 1981 ). CSFs describe the underlying or guiding principles which must be incorporated to ensure success.

Utilizing this methodology, CSFs in the context of this paper define key areas of instruction and design essential for an online class to be successful from a student’s perspective. Instructors implicitly know and consider these areas when setting up an online class and designing and directing activities and tasks important to achieving learning goals. CSFs make explicit those things good instructors may intuitively know and (should) do to enhance student learning. When made explicit, CSFs not only confirm the knowledge of successful instructors, but tap their intuition to guide and direct the accomplishment of quality instruction for entire programs. In addition, CSFs are linked with goals and objectives, helping generate a small number of truly important matters an instructor should focus attention on to achieve different thresholds of online success.

After a comprehensive literature review, an instrument was created to measure students’ perceptions about the importance of techniques and indicators leading to quality online classes. Items were designed to capture the major factors in the literature. The instrument was pilot studied during academic year 2017–18 with a 397 student sample, facilitating an exploratory factor analysis leading to important preliminary findings (reference withheld for review). Based on the pilot, survey items were added and refined to include seven groups of quality teaching factors and two groups of items related to students’ overall acceptance of online classes as well as a variable on their future online class enrollment. Demographic information was gathered to determine their effects on students’ levels of acceptance of online classes based on age, year in program, major, distance from university, number of online classes taken, high school experience with online classes, and communication preferences.

This paper draws evidence from a sample of students enrolled in educational programs at Jack H. Brown College of Business and Public Administration (JHBC), California State University San Bernardino (CSUSB). The JHBC offers a wide range of online courses for undergraduate and graduate programs. To ensure comparable learning outcomes, online classes and face-to-face classes of a certain subject are similar in size—undergraduate classes are generally capped at 60 and graduate classes at 30, and often taught by the same instructors. Students sometimes have the option to choose between both face-to-face and online modes of learning.

A Qualtrics survey link was sent out by 11 instructors to students who were unlikely to be cross-enrolled in classes during the 2018–19 academic year. 1 Approximately 2500 students were contacted, with some instructors providing class time to complete the anonymous survey. All students, whether they had taken an online class or not, were encouraged to respond. Nine hundred eighty-seven students responded, representing a 40% response rate. Although drawn from a single business school, it is a broad sample representing students from several disciplines—management, accounting and finance, marketing, information decision sciences, and public administration, as well as both graduate and undergraduate programs of study.

The sample age of students is young, with 78% being under 30. The sample has almost no lower division students (i.e., freshman and sophomore), 73% upper division students (i.e., junior and senior) and 24% graduate students (master’s level). Only 17% reported having taken a hybrid or online class in high school. There was a wide range of exposure to university level online courses, with 47% reporting having taken 1 to 4 classes, and 21% reporting no online class experience. As a Hispanic-serving institution, 54% self-identified as Latino, 18% White, and 13% Asian and Pacific Islander. The five largest majors were accounting & finance (25%), management (21%), master of public administration (16%), marketing (12%), and information decision sciences (10%). Seventy-four percent work full- or part-time. See Table  1 for demographic data.

Measures and procedure

To increase the reliability of evaluation scores, composite evaluation variables are formed after an exploratory factor analysis of individual evaluation items. A principle component method with Quartimin (oblique) rotation was applied to explore the factor construct of student perceptions of online teaching CSFs. The item correlations for student perceptions of importance coefficients greater than .30 were included, a commonly acceptable ratio in factor analysis. A simple least-squares regression analysis was applied to test the significance levels of factors on students’ impression of online classes.

Exploratory factor constructs

Using a threshold loading of 0.3 for items, 37 items loaded on seven factors. All factors were logically consistent. The first factor, with eight items, was labeled Teaching Presence. Items included providing clear instructions, staying on task, clear deadlines, and customized feedback on strengths and weaknesses. Teaching Presence items all related to instructor involvement during the course as a director, monitor, and learning facilitator. The second factor, with seven items, aligned with Cognitive Presence. Items included stimulating curiosity, opportunities for reflection, helping students construct explanations posed in online courses, and the applicability of material. The third factor, with six items, aligned with Social Presence defined as providing student-to-student learning opportunities. Items included getting to know course participants for sense of belonging, forming impressions of other students, and interacting with others. The fourth factor, with six new items as well as two (“interaction with other students” and “a sense of community in the class”) shared with the third factor, was Instructional Support which related to the instructor’s roles in providing students a cohesive learning experience. They included providing sufficient rehearsal, structured feedback, techniques for communication, navigation guide, detailed syllabus, and coordinating student interaction and creating a sense of online community. This factor also included enthusiasm which students generally interpreted as a robustly designed course, rather than animation in a traditional lecture. The fifth factor was labeled Basic Online Modality and focused on the basic technological requirements for a functional online course. Three items included allowing students to make online submissions, use of online gradebooks, and online grading. A fourth item is the use of online quizzes, viewed by students as mechanical practice opportunities rather than small tests and a fifth is navigation, a key component of Online Modality. The sixth factor, loaded on four items, was labeled Online Social Comfort. Items here included comfort discussing ideas online, comfort disagreeing, developing a sense of collaboration via discussion, and considering online communication as an excellent medium for social interaction. The final factor was called Interactive Online Modality because it included items for “richer” communications or interactions, no matter whether one- or two-way. Items included videoconferencing, instructor-generated videos, and small group discussions. Taken together, these seven explained 67% of the variance which is considered in the acceptable range in social science research for a robust model (Hair, Black, Babin, & Anderson, 2014 ). See Table  2 for the full list.

To test for factor reliability, the Cronbach alpha of variables were calculated. All produced values greater than 0.7, the standard threshold used for reliability, except for system trust which was therefore dropped. To gauge students’ sense of factor importance, all items were means averaged. Factor means (lower means indicating higher importance to students), ranged from 1.5 to 2.6 on a 5-point scale. Basic Online Modality was most important, followed by Instructional Support and Teaching Presence. Students deemed Cognitive Presence, Social Online Comfort, and Online Interactive Modality less important. The least important for this sample was Social Presence. Table  3 arrays the critical success factor means, standard deviations, and Cronbach alpha.

To determine whether particular subgroups of respondents viewed factors differently, a series of ANOVAs were conducted using factor means as dependent variables. Six demographic variables were used as independent variables: graduate vs. undergraduate, age, work status, ethnicity, discipline, and past online experience. To determine strength of association of the independent variables to each of the seven CSFs, eta squared was calculated for each ANOVA. Eta squared indicates the proportion of variance in the dependent variable explained by the independent variable. Eta squared values greater than .01, .06, and .14 are conventionally interpreted as small, medium, and large effect sizes, respectively (Green & Salkind, 2003 ). Table  4 summarizes the eta squared values for the ANOVA tests with Eta squared values less than .01 omitted.

While no significant differences in factor means among students in different disciplines in the College occur, all five other independent variables have some small effect on some or all CSFs. Graduate students tend to rate Online Interactive Modality, Instructional Support, Teaching Presence, and Cognitive Presence higher than undergraduates. Elder students value more Online Interactive Modality. Full-time working students rate all factors, except Social Online Comfort, slightly higher than part-timers and non-working students. Latino and White rate Basic Online Modality and Instructional Support higher; Asian and Pacific Islanders rate Social Presence higher. Students who have taken more online classes rate all factors higher.

In addition to factor scores, two variables are constructed to identify the resultant impressions labeled online experience. Both were logically consistent with a Cronbach’s α greater than 0.75. The first variable, with six items, labeled “online acceptance,” included items such as “I enjoy online learning,” “My overall impression of hybrid/online learning is very good,” and “the instructors of online/hybrid classes are generally responsive.” The second variable was labeled “face-to-face preference” and combines four items, including enjoying, learning, and communicating more in face-to-face classes, as well as perceiving greater fairness and equity. In addition to these two constructed variables, a one-item variable was also used subsequently in the regression analysis: “online enrollment.” That question asked: if hybrid/online classes are well taught and available, how much would online education make up your entire course selection going forward?

Regression results

As noted above, two constructed variables and one item were used as dependent variables for purposes of regression analysis. They were online acceptance, F2F preference, and the selection of online classes. In addition to seven quality-of-teaching factors identified by factor analysis, control variables included level of education (graduate versus undergraduate), age, ethnicity, work status, distance to university, and number of online/hybrid classes taken in the past. See Table  5 .

When the ETA squared values for ANOVA significance were measured for control factors, only one was close to a medium effect. Graduate versus undergraduate status had a .05 effect (considered medium) related to Online Interactive Modality, meaning graduate students were more sensitive to interactive modality than undergraduates. Multiple regression analysis of critical success factors and online impressions were conducted to compare under what conditions factors were significant. The only consistently significant control factor was number of online classes taken. The more classes students had taken online, the more inclined they were to take future classes. Level of program, age, ethnicity, and working status do not significantly affect students’ choice or overall acceptance of online classes.

The least restrictive condition was online enrollment (Table  6 ). That is, students might not feel online courses were ideal, but because of convenience and scheduling might enroll in them if minimum threshold expectations were met. When considering online enrollment three factors were significant and positive (at the 0.1 level): Basic Online Modality, Cognitive Presence, and Online Social Comfort. These least-demanding students expected classes to have basic technological functionality, provide good opportunities for knowledge acquisition, and provide comfortable interaction in small groups. Students who demand good Instructional Support (e.g., rehearsal opportunities, standardized feedback, clear syllabus) are less likely to enroll.

Online acceptance was more restrictive (see Table  7 ). This variable captured the idea that students not only enrolled in online classes out of necessity, but with an appreciation of the positive attributes of online instruction, which balanced the negative aspects. When this standard was applied, students expected not only Basic Online Modality, Cognitive Presence, and Online Social Comfort, but expected their instructors to be highly engaged virtually as the course progressed (Teaching Presence), and to create strong student-to-student dynamics (Social Presence). Students who rated Instructional Support higher are less accepting of online classes.

Another restrictive condition was catering to the needs of students who preferred face-to-face classes (see Table  8 ). That is, they preferred face-to-face classes even when online classes were well taught. Unlike students more accepting of, or more likely to enroll in, online classes, this group rates Instructional Support as critical to enrolling, rather than a negative factor when absent. Again different from the other two groups, these students demand appropriate interactive mechanisms (Online Interactive Modality) to enable richer communication (e.g., videoconferencing). Student-to-student collaboration (Social Presence) was also significant. This group also rated Cognitive Presence and Online Social Comfort as significant, but only in their absence. That is, these students were most attached to direct interaction with the instructor and other students rather than specific teaching methods. Interestingly, Basic Online Modality and Teaching Presence were not significant. Our interpretation here is this student group, most critical of online classes for its loss of physical interaction, are beyond being concerned with mechanical technical interaction and demand higher levels of interactivity and instructional sophistication.

Discussion and study limitations

Some past studies have used robust empirical methods to identify a single factor or a small number of factors related to quality from a student’s perspective, but have not sought to be relatively comprehensive. Others have used a longer series of itemized factors, but have less used less robust methods, and have not tied those factors back to the literature. This study has used the literature to develop a relatively comprehensive list of items focused on quality teaching in a single rigorous protocol. That is, while a Beta test had identified five coherent factors, substantial changes to the current survey that sharpened the focus on quality factors rather than antecedent factors, as well as better articulating the array of factors often lumped under the mantle of “teaching presence.” In addition, it has also examined them based on threshold expectations: from minimal, such as when flexibility is the driving consideration, to modest, such as when students want a “good” online class, to high, when students demand an interactive virtual experience equivalent to face-to-face.

Exploratory factor analysis identified seven factors that were reliable, coherent, and significant under different conditions. When considering students’ overall sense of importance, they are, in order: Basic Online Modality, Instructional Support, Teaching Presence, Cognitive Presence, Social Online Comfort, Interactive Online Modality, and Social Presence. Students are most concerned with the basics of a course first, that is the technological and instructor competence. Next they want engagement and virtual comfort. Social Presence, while valued, is the least critical from this overall perspective.

The factor analysis is quite consistent with the range of factors identified in the literature, pointing to the fact that students can differentiate among different aspects of what have been clumped as larger concepts, such as teaching presence. Essentially, the instructor’s role in quality can be divided into her/his command of basic online functionality, good design, and good presence during the class. The instructor’s command of basic functionality is paramount. Because so much of online classes must be built in advance of the class, quality of the class design is rated more highly than the instructor’s role in facilitating the class. Taken as a whole, the instructor’s role in traditional teaching elements is primary, as we would expect it to be. Cognitive presence, especially as pertinence of the instructional material and its applicability to student interests, has always been found significant when studied, and was highly rated as well in a single factor. Finally, the degree to which students feel comfortable with the online environment and enjoy the learner-learner aspect has been less supported in empirical studies, was found significant here, but rated the lowest among the factors of quality to students.

Regression analysis paints a more nuanced picture, depending on student focus. It also helps explain some of the heterogeneity of previous studies, depending on what the dependent variables were. If convenience and scheduling are critical and students are less demanding, minimum requirements are Basic Online Modality, Cognitive Presence, and Online Social Comfort. That is, students’ expect an instructor who knows how to use an online platform, delivers useful information, and who provides a comfortable learning environment. However, they do not expect to get poor design. They do not expect much in terms of the quality teaching presence, learner-to-learner interaction, or interactive teaching.

When students are signing up for critical classes, or they have both F2F and online options, they have a higher standard. That is, they not only expect the factors for decisions about enrolling in noncritical classes, but they also expect good Teaching and Social Presence. Students who simply need a class may be willing to teach themselves a bit more, but students who want a good class expect a highly present instructor in terms responsiveness and immediacy. “Good” classes must not only create a comfortable atmosphere, but in social science classes at least, must provide strong learner-to-learner interactions as well. At the time of the research, most students believe that you can have a good class without high interactivity via pre-recorded video and videoconference. That may, or may not, change over time as technology thresholds of various video media become easier to use, more reliable, and more commonplace.

The most demanding students are those who prefer F2F classes because of learning style preferences, poor past experiences, or both. Such students (seem to) assume that a worthwhile online class has basic functionality and that the instructor provides a strong presence. They are also critical of the absence of Cognitive Presence and Online Social Comfort. They want strong Instructional Support and Social Presence. But in addition, and uniquely, they expect Online Interactive Modality which provides the greatest verisimilitude to the traditional classroom as possible. More than the other two groups, these students crave human interaction in the learning process, both with the instructor and other students.

These findings shed light on the possible ramifications of the COVID-19 aftermath. Many universities around the world jumped from relatively low levels of online instruction in the beginning of spring 2020 to nearly 100% by mandate by the end of the spring term. The question becomes, what will happen after the mandate is removed? Will demand resume pre-crisis levels, will it increase modestly, or will it skyrocket? Time will be the best judge, but the findings here would suggest that the ability/interest of instructors and institutions to “rise to the occasion” with quality teaching will have as much effect on demand as students becoming more acclimated to online learning. If in the rush to get classes online many students experience shoddy basic functional competence, poor instructional design, sporadic teaching presence, and poorly implemented cognitive and social aspects, they may be quite willing to return to the traditional classroom. If faculty and institutions supporting them are able to increase the quality of classes despite time pressures, then most students may be interested in more hybrid and fully online classes. If instructors are able to introduce high quality interactive teaching, nearly the entire student population will be interested in more online classes. Of course students will have a variety of experiences, but this analysis suggests that those instructors, departments, and institutions that put greater effort into the temporary adjustment (and who resist less), will be substantially more likely to have increases in demand beyond what the modest national trajectory has been for the last decade or so.

There are several study limitations. First, the study does not include a sample of non-respondents. Non-responders may have a somewhat different profile. Second, the study draws from a single college and university. The profile derived here may vary significantly by type of student. Third, some survey statements may have led respondents to rate quality based upon experience rather than assess the general importance of online course elements. “I felt comfortable participating in the course discussions,” could be revised to “comfort in participating in course discussions.” The authors weighed differences among subgroups (e.g., among majors) as small and statistically insignificant. However, it is possible differences between biology and marketing students would be significant, leading factors to be differently ordered. Emphasis and ordering might vary at a community college versus research-oriented university (Gonzalez, 2009 ).

Availability of data and materials

We will make the data available.

Al-Gahtani, S. S. (2016). Empirical investigation of e-learning acceptance and assimilation: A structural equation model. Applied Comput Information , 12 , 27–50.

Google Scholar  

Alqurashi, E. (2016). Self-efficacy in online learning environments: A literature review. Contemporary Issues Educ Res (CIER) , 9 (1), 45–52.

Anderson, T. (2016). A fourth presence for the Community of Inquiry model? Retrieved from https://virtualcanuck.ca/2016/01/04/a-fourth-presence-for-the-community-of-inquiry-model/ .

Annand, D. (2011). Social presence within the community of inquiry framework. The International Review of Research in Open and Distributed Learning , 12 (5), 40.

Arbaugh, J. B. (2005). How much does “subject matter” matter? A study of disciplinary effects in on-line MBA courses. Academy of Management Learning & Education , 4 (1), 57–73.

Arbaugh, J. B., Cleveland-Innes, M., Diaz, S. R., Garrison, D. R., Ice, P., Richardson, J. C., & Swan, K. P. (2008). Developing a community of inquiry instrument: Testing a measure of the Community of Inquiry framework using a multi-institutional sample. Internet and Higher Education , 11 , 133–136.

Armellini, A., & De Stefani, M. (2016). Social presence in the 21st century: An adjustment to the Community of Inquiry framework. British Journal of Educational Technology , 47 (6), 1202–1216.

Arruabarrena, R., Sánchez, A., Blanco, J. M., et al. (2019). Integration of good practices of active methodologies with the reuse of student-generated content. International Journal of Educational Technology in Higher Education , 16 , #10.

Arthur, L. (2009). From performativity to professionalism: Lecturers’ responses to student feedback. Teaching in Higher Education , 14 (4), 441–454.

Artino, A. R. (2010). Online or face-to-face learning? Exploring the personal factors that predict students’ choice of instructional format. Internet and Higher Education , 13 , 272–276.

Asoodar, M., Vaezi, S., & Izanloo, B. (2016). Framework to improve e-learner satisfaction and further strengthen e-learning implementation. Computers in Human Behavior , 63 , 704–716.

Bernard, R. M., et al. (2004). How does distance education compare with classroom instruction? A meta-analysis of the empirical literature. Review of Educational Research , 74 (3), 379–439.

Bollinger, D., & Martindale, T. (2004). Key factors for determining student satisfaction in online courses. Int J E-learning , 3 (1), 61–67.

Brinkley-Etzkorn, K. E. (2018). Learning to teach online: Measuring the influence of faculty development training on teaching effectiveness through a TPACK lens. The Internet and Higher Education , 38 , 28–35.

Chickering, A. W., & Gamson, Z. F. (1987). Seven principles for good practice in undergraduate education. AAHE Bulletin , 3 , 7.

Choi, I., Land, S. M., & Turgeon, A. J. (2005). Scaffolding peer-questioning strategies to facilitate metacognition during online small group discussion. Instructional Science , 33 , 483–511.

Clayton, K. E., Blumberg, F. C., & Anthony, J. A. (2018). Linkages between course status, perceived course value, and students’ preferences for traditional versus non-traditional learning environments. Computers & Education , 125 , 175–181.

Cleveland-Innes, M., & Campbell, P. (2012). Emotional presence, learning, and the online learning environment. The International Review of Research in Open and Distributed Learning , 13 (4), 269–292.

Cohen, A., & Baruth, O. (2017). Personality, learning, and satisfaction in fully online academic courses. Computers in Human Behavior , 72 , 1–12.

Crews, T., & Butterfield, J. (2014). Data for flipped classroom design: Using student feedback to identify the best components from online and face-to-face classes. Higher Education Studies , 4 (3), 38–47.

Dawson, P., Henderson, M., Mahoney, P., Phillips, M., Ryan, T., Boud, D., & Molloy, E. (2019). What makes for effective feedback: Staff and student perspectives. Assessment & Evaluation in Higher Education , 44 (1), 25–36.

Drew, C., & Mann, A. (2018). Unfitting, uncomfortable, unacademic: A sociological reading of an interactive mobile phone app in university lectures. International Journal of Educational Technology in Higher Education , 15 , #43.

Durabi, A., Arrastia, M., Nelson, D., Cornille, T., & Liang, X. (2011). Cognitive presence in asynchronous online learning: A comparison of four discussion strategies. Journal of Computer Assisted Learning , 27 (3), 216–227.

Eom, S. B., Wen, H. J., & Ashill, N. (2006). The determinants of students’ perceived learning outcomes and satisfaction in university online education: An empirical investigation. Decision Sciences Journal of Innovative Education , 4 (2), 215–235.

Espasa, A., & Meneses, J. (2010). Analysing feedback processes in an online teaching and learning environment: An exploratory study. Higher Education , 59 (3), 277–292.

Farrell, O., & Brunton, J. (2020). A balancing act: A window into online student engagement experiences. International Journal of Educational Technology in High Education , 17 , #25.

Fidalgo, P., Thormann, J., Kulyk, O., et al. (2020). Students’ perceptions on distance education: A multinational study. International Journal of Educational Technology in High Education , 17 , #18.

Flores, Ò., del-Arco, I., & Silva, P. (2016). The flipped classroom model at the university: Analysis based on professors’ and students’ assessment in the educational field. International Journal of Educational Technology in Higher Education , 13 , #21.

Garrison, D. R., Anderson, T., & Archer, W. (2003). A theory of critical inquiry in online distance education. Handbook of Distance Education , 1 , 113–127.

Gong, D., Yang, H. H., & Cai, J. (2020). Exploring the key influencing factors on college students’ computational thinking skills through flipped-classroom instruction. International Journal of Educational Technology in Higher Education , 17 , #19.

Gonzalez, C. (2009). Conceptions of, and approaches to, teaching online: A study of lecturers teaching postgraduate distance courses. Higher Education , 57 (3), 299–314.

Grandzol, J. R., & Grandzol, C. J. (2006). Best practices for online business Education. International Review of Research in Open and Distance Learning , 7 (1), 1–18.

Green, S. B., & Salkind, N. J. (2003). Using SPSS: Analyzing and understanding data , (3rd ed., ). Upper Saddle River: Prentice Hall.

Hair, J. F., Black, W. C., Babin, B. J., & Anderson, R. E. (2014). Multivariate data analysis: Pearson new international edition . Essex: Pearson Education Limited.

Harjoto, M. A. (2017). Blended versus face-to-face: Evidence from a graduate corporate finance class. Journal of Education for Business , 92 (3), 129–137.

Hong, K.-S. (2002). Relationships between students’ instructional variables with satisfaction and learning from a web-based course. The Internet and Higher Education , 5 , 267–281.

Horvitz, B. S., Beach, A. L., Anderson, M. L., & Xia, J. (2015). Examination of faculty self-efficacy related to online teaching. Innovation Higher Education , 40 , 305–316.

Inside Higher Education and Gallup. (2019). The 2019 survey of faculty attitudes on technology. Author .

Jaggars, S. S., & Xu, D. (2016). How do online course design features influence student performance? Computers and Education , 95 , 270–284.

Joo, Y. J., Lim, K. Y., & Kim, E. K. (2011). Online university students’ satisfaction and persistence: Examining perceived level of presence, usefulness and ease of use as predictor in a structural model. Computers & Education , 57 (2), 1654–1664.

Jung, I. (2011). The dimensions of e-learning quality: From the learner’s perspective. Educational Technology Research and Development , 59 (4), 445–464.

Kay, R., MacDonald, T., & DiGiuseppe, M. (2019). A comparison of lecture-based, active, and flipped classroom teaching approaches in higher education. Journal of Computing in Higher Education , 31 , 449–471.

Kehrwald, B. (2008). Understanding social presence in text-based online learning environments. Distance Education , 29 (1), 89–106.

Kintu, M. J., Zhu, C., & Kagambe, E. (2017). Blended learning effectiveness: The relationship between student characteristics, design features and outcomes. International Journal of Educational Technology in Higher Education , 14 , #7.

Kuo, Y.-C., Walker, A. E., Schroder, K. E., & Belland, B. R. (2013). Interaction, internet self-efficacy, and self-regulated learning as predictors of student satisfaction in online education courses. Internet and Education , 20 , 35–50.

Lange, C., & Costley, J. (2020). Improving online video lectures: Learning challenges created by media. International Journal of Educational Technology in Higher Education , 17 , #16.

le Roux, I., & Nagel, L. (2018). Seeking the best blend for deep learning in a flipped classroom – Viewing student perceptions through the Community of Inquiry lens. International Journal of Educational Technology in High Education , 15 , #16.

Lee, H.-J., & Rha, I. (2009). Influence of structure and interaction on student achievement and satisfaction in web-based distance learning. Educational Technology & Society , 12 (4), 372–382.

Lee, Y., Stringer, D., & Du, J. (2017). What determines students’ preference of online to F2F class? Business Education Innovation Journal , 9 (2), 97–102.

Legon, R., & Garrett, R. (2019). CHLOE 3: Behind the numbers . Published online by Quality Matters and Eduventures. https://www.qualitymatters.org/sites/default/files/research-docs-pdfs/CHLOE-3-Report-2019-Behind-the-Numbers.pdf

Liaw, S.-S., & Huang, H.-M. (2013). Perceived satisfaction, perceived usefulness and interactive learning environments as predictors of self-regulation in e-learning environments. Computers & Education , 60 (1), 14–24.

Lu, F., & Lemonde, M. (2013). A comparison of online versus face-to-face students teaching delivery in statistics instruction for undergraduate health science students. Advances in Health Science Education , 18 , 963–973.

Lundin, M., Bergviken Rensfeldt, A., Hillman, T., Lantz-Andersson, A., & Peterson, L. (2018). Higher education dominance and siloed knowledge: a systematic review of flipped classroom research. International Journal of Educational Technology in Higher Education , 15 (1).

Macon, D. K. (2011). Student satisfaction with online courses versus traditional courses: A meta-analysis . Disssertation: Northcentral University, CA.

Mann, J., & Henneberry, S. (2012). What characteristics of college students influence their decisions to select online courses? Online Journal of Distance Learning Administration , 15 (5), 1–14.

Mansbach, J., & Austin, A. E. (2018). Nuanced perspectives about online teaching: Mid-career senior faculty voices reflecting on academic work in the digital age. Innovative Higher Education , 43 (4), 257–272.

Marks, R. B., Sibley, S. D., & Arbaugh, J. B. (2005). A structural equation model of predictors for effective online learning. Journal of Management Education , 29 (4), 531–563.

Martin, F., Wang, C., & Sadaf, A. (2018). Student perception of facilitation strategies that enhance instructor presence, connectedness, engagement and learning in online courses. Internet and Higher Education , 37 , 52–65.

Maycock, K. W. (2019). Chalk and talk versus flipped learning: A case study. Journal of Computer Assisted Learning , 35 , 121–126.

McGivney-Burelle, J. (2013). Flipping Calculus. PRIMUS Problems, Resources, and Issues in Mathematics Undergraduate . Studies , 23 (5), 477–486.

Mohammadi, H. (2015). Investigating users’ perspectives on e-learning: An integration of TAM and IS success model. Computers in Human Behavior , 45 , 359–374.

Nair, S. S., Tay, L. Y., & Koh, J. H. L. (2013). Students’ motivation and teachers’ teaching practices towards the use of blogs for writing of online journals. Educational Media International , 50 (2), 108–119.

Nguyen, T. (2015). The effectiveness of online learning: Beyond no significant difference and future horizons. MERLOT Journal of Online Learning and Teaching , 11 (2), 309–319.

Ni, A. Y. (2013). Comparing the effectiveness of classroom and online learning: Teaching research methods. Journal of Public Affairs Education , 19 (2), 199–215.

Nouri, J. (2016). The flipped classroom: For active, effective and increased learning – Especially for low achievers. International Journal of Educational Technology in Higher Education , 13 , #33.

O’Neill, D. K., & Sai, T. H. (2014). Why not? Examining college students’ reasons for avoiding an online course. Higher Education , 68 (1), 1–14.

O'Flaherty, J., & Phillips, C. (2015). The use of flipped classrooms in higher education: A scoping review. The Internet and Higher Education , 25 , 85–95.

Open & Distant Learning Quality Council (2012). ODLQC standards . England: Author https://www.odlqc.org.uk/odlqc-standards .

Ortagus, J. C. (2017). From the periphery to prominence: An examination of the changing profile of online students in American higher education. Internet and Higher Education , 32 , 47–57.

Otter, R. R., Seipel, S., Graef, T., Alexander, B., Boraiko, C., Gray, J., … Sadler, K. (2013). Comparing student and faculty perceptions of online and traditional courses. Internet and Higher Education , 19 , 27–35.

Paechter, M., Maier, B., & Macher, D. (2010). Online or face-to-face? Students’ experiences and preferences in e-learning. Internet and Higher Education , 13 , 292–329.

Prinsloo, P. (2016). (re)considering distance education: Exploring its relevance, sustainability and value contribution. Distance Education , 37 (2), 139–145.

Quality Matters (2018). Specific review standards from the QM higher Education rubric , (6th ed., ). MD: MarylandOnline.

Richardson, J. C., Maeda, Y., Lv, J., & Caskurlu, S. (2017). Social presence in relation to students’ satisfaction and learning in the online environment: A meta-analysis. Computers in Human Behavior , 71 , 402–417.

Rockhart, J. F., & Bullen, C. V. (1981). A primer on critical success factors . Cambridge: Center for Information Systems Research, Massachusetts Institute of Technology.

Rourke, L., & Kanuka, H. (2009). Learning in Communities of Inquiry: A Review of the Literature. The Journal of Distance Education / Revue de l'ducation Distance , 23 (1), 19–48 Athabasca University Press. Retrieved August 2, 2020 from https://www.learntechlib.org/p/105542/ .

Sebastianelli, R., Swift, C., & Tamimi, N. (2015). Factors affecting perceived learning, satisfaction, and quality in the online MBA: A structural equation modeling approach. Journal of Education for Business , 90 (6), 296–305.

Shen, D., Cho, M.-H., Tsai, C.-L., & Marra, R. (2013). Unpacking online learning experiences: Online learning self-efficacy and learning satisfaction. Internet and Higher Education , 19 , 10–17.

Sitzmann, T., Kraiger, K., Stewart, D., & Wisher, R. (2006). The comparative effectiveness of web-based and classroom instruction: A meta-analysis. Personnel Psychology , 59 (3), 623–664.

So, H. J., & Brush, T. A. (2008). Student perceptions of collaborative learning, social presence and satisfaction in a blended learning environment: Relationships and critical factors. Computers & Education , 51 (1), 318–336.

Song, L., Singleton, E. S., Hill, J. R., & Koh, M. H. (2004). Improving online learning: Student perceptions of useful and challenging characteristics. The Internet and Higher Education , 7 (1), 59–70.

Sun, P. C., Tsai, R. J., Finger, G., Chen, Y. Y., & Yeh, D. (2008). What drives a successful e-learning? An empirical investigation of the critical factors influencing learner satisfaction. Computers & Education , 50 (4), 1183–1202.

Takamine, K. (2017). Michelle D. miller: Minds online: Teaching effectively with technology. Higher Education , 73 , 789–791.

Tanner, J. R., Noser, T. C., & Totaro, M. W. (2009). Business faculty and undergraduate students’ perceptions of online learning: A comparative study. Journal of Information Systems Education , 20 (1), 29.

Tucker, B. (2012). The flipped classroom. Education Next , 12 (1), 82–83.

Van Wart, M., Ni, A., Ready, D., Shayo, C., & Court, J. (2020). Factors leading to online learner satisfaction. Business Educational Innovation Journal , 12 (1), 15–24.

Van Wart, M., Ni, A., Rose, L., McWeeney, T., & Worrell, R. A. (2019). Literature review and model of online teaching effectiveness integrating concerns for learning achievement, student satisfaction, faculty satisfaction, and institutional results. Pan-Pacific . Journal of Business Research , 10 (1), 1–22.

Ventura, A. C., & Moscoloni, N. (2015). Learning styles and disciplinary differences: A cross-sectional study of undergraduate students. International Journal of Learning and Teaching , 1 (2), 88–93.

Vlachopoulos, D., & Makri, A. (2017). The effect of games and simulations on higher education: A systematic literature review. International Journal of Educational Technology in Higher Education , 14 , #22.

Wang, Y., Huang, X., & Schunn, C. D. (2019). Redesigning flipped classrooms: A learning model and its effects on student perceptions. Higher Education , 78 , 711–728.

Wingo, N. P., Ivankova, N. V., & Moss, J. A. (2017). Faculty perceptions about teaching online: Exploring the literature using the technology acceptance model as an organizing framework. Online Learning , 21 (1), 15–35.

Xu, D., & Jaggars, S. S. (2014). Performance gaps between online and face-to-face courses: Differences across types of students and academic subject areas. Journal of Higher Education , 85 (5), 633–659.

Young, S. (2006). Student views of effective online teaching in higher education. American Journal of Distance Education , 20 (2), 65–77.

Zawacki-Richter, O., & Naidu, S. (2016). Mapping research trends from 35 years of publications in distance Education. Distance Education , 37 (3), 245–269.

Download references

Acknowledgements

No external funding/ NA.

Author information

Authors and affiliations.

Development for the JHB College of Business and Public Administration, 5500 University Parkway, San Bernardino, California, 92407, USA

Montgomery Van Wart, Anna Ni, Pamela Medina, Jesus Canelon, Melika Kordrostami, Jing Zhang & Yu Liu

You can also search for this author in PubMed   Google Scholar

Contributions

Equal. The author(s) read and approved the final manuscript.

Corresponding author

Correspondence to Montgomery Van Wart .

Ethics declarations

Competing interests.

We have no competing interests.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Cite this article.

Van Wart, M., Ni, A., Medina, P. et al. Integrating students’ perspectives about online learning: a hierarchy of factors. Int J Educ Technol High Educ 17 , 53 (2020). https://doi.org/10.1186/s41239-020-00229-8

Download citation

Received : 29 April 2020

Accepted : 30 July 2020

Published : 02 December 2020

DOI : https://doi.org/10.1186/s41239-020-00229-8

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Online education
  • Online teaching
  • Student perceptions
  • Online quality
  • Student presence

research objectives about online classes

How Effective Is Online Learning? What the Research Does and Doesn’t Tell Us

research objectives about online classes

  • Share article

Editor’s Note: This is part of a series on the practical takeaways from research.

The times have dictated school closings and the rapid expansion of online education. Can online lessons replace in-school time?

Clearly online time cannot provide many of the informal social interactions students have at school, but how will online courses do in terms of moving student learning forward? Research to date gives us some clues and also points us to what we could be doing to support students who are most likely to struggle in the online setting.

The use of virtual courses among K-12 students has grown rapidly in recent years. Florida, for example, requires all high school students to take at least one online course. Online learning can take a number of different forms. Often people think of Massive Open Online Courses, or MOOCs, where thousands of students watch a video online and fill out questionnaires or take exams based on those lectures.

In the online setting, students may have more distractions and less oversight, which can reduce their motivation.

Most online courses, however, particularly those serving K-12 students, have a format much more similar to in-person courses. The teacher helps to run virtual discussion among the students, assigns homework, and follows up with individual students. Sometimes these courses are synchronous (teachers and students all meet at the same time) and sometimes they are asynchronous (non-concurrent). In both cases, the teacher is supposed to provide opportunities for students to engage thoughtfully with subject matter, and students, in most cases, are required to interact with each other virtually.

Coronavirus and Schools

Online courses provide opportunities for students. Students in a school that doesn’t offer statistics classes may be able to learn statistics with virtual lessons. If students fail algebra, they may be able to catch up during evenings or summer using online classes, and not disrupt their math trajectory at school. So, almost certainly, online classes sometimes benefit students.

In comparisons of online and in-person classes, however, online classes aren’t as effective as in-person classes for most students. Only a little research has assessed the effects of online lessons for elementary and high school students, and even less has used the “gold standard” method of comparing the results for students assigned randomly to online or in-person courses. Jessica Heppen and colleagues at the American Institutes for Research and the University of Chicago Consortium on School Research randomly assigned students who had failed second semester Algebra I to either face-to-face or online credit recovery courses over the summer. Students’ credit-recovery success rates and algebra test scores were lower in the online setting. Students assigned to the online option also rated their class as more difficult than did their peers assigned to the face-to-face option.

Most of the research on online courses for K-12 students has used large-scale administrative data, looking at otherwise similar students in the two settings. One of these studies, by June Ahn of New York University and Andrew McEachin of the RAND Corp., examined Ohio charter schools; I did another with colleagues looking at Florida public school coursework. Both studies found evidence that online coursetaking was less effective.

About this series

BRIC ARCHIVE

This essay is the fifth in a series that aims to put the pieces of research together so that education decisionmakers can evaluate which policies and practices to implement.

The conveners of this project—Susanna Loeb, the director of Brown University’s Annenberg Institute for School Reform, and Harvard education professor Heather Hill—have received grant support from the Annenberg Institute for this series.

To suggest other topics for this series or join in the conversation, use #EdResearchtoPractice on Twitter.

Read the full series here .

It is not surprising that in-person courses are, on average, more effective. Being in person with teachers and other students creates social pressures and benefits that can help motivate students to engage. Some students do as well in online courses as in in-person courses, some may actually do better, but, on average, students do worse in the online setting, and this is particularly true for students with weaker academic backgrounds.

Students who struggle in in-person classes are likely to struggle even more online. While the research on virtual schools in K-12 education doesn’t address these differences directly, a study of college students that I worked on with Stanford colleagues found very little difference in learning for high-performing students in the online and in-person settings. On the other hand, lower performing students performed meaningfully worse in online courses than in in-person courses.

But just because students who struggle in in-person classes are even more likely to struggle online doesn’t mean that’s inevitable. Online teachers will need to consider the needs of less-engaged students and work to engage them. Online courses might be made to work for these students on average, even if they have not in the past.

Just like in brick-and-mortar classrooms, online courses need a strong curriculum and strong pedagogical practices. Teachers need to understand what students know and what they don’t know, as well as how to help them learn new material. What is different in the online setting is that students may have more distractions and less oversight, which can reduce their motivation. The teacher will need to set norms for engagement—such as requiring students to regularly ask questions and respond to their peers—that are different than the norms in the in-person setting.

Online courses are generally not as effective as in-person classes, but they are certainly better than no classes. A substantial research base developed by Karl Alexander at Johns Hopkins University and many others shows that students, especially students with fewer resources at home, learn less when they are not in school. Right now, virtual courses are allowing students to access lessons and exercises and interact with teachers in ways that would have been impossible if an epidemic had closed schools even a decade or two earlier. So we may be skeptical of online learning, but it is also time to embrace and improve it.

A version of this article appeared in the April 01, 2020 edition of Education Week as How Effective Is Online Learning?

Sign Up for EdWeek Tech Leader

Edweek top school jobs.

Image of the hand of a robot holding a pen with open books flying all around.

Sign Up & Sign In

module image 9

  • Reference Manager
  • Simple TEXT file

People also looked at

Original research article, a comparative analysis of student performance in an online vs. face-to-face environmental science course from 2009 to 2016.

research objectives about online classes

  • Department of Biology, Fort Valley State University, Fort Valley, GA, United States

A growing number of students are now opting for online classes. They find the traditional classroom modality restrictive, inflexible, and impractical. In this age of technological advancement, schools can now provide effective classroom teaching via the Web. This shift in pedagogical medium is forcing academic institutions to rethink how they want to deliver their course content. The overarching purpose of this research was to determine which teaching method proved more effective over the 8-year period. The scores of 548 students, 401 traditional students and 147 online students, in an environmental science class were used to determine which instructional modality generated better student performance. In addition to the overarching objective, we also examined score variabilities between genders and classifications to determine if teaching modality had a greater impact on specific groups. No significant difference in student performance between online and face-to-face (F2F) learners overall, with respect to gender, or with respect to class rank were found. These data demonstrate the ability to similarly translate environmental science concepts for non-STEM majors in both traditional and online platforms irrespective of gender or class rank. A potential exists for increasing the number of non-STEM majors engaged in citizen science using the flexibility of online learning to teach environmental science core concepts.

Introduction

The advent of online education has made it possible for students with busy lives and limited flexibility to obtain a quality education. As opposed to traditional classroom teaching, Web-based instruction has made it possible to offer classes worldwide through a single Internet connection. Although it boasts several advantages over traditional education, online instruction still has its drawbacks, including limited communal synergies. Still, online education seems to be the path many students are taking to secure a degree.

This study compared the effectiveness of online vs. traditional instruction in an environmental studies class. Using a single indicator, we attempted to see if student performance was effected by instructional medium. This study sought to compare online and F2F teaching on three levels—pure modality, gender, and class rank. Through these comparisons, we investigated whether one teaching modality was significantly more effective than the other. Although there were limitations to the study, this examination was conducted to provide us with additional measures to determine if students performed better in one environment over another ( Mozes-Carmel and Gold, 2009 ).

The methods, procedures, and operationalization tools used in this assessment can be expanded upon in future quantitative, qualitative, and mixed method designs to further analyze this topic. Moreover, the results of this study serve as a backbone for future meta-analytical studies.

Origins of Online Education

Computer-assisted instruction is changing the pedagogical landscape as an increasing number of students are seeking online education. Colleges and universities are now touting the efficiencies of Web-based education and are rapidly implementing online classes to meet student needs worldwide. One study reported “increases in the number of online courses given by universities have been quite dramatic over the last couple of years” ( Lundberg et al., 2008 ). Think tanks are also disseminating statistics on Web-based instruction. “In 2010, the Sloan Consortium found a 17% increase in online students from the years before, beating the 12% increase from the previous year” ( Keramidas, 2012 ).

Contrary to popular belief, online education is not a new phenomenon. The first correspondence and distance learning educational programs were initiated in the mid-1800s by the University of London. This model of educational learning was dependent on the postal service and therefore wasn't seen in American until the later Nineteenth century. It was in 1873 when what is considered the first official correspondence educational program was established in Boston, Massachusetts known as the “Society to Encourage Home Studies.” Since then, non-traditional study has grown into what it is today considered a more viable online instructional modality. Technological advancement indubitably helped improve the speed and accessibility of distance learning courses; now students worldwide could attend classes from the comfort of their own homes.

Qualities of Online and Traditional Face to Face (F2F) Classroom Education

Online and traditional education share many qualities. Students are still required to attend class, learn the material, submit assignments, and complete group projects. While teachers, still have to design curriculums, maximize instructional quality, answer class questions, motivate students to learn, and grade assignments. Despite these basic similarities, there are many differences between the two modalities. Traditionally, classroom instruction is known to be teacher-centered and requires passive learning by the student, while online instruction is often student-centered and requires active learning.

In teacher-centered, or passive learning, the instructor usually controls classroom dynamics. The teacher lectures and comments, while students listen, take notes, and ask questions. In student-centered, or active learning, the students usually determine classroom dynamics as they independently analyze the information, construct questions, and ask the instructor for clarification. In this scenario, the teacher, not the student, is listening, formulating, and responding ( Salcedo, 2010 ).

In education, change comes with questions. Despite all current reports championing online education, researchers are still questioning its efficacy. Research is still being conducted on the effectiveness of computer-assisted teaching. Cost-benefit analysis, student experience, and student performance are now being carefully considered when determining whether online education is a viable substitute for classroom teaching. This decision process will most probably carry into the future as technology improves and as students demand better learning experiences.

Thus far, “literature on the efficacy of online courses is expansive and divided” ( Driscoll et al., 2012 ). Some studies favor traditional classroom instruction, stating “online learners will quit more easily” and “online learning can lack feedback for both students and instructors” ( Atchley et al., 2013 ). Because of these shortcomings, student retention, satisfaction, and performance can be compromised. Like traditional teaching, distance learning also has its apologists who aver online education produces students who perform as well or better than their traditional classroom counterparts ( Westhuis et al., 2006 ).

The advantages and disadvantages of both instructional modalities need to be fully fleshed out and examined to truly determine which medium generates better student performance. Both modalities have been proven to be relatively effective, but, as mentioned earlier, the question to be asked is if one is truly better than the other.

Student Need for Online Education

With technological advancement, learners now want quality programs they can access from anywhere and at any time. Because of these demands, online education has become a viable, alluring option to business professionals, stay-at home-parents, and other similar populations. In addition to flexibility and access, multiple other face value benefits, including program choice and time efficiency, have increased the attractiveness of distance learning ( Wladis et al., 2015 ).

First, prospective students want to be able to receive a quality education without having to sacrifice work time, family time, and travel expense. Instead of having to be at a specific location at a specific time, online educational students have the freedom to communicate with instructors, address classmates, study materials, and complete assignments from any Internet-accessible point ( Richardson and Swan, 2003 ). This type of flexibility grants students much-needed mobility and, in turn, helps make the educational process more enticing. According to Lundberg et al. (2008) “the student may prefer to take an online course or a complete online-based degree program as online courses offer more flexible study hours; for example, a student who has a job could attend the virtual class watching instructional film and streaming videos of lectures after working hours.”

Moreover, more study time can lead to better class performance—more chapters read, better quality papers, and more group project time. Studies on the relationship between study time and performance are limited; however, it is often assumed the online student will use any surplus time to improve grades ( Bigelow, 2009 ). It is crucial to mention the link between flexibility and student performance as grades are the lone performance indicator of this research.

Second, online education also offers more program choices. With traditional classroom study, students are forced to take courses only at universities within feasible driving distance or move. Web-based instruction, on the other hand, grants students electronic access to multiple universities and course offerings ( Salcedo, 2010 ). Therefore, students who were once limited to a few colleges within their immediate area can now access several colleges worldwide from a single convenient location.

Third, with online teaching, students who usually don't participate in class may now voice their opinions and concerns. As they are not in a classroom setting, quieter students may feel more comfortable partaking in class dialogue without being recognized or judged. This, in turn, may increase average class scores ( Driscoll et al., 2012 ).

Benefits of Face-to-Face (F2F) Education via Traditional Classroom Instruction

The other modality, classroom teaching, is a well-established instructional medium in which teaching style and structure have been refined over several centuries. Face-to-face instruction has numerous benefits not found in its online counterpart ( Xu and Jaggars, 2016 ).

First and, perhaps most importantly, classroom instruction is extremely dynamic. Traditional classroom teaching provides real-time face-to-face instruction and sparks innovative questions. It also allows for immediate teacher response and more flexible content delivery. Online instruction dampens the learning process because students must limit their questions to blurbs, then grant the teacher and fellow classmates time to respond ( Salcedo, 2010 ). Over time, however, online teaching will probably improve, enhancing classroom dynamics and bringing students face-to face with their peers/instructors. However, for now, face-to-face instruction provides dynamic learning attributes not found in Web-based teaching ( Kemp and Grieve, 2014 ).

Second, traditional classroom learning is a well-established modality. Some students are opposed to change and view online instruction negatively. These students may be technophobes, more comfortable with sitting in a classroom taking notes than sitting at a computer absorbing data. Other students may value face-to-face interaction, pre and post-class discussions, communal learning, and organic student-teacher bonding ( Roval and Jordan, 2004 ). They may see the Internet as an impediment to learning. If not comfortable with the instructional medium, some students may shun classroom activities; their grades might slip and their educational interest might vanish. Students, however, may eventually adapt to online education. With more universities employing computer-based training, students may be forced to take only Web-based courses. Albeit true, this doesn't eliminate the fact some students prefer classroom intimacy.

Third, face-to-face instruction doesn't rely upon networked systems. In online learning, the student is dependent upon access to an unimpeded Internet connection. If technical problems occur, online students may not be able to communicate, submit assignments, or access study material. This problem, in turn, may frustrate the student, hinder performance, and discourage learning.

Fourth, campus education provides students with both accredited staff and research libraries. Students can rely upon administrators to aid in course selection and provide professorial recommendations. Library technicians can help learners edit their papers, locate valuable study material, and improve study habits. Research libraries may provide materials not accessible by computer. In all, the traditional classroom experience gives students important auxiliary tools to maximize classroom performance.

Fifth, traditional classroom degrees trump online educational degrees in terms of hiring preferences. Many academic and professional organizations do not consider online degrees on par with campus-based degrees ( Columbaro and Monaghan, 2009 ). Often, prospective hiring bodies think Web-based education is a watered-down, simpler means of attaining a degree, often citing poor curriculums, unsupervised exams, and lenient homework assignments as detriments to the learning process.

Finally, research shows online students are more likely to quit class if they do not like the instructor, the format, or the feedback. Because they work independently, relying almost wholly upon self-motivation and self-direction, online learners may be more inclined to withdraw from class if they do not get immediate results.

The classroom setting provides more motivation, encouragement, and direction. Even if a student wanted to quit during the first few weeks of class, he/she may be deterred by the instructor and fellow students. F2F instructors may be able to adjust the structure and teaching style of the class to improve student retention ( Kemp and Grieve, 2014 ). With online teaching, instructors are limited to electronic correspondence and may not pick-up on verbal and non-verbal cues.

Both F2F and online teaching have their pros and cons. More studies comparing the two modalities to achieve specific learning outcomes in participating learner populations are required before well-informed decisions can be made. This study examined the two modalities over eight (8) years on three different levels. Based on the aforementioned information, the following research questions resulted.

RQ1: Are there significant differences in academic performance between online and F2F students enrolled in an environmental science course?

RQ2: Are there gender differences between online and F2F student performance in an environmental science course?

RQ3: Are there significant differences between the performance of online and F2F students in an environmental science course with respect to class rank?

The results of this study are intended to edify teachers, administrators, and policymakers on which medium may work best.

Methodology

Participants.

The study sample consisted of 548 FVSU students who completed the Environmental Science class between 2009 and 2016. The final course grades of the participants served as the primary comparative factor in assessing performance differences between online and F2F instruction. Of the 548 total participants, 147 were online students while 401 were traditional students. This disparity was considered a limitation of the study. Of the 548 total students, 246 were male, while 302 were female. The study also used students from all four class ranks. There were 187 freshmen, 184 sophomores, 76 juniors, and 101 seniors. This was a convenience, non-probability sample so the composition of the study set was left to the discretion of the instructor. No special preferences or weights were given to students based upon gender or rank. Each student was considered a single, discrete entity or statistic.

All sections of the course were taught by a full-time biology professor at FVSU. The professor had over 10 years teaching experience in both classroom and F2F modalities. The professor was considered an outstanding tenured instructor with strong communication and management skills.

The F2F class met twice weekly in an on-campus classroom. Each class lasted 1 h and 15 min. The online class covered the same material as the F2F class, but was done wholly on-line using the Desire to Learn (D2L) e-learning system. Online students were expected to spend as much time studying as their F2F counterparts; however, no tracking measure was implemented to gauge e-learning study time. The professor combined textbook learning, lecture and class discussion, collaborative projects, and assessment tasks to engage students in the learning process.

This study did not differentiate between part-time and full-time students. Therefore, many part-time students may have been included in this study. This study also did not differentiate between students registered primarily at FVSU or at another institution. Therefore, many students included in this study may have used FVSU as an auxiliary institution to complete their environmental science class requirement.

Test Instruments

In this study, student performance was operationalized by final course grades. The final course grade was derived from test, homework, class participation, and research project scores. The four aforementioned assessments were valid and relevant; they were useful in gauging student ability and generating objective performance measurements. The final grades were converted from numerical scores to traditional GPA letters.

Data Collection Procedures

The sample 548 student grades were obtained from FVSU's Office of Institutional Research Planning and Effectiveness (OIRPE). The OIRPE released the grades to the instructor with the expectation the instructor would maintain confidentiality and not disclose said information to third parties. After the data was obtained, the instructor analyzed and processed the data though SPSS software to calculate specific values. These converted values were subsequently used to draw conclusions and validate the hypothesis.

Summary of the Results: The chi-square analysis showed no significant difference in student performance between online and face-to-face (F2F) learners [χ 2 (4, N = 548) = 6.531, p > 0.05]. The independent sample t -test showed no significant difference in student performance between online and F2F learners with respect to gender [ t (145) = 1.42, p = 0.122]. The 2-way ANOVA showed no significant difference in student performance between online and F2F learners with respect to class rank ( Girard et al., 2016 ).

Research question #1 was to determine if there was a statistically significant difference between the academic performance of online and F2F students.

Research Question 1

The first research question investigated if there was a difference in student performance between F2F and online learners.

To investigate the first research question, we used a traditional chi-square method to analyze the data. The chi-square analysis is particularly useful for this type of comparison because it allows us to determine if the relationship between teaching modality and performance in our sample set can be extended to the larger population. The chi-square method provides us with a numerical result which can be used to determine if there is a statistically significant difference between the two groups.

Table 1 shows us the mean and SD for modality and for gender. It is a general breakdown of numbers to visually elucidate any differences between scores and deviations. The mean GPA for both modalities is similar with F2F learners scoring a 69.35 and online learners scoring a 68.64. Both groups had fairly similar SDs. A stronger difference can be seen between the GPAs earned by men and women. Men had a 3.23 mean GPA while women had a 2.9 mean GPA. The SDs for both groups were almost identical. Even though the 0.33 numerical difference may look fairly insignificant, it must be noted that a 3.23 is approximately a B+ while a 2.9 is approximately a B. Given a categorical range of only A to F, a plus differential can be considered significant.

www.frontiersin.org

Table 1 . Means and standard deviations for 8 semester- “Environmental Science data set.”

The mean grade for men in the environmental online classes ( M = 3.23, N = 246, SD = 1.19) was higher than the mean grade for women in the classes ( M = 2.9, N = 302, SD = 1.20) (see Table 1 ).

First, a chi-square analysis was performed using SPSS to determine if there was a statistically significant difference in grade distribution between online and F2F students. Students enrolled in the F2F class had the highest percentage of A's (63.60%) as compared to online students (36.40%). Table 2 displays grade distribution by course delivery modality. The difference in student performance was statistically significant, χ 2 (4, N = 548) = 6.531, p > 0.05. Table 3 shows the gender difference on student performance between online and F2F students.

www.frontiersin.org

Table 2 . Contingency table for student's academic performance ( N = 548).

www.frontiersin.org

Table 3 . Gender * performance crosstabulation.

Table 2 shows us the performance measures of online and F2F students by grade category. As can be seen, F2F students generated the highest performance numbers for each grade category. However, this disparity was mostly due to a higher number of F2F students in the study. There were 401 F2F students as opposed to just 147 online students. When viewing grades with respect to modality, there are smaller percentage differences between respective learners ( Tanyel and Griffin, 2014 ). For example, F2F learners earned 28 As (63.60% of total A's earned) while online learners earned 16 As (36.40% of total A's earned). However, when viewing the A grade with respect to total learners in each modality, it can be seen that 28 of the 401 F2F students (6.9%) earned As as compared to 16 of 147 (10.9%) online learners. In this case, online learners scored relatively higher in this grade category. The latter measure (grade total as a percent of modality total) is a better reflection of respective performance levels.

Given a critical value of 7.7 and a d.f. of 4, we were able to generate a chi-squared measure of 6.531. The correlating p -value of 0.163 was greater than our p -value significance level of 0.05. We, therefore, had to accept the null hypothesis and reject the alternative hypothesis. There is no statistically significant difference between the two groups in terms of performance scores.

Research Question 2

The second research question was posed to evaluate if there was a difference between online and F2F varied with gender. Does online and F2F student performance vary with respect to gender? Table 3 shows the gender difference on student performance between online and face to face students. We used chi-square test to determine if there were differences in online and F2F student performance with respect to gender. The chi-square test with alpha equal to 0.05 as criterion for significance. The chi-square result shows that there is no statistically significant difference between men and women in terms of performance.

Research Question 3

The third research question tried to determine if there was a difference between online and F2F varied with respect to class rank. Does online and F2F student performance vary with respect to class rank?

Table 4 shows the mean scores and standard deviations of freshman, sophomore, and junior and senior students for both online and F2F student performance. To test the third hypothesis, we used a two-way ANOVA. The ANOVA is a useful appraisal tool for this particular hypothesis as it tests the differences between multiple means. Instead of testing specific differences, the ANOVA generates a much broader picture of average differences. As can be seen in Table 4 , the ANOVA test for this particular hypothesis states there is no significant difference between online and F2F learners with respect to class rank. Therefore, we must accept the null hypothesis and reject the alternative hypothesis.

www.frontiersin.org

Table 4 . Descriptive analysis of student performance by class rankings gender.

The results of the ANOVA show there is no significant difference in performance between online and F2F students with respect to class rank. Results of ANOVA is presented in Table 5 .

www.frontiersin.org

Table 5 . Analysis of variance (ANOVA) for online and F2F of class rankings.

As can be seen in Table 4 , the ANOVA test for this particular hypothesis states there is no significant difference between online and F2F learners with respect to class rank. Therefore, we must accept the null hypothesis and reject the alternative hypothesis.

Discussion and Social Implications

The results of the study show there is no significant difference in performance between online and traditional classroom students with respect to modality, gender, or class rank in a science concepts course for non-STEM majors. Although there were sample size issues and study limitations, this assessment shows both online learners and classroom learners perform at the same level. This conclusion indicates teaching modality may not matter as much as other factors. Given the relatively sparse data on pedagogical modality comparison given specific student population characteristics, this study could be considered innovative. In the current literature, we have not found a study of this nature comparing online and F2F non-STEM majors with respect to three separate factors—medium, gender, and class rank—and the ability to learn science concepts and achieve learning outcomes. Previous studies have compared traditional classroom learning vs. F2F learning for other factors (including specific courses, costs, qualitative analysis, etcetera, but rarely regarding outcomes relevant to population characteristics of learning for a specific science concepts course over many years) ( Liu, 2005 ).

In a study evaluating the transformation of a graduate level course for teachers, academic quality of the online course and learning outcomes were evaluated. The study evaluated the ability of course instructors to design the course for online delivery and develop various interactive multimedia models at a cost-savings to the respective university. The online learning platform proved effective in translating information where tested students successfully achieved learning outcomes comparable to students taking the F2F course ( Herman and Banister, 2007 ).

Another study evaluated the similarities and differences in F2F and online learning in a non-STEM course, “Foundations of American Education” and overall course satisfaction by students enrolled in either of the two modalities. F2F and online course satisfaction was qualitatively and quantitative analyzed. However, in analyzing online and F2F course feedback using quantitative feedback, online course satisfaction was less than F2F satisfaction. When qualitative data was used, course satisfaction was similar between modalities ( Werhner, 2010 ). The course satisfaction data and feedback was used to suggest a number of posits for effective online learning in the specific course. The researcher concluded that there was no difference in the learning success of students enrolled in the online vs. F2F course, stating that “in terms of learning, students who apply themselves diligently should be successful in either format” ( Dell et al., 2010 ). The author's conclusion presumes that the “issues surrounding class size are under control and that the instructor has a course load that makes the intensity of the online course workload feasible” where the authors conclude that the workload for online courses is more than for F2F courses ( Stern, 2004 ).

In “A Meta-Analysis of Three Types of Interaction Treatments in Distance Education,” Bernard et al. (2009) conducted a meta-analysis evaluating three types of instructional and/or media conditions designed into distance education (DE) courses known as interaction treatments (ITs)—student–student (SS), student–teacher (ST), or student–content (SC) interactions—to other DE instructional/interaction treatments. The researchers found that a strong association existed between the integration of these ITs into distance education courses and achievement compared with blended or F2F modalities of learning. The authors speculated that this was due to increased cognitive engagement based in these three interaction treatments ( Larson and Sung, 2009 ).

Other studies evaluating students' preferences (but not efficacy) for online vs. F2F learning found that students prefer online learning when it was offered, depending on course topic, and online course technology platform ( Ary and Brune, 2011 ). F2F learning was preferred when courses were offered late morning or early afternoon 2–3 days/week. A significant preference for online learning resulted across all undergraduate course topics (American history and government, humanities, natural sciences, social, and behavioral sciences, diversity, and international dimension) except English composition and oral communication. A preference for analytical and quantitative thought courses was also expressed by students, though not with statistically significant results ( Mann and Henneberry, 2014 ). In this research study, we looked at three hypothesis comparing online and F2F learning. In each case, the null hypothesis was accepted. Therefore, at no level of examination did we find a significant difference between online and F2F learners. This finding is important because it tells us traditional-style teaching with its heavy emphasis on interpersonal classroom dynamics may 1 day be replaced by online instruction. According to Daymont and Blau (2008) online learners, regardless of gender or class rank, learn as much from electronic interaction as they do from personal interaction. Kemp and Grieve (2014) also found that both online and F2F learning for psychology students led to similar academic performance. Given the cost efficiencies and flexibility of online education, Web-based instructional systems may rapidly rise.

A number of studies support the economic benefits of online vs. F2F learning, despite differences in social constructs and educational support provided by governments. In a study by Li and Chen (2012) higher education institutions benefit the most from two of four outputs—research outputs and distance education—with teaching via distance education at both the undergraduate and graduate levels more profitable than F2F teaching at higher education institutions in China. Zhang and Worthington (2017) reported an increasing cost benefit for the use of distance education over F2F instruction as seen at 37 Australian public universities over 9 years from 2003 to 2012. Maloney et al. (2015) and Kemp and Grieve (2014) also found significant savings in higher education when using online learning platforms vs. F2F learning. In the West, the cost efficiency of online learning has been demonstrated by several research studies ( Craig, 2015 ). Studies by Agasisti and Johnes (2015) and Bartley and Golek (2004) both found the cost benefits of online learning significantly greater than that of F2F learning at U.S. institutions.

Knowing there is no significant difference in student performance between the two mediums, institutions of higher education may make the gradual shift away from traditional instruction; they may implement Web-based teaching to capture a larger worldwide audience. If administered correctly, this shift to Web-based teaching could lead to a larger buyer population, more cost efficiencies, and more university revenue.

The social implications of this study should be touted; however, several concerns regarding generalizability need to be taken into account. First, this study focused solely on students from an environmental studies class for non-STEM majors. The ability to effectively prepare students for scientific professions without hands-on experimentation has been contended. As a course that functions to communicate scientific concepts, but does not require a laboratory based component, these results may not translate into similar performance of students in an online STEM course for STEM majors or an online course that has an online laboratory based co-requisite when compared to students taking traditional STEM courses for STEM majors. There are few studies that suggest the landscape may be changing with the ability to effectively train students in STEM core concepts via online learning. Biel and Brame (2016) reported successfully translating the academic success of F2F undergraduate biology courses to online biology courses. However, researchers reported that of the large-scale courses analyzed, two F2F sections outperformed students in online sections, and three found no significant difference. A study by Beale et al. (2014) comparing F2F learning with hybrid learning in an embryology course found no difference in overall student performance. Additionally, the bottom quartile of students showed no differential effect of the delivery method on examination scores. Further, a study from Lorenzo-Alvarez et al. (2019) found that radiology education in an online learning platform resulted in similar academic outcomes as F2F learning. Larger scale research is needed to determine the effectiveness of STEM online learning and outcomes assessments, including workforce development results.

In our research study, it is possible the study participants may have been more knowledgeable about environmental science than about other subjects. Therefore, it should be noted this study focused solely on students taking this one particular class. Given the results, this course presents a unique potential for increasing the number of non-STEM majors engaged in citizen science using the flexibility of online learning to teach environmental science core concepts.

Second, the operationalization measure of “grade” or “score” to determine performance level may be lacking in scope and depth. The grades received in a class may not necessarily show actual ability, especially if the weights were adjusted to heavily favor group tasks and writing projects. Other performance indicators may be better suited to properly access student performance. A single exam containing both multiple choice and essay questions may be a better operationalization indicator of student performance. This type of indicator will provide both a quantitative and qualitative measure of subject matter comprehension.

Third, the nature of the student sample must be further dissected. It is possible the online students in this study may have had more time than their counterparts to learn the material and generate better grades ( Summers et al., 2005 ). The inverse holds true, as well. Because this was a convenience non-probability sampling, the chances of actually getting a fair cross section of the student population were limited. In future studies, greater emphasis must be placed on selecting proper study participants, those who truly reflect proportions, types, and skill levels.

This study was relevant because it addressed an important educational topic; it compared two student groups on multiple levels using a single operationalized performance measure. More studies, however, of this nature need to be conducted before truly positing that online and F2F teaching generate the same results. Future studies need to eliminate spurious causal relationships and increase generalizability. This will maximize the chances of generating a definitive, untainted results. This scientific inquiry and comparison into online and traditional teaching will undoubtedly garner more attention in the coming years.

Our study compared learning via F2F vs. online learning modalities in teaching an environmental science course additionally evaluating factors of gender and class rank. These data demonstrate the ability to similarly translate environmental science concepts for non-STEM majors in both traditional and online platforms irrespective of gender or class rank. The social implications of this finding are important for advancing access to and learning of scientific concepts by the general population, as many institutions of higher education allow an online course to be taken without enrolling in a degree program. Thus, the potential exists for increasing the number of non-STEM majors engaged in citizen science using the flexibility of online learning to teach environmental science core concepts.

Limitations of the Study

The limitations of the study centered around the nature of the sample group, student skills/abilities, and student familiarity with online instruction. First, because this was a convenience, non-probability sample, the independent variables were not adjusted for real-world accuracy. Second, student intelligence and skill level were not taken into consideration when separating out comparison groups. There exists the possibility that the F2F learners in this study may have been more capable than the online students and vice versa. This limitation also applies to gender and class rank differences ( Friday et al., 2006 ). Finally, there may have been ease of familiarity issues between the two sets of learners. Experienced traditional classroom students now taking Web-based courses may be daunted by the technical aspect of the modality. They may not have had the necessary preparation or experience to efficiently e-learn, thus leading to lowered scores ( Helms, 2014 ). In addition to comparing online and F2F instructional efficacy, future research should also analyze blended teaching methods for the effectiveness of courses for non-STEM majors to impart basic STEM concepts and see if the blended style is more effective than any one pure style.

Data Availability Statement

The datasets generated for this study are available on request to the corresponding author.

Ethics Statement

The studies involving human participants were reviewed and approved by Fort Valley State University Human Subjects Institutional Review Board. Written informed consent for participation was not required for this study in accordance with the national legislation and the institutional requirements.

Author Contributions

JP provided substantial contributions to the conception of the work, acquisition and analysis of data for the work, and is the corresponding author on this paper who agrees to be accountable for all aspects of the work in ensuring that questions related to the accuracy or integrity of any part of the work are appropriately investigated and resolved. FJ provided substantial contributions to the design of the work, interpretation of the data for the work, and revised it critically for intellectual content.

This research was supported in part by funding from the National Science Foundation, Awards #1649717, 1842510, Ñ900572, and 1939739 to FJ.

Conflict of Interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Acknowledgments

The authors would like to thank the reviewers for their detailed comments and feedback that assisted in the revising of our original manuscript.

Agasisti, T., and Johnes, G. (2015). Efficiency, costs, rankings and heterogeneity: the case of US higher education. Stud. High. Educ. 40, 60–82. doi: 10.1080/03075079.2013.818644

CrossRef Full Text | Google Scholar

Ary, E. J., and Brune, C. W. (2011). A comparison of student learning outcomes in traditional and online personal finance courses. MERLOT J. Online Learn. Teach. 7, 465–474.

Google Scholar

Atchley, W., Wingenbach, G., and Akers, C. (2013). Comparison of course completion and student performance through online and traditional courses. Int. Rev. Res. Open Dist. Learn. 14, 104–116. doi: 10.19173/irrodl.v14i4.1461

Bartley, S. J., and Golek, J. H. (2004). Evaluating the cost effectiveness of online and face-to-face instruction. Educ. Technol. Soc. 7, 167–175.

Beale, E. G., Tarwater, P. M., and Lee, V. H. (2014). A retrospective look at replacing face-to-face embryology instruction with online lectures in a human anatomy course. Am. Assoc. Anat. 7, 234–241. doi: 10.1002/ase.1396

PubMed Abstract | CrossRef Full Text | Google Scholar

Bernard, R. M., Abrami, P. C., Borokhovski, E., Wade, C. A., Tamim, R. M., Surkesh, M. A., et al. (2009). A meta-analysis of three types of interaction treatments in distance education. Rev. Educ. Res. 79, 1243–1289. doi: 10.3102/0034654309333844

Biel, R., and Brame, C. J. (2016). Traditional versus online biology courses: connecting course design and student learning in an online setting. J. Microbiol. Biol. Educ. 17, 417–422. doi: 10.1128/jmbe.v17i3.1157

Bigelow, C. A. (2009). Comparing student performance in an online versus a face to face introductory turfgrass science course-a case study. NACTA J. 53, 1–7.

Columbaro, N. L., and Monaghan, C. H. (2009). Employer perceptions of online degrees: a literature review. Online J. Dist. Learn. Administr. 12.

Craig, R. (2015). A Brief History (and Future) of Online Degrees. Forbes/Education . Available online at: https://www.forbes.com/sites/ryancraig/2015/06/23/a-brief-history-and-future-of-online-degrees/#e41a4448d9a8

Daymont, T., and Blau, G. (2008). Student performance in online and traditional sections of an undergraduate management course. J. Behav. Appl. Manag. 9, 275–294.

Dell, C. A., Low, C., and Wilker, J. F. (2010). Comparing student achievement in online and face-to-face class formats. J. Online Learn. Teach. Long Beach 6, 30–42.

Driscoll, A., Jicha, K., Hunt, A. N., Tichavsky, L., and Thompson, G. (2012). Can online courses deliver in-class results? A comparison of student performance and satisfaction in an online versus a face-to-face introductory sociology course. Am. Sociol. Assoc . 40, 312–313. doi: 10.1177/0092055X12446624

Friday, E., Shawnta, S., Green, A. L., and Hill, A. Y. (2006). A multi-semester comparison of student performance between multiple traditional and online sections of two management courses. J. Behav. Appl. Manag. 8, 66–81.

Girard, J. P., Yerby, J., and Floyd, K. (2016). Knowledge retention in capstone experiences: an analysis of online and face-to-face courses. Knowl. Manag. ELearn. 8, 528–539. doi: 10.34105/j.kmel.2016.08.033

Helms, J. L. (2014). Comparing student performance in online and face-to-face delivery modalities. J. Asynchr. Learn. Netw. 18, 1–14. doi: 10.24059/olj.v18i1.348

Herman, T., and Banister, S. (2007). Face-to-face versus online coursework: a comparison of costs and learning outcomes. Contemp. Issues Technol. Teach. Educ. 7, 318–326.

Kemp, N., and Grieve, R. (2014). Face-to-Face or face-to-screen? Undergraduates' opinions and test performance in classroom vs. online learning. Front. Psychol. 5:1278. doi: 10.3389/fpsyg.2014.01278

Keramidas, C. G. (2012). Are undergraduate students ready for online learning? A comparison of online and face-to-face sections of a course. Rural Special Educ. Q . 31, 25–39. doi: 10.1177/875687051203100405

Larson, D.K., and Sung, C. (2009). Comparing student performance: online versus blended versus face-to-face. J. Asynchr. Learn. Netw. 13, 31–42. doi: 10.24059/olj.v13i1.1675

Li, F., and Chen, X. (2012). Economies of scope in distance education: the case of Chinese Research Universities. Int. Rev. Res. Open Distrib. Learn. 13, 117–131.

Liu, Y. (2005). Effects of online instruction vs. traditional instruction on student's learning. Int. J. Instruct. Technol. Dist. Learn. 2, 57–64.

Lorenzo-Alvarez, R., Rudolphi-Solero, T., Ruiz-Gomez, M. J., and Sendra-Portero, F. (2019). Medical student education for abdominal radiographs in a 3D virtual classroom versus traditional classroom: a randomized controlled trial. Am. J. Roentgenol. 213, 644–650. doi: 10.2214/AJR.19.21131

Lundberg, J., Castillo-Merino, D., and Dahmani, M. (2008). Do online students perform better than face-to-face students? Reflections and a short review of some Empirical Findings. Rev. Univ. Soc. Conocim . 5, 35–44. doi: 10.7238/rusc.v5i1.326

Maloney, S., Nicklen, P., Rivers, G., Foo, J., Ooi, Y. Y., Reeves, S., et al. (2015). Cost-effectiveness analysis of blended versus face-to-face delivery of evidence-based medicine to medical students. J. Med. Internet Res. 17:e182. doi: 10.2196/jmir.4346

Mann, J. T., and Henneberry, S. R. (2014). Online versus face-to-face: students' preferences for college course attributes. J. Agric. Appl. Econ . 46, 1–19. doi: 10.1017/S1074070800000602

Mozes-Carmel, A., and Gold, S. S. (2009). A comparison of online vs proctored final exams in online classes. Imanagers J. Educ. Technol. 6, 76–81. doi: 10.26634/jet.6.1.212

Richardson, J. C., and Swan, K. (2003). Examining social presence in online courses in relation to student's perceived learning and satisfaction. J. Asynchr. Learn. 7, 68–88.

Roval, A. P., and Jordan, H. M. (2004). Blended learning and sense of community: a comparative analysis with traditional and fully online graduate courses. Int. Rev. Res. Open Dist. Learn. 5. doi: 10.19173/irrodl.v5i2.192

Salcedo, C. S. (2010). Comparative analysis of learning outcomes in face-to-face foreign language classes vs. language lab and online. J. Coll. Teach. Learn. 7, 43–54. doi: 10.19030/tlc.v7i2.88

Stern, B. S. (2004). A comparison of online and face-to-face instruction in an undergraduate foundations of american education course. Contemp. Issues Technol. Teach. Educ. J. 4, 196–213.

Summers, J. J., Waigandt, A., and Whittaker, T. A. (2005). A comparison of student achievement and satisfaction in an online versus a traditional face-to-face statistics class. Innov. High. Educ. 29, 233–250. doi: 10.1007/s10755-005-1938-x

Tanyel, F., and Griffin, J. (2014). A Ten-Year Comparison of Outcomes and Persistence Rates in Online versus Face-to-Face Courses . Retrieved from: https://www.westga.edu/~bquest/2014/onlinecourses2014.pdf

Werhner, M. J. (2010). A comparison of the performance of online versus traditional on-campus earth science students on identical exams. J. Geosci. Educ. 58, 310–312. doi: 10.5408/1.3559697

Westhuis, D., Ouellette, P. M., and Pfahler, C. L. (2006). A comparative analysis of on-line and classroom-based instructional formats for teaching social work research. Adv. Soc. Work 7, 74–88. doi: 10.18060/184

Wladis, C., Conway, K. M., and Hachey, A. C. (2015). The online STEM classroom-who succeeds? An exploration of the impact of ethnicity, gender, and non-traditional student characteristics in the community college context. Commun. Coll. Rev. 43, 142–164. doi: 10.1177/0091552115571729

Xu, D., and Jaggars, S. S. (2016). Performance gaps between online and face-to-face courses: differences across types of students and academic subject areas. J. Higher Educ. 85, 633–659. doi: 10.1353/jhe.2014.0028

Zhang, L.-C., and Worthington, A. C. (2017). Scale and scope economies of distance education in Australian universities. Stud. High. Educ. 42, 1785–1799. doi: 10.1080/03075079.2015.1126817

Keywords: face-to-face (F2F), traditional classroom teaching, web-based instructions, information and communication technology (ICT), online learning, desire to learn (D2L), passive learning, active learning

Citation: Paul J and Jefferson F (2019) A Comparative Analysis of Student Performance in an Online vs. Face-to-Face Environmental Science Course From 2009 to 2016. Front. Comput. Sci. 1:7. doi: 10.3389/fcomp.2019.00007

Received: 15 May 2019; Accepted: 15 October 2019; Published: 12 November 2019.

Reviewed by:

Copyright © 2019 Paul and Jefferson. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY) . The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Jasmine Paul, paulj@fvsu.edu

  • Open access
  • Published: 16 September 2021

Online learning during COVID-19 produced equivalent or better student course performance as compared with pre-pandemic: empirical evidence from a school-wide comparative study

  • Meixun Zheng 1 ,
  • Daniel Bender 1 &
  • Cindy Lyon 1  

BMC Medical Education volume  21 , Article number:  495 ( 2021 ) Cite this article

208k Accesses

75 Citations

115 Altmetric

Metrics details

The COVID-19 pandemic forced dental schools to close their campuses and move didactic instruction online. The abrupt transition to online learning, however, has raised several issues that have not been resolved. While several studies have investigated dental students’ attitude towards online learning during the pandemic, mixed results have been reported. Additionally, little research has been conducted to identify and understand factors, especially pedagogical factors, that impacted students’ acceptance of online learning during campus closure. Furthermore, how online learning during the pandemic impacted students’ learning performance has not been empirically investigated. In March 2020, the dental school studied here moved didactic instruction online in response to government issued stay-at-home orders. This first-of-its-kind comparative study examined students’ perceived effectiveness of online courses during summer quarter 2020, explored pedagogical factors impacting their acceptance of online courses, and empirically evaluated the impact of online learning on students’ course performance, during the pandemic.

The study employed a quasi-experimental design. Participants were 482 pre-doctoral students in a U.S dental school. Students’ perceived effectiveness of online courses during the pandemic was assessed with a survey. Students’ course grades for online courses during summer quarter 2020 were compared with that of a control group who received face-to-face instruction for the same courses before the pandemic in summer quarter 2019.

Survey results revealed that most online courses were well accepted by the students, and 80 % of them wanted to continue with some online instruction post pandemic. Regression analyses revealed that students’ perceived engagement with faculty and classmates predicted their perceived effectiveness of the online course. More notably, Chi Square tests demonstrated that in 16 out of the 17 courses compared, the online cohort during summer quarter 2020 was equally or more likely to get an A course grade than the analogous face-to-face cohort during summer quarter 2019.

Conclusions

This is the first empirical study in dental education to demonstrate that online courses during the pandemic could achieve equivalent or better student course performance than the same pre-pandemic in-person courses. The findings fill in gaps in literature and may inform online learning design moving forward.

Peer Review reports

Introduction

Research across disciplines has demonstrated that well-designed online learning can lead to students’ enhanced motivation, satisfaction, and learning [ 1 , 2 , 3 , 4 , 5 , 6 , 7 ]. A report by the U.S. Department of Education [ 8 ], based on examinations of comparative studies of online and face-to-face versions of the same course from 1996 to 2008, concluded that online learning could produce learning outcomes equivalent to or better than face-to-face learning. The more recent systematic review by Pei and Wu [ 9 ] provided additional evidence that online learning is at least as effective as face-to-face learning for undergraduate medical students.

To take advantage of the opportunities presented by online learning, thought leaders in dental education in the U.S. have advocated for the adoption of online learning in the nation’s dental schools [ 10 , 11 , 12 ]. However, digital innovation has been a slow process in academic dentistry [ 13 , 14 , 15 ]. In March 2020, the COVID-19 pandemic brought unprecedented disruption to dental education by necessitating the need for online learning. In accordance with stay-at-home orders to prevent the spread of the virus, dental schools around the world closed their campuses and moved didactic instruction online.

The abrupt transition to online learning, however, has raised several concerns and question. First, while several studies have examined dental students’ online learning satisfaction during the pandemic, mixed results have been reported. Some studies have reported students’ positive attitude towards online learning [ 15 , 16 , 17 , 18 , 19 , 20 ]. Sadid-Zadeh et al. [ 18 ] found that 99 % of the surveyed dental students at University of Buffalo, in the U.S., were satisfied with live web-based lectures during the pandemic. Schlenz et al. [ 15 ] reported that students in a German dental school had a favorable attitude towards online learning and wanted to continue with online instruction in their future curriculum. Other studies, however, have reported students’ negative online learning experience during the pandemic [ 21 , 22 , 23 , 24 , 25 , 26 ]. For instance, dental students at Harvard University felt that learning during the pandemic had worsened and engagement had decreased [ 23 , 24 ]. In a study with medical and dental students in Pakistan, Abbasi et al. [ 21 ] found that 77 % of the students had negative perceptions about online learning and 84 % reported reduced student-instructor interactions.

In addition to these mixed results, little attention has been given to factors affecting students’ acceptance of online learning during the pandemic. With the likelihood that online learning will persist post pandemic [ 27 ], research in this area is warranted to inform online course design moving forward. In particular, prior research has demonstrated that one of the most important factors influencing students’ performance in any learning environment is a sense of belonging, the feeling of being connected with and supported by the instructor and classmates [ 28 , 29 , 30 , 31 ]. Unfortunately, this aspect of the classroom experience has suffered during school closure. While educational events can be held using a video conferencing system, virtual peer interaction on such platforms has been perceived by medical trainees to be not as easy and personal as physical interaction [ 32 ]. The pandemic highlights the need to examine instructional strategies most suited to the current situation to support students’ engagement with faculty and classmates.

Furthermore, there is considerable concern from the academic community about the quality of online learning. Pre-pandemic, some faculty and students were already skeptical about the value of online learning [ 33 ]. The longer the pandemic lasts, the more they may question the value of online education, asking: Can online learning during the pandemic produce learning outcomes that are similar to face-to-face learning before the pandemic? Despite the documented benefits of online learning prior to the pandemic, the actual impact of online learning during the pandemic on students’ academic performance is still unknown due to reasons outlined below.

On one hand, several factors beyond the technology used could influence the effectiveness of online learning, one of which is the teaching context [ 34 ]. The sudden transition to online learning has posed many challenges to faculty and students. Faculty may not have had adequate time to carefully design online courses to take full advantage of the possibilities of the online format. Some faculty may not have had prior online teaching experience and experienced a deeper learning curve when it came to adopting online teaching methods [ 35 ]. Students may have been at the risk of increased anxiety due to concerns about contracting the virus, on time graduation, finances, and employment [ 36 , 37 ], which may have negatively impacted learning performance [ 38 ]. Therefore, whether online learning during the pandemic could produce learning outcomes similar to those of online learning implemented during more normal times remains to be determined.

Most existing studies on online learning in dental education during the pandemic have only reported students’ satisfaction. The actual impact of the online format on academic performance has not been empirically investigated. The few studies that have examined students’ learning outcomes have only used students’ self-reported data from surveys and focus groups. According to Kaczmarek et al. [ 24 ], 50 % of the participating dental faculty at Harvard University perceived student learning to have worsened during the pandemic and 70 % of the students felt the same. Abbasi et al. [ 21 ] reported that 86 % of medical and dental students in a Pakistan college felt that they learned less online. While student opinions are important, research has demonstrated a poor correlation between students’ perceived learning and actual learning gains [ 39 ]. As we continue to navigate the “new normal” in teaching, students’ learning performance needs to be empirically evaluated to help institutions gauge the impact of this grand online learning experiment.

Research purposes

In March 2020, the University of the Pacific Arthur A. Dugoni School of Dentistry, in the U.S., moved didactic instruction online to ensure the continuity of education during building closure. This study examined students’ acceptance of online learning during the pandemic and its impacting factors, focusing on instructional practices pertaining to students’ engagement/interaction with faculty and classmates. Another purpose of this study was to empirically evaluate the impact of online learning during the pandemic on students’ actual course performance by comparing it with that of a pre-pandemic cohort. To understand the broader impact of the institutional-wide online learning effort, we examined all online courses offered in summer quarter 2020 (July to September) that had a didactic component.

This is the first empirical study in dental education to evaluate students’ learning performance during the pandemic. The study aimed to answer the following three questions.

How well was online learning accepted by students, during the summer quarter 2020 pandemic interruption?

How did instructional strategies, centered around students’ engagement with faculty and classmates, impact their acceptance of online learning?

How did online learning during summer quarter 2020 impact students’ course performance as compared with a previous analogous cohort who received face-to-face instruction in summer quarter 2019?

This study employed a quasi-experimental design. The study was approved by the university’s institutional review board (#2020-68).

Study context and participants

The study was conducted at the Arthur A. Dugoni School of Dentistry, University of the Pacific. The program runs on a quarter system. It offers a 3-year accelerated Doctor of Dental Surgery (DDS) program and a 2-year International Dental Studies (IDS) program for international dentists who have obtained a doctoral degree in dentistry from a country outside the U.S. and want to practice in the U.S. Students advance throughout the program in cohorts. IDS students take some courses together with their DDS peers. All three DDS classes (D1/DDS 2023, D2/DDS 2022, and D3/DDS 2021) and both IDS classes (I1/IDS 2022 and I2/IDS 2021) were invited to participate in the study. The number of students in each class was: D1 = 145, D2 = 143, D3 = 143, I1 = 26, and I2 = 25. This resulted in a total of 482 student participants.

During campus closure, faculty delivered remote instruction in various ways, including live online classes via Zoom @  [ 40 ], self-paced online modules on the school’s learning management system Canvas @  [ 41 ], or a combination of live and self-paced delivery. For self-paced modules, students studied assigned readings and/or viewings such as videos and pre-recorded slide presentations. Some faculty also developed self-paced online lessons with SoftChalk @  [ 42 ], a cloud-based platform that supports the inclusion of gamified learning by insertion of various mini learning activities. The SoftChalk lessons were integrated with Canvas @  [ 41 ] and faculty could monitor students’ progress. After students completed the pre-assigned online materials, some faculty held virtual office hours or live online discussion sessions for students to ask questions and discuss key concepts.

Data collection and analysis

Student survey.

Students’ perceived effectiveness of summer quarter 2020 online courses was evaluated by the school’s Office of Academic Affairs in lieu of the regular course evaluation process. A total of 19 courses for DDS students and 10 courses for IDS students were evaluated. An 8-question survey developed by the researchers (Additional file 1 ) was administered online in the last week of summer quarter 2020. Course directors invited student to take the survey during live online classes. The survey introduction stated that taking the survey was voluntary and that their anonymous responses would be reported in aggregated form for research purposes. Students were invited to continue with the survey if they chose to participate; otherwise, they could exit the survey. The number of students in each class who took the survey was as follows: D1 ( n  = 142; 98 %), D2 ( n  = 133; 93 %), D3 ( n  = 61; 43 %), I1 ( n  = 23; 88 %), and I2 ( n  = 20; 80 %). This resulted in a total of 379 (79 %) respondents across all classes.

The survey questions were on a 4-point scale, ranging from Strongly Disagree (1 point), Disagree (2 points), Agree (3 points), and Strongly Agree (4 points). Students were asked to rate each online course by responding to four statements: “ I could fully engage with the instructor and classmates in this course”; “The online format of this course supported my learning”; “Overall this online course is effective.”, and “ I would have preferred face-to-face instruction for this course ”. For the first three survey questions, a higher mean score indicated a more positive attitude toward the online course. For the fourth question “ I would have preferred face-to-face instruction for this course ”, a higher mean score indicated that more students would have preferred face-to-face instruction for the course. Two additional survey questions asked students to select their preferred online delivery method for fully online courses during the pandemic from three given choices (synchronous online/live, asynchronous online/self-paced, and a combination of both), and to report whether they wanted to continue with some online instruction post pandemic. Finally, two open-ended questions at the end of the survey allowed students to comment on the aspects of online format that they found to be helpful and to provide suggestion for improvement. For the purpose of this study, we focused on the quantitative data from the Likert-scale questions.

Descriptive data such as the mean scores were reported for each course. Regression analyses were conducted to examine the relationship between instructional strategies focusing on students’ engagement with faculty and classmates, and their overall perceived effectiveness of the online course. The independent variable was student responses to the question “ I could fully engage with the instructor and classmates in this course ”, and the dependent variable was their answer to the question “ Overall, this online course is effective .”

Student course grades

Using Chi-square tests, student course grade distributions (A, B, C, D, and F) for summer quarter 2020 online courses were compared with that of a previous cohort who received face-to-face instruction for the same course in summer quarter 2019. Note that as a result of the school’s pre-doctoral curriculum redesign implemented in July 2019, not all courses offered in summer quarter 2020 were offered in the previous year in summer quarter 2019. In other words, some of the courses offered in summer quarter 2020 were new courses offered for the first time. Because these new courses did not have a previous face-to-face version to compare to, they were excluded from data analysis. For some other courses, while course content remained the same between 2019 and 2020, the sequence of course topics within the course had changed. These courses were also excluded from data analysis.

After excluding the aforementioned courses, it resulted in a total of 17 “comparable” courses that were included in data analysis (see the subsequent section). For these courses, the instructor, course content, and course goals were the same in both 2019 and 2020. The assessment methods and grading policies also remained the same through both years. For exams and quizzes, multiple choice questions were the dominating format for both years. While some exam questions in 2020 were different from 2019, faculty reported that the overall exam difficulty level was similar. The main difference in assessment was testing conditions. The 2019 cohort took computer-based exams in the physical classroom with faculty proctoring, and the 2020 cohort took exams at home with remote proctoring to ensure exam integrity. The remote proctoring software monitored the student during the exam through a web camera on their computer/laptop. The recorded video file flags suspicious activities for faculty review after exam completion.

Students’ perceived effectiveness of online learning

Table  1 summarized data on DDS students’ perceived effectiveness of each online course during summer quarter 2020. For the survey question “ Overall, this online course is effective ”, the majority of courses received a mean score that was approaching or over 3 points on the 4-point scale, suggesting that online learning was generally well accepted by students. Despite overall positive online course experiences, for many of the courses examined, there was an equal split in student responses to the question “ I would have preferred face-to-face instruction for this course .” Additionally, for students’ preferred online delivery method for fully online courses, about half of the students in each class preferred a combination of synchronous and asynchronous online learning (see Fig.  1 ). Finally, the majority of students wanted faculty to continue with some online instruction post pandemic: D1class (110; 78.60 %), D2 class (104; 80 %), and D3 class (49; 83.10 %).

While most online courses received favorable ratings, some variations did exist among courses. For D1 courses, “ Anatomy & Histology ” received lower ratings than others. This could be explained by its lab component, which didn’t lend itself as well to the online format. For D2 courses, several of them received lower ratings than others, especially for the survey question on students’ perceived engagement with faculty and classmates.

figure 1

DDS students’ preferred online delivery method for fully online courses

Table  2 summarized IDS students’ perceived effectiveness of each online course during summer quarter 2020. For the survey question “ Overall, this online course is effective ”, all courses received a mean score that was approaching or over 3 points on a 4-point scale, suggesting that online learning was well accepted by students. For the survey question “ I would have preferred face-to-face instruction for this course ”, for most online courses examined, the percentage of students who would have preferred face-to-face instruction was similar to that of students who preferred online instruction for the course. Like their DDS peers, about half of the IDS students in each class also preferred a combination of synchronous and asynchronous online delivery for fully online courses (See Fig.  2 ). Finally, the majority of IDS students (I1, n = 18, 81.80 %; I2, n = 16, 84.20 %) wanted to continue with some online learning after the pandemic is over.

figure 2

IDS students’ preferred online delivery method for fully online courses

Factors impacting students’ acceptance of online learning

For all 19 online courses taken by DDS students, regression analyses indicated that there was a significantly positive relationship between students’ perceived engagement with faculty and classmates and their perceived effectiveness of the course. P value was 0.00 across all courses. The ranges of effect size (r 2 ) were: D1 courses (0.26 to 0.50), D2 courses (0.39 to 0.650), and D3 courses (0.22 to 0.44), indicating moderate to high correlations across courses.

For 9 out of the 10 online courses taken by IDS students, there was a positive relationship between students’ perceived engagement with faculty and classmates and their perceived effectiveness of the course. P value was 0.00 across courses. The ranges of effect size were: I1 courses (0.35 to 0.77) and I2 courses (0.47 to 0.63), indicating consistently high correlations across courses. The only course in which students’ perceived engagement with faculty and classmates didn’t predict perceived effective of the course was “ Integrated Clinical Science III (ICS III) ”, which the I2 class took together with their D3 peers.

Impact of online learning on students’ course performance

Chi square test results (Table  3 ) indicated that in 4 out of the 17 courses compared, the online cohort during summer quarter 2020 was more likely to receive an A grade than the face-to-face cohort during summer quarter 2019. In 12 of the courses, the online cohort were equally likely to receive an A grade as the face-to-face cohort. In the remaining one course, the online cohort was less likely to receive an A grade than the face-to-face cohort.

Students’ acceptance of online learning during the pandemic

Survey results revealed that students had generally positive perceptions about online learning during the pandemic and the majority of them wanted to continue with some online learning post pandemic. Overall, our findings supported several other studies in dental [ 18 , 20 ], medical [ 43 , 44 ], and nursing [ 45 ] education that have also reported students’ positive attitudes towards online learning during the pandemic. In their written comments in the survey, students cited enhanced flexibility as one of the greatest benefits of online learning. Some students also commented that typing questions in the chat box during live online classes was less intimidating than speaking in class. Others explicitly stated that not having to commute to/from school provided more time for sleep, which helped with self-care and mental health. Our findings are in line with previous studies which have also demonstrated that online learning offered higher flexibility [ 46 , 47 ]. Meanwhile, consistent with findings of other researchers [ 19 , 21 , 46 ], our students felt difficulty engaging with faculty and classmates in several online courses.

There were some variations among individual courses in students’ acceptance of the online format. One factor that could partially account for the observed differences was instructional strategies. In particular, our regression analysis results demonstrated a positive correlation between students’ perceived engagement with faculty and classmates and their perceived overall effectiveness of the online course. Other aspects of course design might also have influenced students’ overall rating of the online course. For instance, some D2 students commented that the requirements of the course “ Integrated Case-based Seminars (ICS II) ” were not clear and that assessment did not align with lecture materials. It is important to remember that communicating course requirements clearly and aligning course content and assessment are principles that should be applied in any course, whether face-to-face or online. Our results highlighted the importance of providing faculty training on basic educational design principles and online learning design strategies. Furthermore, the nature of the course might also have impacted student ratings. For example, D1 course “ Anatomy and Histology ” had a lab component, which did not lend itself as well to the online format. Many students reported that it was difficult to see faculty’s live demonstration during Zoom lectures, which may have resulted in a lower student satisfaction rating.

As for students’ preferred online delivery method for fully online courses during the pandemic, about half of them preferred a combination of synchronous and asynchronous online learning. In light of this finding, as we continue with remote learning until public health directives allow a return to campus, we will encourage faculty to integrate these two online delivery modalities. Finally, in view of the result that over 80 % of the students wanted to continue with some online instruction after the pandemic, the school will advocate for blended learning in the post-pandemic world [ 48 ]. For future face-to-face courses on campus after the pandemic, faculty are encouraged to deliver some content online to reduce classroom seat time and make learning more flexible. Taken together, our findings not only add to the overall picture of the current situation but may inform learning design moving forward.

Role of online engagement and interaction

To reiterate, we found that students’ perceived engagement with faculty and classmates predicted their perceived overall effectiveness of the online course. This aligns with the larger literature on best practices in online learning design. Extensive research prior to the pandemic has confirmed that the effectiveness of online learning is determined by a number of factors beyond the tools used, including students’ interactions with the instructor and classmates [ 49 , 50 , 51 , 52 ]. Online students may feel isolated due to reduced or lack of interaction [ 53 , 54 ]. Therefore, in designing online learning experiences, it is important to remember that learning is a social process [ 55 ]. Faculty’s role is not only to transmit content but also to promote the different types of interactions that are an integral part of the online learning process [ 33 ]. The online teaching model in which faculty uploads materials online but teach it in the same way as in the physical classroom, without special effort to engage students, doesn’t make the best use of the online format. Putting the “sage on the screen” during a live class meeting on a video conferencing system is not different from “sage on the stage” in the physical classroom - both provide limited space for engagement. Such one-way monologue devalues the potentials that online learning presents.

In light of the critical role that social interaction plays in online learning, faculty are encouraged to use the interactive features of online learning platforms to provide clear channels for student-instructor and student-student interactions. In the open-ended comments, students highlighted several instructional strategies that they perceived to be helpful for learning. For live online classes, these included conducting breakout room activities, using the chat box to facilitate discussions, polling, and integrating gameplay with apps such as Kahoot! @  [ 56 ]. For self-paced classes, students appreciated that faculty held virtual office hours or subsequent live online discussion sessions to reinforce understanding of the pre-assigned materials.

Quality of online education during the pandemic

This study provided empirical evidence in dental education that it was possible to ensure the continuity of education without sacrificing the quality of education provided to students during forced migration to distance learning upon building closure. To reiterate, in all but one online course offered in summer quarter 2020, students were equally or more likely to get an A grade than the face-to-face cohort from summer quarter 2019. Even for courses that had less student support for the online format (e.g., the D1 course “ Anatomy and Histology ”), there was a significant increase in the number of students who earned an A grade in 2020 as compared with the previous year. The reduced capacity for technical training during the pandemic may have resulted in more study time for didactic content. Overall, our results resonate with several studies in health sciences education before the pandemic that the quality of learning is comparable in face-to-face and online formats [ 9 , 57 , 58 ]. For the only course ( Integrated Case-based Seminars ICS II) in which the online cohort had inferior performance than the face-to-face cohort, as mentioned earlier, students reported that assessment was not aligned with course materials and that course expectations were not clear. This might explain why students’ course performance was not as strong as expected.

Limitations

This study used a pre-existing control group from the previous year. There may have been individual differences between students in the online and the face-to-face cohorts, such as motivation, learning style, and prior knowledge, that could have impacted the observed outcomes. Additionally, even though course content and assessment methods were largely the same in 2019 and 2020, changes in other aspects of the course could have impacted students’ course performance. Some faculty may have been more compassionate with grading (e.g., more flexible with assignment deadlines) in summer quarter 2020 given the hardship students experienced during the pandemic. On the other hand, remote proctoring in summer quarter 2020 may have heightened some students’ exam anxiety knowing that they were being monitored through a webcam. The existence and magnitude of effect of these factors needs to be further investigated.

This present study only examined the correlation between students’ perceived online engagement and their perceived overall effectiveness of the online course. Other factors that might impact their acceptance of the online format need to be further researched in future studies. Another future direction is to examine how students’ perceived online engagement correlates with their actual course performance. Because the survey data collected for our present study are anonymous, we cannot match students’ perceived online engagement data with their course grades to run this additional analysis. It should also be noted that this study was focused on didactic online instruction. Future studies might examine how technical training was impacted during the COVID building closure. It was also out of the scope of this study to examine how student characteristics, especially high and low academic performance as reflected by individual grades, affects their online learning experience and performance. We plan to conduct a follow-up study to examine which group of students are most impacted by the online format. Finally, this study was conducted in a single dental school, and so the findings may not be generalizable to other schools and disciplines. Future studies could be conducted in another school or disciplines to compare results.

This study revealed that dental students had generally favorable attitudes towards online learning during the COVID-19 pandemic and that their perceived engagement with faculty and classmates predicted their acceptance of the online course. Most notably, this is the first study in dental education to demonstrate that online learning during the pandemic could achieve similar or better learning outcomes than face-to-face learning before the pandemic. Findings of our study could contribute significantly to the literature on online learning during the COVID-19 pandemic in health sciences education. The results could also inform future online learning design as we re-envision the future of online learning.

Availability of data and materials

The datasets used and/or analyzed during the current study are available from the corresponding author on reasonable request.

Bello G, Pennisi MA, Maviglia R, Maggiore SM, Bocci MG, Montini L, et al. Online vs live methods for teaching difficult airway management to anesthesiology residents. Intensive Care Med. 2005; 31 (4): 547–552.

Article   Google Scholar  

Ruiz JG, Mintzer MJ, Leipzig RM. The impact of e-learning in medical education. Acad Med. 2006; 81(3): 207–12.

Kavadella A, Tsiklakis K, Vougiouklakis G, Lionarakis A. Evaluation of a blended learning course for teaching oral radiology to undergraduate dental students. Eur J Dent Educ. 2012; 16(1): 88–95.

de Jong N, Verstegen DL, Tan FS, O’Connor SJ. A comparison of classroom and online asynchronous problem-based learning for students undertaking statistics training as part of a public health master’s degree. Adv Health Sci Educ. 2013; 18(2):245–64.

Hegeman JS. Using instructor-generated video lectures in online mathematics coursesimproves student learning. Online Learn. 2015;19(3):70–87.

Gaupp R, Körner M, Fabry G. Effects of a case-based interactive e-learning course on knowledge and attitudes about patient safety: a quasi-experimental study with third-year medical students. BMC Med Educ. 2016; 16(1):172.

Zheng M, Bender D, Reid L, Milani J. An interactive online approach to teaching evidence-based dentistry with Web 2.0 technology. J Dent Educ. 2017; 81(8): 995–1003.

Means B, Toyama Y, Murphy R, Bakia M, Jones K. Evaluation of evidence-based practices in online learning: A meta-analysis and review of online learning studies. U.S. Department of Education, Office of Planning, Evaluation and Policy Development. Washington D.C. 2009.

Google Scholar  

Pei L, Wu H. Does online learning work better than offline learning in undergraduate medical education? A systematic review and meta-analysis. Med Educ Online. 2019; 24(1):1666538.

Andrews KG, Demps EL. Distance education in the U.S. and Canadian undergraduate dental curriculum. J Dent Educ. 2003; 67(4):427–38.

Kassebaum DK, Hendricson WD, Taft T, Haden NK. The dental curriculum at North American dental institutions in 2002–03: a survey of current structure, recent innovations, and planned changes. J Dent Educ. 2004; 68(9):914–931.

Haden NK, Hendricson WD, Kassebaum DK, Ranney RR, Weinstein G, Anderson EL, et al. Curriculum changes in dental education, 2003–09. J Dent Educ. 2010; 74(5):539–57.

DeBate RD, Cragun D, Severson HH, Shaw T, Christiansen S, Koerber A, et al. Factors for increasing adoption of e-courses among dental and dental hygiene faculty members. J Dent Educ. 2011; 75 (5): 589–597.

Saeed SG, Bain J, Khoo E, Siqueira WL. COVID-19: Finding silver linings for dental education. J Dent Educ. 2020; 84(10):1060–1063.

Schlenz MA, Schmidt A, Wöstmann B, Krämer N, Schulz-Weidner N. Students’ and lecturers’ perspective on the implementation of online learning in dental education due to SARS-CoV-2 (COVID-19): a cross-sectional study. BMC Med Educ. 2020;20(1):1–7.

Donn J, Scott JA, Binnie V, Bell A. A pilot of a virtual Objective Structured Clinical Examination in dental education. A response to COVID-19. Eur J Dent Educ. 2020; https://doi.org/10.1111/eje.12624

Hung M, Licari FW, Hon ES, Lauren E, Su S, Birmingham WC, Wadsworth LL, Lassetter JH, Graff TC, Harman W, et al. In an era of uncertainty: impact of COVID-19 on dental education. J Dent Educ. 2020; 85 (2): 148–156.

Sadid-Zadeh R, Wee A, Li R, Somogyi‐Ganss E. Audience and presenter comparison of live web‐based lectures and traditional classroom lectures during the COVID‐19 pandemic. J Prosthodont. 2020. doi: https://doi.org/10.1111/jopr.13301

Wang K, Zhang L, Ye L. A nationwide survey of online teaching strategies in dental education in China. J Dent Educ. 2020; 85 (2): 128–134.

Rad FA, Otaki F, Baqain Z, Zary N, Al-Halabi M. Rapid transition to distance learning due to COVID-19: Perceptions of postgraduate dental learners and instructors. PLoS One. 2021; 16(2): e0246584.

Abbasi S, Ayoob T, Malik A, Memon SI. Perceptions of students regarding E-learning during Covid-19 at a private medical college. Pak J Med Sci. 2020; 3 6 : 57–61.

Al-Azzam N, Elsalem L, Gombedza F. A cross-sectional study to determine factors affecting dental and medical students’ preference for virtual learning during the COVID-19 outbreak. Heliyon. 6(12). 2020. doi: https://doi.org/10.1016/j.heliyon.2020.e05704

Chen E, Kaczmarek K, Ohyama H. Student perceptions of distance learning strategies during COVID-19. J Dent Educ. 2020. doi: https://doi.org/10.1002/jdd.12339

Kaczmarek K, Chen E, Ohyama H. Distance learning in the COVID-19 era: Comparison of student and faculty perceptions. J Dent Educ. 2020. https://doi.org/10.1002/jdd.12469

Sarwar H, Akhtar H, Naeem MM, Khan JA, Waraich K, Shabbir S, et al. Self-reported effectiveness of e-learning classes during COVID-19 pandemic: A nation-wide survey of Pakistani undergraduate dentistry students. Eur J Dent. 2020; 14 (S01): S34-S43.

Al-Taweel FB, Abdulkareem AA, Gul SS, Alshami ML. Evaluation of technology‐based learning by dental students during the pandemic outbreak of coronavirus disease 2019. Eur J Dent Educ. 2021; 25(1): 183–190.

Elangovan S, Mahrous A, Marchini L. Disruptions during a pandemic: Gaps identified and lessons learned. J Dent Educ. 2020; 84 (11): 1270–1274.

Goodenow C. Classroom belonging among early adolescent students: Relationships to motivation and achievement. J Early Adolesc.1993; 13(1): 21–43.

Goodenow C. The psychological sense of school membership among adolescents: Scale development and educational correlates. Psychol Sch. 1993; 30(1): 79–90.

St-Amand J, Girard S, Smith J. Sense of belonging at school: Defining attributes, determinants, and sustaining strategies. IAFOR Journal of Education. 2017; 5(2):105–19.

Peacock S, Cowan J. Promoting sense of belonging in online learning communities of inquiry at accredited courses. Online Learn. 2019; 23(2): 67–81.

Chan GM, Kanneganti A, Yasin N, Ismail-Pratt I, Logan SJ. Well‐being, obstetrics and gynecology and COVID‐19: Leaving no trainee behind. Aust N Z J Obstet Gynaecol. 2020; 60(6): 983–986.

Hodges C, Moore S, Lockee B, Trust T, Bond A. The difference between emergency remote teaching and online learning. Educause Review. 2020; 2 7 , 1–12.

Means B, Bakia M, Murphy R. Learning online: What research tells us about whether, when and how. Routledge. 2014.

Iyer P, Aziz K, Ojcius DM. Impact of COVID-19 on dental education in the United States. J Dent Educ. 2020; 84(6): 718–722.

Machado RA, Bonan PRF, Perez DEDC, Martelli JÚnior H. 2020. COVID-19 pandemic and the impact on dental education: Discussing current and future perspectives. Braz Oral Res. 2020; 34: e083.

Wu DT, Wu KY, Nguyen TT, Tran SD. The impact of COVID-19 on dental education in North America-Where do we go next? Eur J Dent Educ. 2020; 24(4): 825–827.

de Oliveira Araújo FJ, de Lima LSA, Cidade PIM, Nobre CB, Neto MLR. Impact of Sars-Cov-2 and its reverberation in global higher education and mental health. Psychiatry Res. 2020; 288:112977. doi: https://doi.org/10.1016/j.psychres.2020.112977

Persky AM, Lee E, Schlesselman LS. Perception of learning versus performance as outcome measures of educational research. Am J Pharm Educ. 2020; 8 4 (7): ajpe7782.

Zoom @ . Zoom Video Communications , San Jose, CA, USA. https://zoom.us/

Canvas @ . Instructure, INC. Salt Lake City, UT, USA. https://www.instructure.com/canvas

SoftChalk @ . SoftChalk LLC . San Antonio, TX, USA. https://www.softchalkcloud.com/

Agarwal S, Kaushik JS. Student’s perception of online learning during COVID pandemic. Indian J Pediatr. 2020; 87: 554–554.

Khalil R, Mansour AE, Fadda WA, Almisnid K, Aldamegh M, Al-Nafeesah A, et al. The sudden transition to synchronized online learning during the COVID-19 pandemic in Saudi Arabia: a qualitative study exploring medical students’ perspectives. BMC Med Educ. 2020; 20(1): 1–10.

Riley E, Capps N, Ward N, McCormack L, Staley J. Maintaining academic performance and student satisfaction during the remote transition of a nursing obstetrics course to online instruction. Online Learn. 2021; 25(1), 220–229.

Amir LR, Tanti I, Maharani DA, Wimardhani YS, Julia V, Sulijaya B, et al. Student perspective of classroom and distance learning during COVID-19 pandemic in the undergraduate dental study program Universitas Indonesia. BMC Med Educ. 2020; 20(1):1–8.

Dost S, Hossain A, Shehab M, Abdelwahed A, Al-Nusair L. Perceptions of medical students towards online teaching during the COVID-19 pandemic: a national cross-sectional survey of 2721 UK medical students. BMJ Open. 2020; 10(11).

Graham CR, Woodfield W, Harrison JB. A framework for institutional adoption and implementation of blended learning in higher education. Internet High Educ. 2013; 18 : 4–14.

Sing C, Khine M. An analysis of interaction and participation patterns in online community. J Educ Techno Soc. 2006; 9(1): 250–261.

Bernard RM, Abrami PC, Borokhovski E, Wade CA, Tamim RM, Surkes MA, et al. A meta-analysis of three types of interaction treatments in distance education. Rev Educ Res. 2009; 79(3): 1243–1289.

Fedynich L, Bradley KS, Bradley J. Graduate students’ perceptions of online learning. Res High Educ. 2015; 27.

Tanis CJ. The seven principles of online learning: Feedback from faculty and alumni on its importance for teaching and learning. Res Learn Technol. 2020; 28 . https://doi.org/10.25304/rlt.v28.2319

Dixson MD. Measuring student engagement in the online course: The Online Student Engagement scale (OSE). Online Learn. 2015; 19 (4).

Kwary DA, Fauzie S. Students’ achievement and opinions on the implementation of e-learning for phonetics and phonology lectures at Airlangga University. Educ Pesqui. 2018; 44 .

Vygotsky LS. Mind in society: The development of higher psychological processes. Cambridge (MA): Harvard University Press. 1978.

Kahoot! @ . Oslo, Norway. https://kahoot.com/

Davis J, Chryssafidou E, Zamora J, Davies D, Khan K, Coomarasamy A. Computer-based teaching is as good as face to face lecture-based teaching of evidence-based medicine: a randomised controlled trial. BMC Med Educ. 2007; 7(1): 1–6.

Davis J, Crabb S, Rogers E, Zamora J, Khan K. Computer-based teaching is as good as face to face lecture-based teaching of evidence-based medicine: a randomized controlled trial. Med Teach. 2008; 30(3): 302–307.

Download references

Acknowledgements

Not applicable.

Authors’ information

MZ is an Associate Professor of Learning Sciences and Senior Instructional Designer at School of Dentistry, University of the Pacific. She has a PhD in Education, with a specialty on learning sciences and technology. She has dedicated her entire career to conducting research on online learning, learning technology, and faculty development. Her research has resulted in several peer-reviewed publications in medical, dental, and educational technology journals. MZ has also presented regularly at national conferences.

DB is an Assistant Dean for Academic Affairs at School of Dentistry, University of the Pacific. He has an EdD degree in education, with a concentration on learning and instruction. Over the past decades, DB has been overseeing and delivering faculty pedagogical development programs to dental faculty. His research interest lies in educational leadership and instructional innovation. DB has co-authored several peer-reviewed publications in health sciences education and presented regularly at national conferences.

CL is Associate Dean of Oral Healthcare Education, School of Dentistry, University of the Pacific. She has a Doctor of Dental Surgery (DDS) degree and an EdD degree with a focus on educational leadership. Her professional interest lies in educational leadership, oral healthcare education innovation, and faculty development. CL has co-authored several publications in peer-reviewed journals in health sciences education and presented regularly at national conferences.

Author information

Authors and affiliations.

Office of Academic Affairs, Arthur A. Dugoni School of Dentistry, University of the Pacific, CA, San Francisco, USA

Meixun Zheng, Daniel Bender & Cindy Lyon

You can also search for this author in PubMed   Google Scholar

Contributions

MZ analyzed the data and wrote the initial draft of the manuscript. DB and CL both provided assistance with research design, data collection, and reviewed and edited the manuscript. The author(s) read and approved the final manuscript.

Corresponding author

Correspondence to Meixun Zheng .

Ethics declarations

Ethics approval and consent to participate.

The study was approved by the institutional review board at University of the Pacific in the U.S. (#2020-68). Informed consent was obtained from all participants. All methods were carried out in accordance with relevant guidelines and regulations.

Consent for publication

Competing interests.

The authors declare that they have no competing interests.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Additional file 1:.

Survey of online courses during COVID-19 pandemic.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ . The Creative Commons Public Domain Dedication waiver ( http://creativecommons.org/publicdomain/zero/1.0/ ) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Cite this article.

Zheng, M., Bender, D. & Lyon, C. Online learning during COVID-19 produced equivalent or better student course performance as compared with pre-pandemic: empirical evidence from a school-wide comparative study. BMC Med Educ 21 , 495 (2021). https://doi.org/10.1186/s12909-021-02909-z

Download citation

Received : 31 March 2021

Accepted : 26 August 2021

Published : 16 September 2021

DOI : https://doi.org/10.1186/s12909-021-02909-z

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Dental education
  • Online learning
  • COVID-19 pandemic
  • Instructional strategies
  • Interaction
  • Learning performance

BMC Medical Education

ISSN: 1472-6920

research objectives about online classes

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • Springer Nature - PMC COVID-19 Collection

Logo of phenaturepg

Impact of online classes on the satisfaction and performance of students during the pandemic period of COVID 19

1 Chitkara College of Hospitality Management, Chitkara University, Chandigarh, Punjab India

Varsha Singh

Arun aggarwal.

2 Chitkara Business School, Chitkara University, Chandigarh, Punjab India

The aim of the study is to identify the factors affecting students’ satisfaction and performance regarding online classes during the pandemic period of COVID–19 and to establish the relationship between these variables. The study is quantitative in nature, and the data were collected from 544 respondents through online survey who were studying the business management (B.B.A or M.B.A) or hotel management courses in Indian universities. Structural equation modeling was used to analyze the proposed hypotheses. The results show that four independent factors used in the study viz. quality of instructor, course design, prompt feedback, and expectation of students positively impact students’ satisfaction and further student’s satisfaction positively impact students’ performance. For educational management, these four factors are essential to have a high level of satisfaction and performance for online courses. This study is being conducted during the epidemic period of COVID- 19 to check the effect of online teaching on students’ performance.

Introduction

Coronavirus is a group of viruses that is the main root of diseases like cough, cold, sneezing, fever, and some respiratory symptoms (WHO, 2019 ). Coronavirus is a contagious disease, which is spreading very fast amongst the human beings. COVID-19 is a new sprain which was originated in Wuhan, China, in December 2019. Coronavirus circulates in animals, but some of these viruses can transmit between animals and humans (Perlman & Mclntosh, 2020 ). As of March 282,020, according to the MoHFW, a total of 909 confirmed COVID-19 cases (862 Indians and 47 foreign nationals) had been reported in India (Centers for Disease Control and Prevention, 2020 ). Officially, no vaccine or medicine is evaluated to cure the spread of COVID-19 (Yu et al., 2020 ). The influence of the COVID-19 pandemic on the education system leads to schools and colleges’ widespread closures worldwide. On March 24, India declared a country-wide lockdown of schools and colleges (NDTV, 2020 ) for preventing the transmission of the coronavirus amongst the students (Bayham & Fenichel, 2020 ). School closures in response to the COVID-19 pandemic have shed light on several issues affecting access to education. COVID-19 is soaring due to which the huge number of children, adults, and youths cannot attend schools and colleges (UNESCO, 2020 ). Lah and Botelho ( 2012 ) contended that the effect of school closing on students’ performance is hazy.

Similarly, school closing may also affect students because of disruption of teacher and students’ networks, leading to poor performance. Bridge ( 2020 ) reported that schools and colleges are moving towards educational technologies for student learning to avoid a strain during the pandemic season. Hence, the present study’s objective is to develop and test a conceptual model of student’s satisfaction pertaining to online teaching during COVID-19, where both students and teachers have no other option than to use the online platform uninterrupted learning and teaching.

UNESCO recommends distance learning programs and open educational applications during school closure caused by COVID-19 so that schools and teachers use to teach their pupils and bound the interruption of education. Therefore, many institutes go for the online classes (Shehzadi et al., 2020 ).

As a versatile platform for learning and teaching processes, the E-learning framework has been increasingly used (Salloum & Shaalan, 2018 ). E-learning is defined as a new paradigm of online learning based on information technology (Moore et al., 2011 ). In contrast to traditional learning academics, educators, and other practitioners are eager to know how e-learning can produce better outcomes and academic achievements. Only by analyzing student satisfaction and their performance can the answer be sought.

Many comparative studies have been carried out to prove the point to explore whether face-to-face or traditional teaching methods are more productive or whether online or hybrid learning is better (Lockman & Schirmer, 2020 ; Pei & Wu, 2019 ; González-Gómez et al., 2016 ; González-Gómez et al., 2016 ). Results of the studies show that the students perform much better in online learning than in traditional learning. Henriksen et al. ( 2020 ) highlighted the problems faced by educators while shifting from offline to online mode of teaching. In the past, several research studies had been carried out on online learning to explore student satisfaction, acceptance of e-learning, distance learning success factors, and learning efficiency (Sher, 2009 ; Lee, 2014 ; Yen et al., 2018 ). However, scant amount of literature is available on the factors that affect the students’ satisfaction and performance in online classes during the pandemic of Covid-19 (Rajabalee & Santally, 2020 ). In the present study, the authors proposed that course design, quality of the instructor, prompt feedback, and students’ expectations are the four prominent determinants of learning outcome and satisfaction of the students during online classes (Lee, 2014 ).

The Course Design refers to curriculum knowledge, program organization, instructional goals, and course structure (Wright, 2003 ). If well planned, course design increasing the satisfaction of pupils with the system (Almaiah & Alyoussef, 2019 ). Mtebe and Raisamo ( 2014 ) proposed that effective course design will help in improving the performance through learners knowledge and skills (Khan & Yildiz, 2020 ; Mohammed et al., 2020 ). However, if the course is not designed effectively then it might lead to low usage of e-learning platforms by the teachers and students (Almaiah & Almulhem, 2018 ). On the other hand, if the course is designed effectively then it will lead to higher acceptance of e-learning system by the students and their performance also increases (Mtebe & Raisamo, 2014 ). Hence, to prepare these courses for online learning, many instructors who are teaching blended courses for the first time are likely to require a complete overhaul of their courses (Bersin, 2004 ; Ho et al., 2006 ).

The second-factor, Instructor Quality, plays an essential role in affecting the students’ satisfaction in online classes. Instructor quality refers to a professional who understands the students’ educational needs, has unique teaching skills, and understands how to meet the students’ learning needs (Luekens et al., 2004 ). Marsh ( 1987 ) developed five instruments for measuring the instructor’s quality, in which the main method was Students’ Evaluation of Educational Quality (SEEQ), which delineated the instructor’s quality. SEEQ is considered one of the methods most commonly used and embraced unanimously (Grammatikopoulos et al., 2014 ). SEEQ was a very useful method of feedback by students to measure the instructor’s quality (Marsh, 1987 ).

The third factor that improves the student’s satisfaction level is prompt feedback (Kinicki et al., 2004 ). Feedback is defined as information given by lecturers and tutors about the performance of students. Within this context, feedback is a “consequence of performance” (Hattie & Timperley, 2007 , p. 81). In education, “prompt feedback can be described as knowing what you know and what you do not related to learning” (Simsek et al., 2017 , p.334). Christensen ( 2014 ) studied linking feedback to performance and introduced the positivity ratio concept, which is a mechanism that plays an important role in finding out the performance through feedback. It has been found that prompt feedback helps in developing a strong linkage between faculty and students which ultimately leads to better learning outcomes (Simsek et al., 2017 ; Chang, 2011 ).

The fourth factor is students’ expectation . Appleton-Knapp and Krentler ( 2006 ) measured the impact of student’s expectations on their performance. They pin pointed that the student expectation is important. When the expectations of the students are achieved then it lead to the higher satisfaction level of the student (Bates & Kaye, 2014 ). These findings were backed by previous research model “Student Satisfaction Index Model” (Zhang et al., 2008 ). However, when the expectations are students is not fulfilled then it might lead to lower leaning and satisfaction with the course. Student satisfaction is defined as students’ ability to compare the desired benefit with the observed effect of a particular product or service (Budur et al., 2019 ). Students’ whose grade expectation is high will show high satisfaction instead of those facing lower grade expectations.

The scrutiny of the literature show that although different researchers have examined the factors affecting student satisfaction but none of the study has examined the effect of course design, quality of the instructor, prompt feedback, and students’ expectations on students’ satisfaction with online classes during the pandemic period of Covid-19. Therefore, this study tries to explore the factors that affect students’ satisfaction and performance regarding online classes during the pandemic period of COVID–19. As the pandemic compelled educational institutions to move online with which they were not acquainted, including teachers and learners. The students were not mentally prepared for such a shift. Therefore, this research will be examined to understand what factors affect students and how students perceived these changes which are reflected through their satisfaction level.

This paper is structured as follows: The second section provides a description of theoretical framework and the linkage among different research variables and accordingly different research hypotheses were framed. The third section deals with the research methodology of the paper as per APA guideline. The outcomes and corresponding results of the empirical analysis are then discussed. Lastly, the paper concludes with a discussion and proposes implications for future studies.

Theoretical framework

Achievement goal theory (AGT) is commonly used to understand the student’s performance, and it is proposed by four scholars Carole Ames, Carol Dweck, Martin Maehr, and John Nicholls in the late 1970s (Elliot, 2005 ). Elliott & Dweck ( 1988 , p11) define that “an achievement goal involves a program of cognitive processes that have cognitive, affective and behavioral consequence”. This theory suggests that students’ motivation and achievement-related behaviors can be easily understood by the purpose and the reasons they adopted while they are engaged in the learning activities (Dweck & Leggett, 1988 ; Ames, 1992 ; Urdan, 1997 ). Some of the studies believe that there are four approaches to achieve a goal, i.e., mastery-approach, mastery avoidance, performance approach, and performance-avoidance (Pintrich, 1999 ; Elliot & McGregor, 2001 ; Schwinger & Stiensmeier-Pelster, 2011 , Hansen & Ringdal, 2018 ; Mouratidis et al., 2018 ). The environment also affects the performance of students (Ames & Archer, 1988 ). Traditionally, classroom teaching is an effective method to achieve the goal (Ames & Archer, 1988 ; Ames, 1992 ; Clayton et al., 2010 ) however in the modern era, the internet-based teaching is also one of the effective tools to deliver lectures, and web-based applications are becoming modern classrooms (Azlan et al., 2020 ). Hence, following section discuss about the relationship between different independent variables and dependent variables (Fig. ​ (Fig.1 1 ).

An external file that holds a picture, illustration, etc.
Object name is 10639_2021_10523_Fig1_HTML.jpg

Proposed Model

Hypotheses development

Quality of the instructor and satisfaction of the students.

Quality of instructor with high fanaticism on student’s learning has a positive impact on their satisfaction. Quality of instructor is one of the most critical measures for student satisfaction, leading to the education process’s outcome (Munteanu et al., 2010 ; Arambewela & Hall, 2009 ; Ramsden, 1991 ). Suppose the teacher delivers the course effectively and influence the students to do better in their studies. In that case, this process leads to student satisfaction and enhances the learning process (Ladyshewsky, 2013 ). Furthermore, understanding the need of learner by the instructor also ensures student satisfaction (Kauffman, 2015 ). Hence the hypothesis that the quality of instructor significantly affects the satisfaction of the students was included in this study.

  • H1: The quality of the instructor positively affects the satisfaction of the students.

Course design and satisfaction of students

The course’s technological design is highly persuading the students’ learning and satisfaction through their course expectations (Liaw, 2008 ; Lin et al., 2008 ). Active course design indicates the students’ effective outcomes compared to the traditional design (Black & Kassaye, 2014 ). Learning style is essential for effective course design (Wooldridge, 1995 ). While creating an online course design, it is essential to keep in mind that we generate an experience for students with different learning styles. Similarly, (Jenkins, 2015 ) highlighted that the course design attributes could be developed and employed to enhance student success. Hence the hypothesis that the course design significantly affects students’ satisfaction was included in this study.

  • H2: Course design positively affects the satisfaction of students.

Prompt feedback and satisfaction of students

The emphasis in this study is to understand the influence of prompt feedback on satisfaction. Feedback gives the information about the students’ effective performance (Chang, 2011 ; Grebennikov & Shah, 2013 ; Simsek et al., 2017 ). Prompt feedback enhances student learning experience (Brownlee et al., 2009 ) and boosts satisfaction (O'donovan, 2017 ). Prompt feedback is the self-evaluation tool for the students (Rogers, 1992 ) by which they can improve their performance. Eraut ( 2006 ) highlighted the impact of feedback on future practice and student learning development. Good feedback practice is beneficial for student learning and teachers to improve students’ learning experience (Yorke, 2003 ). Hence the hypothesis that prompt feedback significantly affects satisfaction was included in this study.

  • H3: Prompt feedback of the students positively affects the satisfaction.

Expectations and satisfaction of students

Expectation is a crucial factor that directly influences the satisfaction of the student. Expectation Disconfirmation Theory (EDT) (Oliver, 1980 ) was utilized to determine the level of satisfaction based on their expectations (Schwarz & Zhu, 2015 ). Student’s expectation is the best way to improve their satisfaction (Brown et al., 2014 ). It is possible to recognize student expectations to progress satisfaction level (ICSB, 2015 ). Finally, the positive approach used in many online learning classes has been shown to place a high expectation on learners (Gold, 2011 ) and has led to successful outcomes. Hence the hypothesis that expectations of the student significantly affect the satisfaction was included in this study.

  • H4: Expectations of the students positively affects the satisfaction.

Satisfaction and performance of the students

Zeithaml ( 1988 ) describes that satisfaction is the outcome result of the performance of any educational institute. According to Kotler and Clarke ( 1986 ), satisfaction is the desired outcome of any aim that amuses any individual’s admiration. Quality interactions between instructor and students lead to student satisfaction (Malik et al., 2010 ; Martínez-Argüelles et al., 2016 ). Teaching quality and course material enhances the student satisfaction by successful outcomes (Sanderson, 1995 ). Satisfaction relates to the student performance in terms of motivation, learning, assurance, and retention (Biner et al., 1996 ). Mensink and King ( 2020 ) described that performance is the conclusion of student-teacher efforts, and it shows the interest of students in the studies. The critical element in education is students’ academic performance (Rono, 2013 ). Therefore, it is considered as center pole, and the entire education system rotates around the student’s performance. Narad and Abdullah ( 2016 ) concluded that the students’ academic performance determines academic institutions’ success and failure.

Singh et al. ( 2016 ) asserted that the student academic performance directly influences the country’s socio-economic development. Farooq et al. ( 2011 ) highlights the students’ academic performance is the primary concern of all faculties. Additionally, the main foundation of knowledge gaining and improvement of skills is student’s academic performance. According to Narad and Abdullah ( 2016 ), regular evaluation or examinations is essential over a specific period of time in assessing students’ academic performance for better outcomes. Hence the hypothesis that satisfaction significantly affects the performance of the students was included in this study.

  • H5: Students’ satisfaction positively affects the performance of the students.

Satisfaction as mediator

Sibanda et al. ( 2015 ) applied the goal theory to examine the factors persuading students’ academic performance that enlightens students’ significance connected to their satisfaction and academic achievement. According to this theory, students perform well if they know about factors that impact on their performance. Regarding the above variables, institutional factors that influence student satisfaction through performance include course design and quality of the instructor (DeBourgh, 2003 ; Lado et al., 2003 ), prompt feedback, and expectation (Fredericksen et al., 2000 ). Hence the hypothesis that quality of the instructor, course design, prompts feedback, and student expectations significantly affect the students’ performance through satisfaction was included in this study.

  • H6: Quality of the instructor, course design, prompt feedback, and student’ expectations affect the students’ performance through satisfaction.
  • H6a: Students’ satisfaction mediates the relationship between quality of the instructor and student’s performance.
  • H6b: Students’ satisfaction mediates the relationship between course design and student’s performance.
  • H6c: Students’ satisfaction mediates the relationship between prompt feedback and student’s performance.
  • H6d: Students’ satisfaction mediates the relationship between student’ expectations and student’s performance.

Participants

In this cross-sectional study, the data were collected from 544 respondents who were studying the management (B.B.A or M.B.A) and hotel management courses. The purposive sampling technique was used to collect the data. Descriptive statistics shows that 48.35% of the respondents were either MBA or BBA and rests of the respondents were hotel management students. The percentages of male students were (71%) and female students were (29%). The percentage of male students is almost double in comparison to females. The ages of the students varied from 18 to 35. The dominant group was those aged from 18 to 22, and which was the under graduation student group and their ratio was (94%), and another set of students were from the post-graduation course, which was (6%) only.

The research instrument consists of two sections. The first section is related to demographical variables such as discipline, gender, age group, and education level (under-graduate or post-graduate). The second section measures the six factors viz. instructor’s quality, course design, prompt feedback, student expectations, satisfaction, and performance. These attributes were taken from previous studies (Yin & Wang, 2015 ; Bangert, 2004 ; Chickering & Gamson, 1987 ; Wilson et al., 1997 ). The “instructor quality” was measured through the scale developed by Bangert ( 2004 ). The scale consists of seven items. The “course design” and “prompt feedback” items were adapted from the research work of Bangert ( 2004 ). The “course design” scale consists of six items. The “prompt feedback” scale consists of five items. The “students’ expectation” scale consists of five items. Four items were adapted from Bangert, 2004 and one item was taken from Wilson et al. ( 1997 ). Students’ satisfaction was measure with six items taken from Bangert ( 2004 ); Wilson et al. ( 1997 ); Yin and Wang ( 2015 ). The “students’ performance” was measured through the scale developed by Wilson et al. ( 1997 ). The scale consists of six items. These variables were accessed on a five-point likert scale, ranging from 1(strongly disagree) to 5(strongly agree). Only the students from India have taken part in the survey. A total of thirty-four questions were asked in the study to check the effect of the first four variables on students’ satisfaction and performance. For full details of the questionnaire, kindly refer Appendix Tables ​ Tables6 6 .

The study used a descriptive research design. The factors “instructor quality, course design, prompt feedback and students’ expectation” were independent variables. The students’ satisfaction was mediator and students’ performance was the dependent variable in the current study.

In this cross-sectional research the respondents were selected through judgment sampling. They were informed about the objective of the study and information gathering process. They were assured about the confidentiality of the data and no incentive was given to then for participating in this study. The information utilizes for this study was gathered through an online survey. The questionnaire was built through Google forms, and then it was circulated through the mails. Students’ were also asked to write the name of their college, and fifteen colleges across India have taken part to fill the data. The data were collected in the pandemic period of COVID-19 during the total lockdown in India. This was the best time to collect the data related to the current research topic because all the colleges across India were involved in online classes. Therefore, students have enough time to understand the instrument and respondent to the questionnaire in an effective manner. A total of 615 questionnaires were circulated, out of which the students returned 574. Thirty responses were not included due to the unengaged responses. Finally, 544 questionnaires were utilized in the present investigation. Male and female students both have taken part to fill the survey, different age groups, and various courses, i.e., under graduation and post-graduation students of management and hotel management students were the part of the sample.

Exploratory factor analysis (EFA)

To analyze the data, SPSS and AMOS software were used. First, to extract the distinct factors, an exploratory factor analysis (EFA) was performed using VARIMAX rotation on a sample of 544. Results of the exploratory analysis rendered six distinct factors. Factor one was named as the quality of instructor, and some of the items were “The instructor communicated effectively”, “The instructor was enthusiastic about online teaching” and “The instructor was concerned about student learning” etc. Factor two was labeled as course design, and the items were “The course was well organized”, “The course was designed to allow assignments to be completed across different learning environments.” and “The instructor facilitated the course effectively” etc. Factor three was labeled as prompt feedback of students, and some of the items were “The instructor responded promptly to my questions about the use of Webinar”, “The instructor responded promptly to my questions about general course requirements” etc. The fourth factor was Student’s Expectations, and the items were “The instructor provided models that clearly communicated expectations for weekly group assignments”, “The instructor used good examples to explain statistical concepts” etc. The fifth factor was students’ satisfaction, and the items were “The online classes were valuable”, “Overall, I am satisfied with the quality of this course” etc. The sixth factor was performance of the student, and the items were “The online classes has sharpened my analytic skills”, “Online classes really tries to get the best out of all its students” etc. These six factors explained 67.784% of the total variance. To validate the factors extracted through EFA, the researcher performed confirmatory factor analysis (CFA) through AMOS. Finally, structural equation modeling (SEM) was used to test the hypothesized relationships.

Measurement model

The results of Table ​ Table1 1 summarize the findings of EFA and CFA. Results of the table showed that EFA renders six distinct factors, and CFA validated these factors. Table ​ Table2 2 shows that the proposed measurement model achieved good convergent validity (Aggarwal et al., 2018a , b ). Results of the confirmatory factor analysis showed that the values of standardized factor loadings were statistically significant at the 0.05 level. Further, the results of the measurement model also showed acceptable model fit indices such that CMIN = 710.709; df = 480; CMIN/df = 1.481 p  < .000; Incremental Fit Index (IFI) = 0.979; Tucker-Lewis Index (TLI) = 0.976; Goodness of Fit index (GFI) = 0.928; Adjusted Goodness of Fit Index (AGFI) = 0.916; Comparative Fit Index (CFI) = 0.978; Root Mean Square Residual (RMR) = 0.042; Root Mean Squared Error of Approximation (RMSEA) = 0.030 is satisfactory.

Factor Analysis

Author’s Compilation

Validity analysis of measurement model

Author’s compilation

AVE is the Average Variance Extracted, CR is Composite Reliability

The bold diagonal value represents the square root of AVE

The Average Variance Explained (AVE) according to the acceptable index should be higher than the value of squared correlations between the latent variables and all other variables. The discriminant validity is confirmed (Table ​ (Table2) 2 ) as the value of AVE’s square root is greater than the inter-construct correlations coefficient (Hair et al., 2006 ). Additionally, the discriminant validity existed when there was a low correlation between each variable measurement indicator with all other variables except with the one with which it must be theoretically associated (Aggarwal et al., 2018a , b ; Aggarwal et al., 2020 ). The results of Table ​ Table2 2 show that the measurement model achieved good discriminate validity.

Structural model

To test the proposed hypothesis, the researcher used the structural equation modeling technique. This is a multivariate statistical analysis technique, and it includes the amalgamation of factor analysis and multiple regression analysis. It is used to analyze the structural relationship between measured variables and latent constructs.

Table  3 represents the structural model’s model fitness indices where all variables put together when CMIN/DF is 2.479, and all the model fit values are within the particular range. That means the model has attained a good model fit. Furthermore, other fit indices as GFI = .982 and AGFI = 0.956 be all so supportive (Schumacker & Lomax, 1996 ; Marsh & Grayson, 1995 ; Kline, 2005 ).

Criterion for model fit

Hence, the model fitted the data successfully. All co-variances among the variables and regression weights were statistically significant ( p  < 0.001).

Table ​ Table4 4 represents the relationship between exogenous, mediator and endogenous variables viz—quality of instructor, prompt feedback, course design, students’ expectation, students’ satisfaction and students’ performance. The first four factors have a positive relationship with satisfaction, which further leads to students’ performance positively. Results show that the instructor’s quality has a positive relationship with the satisfaction of students for online classes (SE = 0.706, t-value = 24.196; p  < 0.05). Hence, H1 was supported. The second factor is course design, which has a positive relationship with students’ satisfaction of students (SE = 0.064, t-value = 2.395; p < 0.05). Hence, H2 was supported. The third factor is Prompt feedback, and results show that feedback has a positive relationship with the satisfaction of the students (SE = 0.067, t-value = 2.520; p < 0.05). Hence, H3 was supported. The fourth factor is students’ expectations. The results show a positive relationship between students’ expectation and students’ satisfaction with online classes (SE = 0.149, t-value = 5.127; p < 0.05). Hence, H4 was supported. The results of SEM show that out of quality of instructor, prompt feedback, course design, and students’ expectation, the most influencing factor that affect the students’ satisfaction was instructor’s quality (SE = 0.706) followed by students’ expectation (SE =5.127), prompt feedback (SE = 2.520). The factor that least affects the students’ satisfaction was course design (2.395). The results of Table ​ Table4 4 finally depicts that students’ satisfaction has positive effect on students’ performance ((SE = 0.186, t-value = 2.800; p < 0.05). Hence H5 was supported.

Structural analysis

Table ​ Table5 5 shows that students’ satisfaction partially mediates the positive relationship between the instructor’s quality and student performance. Hence, H6(a) was supported. Further, the mediation analysis results showed that satisfaction again partially mediates the positive relationship between course design and student’s performance. Hence, H6(b) was supported However, the mediation analysis results showed that satisfaction fully mediates the positive relationship between prompt feedback and student performance. Hence, H6(c) was supported. Finally, the results of the Table ​ Table5 5 showed that satisfaction partially mediates the positive relationship between expectations of the students and student’s performance. Hence, H6(d) was supported.

Mediation Analysis

In the present study, the authors evaluated the different factors directly linked with students’ satisfaction and performance with online classes during Covid-19. Due to the pandemic situation globally, all the colleges and universities were shifted to online mode by their respective governments. No one has the information that how long this pandemic will remain, and hence the teaching method was shifted to online mode. Even though some of the educators were not tech-savvy, they updated themselves to battle the unexpected circumstance (Pillai et al., 2021 ). The present study results will help the educators increase the student’s satisfaction and performance in online classes. The current research assists educators in understanding the different factors that are required for online teaching.

Comparing the current research with past studies, the past studies have examined the factors affecting the student’s satisfaction in the conventional schooling framework. However, the present study was conducted during India’s lockdown period to identify the prominent factors that derive the student’s satisfaction with online classes. The study also explored the direct linkage between student’s satisfaction and their performance. The present study’s findings indicated that instructor’s quality is the most prominent factor that affects the student’s satisfaction during online classes. This means that the instructor needs to be very efficient during the lectures. He needs to understand students’ psychology to deliver the course content prominently. If the teacher can deliver the course content properly, it affects the student’s satisfaction and performance. The teachers’ perspective is critical because their enthusiasm leads to a better online learning process quality.

The present study highlighted that the second most prominent factor affecting students’ satisfaction during online classes is the student’s expectations. Students might have some expectations during the classes. If the instructor understands that expectation and customizes his/her course design following the student’s expectations, then it is expected that the students will perform better in the examinations. The third factor that affects the student’s satisfaction is feedback. After delivering the course, appropriate feedback should be taken by the instructors to plan future courses. It also helps to make the future strategies (Tawafak et al., 2019 ). There must be a proper feedback system for improvement because feedback is the course content’s real image. The last factor that affects the student’s satisfaction is design. The course content needs to be designed in an effective manner so that students should easily understand it. If the instructor plans the course, so the students understand the content without any problems it effectively leads to satisfaction, and the student can perform better in the exams. In some situations, the course content is difficult to deliver in online teaching like the practical part i.e. recipes of dishes or practical demonstration in the lab. In such a situation, the instructor needs to be more creative in designing and delivering the course content so that it positively impacts the students’ overall satisfaction with online classes.

Overall, the students agreed that online teaching was valuable for them even though the online mode of classes was the first experience during the pandemic period of Covid-19 (Agarwal & Kaushik, 2020 ; Rajabalee & Santally, 2020 ). Some of the previous studies suggest that the technology-supported courses have a positive relationship with students’ performance (Cho & Schelzer, 2000 ; Harasim, 2000 ; Sigala, 2002 ). On the other hand, the demographic characteristic also plays a vital role in understanding the online course performance. According to APA Work Group of the Board of Educational Affairs ( 1997 ), the learner-centered principles suggest that students must be willing to invest the time required to complete individual course assignments. Online instructors must be enthusiastic about developing genuine instructional resources that actively connect learners and encourage them toward proficient performances. For better performance in studies, both teachers and students have equal responsibility. When the learner faces any problem to understand the concepts, he needs to make inquiries for the instructor’s solutions (Bangert, 2004 ). Thus, we can conclude that “instructor quality, student’s expectation, prompt feedback, and effective course design” significantly impact students’ online learning process.

Implications of the study

The results of this study have numerous significant practical implications for educators, students and researchers. It also contributes to the literature by demonstrating that multiple factors are responsible for student satisfaction and performance in the context of online classes during the period of the COVID-19 pandemic. This study was different from the previous studies (Baber, 2020 ; Ikhsan et al., 2019 ; Eom & Ashill, 2016 ). None of the studies had examined the effect of students’ satisfaction on their perceived academic performance. The previous empirical findings have highlighted the importance of examining the factors affecting student satisfaction (Maqableh & Jaradat, 2021 ; Yunusa & Umar, 2021 ). Still, none of the studies has examined the effect of course design, quality of instructor, prompt feedback, and students’ expectations on students’ satisfaction all together with online classes during the pandemic period. The present study tries to fill this research gap.

The first essential contribution of this study was the instructor’s facilitating role, and the competence he/she possesses affects the level of satisfaction of the students (Gray & DiLoreto, 2016 ). There was an extra obligation for instructors who taught online courses during the pandemic. They would have to adapt to a changing climate, polish their technical skills throughout the process, and foster new students’ technical knowledge in this environment. The present study’s findings indicate that instructor quality is a significant determinant of student satisfaction during online classes amid a pandemic. In higher education, the teacher’s standard referred to the instructor’s specific individual characteristics before entering the class (Darling-Hammond, 2010 ). These attributes include factors such as instructor content knowledge, pedagogical knowledge, inclination, and experience. More significantly, at that level, the amount of understanding could be given by those who have a significant amount of technical expertise in the areas they are teaching (Martin, 2021 ). Secondly, the present study results contribute to the profession of education by illustrating a realistic approach that can be used to recognize students’ expectations in their class effectively. The primary expectation of most students before joining a university is employment. Instructors have agreed that they should do more to fulfill students’ employment expectations (Gorgodze et al., 2020 ). The instructor can then use that to balance expectations to improve student satisfaction. Study results can be used to continually improve and build courses, as well as to make policy decisions to improve education programs. Thirdly, from result outcomes, online course design and instructors will delve deeper into how to structure online courses more efficiently, including design features that minimize adversely and maximize optimistic emotion, contributing to greater student satisfaction (Martin et al., 2018 ). The findings suggest that the course design has a substantial positive influence on the online class’s student performance. The findings indicate that the course design of online classes need to provide essential details like course content, educational goals, course structure, and course output in a consistent manner so that students would find the e-learning system beneficial for them; this situation will enable students to use the system and that leads to student performance (Almaiah & Alyoussef, 2019 ). Lastly, the results indicate that instructors respond to questions promptly and provide timely feedback on assignments to facilitate techniques that help students in online courses improve instructor participation, instructor interaction, understanding, and participation (Martin et al., 2018 ). Feedback can be beneficial for students to focus on the performance that enhances their learning.

Limitations and future scope of the study

The data collected in this study was cross-sectional in nature due to which it is difficult to establish the causal relationship between the variables. The future research can use a longitudinal study to handle this limitation. Further, the data was collected from one type of respondents only, that is, the students. Therefore, the results of the study cannot be generalized to other samples. The future research can also include the perspectives of teachers and policy makers to have more generalization of the results. The current research is only limited to theory classes; therefore, it can be implemented to check students’ performance in practical classes. The study is done on the Indian students only; thus, if the data is collected from various countries, it can give better comparative results to understand the student’s perspective. This study is limited to check the performance of students, so in the future, the performance of teachers can be checked with similar kinds of conditions. There may be some issues and problems faced by the students, like the limited access to the internet or disturbance due to low signals. Some of the students may face the home environment issues such as disturbance due to family members, which may lead to negative performance. The above-mentioned points can be inculcated in the future research.

Declarations

Not applicable.

The authors declare no conflict of interest, financial or otherwise.

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Contributor Information

Ram Gopal, Email: [email protected] .

Varsha Singh, Email: [email protected] .

Arun Aggarwal, Email: [email protected] .

  • Agarwal S, Kaushik JS. Student’s perception of online learning during COVID pandemic. The Indian Journal of Pediatrics. 2020; 87 :554–554. doi: 10.1007/s12098-020-03327-7. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Aggarwal A, Dhaliwal RS, Nobi K. Impact of structural empowerment on organizational commitment: The mediating role of women's psychological empowerment. Vision. 2018; 22 (3):284–294. doi: 10.1177/0972262918786049. [ CrossRef ] [ Google Scholar ]
  • Aggarwal A, Goyal J, Nobi K. Examining the impact of leader-member exchange on perceptions of organizational justice: The mediating role of perceptions of organizational politics. Theoretical Economics Letters. 2018; 8 (11):2308–2329. doi: 10.4236/tel.2018.811150. [ CrossRef ] [ Google Scholar ]
  • Aggarwal A, Chand PA, Jhamb D, Mittal A. Leader-member exchange, work engagement and psychological withdrawal behaviour: The mediating role of psychological empowerment. Frontiers in Psychology. 2020; 11 :1–17. doi: 10.3389/fpsyg.2020.00423. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Almaiah MA, Almulhem A. A conceptual framework for determining the success factors of e-learning system implementation using Delphi technique. Journal of Theoretical and Applied Information Technology. 2018; 96 (17):5962–5976. [ Google Scholar ]
  • Almaiah MA, Alyoussef IY. Analysis of the effect of course design, course content support, course assessment and instructor characteristics on the actual use of E-learning system. Ieee Access. 2019; 7 :171907–171922. doi: 10.1109/ACCESS.2019.2956349. [ CrossRef ] [ Google Scholar ]
  • Ames C. Classrooms: Goals, structures, and student motivation. Journal of Educational Psychology. 1992; 84 :261–271. doi: 10.1037/0022-0663.84.3.261. [ CrossRef ] [ Google Scholar ]
  • Ames C, Archer J. Achievement goals in the classroom: Student's learning strategies and motivational processes. Journal of Educational Psychology. 1988; 80 :260–267. doi: 10.1037/0022-0663.80.3.260. [ CrossRef ] [ Google Scholar ]
  • APA Work Group of the Board of Educational Affairs . Learner-centered psychological principles: A framework for school reform and redesign. American Psychological Association; 1997. [ Google Scholar ]
  • Appleton-Knapp S, Krentler KA. Measuring student expectations and their effects on satisfaction: The importance of managing student expectations. Journal of Marketing Education. 2006; 28 (3):254–264. doi: 10.1177/0273475306293359. [ CrossRef ] [ Google Scholar ]
  • Arambewela R, Hall J. An empirical model of international student satisfaction. Asia Pacific Journal of Marketing and Logistics. 2009; 21 (4):555–569. doi: 10.1108/13555850910997599. [ CrossRef ] [ Google Scholar ]
  • Azlan AA, Hamzah MR, Sern TJ, Ayub SH, Mohamad E. Public knowledge, attitudes and practices towards COVID-19: A cross-sectional study in Malaysia. PLoS One. 2020; 15 (5):e0233668. doi: 10.1371/journal.pone.0233668. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Baber H. Determinants of Students' perceived outcome and satisfaction in online learning during the pandemic of COVID-19. Journal of Education and e-Learning Research. 2020; 7 (3):285–292. doi: 10.20448/journal.509.2020.73.285.292. [ CrossRef ] [ Google Scholar ]
  • Bangert AW. The seven principles of good practice: A framework for evaluating on- line teaching. The Internet and Higher Education. 2004; 7 (3):217–232. doi: 10.1016/j.iheduc.2004.06.003. [ CrossRef ] [ Google Scholar ]
  • Bates EA, Kaye LK. “I’d be expecting caviar in lectures”: The impact of the new fee regime on undergraduate students’ expectations of higher education. Higher Education. 2014; 67 (5):655–673. doi: 10.1007/s10734-013-9671-3. [ CrossRef ] [ Google Scholar ]
  • Bayham, J., & Fenichel, E.P. (2020). The impact of school closure for COVID-19 on the US healthcare workforce and the net mortality effects. Available at SSRN: 10.2139/ssrn.3555259.
  • Bersin J. The blended learning book: Best practices, proven methodologies and lessons learned. Pfeiffer Publishing; 2004. [ Google Scholar ]
  • Biner PM, Summers M, Dean RS, Bink ML, Anderson JL, Gelder BC. Student satisfaction with interactive telecourses as a function of demographic variables and prior telecourse experience. Distance Education. 1996; 17 (11):33–43. doi: 10.1080/0158791960170104. [ CrossRef ] [ Google Scholar ]
  • Black GS, Kassaye WW. Do students learning styles impact student outcomes in marketing classes? Academy of Educational Leadership Journal. 2014; 18 (4):149–162. [ Google Scholar ]
  • Bridge, S. (2020). Opinion: How edtech will keep our students on track during covid-19. Arabian business. Com Retrieved from https://search.proquest.com/docview/2377556452?accountid=147490 . Accessed 12 Oct 2020.
  • Brown SA, Venkatesh V, Goyal S. Expectation confirmation in information systems research: A test of six competing models. MIS Quarterly. 2014; 38 (3):729–756. doi: 10.25300/MISQ/2014/38.3.05. [ CrossRef ] [ Google Scholar ]
  • Brownlee J, Walker S, Lennox S, Exley B, Pearce S. The first year university experience: Using personal epistemology to understsnd effective learning and teaching in higher education. Higher Education. 2009; 58 (5):599–618. doi: 10.1007/s10734-009-9212-2. [ CrossRef ] [ Google Scholar ]
  • Budur T, Faraj KM, Karim LA. Benchmarking operations strategies via hybrid model: A case study of café-restaurant sector. Amozonia Investiga. 2019; 8 :842–854. [ Google Scholar ]
  • Centers for Disease Control and Prevention (2020). Coronavirus disease 2019 (COVID-19): Reducing stigma. Retrieved November 26, 2020, from: https://www.cdc.gov/coronavirus/2019-ncov/about/related-stigma.html .
  • Chang N. Pre-service Teachers' views: How did E-feedback through assessment facilitate their learning? Journal of the Scholarship of Teaching and Learning. 2011; 11 (2):16–33. [ Google Scholar ]
  • Chickering AW, Gamson ZF. Seven principles for good practice in undergraduate education. AAHE Bulletin. 1987; 39 (7):3–7. [ Google Scholar ]
  • Cho W, Schelzer C. Just in-time education: Tools for hospitality managers of the future? International Journal of Contemporary Hospitality Management. 2000; 12 (1):31–36. doi: 10.1108/09596110010305000. [ CrossRef ] [ Google Scholar ]
  • Christensen AL. Feedback, affect, and creative behavior: A multi-level model linking feedback to performance. Arizona State University; 2014. [ Google Scholar ]
  • Clayton K, Blumberg F, Auld DP. The relationship between motivation, learning strategies and choice of environment whether traditional or including an online component. British Journal of Educational Technology. 2010; 41 (3):349–364. doi: 10.1111/j.1467-8535.2009.00993.x. [ CrossRef ] [ Google Scholar ]
  • Darling-Hammond, L. (2010).  Evaluating teacher effectiveness: How teacher performance assessments can measure and improve teaching . Washington, DC: Center for American Progress
  • DeBourgh GA. Predictors of student satisfaction in distance-delivered graduate nursing courses: What matters most? Journal of Professional Nursing. 2003; 19 :149–163. doi: 10.1016/S8755-7223(03)00072-3. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Dweck C, Leggett E. A social–cognitive approach to motivation and personality. Psychological Review. 1988; 95 :256–273. doi: 10.1037/0033-295X.95.2.256. [ CrossRef ] [ Google Scholar ]
  • Elliot AJ. A conceptual history of the achievement goal construct. In: Elliot A, Dweck C, editors. Handbook of competence and motivation. Guilford Press; 2005. pp. 52–72. [ Google Scholar ]
  • Elliot A, McGregor H. A 2 _ 2 achievement goal framework. Journal of Personality and Social Psychology. 2001; 80 :501–519. doi: 10.1037/0022-3514.80.3.501. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Elliott ES, Dweck CS. Goals: An approach to motivation and achievement. Journal of Personality and Social Psychology. 1988; 54 (1):5. doi: 10.1037/0022-3514.54.1.5. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Eom SB, Ashill N. The determinants of students' perceived learning outcomes and satisfaction in university online education: An update. Decision Sciences Journal of Innovative Education. 2016; 14 (2):185–215. doi: 10.1111/dsji.12097. [ CrossRef ] [ Google Scholar ]
  • Eraut, M., (2006). Feedback. Learning in Health and Social Care. Volume-5, issue-3. Pg 111–118. Retrieved from https://edservices.wiley.com/how-student-feedback-creates-better-online- learning/ . Accessed 23 Oct 2020.
  • Farooq MS, Chaudhry AH, Shafiq M, Berhanu G. Factors affecting students' quality of academic performance: A case of secondary school level. Journal of Quality and Technology Management. 2011; 7 :1–14. [ Google Scholar ]
  • Fredericksen E, Shea P, Pickett A. Factors influencing student and faculty satisfaction in the SUNY learning network. State University of New York; 2000. [ Google Scholar ]
  • Gold, S. (2011). A constructivist approach to online training for online teachers. Journal of Aysnchronous Learning Networks, 5 (1), 35–57.
  • González-Gómez D, Jeong JS, Rodríguez DA. Performance and perception in the flipped learning model: An initial approach to evaluate the effectiveness of a new teaching methodology in a general science classroom. Journal of Science Education and Technology. 2016; 25 (3):450–459. doi: 10.1007/s10956-016-9605-9. [ CrossRef ] [ Google Scholar ]
  • Gorgodze S, Macharashvili L, Kamladze A. Learning for earning: Student expectations and perceptions of university. International Education Studies. 2020; 13 (1):42–53. doi: 10.5539/ies.v13n1p42. [ CrossRef ] [ Google Scholar ]
  • Grammatikopoulos, V., Linardakis, M., Gregoriadis, A., & Oikonomidis, V. (2014). Assessing the Students' evaluations of educational quality (SEEQ) questionnaire in Greek higher education. Higher Education., 70 (3).
  • Gray JA, DiLoreto M. The effects of student engagement, student satisfaction, and perceived learning in online learning environments. International Journal of Educational Leadership Preparation. 2016; 11 (1):n1. [ Google Scholar ]
  • Grebennikov L, Shah S. Monitoring trends in student satisfaction. Tertiary Education and Management. 2013; 19 (4):301–322. doi: 10.1080/13583883.2013.804114. [ CrossRef ] [ Google Scholar ]
  • Hair JF, Black WC, Babin BJ, Anderson RE, Tatham RL. Multivariate data analysis 6th edition. Pearson Prentice Hall. New Jersey. Humans: Critique and reformulation. Journal of Abnormal Psychology. 2006; 87 :49–74. [ PubMed ] [ Google Scholar ]
  • Hansen G, Ringdal R. Formative assessment as a future step in maintaining the mastery-approach and performance-avoidance goal stability. Studies in Educational Evaluation. 2018; 56 :59–70. doi: 10.1016/j.stueduc.2017.11.005. [ CrossRef ] [ Google Scholar ]
  • Harasim L. Shift happens: Online education as a new paradigm in learning. The Internet and Higher Education. 2000; 3 (1):41–61. doi: 10.1016/S1096-7516(00)00032-4. [ CrossRef ] [ Google Scholar ]
  • Hattie J, Timperley H. The power of feedback. Review of Educational Research. 2007; 77 (1):81–112. doi: 10.3102/003465430298487. [ CrossRef ] [ Google Scholar ]
  • Henriksen D, Creely E, Henderson M. Folk pedagogies for teacher transitions: Approaches to synchronous online learning in the wake of COVID-19. Journal of Technology and Teacher Education. 2020; 28 (2):201–209. [ Google Scholar ]
  • Ho A, Lu L, Thurmaier K. Testing the reluctant Professor's hypothesis: Evaluating a blended-learning approach to distance education. Journal of Public Affairs Education. 2006; 12 (1):81–102. doi: 10.1080/15236803.2006.12001414. [ CrossRef ] [ Google Scholar ]
  • ICSB (2015). Addressing undergraduate entrepreneurship student expectations: An exploratory study. International Council for Small Business (ICSB). Retrieved from https://search.proquest.com/docview/1826918813?accountid=147490 . Accessed 20 Oct 2020.
  • Ikhsan, R. B., Saraswati, L. A., Muchardie, B. G., & Susilo, A. (2019). The determinants of students' perceived learning outcomes and satisfaction in BINUS online learning. Paper presented at the 2019 5th International Conference on New Media Studies (CONMEDIA). IEEE.
  • Jenkins, D. M. (2015). Integrated course design: A facelift for college courses. Journal of Management Education, 39 (3), 427–432.
  • Kauffman, H. (2015). A review of predictive factors of student success in and satisfaction with online learning. Research in Learning Technology, 23 .
  • Khan NUS, Yildiz Y. Impact of intangible characteristics of universities on student satisfaction. Amazonia Investiga. 2020; 9 (26):105–116. doi: 10.34069/AI/2020.26.02.12. [ CrossRef ] [ Google Scholar ]
  • Kinicki AJ, Prussia GE, Wu BJ, McKee-Ryan FM. A covariance structure analysis of employees' response to performance feedback. Journal of Applied Psychology. 2004; 89 (6):1057–1069. doi: 10.1037/0021-9010.89.6.1057. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Kline RB. Principles and practice of structural equation modeling. 2. The Guilford Press; 2005. [ Google Scholar ]
  • Kotler, P., & Clarke, R. N. (1986). Marketing for health care organizations . Prentice Hall.
  • Lado N, Cardone-Riportella C, Rivera-Torres P. Measurement and effects of teaching quality: An empirical model applied to masters programs. Journal of Business Education. 2003; 4 :28–40. [ Google Scholar ]
  • Ladyshewsky RK. Instructor presence in online courses and student satisfaction. International Journal for the Scholarship of Teaching and Learning. 2013; 7 :1. doi: 10.20429/ijsotl.2013.070113. [ CrossRef ] [ Google Scholar ]
  • Lah, K., & G. Botelho. (2012). Union Opts to Continue Chicago Teachers Strike; Mayor Takes Fight to Court. http://articles.cnn.com/2012-09-16/us/us_illinois-chicago-teachersstrike_1_chicago-teachers-union-union-president-karen-lewis-union-delegates .
  • Lee J. An exploratory study of effective online learning: Assessing satisfaction levels of graduate students of mathematics education associated with human and design factors of an online course. The International Review of Research in Open and Distance Learning. 2014; 15 (1):111–132. doi: 10.19173/irrodl.v15i1.1638. [ CrossRef ] [ Google Scholar ]
  • Liaw S-S. Investigating students' perceived satisfaction, behavioral intention, and effectiveness of e-learning: A case study of the blackboard system. Computers & Education. 2008; 51 (2):864–873. doi: 10.1016/j.compedu.2007.09.005. [ CrossRef ] [ Google Scholar ]
  • Lin Y, Lin G, Laffey JM. Building a social and motivational framework for understanding satisfaction in online learning. Journal of Educational Computing Research. 2008; 38 (1):1–27. doi: 10.2190/EC.38.1.a. [ CrossRef ] [ Google Scholar ]
  • Lockman AS, Schirmer BR. Online instruction in higher education: Promising, research-based, and evidence-based practices. Journal of Education and e-Learning Research. 2020; 7 (2):130–152. doi: 10.20448/journal.509.2020.72.130.152. [ CrossRef ] [ Google Scholar ]
  • Luekens, M.T., Lyter, D.M., and Fox, E.E. (2004). Teacher attrition and mobility: Results from the teacher follow-up survey, 2000–01 (NCES 2004–301). National Center for Education Statistics, U.S. Department of Education . Washington, DC. https://nces.ed.gov/pubsearch/pubsinfo.asp?pubid=2004301 .
  • Malik, M. E., Danish, R. Q., & Usman, A. (2010). The impact of service quality on students’ satisfaction in higher education institutes of Punjab. Journal of Management Research, 2 (2), 1–11.
  • Maqableh, M., & Jaradat, M. (2021). Exploring the determinants of students’ academic performance at university level: The mediating role of internet usage continuance intention. Education and Information Technologies . 10.1007/s10639-021-10453-y. [ PMC free article ] [ PubMed ]
  • Marsh HW. Students' evaluations of university teaching: Research findings, methodological issues, and directions for future research. International Journal of Educational Research. 1987; 11 :253–388. doi: 10.1016/0883-0355(87)90001-2. [ CrossRef ] [ Google Scholar ]
  • Marsh, H. W., & Grayson, D. (1995). Latent variable models of multitrait-multimethod data.Marsh, H. W., & Grayson, D. (1995). Latent variable models of multitrait-multimethod data. In R. H. Hoyle (Ed.), Structural equation modeling: Concepts, issues, and applications (p. 177–198). Sage Publications, Inc.
  • Martin, A. M. (2021). Instructor qualities and student success in higher education online courses. Journal of Digital Learning in Teacher Education, 37 (1), 65–80.
  • Martínez-Argüelles, M. J., & Batalla-Busquets, J. M. (2016). Perceived service quality and student loyalty in an online university. International Review of Research in Open and Distributed Learning, 17 (4), 264–279.
  • Martin F, Wang C, Sadaf A. Student perception of helpfulness of facilitation strategies that enhance instructor presence, connectedness, engagement, and learning in online courses. The Internet and Higher Education. 2018; 37 :52–65. doi: 10.1016/j.iheduc.2018.01.003. [ CrossRef ] [ Google Scholar ]
  • Mensink PJ, King K. Student access of online feedback is modified by the availability of assessment marks, gender and academic performance. British Journal of Educational Technology. 2020; 51 (1):10–22. doi: 10.1111/bjet.12752. [ CrossRef ] [ Google Scholar ]
  • Mohammed SS, Suleyman C, Taylan B. Burnout determinants and consequences among university lecturers. Amazonia Investiga. 2020; 9 (27):13–24. doi: 10.34069/AI/2020.27.03.2. [ CrossRef ] [ Google Scholar ]
  • Moore JL, Dickson-Deane C, Galyen K. E-learning, online learning, and distance learning environments: Are they the same? Internet Higher Educ. 2011; 14 (2):129–135. doi: 10.1016/j.iheduc.2010.10.001. [ CrossRef ] [ Google Scholar ]
  • Mouratidis, A., Michou, A., Demircioğlu, A. N., & Sayil, M. (2018). Different goals, different pathways to success: Performance-approach goals as direct and mastery-approach goals as indirect predictors of grades in mathematics. Learning and Individual Differences, 61 , 127–135.
  • Mtebe JS, Raisamo R. A model for assessing learning management system success in higher education in sub-Saharan countries. The Electronic Journal of Information Systems in Developing Countries. 2014; 61 (1):1–17. doi: 10.1002/j.1681-4835.2014.tb00436.x. [ CrossRef ] [ Google Scholar ]
  • Munteanu C, Ceobanu C, Bobâlca C, Anton O. An analysis of customer satisfaction in a higher education context. The International Journal of Public Sector Management. 2010; 23 (2):124. doi: 10.1108/09513551011022483. [ CrossRef ] [ Google Scholar ]
  • Narad A, Abdullah B. Academic performance of senior secondary school students: Influence of parental encouragement and school environment. Rupkatha Journal on Interdisciplinary Studies in Humanities. 2016; 8 (2):12–19. doi: 10.21659/rupkatha.v8n2.02. [ CrossRef ] [ Google Scholar ]
  • NDTV (2020). Schools Closed, Travel To Be Avoided, Says Centre On Coronavirus: 10 Points. NDTV.com. Retrieved March 18, 2020.
  • O'donovan B. How student beliefs about knowledge and knowing influence their satisfaction with assessment and feedback. Higher Education. 2017; 74 (4):617–633. doi: 10.1007/s10734-016-0068-y. [ CrossRef ] [ Google Scholar ]
  • Oliver RL. A congitive model of the antecedents and consequences of satisfaction decisions. JMR, Journal of Marketing Research (Pre-1986) 1980; 17 (000004):460. doi: 10.1177/002224378001700405. [ CrossRef ] [ Google Scholar ]
  • Pei L, Wu H. Does online learning work better than offline learning in undergraduate medical education? A systematic review and meta-analysis. Medical Education Online. 2019; 24 (1):1666538. doi: 10.1080/10872981.2019.1666538. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Perlman S, & Mclntosh K. (2020). Coronaviruses, including severe acute respiratory syndrome (SARS) and Middle East respiratory syndrome (MERS). In: J.E Benett, R. Dolin,  M. J. Blaser (Eds.), Mandell, Douglas, and Bennett's principles and practice of infectious diseases. 9th ed . Philadelphia, PA: Elsevier: chap 155.
  • Pillai, K. R., Upadhyaya, P., Prakash, A. V., Ramaprasad, B. S., Mukesh, H. V., & Pai, Y. (2021). End-user satisfaction of technology-enabled assessment in higher education: A coping theory perspective. Education and Information Technologies . 10.1007/s10639-020-10401-2.
  • Pintrich P. The role of motivation in promoting and sustaining self-regulated learning. International Journal of Educational Research. 1999; 31 :459–470. doi: 10.1016/S0883-0355(99)00015-4. [ CrossRef ] [ Google Scholar ]
  • Rajabalee, Y. B., & Santally, M. I. (2020). Learner satisfaction, engagement and performances in an online module: Implications for institutional e-learning policy. Education and Information Technologies . 10.1007/s10639-020-10375-1. [ PMC free article ] [ PubMed ]
  • Ramsden PA. Performance indicator of teaching quality in higher education: The course experience questionnaire. Studies in Higher Education. 1991; 16 (2):129–150. doi: 10.1080/03075079112331382944. [ CrossRef ] [ Google Scholar ]
  • Rogers J. Adults learning. 3. Open University Press; 1992. [ Google Scholar ]
  • Rono, R. (2013). Factors affecting pupils' performance in public primary schools at Kenya certificate of primary education examination (Kcpe) in Emgwen Division, Nandi District, Kenya (Doctoral dissertation, University of Nairobi) .
  • Salloum, S. A. and Shaalan, K. (2018). Investigating students' acceptance of e-learning system in higher educational environments in the UAE: Applying the extended technology acceptance model (TAM), Ph.D. dissertation, Brit. Univ. Dubai, Dubai, United Arab Emirates, 2018.
  • Sanderson G. Objectives and evaluation. In: Truelove S, editor. Handbook of training and development. 2. Blackwell; 1995. pp. 113–144. [ Google Scholar ]
  • Schumacker RE, Lomax RG. A beginner's guide to structural equation modeling. L. L. Erlbaum Associates; 1996. [ Google Scholar ]
  • Schwarz C, Zhu Z. The impact of student expectations in using instructional tools on student engagement: A look through the expectation disconfirmation theory lens. Journal of Information Systems Education. 2015; 26 (1):47–58. [ Google Scholar ]
  • Schwinger M, Stiensmeier-Pelster J. Performance-approach and performance-avoidance classroom goals and the adoption of personal achievement goals. British Journal of Educational Psychology. 2011; 81 (4):680–699. doi: 10.1111/j.2044-8279.2010.02012.x. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Sher, A. (2009). Assessing the relationship of student-instructor and student-student interaction to student learning and satisfaction in web-based online learning environment. Journal of Interactive Online Learning, 8 (2).
  • Sibanda L, Iwu CG, Benedict OH. Factors influencing academic performance оf university students. Демографія та соціальна економіка 2015; 2 :103–115. [ Google Scholar ]
  • Sigala M. The evolution of internet pedagogy: Benefits for tourism and hospitality education. Journal of Hospitality, Leisure Sport and Tourism Education. 2002; 1 (2):29–45. [ Google Scholar ]
  • Simsek, U., Turan, I., & Simsek, U. (2017). Social studies teachers‟ and teacher candidates” perceptions on prompt feedback and communicate high expectations. PEOPLE: International Journal of Social Sciences, 3 (1), 332, 345.
  • Singh, S. P., Malik, S., & Singh, P. (2016). Factors affecting academic performance of students. Paripex-Indian Journal of Research, 5 (4), 176–178.
  • Shehzadi, S., Nisar, Q. A., Hussain, M. S., Basheer, M. F., Hameed, W. U., & Chaudhry, N. I. (2020). The role of digital learning toward students' satisfaction and university brand image at educational institutes of Pakistan: a post-effect of COVID-19. Asian Education and Development Studies, 10 (2), 276–294.
  • Tawafak RM, Romli AB, Alsinani M. E-learning system of UCOM for improving student assessment feedback in Oman higher education. Education and Information Technologies. 2019; 24 (2):1311–1335. doi: 10.1007/s10639-018-9833-0. [ CrossRef ] [ Google Scholar ]
  • UNESCO (2020). United nations educational, scientific and cultural organization. COVID19 educational disruption and response. UNESCO, Paris, France.  https://en.unesco.org/themes/education-emergencies/coronavirus-school-closures . Accessed 17 Nov 2020.
  • Urdan, T. (1997). Achievement goal theory: Past results, future directions. Advances in Motivation and Achievement, 10 , 99–141.
  • Wilson KL, Lizzio A, Ramsden P. The development, validation and application of the course experience questionnaire. Studies in Higher Education. 1997; 22 (1):33–53. doi: 10.1080/03075079712331381121. [ CrossRef ] [ Google Scholar ]
  • Wooldridge, B. (1995). Increasing the effectiveness of university/college instruction: Integrating the results of learning style research into course design and delivery. In R. R. Simms and S. J. Simms (Eds.), the Importance of Learning Styles. Westport, CT: Greenwood Press, 49–67.
  • World Health Organization (2019). https://www.who.int/health-topics/coronavirus#tab=tab_1 , Retrieved 29 March 2020.
  • Wright CR. Criteria for evaluating the quality of online courses. Alberta distance Educ. Training Assoc. 2003; 16 (2):185–200. [ Google Scholar ]
  • Yen SC, Lo Y, Lee A, Enriquez J. Learning online, offline, and in-between: Comparing student academic outcomes and course satisfaction in face-to-face, online, and blended teaching modalities. Education and Information Technologies. 2018; 23 (5):2141–2153. doi: 10.1007/s10639-018-9707-5. [ CrossRef ] [ Google Scholar ]
  • Yin H, Wang W. Assessing and improving the quality of undergraduate teaching in China: The course experience questionnaire. Assessment & Evaluation in Higher Education. 2015; 40 (8):1032–1049. doi: 10.1080/02602938.2014.963837. [ CrossRef ] [ Google Scholar ]
  • Yorke M. Formative assessment in higher education: Moves towards theory and the enhancement of pedagogic practice. Higher Education. 2003; 45 (4):477–501. doi: 10.1023/A:1023967026413. [ CrossRef ] [ Google Scholar ]
  • Yu F, Du L, Ojcius DM, Pan C, Jiang S. Measures for diagnosing and treating infections by a novel coronavirus responsible for a pneumonia outbreak originating in Wuhan, China. Microbes and Infection. 2020; 22 (2):74–79. doi: 10.1016/j.micinf.2020.01.003. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Yunusa AA, Umar IN. A scoping review of critical predictive factors (CPFs) of satisfaction and perceived learning outcomes in E-learning environments. Education and Information Technologies. 2021; 26 (1):1223–1270. doi: 10.1007/s10639-020-10286-1. [ CrossRef ] [ Google Scholar ]
  • Zeithaml VA. Consumer perceptions of price, quality, and value: A means-end model and synthesis of evidence. Journal of Marketing. 1988; 52 (3):2–22. doi: 10.1177/002224298805200302. [ CrossRef ] [ Google Scholar ]
  • Zhang L, Han Z, Gao Q. Empirical study on the student satisfaction index in higher education. International Journal of Business and Management. 2008; 3 (9):46–51. [ Google Scholar ]
  • Academic Calendar
  • How Online Learning Works
  • Faculty Support
  • Apply and Enroll
  • Apply for Admission
  • Enroll in Courses
  • Courses by Semester
  • Tuition and Fees
  • Military and Veterans
  • Academics FAQ
  • Programs and Courses
  • Graduate Programs
  • Certificate Programs
  • Professional Development Programs
  • Course Catalog
  • Courses FAQ
  • Student Resources
  • Online Course Homepages
  • Site Course Homepages
  • Exams and Homework
  • Technical Support
  • Graduate Program Procedures
  • Course Drop & Withdrawals
  • University Resources
  • Students FAQ

EED 565 Transforming Engineering Education for Future Researchers

3 Credit Hours

Welcoming graduate students from the College of Engineering, College of Education and other Social Science and STEM disciplines. In this 3-credit-hour engineering education course, students will learn how to develop robust mentoring relationships in research environments. Following the principles of the creation of learning communities and functioning as a community of practice, this course’s goal is not only to develop research skill sets and apply those skills to the identification and conduction of research, but to foster excellence in mentorship.

You will develop technical skills needed for mentorship like the designing of research questions, documentation and presentation of research findings, and development of research proposals as well as essential skills of best practices for mentorship, how to serve as a leader in your research community, and your personal mentorship style.

Prerequisite

BS in Engineering, Education or Sciences.

Course Objectives

By the end of the course students will be able to: Describe the attributes of effective mentoring and research for graduate students in engineering education; Identify and review engineering education scholarship while understanding ethical considerations in research, identify effective mentoring practices in engineering, identify the skills and knowledge that engineering students need to develop to prepare them for successful research careers in engineering education, identify and describe effective mentoring across diverse and inclusive environments; Develop and present a research proposal in engineering.

Course Requirements

Research paper, proposal, presentation and project.

Course Outline

Introduction to Research and Mentoring in Engineering Education, Effective Mentoring Practices, Diversity in STEM, Reading and Writing Scholarship in Engineering Education, Research Ethics, Developing Research Identity and Professional Development skills

A list of engineering education resources will be recommended in the syllabus. Journal articles will also be provided through NCSU libraries.

Created: 4/18/2024

research objectives about online classes

A Unified General Education Pathway

research objectives about online classes

"...the transfer process is still unnecessarily complex, confusing and difficult for the majority of students to navigate." — Assembly Bill 928, The Student Transfer Achievement Reform (STAR) Act 2021

More than 50% of CSU students are transfer students, arriving primarily from the California Community Colleges system. In an effort to simplify their pathway to a four-year degree, the Student Transfer Achievement Reform Act (AB 928) creates a singular, lower-division General Education (GE) pattern for both California State University and University of California transfer admissions. This pattern, called Cal-GETC, was approved by all three higher education intersegmental partners via the Intersegmental Committee of Academic Senates in spring 2023. When Cal-GETC is implemented in fall 2025, it will become the only transfer GE pattern offered by California community colleges.

The STAR Act is meant to support student success and equity, helping to ease access, simplify advisement across segments, eliminate barriers and carve a clear path to a four-year degree across California's educational segments.

Recognizing a growing trend of first-time, first-year students arriving to the CSU with college credit, including 60% of CSU first-year applicants who have earned college credit, the Chancellor's Office has recommended a unified pathway. Historically, the CSU has had one unified GE pattern for all students—CSU GE Breadth. Changes to Title 5 California Code of Regulations ensure the CSU continues to provide one unified GE pattern whether students enroll as first-time, first-year students or transfer students.

GE Informational Webinar, April 15, 2024

An informational webinar was held on Monday, April 15, 2024 hosted by Interim Associate Vice Chancellor of Academic and Faculty Programs Laura Massa and Assistant Vice Chancellor and State University Dean Brent Foster. Questions posed in this webinar will be posted shortly.

On March 27, 2024, the CSU Board of Trustees approved proposed changes to Title 5 CSU General Education that modify CSU GE Breadth to mirror the Cal-GETC pattern and units.

The Chancellor’s Office will support campuses and faculty through the implementation processes, including through resources to support faculty release, written guidance and stipends for faculty effort during off-contract periods. Each campus will determine the application of units that are not included in Cal-GETC.

Changes to CSU General Education

The update to CSU GE removes five units from the GE pattern. It does this by:

  • Including a one-unit laboratory for Biological or Physical Sciences
  • Not including one of three Arts or Humanities courses (in Area C)
  • Not including Area E, Lifelong Learning and Self-Development

The five units removed from GE will be returned to campuses to determine how to utilize.

About the Student Transfer Achievement Reform Act

Authored by Assemblymember Marc Berman and approved in 2021, Assembly Bill 928 consolidates two existing general education pathways for California Community College students into a single pathway to either the CSU or UC system. It also requires that community colleges place incoming students on an Associate Degree for Transfer (ADT) pathway, if one exists for their major, on or before August 1, 2024.

Key Terms and Definitions

What is Cal-GETC? Cal-GETC is a new GE pattern that will be implemented in fall 2025. As a result of its implementation, California Community Colleges will no longer offer the current CSU GE Breadth and Intersegmental General Education Transfer Curriculum (IGETC) patterns.

What is IGETC? The Intersegmental General Education Transfer Curriculum, or IGETC, is designed for the community college student who wants to be eligible to transfer to either the CSU or the UC systems. 

What is CSU GE Breadth? CSU GE Breadth is the current General Education pattern for all CSU students whether they are first-time first-year students or transfer students. Following the approval of the CSU Board of Trustees on March 27, 2024, starting in fall 2025 CSU GE will mirror Cal-GETC in areas and units.

What is an ADT? The Associate Degree for Transfer (ADT) allows California Community College students who meet the CSU's minimum eligibility requirements guaranteed priority admission to the CSU, though not necessarily to a particular campus or major. Students earn a two-year associate degree (no more than 60 units) that is fully transferrable towards a CSU bachelor's degree.

Additional Resources

GE Informational Seminar May 2023

AB 928 Bill Text

ADT Intersegmental Implementation Committee

The Intersegmental Committee of the Academic Senates (ICAS)

Frequently Asked Questions

Close dark modal button

Site Logo

Python Basics for Online Research Specialization on Coursera

Created by UC Davis Continuing and Professional Education and hosted on the Coursera platform, this is an online course consisting of pre-recorded video lectures, auto-graded and peer-reviewed assignments and community discussion forums. This specialization program is self-paced and designed to help you master a specific career skill in as little as 4-6 months.

If you do not yet code and want to learn, this Specialization has a goal to soften the learning curve for Python. It has four main objectives:

  • To inspire you to code
  • To help you think in code
  • To teach you technical concepts to code
  • To give you useful examples of things to do in code

There is a steep learning curve on learning to code, and that is why this Specialization emphasizes motivation. You have to want to learn to code and stick with it through multiple learning activities and your own experimentation, research, and practice. This single Specialization will not teach you to code. It will, however, get you started with a mindset for coding, understanding of Python technical concepts, and an appreciation of what can be done with Python to access and interact with data on the Internet. These skills are increasingly essential for researchers.

The wealth of data that is now available to researchers who can use Python and other tools to access it is transforming academic disciplines, including the social sciences. But there's a gap between the questions about human nature that we know internet data can cast light on, and the raw, messy reality of code and data. Each course in this Specialization has code demonstrations that you run that show how to use Python to bridge the gap and to discover things about ourselves, our friends, each other, and society, as we interact with the Internet in code.

Instruction Method Online class

Section Notes

Enrollments are accepted on a continuous basis. Complete the course at your own pace. This Specialization consists of four courses on the Coursera platform:

  • Python Basics: Interacting with the Internet
  • Python Basics: Retrieving Online Data
  • Python Basics: Automation and Bots
  • Python Basics: Problem Solving with Code

For more information on Coursera online courses, including enrollment policies and technical requirements, please visit https://coursera.org/about/terms .

Advertisement

Advertisement

Enhancing online participation in education: quarter century of research

  • Published: 24 August 2022
  • Volume 10 , pages 663–687, ( 2023 )

Cite this article

  • Andrina Granić   ORCID: orcid.org/0000-0002-4266-3406 1  

490 Accesses

3 Citations

1 Altmetric

Explore all metrics

Online participation in education is not a new research topic, but the field still lacks succinct account of approaches for enhancing online participation. This paper presents a systematic review of representative literature dealing with online participation in education. All subscribed resources in Web of Science were systematically searched with no time frame limit, and a total of 30 relevant studies published between 1996 and 2020 were selected. Proposed classification of a variety of approaches that evidently enhance and encourage online participation is based on actual empirical results and evidences reported in each analysed study. The key research findings revealed that different communication support (like discussion forums, blogs, and chats) and teaching strategies (like flipped learning, personalized communication, and gamification) were evidenced as predominantly employed approaches which proved to enhance online participation. Besides, facilitative strategies and tools (like online peer voting, wikis, and blogs) along with other support mechanisms (like social media and group awareness tools) were also regarded as valuable means of participation encouragement. Furthermore, in a great majority of analysed research, university students were found to be involved participants (in terms of data collection), while half of studies relied on Learning Management System as supporting technology. The conducted review exposed also a significant empirical evidence to support the argument that students’ learning performance could be enhanced by improving online participation. Further research directions and potential challenges are offered as well.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price includes VAT (Russian Federation)

Instant access to the full article PDF.

Rent this article via DeepDyve

Institutional subscriptions

research objectives about online classes

Similar content being viewed by others

research objectives about online classes

Learning Engagement and Peer Learning in MOOC: A Selective Systematic Review

research objectives about online classes

A Study on the Behaviours and Attitudes of Students in Online Learning at the Open University of Mauritius

research objectives about online classes

Participation in online discussion environments: Is it really effective?

Fatma Betül Kurnaz, Esin Ergün & Hale Ilgaz

Data availability

The data of the systematic review consist of articles published in journals and conferences. Many of these are freely available online, others can be accessed for a fee or through subscription.

Anderson, T. (2003). Getting the mix right again: An updated and theoretical rationale for interaction. International Review of Research in Open and Distance Learning . https://doi.org/10.19173/irrodl.v4i2.149

Article   Google Scholar  

Afzalan, N., & Muller, B. (2018). Online participatory technologies: Opportunities and challenges for enriching participatory planning. Journal of the American Planning Association, 84 (2), 162–177. https://doi.org/10.1080/01944363.2018.1434010

Akçayır, G., & Akçayır, M. (2018). The flipped classroom: A review of its advantages and challenges. Computers & Education, 126 , 334–345. https://doi.org/10.1016/j.compedu.2018.07.021

Barata, G., Gama, S., Jorge, J., & Goncalves, D. (2013). Engaging engineering students with gamification an empirical study. In 2013 5th International Conference on Games and Virtual Worlds for Serious Applications (VS-Games) .

Berelson, B. (1952). Content analysis in communications research . Free Press.

Google Scholar  

Cacciamani, S. (2017). Experiential learning and knowledge building in higher education: An application of the progressive design method. Journal of E-Learning and Knowledge Society, 13 (1), 27–38.

Cartwright, J., Franklin, D., Forman, D., & Freegard, H. (2015). Promoting collaborative dementia care via online interprofessional education. Australasian Journal on Ageing, 34 (2), 88–94. https://doi.org/10.1111/ajag.12106

Charalampidi, M., & Hammond, M. (2016). How do we know what is happening online? A mixed methods approach to analysing online activity. Interactive Technology and Smart Education, 13 (4), 274–288. https://doi.org/10.1108/ITSE-09-2016-0032

Chan, C. K. K., & Chan, Y. Y. (2011). Students’ views of collaboration and online participation in Knowledge Forum. Computers & Education, 57 (1), 1445–1457. https://doi.org/10.1016/j.compedu.2010.09.003

Cheng, G., & Chau, J. (2016). Exploring the relationships between learning styles, online participation, learning achievement and course satisfaction: An empirical study of a blended learning course. British Journal of Educational Technology, 47 (2), 257–278. https://doi.org/10.1111/bjet.12243

Cui, T., & Coleman, A. (2020). Investigating students’ attitudes, motives, participation and performance regarding out-of-class communication (OCC) in a flipped classroom. Electronic Journal of E-Learning, 18 (6), 550–561. https://doi.org/10.34190/JEL.18.6.007

Davies, J., & Graff, M. (2005). Performance in e-learning: Online participation and student grades. British Journal of Educational Technology, 36 (4), 657–663. https://doi.org/10.1111/j.1467-8535.2005.00542.x

Delahunty, J., Verenika, I., & Jones, P. (2014). Socioemotional connections: Identity, belonging and learning in online interactions. A literature review. Technology, Pedagogy and Education, 23 (2), 243–265. https://doi.org/10.1080/1475939X.2013.813405

Deng, L. P., & Tavares, N. J. (2015). Exploring university students’ use of technologies beyond the formal learning context: A tale of two online platforms. Australasian Journal of Educational Technology, 31 (3), 313–327.

Diep, N. A., Cocquyt, C., Zhu, C., & Vanwing, T. (2016). Predicting adult learners’ online participation: Effects of altruism, performance expectancy, and social capital. Computers & Education, 101 , 84–101. https://doi.org/10.1016/j.compedu.2016.06.002

Duan, Y. Q., Kandakatla, R., Stites, N., Berger, E., DeBoer, J., & Rhoads, J. F. (2018). The relationship between demographic characteristics and engagement in an undergraduate engineering online forum. In 2018 IEEE Frontiers in Education Conference (FIE), Book Series Frontiers in Education Conference .

Edelmann, N. (2016). What is lurking? A literature review of research on lurking. Psychology of Social Networking: Personal Experience in Online Communities . https://doi.org/10.1515/9783110473780-015

Ergazakis, K., Metaxiotis, K., & Tsitsanis, T. (2011). A state-of-the-art review of applied forms and areas, tools and technologies for e-participation. International Journal of Electronic Government Research, 7 (1), 1–19. https://doi.org/10.4018/jegr.2011010101

Garcia, C., & Badia, A. (2020). Posting messages and acquiring knowledge in collaborative online tasks. Technology Pedagogy and Education, 29 (3), 377–388. https://doi.org/10.1080/1475939X.2020.1778076

Garrison, R., & Cleveland-Innes, M. (2005). Facilitating cognitive presence in online learning: Interaction is not enough. The American Journal of Distance Education, 19 (3), 133–148. https://doi.org/10.1207/s15389286ajde1903_2

Ghadirian, H., & Ayub, A. F. M. (2017). Peer moderation of asynchronous online discussions: An exploratory study of peer e-moderating behaviour. Australasian Journal of Educational Technology, 33 (1), 1–18. https://doi.org/10.14742/ajet.2882

Gillett-Swan, J. (2017). The challenges of online learning: supporting and engaging the isolated learner. Journal of Learning Design, 10 (1), 20–30.

Guan, J. F., Tregonning, S., & Keenan, L. (2008). Social interaction and participation: Formative evaluation of online CME modules. Journal of Continuing Education in the Health Professions, 28 (3), 172–179.

Han, F. F., & Ellis, R. (2020). Combining self-reported and observational measures to assess university student academic performance in blended course designs. Australasian Journal of Educational Technology, 36 (6), 1–14. https://doi.org/10.14742/ajet.6369

Han, X. B., Wang, Y. P., & Jiang, L. (2019). Towards a framework for an institution-wide quantitative assessment of teachers’ online participation in blended learning implementation. Internet and Higher Education, 42 , 1–12. https://doi.org/10.1016/j.iheduc.2019.03.003

Hasan, N., & Rahman, A. A. (2017). Ranking the factors that impact customers online participation in value co-creation in service sector using analytic hierarchy process. International Journal of Information Systems in the Service Sector, 9 (1), 37–53. https://doi.org/10.4018/IJISSS.2017010103

Hassan, L., & Hamari, J. (2020). Gameful civic engagement: A review of the literature on gamification of e-participation. Government Information Quarterly, 37 (3), 1461–1461.

He, W. (2013). Examining students’ online interaction in a live video streaming environment using data mining and text mining. Computers in Human Behavior, 29 (1), 90–102. https://doi.org/10.1016/j.chb.2012.07.020

Hrastinski, S. (2008a). The potential of synchronous communication to enhance participation in online discussions: A case study of two e-learning courses. Information & Management, 45 (7), 499–506. https://doi.org/10.1016/j.im.2008.07.005

Hrastinski, S. (2008b). What is online learner participation? A literature review. Computers & Education, 51 (4), 1755–1765. https://doi.org/10.1016/j.compedu.2008.05.005

Hrastinski, S. (2009). A theory of online learning as online participation. Computers & Education, 52 (1), 78–82. https://doi.org/10.1016/j.compedu.2008.06.009

Hrastinski, S. (2019). What do we mean by blended learning? TechTrends, 63 , 564–569. https://doi.org/10.1007/s11528-019-00375-5

Hsieh, Y. H., & Tsai, C. C. (2012). The effect of moderator’s facilitative strategies on online synchronous discussions. Computers in Human Behavior, 28 (5), 1708–1716. https://doi.org/10.1016/j.chb.2012.04.010

Hsu, P. L., & Yen, Y. H. (2014). College student performance facilitated on facebook: A case study. In L. S. L. Wang, J. J. June, C. H. Lee, K. Okuhara, & H. C. Yang (Eds.), Multidisciplinary Social Networks Research, MISNC 2014, Book Series Communications in Computer and Information Science (Vol. 473, pp. 368–382).

Huang, E. Y., Lin, S. W., & Huang, T. K. (2012). What type of learning style leads to online participation in the mixed-mode e-learning environment? A study of software usage instruction. Computers & Education, 58 (1), 338–349. https://doi.org/10.1016/j.compedu.2011.08.003

Jin, S. H., & Kim, T. H. (2015). Visual scaffolding for encouraging online participation. In Proceedings of 2015 International Conference on Interactive Collaborative Learning (ICL) (pp. 685–688).

Kamalodeen, V. J., & Jameson-Charles, M. (2016). A mixed methods research approach to exploring teacher participation in an online social networking website. International Journal of Qualitative Methods . https://doi.org/10.1177/1609406915624578

Kebritchi, M., Lipschuetz, A., & Santiague, L. (2017). Issues and challenges for teaching successful online courses in higher education: A literature review. Journal of Educational Technology Systems, 468 (1), 4–29. https://doi.org/10.1177/0047239516661713

Kimmons, R., & Veletsianos, G. (2014). The fragmented educator 2.0: Social networking sites, acceptable identity fragments, and the identity constellation. Computers & Education, 72 , 292–301. https://doi.org/10.1016/j.compedu.2013.12.001

Kitchenham, B. (2004). Procedures for performing systematic reviews. NICTA Technical Report.

Komoroski, E. M. (1996). Use of e-mail to teach residents pediatric emergency medicine. In 36th Annual Meeting of the Ambulatory-Pediatric-Association / Society-for-Pediatric-Research, Washington, D.C., May 06–12, 1996

Kurnaz, F. B., Ergun, E., & Ilgaz, H. (2018). Participation in online discussion environments: Is it really effective? Education and Information Technologies, 23 (4), 1719–1736. https://doi.org/10.1007/s10639-018-9688-4

Lantz-Andersson, A., Lundin, M., & Selwyn, N. (2018). Twenty years of online teacher communities: A systematic review of formally-organized and informally-developed professional learning groups. Teaching and Teacher Education, 75 , 302–315.

Law, L. (2020). Examining students' perceptions of online intercultural learning experience via a SPOC. In 14th International Technology, Education and Development Conference (INTED2020) (pp. 922–922).

Lin, C. J., & Hwang, G. J. (2018). A learning analytics approach to investigating factors affecting EFL students’ oral performance in a flipped classroom. Educational Technology & Society, 21 (2), 205–219.

Lin, J. W., Tsai, C. W., Hsu, C. C., & Chang, L. C. (2019). Peer assessment with group awareness tools and effects on project-based learning. Interactive Learning Environments . https://doi.org/10.1080/10494820.2019.1593198

Lutz, C., Hoffmann, C. P., & Meckel, M. (2014). Beyond just politics: A systematic literature review of online participation. First Monday . https://doi.org/10.5210/fm.v19i7.5260

Malinen, S. (2015). Understanding user participation in online communities: A systematic literature review of empirical studies. Computers in Human Behavior, 46 , 228–238.

Mazzolini, M., & Maddison, S. (2003). Sage, guide or ghost? The effect of instructor intervention on student participation in online discussion forums. Computers & Education, 40 (3), 237–253.

Mese, C., & Dursun, O. O. (2018). Influence of gamification elements on emotion, interest and online participation. Egitim Ve Bilim-Education and Science, 43 (196), 67–95. https://doi.org/10.15390/EB.2018.7726

Miyazoe, T., & Anderson, T. (2011). Anonymity in blended learning: Who would you like to be? Educational Technology & Society, 14 (2), 175–187.

Mompean, J. A., & Fouz-Gonzalez, J. (2016). Twitter-based EFL pronunciation instruction. Language Learning & Technology, 20 (1), 166–190.

Naresh, B., & Rajalakshmi, M. A. (2020). A conceptual study on employer perception towards hiring employee with online degree/certification. International Journal of Information and Communication Technology Education, 16 (3), 1–14. https://doi.org/10.4018/IJICTE.2020070101

Neto, P., Williams, B., & Carvalho, I. (2011). Keeping them Involved—Encouraging and Monitoring Student Activity. Journal of Technical Education and Training, 3 (2), 33–43.

Nieuwoudt, J. (2018). Exploring online interaction and online learner participation in an online science subject through the lens of the interaction equivalence theorem. Student Success, 9 (4), 53–62. https://doi.org/10.5204/ssj.v9i4.520

Ng, W., & Nicholas, H. (2010). A progressive pedagogy for online learning with high-ability secondary school students: A case study. Gifted Child Quarterly, 54 (3), 239–251. https://doi.org/10.1177/0016986209355973

OECD/OCDE. (2007). Revised Field of Science and Technology (FOS) Classification in the Frascati Manual. Working Party of National Experts on Science and Technology Indicators. Retrieved October 19 2021 from http://www.oecd.org/science/inno/38235147.pdf

Razzaque, A. (2020). M-learning improves knowledge sharing over e-learning platforms to build higher education students’ social capital. SAGE Open . https://doi.org/10.1177/2158244020926575

Rovai, A. P., & Jordan, H. (2004). Blended learning and sense of community: A comparative analysis with traditional and fully online graduate courses. The International Review of Research in Open and Distributed Learning . https://doi.org/10.19173/irrodl.v5i2.192

Rubio, F., Thomas, J. M., & Li, Q. (2018). The role of teaching presence and student participation in Spanish blended courses. Computer Assisted Language Learning, 31 (3), 226–250. https://doi.org/10.1080/09588221.2017.1372481

Saifuddin, K. M., & Strange, M. H. (2016). School teacher professional development in online communities of practice: A systematic literature review. In Proceedings of the 15th European Conference on E-Learning (ECEL 2016) (pp. 605–614).

Sanchez-Castro, O., & Strambi, A. (2017). Researching learner self-efficacy and online participation through speech functions: An exploratory study. CALICO Journal, 34 (2), 219–242. https://doi.org/10.1558/cj.26389

Schreier, M. (2012). Qualitative content analysis in practice . Sage Publications.

Shukor, N. A., Tasir, Z., Van der Meijden, H., & Harun, J. (2014). A predictive model to evaluate students' cognitive engagement in online learning. In J. C. Laborda, F. Ozdamli, Y. Maasoglu (Eds.) 5th World Conference on Educational Sciences, Book Series Procedia Social and Behavioral Sciences (Vol. 116, pp. 4844–4853). https://doi.org/10.1016/j.sbspro.2014.01.1036

Şimşek, H., & Yildirim, A. (2011). Qualitative research methods in social sciences . Seckin Publications.

Snodgrass, S. (2011). Wiki activities in blended learning for health professional students: Enhancing critical thinking and clinical reasoning skills. Australasian Journal of Educational Technology, 27 , 563–580.

Song, D., & Kim, D. (2020). Effects of self-regulation scaffolding on online participation and learning outcomes. Journal of Research on Technology in Education . https://doi.org/10.1080/15391523.2020.1767525

Steinbach, M., Sieweke, J., & Suss, S. (2019). The diffusion of e-participation in public administrations: A systematic literature review. Journal of Organizational Computing and Electronic Commerce, 29 (2), 61–95.

Stone, C. (2017). Opportunity through online learning: Improving student access, participation and success in higher education. Report. Perth, Western Australia: National Centre for Student Equity in Higher Education, Curtin University. Retrieved April 15 2022 from http://hdl.voced.edu.au/10707/425131

Sung, C. Y., Huang, X. Y., Shen, Y., Cherng, F. Y., Lin, W. C., & Wang, H. C. (2017). Exploring online learners’ interactive dynamics by visually analyzing their time-anchored comments. Computer Graphics Forum, 36 (7), 145–155. https://doi.org/10.1111/cgf.13280

Tang, H. T., Xing, W. L., & Pei, B. (2019). Time really matters: Understanding the temporal dimension of online learning using educational data mining. Journal of Educational Computing Research, 57 (5), 1326–1347. https://doi.org/10.1177/0735633118784705

Tomor, Z., Meijer, A., Michels, A., & Geertman, S. (2019). Smart Governance for sustainable cities: Findings from a systematic literature review. Journal of Urban Technology, 26 (4), 3–27.

Topcu, A., & Ubuz, B. (2008). The effects of metacognitive knowledge on the pre-service teachers’ participation in the asynchronous online forum. Educational Technology & Society, 11 (3), 1–12.

Vaismoradi, M., Turunen, H., & Bondas, T. (2013). Content analysis and thematic analysis: Implications for conducting a qualitative descriptive study. Nursing and Health Sciences, 15 (3), 398–405. https://doi.org/10.1111/nhs.12048

Vician, C., & Brown, S. A. (2001). Re-engineering participation through on-line learning environments: An examination of communication apprehension, choice, and performance. Journal of Computer Information Systems, 42 (1), 26–36.

Vonderwell, S., & Zachariah, S. (2005). Factors that influence participation in online learning. Journal of Research on Technology in Education, 38 (2), 213–230.

Watson, J. M., Salmon, P. M., Lacey, D., & Kerr, D. (2018). Continuance in online participation following the compromise of older adults’ identity information: A literature review. Theoretical Issues in Ergonomics Science, 19 (6), 637–657. https://doi.org/10.1080/1463922X.2018.1432714

Wenger, E. (1998). Communities of practice: Learning, meaning, and identity . Cambridge University Press.

Book   Google Scholar  

Yoon, S. A., Miller, K., Richman, T., Wendel, D., Schoenfeld, I., Anderson, E., Shim, J., & Marei, A. (2020). A social capital design for delivering online asynchronous professional development in a MOOC course for science teachers. Information and Learning Sciences, 121 (7–8), 677–693. https://doi.org/10.1108/ILS-04-2020-0061

Zolotov, M. N., Oliveira, T., & Casteleyn, S. (2018). E-participation adoption models research in the last 17 years: A weight and meta-analytical review. Computers in Human Behavior, 81 , 350–365.

Download references

Author information

Authors and affiliations.

Department of Computer Science, Faculty of Science, University of Split, Rudera Boskovica 33, 21000, Split, Croatia

Andrina Granić

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Andrina Granić .

Ethics declarations

Conflict of interest.

The authors declare no conflicts of interest.

Ethical approval

No ethics review was required to undertake this literature review.

Additional information

Publisher's note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Granić, A. Enhancing online participation in education: quarter century of research. J. Comput. Educ. 10 , 663–687 (2023). https://doi.org/10.1007/s40692-022-00238-8

Download citation

Received : 12 November 2021

Revised : 20 May 2022

Accepted : 04 August 2022

Published : 24 August 2022

Issue Date : December 2023

DOI : https://doi.org/10.1007/s40692-022-00238-8

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Online participation
  • Classification of approaches
  • Systematic review
  • Find a journal
  • Publish with us
  • Track your research

Numbers, Facts and Trends Shaping Your World

Read our research on:

Full Topic List

Regions & Countries

  • Publications
  • Our Methods
  • Short Reads
  • Tools & Resources

Read Our Research On:

Partisan divides over K-12 education in 8 charts

Proponents and opponents of teaching critical race theory attend a school board meeting in Yorba Linda, California, in November 2021. (Robert Gauthier/Los Angeles Times via Getty Images)

K-12 education is shaping up to be a key issue in the 2024 election cycle. Several prominent Republican leaders, including GOP presidential candidates, have sought to limit discussion of gender identity and race in schools , while the Biden administration has called for expanded protections for transgender students . The coronavirus pandemic also brought out partisan divides on many issues related to K-12 schools .

Today, the public is sharply divided along partisan lines on topics ranging from what should be taught in schools to how much influence parents should have over the curriculum. Here are eight charts that highlight partisan differences over K-12 education, based on recent surveys by Pew Research Center and external data.

Pew Research Center conducted this analysis to provide a snapshot of partisan divides in K-12 education in the run-up to the 2024 election. The analysis is based on data from various Center surveys and analyses conducted from 2021 to 2023, as well as survey data from Education Next, a research journal about education policy. Links to the methodology and questions for each survey or analysis can be found in the text of this analysis.

Most Democrats say K-12 schools are having a positive effect on the country , but a majority of Republicans say schools are having a negative effect, according to a Pew Research Center survey from October 2022. About seven-in-ten Democrats and Democratic-leaning independents (72%) said K-12 public schools were having a positive effect on the way things were going in the United States. About six-in-ten Republicans and GOP leaners (61%) said K-12 schools were having a negative effect.

A bar chart that shows a majority of Republicans said K-12 schools were having a negative effect on the U.S. in 2022.

About six-in-ten Democrats (62%) have a favorable opinion of the U.S. Department of Education , while a similar share of Republicans (65%) see it negatively, according to a March 2023 survey by the Center. Democrats and Republicans were more divided over the Department of Education than most of the other 15 federal departments and agencies the Center asked about.

A bar chart that shows wide partisan differences in views of most federal agencies, including the Department of Education.

In May 2023, after the survey was conducted, Republican lawmakers scrutinized the Department of Education’s priorities during a House Committee on Education and the Workforce hearing. The lawmakers pressed U.S. Secretary of Education Miguel Cardona on topics including transgender students’ participation in sports and how race-related concepts are taught in schools, while Democratic lawmakers focused on school shootings.

Partisan opinions of K-12 principals have become more divided. In a December 2021 Center survey, about three-quarters of Democrats (76%) expressed a great deal or fair amount of confidence in K-12 principals to act in the best interests of the public. A much smaller share of Republicans (52%) said the same. And nearly half of Republicans (47%) had not too much or no confidence at all in principals, compared with about a quarter of Democrats (24%).

A line chart showing that confidence in K-12 principals in 2021 was lower than before the pandemic — especially among Republicans.

This divide grew between April 2020 and December 2021. While confidence in K-12 principals declined significantly among people in both parties during that span, it fell by 27 percentage points among Republicans, compared with an 11-point decline among Democrats.

Democrats are much more likely than Republicans to say teachers’ unions are having a positive effect on schools. In a May 2022 survey by Education Next , 60% of Democrats said this, compared with 22% of Republicans. Meanwhile, 53% of Republicans and 17% of Democrats said that teachers’ unions were having a negative effect on schools. (In this survey, too, Democrats and Republicans include independents who lean toward each party.)

A line chart that show from 2013 to 2022, Republicans' and Democrats' views of teachers' unions grew further apart.

The 38-point difference between Democrats and Republicans on this question was the widest since Education Next first asked it in 2013. However, the gap has exceeded 30 points in four of the last five years for which data is available.

Republican and Democratic parents differ over how much influence they think governments, school boards and others should have on what K-12 schools teach. About half of Republican parents of K-12 students (52%) said in a fall 2022 Center survey that the federal government has too much influence on what their local public schools are teaching, compared with two-in-ten Democratic parents. Republican K-12 parents were also significantly more likely than their Democratic counterparts to say their state government (41% vs. 28%) and their local school board (30% vs. 17%) have too much influence.

A bar chart showing Republican and Democratic parents have different views of the influence government, school boards, parents and teachers have on what schools teach

On the other hand, more than four-in-ten Republican parents (44%) said parents themselves don’t have enough influence on what their local K-12 schools teach, compared with roughly a quarter of Democratic parents (23%). A larger share of Democratic parents – about a third (35%) – said teachers don’t have enough influence on what their local schools teach, compared with a quarter of Republican parents who held this view.

Republican and Democratic parents don’t agree on what their children should learn in school about certain topics. Take slavery, for example: While about nine-in-ten parents of K-12 students overall agreed in the fall 2022 survey that their children should learn about it in school, they differed by party over the specifics. About two-thirds of Republican K-12 parents said they would prefer that their children learn that slavery is part of American history but does not affect the position of Black people in American society today. On the other hand, 70% of Democratic parents said they would prefer for their children to learn that the legacy of slavery still affects the position of Black people in American society today.

A bar chart showing that, in 2022, Republican and Democratic parents had different views of what their children should learn about certain topics in school.

Parents are also divided along partisan lines on the topics of gender identity, sex education and America’s position relative to other countries. Notably, 46% of Republican K-12 parents said their children should not learn about gender identity at all in school, compared with 28% of Democratic parents. Those shares were much larger than the shares of Republican and Democratic parents who said that their children should not learn about the other two topics in school.

Many Republican parents see a place for religion in public schools , whereas a majority of Democratic parents do not. About six-in-ten Republican parents of K-12 students (59%) said in the same survey that public school teachers should be allowed to lead students in Christian prayers, including 29% who said this should be the case even if prayers from other religions are not offered. In contrast, 63% of Democratic parents said that public school teachers should not be allowed to lead students in any type of prayers.

Bar charts that show nearly six-in-ten Republican parents, but fewer Democratic parents, said in 2022 that public school teachers should be allowed to lead students in prayer.

In June 2022, before the Center conducted the survey, the Supreme Court ruled in favor of a football coach at a public high school who had prayed with players at midfield after games. More recently, Texas lawmakers introduced several bills in the 2023 legislative session that would expand the role of religion in K-12 public schools in the state. Those proposals included a bill that would require the Ten Commandments to be displayed in every classroom, a bill that would allow schools to replace guidance counselors with chaplains, and a bill that would allow districts to mandate time during the school day for staff and students to pray and study religious materials.

Mentions of diversity, social-emotional learning and related topics in school mission statements are more common in Democratic areas than in Republican areas. K-12 mission statements from public schools in areas where the majority of residents voted Democratic in the 2020 general election are at least twice as likely as those in Republican-voting areas to include the words “diversity,” “equity” or “inclusion,” according to an April 2023 Pew Research Center analysis .

A dot plot showing that public school district mission statements in Democratic-voting areas mention some terms more than those in areas that voted Republican in 2020.

Also, about a third of mission statements in Democratic-voting areas (34%) use the word “social,” compared with a quarter of those in Republican-voting areas, and a similar gap exists for the word “emotional.” Like diversity, equity and inclusion, social-emotional learning is a contentious issue between Democrats and Republicans, even though most K-12 parents think it’s important for their children’s schools to teach these skills . Supporters argue that social-emotional learning helps address mental health needs and student well-being, but some critics consider it emotional manipulation and want it banned.

In contrast, there are broad similarities in school mission statements outside of these hot-button topics. Similar shares of mission statements in Democratic and Republican areas mention students’ future readiness, parent and community involvement, and providing a safe and healthy educational environment for students.

  • Education & Politics
  • Partisanship & Issues
  • Politics & Policy

About 1 in 4 U.S. teachers say their school went into a gun-related lockdown in the last school year

About half of americans say public k-12 education is going in the wrong direction, what public k-12 teachers want americans to know about teaching, what’s it like to be a teacher in america today, race and lgbtq issues in k-12 schools, most popular.

1615 L St. NW, Suite 800 Washington, DC 20036 USA (+1) 202-419-4300 | Main (+1) 202-857-8562 | Fax (+1) 202-419-4372 |  Media Inquiries

Research Topics

  • Age & Generations
  • Coronavirus (COVID-19)
  • Economy & Work
  • Family & Relationships
  • Gender & LGBTQ
  • Immigration & Migration
  • International Affairs
  • Internet & Technology
  • Methodological Research
  • News Habits & Media
  • Non-U.S. Governments
  • Other Topics
  • Race & Ethnicity
  • Email Newsletters

ABOUT PEW RESEARCH CENTER  Pew Research Center is a nonpartisan fact tank that informs the public about the issues, attitudes and trends shaping the world. It conducts public opinion polling, demographic research, media content analysis and other empirical social science research. Pew Research Center does not take policy positions. It is a subsidiary of  The Pew Charitable Trusts .

Copyright 2024 Pew Research Center

Terms & Conditions

Privacy Policy

Cookie Settings

Reprints, Permissions & Use Policy

IMAGES

  1. Online Education Objectives

    research objectives about online classes

  2. How to Write Online Course Learning Objectives

    research objectives about online classes

  3. 20+ research objectives examples

    research objectives about online classes

  4. (PDF) Online Education and Its Effective Practice: A Research Review

    research objectives about online classes

  5. Purpose Of Learning Objectives In eLearning

    research objectives about online classes

  6. (PDF) COMPARING FACE-TO-FACE AND ONLINE TEACHING AND LEARNING IN HIGHER

    research objectives about online classes

VIDEO

  1. #4 Purpose of Research

  2. Notes Of Research Objectives And It's Types, Needs, Characterstics in Hindi (Bsc Nursing)

  3. Difference between Research Questions and Research Objectives

  4. Online Learning: Pros and Cons for Students

  5. Online Learning in Higher Education: Challenges and Opportunities

  6. Objectives of/behind Research

COMMENTS

  1. (Pdf) Research on Online Learning

    sense of online learning. The altered learning environments created by web-based technologies, not only. eliminate barriers of time, space and arguably l earning styles, providing increased access ...

  2. A systematic review of research on online teaching and learning from

    1. Introduction. Online learning has been on the increase in the last two decades. In the United States, though higher education enrollment has declined, online learning enrollment in public institutions has continued to increase (Allen & Seaman, 2017), and so has the research on online learning.There have been review studies conducted on specific areas on online learning such as innovations ...

  3. PDF ONLINE LEARNING EXPERIENCES AND SATISFACTION OF STUDENTS ON THE ...

    Online education becomes a viable and stimulating method for instructional and service delivery by providing students with great flexibility in the current situation. ... research objectives. Since all of the respondents are distance learning students, it was forwarded with their official institutional email address and ...

  4. Impact of online classes on the satisfaction and performance of

    The aim of the study is to identify the factors affecting students' satisfaction and performance regarding online classes during the pandemic period of COVID-19 and to establish the relationship between these variables. The study is quantitative in nature, and the data were collected from 544 respondents through online survey who were studying the business management (B.B.A or M.B.A) or ...

  5. The effects of online education on academic success: A meta ...

    The purpose of this study is to analyze the effect of online education, which has been extensively used on student achievement since the beginning of the pandemic. In line with this purpose, a meta-analysis of the related studies focusing on the effect of online education on students' academic achievement in several countries between the years 2010 and 2021 was carried out. Furthermore, this ...

  6. Development of a new model on utilizing online learning platforms to

    This research aims to explore and investigate potential factors influencing students' academic achievements and satisfaction with using online learning platforms. This study was constructed based on Transactional Distance Theory (TDT) and Bloom's Taxonomy Theory (BTT). This study was conducted on 243 students using online learning platforms in higher education. This research utilized a ...

  7. Combining the Best of Online and Face-to-Face Learning: Hybrid and

    Blended learning as defined by Dziuban et al. (2004), is an instructional method that includes the efficiency and socialization opportunities of the traditional face-to-face classroom with the digitally enhanced learning possibilities of the online mode of delivery.Characteristics of this approach include (a) student centered teaching where each and every student has to be actively involved in ...

  8. Online education in the post-COVID era

    The COVID-19 pandemic has forced the world to engage in the ubiquitous use of virtual learning. And while online and distance learning has been used before to maintain continuity in education ...

  9. Exploring students' learning experience in online education: analysis

    The research presented in this article is from a case study developed at the Open University of Catalonia (UOC), located in Spain OUC was founded as an online university and has 25 years' experience in offering online education. The main objective of the research was to explore and understand the academic and personal trajectories of the ...

  10. Integrating students' perspectives about online learning: a hierarchy

    This article reports on a large-scale (n = 987), exploratory factor analysis study incorporating various concepts identified in the literature as critical success factors for online learning from the students' perspective, and then determines their hierarchical significance. Seven factors--Basic Online Modality, Instructional Support, Teaching Presence, Cognitive Presence, Online Social ...

  11. Online and face‐to‐face learning: Evidence from students' performance

    1.1. Related literature. Online learning is a form of distance education which mainly involves internet‐based education where courses are offered synchronously (i.e. live sessions online) and/or asynchronously (i.e. students access course materials online in their own time, which is associated with the more traditional distance education).

  12. The effects of online education on academic success: A meta-analysis

    According to the study of Bernard et al. ( 2004 ), this meta-analysis focuses on the activities done in online education lectures. As a result of the research, an overall effect size close to zero was found for online education utilizing more than one generation technology for students at different levels.

  13. How Effective Is Online Learning? What the Research ...

    The use of virtual courses among K-12 students has grown rapidly in recent years. Florida, for example, requires all high school students to take at least one online course.

  14. Frontiers

    The mean grade for men in the environmental online classes (M = 3.23, N = 246, SD = 1.19) was higher than the mean grade for women in the classes (M = 2.9, N = 302, SD = 1.20) (see Table 1).First, a chi-square analysis was performed using SPSS to determine if there was a statistically significant difference in grade distribution between online and F2F students.

  15. PDF Online Vs. Face-to-Face: A Comparison of Student Outcomes with ...

    Online educational opportunities have blossomed as parents, students, college and university administrators and state and federal legislatures try to grapple with the problem of increasing education costs. The potential advantages of offering courses online are numerous: There is a perception that online classes are a more cost-

  16. Online learning during COVID-19 produced ...

    Research across disciplines has demonstrated that well-designed online learning can lead to students' enhanced motivation, satisfaction, and learning [1,2,3,4,5,6,7].]. A report by the U.S. Department of Education [], based on examinations of comparative studies of online and face-to-face versions of the same course from 1996 to 2008, concluded that online learning could produce learning ...

  17. A Survey on the Effectiveness of Online Teaching-Learning ...

    Online teaching-learning methods have been followed by world-class universities for more than a decade to cater to the needs of students who stay far away from universities/colleges. But during the COVID-19 pandemic period, online teaching-learning helped almost all universities, colleges, and affiliated students. An attempt is made to find the effectiveness of online teaching-learning ...

  18. Using Artificial Intelligence Tools in K-12 Classrooms

    The research described in this report was funded by the Bill & Melinda Gates Foundation and conducted by RAND Education and Labor. This report is part of the RAND research report series. RAND reports present research findings and objective analysis that address the challenges facing the public and private sectors.

  19. Impact of online classes on the satisfaction and performance of

    The current research is only limited to theory classes; therefore, it can be implemented to check students' performance in practical classes. The study is done on the Indian students only; thus, if the data is collected from various countries, it can give better comparative results to understand the student's perspective.

  20. EED 565 Transforming Engineering Education for Future Researchers

    BS in Engineering, Education or Sciences. Course Objectives. By the end of the course students will be able to: Describe the attributes of effective mentoring and research for graduate students in engineering education; Identify and review engineering education scholarship while understanding ethical considerations in research, identify ...

  21. A systematic review of research on online teaching and learning from

    1. Introduction. Online learning has been on the increase in the last two decades. In the United States, though higher education enrollment has declined, online learning enrollment in public institutions has continued to increase (Allen & Seaman, 2017), and so has the research on online learning.There have been review studies conducted on specific areas on online learning such as innovations ...

  22. A Unified General Education Pathway

    More than 50% of CSU students are transfer students, arriving primarily from the California Community Colleges system. In an effort to simplify their pathway to a four-year degree, the Student Transfer Achievement Reform Act (AB 928) creates a singular, lower-division General Education (GE) pattern for both California State University and University of California transfer admissions.

  23. Python Basics for Online Research Specialization on Coursera

    Python Basics for Online Research Specialization on Coursera Online. Created by UC Davis Continuing and Professional Education and hosted on the Coursera platform, this is an online course consisting of pre-recorded video lectures, auto-graded and peer-reviewed assignments and community discussion forums. ... It has four main objectives: To ...

  24. Key facts about US students with disabilities, for Disability Pride

    July is both Disability Pride Month and the anniversary of the Americans with Disabilities Act. To mark these occasions, Pew Research Center used federal education data from the National Center for Education Statistics to learn more about students who receive special education services in U.S. public schools.. In this analysis, students with disabilities include those ages 3 to 21 who are ...

  25. (PDF) A comparative study on effectiveness of online and offline

    The objective of the study is to assess the effectiveness of online and offline learning through higher education. The sudden outbreak of Covid-19 in various part of the world in 2020 has severely ...

  26. Enhancing online participation in education: quarter century of research

    Online participation in education is not a new research topic, but the field still lacks succinct account of approaches for enhancing online participation. This paper presents a systematic review of representative literature dealing with online participation in education. All subscribed resources in Web of Science were systematically searched with no time frame limit, and a total of 30 ...

  27. How Democrats, Republicans differ over K-12 education

    Pew Research Center conducted this analysis to provide a snapshot of partisan divides in K-12 education in the run-up to the 2024 election. The analysis is based on data from various Center surveys and analyses conducted from 2021 to 2023, as well as survey data from Education Next, a research journal about education policy.

  28. World Online Ranking of Best Chemistry Scientists

    On April 19, 2024, Research.com released its third annual report on the top chemists. This study provides a ranking of researchers and academics making major contributions to the progress in this field. Our objective in presenting this study is to enhance the prominence of prominent chemical specialists and assist researchers...