SYSTEMATIC REVIEW article

A systematic review of the effectiveness of online learning in higher education during the covid-19 pandemic period.

Wentao Meng

  • 1 Department of Basic Education, Beihai Campus, Guilin University of Electronic Technology Beihai, Beihai, Guangxi, China
  • 2 School of Sports and Arts, Harbin Sport University, Harbin, Heilongjiang, China
  • 3 School of Music, Harbin Normal University, Harbin, Heilongjiang, China
  • 4 School of General Education, Beihai Vocational College, Beihai, Guangxi, China
  • 5 School of Economics and Management, Beihai Campus, Guilin University of Electronic Technology, Guilin, Guangxi, China

Background: The effectiveness of online learning in higher education during the COVID-19 pandemic period is a debated topic but a systematic review on this topic is absent.

Methods: The present study implemented a systematic review of 25 selected articles to comprehensively evaluate online learning effectiveness during the pandemic period and identify factors that influence such effectiveness.

Results: It was concluded that past studies failed to achieve a consensus over online learning effectiveness and research results are largely by how learning effectiveness was assessed, e.g., self-reported online learning effectiveness, longitudinal comparison, and RCT. Meanwhile, a set of factors that positively or negatively influence the effectiveness of online learning were identified, including infrastructure factors, instructional factors, the lack of social interaction, negative emotions, flexibility, and convenience.

Discussion: Although it is debated over the effectiveness of online learning during the pandemic period, it is generally believed that the pandemic brings a lot of challenges and difficulties to higher education and these challenges and difficulties are more prominent in developing countries. In addition, this review critically assesses limitations in past research, develops pedagogical implications, and proposes recommendations for future research.

1 Introduction

1.1 research background.

The COVID-19 pandemic first out broken in early 2020 has considerably shaped the higher education landscape globally. To restrain viral transmission, universities globally locked down, and teaching and learning activities were transferred to online platforms. Although online learning is a relatively mature learning model and is increasingly integrated into higher education, the sudden and unprepared transition to wholly online learning caused by the pandemic posed formidable challenges to higher education stakeholders, e.g., policymakers, instructors, and students, especially at the early stage of the pandemic ( García-Morales et al., 2021 ; Grafton-Clarke et al., 2022 ). Correspondingly, the effectiveness of online learning during the pandemic period is still questionable as online learning during this period has some unique characteristics, e.g., the lack of preparation, sudden and unprepared transition, the huge scale of implementation, and social distancing policies ( Sharma et al., 2020 ; Rahman, 2021 ; Tsang et al., 2021 ; Hollister et al., 2022 ; Zhang and Chen, 2023 ). This question is more prominent in developing or undeveloped countries because of insufficient Internet access, network problems, the lack of electronic devices, and poor network infrastructure ( Adnan and Anwar, 2020 ; Muthuprasad et al., 2021 ; Rahman, 2021 ; Chandrasiri and Weerakoon, 2022 ).

Learning effectiveness is a key consideration of education as it reflects the extent to which learning and teaching objectives are achieved and learners’ needs are satisfied ( Joy and Garcia, 2000 ; Swan, 2003 ). Online learning was generally proven to be effective within a higher education context ( Kebritchi et al., 2017 ) prior to the pandemic. ICTs have fundamentally shaped the process of learning as they allow learners to learn anywhere and anytime, interact with others efficiently and conveniently, and freely acquire a large volume of learning materials online ( Kebritchi et al., 2017 ; Choudhury and Pattnaik, 2020 ). Such benefits may be offset by the challenges brought about by the pandemic. A lot of empirical studies globally have investigated the effectiveness of online learning but there is currently a scarcity of a systematic review of these studies to comprehensively evaluate online learning effectiveness and identify factors that influence effectiveness.

At present, although the vast majority of countries have implemented opening policies to deal with the pandemic and higher education institutes have recovered offline teaching and learning, assessing the effectiveness of online learning during the pandemic period via a systematic review is still essential. First, it is necessary to summarize, learn from, and reflect on the lessons and experiences of online learning practices during the pandemic period to offer implications for future practices and research. Second, the review of online learning research carried out during the pandemic period is likely to generate interesting knowledge because of the unique research context. Third, higher education institutes still need a contingency plan for emergency online learning to deal with potential crises in the future, e.g., wars, pandemics, and natural disasters. A systematic review of research on the effectiveness of online learning during the pandemic period offers valuable knowledge for designing a contingency plan for the future.

1.2 Related concepts

1.2.1 online learning.

Online learning should not be simply understood as learning on the Internet or the integration of ICTs with learning because it is a systematic framework consisting of a set of pedagogies, technologies, implementations, and processes ( Kebritchi et al., 2017 ; Choudhury and Pattnaik, 2020). Choudhury and Pattnaik (2020; p.2) summarized prior definitions of online learning and provided a comprehensive and up-to-date definition, i.e., online learning refers to “ the transfer of knowledge and skills, in a well-designed course content that has established accreditations, through an electronic media like the Internet, Web 4.0, intranets and extranets .” Online learning differs from traditional learning because of not only technological differences, but also differences in social development and pedagogies ( Camargo et al., 2020 ). Online learning has also considerably shaped the patterns by which knowledge is stored, shared, and transferred, skills are practiced, as well as the way by which stakeholders (e.g., teachers and teachers) interact ( Desai et al., 2008 ; Anderson and Hajhashemi, 2013 ). In addition, online learning has altered educational objectives and learning requirements. Memorizing knowledge was traditionally viewed as vital to learning but it is now less important since required knowledge can be conveniently searched and acquired on the Internet while the reflection and application of knowledge becomes more important ( Gamage et al., 2023 ). Online learning also entails learners’ self-regulated learning ability more than traditional learning because the online learning environment inflicts less external regulation and provides more autonomy and flexibility ( Barnard-Brak et al., 2010 ; Wong et al., 2019 ). The above differences imply that traditional pedagogies may not apply to online learning.

There are a variety of online learning models according to the differences in learning methods, processes, outcomes, and the application of technologies ( Zeitoun, 2008 ). As ICTs can be used as either the foundation of learning or auxiliary means, online learning can be classified into assistant, blended, and wholly online models. Here, assistant online learning refers to the scenario where online learning technologies are used to supplement and support traditional learning; blended online learning refers to the integration/ mixture of online and offline methods, and; wholly online learning refers to the exclusive use of the Internet for learning ( Arkorful and Abaidoo, 2015 ). The present review focuses on wholly online learning because the review is interested in the COVID-19 pandemic context where learning activities are fully switched to online platforms.

1.2.2 Learning effectiveness

Learning effectiveness can be broadly defined as the extent to which learning and teaching objectives have been effectively and efficiently achieved via educational activities ( Swan, 2003 ) or the extent to which learners’ needs are satisfied by learning activities ( Joy and Garcia, 2000 ). It is a multi-dimensional construct because learning objectives and needs are always diversified ( Joy and Garcia, 2000 ; Swan, 2003 ). Assessing learning effectiveness is a key challenge in educational research and researchers generally use a set of subjective and objective indicators to assess learning effectiveness, e.g., examination scores, assignment performance, perceived effectiveness, student satisfaction, learning motivation, engagement in learning, and learning experience ( Rajaram and Collins, 2013 ; Noesgaard and Ørngreen, 2015 ). Prior research related to the effectiveness of online learning was diversified in terms of learning outcomes, e.g., satisfaction, perceived effectiveness, motivation, and learning engagement, and there is no consensus over which outcomes are valid indicators of learning effectiveness. The present study adopts a broad definition of learning effectiveness and considers various learning outcomes that are closely associated with learning objectives and needs.

1.3 Previous review research

Up to now, online learning during the COVID-19 pandemic period has attracted considerable attention from academia and there is a lot of related review research. Some review research analyzed the trends and major topics in related research. Pratama et al. (2020) tracked the trend of using online meeting applications in online learning during the pandemic period based on a systematic review of 12 articles. It was reported that the use of these applications kept a rising trend and this use helps promote learning and teaching processes. However, this review was descriptive and failed to identify problems related to these applications as well as the limitations of these applications. Zhang et al. (2022) implemented a bibliometric review to provide a holistic view of research on online learning in higher education during the COVID-19 pandemic period. They concluded that the majority of research focused on identifying the use of strategies and technologies, psychological impacts brought by the pandemic, and student perceptions. Meanwhile, collaborative learning, hands-on learning, discovery learning, and inquiry-based learning were the most frequently discussed instructional approaches. In addition, chemical and medical education were found to be the most investigated disciplines. This review hence offered a relatively comprehensive landscape of related research in the field. However, since it was a bibliometric review, it merely analyzed the superficial characteristics of past articles in the field without a detailed analysis of their research contributions. Bughrara et al. (2023) categorized the major research topics in the field of online medical education during the pandemic period via a scoping review. A total of 174 articles were included in the review and it was found there were seven major topics, including students’ mental health, stigma, student vaccination, use of telehealth, students’ physical health, online modifications and educational adaptations, and students’ attitudes and knowledge. Overall, the review comprehensively reveals major topics in the focused field.

Some scholars believed that online learning during the pandemic period has brought about a lot of problems while both students and teachers encounter many challenges. García-Morales et al. (2021) implemented a systematic review to identify the challenges encountered by higher education in an online learning scenario during the pandemic period. A total of seven studies were included and it was found that higher education suddenly transferred to online learning and a lot of technologies and platforms were used to support online learning. However, this transition was hasty and forced by the extreme situation. Thus, various stakeholders in learning and teaching (e.g., students, universities, and teachers) encountered difficulties in adapting to this sudden change. To deal with these challenges, universities need to utilize the potential of technologies, improve learning experience, and meet students’ expectations. The major limitation of García-Morales et al. (2021) review of the small-sized sample. Meanwhile, García-Morales et al. (2021) also failed to systematically categorize various types of challenges. Stojan et al. (2022) investigated the changes to medical education brought about by the shift to online learning in the COVID-19 pandemic context as well as the lessons and impacts of these changes via a systematic review. A total of 56 articles were included in the analysis, it was reported that small groups and didactics were the most prevalent instructional methods. Although learning engagement was always interactive, teachers majorly integrated technologies to amplify and replace, rather than transform learning. Based on this, they argued that the use of asynchronous and synchronous formats promoted online learning engagement and offered self-directed and flexible learning. The major limitation of this review is that the article is somewhat descriptive and lacks the crucial evaluation of problems of online learning.

Review research has also focused on the changes and impacts brought by online learning during the pandemic period. Camargo et al. (2020) implemented a meta-analysis on seven empirical studies regarding online learning methods during the pandemic period to evaluate feasible online learning platforms, effective online learning models, and the optimal duration of online lectures, as well as the perceptions of teachers and students in the online learning process. Overall, it was concluded that the shift from offline to online learning is feasible, and; effective online learning needs a well-trained and integrated team to identify students’ and teachers’ needs, timely respond, and support them via digital tools. In addition, the pandemic has brought more or less difficulties to online learning. An obvious limitation of this review is the overly small-sized sample ( N  = 7), which offers very limited information, but the review tries to answer too many questions (four questions). Grafton-Clarke et al. (2022) investigated the innovation/adaptations implemented, their impacts, and the reasons for their selections in the shift to online learning in medical education during the pandemic period via a systematic review of 55 articles. The major adaptations implemented include the rapid shift to the virtual space, pre-recorded videos or live streaming of surgical procedures, remote adaptations for clinical visits, and multidisciplinary ward rounds and team meetings. Major challenges encountered by students and teachers include the need for technical resources, faculty time, and devices, the shortage of standardized telemedicine curricula, and the lack of personal interactions. Based on this, they criticized the quality of online medical education. Tang (2023) explored the impact of the pandemic on primary, secondary, and tertiary education in the pandemic context via a systematic review of 41 articles. It was reported that the majority of these impacts are negative, e.g., learning loss among learners, assessment and experiential learning in the virtual environment, limitations in instructions, technology-related constraints, the lack of learning materials and resources, and deteriorated psychosocial well-being. These negative impacts are amplified by the unequal distribution of resources, unfair socioeconomic status, ethnicity, gender, physical conditions, and learning ability. Overall, this review comprehensively criticizes the problems brought about by online learning during the pandemic period.

Very little review research evaluated students’ responses to online learning during the pandemic period. For instance, Salas-Pilco et al. (2022) evaluated the engagement in online learning in Latin American higher education during the COVID-19 pandemic period via a systematic review of 23 studies. They considered three dimensions of engagement, including affective, cognitive, and behavioral engagement. They described the characteristics of learning engagement and proposed suggestions for enhancing engagement, including improving Internet connectivity, providing professional training, transforming higher education, ensuring quality, and offering emotional support. A key limitation of the review is that these authors focused on describing the characteristics of engagement without identifying factors that influence engagement.

A synthesis of previous review research offers some implications. First, although learning effectiveness is an important consideration in educational research, review research is scarce on this topic and hence there is a lack of comprehensive knowledge regarding the extent to which online learning is effective during the COVID-19 pandemic period. Second, according to past review research that summarized the major topics of related research, e.g., Bughrara et al. (2023) and Zhang et al. (2022) , the effectiveness of online learning is not a major topic in prior empirical research and hence the author of this article argues that this topic has not received due attention from researchers. Third, some review research has identified a lot of problems in online learning during the pandemic period, e.g., García-Morales et al. (2021) and Stojan et al. (2022) . Many of these problems are caused by the sudden and rapid shift to online learning as well as the unique context of the pandemic. These problems may undermine the effectiveness of online learning. However, the extent to which these problems influence online learning effectiveness is still under-investigated.

1.4 Purpose of the review research

The research is carried out based on a systematic review of past empirical research to answer the following two research questions:

Q1: To what extent online learning in higher education is effective during the COVID-19 pandemic period?

Q2: What factors shape the effectiveness of online learning in higher education during the COVID-19 pandemic period?

2 Research methodology

2.1 literature review as a research methodology.

Regardless of discipline, all academic research activities should be related to and based on existing knowledge. As a result, scholars must identify related research on the topic of interest, critically assess the quality and content of existing research, and synthesize available results ( Linnenluecke et al., 2020 ). However, this task is increasingly challenging for scholars because of the exponential growth of academic knowledge, which makes it difficult to be at the forefront and keep up with state-of-the-art research ( Snyder, 2019 ). Correspondingly, literature review, as a research methodology is more relevant than previously ( Snyder, 2019 ; Linnenluecke et al., 2020 ). A well-implemented review provides a solid foundation for facilitating theory development and advancing knowledge ( Webster and Watson, 2002 ). Here, a literature review is broadly defined as a more or less systematic way of collecting and synthesizing past studies ( Tranfield et al., 2003 ). It allows researchers to integrate perspectives and results from a lot of past research and is able to address research questions unanswered by a single study ( Snyder, 2019 ).

There are generally three types of literature review, including meta-analysis, bibliometric review, and systematic review ( Snyder, 2019 ). A meta-analysis refers to a statistical technique for integrating results from a large volume of empirical research (majorly quantitative research) to compare, identify, and evaluate patterns, relationships, agreements, and disagreements generated by research on the same topic ( Davis et al., 2014 ). This study does not adopt a meta-analysis for two reasons. First, the research on the effectiveness of online learning in the context of the COVID-19 pandemic was published since 2020 and currently, there is a limited volume of empirical evidence. If the study adopts a meta-analysis, the sample size will be small, resulting in limited statistical power. Second, as mentioned above, there are a variety of indicators, e.g., motivation, satisfaction, experience, test score, and perceived effectiveness ( Rajaram and Collins, 2013 ; Noesgaard and Ørngreen, 2015 ), that reflect different aspects of online learning effectiveness. The use of diversified effectiveness indicators increases the difficulty of carrying out meta-analysis.

A bibliometric review refers to the analysis of a large volume of empirical research in terms of publication characteristics (e.g., year, journal, and citation), theories, methods, research questions, countries, and authors ( Donthu et al., 2021 ) and it is useful in tracing the trend, distribution, relationship, and general patterns of research published in a focused topic ( Wallin, 2005 ). A bibliometric review does not fit the present study for two reasons. First, at present, there are less than 4 years of history of research on online learning effectiveness. Hence the volume of relevant research is limited and the public trend is currently unclear. Second, this study is interested in the inner content and results of articles published, rather than their external characteristics.

A systematic review is a method and process of critically identifying and appraising research in a specific field based on predefined inclusion and exclusion criteria to test a hypothesis, answer a research question, evaluate problems in past research, identify research gaps, and/or point out the avenue for future research ( Liberati et al., 2009 ; Moher et al., 2009 ). This type of review is particularly suitable to the present study as there are still a lot of unanswered questions regarding the effectiveness of online learning in the pandemic context, a need for indicating future research direction, a lack of summary of relevant research in this field, and a scarcity of critical appraisal of problems in past research.

Adopting a systematic review methodology brings multiple benefits to the present study. First, it is helpful for distinguishing what needs to be done from what has been done, identifying major contributions made by past research, finding out gaps in past research, avoiding fruitless research, and providing insights for future research in the focused field ( Linnenluecke et al., 2020 ). Second, it is also beneficial for finding out new research directions, needs for theory development, and potential solutions for limitations in past research ( Snyder, 2019 ). Third, this methodology helps scholars to efficiently gain an overview of valuable research results and theories generated by past research, which inspires their research design, ideas, and perspectives ( Callahan, 2014 ).

Commonly, a systematic review can be either author-centric or theme-centric ( Webster and Watson, 2002 ) and the present review is theme-centric. Specifically, an author-centric review focuses on works published by a certain author or a group of authors and summarizes the major contributions made by the author(s; ( Webster and Watson, 2002 ). This type of review is problematic in terms of its incompleteness of research conclusions in a specific field and descriptive nature ( Linnenluecke et al., 2020 ). A theme-centric review is more common where a researcher guides readers through reviewing themes, concepts, and interesting phenomena according to a certain logic ( Callahan, 2014 ). A theme in this review can be further structured into several related sub-themes and this type of review helps researchers to gain a comprehensive understanding of relevant academic knowledge ( Papaioannou et al., 2016 ).

2.2 Research procedures

This study follows the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guideline ( Liberati et al., 2009 ) to implement a systematic review. The guideline indicates four phases of performing a systematic review, including (1) identifying possible research, (2) abstract screening, (3) assessing full-text for eligibility, and (4) qualitatively synthesizing included research. Figure 1 provides a flowchart of the process and the number of articles excluded and included in each phase.

www.frontiersin.org

Figure 1 . PRISMA flowchart concerning the selection of articles.

This study uses multiple academic databases to identify possible research, e.g., Academic Search Complete, IGI Global, ACM Digital Library, Elsevier (SCOPUS), Emerald, IEEE Xplore, Web of Science, Science Direct, ProQuest, Wiley Online Library, Taylor and Francis, and EBSCO. Since the COVID-19 pandemic broke out in January 2020, this study limits the literature search to articles published from January 2020 to August 2023. During this period, online learning was highly prevalent in schools globally and a considerable volume of articles were published to investigate various aspects of online learning in this period. Keywords used for searching possible research include pandemic, COVID, SARS-CoV-2, 2019-nCoV, coronavirus, online learning, e-learning, electronic learning, higher education, tertiary education, universities, learning effectiveness, learning satisfaction, learning engagement, and learning motivation. Aside from searching from databases, this study also manually checks the reference lists of relevant articles and uses Google Scholar to find out other articles that have cited these articles.

2.3 Inclusion and exclusion criteria

Articles included in the review must meet the following criteria. First, articles have to be written in English and published on peer-reviewed journals. The academic language being English was chosen because it is in the Q zone of the specified search engines. Second, the research must be carried out in an online learning context. Third, the research must have collected and analyzed empirical data. Fourth, the research should be implemented in a higher education context and during the pandemic period. Fifth, the outcome variable must be factors related to learning effectiveness, and included studies must have reported the quantitative results for online learning effectiveness. The outcome variable should be measured by data collected from students, rather than other individuals (e.g., instructors). For instance, the research of Rahayu and Wirza (2020) used teacher perception as a measurement of online learning effectiveness and was hence excluded from the sample. According to the above criteria, a total of 25 articles were included in the review.

2.4 Data extraction and analysis

Content analysis is performed on included articles and an inductive approach is used to answer the two research questions. First, to understand the basic characteristics of the 25 articles/studies, the researcher summarizes their types, research designs, and samples and categorizes them into several groups. The researcher carefully reads the full-text of these articles and codes valuable pieces of content. In this process, an inductive approach is used, and key themes in these studies have been extracted and summarized. Second, the researcher further categorizes these studies into different groups according to their similarities and differences in research findings. In this way, these studies are broadly categorized into three groups, i.e., (1) ineffective (2) neutral, and (3) effective. Based on this, the research answers the research question and indicates the percentage of studies that evidenced online learning as effective in a COVID-19 pandemic context. The researcher also discusses how online learning is effective by analyzing the learning outcomes brought by online learning. Third, the researcher analyzes and compares the characteristics of the three groups of studies and extracts key themes that are relevant to the conditional effectiveness of online learning from these studies. Based on this, the researcher identifies factors that influence the effectiveness of online learning in a pandemic context. In this way, the two research questions have been adequately answered.

3 Research results and discussion

3.1 study characteristics.

Table 1 shows the statistics of the 25 studies while Table 2 shows a summary of these studies. Overall, these studies varied greatly in terms of research design, research subjects, contexts, measurements of learning effectiveness, and eventually research findings. Approximately half of the studies were published in 2021 and the number of studies reduced in 2022 and 2023, which may be attributed to the fact that universities gradually implemented opening-up policies after 2020. China received the largest number of studies ( N  = 5), followed by India ( N = 4) and the United States ( N  = 3). The sample sizes of the majority of studies (88.0%) ranged between 101 and 500. As this review excluded qualitative studies, all studies included adopted a purely quantitative design (88.0%) or a mixed design (12.0%). The majority of the studies were cross-sectional (72%) and a few studies (2%) were experimental.

www.frontiersin.org

Table 1 . Statistics of studies included in the review.

www.frontiersin.org

Table 2 . A summary of studies reviewed.

3.2 The effectiveness of online learning

Overall, the 25 studies generated mixed results regarding the effectiveness of online learning during the pandemic period. 9 (36%) studies reported online learning as effective; 13 (52%) studies reported online learning as ineffective, and the rest 3 (12%) studies produced neutral results. However, it should be noted that the results generated by these studies are not comparable as they used different approaches to evaluate the effectiveness of online learning. According to the approach of evaluating online learning effectiveness, these studies are categorized into four groups, including (1) Cross-sectional evaluation of online learning effectiveness without a comparison with offline learning; without a control group ( N  = 14; 56%), (2) Cross-sectional comparison of the effectiveness of online learning with offline learning; without control group (7; 28%), (3) Longitudinal comparison of the effectiveness of online learning with offline learning, without a control group ( N  = 2; 8%), and (4) Randomized Controlled Trial (RCT); with a control group ( N  = 2; 8%).

The first group of studies asked students to report the extent to which they perceived online learning as effective, they had achieved expected learning outcomes through online learning, or they were satisfied with online learning experience or outcomes, without a comparison with offline learning. Six out of 14 studies reported online learning as ineffective, including Adnan and Anwar (2020) , Hong et al. (2021) , Mok et al. (2021) , Baber (2022) , Chandrasiri and Weerakoon (2022) , and Lalduhawma et al. (2022) . Five out of 14 studies reported online learning as effective, including Almusharraf and Khahro (2020) , Sharma et al. (2020) , Mahyoob (2021) , Rahman (2021) , and Haningsih and Rohmi (2022) . In addition, 3 out of 14 studies reported neutral results, including Cranfield et al. (2021) , Tsang et al. (2021) , and Conrad et al. (2022) . It should be noted that this measurement approach is problematic in three aspects. First, researchers used various survey instruments to measure learning effectiveness without reaching a consensus over a widely accepted instrument. As a result, these studies measured different aspects of learning effectiveness and hence their results may be incomparable. Second, these studies relied on students’ self-reports to evaluate learning effectiveness, which is subjective and inaccurate. Third, even though students perceived online learning as effective, it does not imply that online learning is more effective than offline learning because of the absence of comparables.

The second group of studies asked students to compare online learning with offline learning to evaluate learning effectiveness. Interestingly, all 7 studies, including Alawamleh et al. (2020) , Almahasees et al. (2021) , Gonzalez-Ramirez et al. (2021) , Muthuprasad et al. (2021) , Selco and Habbak (2021) , Hollister et al. (2022) , and Zhang and Chen (2023) , reported that online learning was perceived by participants as less effective than offline learning. It should be noted that these results were specific to the COVID-19 pandemic context where strict social distancing policies were implemented. Consequently, these results should be interpreted as online learning during the school lockdown period was perceived by participants as less effective than offline learning during the pre-pandemic period. A key problem of the measurement of learning effectiveness in these studies is subjectivity, i.e., students’ self-reported online learning effectiveness relative to offline learning may be subjective and influenced by a lot of factors caused by the pandemic, e.g., negative emotions (e.g., fear, loneliness, and anxiety).

Only two studies implemented a longitudinal comparison of the effectiveness of online learning with offline learning, i.e., Chang et al. (2021) and Fyllos et al. (2021) . Interestingly, both studies reported that participants perceived online learning as more effective than offline learning, which is contradicted with the second group of studies. In the two studies, the same group of students participated in offline learning and online learning successively and rated the effectiveness of the two learning approaches, respectively. The two studies were implemented by time coincidence, i.e., researchers unexpectedly encountered the pandemic and subsequently, school lockdown when they were investigating learning effectiveness. Such time coincidence enabled them to compare the effectiveness of offline and online learning. However, this research design has three key problems. First, the content of learning in the online and offline learning periods was different and hence the evaluations of learning effectiveness of the two periods are not comparable. Second, self-reported learning effectiveness is subjective. Third, students are likely to obtain better examination scores in online examinations than in offline examinations because online examinations bring a lot of cheating behaviors and are less fair than offline examinations. As reported by Fyllos et al. (2021) , the examination score after online learning was significantly higher than after offline learning. Chang et al. (2021) reported that participants generally believed that offline examinations are fairer than online examinations.

Lastly, only two studies, i.e., Jiang et al. (2023) and Shirahmadi et al. (2023) , implemented an RCT design, which is more persuasive, objective, and accurate than the above-reviewed studies. Indeed, implementing an RCT to evaluate the effectiveness of online learning was a formidable challenge during the pandemic period because of viral transmission and social distancing policies. Both studies reported that online learning is more effective than offline learning during the pandemic period. However, it is questionable about the extent to which such results are affected by health/safety-related issues. It is reasonable to infer that online learning was perceived by students as safer than offline learning during the pandemic period and such perceptions may affect learning effectiveness.

Overall, it is difficult to conclude whether online learning is effective during the pandemic period. Nevertheless, it is possible to identify factors that shape the effectiveness of online learning, which is discussed in the next section.

3.3 Factors that shape online learning effectiveness

Infrastructure factors were reported as the most salient factors that determine online learning effectiveness. It seems that research from developed countries generated more positive results for online learning than research from less developed countries. This view was confirmed by the cross-country comparative study of Cranfield et al. (2021) . Indeed, online learning entails the support of ICT infrastructure, and hence ICT related factors, e.g., Internet connectivity, technical issues, network speed, accessibility of digital devices, and digital devices, considerably influence the effectiveness of online learning ( García-Morales et al., 2021 ; Grafton-Clarke et al., 2022 ). Prior review research, e.g., Tang (2023) also suggested that the unequal distribution of resources and unfair socioeconomic status intensified the problems brought about by online learning during the pandemic period. Salas-Pilco et al. (2022) recommended that improving Internet connectivity would increase students’ engagement in online learning during the pandemic period.

Adnan and Anwar (2020) study is one of the most cited works in the focused field. They reported that online learning is ineffective in Pakistan because of the problems of Internet access due to monetary and technical issues. The above problems hinder students from implementing online learning activities, making online learning ineffective. Likewise, Lalduhawma et al. (2022) research from India indicated that online learning is ineffective because of poor network interactivity, slow data speed, low data limits, and expensive costs of devices. As a result, online learning during the COVID-19 pandemic may have expanded the education gap between developed and developing countries because of developing countries’ infrastructure disadvantages. More attention to online learning infrastructure problems in developing countries is needed.

Instructional factors, e.g., course management and design, instructor characteristics, instructor-student interaction, assignments, and assessments were found to affect online learning effectiveness ( Sharma et al., 2020 ; Rahman, 2021 ; Tsang et al., 2021 ; Hollister et al., 2022 ; Zhang and Chen, 2023 ). Although these instructional factors have been well-documented as significant drivers of learning effectiveness in traditional learning literature, these factors in the pandemic period have some unique characteristics. Both students and teachers were not well prepared for wholly online instruction and learning in 2020 and hence they encountered a lot of problems in course management and design, learning interactions, assignments, and assessments ( Stojan et al., 2022 ; Tang, 2023 ). García-Morales et al. (2021) review also suggested that various stakeholders in learning and teaching encountered difficulties in adapting to the sudden, hasty, and forced transition of offline to online learning. Consequently, these instructional factors become salient in terms of affecting online learning effectiveness.

The negative role of the lack of social interaction caused by social distancing in affecting online learning effectiveness was highlighted by a lot of studies ( Almahasees et al., 2021 ; Baber, 2022 ; Conrad et al., 2022 ; Hollister et al., 2022 ). Baber (2022) argued that people give more importance to saving lives than socializing in the online environment and hence social interactions in learning are considerably reduced by social distancing norms. The negative impact of the lack of social interaction on online learning effectiveness is reflected in two aspects. First, according to a constructivist view, interaction is an indispensable element of learning because knowledge is actively constructed by learners in social interactions ( Woo and Reeves, 2007 ). Consequently, online learning effectiveness during the pandemic period is reduced by the lack of social interaction. Second, the lack of social interaction brings a lot of negative emotions, e.g., feelings of isolation, loneliness, anxiety, and depression ( Alawamleh et al., 2020 ; Gonzalez-Ramirez et al., 2021 ; Selco and Habbak, 2021 ). Such negative emotions undermine online learning effectiveness.

Negative emotions caused by the pandemic and school lockdown were also found to be detrimental to online learning effectiveness. In this context, it was reported that many students experience a lot of negative emotions, e.g., feelings of isolation, exhaustion, loneliness, and distraction ( Alawamleh et al., 2020 ; Gonzalez-Ramirez et al., 2021 ; Selco and Habbak, 2021 ). Such negative emotions, as mentioned above, reduce online learning effectiveness.

Several factors were also found to increase online learning effectiveness during the pandemic period, e.g., convenience and flexibility ( Hong et al., 2021 ; Muthuprasad et al., 2021 ; Selco and Habbak, 2021 ). Students with strong self-regulated learning abilities gain more benefits from convenience and flexibility in online learning ( Hong et al., 2021 ).

Overall, although it is debated over the effectiveness of online learning during the pandemic period, it is generally believed that the pandemic brings a lot of challenges and difficulties to higher education. Meanwhile, the majority of students prefer offline learning to online learning. The above challenges and difficulties are more prominent in developing countries than in developed countries.

3.4 Pedagogical implications

The results generated by the systematic review offer a lot of pedagogical implications. First, online learning entails the support of ICT infrastructure, and infrastructure defects strongly undermine learning effectiveness ( García-Morales et al., 2021 ; Grafton-Clarke et al., 2022 ). Given the fact online learning is increasingly integrated into higher education ( Kebritchi et al., 2017 ) regardless of the presence of the pandemic, governments globally should increase the investment in learning-related ICT infrastructure in higher education institutes. Meanwhile, schools should consider students’ affordability of digital devices and network fees when implementing online learning activities. It is important to offer material support for those students with poor economic status. Infrastructure issues are more prominent in developing countries because of the lack of monetary resources and poor infrastructure base. Thus, international collaboration and aid are recommended to address these issues.

Second, since the lack of social interaction is a key factor that reduces online learning effectiveness, it is important to increase social interactions during the implementation of online learning activities. On the one hand, both students and instructors are encouraged to utilize network technologies to promote inter-individual interactions. On the other hand, the two parties are also encouraged to engage in offline interaction activities if the risk is acceptable.

Third, special attention should be paid to students’ emotions during the online learning process as online learning may bring a lot of negative emotions to students, which undermine learning effectiveness ( Alawamleh et al., 2020 ; Gonzalez-Ramirez et al., 2021 ; Selco and Habbak, 2021 ). In addition, higher education institutes should prepare a contingency plan for emergency online learning to deal with potential crises in the future, e.g., wars, pandemics, and natural disasters.

3.5 Limitations and suggestions for future research

There are several limitations in past research regarding online learning effectiveness during the pandemic period. The first is the lack of rigor in assessing learning effectiveness. Evidently, there is a scarcity of empirical research with an RCT design, which is considered to be accurate, objective, and rigorous in assessing pedagogical models ( Torgerson and Torgerson, 2001 ). The scarcity of ICT research leads to the difficulty in accurately assessing the effectiveness of online learning and comparing it with offline learning. Second, the widely accepted criteria for assessing learning effectiveness are absent, and past empirical studies used diversified procedures, techniques, instruments, and criteria for measuring online learning effectiveness, resulting in difficulty in comparing research results. Third, learning effectiveness is a multi-dimensional construct but its multidimensionality was largely ignored by past research. Therefore, it is difficult to evaluate which dimensions of learning effectiveness are promoted or undermined by online learning and it is also difficult to compare the results of different studies. Finally, there is very limited knowledge about the difference in online learning effectiveness between different subjects. It is likely that the subjects that depend on lab-based work (e.g., experimental physics, organic chemistry, and cell biology) are less appropriate for online learning than the subjects that depend on desk-based work (e.g., economics, psychology, and literature).

To deal with the above limitations, there are several recommendations for future research on online learning effectiveness. First, future research is encouraged to adopt an RCT design and collect a large-sized sample to objectively, rigorously, and accurately quantify the effectiveness of online learning. Second, scholars are also encouraged to develop a new framework to assess learning effectiveness comprehensively. This framework should cover multiple dimensions of learning effectiveness and have strong generalizability. Finally, it is recommended that future research could compare the effectiveness of online learning between different subjects.

4 Conclusion

This study carried out a systematic review of 25 empirical studies published between 2020 and 2023 to evaluate the effectiveness of online learning during the COVID-19 pandemic period. According to how online learning effectiveness was assessed, these 25 studies were categorized into four groups. The first group of studies employed a cross-sectional design and assessed online learning based on students’ perceptions without a control group. Less than half of these studies reported online learning as effective. The second group of studies also employed a cross-sectional design and asked students to compare the effectiveness of online learning with offline learning. All these studies reported that online learning is less effective than offline learning. The third group of studies employed a longitudinal design and compared the effectiveness of online learning with offline learning but without a control group and this group includes only 2 studies. It was reported that online learning is more effective than offline learning. The fourth group of studies employed an RCT design and this group includes only 2 studies. Both studies reported online learning as more effective than offline learning.

Overall, it is difficult to conclude whether online learning is effective during the pandemic period because of the diversified research contexts, methods, and approaches in past research. Nevertheless, the review identifies a set of factors that positively or negatively influence the effectiveness of online learning, including infrastructure factors, instructional factors, the lack of social interaction, negative emotions, flexibility, and convenience. Although it is debated over the effectiveness of online learning during the pandemic period, it is generally believed that the pandemic brings a lot of challenges and difficulties to higher education. Meanwhile, the majority of students prefer offline learning to online learning. In addition, developing countries face more challenges and difficulties in online learning because of monetary and infrastructure issues.

The findings of this review offer significant pedagogical implications for online learning in higher education institutes, including enhancing the development of ICT infrastructure, providing material support for students with poor economic status, enhancing social interactions, paying attention to students’ emotional status, and preparing a contingency plan of emergency online learning.

The review also identifies several limitations in past research regarding online learning effectiveness during the pandemic period, including the lack of rigor in assessing learning effectiveness, the absence of accepted criteria for assessing learning effectiveness, the neglect of the multidimensionality of learning effectiveness, and limited knowledge about the difference in online learning effectiveness between different subjects.

To deal with the above limitations, there are several recommendations for future research on online learning effectiveness. First, future research is encouraged to adopt an RCT design and collect a large-sized sample to objectively, rigorously, and accurately quantify the effectiveness of online learning. Second, scholars are also encouraged to develop a new framework to assess learning effectiveness comprehensively. This framework should cover multiple dimensions of learning effectiveness and have strong generalizability. Finally, it is recommended that future research could compare the effectiveness of online learning between different subjects. To fix these limitations in future research, recommendations are made.

It should be noted that this review is not free of problems. First, only studies that quantitatively measured online learning effectiveness were included in the review and hence a lot of other studies (e.g., qualitative studies) that investigated factors that influence online learning effectiveness were excluded, resulting in a relatively small-sized sample and incomplete synthesis of past research contributions. Second, since this review was qualitative, it was difficult to accurately quantify the level of online learning effectiveness.

Data availability statement

The original contributions presented in the study are included in the article/supplementary material, further inquiries can be directed to the corresponding author.

Author contributions

WM: Writing – original draft, Writing – review & editing. LY: Writing – original draft, Writing – review & editing. CL: Writing – review & editing. NP: Writing – review & editing. XP: Writing – review & editing. YZ: Writing – review & editing.

The author(s) declare that no financial support was received for the research, authorship, and/or publication of this article.

Conflict of interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Publisher’s note

All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.

Adnan, M., and Anwar, K. (2020). Online learning amid the COVID-19 pandemic: Students' perspectives. J. Pedagogical Sociol. Psychol. 1, 45–51. doi: 10.33902/JPSP.2020261309

Crossref Full Text | Google Scholar

Alawamleh, M., Al-Twait, L. M., and Al-Saht, G. R. (2020). The effect of online learning on communication between instructors and students during Covid-19 pandemic. Asian Educ. Develop. Stud. 11, 380–400. doi: 10.1108/AEDS-06-2020-0131

Almahasees, Z., Mohsen, K., and Amin, M. O. (2021). Faculty’s and students’ perceptions of online learning during COVID-19. Front. Educ. 6:638470. doi: 10.3389/feduc.2021.638470

Almusharraf, N., and Khahro, S. (2020). Students satisfaction with online learning experiences during the COVID-19 pandemic. Int. J. Emerg. Technol. Learn. (iJET) 15, 246–267. doi: 10.3991/ijet.v15i21.15647

Anderson, N., and Hajhashemi, K. (2013). Online learning: from a specialized distance education paradigm to a ubiquitous element of contemporary education. In 4th international conference on e-learning and e-teaching (ICELET 2013) (pp. 91–94). IEEE.

Google Scholar

Arkorful, V., and Abaidoo, N. (2015). The role of e-learning, advantages and disadvantages of its adoption in higher education. Int. J. Instructional Technol. Distance Learn. 12, 29–42.

Baber, H. (2022). Social interaction and effectiveness of the online learning–a moderating role of maintaining social distance during the pandemic COVID-19. Asian Educ. Develop. Stud. 11, 159–171. doi: 10.1108/AEDS-09-2020-0209

Barnard-Brak, L., Paton, V. O., and Lan, W. Y. (2010). Profiles in self-regulated learning in the online learning environment. Int. Rev. Res. Open Dist. Learn. 11, 61–80. doi: 10.19173/irrodl.v11i1.769

Bughrara, M. S., Swanberg, S. M., Lucia, V. C., Schmitz, K., Jung, D., and Wunderlich-Barillas, T. (2023). Beyond COVID-19: the impact of recent pandemics on medical students and their education: a scoping review. Med. Educ. Online 28:2139657. doi: 10.1080/10872981.2022.2139657

PubMed Abstract | Crossref Full Text | Google Scholar

Callahan, J. L. (2014). Writing literature reviews: a reprise and update. Hum. Resour. Dev. Rev. 13, 271–275. doi: 10.1177/1534484314536705

Camargo, C. P., Tempski, P. Z., Busnardo, F. F., Martins, M. D. A., and Gemperli, R. (2020). Online learning and COVID-19: a meta-synthesis analysis. Clinics 75:e2286. doi: 10.6061/clinics/2020/e2286

Choudhury, S., and Pattnaik, S. (2020). Emerging themes in e-learning: A review from the stakeholders’ perspective. Computers and Education 144, 103657. doi: 10.1016/j.compedu.2019.103657

Chandrasiri, N. R., and Weerakoon, B. S. (2022). Online learning during the COVID-19 pandemic: perceptions of allied health sciences undergraduates. Radiography 28, 545–549. doi: 10.1016/j.radi.2021.11.008

Chang, J. Y. F., Wang, L. H., Lin, T. C., Cheng, F. C., and Chiang, C. P. (2021). Comparison of learning effectiveness between physical classroom and online learning for dental education during the COVID-19 pandemic. J. Dental Sci. 16, 1281–1289. doi: 10.1016/j.jds.2021.07.016

Conrad, C., Deng, Q., Caron, I., Shkurska, O., Skerrett, P., and Sundararajan, B. (2022). How student perceptions about online learning difficulty influenced their satisfaction during Canada's Covid-19 response. Br. J. Educ. Technol. 53, 534–557. doi: 10.1111/bjet.13206

Cranfield, D. J., Tick, A., Venter, I. M., Blignaut, R. J., and Renaud, K. (2021). Higher education students’ perceptions of online learning during COVID-19—a comparative study. Educ. Sci. 11, 403–420. doi: 10.3390/educsci11080403

Desai, M. S., Hart, J., and Richards, T. C. (2008). E-learning: paradigm shift in education. Education 129, 1–20.

Davis, J., Mengersen, K., Bennett, S., and Mazerolle, L. (2014). Viewing systematic reviews and meta-analysis in social research through different lenses. SpringerPlus 3, 1–9. doi: 10.1186/2193-1801-3-511

Donthu, N., Kumar, S., Mukherjee, D., Pandey, N., and Lim, W. M. (2021). How to conduct a bibliometric analysis: An overview and guidelines. Journal of business research 133, 264–269. doi: 10.1016/j.jbusres.2021.04.070

Fyllos, A., Kanellopoulos, A., Kitixis, P., Cojocari, D. V., Markou, A., Raoulis, V., et al. (2021). University students perception of online education: is engagement enough? Acta Informatica Medica 29, 4–9. doi: 10.5455/aim.2021.29.4-9

Gamage, D., Ruipérez-Valiente, J. A., and Reich, J. (2023). A paradigm shift in designing education technology for online learning: opportunities and challenges. Front. Educ. 8:1194979. doi: 10.3389/feduc.2023.1194979

García-Morales, V. J., Garrido-Moreno, A., and Martín-Rojas, R. (2021). The transformation of higher education after the COVID disruption: emerging challenges in an online learning scenario. Front. Psychol. 12:616059. doi: 10.3389/fpsyg.2021.616059

Gonzalez-Ramirez, J., Mulqueen, K., Zealand, R., Silverstein, S., Mulqueen, C., and BuShell, S. (2021). Emergency online learning: college students' perceptions during the COVID-19 pandemic. Coll. Stud. J. 55, 29–46.

Grafton-Clarke, C., Uraiby, H., Gordon, M., Clarke, N., Rees, E., Park, S., et al. (2022). Pivot to online learning for adapting or continuing workplace-based clinical learning in medical education following the COVID-19 pandemic: a BEME systematic review: BEME guide no. 70. Med. Teach. 44, 227–243. doi: 10.1080/0142159X.2021.1992372

Haningsih, S., and Rohmi, P. (2022). The pattern of hybrid learning to maintain learning effectiveness at the higher education level post-COVID-19 pandemic. Eurasian J. Educ. Res. 11, 243–257. doi: 10.12973/eu-jer.11.1.243

Hollister, B., Nair, P., Hill-Lindsay, S., and Chukoskie, L. (2022). Engagement in online learning: student attitudes and behavior during COVID-19. Front. Educ. 7:851019. doi: 10.3389/feduc.2022.851019

Hong, J. C., Lee, Y. F., and Ye, J. H. (2021). Procrastination predicts online self-regulated learning and online learning ineffectiveness during the coronavirus lockdown. Personal. Individ. Differ. 174:110673. doi: 10.1016/j.paid.2021.110673

Jiang, P., Namaziandost, E., Azizi, Z., and Razmi, M. H. (2023). Exploring the effects of online learning on EFL learners’ motivation, anxiety, and attitudes during the COVID-19 pandemic: a focus on Iran. Curr. Psychol. 42, 2310–2324. doi: 10.1007/s12144-022-04013-x

Joy, E. H., and Garcia, F. E. (2000). Measuring learning effectiveness: a new look at no-significant-difference findings. JALN 4, 33–39.

Kebritchi, M., Lipschuetz, A., and Santiague, L. (2017). Issues and challenges for teaching successful online courses in higher education: a literature review. J. Educ. Technol. Syst. 46, 4–29. doi: 10.1177/0047239516661713

Lalduhawma, L. P., Thangmawia, L., and Hussain, J. (2022). Effectiveness of online learning during the COVID-19 pandemic in Mizoram. J. Educ. e-Learning Res. 9, 175–183. doi: 10.20448/jeelr.v9i3.4162

Liberati, A., Altman, D. G., Tetzlaff, J., Mulrow, C., Gotzsche, P. C., Ioannidis, J. P., et al. (2009). The PRISMA statement for reporting systematic reviews and meta-analyses of studies that evaluate health care interventions: explanation and elaboration. Annals of internal medicine , 151, W-65. doi: 10.7326/0003-4819-151-4-200908180-00136

Linnenluecke, M. K., Marrone, M., and Singh, A. K. (2020). Conducting systematic literature reviews and bibliometric analyses. Aust. J. Manag. 45, 175–194. doi: 10.1177/0312896219877678

Mahyoob, M. (2021). Online learning effectiveness during the COVID-19 pandemic: a case study of Saudi universities. Int. J. Info. Commun. Technol. Educ. (IJICTE) 17, 1–14. doi: 10.4018/IJICTE.20211001.oa7

Moher, D., Liberati, A., Tetzlaff, D., and Altman, G. and PRISMA Group (2009). Preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement. Annals of internal medicine , 151, 264–269. doi: 10.3736/jcim20090918

Mok, K. H., Xiong, W., and Bin Aedy Rahman, H. N. (2021). COVID-19 pandemic’s disruption on university teaching and learning and competence cultivation: student evaluation of online learning experiences in Hong Kong. Int. J. Chinese Educ. 10:221258682110070. doi: 10.1177/22125868211007011

Muthuprasad, T., Aiswarya, S., Aditya, K. S., and Jha, G. K. (2021). Students’ perception and preference for online education in India during COVID-19 pandemic. Soc. Sci. Humanities open 3:100101. doi: 10.1016/j.ssaho.2020.100101

Noesgaard, S. S., and Ørngreen, R. (2015). The effectiveness of e-learning: an explorative and integrative review of the definitions, methodologies and factors that promote e-learning effectiveness. Electronic J. E-learning 13, 278–290.

Papaioannou, D., Sutton, A., and Booth, A. (2016). Systematic approaches to a successful literature review. London: Sage.

Pratama, H., Azman, M. N. A., Kassymova, G. K., and Duisenbayeva, S. S. (2020). The trend in using online meeting applications for learning during the period of pandemic COVID-19: a literature review. J. Innovation in Educ. Cultural Res. 1, 58–68. doi: 10.46843/jiecr.v1i2.15

Rahayu, R. P., and Wirza, Y. (2020). Teachers’ perception of online learning during pandemic covid-19. Jurnal penelitian pendidikan 20, 392–406. doi: 10.17509/jpp.v20i3.29226

Rahman, A. (2021). Using students’ experience to derive effectiveness of COVID-19-lockdown-induced emergency online learning at undergraduate level: evidence from Assam. India. Higher Education for the Future 8, 71–89. doi: 10.1177/2347631120980549

Rajaram, K., and Collins, B. (2013). Qualitative identification of learning effectiveness indicators among mainland Chinese students in culturally dislocated study environments. J. Int. Educ. Bus. 6, 179–199. doi: 10.1108/JIEB-03-2013-0010

Salas-Pilco, S. Z., Yang, Y., and Zhang, Z. (2022). Student engagement in online learning in Latin American higher education during the COVID-19 pandemic: a systematic review. Br. J. Educ. Technol. 53, 593–619. doi: 10.1111/bjet.13190

Selco, J. I., and Habbak, M. (2021). Stem students’ perceptions on emergency online learning during the covid-19 pandemic: challenges and successes. Educ. Sci. 11:799. doi: 10.3390/educsci11120799

Sharma, K., Deo, G., Timalsina, S., Joshi, A., Shrestha, N., and Neupane, H. C. (2020). Online learning in the face of COVID-19 pandemic: assessment of students’ satisfaction at Chitwan medical college of Nepal. Kathmandu Univ. Med. J. 18, 40–47. doi: 10.3126/kumj.v18i2.32943

Shirahmadi, S., Hazavehei, S. M. M., Abbasi, H., Otogara, M., Etesamifard, T., Roshanaei, G., et al. (2023). Effectiveness of online practical education on vaccination training in the students of bachelor programs during the Covid-19 pandemic. PLoS One 18:e0280312. doi: 10.1371/journal.pone.0280312

Snyder, H. (2019). Literature review as a research methodology: an overview and guidelines. J. Bus. Res. 104, 333–339. doi: 10.1016/j.jbusres.2019.07.039

Stojan, J., Haas, M., Thammasitboon, S., Lander, L., Evans, S., Pawlik, C., et al. (2022). Online learning developments in undergraduate medical education in response to the COVID-19 pandemic: a BEME systematic review: BEME guide no. 69. Med. Teach. 44, 109–129. doi: 10.1080/0142159X.2021.1992373

Swan, K. (2003). Learning effectiveness online: what the research tells us. Elements of quality online education, practice and direction 4, 13–47.

Tang, K. H. D. (2023). Impacts of COVID-19 on primary, secondary and tertiary education: a comprehensive review and recommendations for educational practices. Educ. Res. Policy Prac. 22, 23–61. doi: 10.1007/s10671-022-09319-y

Torgerson, C. J., and Torgerson, D. J. (2001). The need for randomised controlled trials in educational research. Br. J. Educ. Stud. 49, 316–328. doi: 10.1111/1467-8527.t01-1-00178

Tranfield, D., Denyer, D., and Smart, P. (2003). Towards a methodology for developing evidence‐informed management knowledge by means of systematic review. British journal of management , 14, 207–222. doi: 10.1111/1467-8551.00375

Tsang, J. T., So, M. K., Chong, A. C., Lam, B. S., and Chu, A. M. (2021). Higher education during the pandemic: the predictive factors of learning effectiveness in COVID-19 online learning. Educ. Sci. 11:446. doi: 10.3390/educsci11080446

Wallin, J. A. (2005). Bibliometric methods: pitfalls and possibilities. Basic Clin. Pharmacol. Toxicol. 97, 261–275. doi: 10.1111/j.1742-7843.2005.pto_139.x

Webster, J., and Watson, R. T. (2002). Analyzing the past to prepare for the future: Writing a literature review. MIS quarterly , 26, 13–23.

Wong, J., Baars, M., Davis, D., Van Der Zee, T., Houben, G. J., and Paas, F. (2019). Supporting self-regulated learning in online learning environments and MOOCs: a systematic review. Int. J. Human–Computer Interaction 35, 356–373. doi: 10.1080/10447318.2018.1543084

Woo, Y., and Reeves, T. C. (2007). Meaningful interaction in web-based learning: a social constructivist interpretation. Internet High. Educ. 10, 15–25. doi: 10.1016/j.iheduc.2006.10.005

Zeitoun, H. (2008). E-learning: Concept, Issues, Application, Evaluation . Riyadh: Dar Alsolateah Publication.

Zhang, L., Carter, R. A. Jr., Qian, X., Yang, S., Rujimora, J., and Wen, S. (2022). Academia's responses to crisis: a bibliometric analysis of literature on online learning in higher education during COVID-19. Br. J. Educ. Technol. 53, 620–646. doi: 10.1111/bjet.13191

Zhang, Y., and Chen, X. (2023). Students’ perceptions of online learning in the post-COVID era: a focused case from the universities of applied sciences in China. Sustain. For. 15:946. doi: 10.3390/su15020946

Keywords: COVID-19 pandemic, higher education, online learning, learning effectiveness, systematic review

Citation: Meng W, Yu L, Liu C, Pan N, Pang X and Zhu Y (2024) A systematic review of the effectiveness of online learning in higher education during the COVID-19 pandemic period. Front. Educ . 8:1334153. doi: 10.3389/feduc.2023.1334153

Received: 06 November 2023; Accepted: 27 December 2023; Published: 17 January 2024.

Reviewed by:

Copyright © 2024 Meng, Yu, Liu, Pan, Pang and Zhu. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY) . The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Lei Yu, [email protected]

Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.

  • Access through  your organization
  • Purchase PDF

Article preview

Section snippets, references (48), cited by (26).

Elsevier

Educational Research Review

Research paper examining research on the impact of distance and online learning: a second-order meta-analysis study.

  • • Analyzed 19 first order meta-analysis on impact of distance education.
  • • Statistically significant average effect size of distance education on educational outcomes.
  • • Higher education studies had a statistically significant larger effect size than K-12 education.

Definition of distance education

Meta-analysis on distance and online learning, purpose of the study and research questions, credit authorship contribution statement, examining the effectiveness of technology use in classrooms: a tertiary meta-analysis, computers & education, the empirical status of cognitive-behavioral therapy: a review of meta-analyses, clinical psychology review, impact of e-learning on nurses' and student nurses knowledge, skills, and satisfaction: a systematic review and meta-analysis, international journal of nursing studies, the "file drawer problem" and tolerance for null results, psychological bulletin, methods for second order meta-analysis and illustrative applications, organizational behavior and human decision processes, the perceptions of primary school teachers of online learning during the covid-19 pandemic period: a case study in indonesia, journal of ethnic and cultural studies, comparing student satisfaction with distance education to traditional classrooms in higher education: a meta-analysis, american journal of distance education, evaluating the effectiveness of distance learning: a comparison using meta-analysis, journal of communication, foundations of educational theory for online learning, how does distance education compare with classroom instruction a meta-analysis of the empirical literature, review of educational research, an exploration of bias in meta-analysis: the case of technology integration research in higher education, journal of computing in higher education, introduction to meta-analysis, comprehensive meta-analysis (version 3) [computer software]. biostat, the effect of web-based instruction on achievement: a meta-analysis study, cypriot journal of educational sciences, the effects of distance education on k-12 student outcomes: a meta-analysis. naperville, ill, statistical power analysis for the behavioral sciences, time and learning efficiency in internet-based learning: a systematic review and meta-analysis, advances in health sciences education, internet-based learning in the health professions: a meta-analysis, the overview of reviews: unique challenges and opportunities when research syntheses are the principal elements of new integrative scholarship, american psychologist, integrating findings: the meta-analysis of research, measuring inconsistency in meta-analyses, british medical journal, the difference between emergency remote teaching and online learning. educause review, meta-analysis: cumulating research findings across studies, student achievement in online distance education compared to face-to-face education, european journal of open, distance and e-learning, can gamification enhance online learning evidence from a meta-analysis, academic engagement: assessment, conditions, and effects—a study in higher education from the perspective of the person-situation interaction, visible learning: the sequel: a synthesis of over 2,100 meta-analyses relating to achievement, effectiveness of synchronous and asynchronous online learning: a meta-analysis, developing an online learner satisfaction framework in higher education through a systematic review of research, assessing the socio-economic consequences of distance learning during the covid-19 pandemic.

Information

  • Author Services

Initiatives

You are accessing a machine-readable page. In order to be human-readable, please install an RSS reader.

All articles published by MDPI are made immediately available worldwide under an open access license. No special permission is required to reuse all or part of the article published by MDPI, including figures and tables. For articles published under an open access Creative Common CC BY license, any part of the article may be reused without permission provided that the original article is clearly cited. For more information, please refer to https://www.mdpi.com/openaccess .

Feature papers represent the most advanced research with significant potential for high impact in the field. A Feature Paper should be a substantial original Article that involves several techniques or approaches, provides an outlook for future research directions and describes possible research applications.

Feature papers are submitted upon individual invitation or recommendation by the scientific editors and must receive positive feedback from the reviewers.

Editor’s Choice articles are based on recommendations by the scientific editors of MDPI journals from around the world. Editors select a small number of articles recently published in the journal that they believe will be particularly interesting to readers, or important in the respective research area. The aim is to provide a snapshot of some of the most exciting work published in the various research areas of the journal.

Original Submission Date Received: .

  • Active Journals
  • Find a Journal
  • Journal Proposal
  • Proceedings Series
  • For Authors
  • For Reviewers
  • For Editors
  • For Librarians
  • For Publishers
  • For Societies
  • For Conference Organizers
  • Open Access Policy
  • Institutional Open Access Program
  • Special Issues Guidelines
  • Editorial Process
  • Research and Publication Ethics
  • Article Processing Charges
  • Testimonials
  • Preprints.org
  • SciProfiles
  • Encyclopedia

sustainability-logo

Article Menu

online learning research hypothesis

  • Subscribe SciFeed
  • Recommended Articles
  • Google Scholar
  • on Google Scholar
  • Table of Contents

Find support for a specific problem in the support section of our website.

Please let us know what you think of our products and services.

Visit our dedicated information section to learn more about MDPI.

JSmol Viewer

Assessing the impact of online-learning effectiveness and benefits in knowledge management, the antecedent of online-learning strategies and motivations: an empirical study.

online learning research hypothesis

1. Introduction

2. literature review and research hypothesis, 2.1. online-learning self-efficacy terminology, 2.2. online-learning monitoring terminology, 2.3. online-learning confidence in technology terminology, 2.4. online-learning willpower terminology, 2.5. online-learning attitude terminology, 2.6. online-learning motivation terminology, 2.7. online-learning strategies and online-learning effectiveness terminology, 2.8. online-learning effectiveness terminology, 3. research method, 3.1. instruments, 3.2. data analysis and results, 4.1. reliability and validity analysis, 4.2. hypothesis result, 5. discussion, 6. conclusions, 7. limitations and future directions, author contributions, institutional review board statement, informed consent statement, data availability statement, conflicts of interest.

  • UNESCO. COVID-19 Educational Disruption and Response ; UNESCO: Paris, France, 2020. [ Google Scholar ]
  • Moore, D.R. E-learning and the science of instruction: Proven guidelines for consumers and designers of multimedia learning. Educ. Technol. Res. Dev. 2006 , 54 , 197–200. [ Google Scholar ] [ CrossRef ]
  • McDonald, E.W.; Boulton, J.L.; Davis, J.L. E-learning and nursing assessment skills and knowledge–An integrative review. Nurse Educ. Today 2018 , 66 , 166–174. [ Google Scholar ] [ CrossRef ] [ PubMed ]
  • Homan, S.R.; Wood, K. Taming the mega-lecture: Wireless quizzing. Syllabus Sunnyvale Chatsworth 2003 , 17 , 23–27. [ Google Scholar ]
  • Emran, M.A.; Shaalan, K. E-podium technology: A medium of managing knowledge at al buraimi university college via mlearning. In Proceedings of the 2nd BCS International IT Conference, Abu Dhabi, United Arab Emirates, 9–10 March 2014; pp. 1–4. [ Google Scholar ]
  • Tenório, T.; Bittencourt, I.I.; Isotani, S.; Silva, A.P. Does peer assessment in on-line learning environments work? A systematic review of the literature. Comput. Hum. Behav. 2016 , 64 , 94–107. [ Google Scholar ] [ CrossRef ]
  • Sheshasaayee, A.; Bee, M.N. Analyzing online learning effectiveness for knowledge society. In Information Systems Design and Intelligent Applications ; Bhateja, V., Nguyen, B., Nguyen, N., Satapathy, S., Le, D.N., Eds.; Springer: Singapore, 2018; pp. 995–1002. [ Google Scholar ]
  • Panigrahi, R.; Srivastava, P.R.; Sharma, D. Online learning: Adoption, continuance, and learning outcome—A review of literature. Int. J. Inform. Manag. 2018 , 43 , 1–14. [ Google Scholar ] [ CrossRef ]
  • Al-Rahmi, W.M.; Alias, N.; Othman, M.S.; Alzahrani, A.I.; Alfarraj, O.; Saged, A.A. Use of e-learning by university students in Malaysian higher educational institutions: A case in Universiti Teknologi Malaysia. IEEE Access 2018 , 6 , 14268–14276. [ Google Scholar ] [ CrossRef ]
  • Al-Rahmi, W.M.; Yahaya, N.; Aldraiweesh, A.A.; Alamri, M.M.; Aljarboa, N.A.; Alturki, U. Integrating technology acceptance model with innovation diffusion theory: An empirical investigation on students’ intention to use E-learning systems. IEEE Access 2019 , 7 , 26797–26809. [ Google Scholar ] [ CrossRef ]
  • Gunawan, I.; Hui, L.K.; Ma’sum, M.A. Enhancing learning effectiveness by using online learning management system. In Proceedings of the 6th International Conference on Education and Technology (ICET), Beijing, China, 18–20 June 2021; pp. 48–52. [ Google Scholar ]
  • Nguyen, P.H.; Tangworakitthaworn, P.; Gilbert, L. Individual learning effectiveness based on cognitive taxonomies and constructive alignment. In Proceedings of the IEEE Region 10 Conference (Tencon), Osaka, Japan, 16–19 November 2020; pp. 1002–1006. [ Google Scholar ]
  • Pee, L.G. Enhancing the learning effectiveness of ill-structured problem solving with online co-creation. Stud. High. Educ. 2020 , 45 , 2341–2355. [ Google Scholar ] [ CrossRef ]
  • Kintu, M.J.; Zhu, C.; Kagambe, E. Blended learning effectiveness: The relationship between student characteristics, design features and outcomes. Int. J. Educ. Technol. High. Educ. 2017 , 14 , 1–20. [ Google Scholar ] [ CrossRef ] [ Green Version ]
  • Wang, M.H.; Vogel, D.; Ran, W.J. Creating a performance-oriented e-learning environment: A design science approach. Inf. Manag. 2011 , 48 , 260–269. [ Google Scholar ] [ CrossRef ] [ Green Version ]
  • Hew, K.F.; Cheung, W.S. Students’ and instructors’ use of massive open online courses (MOOCs): Motivations and challenges. Educ. Res. Rev. 2014 , 12 , 45–58. [ Google Scholar ] [ CrossRef ]
  • Bryant, J.; Bates, A.J. Creating a constructivist online instructional environment. TechTrends 2015 , 59 , 17–22. [ Google Scholar ] [ CrossRef ]
  • Lee, M.C. Explaining and predicting users’ continuance intention toward e-learning: An extension of the expectation–confirmation model. Comput. Educ. 2010 , 54 , 506–516. [ Google Scholar ] [ CrossRef ]
  • Lin, K.M. E-Learning continuance intention: Moderating effects of user e-learning experience. Comput. Educ. 2011 , 56 , 515–526. [ Google Scholar ] [ CrossRef ]
  • Huang, E.Y.; Lin, S.W.; Huang, T.K. What type of learning style leads to online participation in the mixed-mode e-learning environment? A study of software usage instruction. Comput. Educ. 2012 , 58 , 338–349. [ Google Scholar ]
  • Chu, T.H.; Chen, Y.Y. With good we become good: Understanding e-learning adoption by theory of planned behavior and group influences. Comput. Educ. 2016 , 92 , 37–52. [ Google Scholar ] [ CrossRef ]
  • Bandura, A. Self-efficacy: Toward a unifying theory of behavioral change. Psychol. Rev. 1977 , 84 , 191–215. [ Google Scholar ] [ CrossRef ]
  • Torkzadeh, G.; Van Dyke, T.P. Development and validation of an Internet self-efficacy scale. Behav. Inform. Technol. 2001 , 20 , 275–280. [ Google Scholar ] [ CrossRef ]
  • Saadé, R.G.; Kira, D. Computer anxiety in e-learning: The effect of computer self-efficacy. J. Inform. Technol. Educ. Res. 2009 , 8 , 177–191. [ Google Scholar ] [ CrossRef ] [ Green Version ]
  • Tucker, J.; Gentry, G. Developing an E-Learning strategy in higher education. In Proceedings of the SITE 2009–Society for Information Technology & Teacher Education International Conference, Charleston, SC, USA, 2–6 March 2009; pp. 2702–2707. [ Google Scholar ]
  • Wang, Y.; Peng, H.M.; Huang, R.H.; Hou, Y.; Wang, J. Characteristics of distance learners: Research on relationships of learning motivation, learning strategy, self-efficacy, attribution and learning results. Open Learn. J. Open Distance Elearn. 2008 , 23 , 17–28. [ Google Scholar ] [ CrossRef ]
  • Mahmud, B.H. Study on the impact of motivation, self-efficacy and learning strategies of faculty of education undergraduates studying ICT courses. In Proceedings of the 4th International Postgraduate Research Colloquium (IPRC) Proceedings, Bangkok, Thailand, 29 October 2009; pp. 59–80. [ Google Scholar ]
  • Yusuf, M. Investigating relationship between self-efficacy, achievement motivation, and self-regulated learning strategies of undergraduate Students: A study of integrated motivational models. Procedia Soc. Behav. Sci. 2011 , 15 , 2614–2617. [ Google Scholar ] [ CrossRef ] [ Green Version ]
  • De la Fuente, J.; Martínez-Vicente, J.M.; Peralta-Sánchez, F.J.; GarzónUmerenkova, A.; Vera, M.M.; Paoloni, P. Applying the SRL vs. ERL theory to the knowledge of achievement emotions in undergraduate university students. Front. Psychol. 2019 , 10 , 2070. [ Google Scholar ] [ CrossRef ] [ PubMed ]
  • Ahmadi, S. Academic self-esteem, academic self-efficacy and academic achievement: A path analysis. J. Front. Psychol. 2020 , 5 , 155. [ Google Scholar ]
  • Meyen, E.L.; Aust, R.J.; Bui, Y.N. Assessing and monitoring student progress in an E-learning personnel preparation environment. Teach. Educ. Spec. Educ. 2002 , 25 , 187–198. [ Google Scholar ] [ CrossRef ] [ Green Version ]
  • Dunlosky, J.; Kubat-Silman, A.K.; Christopher, H. Training monitoring skills improves older adults’ self-paced associative learning. Psychol. Aging 2003 , 18 , 340–345. [ Google Scholar ] [ CrossRef ]
  • Zhang, H.J. Research on the relationship between English learning motivation. Self-monitoring and Test Score. Ethnic Educ. Res. 2005 , 6 , 66–71. [ Google Scholar ]
  • Rosenberg, M.J. E-Learning: Strategies for Delivering Knowledge in the Digital Age ; McGraw-Hill: New York, NY, USA, 2001. [ Google Scholar ]
  • Bhat, S.A.; Bashir, M. Measuring ICT orientation: Scale development & validation. Educ. Inf. Technol. 2018 , 23 , 1123–1143. [ Google Scholar ]
  • Achuthan, K.; Francis, S.P.; Diwakar, S. Augmented reflective learning and knowledge retention perceived among students in classrooms involving virtual laboratories. Educ. Inf. Technol. 2017 , 22 , 2825–2855. [ Google Scholar ] [ CrossRef ]
  • Hu, X.; Yelland, N. An investigation of preservice early childhood teachers’ adoption of ICT in a teaching practicum context in Hong Kong. J. Early Child. Teach. Educ. 2017 , 38 , 259–274. [ Google Scholar ] [ CrossRef ]
  • Fraillon, J.; Ainley, J.; Schulz, W.; Friedman, T.; Duckworth, D. Preparing for Life in a Digital World: The IEA International Computer and Information Literacy Study 2018 International Report ; Springer: New York, NY, USA, 2019. [ Google Scholar ]
  • Huber, S.G.; Helm, C. COVID-19 and schooling: Evaluation, assessment and accountability in times of crises—Reacting quickly to explore key issues for policy, practice and research with the school barometer. Educ. Assess. Eval. Account. 2020 , 32 , 237–270. [ Google Scholar ] [ CrossRef ]
  • Eickelmann, B.; Gerick, J. Learning with digital media: Objectives in times of Corona and under special consideration of social Inequities. Dtsch. Schule. 2020 , 16 , 153–162. [ Google Scholar ]
  • Shehzadi, S.; Nisar, Q.A.; Hussain, M.S.; Basheer, M.F.; Hameed, W.U.; Chaudhry, N.I. The role of e-learning toward students’ satisfaction and university brand image at educational institutes of Pakistan: A post-effect of COVID-19. Asian Educ. Dev. Stud. 2020 , 10 , 275–294. [ Google Scholar ] [ CrossRef ]
  • Miller, E.M.; Walton, G.M.; Dweck, C.S.; Job, V.; Trzesniewski, K.; McClure, S. Theories of willpower affect sustained learning. PLoS ONE 2012 , 7 , 38680. [ Google Scholar ] [ CrossRef ] [ PubMed ]
  • Moriña, A.; Molina, V.M.; Cortés-Vega, M.D. Voices from Spanish students with disabilities: Willpower and effort to survive university. Eur. J. Spec. Needs Educ. 2018 , 33 , 481–494. [ Google Scholar ] [ CrossRef ]
  • Koballa, T.R., Jr.; Crawley, F.E. The influence of attitude on science teaching and learning. Sch. Sci. Math. 1985 , 85 , 222–232. [ Google Scholar ] [ CrossRef ]
  • Chao, C.Y.; Chen, Y.T.; Chuang, K.Y. Exploring students’ learning attitude and achievement in flipped learning supported computer aided design curriculum: A study in high school engineering education. Comput. Appl. Eng. Educ. 2015 , 23 , 514–526. [ Google Scholar ] [ CrossRef ]
  • Stefan, M.; Ciomos, F. The 8th and 9th grades students’ attitude towards teaching and learning physics. Acta Didact. Napocensia. 2010 , 3 , 7–14. [ Google Scholar ]
  • Sedighi, F.; Zarafshan, M.A. Effects of attitude and motivation on the use of language learning strategies by Iranian EFL University students. J. Soc. Sci. Humanit. Shiraz Univ. 2007 , 23 , 71–80. [ Google Scholar ]
  • Megan, S.; Jennifer, H.C.; Stephanie, V.; Kyla, H. The relationship among middle school students’ motivation orientations, learning strategies, and academic achievement. Middle Grades Res. J. 2013 , 8 , 1–12. [ Google Scholar ]
  • Nasser, O.; Majid, V. Motivation, attitude, and language learning. Procedia Soc. Behav. Sci. 2011 , 29 , 994–1000. [ Google Scholar ]
  • Özhan, Ş.Ç.; Kocadere, S.A. The effects of flow, emotional engagement, and motivation on success in a gamified online learning environment. J. Educ. Comput. Res. 2020 , 57 , 2006–2031. [ Google Scholar ] [ CrossRef ]
  • Wang, A.P.; Che, H.S. A research on the relationship between learning anxiety, learning attitude, motivation and test performance. Psychol. Dev. Educ. 2005 , 21 , 55–59. [ Google Scholar ]
  • Lin, C.H.; Zhang, Y.N.; Zheng, B.B. The roles of learning strategies and motivation in online language learning: A structural equation modeling analysis. Comput. Educ. 2017 , 113 , 75–85. [ Google Scholar ] [ CrossRef ]
  • Deschênes, M.F.; Goudreau, J.; Fernandez, N. Learning strategies used by undergraduate nursing students in the context of a digital educational strategy based on script concordance: A descriptive study. Nurse Educ. Today 2020 , 95 , 104607. [ Google Scholar ] [ CrossRef ] [ PubMed ]
  • Jerusalem, M.; Schwarzer, R. Self-efficacy as a resource factor in stress appraisal processes. In Self-Efficacy: Thought Control of Action ; Schwarzer, R., Ed.; Hemisphere Publishing Corp: Washington, DC, USA, 1992; pp. 195–213. [ Google Scholar ]
  • Zimmerman, B.J. Becoming a self-regulated learner: An overview. Theory Pract. 2002 , 41 , 64–70. [ Google Scholar ] [ CrossRef ]
  • Pintrich, P.R.; Smith, D.A.F.; García, T.; McKeachie, W.J. A Manual for the Use of the Motivated Strategies Questionnaire (MSLQ) ; University of Michigan, National Center for Research to Improve Post Secondary Teaching and Learning: Ann Arbor, MI, USA, 1991. [ Google Scholar ]
  • Knowles, E.; Kerkman, D. An investigation of students attitude and motivation toward online learning. InSight Collect. Fac. Scholarsh. 2007 , 2 , 70–80. [ Google Scholar ] [ CrossRef ]
  • Hair, J.F., Jr.; Black, W.C.; Babin, B.J.; Anderson, R.E. Multivariate Data Analysis: A Global Perspective , 7th ed.; Pearson Education International: Upper Saddle River, NJ, USA, 2010. [ Google Scholar ]
  • Fornell, C.; Larcker, D.F. Evaluating structural equation models with unobservable variables and measurement error. J. Mark. Res. 1981 , 18 , 39–50. [ Google Scholar ] [ CrossRef ]
  • Hair, J.F., Jr.; Hult, G.T.M.; Ringle, C.; Sarstedt, M. A Primer on Partial Least Squares Structural Equation Modeling (PLS-SEM) ; Sage: Los Angeles, CA, USA, 2016. [ Google Scholar ]
  • Kiliç-Çakmak, E. Learning strategies and motivational factors predicting information literacy self-efficacy of e-learners. Aust. J. Educ. Technol. 2010 , 26 , 192–208. [ Google Scholar ] [ CrossRef ] [ Green Version ]
  • Zheng, C.; Liang, J.C.; Li, M.; Tsai, C. The relationship between English language learners’ motivation and online self-regulation: A structural equation modelling approach. System 2018 , 76 , 144–157. [ Google Scholar ] [ CrossRef ]
  • May, M.; George, S.; Prévôt, P. TrAVis to enhance students’ self-monitoring in online learning supported by computer-mediated communication tools. Int. J. Comput. Inform. Syst. Ind. Manag. Appl. 2011 , 3 , 623–634. [ Google Scholar ]
  • Rafart, M.A.; Bikfalvi, A.; Soler, J.; Poch, J. Impact of using automatic E-Learning correctors on teaching business subjects to engineers. Int. J. Eng. Educ. 2019 , 35 , 1630–1641. [ Google Scholar ]
  • Lee, P.M.; Tsui, W.H.; Hsiao, T.C. A low-cost scalable solution for monitoring affective state of students in E-learning environment using mouse and keystroke data. In Intelligent Tutoring Systems ; Cerri, S.A., Clancey, W.J., Papadourakis, G., Panourgia, K., Eds.; Springer: Berlin, Germany, 2012; pp. 679–680. [ Google Scholar ]
  • Metz, D.; Karadgi, S.S.; Müller, U.J.; Grauer, M. Self-Learning monitoring and control of manufacturing processes based on rule induction and event processing. In Proceedings of the 4th International Conference on Information, Process, and Knowledge Management eKNOW, Valencia, Spain, 21–25 November 2012; pp. 78–85. [ Google Scholar ]
  • Fitch, J.L.; Ravlin, E.C. Willpower and perceived behavioral control: Intention-behavior relationship and post behavior attributions. Soc. Behav. Pers. Int. J. 2005 , 33 , 105–124. [ Google Scholar ] [ CrossRef ]
  • Sridharan, B.; Deng, H.; Kirk, J.; Brian, C. Structural equation modeling for evaluating the user perceptions of e-learning effectiveness in higher education. In Proceedings of the ECIS 2010: 18th European Conference on Information Systems, Pretoria, South Africa, 7–9 June 2010. [ Google Scholar ]
  • Tarhini, A.; Hone, K.; Liu, X. The effects of individual differences on e-learning users’ behaviour in developing countries: A structural equation model. Comput. Hum. Behav. 2014 , 41 , 153–163. [ Google Scholar ] [ CrossRef ] [ Green Version ]
  • de Leeuw, R.A.; Logger, D.N.; Westerman, M.; Bretschneider, J.; Plomp, M.; Scheele, F. Influencing factors in the implementation of postgraduate medical e-learning: A thematic analysis. BMC Med. Educ. 2019 , 19 , 300. [ Google Scholar ] [ CrossRef ] [ Green Version ]
  • Erenler, H.H.T. A structural equation model to evaluate students’ learning and satisfaction. Comput. Appl. Eng. Educ. 2020 , 28 , 254–267. [ Google Scholar ] [ CrossRef ]
  • Fee, K. Delivering E-learning: A complete strategy for design, application and assessment. Dev. Learn. Organ. 2013 , 27 , 40–52. [ Google Scholar ] [ CrossRef ]
  • So, W.W.N.; Chen, Y.; Wan, Z.H. Multimedia e-Learning and self-regulated science learning: A study of primary school learners’ experiences and perceptions. J. Sci. Educ. Technol. 2019 , 28 , 508–522. [ Google Scholar ] [ CrossRef ]

Click here to enlarge figure

VariablesCategoryFrequencyPercentage
GenderMale24351.81
Female22648.19
Education program levelUndergraduate program21044.78
Master program15432.84
Doctoral program10522.39
Online learning toolsSmartphone25554.37
Computer/PC12526.65
Tablet8918.98
Online learning mediaGoogle Meet13228.14
Microsoft Teams9921.11
Zoom19641.79
Others428.96
ConstructMeasurement ItemsFactor Loading/Coefficient (t-Value)AVEComposite ReliabilityCronbach’s Alpha
Online Learning Benefit (LBE)LBE10.880.680.860.75
LBE20.86
LBE30.71
Online-learning effectiveness (LEF)LEF10.830.760.900.84
LEF20.88
LEF30.90
Online-learning motivation (LMT)LMT10.860.770.910.85
LMT20.91
LMT30.85
Online-learning strategies (LST)LST10.900.750.900.84
LST20.87
LST30.83
Online-learning attitude (OLA)OLA10.890.750.900.84
OLA20.83
OLA30.87
Online-learning confidence-in-technology (OLC)OLC10.870.690.870.76
OLC20.71
OLC30.89
Online-learning monitoring (OLM)OLM10.880.750.890.83
OLM20.91
OLM30.79
Online-learning self-efficacy (OLS)OLS10.790.640.840.73
OLS20.81
OLS30.89
Online-learning willpower (OLW)OLW10.910.690.870.77
OLW20.84
OLW30.73
LBELEFLMTLSTOLAOLCOLMOLSOLW
LBE
LEF0.82
LMT0.810.80
LST0.800.840.86
OLA0.690.630.780.81
OLC0.760.790.850.790.72
OLM0.810.850.810.760.630.83
OLS0.710.590.690.570.560.690.75
OLW0.750.750.800.740.640.810.800.79
LBELEFLMTLSTOLAOLCOLMOLSOLW
LBE10.880.760.870.660.540.790.780.630.74
LBE20.860.680.740.630.570.750.910.730.79
LBE30.710.540.590.710.630.550.500.360.53
LEF10.630.830.720.650.510.620.690.460.57
LEF20.770.880.780.710.550.730.780.520.69
LEF30.720.900.800.830.570.720.760.580.69
LMT10.880.760.870.660.540.790.780.630.74
LMT20.790.890.910.790.620.730.880.610.67
LMT30.720.650.850.770.890.720.670.590.69
LST10.610.630.680.900.780.640.570.390.57
LST20.740.590.720.870.780.680.610.480.63
LST30.720.900.800.830.570.720.760.580.69
OLA10.720.650.850.790.890.720.670.590.69
OLA20.510.480.550.590.830.580.470.420.43
OLA30.520.440.550.700.870.550.430.390.47
OLC10.780.700.730.650.530.870.770.650.91
OLC20.510.530.570.620.750.710.460.390.47
OLC30.810.730.780.690.550.890.800.660.75
OLM10.790.890.910.790.620.730.880.610.69
OLM20.860.680.740.630.570.750.910.730.79
OLM30.690.550.570.470.390.670.790.610.73
OLS10.410.230.350.280.390.410.400.690.49
OLS20.450.410.480.380.430.480.520.810.49
OLS30.750.660.720.600.490.690.770.890.82
OLW10.780.700.730.650.530.870.770.650.91
OLW20.750.650.710.590.510.690.770.870.84
OLW30.570.490.540.590.570.570.530.390.73
HypothesisPathStandardized Path Coefficientt-ValueResult
H1OLS → LST0.29 ***2.14Accepted
H2OLM → LST0.24 ***2.29Accepted
H3OLC → LST0.28 ***1.99Accepted
H4OLC → LMT0.36 ***2.96Accepted
H5OLW → LMT0.26 ***2.55Accepted
H6OLA → LMT0.34 ***4.68Accepted
H7LMT → LST0.71 ***4.96Accepted
H8LMT → LEF0.60 ***5.89Accepted
H9LST → LEF0.32 ***3.04Accepted
H10LEF → LBE0.81 ***23.6Accepted
MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

Hongsuchon, T.; Emary, I.M.M.E.; Hariguna, T.; Qhal, E.M.A. Assessing the Impact of Online-Learning Effectiveness and Benefits in Knowledge Management, the Antecedent of Online-Learning Strategies and Motivations: An Empirical Study. Sustainability 2022 , 14 , 2570. https://doi.org/10.3390/su14052570

Hongsuchon T, Emary IMME, Hariguna T, Qhal EMA. Assessing the Impact of Online-Learning Effectiveness and Benefits in Knowledge Management, the Antecedent of Online-Learning Strategies and Motivations: An Empirical Study. Sustainability . 2022; 14(5):2570. https://doi.org/10.3390/su14052570

Hongsuchon, Tanaporn, Ibrahiem M. M. El Emary, Taqwa Hariguna, and Eissa Mohammed Ali Qhal. 2022. "Assessing the Impact of Online-Learning Effectiveness and Benefits in Knowledge Management, the Antecedent of Online-Learning Strategies and Motivations: An Empirical Study" Sustainability 14, no. 5: 2570. https://doi.org/10.3390/su14052570

Article Metrics

Article access statistics, further information, mdpi initiatives, follow mdpi.

MDPI

Subscribe to receive issue release notifications and newsletters from MDPI journals

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • View all journals
  • Explore content
  • About the journal
  • Publish with us
  • Sign up for alerts
  • Open access
  • Published: 09 January 2024

Online vs in-person learning in higher education: effects on student achievement and recommendations for leadership

  • Bandar N. Alarifi 1 &
  • Steve Song 2  

Humanities and Social Sciences Communications volume  11 , Article number:  86 ( 2024 ) Cite this article

14k Accesses

3 Citations

3 Altmetric

Metrics details

  • Science, technology and society

This study is a comparative analysis of online distance learning and traditional in-person education at King Saud University in Saudi Arabia, with a focus on understanding how different educational modalities affect student achievement. The justification for this study lies in the rapid shift towards online learning, especially highlighted by the educational changes during the COVID-19 pandemic. By analyzing the final test scores of freshman students in five core courses over the 2020 (in-person) and 2021 (online) academic years, the research provides empirical insights into the efficacy of online versus traditional education. Initial observations suggested that students in online settings scored lower in most courses. However, after adjusting for variables like gender, class size, and admission scores using multiple linear regression, a more nuanced picture emerged. Three courses showed better performance in the 2021 online cohort, one favored the 2020 in-person group, and one was unaffected by the teaching format. The study emphasizes the crucial need for a nuanced, data-driven strategy in integrating online learning within higher education systems. It brings to light the fact that the success of educational methodologies is highly contingent on specific contextual factors. This finding advocates for educational administrators and policymakers to exercise careful and informed judgment when adopting online learning modalities. It encourages them to thoroughly evaluate how different subjects and instructional approaches might interact with online formats, considering the variable effects these might have on learning outcomes. This approach ensures that decisions about implementing online education are made with a comprehensive understanding of its diverse and context-specific impacts, aiming to optimize educational effectiveness and student success.

Similar content being viewed by others

online learning research hypothesis

Elementary school teachers’ perspectives about learning during the COVID-19 pandemic

online learning research hypothesis

Quality of a master’s degree in education in Ecuador

online learning research hypothesis

Co-designing inclusive excellence in higher education: Students’ and teachers’ perspectives on the ideal online learning environment using the I-TPACK model

Introduction.

The year 2020 marked an extraordinary period, characterized by the global disruption caused by the COVID-19 pandemic. Governments and institutions worldwide had to adapt to unforeseen challenges across various domains, including health, economy, and education. In response, many educational institutions quickly transitioned to distance teaching (also known as e-learning, online learning, or virtual classrooms) to ensure continued access to education for their students. However, despite this rapid and widespread shift to online learning, a comprehensive examination of its effects on student achievement in comparison to traditional in-person instruction remains largely unexplored.

In research examining student outcomes in the context of online learning, the prevailing trend is the consistent observation that online learners often achieve less favorable results when compared to their peers in traditional classroom settings (e.g., Fischer et al., 2020 ; Bettinger et al., 2017 ; Edvardsson and Oskarsson, 2008 ). However, it is important to note that a significant portion of research on online learning has primarily focused on its potential impact (Kuhfeld et al., 2020 ; Azevedo et al., 2020 ; Di Pietro et al., 2020 ) or explored various perspectives (Aucejo et al., 2020 ; Radha et al., 2020 ) concerning distance education. These studies have often omitted a comprehensive and nuanced examination of its concrete academic consequences, particularly in terms of test scores and grades.

Given the dearth of research on the academic impact of online learning, especially in light of Covid-19 in the educational arena, the present study aims to address that gap by assessing the effectiveness of distance learning compared to in-person teaching in five required freshmen-level courses at King Saud University, Saudi Arabia. To accomplish this objective, the current study compared the final exam results of 8297 freshman students who were enrolled in the five courses in person in 2020 to their 8425 first-year counterparts who has taken the same courses at the same institution in 2021 but in an online format.

The final test results of the five courses (i.e., University Skills 101, Entrepreneurship 101, Computer Skills 101, Computer Skills 101, and Fitness and Health Culture 101) were examined, accounting for potential confounding factors such as gender, class size and admission scores, which have been cited in past research to be correlated with student achievement (e.g., Meinck and Brese, 2019 ; Jepsen, 2015 ) Additionally, as the preparatory year at King Saud University is divided into five tracks—health, nursing, science, business, and humanity, the study classified students based on their respective disciplines.

Motivation for the study

The rapid expansion of distance learning in higher education, particularly highlighted during the recent COVID-19 pandemic (Volk et al., 2020 ; Bettinger et al., 2017 ), underscores the need for alternative educational approaches during crises. Such disruptions can catalyze innovation and the adoption of distance learning as a contingency plan (Christensen et al., 2015 ). King Saud University, like many institutions worldwide, faced the challenge of transitioning abruptly to online learning in response to the pandemic.

E-learning has gained prominence in higher education due to technological advancements, offering institutions a competitive edge (Valverde-Berrocoso et al., 2020 ). Especially during conditions like the COVID-19 pandemic, electronic communication was utilized across the globe as a feasible means to overcome barriers and enhance interactions (Bozkurt, 2019 ).

Distance learning, characterized by flexibility, became crucial when traditional in-person classes are hindered by unforeseen circumstance such as the ones posed by COVID-19 (Arkorful and Abaidoo, 2015 ). Scholars argue that it allows students to learn at their own pace, often referred to as self-directed learning (Hiemstra, 1994 ) or self-education (Gadamer, 2001 ). Additional advantages include accessibility, cost-effectiveness, and flexibility (Sadeghi, 2019 ).

However, distance learning is not immune to its own set of challenges. Technical impediments, encompassing network issues, device limitations, and communication hiccups, represent formidable hurdles (Sadeghi, 2019 ). Furthermore, concerns about potential distractions in the online learning environment, fueled by the ubiquity of the internet and social media, have surfaced (Hall et al., 2020 ; Ravizza et al., 2017 ). The absence of traditional face-to-face interactions among students and between students and instructors is also viewed as a potential drawback (Sadeghi, 2019 ).

Given the evolving understanding of the pros and cons of distance learning, this study aims to contribute to the existing literature by assessing the effectiveness of distance learning, specifically in terms of student achievement, as compared to in-person classroom learning at King Saud University, one of Saudi Arabia’s largest higher education institutions.

Academic achievement: in-person vs online learning

The primary driving force behind the rapid integration of technology in education has been its emphasis on student performance (Lai and Bower, 2019 ). Over the past decade, numerous studies have undertaken comparisons of student academic achievement in online and in-person settings (e.g., Bettinger et al., 2017 ; Fischer et al., 2020 ; Iglesias-Pradas et al., 2021 ). This section offers a concise review of the disparities in academic achievement between college students engaged in in-person and online learning, as identified in existing research.

A number of studies point to the superiority of traditional in-person education over online learning in terms of academic outcomes. For example, Fischer et al. ( 2020 ) conducted a comprehensive study involving 72,000 university students across 433 subjects, revealing that online students tend to achieve slightly lower academic results than their in-class counterparts. Similarly, Bettinger et al. ( 2017 ) found that students at for-profit online universities generally underperformed when compared to their in-person peers. Supporting this trend, Figlio et al. ( 2013 ) indicated that in-person instruction consistently produced better results, particularly among specific subgroups like males, lower-performing students, and Hispanic learners. Additionally, Kaupp’s ( 2012 ) research in California community colleges demonstrated that online students faced lower completion and success rates compared to their traditional in-person counterparts (Fig. 1 ).

figure 1

The figure compared student achievement in the final tests in the five courses by year, using independent-samples t-tests; the results show a statistically-significant drop in test scores from 2020 (in person) to 2021 (online) for all courses except CT_101.

In contrast, other studies present evidence of online students outperforming their in-person peers. For example, Iglesias-Pradas et al. ( 2021 ) conducted a comparative analysis of 43 bachelor courses at Telecommunication Engineering College in Malaysia, revealing that online students achieved higher academic outcomes than their in-person counterparts. Similarly, during the COVID-19 pandemic, Gonzalez et al. ( 2020 ) found that students engaged in online learning performed better than those who had previously taken the same subjects in traditional in-class settings.

Expanding on this topic, several studies have reported mixed results when comparing the academic performance of online and in-person students, with various student and instructor factors emerging as influential variables. Chesser et al. ( 2020 ) noted that student traits such as conscientiousness, agreeableness, and extraversion play a substantial role in academic achievement, regardless of the learning environment—be it traditional in-person classrooms or online settings. Furthermore, Cacault et al. ( 2021 ) discovered that online students with higher academic proficiency tend to outperform those with lower academic capabilities, suggesting that differences in students’ academic abilities may impact their performance. In contrast, Bergstrand and Savage ( 2013 ) found that online classes received lower overall ratings and exhibited a less respectful learning environment when compared to in-person instruction. Nevertheless, they also observed that the teaching efficiency of both in-class and online courses varied significantly depending on the instructors’ backgrounds and approaches. These findings underscore the multifaceted nature of the online vs. in-person learning debate, highlighting the need for a nuanced understanding of the factors at play.

Theoretical framework

Constructivism is a well-established learning theory that places learners at the forefront of their educational experience, emphasizing their active role in constructing knowledge through interactions with their environment (Duffy and Jonassen, 2009 ). According to constructivist principles, learners build their understanding by assimilating new information into their existing cognitive frameworks (Vygotsky, 1978 ). This theory highlights the importance of context, active engagement, and the social nature of learning (Dewey, 1938 ). Constructivist approaches often involve hands-on activities, problem-solving tasks, and opportunities for collaborative exploration (Brooks and Brooks, 1999 ).

In the realm of education, subject-specific pedagogy emerges as a vital perspective that acknowledges the distinctive nature of different academic disciplines (Shulman, 1986 ). It suggests that teaching methods should be tailored to the specific characteristics of each subject, recognizing that subjects like mathematics, literature, or science require different approaches to facilitate effective learning (Shulman, 1987 ). Subject-specific pedagogy emphasizes that the methods of instruction should mirror the ways experts in a particular field think, reason, and engage with their subject matter (Cochran-Smith and Zeichner, 2005 ).

When applying these principles to the design of instruction for online and in-person learning environments, the significance of adapting methods becomes even more pronounced. Online learning often requires unique approaches due to its reliance on technology, asynchronous interactions, and potential for reduced social presence (Anderson, 2003 ). In-person learning, on the other hand, benefits from face-to-face interactions and immediate feedback (Allen and Seaman, 2016 ). Here, the interplay of constructivism and subject-specific pedagogy becomes evident.

Online learning. In an online environment, constructivist principles can be upheld by creating interactive online activities that promote exploration, reflection, and collaborative learning (Salmon, 2000 ). Discussion forums, virtual labs, and multimedia presentations can provide opportunities for students to actively engage with the subject matter (Harasim, 2017 ). By integrating subject-specific pedagogy, educators can design online content that mirrors the discipline’s methodologies while leveraging technology for authentic experiences (Koehler and Mishra, 2009 ). For instance, an online history course might incorporate virtual museum tours, primary source analysis, and collaborative timeline projects.

In-person learning. In a traditional brick-and-mortar classroom setting, constructivist methods can be implemented through group activities, problem-solving tasks, and in-depth discussions that encourage active participation (Jonassen et al., 2003 ). Subject-specific pedagogy complements this by shaping instructional methods to align with the inherent characteristics of the subject (Hattie, 2009). For instance, in a physics class, hands-on experiments and real-world applications can bring theoretical concepts to life (Hake, 1998 ).

In sum, the fusion of constructivism and subject-specific pedagogy offers a versatile approach to instructional design that adapts to different learning environments (Garrison, 2011 ). By incorporating the principles of both theories, educators can tailor their methods to suit the unique demands of online and in-person learning, ultimately providing students with engaging and effective learning experiences that align with the nature of the subject matter and the mode of instruction.

Course description

The Self-Development Skills Department at King Saud University (KSU) offers five mandatory freshman-level courses. These courses aim to foster advanced thinking skills and cultivate scientific research abilities in students. They do so by imparting essential skills, identifying higher-level thinking patterns, and facilitating hands-on experience in scientific research. The design of these classes is centered around aiding students’ smooth transition into university life. Brief descriptions of these courses are as follows:

University Skills 101 (CI 101) is a three-hour credit course designed to nurture essential academic, communication, and personal skills among all preparatory year students at King Saud University. The primary goal of this course is to equip students with the practical abilities they need to excel in their academic pursuits and navigate their university lives effectively. CI 101 comprises 12 sessions and is an integral part of the curriculum for all incoming freshmen, ensuring a standardized foundation for skill development.

Fitness and Health 101 (FAJB 101) is a one-hour credit course. FAJB 101 focuses on the aspects of self-development skills in terms of health and physical, and the skills related to personal health, nutrition, sports, preventive, psychological, reproductive, and first aid. This course aims to motivate students’ learning process through entertainment, sports activities, and physical exercises to maintain their health. This course is required for all incoming freshmen students at King Saud University.

Entrepreneurship 101 (ENT 101) is a one-hour- credit course. ENT 101 aims to develop students’ skills related to entrepreneurship. The course provides students with knowledge and skills to generate and transform ideas and innovations into practical commercial projects in business settings. The entrepreneurship course consists of 14 sessions and is taught only to students in the business track.

Computer Skills 101 (CT 101) is a three-hour credit course. This provides students with the basic computer skills, e.g., components, operating systems, applications, and communication backup. The course explores data visualization, introductory level of modern programming with algorithms and information security. CT 101 course is taught for all tracks except those in the human track.

Computer Skills 102 (CT 102) is a three-hour credit course. It provides IT skills to the students to utilize computers with high efficiency, develop students’ research and scientific skills, and increase capability to design basic educational software. CT 102 course focuses on operating systems such as Microsoft Office. This course is only taught for students in the human track.

Structure and activities

These courses ranged from one to three hours. A one-hour credit means that students must take an hour of the class each week during the academic semester. The same arrangement would apply to two and three credit-hour courses. The types of activities in each course are shown in Table 1 .

At King Saud University, each semester spans 15 weeks in duration. The total number of semester hours allocated to each course serves as an indicator of its significance within the broader context of the academic program, including the diverse tracks available to students. Throughout the two years under study (i.e., 2020 and 2021), course placements (fall or spring), course content, and the organizational structure remained consistent and uniform.

Participants

The study’s data comes from test scores of a cohort of 16,722 first-year college students enrolled at King Saud University in Saudi Arabia over the span of two academic years: 2020 and 2021. Among these students, 8297 were engaged in traditional, in-person learning in 2020, while 8425 had transitioned to online instruction for the same courses in 2021 due to the Covid-19 pandemic. In 2020, the student population consisted of 51.5% females and 48.5% males. However, in 2021, there was a reversal in these proportions, with female students accounting for 48.5% and male students comprising 51.5% of the total participants.

Regarding student enrollment in the five courses, Table 2 provides a detailed breakdown by average class size, admission scores, and the number of students enrolled in the courses during the two years covered by this study. While the total number of students in each course remained relatively consistent across the two years, there were noticeable fluctuations in average class sizes. Specifically, four out of the five courses experienced substantial increases in class size, with some nearly doubling in size (e.g., ENT_101 and CT_102), while one course (CT_101) showed a reduction in its average class size.

In this study, it must be noted that while some students enrolled in up to three different courses within the same academic year, none repeated the same exam in both years. Specifically, students who failed to pass their courses in 2020 were required to complete them in summer sessions and were consequently not included in this study’s dataset. To ensure clarity and precision in our analysis, the research focused exclusively on student test scores to evaluate and compare the academic effectiveness of online and traditional in-person learning methods. This approach was chosen to provide a clear, direct comparison of the educational impacts associated with each teaching format.

Descriptive analysis of the final exam scores for the two years (2020 and 2021) were conducted. Additionally, comparison of student outcomes in in-person classes in 2020 to their online platform peers in 2021 were conducted using an independent-samples t -test. Subsequently, in order to address potential disparities between the two groups arising from variables such as gender, class size, and admission scores (which serve as an indicator of students’ academic aptitude and pre-enrollment knowledge), multiple regression analyses were conducted. In these multivariate analyses, outcomes of both in-person and online cohorts were assessed within their respective tracks. By carefully considering essential aforementioned variables linked to student performance, the study aimed to ensure a comprehensive and equitable evaluation.

Study instrument

The study obtained students’ final exam scores for the years 2020 (in-person) and 2021 (online) from the school’s records office through their examination management system. In the preparatory year at King Saud University, final exams for all courses are developed by committees composed of faculty members from each department. To ensure valid comparisons, the final exam questions, crafted by departmental committees of professors, remained consistent and uniform for the two years under examination.

Table 3 provides a comprehensive assessment of the reliability of all five tests included in our analysis. These tests exhibit a strong degree of internal consistency, with Cronbach’s alpha coefficients spanning a range from 0.77 to 0.86. This robust and consistent internal consistency measurement underscores the dependable nature of these tests, affirming their reliability and suitability for the study’s objectives.

In terms of assessing test validity, content validity was ensured through a thorough review by university subject matter experts, resulting in test items that align well with the content domain and learning objectives. Additionally, criterion-related validity was established by correlating students’ admissions test scores with their final required freshman test scores in the five subject areas, showing a moderate and acceptable relationship (0.37 to 0.56) between the test scores and the external admissions test. Finally, construct validity was confirmed through reviews by experienced subject instructors, leading to improvements in test content. With guidance from university subject experts, construct validity was established, affirming the effectiveness of the final tests in assessing students’ subject knowledge at the end of their coursework.

Collectively, these validity and reliability measures affirm the soundness and integrity of the final subject tests, establishing their suitability as effective assessment tools for evaluating students’ knowledge in their five mandatory freshman courses at King Saud University.

After obtaining research approval from the Research Committee at King Saud University, the coordinators of the five courses (CI_101, ENT_101, CT_101, CT_102, and FAJB_101) supplied the researchers with the final exam scores of all first-year preparatory year students at King Saud University for the initial semester of the academic years 2020 and 2021. The sample encompassed all students who had completed these five courses during both years, resulting in a total of 16,722 students forming the final group of participants.

Limitations

Several limitations warrant acknowledgment in this study. First, the research was conducted within a well-resourced major public university. As such, the experiences with online classes at other types of institutions (e.g., community colleges, private institutions) may vary significantly. Additionally, the limited data pertaining to in-class teaching practices and the diversity of learning activities across different courses represents a gap that could have provided valuable insights for a more thorough interpretation and explanation of the study’s findings.

To compare student achievement in the final tests in the five courses by year, independent-samples t -tests were conducted. Table 4 shows a statistically-significant drop in test scores from 2020 (in person) to 2021 (online) for all courses except CT_101. The biggest decline was with CT_102 with 3.58 points, and the smallest decline was with CI_101 with 0.18 points.

However, such simple comparison of means between the two years (via t -tests) by subjects does not account for the differences in gender composition, class size, and admission scores between the two academic years, all of which have been associated with student outcomes (e.g., Ho and Kelman, 2014 ; De Paola et al., 2013 ). To account for such potential confounding variables, multiple regressions were conducted to compare the 2 years’ results while controlling for these three factors associated with student achievement.

Table 5 presents the regression results, illustrating the variation in final exam scores between 2020 and 2021, while controlling for gender, class size, and admission scores. Importantly, these results diverge significantly from the outcomes obtained through independent-sample t -test analyses.

Taking into consideration the variables mentioned earlier, students in the 2021 online cohort demonstrated superior performance compared to their 2020 in-person counterparts in CI_101, FAJB_101, and CT_101, with score advantages of 0.89, 0.56, and 5.28 points, respectively. Conversely, in the case of ENT_101, online students in 2021 scored 0.69 points lower than their 2020 in-person counterparts. With CT_102, there were no statistically significant differences in final exam scores between the two cohorts of students.

The study sought to assess the effectiveness of distance learning compared to in-person learning in the higher education setting in Saudi Arabia. We analyzed the final exam scores of 16,722 first-year college students in King Saud University in five required subjects (i.e., CI_101, ENT_101, CT_101, CT_102, and FAJB_101). The study initially performed a simple comparison of mean scores by tracks by year (via t -tests) and then a number of multiple regression analyses which controlled for class size, gender composition, and admission scores.

Overall, the study’s more in-depth findings using multiple regression painted a wholly different picture than the results obtained using t -tests. After controlling for class size, gender composition, and admissions scores, online students in 2021 performed better than their in-person instruction peers in 2020 in University Skills (CI_101), Fitness and Health (FAJB_101), and Computer Skills (CT_101), whereas in-person students outperformed their online peers in Entrepreneurship (ENT_101). There was no meaningful difference in outcomes for students in the Computer Skills (CT_102) course for the two years.

In light of these findings, it raises the question: why do we observe minimal differences (less than a one-point gain or loss) in student outcomes in courses like University Skills, Fitness and Health, Entrepreneurship, and Advanced Computer Skills based on the mode of instruction? Is it possible that when subjects are primarily at a basic or introductory level, as is the case with these courses, the mode of instruction may have a limited impact as long as the concepts are effectively communicated in a manner familiar and accessible to students?

In today’s digital age, one could argue that students in more developed countries, such as Saudi Arabia, generally possess the skills and capabilities to effectively engage with materials presented in both in-person and online formats. However, there is a notable exception in the Basic Computer Skills course, where the online cohort outperformed their in-person counterparts by more than 5 points. Insights from interviews with the instructors of this course suggest that this result may be attributed to the course’s basic and conceptual nature, coupled with the availability of instructional videos that students could revisit at their own pace.

Given that students enter this course with varying levels of computer skills, self-paced learning may have allowed them to cover course materials at their preferred speed, concentrating on less familiar topics while swiftly progressing through concepts they already understood. The advantages of such self-paced learning have been documented by scholars like Tullis and Benjamin ( 2011 ), who found that self-paced learners often outperform those who spend the same amount of time studying identical materials. This approach allows learners to allocate their time more effectively according to their individual learning pace, providing greater ownership and control over their learning experience. As such, in courses like introductory computer skills, it can be argued that becoming familiar with fundamental and conceptual topics may not require extensive in-class collaboration. Instead, it may be more about exposure to and digestion of materials in a format and at a pace tailored to students with diverse backgrounds, knowledge levels, and skill sets.

Further investigation is needed to more fully understand why some classes benefitted from online instruction while others did not, and vice versa. Perhaps, it could be posited that some content areas are more conducive to in-person (or online) format while others are not. Or it could be that the different results of the two modes of learning were driven by students of varying academic abilities and engagement, with low-achieving students being more vulnerable to the limitations of online learning (e.g., Kofoed et al., 2021 ). Whatever the reasons, the results of the current study can be enlightened by a more in-depth analysis of the various factors associated with such different forms of learning. Moreover, although not clear cut, what the current study does provide is additional evidence against any dire consequences to student learning (at least in the higher ed setting) as a result of sudden increase in online learning with possible benefits of its wider use being showcased.

Based on the findings of this study, we recommend that educational leaders adopt a measured approach to online learning—a stance that neither fully embraces nor outright denounces it. The impact on students’ experiences and engagement appears to vary depending on the subjects and methods of instruction, sometimes hindering, other times promoting effective learning, while some classes remain relatively unaffected.

Rather than taking a one-size-fits-all approach, educational leaders should be open to exploring the nuances behind these outcomes. This involves examining why certain courses thrived with online delivery, while others either experienced a decline in student achievement or remained largely unaffected. By exploring these differentiated outcomes associated with diverse instructional formats, leaders in higher education institutions and beyond can make informed decisions about resource allocation. For instance, resources could be channeled towards in-person learning for courses that benefit from it, while simultaneously expanding online access for courses that have demonstrated improved outcomes through its virtual format. This strategic approach not only optimizes resource allocation but could also open up additional revenue streams for the institution.

Considering the enduring presence of online learning, both before the pandemic and its accelerated adoption due to Covid-19, there is an increasing need for institutions of learning and scholars in higher education, as well as other fields, to prioritize the study of its effects and optimal utilization. This study, which compares student outcomes between two cohorts exposed to in-person and online instruction (before and during Covid-19) at the largest university in Saudi Arabia, represents a meaningful step in this direction.

Data availability

The datasets generated during and/or analyzed during the current study are available from the corresponding author upon reasonable request.

Allen IE, Seaman J (2016) Online report card: Tracking online education in the United States . Babson Survey Group

Anderson T (2003) Getting the mix right again: an updated and theoretical rationale for interaction. Int Rev Res Open Distrib Learn , 4 (2). https://doi.org/10.19173/irrodl.v4i2.149

Arkorful V, Abaidoo N (2015) The role of e-learning, advantages and disadvantages of its adoption in higher education. Int J Instruct Technol Distance Learn 12(1):29–42

Google Scholar  

Aucejo EM, French J, Araya MP, Zafar B (2020) The impact of COVID-19 on student experiences and expectations: Evidence from a survey. Journal of Public Economics 191:104271. https://doi.org/10.1016/j.jpubeco.2020.104271

Article   PubMed   PubMed Central   Google Scholar  

Azevedo JP, Hasan A, Goldemberg D, Iqbal SA, and Geven K (2020) Simulating the potential impacts of COVID-19 school closures on schooling and learning outcomes: a set of global estimates. World Bank Policy Research Working Paper

Bergstrand K, Savage SV (2013) The chalkboard versus the avatar: Comparing the effectiveness of online and in-class courses. Teach Sociol 41(3):294–306. https://doi.org/10.1177/0092055X13479949

Article   Google Scholar  

Bettinger EP, Fox L, Loeb S, Taylor ES (2017) Virtual classrooms: How online college courses affect student success. Am Econ Rev 107(9):2855–2875. https://doi.org/10.1257/aer.20151193

Bozkurt A (2019) From distance education to open and distance learning: a holistic evaluation of history, definitions, and theories. Handbook of research on learning in the age of transhumanism , 252–273. https://doi.org/10.4018/978-1-5225-8431-5.ch016

Brooks JG, Brooks MG (1999) In search of understanding: the case for constructivist classrooms . Association for Supervision and Curriculum Development

Cacault MP, Hildebrand C, Laurent-Lucchetti J, Pellizzari M (2021) Distance learning in higher education: evidence from a randomized experiment. J Eur Econ Assoc 19(4):2322–2372. https://doi.org/10.1093/jeea/jvaa060

Chesser S, Murrah W, Forbes SA (2020) Impact of personality on choice of instructional delivery and students’ performance. Am Distance Educ 34(3):211–223. https://doi.org/10.1080/08923647.2019.1705116

Christensen CM, Raynor M, McDonald R (2015) What is disruptive innovation? Harv Bus Rev 93(12):44–53

Cochran-Smith M, Zeichner KM (2005) Studying teacher education: the report of the AERA panel on research and teacher education. Choice Rev Online 43 (4). https://doi.org/10.5860/choice.43-2338

De Paola M, Ponzo M, Scoppa V (2013) Class size effects on student achievement: heterogeneity across abilities and fields. Educ Econ 21(2):135–153. https://doi.org/10.1080/09645292.2010.511811

Dewey, J (1938) Experience and education . Simon & Schuster

Di Pietro G, Biagi F, Costa P, Karpinski Z, Mazza J (2020) The likely impact of COVID-19 on education: reflections based on the existing literature and recent international datasets. Publications Office of the European Union, Luxembourg

Duffy TM, Jonassen DH (2009) Constructivism and the technology of instruction: a conversation . Routledge, Taylor & Francis Group

Edvardsson IR, Oskarsson GK (2008) Distance education and academic achievement in business administration: the case of the University of Akureyri. Int Rev Res Open Distrib Learn, 9 (3). https://doi.org/10.19173/irrodl.v9i3.542

Figlio D, Rush M, Yin L (2013) Is it live or is it internet? Experimental estimates of the effects of online instruction on student learning. J Labor Econ 31(4):763–784. https://doi.org/10.3386/w16089

Fischer C, Xu D, Rodriguez F, Denaro K, Warschauer M (2020) Effects of course modality in summer session: enrollment patterns and student performance in face-to-face and online classes. Internet Higher Educ 45:100710. https://doi.org/10.1016/j.iheduc.2019.100710

Gadamer HG (2001) Education is self‐education. J Philos Educ 35(4):529–538

Garrison DR (2011) E-learning in the 21st century: a framework for research and practice . Routledge. https://doi.org/10.4324/9780203838761

Gonzalez T, de la Rubia MA, Hincz KP, Comas-Lopez M, Subirats L, Fort S, & Sacha GM (2020) Influence of COVID-19 confinement on students’ performance in higher education. PLOS One 15 (10). https://doi.org/10.1371/journal.pone.0239490

Hake RR (1998) Interactive-engagement versus traditional methods: a six-thousand-student survey of mechanics test data for introductory physics courses. Am J Phys 66(1):64–74. https://doi.org/10.1119/1.18809

Article   ADS   Google Scholar  

Hall ACG, Lineweaver TT, Hogan EE, O’Brien SW (2020) On or off task: the negative influence of laptops on neighboring students’ learning depends on how they are used. Comput Educ 153:1–8. https://doi.org/10.1016/j.compedu.2020.103901

Harasim L (2017) Learning theory and online technologies. Routledge. https://doi.org/10.4324/9780203846933

Hiemstra R (1994) Self-directed learning. In WJ Rothwell & KJ Sensenig (Eds), The sourcebook for self-directed learning (pp 9–20). HRD Press

Ho DE, Kelman MG (2014) Does class size affect the gender gap? A natural experiment in law. J Legal Stud 43(2):291–321

Iglesias-Pradas S, Hernández-García Á, Chaparro-Peláez J, Prieto JL (2021) Emergency remote teaching and students’ academic performance in higher education during the COVID-19 pandemic: a case study. Comput Hum Behav 119:106713. https://doi.org/10.1016/j.chb.2021.106713

Jepsen C (2015) Class size: does it matter for student achievement? IZA World of Labor . https://doi.org/10.15185/izawol.190

Jonassen DH, Howland J, Moore J, & Marra RM (2003) Learning to solve problems with technology: a constructivist perspective (2nd ed). Columbus: Prentice Hall

Kaupp R (2012) Online penalty: the impact of online instruction on the Latino-White achievement gap. J Appli Res Community Coll 19(2):3–11. https://doi.org/10.46569/10211.3/99362

Koehler MJ, Mishra P (2009) What is technological pedagogical content knowledge? Contemp Issues Technol Teacher Educ 9(1):60–70

Kofoed M, Gebhart L, Gilmore D, & Moschitto R (2021) Zooming to class?: Experimental evidence on college students’ online learning during COVID-19. SSRN Electron J. https://doi.org/10.2139/ssrn.3846700

Kuhfeld M, Soland J, Tarasawa B, Johnson A, Ruzek E, Liu J (2020) Projecting the potential impact of COVID-19 school closures on academic achievement. Educ Res 49(8):549–565. https://doi.org/10.3102/0013189x20965918

Lai JW, Bower M (2019) How is the use of technology in education evaluated? A systematic review. Comput Educ 133:27–42

Meinck S, Brese F (2019) Trends in gender gaps: using 20 years of evidence from TIMSS. Large-Scale Assess Educ 7 (1). https://doi.org/10.1186/s40536-019-0076-3

Radha R, Mahalakshmi K, Kumar VS, Saravanakumar AR (2020) E-Learning during lockdown of COVID-19 pandemic: a global perspective. Int J Control Autom 13(4):1088–1099

Ravizza SM, Uitvlugt MG, Fenn KM (2017) Logged in and zoned out: How laptop Internet use relates to classroom learning. Psychol Sci 28(2):171–180. https://doi.org/10.1177/095679761667731

Article   PubMed   Google Scholar  

Sadeghi M (2019) A shift from classroom to distance learning: advantages and limitations. Int J Res Engl Educ 4(1):80–88

Salmon G (2000) E-moderating: the key to teaching and learning online . Routledge. https://doi.org/10.4324/9780203816684

Shulman LS (1986) Those who understand: knowledge growth in teaching. Edu Res 15(2):4–14

Shulman LS (1987) Knowledge and teaching: foundations of the new reform. Harv Educ Rev 57(1):1–22

Tullis JG, Benjamin AS (2011) On the effectiveness of self-paced learning. J Mem Lang 64(2):109–118. https://doi.org/10.1016/j.jml.2010.11.002

Valverde-Berrocoso J, Garrido-Arroyo MDC, Burgos-Videla C, Morales-Cevallos MB (2020) Trends in educational research about e-learning: a systematic literature review (2009–2018). Sustainability 12(12):5153

Volk F, Floyd CG, Shaler L, Ferguson L, Gavulic AM (2020) Active duty military learners and distance education: factors of persistence and attrition. Am J Distance Educ 34(3):1–15. https://doi.org/10.1080/08923647.2019.1708842

Vygotsky LS (1978) Mind in society: the development of higher psychological processes. Harvard University Press

Download references

Author information

Authors and affiliations.

Department of Sports and Recreation Management, King Saud University, Riyadh, Saudi Arabia

Bandar N. Alarifi

Division of Research and Doctoral Studies, Concordia University Chicago, 7400 Augusta Street, River Forest, IL, 60305, USA

You can also search for this author in PubMed   Google Scholar

Contributions

Dr. Bandar Alarifi collected and organized data for the five courses and wrote the manuscript. Dr. Steve Song analyzed and interpreted the data regarding student achievement and revised the manuscript. These authors jointly supervised this work and approved the final manuscript.

Corresponding author

Correspondence to Bandar N. Alarifi .

Ethics declarations

Competing interests.

The author declares no competing interests.

Ethical approval

This study was approved by the Research Ethics Committee at King Saud University on 25 March 2021 (No. 4/4/255639). This research does not involve the collection or analysis of data that could be used to identify participants (including email addresses or other contact details). All information is anonymized and the submission does not include images that may identify the person. The procedures used in this study adhere to the tenets of the Declaration of Helsinki.

Informed consent

This article does not contain any studies with human participants performed by any of the authors.

Additional information

Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary information

Rights and permissions.

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Cite this article.

Alarifi, B.N., Song, S. Online vs in-person learning in higher education: effects on student achievement and recommendations for leadership. Humanit Soc Sci Commun 11 , 86 (2024). https://doi.org/10.1057/s41599-023-02590-1

Download citation

Received : 07 June 2023

Accepted : 21 December 2023

Published : 09 January 2024

DOI : https://doi.org/10.1057/s41599-023-02590-1

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

Quick links

  • Explore articles by subject
  • Guide to authors
  • Editorial policies

online learning research hypothesis

A Theoretical Framework for Effective Online Learning

  • October 2005
  • SSRN Electronic Journal
  • This person is not on ResearchGate, or hasn't claimed this research yet.

Jeremy B Williams at Zayed University

  • Zayed University

Discover the world's research

  • 25+ million members
  • 160+ million publication pages
  • 2.3+ billion citations

Bariham Iddrisu

  • Evelyn Kuusozume Yirbekyaa

Anthony Bordoh

  • Javier Sierra Isturiz
  • Maria Jose García-Miguel

Fernando Maria Caballero Martinez

  • INT J SCI EDUC

Jason Martina

  • Diana Rivera
  • Maritza Ocasio-Vega
  • Jorge López Goglad

Shevchenko Viktoriya

  • Paul Ramsden
  • MIT SLOAN MANAGE REV

Stuart L. Hart

  • C. M. Christensen

John B Biggs

  • Benjamin S. Bloom
  • K. A. Bruffee
  • Peter M Senge
  • J Educ Tech Syst

Chee-Kit Looi

  • Recruit researchers
  • Join for free
  • Login Email Tip: Most researchers use their institutional email address as their ResearchGate login Password Forgot password? Keep me logged in Log in or Continue with Google Welcome back! Please log in. Email · Hint Tip: Most researchers use their institutional email address as their ResearchGate login Password Forgot password? Keep me logged in Log in or Continue with Google No account? Sign up

IMAGES

  1. The Impact of Online Learning on Students Academic Performance

    online learning research hypothesis

  2. One Thing Missing in Most E-learning Courses

    online learning research hypothesis

  3. PPT

    online learning research hypothesis

  4. What Research Does—and Does Not—Tell Us About Online Learning

    online learning research hypothesis

  5. 13 Different Types of Hypothesis (2024)

    online learning research hypothesis

  6. The Impact of Online Learning Students Views

    online learning research hypothesis

VIDEO

  1. 02_Introduction to Concept Learning and Hypothesis_12-08-2021

  2. THE RESEARCH HYPOTHESIS-ACADEMIC RESEARCH WRITING BASIC GUIDELINES

  3. lesson 9 research hypothesis conditions in stating the research hypothesis

  4. Hypothesis in educational research guess 👍#hypothesis #quiz #research #foryou #sst #exampreparation

  5. NEGATIVE RESEARCH HYPOTHESIS STATEMENTS l 3 EXAMPLES l RESEARCH PAPER WRITING GUIDE l THESIS TIPS

  6. What is Hypothesis in Research Methodology?

COMMENTS

  1. A systematic review of the effectiveness of online learning ...

    Camargo et al. (2020) implemented a meta-analysis on seven empirical studies regarding online learning methods during the pandemic period to evaluate feasible online learning platforms, effective online learning models, and the optimal duration of online lectures, as well as the perceptions of teachers and students in the online learning ...

  2. Examining research on the impact of distance and online ...

    We examine to what extent distance learning generation level, and instructional setting moderate the influences of distance learning on cognitive, affective and behavioral outcomes. This second-order meta-analyses also analyzes the first-order meta-analyses for methodological quality and robustness.

  3. (PDF) RESEARCH ON ONLINE LEARNING

    They include: a critical review of what the research literature can tell us about blended learning relative to each of Sloan-C’s five pillars of quality in online learning; two papers on...

  4. Assessing the Impact of Online-Learning Effectiveness and ...

    The result of this research showed that learning objectives could enable universities to increase the effectiveness of students’ online learning by motivating students to join online classes and developing appropriate learning strategies for their individual needs.

  5. A Systematic Review of the Research Topics in Online Learning ...

    What Martin et al. (2020) discovered was that researchers prior to the pandemic were focused on understanding how online learners effectively engaged in learning and the common characteristics, traits, and perspectives of learners engaged in online learning pursuits.

  6. Learners’ Satisfaction and Commitment Towards Online Learning ...

    Online learning is recognized as a relatively cheaper mode of education regarding the lower cost of transportation, accommodation and the overall cost of physical in-class learning method, but the empirical researchers have brought out several arguments associated with online pedagogy.

  7. Students’ online learning adaptability and their continuous ...

    The adaptability of students to online learning and their sustained willingness to engage with it constitute two pivotal factors influencing the effective operation of online education...

  8. Online vs in-person learning in higher education: effects on ...

    Abstract. This study is a comparative analysis of online distance learning and traditional in-person education at King Saud University in Saudi Arabia, with a focus on understanding how...

  9. Quality of e-Learning: An Analysis Based on e-Learners ...

    The review of the literature on the quality issues in e-learning revealed that online courses target a different segment of student population (Mangan, 2001; Thomas, 2001). Research studies further revealed that five out of six online students were found to be employed and would be unable to attend traditional classes (Thomas, 2001).

  10. A Theoretical Framework for Effective Online Learning

    A Theoretical Framework for Effective Online Learning. October 2005. SSRN Electronic Journal. DOI: 10.2139/ssrn.1614626. Authors: Sarah Siew Chin Teo. Jeremy B Williams. Zayed University....