Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • View all journals
  • My Account Login
  • Explore content
  • About the journal
  • Publish with us
  • Sign up for alerts
  • Open access
  • Published: 10 January 2022

Predicting students’ performance in e-learning using learning process and behaviour data

  • Feiyue Qiu 1 ,
  • Guodao Zhang 2 ,
  • Xin Sheng 1 ,
  • Lei Jiang 1 ,
  • Lijia Zhu 1 ,
  • Qifeng Xiang 1 ,
  • Bo Jiang 3 &
  • Ping-kuo Chen 4  

Scientific Reports volume  12 , Article number:  453 ( 2022 ) Cite this article

21k Accesses

43 Citations

6 Altmetric

Metrics details

  • Computational science
  • Computer science
  • Scientific data

E-learning is achieved by the deep integration of modern education and information technology, and plays an important role in promoting educational equity. With the continuous expansion of user groups and application areas, it has become increasingly important to effectively ensure the quality of e-learning. Currently, one of the methods to ensure the quality of e-learning is to use mutually independent e-learning behaviour data to build a learning performance predictor to achieve real-time supervision and feedback during the learning process. However, this method ignores the inherent correlation between e-learning behaviours. Therefore, we propose the behaviour classification-based e-learning performance (BCEP) prediction framework, which selects the features of e-learning behaviours, uses feature fusion with behaviour data according to the behaviour classification model to obtain the category feature values of each type of behaviour, and finally builds a learning performance predictor based on machine learning. In addition, because existing e-learning behaviour classification methods do not fully consider the process of learning, we also propose an online behaviour classification model based on the e-learning process called the process-behaviour classification (PBC) model. Experimental results with the Open University Learning Analytics Dataset (OULAD) show that the learning performance predictor based on the BCEP prediction framework has a good prediction effect, and the performance of the PBC model in learning performance prediction is better than traditional classification methods. We construct an e-learning performance predictor from a new perspective and provide a new solution for the quantitative evaluation of e-learning classification methods.

Similar content being viewed by others

e learning research topics 2021

Impact of artificial intelligence on human loss in decision making, laziness and safety in education

e learning research topics 2021

Self-supervised learning for human activity recognition using 700,000 person-days of wearable data

e learning research topics 2021

Sleep quality, duration, and consistency are associated with better academic performance in college students

Introduction.

E-learning has become a typical form of education 1 and an important part of the development of Internet-based education. Due to the impact of the COVID-19 pandemic, e-learning has been widely used worldwide due to its high temporal and spatial flexibility, low knowledge acquisition threshold, and rich learning resources. However, in this mode, teachers cannot easily perceive the learning status of their learners 2 , and questions about the quality of e-learning have been raised. The study of learning performance prediction provides a basis for teachers to adjust their teaching methods for students who may have problems by predicting students’ performance on future exams, reducing the risk of students failing to pass the course, and ensuring the quality of e-learning. Through a large number of empirical studies that investigate the relationship between e-learning behaviour and learning performance, learners’ e-learning behaviour has an important impact on learning performance. Therefore, in recent years, learning performance prediction based on learning process data has received widespread attention. The use of measurement, collection and analysis of learning process data to achieve learning performance prediction 3 can help teachers modify teaching strategies in time and start during students’ learning processes using the role of supervision and early warning 4 .

Research notes that e-learning behaviour data are important to understanding e-learning processes. E-learning behaviour data refers to the data generated by learners in various behavioural activities performed on e-learning platforms or online teaching organizations, which can describe the activity records of learners in the learning process, specifically including the number of login platforms, the number of access to resources, the number of participants in forum discussions, the number of access to resources and other behavioural data 5 . Concurrently, e-learning behaviour involves less private information, and data collection and use are more convenient, which has an important impact on e-learning performance 6 . Therefore, researchers have conducted in-depth research on e-learning behaviour 7 and constructed different learning performance predictors based on e-learning behaviour 8 . Learning performance prediction is usually a binary classification task that divides students into two groups of ”passed” or ”failed” to predict the possibility of passing the test in the future 9 . Because predictive learning is the primary advantage of machine learning technology, it is often used to train the learning performance prediction model using a simple method 10 , 11 . Although this type of predictor can achieve good prediction results, it has certain limitations. First, regarding low generalizability and high computational complexity, a large number of e-learning behaviour data of different dimensions are captured and recorded during the e-learning process. Constructing an e-learning performance predictor is prone to overfitting and high computational complexity. Feature selection can be used to retain key learning behaviours to reduce model operational costs, which is of practical significance for online platforms to provide high-accuracy and low-time-consuming learning performance prediction services. Second, the single input method of e-learning behaviour data requires that e-learning predictors generally use e-learning behaviour data directly as input variables. Few predictors will consider the combined effect of learning behaviour data (i.e., perform feature fusion processing) on the same type of learning behaviour data and then use it for training. Last, key learning behaviour indicators are not standardized, and those found by different researchers ares different. This field of study has failed to identify key behaviour indicators that can be used to effectively predict learning performance 12 , 13 ; thus, the results of the prediction models are frequently affected by platform-specific learning behaviours, which affects the model’s mobility.

To solve these problems, we propose the behaviour classification-based E-learning performance prediction framework (BCEP prediction framework), summarize the classic e-learning classification methods, analyse the e-learning process in detail, propose the process-behaviour classification model (PBC model), and construct an e-learning performance predictor based on the PBC model. The primary contributions of this article are as follows. First, the BCEP prediction framework proposed in this paper includes four steps: data cleaning, behaviour classification, feature fusion, and model training. Compared with general learning performance predictors, this framework yields more accurate predictions of student achievement. During training, computational complexity is reduced, and the model’s mobility and versatility are increased in the application process. Second, the PBC model proposed in this paper divides e-learning behaviours into four categories. The learning performance predictor based on the PBC model is shown to perform markedly better than existing typical classification methods .

This article is composed of 7 summaries, and the remainder of its content is organized as follows. Section 2 summarizes the development status of e-learning performance prediction, focusing on the prediction indicators and methods of e-learning performance prediction. Section 3 describes the BCEP prediction framework in detail. Section 4 reviews the existing learning classification models and designs a new e-learning behaviour classification method-PBC model. Section 5 describes the experiments used to verify the effectiveness of the BCEP prediction framework and PBC model. In Section 6, experimental results are systematically analysed and discussed. Last, Section 7 provides conclusions and prospects for future research.

Related work

Prediction indicators of e-learning performance.

E-learning performance predictors are generally summarized as tendency indicators and behavioural performance indicators 14 . Tendency indicators are inherent attributes of themselves; primarily static data, which are generally collected before the start of the course or semester, such as socioeconomic status 15 , historical academic records 16 and gender 17 , are common indicators of propensity. Many researchers have used propensity indicators to develop learning early-warning models to predict students’ learning in a course, a semester, and other stages. Although the predictors established by these studies achieve good performance, they ignored the role of learning behaviour records. For example, many studies used students’ historical performance or demographic data that were not related to learning. Although these studies can predict learning performance through learner characteristics, this method ignored that most of the tendency indicators were not in the student’s and the teacher’s control, and the students’ changes in curriculum were ignored 18 . In addition, there is a privacy problem with preference indicators, and personal data collected by educational institutions cannot be shared publicly. The behavioural performance index (i.e., the dynamic index reflected by the learner in the learning process 19 , 20 , 21 ) generally did not have such a problem. E-learning behaviour data can accurately describe the time and energy that students spend on a specific course, such as the frequency of access to course materials 22 and the frequency of online discussions 23 . Some studies also tried to use the combination of two indicators to complete learning prediction 24 but encountered problems related to increasing computational costs.

The accumulation of educational big data and the emergence of new methods of connecting and exchanging information have laid the foundation for e-learning behaviour research. Learners’ learning behaviour data are important when analysing changes in learners’ behaviour, preferences, and ability levels 25 , which promotes related research on learning performance prediction based on learning behaviour. Learning input theory explains the relationship between learning behaviour and learning performance 26 and states that learning behaviour is a key factor affecting learning performance and an important indicator for predicting learning performance 27 . Concurrently, many studies have confirmed that there is a significant correlation between student online activities and academic performance 28 , 29 , and observing learning activities at a finer-grained level can strengthen the grasp of learning conditions and promote constructive learning 30 . Therefore, many researchers have explored the correlation between e-learning behaviour and learning performance, and used e-learning behaviour to predict learning performance. For example, Qi 31 reported that there is a significant positive correlation between learners’ e-learning behaviour and the learning effect. Liang et al. 32 recorded student data through a curriculum management system and used regression analysis to find a correlation between learning experience, learning behaviour and learning performance. Comer et al. 33 found that in the e-learning environment, collaborative communication behaviour will deepen students’ understanding of knowledge and encourage students to achieve certain learning achievements. Kokoç and Altun 34 used learning interaction data to predict the learning performance of online learners and found that the access behaviour of learning content, books, forums, and course activities can significantly affect learning outcomes. Some studies looked for a relationship between a certain behavioural activity or several behavioural activities and learning performance. Zheng et al. 35 found that there is a positive correlation between the number of logins and the final grades of students. Qureshi et al. 36 used a questionnaire survey method to find that cooperative learning and learning participation play an intermediary role between social factors and learning performance, and verified that collaborative learning behaviours promote learning performance in e-learning. Shen 37 noted that the proportion of learners’ homework completion and video completion rate in e-learning affect learning.

Based on the literature about learning performance prediction using learning behaviours, analyses of e-learning behaviour is frequently limited to independent e-learning behaviours. Few studies have explored the internal associations and differences between e-learning, specifically categorizing and analysing e-learning behaviours. In previous studies that used learning behaviour classification as primary predictor indicators, researchers only used independent e-learning behaviour data as the input of predictor training instead of the fusion data of learning behaviour classification, which reduced the importance of learning behaviour classification.

Prediction algorithm of e-learning performance

In e-learning performance prediction research, the selection of predictive indicators occupies an important position, and prediction methods also play a key role, particularly feature selection and algorithm selection, which can markedly affect the prediction effect. Therefore, it is necessary to identify relevant research and applications of machine learning and feature selection.

An increasing number of studies have confirmed that when constructing predictive models, multiple data points cannot always guarantee a higher predictive ability. Unnecessary features will affect the generalizability of the model and increase the computational cost of the model. For example, Akram et al. 38 used ten prediction algorithms to predict students with learning difficulties through assignment submission behaviour and found that the prediction performance of all algorithms decreased as the number of input data increased. It is thus necessary to select behavioural features that are meaningful for learning performance from the sample data and then input them into the model for training and learning; thus, feature selection is necessary 39 . Three methods can be used for feature selection: the filter method, the wrapper method, and the embedded method. Madichetty et al. 40 verified that the selection of key features is helpful for classification prediction. The filtering feature selection method has the advantages of strong independence, fast running speed and low computational complexity in machine learning algorithms, but makes it difficult to completely delete redundant features when there are many redundant features and high target relevance 41 . Wrapping methods can be independent of machine learning models but typically have high computational costs 42 . The embedded method embeds feature selection into other algorithms and selects new features during the training process, which can effectively improve the efficiency of model learning 43 .

In addition, machine learning algorithms have unique advantages in solving classification problems. For example, Huang and Lin et al. 44 proposed a multimodal information perception method for flexible manipulators based on machine learning methods to complete gestures, object shapes, sizes and weights to recognize tasks, compared the recognition accuracy of optical sensor information (OSI), pressure sensor information (PSI) and dual sensor information (DSI). They found that the KNN algorithm with DSI performed better than other with regard to recognition accuracy. Cao and Zhang et al. 45 used and improved the deep learning method; used the multitask cascaded convolution network (MTCNN) method to locate the face of cartoon characters; performed face detection and face feature point detection; and recognized the image emotion of cartoon style. Muhammad and Liu et al. 46 extended the application of machine learning to the field of language recognition and translation by sharing dictionary embeddings between the parent language and the child language without using reverse translation or manual noise injection and proposed a language-independent hybrid transfer learning (HTL) method to solve the problem of data sparseness in low-resource languages (LRLs). Machine learning technology has gradually emerged in the development of the learning analysis process, which facilitates the collection and analysis of student and environmental data 47 . In recent years, many machine learning classification algorithms have been applied to the field of learning performance prediction. For example, Jiang et al. 48 built a predictor based on logistic regression, which combined students’ first week of homework performance and social interaction behaviour to predict learners’ performance in the course. Aziz et al. 49 selected five parameters, race, gender, family income, college enrolment mode and average grade point, and used the naïve Bayes classifier to predict the average grade point. Ahuja and Kankane 50 used the K-nearest neighbour algorithm to predict the results of students’ academic acquisition based on the previous academic performance and non-academic factors of college students. Asif et al. 51 used the decision tree algorithm to predict students’ performance at the end of the four-year study plan. Jie-ping et al. 52 proposed a performance prediction method that combined fuzzy clustering and support vector machine regression based on students’ historical performance and behavioural habits.

Behaviour-based classification of the e-learning performance prediction framework

Researchers typically use the behaviour of each e-learning category as an independent predictor of the performance of e-learning to build predictive models. However, different e-learning behaviours have potential correlations and can be classified into different behaviour categories according to different rules. This research innovatively constructs a learning performance predictor from the perspective of behaviour categories and proposes the behaviour classification-based E-learning performance prediction framework (BCEP prediction framework).

The BCEP prediction framework describes the complete process of implementing learning performance predictors through e-learning behaviour categories, as shown in Fig.  1 . The prediction framework includes four core links: (1) data pre-processing, which includes data cleaning and conversion from the original e-learning behaviour data obtained by the e-learning platform obtain standardized e-learning behaviour data; (2) feature selection, which is performed on pre-processed e-learning behaviour data to obtain key e-learning behaviours; (3) feature fusion, which classifies core learning behaviours according to specific rules, constructs a collection of behaviour categories, and then performs feature fusion to obtain the category feature value of each type of e-learning behaviour; and (4) model training, which builds an e-learning performance predictor based on a variety of machine learning algorithms.

figure 1

Behavior-based classification of e-learning performance prediction framework.

(1) Data pre-processing

The quality of e-learning behaviour data directly affects the accuracy of predictive models. Therefore, the first step is to clean the e-learning behaviour data obtained from the e-learning platform. There is no unified process for data cleaning, but the method should be selected according to the real situation of the data to manage missing values, duplicate values, and abnormal values. Concurrently, e-learning behaviours recorded by e-learning platforms are often not of a single dimension, e-learning behaviour data of different dimensions are often not numerically comparable, and feature selection cannot be performed. The proposed framework solves this problem by standardizing e-learning behaviour data in different dimensions with Z scores.

We define the original e-learning behaviour set \(B \left\{ b_{1}, b_{2}, \ldots . ., b_{n}\right\}\) and the standard e-learning behaviour set \(B\left\{ b^{\prime }, b_{2}^{\prime }, \ldots , b_{n}^{\prime }\right\}\) . Where \(b_{n}\) is the n-th e-learning behaviour recorded by the e-learning platform, and \(b_{n}^{\prime }\) is the n-th e-learning behaviour after standardization. Concurrently, the original e-learning behaviour data and the standard e-learning behaviour data are defined, where n is the n-th e-learning behaviour, and m is the m-th data of the current e-learning behaviour. For example, \(d_{nm}\) is the second behaviour data of the first type of e-learning behaviour recorded by the e-learning platform, and the formula for \(d_{nm}^{\prime }\) is as follows:

Where \(\mu _{\mathrm {b}_{\mathrm {m}}}\) is the average value of the n-th type of e-learning behaviour data, and \(\sigma _{\mathrm {b}_{\mathrm {m}}}\) is the variance of the n-th type of e-learning behaviour data.

(2) Feature selection

Feature selection can select relevant features that are beneficial to the training model from all features, thereby reducing the feature dimension and improving the generalizability, operating efficiency and interpretability of the model. This framework uses the variance filtering method to perform feature selection on standardized e-learning behaviour data. The variance filtering method uses the variance of each feature itself to filter the features. The smaller the variance of the feature, the lower the difference of the sample on this feature, and the smaller the distinguishing effect of the feature on the sample. The threshold is an important parameter of the variance filtering method, which represents the threshold of variance; thus, features with variance less than the threshold will be discarded.

We define the characteristic value set of e-learning behaviour \(V \left\{ v_{1}, v_{2}, \ldots , v_{n}\right\}\) , where \(v_{n}\) is the characteristic value of the n-th e-learning behaviour, and its formula is as follows:

where \(\mu _{b^{\prime }_{m}}\) represents the average value of the n-th standard e-learning behaviour data. The elements in traversal V are compared with the variance threshold. If the current e-learning behaviour feature value is greater than the threshold, the corresponding e-learning behaviour is added to the key e-learning behaviour set; otherwise, it is not added.

(3) Feature fusion

First, according to the e-learning behaviour classification model, the key e-learning behaviour is divided into different e-learning behaviour clusters. We assume that the classification Model M is composed of n types of e-learning behaviour categories ( ie., \(M\left\{ C_{1}, C_{2}, \ldots , C_{n}\right\}\) ). After dividing the e-learning behaviour categories, n e-learning behaviour clusters are generated, and each type of e-learning behaviour cluster includes a varying number of e-learning behaviours, such as \(C_{1}\left\{ b_{1}, b_{2}, \ldots , b_{n}\right\}\) , where, \(b_{n}\) is the n-th e-learning behaviour that meets the standard of \(C_{1}\) .

Then, feature fusion is performed on each e-learning behaviour cluster to obtain the corresponding feature value of the e-learning behaviour category \(C_{1}\left\{ b_{1}, b_{2}, \ldots , b_{n}\right\}\) as an example. The calculation formula of its category feature value is as follows:

In Eq. 3 , \(V_{b_{i}}\) is the \(b_{i}\) characteristic value of the behaviour, \(\lambda =0\) means the student has passed the curriculum, \(\lambda =1\) means failed. Similarly,we construct the feature value set of e-learning behaviour category \(V_{c}\left\{ V_{c1}, V_{c2}, \ldots , V_{ci}\right\}\) where \(V_{C_{i}}\) is the category feature value of the \(C_{1}\) e-learning behavior.

(4) Model training

In the model training session, classic machine learning methods such as SVC, NaÃve Bayes, KNN and Softmax are selected, and the e-learning behaviour category feature value set \(V_{C}\) is used as the feature data to train the e-learning performance prediction model. After many iterations, the best e-learning performance prediction model is selected to predict the e-learning performance of e-learners.

E-learning process—behaviour classification model

The e-learning behaviour classification model is an important component of the BCEP prediction framework that directly affects the prediction effect of the e-learning performance prediction model. This paper summarizes the current mainstream e-learning behaviour classification methods, as shown in Table  1 .

Table  1 shows that most researchers use interactive objects as the basis for the classification of learning behaviours. The primary interactive objects include learning systems, resource content, learning communities, and learners themselves. However, when learners are in different stages of learning, they often engage in different learning behaviours, but the interactive objects may be the same. The classification method based on interactive objects does not fully consider the process of learning. Based on these ideas, this study constructed an e-learning behaviour classification model based on the e-learning process-process-behaviour classification model (PBC model), as shown in Fig.  2 .

figure 2

The process-behaviour classification model (PBCM.

The e-learning process primarily includes the learning stage, the knowledge acquisition stage, the interactive reflection stage and the learning consolidation stage 5 . The learning stage is the preparation process for learners to officially start e-learning; the knowledge acquisition stage is the most important e-learning process and is also the process by which learners initially acquire knowledge; the interactive reflection stage is a process in which learners interact with teachers and peers and reflect on themselves during the interaction; and the stage of learning consolidation is the process by which learners consolidate internalized knowledge. The model is centred on online learners, and according to the e-learning process, learning behaviour is divided into learning preparation behaviour (LPB), knowledge acquisition behaviour (KAB), interactive learning behaviour (ILB), and learning consolidation behaviour (LCB).

Learning preparation behaviour (LPB ) occurs during the learning stage and is the most basic behaviour of learners in e-learning. Specifically, LPB includes behaviours such as logging in to the learning platform, accessing the primary page of the course, and accessing the course activity interface.

Knowledge acquisition behaviour (KAB) occurs during the knowledge acquisition stage and is the behaviour of online learners directly acquiring knowledge. KAB primarily includes activities such as browsing course content resources, participating in course activities, watching course videos, and accessing resource links.

Interactive learning behaviour (ILB) occurs in the interactive reflection stage and is one of the key learning behaviours in e-learning. ILB has been proven to have a positive effect on the continuity and learning effect of e-learning 61 . Its specific manifestations include participating in seminars, publishing forums, replying to forums, asking questions to teachers, etc..

Learning consolidation behaviour (LCB) occurs in the stage of learning consolidation and refers to the behaviour of learners to strengthen the degree of knowledge mastery, primarily including proposing postclass reflections and completing postclass tests.

Experimental design

Experiments are used to compare the prediction performance of the predictor in the traditional framework and the BCEP prediction framework based on the PBC model proposed in this study to verify the effectiveness of the proposed framework. Its predictors include six machine learning methods: SVC (R), SVC (L), Naïve Bayes, KNN (U), KNN (D) and softmax 10 . We selected the accuracy rate, F1-score and Kappa coefficient as the quantitative indicators to evaluate the prediction performance. To fully verify the BCEP prediction framework, the evaluation indicators also include the time required for the experiment to complete the prediction.

We used a XiaoXin AirPlus14 Laptop to build the experimental environment, which consists of an AMD Ryzen 5600u processor, NVIDIA GeForce MX450 graphics card and a 500-GB hard disk. In terms of software, programming was performed in the Jupyter lab programming platform of the Windows 10 operating system and in the Python programming language for experiments.

Data sources

The Open University Learning Analytics Dataset (OULAD) 62 is considered to be one of the most comprehensive international open datasets in terms of e-learning data diversity, including student demographic data and interaction data between students and VLE. The role of Open University in developing this dataset is to support research in the field of learning analysis by collecting and analysing learner data to provide personalized guidance and optimize learning resources. The dataset contains 7 course modules (AAA   GGG), 22 courses, e-learning behaviour data and learning performance data of 32,593 students. The collection phase of the entire dataset includes collection, selection, and anonymization. We use SAS technology to create a data warehouse to collect data, select data containing student information from 2013 to 2014, and finally anonymize the data. Its design types are time series design, data integration objective and observation design; its measurement type is learning behaviour; its technology type is digital curation, and its factor type is temporal_interval. In this experiment, the DDD course module with the most sample data was selected, and the e-learning data of 6,272 learners who participated in the DDD course were used as the data source for training and verifying the e-learning performance predictor. When learning DDD courses, learners performed 12 e-learning behaviours, as shown in Table   2 .

Experimental design for validation of the BCPF prediction framework

(1) Experimental program

This experiment sets up three experimental groups to compare and verify the effectiveness of the BCEP prediction framework. The difference between the three experimental groups lies in the feature data used to train the learning performance predictor. Group 1 uses online behaviour data that have only undergone data pre-processing as characteristic data. Group 2 uses data that have undergone feature selection but not feature fusion as characteristic data. Group 3 follows the BCEP prediction framework and uses existing features. The data that have undergone feature fusion are selected as feature data. In this experiment, 6 machine learning methods were selected, 18 learning performance predictors were constructed based on the 3-feature data above , and the effectiveness of the BCEP prediction framework was verified by comprehensively comparing the prediction results of the 18 learning performance predictors.

The experimental group using feature selection uses the variance filtering method to feature a selection of 12 online learning behaviours in the dataset, and 8 of them are selected as the feature data for constructing the learning performance predictor according to the variance threshold. The feature data of the three experimental groups after feature selection are shown in Table 3 :

Before using feature fusion, the experimental group classifies e-learning behaviours through the PBC model and then performs feature fusion on behaviour clusters to obtain behaviour category feature values. Finally, the behaviour category feature value is used as the feature data for building the learning performance predictor. A schematic diagram of the learning behaviour classification of Group 3 is shown in Fig.  3 :

figure 3

Schematic diagram of e-learning behaviour classification in Group 3.

In this experiment, the number and dimensions of data after feature fusion in the 3 experimental groups are shown in Table  4 .

Experimental design for the validation of the PBC model

The classification of e-learning behaviours are generally limited to theoretical research, and its role and scientific validity in learning performance prediction are difficult to verify. This experiment follows the learning performance prediction framework based on behaviour classification and selects three representative behaviour classification models for comparison with the PBC model, including Moore 53 e-learning behaviour classification (a total of three types of behaviour), Wu 60 e-learning behaviour classification (a total of four types of behaviours) and Peng 56 e-learning behaviour classification (a total of five types of behaviours). We use 6 classic machine learning methods to construct 24 learning performance predictors and verify the effectiveness of the PBC model by comprehensively comparing the prediction results of 24 learning performance predictors.

(2) Behaviour classification

First, feature selection is performed on 12 e-learning behaviours in the original dataset, and 8 e-learning behaviours are selected according to the variance threshold. Then, the 8 e-learning behaviours are classified according to the classification methods of the 4 experimental groups, and the classification results are shown in Fig.  4 :

figure 4

Online student behaviour classification: ( a ) PBCM, ( b ) Moore, ( c ) Wu, and ( d ) Peng.

Predictor implementation and evaluation

This experiment selects six machine learning algorithms that are currently widely used in the field of learning and prediction, such as SVC (R), SVC (L), Naïve Bayes, KNN (U), KNN (D) and softmax. We divided the original data \(70:30\%\) for training and testing 63 , and the expected output value of the predictor was ”qualified” or ”unqualified”.

Common indicators that are used to evaluate predictors include accuracy (ACC), F1-score (F1), and Kappa coefficient (K). ACC is considered to be the most commonly used measurement index, which refers to the proportion of the number of correctly classified samples to the total number of samples, and its formula is as follows:

where \(T_{p}\) is the number of positive samples correctly predicted, \(T_{n}\) is the number of negative samples correctly predicted, and totaldata is the total number of samples.

However, because the accuracy rate cannot fully evaluate the prediction model, further analysis of the F1-score (F1) and Kappa coefficients is required. The formula of F 1 is as follows:

The formula of Kappa is as follows:

where \(P_{o}\) is the observed coincidence ratio, and \(P_{e}\) is the coincidence ratio due to randomness.

Result analysis and discussion

This section presents the experimental results, including ACC, F1, Kappa, and the prediction time of each experimental group as calculated by the six machine learning methods. The BCEP prediction framework and the PBC model are verified by analysing these data.

BCEP prediction framework validation

We designed 3 control groups (different feature data) and used 6 common machine learning algorithms to build 3 types (18) of learning performance predictors to discuss the effectiveness of the BCEP prediction framework based on the prediction effects of the predictors. The ACC, F1, and Kappa of the three types of learning performance predictors are shown in Figs.  5 ,  6 , and 7 , respectively. To verify the role of feature selection, we also compared the experimental time of different experimental groups to complete the prediction task, as shown in Fig.  8 .

figure 5

Accuracy of the three types of prediction models.

figure 6

F1-score of the three types of prediction models.

figure 7

Kappa of the three types of prediction models.

Figure  5 describes the accuracy of the 6 algorithms (SVC(R), SVC(L), naïve Bayes, KNN(U), KNN(D), and softmax) with three different data processing methods. The accuracy rate of Group 1 is distributed between 89.7% and 91.65%, that of Group 2 is between 89.15% and 91.00%, and that of Group 3 is between 95.44% and 97.40%. The accuracy of Group 3 is also shown to be higher than that of the other two experimental groups with all six algorithms, which means that the prediction accuracy based on the proposed BCEP prediction framework is the highest. Experimental results show that the F1-score of Group 1 is between 0.9280 \(\sim\) 0.9423, that of Group 2 is between 0.9246 \(\sim\) 0.9374, and that of Group 3 is between 0.9685 \(\sim\) 0.9818; thus, Group 3 has the highest F1-score. The Kappa of Group 1 is between 0.7473 \(\sim\) 0.7916, that of Group 2 is between 0.7310 \(\sim\) 0.7820, and that of Group 3 is between 0.8865 \(\sim\) 0.9364; thus, Group 3 achieve markedly higher Kappa values. Lastly, the computational time required for Group 1 under each algorithm is 0.0139 s \(\sim\) 0.1489 s, that for Group 2 is 0.0070 s \(\sim\) 0.1199 s, and that for Group 3 is 0.0050 s \(\sim\) 0.1080 s; thus, the computational time required for Group 2 is less than that of Group 1, and Group 3 is the fastest in obtaining prediction results in each algorithm.

In addition, we can compare the indicators of Groups 1, 2, and 3 using Figs.  5 ,  6 ,  7 , and 8 , although the prediction performance of Groups 1 and 2 on different algorithms has both advantages and disadvantages. In general, after applying the feature selection strategy, Group 2 reduces the feature dimension from 12 to 8, the prediction effect is still near that of Group 1, and the speed is increases by 23.56%. These results show that the feature selection strategy can reduce the predictor’s training parameters while maintaining the predictor’s predictive performance and can reduce the time complexity of the operation. Group 3 is based on Group 2, according to the idea of behaviour classification and adopts a feature fusion strategy to further reduce the feature dimension from 8 to 4, and all indicators for each of the 6 machine learning algorithms are better than those of Group 1 and Group 2. Further analysis of Figs.  5 ,  6 , and 7 shows that the accuracy increased by 5.8% and 6.1% on average compared to Groups 1 and 2; the F1-score increased by 4.06% and 4.24%, respectively; Kappa markedly increased by 14.24% and 15.03%, respectively; and the computation time decreased by 41.57% and 23.56%, respectively. The learning performance prediction framework based on behaviour classification proposed in this paper is thus effective in real scenarios. Building a learning performance predictor with this framework can reduce the dimensionality of feature data and also markedly improve prediction performance compared to traditional methods that only use data pre-processing or feature selection strategies to build learning performance predictors based on pre-processing.

PBC model validity verification

The feature fusion link is a critical step of the proposed framework, and the learning behaviour classification model directly determines the effect of feature fusion. We thus designed 3 comparative experiments (different learning behaviour classification models), built 4 types (24) of learning performance predictors based on 6 common machine learning algorithms, and analysed the proposed PBC model based on the prediction effects of these predictors. The effectiveness of the four types of learning performance predictor accuracy, F1-Score, and Kappa is shown in Figs.  9 , 10 , and 11 .

figure 8

Computation time required for each of the three types of prediction models.

figure 9

Accuracy of the four types of prediction models.

figure 10

F1-score of the four types of prediction models.

figure 11

Kappa of four types of prediction models.

Figure  9 describes the prediction accuracy of the four groups of experiments (PBC model group, Moore group, Wu group, Peng group). The accuracy of the PBC model group is between 95.44% and 97.40%; that of the Moore group is between 94.25% and 96.42%; that of the Wu group is between 95.01% and 96.10%; and that of the Peng group is between 90.89% and 95.34%. The PBC model group thus achieved a higher accuracy with the Naïve Bayes, KNN (U), and KNN (D) algorithms. Figure  10 shows the F1-score results of four sets of experiments. The F1-score of the six predictors based on the PBC model is between 0.9685 and 0.9818, that based on the Moore group is between 0.9606 and 0.9749, that based on the Wu group is between 0.9654 and 0.9727, and that based on the Peng group is between 0.9388 and 0.9677. The six algorithm models based on the PBC model all have higher F1-score results; the Moore-based and Wu-based F1-score performances are equivalent; and the Peng-based F1-score performance is the worst. Figure  11 shows the Kappa values of the comparative experiment. The Kappa interval based on the PBC model group is 0.8865 to 0.9364, that based on the Moore group is 0.8550 to 0.9126, that based on the Wu group is 0.8764 to 0.9043, and that based on the Peng group is 0.7881 0.8844. Thus, except for the SVC(R) algorithm, the Kappa of the other algorithms in the PBC model group are higher, and the indicators of the PBC model group are the most stable.

From Figs.  9 ,  10 , and 11 , further analysis shows that the average accuracy rate of the PBC model group is 0.65%, 0.60%, and 2.02% higher than that of the Moore group, Wu group, and Peng group, respectively, and the upper (lower) limits of the accuracy increase by 0.98% (1.19%), 1.34% (0.33%), and 2.06% (4.56%); the average value of F1- score is higher by 0.45%, 0.41%, and 1.44%, and the upper (lower) limits of F1-score increase by 0.69% (0.79%), 0.90% (0.32%), and 1.41% (3.48%); the average values of Kappa are higher by 1.61%, 1.48%, and 4.86%, respectively, and the upper (lower) limits of Kappa increase by 2.38% (3.15%), 3.20% (1.00%), and 5.20% (9.84%). Thus, the predictor constructed based on the PBC model achieves the best accuracy, F1-score and Kappa value, and its prediction performance is better than that of the Moore classification method and Wu classification method, and markedly better than that of the Peng classification method. Therefore, when performing learning performance prediction tasks, it is effective and better to use the PBC model to divide learning behaviours.

The learning performance predictor is an effective tool to ensure the quality of e-learning. How to build a learning performance predictor with high versatility and high accuracy has become a research hotspot in e-learning. This paper innovatively begins from the starting point of behaviour classification, introduces the learning behaviour feature fusion strategy to the traditional method, proposes the BCEP prediction framework, and proposes the PBC model based on a summary of existing e-learning behaviour classification methods. Experimental results with the OULAD dataset show that the BCEP prediction framework performs markedly better than the traditional learning performance predictor construction method, and the learning performance predictor constructed by this framework is accurate and stable. Subsequent experiments showed that the PBC model proposed in this paper as a feature fusion strategy of the prediction framework is effective and superior to other e-learning behaviour classification methods, and to a certain extent, also provides a new feasible scheme for quantitatively evaluating the pros and cons of e-learning behaviour classification methods.

In future work, we plan to build learning performance predictors for different e-learning platforms using the framework proposed in this article. By recording and analysing the performance of these predictors in real-world applications, we plan to optimize the proposed BCEP prediction framework. Concurrently, considering that the prediction targets for e-learning should be more diversified, in addition to the e-learning performance mentioned in this article, we plan to use similar methods to predict e-learning emotions to achieve better online supervision and early warning through multiangle prediction results and to ensure the quality of online learners’ learning.

Giannakos, N. & Vlamos, P. Empirical examination and the role of experience. Educational webcasts’ acceptance. Br. J. Educ. Technol. 44 , 125–143. https://doi.org/10.1111/j.1467-8535.2011.01279.x (2013).

Article   Google Scholar  

Qu, S., Li, K., Wu, B., Zhang, X. & Zhu, K. Predicting student performance and deficiency in mastering knowledge points in moocs using multi-task learning. Entropy 21 , 1216. https://doi.org/10.3390/e21121216 (2019).

Article   ADS   PubMed Central   Google Scholar  

Gasevic, D., Siemens, G. & Rose, C. P. Guest editorial: Special section on learning analytics. IEEE Trans. Learn. Technol. 10 , 3–5. https://doi.org/10.1109/tlt.2017.2670999 (2017).

Shu, Y., Jiang, Q. & Zhao, W. Accurate alerting and prevention of online learning crisis: An empirical study of a model. Dist. Educ. China https://doi.org/10.13541/j.cnki.chinade.2019.08.004 (2019).

Sun, Y. Characteristics analysis of online learning behavior of distance learners in open university. China Educ. Technol. 2 , 64–71 (2015).

Google Scholar  

Cohen, A. Analysis of student activity in web-supported courses as a tool for predicting dropout. Etr&D-Educ. Technol. Res. Dev. 65 , 1285–1304. https://doi.org/10.1007/s11423-017-9524-3 (2017).

Lin, J. Moocs learner characteristics and study effect analysis research. China Audio-vis. Educ. 2 , 2 (2013).

Balakrishnan Eecs, G.,. Predicting student retention in massive open online courses using hidden markov models. Digit. Collect. 2 , 2 (2013).

Joksimovi, S. et al. How do we model learning at scale a systematic review of research on moocs. Rev. Educ. Res. 88 (1), 43–86. https://doi.org/10.3102/0034654317740335 (2017).

Coussement, K., Phan, M., Caigny, A. D., Benoit, F. & D. & Raes, A.,. Predicting student dropout in subscription-based online learning environments: The beneficial impact of the logit leaf model. Decis. Support Syst. 135 , 113325. https://doi.org/10.1016/j.dss.2020.113325 (2020).

Kotsiantis, S., Pierrakeas, C. & Pintelas, P. Preventing student dropout in distance learning using machine learning techniques. Springer Berlin Heidelberg 18 , 411–426. https://doi.org/10.1080/08839510490442058 (2003).

Lei, Z. & Tong, D. The prediction of academic achievement and analysis of group characteristics for mooc learners based on data mining. Chongqing Higher Educ. Res. 2 , 1–13 (2021).

Yang Zong, H. Z. & Hongtao, S. A logistic regression analysis of learning behaviors and learning outcomes in moocs. Dist. Educ. China https://doi.org/10.13541/j.cnki.chinade.20160527.002 (2016).

Fan, Y. & Wang, Q. Prediction of academic performance and risk: A review of literature on predicative indicators in learning analytics. Dist. Educ. China https://doi.org/10.13541/j.cnki.chinade.2018.01.001 (2018).

Romero, C., Cerezo, R., Bogarín, A. & Sànchez-Santillán, M. Educational process mining: A tutorial and case study using moodle data sets. Data Min. Learn. Anal. Appl. Educ. Res. 2 , 1–28 (2016).

Nawang, H., Makhtar, M. & Shamsudin, S. Classification model and analysis on students’ performance. J. Fundam. Appl. Sci. 9 , 869–885. https://doi.org/10.4314/jfas.v9i6s.65 (2017).

Keogh, E. J. & Mueen, A. Curse of dimensionality. Encycl. Mach. Learn. Data Mining 314–315 , 2017. https://doi.org/10.1007/978-1-4899-7687-1_192 (2017).

Hooshyar, D., Pedaste, M. & Yang, Y. Mining educational data to predict students’ performance through procrastination behavior. Entropy 22 , 12. https://doi.org/10.3390/e22010012 (2020).

Article   ADS   Google Scholar  

Du, X., Yang, J., Shelton, B. E., Hung, J. & Zhang, M. A systematic meta-review and analysis of learning analytics research. Behav. Inf. Technol. 40 , 49–62. https://doi.org/10.1080/0144929X.2019.1669712 (2021).

E.Shelton, B., Yang, J., Hung, J.-L. & Du, X. Two-stage predictive modeling for identifying at-risk students. In Innovative Technologies and Learning, Icitl 2018 , vol. 11003 of Lecture Notes in Computer Science , 578–583, https://doi.org/10.1007/978-3-319-99737-7_61 (Springer, 2018).

Lagus, J., Longi, K., Klami, A. & Hellas, A. Transfer-learning methods in programming course outcome prediction. Acm Trans. Comput. Educ. https://doi.org/10.1145/3152714 (2018).

Marquez-Vera, C. et al. Early dropout prediction using data mining: A case study with high school students. Expert. Syst. 33 , 107–124. https://doi.org/10.1111/exsy.12135 (2016).

Marbouti, F., Diefes-Dux, H. & Madhavan, K. Models for early prediction of at-risk students in a course using standards-based grading. Comput. Educ. 103 , 1–15. https://doi.org/10.1016/j.compedu.2016.09.005 (2016).

Zhao, L. et al. Academic performance prediction based on multisource, multifeature behavioral data. IEEE Access 9 , 5453–5465. https://doi.org/10.1109/access.2020.3002791 (2021).

Kumar, K. & Vivekanandan, V. Advancing learning through smart learning analytics: A review of case studies. Asian Assoc. Open Universities J. (2018).

Yao, Z. A review of the student engagement theory. J. Shunde Polytechnic 16 , 44–52 (2018).

Ma, Z., Su, S. & Zhang, T. Research on the e-learning behavior model based on the theory of learning engagement–taking the course of ”the design and implementation of network teaching platform” as an example. Modern Educational Technology 27 , 74–80 (2017).

F.Agudo-Peregrina, A., Iglesias–Pradas, S., Conde-González, M. A. & Hernández-Garcáa, A. Can we predict success from log data in vles? classification of interactions for learning analytics and their relation with performance in vle-supported f2f and online learning. Computers in human behavior 31 , 542–550, https://doi.org/10.1016/j.chb.2013.05.031 (2014).

Gomez-Aguilar, D. A., Hernandez-Garcia, A., Garcia-Penalvo, J. & Heron, R. Tap into visual analysis of customization of grouping of activities in elearning. Comput. Hum. Behav. 47 , 60–67. https://doi.org/10.1016/j.chb.2014.11.001 (2015).

Kumar, V. S., Pinnell, C. & Paulmani, G. Analytics in Authentic Learning 75–89 (Springer, Berlin, 2018).

Guo, F. & Liu, Q. A study on the correlation between online learning behavior and learning effect–based on the teaching practice of the flipped classroom of blackboard. Higher Educ. Sci. https://doi.org/10.1007/978-981-10-5930-8_6 (2018).

Liang, D., Jia, J., Wu, X., Miao, J. & Wang, A. Analysis of learners’ behaviors and learning outcomes in a massive open online course. Knowl. Manag. E-Learn. Int. J. 6 , 281–298 (2014).

Comer, K. & Clark, C. Peer-to-peer writing in introductory-level moocs. Writing to learn and learning to write across the disciplines. Int. Rev. Res. Open Dist. Learn. 15 , 26–82 (2014).

Kokoç, M. & Altun, A. Effects of learner interaction with learning dashboards on academic performance in an e-learning environment. Behav. Inf. Technol. 40 , 161–175. https://doi.org/10.1080/0144929X.2019.1680731 (2021).

Binbin, Z., Lin, C. H. & Kwon, J. B. The impact of learner-, instructor-, and course-level factors on online learning. Comput. Educ. https://doi.org/10.1016/j.compedu.2020.103851 (2020).

Qureshi, M. A., Khaskheli, A., Qureshi, J. A., Raza, S. A. & Yousufi, S. Q. Factors affecting students’ learning performance through collaborative learning and engagement. Interact. Learn. Environ. https://doi.org/10.1080/10494820.2021.1884886 (2021).

Shen, X., Liu, M., Wu, J. & Dong, X. Towards a model for evaluating students’ online learning behaviors and learning performance. Dist. Educ. China. https://doi.org/10.13541/j.cnki.chinade.2020.10.001 (2020).

Akram, A. et al. Predicting students’ academic procrastination in blended learning course using homework submission data. IEEE Access 7 , 102487–102498. https://doi.org/10.1109/access.2019.2930867 (2019).

Chaity, et al. Feature representations using the reflected rectified linear unit(rrelu) activation. Big Data Mining Anal. 3 , 20–38 (2020).

Madichetty, Sreenivasulu & Sridevi, M. Comparative study of statistical features to detect the target event during disaster. Big Data Mining Anal. 3 , 39–48. https://doi.org/10.26599/BDMA.2019.9020021 (2020).

Saha, S., Ghosh, M., Ghosh, S., Sen, S. & Sarkar, R. Feature selection for facial emotion recognition using cosine similarity-based harmony search algorithm. Appl. Sci. 10 , 2816. https://doi.org/10.3390/app10082816 (2020).

Article   CAS   Google Scholar  

Zigeng, W., Xiao, S. & Rajasekaran R. Novel and efficient randomized algorithms for feature selection. Big Data Mining Anal. 3 , 56–72. https://doi.org/10.26599/BDMA.2020.9020005 (2020).

Chen, L. & Xia, M. A context-aware recommendation approach based on feature selection. Appl. Intell. https://doi.org/10.1007/s10489-020-01835-9 (2020).

Huang, H., Lin, J., Wu, L., Fang, B. & Sun, F. Machine learning-based multi-modal information perception for soft robotic hands. Tsinghua Science and Technology 25 , 255–269, (2019).

Qinchen, Cao & W., Zhang, Y. & Zhu J.,. Deep learning-based classification of the polar emotions of moe-style cartoon pictures. Tsinghua Sci. Technol. 26 , 275–286 (2021).

Muhammad, M., Liu, Y., Sun, M. & Luan, H. Enriching the transfer learning with pre-trained lexicon embedding for low-resource neural machine translation. Tsinghua Sci. Technol. 26 , 2 (2020).

Vieira, C., Parsons, P. & Byrd, V. Visual learning analytics of educational data: A systematic literature review and research agenda. Comput. Educ. 122 , 119–135. https://doi.org/10.1016/j.compedu.2018.03.018 (2018).

Jiang, S., E.Williams, A., Schenke, K., Warschauer, M. & K.O’Dowd, D. Predicting mooc performance with week 1 behavior. In Proceedings of the 7th International Conference on Educational Data Mining, EDM 2014, London, UK, July 4-7, 2014 , 273–275 (International Educational Data Mining Society (IEDMS), 2014).

Aziz, A. A., Ahmad, F. I. & Hassan, H. A framework for studentsa academic performance analysis using naa ve bayes classifier. Jurnal Teknologi 75 , 2 (2015).

Ahuja, R. & Kankane, Y. Predicting the probability of student’s degree completion by using different data mining techniques. 2017 Fourth International Conference on Image Information Processing 474–477, https://doi.org/10.1109/ICIIP.2017.8313763 (2017).

Asif, R., Merceron, A., Ali, S. A. & Haider, N. G. Analyzing undergraduate students’ performance using educational data mining. Comput. Educ. 113 , 177–194. https://doi.org/10.1016/j.compedu.2017.05.007 (2017).

Shen, H., Ju, S. & Sun, J. Performance prediction based on fuzzy clustering and support vector regression. J. East China Normal Univ. 2 , 66–73 (2019).

Moore, M. G. Three types of interaction. Am. J. Dist. Educ. 3 , 1–6. https://doi.org/10.1080/08923648909526659 (1989).

Hillman, D. C., Willis, D. J. & Gunawardena, C. N. Learner-interface interaction in distance education: An extension of contemporary models and strategies for practitioners. Am. J. Dist. Educ. 8 , 30–42. https://doi.org/10.1080/08923649409526853 (1994).

Hirumi, A. A framework for analyzing, designing, and sequencing planned elearning interactions. Quart. Rev. Dist. Educ. 3 , 141–60 (2002).

Peng, W., Yang, Z. & Huang, K. Analysis of online learning behavior and research on its model. China Educ. Technol. 2 , 31–35 (2006).

Malikowski, S. R., Thompson, M. E. & Theis, J. G. A model for research into course management systems: Bridging technology and learning theory. J. Educ. Comput. Res. 36 , 149–73. https://doi.org/10.2190/1002-1t50-27g2-h3v7 (2007).

Veletsianos, G., Collier, A. & Schneider, E. Digging deeper into learners’ experiences in moocs: Participation in social networks outside of moocs, notetaking and contexts surrounding content consumption. Br. J. Educ. Technol. 46 , 570–587. https://doi.org/10.1111/bjet.12297 (2015).

Wu, L., Lao, C., Liu, Q. & Cheng, Y. Online learning behavior analysis model and its application in network learning space. Mod. Educ. Technol. 28 , 46–53. https://doi.org/10.3969/j.issn.1009-8097.2018.06.007 (2018).

Wu, F. & Tian, H. Mining meaningful features of learning behavior: Research on prediction framework of learning outcomes. Open Educ. Res. 25 , 75–82. https://doi.org/10.13966/j.cnki.kfjyyj.2019.06.008 (2019).

Gayman, C. M., Hammonds, F. & Rost, K. A. Interteaching in an asynchronous online class. Scholarsh. Teach. Learn. Psychol. 4 , 231. https://doi.org/10.1037/stl0000126 (2018).

Kuzilek, J., Hlosta, M. & Zdrahal, Z. Open university learning analytics dataset. Sci. Data 4 , 2. https://doi.org/10.1038/sdata.2017.171 (2017).

Wong, T. & Yeh, P. Reliable accuracy estimates from k-fold cross validation. IEEE Trans. Knowl. Data Eng. 32 , 1586–1594. https://doi.org/10.1109/TKDE.2019.2912815 (2019).

Download references

This work was funded by the National Natural Science Foundation of China grant numbers 71872131 and 61977058, in part by of Science and Technology Program of Zhejiang Province (2018C01080)and the STU Scientific Research Initiation Grant (SRIG) under Grant 20007.

Author information

Authors and affiliations.

College of Education, Zhejiang University of Technology, Hangzhou, 310023, China

Feiyue Qiu, Xin Sheng, Lei Jiang, Lijia Zhu & Qifeng Xiang

College of Computer Science and Technology, Zhejiang University of Technology, Hangzhou, 310023, China

Guodao Zhang

Department of Educational Information Technology, East China Normal University, Shanghai, 200062, China

Business School and Research Institute for Guangdong-Taiwan Business Cooperation, Shantou University, Shantou, 515000, China

Ping-kuo Chen

You can also search for this author in PubMed   Google Scholar

Contributions

Conceptualization,G.Z. and P.C.; methodology,X.S.;software,X.S.; validation,L.Z. and Q.X.; formal analysis,X.S. and G.Z.; investigation,G.Z.; resources,G.Z. and X.S.; data curation,L.J.; writing original draft preparation, X.S. and G.Z.; writing review and editing, X.S. and G.Z.; visualization,L.Z. and Q.X.; supervision,G.Z.; project administration,G.Z.;funding acquisition,F.Q. and B.J. All authors have read and agreed to the published version of the manuscript.

Corresponding authors

Correspondence to Xin Sheng or Ping-kuo Chen .

Ethics declarations

Competing interests.

The authors declare no competing interests.

Additional information

Publisher's note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Cite this article.

Qiu, F., Zhang, G., Sheng, X. et al. Predicting students’ performance in e-learning using learning process and behaviour data. Sci Rep 12 , 453 (2022). https://doi.org/10.1038/s41598-021-03867-8

Download citation

Received : 05 August 2021

Accepted : 09 December 2021

Published : 10 January 2022

DOI : https://doi.org/10.1038/s41598-021-03867-8

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

This article is cited by

A novel methodology using rnn + lstm + ml for predicting student’s academic performance.

  • Ashima Kukkar
  • Rajni Mohana
  • Anand Nayyar

Education and Information Technologies (2024)

Recent advances in Predictive Learning Analytics: A decade systematic review (2012–2022)

  • Nabila Sghir
  • Amina Adadi
  • Mohammed Lahmer

Education and Information Technologies (2023)

Hybrid analysis of the learner’s online behavior based on learning style

  • Rihab Balti
  • Aroua Hedhili
  • Mourad Abed

The role of demographic and academic features in a student performance prediction

  • Muhammad Bilal
  • Muhammad Omar
  • Gyu Sang Choi

Scientific Reports (2022)

By submitting a comment you agree to abide by our Terms and Community Guidelines . If you find something abusive or that does not comply with our terms or guidelines please flag it as inappropriate.

Quick links

  • Explore articles by subject
  • Guide to authors
  • Editorial policies

Sign up for the Nature Briefing: AI and Robotics newsletter — what matters in AI and robotics research, free to your inbox weekly.

e learning research topics 2021

Systematic Literature Review of E-Learning Capabilities to Enhance Organizational Learning

  • Open access
  • Published: 01 February 2021
  • Volume 24 , pages 619–635, ( 2022 )

Cite this article

You have full access to this open access article

e learning research topics 2021

  • Michail N. Giannakos 1 ,
  • Patrick Mikalef 1 &
  • Ilias O. Pappas   ORCID: orcid.org/0000-0001-7528-3488 1 , 2  

20k Accesses

31 Citations

Explore all metrics

E-learning systems are receiving ever increasing attention in academia, business and public administration. Major crises, like the pandemic, highlight the tremendous importance of the appropriate development of e-learning systems and its adoption and processes in organizations. Managers and employees who need efficient forms of training and learning flow within organizations do not have to gather in one place at the same time or to travel far away to attend courses. Contemporary affordances of e-learning systems allow users to perform different jobs or tasks for training courses according to their own scheduling, as well as to collaborate and share knowledge and experiences that result in rich learning flows within organizations. The purpose of this article is to provide a systematic review of empirical studies at the intersection of e-learning and organizational learning in order to summarize the current findings and guide future research. Forty-seven peer-reviewed articles were collected from a systematic literature search and analyzed based on a categorization of their main elements. This survey identifies five major directions of the research on the confluence of e-learning and organizational learning during the last decade. Future research should leverage big data produced from the platforms and investigate how the incorporation of advanced learning technologies (e.g., learning analytics, personalized learning) can help increase organizational value.

Similar content being viewed by others

e learning research topics 2021

Exploring Human Resource Management Digital Transformation in the Digital Age

e learning research topics 2021

A Systematic Review of Research on Personalized Learning: Personalized by Whom, to What, How, and for What Purpose(s)?

e learning research topics 2021

Learning environments’ influence on students’ learning experience in an Australian Faculty of Business and Economics

Avoid common mistakes on your manuscript.

1 Introduction

E-learning covers the integration of information and communication technology (ICT) in environments with the main goal of fostering learning (Rosenberg and Foshay 2002 ). The term “e-learning” is often used as an umbrella term to portray several modes of digital learning environments (e.g., online, virtual learning environments, social learning technologies). Digitalization seems to challenge numerous business models in organizations and raises important questions about the meaning and practice of learning and development (Dignen and Burmeister 2020 ). Among other things, the digitalization of resources and processes enables flexible ways to foster learning across an organization’s different sections and personnel.

Learning has long been associated with formal or informal education and training. However organizational learning is much more than that. It can be defined as “a learning process within organizations that involves the interaction of individual and collective (group, organizational, and inter-organizational) levels of analysis and leads to achieving organizations’ goals” (Popova-Nowak and Cseh 2015 ) with a focus on the flow of knowledge across the different organizational levels (Oh 2019 ). Flow of knowledge or learning flow is the way in which new knowledge flows from the individual to the organizational level (i.e., feed forward) and vice versa (i.e., feedback) (Crossan et al. 1999 ; March 1991 ). Learning flow and the respective processes constitute the cornerstone of an organization’s learning activities (e.g., from physical training meetings to digital learning resources), they are directly connected to the psycho-social experiences of an organization’s members, and they eventually lead to organizational change (Crossan et al. 2011 ). The overall organizational learning is extremely important in an organization because it is associated with the process of creating value from an organizations’ intangible assets. Moreover, it combines notions from several different domains, such as organizational behavior, human resource management, artificial intelligence, and information technology (El Kadiri et al. 2016 ).

A growing body of literature lies at the intersection of e-learning and organizational learning. However, there is limited work on the qualities of e-learning and the potential of its qualities to enhance organizational learning (Popova-Nowak and Cseh 2015 ). Blockages and disruptions in the internal flow of knowledge is a major reason why organizational change initiatives often fail to produce their intended results (Dee and Leisyte 2017 ). In recent years, several models of organizational learning have been published (Berends and Lammers 2010 ; Oh 2019 ). However, detailed empirical studies indicate that learning does not always proceed smoothly in organizations; rather, the learning meets interruptions and breakdowns (Engeström et al. 2007 ).

Discontinuities and disruptions are common phenomena in organizational learning (Berends and Lammers 2010 ), and they stem from various causes. For example, organizational members’ low self-esteem, unsupportive technology and instructors (Garavan et al. 2019 ), and even crises like the Covid-19 pandemic can result in demotivated learners and overall unwanted consequences for their learning (Broadbent 2017 ). In a recent conceptual article, Popova-Nowak and Cseh ( 2015 ) emphasized that there is a limited use of multidisciplinary perspectives to investigate and explain the processes and importance of utilizing the available capabilities and resources and of creating contexts where learning is “attractive to individual agents so that they can be more engaged in exploring ways in which they can contribute through their learning to the ongoing renewal of organizational routines and practices” (Antonacopoulou and Chiva 2007 , p. 289).

Despite the importance of e-learning, the lack of systematic reviews in this area significantly hinders research on the highly promising value of e-learning capabilities for efficiently supporting organizational learning. This gap leaves practitioners and researchers in uncharted territories when faced with the task of implementing e-learning designs or deciding on their digital learning strategies to enhance the learning flow of their organizations. Hence, in order to derive meaningful theoretical and practical implications, as well as to identify important areas for future research, it is critical to understand how the core capabilities pertinent to e-learning possess the capacity to enhance organizational learning.

In this paper, we define e-learning enhanced organizational learning (eOL) as the utilization of digital technologies to enhance the process of improving actions through better knowledge and understanding in an organization. In recent years, a significant body of research has focused on the intersection of e-learning and organizational learning (e.g., Khandakar and Pangil 2019 ; Lin et al. 2019 ; Menolli et al. 2020 ; Turi et al. 2019 ; Xiang et al. 2020 ). However, there is a lack of systematic work that summarizes and conceptualizes the results in order to support organizations that want to move from being information-based enterprises to being knowledge-based ones (El Kadiri et al. 2016 ). In particular, recent technological advances have led to an increase in research that leverages e-learning capacities to support organizational learning, from virtual reality (VR) environments (Costello and McNaughton 2018 ; Muller Queiroz et al. 2018 ) to mobile computing applications (Renner et al. 2020 ) to adaptive learning and learning analytics (Zhang et al. 2019 ). These studies support different skills, consider different industries and organizations, and utilize various capacities while focusing on various learning objectives (Garavan et al. 2019 ). Our literature review aims to tease apart these particularities and to investigate how these elements have been utilized over the past decade in eOL research. Therefore, in this review we aim to answer the following research questions (RQs):

RQ1: What is the status of research at the intersection of e-learning and organizational learning, seen through the lens of areas of implementation (e.g., industries, public sector), technologies used, and methodologies (e.g., types of data and data analysis techniques employed)?

RQ2: How can e-learning be leveraged to enhance the process of improving actions through better knowledge and understanding in an organization?

Our motivation for this work is based on the emerging developments in the area of learning technologies that have created momentum for their adoption by organizations. This paper provides a review of research on e-learning capabilities to enhance organizational learning with the purpose of summarizing the findings and guiding future studies. This study can provide a springboard for other scholars and practitioners, especially in the area of knowledge-based enterprises, to examine e-learning approaches by taking into consideration the prior and ongoing research efforts. Therefore, in this paper we present a systematic literature review (SLR) (Kitchenham and Charters 2007 ) on the confluence of e-learning and organizational learning that uncovers initial findings on the value of e-learning to support organizational learning while also delineating several promising research streams.

The rest of this paper is organized as follows. In the next section, we present the related background work. The third section describes the methodology used for the literature review and how the studies were selected and analyzed. The fourth section presents the research findings derived from the data analysis based on the specific areas of focus. In the fifth section, we discuss the findings, the implications for practice and research, and the limitations of the selected methodological approach. In the final section, we summarize the conclusions from the study and make suggestions for future work.

2 Background and Related Work

2.1 e-learning systems.

E-learning systems provide solutions that deliver knowledge and information, facilitate learning, and increase performance by developing appropriate knowledge flow inside organizations (Menolli et al. 2020 ). Putting into practice and appropriately managing technological solutions, processes, and resources are necessary for the efficient utilization of e-learning in an organization (Alharthi et al. 2019 ). Examples of e-learning systems that have been widely adopted by various organizations are Canvas, Blackboard, and Moodle. Such systems provide innovative services for students, employees, managers, instructors, institutions, and other actors to support and enhance the learning processes and facilitate efficient knowledge flow (Garavan et al. 2019 ). Functionalities, such as creating modules to organize mini course information and learning materials or communication channels such as chat, forums, and video exchange, allow instructors and managers to develop appropriate training and knowledge exchange (Wang et al. 2011 ). Nowadays, the utilization of various e-learning capabilities is a commodity for supporting organizational and workplace learning. Such learning refers to training or knowledge development (also known in the literature as learning and development, HR development, and corporate training: Smith and Sadler-Smith 2006 ; Garavan et al. 2019 ) that takes place in the context of work.

Previous studies have focused on evaluating e-learning systems that utilize various models and frameworks. In particular, the development of maturity models, such as the e-learning capability maturity model (eLCMM), addresses technology-oriented concerns (Hammad et al. 2017 ) by overcoming the limitations of the domain-specific models (e.g., game-based learning: Serrano et al.  2012 ) or more generic lenses such as the e-learning maturity model (Marshall 2006 ). The aforementioned models are very relevant since they focus on assessing the organizational capabilities for sustainably developing, deploying, and maintaining e-learning. In particular, the eLCMM focuses on assessing the maturity of adopting e-learning systems and adds a feedback building block for improving learners’ experiences (Hammad et al. 2017 ). Our proposed literature review builds on the previously discussed models, lenses, and empirical studies, and it provides a review of research on e-learning capabilities with the aim of enhancing organizational learning in order to complement the findings of the established models and guide future studies.

E-learning systems can be categorized into different types, depending on their functionalities and affordances. One very popular e-learning type is the learning management system (LMS), which includes a virtual classroom and collaboration capabilities and allows the instructor to design and orchestrate a course or a module. An LMS can be either proprietary (e.g., Blackboard) or open source (e.g., Moodle). These two types differ in their features, costs, and the services they provide; for example, proprietary systems prioritize assessment tools for instructors, whereas open-source systems focus more on community development and engagement tools (Alharthi et al. 2019 ). In addition to LMS, e-learning systems can be categorized based on who controls the pace of learning; for example, an institutional learning environment (ILE) is provided by the organization and is usually used for instructor-led courses, while a personal learning environment (PLE) is proposed by the organization and is managed personally (i.e., learner-led courses). Many e-learning systems use a hybrid version of ILE and PLE that allows organizations to have either instructor-led or self-paced courses.

Besides the controlled e-learning systems, organizations have been using environments such as social media (Qi and Chau 2016 ), massive open online courses (MOOCs) (Weinhardt and Sitzmann 2018 ) and other web-based environments (Wang et al. 2011 ) to reinforce their organizational learning potential. These systems have been utilized through different types of technology (e.g., desktop applications, mobile) that leverage the various capabilities offered (e.g., social learning, VR, collaborative systems, smart and intelligent support) to reinforce the learning and knowledge flow potential of the organization. Although there is a growing body of research on e-learning systems for organizational learning due to the increasingly significant role of skills and expertise development in organizations, the role and alignment of the capabilities of the various e-learning systems with the expected competency development remains underexplored.

2.2 Organizational Learning

There is a large body of research on the utilization of technologies to improve the process and outcome dimensions of organizational learning (Crossan et al. 1999 ). Most studies have focused on the learning process and on the added value that new technologies can offer by replacing some of the face-to-face processes with virtual processes or by offering new, technology-mediated phases to the process (Menolli et al. 2020 ; Lau 2015 ) highlighted how VR capabilities can enhance organizational learning, describing the new challenges and frameworks needed in order to effectively utilize this potential. In the same vein, Zhang et al. ( 2017 ) described how VR influences reflective thinking and considered its indirect value to overall learning effectiveness. In general, contemporary research has investigated how novel technologies and approaches have been utilized to enhance organizational learning, and it has highlighted both the promises and the limitations of the use of different technologies within organizations.

In many organizations, alignment with the established infrastructure and routines, and adoption by employees are core elements for effective organizational learning (Wang et al. 2011 ). Strict policies, low digital competence, and operational challenges are some of the elements that hinder e-learning adoption by organizations (Garavan et al. 2019 ; Wang 2018 ) demonstrated the importance of organizational, managerial, and job support for utilizing individual and social learning in order to increase the adoption of organizational learning. Other studies have focused on the importance of communication through different social channels to develop understanding of new technology, to overcome the challenges employees face when engaging with new technology, and, thereby, to support organizational learning (Menolli et al. 2020 ). By considering the related work in the area of organizational learning, we identified a gap in aligning an organization’s learning needs with the capabilities offered by the various technologies. Thus, systematic work is needed to review e-learning capabilities and how these capabilities can efficiently support organizational learning.

2.3 E-learning Systems to Enhance Organizational Learning

When considering the interplay between e-learning systems and organizational learning, we observed that a major challenge for today’s organizations is to switch from being information-based enterprises to become knowledge-based enterprises (El Kadiri et al. 2016 ). Unidirectional learning flows, such as formal and informal training, are important but not sufficient to cover the needs that enterprises face (Manuti et al. 2015 ). To maintain enterprises’ competitiveness, enterprise staff have to operate in highly intense information and knowledge-oriented environments. Traditional learning approaches fail to substantiate learning flow on the basis of daily evidence and experience. Thus, novel, ubiquitous, and flexible learning mechanisms are needed, placing humans (e.g., employees, managers, civil servants) at the center of the information and learning flow and bridging traditional learning with experiential, social, and smart learning.

Organizations consider lack of skills and competences as being the major knowledge-related factors hampering innovation (El Kadiri et al. 2016 ). Thus, solutions need to be implemented that support informal, day-to-day, and work training (e.g., social learning, collaborative learning, VR/AR solutions) in order to develop individual staff competences and to upgrade the competence affordances at the organizational level. E-learning-enhanced organizational learning has been delivered primarily in the form of web-based learning (El Kadiri et al. 2016 ). More recently, the TEL tools portfolio has rapidly expanded to make more efficient joint use of novel learning concepts, methodologies, and technological enablers to achieve more direct, effective, and lasting learning impacts. Virtual learning environments, mobile-learning solutions, and AR/VR technologies and head-mounted displays have been employed so that trainees are empowered to follow their own training pace, learning topics, and assessment tests that fit their needs (Costello and McNaughton 2018 ; Mueller et al. 2011 ; Muller Queiroz et al. 2018 ). The expanding use of social networking tools has also brought attention to the contribution of social and collaborative learning (Hester et al. 2016 ; Wei and Ram 2016 ).

Contemporary learning systems supporting adaptive, personalized, and collaborative learning expand the tools available in eOL and contribute to the adoption, efficiency, and general prospects of the introduction of TEL in organizations (Cheng et al. 2011 ). In recent years, eOL has emphasized how enterprises share knowledge internally and externally, with particular attention being paid to systems that leverage collaborative learning and social learning functionalities (Qi and Chau 2016 ; Wang  2011 ). This is the essence of computer-supported collaborative learning (CSCL). The CSCL literature has developed a framework that combines individual learning, organizational learning, and collaborative learning, facilitated by establishing adequate learning flows and emerges effective learning in an enterprise learning (Goggins et al. 2013 ), in Fig.  1 .

figure 1

Representation of the combination of enterprise learning and knowledge flows. (adapted from Goggins et al. 2013 )

Establishing efficient knowledge and learning flows is a primary target for future data-driven enterprises (El Kadiri et al. 2016 ). Given the involved knowledge, the human resources, and the skills required by enterprises, there is a clear need for continuous, flexible, and efficient learning. This can be met by contemporary learning systems and practices that provide high adoption, smooth usage, high satisfaction, and close alignment with the current practices of an enterprise. Because the required competences of an enterprise evolve, the development of competence models needs to be agile and to leverage state-of-the art technologies that align with the organization’s processes and models. Therefore, in this paper we provide a review of the eOL research in order to summarize the findings, identify the various capabilities of eOL, and guide the development of organizational learning in future enterprises as well as in future studies.

3 Methodology

To answer our research questions, we conducted an SLR, which is a means of evaluating and interpreting all available research relevant to a particular research question, topic area, or phenomenon of interest. A SLR has the capacity to present a fair evaluation of a research topic by using a trustworthy, rigorous, and auditable methodology (Kitchenham and Charters 2007 ). The guidelines used (Kitchenham and Charters 2007 ) were derived from three existing guides adopted by medical researchers. Therefore, we adopted SLR guidelines that follow transparent and widely accepted procedures (especially in the area of software engineering and information systems, as well as in e-learning), minimize potential bias (researchers), and support reproducibility (Kitchenham and Charters 2007 ). Besides the minimization of bias and support for reproducibility, an SLR allows us to provide information about the impact of some phenomenon across a wide range of settings, contexts, and empirical methods. Another important advantage is that, if the selected studies give consistent results, SLRs can provide evidence that the phenomenon is robust and transferable (Kitchenham and Charters 2007 ).

3.1 Article Collection

Several procedures were followed to ensure a high-quality review of the literature of eOL. A comprehensive search of peer-reviewed articles was conducted in February 2019 (short papers, posters, dissertations, and reports were excluded), based on a relatively inclusive range of key terms: “organizational learning” & “elearning”, “organizational learning” & “e-learning”, “organisational learning” & “elearning”, and “organisational learning” & “e-learning”. Publications were selected from 2010 onwards, because we identified significant advances since 2010 (e.g., MOOCs, learning analytics, personalized learning) in the area of learning technologies. A wide variety of databases were searched, including SpringerLink, Wiley, ACM Digital Library, IEEE Xplore, Science Direct, SAGE, ERIC, AIS eLibrary, and Taylor & Francis. The selected databases were aligned with the SLR guidelines (Kitchenham and Charters 2007 ) and covered the major venues in IS and educational technology (e.g., a basket of eight IS journals, the top 20 journals in the Google Scholar IS subdiscipline, and the top 20 journals in the Google Scholar Educational Technology subdiscipline). The search process uncovered 2,347 peer-reviewed articles.

3.2 Inclusion and Exclusion Criteria

The selection phase determines the overall validity of the literature review, and thus it is important to define specific inclusion and exclusion criteria. As Dybå and Dingsøyr ( 2008 ) specified, the quality criteria should cover three main issues – namely, rigor, credibility, and relevance – that need to be considered when evaluating the quality of the selected studies. We applied eight quality criteria informed by the proposed Critical Appraisal Skills Programme (CASP) and related works (Dybå and Dingsøyr 2008 ). Table 1 presents these criteria.

Therefore, studies were eligible for inclusion if they were focused on eOL. The aforementioned criteria were applied in stages 2 and 3 of the selection process (see Fig.  2 ), when we assessed the papers based on their titles and abstracts, and read the full papers. From March 2020, we performed an additional search (stage 4) following the same process for papers published after the initial search period (i.e., 2010–February 2019). The additional search returned seven papers. Figure 2 summarizes the stages of the selection process.

figure 2

Stages of the selection process

3.3 Analysis

Each collected study was analyzed based on the following elements: study design (e.g., experiment, case study), area (e.g., IT, healthcare), technology (e.g., wiki, social media), population (e.g., managers, employees), sample size, unit of analysis (individual, firm), data collections (e.g., surveys, interviews), research method, data analysis, and the main research objective of the study. It is important to highlight that the articles were coded based on the reported information, that different authors reported information at different levels of granularity (e.g., an online system vs. the name of the system), and that in some cases the information was missing from the paper. Overall, we endeavored to code the articles as accurately and completely as possible.

The coding process was iterative with regular consensus meetings between the two researchers involved. The primary coder prepared the initial coding for a number of articles and both coders reviewed and agreed on the coding in order to reach the final codes presented in the Appendix . Disagreements between the coders and inexplicit aspects of the reviewed papers were discussed and resolved in regular consensus meetings. Although this process did not provide reliability indices (e.g., Cohen’s kappa), it did provide certain reliability in terms of consistency of the coding and what Krippendorff ( 2018 ) stated as the reliability of “the degree to which members of a designated community concur on the readings, interpretations, responses to, or uses of given texts or data”, which is considered acceptable research practice (McDonald et al. 2019 ).

In this section, we present the detailed results of the analysis of the 47 papers. Analysis of the studies was performed using non-statistical methods that considered the variables reported in the Appendix . This section is followed by an analysis and discussion of the categories.

4.1 Sample Size and Population Involved

The categories related to the sample of the articles and included the number of participants in each study (size), their position (e.g., managers, employees), and the area/topic covered by the study. The majority of the studies involved employees (29), with few studies involving managers (6), civil servants (2), learning specialists (2), clients, and researchers. Regarding the sample size, approximately half of the studies (20) were conducted with fewer than 100 participants; some (12) can be considered large-scale studies (more than 300 participants); and only a few (9) can be considered small scale (fewer than 20 participants). In relation to the area/topic of the study, most studies (11) were conducted in the context of the IT industry, but there was also good coverage of other important areas (i.e., healthcare, telecommunications, business, public sector). Interestingly, several studies either did not define the area or were implemented in a generic context (sector-agnostic studies, n = 10), and some studies were implemented in a multi-sector context (e.g., participants from different sections or companies, n = 4).

4.2 Research Methods

When assessing the status of research for an area, one of the most important aspects is the methodology used. By “method” in the Appendix , we refer to the distinction between quantitative, qualitative, and mixed methods research. In addition to the method, in our categorization protocol we also included “study design” to refer to the distinction between survey studies (i.e., those that gathered data by asking a group of participants), experiments (i.e., those that created situations to record beneficial data), and case studies (i.e., those that closely studied a group of individuals).

Based on this categorization, the Appendix shows that the majority of the papers were quantitative (34) and qualitative (7), with few studies (6) utilizing mixed methods. Regarding the study design, most of the studies were survey studies (26), 13 were case studies, and fewer were experiments (8). For most studies, the individual participant (40) was the unit of analysis, with few studies having the firm as the unit of analysis, and only one study using the training session as a unit of analysis. Regarding the measures used in the studies, most utilized surveys (39), with 11 using interviews, and only a few studies using field notes from focus groups (2) and log files from the systems (2). Only eight studies involved researchers using different measures to triangulate or extend their findings. Most articles used structural equation modeling (SEM) (17) to analyze their data, with 13 studies employing descriptive statistics, seven using content analysis, nine using regression analysis or analyses of variances/covariance, and one study using social network analysis (SNA).

4.3 Technologies

Concerning the technology used, most of the studies (17) did not study a specific system, referring instead in their investigation to a generic e-learning or technological solution. Several studies (9) named web-based learning environments, without describing the functionalities of the identified system. Other studies focused on online learning environments (4), collaborative learning systems (3), social learning systems (3), smart learning systems (2), podcasting (2), with the rest of the studies using a specific system (e.g., a wiki, mobile learning, e-portfolios, Second Life, web application).

4.4 Research Objectives

The research objectives of the studies could be separated into six main categories. The first category focuses on the intention of the employees to use the technology (9); the second focuses on the performance of the employees (8); the third focuses on the value/outcome for the organization (4); the fourth focuses on the actual usage of the system (7); the fifth focuses on employees’ satisfaction (4); and the sixth focuses on the ability of the proposed system to foster learning (9). In addition to these six categories, we also identified studies that focused on potential barriers for eOL in organizations (Stoffregen et al. 2016 ), the various benefits associated with the successful implementation of eOL (Liu et al. 2012 ), the feasibility of eOL (Kim et al. 2014 ; Mueller et al. 2011 ), and the alignment of the proposed innovation with the other processes and systems in the organization (Costello and McNaughton 2018 ).

4.5 E-learning Capabilities in Various Organizations and for Various Objectives

The technology used has an inherent role for both the organization and the expected eOL objective. E-learning systems are categorized based on their functionalities and affordances. Based on the information reported in the selected papers, we ranked them based on the different technologies and functionalities (e.g., collaborative, online, smart). To do so, we focused on the main elements described in the selected paper; for instance, a paper that described the system as wiki-based or indicated that the system was Second Life was ranked as such, rather than being added to collaborative systems or social learning respectively. We did this because we wanted to capture all the available information since it gave us additional insights (e.g., Second Life is both a social and a VR system).

To investigate the connection between the various technologies used to enhance organizational learning and their application in the various organizations, we utilized the coding (see Appendix ) and mapped the various e-learning technologies (or their affordances) with the research industries to which they applied (Fig.  3 ). There was occasionally a lack of detailed information about the capabilities of the e-learning systems applied (e.g., generic, or a web application, or an online system), which limited the insights. Figure 3 provides a useful mapping of the confluence of e-learning technologies and their application in the various industries.

figure 3

Association of the different e-learning technologies with the industries to which they are applied in the various studies. Note: The size of the circles depicts the frequency of studies, with the smallest circle representing one study and the largest representing six studies. The mapping is extracted from the data in the Appendix , which outlines the papers that belong in each of the circles

To investigate the connection between the various technologies used to enhance organizational learning and their intended objectives, we utilized the coding of the articles (see Appendix ) and mapped the various e-learning technologies (or their affordances) with the intended objectives, as reported in the various studies (Fig.  4 ). The results in Fig.  4 show the objectives that are central in eOL research (e.g., performance, fostering learning, adoption, and usage) as well as those objectives on which few studies have focused (e.g., alignment, feasibility, behavioral change). In addition, the results also indicate the limited utilization of the various e-learning capabilities (e.g., social, collaborative, smart) to achieve objectives connected with those capabilities (e.g., social learning and behavioral change, collaborative learning, and barriers).

figure 4

Association of the different e-learning technologies with the objectives investigated in the various studies. Note: The size of the circles depicts the frequency of studies, with the smallest circle representing one study and the largest representing five studies. The mapping is extracted from the data in the Appendix , which outlines the papers that belong in each of the circles

5 5. Discussion

After reviewing the 47 identified articles in the area of eOL, we can observe that all the works acknowledge the importance of the affordances offered by different e-learning technologies (e.g., remote collaboration, anytime anywhere), the importance of the relationship between eOL and employees’ satisfaction and performance, and the benefits associated with organizational value and outcome. Most of the studies agree that eOL provides employees, managers, and even clients with opportunities to learn in a more differentiated manner, compared to formal and face-to-face learning. However, how the organization adopts and puts into practice these capabilities to leverage them and achieve its goals are complex and challenging procedures that seem to be underexplored.

Several studies (Lee et al. 2015a ; Muller Queiroz et al. 2018 ; Tsai et al. 2010 ) focused on the positive effect of perceived managerial support, perceived usefulness, perceived ease of use, and other technology acceptance model (TAM) constructs of the e-learning system in supporting all three levels of learning (i.e., individual, collaborative, and organizational). Another interesting dimension highlighted by many studies (Choi and Ko 2012 ; Khalili et al. 2012 ; Yanson and Johnson 2016 ) is the role of socialization in the adoption and usage of the e-learning systems that offer these capabilities. Building connections and creating a shared learning space in the e-learning system is challenging but also critical for the learners (Yanson and Johnson 2016 ). This is consistent with the expectancy-theoretical explanation of how social context impacts on employees’ motivation to participate in learning (Lee et al. 2015a ; Muller Queiroz et al. 2018 ).

The organizational learning literature suggests that e-learning may be more appropriate for the acquisition of certain types of knowledge than others (e.g., procedural vs. declarative, or hard-skills vs. soft-skills); however, there is no empirical evidence for this (Yanson and Johnson 2016 ). To advance eOL research, there is a need for a significant move to address complex, strategic skills by including learning and development professionals (Garavan et al. 2019 ) and by developing strategic relationships. Another important element is to utilize e-learning technology that addresses and integrates organizational, individual, and social perspectives in eOL (Wang  2011 ). This is also identified in our literature review since we found only limited specialized e-learning systems in domain areas that have traditionally benefited from such technology. For instance, although there were studies that utilized VR environments (Costello and McNaughton 2018 ; Muller Queiroz et al. 2018 ) and video-based learning systems (Wei et al. 2013 ; Wei and Ram 2016 ), there was limited focus in contemporary eOL research on how specific affordances of the various environments that are used in organizations (e.g., Carnetsoft, Outotec HSC, and Simscale for simulations of working environments; or Raptivity, YouTube, and FStoppers to gain specific skills and how-to knowledge) can benefit the intended goals or be integrated with the unique qualities of the organization (e.g., IT, healthcare).

For the design and the development of the eOL approach, the organization needs to consider the alignment of individual learning needs, organizational objectives, and the necessary resources (Wang  2011 ). To achieve this, it is advisable for organizations to define the expected objectives, catalogue the individual needs, and select technologies that have the capacity to support and enrich learners with self-directed and socially constructed learning practices in the organization (Wang  2011 ). This needs to be done by taking into consideration that on-demand eOL is gradually replacing the classic static eOL curricula and processes (Dignen and Burmeister 2020 ).

Another important dimension of eOL research is the lenses used to approach effectiveness. The selected papers approached effectiveness with various objectives, such as fostering learning, usage of the e-learning system, employees’ performance, and the added organizational value (see Appendix ). To measure these indices, various metrics (quantitative, qualitative, and mixed) have been applied. The qualitative dimensions emphasize employees’ satisfaction and system usage (e.g., Menolli et al. 2020 ; Turi et al. 2019 ), as well as managers’ perceived gained value and benefits (e.g., Lee et al. 2015b ; Xiang et al. 2020 ) and firms’ perceived effective utilization of eOL resources (López-Nicolás and Meroño-Cerdán 2011 ). The quantitative dimensions focus on usage, feasibility, and experience at different levels within an organization, based on interviews, focus groups, and observations (Costello and McNaughton 2018 ; Michalski 2014 ; Stoffregen et al. 2016 ). However, it is not always clear the how eOL effectiveness has been measured, nor the extent to which eOL is well aligned with and is strategically impactful on delivering the strategic agenda of the organization (Garavan et al. 2019 ).

Research on digital technologies is developing rapidly, and big data and business analytics have the potential to pave the way for organizations’ digital transformation and sustainable development (Mikalef et al. 2018 ; Pappas et al. 2018 ); however, our review finds surprisingly limited use of big data and analytics in eOL. Despite contemporary e-learning systems adopting data-driven mechanisms, as well as advances in learning analytics (Siemens and Long 2011 ), the results of our analysis indicate that learner-generated data in the context of eOL are used in only a few studies to extract very limited insights with respect to the effectiveness of eOL and the intended objectives of the respective study (Hung et al. 2015 ; Renner et al. 2020 ; Rober and Cooper 2011 ). Therefore, eOL research needs to focus on data-driven qualities that will allow future researchers to gain deeper insights into which capabilities need to be developed to monitor the effectiveness of the various practices and technologies, their alignment with other functions of the organization, and how eOL can be a strategic and impactful vehicle for materializing the strategic agenda of the organization.

5.1 Status of eOL Research

The current review suggests that, while the efficient implementation of eOL entails certain challenges, there is also a great potential for improving employees’ performance as well as overall organizational outcome and value. There are also opportunities for improving organizations’ learning flow, which might not be feasible with formal learning and training. In order to construct the main research dimensions of eOL research and to look more deeply at the research objectives of the studies (the information we coded as objectives in the Appendix ), we performed a content analysis and grouped the research objectives. This enabled us to summarize the contemporary research on eOL according to five major categories, each of which is describes further below. As the research objectives of the published work shows, the research on eOL conducted during the last decade has particularly focused on the following five directions.

Investigating the capabilities of different technologies in different organizations.

Research has particularly focused on how easy the technology is to use, on how useful it is, or on how well aligned/integrated it is with other systems and processes within the organization. In addition, studies have used different learning technologies (e.g., smart, social, personalized) to enhance organizational learning in different contexts and according to different needs. However, most works have focused on affordances such as remote training and the development of static courses or modules to share information with learners. Although a few studies have utilized contemporary e-learning systems (see Appendix ), even in these studies there is a lack of alignment between the capabilities of those systems (e.g., open online course, adaptive support, social and collaborative learning) and the objectives and strategy of the organization (e.g., organizational value, fostering learning).

Enriching the learning flow and learning potential in different levels within an organization.

The reviewed work has emphasized how different factors contribute to different levels of organizational learning, and it has focused on practices that address individual, collaborative, and organizational learning within the structure of the organization. In particular, most of the reviewed studies recognize that organizational learning occurs at multiple levels: individual, team (or group), and organization. In other words, although each of the studies carried out an investigation within a given level (except for Garavan et al. 2019 ), there is a recognition and discussion of the different levels. Therefore, the results align with the 4I framework of organizational learning that recognizes how learning across the different levels is linked by social and psychological processes: intuiting, interpreting, integrating, and institutionalizing (the 4Is) (Crossan et al. 1999 ). However, most of the studies focused on the institutionalizing-intuiting link (i.e., top-down feedback); moreover, no studies focused on contemporary learning technologies and processes that strengthen the learning flow (e.g., self-regulated learning).

Identifying critical aspects for effective eOL.

There is a considerable amount of predominantly qualitative studies that focus on potential barriers to eOL implementation as well as on the risks and requirements associated with the feasibility and successful implementation of eOL. In the same vein, research has emphasized the importance of alignment of eOL (both in processes and in technologies) within the organization. These critical aspects for effective eOL are sometimes the main objectives of the studies (see Appendix ). However, most of the elements relating to the effectiveness of eOL were measured with questionnaires and interviews with employees and managers, and very little work was conducted on how to leverage the digital technologies employed in eOL, big data, and analytics in order to monitor the effectiveness of eOL.

Implementing employee-centric eOL.

In most of the studies, the main objective was to increase employees’ adoption, satisfaction, and usage of the e-learning system. In addition, several studies focused on the e-learning system’s ability to improve employees’ performance, increase the knowledge flow in the organization, and foster learning. Most of the approaches were employee-centric, with a small amount of studies focusing on managers and the firm in general. However, employees were seen as static entities within the organization, with limited work investigating how eOL-based training exposes employees to new knowledge, broadens their skills repertoire, and has tremendous potential for fostering innovation (Lin and Sanders 2017 ).

Achieving goals associated with the value creation of the organization.

A considerable number of studies utilized the firm (rather than the individual employee) as the unit of analysis. Such studies focused on how the implementation of eOL can increase employee performance, organizational value, and customer value. Although this is extremely helpful in furthering knowledge about eOL technologies and practices, a more granular investigation of the different e-learning systems and processes to address the various goals and strategies of the organization would enable researchers to extract practical insights on the design and implementation of eOL.

5.2 Research Agenda

By conducting an SLR and documenting the eOL research of the last decade, we have identified promising themes of research that have the potential to further eOL research and practice. To do so, we define a research agenda consisting of five thematic areas of research, as depicted in the research framework in Fig.  5 , and we provide some suggestions on how researchers could approach these challenges. In this visualization of the framework, on the left side we present the organizations as they were identified from our review (i.e., area/topic category in the Appendix ) and the multiple levels where organizational learning occurs (Costello and McNaughton 2018 ). On the right side, we summarize the objectives as they were identified from our review (i.e., the objectives category in the Appendix ). In the middle, we depict the orchestration that was conducted and how potential future research on eOL can improve the orchestration of the various elements and accelerate the achievement of the intended objectives. In particular, our proposed research agenda includes five research themes discussed in the following subsections.

figure 5

E-learning capabilities to enhance organizational research agenda

5.2.1 Theme 1: Couple E-learning Capabilities With the Intended Goals

The majority of the eOL studies either investigated a generic e-learning system using the umbrella term “e-learning” or did not provide enough details about the functionalities of the system (in most cases, it was simply defined as an online or web system). This indicates the very limited focus of the eOL research on the various capabilities of e-learning systems. In other words, the literature has been very detailed on the organizational value and employees’ acceptance of the technology, but less detailed on the capabilities of this technology that needs to be put into place to achieve the intended goals and strategic agenda. However, the capabilities of the e-learning systems and their use are not one-size-fits-all, and the intended goals (to obtain certain skills and competences) and employees’ needs and backgrounds play a determining role in the selection of the e-learning system (Al-Fraihat et al. 2020 ).

Only in a very few studies (Mueller et al. 2011 ; Renner et al. 2020 ) were the capabilities of the e-learning solutions (e.g., mobile learning, VR) utilized, and the results were found to significantly contribute to the intended goals. The intended knowledge can be procedural, declarative, general competence (e.g., presentation, communication, or leadership skills) or else, and its particularities and the pedagogical needs of the intended knowledge (e.g., a need for summative/formative feedback or for social learning support) should guide the selection of the e-learning system and the respective capabilities. Therefore, future research needs to investigate how the various capabilities offered by contemporary learning systems (e.g., assessment mechanisms, social learning, collaborative learning, personalized learning) can be utilized to adequately reinforce the intended goals (e.g., to train personnel to use a new tool, to improve presentation skills).

5.2.2 Theme 2: Embrace the Particularities of the Various Industries

Organizational learning entails sharing knowledge and enabling opportunities for growth at the individual, group, team, and organizational levels. Contemporary e-learning systems provide the medium to substantiate the necessary knowledge flow within organizations and to support employees’ overall learning. From the selected studies, we can infer that eOL research is either conducted in an industry-agnostic context (either generic or it was not properly reported) or there is a focus on the IT industry (see Appendix ). However, when looking at the few studies that provide results from different industries (Garavan et al. 2019 ; Lee et al. 2014 ), companies indicate that there are different practices, processes, and expectations, and that employees have different needs and perceptions with regards to e-learning systems and eOL in general. Such particularities influence the perceived dimensions of a learning organization. Some industries noted that eOL promoted the development of their learning organizations, whereas others reported that eOL did not seem to contribute to their development as a learning organization (Yoo and Huang 2016 ). Therefore, it is important that the implementation of organizational learning embraces the particularities of the various industries and future research needs to identify how the industry-specific characteristics can inform the design and development of organizational learning in promoting an organization’s goals and agenda.

5.2.3 Theme 3: Utilize E-learning Capabilities to Implement Employee-centric Approaches

For efficient organizational learning to be implemented, the processes and technologies need to recognize that learning is linked by social and psychological processes (Crossan et al. 1999 ). This allows employees to develop learning in various forms (e.g., social, emotional, personalized) and to develop elements such as self-awareness, self-control, and interpersonal skills that are vital for the organization. Looking at the contemporary eOL research, we notice that the exploration of e-learning capabilities to nurture the aforementioned elements and support employee-centric approaches is very limited (e.g., personalized technologies, adaptive assessment). Therefore, future research needs to collect data to understand how e-learning capabilities can be utilized in relation to employees’ needs and perceptions in order to provide solutions (e.g., collaborative, social, adaptive) that are employee-centric and focused on development, and that have the potential to move away from standard one-size-fits-all e-learning solutions to personalized and customized systems and processes.

5.2.4 Theme 4: Employ Analytics-enabled eOL

There is a lot of emphasis on measuring, via various qualitative and quantitative metrics, the effectiveness of eOL implemented at different levels in organizations. However, most of these metrics come from surveys and interviews that capture employees’ and managers’ perceptions of various aspects of eOL (e.g., fostering of learning, organizational value, employees’ performance), and very few studies utilize analytics (Hung et al. 2015 ; Renner et al. 2020 ; Rober and Cooper 2011 ). Given how digital technologies, big data, and business analytics pave the way towards organizations’ digital transformation and sustainable development (Mikalef et al. 2018 ; Pappas et al. 2018 ), and considering the learning analytics affordances of contemporary e-learning systems (Siemens and Long 2011 ), future work needs to investigate how learner/employee-generated data can be employed to inform practice and devise more accurate and temporal effectiveness metrics when measuring the importance and impact of eOL.

5.2.5 Theme 5: Orchestrate the Employees’ Needs, Resources, and Objectives in eOL Implementation

While considerable effort has been directed towards the various building blocks of eOL implementation, such as resources (intangible, tangible, and human skills) and employees’ needs (e.g., vision, growth, skills development), little is known so far about the processes and structures necessary for orchestrating those elements in order to achieve an organization’s intended goals and to materialize its overall agenda. In other words, eOL research has been very detailed on some of the elements that constitute efficient eOL, but less so on the interplay of those elements and how they need to be put into place. Prior literature on strategic resource planning has shown that competence in orchestrating such elements is a prerequisite to successfully increasing business value (Wang et al. 2012 ). Therefore, future research should not only investigate each of these elements in silos, but also consider their interplay, since it is likely that organizations with similar resources will exert highly varied levels in each of these elements (e.g., analytics-enabled, e-learning capabilities) to successfully materialize their goals (e.g., increase value, improve the competence base of their employees, modernize their organization).

5.3 Implications

Several implications for eOL have been revealed in this literature review. First, most studies agree that employees’ or trainees’ experience is extremely important for the successful implementation of eOL. Thus, keeping them in the design and implementation cycle of eOL will increase eOL adoption and satisfaction as well as reduce the risks and barriers. Another important implication addressed by some studies relates to the capabilities of the e-learning technologies, with easy-to-use, useful, and social technologies resulting in more efficient eOL (e.g., higher adoption and performance). Thus, it is important for organizations to incorporate these functionalities in the platform and reinforce them with appropriate content and support. This should not only benefit learning outcomes, but also provide the networking opportunities for employees to broaden their personal networks, which are often lost when companies move from face-to-face formal training to e-learning-enabled organizational learning.

5.4 Limitations

This review has some limitations. First, we had to make some methodological decisions (e.g., selection of databases, the search query) that might lead to certain biases in the results. However, tried to avoid such biases by considering all the major databases and following the steps indicated by Kitchenham and Charters ( 2007 ). Second, the selection of empirical studies and coding of the papers might pose another possible bias. However, the focus was clearly on the empirical evidence, the terminology employed (“e-learning”) is an umbrella term that covers the majority of the work in the area, and the coding of papers was checked by two researchers. Third, some elements of the papers were not described accurately, leading to some missing information in the coding of the papers. However, the amount of missing information was very small and could not affect the results significantly. Finally, we acknowledge that the selected methodology (Kitchenham and Charters 2007 ) includes potential biases (e.g., false negatives and false positives), and that different, equally valid methods (e.g., Okoli and Schabram 2010 ) might have been used and have resulted in slightly different outcomes. Nevertheless, despite the limitations of the selected methodology, it is a well-accepted and widely used literature review method in both software engineering and information systems (Boell and Cecez-Kecmanovic 2014 ), providing certain assurance of the results.

6 Conclusions and Future Work

We have presented an SLR of 47 contributions in the field of eOL over the last decade. With respect to RQ1, we analyzed the papers from different perspectives, such as research methodology, technology, industries, employees, and intended outcomes in terms of organizational value, employees’ performance, usage, and behavioral change. The detailed landscape is depicted in the Appendix and Figs.  3 and 4 ; with the results indicating the limited utilization of the various e-learning capabilities (e.g., social, collaborative) to achieve objectives connected with those capabilities (e.g., social learning and behavioral change, collaborative learning and overcoming barriers).

With respect to RQ2, we categorized the main findings of the selected papers into five areas that reflect the status of eOL research, and we have discussed the challenges and opportunities emerging from the current review. In addition, we have synthesized the extracted challenges and opportunities and proposed a research agenda consisting of five elements that provide suggestions on how researchers could approach these challenges and exploit the opportunities. Such an agenda will strengthen how e-learning can be leveraged to enhance the process of improving actions through better knowledge and understanding in an organization.

A number of suggestions for further research have emerged from reviewing prior and ongoing work on eOL. One recommendation for future researchers is to clearly describe the eOL approach by providing detailed information about the technologies and materials used, as well as the organizations. This will allow meta-analyses to be conducted and it will also identify the potential effects of a firm’s size or area on the performance and other aspects relating to organizational value. Future work should also focus on collecting and triangulating different types of data from different sources (e.g., systems’ logs). The reviewed studies were conducted mainly by using survey data, and they made limited use of data coming from the platforms; thus, the interpretations and triangulation between the different types of collected data were limited.

Al-Fraihat, D., Joy, M., & Sinclair, J. (2020). Evaluating E-learning systems success: An empirical study. Computers in Human Behavior, 102 , 67–86.

Article   Google Scholar  

Alharthi, A. D., Spichkova, M., & Hamilton, M. (2019). Sustainability requirements for eLearning systems: A systematic literature review and analysis. Requirements Engineering, 24 (4), 523–543.

Alsabawy, A. Y., Cater-Steel, A., & Soar, J. (2013). IT infrastructure services as a requirement for e-learning system success. Computers & Education, 69 , 431–451.

Antonacopoulou, E., & Chiva, R. (2007). The social complexity of organizational learning: The dynamics of learning and organizing. Management Learning, 38 , 277–295.

Berends, H., & Lammers, I. (2010). Explaining discontinuity in organizational learning: A process analysis. Organization Studies, 31 (8), 1045–1068.

Boell, S. K., & Cecez-Kecmanovic, D. (2014). A hermeneutic approach for conducting literature reviews and literature searches. Communications of the Association for Information Systems, 34 (1), 12.

Google Scholar  

Bologa, R., & Lupu, A. R. (2014). Organizational learning networks that can increase the productivity of IT consulting companies. A case study for ERP consultants. Expert Systems with Applications, 41 (1), 126–136.

Broadbent, J. (2017). Comparing online and blended learner’s self-regulated learning strategies and academic performance. The Internet and Higher Education, 33 , 24–32.

Cheng, B., Wang, M., Moormann, J., Olaniran, B. A., & Chen, N. S. (2012). The effects of organizational learning environment factors on e-learning acceptance. Computers & Education, 58 (3), 885–899.

Cheng, B., Wang, M., Yang, S. J., & Peng, J. (2011). Acceptance of competency-based workplace e-learning systems: Effects of individual and peer learning support. Computers & Education, 57 (1), 1317–1333.

Choi, S., & Ko, I. (2012). Leveraging electronic collaboration to promote interorganizational learning. International Journal of Information Management, 32 (6), 550–559.

Costello, J. T., & McNaughton, R. B. (2018). Integrating a dynamic capabilities framework into workplace e-learning process evaluations. Knowledge and Process Management, 25 (2), 108–125.

Crossan, M. M., Lane, H. W., & White, R. E. (1999). An organizational learning framework: From intuition to institution. Academy of Management Review, 24 , 522–537.

Crossan, M. M., Maurer, C. C., & White, R. E. (2011). Reflections on the 2009 AMR decade award: Do we have a theory of organizational learning? Academy of Management Review, 36 (3), 446–460.

Dee, J., & Leisyte, L. (2017). Knowledge sharing and organizational change in higher education. The Learning Organization, 24 (5), 355–365. https://doi.org/10.1108/TLO-04-2017-0034

Dignen, B., & Burmeister, T. (2020). Learning and development in the organizations of the future. Three pillars of organization and leadership in disruptive times (pp. 207–232). Cham: Springer.

Chapter   Google Scholar  

Dybå, T., & Dingsøyr, T. (2008). Empirical studies of agile software development: A systematic review. Information and Software Technology, 50 (9–10), 833–859.

El Kadiri, S., Grabot, B., Thoben, K. D., Hribernik, K., Emmanouilidis, C., Von Cieminski, G., & Kiritsis, D. (2016). Current trends on ICT technologies for enterprise information systems. Computers in Industry, 79 , 14–33.

Engeström, Y., Kerosuo, H., & Kajamaa, A. (2007). Beyond discontinuity: Expansive organizational learning remembered. Management Learning, 38 (3), 319–336.

Gal, E., & Nachmias, R. (2011). Online learning and performance support in organizational environments using performance support platforms. Performance Improvement, 50 (8), 25–32.

Garavan, T. N., Heneghan, S., O’Brien, F., Gubbins, C., Lai, Y., Carbery, R., & Grant, K. (2019). L&D professionals in organisations: much ambition, unfilled promise. European Journal of Training and Development, 44 (1), 1–86.

Goggins, S. P., Jahnke, I., & Wulf, V. (2013). Computer-supported collaborative learning at the workplace . New York: Springer.

Book   Google Scholar  

Hammad, R., Odeh, M., & Khan, Z. (2017). ELCMM: An e-learning capability maturity model. In Proceedings of the 15th International Conference (e-Society 2017) (pp. 169–178).

Hester, A. J., Hutchins, H. M., & Burke-Smalley, L. A. (2016). Web 2.0 and transfer: Trainers’ use of technology to support employees’ learning transfer on the job. Performance Improvement Quarterly, 29 (3), 231–255.

Hung, Y. H., Lin, C. F., & Chang, R. I. (2015). Developing a dynamic inference expert system to support individual learning at work. British Journal of Educational Technology, 46 (6), 1378–1391.

Iris, R., & Vikas, A. (2011). E-Learning technologies: A key to dynamic capabilities. Computers in Human Behavior, 27 (5), 1868–1874.

Jia, H., Wang, M., Ran, W., Yang, S. J., Liao, J., & Chiu, D. K. (2011). Design of a performance-oriented workplace e-learning system using ontology. Expert Systems with Applications, 38 (4), 3372–3382.

Joo, Y. J., Lim, K. Y., & Park, S. Y. (2011). Investigating the structural relationships among organisational support, learning flow, learners’ satisfaction and learning transfer in corporate e-learning. British Journal of Educational Technology, 42 (6), 973–984.

Kaschig, A., Maier, R., Sandow, A., Lazoi, M., Barnes, S. A., Bimrose, J., … Schmidt, A. (2010). Knowledge maturing activities and practices fostering organisational learning: results of an empirical study. In European Conference on Technology Enhanced Learning (pp. 151–166). Berlin: Springer.

Khalili, A., Auer, S., Tarasowa, D., & Ermilov, I. (2012). SlideWiki: Elicitation and sharing of corporate knowledge using presentations. International Conference on Knowledge Engineering and Knowledge Management (pp. 302–316). Berlin: Springer.

Khandakar, M. S. A., & Pangil, F. (2019). Relationship between human resource management practices and informal workplace learning. Journal of Workplace Learning, 31 (8), 551–576.

Kim, M. K., Kim, S. M., & Bilir, M. K. (2014). Investigation of the dimensions of workplace learning environments (WLEs): Development of the WLE measure. Performance Improvement Quarterly, 27 (2), 35–57.

Kitchenham, B., & Charters, S. (2007). Guidelines for performing systematic literature reviews in software engineering. Technical Report EBSE-2007-01, 2007 . https://citeseerx.ist.psu.edu/viewdoc/download;jsessionid=35909B1B280E2032BF116BDC9DCB71EA? .

Krippendorff, K. (2018). Content analysis: an introduction to its methodology. Thousand Oaks: Sage Publications.

Lai, H. J. (2017). Examining civil servants’ decisions to use Web 2.0 tools for learning, based on the decomposed theory of planned behavior. Interactive Learning Environments, 25 (3), 295–305.

Lau, K. (2015). Organizational learning goes virtual? A study of employees’ learning achievement in stereoscopic 3D virtual reality. The Learning Organization, 22 (5), 289–303.

Lee, J., Choi, M., & Lee, H. (2015a). Factors affecting smart learning adoption in workplaces: Comparing large enterprises and SMEs. Information Technology and Management, 16 (4), 291–302.

Lee, J., Kim, D. W., & Zo, H. (2015b). Conjoint analysis on preferences of HRD managers and employees for effective implementation of m-learning: The case of South Korea. Telematics and Informatics, 32 (4), 940–948.

Lee, J., Zo, H., & Lee, H. (2014). Smart learning adoption in employees and HRD managers. British Journal of Educational Technology, 45 (6), 1082–1096.

Lin, C. H., & Sanders, K. (2017). HRM and innovation: A multi-level organizational learning perspective. Human Resource Management Journal, 27 (2), 300–317.

Lin, C. Y., Huang, C. K., & Zhang, H. (2019). Enhancing employee job satisfaction via e-learning: The mediating role of an organizational learning culture. International Journal of Human–Computer Interaction, 35 (7), 584–595.

Liu, Y. C., Huang, Y. A., & Lin, C. (2012). Organizational factors’ effects on the success of e-learning systems and organizational benefits: An empirical study in Taiwan. The International Review of Research in Open and Distributed Learning, 13 (4), 130–151.

López-Nicolás, C., & Meroño-Cerdán, ÁL. (2011). Strategic knowledge management, innovation and performance. International Journal of Information Management, 31 (6), 502–509.

Manuti, A., Pastore, S., Scardigno, A. F., Giancaspro, M. L., & Morciano, D. (2015). Formal and informal learning in the workplace: A research review. International Journal of Training and Development, 19 (1), 1–17.

March, J. G. (1991). Exploration and exploitation in organizational learning. Organization Science, 2 (1), 71–87.

Marshall, S. (2006). New Zealand Tertiary Institution E-learning Capability: Informing and Guiding eLearning Architectural Change and Development. Report to the ministry of education . NZ: Victoria University of Wellington.

McDonald, N., Schoenebeck, S., & Forte, A. (2019). Reliability and inter-rater reliability in qualitative research: Norms and guidelines for CSCW and HCI practice. In Proceedings of the ACM on Human–Computer Interaction, 3(CSCW) (pp. 1–23).

Menolli, A., Tirone, H., Reinehr, S., & Malucelli, A. (2020). Identifying organisational learning needs: An approach to the semi-automatic creation of course structures for software companies. Behaviour & Information Technology, 39 (11), 1140–1155.

Michalski, M. P. (2014). Symbolic meanings and e-learning in the workplace: The case of an intranet-based training tool. Management Learning, 45 (2), 145–166.

Mikalef, P., Pappas, I. O., Krogstie, J., & Giannakos, M. (2018). Big data analytics capabilities: A systematic literature review and research agenda. Information Systems and e-Business Management, 16 (3), 547–578.

Mitić, S., Nikolić, M., Jankov, J., Vukonjanski, J., & Terek, E. (2017). The impact of information technologies on communication satisfaction and organizational learning in companies in Serbia. Computers in Human Behavior, 76 , 87–101.

Mueller, J., Hutter, K., Fueller, J., & Matzler, K. (2011). Virtual worlds as knowledge management platform—A practice-perspective. Information Systems Journal, 21 (6), 479–501.

Muller Queiroz, A. C., Nascimento, M., Tori, A., Alejandro, R. Brashear, Veloso, T., de Melo, V., de Souza Meirelles, F., & da Silva Leme, M. I. (2018). Immersive virtual environments in corporate education and training. In AMCIS. https://aisel.aisnet.org/amcis2018/Education/Presentations/12/ .

Navimipour, N. J., & Zareie, B. (2015). A model for assessing the impact of e-learning systems on employees’ satisfaction. Computers in Human Behavior, 53 , 475–485.

Oh, S. Y. (2019). Effects of organizational learning on performance: The moderating roles of trust in leaders and organizational justice. Journal of Knowledge Management, 23, 313–331.

Okoli, C., & Schabram, K. (2010). A guide to conducting a systematic literature review of information systems research. Sprouts: Working Papers on Information Systems, 10 (26), 1–46.

Pappas, I. O., Mikalef, P., Giannakos, M. N., Krogstie, J., & Lekakos, G. (2018). Big data and business analytics ecosystems: paving the way towards digital transformation and sustainable societies. Information Systems and e-Business Management, 16, 479–491.

Popova-Nowak, I. V., & Cseh, M. (2015). The meaning of organizational learning: A meta-paradigm perspective. Human Resource Development Review, 14 (3), 299–331.

Qi, C., & Chau, P. Y. (2016). An empirical study of the effect of enterprise social media usage on organizational learning. In Pacific Asia Conference on Information Systems (PACIS'16). Proceedings , Paper 330. http://aisel.aisnet.org/pacis2016/330 .

Renner, B., Wesiak, G., Pammer-Schindler, V., Prilla, M., Müller, L., Morosini, D., … Cress, U. (2020). Computer-supported reflective learning: How apps can foster reflection at work. Behaviour & Information Technology, 39 (2), 167–187.

Rober, M. B., & Cooper, L. P. (2011, January). Capturing knowledge via an” Intrapedia”: A case study. In 2011 44th Hawaii International Conference on System Sciences (pp. 1–10). New York: IEEE.

Rosenberg, M. J., & Foshay, R. (2002). E-learning: Strategies for delivering knowledge in the digital age. Performance Improvement, 41 (5), 50–51.

Serrano, Á., Marchiori, E. J., del Blanco, Á., Torrente, J., & Fernández-Manjón, B. (2012). A framework to improve evaluation in educational games. The IEEE Global Engineering Education Conference (pp. 1–8). Marrakesh, Morocco.

Siadaty, M., Jovanović, J., Gašević, D., Jeremić, Z., & Holocher-Ertl, T. (2010). Leveraging semantic technologies for harmonization of individual and organizational learning. In European Conference on Technology Enhanced Learning (pp. 340–356). Berlin: Springer.

Siemens, G., & Long, P. (2011). Penetrating the fog: Analytics in learning and education. EDUCAUSE Review, 46 (5), 30.

Škerlavaj, M., Dimovski, V., Mrvar, A., & Pahor, M. (2010). Intra-organizational learning networks within knowledge-intensive learning environments. Interactive Learning Environments, 18 (1), 39–63.

Smith, P. J., & Sadler-Smith, E. (2006). Learning in organizations: Complexities and diversities . London: Routledge.

Stoffregen, J. D., Pawlowski, J. M., Ras, E., Tobias, E., Šćepanović, S., Fitzpatrick, D., … Friedrich, H. (2016). Barriers to open e-learning in public administrations: A comparative case study of the European countries Luxembourg, Germany, Montenegro and Ireland. Technological Forecasting and Social Change, 111 , 198–208.

Subramaniam, R., & Nakkeeran, S. (2019). Impact of corporate e-learning systems in enhancing the team performance in virtual software teams. In Smart Technologies and Innovation for a Sustainable Future (pp. 195–204). Berlin: Springer.

Tsai, C. H., Zhu, D. S., Ho, B. C. T., & Wu, D. D. (2010). The effect of reducing risk and improving personal motivation on the adoption of knowledge repository system. Technological Forecasting and Social Change, 77 (6), 840–856.

Turi, J. A., Sorooshian, S., & Javed, Y. (2019). Impact of the cognitive learning factors on sustainable organizational development. Heliyon, 5 (9), e02398.

Wang, M. (2011). Integrating organizational, social, and individual perspectives in Web 2.0-based workplace e-learning. Information Systems Frontiers, 13 (2), 191–205.

Wang, M. (2018). Effects of individual and social learning support on employees’ acceptance of performance-oriented e-learning. In E-Learning in the Workplace (pp. 141–159). Springer. https://doi.org/10.1007/978-3-319-64532-2_13 .

Wang, M., Ran, W., Liao, J., & Yang, S. J. (2010). A performance-oriented approach to e-learning in the workplace. Journal of Educational Technology & Society, 13 (4), 167–179.

Wang, M., Vogel, D., & Ran, W. (2011). Creating a performance-oriented e-learning environment: A design science approach. Information & Management, 48 (7), 260–269.

Wang, N., Liang, H., Zhong, W., Xue, Y., & Xiao, J. (2012). Resource structuring or capability building? An empirical study of the business value of information technology. Journal of Management Information Systems, 29 (2), 325–367.

Wang, S., & Wang, H. (2012). Organizational schemata of e-portfolios for fostering higher-order thinking. Information Systems Frontiers, 14 (2), 395–407.

Wei, K., & Ram, J. (2016). Perceived usefulness of podcasting in organizational learning: The role of information characteristics. Computers in Human Behavior, 64 , 859–870.

Wei, K., Sun, H., & Li, H. (2013). On the driving forces of diffusion of podcasting in organizational settings: A case study and propositions. In PACIS 2013. Proceedings , 217. http://aisel.aisnet.org/pacis2013/217 .

Weinhardt, J. M., & Sitzmann, T. (2018). Revolutionizing training and education? Three questions regarding massive open online courses (MOOCs). Human Resource Management Review, 29 (2), 218–225.

Xiang, Q., Zhang, J., & Liu, H. (2020). Organisational improvisation as a path to new opportunity identification for incumbent firms: An organisational learning view. Innovation, 22 (4), 422–446. https://doi.org/10.1080/14479338.2020.1713001 .

Yanson, R., & Johnson, R. D. (2016). An empirical examination of e-learning design: The role of trainee socialization and complexity in short term training. Computers & Education, 101 , 43–54.

Yoo, S. J., & Huang, W. D. (2016). Can e-learning system enhance learning culture in the workplace? A comparison among companies in South Korea. British Journal of Educational Technology, 47 (4), 575–591.

Zhang, X., Jiang, S., Ordóñez de Pablos, P., Lytras, M. D., & Sun, Y. (2017). How virtual reality affects perceived learning effectiveness: A task–technology fit perspective. Behaviour & Information Technology, 36 (5), 548–556.

Zhang, X., Meng, Y., de Pablos, P. O., & Sun, Y. (2019). Learning analytics in collaborative learning supported by Slack: From the perspective of engagement. Computers in Human Behavior, 92 , 625–633.

Download references

Open Access funding provided by NTNU Norwegian University of Science and Technology (incl St. Olavs Hospital - Trondheim University Hospital).

Author information

Authors and affiliations.

Norwegian University of Science and Technology, Trondheim, Norway

Michail N. Giannakos, Patrick Mikalef & Ilias O. Pappas

University of Agder, Kristiansand, Norway

Ilias O. Pappas

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Ilias O. Pappas .

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Giannakos, M.N., Mikalef, P. & Pappas, I.O. Systematic Literature Review of E-Learning Capabilities to Enhance Organizational Learning. Inf Syst Front 24 , 619–635 (2022). https://doi.org/10.1007/s10796-020-10097-2

Download citation

Accepted : 09 December 2020

Published : 01 February 2021

Issue Date : April 2022

DOI : https://doi.org/10.1007/s10796-020-10097-2

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Organizational learning
  • Literature review
  • Learning environments
  • Find a journal
  • Publish with us
  • Track your research

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • Springer Nature - PMC COVID-19 Collection

Logo of phenaturepg

Systematic Literature Review of E-Learning Capabilities to Enhance Organizational Learning

Michail n. giannakos.

1 Norwegian University of Science and Technology, Trondheim, Norway

Patrick Mikalef

Ilias o. pappas.

2 University of Agder, Kristiansand, Norway

E-learning systems are receiving ever increasing attention in academia, business and public administration. Major crises, like the pandemic, highlight the tremendous importance of the appropriate development of e-learning systems and its adoption and processes in organizations. Managers and employees who need efficient forms of training and learning flow within organizations do not have to gather in one place at the same time or to travel far away to attend courses. Contemporary affordances of e-learning systems allow users to perform different jobs or tasks for training courses according to their own scheduling, as well as to collaborate and share knowledge and experiences that result in rich learning flows within organizations. The purpose of this article is to provide a systematic review of empirical studies at the intersection of e-learning and organizational learning in order to summarize the current findings and guide future research. Forty-seven peer-reviewed articles were collected from a systematic literature search and analyzed based on a categorization of their main elements. This survey identifies five major directions of the research on the confluence of e-learning and organizational learning during the last decade. Future research should leverage big data produced from the platforms and investigate how the incorporation of advanced learning technologies (e.g., learning analytics, personalized learning) can help increase organizational value.

Introduction

E-learning covers the integration of information and communication technology (ICT) in environments with the main goal of fostering learning (Rosenberg and Foshay 2002 ). The term “e-learning” is often used as an umbrella term to portray several modes of digital learning environments (e.g., online, virtual learning environments, social learning technologies). Digitalization seems to challenge numerous business models in organizations and raises important questions about the meaning and practice of learning and development (Dignen and Burmeister 2020 ). Among other things, the digitalization of resources and processes enables flexible ways to foster learning across an organization’s different sections and personnel.

Learning has long been associated with formal or informal education and training. However organizational learning is much more than that. It can be defined as “a learning process within organizations that involves the interaction of individual and collective (group, organizational, and inter-organizational) levels of analysis and leads to achieving organizations’ goals” (Popova-Nowak and Cseh 2015 ) with a focus on the flow of knowledge across the different organizational levels (Oh 2019 ). Flow of knowledge or learning flow is the way in which new knowledge flows from the individual to the organizational level (i.e., feed forward) and vice versa (i.e., feedback) (Crossan et al. 1999 ; March 1991 ). Learning flow and the respective processes constitute the cornerstone of an organization’s learning activities (e.g., from physical training meetings to digital learning resources), they are directly connected to the psycho-social experiences of an organization’s members, and they eventually lead to organizational change (Crossan et al. 2011 ). The overall organizational learning is extremely important in an organization because it is associated with the process of creating value from an organizations’ intangible assets. Moreover, it combines notions from several different domains, such as organizational behavior, human resource management, artificial intelligence, and information technology (El Kadiri et al. 2016 ).

A growing body of literature lies at the intersection of e-learning and organizational learning. However, there is limited work on the qualities of e-learning and the potential of its qualities to enhance organizational learning (Popova-Nowak and Cseh 2015 ). Blockages and disruptions in the internal flow of knowledge is a major reason why organizational change initiatives often fail to produce their intended results (Dee and Leisyte 2017 ). In recent years, several models of organizational learning have been published (Berends and Lammers 2010 ; Oh 2019 ). However, detailed empirical studies indicate that learning does not always proceed smoothly in organizations; rather, the learning meets interruptions and breakdowns (Engeström et al. 2007 ).

Discontinuities and disruptions are common phenomena in organizational learning (Berends and Lammers 2010 ), and they stem from various causes. For example, organizational members’ low self-esteem, unsupportive technology and instructors (Garavan et al. 2019 ), and even crises like the Covid-19 pandemic can result in demotivated learners and overall unwanted consequences for their learning (Broadbent 2017 ). In a recent conceptual article, Popova-Nowak and Cseh ( 2015 ) emphasized that there is a limited use of multidisciplinary perspectives to investigate and explain the processes and importance of utilizing the available capabilities and resources and of creating contexts where learning is “attractive to individual agents so that they can be more engaged in exploring ways in which they can contribute through their learning to the ongoing renewal of organizational routines and practices” (Antonacopoulou and Chiva 2007 , p. 289).

Despite the importance of e-learning, the lack of systematic reviews in this area significantly hinders research on the highly promising value of e-learning capabilities for efficiently supporting organizational learning. This gap leaves practitioners and researchers in uncharted territories when faced with the task of implementing e-learning designs or deciding on their digital learning strategies to enhance the learning flow of their organizations. Hence, in order to derive meaningful theoretical and practical implications, as well as to identify important areas for future research, it is critical to understand how the core capabilities pertinent to e-learning possess the capacity to enhance organizational learning.

In this paper, we define e-learning enhanced organizational learning (eOL) as the utilization of digital technologies to enhance the process of improving actions through better knowledge and understanding in an organization. In recent years, a significant body of research has focused on the intersection of e-learning and organizational learning (e.g., Khandakar and Pangil 2019 ; Lin et al. 2019 ; Menolli et al. 2020 ; Turi et al. 2019 ; Xiang et al. 2020 ). However, there is a lack of systematic work that summarizes and conceptualizes the results in order to support organizations that want to move from being information-based enterprises to being knowledge-based ones (El Kadiri et al. 2016 ). In particular, recent technological advances have led to an increase in research that leverages e-learning capacities to support organizational learning, from virtual reality (VR) environments (Costello and McNaughton 2018 ; Muller Queiroz et al. 2018 ) to mobile computing applications (Renner et al. 2020 ) to adaptive learning and learning analytics (Zhang et al. 2019 ). These studies support different skills, consider different industries and organizations, and utilize various capacities while focusing on various learning objectives (Garavan et al. 2019 ). Our literature review aims to tease apart these particularities and to investigate how these elements have been utilized over the past decade in eOL research. Therefore, in this review we aim to answer the following research questions (RQs):

  • RQ1: What is the status of research at the intersection of e-learning and organizational learning, seen through the lens of areas of implementation (e.g., industries, public sector), technologies used, and methodologies (e.g., types of data and data analysis techniques employed)?
  • RQ2: How can e-learning be leveraged to enhance the process of improving actions through better knowledge and understanding in an organization?

Our motivation for this work is based on the emerging developments in the area of learning technologies that have created momentum for their adoption by organizations. This paper provides a review of research on e-learning capabilities to enhance organizational learning with the purpose of summarizing the findings and guiding future studies. This study can provide a springboard for other scholars and practitioners, especially in the area of knowledge-based enterprises, to examine e-learning approaches by taking into consideration the prior and ongoing research efforts. Therefore, in this paper we present a systematic literature review (SLR) (Kitchenham and Charters 2007 ) on the confluence of e-learning and organizational learning that uncovers initial findings on the value of e-learning to support organizational learning while also delineating several promising research streams.

The rest of this paper is organized as follows. In the next section, we present the related background work. The third section describes the methodology used for the literature review and how the studies were selected and analyzed. The fourth section presents the research findings derived from the data analysis based on the specific areas of focus. In the fifth section, we discuss the findings, the implications for practice and research, and the limitations of the selected methodological approach. In the final section, we summarize the conclusions from the study and make suggestions for future work.

Background and Related Work

E-learning systems.

E-learning systems provide solutions that deliver knowledge and information, facilitate learning, and increase performance by developing appropriate knowledge flow inside organizations (Menolli et al. 2020 ). Putting into practice and appropriately managing technological solutions, processes, and resources are necessary for the efficient utilization of e-learning in an organization (Alharthi et al. 2019 ). Examples of e-learning systems that have been widely adopted by various organizations are Canvas, Blackboard, and Moodle. Such systems provide innovative services for students, employees, managers, instructors, institutions, and other actors to support and enhance the learning processes and facilitate efficient knowledge flow (Garavan et al. 2019 ). Functionalities, such as creating modules to organize mini course information and learning materials or communication channels such as chat, forums, and video exchange, allow instructors and managers to develop appropriate training and knowledge exchange (Wang et al. 2011 ). Nowadays, the utilization of various e-learning capabilities is a commodity for supporting organizational and workplace learning. Such learning refers to training or knowledge development (also known in the literature as learning and development, HR development, and corporate training: Smith and Sadler-Smith 2006 ; Garavan et al. 2019 ) that takes place in the context of work.

Previous studies have focused on evaluating e-learning systems that utilize various models and frameworks. In particular, the development of maturity models, such as the e-learning capability maturity model (eLCMM), addresses technology-oriented concerns (Hammad et al. 2017 ) by overcoming the limitations of the domain-specific models (e.g., game-based learning: Serrano et al.  2012 ) or more generic lenses such as the e-learning maturity model (Marshall 2006 ). The aforementioned models are very relevant since they focus on assessing the organizational capabilities for sustainably developing, deploying, and maintaining e-learning. In particular, the eLCMM focuses on assessing the maturity of adopting e-learning systems and adds a feedback building block for improving learners’ experiences (Hammad et al. 2017 ). Our proposed literature review builds on the previously discussed models, lenses, and empirical studies, and it provides a review of research on e-learning capabilities with the aim of enhancing organizational learning in order to complement the findings of the established models and guide future studies.

E-learning systems can be categorized into different types, depending on their functionalities and affordances. One very popular e-learning type is the learning management system (LMS), which includes a virtual classroom and collaboration capabilities and allows the instructor to design and orchestrate a course or a module. An LMS can be either proprietary (e.g., Blackboard) or open source (e.g., Moodle). These two types differ in their features, costs, and the services they provide; for example, proprietary systems prioritize assessment tools for instructors, whereas open-source systems focus more on community development and engagement tools (Alharthi et al. 2019 ). In addition to LMS, e-learning systems can be categorized based on who controls the pace of learning; for example, an institutional learning environment (ILE) is provided by the organization and is usually used for instructor-led courses, while a personal learning environment (PLE) is proposed by the organization and is managed personally (i.e., learner-led courses). Many e-learning systems use a hybrid version of ILE and PLE that allows organizations to have either instructor-led or self-paced courses.

Besides the controlled e-learning systems, organizations have been using environments such as social media (Qi and Chau 2016 ), massive open online courses (MOOCs) (Weinhardt and Sitzmann 2018 ) and other web-based environments (Wang et al. 2011 ) to reinforce their organizational learning potential. These systems have been utilized through different types of technology (e.g., desktop applications, mobile) that leverage the various capabilities offered (e.g., social learning, VR, collaborative systems, smart and intelligent support) to reinforce the learning and knowledge flow potential of the organization. Although there is a growing body of research on e-learning systems for organizational learning due to the increasingly significant role of skills and expertise development in organizations, the role and alignment of the capabilities of the various e-learning systems with the expected competency development remains underexplored.

Organizational Learning

There is a large body of research on the utilization of technologies to improve the process and outcome dimensions of organizational learning (Crossan et al. 1999 ). Most studies have focused on the learning process and on the added value that new technologies can offer by replacing some of the face-to-face processes with virtual processes or by offering new, technology-mediated phases to the process (Menolli et al. 2020 ; Lau 2015 ) highlighted how VR capabilities can enhance organizational learning, describing the new challenges and frameworks needed in order to effectively utilize this potential. In the same vein, Zhang et al. ( 2017 ) described how VR influences reflective thinking and considered its indirect value to overall learning effectiveness. In general, contemporary research has investigated how novel technologies and approaches have been utilized to enhance organizational learning, and it has highlighted both the promises and the limitations of the use of different technologies within organizations.

In many organizations, alignment with the established infrastructure and routines, and adoption by employees are core elements for effective organizational learning (Wang et al. 2011 ). Strict policies, low digital competence, and operational challenges are some of the elements that hinder e-learning adoption by organizations (Garavan et al. 2019 ; Wang 2018 ) demonstrated the importance of organizational, managerial, and job support for utilizing individual and social learning in order to increase the adoption of organizational learning. Other studies have focused on the importance of communication through different social channels to develop understanding of new technology, to overcome the challenges employees face when engaging with new technology, and, thereby, to support organizational learning (Menolli et al. 2020 ). By considering the related work in the area of organizational learning, we identified a gap in aligning an organization’s learning needs with the capabilities offered by the various technologies. Thus, systematic work is needed to review e-learning capabilities and how these capabilities can efficiently support organizational learning.

E-learning Systems to Enhance Organizational Learning

When considering the interplay between e-learning systems and organizational learning, we observed that a major challenge for today’s organizations is to switch from being information-based enterprises to become knowledge-based enterprises (El Kadiri et al. 2016 ). Unidirectional learning flows, such as formal and informal training, are important but not sufficient to cover the needs that enterprises face (Manuti et al. 2015 ). To maintain enterprises’ competitiveness, enterprise staff have to operate in highly intense information and knowledge-oriented environments. Traditional learning approaches fail to substantiate learning flow on the basis of daily evidence and experience. Thus, novel, ubiquitous, and flexible learning mechanisms are needed, placing humans (e.g., employees, managers, civil servants) at the center of the information and learning flow and bridging traditional learning with experiential, social, and smart learning.

Organizations consider lack of skills and competences as being the major knowledge-related factors hampering innovation (El Kadiri et al. 2016 ). Thus, solutions need to be implemented that support informal, day-to-day, and work training (e.g., social learning, collaborative learning, VR/AR solutions) in order to develop individual staff competences and to upgrade the competence affordances at the organizational level. E-learning-enhanced organizational learning has been delivered primarily in the form of web-based learning (El Kadiri et al. 2016 ). More recently, the TEL tools portfolio has rapidly expanded to make more efficient joint use of novel learning concepts, methodologies, and technological enablers to achieve more direct, effective, and lasting learning impacts. Virtual learning environments, mobile-learning solutions, and AR/VR technologies and head-mounted displays have been employed so that trainees are empowered to follow their own training pace, learning topics, and assessment tests that fit their needs (Costello and McNaughton 2018 ; Mueller et al. 2011 ; Muller Queiroz et al. 2018 ). The expanding use of social networking tools has also brought attention to the contribution of social and collaborative learning (Hester et al. 2016 ; Wei and Ram 2016 ).

Contemporary learning systems supporting adaptive, personalized, and collaborative learning expand the tools available in eOL and contribute to the adoption, efficiency, and general prospects of the introduction of TEL in organizations (Cheng et al. 2011 ). In recent years, eOL has emphasized how enterprises share knowledge internally and externally, with particular attention being paid to systems that leverage collaborative learning and social learning functionalities (Qi and Chau 2016 ; Wang  2011 ). This is the essence of computer-supported collaborative learning (CSCL). The CSCL literature has developed a framework that combines individual learning, organizational learning, and collaborative learning, facilitated by establishing adequate learning flows and emerges effective learning in an enterprise learning (Goggins et al. 2013 ), in Fig.  1 .

An external file that holds a picture, illustration, etc.
Object name is 10796_2020_10097_Fig1_HTML.jpg

Representation of the combination of enterprise learning and knowledge flows. (adapted from Goggins et al. 2013 )

Establishing efficient knowledge and learning flows is a primary target for future data-driven enterprises (El Kadiri et al. 2016 ). Given the involved knowledge, the human resources, and the skills required by enterprises, there is a clear need for continuous, flexible, and efficient learning. This can be met by contemporary learning systems and practices that provide high adoption, smooth usage, high satisfaction, and close alignment with the current practices of an enterprise. Because the required competences of an enterprise evolve, the development of competence models needs to be agile and to leverage state-of-the art technologies that align with the organization’s processes and models. Therefore, in this paper we provide a review of the eOL research in order to summarize the findings, identify the various capabilities of eOL, and guide the development of organizational learning in future enterprises as well as in future studies.

Methodology

To answer our research questions, we conducted an SLR, which is a means of evaluating and interpreting all available research relevant to a particular research question, topic area, or phenomenon of interest. A SLR has the capacity to present a fair evaluation of a research topic by using a trustworthy, rigorous, and auditable methodology (Kitchenham and Charters 2007 ). The guidelines used (Kitchenham and Charters 2007 ) were derived from three existing guides adopted by medical researchers. Therefore, we adopted SLR guidelines that follow transparent and widely accepted procedures (especially in the area of software engineering and information systems, as well as in e-learning), minimize potential bias (researchers), and support reproducibility (Kitchenham and Charters 2007 ). Besides the minimization of bias and support for reproducibility, an SLR allows us to provide information about the impact of some phenomenon across a wide range of settings, contexts, and empirical methods. Another important advantage is that, if the selected studies give consistent results, SLRs can provide evidence that the phenomenon is robust and transferable (Kitchenham and Charters 2007 ).

Article Collection

Several procedures were followed to ensure a high-quality review of the literature of eOL. A comprehensive search of peer-reviewed articles was conducted in February 2019 (short papers, posters, dissertations, and reports were excluded), based on a relatively inclusive range of key terms: “organizational learning” & “elearning”, “organizational learning” & “e-learning”, “organisational learning” & “elearning”, and “organisational learning” & “e-learning”. Publications were selected from 2010 onwards, because we identified significant advances since 2010 (e.g., MOOCs, learning analytics, personalized learning) in the area of learning technologies. A wide variety of databases were searched, including SpringerLink, Wiley, ACM Digital Library, IEEE Xplore, Science Direct, SAGE, ERIC, AIS eLibrary, and Taylor & Francis. The selected databases were aligned with the SLR guidelines (Kitchenham and Charters 2007 ) and covered the major venues in IS and educational technology (e.g., a basket of eight IS journals, the top 20 journals in the Google Scholar IS subdiscipline, and the top 20 journals in the Google Scholar Educational Technology subdiscipline). The search process uncovered 2,347 peer-reviewed articles.

Inclusion and Exclusion Criteria

The selection phase determines the overall validity of the literature review, and thus it is important to define specific inclusion and exclusion criteria. As Dybå and Dingsøyr ( 2008 ) specified, the quality criteria should cover three main issues – namely, rigor, credibility, and relevance – that need to be considered when evaluating the quality of the selected studies. We applied eight quality criteria informed by the proposed Critical Appraisal Skills Programme (CASP) and related works (Dybå and Dingsøyr 2008 ). Table ​ Table1 1 presents these criteria.

Quality criteria

Therefore, studies were eligible for inclusion if they were focused on eOL. The aforementioned criteria were applied in stages 2 and 3 of the selection process (see Fig.  2 ), when we assessed the papers based on their titles and abstracts, and read the full papers. From March 2020, we performed an additional search (stage 4) following the same process for papers published after the initial search period (i.e., 2010–February 2019). The additional search returned seven papers. Figure ​ Figure2 2 summarizes the stages of the selection process.

An external file that holds a picture, illustration, etc.
Object name is 10796_2020_10097_Fig2_HTML.jpg

Stages of the selection process

Each collected study was analyzed based on the following elements: study design (e.g., experiment, case study), area (e.g., IT, healthcare), technology (e.g., wiki, social media), population (e.g., managers, employees), sample size, unit of analysis (individual, firm), data collections (e.g., surveys, interviews), research method, data analysis, and the main research objective of the study. It is important to highlight that the articles were coded based on the reported information, that different authors reported information at different levels of granularity (e.g., an online system vs. the name of the system), and that in some cases the information was missing from the paper. Overall, we endeavored to code the articles as accurately and completely as possible.

The coding process was iterative with regular consensus meetings between the two researchers involved. The primary coder prepared the initial coding for a number of articles and both coders reviewed and agreed on the coding in order to reach the final codes presented in the Appendix . Disagreements between the coders and inexplicit aspects of the reviewed papers were discussed and resolved in regular consensus meetings. Although this process did not provide reliability indices (e.g., Cohen’s kappa), it did provide certain reliability in terms of consistency of the coding and what Krippendorff ( 2018 ) stated as the reliability of “the degree to which members of a designated community concur on the readings, interpretations, responses to, or uses of given texts or data”, which is considered acceptable research practice (McDonald et al. 2019 ).

In this section, we present the detailed results of the analysis of the 47 papers. Analysis of the studies was performed using non-statistical methods that considered the variables reported in the Appendix . This section is followed by an analysis and discussion of the categories.

Sample Size and Population Involved

The categories related to the sample of the articles and included the number of participants in each study (size), their position (e.g., managers, employees), and the area/topic covered by the study. The majority of the studies involved employees (29), with few studies involving managers (6), civil servants (2), learning specialists (2), clients, and researchers. Regarding the sample size, approximately half of the studies (20) were conducted with fewer than 100 participants; some (12) can be considered large-scale studies (more than 300 participants); and only a few (9) can be considered small scale (fewer than 20 participants). In relation to the area/topic of the study, most studies (11) were conducted in the context of the IT industry, but there was also good coverage of other important areas (i.e., healthcare, telecommunications, business, public sector). Interestingly, several studies either did not define the area or were implemented in a generic context (sector-agnostic studies, n = 10), and some studies were implemented in a multi-sector context (e.g., participants from different sections or companies, n = 4).

Research Methods

When assessing the status of research for an area, one of the most important aspects is the methodology used. By “method” in the Appendix , we refer to the distinction between quantitative, qualitative, and mixed methods research. In addition to the method, in our categorization protocol we also included “study design” to refer to the distinction between survey studies (i.e., those that gathered data by asking a group of participants), experiments (i.e., those that created situations to record beneficial data), and case studies (i.e., those that closely studied a group of individuals).

Based on this categorization, the Appendix shows that the majority of the papers were quantitative (34) and qualitative (7), with few studies (6) utilizing mixed methods. Regarding the study design, most of the studies were survey studies (26), 13 were case studies, and fewer were experiments (8). For most studies, the individual participant (40) was the unit of analysis, with few studies having the firm as the unit of analysis, and only one study using the training session as a unit of analysis. Regarding the measures used in the studies, most utilized surveys (39), with 11 using interviews, and only a few studies using field notes from focus groups (2) and log files from the systems (2). Only eight studies involved researchers using different measures to triangulate or extend their findings. Most articles used structural equation modeling (SEM) (17) to analyze their data, with 13 studies employing descriptive statistics, seven using content analysis, nine using regression analysis or analyses of variances/covariance, and one study using social network analysis (SNA).

Technologies

Concerning the technology used, most of the studies (17) did not study a specific system, referring instead in their investigation to a generic e-learning or technological solution. Several studies (9) named web-based learning environments, without describing the functionalities of the identified system. Other studies focused on online learning environments (4), collaborative learning systems (3), social learning systems (3), smart learning systems (2), podcasting (2), with the rest of the studies using a specific system (e.g., a wiki, mobile learning, e-portfolios, Second Life, web application).

Research Objectives

The research objectives of the studies could be separated into six main categories. The first category focuses on the intention of the employees to use the technology (9); the second focuses on the performance of the employees (8); the third focuses on the value/outcome for the organization (4); the fourth focuses on the actual usage of the system (7); the fifth focuses on employees’ satisfaction (4); and the sixth focuses on the ability of the proposed system to foster learning (9). In addition to these six categories, we also identified studies that focused on potential barriers for eOL in organizations (Stoffregen et al. 2016 ), the various benefits associated with the successful implementation of eOL (Liu et al. 2012 ), the feasibility of eOL (Kim et al. 2014 ; Mueller et al. 2011 ), and the alignment of the proposed innovation with the other processes and systems in the organization (Costello and McNaughton 2018 ).

E-learning Capabilities in Various Organizations and for Various Objectives

The technology used has an inherent role for both the organization and the expected eOL objective. E-learning systems are categorized based on their functionalities and affordances. Based on the information reported in the selected papers, we ranked them based on the different technologies and functionalities (e.g., collaborative, online, smart). To do so, we focused on the main elements described in the selected paper; for instance, a paper that described the system as wiki-based or indicated that the system was Second Life was ranked as such, rather than being added to collaborative systems or social learning respectively. We did this because we wanted to capture all the available information since it gave us additional insights (e.g., Second Life is both a social and a VR system).

To investigate the connection between the various technologies used to enhance organizational learning and their application in the various organizations, we utilized the coding (see Appendix ) and mapped the various e-learning technologies (or their affordances) with the research industries to which they applied (Fig.  3 ). There was occasionally a lack of detailed information about the capabilities of the e-learning systems applied (e.g., generic, or a web application, or an online system), which limited the insights. Figure ​ Figure3 3 provides a useful mapping of the confluence of e-learning technologies and their application in the various industries.

An external file that holds a picture, illustration, etc.
Object name is 10796_2020_10097_Fig3_HTML.jpg

Association of the different e-learning technologies with the industries to which they are applied in the various studies. Note: The size of the circles depicts the frequency of studies, with the smallest circle representing one study and the largest representing six studies. The mapping is extracted from the data in the Appendix , which outlines the papers that belong in each of the circles

To investigate the connection between the various technologies used to enhance organizational learning and their intended objectives, we utilized the coding of the articles (see Appendix ) and mapped the various e-learning technologies (or their affordances) with the intended objectives, as reported in the various studies (Fig.  4 ). The results in Fig.  4 show the objectives that are central in eOL research (e.g., performance, fostering learning, adoption, and usage) as well as those objectives on which few studies have focused (e.g., alignment, feasibility, behavioral change). In addition, the results also indicate the limited utilization of the various e-learning capabilities (e.g., social, collaborative, smart) to achieve objectives connected with those capabilities (e.g., social learning and behavioral change, collaborative learning, and barriers).

An external file that holds a picture, illustration, etc.
Object name is 10796_2020_10097_Fig4_HTML.jpg

Association of the different e-learning technologies with the objectives investigated in the various studies. Note: The size of the circles depicts the frequency of studies, with the smallest circle representing one study and the largest representing five studies. The mapping is extracted from the data in the Appendix , which outlines the papers that belong in each of the circles

5. Discussion

After reviewing the 47 identified articles in the area of eOL, we can observe that all the works acknowledge the importance of the affordances offered by different e-learning technologies (e.g., remote collaboration, anytime anywhere), the importance of the relationship between eOL and employees’ satisfaction and performance, and the benefits associated with organizational value and outcome. Most of the studies agree that eOL provides employees, managers, and even clients with opportunities to learn in a more differentiated manner, compared to formal and face-to-face learning. However, how the organization adopts and puts into practice these capabilities to leverage them and achieve its goals are complex and challenging procedures that seem to be underexplored.

Several studies (Lee et al. 2015a ; Muller Queiroz et al. 2018 ; Tsai et al. 2010 ) focused on the positive effect of perceived managerial support, perceived usefulness, perceived ease of use, and other technology acceptance model (TAM) constructs of the e-learning system in supporting all three levels of learning (i.e., individual, collaborative, and organizational). Another interesting dimension highlighted by many studies (Choi and Ko 2012 ; Khalili et al. 2012 ; Yanson and Johnson 2016 ) is the role of socialization in the adoption and usage of the e-learning systems that offer these capabilities. Building connections and creating a shared learning space in the e-learning system is challenging but also critical for the learners (Yanson and Johnson 2016 ). This is consistent with the expectancy-theoretical explanation of how social context impacts on employees’ motivation to participate in learning (Lee et al. 2015a ; Muller Queiroz et al. 2018 ).

The organizational learning literature suggests that e-learning may be more appropriate for the acquisition of certain types of knowledge than others (e.g., procedural vs. declarative, or hard-skills vs. soft-skills); however, there is no empirical evidence for this (Yanson and Johnson 2016 ). To advance eOL research, there is a need for a significant move to address complex, strategic skills by including learning and development professionals (Garavan et al. 2019 ) and by developing strategic relationships. Another important element is to utilize e-learning technology that addresses and integrates organizational, individual, and social perspectives in eOL (Wang  2011 ). This is also identified in our literature review since we found only limited specialized e-learning systems in domain areas that have traditionally benefited from such technology. For instance, although there were studies that utilized VR environments (Costello and McNaughton 2018 ; Muller Queiroz et al. 2018 ) and video-based learning systems (Wei et al. 2013 ; Wei and Ram 2016 ), there was limited focus in contemporary eOL research on how specific affordances of the various environments that are used in organizations (e.g., Carnetsoft, Outotec HSC, and Simscale for simulations of working environments; or Raptivity, YouTube, and FStoppers to gain specific skills and how-to knowledge) can benefit the intended goals or be integrated with the unique qualities of the organization (e.g., IT, healthcare).

For the design and the development of the eOL approach, the organization needs to consider the alignment of individual learning needs, organizational objectives, and the necessary resources (Wang  2011 ). To achieve this, it is advisable for organizations to define the expected objectives, catalogue the individual needs, and select technologies that have the capacity to support and enrich learners with self-directed and socially constructed learning practices in the organization (Wang  2011 ). This needs to be done by taking into consideration that on-demand eOL is gradually replacing the classic static eOL curricula and processes (Dignen and Burmeister 2020 ).

Another important dimension of eOL research is the lenses used to approach effectiveness. The selected papers approached effectiveness with various objectives, such as fostering learning, usage of the e-learning system, employees’ performance, and the added organizational value (see Appendix ). To measure these indices, various metrics (quantitative, qualitative, and mixed) have been applied. The qualitative dimensions emphasize employees’ satisfaction and system usage (e.g., Menolli et al. 2020 ; Turi et al. 2019 ), as well as managers’ perceived gained value and benefits (e.g., Lee et al. 2015b ; Xiang et al. 2020 ) and firms’ perceived effective utilization of eOL resources (López-Nicolás and Meroño-Cerdán 2011 ). The quantitative dimensions focus on usage, feasibility, and experience at different levels within an organization, based on interviews, focus groups, and observations (Costello and McNaughton 2018 ; Michalski 2014 ; Stoffregen et al. 2016 ). However, it is not always clear the how eOL effectiveness has been measured, nor the extent to which eOL is well aligned with and is strategically impactful on delivering the strategic agenda of the organization (Garavan et al. 2019 ).

Research on digital technologies is developing rapidly, and big data and business analytics have the potential to pave the way for organizations’ digital transformation and sustainable development (Mikalef et al. 2018 ; Pappas et al. 2018 ); however, our review finds surprisingly limited use of big data and analytics in eOL. Despite contemporary e-learning systems adopting data-driven mechanisms, as well as advances in learning analytics (Siemens and Long 2011 ), the results of our analysis indicate that learner-generated data in the context of eOL are used in only a few studies to extract very limited insights with respect to the effectiveness of eOL and the intended objectives of the respective study (Hung et al. 2015 ; Renner et al. 2020 ; Rober and Cooper 2011 ). Therefore, eOL research needs to focus on data-driven qualities that will allow future researchers to gain deeper insights into which capabilities need to be developed to monitor the effectiveness of the various practices and technologies, their alignment with other functions of the organization, and how eOL can be a strategic and impactful vehicle for materializing the strategic agenda of the organization.

Status of eOL Research

The current review suggests that, while the efficient implementation of eOL entails certain challenges, there is also a great potential for improving employees’ performance as well as overall organizational outcome and value. There are also opportunities for improving organizations’ learning flow, which might not be feasible with formal learning and training. In order to construct the main research dimensions of eOL research and to look more deeply at the research objectives of the studies (the information we coded as objectives in the Appendix ), we performed a content analysis and grouped the research objectives. This enabled us to summarize the contemporary research on eOL according to five major categories, each of which is describes further below. As the research objectives of the published work shows, the research on eOL conducted during the last decade has particularly focused on the following five directions.

Research has particularly focused on how easy the technology is to use, on how useful it is, or on how well aligned/integrated it is with other systems and processes within the organization. In addition, studies have used different learning technologies (e.g., smart, social, personalized) to enhance organizational learning in different contexts and according to different needs. However, most works have focused on affordances such as remote training and the development of static courses or modules to share information with learners. Although a few studies have utilized contemporary e-learning systems (see Appendix ), even in these studies there is a lack of alignment between the capabilities of those systems (e.g., open online course, adaptive support, social and collaborative learning) and the objectives and strategy of the organization (e.g., organizational value, fostering learning).

The reviewed work has emphasized how different factors contribute to different levels of organizational learning, and it has focused on practices that address individual, collaborative, and organizational learning within the structure of the organization. In particular, most of the reviewed studies recognize that organizational learning occurs at multiple levels: individual, team (or group), and organization. In other words, although each of the studies carried out an investigation within a given level (except for Garavan et al. 2019 ), there is a recognition and discussion of the different levels. Therefore, the results align with the 4I framework of organizational learning that recognizes how learning across the different levels is linked by social and psychological processes: intuiting, interpreting, integrating, and institutionalizing (the 4Is) (Crossan et al. 1999 ). However, most of the studies focused on the institutionalizing-intuiting link (i.e., top-down feedback); moreover, no studies focused on contemporary learning technologies and processes that strengthen the learning flow (e.g., self-regulated learning).

There is a considerable amount of predominantly qualitative studies that focus on potential barriers to eOL implementation as well as on the risks and requirements associated with the feasibility and successful implementation of eOL. In the same vein, research has emphasized the importance of alignment of eOL (both in processes and in technologies) within the organization. These critical aspects for effective eOL are sometimes the main objectives of the studies (see Appendix ). However, most of the elements relating to the effectiveness of eOL were measured with questionnaires and interviews with employees and managers, and very little work was conducted on how to leverage the digital technologies employed in eOL, big data, and analytics in order to monitor the effectiveness of eOL.

In most of the studies, the main objective was to increase employees’ adoption, satisfaction, and usage of the e-learning system. In addition, several studies focused on the e-learning system’s ability to improve employees’ performance, increase the knowledge flow in the organization, and foster learning. Most of the approaches were employee-centric, with a small amount of studies focusing on managers and the firm in general. However, employees were seen as static entities within the organization, with limited work investigating how eOL-based training exposes employees to new knowledge, broadens their skills repertoire, and has tremendous potential for fostering innovation (Lin and Sanders 2017 ).

A considerable number of studies utilized the firm (rather than the individual employee) as the unit of analysis. Such studies focused on how the implementation of eOL can increase employee performance, organizational value, and customer value. Although this is extremely helpful in furthering knowledge about eOL technologies and practices, a more granular investigation of the different e-learning systems and processes to address the various goals and strategies of the organization would enable researchers to extract practical insights on the design and implementation of eOL.

Research Agenda

By conducting an SLR and documenting the eOL research of the last decade, we have identified promising themes of research that have the potential to further eOL research and practice. To do so, we define a research agenda consisting of five thematic areas of research, as depicted in the research framework in Fig.  5 , and we provide some suggestions on how researchers could approach these challenges. In this visualization of the framework, on the left side we present the organizations as they were identified from our review (i.e., area/topic category in the Appendix ) and the multiple levels where organizational learning occurs (Costello and McNaughton 2018 ). On the right side, we summarize the objectives as they were identified from our review (i.e., the objectives category in the Appendix ). In the middle, we depict the orchestration that was conducted and how potential future research on eOL can improve the orchestration of the various elements and accelerate the achievement of the intended objectives. In particular, our proposed research agenda includes five research themes discussed in the following subsections.

An external file that holds a picture, illustration, etc.
Object name is 10796_2020_10097_Fig5_HTML.jpg

E-learning capabilities to enhance organizational research agenda

Theme 1: Couple E-learning Capabilities With the Intended Goals

The majority of the eOL studies either investigated a generic e-learning system using the umbrella term “e-learning” or did not provide enough details about the functionalities of the system (in most cases, it was simply defined as an online or web system). This indicates the very limited focus of the eOL research on the various capabilities of e-learning systems. In other words, the literature has been very detailed on the organizational value and employees’ acceptance of the technology, but less detailed on the capabilities of this technology that needs to be put into place to achieve the intended goals and strategic agenda. However, the capabilities of the e-learning systems and their use are not one-size-fits-all, and the intended goals (to obtain certain skills and competences) and employees’ needs and backgrounds play a determining role in the selection of the e-learning system (Al-Fraihat et al. 2020 ).

Only in a very few studies (Mueller et al. 2011 ; Renner et al. 2020 ) were the capabilities of the e-learning solutions (e.g., mobile learning, VR) utilized, and the results were found to significantly contribute to the intended goals. The intended knowledge can be procedural, declarative, general competence (e.g., presentation, communication, or leadership skills) or else, and its particularities and the pedagogical needs of the intended knowledge (e.g., a need for summative/formative feedback or for social learning support) should guide the selection of the e-learning system and the respective capabilities. Therefore, future research needs to investigate how the various capabilities offered by contemporary learning systems (e.g., assessment mechanisms, social learning, collaborative learning, personalized learning) can be utilized to adequately reinforce the intended goals (e.g., to train personnel to use a new tool, to improve presentation skills).

Theme 2: Embrace the Particularities of the Various Industries

Organizational learning entails sharing knowledge and enabling opportunities for growth at the individual, group, team, and organizational levels. Contemporary e-learning systems provide the medium to substantiate the necessary knowledge flow within organizations and to support employees’ overall learning. From the selected studies, we can infer that eOL research is either conducted in an industry-agnostic context (either generic or it was not properly reported) or there is a focus on the IT industry (see Appendix ). However, when looking at the few studies that provide results from different industries (Garavan et al. 2019 ; Lee et al. 2014 ), companies indicate that there are different practices, processes, and expectations, and that employees have different needs and perceptions with regards to e-learning systems and eOL in general. Such particularities influence the perceived dimensions of a learning organization. Some industries noted that eOL promoted the development of their learning organizations, whereas others reported that eOL did not seem to contribute to their development as a learning organization (Yoo and Huang 2016 ). Therefore, it is important that the implementation of organizational learning embraces the particularities of the various industries and future research needs to identify how the industry-specific characteristics can inform the design and development of organizational learning in promoting an organization’s goals and agenda.

Theme 3: Utilize E-learning Capabilities to Implement Employee-centric Approaches

For efficient organizational learning to be implemented, the processes and technologies need to recognize that learning is linked by social and psychological processes (Crossan et al. 1999 ). This allows employees to develop learning in various forms (e.g., social, emotional, personalized) and to develop elements such as self-awareness, self-control, and interpersonal skills that are vital for the organization. Looking at the contemporary eOL research, we notice that the exploration of e-learning capabilities to nurture the aforementioned elements and support employee-centric approaches is very limited (e.g., personalized technologies, adaptive assessment). Therefore, future research needs to collect data to understand how e-learning capabilities can be utilized in relation to employees’ needs and perceptions in order to provide solutions (e.g., collaborative, social, adaptive) that are employee-centric and focused on development, and that have the potential to move away from standard one-size-fits-all e-learning solutions to personalized and customized systems and processes.

Theme 4: Employ Analytics-enabled eOL

There is a lot of emphasis on measuring, via various qualitative and quantitative metrics, the effectiveness of eOL implemented at different levels in organizations. However, most of these metrics come from surveys and interviews that capture employees’ and managers’ perceptions of various aspects of eOL (e.g., fostering of learning, organizational value, employees’ performance), and very few studies utilize analytics (Hung et al. 2015 ; Renner et al. 2020 ; Rober and Cooper 2011 ). Given how digital technologies, big data, and business analytics pave the way towards organizations’ digital transformation and sustainable development (Mikalef et al. 2018 ; Pappas et al. 2018 ), and considering the learning analytics affordances of contemporary e-learning systems (Siemens and Long 2011 ), future work needs to investigate how learner/employee-generated data can be employed to inform practice and devise more accurate and temporal effectiveness metrics when measuring the importance and impact of eOL.

Theme 5: Orchestrate the Employees’ Needs, Resources, and Objectives in eOL Implementation

While considerable effort has been directed towards the various building blocks of eOL implementation, such as resources (intangible, tangible, and human skills) and employees’ needs (e.g., vision, growth, skills development), little is known so far about the processes and structures necessary for orchestrating those elements in order to achieve an organization’s intended goals and to materialize its overall agenda. In other words, eOL research has been very detailed on some of the elements that constitute efficient eOL, but less so on the interplay of those elements and how they need to be put into place. Prior literature on strategic resource planning has shown that competence in orchestrating such elements is a prerequisite to successfully increasing business value (Wang et al. 2012 ). Therefore, future research should not only investigate each of these elements in silos, but also consider their interplay, since it is likely that organizations with similar resources will exert highly varied levels in each of these elements (e.g., analytics-enabled, e-learning capabilities) to successfully materialize their goals (e.g., increase value, improve the competence base of their employees, modernize their organization).

Implications

Several implications for eOL have been revealed in this literature review. First, most studies agree that employees’ or trainees’ experience is extremely important for the successful implementation of eOL. Thus, keeping them in the design and implementation cycle of eOL will increase eOL adoption and satisfaction as well as reduce the risks and barriers. Another important implication addressed by some studies relates to the capabilities of the e-learning technologies, with easy-to-use, useful, and social technologies resulting in more efficient eOL (e.g., higher adoption and performance). Thus, it is important for organizations to incorporate these functionalities in the platform and reinforce them with appropriate content and support. This should not only benefit learning outcomes, but also provide the networking opportunities for employees to broaden their personal networks, which are often lost when companies move from face-to-face formal training to e-learning-enabled organizational learning.

Limitations

This review has some limitations. First, we had to make some methodological decisions (e.g., selection of databases, the search query) that might lead to certain biases in the results. However, tried to avoid such biases by considering all the major databases and following the steps indicated by Kitchenham and Charters ( 2007 ). Second, the selection of empirical studies and coding of the papers might pose another possible bias. However, the focus was clearly on the empirical evidence, the terminology employed (“e-learning”) is an umbrella term that covers the majority of the work in the area, and the coding of papers was checked by two researchers. Third, some elements of the papers were not described accurately, leading to some missing information in the coding of the papers. However, the amount of missing information was very small and could not affect the results significantly. Finally, we acknowledge that the selected methodology (Kitchenham and Charters 2007 ) includes potential biases (e.g., false negatives and false positives), and that different, equally valid methods (e.g., Okoli and Schabram 2010 ) might have been used and have resulted in slightly different outcomes. Nevertheless, despite the limitations of the selected methodology, it is a well-accepted and widely used literature review method in both software engineering and information systems (Boell and Cecez-Kecmanovic 2014 ), providing certain assurance of the results.

Conclusions and Future Work

We have presented an SLR of 47 contributions in the field of eOL over the last decade. With respect to RQ1, we analyzed the papers from different perspectives, such as research methodology, technology, industries, employees, and intended outcomes in terms of organizational value, employees’ performance, usage, and behavioral change. The detailed landscape is depicted in the Appendix and Figs.  3 and ​ and4; 4 ; with the results indicating the limited utilization of the various e-learning capabilities (e.g., social, collaborative) to achieve objectives connected with those capabilities (e.g., social learning and behavioral change, collaborative learning and overcoming barriers).

With respect to RQ2, we categorized the main findings of the selected papers into five areas that reflect the status of eOL research, and we have discussed the challenges and opportunities emerging from the current review. In addition, we have synthesized the extracted challenges and opportunities and proposed a research agenda consisting of five elements that provide suggestions on how researchers could approach these challenges and exploit the opportunities. Such an agenda will strengthen how e-learning can be leveraged to enhance the process of improving actions through better knowledge and understanding in an organization.

A number of suggestions for further research have emerged from reviewing prior and ongoing work on eOL. One recommendation for future researchers is to clearly describe the eOL approach by providing detailed information about the technologies and materials used, as well as the organizations. This will allow meta-analyses to be conducted and it will also identify the potential effects of a firm’s size or area on the performance and other aspects relating to organizational value. Future work should also focus on collecting and triangulating different types of data from different sources (e.g., systems’ logs). The reviewed studies were conducted mainly by using survey data, and they made limited use of data coming from the platforms; thus, the interpretations and triangulation between the different types of collected data were limited.

Biographies

is a Professor of Interaction Design and Learning Technologies at the Department of Computer Science of NTNU, and Head of the Learner-Computer Interaction lab (https://lci.idi.ntnu.no/). His research focuses on the design and study of emerging technologies in online and hybrid education settings, and their connections to student and instructor experiences and practices. Giannakos has co-authored more than 150 manuscripts published in peer-reviewed journals and conferences (including Computers & Education, Computers in Human Behavior, IEEE TLT, Behaviour & Information Technology, BJET, ACM TOCE, CSCL, Interact, C&C, IDC to mention few) and has served as an evaluator for the EC and the US-NSF. He has served/serves in various organization committees (e.g., general chair, associate chair), program committees as well as editor and guest editor on highly recognized journals (e.g., BJET, Computers in Human Behavior, IEEE TOE, IEEE TLT, ACM TOCE). He has worked at several research projects funded by diverse sources like the EC, Microsoft Research, The Research Council of Norway (RCN), US-NSF, the German agency for international academic cooperation (DAAD) and Cheng Endowment; Giannakos is also a recipient of a Marie Curie/ERCIM fellowship, the Norwegian Young Research Talent award and he is one of the outstanding academic fellows of NTNU (2017-2021).

is an Associate Professor in Data Science and Information Systems at the Department of Computer Science. In the past, he has been a Marie Skłodowska-Curie post-doctoral research fellow working on the research project “Competitive Advantage for the Data-driven Enterprise” (CADENT). He received his B.Sc. in Informatics from the Ionian University, his M.Sc. in Business Informatics for Utrecht University, and his Ph.D. in IT Strategy from the Ionian University. His research interests focus the on strategic use of information systems and IT-business value in turbulent environments. He has published work in international conferences and peer-reviewed journals including the Journal of Business Research, British Journal of Management, Information and Management, Industrial Management & Data Systems, and Information Systems and e-Business Management.

Ilias O. Pappasis

an Associate Professor of Information Systems at the Department of Information Systems, University of Agder (UiA), Norway. His research and teaching activities include data science and digital transformation, social innovation and social change, user experience in different contexts,as well as digital marketing, e-services, and information technology adoption. He has published articles in peer reviewed journals and conferences including Journal of Business Research, European Journal of Marketing, Computers in Human Behavior, Information & Management, Psychology & Marketing, International Journal of Information Management, Journal of Systems and Software. Pappas has been a Guest Editor for the journals Information & Management, Technological Forecasting and Social Change, Information Systems Frontiers, Information Technology & People, and Information Systems and e-Business Management. Pappas is a recipient of ERCIM and Marie Skłodowska-Curiefellowships.

Survey = survey study; Exp. = experiment; CaseSt = case study; ND = non-defined; MGM = management; Telec. = telecommunication; Bsn = business; Univ. = university; Cons. = consulting; Public = public sector; Ent. = enterprise; Web = Web-based; KRS = knowledge repository system; OERs = open educational resources; SL = Second Life, Mg, = managers; Empl = employees; Stud = students; Res. = researchers; Learn. = learning specialists; Individ. = individual; Surv. = surveys; Int. = interviews; FG = focus groups; Log = log files; Obs. = observations; Reg. = regression analysis; Descr. = descriptive statistics; A-VA = analysis of variances/covariance; CA = content analysis; ItU = intention to use; Sat. = satisfaction; OV = organizational value; Per. = performance; Flearn = foster learning; Benef. = benefits; Align. = alignment; Feas. = feasibility; Barr. = barriers; Beh. = behavioral change

Open Access funding provided by NTNU Norwegian University of Science and Technology (incl St. Olavs Hospital - Trondheim University Hospital).

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Contributor Information

Michail N. Giannakos, Email: on.untn@gliahcim .

Patrick Mikalef, Email: [email protected] .

Ilias O. Pappas, Email: [email protected] .

  • Al-Fraihat D, Joy M, Sinclair J. Evaluating E-learning systems success: An empirical study. Computers in Human Behavior. 2020; 102 :67–86. doi: 10.1016/j.chb.2019.08.004. [ CrossRef ] [ Google Scholar ]
  • Alharthi AD, Spichkova M, Hamilton M. Sustainability requirements for eLearning systems: A systematic literature review and analysis. Requirements Engineering. 2019; 24 (4):523–543. doi: 10.1007/s00766-018-0299-9. [ CrossRef ] [ Google Scholar ]
  • Alsabawy AY, Cater-Steel A, Soar J. IT infrastructure services as a requirement for e-learning system success. Computers & Education. 2013; 69 :431–451. doi: 10.1016/j.compedu.2013.07.035. [ CrossRef ] [ Google Scholar ]
  • Antonacopoulou E, Chiva R. The social complexity of organizational learning: The dynamics of learning and organizing. Management Learning. 2007; 38 :277–295. doi: 10.1177/1350507607079029. [ CrossRef ] [ Google Scholar ]
  • Berends H, Lammers I. Explaining discontinuity in organizational learning: A process analysis. Organization Studies. 2010; 31 (8):1045–1068. doi: 10.1177/0170840610376140. [ CrossRef ] [ Google Scholar ]
  • Boell SK, Cecez-Kecmanovic D. A hermeneutic approach for conducting literature reviews and literature searches. Communications of the Association for Information Systems. 2014; 34 (1):12. [ Google Scholar ]
  • Bologa R, Lupu AR. Organizational learning networks that can increase the productivity of IT consulting companies. A case study for ERP consultants. Expert Systems with Applications. 2014; 41 (1):126–136. doi: 10.1016/j.eswa.2013.07.016. [ CrossRef ] [ Google Scholar ]
  • Broadbent J. Comparing online and blended learner’s self-regulated learning strategies and academic performance. The Internet and Higher Education. 2017; 33 :24–32. doi: 10.1016/j.iheduc.2017.01.004. [ CrossRef ] [ Google Scholar ]
  • Cheng B, Wang M, Moormann J, Olaniran BA, Chen NS. The effects of organizational learning environment factors on e-learning acceptance. Computers & Education. 2012; 58 (3):885–899. doi: 10.1016/j.compedu.2011.10.014. [ CrossRef ] [ Google Scholar ]
  • Cheng B, Wang M, Yang SJ, Peng J. Acceptance of competency-based workplace e-learning systems: Effects of individual and peer learning support. Computers & Education. 2011; 57 (1):1317–1333. doi: 10.1016/j.compedu.2011.01.018. [ CrossRef ] [ Google Scholar ]
  • Choi S, Ko I. Leveraging electronic collaboration to promote interorganizational learning. International Journal of Information Management. 2012; 32 (6):550–559. doi: 10.1016/j.ijinfomgt.2012.03.002. [ CrossRef ] [ Google Scholar ]
  • Costello JT, McNaughton RB. Integrating a dynamic capabilities framework into workplace e-learning process evaluations. Knowledge and Process Management. 2018; 25 (2):108–125. doi: 10.1002/kpm.1565. [ CrossRef ] [ Google Scholar ]
  • Crossan MM, Lane HW, White RE. An organizational learning framework: From intuition to institution. Academy of Management Review. 1999; 24 :522–537. doi: 10.2307/259140. [ CrossRef ] [ Google Scholar ]
  • Crossan MM, Maurer CC, White RE. Reflections on the 2009 AMR decade award: Do we have a theory of organizational learning? Academy of Management Review. 2011; 36 (3):446–460. doi: 10.5465/amr.2010.0544. [ CrossRef ] [ Google Scholar ]
  • Dee, J., & Leisyte, L. (2017). Knowledge sharing and organizational change in higher education. The Learning Organization, 24 (5), 355–365. 10.1108/TLO-04-2017-0034
  • Dignen B, Burmeister T. Three pillars of organization and leadership in disruptive times. Cham: Springer; 2020. Learning and development in the organizations of the future; pp. 207–232. [ Google Scholar ]
  • Dybå T, Dingsøyr T. Empirical studies of agile software development: A systematic review. Information and Software Technology. 2008; 50 (9–10):833–859. doi: 10.1016/j.infsof.2008.01.006. [ CrossRef ] [ Google Scholar ]
  • El Kadiri S, Grabot B, Thoben KD, Hribernik K, Emmanouilidis C, Von Cieminski G, Kiritsis D. Current trends on ICT technologies for enterprise information systems. Computers in Industry. 2016; 79 :14–33. doi: 10.1016/j.compind.2015.06.008. [ CrossRef ] [ Google Scholar ]
  • Engeström Y, Kerosuo H, Kajamaa A. Beyond discontinuity: Expansive organizational learning remembered. Management Learning. 2007; 38 (3):319–336. doi: 10.1177/1350507607079032. [ CrossRef ] [ Google Scholar ]
  • Gal E, Nachmias R. Online learning and performance support in organizational environments using performance support platforms. Performance Improvement. 2011; 50 (8):25–32. doi: 10.1002/pfi.20238. [ CrossRef ] [ Google Scholar ]
  • Garavan TN, Heneghan S, O’Brien F, Gubbins C, Lai Y, Carbery R, Grant K. L&D professionals in organisations: much ambition, unfilled promise. European Journal of Training and Development. 2019; 44 (1):1–86. [ Google Scholar ]
  • Goggins SP, Jahnke I, Wulf V. Computer-supported collaborative learning at the workplace. New York: Springer; 2013. [ Google Scholar ]
  • Hammad, R., Odeh, M., & Khan, Z. (2017). ELCMM: An e-learning capability maturity model. In Proceedings of the 15th International Conference (e-Society 2017) (pp. 169–178).
  • Hester AJ, Hutchins HM, Burke-Smalley LA. Web 2.0 and transfer: Trainers’ use of technology to support employees’ learning transfer on the job. Performance Improvement Quarterly. 2016; 29 (3):231–255. doi: 10.1002/piq.21225. [ CrossRef ] [ Google Scholar ]
  • Hung YH, Lin CF, Chang RI. Developing a dynamic inference expert system to support individual learning at work. British Journal of Educational Technology. 2015; 46 (6):1378–1391. doi: 10.1111/bjet.12214. [ CrossRef ] [ Google Scholar ]
  • Iris R, Vikas A. E-Learning technologies: A key to dynamic capabilities. Computers in Human Behavior. 2011; 27 (5):1868–1874. doi: 10.1016/j.chb.2011.04.010. [ CrossRef ] [ Google Scholar ]
  • Jia H, Wang M, Ran W, Yang SJ, Liao J, Chiu DK. Design of a performance-oriented workplace e-learning system using ontology. Expert Systems with Applications. 2011; 38 (4):3372–3382. doi: 10.1016/j.eswa.2010.08.122. [ CrossRef ] [ Google Scholar ]
  • Joo YJ, Lim KY, Park SY. Investigating the structural relationships among organisational support, learning flow, learners’ satisfaction and learning transfer in corporate e-learning. British Journal of Educational Technology. 2011; 42 (6):973–984. doi: 10.1111/j.1467-8535.2010.01116.x. [ CrossRef ] [ Google Scholar ]
  • Kaschig, A., Maier, R., Sandow, A., Lazoi, M., Barnes, S. A., Bimrose, J., … Schmidt, A. (2010). Knowledge maturing activities and practices fostering organisational learning: results of an empirical study. In European Conference on Technology Enhanced Learning (pp. 151–166). Berlin: Springer.
  • Khalili A, Auer S, Tarasowa D, Ermilov I. International Conference on Knowledge Engineering and Knowledge Management. Berlin: Springer; 2012. SlideWiki: Elicitation and sharing of corporate knowledge using presentations; pp. 302–316. [ Google Scholar ]
  • Khandakar MSA, Pangil F. Relationship between human resource management practices and informal workplace learning. Journal of Workplace Learning. 2019; 31 (8):551–576. doi: 10.1108/JWL-04-2019-0049. [ CrossRef ] [ Google Scholar ]
  • Kim MK, Kim SM, Bilir MK. Investigation of the dimensions of workplace learning environments (WLEs): Development of the WLE measure. Performance Improvement Quarterly. 2014; 27 (2):35–57. doi: 10.1002/piq.21170. [ CrossRef ] [ Google Scholar ]
  • Kitchenham, B., & Charters, S. (2007). Guidelines for performing systematic literature reviews in software engineering. Technical Report EBSE-2007-01, 2007 . https://citeseerx.ist.psu.edu/viewdoc/download;jsessionid=35909B1B280E2032BF116BDC9DCB71EA? .
  • Krippendorff, K. (2018). Content analysis: an introduction to its methodology. Thousand Oaks: Sage Publications.
  • Lai HJ. Examining civil servants’ decisions to use Web 2.0 tools for learning, based on the decomposed theory of planned behavior. Interactive Learning Environments. 2017; 25 (3):295–305. doi: 10.1080/10494820.2015.1121879. [ CrossRef ] [ Google Scholar ]
  • Lau K. Organizational learning goes virtual? A study of employees’ learning achievement in stereoscopic 3D virtual reality. The Learning Organization. 2015; 22 (5):289–303. doi: 10.1108/TLO-11-2014-0063. [ CrossRef ] [ Google Scholar ]
  • Lee J, Choi M, Lee H. Factors affecting smart learning adoption in workplaces: Comparing large enterprises and SMEs. Information Technology and Management. 2015; 16 (4):291–302. doi: 10.1007/s10799-014-0201-5. [ CrossRef ] [ Google Scholar ]
  • Lee J, Kim DW, Zo H. Conjoint analysis on preferences of HRD managers and employees for effective implementation of m-learning: The case of South Korea. Telematics and Informatics. 2015; 32 (4):940–948. doi: 10.1016/j.tele.2015.04.010. [ CrossRef ] [ Google Scholar ]
  • Lee J, Zo H, Lee H. Smart learning adoption in employees and HRD managers. British Journal of Educational Technology. 2014; 45 (6):1082–1096. doi: 10.1111/bjet.12210. [ CrossRef ] [ Google Scholar ]
  • Lin CH, Sanders K. HRM and innovation: A multi-level organizational learning perspective. Human Resource Management Journal. 2017; 27 (2):300–317. doi: 10.1111/1748-8583.12127. [ CrossRef ] [ Google Scholar ]
  • Lin CY, Huang CK, Zhang H. Enhancing employee job satisfaction via e-learning: The mediating role of an organizational learning culture. International Journal of Human–Computer Interaction. 2019; 35 (7):584–595. doi: 10.1080/10447318.2018.1480694. [ CrossRef ] [ Google Scholar ]
  • Liu YC, Huang YA, Lin C. Organizational factors’ effects on the success of e-learning systems and organizational benefits: An empirical study in Taiwan. The International Review of Research in Open and Distributed Learning. 2012; 13 (4):130–151. doi: 10.19173/irrodl.v13i4.1203. [ CrossRef ] [ Google Scholar ]
  • López-Nicolás C, Meroño-Cerdán ÁL. Strategic knowledge management, innovation and performance. International Journal of Information Management. 2011; 31 (6):502–509. doi: 10.1016/j.ijinfomgt.2011.02.003. [ CrossRef ] [ Google Scholar ]
  • Manuti A, Pastore S, Scardigno AF, Giancaspro ML, Morciano D. Formal and informal learning in the workplace: A research review. International Journal of Training and Development. 2015; 19 (1):1–17. doi: 10.1111/ijtd.12044. [ CrossRef ] [ Google Scholar ]
  • March JG. Exploration and exploitation in organizational learning. Organization Science. 1991; 2 (1):71–87. doi: 10.1287/orsc.2.1.71. [ CrossRef ] [ Google Scholar ]
  • Marshall S. New Zealand Tertiary Institution E-learning Capability: Informing and Guiding eLearning Architectural Change and Development. Report to the ministry of education. NZ: Victoria University of Wellington; 2006. [ Google Scholar ]
  • McDonald, N., Schoenebeck, S., & Forte, A. (2019). Reliability and inter-rater reliability in qualitative research: Norms and guidelines for CSCW and HCI practice. In Proceedings of the ACM on Human–Computer Interaction, 3(CSCW) (pp. 1–23).
  • Menolli, A., Tirone, H., Reinehr, S., & Malucelli, A. (2020). Identifying organisational learning needs: An approach to the semi-automatic creation of course structures for software companies. Behaviour & Information Technology, 39 (11), 1140–1155.
  • Michalski MP. Symbolic meanings and e-learning in the workplace: The case of an intranet-based training tool. Management Learning. 2014; 45 (2):145–166. doi: 10.1177/1350507612468419. [ CrossRef ] [ Google Scholar ]
  • Mikalef P, Pappas IO, Krogstie J, Giannakos M. Big data analytics capabilities: A systematic literature review and research agenda. Information Systems and e-Business Management. 2018; 16 (3):547–578. doi: 10.1007/s10257-017-0362-y. [ CrossRef ] [ Google Scholar ]
  • Mitić S, Nikolić M, Jankov J, Vukonjanski J, Terek E. The impact of information technologies on communication satisfaction and organizational learning in companies in Serbia. Computers in Human Behavior. 2017; 76 :87–101. doi: 10.1016/j.chb.2017.07.012. [ CrossRef ] [ Google Scholar ]
  • Mueller J, Hutter K, Fueller J, Matzler K. Virtual worlds as knowledge management platform—A practice-perspective. Information Systems Journal. 2011; 21 (6):479–501. doi: 10.1111/j.1365-2575.2010.00366.x. [ CrossRef ] [ Google Scholar ]
  • Muller Queiroz, A. C., Nascimento, M., Tori, A., Alejandro, R. Brashear, Veloso, T., de Melo, V., de Souza Meirelles, F., & da Silva Leme, M. I. (2018). Immersive virtual environments in corporate education and training. In AMCIS. https://aisel.aisnet.org/amcis2018/Education/Presentations/12/ .
  • Navimipour NJ, Zareie B. A model for assessing the impact of e-learning systems on employees’ satisfaction. Computers in Human Behavior. 2015; 53 :475–485. doi: 10.1016/j.chb.2015.07.026. [ CrossRef ] [ Google Scholar ]
  • Oh SY. Effects of organizational learning on performance: The moderating roles of trust in leaders and organizational justice. Journal of Knowledge Management. 2019; 23 :313–331. doi: 10.1108/JKM-02-2018-0087. [ CrossRef ] [ Google Scholar ]
  • Okoli, C., & Schabram, K. (2010). A guide to conducting a systematic literature review of information systems research. Sprouts: Working Papers on Information Systems, 10 (26), 1–46.
  • Pappas IO, Mikalef P, Giannakos MN, Krogstie J, Lekakos G. Big data and business analytics ecosystems: paving the way towards digital transformation and sustainable societies. Information Systems and e-Business Management. 2018; 16 :479–491. doi: 10.1007/s10257-018-0377-z. [ CrossRef ] [ Google Scholar ]
  • Popova-Nowak IV, Cseh M. The meaning of organizational learning: A meta-paradigm perspective. Human Resource Development Review. 2015; 14 (3):299–331. doi: 10.1177/1534484315596856. [ CrossRef ] [ Google Scholar ]
  • Qi, C., & Chau, P. Y. (2016). An empirical study of the effect of enterprise social media usage on organizational learning. In Pacific Asia Conference on Information Systems (PACIS'16). Proceedings , Paper 330. http://aisel.aisnet.org/pacis2016/330 .
  • Renner B, Wesiak G, Pammer-Schindler V, Prilla M, Müller L, Morosini D, Cress U. Computer-supported reflective learning: How apps can foster reflection at work. Behaviour & Information Technology. 2020; 39 (2):167–187. doi: 10.1080/0144929X.2019.1595726. [ CrossRef ] [ Google Scholar ]
  • Rober, M. B., & Cooper, L. P. (2011, January). Capturing knowledge via an” Intrapedia”: A case study. In 2011 44th Hawaii International Conference on System Sciences (pp. 1–10). New York: IEEE.
  • Rosenberg MJ, Foshay R. E-learning: Strategies for delivering knowledge in the digital age. Performance Improvement. 2002; 41 (5):50–51. doi: 10.1002/pfi.4140410512. [ CrossRef ] [ Google Scholar ]
  • Serrano, Á., Marchiori, E. J., del Blanco, Á., Torrente, J., & Fernández-Manjón, B. (2012). A framework to improve evaluation in educational games. The IEEE Global Engineering Education Conference (pp. 1–8). Marrakesh, Morocco.
  • Siadaty, M., Jovanović, J., Gašević, D., Jeremić, Z., & Holocher-Ertl, T. (2010). Leveraging semantic technologies for harmonization of individual and organizational learning. In European Conference on Technology Enhanced Learning (pp. 340–356). Berlin: Springer.
  • Siemens G, Long P. Penetrating the fog: Analytics in learning and education. EDUCAUSE Review. 2011; 46 (5):30. [ Google Scholar ]
  • Škerlavaj M, Dimovski V, Mrvar A, Pahor M. Intra-organizational learning networks within knowledge-intensive learning environments. Interactive Learning Environments. 2010; 18 (1):39–63. doi: 10.1080/10494820802190374. [ CrossRef ] [ Google Scholar ]
  • Smith PJ, Sadler-Smith E. Learning in organizations: Complexities and diversities. London: Routledge; 2006. [ Google Scholar ]
  • Stoffregen JD, Pawlowski JM, Ras E, Tobias E, Šćepanović S, Fitzpatrick D, Friedrich H. Barriers to open e-learning in public administrations: A comparative case study of the European countries Luxembourg, Germany, Montenegro and Ireland. Technological Forecasting and Social Change. 2016; 111 :198–208. doi: 10.1016/j.techfore.2016.06.030. [ CrossRef ] [ Google Scholar ]
  • Subramaniam, R., & Nakkeeran, S. (2019). Impact of corporate e-learning systems in enhancing the team performance in virtual software teams. In Smart Technologies and Innovation for a Sustainable Future (pp. 195–204). Berlin: Springer.
  • Tsai CH, Zhu DS, Ho BCT, Wu DD. The effect of reducing risk and improving personal motivation on the adoption of knowledge repository system. Technological Forecasting and Social Change. 2010; 77 (6):840–856. doi: 10.1016/j.techfore.2010.01.011. [ CrossRef ] [ Google Scholar ]
  • Turi JA, Sorooshian S, Javed Y. Impact of the cognitive learning factors on sustainable organizational development. Heliyon. 2019; 5 (9):e02398. doi: 10.1016/j.heliyon.2019.e02398. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Wang, M. (2011). Integrating organizational, social, and individual perspectives in Web 2.0-based workplace e-learning. Information Systems Frontiers, 13 (2), 191–205.
  • Wang, M. (2018). Effects of individual and social learning support on employees’ acceptance of performance-oriented e-learning. In E-Learning in the Workplace (pp. 141–159). Springer. 10.1007/978-3-319-64532-2_13.
  • Wang M, Ran W, Liao J, Yang SJ. A performance-oriented approach to e-learning in the workplace. Journal of Educational Technology & Society. 2010; 13 (4):167–179. [ Google Scholar ]
  • Wang M, Vogel D, Ran W. Creating a performance-oriented e-learning environment: A design science approach. Information & Management. 2011; 48 (7):260–269. doi: 10.1016/j.im.2011.06.003. [ CrossRef ] [ Google Scholar ]
  • Wang N, Liang H, Zhong W, Xue Y, Xiao J. Resource structuring or capability building? An empirical study of the business value of information technology. Journal of Management Information Systems. 2012; 29 (2):325–367. doi: 10.2753/MIS0742-1222290211. [ CrossRef ] [ Google Scholar ]
  • Wang S, Wang H. Organizational schemata of e-portfolios for fostering higher-order thinking. Information Systems Frontiers. 2012; 14 (2):395–407. doi: 10.1007/s10796-010-9262-0. [ CrossRef ] [ Google Scholar ]
  • Wei K, Ram J. Perceived usefulness of podcasting in organizational learning: The role of information characteristics. Computers in Human Behavior. 2016; 64 :859–870. doi: 10.1016/j.chb.2016.08.003. [ CrossRef ] [ Google Scholar ]
  • Wei, K., Sun, H., & Li, H. (2013). On the driving forces of diffusion of podcasting in organizational settings: A case study and propositions. In PACIS 2013. Proceedings , 217. http://aisel.aisnet.org/pacis2013/217 .
  • Weinhardt, J. M., & Sitzmann, T. (2018). Revolutionizing training and education? Three questions regarding massive open online courses (MOOCs). Human Resource Management Review, 29 (2), 218–225.
  • Xiang, Q., Zhang, J., & Liu, H. (2020). Organisational improvisation as a path to new opportunity identification for incumbent firms: An organisational learning view. Innovation, 22 (4), 422–446. 10.1080/14479338.2020.1713001.
  • Yanson R, Johnson RD. An empirical examination of e-learning design: The role of trainee socialization and complexity in short term training. Computers & Education. 2016; 101 :43–54. doi: 10.1016/j.compedu.2016.05.010. [ CrossRef ] [ Google Scholar ]
  • Yoo SJ, Huang WD. Can e-learning system enhance learning culture in the workplace? A comparison among companies in South Korea. British Journal of Educational Technology. 2016; 47 (4):575–591. doi: 10.1111/bjet.12240. [ CrossRef ] [ Google Scholar ]
  • Zhang X, Jiang S, Ordóñez de Pablos P, Lytras MD, Sun Y. How virtual reality affects perceived learning effectiveness: A task–technology fit perspective. Behaviour & Information Technology. 2017; 36 (5):548–556. doi: 10.1080/0144929X.2016.1268647. [ CrossRef ] [ Google Scholar ]
  • Zhang X, Meng Y, de Pablos PO, Sun Y. Learning analytics in collaborative learning supported by Slack: From the perspective of engagement. Computers in Human Behavior. 2019; 92 :625–633. doi: 10.1016/j.chb.2017.08.012. [ CrossRef ] [ Google Scholar ]

e learning research topics 2021

A Systematic Review of the Research Topics in Online Learning During COVID-19: Documenting the Sudden Shift

  • Min Young Doo Kangwon National University http://orcid.org/0000-0003-3565-2159
  • Meina Zhu Wayne State University
  • Curtis J. Bonk Indiana University Bloomington

Since most schools and learners had no choice but to learn online during the pandemic, online learning became the mainstream learning mode rather than a substitute for traditional face-to-face learning. Given this enormous change in online learning, we conducted a systematic review of 191 of the most recent online learning studies published during the COVID-19 era. The systematic review results indicated that the themes regarding “courses and instructors” became popular during the pandemic, whereas most online learning research has focused on “learners” pre-COVID-19. Notably, the research topics “course and instructors” and “course technology” received more attention than prior to COVID-19. We found that “engagement” remained the most common research theme even after the pandemic. New research topics included parents, technology acceptance or adoption of online learning, and learners’ and instructors’ perceptions of online learning.

An, H., Mongillo, G., Sung, W., & Fuentes, D. (2022). Factors affecting online learning during the COVID-19 pandemic: The lived experiences of parents, teachers, and administrators in U.S. high-needs K-12 schools. The Journal of Online Learning Research (JOLR), 8(2), 203-234. https://www.learntechlib.org/primary/p/220404/

Aslan, S., Li, Q., Bonk, C. J., & Nachman, L. (2022). An overnight educational transformation: How did the pandemic turn early childhood education upside down? Online Learning, 26(2), 52-77. DOI: http://dx.doi.org/10.24059/olj.v26i2.2748

Azizan, S. N., Lee, A. S. H., Crosling, G., Atherton, G., Arulanandam, B. V., Lee, C. E., &

Abdul Rahim, R. B. (2022). Online learning and COVID-19 in higher education: The value of IT models in assessing students’ satisfaction. International Journal of Emerging Technologies in Learning (iJET), 17(3), 245–278. https://doi.org/10.3991/ijet.v17i03.24871

Beatty, B. J. (2019). Hybrid-flexible course design (1st ed.). EdTech Books. https://edtechbooks.org/hyflex

Berge, Z., & Mrozowski, S. (2001). Review of research in distance education, 1990 to 1999. American Journal of Distance Education, 15(3), 5–19. https://doi.org/ 10.1080/08923640109527090

Bond, M. (2020). Schools and emergency remote education during the COVID-19 pandemic: A living rapid systematic review. Asian Journal of Distance Education, 15(2), 191-247. http://www.asianjde.com/ojs/index.php/AsianJDE/article/view/517

Bond, M., Bedenlier, S., Marín, V. I., & Händel, M. (2021). Emergency remote teaching in higher education: Mapping the first global online semester. International Journal of Educational Technology in Higher Education, 18(1), 1-24. https://doi.org/10.1186/s41239-021-00282-x

Bonk, C. J. (2020). Pandemic ponderings, 30 years to today: Synchronous signals, saviors, or survivors? Distance Education, 41(4), 589-599. https://doi.org/10.1080/01587919.2020.1821610

Bonk, C. J., & Graham, C. R. (Eds.) (2006). Handbook of blended learning: Global perspectives, local designs. Pfeiffer Publishing.

Bonk, C. J., Olson, T., Wisher, R. A., & Orvis, K. L. (2002). Learning from focus groups: An examination of blended learning. Journal of Distance Education, 17(3), 97-118.

Braun, V., & Clarke, V. (2006). Using thematic analysis in psychology. Qualitative Research in Psychology, 3(2), 77–101. https://doi.org/10.1191/1478088706qp063oa

Braun, V., Clarke, V., & Rance, N. (2014). How to use thematic analysis with interview data. In A. Vossler & N. Moller (Eds.), The counselling & psychotherapy research handbook, 183–197. Sage.

Canales-Romero, D., & Hachfeld, A (2021). Juggling school and work from home: Results from a survey on German families with school-aged children during the early COVID-19 lockdown. Frontiers in Psychology, 12. https://doi.org/10.3389/fpsyg.2021.734257

Cao, Y., Zhang, S., Chan, M.C.E., Kang. Y. (2021). Post-pandemic reflections: lessons from Chinese mathematics teachers about online mathematics instruction. Asia Pacific Education Review, 22, 157–168. https://doi.org/10.1007/s12564-021-09694-w

The Centers for Disease Control and Prevention (2022, May 4). COVID-19 forecasts: Deaths. Retrieved from https://www.cdc.gov/coronavirus/2019-ncov/science/forecasting/forecasting-us.html

Chang, H. M, & Kim. H. J. (2021). Predicting the pass probability of secondary school students taking online classes. Computers & Education, 164, 104110. https://doi.org/10.1016/j.compedu.2020.104110

Charumilind, S. Craven, M., Lamb, J., Sabow, A., Singhal, S., & Wilson, M. (2022, March 1). When will the COVID-19 pandemic end? McKinsey & Company. https://www.mckinsey.com/industries/healthcare-systems-and-services/our-insights/when-will-the-covid-19-pandemic-end

Cooper, H. (1988). The structure of knowledge synthesis: A taxonomy of literature reviews. Knowledge in Society, 1, 104–126.

Crompton, H., Burke, D., Jordan, K., & Wilson, S. W. (2021). Learning with technology during emergencies: A systematic review of K‐12 education. British Journal of Educational Technology, 52(4), 1554-1575. https://doi.org/10.1111/bjet.13114

Davis, F. D. (1989). Perceived usefulness, perceived ease of use, and user acceptance of information technology. MIS Quarterly, 13(3), 319-340. https://doi.org/10.2307/249008

Erwin, B. (2021, November). A policymaker’s guide to virtual schools. Education Commission of the States. https://www.ecs.org/wp-content/uploads/Policymakers-Guide-to-Virtual-Schools.pdf

Gross, B. (2021). Surging enrollment in virtual schools during the pandemic spurs new questions for policymakers. Center on Reinventing Public Education, Arizona State University. https://crpe.org/surging-enrollment-in-virtual-schools-during-the-pandemic-spurs-new-questions-for-policymakers/

Hamaidi, D. D. A., Arouri, D. Y. M., Noufal, R. K., & Aldrou, I. T. (2021). Parents’ perceptions of their children’s experiences with distance learning during the COVID-19 pandemic. The International Review of Research in Open and Distributed Learning, 22(2), 224-241. https://doi.org/10.19173/irrodl.v22i2.5154

Heo, H., Bonk, C. J., & Doo, M. Y. (2022). Influences of depression, self-efficacy, and resource management on learning engagement in blended learning during COVID-19. The Internet and Higher Education, 54, https://doi.org/10.1016/j.iheduc.2022.100856

Hodges, C., Moore, S., Lockee, B., Trust, T., & Bond, A. (2020, March 27). The differences between emergency remote teaching and online learning. EDUCAUSE Review. https://er.educause.edu/articles/2020/3/the-difference-between-emergency-remote-teachingand-online-learning

Huang, L. & Zhang, T. (2021). Perceived social support, psychological capital, and subjective well-being among college students in the context of online learning during the COVID-19 pandemic. Asia-Pacific Education Researcher. https://doi.org/10.1007/s40299-021-00608-3

Kanwar, A., & Daniel, J. (2020). Report to Commonwealth education ministers: From response to resilience. Commonwealth of Learning. http://oasis.col.org/handle/11599/3592

Lederman, D. (2019). Online enrollments grow, but pace slows. Inside Higher Ed. https://www.insidehighered.com/digital-learning/article/2019/12/11/more-students-study-online-rate-growth-slowed-2018

Lee, K. (2019). Rewriting a history of open universities: (Hi)stories of distance teachers. The International Review of Research in Open and Distributed Learning, 20(4), 1-12. https://doi.org/10.19173/irrodl.v20i3.4070

Liu, Y., & Butzlaff, A. (2021). Where's the germs? The effects of using virtual reality on nursing students' hospital infection prevention during the COVID-19 pandemic. Journal of Computer Assisted Learning, 37(6), 1622–1628. https://doi.org/10.1111/jcal.12601

Maloney, E. J., & Kim, J. (2020, June 10). Learning in 2050. Inside Higher Ed. https://www.insidehighered.com/digital-learning/blogs/learning-innovation/learning-2050

Martin, F., Sun, T., & Westine, C. D. (2020). A systematic review of research on online teaching and learning from 2009 to 2018. Computers & Education, 159, 104009.

Miks, J., & McIlwaine, J. (2020, April 20). Keeping the world’s children learning through COVID-19. UNICEF. https://www.unicef.org/coronavirus/keeping-worlds-children-learning-through-covid-19

Mishra, S., Sahoo, S., & Pandey, S. (2021). Research trends in online distance learning during the COVID-19 pandemic. Distance Education, 42(4), 494-519. https://doi.org/10.1080/01587919.2021.1986373

Moore, M. G. (Ed.) (2007). The handbook of distance education (2nd Ed.). Lawrence Erlbaum Associates.

Moore, M. G., & Kearsley, G. (2012). Distance education: A systems view (3rd ed.). Wadsworth.

Munir, F., Anwar, A., & Kee, D. M. H. (2021). The online learning and students’ fear of COVID-19: Study in Malaysia and Pakistan. The International Review of Research in Open and Distributed Learning, 22(4), 1-21. https://doi.org/10.19173/irrodl.v22i4.5637

National Center for Education Statistics (2015). Number of virtual schools by state and school type, magnet status, charter status, and shared-time status: School year 2013–14. https://nces.ed.gov/ccd/tables/201314_Virtual_Schools_table_1.asp

National Center for Education Statistics (2020). Number of virtual schools by state and school type, magnet status, charter status, and shared-time status: School year 2018–19. https://nces.ed.gov/ccd/tables/201819_Virtual_Schools_table_1.asp

National Center for Education Statistics (2021). Number of virtual schools by state and school type, magnet status, charter status, and shared-time status: School year 2019–20. https://nces.ed.gov/ccd/tables/201920_Virtual_Schools_table_1.asp

Nguyen T., Netto, C.L.M., Wilkins, J.F., Bröker, P., Vargas, E.E., Sealfon, C.D., Puthipiroj, P., Li, K.S., Bowler, J.E., Hinson, H.R., Pujar, M. & Stein, G.M. (2021). Insights into students’ experiences and perceptions of remote learning methods: From the COVID-19 pandemic to best practice for the future. Frontiers in Education, 6, 647986. doi: 10.3389/feduc.2021.647986

Oinas, S., Hotulainen, R., Koivuhovi, S., Brunila, K., & Vainikainen, M-P. (2022). Remote learning experiences of girls, boys and non-binary students. Computers & Education, 183, [104499]. https://doi.org/10.1016/j.compedu.2022.104499

Park, A. (2022, April 29). The U.S. is in a 'Controlled Pandemic' Phase of COVID-19. But what does that mean? Time. https://time.com/6172048/covid-19-controlled-pandemic-endemic/

Petersen, G. B. L., Petkakis, G., & Makransky, G. (2022). A study of how immersion and interactivity drive VR learning. Computers & Education, 179, 104429, https://doi.org/10.1016/j.compedu.2021.104429

Picciano, A., Dziuban, C., & Graham, C. R. (Eds.) (2014). Blended learning: Research perspectives, Volume 2. Routledge.

Picciano, A., Dziuban, C., Graham, C. R. & Moskal, P. (Eds.) (2022). Blended learning: Research perspectives, Volume 3. Routledge.

Pollard, R., & Kumar, S. (2021). Mentoring graduate students online: Strategies and challenges. The International Review of Research in Open and Distributed Learning, 22(2), 267-284. https://doi.org/10.19173/irrodl.v22i2.5093

Salis-Pilco, S. Z., Yang. Y., Zhang. Z. (2022). Student engagement in online learning in Latin American higher education during the COVID-19 pandemic: A systematic review. British Journal of Educational Technology, 53(3), 593-619. https://doi.org/10.1111/bjet.13190

Shen, Y. W., Reynolds, T. H., Bonk, C. J., & Brush, T. A. (2013). A case study of applying blended learning in an accelerated post-baccalaureate teacher education program. Journal of Educational Technology Development and Exchange, 6(1), 59-78.

Seabra, F., Teixeira, A., Abelha, M., Aires, L. (2021). Emergency remote teaching and learning in Portugal: Preschool to secondary school teachers’ perceptions. Education Sciences, 11, 349. https://doi.org/ 10.3390/educsci11070349

Tallent-Runnels, M. K., Thomas, J. A., Lan, W. Y., Cooper, S., Ahern, T. C., Shaw, S. M., & Liu, X. (2006). Teaching courses online: A review of the research. Review of Educational Research, 76(1), 93–135. https://doi.org/10.3102/00346543076001093 .

Theirworld. (2020, March 20). Hundreds of millions of students now learning from home after coronavirus crisis shuts their schools. ReliefWeb. https://reliefweb.int/report/world/hundreds-millions-students-now-learning-home-after-coronavirus-crisis-shuts-their

UNESCO (2020). UNESCO rallies international organizations, civil society and private sector partners in a broad Coalition to ensure #LearningNeverStops. https://en.unesco.org/news/unesco-rallies-international-organizations-civil-society-and-private-sector-partners-broad

VanLeeuwen, C. A., Veletsianos, G., Johnson, N., & Belikov, O. (2021). Never-ending repetitiveness, sadness, loss, and “juggling with a blindfold on:” Lived experiences of Canadian college and university faculty members during the COVID-19

pandemic. British Journal of Educational Technology, 52, 1306-1322

https://doi.org/10.1111/bjet.13065

Wedemeyer, C. A. (1981). Learning at the back door: Reflections on non-traditional learning in the lifespan. University of Wisconsin Press.

Zawacki-Richter, O., Backer, E., & Vogt, S. (2009). Review of distance education research (2000 to 2008): Analysis of research areas, methods, and authorship patterns. International Review of Research in Open and Distance Learning, 10(6), 30. https://doi.org/10.19173/irrodl.v10i6.741

Zhan, Z., Li, Y., Yuan, X., & Chen, Q. (2021). To be or not to be: Parents’ willingness to send their children back to school after the COVID-19 outbreak. The Asia-Pacific Education Researcher. https://doi.org/10.1007/s40299-021-00610-9

As a condition of publication, the author agrees to apply the Creative Commons – Attribution International 4.0 (CC-BY) License to OLJ articles. See: https://creativecommons.org/licenses/by/4.0/ .

This licence allows anyone to reproduce OLJ articles at no cost and without further permission as long as they attribute the author and the journal. This permission includes printing, sharing and other forms of distribution.

Author(s) hold copyright in their work, and retain publishing rights without restrictions

Scopus Citescore 2022

The DOAJ Seal is awarded to journals that demonstrate best practice in open access publishing

OLC Membership

Join the OLC

OLC Research Center

e learning research topics 2021

Information

  • For Readers
  • For Authors
  • For Librarians

More information about the publishing system, Platform and Workflow by OJS/PKP.

Covid-19 and E-Learning: An Exploratory Analysis of Research Topics and Interests in E-Learning During the Pandemic

Ieee account.

  • Change Username/Password
  • Update Address

Purchase Details

  • Payment Options
  • Order History
  • View Purchased Documents

Profile Information

  • Communications Preferences
  • Profession and Education
  • Technical Interests
  • US & Canada: +1 800 678 4333
  • Worldwide: +1 732 981 0060
  • Contact & Support
  • About IEEE Xplore
  • Accessibility
  • Terms of Use
  • Nondiscrimination Policy
  • Privacy & Opting Out of Cookies

A not-for-profit organization, IEEE is the world's largest technical professional organization dedicated to advancing technology for the benefit of humanity. © Copyright 2024 IEEE - All rights reserved. Use of this web site signifies your agreement to the terms and conditions.

E-Learning Research Trends in Higher Education in Light of COVID-19: A Bibliometric Analysis

Affiliations.

  • 1 University of Bisha, Bisha, Saudi Arabia.
  • 2 University of Oum El Bouaghi, Oum El Bouaghi, Algeria.
  • 3 Binghamton University, Binghamton, NY, United States.
  • 4 University of Rochester, Rochester, NY, United States.
  • PMID: 35308075
  • PMCID: PMC8929398
  • DOI: 10.3389/fpsyg.2021.762819

This paper provides a broad bibliometric overview of the important conceptual advances that have been published during COVID-19 within "e-learning in higher education." E-learning as a concept has been widely used in the academic and professional communities and has been approved as an educational approach during COVID-19. This article starts with a literature review of e-learning. Diverse subjects have appeared on the topic of e-learning, which is indicative of the dynamic and multidisciplinary nature of the field. These include analyses of the most influential authors, of models and networks for bibliometric analysis, and progress towards the current research within the most critical areas. A bibliometric review analyzes data of 602 studies published (2020-2021) in the Web of Science (WoS) database to fully understand this field. The data were examined using VOSviewer, CiteSpace, and KnowledgeMatrix Plus to extract networks and bibliometric indicators about keywords, authors, organizations, and countries. The study concluded with several results within higher education. Many converging words or sub-fields of e-learning in higher education included distance learning, distance learning, interactive learning, online learning, virtual learning, computer-based learning, digital learning, and blended learning (hybrid learning). This research is mainly focused on pedagogical techniques, particularly e-learning and collaborative learning, but these are not the only trends developing in this area. The sub-fields of artificial intelligence, machine learning, and deep learning constitute new research directions for e-learning in light of COVID-19 and are suggestive of new approaches for further analysis.

Keywords: COVID-19; Web of Science (WoS) database; bibliometric analysis; e-learning; higher education.

Copyright © 2022 Brika, Chergui, Algamdi, Musa and Zouaghi.

Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here .

Loading metrics

Open Access

Peer-reviewed

Research Article

Academic student satisfaction and perceived performance in the e-learning environment during the COVID-19 pandemic: Evidence across ten countries

Contributed equally to this work with: Damijana Keržič, Jogymol Kalariparampil Alex, Roxana Pamela Balbontín Alvarado, Denilson da Silva Bezerra, Maria Cheraghi, Beata Dobrowolska, Adeniyi Francis Fagbamigbe, MoezAlIslam Ezzat Faris, Thais França, Belinka González-Fernández, Luz Maria Gonzalez-Robledo, Fany Inasius, Sujita Kumar Kar, Kornélia Lazányi, Florin Lazăr, Juan Daniel Machin-Mastromatteo, João Marôco, Bertil Pires Marques, Oliva Mejía-Rodríguez, Silvia Mariela Méndez Prado, Alpana Mishra, Cristina Mollica, Silvana Guadalupe Navarro Jiménez, Alka Obadić, Daniela Raccanello, Md Mamun Ur Rashid, Dejan Ravšelj, Nina Tomaževič, Chinaza Uleanya, Lan Umek, Giada Vicentini, Özlem Yorulmaz, Ana-Maria Zamfir, Aleksander Aristovnik

Roles Conceptualization, Investigation, Methodology, Project administration, Supervision, Visualization, Writing – original draft, Writing – review & editing

Affiliations Faculty of Public Administration, University of Ljubljana, Ljubljana, Slovenia, Faculty of Social Sciences, University of Ljubljana, Ljubljana, Slovenia

ORCID logo

Roles Writing – review & editing

Affiliation Faculty of Educational Sciences, Walter Sisulu University, Mthatha, South Africa

Roles Investigation, Writing – original draft, Writing – review & editing

Affiliation Faculty of Education and Humanities, University of Bío Bío, Concepción, Chile

Affiliation Department of Oceanography and Limnology, Federal University of Maranhão, São Luís, Brazil

Roles Investigation

Affiliation Social Determinant of Health Research Center, Department of Public Health, School of Health, Ahvaz Jundishapur University of Medical Sciences, Ahvaz, Iran

Affiliation Faculty of Health Sciences, Medical University of Lublin, Lublin, Poland

Roles Formal analysis, Investigation, Writing – review & editing

Affiliation Department of Epidemiology and Medical Statistics, Faculty of Public Health, College of Medicine, University of Ibadan, Ibadan, Nigeria

Roles Investigation, Writing – review & editing

Affiliation Department of Clinical Nutrition and Dietetics, College of Health Sciences, University of Sharjah, Sharjah, United Arab Emirates

Affiliation Centre for Research and Studies in Sociology, Cies-Iscte, Portugal

Affiliation Department of Sciences and Engineering, Universidad Iberoamericana Puebla/Red Citeg, Mexico City, Mexico

Affiliation Facultad de Medicina, Universidad Autónoma del Estado de Morelos, Morelos, Mexico

Affiliation Faculty of Economic and Communication, Bina Nusantara University, West Jakarta, Indonesia

Roles Writing – original draft, Writing – review & editing

Affiliation Department of Psychiatry, King George’s Medical University, Lucknow, India

Affiliation John von Neumann Faculty of Informatics, Obuda University, Budapest, Hungary

Affiliation Faculty of Sociology and Social Work, University of Bucharest, Bucharest, Romania

Affiliation Faculty of Philosophy and Letters, Universidad Autónoma de Chihuahua, Chihuahua, Mexico

Roles Formal analysis, Investigation, Methodology, Visualization, Writing – original draft, Writing – review & editing

Affiliation William James Centre for Research, ISPA—Instituto Universitário, Lisbon, Portugal

Affiliation Higher Institute of Engineering of Porto, Polytechnic Institute of Porto, Porto, Portugal

Roles Conceptualization, Investigation, Writing – original draft, Writing – review & editing

Affiliation División de Investigación Clínica, Centro de Investigación Biomédica de Michoacán, Instituto Mexicano del Seguro Social, Mexico, Mexico

Affiliation Faculty of Social Sciences and Humanities, ESPOL Polytechnic University, Guayaquil, Ecuador

Affiliation Faculty of Community Medicine, KIMS, Bhubaneswar, KIIT University, Bhubaneswar, India

Roles Data curation, Formal analysis, Methodology, Writing – original draft, Writing – review & editing

Affiliation Department of Statistical Sciences, Sapienza University of Rome, Rome, Italy

Affiliation DTI-CUCEA & Instituto de Astronomía y Meteorología—CUCEI, Universidad de Guadalajara, Guadalajara, Mexico

Affiliation Faculty of Economics and Business, University of Zagreb, Zagreb, Croatia

Affiliation Department of Human Sciences, University of Verona, Verona, Italy

Affiliation Department of Agricultural Extension and Rural Development, Patuakhali Science and Technology University, Barisal, Bangladesh

Roles Data curation

Affiliation Faculty of Public Administration, University of Ljubljana, Ljubljana, Slovenia

Affiliation Business Management, University of South Africa (UNISA), Pretoria, South Africa

Roles Data curation, Investigation, Methodology, Resources, Validation, Writing – original draft, Writing – review & editing

Affiliation Department of Econometrics, Faculty of Economics, University of Istanbul, Istanbul, Turkey

Affiliation National Scientific Research Institute for Labour and Social Protection, Bucharest, Romania

  •  [ ... ],

Roles Conceptualization, Funding acquisition, Investigation, Project administration, Supervision, Writing – review & editing

* E-mail: [email protected]

  • [ view all ]
  • [ view less ]
  • Damijana Keržič, 
  • Jogymol Kalariparampil Alex, 
  • Roxana Pamela Balbontín Alvarado, 
  • Denilson da Silva Bezerra, 
  • Maria Cheraghi, 
  • Beata Dobrowolska, 
  • Adeniyi Francis Fagbamigbe, 
  • MoezAlIslam Ezzat Faris, 
  • Thais França, 

PLOS

  • Published: October 20, 2021
  • https://doi.org/10.1371/journal.pone.0258807
  • Reader Comments

Fig 1

The outbreak of the COVID-19 pandemic has dramatically shaped higher education and seen the distinct rise of e-learning as a compulsory element of the modern educational landscape. Accordingly, this study highlights the factors which have influenced how students perceive their academic performance during this emergency changeover to e-learning. The empirical analysis is performed on a sample of 10,092 higher education students from 10 countries across 4 continents during the pandemic’s first wave through an online survey. A structural equation model revealed the quality of e-learning was mainly derived from service quality, the teacher’s active role in the process of online education, and the overall system quality, while the students’ digital competencies and online interactions with their colleagues and teachers were considered to be slightly less important factors. The impact of e-learning quality on the students’ performance was strongly mediated by their satisfaction with e-learning. In general, the model gave quite consistent results across countries, gender, study fields, and levels of study. The findings provide a basis for policy recommendations to support decision-makers incorporate e-learning issues in the current and any new similar circumstances.

Citation: Keržič D, Alex JK, Pamela Balbontín Alvarado R, Bezerra DdS, Cheraghi M, Dobrowolska B, et al. (2021) Academic student satisfaction and perceived performance in the e-learning environment during the COVID-19 pandemic: Evidence across ten countries. PLoS ONE 16(10): e0258807. https://doi.org/10.1371/journal.pone.0258807

Editor: Dejan Dragan, Univerza v Mariboru, SLOVENIA

Received: July 21, 2021; Accepted: October 5, 2021; Published: October 20, 2021

Copyright: © 2021 Keržič et al. This is an open access article distributed under the terms of the Creative Commons Attribution License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.

Data Availability: The data presented in this study are available in Supporting Information (see S1 Dataset ).

Funding: This research and the APC were funded by the Slovenian Research Agency grant number P5-0093. The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.

Competing interests: The authors have declared that no competing interests exist.

Introduction

COVID-19, as a global public health crisis, has been brutal on the economy, education and food security of people all around the world, regardless of national boundaries. Affected sectors include tertiary education, featuring one of the worst disruptions during the lockdown periods given that most countries have tried to keep their essential economic activities running. Still, such activities did not extend to higher education institutions (HEIs), which were closed completely after the suspension of face-to-face activities in an effort to avoid the virus spreading among their students and staff and, in turn, the general population.

Nevertheless, HEIs have continued to offer education by using various digital media, e-learning platforms and video conferencing systems. The result is that e-learning has become a compulsory educational process. Many HEIs were even encountering this mode of delivery for the first time, making the transition particularly demanding for them since no time was available to organize and adapt to the new landscape for education. Both teachers and students today find themselves in a new environment, where some seem better at adapting than others. This means the quality of teaching and learning call for special consideration. In this article, the term “e-learning” refers to all forms of delivery for teaching and learning purposes that rely on different information communication technologies (ICTs) during the COVID-19 lockdown.

To understand COVID-19’s impact on the academic sphere, especially on students’ learning effectiveness, we explored the factors influencing how students have perceived their academic performance since HEIs cancelled their onsite classes. Students’ satisfaction in e-learning environments has been studied ever since the new mode of delivery via ICT first appeared (e.g. [ 1 ]), with researchers having tried to reveal factors that shape success with the implementation of e-learning systems (e.g. [ 2 – 4 ]), yet hitherto little attention has been paid to this topic in the current pandemic context. This study thus aims to fill this gap by investigating students’ e-learning experience in this emergency shift. Therefore, the questions we address in the paper are:

  • R1: Which factors have contributed to students’ greater satisfaction with the e-learning during the COVID-19 pandemic?
  • R2: Are there any differences between factors influencing quality of the e-learning regarding countries, gender, and fields of study?
  • R3: How does the students’ satisfaction with the transition to e-learning during the COVID-19 pandemic relate to their academic performance?

According to previous research and considering the new circumstances (e.g. [ 5 – 7 ]), we propose a model for explaining students’ perceived academic performance. In order to identify relevant variables positively affecting students’ performance, we use data from the multi-country research study “Impacts of the COVID-19 Pandemic on Life of Higher Education Students”, coordinated by the Faculty of Public Administration, University of Ljubljana, Slovenia [ 8 ]. Structural equation modelling (SEM) is applied to explore the causal relationships among latent concepts, measured by several observed items. Since the SEM approach has a long history of successful applications in research, especially in the social sciences [ 9 , 10 ] and also in the educational context [ 11 ], it offers a suitable statistical framework that allows us to define a conceptual model containing interrelated variables connected to e-learning’s effect on students’ performance [ 9 , 10 ].

This study significantly contributes to understanding of students’ satisfaction and performance in the online environment. The research findings may be of interest to higher education planners, teachers, support services and students all around the world.

E-learning and the COVID-19 pandemic

According to the International Association of Universities (IAU), over 1.5 billion students and young people around the globe have been affected by the suspension of school and university classes due to the pandemic [ 12 ]. Thus, to maintain continuity in learning while working on containing the pandemic, countries have had to rely hugely on the e-learning modality, which may be defined as learning experiences with the assistance of online technologies. However, most HEIs were unprepared to effectively deal with the abrupt switch from on-site classes to on-line platforms, either due to infrastructure unavailability or the lack of suitable pedagogic projects [ 13 , 14 ]. To understand the mechanism and depth of the effects of COVID-19, many research studies have been carried out across the world.

Before COVID-19, as new technologies were developed, different e-learning modalities like blended learning and massive open online courses were gradually spreading around the world during the last few decades [ 15 , 16 ]. Hence, e-learning was deeply rooted in adequate planning and instructional design based on the available theories and models. It should be noted at the outset that what has been installed at many HEIs during the pandemic cannot even be considered e-learning, but emergency remote teaching, which is not necessarily as efficient and effective as a well-established and strategically organized system [ 17 ]. Still, all over the world online platforms, for example MS Teams, Moodle, Google Classroom, and Blackboard are in use. Although e-learning offers some educational continuity when it comes to academic learning, technical education has suffered doubly since the social distancing requirements have disrupted the implementation of both practical and work-based learning activities, which are critical for educational success [ 18 ].

According to Puljak et al. [ 19 ], while students have mostly been satisfied with how they have adapted to e-learning, they have missed the lectures and personal communication with their teachers. They declared that e-learning could not replace regular learning experiences; only 18.9% of students were interested in e-learning exclusively in the long run. Inadequate readiness among teachers and students to abruptly switch from face-to-face teaching to a digital platform has been reported [ 20 ].

The closure of universities and schools due to the COVID-19 pandemic has led to several adverse consequences for students, such as interrupted learning, giving students fewer opportunities to grow and develop [ 21 ]. This shift has resulted in various psychological changes among both students and teachers [ 22 ] and greatly affected their performance. Tutoring system in higher education is an established model of support, advice, and guidance for students in higher education with a purpose to improve motivation and success and prevent drop-out. Pérez-Jorge et al. [ 23 ] studied the effectiveness of the university tutoring system during the Covid-19 pandemic. The relation between tutor and student is based on collaboration and communication, which required to adopting quickly to the new situations using different communication technology. The research focused on four different forms of tutoring: in person, by e-mail, using virtual tutoring (Hangout/Google Meet) and WhatsApp. They pointed out that synchronous models and frequent daily communication are essential for effective and successful tutoring system where application WhatsApp, with synchronous communication by messages and video calls, is the form with which students were most satisfied and gain the most from it.

The goal of shifting teaching and learning over to online platforms is to minimize in-person interactions to reduce the risk of acquiring COVID-19 through physical contact. The form of interaction has also moved from offline mode to online mode. Students interact with each other in online platforms for their close group and also for larger groups [ 24 , 25 ]. Many clinical skills are learned through direct interactions with patients and caregivers, one area that has been badly affected by the switch to e-learning platforms [ 26 – 28 ].

Student satisfaction with e-learning

Student satisfaction has been shown to be a reliable proxy for measuring the success of implementing ICT-based initiatives in e-learning environments. Scholars have documented a strong relationship between how students perceive their academic performance and how satisfied students are with their e-learning environments [ 1 , 29 – 31 ].

The literature reveals important antecedents related to students’ satisfaction with e-learning training, such as online interactions [ 32 , 33 ], computer efficiency [ 34 , 35 ], online skills [ 36 ], teacher support [ 34 , 37 , 38 ], course design [ 29 , 39 ], teacher feedback [ 40 ], quality of information and activity [ 1 ] and technical support [ 34 , 36 , 41 ]. During the COVID-19 pandemic, environmental aspects like temperature, lighting and noise have been identified as significant determinants of students’ e-learning performance [ 42 ].

Sun et al. [ 1 ] consider the effect of overall quality–as a holistic construct–on satisfaction with the e-learning system. Their research identifies several quality factors that facilitate e-learning through factors associated with: learners (mental health, self-efficacy and attitude of the learner), teachers/instructors (attitude and response timelines assigned by the teacher), technology (quality of technology and the Internet), curriculum (quality and flexibility of the curriculum), design (usefulness and complexity of the design) and environment (interactiveness and assessment diversity). This pandemic has challenged HEIs around the world since e-learning requires physical equipment such as computers, servers, learning and communication platforms, but also software applications, operating systems and experts in the use of these technologies. However, teachers must also possess sufficient digital competencies if they are to use ICT effectively in the learning process.

One of the most relevant factors related to success in implementing e-learning relates to how online education is conducted [ 19 ]. This includes receiving timely feedback, teachers’ efforts to be organized, delivering online lectures (and recording them), adapting instructions to this learning model, and helping students follow the courses and look for feedback on their experiences. In some cases, students have not been appropriately guided to follow their courses, overloaded with too many assignments, while there has been a general concern about the lack or loss of practical instruction, which has thus not entirely been covered in their e-learning experiences.

According to Chopra et al. [ 37 ], timely feedback and responses to students’ actions are key to effective online delivery. Another study also found a positive association between e-service and information quality with students’ satisfaction [ 43 ]. Based on interviews with teachers and students from Jordan, Almaiah et al. [ 44 ] found that it is crucial to analyse students and teachers’ use and adoption of systems, while their critical challenges included: (1) change management, students’ and teachers’ resistance, since many prefer traditional learning; (2) ICT literacy; (3) students’ self-efficacy and motivation; and (4) technical issues around systems’ accessibility, availability, usability, reliability, personalization, and service quality, mainly because perceived ease of use might benefit students’ performance and their efficacy while using e-learning systems. Perceived ease of use influences both system adoption and perceived usefulness and was clearly an important aspect since many participants complained that the e-learning system implemented was neither easy to use nor flexible, and this affected their experience regarding technical issues.

An Indian study reports a decline in teacher–student interaction when teaching moved across to online platforms [ 22 ]. Hence, greater autonomy is required from students, along with self-regulation and skills to learn online for effective learning [ 45 ].

Yet, students’ expertise in computer use and different learning platforms deeply influences their participation in e-learning [ 34 ]. Similarly, Wu et al. [ 35 ] emphasize the lack of adequate computer skills as an important impediment to effective online delivery. It is important to note that not only the lack of soft skills but also not having adequate hardware can obstruct e-learning. The Hungarian Rectors’ Conference [ 46 ], on the basis of 42 Hungarian HEIs’ responses, reported that the experiences with e-learning were generally positive. Still, the main issues involved the lack of technical preparation and equipment; in particular, many students did not have adequate equipment or Internet access. The levels of the students’ satisfaction with the e-learning was also reported to be better among students in developed countries than their counterparts in developing ones [ 26 ]. Similarly, resource-scarce settings struggle with the unavailability of digital platforms for education, limited Internet access, poor Internet speed, high cost of Internet and inadequate expertise to work via digital platforms [ 14 ]. The infrastructure resources in developing countries are incomparable to developed ones because there is a lack of technological infrastructure for e-learning like computers, connectivity and electricity on top of deficient skills and the active participation of both students and teachers due to insufficient ICT literacy [ 47 ].

To strengthen e-learning, the following strategies have been suggested as useful:

  • To use a wide variety of learning strategies [ 48 ].
  • To use tools that allow students to collaboratively build knowledge, discuss, co-construct and interact with the content [ 49 ].
  • To incorporate social media in e-learning so as to provide an adequate and more engaging learning space [ 50 ].
  • To use flexible and scaffolded online resources so as to acquire new technical skills that may be useful for future working opportunities [ 51 ].
  • To provide adequate technological infrastructure and equipment for e-learning [ 26 ].

Students’ satisfaction and performance

Several comprehensive models have also been developed for studying e-learning performance. The technology acceptance model (TAM) provides an easy way to assess the effects of two beliefs–perceived usefulness and perceived ease of use–on users’ intention to utilize a certain technology, hence providing a good prediction of students’ participation and involvement in e-learning, which in turn influences their performance [ 52 ].

Rizun and Strzelecki [ 53 ] employed an extension of the TAM, which suggests that acceptance of e-learning is related to enjoyment and self-efficacy. According to DeLone and McLean [ 54 ], system usage–the degree to which an individual uses the capabilities of a given information system in terms of frequency, nature and duration of use–has a direct connection with users’ satisfaction and their online performance. By applying DeLone and McLean’s Model (D&M model) of Information Systems Success, Aldholay et al. [ 55 ] were able to prove that system, service and information quality related to e-learning have significant positive effects on system usage, that thereby predicts a user’s satisfaction and has a positive impact on their performance.

Recently, Al-Fraihat et al. [ 41 ] used a multidimensional and comprehensive model and found seven types of quality factors that influence the success of e-learning systems, namely: technical system quality, information quality, service quality, education system quality, support system quality, learner quality, and instructor quality as antecedents of perceived satisfaction, perceived usefulness, use and benefits of e-learning. Moreover, Baber [ 56 ] relates students’ perception of their learning outcomes and their satisfaction to factors like students’ motivation, course structure, the instructor’s knowledge and facilitation.

Cidral et al. [ 34 ] proposed 11 different constructs of effective e-learning, among which we can mention individual skills, system requirements, and interaction-focused elements. System use and user satisfaction were shown to exert the greatest positive impact on individuals’ performance through e-learning. In a similar study, Hassanzadeh et al. [ 57 ] identified the following factors as responsible for success with e-learning: use of the system, loyalty to the system, benefits of using the system, intention to use, technical system quality, service quality, user satisfaction, goal achievement, and content and information quality.

Rashid and Yadav [ 58 ] draw attention to several critical issues that may affect the effectiveness of e-learning: students’ possibility to have access to and to afford e-learning technologies; the need for educators to be properly trained in the use of the technologies; teachers’ autonomy and trust; and the quality of the communication among higher education stakeholders. Moreover, Deloitte [ 59 ] highlights the importance of institutional support in the successful delivery of e-learning.

Constructs of the conceptual model and research hypotheses

This study proposes a conceptual model for analysing students’ perceived academic performance during the period of the COVID-19 pandemic, which forced the transition from on-site to on-line teaching and learning. In this research, we combine the theoretical results of previous studies on e-learning with the emergency changeover to various online modes of delivery in response to the pandemic lockdown. The proposed conceptual model builds on the model of students’ satisfaction with e-learning suggested by Sun et al. [ 1 ] as well as the D&M model [ 60 ], which was used to describe different information systems’ success, including the e-learning system [ 41 ]. Cidral et al. [ 34 ] studied similar key apects of quality e-learning systems.

In the conceptual model we propose second-order multidimensional construct E-learning Quality of five components. Based on the literature [ 1 , 34 , 37 ] the construct connects three aspects of quality: learner, teacher and system.

Two factors associated with students’ satisfaction corresponding to the learner dimension are included in our proposed model: Home Infrastructure and Computer Skills . The rapid transition to online study meant students were relocated to a home environment where many did not enjoy suitable conditions to study, both a quiet place and digital equipment with access to (high-performance) Internet, which is indispensable for effective online study. Therefore, the latent variable Home Infrastructure covers the ICT conditions at home, i.e. having one’s own computer or access to one, the required software, a webcam, and a stable (and fast) Internet connection [ 37 ]. The greater the students’ previous knowledge and experience in using digital media, the easier the transition to e-learning has been. Computer Skills describe students’ expertise in using computers and different learning platforms, which is particularly important for active participation in the online delivery mode [ 34 , 35 ].

The teacher dimension refers to the organization of teaching in a new e-learning environment. Studies show the organization and delivery of study material is important for student satisfaction and performance. Three constructs related to teachers are defined in the model. Mode of Delivery corresponds to the different forms used in online lectures, tutorials or practical classes providing learning materials and assignments, such as videoconference, audio recording, forum or e-mail [ 57 ]. Teachers play a valuable active role in the online environment by guiding students through the learning contents and providing them with timely responses and information. Equally important are prepared assignments that encourage and motivate students to independently learn at home. Online Instruction focuses on teachers’ active role and attitude to online teaching. The construct is explained by Information Quality and two other aspects assessed in our questionnaire, namely preparing regular assignments and being open to students’ suggestions [ 34 , 41 , 61 ]. Information Quality measures teachers’ responsiveness to the students, such as timely feedback or answering questions in an e-learning environment [ 34 , 37 ]. We also propose a second-order construct System Quality , composed of learner and teacher dimensions: Home Infrastructure and Mode of Delivery .

Previous studies reveal that IT service support has a positive influence on users’ perceptions of their satisfaction with the system. As the transition to online study happened quickly and without prior training, the support of both the IT and the administrative service is vital for ensuring that students are satisfied with their new learning environment [ 34 , 37 , 41 , 57 , 61 , 62 ]. In our model, Service Quality refers to the aspect of administrative, technical and learning assistance. To compensate for the lack of social contact while studying from home, various forms of online interactions are possible. Teacher–student or student–student interactions were shown to be important factors of satisfaction with the e-learning system [ 34 , 41 , 61 ]. The construct Online Interactions describes how often a student communicates with colleagues from the course, the teachers or the administrative staff.

To summarize, E-learning Quality is multidimensional construct of five components Students’ Computer Skills , System Quality , which reflects the Mode of Delivery and Home Infrastructure ,. Online Instruction assessed through Information Quality , Online Service Quality and Online Interactions with colleagues, teachers and staff. We hypothesize:

  • H1: Students’ Computer Skills is correlated with Home Infrastructure .

During the COVID-19 pandemic, teaching and learning were completely implemented in the online environment and thus we include the quality dimension, which measures several important aspects of the e-learning system: system quality, information quality, service quality, learner digital quality and interaction quality. Models measuring the success of the information system (also the e-learning system) are usually based on the D&M model, where user satisfaction and quality dimension play an important role [ 34 , 41 , 57 , 61 ]. The construct Perceived Student Satisfaction is manifested by students’ satisfaction with the organization of e-learning (i.e. lectures, tutorials, seminars, mentorships) and with the support of the teachers and the student counselling service [ 34 , 57 ]. Perceived Student Performance aims to capture students’ benefits of using an e-learning system. It measures students’ opinion of their performance and whether it has worsened with the transition to the online learning mode [ 34 , 41 , 57 ]. The proposed model’s structural part includes three constructs: E-learning Quality , Perceived Student Satisfaction and Perceived Student Performance . We may reasonably assume the quality of the e-learning system has a positive effect on satisfaction with the online education environment system, leading to the system’s greater use and thus to improve the student performance. It is unlikely that one can perform well without use of the system.

This leads to three hypotheses being proposed:

  • H2: E-learning Quality has a positive effect on Perceived Student Satisfaction .
  • H3: Perceived Student Satisfaction has a positive effect on Perceived Student Performance .
  • H4: E-learning Quality has an indirect (mediated by Perceived Student Satisfaction ) positive effect on Perceived Student Performance .

Therefore, we propose the conceptual model presented in Fig 1 . and construct description in Table 1 .

thumbnail

  • PPT PowerPoint slide
  • PNG larger image
  • TIFF original image

https://doi.org/10.1371/journal.pone.0258807.g001

thumbnail

https://doi.org/10.1371/journal.pone.0258807.t001

Materials and methods

Design and procedure.

The data for this study come from a very comprehensive and large-scale global student survey entitled “Impacts of the COVID-19 Pandemic on Life of Higher Education Students”, aimed at examining how students perceive the impacts of the pandemic’s first wave in early 2020 on various aspects of their lives on a global level [ 8 ]. This project was originally promoted by the Faculty of Public Administration, University of Ljubljana (Slovenia), which, thanks to the support of international partners, was able to be disseminated worldwide. The online questionnaire was adapted and extended from the European Students’ Union [ 63 ] survey. It was formed by 39 questions, mainly including closed-ended questions (see S1 Questionnaire ). It focused on socio-demographic, geographic and other aspects pertaining to the life of university students, such as academic online work and life, social life, emotional life, personal circumstances, changes in habits, the roles and measures of institutions, as well as personal reflections on COVID-19 [ 64 ]. Initially, the online questionnaire was designed in English and later translated into six different languages (Italian, North Macedonian, Portuguese, Romanian, Spanish, Turkish). The translation of the questionnaire was carried out by native speakers, being proficient in English. The web-based survey was launched via the open-source web application 1KA (One Click Survey; www.1ka.si ) on 5 May 2020 and remained open until 15 June 2020, that is, in a period when most nations were experiencing the onerous restrictions imposed by the lockdown. Participation in the study reached global proportions by exceeding the milestone of 30,000 responses submitted by students from more than 130 countries on all six continents. The entire dataset was first analysed by Aristovnik et al. [ 8 ].

Participants

The survey was intended for all higher education students at least 18 years of age, representing the target population of this study. The sampling technique used is non-probabilistic, specifically convenience sampling through university communication systems around the world and social media. The students were informed about the details of the study and gave their informed consent before participating. Due to this study’s specific focus on academic online work and life, it only includes student data with respect to selected parts of the questionnaire. However, since the respondents were not obliged to complete the questionnaire in full, the number of respondents varied across questions. Accordingly, a complete-case-analysis approach was applied to mitigate missing data issues [ 65 ]. With the assumption of “missing completely at random”, meaning the complete cases are a random sample of the originally identified set of cases, a complete-case approach is the most common method for handling missing data in many research fields, including educational and epidemiologic research [ 66 , 67 ]. In order to assure a more robust analysis and perform reliable comparisons on the national level, this study focuses on the 10 countries (Chile, Ecuador, India, Italy, Mexico, Poland, Portugal, Romania, Slovenia, Turkey) that provided at least 500 answers with regard to different aspects of students’ academic life.

The final dataset consisted of 10,092 participants or students enrolled in HEIs, of whom 92% were attending a full-time study course. They were at least 18 years old, with a median age of 23 years (IQR [21.0, 24.0]), and about two-thirds of them (67%) being female. Most respondents (82%) were pursuing a bachelor’s degree, 16% a master’s degree, and 2% a doctoral course. Twelve percent were majoring in a study course in the Arts and Humanities, 37% in the Social Sciences, 32% in the Applied Sciences and 19% in the Natural and Life Sciences. Detailed information on the sample, i.e. the number of respondents and participants’ sociodemographic characteristics by country, are given in Table 1 .

This study primarily focuses on how COVID-19 has affected different aspects of students’ academic life. Specifically, students reported their experiences with the organization of teaching and administrative services, along with their satisfaction, expectations and perceived impacts on their university career. This involves a total of 34 survey items, representing a basis for measuring the 9 latent constructs used in our proposed conceptual model. Individual satisfaction and concern levels were measured on a 5-point Likert scale, from 1 (lowest value) to 5 (highest value) [ 68 ]. A more detailed description, including the set of measuring items and their characteristics, is found in Table 2 .

thumbnail

https://doi.org/10.1371/journal.pone.0258807.t002

Ethical considerations

All participants were informed about the details of the study and gave their informed consent before participating. By clicking on a button ‘next page’ participants agreed to participate in the survey. Study participation was anonymous and voluntary, and students could withdraw from the study without any consequences. For data-protection reasons, the online survey was open to people aged 18 or over and enrolled in a higher education institution. The procedures of this study comply with the provisions of the Declaration of Helsinki regarding research on human participants. Ethical Committees of several of the higher education institutions involved approved this study, such as the University of Verona (protocol number: 152951), ISPA–Instituto Universitário (Ethical Clearance Number: I/035/05/2020), University of Arkansas (IRB protocol number: 2005267431), Walter Sisulu University (Ethical Clearance Number: REC/ST01/2020) and Fiji National University (CHREC ID: 252.20).

Data analysis

We implemented the SEM with use of the lavaan package (v.0.6.4, [ 69 ]) in the R statistical environment (v.4.0.2, [ 70 ]). A two-step approach was followed. In the first step, we checked the fit of the measurement model to all the latent variables; in the second step, we checked the fit of the structural model. The Comparative Fit Index (CFI), Tucker-Lewis Index (TLI), Root Mean Square Error of Approximation (RMSEA) and Square Root Mean Residual (SRMR) were used as goodness of fit indices. The fit was deemed appropriate for CFI and TLI above .90, and for RMSEA and SRMR values below .06 and .08, respectively (e.g. [ 71 , 72 ]).

We assessed the reliability of the first-order and second-order factors with McDonald’s omega ( ω ) and ω L2 , respectively, and convergent validity with Average Variance Extracted (AVE) using the semTools package (v.0.5.3, [ 73 ]). Omega and AVE values above .70 and .50 were indicative of good reliability and convergent validity, respectively [ 72 , 74 , 75 ].

Invariance analysis was performed [ 72 ] by comparing the difference in the fit of a series of sequentially constrained models from configural (Conf), intercepts (Intercpy), loadings (Load), means (Means), to regression coefficients (Regr). Invariance was assumed for nonsignificant Δ χ 2 or, preferentially, ΔCFI<-.01 for two sequentially constrained models.

Preliminary analyses

Factor loadings and factor reliabilities for the first- and second-order constructs used in the model are given in Table 3 . All factor loadings for the first-order constructs were statistically significant for p < .001 and larger than the usual .50 cut-off value. Reliability, as measured by McDonald’s ω , ranged from .67 (for Online Instruction ) to .94 (for Mode of Delivery ). The second-order constructs have lower reliability values, which is explained by the reduced number of indicators in some of these constructs. For the first-order constructs, AVE ranged from .55 (for Online Interactions ) to .80 (for Mode of Delivery ). As seen from the reliability measures, the second-order constructs, especially the ones with few indicators, displayed lower AVE. Moreover, in Fig 2 , we show the path coefficients calculated for each hypothesis.

thumbnail

https://doi.org/10.1371/journal.pone.0258807.g002

thumbnail

https://doi.org/10.1371/journal.pone.0258807.t003

Model of student perceived performance

The overall model under the e-learning regime due to the COVID-19 pandemic is depicted in Fig 2 . The estimated model had a good fit to the 10,092 students from the 10 countries that provided more than 500 valid responses ( χ 2 (519) = 5213.6, p < .001, CFI = .990, TLI = .989, RMSEA = .063, SRMR = .049) with all structural paths significant at p < .001. The model explained 55% (R 2 = .55, p < .001) of the students’ perceived performance. Major determinants of E-learning Quality were Service Quality ( β = .96, p < .001) and overall System Quality ( β = .90, p < .001). Online Interactions with colleagues and teachers ( β = .34, p < .001) and the students’ Computer Skills ( β = .46, p < .001) had a lower impact on the e-learning system’s overall quality.

Country invariance

The analysis of invariance revealed configural invariance (CFI = .900, TLI = .900, RMSEA = .070, SRMR = .060) for the 10 countries. However, no weak measurement invariance (equal loadings between countries) was observed (Δ χ 2 Load (243) = 510.93, p < .001; ΔCFI Load = -.03). Thus, the proposed conceptual model was fit to the 10 participating countries individually. Table 4 summarizes the structural standardized coefficients and fit indices obtained for each country.

thumbnail

https://doi.org/10.1371/journal.pone.0258807.t004

Overall, the models displayed an acceptable fit for all countries (CFI and TLI greater or equal to .850) and for most countries RMSEA and SRMR less or equal than .05 and .06, respectively). The model explained from 35% (India) to 59% (Portugal) of the Perceived Student Performance variation within countries. The overall mean explained variance was 42%.

Gender and areas of study invariance

Invariance analysis of the model revealed strong metric invariance for gender according to the ΔCFI criteria (ΔCFI Load = -.001; ΔCFI Intercpt = -.001), but not for the Δ χ 2 criteria (Δ χ 2 Load (26) = 62.253; p < .001; Δ χ 2 Intercpt (26) = 79.824; p < .001). However, for large sample sizes inflation of χ 2 is well known, thus recent research has adopted different criteria, including the ΔCFI as described in the methods section. Using the gender ΔCFI criteria, invariance was also observed for factor means ΔCFI Means <-.001) and structural regression coefficients (ΔCFI Means = -.001).

The model displayed strong metric invariance for the areas of study (Arts and Humanities, Social Sciences, Applied Sciences, Natural Sciences) according to the ΔCFI criteria (ΔCFI Load = -.002; ΔCFI Intercpt = -.003). Using the same criteria, invariance was also observed for factor means ΔCFI Means = -.001) and structural regression coefficients (ΔCFI Means <-.001).

Therefore, we conclude that the model is invariant for gender and areas of study, implying that we can apply it for both genders and all four areas of study.

The goal of this research was to analyse which factors influenced students’ perceived academic performance after switching their academic activities over to the online mode, as imposed by the lockdown in response to COVID-19 in 2020. To this end, a global study including 62 countries was conducted. In this paper, we presented the results of 10 countries that provided more than 500 valid responses.

The study results show that the impact of computer skills is less influential for e-learning quality compared to other factors like system quality, which is the most determinative factor. These results are aligned with previous studies (e.g. [ 34 , 37 ]), which found that system quality is positively related to a user’s perceived satisfaction, but are contrary to Al-Fraihat et al. [ 41 ] who did not detect any significant system quality impact. Our data also show that different modes of delivery positively influenced system quality. On the other hand, even though the quality and diversity of the home infrastructure revealed some impact on the system quality, it is a less determinative factor. These results suggest that students respond better to diversity in learning formats, but it seems that having suitable infrastructure is not so important.

As concerns online instruction, we found that it is one of the three major determinant of e-learning quality and, therefore, for students’ perceived satisfaction and performance. Online instruction can be assessed by the construct information quality, as well as by considering other factors like the teacher’s active role and attitude to online teaching, preparation of regular assignments and openness to the students’ suggestions [ 34 , 41 , 61 ]. Information quality can be explained by teachers’ responsiveness to the students, such as timely feedback or answering questions in an e-learning environment [ 19 , 34 , 37 ].

The active role of teachers and their responsiveness and feedback seem crucial for the students’ satisfaction with the online instruction since the teacher/instructor is a key element of success with the e-learning environment [ 76 ]. Sun et al. [ 1 ] investigated the instructor’s role in the success of e-learning, focusing on two specific indicators: instructor response timelines, and instructor attitude to e-learning. They found a positive and significant relationship between these aspects and the satisfaction of students. Similar findings were outlined by Cidral et al. [ 34 ], who documented a positive relationship between instructor attitude to e-learning and user satisfaction. In addition, Al-Fraihat et al. [ 41 ] and Mtebe and Raphael [ 77 ] established a positive relationship between the instructor’s quality and students’ perceived satisfaction with an e-learning system. Moreover, the quality of information provided by the instructor/teacher has been considered to be a determinant of perceived satisfaction in previous studies that support our findings [ 29 , 37 , 41 , 43 , 78 , 79 ]. According to Al-Fraihat et al. [ 41 ], it is essential to provide students with clear, updated and sufficient information and quality content.

Regarding online service quality, we found that it was a major determinant of the students’ perceived e-learning quality. This allows us to infer that administrative, technical and learning assistance through tutors and the library is very important for students’ greater satisfaction and, in consequence, students’ higher perceived satisfaction and performance. This result is contrary to Cidral et al. [ 34 ], yet consistent with the findings of Al-Fraihat et al. [ 41 ], Hassanzadeh et al. [ 57 ] and Chopra [ 37 ], who state that providing quality services might increase the level of satisfaction, making it crucial to have personnel available to support students with their technical issues and satisfy their needs, generating positive feelings towards the e-learning system.

The construct online interactions describes how often a student communicates with colleagues from the course, the teachers or the administrative staff [ 34 , 41 ]. This factor was considered to be one of the least determinative of overall satisfaction-learning quality and, consequently, least able to explain the conceptual model of perceived student performance. It seems the new emergency remote teaching and learning scenario [ 17 ] has affected the frequency of student interactions with colleagues and teachers [ 19 , 22 ], which may explain why it is less important for perceived e-learning quality. Our results suggest these interactions are still needed for a successful student performance in an e-learning environment, although they are less determinative than other factors.

The first hypothesis (H1) about the influence of the students’ computer skills on e-learning quality and The first hypothesis (H1), referring to the intercorrelation between students’ computer skills with the quality and variety of the IT infrastructure at home, were confirmed. The correlation is only moderate. In other terms, students who possess different digital media and better-quality infrastructure at home had greater digital competencies, which then favoured their perceived e-learning quality and, thus, the students’ perceived satisfaction and performance under the e-learning mode.

Taking all five dimensions of e-learning quality into consideration, the second hypothesis (H2) is also confirmed because this factor (e-learning quality) has a very strong positive effect on perceived student satisfaction. Students who are more satisfied with the quality of their e-learning experience are generally more satisfied with their education, which further more positively influences their perceived academic performance (see H3). Students more satisfied with their online education also perform better at school. The result highlights the role of students’ satisfaction in their academic performance [ 60 ]. At the same time, we may infer that students who use the online learning mode more frequently perceive their educational performance is higher.

The last hypothesis (H4) is also confirmed. E-learning quality has an indirect (mediated by perceived student satisfaction) positive effect on perceived student performance. The overarching research question of our study is thereby confirmed: the better the quality of the e-learning system, the more satisfied students are with their academic performances.

Regarding the country comparisons (see Table 2 ) and considering the overall model’s lack of invariance and irrespective of the country differences, the results show that students’ perceived satisfaction is largely predicted by the quality of the school’s service and the quality of the overall system. However, it is worth discussing some of the outliers shown in Table 2 .

Concerning computer skills, one can observe a significant difference between India and the other countries. This result might have been influenced by the fact that the majority of Indian students participating in the study have a technical background, pursuing Engineering or Medical Sciences. Hence, their proficiency in computing is expected to be high. On the other side, among Romanian students the impact of computer skills on e-learning quality is the lowest, which may be explained by the structure of the Romanian sample, comprising more Social Sciences students who prefer face-to-face interactions over the use of different platforms for online teaching, which has increased the workload compared to the previous situation.

While examining the results for the construct system quality, we see that, although most countries show similar structural standardized coefficients, Portugal has a slightly highest coefficient compared to the rest of the countries. This result might be caused by the fact that Portugal had already been through a process of creating a very strong online higher education infrastructure [ 80 ], meaning the students’ transition to this modality has been quite smooth and they do not seem to perceive any significant change.

With respect to online interactions, India has a significantly higher coefficient than the corresponding values for the other countries. This result may be explained by the fact that the average university class size in India is 150–250 students, making it very difficult for the students to interact with each other or with the teachers in a personal way. In the new e-learning scenario, teachers are more available for flexible consultation time. In addition, many of the teaching strategies that lecturers are relying on encourage collaborative work. Yet, in contrast, Slovenia has the lowest coefficient for this factor, which can be attributed to the fact that, even before the pandemic outbreak, e-learning was widespread in higher education, including blended learning, and thus the students do not consider that online interactions have increased or changed due to the pandemic.

Regarding online instruction, Turkey has the lowest coefficient of the 10 countries. The high number of Turkish students per academic, which exceeds the OECD average [ 81 ] makes it difficult for academics to give individual feedback to all of their students.

Regarding gender and areas of study, the proposed model proved to be invariant for both factors, which confirms its relevance in explaining students’ perceived academic performance through the quality of the e-learning infrastructure as mediated by students’ perceived satisfaction.

Although no significant difference in the results is found by gender, the number of female participants is remarkably higher than for males in all countries. Although the causes of this result lie beyond the scope of this study, it would be worth analysing them in future research.

Conclusions

Our study has provided insights into latent factors explaining students’ perceived academic performance during the first wave of COVID-19 pandemic, which forced the transition to online education. The results confirmed all of the hypotheses and the proposed conceptual model was revealed to be reliable.

According to the study results, the quality of e-learning during the COVID-19 pandemic’s first wave was mainly derived from service quality with administrative, technical and learning assistance through tutors and the library, teachers’ active role in the process of online education with their responsiveness and timely feedback, and overall system quality with the mode of delivery and IT infrastructure. Students’ digital competencies and online interactions with colleagues and teachers were shown to be slightly less important factors, yet still statistically significant. Moreover, our study shows that the impact of e-learning quality on student performance is strongly mediated by student satisfaction with e-learning.

Understanding the factors that influenced students’ performance after the urgent introduction of e-learning may be important for decision-makers and all those involved in implementation in any future new similar circumstances. Thus, the results of our study imply a clear strategy for education, research and policy. Investment in the development of digital competencies, of both students and academic staff, together with initiatives supporting research and interdisciplinary innovative collaboration within the scope of different aspects of higher online education, are recommended and should be encouraged.

Limitations

This study has several limitations that should be considered. First, the convenience sampling methodology, which limits the generalizability of the results. The calculated results are based on a sample, which includes students from 10 countries, although European countries prevail. It is clear that the countries are on different levels of economic development and have differently organized and developed higher education systems. Further, no data come from low-income countries, where students might have a problem with an Internet connection and access to appropriate equipment [ 82 , 83 ]. In addition, to access the online questionnaire students first needed to have electronic devices and an Internet connection, which could cause selection bias.

Another important limitation of this study is the time in which the data were collected. Not all countries were in the same pandemic phase or lockdown period, which might impact the student responses. Therefore, our study does not give a full picture of the students’ perceived satisfaction and performance during e-learning in the time of the first wave of the pandemic.

Future work

Future research could attempt to cluster countries by their economic development level given that e-learning quality and students’ perceived satisfaction and performance with online education depend on IT technology development and IT tools’ access and affordability [ 83 ]. In the future, studies should include representative countries on all levels of development and economic growth to further test the proposed model and look for differences in the area of students’ perceived satisfaction and performance with e-learning. This may help generate evidence for policymakers to invest in developing online education infrastructure in low- and middle-income countries.

Further, although digitalization in HEIs has been confirmed as significant and essential for the higher education system’s functioning during the lockdown [ 84 ] and e-learning has offered some kind of continuity of academic education, it does not meet all of the needs for practical and work-based learning, e.g. in medical and health or technical sciences education, especially when viewed in the long run [ 85 – 87 ]. In future research, more emphasis should be placed on analysing students’ perceived satisfaction and performance with online education in the context of differences between fields of study, particularly in relation to the nature of education (theoretical vs. practical) and the competencies that are supposed to be developed during education.

Future research may also consider differences between local and international students’ perceived satisfaction and performance. According to the EMN/OECD report [ 88 ], the COVID-19 pandemic has imposed more difficult situations on international students than local students in terms of psychological and financial issues. This may well impact their academic outcomes. Such analysis could also compare the adaptation to the online education environment of students whose training is in their mother language and students for which the training is in a second language.

Finally, the survey is based on the subjective opinion of students, also with regard to their academic performance. Therefore, to objectify the results further research entailing analysis of the relationship between students’ satisfaction with online education and their learning outcomes expressed in the form of grades may reveal interesting results. Namely, recent analyses suggest that students have been receiving higher grades during the pandemic compared to the on-site education before the pandemic, which may increase their satisfaction with the e-education [ 82 ].

Supporting information

S1 questionnaire..

https://doi.org/10.1371/journal.pone.0258807.s001

S1 Dataset.

https://doi.org/10.1371/journal.pone.0258807.s002

Acknowledgments

We wish to thank all the numerous international partners with data collection: Yusuf Alpayadin; Sorin Gabriel Anton; Roberto Burro; Silvia Cantele; Özkan Cikrikci; Michela Cortini; Manuel Gradim de Oliveira Gericota; Iusein Ervin; Stefania Fantinelli; Paulo Ferrinho; Sandeep Grover; Aleksandar Kešeljević; Poliana Leru; Piotr Major; João Matias; Marek Milosz; Andronic Octavian; Izabela Ostoj; Justyna Podgórska-Bednarz; Vijayalakshmi Reddy; Maya Roche; Ana Sofia Rodrigues; Piotr Rzymski; Oana Săndulescu; Rinku Sanjeev; Ganesh Kamath Sevagur; Parag Suresh Amin; Rajanikanta Swain. We would also like to thank anonymous global survey participants for their valuable insights into the lives of students, which they shared selflessly. Finally, we would like to acknowledge the CovidSocLab project ( http://www.covidsoclab.org/ ) as a working platform for collaboration.

  • View Article
  • Google Scholar
  • PubMed/NCBI
  • 11. Khine MS. Application of structural equation modeling in educational research and practice. Rotterdam: SensePublishers; 2013.
  • 12. IUA. COVID-19: Higher education challenges and responses. International Association of Universities. 2020. Available from: https://www.iau-aiu.net/COVID-19-Higher-Education-challenges-and-responses
  • 18. OECD. Education at a Glance 2020. OECD indicators. OECD Publishing. 2020. https://doi.org/10.1787/69096873-en
  • 45. Schleicher A. The impact of COVID-19 on education: Insights from education at a glance 2020. OECD. 2020. Available from: https://www.oecd.org/education/the-impact-of-covid-19-on-education-insights-education-at-a-glance-2020.pdf
  • 46. Hungarian Rectors’ Conference. Hungarian response to COVID-19. In Regional/National Perspectives on the Impact of COVID-19 on Higher Education (pp. 25–30). International Association of Universities, 2020. Available from: https://www.iau-aiu.net/IMG/pdf/iau_covid-19_regional_perspectives_on_the_impact_of_covid-19_on_he_july_2020_.pdf
  • 47. Aung TN, Khaing SS. Challenges of implementing e-learning in developing countries: A review. In: Zin T.; Lin J.W.; Pan J.S.; Tin P.; Yokota M., editors. Genetic and evolutionary computing: Advances in intelligent systems and computing. Springer. 2016; 388, pp. 405–411. https://doi.org/10.1007/978-3-319-23207-2_41
  • 59. Deloitte. Understanding the impact of COVID-19 on higher education institutions. 2020. Available from: https://www2.deloitte.com/ie/en/pages/covid-19/articles/covid-19-on-higher-education.html
  • 63. European Students’ Union. ESU’s survey on student life during the Covid-19 pandemic. 2020. Available from: https://eua.eu/partners-news/492-esu%E2%80%99s-survey-on-student-life-during-the-Covid-19-pandemic.html
  • 70. R Core Team. R: A language and environment for statistical computing. R Foundation for Statistical Computing. 2020. Available from: https://www.R-project.org/
  • 72. Marôco J. Análise de equações estruturais: Fundamentos teóricos, software & aplicações. (2nd ed.). Report Number. 2014.
  • 73. Jorgensen TD, Pornprasertmanit S, Schoemann A, Rosseel Y. semTools: Useful tools for structural equation modeling. R package version 0.5–3. 2020. Available from: https://CRAN.R-project.org/package=semTools
  • 80. Krueger K. Reinventing learning in Portugal: An ecosystem approach report of the 2013 CoSN Delegation to Portugal. CoSN. 2013. Available from: https://cosn.org/sites/default/files/pdf/ReinventingLearning_Portugal_April14.pdf
  • 81. Eurostat. Tertiary education statistics. 2020. Available from: https://ec.europa.eu/eurostat/statistics-explained/index.php?title=Tertiary_education_statistics&oldid=507549
  • 88. EMN/OECD. Impact of COVID-19 on international students in EU and OECD member states–EMN-OECD Inform. Brussels: European Migration Network. 2020. Available from: https://ec.europa.eu/home-affairs/sites/homeaffairs/files/00_eu_inform2_students_final_en.pdf
  • Research article
  • Open access
  • Published: 01 October 2021

Adaptive e-learning environment based on learning styles and its impact on development students' engagement

  • Hassan A. El-Sabagh   ORCID: orcid.org/0000-0001-5463-5982 1 , 2  

International Journal of Educational Technology in Higher Education volume  18 , Article number:  53 ( 2021 ) Cite this article

63k Accesses

63 Citations

27 Altmetric

Metrics details

Adaptive e-learning is viewed as stimulation to support learning and improve student engagement, so designing appropriate adaptive e-learning environments contributes to personalizing instruction to reinforce learning outcomes. The purpose of this paper is to design an adaptive e-learning environment based on students' learning styles and study the impact of the adaptive e-learning environment on students’ engagement. This research attempts as well to outline and compare the proposed adaptive e-learning environment with a conventional e-learning approach. The paper is based on mixed research methods that were used to study the impact as follows: Development method is used in designing the adaptive e-learning environment, a quasi-experimental research design for conducting the research experiment. The student engagement scale is used to measure the following affective and behavioral factors of engagement (skills, participation/interaction, performance, emotional). The results revealed that the experimental group is statistically significantly higher than those in the control group. These experimental results imply the potential of an adaptive e-learning environment to engage students towards learning. Several practical recommendations forward from this paper: how to design a base for adaptive e-learning based on the learning styles and their implementation; how to increase the impact of adaptive e-learning in education; how to raise cost efficiency of education. The proposed adaptive e-learning approach and the results can help e-learning institutes in designing and developing more customized and adaptive e-learning environments to reinforce student engagement.

Introduction

In recent years, educational technology has advanced at a rapid rate. Once learning experiences are customized, e-learning content becomes richer and more diverse (El-Sabagh & Hamed, 2020 ; Yang et al., 2013 ). E-learning produces constructive learning outcomes, as it allows students to actively participate in learning at anytime and anyplace (Chen et al., 2010 ; Lee et al., 2019 ). Recently, adaptive e-learning has become an approach that is widely implemented by higher education institutions. The adaptive e-learning environment (ALE) is an emerging research field that deals with the development approach to fulfill students' learning styles by adapting the learning environment within the learning management system "LMS" to change the concept of delivering e-content. Adaptive e-learning is a learning process in which the content is taught or adapted based on the responses of the students' learning styles or preferences. (Normadhi et al., 2019 ; Oxman & Wong, 2014 ). By offering customized content, adaptive e-learning environments improve the quality of online learning. The customized environment should be adaptable based on the needs and learning styles of each student in the same course. (Franzoni & Assar, 2009 ; Kolekar et al., 2017 ). Adaptive e-learning changes the level of instruction dynamically based on student learning styles and personalizes instruction to enhance or accelerate a student's success. Directing instruction to each student's strengths and content needs can minimize course dropout rates, increase student outcomes and the speed at which they are accomplished. The personalized learning approach focuses on providing an effective, customized, and efficient path of learning so that every student can participate in the learning process (Hussein & Al-Chalabi, 2020 ). Learning styles, on the other hand, represent an important issue in learning in the twenty-first century, with students expected to participate actively in developing self-understanding as well as their environment engagement. (Klasnja-Milicevic et al., 2011 ; Nuankaew et al., 2019 ; Truong, 2016 ).

In current conventional e-learning environments, instruction has traditionally followed a “one style fits all” approach, which means that all students are exposed to the same learning procedures. This type of learning does not take into account the different learning styles and preferences of students. Currently, the development of e-learning systems has accommodated and supported personalized learning, in which instruction is fitted to a students’ individual needs and learning styles (Beldagli & Adiguzel, 2010 ; Benhamdi et al., 2017 ; Pashler et al., 2008 ). Some personalized approaches let students choose content that matches their personality (Hussein & Al-Chalabi, 2020 ). The delivery of course materials is an important issue of personalized learning. Moreover, designing a well-designed, effective, adaptive e-learning system represents a challenge due to complication of adapting to the different needs of learners (Alshammari, 2016 ). Regardless of using e-learning claims that shifting to adaptive e-learning environments to be able to reinforce students' engagement. However, a learning environment cannot be considered adaptive if it is not flexible enough to accommodate students' learning styles. (Ennouamani & Mahani, 2017 ).

On the other hand, while student engagement has become a central issue in learning, it is also an indicator of educational quality and whether active learning occurs in classes. (Lee et al., 2019 ; Nkomo et al., 2021 ; Robinson & Hullinger, 2008 ). Veiga et al. ( 2014 ) suggest that there is a need for further research in engagement because assessing students’ engagement is a predictor of learning and academic progress. It is important to clarify the distinction between causal factors such as learning environment and outcome factors such as achievement. Accordingly, student engagement is an important research topic because it affects a student's final grade, and course dropout rate (Staikopoulos et al., 2015 ).

The Umm Al-Qura University strategic plan through common first-year deanship has focused on best practices that increase students' higher-order skills. These skills include communication skills, problem-solving skills, research skills, and creative thinking skills. Although the UQU action plan involves improving these skills through common first-year academic programs, the student's learning skills need to be encouraged and engaged more (Umm Al-Qura University Agency, 2020 ). As a result of the author's experience, The conventional methods of instruction in the "learning skills" course were observed, in which the content is presented to all students in one style that is dependent on understanding the content regardless of the diversity of their learning styles.

According to some studies (Alshammari & Qtaish, 2019 ; Lee & Kim, 2012 ; Shih et al., 2008 ; Verdú, et al., 2008 ; Yalcinalp & Avc, 2019 ), there is little attention paid to the needs and preferences of individual learners, and as a result, all learners are treated in the same way. More research into the impact of educational technologies on developing skills and performance among different learners is recommended. This “one-style-fits-all” approach implies that all learners are expected to use the same learning style as prescribed by the e-learning environment. Subsequently, a review of the literature revealed that an adaptive e-learning environment can affect learning outcomes to fill the identified gap. In conclusion: Adaptive e-learning environments rely on the learner's preferences and learning style as a reference that supports to create adaptation.

To confirm the above: the author conducted an exploratory study via an open interview that included some questions with a sample of 50 students in the learning skills department of common first-year. Questions asked about the difficulties they face when learning a "learning skills" course, what is the preferred way of course content. Students (88%) agreed that the way students are presented does not differ according to their differences and that they suffer from a lack of personal learning that is compatible with their style of work. Students (82%) agreed that they lack adaptive educational content that helps them to be engaged in the learning process. Accordingly, the author handled the research problem.

This research supplements to the existing body of knowledge on the subject. It is considered significant because it improves understanding challenges involved in designing the adaptive environments based on learning styles parameter. Subsequently, this paper is structured as follows: The next section presents the related work cited in the literature, followed by research methodology, then data collection, results, discussion, and finally, some conclusions and future trends are discussed.

Theoretical framework

This section briefly provides a thorough review of the literature about the adaptive E-learning environments based on learning styles.

Adaptive e-learning environments based on learning styles

The adaptive e-learning employment in higher education has been slower to evolve, and challenges that led to the slow implementation still exist. The learning management system offers the same tools to all learners, although individual learners need different details based on learning style and preferences. (Beldagli & Adiguzel, 2010 ; Kolekar et al., 2017 ). The interactive e-learning environment requisite evaluating the learner's desired learning style, before the course delivery, such as an online quiz or during the course delivery, such as tracking student reactions (DeCapua & Marshall, 2015 ).

In e-learning environments, adaptation is constructed on a series of well-designed processes to fit the instructional materials. The adaptive e-learning framework attempt to match instructional content to the learners' needs and styles. According to Qazdar et al. ( 2015 ), adaptive e-learning (AEL) environments rely on constructing a model of each learner's needs, preferences, and styles. It is well recognized that such adaptive behavior can increase learners' development and performance, thus enriching learning experience quality. (Shi et al., 2013 ). The following features of adaptive e-learning environments can be identified through diversity, interactivity, adaptability, feedback, performance, and predictability. Although adaptive framework taxonomy and characteristics related to various elements, adaptive learning includes at least three elements: a model of the structure of the content to be learned with detailed learning outcomes (a content model). The student's expertise based on success, as well as a method of interpreting student strengths (a learner model), and a method of matching the instructional materials and how it is delivered in a customized way (an instructional model) (Ali et al., 2019 ). The number of adaptive e-learning studies has increased over the last few years. Adaptive e-learning is likely to increase at an accelerating pace at all levels of instruction (Hussein & Al-Chalabi, 2020 ; Oxman & Wong, 2014 ).

Many studies assured the power of adaptive e-learning in delivering e-content for learners in a way that fitting their needs, and learning styles, which helps improve the process of students' acquisition of knowledge, experiences and develop their higher thinking skills (Ali et al., 2019 ; Behaz & Djoudi, 2012 ; Chun-Hui et al., 2017 ; Daines et al., 2016 ; Dominic et al., 2015 ; Mahnane et al., 2013 ; Vassileva, 2012 ). Student characteristics of learning style are recognized as an important issue and a vital influence in learning and are frequently used as a foundation to generate personalized learning experiences (Alshammari & Qtaish, 2019 ; El-Sabagh & Hamed, 2020 ; Hussein & Al-Chalabi, 2020 ; Klasnja-Milicevic et al., 2011 ; Normadhi et al., 2019 ; Ozyurt & Ozyurt, 2015 ).

The learning style is a parameter of designing adaptive e-learning environments. Individuals differ in their learning styles when interacting with the content presented to them, as many studies emphasized the relationship between e-learning and learning styles to be motivated in learning situations, consequently improving the learning outcomes (Ali et al., 2019 ; Alshammari, 2016 ; Alzain et al., 2018a , b ; Liang, 2012 ; Mahnane et al., 2013 ; Nainie et al., 2010 ; Velázquez & Assar, 2009 ). The word "learning style" refers to the process by which the learner organizes, processes, represents, and combines this information and stores it in his cognitive source, then retrieves the information and experiences in the style that reflects his technique of communicating them. (Fleming & Baume, 2006 ; Jaleel & Thomas, 2019 ; Jonassen & Grabowski, 2012 ; Klasnja-Milicevic et al., 2011 ; Nuankaew et al., 2019 ; Pashler et al., 2008 ; Willingham et al., 2105 ; Zhang, 2017 ). The concept of learning style is founded based on the fact that students vary in their styles of receiving knowledge and thought, to help them recognizing and combining information in their mind, as well as acquire experiences and skills. (Naqeeb, 2011 ). The extensive scholarly literature on learning styles is distributed with few strong experimental findings (Truong, 2016 ), and a few findings on the effect of adapting instruction to learning style. There are many models of learning styles (Aldosarim et al., 2018 ; Alzain et al., 2018a , 2018b ; Cletus & Eneluwe, 2020 ; Franzoni & Assar, 2009 ; Willingham et al., 2015 ), including the VARK model, which is one of the most well-known models used to classify learning styles. The VARK questionnaire offers better thought about information processing preferences (Johnson, 2009 ). Fleming and Baume ( 2006 ) developed the VARK model, which consists of four students' preferred learning types. The letter "V" represents for visual and means the visual style, while the letter "A" represents for auditory and means the auditory style, and the letter "R/W" represents "write/read", means the reading/writing style, and the letter "K" represents the word "Kinesthetic" and means the practical style. Moreover, VARK distinguishes the visual category further into graphical and textual or visual and read/write learners (Murphy et al., 2004 ; Leung, et al., 2014 ; Willingham et al., 2015 ). The four categories of The VARK Learning Style Inventory are shown in the Fig. 1 below.

figure 1

VARK learning styles

According to the VARK model, learners are classified into four groups representing basic learning styles based on their responses which have 16 questions, there are four potential responses to each question, where each answer agrees to one of the extremes of the dimension (Hussain, 2017 ; Silva, 2020 ; Zhang, 2017 ) to support instructors who use it to create effective courses for students. Visual learners prefer to take instructional materials and send assignments using tools such as maps, graphs, images, and other symbols, according to Fleming and Baume ( 2006 ). Learners who can read–write prefer to use written textual learning materials, they use glossaries, handouts, textbooks, and lecture notes. Aural learners, on the other hand, prefer to learn through spoken materials, dialogue, lectures, and discussions. Direct practice and learning by doing are preferred by kinesthetic learners (Becker et al., 2007 ; Fleming & Baume, 2006 ; Willingham et al., 2015 ). As a result, this research work aims to provide a comprehensive discussion about how these individual parameters can be applied in adaptive e-learning environment practices. Dominic et al., ( 2015 ) presented a framework for an adaptive educational system that personalized learning content based on student learning styles (Felder-Silverman learning model) and other factors such as learners' learning subject competency level. This framework allowed students to follow their adaptive learning content paths based on filling in "ils" questionnaire. Additionally, providing a customized framework that can automatically respond to students' learning styles and suggest online activities with complete personalization. Similarly, El Bachari et al. ( 2011 ) attempted to determine a student's unique learning style and then adapt instruction to that individual interests. Adaptive e-learning focused on learner experience and learning style has a higher degree of perceived usability than a non-adaptive e-learning system, according to Alshammari et al. ( 2015 ). This can also improve learners' satisfaction, engagement, and motivation, thus improving their learning.

According to the findings of (Akbulut & Cardak, 2012 ; Alshammari & Qtaish, 2019 ; Alzain et al., 2018a , b ; Shi et al., 2013 ; Truong, 2016 ), adaptation based on a combination of learning style, and information level yields significantly better learning gains. Researchers have recently initiated to focus on how to personalize e-learning experiences using personal characteristics such as the student's preferred learning style. Personal learning challenges are addressed by adaptive learning programs, which provide learners with courses that are fit to their specific needs, such as their learning styles.

  • Student engagement

Previous research has emphasized that student participation is a key factor in overcoming academic problems such as poor academic performance, isolation, and high dropout rates (Fredricks et al., 2004 ). Student participation is vital to student learning, especially in an online environment where students may feel isolated and disconnected (Dixson, 2015 ). Student engagement is the degree to which students consciously engage with a course's materials, other students, and the instructor. Student engagement is significant for keeping students engaged in the course and, as a result, in their learning (Barkley & Major, 2020 ; Lee et al., 2019 ; Rogers-Stacy, et al, 2017 ). Extensive research was conducted to investigate the degree of student engagement in web-based learning systems and traditional education systems. For instance, using a variety of methods and input features to test the relationship between student data and student participation (Hussain et al., 2018 ). Guo et al. ( 2014 ) checked the participation of students when they watched videos. The input characteristics of the study were based on the time they watched it and how often students respond to the assessment.

Atherton et al. ( 2017 ) found a correlation between the use of course materials and student performance; course content is more expected to lead to better grades. Pardo et al., ( 2016 ) found that interactive students with interactive learning activities have a significant impact on student test scores. The course results are positively correlated with student participation according to previous research. For example, Atherton et al. ( 2017 ) explained that students accessed learning materials online and passed exams regularly to obtain higher test scores. Other studies have shown that students with higher levels of participation in questionnaires and course performance tend to perform well (Mutahi et al., 2017 ).

Skills, emotion, participation, and performance, according to Dixson ( 2015 ), were factors in online learning engagement. Skills are a type of learning that includes things like practicing on a daily foundation, paying attention while listening and reading, and taking notes. Emotion refers to how the learner feels about learning, such as how much you want to learn. Participation refers to how the learner act in a class, such as chat, discussion, or conversation. Performance is a result, such as a good grade or a good test score. In general, engagement indicated that students spend time, energy learning materials, and skills to interact constructively with others in the classroom, and at least participate in emotional learning in one way or another (that is, be motivated by an idea, willing to learn and interact). Student engagement is produced through personal attitudes, thoughts, behaviors, and communication with others. Thoughts, effort, and feelings to a certain level when studying. Therefore, the student engagement scale attempts to measure what students are doing (thinking actively), how they relate to their learning, and how they relate to content, faculty members, and other learners including the following factors as shown in Fig.  2 . (skills, participation/interaction, performance, and emotions). Hence, previous research has moved beyond comparing online and face-to-face classes to investigating ways to improve online learning (Dixson, 2015 ; Gaytan & McEwen, 2007 ; Lévy & Wakabayashi, 2008 ; Mutahi et al., 2017 ). Learning effort, involvement in activities, interaction, and learning satisfaction, according to reviews of previous research on student engagement, are significant measures of student engagement in learning environments (Dixson, 2015 ; Evans et al., 2017 ; Lee et al., 2019 ; Mutahi et al., 2017 ; Rogers-Stacy et al., 2017 ). These results point to several features of e-learning environments that can be used as measures of student participation. Successful and engaged online learners learn actively, have the psychological inspiration to learn, make good use of prior experience, and make successful use of online technology. Furthermore, they have excellent communication abilities and are adept at both cooperative and self-directed learning (Dixson, 2015 ; Hong, 2009 ; Nkomo et al., 2021 ).

figure 2

Engagement factors

Overview of designing the adaptive e-learning environment

The paper follows the (ADDIE) Instructional Design Model: analysis, design, develop, implement, and evaluate to answer the first research question. The adaptive learning environment offers an interactive decentralized media environment that takes into account individual differences among students. Moreover, the environment can spread the culture of self-learning, attract students, and increase their engagement in learning.

Any learning environment that is intended to accomplish a specific goal should be consistent to increase students' motivation to learn. so that they have content that is personalized to their specific requirements, rather than one-size-fits-all content. As a result, a set of instructional design standards for designing an adaptive e-learning framework based on learning styles was developed according to the following diagram (Fig. 3 ).

figure 3

The ID (model) of the adaptive e-learning environment

According to the previous figure, The analysis phase included identifying the course materials and learning tools (syllabus and course plan modules) used for the study. The learning objectives were included in the high-level learning objectives (C4-C6: analysis, synthesis, evaluation).

The design phase included writing SMART objectives, the learning materials were written within the modules plan. To support adaptive learning, four content paths were identified, choosing learning models, processes, and evaluation. Course structure and navigation were planned. The adaptive structural design identified the relationships between the different components, such as introduction units, learning materials, quizzes. Determining the four path materials. The course instructional materials were identified according to the following Figure 4 .

figure 4

Adaptive e-course design

The development phase included: preparing and selecting the media for the e-course according to each content path in an adaptive e-learning environment. During this process, the author accomplished the storyboard and the media to be included on each page of the storyboard. A category was developed for the instructional media for each path (Fig. 5 )

figure 5

Roles and deployment diagram of the adaptive e-learning environment

The author developed a learning styles questionnaire via a mobile App. as follows: https://play.google.com/store/apps/details?id=com.pointability.vark . Then, the students accessed the adaptive e-course modules based on their learning styles.

The Implementation phase involved the following: The professional validation of the course instructional materials. Expert validation is used to evaluate the consistency of course materials (syllabi and modules). The validation was performed including the following: student learning activities, learning implementation capability, and student reactions to modules. The learner's behaviors, errors, navigation, and learning process are continuously geared toward improving the learner's modules based on the data the learner gathered about him.

The Evaluation phase included five e-learning specialists who reviewed the adaptive e-learning. After that, the framework was revised based on expert recommendations and feedback. Content assessment, media evaluation in three forms, instructional design, interface design, and usage design included in the evaluation. Adaptive learners checked the proposed framework. It was divided into two sections. Pilot testing where the proposed environment was tested by ten learners who represented the sample in the first phase. Each learner's behavior was observed, questions were answered, and learning control, media access, and time spent learning were all verified.

Research methodology

Research purpose and questions.

This research aims to investigate the impact of designing an adaptive e-learning environment on the development of students' engagement. The research conceptual framework is illustrated in Fig.  6 . Therefore, the articulated research questions are as follows: the main research question is "What is the impact of an adaptive e-learning environment based on (VARK) learning styles on developing students' engagement? Accordingly, there are two sub research questions a) "What is the instructional design of the adaptive e-learning environment?" b) "What is the impact of an adaptive e-learning based on (VARK) learning styles on development students' engagement (skills, participation, performance, emotional) in comparison with conventional e-learning?".

figure 6

The conceptual framework (model) of the research questions

Research hypotheses

The research aims to verify the validity of the following hypothesis:

There is no statistically significant difference between the students' mean scores of the experimental group that exposed to the adaptive e-learning environment and the scores of the control group that was exposed to the conventional e-learning environment in pre-application of students' engagement scale.

There is a statistically significant difference at the level of (0.05) between the students' mean scores of the experimental group (adaptive e-learning) and the scores of the control group (conventional e-learning) in post-application of students' engagement factors in favor of the experimental group.

Research design

This research was a quasi-experimental research with the pretest-posttest. Research variables were independent and dependent as shown in the following Fig. 7 .

figure 7

Research "Experimental" design

Both groups were informed with the learning activities tracks, the experimental group was instructed to use the adaptive learning environment to accomplish the learning goals; on the other hand, the control group was exposed to the conventional e-learning environment without the adaptive e-learning parameters.

Research participants

The sample consisted of students studying the "learning skills" course in the common first-year deanship aged between (17–18) years represented the population of the study. All participants were chosen in the academic year 2109–2020 at the first term which was taught by the same instructors. The research sample included two classes (118 students), selected randomly from the learning skills department. First-group was randomly assigned as the control group (N = 58, 31 males and 27 females), the other was assigned as experimental group (N = 60, 36 males and 24 females) was assigned to the other class. The following Table 1 shows the distribution of students' sample "Demographics data".

The instructional materials were not presented to the students before. The control group was expected to attend the conventional e-learning class, where they were provided with the learning environment without adaptive e-learning parameter based on the learning styles that introduced the "learning skills" course. The experimental group was exposed to the use of adaptive e-learning based on learning styles to learn the same course instructional materials within e-course. Moreover, all the student participants were required to read the guidelines to indicate their readiness to participate in the research experiment with permission.

Research instruments

In this research, the measuring tools included the VARK questionnaire and the students' engagement scale including the following factors (skills, participation/interaction, performance, emotional). To begin, the pre-post scale was designed to assess the level of student engagement related to the "learning skills" course before and after participating in the experiment.

VARK questionnaire

Questionnaires are a common method for collecting data in education research (McMillan & Schumacher, 2006 ). The VARK questionnaire had been organized electronically and distributed to the student through the developed mobile app and registered on the UQU system. The questionnaire consisted of 16 items within the scale as MCQ classified into four main factors (kinesthetic, auditory, visual, and R/W).

Reliability and Validity of The VARK questionnaire

For reliability analysis, Cronbach’s alpha is used for evaluating research internal consistency. Internal consistency was calculated through the calculation of correlation of each item with the factor to which it fits and correlation among other factors. The value of 0.70 and above are normally recognized as high-reliability values (Hinton et al., 2014 ). The Cronbach's Alpha correlation coefficient for the VARK questionnaire was 0.83, indicating that the questionnaire was accurate and suitable for further research.

Students' engagement scale

The engagement scale was developed after a review of the literature on the topic of student engagement. The Dixson scale was used to measure student engagement. The scale consisted of 4 major factors as follows (skills, participation/interaction, performance, emotional). The author adapted the original "Dixson scale" according to the following steps. The Dixson scale consisted of 48 statements was translated and accommodated into Arabic by the author. After consulting with experts, the instrument items were reduced to 27 items after adaptation according to the university learning environment. The scale is rated on a 5-point scale.

The final version of the engagement scale comprised 4 factors as follows: The skills engagement included (ten items) to determine keeping up with, reading instructional materials, and exerting effort. Participation/interaction engagement involved (five items) to measure having fun, as well as regularly engaging in group discussion. The performance engagement included (five items) to measure test performance and receiving a successful score. The emotional engagement involved (seven items) to decide whether or not the course was interesting. Students can access to respond engagement scale from the following link: http://bit.ly/2PXGvvD . Consequently, the objective of the scale is to measure the possession of common first-year students of the basic engagement factors before and after instruction with adaptive e-learning compared to conventional e-learning.

Reliability and validity of the engagement scale

The alpha coefficient of the scale factors scores was presented. All four subscales have a strong degree of internal accuracy (0.80–0.87), indicating strong reliability. The overall reliability of the instruments used in this study was calculated using Alfa-alpha, Cronbach's with an alpha value of 0.81 meaning that the instruments were accurate. The instruments used in this research demonstrated strong validity and reliability, allowing for an accurate assessment of students' engagement in learning. The scale was applied to a pilot sample of 20 students, not including the experimental sample. The instrument, on the other hand, had a correlation coefficient of (0.74–0.82), indicating a degree of validity that enables the instrument's use. Table 2 shows the correlation coefficient and Cronbach's alpha based on the interaction scale.

On the other hand, to verify the content validity; the scale was to specialists to take their views on the clarity of the linguistic formulation and its suitability to measure students' engagement, and to suggest what they deem appropriate in terms of modifications.

Research procedures

To calculate the homogeneity and group equivalence between both groups, the validity of the first hypothesis was examined which stated "There is no statistically significant difference between the students' mean scores of the experimental group that exposed to the adaptive e-learning environment and the scores of the control group that was exposed to the conventional e-learning environment in pre-application of students' engagement scale", the author applied the engagement scale to both groups beforehand, and the scores of the pre-application were examined to verify the equivalence of the two groups (experimental and control) in terms of students' engagement.

The t-test of independent samples was calculated for the engagement scale to confirm the homogeneity of the two classes before the experiment. The t-values were not significant at the level of significance = 0.05, meaning that the two groups were homogeneous in terms of students' engagement scale before the experiment.

Since there was no significant difference in the mean scores of both groups ( p  > 0.05), the findings presented in Table 3 showed that there was no significant difference between both experimental and control groups in engagement as a whole, and each student engagement factor separately. The findings showed that the two classes were similar before start of research experiment.

Learner content path in adaptive e-learning environment

The previous well-designed processes are the foundation for adaptation in e-learning environments. There are identified entries for accommodating materials, including classification depending on learning style.: kinesthetic, auditory, visual, and R/W. The present study covered the 1st semester during the 2019/2020 academic year. The course was divided into modules that concentrated on various topics; eleven of the modules included the adaptive learning exercise. The exercises and quizzes were assigned to specific textbook modules. To reduce irrelevant variation, all objects of the course covered the same content, had equal learning results, and were taught by the same instructor.

The experimental group—in which students were asked to bring smartphones—was taught, where the how-to adaptive learning application for adaptive learning was downloaded, and a special account was created for each student, followed by access to the channel designed by the through the application, and the students were provided with instructions and training on how entering application with the appropriate default element of the developed learning objects, while the control group used the variety of instructional materials in the same course for the students.

In this adaptive e-course, students in the experimental group are presented with a questionnaire asked to answer that questions via a developed mobile App. They are provided with four choices. Students are allowed to answer the questions. The correct answer is shown in the students' responses to the results, but the learning module is marked as incomplete. If a student chooses to respond to a question, the correct answer is found immediately, regardless of the student's reaction.

Figure  8 illustrates a visual example from learning styles identification through responding VARK Questionnaire. The learning process experienced by the students in this adaptive Learning environment is as shown in Fig.  4 . Students opened the adaptive course link by tapping the following app " https://play.google.com/store/apps/details?id=com.pointability.vark ," which displayed the appropriate positioning of both the learning skills course and the current status of students. It directed students to the learning skills that they are interested in learning more. Once students reached a specific situation in the e-learning environment, they could access relevant digital instructional materials. Students were then able to progress through the various styles offered by the proposed method, giving them greater flexibility in their learning pace.

figure 8

Visual example from "learning of the learning styles" identification and adaptive e-learning course process

The "flowchart" diagram below illustrates the learner's path in an adaptive e-learning environment, depending on the (VARK) learning styles (visual, auditory, kinesthetic, reading/writing) (Fig. 9 ).

figure 9

Student learning path

According to the previous design model of the adaptive framework, the students responded "Learning Styles" questionnaire. Based on each student's results, the orientation of students will direct to each of "Visual", "Aural", "Read-Write", and "Kinesthetic". The student took at the beginning the engagement scale online according to their own pace. When ready, they responded "engagement scale".

Based on the results, the system produced an individualized learning plan to fill in the gap based on the VARK questionnaire's first results. The learner model represents important learner characteristics such as personal information, knowledge level, and learning preferences. Pre and post measurements were performed for both experimental and control groups. The experimental group was exposed only to treatment (using the adaptive learning environment).

To address the second question, which states: “What is the impact "effect" of adaptive e-learning based on (VARK) learning styles on development students' engagement (skills, participation/interaction, performance, emotional) in comparison with conventional e-learning?

The validity of the second hypothesis of the research hypothesis was tested, which states " There is a statistically significant difference at the level of (0.05) between the students' mean scores of the experimental group (adaptive e-learning) and the scores of the control group (conventional e-learning) in post-application of students' engagement factors in favor of the experimental group". To test the hypothesis, the arithmetic means, standard deviations, and "T"-test values were calculated for the results of the two research groups in the application of engagement scale factors".

Table 4 . indicates that students in the experimental group had significantly higher mean of engagement post-test (engagement factors items) scores than students in the control group ( p  < 0.05).

The experimental research was performed to evaluate the impact of the proposed adaptive e-learning. Independent sample t-tests were used to measure the previous behavioral engagement of the two groups related to topic of this research. Subsequently, the findings stated that the experimental group students had higher learning achievement than those who were taught using the conventional e-learning approach.

To verify the effect size of the independent variable in terms of the dependent variable, Cohen (d) was used to investigate that adaptive learning can significantly students' engagement. According to Cohen ( 1992 ), ES of 0.20 is small, 0.50 is medium, and 0.80 is high. In the post-test of the student engagement scale, however, the effect size between students' scores in the experimental and control groups was calculated using (d and r) using means and standard deviations. Cohen's d = 0.826, and Effect-size r = 0.401, according to the findings. The ES of 0.824 means that the treated group's mean is in the 79th percentile of the control group (Large effect). Effect sizes can also be described as the average percentile rank of the average treated learner compared to the average untreated learner in general. The mean of the treated group is at the 50th percentile of the untreated group, indicating an ES of 0.0. The mean of the treated group is at the 79th percentile of the untreated group, with an ES of 0.8. The results showed that the dependent variable was strongly influenced in the four behavioral engagement factors: skills: performance, participation/interaction, and emotional, based on the fact that effect size is a significant factor in determining the research's strength.

Discussions and limitations

This section discusses the impact of an adaptive e-learning environment on student engagement development. This paper aimed to design an adaptive e-learning environment based on learning style parameters. The findings revealed that factors correlated to student engagement in e-learning: skills, participation/interaction, performance, and emotional. The engagement factors are significant because they affect learning outcomes (Nkomo et al., 2021 ). Every factor's items correlate to cognitive process-related activities. The participation/interaction factor, for example, referred to, interactions with the content, peers, and instructors. As a result, student engagement in e-learning can be predicted by interactions with content, peers, and instructors. The results are in line with previous research, which found that customized learning materials are important for increasing students' engagement. Adaptive e-learning based on learning styles sets a strong emphasis on behavioral engagement, in which students manage their learning while actively participating in online classes to adapt instruction according to each learning style. This leads to improved learning outcomes (Al-Chalabi & Hussein, 2020 ; Chun-Hui et al., 2017 ; Hussein & Al-Chalabi, 2020 ; Pashler et al., 2008 ). The experimental findings of this research showed that students who learned through adaptive eLearning based on learning styles learned more; as learning styles are reflected in this research as one of the generally assumed concerns as a reference for adapting e-content path. Students in the experimental group reported that the adaptive eLearning environment was very interesting and able to attract their attention. Those students also indicated that the adaptive eLearning environment was particularly useful because it provided opportunities for them to recall the learning content, thus enhancing their overall learning impression. This may explain why students in the experimental group performed well in class and showed more enthusiasm than students in the control group. This research compared an adaptive e-learning environment to a conventional e-learning approach toward engagement in a learning skills course through instructional content delivery and assessment. It can also be noticed that the experimental group had higher participation than the control group, indicating that BB activities were better adapted to the students' learning styles. Previous studies have agreed on the effectiveness of adaptive learning; it provides students with quality opportunity that is adapted to their learning styles, and preferences (Alshammari, 2016 ; Hussein & Al-Chalabi, 2020 ; Roy & Roy, 2011 ; Surjono, 2014 ). However, it should be noted that this study is restricted to one aspect of content adaptation and its factors, which is learning materials adapting based on learning styles. Other considerations include content-dependent adaptation. These findings are consistent with other studies, such as (Alshammari & Qtaish, 2019 ; Chun-Hui et al., 2017 ), which have revealed the effectiveness of the adaptive e-learning environment. This research differs from others in that it reflects on the Umm Al-Qura University as a case study, VARK Learning styles selection, engagement factors, and the closed learning management framework (BB).

The findings of the study revealed that adaptive content has a positive impact on adaptive individuals' achievement and student engagement, based on their learning styles (kinesthetic; auditory; visual; read/write). Several factors have contributed to this: The design of adaptive e-content for learning skills depended on introducing an ideal learning environment for learners, and providing support for learning adaptation according to the learning style, encouraging them to learn directly, achieving knowledge building, and be enjoyable in the learning process. Ali et al. ( 2019 ) confirmed that, indicating that education is adapted according to each individual's learning style, needs, and characteristics. Adaptive e-content design that allows different learners to think about knowledge by presenting information and skills in a logical sequence based on the adaptive e-learning framework, taking into account its capabilities as well as the diversity of its sources across the web, and these are consistent with the findings of (Alshammari & Qtaish, 2019 ).

Accordingly, the previous results are due to the following: good design of the adaptive e-learning environment in light of the learning style and educational preferences according to its instructional design (ID) standards, and the provision of adaptive content that suits the learners' needs, characteristics, and learning style, in addition to the diversity of course content elements (texts, static images, animations, and video), variety of tests and activities, diversity of methods of reinforcement, return and support from the instructor and peers according to the learning style, as well as it allows ease of use, contains multiple and varied learning sources, and allows referring to the same point when leaving the environment.

Several studies have shown that using adaptive eLearning technologies allows students to improve their learning knowledge and further enhance their engagement in issues such as "skills, performance, interaction, and emotional" (Ali et al., 2019 ; Graf & Kinshuk, 2007 ; Murray & Pérez, 2015 ); nevertheless, Murray and Pérez ( 2015 ) revealed that adaptive learning environments have a limited impact on learning outcome.

The restricted empirical findings on the efficacy of adapting teaching to learning style are mixed. (Chun-Hui et al., 2017 ) demonstrated that adaptive eLearning technologies can be beneficial to students' learning and development. According to these findings, adaptive eLearning can be considered a valuable method for learning because it can attract students' attention and promote their participation in educational activities. (Ali et al., 2019 ); however, only a few recent studies have focused on how adaptive eLearning based on learning styles fits in diverse cultural programs. (Benhamdi et al., 2017 ; Pashler et al., 2008 ).

The experimental results revealed that the proposed environment significantly increased students' learning achievements as compared to the conventional e-learning classroom (without adaptive technology). This means that the proposed environment's adaptation could increase students' engagement in the learning process. There is also evidence that an adaptive environment positively impacts other aspects of quality such as student engagement (Murray & Pérez, 2015 ).

Conclusions and implications

Although this field of research has stimulated many interests in recent years, there are still some unanswered questions. Some research gaps are established and filled in this study by developing an active adaptive e-learning environment that has been shown to increase student engagement. This study aimed to design an adaptive e-learning environment for performing interactive learning activities in a learning skills course. The main findings of this study revealed a significant difference in learning outcomes as well as positive results for adaptive e-learning students, indicating that it may be a helpful learning method for higher education. It also contributed to the current adaptive e-learning literature. The findings revealed that adaptive e-learning based on learning styles could help students stay engaged. Consequently, adaptive e-learning based on learning styles increased student engagement significantly. According to research, each student's learning style is unique, and they prefer to use different types of instructional materials and activities. Furthermore, students' preferences have an impact on the effectiveness of learning. As a result, the most effective learning environment should adjust its output to the needs of the students. The development of high-quality instructional materials and activities that are adapted to students' learning styles will help them participate and be more motivated. In conclusion, learning styles are a good starting point for creating instructional materials based on learning theories.

This study's results have important educational implications for future studies on the effect of adaptive e-learning on student interaction. First, the findings may provide data to support the development and improvement of adaptive environments used in blended learning. Second, the results emphasize the need for more quasi-experimental and descriptive research to better understand the benefits and challenges of incorporating adaptive e-learning in higher education institutions. Third, the results of this study indicate that using an adaptive model in an adaptive e-learning environment will encourage, motivate, engage, and activate students' active learning, as well as facilitate their knowledge construction, rather than simply taking in information passively. Fourth, new research is needed to design effective environments in which adaptive learning can be used in higher education institutions to increase academic performance and motivation in the learning process. Finally, the study shows that adaptive e-learning allows students to learn individually, which improves their learning and knowledge of course content, such as increasing their knowledge of learning skills course topics beyond what they can learn in a conventional e-learning classroom.

Contribution to research

The study is intended to provide empirical evidence of adaptive e-learning on student engagement factors. This research, on the other hand, has practical implications for higher education stakeholders, as it is intended to provide university faculty members with learning approaches that will improve student engagement. It is also expected to offer faculty a framework for designing personalized learning environments based on learning styles in various learning situations and designing more adaptive e-learning environments.

Research implication

Students with their preferred learning styles are more likely to enjoy learning if they are provided with a variety of instructional materials such as references, interactive media, videos, podcasts, storytelling, simulation, animation, problem-solving, games, and accessible educational tools in an e-learning environment. Also, different learning strategies can be accommodated. Other researchers would be able to conduct future studies on the use of the "adaptive e-learning" approach throughout the instructional process, at different phases of learning, and in various e-courses as a result of the current study. Meanwhile, the proposed environment's positive impact on student engagement gained considerable interest for future educational applications. Further research on learning styles in different university colleges could contribute to a foundation for designing adaptive e-courses based on students' learning styles and directing more future research on learning styles.

Implications for practice or policy:

Adaptive e-learning focused on learning styles would help students become more engaged.

Proving the efficacy of an adaptive e-learning environment via comparison with conventional e-learning .

Availability of data and materials

The author confirms that the data supporting the findings of this study are based on the research tools which were prepared and explained by the author and available on the links stated in the research instruments sub-section. The data analysis that supports the findings of this study is available on request from the corresponding author.

Akbulut, Y., & Cardak, C. (2012). Adaptive educational hypermedia accommodating learning styles: A content analysis of publications from 2000 to 2011. Computers & Education . https://doi.org/10.1016/j.compedu.2011.10.008 .

Article   Google Scholar  

Al-Chalabi, H., & Hussein, A. (2020). Analysis & implementation of personalization parameters in the development of computer-based adaptive learning environment. SAR Journal Science and Research., 3 (1), 3–9. https://doi.org/10.18421//SAR31-01 .

Aldosari, M., Aljabaa, A., Al-Sehaibany, F., & Albarakati, S. (2018). Learning style preferences of dental students at a single institution in Riyadh Saudi Arabia, evaluated using the VARK questionnaire . Advances in Medical Education and Practice. https://doi.org/10.2147/AMEP.S157686 .

Ali, N., Eassa, F., & Hamed, E. (2019). Personalized Learning Style for Adaptive E-Learning System, International Journal of Advanced Trends in Computer Science and Engineering . 223-230. Retrieved June 26, 2020 from http://www.warse.org/IJATCSE/static/pdf/file/ijatcse4181.12019.pdf .

Alshammari, M., & Qtaish, A. (2019). Effective adaptive e-learning systems according to learning style and knowledge level. JITE Research, 18 , 529–547. https://doi.org/10.28945/4459 .

Alshammari, M. (2016). Adaptation based on learning style and knowledge level in e-learning systems, Ph.D. thesis , University of Birmingham.  Retrieved April 18, 2019 from http://etheses.bham.ac.uk//id/eprint/6702/ .

Alshammari, M., Anane, R., & Hendley, R. (2015). Design and Usability Evaluation of Adaptive E-learning Systems based on Learner Knowledge and Learning Style. Human-Computer Interaction Conference- INTERACT , Vol. (9297), (pp. 157–186). https://doi.org/10.1007/978-3-319-22668-2_45 .

Alzain, A., Clack, S., Jwaid, A., & Ireson, G. (2018a). Adaptive education based on learning styles: Are learning style instruments precise enough. International Journal of Emerging Technologies in Learning (iJET), 13 (9), 41–52. https://doi.org/10.3991/ijet.v13i09.8554 .

Alzain, A., Clark, S., Ireson, G., & Jwaid, A. (2018b). Learning personalization based on learning style instruments. Advances in Science Technology and Engineering Systems Journal . https://doi.org/10.25046/aj030315 .

Atherton, M., Shah, M., Vazquez, J., Griffiths, Z., Jackson, B., & Burgess, C. (2017). Using learning analytics to assess student engagement and academic outcomes in open access enabling programs”. Journal of Open, Distance and e-Learning, 32 (2), 119–136.

Barkley, E., & Major, C. (2020). Student engagement techniques: A handbook for college faculty . Jossey-Bass . 10:047028191X.

Google Scholar  

Becker, K., Kehoe, J., & Tennent, B. (2007). Impact of personalized learning styles on online delivery and assessment. Campus-Wide Information Systems . https://doi.org/10.1108/10650740710742718 .

Behaz, A., & Djoudi, M. (2012). Adaptation of learning resources based on the MBTI theory of psychological types. IJCSI International Journal of Computer Science, 9 (2), 135–141.

Beldagli, B., & Adiguzel, T. (2010). Illustrating an ideal adaptive e-learning: A conceptual framework. Procedia - Social and Behavioral Sciences, 2 , 5755–5761. https://doi.org/10.1016/j.sbspro.2010.03.939 .

Benhamdi, S., Babouri, A., & Chiky, R. (2017). Personalized recommender system for e-Learning environment. Education and Information Technologies, 22 , 1455–1477. https://doi.org/10.1007/s10639-016-9504-y .

Chen, P., Lambert, A., & Guidry, K. (2010). Engaging online learners: The impact of Web-based learning technology on college student engagement. Computers & Education, 54 , 1222–1232.

Chun-Hui, Wu., Chen, Y.-S., & Chen, T. C. (2017). An adaptive e-learning system for enhancing learning performance: based on dynamic scaffolding theory. Eurasia Journal of Mathematics, Science and Technology Education. https://doi.org/10.12973/ejmste/81061 .

Cletus, D., & Eneluwe, D. (2020). The impact of learning style on student performance: mediate by personality. International Journal of Education, Learning and Training. https://doi.org/10.24924/ijelt/2019.11/v4.iss2/22.47Desmond .

Cohen, J. (1992). Statistical power analysis. Current Directions in Psychological Science., 1 (3), 98–101. https://doi.org/10.1111/1467-8721.ep10768783 .

Daines, J., Troka, T. and Santiago, J. (2016). Improving performance in trigonometry and pre-calculus by incorporating adaptive learning technology into blended models on campus. https://doi.org/10.18260/p.25624 .

DeCapua, A. & Marshall, H. (2015). Implementing a Mutually Adaptive Learning Paradigm in a Community-Based Adult ESL Literacy Class. In M. Santos & A. Whiteside (Eds.). Low Educated Second Language and Literacy Acquisition. Proceedings of the Ninth Symposium (pps. 151-171). Retrieved Nov. 14, 2020 from https://www.researchgate.net/publication/301355138_Implementing_a_Mutually_Adaptive_Learning_Paradigm_in_a_Community-Based_Adult_ESL_Literacy_Class .

Dixson, M. (2015). Measuring student engagement in the online course: The online student engagement scale (OSE). Online Learning . https://doi.org/10.24059/olj.v19i4.561 .

Dominic, M., Xavier, B., & Francis, S. (2015). A Framework to Formulate Adaptivity for Adaptive e-Learning System Using User Response Theory. International Journal of Modern Education and Computer Science, 7 , 23. https://doi.org/10.5815/ijmecs.2015.01.04 .

El Bachari, E., Abdelwahed, E., & M., El. . (2011). E-Learning personalization based on Dynamic learners’ preference. International Journal of Computer Science and Information Technology., 3 , 200–216. https://doi.org/10.5121/ijcsit.2011.3314 .

El-Sabagh, H. A., & Hamed, E. (2020). The Relationship between Learning-Styles and Learning Motivation of Students at Umm Al-Qura University. Egyptian Association for Educational Computer Journal . https://doi.org/10.21608/EAEC.2020.25868.1015 ISSN-Online: 2682-2601.

Ennouamani, S., & Mahani, Z. (2017). An overview of adaptive e-learning systems. Eighth International ConfeRence on Intelligent Computing and Information Systems (ICICIS) . https://doi.org/10.1109/INTELCIS.2017.8260060 .

Evans, S., Steele, J., Robertson, S., & Dyer, D. (2017). Personalizing post titles in the online classroom: A best practice? Journal of Educators Online, 14 (2), 46–54.

Fleming, N., & Baume, D. (2006). Learning styles again: VARKing up the Right Tree! Educational Developments, 7 , 4–7.

Franzoni, A., & Assar, S. (2009). Student learning style adaptation method based on teaching strategies and electronic media. Journal of Educational Technology & Society , 12(4), 15–29. Retrieved March 21, 2020, from http://www.jstor.org/stable/jeductechsoci.12.4.15 .

Fredricks, J., Blumenfeld, P., & Paris, A. (2004). School Engagement: Potential of the Concept . State of the Evidence: Review of Educational Research. https://doi.org/10.3102/00346543074001059 .

Book   Google Scholar  

Gaytan, J., & McEwen, M. (2007). Effective Online Instructional and Assessment Strategies. American Journal of Distance Education, 21 (3), 117–132. https://doi.org/10.1080/08923640701341653 .

Graf, S. & Kinshuk. K. (2007). Providing Adaptive Courses in Learning Management Systems with respect to Learning Styles. Proceeding of the World Conference on eLearning in Corporate. Government. Healthcare. and Higher Education (2576–2583). Association for the Advancement of Computing in Education (AACE). Retrieved January 18, 2020 from  https://www.learntechlib.org/primary/p/26739/ . ISBN 978-1-880094-63-1.

Guo, P., Kim, V., & Rubin, R. (2014). How video production affects student engagement: an empirical study of MOOC videos. Proceedings of First ACM Conference on Learning @ Scale Confernce . March 2014, (pp. 41-50). https://doi.org/10.1145/2556325.2566239 .

Hinton, P. R., Brownlow, C., McMurray, I., & Cozens, B. (2014). SPSS Explained (2nd ed., pp. 339–354). Routledge Taylor & Francis Group.

Hong, S. (2009). Developing competency model of learners in distance universities. Journal of Educational Technology., 25 , 157–186.

Hussain, I. (2017). Pedagogical implications of VARK model of learning. Journal of Literature, Languages and Linguistics, 38 , 33–37.

Hussain, M., Zhu, W., Zhang, W., & Abidi, S. (2018). Student engagement predictions in an e-learning system and their impact on student course assessment scores. Computational Intelligence, and Neuroscience. https://doi.org/10.1155/2018/6347186 .

Hussein, A., & Al-Chalabi, H. (2020). Pedagogical Agents in an Adaptive E-learning System. SAR Journal of Science and Research., 3 , 24–30. https://doi.org/10.18421/SAR31-04 .

Jaleel, S., & Thomas, A. (2019). Learning styles theories and implications for teaching learning . Horizon Research Publishing. 978-1-943484-25-6.

Johnson, M. (2009). Evaluation of Learning Style for First-Year Medical Students. Int J Schol Teach Learn . https://doi.org/10.20429/ijsotl.2009.030120 .

Jonassen, D. H., & Grabowski, B. L. (2012). Handbook of individual differences, learning, and instruction. Routledge . https://doi.org/10.1016/0022-4405(95)00013-C .

Klasnja-Milicevic, A., Vesin, B., Ivanovic, M., & Budimac, Z. (2011). E-Learning personalization based on hybrid recommendation strategy and learning style identification. Computers & Education, 56 (3), 885–899. https://doi.org/10.1016/j.compedu.2010.11.001 .

Kolekar, S. V., Pai, R. M., & Manohara Pai, M. M. (2017). Prediction of learner’s profile based on learning styles in adaptive e-learning system. International Journal of Emerging Technologies in Learning, 12 (6), 31–51. https://doi.org/10.3991/ijet.v12i06.6579 .

Lee, J., & Kim, D. (2012). Adaptive learning system applied bruner’ EIS theory. International Conference on Future Computer Supported Education, IERI Procedia, 2 , 794–801. https://doi.org/10.1016/j.ieri.2012.06.173 .

Lee, J., Song, H.-D., & Hong, A. (2019). Exploring factors, and indicators for measuring students’ sustainable engagement in e-learning. Sustainability, 11 , 985. https://doi.org/10.3390/su11040985 .

Leung, A., McGregor, M., Sabiston, D., & Vriliotis, S. (2014). VARK learning styles and student performance in principles of Micro-vs. Macro-Economics. Journal of Economics and Economic Education Research, 15 (3), 113.

Lévy, P. & Wakabayashi, N. (2008). User's appreciation of engagement in service design: The case of food service design. Proceedings of International Service Innovation Design Conference 2008 - ISIDC08 . Busan, Korea. Retrieved October 28, 2019 from https://www.researchgate.net/publication/230584075 .

Liang, J. S. (2012). The effects of learning styles and perceptions on application of interactive learning guides for web-based. Proceedings of Australasian Association for Engineering Education Conference AAEE . Melbourne, Australia. Retrieved October 22, 2019 from https://aaee.net.au/wpcontent/uploads/2018/10/AAEE2012-Liang.-Learning_styles_and_perceptions_effects_on_interactive_learning_guide_application.pdf .

Mahnane, L., Laskri, M. T., & Trigano, P. (2013). A model of adaptive e-learning hypermedia system based on thinking and learning styles. International Journal of Multimedia and Ubiquitous Engineering, 8 (3), 339–350.

Markey, M. K. & Schmit, K, J. (2008). Relationship between learning style Preference and instructional technology usage. Proceedings of American Society for Engineering Education Annual Conference & Expodition . Pittsburgh, Pennsylvania. Retrieved March 15, 2020 from https://peer.asee.org/3173 .

McMillan, J., & Schumacher, S. (2006). Research in education: Evidence-based inquiry . Pearson.

Murphy, R., Gray, S., Straja, S., & Bogert, M. (2004). Student learning preferences and teaching implications: Educational methodologies. Journal of Dental Education, 68 (8), 859–866.

Murray, M., & Pérez, J. (2015). Informing and performing: A study comparing adaptive learning to traditional learning. Informing Science. The International Journal of an Emerging Transdiscipline , 18, 111–125. Retrieved Febrauary 4, 2021 from http://www.inform.nu/Articles/Vol18/ISJv18p111-125Murray1572.pdf .

Mutahi, J., Kinai, A. , Bore, N. , Diriye, A. and Weldemariam, K. (2017). Studying engagement and performance with learning technology in an African classroom, Proceedings of Seventh International Learning Analytics & Knowledge Conference , (pp. 148–152), Canada: Vancouver.

Nainie, Z., Siraj, S., Abuzaiad, R. A., & Shagholi, R. (2010). Hypothesized learners’ technology preferences based on learning styles dimensions. The Turkish Online Journal of Educational Technology, 9 (4), 83–93.

Naqeeb, H. (2011). Learning Styles as Perceived by Learners of English as a Foreign Language in the English Language Center of The Arab American University—Jenin. Palestine. an Najah Journal of Research, 25 , 2232.

Nkomo, L. M., Daniel, B. K., & Butson, R. J. (2021). Synthesis of student engagement with digital technologies: a systematic review of the literature. International Journal of Educational Technology in Higher Education . https://doi.org/10.1186/s41239-021-00270-1 .

Normadhi, N. B., Shuib, L., Nasir, H. N. M., Bimba, A., Idris, N., & Balakrishnan, V. (2019). Identification of personal traits in adaptive learning environment: Systematic literature review. Computers & Education, 130 , 168–190. https://doi.org/10.1016/j.compedu.2018.11.005 .

Nuankaew, P., Nuankaew, W., Phanniphong, K., Imwut, S., & Bussaman, S. (2019). Students model in different learning styles of academic achievement at the University of Phayao, Thailand. International Journal of Emerging Technologies in Learning (iJET)., 14 , 133. https://doi.org/10.3991/ijet.v14i12.10352 .

Oxman, S. & Wong, W. (2014). White Paper: Adaptive Learning Systems. DV X Innovations DeVry Education Group. Retrieved December 14, 2020 from shorturl.at/hnsS8 .

Ozyurt, Ö., & Ozyurt, H. (2015). Learning style-based individualized adaptive e-learning environments: Content analysis of the articles published from 2005 to 2014. Computers in Human Behavior, 52 , 349–358. https://doi.org/10.1016/j.chb.2015.06.020 .

Pardo, A., Han, F., & Ellis, R. (2016). Exploring the relation between self-regulation, online activities, and academic performance: a case study. Proceedings of Sixth International Conference on Learning Analytics & Knowledge , (pp. 422-429). https://doi.org/10.1145/2883851.2883883 .

Pashler, H., McDaniel, M., Rohrer, D., & Bjork, R. (2008). Learning styles: concepts and evidence. Psychology Faculty Publications., 9 (3), 105–119. https://doi.org/10.1111/j.1539-6053.2009.01038.x .

Qazdar, A., Cherkaoui, C., Er-Raha, B., & Mammass, D. (2015). AeLF: Mixing adaptive learning system with learning management system. International Journal of Computer Applications., 119 , 1–8. https://doi.org/10.5120/21140-4171 .

Robinson, C., & Hullinger, H. (2008). New benchmarks in higher education: Student engagement in online learning. Journal of Education for Business, 84 , 101–109.

Rogers-Stacy, C., Weister, T., & Lauer, S. (2017). Nonverbal immediacy behaviors and online student engagement: Bringing past instructional research into the present virtual classroom. Communication Education, 66 (1), 37–53.

Roy, S., & Roy, D. (2011). Adaptive e-learning system: a review. International Journal of Computer Trends and Technology (IJCTT), 1 (1), 78–81. ISSN:2231-2803.

Shi, L., Cristea, A., Foss, J., Qudah, D., & Qaffas, A. (2013). A social personalized adaptive e-learning environment: a case study in topolor. IADIS International Journal on WWW/Internet., 11 , 13–34.

Shih, M., Feng, J., & Tsai, C. (2008). Research and trends in the field of e-learning from 2001 to 2005: A content analysis of cognitive studies in selected journals. Computers & Education, 51 (2), 955–967. https://doi.org/10.1016/j.compedu.2007.10.004 .

Silva, A. (2020). Towards a Fuzzy Questionnaire of Felder and Solomon for determining learning styles without dichotomic in the answers. Journal of Learning Styles, 13 (15), 146–166.

Staikopoulos, A., Keeffe, I., Yousuf, B. et al., (2015). Enhancing student engagement through personalized motivations. Proceedings of IEEE 15th International Conference on Advanced Learning Technologies , (pp. 340–344), Taiwan: Hualien. https://doi.org/10.1109/ICALT.2015.116 .

Surjono, H. D. (2014). The evaluation of Moodle-based adaptive e-learning system. International Journal of Information and Education Technology, 4 (1), 89–92. https://doi.org/10.7763/IJIET.2014.V4.375 .

Truong, H. (2016). Integrating learning styles and adaptive e-learning system: current developments, problems, and opportunities. Computers in Human Behavior, 55 (2016), 1185–1193. https://doi.org/10.1016/j.chb.2015.02.014 .

Umm Al-Qura University Agency for Educational Affairs (2020). Common first-year Deanship, at Umm Al-Qura University. Retrieved February 3, 2020 from https://uqu.edu.sa/en/pre-edu/70021 .

Vassileva, D. (2012). Adaptive e-learning content design and delivery based on learning style and knowledge level. Serdica Journal of Computing, 6 , 207–252.

Veiga, F., Robu, V., Appleton, J., Festas, I & Galvao, D. (2014). Students' engagement in school: Analysis according to self-concept and grade level. Proceedings of EDULEARN14 Conference 7th-9th July 2014 (pp. 7476-7484). Barcelona, Spain. Available Online at: http://hdl.handle.net/10451/12044 .

Velázquez, A., & Assar, S. (2009). Student learning styles adaptation method based on teaching strategies and electronic media. Educational Technology & SocieTy., 12 , 15–29.

Verdú, E., Regueras, L., & De Castro, J. (2008). An analysis of the research on adaptive Learning: The next generation of e-learning. WSEAS Transactions on Information Science and Applications, 6 (5), 859–868.

Willingham, D., Hughes, E., & Dobolyi, D. (2015). The scientific status of learning styles theories. Teaching of Psychology., 42 (3), 266–271. https://doi.org/10.1177/0098628315589505 .

Yalcinalp & Avcı. (2019). Creativity and emerging digital educational technologies: A systematic review. The Turkish Online Journal of Educational Technology, 18 (3), 25–45.

Yang, J., Huang, R., & Li, Y. (2013). Optimizing classroom environment to support technology enhanced learning. In A. Holzinger & G. Pasi (Eds.), Human-computer interaction and knowledge discovery in complex (pp. 275–284). Berlin: Springer.

Zhang, H. (2017). Accommodating different learning styles in the teaching of economics: with emphasis on fleming and mills¡¯s sensory-based learning style typology. Applied Economics and Finance, 4 (1), 72–78.

Download references

Acknowledgements

The author would like to thank the Deanship of Scientific Research at Umm Al-Qura University for the continuous support. This work was supported financially by the Deanship of Scientific Research at Umm Al-Qura University to Dr.: Hassan Abd El-Aziz El-Sabagh. (Grant Code: 18-EDU-1-01-0001).

Author information

Hassan A. El-Sabagh is an assistant professor in the E-Learning Deanship and head of the Instructional Programs Department, Umm Al-Qura University, Saudi Arabia, where he has worked since 2012. He has extensive experience in the field of e-learning and educational technologies, having served primarily at the Educational Technology Department of the Faculty of Specific Education, Mansoura University, Egypt since 1997. In 2011, he earned a Ph.D. in Educational Technology from Dresden University of Technology, Germany. He has over 14 papers published in international journals/conference proceedings, as well as serving as a peer reviewer in several international journals. His current research interests include eLearning Environments Design, Online Learning; LMS-based Interactive Tools, Augmented Reality, Design Personalized & Adaptive Learning Environments, and Digital Education, Quality & Online Courses Design, and Security issues of eLearning Environments. (E-mail: [email protected]; [email protected]).

Authors and Affiliations

E-Learning Deanship, Umm Al-Qura University, Mecca, Saudi Arabia

Hassan A. El-Sabagh

Faculty of Specific Education, Mansoura University, Mansoura, Egypt

You can also search for this author in PubMed   Google Scholar

Contributions

The author read and approved the final manuscript.

Corresponding author

Correspondence to Hassan A. El-Sabagh .

Ethics declarations

Competing interests.

The author declares that there is no conflict of interest

Additional information

Publisher's note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Cite this article.

El-Sabagh, H.A. Adaptive e-learning environment based on learning styles and its impact on development students' engagement. Int J Educ Technol High Educ 18 , 53 (2021). https://doi.org/10.1186/s41239-021-00289-4

Download citation

Received : 24 May 2021

Accepted : 19 July 2021

Published : 01 October 2021

DOI : https://doi.org/10.1186/s41239-021-00289-4

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Adaptive e-Learning
  • Learning style
  • Learning impact

e learning research topics 2021

  • Become a Member
  • Artificial Intelligence
  • Computational Thinking
  • Digital Citizenship
  • Edtech Selection
  • Global Collaborations
  • STEAM in Education
  • Teacher Preparation
  • ISTE Certification
  • School Partners
  • Career Development
  • ISTELive 24
  • 2024 ASCD Annual Conference
  • Solutions Summit
  • Leadership Exchange
  • 2024 ASCD Leadership Summit
  • Edtech Product Database
  • Solutions Network
  • Sponsorship & Advertising
  • Sponsorship & Advertising
  • Learning Library

The Hottest Topics in Edtech in 2021

  • Professional Development & Well-Being

Hottest Topics In Edtech Blog Version Id Jwt Is Lr B C Lxkd Qfi ELG Uo R Ufh G Zv9rw

For a few years now, we’ve shared the hottest edtech trends of the year based on the topics resonating with educators at the annual ISTE conference. Although the topics themselves often don’t change much from year to year, the approach to them does. But 2020 was a year like no other, and thus new topics emerged on the list and others moved up a few notches. 

Digital citizenship, professional learning and social-emotional learning still made the list like they did the year before, but they took on new urgency as schooling moved online. Meanwhile, topics like e-sports, online learning design and creativity were new to the list. 

All these topics will be well represented at ISTELive 21 this year. The fully online conference will run for four days, June 26-30. Here’s a look at the trending topics and why they are especially important now.

1. Digital citizenship

Digital citizenship has been a hot topic for educators for nearly a decade — but it has quickly evolved in the past two years — especially in the past year as remote and hybrid learning has shifted learning online. 

In the beginning, digital citizenship was focused on safety, security and legality (protect your passwords, keep your identity secret, and cite sources when using intellectual property). Now the focus is on making sure students feel empowered to use digital tools and platforms to do good in the world — and that they do so responsibly. 

The DigCitCommit movement was born out of this shift to focus on the opportunities of the digital world rather than the dangers. DigCitCommit breaks down digital citizenship into five focus areas: 

Inclusive: Open to multiple viewpoints and being respectful in digital interactions. Informed: Evaluating the accuracy, perspectif and validity of digital media and social posts. Engaged: Using technology for civic engagement, problem solving and being a force for good. Balanced : Prioritizing time and activities online and off to promote mental and physical health.  Alert : Being aware of online actions and their consequences and knowing how to be safe and ensuring others are safe online.

Look for digital citizenship sessions at ISTELive 21 that focus on global collaboration, media literacy and social justice projects. 

2. Online learning design

One of the biggest challenges educators have faced in responding to the pandemic has been how to effectively move lessons that were designed for an in-person classroom online. Many educators around the world had to make that transition in less than a week in spring 2020 and, in some cases, less than a day. 

What many discovered immediately was that you just can’t simply upload worksheets to Google Classroom and expect the same learning success.

Michele Eaton, author of the book,  T he Perfect Blend: A Practical Guide to Designing Student-Centered Learning , says good in-person teaching doesn’t equate with good online teaching. 

 “I have a strong belief that if all we ever do is replicate what we do face to face, then online learning will just be a cheap imitation of the classroom experience.” In her post,  4 tips for creating successful online content , Eaton outlines ways educators can design online lessons that are interactive, reduce cognitive load, and build in formative assessments. 

Look for ISTELive 21 sessions that focus on online learning strategies and ideas for the hybrid classroom. Check out ISTE’s Summer Learning Academy, a course designed to help educators take what they learned from teaching in online and hybrid settings and moving to the next level.  

3. Equity and inclusion

The COVID-19 pandemic exposed many of the ugly inequities that have existed in education for a long time. It also created a few new ones. When school moved online, many young learners and students with disabilities were unable to access learning without parental help, which was often unavailable because parents were working.

The lack of devices and bandwidth hampered many rural and low income students. Most districts were able to secure funding to get hotspots and laptops or tablets into the hands of students who needed them, but those solutions were not always ideal. Hotspots were at times unreliable and devices would be in disrepair. Because of these problems and others, many teachers reported a high percentage of missing students — those who never showed up online.

Patricia Brown, an instructional technology coach for Ladue School district in Missouri, said the pandemic has been a watershed moment. In the blog, COVID-19 Thrusts Digital Equity to the Forefront , Brown shares some of the complexities of the inequities wrought by the pandemic.

 “It’s definitely bringing some attention to things that a lot of people have been talking about and nobody was listening to,” Brown said. “Now, when it affects people in their own communities, they are realizing they don’t have it together like they thought they had it together. People are having their eyes opened.”

Those inequities aren’t just limited to ensuring students have devices and internet access. Brown says there are multiple dimensions of digital equity. One focus is on the need for professional learning and providing support for teachers, students and families.

“When we talk about equity, we can talk a lot about devices and curriculum, but we also have to think about the basic needs that our kids and our families have,” Brown said. “We need to think about those basic needs, whether that’s providing lunches or breakfasts, or social-emotional resources for families or having counselors and social workers available,” Brown said. “That’s part of equity, too, providing what is needed for your population or for your community.”

4. Social-emotional learning and cultural competence

We’ve lumped these two important topics together because much of the anxiety and trauma students have faced during the pandemic relate to both. Social-emotional learning, or SEL, involves the skills required to manage emotions, set goals and maintain positive relationships, which are necessary for learning but also a tall order for students facing a barrage of COVID-related issues like family job loss, stressed parents and the illness or death of friends or relatives. 

The pandemic has caused enormous emotional stress and trauma to students across the board, but the emotional effects have disproportionately affected students of color, English language learners and students in other marginalized groups. 

That’s why in order to help students process their emotions, it’s important for educators to have cultural competence, which is the ability to understand, communicate with and interact with people across cultures. 

In the blog, 3 Ways Teachers Can Integrate SEL Into Online Learning , educator Jorge Valenzuela writes that “dealing with the fallout from the coronavirus pandemic has caused multiple traumas — which have been heightened by news and graphic images of the murder of George Floyd and the outrage and fear that followed.” 

That is why he says all educators should seek out cultural competence training in addition to learning about restorative justice, trauma-informed teaching and culturally responsive teaching. 

5. Professional development

Teacher professional development, especially related to edtech, is nothing new, of course. But the pandemic changed that, too. No longer are teachers attending daylong face-to-face lectures at the district office or out-of-town seminars and events. 

Because of social distancing, the urgency to quickly learn new skills, and increasingly tight budgets, many educators have formed professional learning communities within their schools and districts. Some of these are grouped by grade level, others by content area. In her post, 4 Benefits of an Active Professional Learning Community , Jennifer Serviss explores how PLCs enhance teaching and learning. 

Many educators have sought PD online — some for the first time. Those used to attending conferences in person might feel at sea trying to plan for and navigate a virtual conference. In her post,   10 Tips for Getting the Most out of a Virtual PD Event , Nicole Zumpano, a regional edtech coordinator, shares ideas for making the most out of virtual PD. 

It can seem daunting to choose the most worthwhile online conferences and courses in a learning landscape flooded with choices. Probably the best way to select: Look to the trusted sources. ISTE offers online courses and a slate of virtual events to prepare educators for the future of learning. 

Esports — aka competitive video gaming — has exploded as a form of entertainment in the past decade, and now it’s naturally finding its way into schools, clubs and after-school programs. Many educators are embracing esports as a way to engage hard-to-reach students who don’t necessarily gravitate to athletic sports or academic pursuits. Research indicates that 40% of students involved in esports have never participated in school activities. 

Esports also promote interest in STEM careers and are a pipeline to jobs in the burgeoning esports industry.

Kevin Brown, an esports specialist with the Orange County Department of Education in California, says educators can tap esports in the classroom to support just about every subject because esports connect student interests to learning in a positive way.

Brown says esports have seen explosive growth in the last few years. The North America Scholastic Esports Federation started as a regional program in Southern California with 25 clubs and 38 teams. In 2½ years, it has grown to include more than 1,000 clubs and 11,000 students in North America.

Many educators mistakenly believe that if they aren’t gamers themselves, they can’t incorporate esports in the curriculum or organize a club. Not true, says Joe McAllister, an education esports expert for CDW who helps schools and districts get programs off the ground. 

He often sees reluctance from people who say, “Oh, I don't really play video games.”

“That’s OK. Do you do enjoy kids growing and learning and providing them structure? Of course, that’s what teachers do,” he said. “The content and strategy for the games, that’s all out there on YouTube and Twitch. Most students will bring that to the table.”

Esports was the topic of a daylong series of events at ISTE20 Live in December and will be a focus again at ISTELive 21. In the meantime, check out the ISTE Jump Start Guide " Esports in Schools ."

7. Augmented and virtual reality

Pokemon Go may have introduced the terms virtual and augmented reality to a majority of educators in 2017, but there’s a lot more learning potential in AR/VR than chasing around imaginary creatures. The game that took the world by storm has faded in popularity these days but AR/VR has not. 

The reason for that, says Jaime Donally, author of the ISTE book, The Immersive Classroom: Create Customized Learning Experiences With AR/VR , is because AR/VR deepens learning. It allows students to see the wonders of the world up close and it grants them access to experiences that they wouldn’t be able to get any other way, such as an incredibly detailed 3D view of the human body or a front row seat to unfolding world events. 

The technology is becoming more affordable and sophisticated all the time, allowing students to do more than consume AR/VR experiences. They can actually create them. 

Most of the AR experiences in the past 10 years involved using a trigger image to superimpose an object or video on top. The trigger image is similar to a barcode telling the mobile device precisely what to add to the image. Newer AR technology eliminates the trigger image and places objects in your space by surface tracking. In the past four years, this technology is included on most mobile devices and uses ARKit for the Apple platform and ARCore for Android, Donally explained, which opens up even more possibilities for students and educators.

8. Project-based learning

At first blush, it seemed like project-based learning, or PBL, would be one of those educational strategies that would have to go by the wayside during remote and online learning. After all, you can’t really organize collaborative projects when students are not together in the same room, right?

“Wrong,” says Nichlas Provenzano, a middle school technology teacher and makerspace director in Michigan.

When the pandemic hit, Provenzano was teaching an innovation and design class, and it wasn’t immediately clear how he could teach that class remotely. He decided to implement genius hour, the ultimate PBL strategy. Genius hour is an instructional approach that allows students to  decide what they want to learn and how they want to learn it. The teacher’s job is to support the student by offering resources and helping them understand complex material.

He told his students to create something using the resources they had at home. One student submitted images demonstrating his ability to build a side table that he designed himself. 

Another student hydro-dipped some shoes and then created a website to demonstrate the process.

“This approach to personalized learning was a huge success in my middle school class just like it was in my high school class,” Provanzano says in the video, “ The emphasis on personalization increases engagement, but more importantly, it builds the skills necessary to be lifelong learners long after they leave our classrooms.

Learn how to infuse project-based learning in your classroom by enrolling in the ISTE U course, Leading Project-Based Learning With Technology .

9. Creativity  

Of course creativity is nothing new. Cave drawings dating back to the late Stone Age continue to awe and inspire us, as do the ivory, stone and shell artifacts created by ancient peoples. Nevertheless, creativity is considered a hot topic because educators are embracing more creative and less traditional methods for students to demonstrate skills and content knowledge. 

Tim Needles, an art teacher from Smithtown High School in New York, loves to show teachers how to incorporate creativity into all topic areas. In his video “ Digital Drawing Tools for Creative Online Learning ,” he demonstrates how to “draw with code,” using the Code.org lesson called Artist . It merges math and computer science with art.

Needles who has presented at ISTE’s Creative Constructor Lab, is also a big fan of sketchnoting, a method of taking notes by drawing pictures. Sketchnoting is not just a fun method for getting information on paper, it’s a proven strategy backed by learning science to help students recall information.

Nichole Carter, author of Sketchnoting in the Classroom , says that sketchnoting is not about drawing the perfect piece of art. It’s about getting the content on the page. That’s why she says it’s important for teachers to help student improve their visual vocabulary. Watch the video below to understand more about this. 

These nine topics represent a mere fraction of the content you'll fine at ISTELive 21. Register today to ensure the best registration price, then return to the site in March to browse the program. 

Diana Fingal is director of editorial content for ISTE. 

  • artificial intelligence

10 Irreversible E-Learning Trends for 2021

These are 10 of the most noteworthy E-Learning trends that will shape the online education industry in the years to come.​

  • By Erin Wilson
  • Jan 26, 2021

E-student.org is supported by our community of learners. When you visit links on our site, we may earn an affiliate commission.

Remote learning, distance education: spending hours in front of the computer screen. Watching your teachers lecture and sitting through prerecorded classes. Learning to separate home and school. Managing interactive projects over Zoom and internet connectivity problems.

E-learning has added an entirely new dimension to the age-old concept of learning.

Recently, there has been a massive increase in remote learning at all levels — which has allowed educators to understand e-learning’s effectiveness in new ways. In some cases, the implementation of e-learning has been an absolute mess, while in other cases online learning opened a whole new world of opportunities.

Going into 2021, many organizations are already catching on to the popular e-learning trends. Some of these trends are likely to be fads. These kinds of trends in e-learning are short-lived and will disappear as quickly as they arose.

Yet, other trends have shown remarkable resilience. They tend to utilize new and revolutionary technologies, increase retention rates in learners, and make online education more reachable for students of all backgrounds. These are the types of trends that are irreversible, and any company or organization that fails to keep up with them will be bound to get left in the dust.

These are 10 irreversible e-learning trends.

Irreversible E-Learning Trends

Mobile learning (m-learning).

Mobile phones

According to Learning House , 67% of students will finish online course work using a mobile device. And 87% of potential students use a mobile device to search for a new e-Learning course. Students want to learn anywhere, using just their phones.

Over 50% of online traffic comes from mobile devices (not including tablets). The most popular software for e-Learning began with desktop access: the Learning Management System (LMS).

These software systems began with administrative functions, like attendance and grading, and evolved into platforms to deliver learning content. As e-Learning trends move into mobile learning, the LMS is forced to change its systems.

As mobile-only internet access increases, e-Learning will follow industries like entertainment streaming, shopping, and even navigation. E-learning will trend toward accommodating better mobile learning and user

Microlearning

The human attention span is relatively short. “Bite-sized chunks” of news, entertainment, and social content are becoming the most popular trends. E-learning will include microlearning programs to accommodate learners’ short attention spans and various learning styles.

Longer modules tend to be boring for students. On the other hand, microlearning in 5-minute videos maintains the students’ attention, compared with 40-minute prerecorded videos.

Microlearning courses let students focus on specific skills for a few minutes at a time — allowing students to retain more information.

These short-form lessons create an opportunity for students who are short on time to continue their education. If you only have a few minutes, you can consume entire “micro-lessons” in your free time through microlearning. Plus, you can complete short quizzes and lesson reviews quickly.

This format allows you to earn credentials that increase your qualifications and expand your employment opportunities. Microlearning is a trend prevalent in professional learning and development.

Most e-learners retain short-form microlearning courses better. These courses are more accessible to a wider variety of learning styles. They help with certain learning difficulties like ADD and ADHD.

Virtual reality (VR) and augmented reality (AR)

e learning research topics 2021

Virtual reality and augmented reality are taking over education  at an unprecedented pace. The global virtual reality market is already valued at  $6.1 billion, and set to reach $20.9 billion by 2025 . Together with this massive worldwide growth of VR technologies, e-learning applications for it are bound to grow at a similar pace.

Educators around the world know there are many different learning styles. E-learning has primarily focused on learners who do well by reading text or watching video-based instruction — and learn the material well in those formats.

Technology for augmented reality (AR) and virtual reality (VR) has come a long way in a short time. There are many learners not served by reading or watching videos. AR and VR help present interactive videos — teaching a wider spectrum of learner types, such as visual and physical learners.

AR and VR have the potential to introduce e-Learning where in-person instruction is most popular. They add a spatial and relevant aspect to the learning experience that lacks depth when presented through text or video-based courses:

  • Chemistry experiments come to life on your kitchen table, without the mess
  • Visualize how a plant will grow through its life cycle
  • Medical students can experience and practice surgery
  • Walkthrough an entire museum without setting foot inside

AR and VR create an environment that increases students’ excitement, accessibility, and interactivity, creating more broadly attainable learning outcomes. They’re powerful tools to help learners of all styles further their education.

Increased accessibility for all online students

An increase in accessibility enables all learners to control and enjoy their learning paths.

One of the advantages of e-Learning is that virtual learning experiences are adaptable. The flexibility of e-Learning makes it accessible to students of all learning styles. We’ll see trends broadening the accessibility of e-Learning — such as the addition of platforms and learning modifications.

User-generated content

E-learning platforms provide a unique way for students to share their knowledge, expertise, and interests with each other and their instructors. By letting their students crowd-source and curate content, instructors increase student engagement and ownership in their education.

When students find videos that help them make sense of a challenging subject — they can share those resources with the class. If an instructor wants to teach creative problem solving and team building, they create class assignments that allow students to learn about a topic collaboratively.

This is an effective tactic for increasing learner engagement because students are curating their own learning content. Instead of being “taught at”, they’re empowered to take part in social learning. It’s also great for content creators, as the students curate their own content.

Gamification & game-based learning (GBL)

E-learning can take advantage of its online platform — and truly ramp up the interactive “gaming experience” for students. Adding games into e-Learning  has been proven  to increase the retention of learning material and increase student performance.

Especially compared to sitting through a lecture or reading a book.

Gamification is a trend that adds fun to the learning experience. It provides a sense of satisfaction as you complete tasks and levels.

Instructional designers will add gamification to their online courses. This helps ensure that students are well engaged and highly interactive with their education, which improves their learning experiences. Distractions in e-Learning are ample – gamifying education is a sure-fire trend to combat them.

Personalized learning

Traditionally, education has been standardized, with a rigid structure and minimal personalization for students. e-Learning and artificial intelligence are powerful tools for creating an individualized, interactive learning environment.

e-Learning software is engineered to collect information about each learner’s habits and skills. Using adaptive e-Learning , artificial intelligence supplements the teacher’s efforts. It learns to prompt study habits individually for each student.

The LMS suggests additional help for students struggling with multiplication — while a student who does well with the material will receive advanced learning suggestions. By creating a personalized learning pathway, this e-Learning trend can boost engagement.

Not only does the trend of personalized learning increase student engagement — but it improves student performance as well. As the software gets more sophisticated, it will offer suggestions earlier, before a teacher might see a student falling behind, boosting that student’s success.

Virtual conferences

e-Learning has the power to bring entire communities of people together. People with similar interests and education attend virtual presentations to learn about their favorite subjects. As popular in-person events moved online in 2020, virtual conferences shifted to the forefront.

With better technology to broadcast interactive videos, virtual conferences provide all the benefits of their physical counterparts. Small groups of attendees can gather online to discuss what they’ve learned. This builds community, allowing students to share trends of what they’ve been learning.

If an e-learner cannot attend an event because it’s too far from their physical location, a virtual conference allows them to participate. Thus, you can continue your education and network with other attendees virtually.

Collaborative e-learning

e-Learning will encourage community through the unique collaboration of its collective learners. When students engage with content and each other, “curriculum as a community” develops. Students learn as much from each other as they do from official instruction.

This trend focuses on an essential factor of learning: social interactions. e-Learning will transform students’ close associations in community-wide learning opportunities.

Online learning platforms  can include built-in networking options. Students are then able to connect individually, in small groups, or as a whole. This facilitates relationship building in a way that mimics in-person learning and engages the learners in natural learning practices.

With instructor-led discussions, challenging ideas, and each student’s uniques experiences — students can genuinely experience collaborative learning through their community.

Social learning

The most ambitious way for educators to create online learning communities is to harness the power of social media. Since most people spend time on social platforms, it’s a great way to integrate education and society.

By creating discussion through social media hashtags, using forums in pre-existing spaces, and challenging students to use the platforms for their education — instructors in e-Learning spaces bring learning into online social communities.

More e-Learning instructors will use social learning to increase students’ awareness of current issues. By funneling student usage tendencies into learning from the community at large, instructors will create lifelong learners. e-Learning will effectively utilize the community that is already thriving.

Bringing together people of different backgrounds, experiences, and lifestyles defines the education experience for many people. As e-Learning expands into 2021, we expect to see instructors and students recreating the communities in physical learning spaces.

e-Learning will continue to grow throughout 2021. We’ll see the most effective and constructive trends in e-Learning continue to build. Are you looking to further your education in 2021? Check out our course reviews today to see which one is the best for you.

Are you an educator looking to expand your repertoire in e-Learning? Start here to learn even more about e-Learning, and we’ll help prepare you for the future of education.

Erin Wilson

Review of “social media success: video storytelling on youtube & beyond” on skillshare.

Are you a content creator looking to find inspiration and guidance for your social media storytelling journey? This course may be for you.

Brushing Up on Street Art: Futura’s MasterClass Review

Trying to master your spray-can technique or have an interest in abstract art? Find out in review if Futura’s MasterClass is for you.

Bridging the Gaps in E-learning: Cultivating Communication Skills

Online students risk ending up lacking in communication skills. We’ll go through the reasons for this and how to address it.

e learning research topics 2021

Thematic exploration of educational research after the COVID pandemic through topic modelling

  • Indu Bala + −
  • Lewis Mitchell + −

e learning research topics 2021

The Journal of Applied Learning and Teaching is known for its focus on innovative practices in learning and teaching in higher education. In this study, we utilised BERTopic modelling to investigate trends and research within the journal. Our objective was to analyse thematic structures and identify emerging trends in a vast academic research corpus. BERTopic modelling enabled us to categorise academic texts into distinct topics, revealing underlying patterns and themes. Our analysis unveiled various topics, showcasing the journal’s interdisciplinary nature. Particularly, articles from January 2021 to December 2023 shed light on global trends in learning and teaching amidst significant changes in the post-COVID era. We identified 17 frequent topics, categorised into four major thematic groups: Technology and Digital Learning in Education, Healthcare and Clinical Training, Educational Strategies and Outcomes, and Pandemic-Driven Social and Compassion Aspects in Education. We examined these themes and presented the findings, highlighting challenges and opportunities in higher education. This comprehensive analysis serves as a roadmap for future research, guiding scholars and practitioners in advancing applied learning and teaching.

Numbers, Facts and Trends Shaping Your World

Read our research on:

Full Topic List

Regions & Countries

  • Publications
  • Our Methods
  • Short Reads
  • Tools & Resources

Read Our Research On:

  • Majority of Latinos Say Skin Color Impacts Opportunity in America and Shapes Daily Life
  • 4. Measuring the racial identity of Latinos

Table of Contents

  • 1. Half of U.S Latinos experienced some form of discrimination during the first year of the pandemic
  • 2. For many Latinos, skin color shapes their daily life and affects opportunity in America
  • 3. Latinos divided on whether race gets too much or too little attention in the U.S. today
  • Acknowledgments
  • Methodology
  • Appendix: Additional tables

How we measured racial identity among Hispanics

The survey used the following four questions to assess the racial identity of Latinos:

What is your race or origin?

  • Black or African American
  • Asian or Asian American
  • Two or more races
  • Some other race or origin

How would most people describe you, if, for example, they walked past you on the street? Would they say you are …

  • Hispanic or Latino
  • Native American or Indigenous (the native peoples of the Americas such as Mayan, Quechua or Taino)
  • Native Hawaiian or other Pacific Islander
  • Mixed race or multiracial

In your own words, if you could describe your race or origin in any way you wanted, how would you describe yourself?

Which of these most closely matches your own skin color, even if none of them is exactly right? (If this question makes you uncomfortable, you may skip it.)

e learning research topics 2021

The most widely employed method to measure racial and ethnic identity is from the U.S. Census Bureau. It is a two-part question, first asking about Hispanic identity and then asking about racial identity and is the standard method used to measure racial and ethnic identity in decennial censuses and in surveys conducted by the bureau. It is also the standard often used by polling organizations, marketers, local governments and many others.

Alternative measures can capture other dimensions of racial and ethnic identity not necessarily captured by the Census Bureau’s format. For example, one’s skin color can shape opportunities and can be at the heart of discrimination experiences no matter what race one identifies with. In addition, how others see you, such as when passing each other on the street, can shape one’s life experiences. And sometimes directly asking one to describe their racial identity can reveal a personal view of identity unencumbered by the framing of survey questions.

Pew Research Center’s 2021 National Survey of Latinos explored four approaches to measuring racial identity – the Census Bureau’s two-question method; an assessment of how respondents believe others see them when passing them on the street (street race); an open-ended question asking respondents to describe their race and origin in their own words; and self-assessed skin color. Responses across all these measures do not necessarily align – a respondent may indicate their race is White in the Census Bureau’s method but also indicate their street race is Latino (and not White). These differences in responses reflect the nuances of racial identity, contextual factors and the experiences associated with them. This chapter explores these four alternative measures and the responses of Latino adults.

The Census Bureau’s standard method for measuring race and ethnicity

Majority of Latinos say their race is White in two-question race and ethnicity format

In current Census Bureau data collections like the 2020 decennial census and surveys like the American Community Survey, racial and ethnic identity is asked about in a two-part question . First, respondents are asked if they are Hispanic or Latino and then in a second question are asked their race. Currently, the Hispanic category is described in census forms and surveys as an ethnic origin and not a race with respondents given explicit instructions indicating so.

The Pew Research Center survey replicated the Census Bureau’s format, asking about race separately from Hispanic ethnicity. Asked about their race in this way, more than half of Hispanics in the survey identified their race as White (58%), with the next largest share selecting the “some other race” category (27%), 8% selecting two or more races, and 2% selecting Black or African American. Foreign-born Hispanics were more likely than their U.S.-born counterparts to select the “some other race” category, while U.S.-born Hispanics were more likely than foreign-born Hispanics to select multiple races. For both groups, though, more than half say their race is White.

These findings echo those of earlier Pew Research Center surveys of Hispanic adults, as well as Census Bureau findings from the 2010 decennial census and other surveys. Yet, the findings from this survey by the Center, conducted in March 2021, differ from those revealed by the Census Bureau from the 2020 decennial census . The wording of the 2020 Census race question differed markedly from the Center’s question and from previous decennial censuses surveys, which could account for why results varied greatly. In the 2020 census, for the first time respondents were prompted to write in origins or ethnicities for all racial groups; this was not offered to the Center’s survey respondents. According to the bureau , about four-in-ten Hispanics (42%) marked their race as “some other race” in the 2020 census without marking any other response, the single largest set of responses among the nation’s 62.1 million Hispanics—an analysis of the 2010 decennial census results showed that most responses coded as “some other race” were write-ins of Hispanic ancestries or ethnicities. This was followed by one-third (33%) who selected two or more racial groups, and 20% that selected White as their race. A separate Pew Research Center  survey from 2020 found Hispanic adults were more likely than White or Black adults to say the 2020 decennial census two-part race and ethnicity questions do not reflect their identity well: 23% of Hispanic adults say census race and ethnicity questions reflect how they see their race and origin either “not too well” (17%) or “not at all well” (5%). This compares with 15% of White adults and 16% of Black adults who said the same.

Latinos’ skin color reflects the diversity within the group

For Latinos and non-Latinos alike, skin color is an important dimension of identity that can affect their daily lives. To measure this dimension of race, the survey asked Latino respondents to identify the skin color that best resembled their own using a version of the Yadon-Ostfeld scale. Respondents were shown 10 skin colors that ranged from fair to dark (see graphic below for images used). Eight-in-ten Latinos selected one of the four lightest skin colors, with the second-lightest ranking most common (28%), followed by the third (21%) and fourth lightest colors (17%). By contrast, only 3% of Latino respondents in total selected one of the four darkest skin colors.

For purposes of the analysis in this report, Hispanics are grouped into two categories. The “lighter skin” color group consisted of those who chose the four lightest skin colors (80%), while the “darker skin” color group included those who chose the six darker skin colors (15%). (Another 5% of respondents did not indicate their skin color.) While there were enough Hispanics who chose each of the lightest four skin colors to analyze separately, there were no significant differences in the opinions or experiences of discrimination among them due to their skin color. (The number of Hispanics who chose the five darkest skin tones was too small to analyze each separately.)

Among Latinos, those who rated their skin as lighter were more likely to be older than 50 (35%) than those who rated their skin as darker (23%). Latinos with lighter skin were also more likely to be women (52%) than Latinos with darker skin (42%).

The distribution of skin color among U.S. Hispanics

Most Latinos say others would describe them as Latino when walking past them on the street

How others would describe Latinos when walking down the street

Similar to skin color, the way others perceive Latinos when interacting with them is another manner in how racial identity can be shaped. In the survey, respondents were asked how most people would describe them if they walked past them on the street.

Seven-in-ten Hispanic adults said that most people would describe them as Hispanic when walking past them on the street, with the foreign born the most likely to say this (75%) compared with those of the second generation (68%) or third or higher generation (55%).

Fewer than two-in-ten Latinos (17%) say others would view them as White when walking past them, with those born in the U.S. being more likely to say this (28% of at least third-generation Latinos and 20% of second-generation Latinos) than Latino immigrants (13%).

A smaller share (12%) say others view them as belonging to another racial group such as Asian, Black or Indigenous.

Asked to describe their race or origin, most Latinos say they are Hispanic or Latino or give their country of origin

In open-ended question, most Hispanics identify their race as Hispanic or link it to their country or region of origin

As a fourth measure of racial identity, the survey asked Latinos how they would describe their race or origin in their own words. The most common responses for Latinos regarding their race in this open-end format were the pan-ethnic terms Hispanic, Latino or Latinx (28%) or responses that linked their racial origin to the country or region of their ancestors (28%). A smaller share also chose to identify their race or origin as American, either as a single answer or in combination with another response (11%), 9% identified their race as White, and 9% mentioned another racial group such as Asian, Black or Indigenous.

There were some differences in the way Hispanics identified their race depending on their immigrant roots. Fully one-third of the foreign born (33%) used the pan-ethnic terms Hispanic, Latino or Latinx to identify their race, while 23% of the U.S. born did so. Among the U.S. born, those with at least one immigrant parent – second generation Hispanics – were also more likely than those without any immigrant parents – third or higher generation – to use the pan-ethnic terms to describe their race (27% vs. 19% respectively).

Conversely, those born in the U.S., regardless of the place of birth of their parents, were more likely to describe their race or origin as American or having been born in the U.S. (19% of the U.S. born vs. 5% of the foreign born).

Those in the third or higher generation were more likely than those from the second generation to describe their race as White (14% vs. 6%). In addition, third or higher generation Hispanics were more likely than Hispanic immigrants or those with at least one immigrant parent to mention another racial group such as Black or Asian in their response (17% compared with 7% of foreign born and 8% of second generation).

How the four racial identity measures correlate with each other

There is some overlap in the responses to the four racial identity questions, particularly when looking at just two of the four measures. For example, nearly all respondents who say most people see them as White when passing them on the street (95%) chose one of the four lightest skin colors (1-4). By comparison, 79% of those who say they would be viewed as Latino by passersby selected one of the four lightest skin colors and 69% who say they would be perceived as belonging to another racial group did the same.

Similarly, 94% of those who said their race was White in the open-ended question chose one of the four lightest skin colors. About eight-in-ten (83%) of those who say they are Hispanic in the open-ended question or included a Hispanic country of origin or region (80%) also chose one of the four lightest skin colors. Meanwhile, 74% Hispanics who mentioned another racial group like Black or Asian selected one of the lighter skin colors.

Among Hispanics who characterized their race as White in the Census Bureau’s standard two-part question, 86% selected one of the four lightest skin colors. By comparison, about seven-in-ten of those who identified their race as “some other race” (72%) or chose another race group (68%) selected one of the four lightest skin colors.

There were other similarities across the ways respondents characterized their race across the four different questions included in the survey, but the overlap between similar categories across the four measures was considerably less. For example, among those who mark their race as White in a standard two-part race question, only 25% say others would describe them as such walking down the street, and only 14% describe their race as White in an open-ended question. In both measures, respondents who had selected their race as White in the Census Bureau’s standard two-part question were more likely to select Hispanic as the way others view them (69%) or use a pan-ethnic term (30%) or a country or Hispanic origin as their race (27%) when asked to describe their race in their own words.

Most common combination of answers to the four racial identity measures

The table shows the degree to which responses to the four different ways we asked about race correlate with each other. As can be seen, there is not much overlap across the four measures in the most common responses for these measures. When looking at the overlap across the four measures, only 5% of Hispanics identified their race as Hispanic or Latino in the open-ended question, said others viewed them as Hispanic when walking past them, selected the “some other race” option in a standard two-way format question and selected one of the four lighter skin colors of the 10 given. Similarly, only 4% of Hispanics described their race as White in an open-ended question, said others viewed them as White when walking past them, selected White in a standard two-way format question and selected one of the four lighter skin colors.

Sign up for our weekly newsletter

Fresh data delivery Saturday mornings

Sign up for The Briefing

Weekly updates on the world of news & information

  • Hispanic/Latino Identity
  • Hispanics/Latinos
  • Immigration Issues
  • Integration & Identity
  • Race & Ethnicity
  • Racial & Ethnic Groups
  • Racial & Ethnic Identity
  • Racial Bias & Discrimination

Key facts about U.S. Latinos for National Hispanic Heritage Month

Latinos’ views of and experiences with the spanish language, who is hispanic, 11 facts about hispanic origin groups in the u.s., how a coding error provided a rare glimpse into latino identity among brazilians in the u.s., most popular, report materials.

  • American Trends Panel Wave 86

1615 L St. NW, Suite 800 Washington, DC 20036 USA (+1) 202-419-4300 | Main (+1) 202-857-8562 | Fax (+1) 202-419-4372 |  Media Inquiries

Research Topics

  • Age & Generations
  • Coronavirus (COVID-19)
  • Economy & Work
  • Family & Relationships
  • Gender & LGBTQ
  • Immigration & Migration
  • International Affairs
  • Internet & Technology
  • Methodological Research
  • News Habits & Media
  • Non-U.S. Governments
  • Other Topics
  • Politics & Policy
  • Email Newsletters

ABOUT PEW RESEARCH CENTER  Pew Research Center is a nonpartisan fact tank that informs the public about the issues, attitudes and trends shaping the world. It conducts public opinion polling, demographic research, media content analysis and other empirical social science research. Pew Research Center does not take policy positions. It is a subsidiary of  The Pew Charitable Trusts .

Copyright 2024 Pew Research Center

Terms & Conditions

Privacy Policy

Cookie Settings

Reprints, Permissions & Use Policy

IMAGES

  1. 10 E-Learning Trends for 2021

    e learning research topics 2021

  2. E-Learning App Development Trends in 2021 that you must know

    e learning research topics 2021

  3. 55 Brilliant Research Topics For STEM Students

    e learning research topics 2021

  4. ⛔ Sample research topics in education. 53 Best Education Dissertation

    e learning research topics 2021

  5. Map of some relevant e-learning research topics and their potential

    e learning research topics 2021

  6. 100+ Education Research Topics & Ideas for Your Paper

    e learning research topics 2021

VIDEO

  1. Top 50 Research Topics

  2. Introduction to e Learning

  3. Latest Research topics in Education

  4. Connecting Research with Education: 20 research scenarios that require new computational practice

  5. Modeling Relationships and Trends in Data

  6. 【e-Learning】Research Methodology

COMMENTS

  1. Frontiers

    The researcher can also take these subfields as topics for research in e-learning, especially the last cluster, which formed a recent research trend for many scholars (Bhardwaj et al., 2021; Kashive et al., 2021; Rasheed and Wahid, 2021). Figure 2B shows that the research on this topic requires focusing on several issues.

  2. PDF A Systematic Review of the Research Topics in Online Learning During

    Table 1 summarizes the 12 topics in online learning research in the current research and compares it to Martin et al.'s (2020) study, as shown in Figure 1. The top research theme in our study was engagement (22.5%), followed by course design and development (12.6%) and course technology (11.0%).

  3. (PDF) A Systematic Review of the Research Topics in Online Learning

    Table 1 summarizes the 12 topics in online learning research in the current research and compares it to Martin et al. 's (2020) study , as shown in Figure 1. The top research theme in our

  4. Predicting students' performance in e-learning using learning process

    Research notes that e-learning behaviour data are important to understanding e-learning processes. ... (2021). Article Google Scholar Kumar, K. & Vivekanandan, V. Advancing learning through smart ...

  5. Systematic Literature Review of E-Learning Capabilities to Enhance

    E-learning systems are receiving ever increasing attention in academia, business and public administration. Major crises, like the pandemic, highlight the tremendous importance of the appropriate development of e-learning systems and its adoption and processes in organizations. Managers and employees who need efficient forms of training and learning flow within organizations do not have to ...

  6. Systematic Literature Review of E-Learning Capabilities to Enhance

    E-learning Systems. E-learning systems provide solutions that deliver knowledge and information, facilitate learning, and increase performance by developing appropriate knowledge flow inside organizations (Menolli et al. 2020).Putting into practice and appropriately managing technological solutions, processes, and resources are necessary for the efficient utilization of e-learning in an ...

  7. A Systematic Review of the Research Topics in Online Learning During

    The systematic review results indicated that the themes regarding "courses and instructors" became popular during the pandemic, whereas most online learning research has focused on "learners" pre-COVID-19. Notably, the research topics "course and instructors" and "course technology" received more attention than prior to COVID-19.

  8. E-learning in higher education institutions during COVID-19 pandemic

    1. Introduction. The outbreak of the COVID-19 pandemic has caused uncontrollable downfalls and interruptions in many sectors, including education (Shahzad et al., 2021).Since then, e-learning or online learning has become one of the hottest and controversial topics (Liu et al., 2021).The unprecedented event has liberated e-learning as one of the most critical components in teaching and ...

  9. Evolution and current state of research into E-learning

    What are the main topics addressed in e-learning research and their evolution? RQ2. ... Knowledge and Learning, Technology (2021) Google Scholar [84] M.A. Hassan, U. Habiba, F. Majeed, M. Shoaib. Adaptive gamification in e-learning based on students' learning styles. Interact. Learn. Environ., 29 (2021), pp. 545-565. CrossRef View in Scopus ...

  10. (PDF) E-Learning Research Trends in Higher Education in Light of COVID

    Abstract and Figures. This paper provides a broad bibliometric overview of the important conceptual advances that have been published during COVID-19 within "e-learning in higher education ...

  11. Covid-19 and E-Learning: An Exploratory Analysis of Research Topics and

    E-learning has gained further importance and the amount of e-learning research and applications has increased exponentially during the COVID-19 pandemic. Therefore, it is critical to examine trends and interests in e-learning research and applications during the pandemic period. This paper aims to identify trends and research interests in e-learning articles related to COVID-19 pandemic ...

  12. E-Learning Research Trends in Higher Education in Light of ...

    This article starts with a literature review of e-learning. Diverse subjects have appeared on the topic of e-learning, which is indicative of the dynamic and multidisciplinary nature of the field. These include analyses of the most influential authors, of models and networks for bibliometric analysis, and progress towards the current research ...

  13. PDF May

    Today, e -learning has become a very important topic , with applications in every field, as supportive training , lifelong learning modalities , and support tools , for all types of educational systems. D ue to the effects of the COVID-19 pandemic on teaching and learning environments, research into e-learning studies has become

  14. Frontiers

    Research Method Instrument. The measurement tool for e-learning self-efficacy in this study is General Self-Efficacy (GSES) (Jerusalem and Schwarzer, 1992), which has few questions and can be easily operated.According to Jerusalem and Schwarzer, with the internal consistency coefficient between 0.75 and 0.91 in multiple measurements of different cultures (countries), GSES has always kept good ...

  15. Academic student satisfaction and perceived performance in the e ...

    The outbreak of the COVID-19 pandemic has dramatically shaped higher education and seen the distinct rise of e-learning as a compulsory element of the modern educational landscape. Accordingly, this study highlights the factors which have influenced how students perceive their academic performance during this emergency changeover to e-learning. The empirical analysis is performed on a sample ...

  16. Adaptive e-learning environment based on learning styles ...

    Adaptive e-learning is viewed as stimulation to support learning and improve student engagement, so designing appropriate adaptive e-learning environments contributes to personalizing instruction to reinforce learning outcomes. The purpose of this paper is to design an adaptive e-learning environment based on students' learning styles and study the impact of the adaptive e-learning environment ...

  17. (PDF) Trends in artificial intelligence-supported e- learning: a

    Artificial intelligence (AI) has been widely explored across the world over the past decades. A particularly emerging topic is the application of AI in e-learning (AIeL) to improve the ...

  18. The Hottest Topics in Edtech in 2021

    Meanwhile, topics like e-sports, online learning design and creativity were new to the list. All these topics will be well represented at ISTELive 21 this year. The fully online conference will run for four days, June 26-30. Here's a look at the trending topics and why they are especially important now. 1.

  19. PDF The Effectiveness of E-Learning: An Explorative and Integrative Review

    This is a broad definition, but in the abstracts of papers examining higher education, the definition is often clarified in terms of measurements; for example: 'Student learning measurements included: pre-test, final examination (post-test) and final letter grade' (Boghikian-Whitby and Mortagy, 2008).

  20. E-Learning Research Trends in Higher Education in Light of ...

    We retrieved published research via a topic search of the Science e-learning in higher education during the COVID-19 pandemic using the WoS database on August 12, 2021. The following search terms were used: topic = ("e-learning" "COVID-19" "higher education"), in title-abs-key from 2020 to 2021,

  21. Sustainability

    The study aimed to gain a comprehensive picture of current e-learning development in higher education. In order to do this, research questions including distribution patterns, contributors in terms of journals, authors and countries, as well as theme development and evolution pattens of e-learning were proposed. To address the research questions, a bibliometric analysis was carried out by ...

  22. Key findings about online learning and the ...

    America's K-12 students are returning to classrooms this fall after 18 months of virtual learning at home during the COVID-19 pandemic. Some students who lacked the home internet connectivity needed to finish schoolwork during this time - an experience often called the "homework gap" - may continue to feel the effects this school year. Here is what Pew Research Center surveys found ...

  23. 10 Irreversible E-Learning Trends for 2021

    According to Learning House, 67% of students will finish online course work using a mobile device. And 87% of potential students use a mobile device to search for a new e-Learning course. Students want to learn anywhere, using just their phones. Over 50% of online traffic comes from mobile devices (not including tablets).

  24. Thematic exploration of educational research after the COVID pandemic

    Particularly, articles from January 2021 to December 2023 shed light on global trends in learning and teaching amidst significant changes in the post-COVID era. We identified 17 frequent topics, categorised into four major thematic groups: Technology and Digital Learning in Education, Healthcare and Clinical Training, Educational Strategies and ...

  25. U.S. public school teachers much less racially ...

    Elementary and secondary public school teachers in the United States are considerably less racially and ethnically diverse as a group than their students - and while the share of Black, Hispanic and Asian American teachers has increased in recent decades, it has not kept pace with the rapid growth in the racial and ethnic diversity of their students, according to data from the National ...

  26. The Impact and Effectiveness of E-Learning on Teaching and Learning

    Abstract and Figures. Purpose-This paper presents research findings on the effectiveness and impact of E-Learning to the teaching and learning process of the Undergraduate Program (UGP) and ...

  27. Key facts about US students with disabilities ...

    July is both Disability Pride Month and the anniversary of the Americans with Disabilities Act. To mark these occasions, Pew Research Center used federal education data from the National Center for Education Statistics to learn more about students who receive special education services in U.S. public schools.. In this analysis, students with disabilities include those ages 3 to 21 who are ...

  28. Key facts about U.S. college graduates

    As of 2021, 37.9% of adults in this age group held a bachelor's degree, including 14.3% who also obtained a graduate or professional degree, according to data from the Census Bureau's Current Population Survey. That share is up 7.5 percentage points from 30.4% in 2011. An additional 10.5% had an associate degree in 2021.

  29. Mobile Fact Sheet

    ABOUT PEW RESEARCH CENTER Pew Research Center is a nonpartisan fact tank that informs the public about the issues, attitudes and trends shaping the world. It conducts public opinion polling, demographic research, media content analysis and other empirical social science research. Pew Research Center does not take policy positions.

  30. Racial identity of Latinos: How we measured it

    These findings echo those of earlier Pew Research Center surveys of Hispanic adults, as well as Census Bureau findings from the 2010 decennial census and other surveys. Yet, the findings from this survey by the Center, conducted in March 2021, differ from those revealed by the Census Bureau from the 2020 decennial census. The wording of the ...