
Vol No: 5 Issue No: 2 eISSN:
Dear Authors,
We invite you to watch this comprehensive video guide on the process of submitting your article online. This video will provide you with step-by-step instructions to ensure a smooth and successful submission.
Thank you for your attention and cooperation.
1Dr. S. Kirubakaran, Senior Assistant Professor, Department of Community Medicine, Government Medical College, Omandurar Government Estate, Chennai, Tamil Nadu, India.
2Department of Surgery, SRIHER, Chennai, Tamil Nadu, India
3Department of Anatomy, Government Medical College, Omandurar Government Estate, Chennai, Tamil Nadu, India
4Department of Community Medicine, Government Medical College, Omandurar Government Estate, Chennai, Tamil Nadu, India
5Department of Community Medicine, Government Medical College, Omandurar Government Estate, Chennai, Tamil Nadu, India
*Corresponding Author:
Dr. S. Kirubakaran, Senior Assistant Professor, Department of Community Medicine, Government Medical College, Omandurar Government Estate, Chennai, Tamil Nadu, India., Email: kirubasam01@gmail.com
Abstract
Background: Adaptive e-learning personalizes instruction content and delivery based on the real-time learner's data to improve engagement and learning outcomes. While there are many documented studies touting its effectiveness in formal educational settings, little is known about whether it can be applied or has been tried in short-term intensive training programs, such as online workshops
Aim: To evaluate the adaptive e-learning role and effectiveness in improving learning outcomes during the online workshops and to dissect participants' experiences and perceptions of adaptive e-learning.
Methods: This study adopted the use of both quantitative and qualitative approaches. This involved 74 participants, including faculty members and postgraduate students enrolled in an online workshop. The quantitative phase assessed learning outcomes through pre- and post-tests. All the participants were then randomly assigned into two groups. One received the feedback on the pre-test results, while the other group did not (no record). In the qualitative aspect, we employed in-depth interviews to explore the participants' experiences.
Results: Both groups showed significant post-test improvements. The ‘Record group’ performed significantly better than the ‘No record’ group (P = 0.002), suggesting that feedback enhanced learning outcomes. Six qualitative themes emerged: personalized learning, e-library, technology-driven learning, self-directed learning, reinforcement, and overall effectiveness. Participants reported high satisfaction with the adaptive learning experience.
Conclusion: Adaptive e-learning supports learner engagement, and indeed, the learning outcome of the learners is very significant when the feedback is involved. Adaptive e-learning should be implemented in short-term professional development workshops.
Keywords
Downloads
-
1FullTextPDF
Article
Introduction
Adaptive e-learning is an approach toward education that is rapidly moving forward and personalizing instruction by adjusting the content and delivery according to real-time learner data so that more engagement and better learning outcomes are achieved.1 Major benefits of adaptive e-learning include, in fact, the tailoring of the learning process in such a way that it corresponds to individual tastes, abilities, and ways of progress, thus propelling both cognitive and emotional development. The use of Artificial Intelligence (AI) and machine learning makes for a very adaptable system of learning with the ability to customize learning environments dynamically - an indispensable asset for students and their learning characteristics, such as those presented in online workshops.2
Recent studies have indicated that adaptive e-learning is an effective methodology in several educational environments. There are vast quantities of research works that mention the importance of learning styles in adaptive systems, in which personal learning experiences are regarded as critical factors to enhance knowledge acquisition and higher-order thinking skills among students.1 One of the most commonly used methods is the Visual, Auditory, Reading/Writing, and Kinesthetic (VARK) model, which relates the learning material to the individual's style so as to enhance motivation and retention.3 Artificial Intelligence (AI) based adaptive learning frameworks have been proven to make real-time data-driven adjustments to instructional strategies, thereby improving engagement and academic performance while limiting learner dissatisfaction.4,2 As interesting as these results are, it should be stated that there are still so many gaps in understanding the general effects of adaptive e-learning on short-term, intensive training contexts, with particular emphasis on such online workshops. Most existing studies focus on the formal settings of education, from where learners experience higher education environments like universities and do not necessarily penetrate into the experiences of learners and the outcomes of professional development programs or workshops. It is also true that the subjective experiences of the learner in these adaptive systems are, to a significant extent, crucial in fine-tuning adaptive technologies to better meet the needs of the learners.
There are two specific knowledge gaps in this field: there is limited evidence on the long-term retention and application of skills acquired through adaptive e-learning in short-term workshop settings, and the learner experience in personalized adaptive e-learning environments, especially in non-traditional educational settings such as workshops, is under-explored. In this respect, this study intended to evaluate the adaptive e-learning role and effectiveness in improving learning outcomes during the online workshops and to dissect participants' experiences and perceptions of adaptive e-learning within these contexts.
Materials & Methods
Study Design
This study employed an explanatory mixed-method design, combining both quantitative and qualitative approaches. The first phase was a cross-sectional quantitative study, followed by a qualitative phase.
Study Setting
The study was conducted within the Department of Community Medicine.
Study Duration
The study spanned a period of five months.
Sample Size and Participants
A total of 74 participants, consisting of faculty members and postgraduate students from the Department of Community Medicine, and other researchers enrolled in an online workshop on systematic review and meta-analysis, took part in the study.
Phase-1: Quantitative Phase
Study Design: The quantitative phase was a crosssectional study focused on assessing participants' learning outcomes through pre- and post-tests.
Study Procedure: The online workshop was divided into two sessions. The first session was conducted live, while the second session involved recorded video content. There was a one-week interval between the two sessions. Participants were randomly assigned to two groups. Group A received the recorded video content immediately after the live session, while Group B received the recorded content only after completing the post-test. Pre-tests were conducted four days prior to the start of each session, and post-tests were administered four days after the completion of each session. The study assessed the impact of the timing of video content delivery on learning outcomes by comparing post-test scores between the two groups. Participant engagement, including attendance in the live session, frequency of video viewing, and the number of views, was monitored using Google Drive analytics.
Data Collection Tool: Data was collected using a structured Google Forms questionnaire. The pre- and post-tests included standardized questions designed to evaluate participants' understanding of the material.
Statistical Analysis: Descriptive statistics, including mean values, frequencies, and percentages, were calculated to summarize demographic data and participant engagement metrics. Paired t-tests were conducted to compare the mean pre- and post-test scores within each group to determine whether significant learning had occurred. Independent t-tests were used to compare the mean post-test scores between Group A and Group B, assessing the effect of the timing of video content delivery on learning outcomes. Chi-square tests were employed to examine associations between categorical variables such as attendance and engagement.
Phase-2: Qualitative Phase
Study Design: The qualitative phase of the study aimed to explore participants' experiences and perceptions of the online workshop, using an in-depth interview approach.
Study Procedure: Qualitative data was gathered through in-depth interviews, employing the K-Textual Dialogue technique. Participants from both groups in Phase-1 were invited to share their insights regarding their learning experiences, engagement with the workshop, and the impact of the timing of video content. The interviews focused on capturing participants' reflections on the workshop structure, their perceived learning benefits, and any challenges they encountered during the process.
Data Analysis: The interviews were recorded, transcribed verbatim, and subjected to content analysis. ATLAS.ti software was used to code and categorize the data, with themes generated through a thematic analysis approach. The analysis focused on identifying common themes related to participants' perceptions of the timing of video content, engagement, and overall learning experiences. Manual content analysis was also performed for validation, and QDA Miner software was used to generate word clouds, highlighting the most frequently discussed topics.
Ethical Considerations
Institutional ethical clearance was obtained for the study. Informed consent was secured from all participants before their involvement in both quantitative and qualitative phases. The confidentiality and anonymity of participants were maintained throughout the study.
Results
The sample consisted of 74 participants, evenly distributed by gender, with 39 males (52.7%) and 35 females (47.3%). These participants were drawn from three distinct categories: 48 postgraduates (64.9%), 19 faculty members (25.7%), and 7 research scientists (9.5%) (Table 1).
A paired samples t-test was conducted to compare the pre- and post-test scores within the two groups: participants who did not receive a record of their scores (referred to as the "No record" group) and participants who were provided with a record ("Record" group). For the "No record" group (n = 36), the mean pre-test score was 6.08 (SD = 1.96), which increased to 7.08 (SD = 1.84) in the post-test. This improvement was statistically significant, t(35) = -3.07, P = 0.004, indicating that the participants who did not receive a record of their pretest scores still showed significant learning gains after the intervention. In the "Record" group (n = 38), the mean pre-test score was 6.42 (SD = 1.87), rising to 7.34 (SD = 1.96) in the post-test. This difference was also statistically significant, t(37) = -2.57, P = 0.014, suggesting that participants who received a record of their scores demonstrated a significant improvement in their post-test results (Table 2).
In the "No record" group, the mean score increased from 6.08 to 7.08 from pre- to post-test, while in the "Record" group, the mean score increased from 6.42 to 7.34. The median scores also showed a notable improvement in both groups, with the "Record" group moving from a median of 6.00 in the pre-test to 8.00 in the post-test (Table 3).
An independent samples t-test was conducted to compare the pre-test and post-test scores between the "Record" and "No record" groups. The results indicated no significant difference between the two groups in the pre-test scores, t(72) = 0.758, P = 0.451, suggesting that both groups had similar baseline knowledge before the intervention. However, there was a statistically significant difference in the post-test scores between the groups, t(72) = 3.179, P = 0.002. Specifically, the "Record" group performed significantly better in the post-test (M = 7.34, SD = 1.96) compared to the "No record" group (M = 5.86, SD = 2.04) (Table 4).
The group-level descriptive statistics further underscore the significant difference in post-test scores between the "Record" and "No record" groups. While both groups showed similar pre-test scores (M = 6.42 for the "Record" group vs M = 6.08 for the "No record" group), the "Record" group demonstrated substantially greater improvement in post-test performance (Table 5).
In the qualitative phase of the study, a thematic analysis was conducted based on participant feedback. Six key categories emerged from the qualitative data analysis, each reflecting participants' experiences and perceptions regarding the intervention. These categories provide insight into the various aspects of the learning process that participants found impactful.
- Personalized Learning: Many participants emphasized the importance of personalized learning, stating that the tailored approach allowed them to engage with the content more effectively. They noted that the intervention provided flexibility to focus on areas where they needed more support, improving both their motivation and performance.
- E-Library: The availability of an electronic library (E-library) was highlighted as a critical resource. Participants appreciated the easy access to a wide range of learning materials, which supported selfpaced learning and allowed them to revisit topics as needed.
- Technology-Driven Learning: Participants frequently mentioned the technology-driven nature of the intervention as a positive aspect. The use of digital tools and platforms facilitated a more interactive and engaging learning experience, enabling them to track their progress and receive instant feedback.
- Self-Directed Learning: The intervention encouraged a self-directed learning approach, where participants could take charge of their own learning journey. This category underscored the value of autonomy, with participants noting that they could structure their study time according to their individual preferences, promoting deeper understanding and retention of the material.
- Reinforcement: Reinforcement was another theme that emerged, reflecting how the intervention helped solidify participants’ knowledge through repetition and practice. Participants appreciated that the intervention incorporated mechanisms to review and reinforce key concepts, which contributed to their overall improvement in understanding.
- Effectiveness: Overall, participants viewed the intervention as effective. They reported noticeable improvements in their performance and expressed satisfaction with the structure and content. The combination of personalized, technology-driven, and self-directed learning was perceived as highly beneficial for achieving their educational goals.
These six themes-Personalized Learning, E-Library, Technology-Driven Learning, Self-Directed Learning, Reinforcement, and Effectiveness-collectively demonstrate how the intervention addressed various dimensions of the learning experience, contributing to both improved outcomes and positive perceptions among participants.
Discussion
The study found that both groups improved significantly from pre- to post-test, but the "Record" group, which received feedback on their pre-test scores, demonstrated a greater improvement. The "Record" group's post-test scores were significantly higher than the "No record" group. The paired samples t-test confirmed significant within-group improvements, and the independent samples t-test revealed a significant difference between groups in post-test performance. The findings indicate that while both groups showed improvements after the intervention, the provision of a record of pre-test scores had a marked effect on post-test performance. Participants in the "Record" group, who were given feedback on their initial scores, demonstrated significantly greater learning gains compared to the "No record" group. This suggests that providing participants with detailed feedback plays a critical role in enhancing their overall performance. The qualitative analysis revealed six key themes: personalized learning, e-library access, technology-driven learning, self-directed learning, reinforcement, and overall effectiveness. Participants found the intervention highly beneficial, highlighting its ability to enhance engagement, autonomy, and knowledge retention.
Our study involved 74 participants from a specific medical department, including both faculty and postgraduate students. In contrast, Cook et al. focused on larger cohorts drawn from various medical specialties, which may limit the granularity of insights that can be gained from more department-specific studies like ours.5 Our findings showed significant improvements in post-test scores, especially among participants who received feedback on their pre-test scores. This aligns with existing literature that also reports positive trends, though many studies, such as Van der Vleuten et al. , focus on long-term retention rather than immediate post-intervention improvements.6 Participants in our study reported high levels of satisfaction with the intervention’s structure and content.
Our study found that providing participants with a record of their pre-test scores significantly enhanced their post-test performance, aligning with other studies that emphasize the importance of feedback in improving learning outcomes. For instance, Lipnevich et al.) found that detailed feedback following pre-tests significantly improved post-test performance in Massive Open Online Course (MOOC) participants, particularly when feedback was more elaborate rather than simple correct/incorrect feedback.7 This supports our findings that detailed feedback enhances learning gains, with the "Record" group performing better than the "No record" group in the post-test. Similarly, a study in a clinical setting demonstrated that feedback during formative testing led to improved study behaviour and performance, although it was noted that the effect varied depending on the intensity of app use in the intervention group. Interestingly, students who engaged more deeply with the feedback saw more significant performance improvements.8 These findings echo the self-directed learning theme from our qualitative analysis, where participants valued the autonomy and personalized learning the intervention offered.
However, differences emerge when examining long-term retention. A previous study, suggest that while feedback driven interventions are effective for immediate performance improvements, their impact on long-term retention is more variable. Some studies show strong retention of knowledge and skills over several months, while others report a decline in retention.9 This contrasts with our study, which primarily focuses on short-term post-intervention gains, with no emphasis on long-term retention. In terms of subjective satisfaction, our findings regarding the positive perception of the intervention's structure and content contribute valuable insight. A study on work-based feedback interventions like those in clinical education, prioritizes objective measures like performance scores, with less attention given to participants' satisfaction and perceived effectiveness.10
This study, while offering valuable insights into the effectiveness of adaptive e-learning in online workshops, faces several limitations. The small sample size, drawn exclusively from a single department (Community Medicine), restricts the generalizability of the findings to other disciplines and broader populations. The short-term focus on immediate pre- and post-test improvements does not allow for conclusions regarding long-term knowledge retention or the sustained application of skills. The study's specific context—an online workshop-further limits the relevance of these findings to other educational settings, such as corporate training or formal academic programs. External factors like participants' prior knowledge, learning habits, and external engagement were not controlled, which may have influenced the results. Engagement metrics, while monitored, were not deeply analyzed to assess the quality of engagement, leaving gaps in understanding how adaptive e-learning drives active participation. Qualitative data, gathered through indepth interviews, also risked response and interviewer biases, potentially skewing the themes of participant satisfaction and challenges. The absence of a control group using traditional e-learning methods prevents a direct comparison of adaptive e-learning's effectiveness relative to more conventional approaches. The study did not examine long-term real-world application or professional impact, leaving its practical relevance in professional settings unexplored Future research should expand the sample size and include participants from a variety of disciplines to improve the generalizability of findings. Longitudinal studies that assess long-term retention and practical application of skills are essential for understanding the lasting impact of adaptive e-learning. Additionally, incorporating a control group using traditional e-learning methods would allow for direct comparisons of effectiveness. More comprehensive engagement metrics should be analyzed, including attention to the quality of interaction and external learning activities. Reducing potential biases in qualitative data collection through alternative methods, such as anonymous surveys or multiple interviewers, may provide more accurate insights into learner experiences. Examining how adaptive e-learning influences real-world outcomes, such as career progression or practical skill application, would deepen the understanding of its effectiveness in professional development contexts.
Conclusion
It was found that adaptive e-learning has a considerable impact on the result of learning, especially if the feedback relates to the records obtained after pretests. Participants who received their pre-test records performed notably better in post-tests compared to those who did not. Thus, it may be inferred that feedback is required to further optimize the learning outcome for optimal gain of learning. Qualitative analysis revealed that the participants valued personalized learning, technology-driven tools, and self-directed approaches as the factors that led to engagement and satisfaction. Adaptive e-learning should be extended in similar short-term professional workshops and should focus on mechanisms for personalized feedback. In addition, an extension to all disciplines and the use of long-term follow up assessments will provide a complete understanding of the long-term effects. Institutions should, therefore, embrace integration into different learning contexts for adaptive systems to ensure that participants learn, have more autonomy, and understand things better. The efficiency of the workshops in online environments will also make it possible to deliver the learning content more efficiently and in a way that adapts to the learners' different needs.
Suggestions
- Use an individual response mechanism: Since the participants who received feedback on their pre-test results had statistically significant improvement in their post-test scores, it is strongly recommended that people who do online workshops prefer inclusion of the strong system to provide detailed and personal feedback to students.
- Emphasize self-guided learning and access to the E-library: The qualitative conclusion of high participants satisfaction which resulted from the E-library access and self-directed learning suggests that workshops should be designed to promote autonomy of the learner and promote access to digital resources.
- Adopt the learning environment driven by technology: The good response and positive feedback to the use of technology in learning show that it is necessary to utilize digital tools and platforms well to make learning interactive and engaging in online workshops.
- Integrate reinforcement strategies: An important qualitative theme that emerged was "reinforcement" suggesting that online workshop materials and designs should deliberately include repetition and practice opportunities to strengthen learning and promote increased understanding.
- Adaptive e-learning for short-term professional training: The study’s findings suggest to use adoptive e-learning methods for short-term professional training sessions based on their beneficial effects on student engagement and performance.
- Use adaptive systems in all domains: It is recommended that adaptive e-learning can be implemented in all disciplines with emphasis on more efficient and adaptable delivery of learning material for different student requirements
Conflicts of Interest
Nil
Supporting File
References
- El-Sabagh HA. Adaptive e-learning environment based on learning styles and its impact on development students’ engagement. International Journal of Educational Technology in Higher Education 2021;18(1):53. Available from: https://doi. org/10.1186/s41239-021-00289-4.
- Gligorea I, Cioca M, Oancea R, et al. Adaptive learning using artificial intelligence in e-learning: A literature review. Education Sciences 2023;13(12):1216. Available from: https://www.mdpi.com/2227- 7102/13/12/1216.
- Fleming ND, Mills C. Not another inventory, rather a catalyst for reflection. To Improve the Academy 1992;11(1):137-155.
- Halkiopoulos C, Gkintoni E. Leveraging AI in E-learning: Personalized learning and adaptive assessment through cognitive neuropsychology—A systematic analysis. Electronics 2024;13(18):3762. Available from: https://www.mdpi.com/2079- 9292/13/18/3762.
- Cook DA, Hatala R, Brydges R, et al. Technology enhanced simulation for health professions education: A systematic review and meta-analysis. JAMA 2011;306(9). Available from: http://jama. jamanetwork.com/article.aspx?doi=10.1001/ jama.2011.1234.
- Van Der Vleuten CPM, Schuwirth LWT. Assessing professional competence: from methods to programmes. Med Educ 2005;39(3):309-17. Available from: https://onlinelibrary.wiley.com/ doi/10.1111/j.1365-2929.2005.02094.x.
- Janelli M, Lipnevich AA. Effects of pre-tests and feedback on performance outcomes and persistence in Massive Open Online Courses. Computers & Education 2021;161:104076. Available from: https://linkinghub.elsevier.com/retrieve/pii/ S0360131520302748.
- Thijssen DHJ, Hopman MTE, Van Wijngaarden MT, et al. The impact of feedback during formative testing on study behaviour and performance of (bio)medical students: a randomised controlled study. BMC Med Educ 2019;19(1):97. Available from: https:// bmcmededuc.biomedcentral.com/articles/10.1186/ s12909-019-1534-x.
- Alharbi A, Nurfianti A, Mullen RF, et al. The effectiveness of simulation-based learning (SBL) on students’ knowledge and skills in nursing programs: a systematic review. BMC Med Educ 2024;24(1):1099. Available from: https://doi.org/10.1186/s12909-024- 06080-z.
- Ossenberg C, Mitchell M, Henderson A. Impact of a work-based feedback intervention on student performance during clinical placements in acute care healthcare settings: a quasi-experimental protocol for the REMARK programme. BMJ Open 2020;10(6):e034945. Available from: https://bmjopen.bmj.com/lookup/doi/10.1136/ bmjopen-2019-034945.